Why is my new server more expensive?
It’s not always obvious why costs change in IT.
But over the last three months, there have been significant increases in the prices of RAM.
That change has had a knock-on effect.
Servers, laptops, and desktops have all become noticeably more expensive as a result.
At first glance, the connection isn’t clear.
Why would the growth of AI affect the cost of everyday infrastructure?
It comes down to how the same raw materials are being used.
Silicon sits at the centre of it all.
It’s refined into ultra-pure ingots, sliced into wafers, and used to produce both DRAM (laptop and server ram modules) and they are also the raw material used to make SSD drives (NAND flash memory)
For a long time, supply and demand stayed relatively balanced.
Then AI came along.
AI systems don’t use memory in quite the same way.
To improve speed and reduce latency, memory is built directly into the chip itself.
Layers of silicon are stacked vertically (12 deep in the latest B300 chip), creating highly dense, high-performance modules.
It’s efficient in operation.
But far more demanding to produce.
The trade-off becomes clearer when you look at scale.
A single NVidia Blackwell B300 chip has 192Gb onboard, but it’s not even as simple as 192/16 = 12, so 1 chip consumes the ram of 12 x 16 Gb Laptops.
This means a single high-end AI chip can contain hundreds of gigabytes of memory.
But producing that chip can consume the same raw silicon that would otherwise support dozens of standard memory modules.
In practical terms, more silicon is now being directed into fewer, more complex components.
And the total supply hasn’t increased at the same pace.
That shift is already being felt across the market.
In late 2025 Micron (probably best known to any techies as the consumer brand Crucial) wound down all their consumer output due to the unprecedented demand from AI Hyper-scalers.
The result is a tighter supply of RAM and SSDs for everything else.
AI, as it stands today, is extremely useful, and it’s not going anywhere.
But despite significant progress over the last 24 months, the idea of AGI (Artificial General Intelligence) doesn’t feel any closer - in some ways, it feels further away.
The limitations tend to become clearer with use.
In areas where knowledge is well established, it performs very well, often making older ways of searching feel slow by comparison.
But in more practical scenarios, especially where there are multiple ways to do something, it can be less reliable - mixing approaches or missing a key detail in a way that becomes frustrating.
It’s a capable tool.
Just not a complete one.
For most businesses, the impact is more straightforward.
Hardware costs have increased.
And they’re likely to remain elevated for some time.
Expanding silicon production takes years, not months.
So this isn’t a short-term fluctuation.
Changes in technology don’t always show up where you expect them. Sometimes they appear quietly, in the background, through pricing, availability, or small shifts in planning.
At Nitec, we can’t change the global supply of silicon.
But understanding what’s driving these changes helps make better decisions around timing, investment, and lifecycle planning.
Progress here isn’t dramatic.
It’s just about staying informed and adjusting where needed.