I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
Don't worry, if you are happy with those specs you can get corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.
Personally, at work I use the latest hardware at home I use ewaste.
I got a junk Precision workstation last year as a "polite" home server (it's quiet and doesn't look like industrial equipment, but still has some server-like qualities, particularly the use of ECC RAM). I liked it so much that it ended up becoming my main desktop.
Ha, I bought one of those for $500 from Ebay. It's a dual Xeon Silver workstation with a Nvidia Quadro P400 8GB, 128GB RAM and 256G SSD. I threw in a 1TB SSD and it's been working pretty well.
At home some of my systems are ewaste from former employers who would just give it to employees rather than paying for disposal. A couple are eBay finds. I do have one highish-end system at a time specifically for games. Some of my systems are my old hardware reassembled after all the parts for gaming have been upgraded over the years.
About a month ago, the mobo for my 5950x decided to give up the ghost. I decided to just rebuild the whole thing and update from scratch.
So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.
I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!
Except nobody earns the minimum wage today, it's less than 1/2 of 1% of US labor.
The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.
In addition to the other comments, I presume the big box retailers do not hire for full-time positions when they don't have to, and gig economy work is rapidly replacing jobs that used to be minimum wage.
Counterpoint: affording average rent for a 1-bedroom apartment (~$1,675) requires that exact median full-time wage. $15 an hour affords you about $740 for monthly housing expenses. One can suggest getting two roommates for a one-bedroom apartment, but they would be missing the fact that this is very unusual for the last century. It's more in line with housing economics from the early-to-mid 19th century.
That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..
In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).
If you fast forward just a few years though, it wasn't too bad.
You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.
As other mentionned, there are plenty of refurbished stuff and second hand parts that there isn't any risk of finding yourself having to buy something at insane prices if your computer was to die today.
If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.
This. My M1 MacBook felt like a similarly shocking upgrade -- probably not quite as much as my first SSD did, but still the only other time when I've thought, "holy sh*t, this is a whole different thing".
The M1 was great. But the jump felt particularly great because Intel Macbooks had fallen behind in performance per dollar. Great build quality, great trackpad, but if you were after performance they were not exactly the best thing to get
To me that reads 3x, not "almost 10x". The main differrence here is probably power. A desktop/server is happy to send 15W to the SSD and hundreds of watts to the CPU, while a laptop wants the SSD running in the ~1 watt range and the CPU in the 10s of watts range.
There's over twice as much content in the first test. It's around 3.8gb/s vs 30gb/s if you divide both folder size and both du durations. That makes it 7.9 times faster and I'm comfortable calling this "almost 10 times".
The total size isn't what matters in this case but rather the total number of files/directories that need to be traversed (and their file sizes summed).
The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.
For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.
I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.
Don't all RAM manufacturers offer a lifetime warranty?
That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?
You can still buy DDR4 for pretty cheap, and if you're replacing a computer that old any system built around DDR4 will still be a massive jump in performance.
A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...
Man, it was just GPU for a while. But same boat. I regret not getting the 4090 for $1600 direct from Nvidia. "That's too much for a video card", and got the 4080 instead. I dread the day when I need to replace it.
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
> This is self indulgence justified as « investment ».
Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.
It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?
My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.
Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.
Prices are high but they're not that high, unless you're buying the really big GPUs.
Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.
You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.
The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.
It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.
> Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?
Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.
Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.
> maybe $250/month (...) which you can then use to go and earn 100x that.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?
That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...
Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.
The goal is the right tool for the job, not the best tool you can afford.
Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.
I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.
It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.
A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.
Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.
Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.
Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.
So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.
Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.
It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.
With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.
That and your employer should pay for your hardware in many cases.
One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.
I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.
========
COMPUTER
========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
========
PHONE
========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
========
ASIDE
========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.
No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?
It's the "how much can the banana cost, $10?" of HN.
The point they're trying to make is a valid one - a company should be willing to spend "some money" if it saves time of the employee they're paying.
The problem is usually that the "IT Budget" is a separate portion/group of the company than the "Salary" budget, and the "solution" can be force a certain dollar amount has to be spent each year (with one year carry-forward, perhaps) so that the employees always have good access to good equipment.
(Some companies are so bad at this that a senior engineer of 10+ years will have a ten year old PoS computer, and a new intern will get a brand new M5 MacBook.)
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or Mediatek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
It makes absolutely no sense to apply the lessons from one into the other.
I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.
Each of these units were then given access to an internal "market" and directed to compete with each other for funding.
The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.
It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.
This happened at a place where I worked years ago, but not as 'on purpose.' We were a large company where most pieces depended on other pieces, and everything was fine - until a new CEO came in who started holding the numbers of each BU under a microscope. This led to each department trying to bill other departments as an enterprise customer, who then retaliated, which then led to internal departments threatening to go to competitors who charged less for the same service. Kinda stupid how that all works - on paper it would make each department look better if they used a bottom barrel competitor, but in reality the company would have bled millions of dollars as a whole.
to put a finer point on it, it wasn't just competition or rewarding-the-successful, the CEO straight up set them at odds with each other and told them directly to battle it out.
basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.
That was a bullshit separation of a single horizontal cut of the market (all of those segments did consumer retail sales) without overlap.
The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.
That has absolutely no similarity to how Samsung is organized.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.
Ok. And if it did divide on the borders of insurance and payment services, the reorganization wouldn't have been complete bullshit and may even have been somewhat successful.
It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option
You mean toyota putting bmw engine (supra). Your statement is contradicting as Toyota has TRD, which focuses on the track performance. They just couldn't keep up with the straight six perf+reliability when comparing to their own 2jz
Buying a Supra is stupid. Either buy a proper BMW with the b58/Zf8 speed and get a proper interior or stop being poor and buy an LC500.
Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.
Not sure that the opposite of transfer pricing is nepotism. As far as I know it’s far more common for someone who owns a lake house to assign four weeks a year to each grandkid , than to make them bid real money on it and put that in a maintenance fund or something. Though it’s an interesting idea, it’s not very family friendly
They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.
I worked with some supply chain consultants who mentioned "internal suppliers are often worse suppliers than external".
Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
"Galaxy S II" and its aesthetics was already a mere branding shared across at least four different phones with different SoCs, before counting in sub-variants that share same SoCs. This isn't unique to Samsung, nor is it a new phenomenon, just how consumer products are made and sold.
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.
"The price of eggs has nothing on the price of computer memory right now.". A dozen eggs went to ~$5. They are eggs and most people use what, max 12 eggs a month? Get out of here with that trite garbage. Everyone knew that the egg shortage was due to the extreme step the US does of culling herds infected with avian flu and that they were transitory.
Maybe if you include all the eggs in processed food like cookies or cakes and in restaurants or other catering operations you reach that number? And eggs consumed at home could still be around 12 per person?
There was also a lot of profiteering going on? This was talked about quite a bit? And it's still going on in other markets with other things like cars??
I really wanted to build a new PC this year, which is obviously not happening anymore. But I do have 2x16GB DDR5 SODIMMs from my laptop that I'm not using, after I upgraded to 64GB a while back. Now I wonder if I can build a tiny PC around those? Does anyone make motherboards that support DDR5 laptop memory?
A bunch of the NUC models use laptop RAM, and often have barebones kits. Looks like ASUS has a decent range of kits and prebuilt, but you may be able to find boards. If you want something expandable, look for the "Pro" and "Extreme" range. I had one of the first gaming-oriented NUC's a while back, Hades Canyon, highly capable.
Apple is going to be even more profitable in the consumer space because of RAM prices ? I feel like they are the only player to have the supply chain locked down enough to not get caught off guard, have good prices locked in enough in advance and suppliers not willing to antagonize such a big customer by backing out of a deal.
They used to, but they've caught up. The flagship iPhone 17 has 12GB RAM, the same as the Galaxy S25. Only the most expensive Z Fold has more, with 16GB.
RAM pricing segmentation makes Apple a lot of money, but I think they scared themselves when AI took off and they had millions of 4GB and 8GB products out in the world. The Mac minimum RAM specs have gone up too, they're trying to get out of the hole they dug.
Tim Cook is the Supply Chain Guy. He has been for decades, before he ever worked at Apple. He does everything he can to make sure that Apple directly controls as much of the supply chain as possible, and uses the full extent of their influence to get favorable long-term deals on what they don't make themselves.
In the past this has resulted in stuff like Samsung Display sending their best displays to Apple instead of Samsung Mobile.
I had planned to build a new workstation this fall, all the parts were in the list. but seeing the ram go from 300€ (96 GB) to 820€, in-stock for 999€, in under a month made me decide that i will continue using that laptop from 2019 for maybe another 1.5 years.
It's a ridiculous situation and these companies, whoever they are, should be somewhat ashamed of themselves for the situation they're putting us in.
That goes specially for those MF at OpenAI who apparently grabbed 40% of the worldwide DRAM production, as well as those sold in stores.
Based on my time working for Samsung this does not surprise me. The silos within fight against one another more than they ever bother to compete with anyone else
It is absolutely the worst time to be a gamer. First it was the GPU prices that went up and NVIDIA started to focus on their enterprise cards more and more RAM prices. I don’t think I’ve seen the price of computer components go up so much.
When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.
Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.
In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
I'd take the opposite bet on this. They're diverting wafer capacity from lower-profit items to things like HBM, but all indications are that wafer starts are up a bit. Just not up enough.
"Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"
(from june of this year).
The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.
Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.
If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.
I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'
But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?
If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.
I suspect there will be a shortage of something else then…
And regardless, you could flip it around and ask, what will we do in x years when the next shortage comes along and we have no fabs? (And that shortage of course could well be an imposed one from an unfriendly nation.)
It's a political problem: do we, the people, have a choice in what gets prioritized? I think it's clear that the majority of people don't give a damn about minor improvements in AI and would rather have a better computer, smartphone, or something else for their daily lives than fuel the follies of OpenAI and its competitors. At worst, they can build more fabs simultaneously to have the necessary production for AI within a few years, but reallocating it right now is detrimental and nobody wants that, except for a few members of the crazy elite like Sam Altman or Elon Musk.
Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
A few hours ago I looked at the RAM prices. I bought
some DDR4, 32GB only, about a year or two ago. I kid
you not - the local price here is now 2.5 times as it
was back in 2023 or so, give or take.
This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.
Americans are subsidizing ai by paying more for their electricity for the rest of the world to use chatgpt (I'm not counting the data centers of Chinese models and a few European ones though)
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
I bought 2x16 (32GB) DDR4 in June for $50. It is now ~$150.
I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.
My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.
The problem is further upstream. Capitalism is nice in theory, but...
"The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933
And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.
I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
Don't worry, if you are happy with those specs you can get corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.
Personally, at work I use the latest hardware at home I use ewaste.
I got a junk Precision workstation last year as a "polite" home server (it's quiet and doesn't look like industrial equipment, but still has some server-like qualities, particularly the use of ECC RAM). I liked it so much that it ended up becoming my main desktop.
Optiplex's used to be my go to the SFF, I had a 1050ti in there not crazy but worked for basic gaming
Ha, I bought one of those for $500 from Ebay. It's a dual Xeon Silver workstation with a Nvidia Quadro P400 8GB, 128GB RAM and 256G SSD. I threw in a 1TB SSD and it's been working pretty well.
At home some of my systems are ewaste from former employers who would just give it to employees rather than paying for disposal. A couple are eBay finds. I do have one highish-end system at a time specifically for games. Some of my systems are my old hardware reassembled after all the parts for gaming have been upgraded over the years.
surely these will soon be scavenged for ram? Arbitrage opportunity?
If they’re DDR4 (or even DDR3), it has no value to e.g. OpenAI so it shouldn’t really matter
The price of DDR4 is also going up!
About a month ago, the mobo for my 5950x decided to give up the ghost. I decided to just rebuild the whole thing and update from scratch.
So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.
I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!
> I've never been more fearful of components breaking than current day.
The mid 90s was pretty scary too. Minimum wage was $4.25 and a new Pentium 133 was $935 in bulk.
If you were in minimum wage jn the 90s your lifelihood likely didn't rely on Pentium processors.
Also, it is frightening how close that is to current day minimum wage.
I was an unemployed student then -- a generous family member gifted me my first Windows PC, and it cost about the same as a used car.
1990-1997 averaged >4% yearly compounded minimum wage hikes, which is probably about where it should have been. The late 90s to today has been <1.25%.
Yep, I had a Cyrix processor in mine during that time. Slackware didn't care.
It also worked as a very good space heater.
Except nobody earns the minimum wage today, it's less than 1/2 of 1% of US labor.
The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.
In addition to the other comments, I presume the big box retailers do not hire for full-time positions when they don't have to, and gig economy work is rapidly replacing jobs that used to be minimum wage.
Counterpoint: affording average rent for a 1-bedroom apartment (~$1,675) requires that exact median full-time wage. $15 an hour affords you about $740 for monthly housing expenses. One can suggest getting two roommates for a one-bedroom apartment, but they would be missing the fact that this is very unusual for the last century. It's more in line with housing economics from the early-to-mid 19th century.
>You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.
Holy moly! 11 whole dollars an hour!?
Okay, so we went from $4.25 to $11.00. That's a 159% change. Awesome!
Now, lets look at... School, perhaps? So I can maybe skill-up out of Little Caesars and start building a slightly more comfortable life.
Median in-state tuition in 1995: $2,681. Median in-state tuation in 2025: $11,610. Wait a second! That's a 333% change. Uh oh.
Should we do the same calculation with housing...? Sure, I love making myself more depressed. 1995: $114,600. 2025: $522,200. 356% change. Fuck.
That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..
In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).
> That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..
A powerbook 5300 was $6500 in 1995, which is $13,853 today.
> The mid 90s was pretty scary too.
If you fast forward just a few years though, it wasn't too bad.
You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.
Are you sure? From what I can tell it's more like 500 USD RRP on release, boxed.
Either way, it was the 90s: two years later that was a budget CPU because the top end was two to three times the speed.
As other mentionned, there are plenty of refurbished stuff and second hand parts that there isn't any risk of finding yourself having to buy something at insane prices if your computer was to die today.
If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.
This. My M1 MacBook felt like a similarly shocking upgrade -- probably not quite as much as my first SSD did, but still the only other time when I've thought, "holy sh*t, this is a whole different thing".
The M1 was great. But the jump felt particularly great because Intel Macbooks had fallen behind in performance per dollar. Great build quality, great trackpad, but if you were after performance they were not exactly the best thing to get
I've had this with gen5 PCIe SSDs recently. My T710 is so fast it's hard to believe. But you need to have a lot of data to make it worth.
Example:
And on my laptop that has a gen3, lower spec NVMe: It's almost 10 times faster. The CPU must have something to do with it too but they're both Ryzen 9.I believe you, but your benchmark is not very useful. I get this on two 5400rpm 3T HDDs in a mirror:
Simply because there's less than 20 directories and the files are large.To me that reads 3x, not "almost 10x". The main differrence here is probably power. A desktop/server is happy to send 15W to the SSD and hundreds of watts to the CPU, while a laptop wants the SSD running in the ~1 watt range and the CPU in the 10s of watts range.
There's over twice as much content in the first test. It's around 3.8gb/s vs 30gb/s if you divide both folder size and both du durations. That makes it 7.9 times faster and I'm comfortable calling this "almost 10 times".
The total size isn't what matters in this case but rather the total number of files/directories that need to be traversed (and their file sizes summed).
oops. I missed the size diff. that's a solid 8x. that's cool!
This and high resolution displays, for me at least.
The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.
I am still running an i5 4690k, really all I need is better GPU but those prices are criminal. I wish I got a 4090 when I had the chance rip
You can still get brand new generic motherboards for old CPUs.
I swapped out old ASUS MBs for an i3-540 and an Athlon II X4 with brand new motherboards.
They are quite cheaper than getting a new kit, so I guess that's the market they cater to: people who don't need an upgrade but their MBs gave in.
You can get these for US$20-US$30.
For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.
I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.
Don't all RAM manufacturers offer a lifetime warranty?
That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?
I've never heard of a lifetime warranty on anything in the enterprise space. Maybe consumer stuff, where it's just a marketing gimmick.
Oh, your RAM died? That means its lifetime ended at that moment, and so did the lifetime warranty. Is there anything else we can help you with today?
I got a used M1 MacBook Air a year ago.
By far the fastest computer I’ve ever used. It felt like the SSD leap of years earlier.
You can still buy DDR4 for pretty cheap, and if you're replacing a computer that old any system built around DDR4 will still be a massive jump in performance.
A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...
Man, it was just GPU for a while. But same boat. I regret not getting the 4090 for $1600 direct from Nvidia. "That's too much for a video card", and got the 4080 instead. I dread the day when I need to replace it.
The Radeon RX 9070 XT performs at a similar level to the RTX 5070, and is retailing around $600 right now.
No CUDA means not an option for me.
> What kinds of applications do you use that require CUDA?
Molecular dynamics simulations, and related structural bio tasks.
What kinds of applications do you use that require CUDA?
You could still easily build a $800-$900 system that would dramatically jump forward from that machine.
$700 in 2014 is now $971 inflation adjusted (BLS calculator).
RTX 3060 12gb $180 (eBay). Sub $200 CPU (~5-7 times faster than yours). 16gb DDR4 $100-$120. $90 PSU. $100 motherboard. WD Black 1tb SSD $120. Roughly $800 (which inflation adjusted beats your $700).
Right now is a rather amazing time for CPUs, even though RAM prices have gone crazy.
Assume you find some deals somewhere in there, you could do slightly better with either pricing or components.
GPU prices are actually at MSRP now for most cards other than the 5090.
[dead]
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
> This is self indulgence justified as « investment ».
Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.
It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?
Every hardware update for me involves hours or sometimes days of faffing with drivers and config and working round new bugs.
Nobody is paying for that time.
And whilst it is 'training', my training time is better spent elsewhere than battling with why cuda won't work on my GPU upgrade.
Therefore, I avoid hardware and software changes merely because a tiny bit more speed isn't worth the hours I'll put in.
My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.
>I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year
>I buy the best phones and desktops money can buy
Sick man! Awesome, you spend 1/3 of the median US salary on a laptop and desktop every year. That's super fucking cool! Love that for you.
Anyways, please go brag somewhere else. You're rich, you shouldn't need extra validation from an online forum.
This is a crazy out of touch perspective.
Depending on salary, 2 magnitudes at $5k is $500k.
That amount of money for the vast majority of humans across the planet is unfathomable.
No one is worried about if the top 5% can afford DRAM. Literally zero people.
[flagged]
Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.
Prices are high but they're not that high, unless you're buying the really big GPUs.
Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.
You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.
The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.
It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.
Typing this on similar spec P16s that was around 2.6k or so. So if you call anything under 3k simply 2k, then it was 2k.
Thats in Germany, from a corporate supplier.
> Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?
Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.
Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.
> maybe $250/month (...) which you can then use to go and earn 100x that.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
>Most developers in the third world don't make that in a full year
And many in the first world haha
> But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
The soon to be unemployed SV engineers when LLM's mean anyone can design an app and backend with no coding knowledge.
Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?
Most tech jobs could be done on a $25 ten year old smartphone with a cracked screen and bulging battery.
That’s exactly my point. Underspending on your tools is a misallocation of resources.
That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...
Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.
The goal is the right tool for the job, not the best tool you can afford.
Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.
I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.
That's crazy spend for anyone making sub 100K
It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.
A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.
thats a crazy spend for sub-200k or even sub-500k
you're just building a gaming rig with a flimsy work-related justification.
Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.
Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.
Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.
So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.
Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.
Have you ever heard of the term "efficiency"?
It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.
With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.
That and your employer should pay for your hardware in many cases.
One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.
I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.
======== COMPUTER ========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
======== PHONE ========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
======== ASIDE ========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
are you paid by the FLOP?
I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
Big Tech has made it unaffordable for everyone.
8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.
> something other than Windows or macOS > 8GB
No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?
The bright side is the bust is going to make a glut of cheap used parts.
You’re an employee. You don’t use any of your computers to make your revenue; your employer provides them. Obviously I wasn’t speaking about you.
Oh. I’m not allowed to own a home computer to try to further my own learning and education and knowledge then.
Guess I’ll go fuck myself now then.
They're just using this comment section to brag about how well off they are, I wouldn't worry too much. They're completely out of touch.
It's the "how much can the banana cost, $10?" of HN.
The point they're trying to make is a valid one - a company should be willing to spend "some money" if it saves time of the employee they're paying.
The problem is usually that the "IT Budget" is a separate portion/group of the company than the "Salary" budget, and the "solution" can be force a certain dollar amount has to be spent each year (with one year carry-forward, perhaps) so that the employees always have good access to good equipment.
(Some companies are so bad at this that a senior engineer of 10+ years will have a ten year old PoS computer, and a new intern will get a brand new M5 MacBook.)
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or Mediatek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.
To be honest, this actually sounds kinda healthy.
Sears would like to have a word about how healthy intra-company competition is.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
It makes absolutely no sense to apply the lessons from one into the other.
I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.
Each of these units were then given access to an internal "market" and directed to compete with each other for funding.
The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.
It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.
This happened at a place where I worked years ago, but not as 'on purpose.' We were a large company where most pieces depended on other pieces, and everything was fine - until a new CEO came in who started holding the numbers of each BU under a microscope. This led to each department trying to bill other departments as an enterprise customer, who then retaliated, which then led to internal departments threatening to go to competitors who charged less for the same service. Kinda stupid how that all works - on paper it would make each department look better if they used a bottom barrel competitor, but in reality the company would have bled millions of dollars as a whole.
to put a finer point on it, it wasn't just competition or rewarding-the-successful, the CEO straight up set them at odds with each other and told them directly to battle it out.
basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.
Yes, this is what I was referring to. I should have provided more context, thanks for doing so.
That was a bullshit separation of a single horizontal cut of the market (all of those segments did consumer retail sales) without overlap.
The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.
That has absolutely no similarity to how Samsung is organized.
Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.
Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.
Ok. And if it did divide on the borders of insurance and payment services, the reorganization wouldn't have been complete bullshit and may even have been somewhat successful.
Nokia too
It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option
Yeah, makes absolute sense.
A bit like Toyota putting a GM engine in their car, because the Toyota engine division is too self-centered, focusing to much on efficiency.
You mean toyota putting bmw engine (supra). Your statement is contradicting as Toyota has TRD, which focuses on the track performance. They just couldn't keep up with the straight six perf+reliability when comparing to their own 2jz
Buying a Supra is stupid. Either buy a proper BMW with the b58/Zf8 speed and get a proper interior or stop being poor and buy an LC500.
Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.
The opposite, nepotism, is very unhealthy, so i think you're correct.
Not sure that the opposite of transfer pricing is nepotism. As far as I know it’s far more common for someone who owns a lake house to assign four weeks a year to each grandkid , than to make them bid real money on it and put that in a maintenance fund or something. Though it’s an interesting idea, it’s not very family friendly
n/a
I genuinely can't tell if this is sarcasm? Or do you live somewhere where this is taught?
Isn't this how South Korean chaebols work?
They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.
I worked with some supply chain consultants who mentioned "internal suppliers are often worse suppliers than external".
Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.
That’s really good business. Everyone is pushing to be the best rather than accepting mediocrity.
> two versions of the same phone with different processors
That's hilarious, which phone is this?
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.
I might be out of date, but last I knew, it was "most of them."
International models tended to use Samsung's Exynos processors, while the ones for the North American market used Snapdragons or whatever.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
"Galaxy S II" and its aesthetics was already a mere branding shared across at least four different phones with different SoCs, before counting in sub-variants that share same SoCs. This isn't unique to Samsung, nor is it a new phenomenon, just how consumer products are made and sold.
1: https://en.wikipedia.org/wiki/Samsung_Galaxy_S_II
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.
The S23 was also Snapdragon-only as far as I know[1]. The S24 had the dual chips again, while as you say S25 is Qualcomm only once more.
[1]: https://www.androidauthority.com/samsung-exynos-versus-snapd...
This is the case as recent as of S24, phones can come with exynos or snapdragon, with exynos usually featuring worse performance and battery life
Several high end Galaxy S's AFAIK.
"The price of eggs has nothing on the price of computer memory right now.". A dozen eggs went to ~$5. They are eggs and most people use what, max 12 eggs a month? Get out of here with that trite garbage. Everyone knew that the egg shortage was due to the extreme step the US does of culling herds infected with avian flu and that they were transitory.
The average person buys, what, 0 ram per month? Which cares.
Surprisingly, apparently Americans average 279 eggs per year per person or 24 per month.
https://www.washingtonpost.com/business/2019/02/28/why-ameri...
(This is not a comment making any judgements about cost or the state of the economy, I was just surprised to find it that high)
Maybe if you include all the eggs in processed food like cookies or cakes and in restaurants or other catering operations you reach that number? And eggs consumed at home could still be around 12 per person?
cuz eggs are in breakfast sandwiches, are ingredients in pastries, act as binders in things like meatloaf or fried chicken, etc. etc.
There was also a lot of profiteering going on? This was talked about quite a bit? And it's still going on in other markets with other things like cars??
I really wanted to build a new PC this year, which is obviously not happening anymore. But I do have 2x16GB DDR5 SODIMMs from my laptop that I'm not using, after I upgraded to 64GB a while back. Now I wonder if I can build a tiny PC around those? Does anyone make motherboards that support DDR5 laptop memory?
A bunch of the NUC models use laptop RAM, and often have barebones kits. Looks like ASUS has a decent range of kits and prebuilt, but you may be able to find boards. If you want something expandable, look for the "Pro" and "Extreme" range. I had one of the first gaming-oriented NUC's a while back, Hades Canyon, highly capable.
https://www.asrock.com/mb/AMD/X600TM-ITX/index.asp
Can't find it for sale, though. There's also a barebones mini-PC:
https://www.asrock.com/nettop/AMD/DeskMini%20X600%20Series/i...
Apple is going to be even more profitable in the consumer space because of RAM prices ? I feel like they are the only player to have the supply chain locked down enough to not get caught off guard, have good prices locked in enough in advance and suppliers not willing to antagonize such a big customer by backing out of a deal.
Apple software typically seems to give a better user experience in less RAM in both desktop and mobile.
For the last 10+ years apples iPhones have shipped with about half the ram of a flagship android for example.
They used to, but they've caught up. The flagship iPhone 17 has 12GB RAM, the same as the Galaxy S25. Only the most expensive Z Fold has more, with 16GB.
RAM pricing segmentation makes Apple a lot of money, but I think they scared themselves when AI took off and they had millions of 4GB and 8GB products out in the world. The Mac minimum RAM specs have gone up too, they're trying to get out of the hole they dug.
>the only player to have the supply chain locked down enough to not get caught off guard What?
Tim Cook is the Supply Chain Guy. He has been for decades, before he ever worked at Apple. He does everything he can to make sure that Apple directly controls as much of the supply chain as possible, and uses the full extent of their influence to get favorable long-term deals on what they don't make themselves.
In the past this has resulted in stuff like Samsung Display sending their best displays to Apple instead of Samsung Mobile.
I had planned to build a new workstation this fall, all the parts were in the list. but seeing the ram go from 300€ (96 GB) to 820€, in-stock for 999€, in under a month made me decide that i will continue using that laptop from 2019 for maybe another 1.5 years.
It's a ridiculous situation and these companies, whoever they are, should be somewhat ashamed of themselves for the situation they're putting us in.
That goes specially for those MF at OpenAI who apparently grabbed 40% of the worldwide DRAM production, as well as those sold in stores.
Dec 2023:
96GB (2x48) DDR5 5x00 £260 today £1050
128GB (4x32 ) DDR5 5x00 £350 today £1500
Wut?
Edit: formatting
ECC memory has been one of my better investments in the past two years, and now because of the crashes it might have prevented.
Based on my time working for Samsung this does not surprise me. The silos within fight against one another more than they ever bother to compete with anyone else
It is absolutely the worst time to be a gamer. First it was the GPU prices that went up and NVIDIA started to focus on their enterprise cards more and more RAM prices. I don’t think I’ve seen the price of computer components go up so much.
You make more money selling the good stuff. It's like this in just about every industry.
Ironically that site was eating up my RAM. PC World has some issues, Chrome & Firefox.
When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.
Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.
The manufacturers are willing to quadruple the prices for the foreseeable future but not change their manufacturing quotes a bit.
So much for open markets, somebody must check their books and manufacturing schedules.
In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
I don’t think they’re working at 100% capacity or don’t have any other FAB that they can utilize for other low profit stuff.
Let’s check their books and manufacturing schedule to see if they’re artificially constraining the supply to jack up the prices on purpose.
I'd take the opposite bet on this. They're diverting wafer capacity from lower-profit items to things like HBM, but all indications are that wafer starts are up a bit. Just not up enough.
For example: https://chipsandwafers.substack.com/p/mainstream-recovery
"Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"
(from june of this year).
The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.
Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.
If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.
I've bought 2x16GB Samsung ECC RAM last week for $150.
This seems to be for chips put in phones in 2026? I thought these orders were booked further in advance, or is that only for processors?
In the 90s, Motorola Mobile used Cypress SRAMs and not Motorola SRAMs.
Pricing.
I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'
But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?
If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.
What will we do with that fab in two years when nobody needs that excess RAM?
I suspect there will be a shortage of something else then…
And regardless, you could flip it around and ask, what will we do in x years when the next shortage comes along and we have no fabs? (And that shortage of course could well be an imposed one from an unfriendly nation.)
There has never been 'an excess of RAM', the market has always absorbed what was available.
Sell it at lower prices. Demand is a function of price, not a scalar.
Tax write-off donations to schools and non-profits, too.
Micron is bringing up one in Boise Idaho as well.
It's a political problem: do we, the people, have a choice in what gets prioritized? I think it's clear that the majority of people don't give a damn about minor improvements in AI and would rather have a better computer, smartphone, or something else for their daily lives than fuel the follies of OpenAI and its competitors. At worst, they can build more fabs simultaneously to have the necessary production for AI within a few years, but reallocating it right now is detrimental and nobody wants that, except for a few members of the crazy elite like Sam Altman or Elon Musk.
Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
This industry has a history of forming cartels.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
There is also the side effect of limiting people to run powerful models themselves. Could very well be part of a strategy.
Kdrama on this when?
AI companies must compensate us for this outrage.
A few hours ago I looked at the RAM prices. I bought some DDR4, 32GB only, about a year or two ago. I kid you not - the local price here is now 2.5 times as it was back in 2023 or so, give or take.
I want my money back, OpenAI!
This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.
I think we're going to regret this.
Americans are subsidizing ai by paying more for their electricity for the rest of the world to use chatgpt (I'm not counting the data centers of Chinese models and a few European ones though)
DDR4 manufacturing is being spun down due to lack of demand. The prices on it would be going up regardless of what's happening with DDR5.
Yup.
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
Usually the narrative for externalizing these kinds of costs is that the investment will result in lots of jobs in the upgrade area.
Sometimes that materializes.
Here the narrative is almost the opposite: pay for our expensive infrastructure and we’ll take all your jobs.
It’s a bit mind boggling. One wonders how many friends our SV AI barons will have at the end of the day.
I bought 2x16 (32GB) DDR4 in June for $50. It is now ~$150.
I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.
My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.
I am so glad I built my PC back in April. My 2x16gb DDR5 sticks cost $105 all in then, now it’s $480 on amazon. That is ridiculous!
The problem is further upstream. Capitalism is nice in theory, but...
"The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933
And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.