Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Most GPUs are sold as integrated such as SoCs and in consoles nowadays.

But those are mostly AMD, and doesn't really have anything to do with what features someone puts on their discrete gaming cards, except insofar as it implies gamers don't need the cards some amateur ML hobbyist might buy.

> In a shrinking market, the midrange and low-end products will cease to be profitable.

That's assuming the products have high independent development costs, but that isn't really the case. The low end products are essentially the high end products with fewer cores which use correspondingly less silicon -- which have higher yields because you don't need such a large area of perfect silicon or can sell a defective die as a slower part by disabling the defective section, making them profitable with a smaller margin per unit die area.

> Furthermore, node advancements have stopped scaling $/transistor. So the transistors aren't getting cheaper, just smaller.

Which implies that they can profitably continue producing almost-as-good GPUs on the older process node.

> Lastly, Nvidia wants to allocate every last wafer they pre-purchased from TSMC to their server GPUs.

Which is why the proposal is for them to make as many GPUs as the gamers could want at Samsung.



Customers who want midrange or low end GPUs are price sensitive. Therefore, midrange and low end GPUs are a low margin business. Hence, there's little to no reason to provide a very compelling product in those categories.

While midrange GPUs are a cut from highend GPUs, they're still significantly more expensive to manufacture than say CPUs, at a transistor to transistor level. Look at an AMD 7950x transistor count, and then an RTX 4060 transistor count. The GPU has ~50% more transistors but sell at half the price. In addition, the GPU requires RAM, a board, circuitry, and a heatsink fan. The margins simply aren't there for lowend GPUs anymore.

Previously, Nvidia and AMD can make it up through volume. But again, the market has gotten much smaller going from 60 million discrete GPUs per year sold to 30 million. That's half!

Based on your logic, AMD should feast on midrange and low end discrete GPU market because Nvidia does not have value products there. But AMD isn't feasting. You know why? Because there's also no profit there for AMD either.

Once you stop thinking like an angry gamer, these decisions start to make a lot of sense.


> Customers who want midrange or low end GPUs are price sensitive. Therefore, midrange and low end GPUs are a low margin business.

Customers who want petroleum are price sensitive. Therefore, petroleum exporting is a low margin business. This is why the Saudis make the profit margins they do. Wait, something's not right here.

> Look at an AMD 7950x transistor count, and then an RTX 4060 transistor count. The GPU has ~50% more transistors but sell at half the price.

You're comparing the high end CPU to the mid-range GPU. The AMD 8500G has more transistors than the RTX 4060 and costs less.

> In addition, the GPU requires RAM, a board, circuitry, and a heatsink fan.

The 8500G comes with a heatsink and fan. The 8GB of GDDR6 on the 4060 costs $27 but the 4060 costs $120 more. A printed circuit board doesn't cost $93.

> Previously, Nvidia and AMD can make it up through volume. But again, the market has gotten much smaller going from 60 million discrete GPUs per year sold to 30 million. That's half!

That's not because people stopped buying them, it's because they shifted production capacity to servers.

> Based on your logic, AMD should feast on midrange and low end discrete GPU market because Nvidia does not have value products there. But AMD isn't feasting. You know why? Because there's also no profit there for AMD either.

But they do though. You can find lower end AMD GPUs from the last two years for $125 (e.g. RX 6400) whereas the cheapest RTX 3000 or 4000 series is around twice that.

And anyway who is talking about the bottom end? The question is why they don't produce more of e.g. the RTX 3070, which is on the old Samsung 8LPP process, has fewer transistors than the RTX 4060, is faster, and is still selling for a higher price.


  And anyway who is talking about the bottom end? The question is why they don't produce more of e.g. the RTX 3070, which is on the old Samsung 8LPP process, has fewer transistors than the RTX 4060, is faster, and is still selling for a higher price.
What do you think? I gave you my reasons. Why don't you take a crack at your own question? There has to be a logical business reason right?

  Customers who want petroleum are price sensitive. Therefore, petroleum exporting is a low margin business. This is why the Saudis make the profit margins they do. Wait, something's not right here.
One is a commodity. The other is about as high tech as it gets. Completely different economic rules that govern these products.

  You're comparing the high end CPU to the mid-range GPU. The AMD 8500G has more transistors than the RTX 4060 and costs less. The 8500G comes with a heatsink and fan. The 8GB of GDDR6 on the 4060 costs $27 but the 4060 it costs $120 more. A printed circuit board doesn't cost $93.
One has an entire board that needs soldering, assembled by a manufacturing line, tested with many parts, and a team of dedicated engineers optimizing drivers constantly. The other is a CPU that is machine tested, and shipped with a heatsink fan unattached. Come on now.

  That's not because people stopped buying them, it's because they shifted production capacity to servers.
That's not true. The discrete GPU market has been shrinking for 14 years straight with some crypto boom years here and there. See the chart I posted previously. Fewer and fewer people are buying discrete GPUs

  But they do though. You can find lower end AMD GPUs from the last two years for $125 (e.g. RX 6400) whereas the cheapest RTX 3000 or 4000 series is around twice that.
AMD cards do not "feast" on low-end and midrange. According to Steam charts, Nvidia still dominates midrange cards.[0] Furthermore, when I said "feast", I meant making profits. AMD does not make much profit from midrange or low end cards.

The bottom line is, you keep wondering why no one is offering compelling value in the midrange area but there's a very obvious reason why: profit is not there.

[0]https://store.steampowered.com/hwsurvey/videocard/


> Why don't you take a crack at your own question? There has to be a logical business reason right?

Selling 30M GPUs with a huge margin is more profitable than selling 60M GPUs with a modest margin, and they can point to Bitcoin or AI as an excuse.

But also, we're talking about them crippling the cards "for gamers" so there will be cards "for gamers" -- the premise of this has to be that they're supply constrained (artificially or otherwise) because otherwise they would just make more at the evidently profitable price gamers are already paying. It can't be a lack of demand because the purpose of removing the feature is to suppress demand (and shift it to more expensive cards).

> One is a commodity. The other is about as high tech as it gets. Completely different economic rules that govern these products.

So you're saying that if a high tech product only has a limited number of suppliers then they could charge high margins even if customers are price sensitive.

> One has an entire board that needs soldering, assembled by a manufacturing line, tested, with many parts, and a team of dedicated engineers optimizing drivers constantly. The other is a CPU that is machine tested, and shipped with a heatsink fan unattached. Come on now.

GPU manufacturing is automated. The CPU heatsink isn't attached because it mounts to the system board, not because attaching it would meaningfully affect the unit price.

Driver development isn't part of the unit cost, its contribution per unit goes down when you ship more units.

You can buy an entire GPU for the price difference between the 8500G and the RTX 4060.

> That's not true. The discrete GPU market has been shrinking for 14 years straight with some crypto boom years here and there. See the chart I posted previously.

That's only because you're limiting things to discrete GPUs and customers have increasingly been purchasing GPUs in other form factors (consoles, laptops, iGPUs) which have different attachment methods but are based on the same technology.

> According to Steam charts, Nvidia still dominates midrange cards.

Steam is measuring installed base. That changes slowly, especially when prices are high.

> Furthermore, when I said "feast", I meant making profits. AMD does not make much profit from midrange or low end cards.

They make a non-zero amount of profit, which is why they do it.


I think you answered your own question, really.

Although I would modify your statement slightly:

Original: Selling 30M GPUs with a huge margin is more profitable than selling 60M GPUs with a modest margin.

Modified: Nvidia and AMD must sell at a higher ASP because the market for discrete GPUs has shrunk from 60m to 30m/year.

That's your answer! It's what I've been arguing for since my very first post. It isn't Nvidia and AMD's choice to have the market shrink in terms of raw volume. It's because many midrange gamers have largely moved onto laptops, phones, and consoles for gaming since 2010. The remaining PC gamers are willing to pay more for discrete GPUs. Hence, both Nvidia and AMD don't bother making compelling midrange GPUs.

I remember midrange GPUs that have great value such as the AMD HD 4850. I don't think those days are ever coming back.


> Modified: Nvidia and AMD must sell at a higher ASP because the market for discrete GPUs has shrunk from 60m to 30m/year.

That makes no sense as a reason not to sell more.

It also makes no sense in general because discrete GPUs aren't a distinct technology. The H100 isn't literally a bunch of RTX cards glued together, but it's approximately that and the same R&D goes into both, implying that the higher demand for this technology should allow for lower ASPs as you now have a new source of demand to spread the R&D costs into.

And the same is true for consoles and laptops. It's the same technology, it's just soldered to something instead of being in a PCIe card. Discrete GPUs aren't expensive because they can't justify the cost of the printed circuit board without selling more units, they're expensive because the market is consolidated and in the absence of more competition, Nvidia charges what they can get away with even when that is far in excess of what they would need to charge simply to remain a viable business.

> I remember midrange GPUs that have great value such as the AMD HD 4850. I don't think those days are ever coming back.

The way you get those days back is to get more competition. Support AMD and Intel and anyone else who might present a viable challenge to Nvidia so that Nvidia has to provide better value for money to keep you from switching.


It makes sense because there is now less demand for midrange GPUs than before. But high end GPUs have either increased demand or stay the same, making high end GPUs more lucrative.

You seem to think that midrange market shrunk in volume because Nvidia decided to stop offering value products, is that right?

Don’t support AMD or Intel if they have inferior products. There are now plenty of GPU makers out there including Qualcomm and Apple. Let’s not become fanboys here. R/ayymd is where you want to go if you want to become a blind AMD supporter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: