Sunday, November 11, 2012

Speculation about the future of AMD, nVidia, and XBox

There are a few important facts to highlight for you, the reader, before going any further in speculation:

1-AMD first tried to acquire nVidia, but turned out to acquire ATI instead.

2-Since AMD launched their APUs (CPUs that are embedding a GPU), it is evident that people at AMD are trying to convince the consumers to adopt Radeon graphics, because when an AMD APU is paired with a nVidia video card, the embedded Radeon GPU becomes ineffective. When paired with a Radeon video card and a motherboard that is supporting Crossfire, the embedded GPU power is unlocked and added to the discrete Radeon video card power, creating an instant plus-value for the customer. This means the GPU embedded in the APU can be used as extra horse power for dealing with other things like physics and/or A.I. But developers need to take advantage of this, and nVidia is more likely to release something new before it happens in order to prevent this on the PC platform.

3-While being very inexpensive, AMD APUs are lacking of any GDDR (RAM dedicated to GPU processing), using the standard, but slower, RAM connected to the motherboard. The customer can unlock faster GPU performance by pairing the APU with faster RAM modules. Example, a motherboard that mounts an APU and 2400Mhz DDR3 RAM will throw much faster graphics horsepower than a motherboard that mounts an APU and 1600Mhz DDR3 RAM.

4-nVidia solutions are globally more expensive than equivalent AMD solutions.

5-nVidia Tegra chips for ultra-mobile devices are in fact slightly-modified quad-core ARM CPUs that embeds a nVidia GPU, a northbridge, a southbridge, a memory controller, and a 5th companion core. They are competing with Mali, a graphical chipset family also created by ARM and used in many smartphones, like many Samsung devices.

Now is time for open speculation.

The next XBox console is rumored to rely on a 16-cores IBM ARM CPU coupled with a AMD GPU and custom, lightning-fast RAM memory. This is at least what is rumored to be included in one of the early development kits, but it appears that final specs are often only half those of the devkits, which is bringing the thing down to being (approximately) a 8-cores CPU, which is equivalent to today's high-end technology on the PC platform.

Chances are that the console's AMD GPU will not embed any standard GDDR; it would rather embed very fast RAM in order to compensate for the lack of direct-access to GDDR. But why? GDDR5 is so fast already, why bumping it?

Well, think about it for a minute; if Microsoft can get the RAM to work at least as fast as the GDDR5 technology that is roaming around since a few years already, this would open a HUGE improvement over graphics, since developers would have access to the whole system RAM (which you can expect to be considerable, reaching 8Gb-12Gb) as a standard, instead of a non-standardized maximum of 1Gb or 2Gb like on PC.

Today's PC games, even top-end ones like Crysis3, are designed for using 1Gb of GDDR5, or 2Gb in the very best scenario. But we don't see major improvements in graphics quality between 1Gb and 2Gb because developers are not wasting their time developing ultra-high resolution graphics for something that only 5 - 12% of their userbase will be able to run, even if they have video chipsets powerful enough to render them. They are boosting a few things, like the screen resolution, the special effects resolution (blur, lightmap, etc.) and of course the framerate, but they are not bothering developing new features or graphics because THE TIME AND RESSOURCES INVESTMENTS DOES NOT WORTH THE PAIN, AND BECAUSE NO BIG COMPETITOR WILL DO IT ANYWAY. However this is a whole different story if you approach them with a console that promises to sell millions of units on day one, with very high and standardized specs. It now not only worths the pain, but it gives them no choice to improve if they want to keep the flag in their camp.

AMD will have stronger presence and reputation on the PC platform, because up to now they are providing very interesting price VS performance solutions, but they failed to release any convincing ultra-mobile hardware. Unless they hurry up to release a solid and attractive offering for the ultra-mobile devices scene, they won't be able to catch up the notoriety of nVidia in this field, and will rather keep focusing on growing their presence on the desktop and console scenes. Tablets and smartphones are stealing market shares to laptops every month, and this will continue for the years to come. AMD plans are most probably to become a stronger actor on the desktop and console worlds than they ever had the opportunity to be. This is where they will channel their ressources, offerings and R&D.

NVidia will continue to sit on their reputation for the mass-market desktop scene, just like Intel are doing right now, because nVidia currently have the perfect opportunity to rule the ultra-mobile scene with their Tegra chips. They will continue to compete with AMD on the desktop scene, in order to keep stealing them shares as much as possible, but they will want to do this only in exchange of high profit margins. In other words, they will target a more niche market (smaller territory), but they will defend it with force. Much like Apple, nVidia will play the card of the expensive high-end, and will invest a lot in marketing strategies and corporate identity to justify their higher price points (which may really worth it anyway). You can expect nVidia to announce many new business partnerships in the forthcoming years. I can also imagine them to acquire ARM. Not for direct profits, but rather with the objective to seduce all ARM partners to use their graphics chip (bigger production = lower costs), thus reinforcing their presence on the ultra-mobile scene, and having a stronger edge over Intel and AMD on this quickly-growing scene. If nVidia kills the Mali GPUs in favor of nVidia GPUs, this would almost guaranteed that Intel won't make the move to compete with ARM. Such a move would be very hard for Intel, whose only chance to catch-up on the ARM + nVidia architecture would be by investing many billions of dollars; Intel won't be interested in investing that much if they cannot tackle the ARM + nVidia combo, let alone ARM; they currently need to tackle ARM only and they proved to be unable so far (even with their ATOM CPU), so tackling ARM + nVidia is very unlikely to happen unless Intel find a new way to innovate. This would be a way for nVidia to secure their young but yet expensive venture in ARM architecture, because Intel are reportedly preparing another trial for 2014 and if they succeed, this will hurt not only ARM itself, but also nVidia. But that remains to be seen...

No comments:

Post a Comment