This week has been one of many firsts. Intel and AMD are clubbing together. Lian Li is making cases with tempered glass and regular steel. Microsoft is ahead of Sony in the console spec wars. But that’s nothing compared to what’s been announced today by Intel, and that is the creation of a new graphics and compute division inside the company headed up by former AMD executive Raja Koduri. Graphics cards, guys! Intel wants to make graphics cards.
In a press release posted to Intel’s blog site, the company revealed that they were eager to get back into the graphics and compute division because they see a future where GPU performance is valued far more than per-core processing power. The only snag in this plan is Intel’s own graphics portfolio – it’s strong, but it lacks any kind of voema to propel it in front of AMD and NVIDIA’s solutions. It’s seen as more of a productivity add-on than an actual graphics solution.
Enter Raja Koduri, AMD’s ex-chief of the Radeon Technologies Group. At Intel, Koduri’s job will be to head the Core and Visual Computing Group which will oversee a transition in Intel’s graphics portfolio and products, expanding from their current integrated graphics solutions into high-end discrete chips with much more power. Koduri’s main initiative at the company is actually separate from his position at Intel because he’s also Intel’s general manager “dedicated to driving edge computing solutions”.
Edge computing is an interesting field because it’s tied to the growth of the Internet of Things (IoT) market. Edge computing is defined as pushing the frontier of computing applications, data, and services away from centralised nodes to the logical extremes of a network. More simply, it’s the act of moving compute capabilities from a server on to the device itself, and allowing individual devices to relay and act on information it shares with others using the data it retrieves on its own self. For example, robots in an assembly line might discover that there’s a fault in the metals the company is using, and adjust their welding technique to compensate for the issue without needing a factory operator to make the adjustment himself. Another scenario would be a machine that is moving a little slower than normal, losing tenths of a second in an action, would be able to identify with other machines that it is defective, and a floor manager would be able to take it off the line without any downtime or product loss.
Edge computing is essentially the engine that would eventually allow factories to run fully automated with almost no human supervision or intervention.
“We have exciting plans to aggressively expand our computing and graphics capabilities and build on our very strong and broad differentiated IP foundation,” says Intel. “With Raja at the helm of our Core and Visual Computing Group, we will add to our portfolio of unmatched capabilities, advance our strategy to lead in computing and graphics, and ultimately be the driving force of the data revolution.”
As for how they’re doing this, they’re definitely not going to match NVIDIA’s compute capabilities with their Iris graphics solution. In steps AMD’s semi-custom silicon that is embedded onto the same packaging, acting almost like on-die graphics. AMD has much more experience in machine learning and visual computing systems, and they already have a framework for Intel to use thanks to their work in driving OpenCL adoption. Intel also benefits from having a GPU attached to their processor cores that gives them much more performance for mobile systems, satisfying the needs of clients like Apple. AMD benefits from the increased business sales as well, and their semi-custom business will soon become their most important business offering, and the most lucrative one.
Of course, we can’t take in this news without thinking of the future impact this will have on AMD as a company. If Intel comes to rely on AMD’s graphics prowess a bit too much, they may be tempted to offer to buy out AMD’s graphics department to keep things more in-house, leaving AMD’s CPU business at a disadvantage. If AMD relies too much on the semi-custom business, they may eventually decide to stop servicing the consumer market, and instead stick to delivering solutions to other companies for things like consoles, VR headsets, and all-in-one machine learning systems mixing graphics and compute capabilities.
There might be a third, and perhaps even a fourth ending to this story that I haven’t considered yet, but those two are the most likely. It will be interesting to see things play out. For now, though, the PC industry has just had a major shake-up in the natural order of things, and Intel putting more money into GPU compute and graphics is bad news for NVIDIA.