AMD has been keeping the entire tech press busy this past month with a spree of product announcements, reveals, and leaks, and today the craze is finally over for at least another month. Announced during AMD’s recent SIGGRAPH presentation, the company finally revealed Ryzen Threadripper as well as their new Radeon RX Vega graphics cards, detailing all of their prices and what competition AMD is expecting to face. Grab some coffee, I guess, this is going to be another long one.

Ryzen Threadripper 1950X, 1920X, and 1900X

First, get a load of AMD’s box packaging for Threadripper! It’s quite a unique design, and AMD wanted to make a bold statement with seeing these on the shelf. To open the box, you unscrew the rear pin to pop off the back. Inside, you’ll find a bracket for the ASETEK family of water coolers, along with the usual paraphernalia, and the enormous chip itself. The socket and the mounting mechanism is so large that AMD had to use old designs from server processors that had slide-in trays, which you can see in action in this video shot by MSI. Somehow, the rather pornographic music in the background suits the ridiculous levels of engineering that you see in just three minutes.

With Ryzen 3 and finally Threadripper on the way, there’s a new roadmap that AMD has just updated to reflect their future plans. Note how different it is to the original one they showed a few months ago. Aside from the fact that Threadripper now features in the table, look at the position of Ryzen Mobile. It has now shifted to the right, signalling a delay in AMD’s intended plans for the platform’s release.

It now hovers over the tail-end of Q3 2017 and mostly sits above Q4. With Threadripper only launching in August, the team behind Ryzen Mobile has to really pull up their socks to make it in to a September launch, or else it’s Q4 2017 for both the consumer and pro versions of Ryzen Mobile. There may not be enough time for their partners to have notebooks ready for sale in early November, if that’s the case.

Ryzen Threadripper is composed of three processors in the family, priced to go up against Intel’s Core i7 and Core i9 processors on the X299 platform. AMD is pricing the platform quite aggressively, starting with a $549 entry point with the TR 1900X. It’s quite similar to the Ryzen 7 1800X, but it has a higher base clock to start off. In addition it carries more memory channels, full ECC compatibility, 64 lanes of PCI Express connectivity, and some of the professional hardware features that make it desirable as a workstation processor. It is unlikely that AMD will ever make a “pro” version of Threadripper, given its placement in the market.

For that reason, AMD doesn’t make much of a fuss over the TR 1900X when the TR 1920X and 1950X also have to share the spotlight. It’s just an entry point into the platform, and it’s far more crippled in terms of performance compared to its bigger brothers. AMD is pitting the TR 1920X against Intel’s Core i9-7900X, promising higher performance for a much lower price, while the TR 1950X is offering much more performance for the same price. It’s a tantalising option, especially when you get into the performance-per-watt comparisons. If you’re looking for a better option that saves you power while being faster at the same time, AMD’s Threadripper family is well suited for this.

Because AMD is also using the same socket as EPYC, with some tweaks, it stands to reason that the chip packaging is the same. Underneath the Threadripper heatspreader, the package does include four dies, although only two are ever active. AMD could be filling in the space for the other dies with absolutely anything. Dead dies that couldn’t even pass a single core test? Dies that have bigger cache? Dies that have underperforming or damaged memory controllers? Those can all be used as spacers to make the package match the socket specifications. AMD is using almost all of the physical processors they get back from Global Foundry’s fabs to make Zen and its derivatives, and using dead dies as spacers is a great way to save on waste.

The X399 ecosystem for Threadripper is much more mature than socket AM4 ever was, and I can tell you that it’s going to be a stable experience right out of the box. For months now, AMD has worked hard to make sure that the X399 launch goes more swimmingly, and it’s at a level they can be comfortable with, and maybe even proud of. Not only will all of the major motherboard brands have their options up at launch, but there will also be an ecosystem of accessories and CPU coolers ready for new builds. There won’t be many existing coolers that work with the new socket, but there will be some brackets that are made to be compatible, since every cooler has to screw into the TR4 socket.

Compared to Ryzen 3, 5, and 7, Threadripper processors can overclock more stably, can hold those overclocks for a longer period of time, and might even run cooler if you compare die-to-die temperatures. There’s also much higher memory stability, so difficult-to-run kits at DDR4-3400 and higher are probably not going to pose such an issue.

AMD Radeon RX Vega and Vega Nano

Radeon RX Vega was AMD’s other star of their show, and the biggest draw for enthusiasts to the brand this year. It’s been a very long road for consumers and AMD itself, because we first heard about Vega’s existence about two years out of its architectural reveal earlier this year. That’s a whole twenty-four months that we’ve had to board the hype train, and it just came into the station. For some who will make use of all the new features, Vega holds incredible promise to spur on technological advances. For gamers, its value is a little questionable given what we know about the performance of Radeon Vega Founders Edition, but AMD does promise that big changes are on the way to improve performance.

For those of you wondering what took AMD so long, the primary reason they’ll admit to, as seen in HardOCP’s interview with Chris Hook, is that they wanted enough stock for launch day for everyone to get a card. They’re clearly worried about the impact of mining operations on the card’s success in the market, and they want gamers to buy it instead.

The Next Compute Unit (NCU) that forms the building blocks of Vega has been redesigned to offer higher throughput for all manner of instructions and data, including double-speed floating point performance at 16 bits of accuracy. There are 40 new instructions that Vega can process, most of which will be useful for scientific computing.

Interestingly, a few are dedicated to cryptocurrency and data hashing for blockchain technology. AMD is betting that Vega will be used in the future to secure the blockchain, and having dedicated instructions available to improve its performance is now something we can look forward to. Crypto mining isn’t something AMD sees as a growth driver, but blockchain is a whole different ballgame.

In terms of architecture, Vega actually shares a boatload of hardware with the PlayStation 4’s GPU, and it will also directly contribute to game development on the Xbox One. I’ve written countless times before that the consoles designed by AMD turn into test platforms for their future hardware designs on the desktop market, and that’s been repeated here. Optimisations made by AMD first on console now wind their way into the hardware for the Vega architecture, and it’s good that both platforms are now on par.

AMD supports a wide range of console-like optimisations through their GPUOpen program, which offers open-source software and code to support advanced graphical features and more complex physics engines. It seems that it’s also Shader Model 6 compliant, so it’ll be ready for the next generation of games that make use of it.

On a related note about the floating point performance, AMD has partnered up with several game developers to make FP16 calculations take precedence over FP32 where it makes sense, which will bring a big increase in performance for specific workloads. There aren’t many who’ve signed up, but we can look forward to updates to Prey, Far Cry 5, Battlefield 1, and Wolfenstein II. NVIDIA currently isn’t allowing consumer Pascal GPUs to use FP16 calculations at full speed, so they’ll be at a disadvantage in these games once these updates are available.

In terms of software support, Vega offers the most complete tier of support for DirectX 12 features possible. They’re one step ahead of NVIDIA’s feature support in the Pascal family, and it supports Standard Swizzle, a method to allow the CPU to access and modify textures held in GPU memory when the GPU is a UMA device. This is applicable to AMD’s upcoming Vega APU, so it’s not directly going to affect gaming performance for desktop Vega RX.

AMD’s Vega architecture also provides initial support for SR-IOV, a feature in both hardware and software that allows a graphics card to dedicate resources dynamically to virtual machines that require some GPU acceleration for a particular workload. SR-IOV has been a feature for NVIDIA’s Tesla graphics cards for ages, and it’s typically an enterprise-only feature, but AMD saw some benefit to including it by default for every Vega GPU.

This is going to be huge for small-scale scientific computing farms that operate in virtual machines, and it’s going to be a killer feature for gamers who want to use Linux as their base operating system, but support a Windows 10 virtual machine for gaming. Instead of mucking about with drivers and unbinding and two GPUs and whatnot, SR-IOV promises to allow systems to share one graphics card on up to 16 virtual machines. This is going to be industry-changing, and it’s going to be the start of the reduction of Microsoft’s dominance on the desktop for professional use-cases.

Where things get interesting is in the performance numbers AMD were willing to share about power efficiency. The Radeon Vega Frontier Edition did not, as reviewers discovered, have a working version of AMD’s next-generation rasteriser, which would have put them on par with NVIDIA in terms of power efficiency. Instead, Vega will render triangles in exactly the same way as its predecessor, Fiji, and this means that AMD is allowing Vega to use a slower rendering method that consumes more power.

The story coming out of SIGGRAPH was that the feature, called Draw Stream Binning Rasterisation (DSBR) was not functional and was stripped out at launch in the drivers. It’s not a good look for AMD to only confirm this now, but it’s better late than never. In terms of actual savings, the DSBR can reduce the size of rendered frames by as much as 33% by culling triangles that aren’t rendered or otherwise hidden, freeing up cache resources and reducing the demand for power from the GPU.

As far as power efficiency goes, AMD is still behind NVIDIA in terms of efficiency. They’re creeping up a little bit on the GTX 1080 Founders Edition when running Vega with a 220W TDP, and almost matching NVIDIA’s design when running with a 150W TDP. However, given that lower TDPs for Vega will result in a slower card, this isn’t much of an advantage when it comes to performance benchmarks.

In terms of actual game performance, AMD provided some sparse data, but there is some stuff to go on. If you have a look at the “Gaming Zone” table for AMD, Vega has some minimum frame rate numbers. It’s within the range of what NVIDIA’s GTX 1080 can muster at the 3440 x 1440 resolution (which means, yes, AMD isn’t comfortable sharing 4K numbers just yet) most of the time, but some games show it trailing behind definitively, and by as much as ten frames per second in the case of Forza.

And these are just minimum framerates, mind you. The averages are going to be higher, and I think both cards will be pushing more frames than what the monitor can display. This is how AMD is justifying buying into the Vega experience – that it is cheaper than the competition once you throw in a FreeSync monitor to complete the package.

As far as pricing goes, things are looking okay. AMD is pricing the full-fat, 64 NCU RX Vega GPU at $499 (approximately R6,600). Of course, it’ll never be that cheap locally, and we’re looking at price parity with the GeForce GTX 1080 at around R9,000 to R12,000. Vega 56, the cut-down version, is priced at $399 to combat the GeForce GTX 1070. That will still be somewhat affordable locally, but it may be at just as much risk of being snapped up by miners as NVIDIA’s cards are now.

To help spur on sales, AMD is offering bundles with RX Vega. The Red Pack will include RX Vega and two free games, Prey and a pre-order for Wolfenstein II. That’s more likely to be offered locally than the other packs, which are region specific and not available in South Africa. The Black Pack gives you $100 off the purchase cost of a Ryzen 7 1700X processor and a qualifying motherboard, while the Aqua Pack includes the liquid-cooled Vega 64 reference design along with a $200 off coupon for a qualifying FreeSync monitor.

As a reminder for what’s coming next, AMD will have three Vega RX GPUs at launch – Vega 64 water-cooled, Vega 64 air-cooled, and Vega 56 air-cooled. The cards all have different power requirements and performance profiles, and it’ll be interesting to see how much of a gap exists between Vega 56 and Vega 64, with such a large gap in TDP and hardware units already. All three cards might also be more capable of meeting their boost clock speeds than previous cards, which is why AMD is now supplying base and boost clocks instead of a “typical” engine clock.

Additionally, we can’t expect Vega to be competitive against Volta or the existing GeForce GTX 1080 Ti. AMD only has plans to address NVIDIA’s new GPUs and the last performance tier in 2018. For now, Vega has to exist with NVIDIA taking the performance crown at the top-end of the market, and that’s just how the cookie has crumbled. AMD will still be successful in their sales, no doubt, but NVIDIA won this round quite convincingly. If the next generation of Volta cards are coming in late 2017, which is a possibility, then AMD will have to spend a lot of time in driver optimisations and hope that Vega remains competitive over the course of the next year.

And one more thing

Although AMD is not talking about it in detail, they also had Tim Sweeney whip out a Radeon RX Vega Nano on stage during their presentation. This is the successor to the Radeon R9 Nano, and it sold really well because it could be overclocked and made to run on par with the much more expensive Radeon R9 Fury X. The card is the same size as the original Nano, and features the same cooler and shroud design, only this time it includes a LED-lit Radeon RX cube on the edge.

Looking closer at the pictures, it also has two 8-pin EPS power connectors as well as three Displayport 1.4 outputs and one HDMI 2.0 output on the back. This means that it’ll consume as much power as the RX Vega 56 because the heatsink is much smaller, and AMD’s representatives over the weekend claimed that they were looking at a 150W power draw. If you scroll up a bit, you’ll see that this is the card they used to test their efficiency against Pascal. The Nano is probably going to be popular with system builders who want to use it in an ITX chassis, or someone who wants to save more money instead of buying the RX Vega 64.

AMD expects Radeon RX Vega to be on store shelves in the middle of August 2017, and they’re aiming for a launch on the 14th.