2016 was particularly challenging and difficult for so many – the world, the arts, the tech industry, and for myself personally. It was a year characterised by sudden surprise, gut-wrenching heartache, and tremendous disappointment; and yet, at the same time some incredible things happened. Just the other day, we captured part of the light spectrum from antimatter, and NASA determined that the EmDrive sort of works (they just don’t know why, yet). We took good pictures of Pluto. We discovered gravity waves. This year was totally cash for scientists everywhere.

In computing, we’ve seen incredible leaps in performance in storage technology, and companies are flipping on to triple-level TLC 3D NAND memory. AMD has more than a handful of laptops on sale at partners. Intel finally gave up the goose chase trying to get Atom into the mobile market. We’re getting new CPU platforms this year, and DDR4 is finally cheap! Let’s look back at 2016 in tech, and ponder about what 2017 has in store for us all.

Breakthrough storage speeds

I’ll start with the best part – solid state drives today are eight times faster than their predecessors now two years old. Comparing a Samsung 960 Pro to the old Intel X25-M is unthinkable, the gap is just simply too enormous. The adoption of PCI Express-driven storage has allowed manufacturers to bypass the SATA specification and simply keep on making faster and faster flash memory and controllers.

It’s even spurred some much-needed advancements in chipset design from Intel and AMD to increase the amount of bandwidth for storage devices – this year, there will be NVME-based SSDs in the M.2 form factor consuming eight lanes of PCI Express connectivity, opening up the potential for a drive that reads at speeds of eight gigabytes per second. Both companies have prepared for this eventuality, and it’s going to be glorious to see how low we can drive latency. Flash memory is also getting so cheap, that people have begun replacing their secondary drives in their rigs with terabyte-sized SSDs instead.

A side effect that came from this change is how RAID setups in servers and professional workstations are marketed. Speed and pricing is no longer the critical factor anymore, it’s latency. Drives are now manufactured and benchmarked according to consistency over time, which is a great metric for SSD rollouts in web servers and data centers where delaying the time it takes to access data is more critical, and where serving up web pages with fault tolerances up to nine nines is a requirement. If you’re scrubbing along 4K footage in Premiere Pro, you want little to no lag when jumping around to make edits. If you’re capturing high framerate footage, you want latencies short enough to capture all the footage, instead of having a frame skip or get dropped because the array encountered a hiccup.

“Wait for benchmarks”

The biggest change in the tech journalism industry in 2016, at least in the echo chamber I find myself in (a new thing we also discovered this year), is the public’s newfound cynicism for any leaks or benchmarks that have little legitimacy. The AMD, Intel, and NVIDIA subreddits have switched slowly into not taking things too seriously when they’re based on a rumour. Watching the community struggle to hold themselves back while waiting for the Core i7-6900K benchmarks was hilarious because of how serious the self-policing became. It didn’t stop the deluge of rumours coming out from every forum and website imaginable, but the public took a more critical view of things this time around.

Perhaps the mantra of “no pre-orders!” and “wait for benchmarks!” is slowly working its way through the system and we’ll reach the point where people don’t take companies and their marketing departments at their word. That would be something, eh?

“Maybe we were a bit too aggressive with the upgrade…”

Those words were spoken by Microsoft’s head of marketing, Chris Caposella, in a year-end Windows Weekly podcast that lasted a little over two hours. Chris has a reputation for being a bit more open about what Microsoft’s thought processes were behind the scenes, and he admitted that the company’s move to market Windows 10 to consumers, and to try get as many of them upgraded as possible, was borderline scary at times because of how aggressively they pursued their goal.

What he also revealed is that the “Billion devices running Windows 10” plan was also quite lofty, because they wanted half of that to be mobile devices – 500 million tablets and phones running Windows 10 Mobile. That didn’t pan out as planned, because every time Microsoft’s mobile department came up with something cool, it would be eventually shut down because it either handed too much power to their competition, or might have ended up making their slice of the pie smaller.

The joke is that it got smaller anyway due to their lack of any devices and software that people really wanted, and the Lumia 950 and 950XL are still exorbitantly priced (and currently being phased out). Perhaps they’ll pull themselves out of the doldrums by targeting business customers, but if the waning popularity of BlackBerry phones running Android is anything to go by, they’ll probably stick to less than 1% of market share for a good while longer. Microsoft is far more comfortable with developing Windows 10 Mobile and getting other manufacturers to make Windows Phones on their behalf, and perhaps things are better off that way.

On the bright side, Windows 10’s development into a more mature platform is still going swimmingly, and the wheels haven’t fallen off the wagon yet for Microsoft’s other products and services (with the exception of Skype). And Xbox is doing great.

Machine learning takes over

There was a well-publicised story that broke in October 2016, detailing how a machine learning algorithm run by Google Labs had created its own cryptographic algorithm for securing data. A neural net of three servers was tasked with the following problem: Server A needed to talk to Server B using data encoded using a process that Server B had to reverse engineer, while Server C had to try intercept the data and attempt to decode that same data without having access to the secret key in possession of Servers A and B. While Server A and B eventually figured things out, Server C could only guess about half of the content of the messages it eavesdropped on.

The important bits to take away from the study was that the machines used a method that was simple, yet unexpected, and that it took tens of thousands of iterations of the experiment before the three computers in the neural net could encrypt and decrypt messages using a method devised entirely without human intervention. The implications of the study aren’t necessarily applicable for current encryption mechanisms, but it could be useful further down the road.

Brute forcing encrypted data is time-consuming, and takes a lot of power. Having a server break the algorithm using a mixture of machine learning and big data analytics could hold the key to guessing accurately what the contents of an encrypted message are likely to be. Really, the surprising part is that Server C could guess anything at all. The question now is whether the same result could be observed using stronger encryption schemes, and if it is possible, we should start thinking about what measures we can take to prevent machine learning from being able to easily break into encrypted data.

That’s the scale on which we’re operating now. In 2015, people were mesmerised that an AI could search for patterns in pictures and produce works of art that were exceedingly creepy depending on what you asked it to look for. In just two years, we’ve leapt from the intelligence of something almost at the level of an infant, to a pre-teen script kiddie coming up with a “my first encryption scheme”. That’s an astounding leap to make. This kind of progress is exponential rather than linear, and so we can expect neural nets to improve in leaps and bounds in the coming years.

Before 2020, they’ll be beating us at far more than games of Jeopardy, Go!, or StarCraft.

Looking forward to the future…

While this is a small selection of the headlining news in tech this past year, it’s been a great one overall. Despite nothing really happening on the CPU front, and the GPU market in a bit of shock now that laptop GPUs are as fast as their desktop counterparts, there was a lot that went right last year, and it’s been a wild ride. But the ride isn’t over yet.

Starting this month, I’m expecting several twists to the tale that not even M. Night Shyamalan could have written in. We’re going to see a resurgence in popularity as AMD becomes competitive in the CPU market again. We’re going to see Intel take the SSD market by storm with their Optane drives. Virtual Reality is now an established and growing market for consoles. Desktop Linux installations could be at 3% market share at the end of next year. Rented machine learning services will be dime-a-dozen. The sliding sales of pre-built computers might finally show signs of slowing. Skype for Linux might actually turn out to be useful.

2017 could be anyone’s year for success. I’m excited for it. Are you?