AI is on the rise. In some ways it was always inevitable. But ask any researcher who suffered through the 1990s in AI research and they might not agree. AI and neural networks in particular were considered a backwater for researchers for decades. If you wanted a dead end career go into neural nets. In the 1990s, one of the leading thinkers behind neural networks, Geoffrey Hinton, could barely get funding. Nobody came to his classes. He worked on his ideas in isolation.
So few people have strong histories in AI now that big companies like Google and Facebook can’t find enough of them or hire them fast enough. Top AI researchers are making more money than pro athletes. The shortage will continue for some time, since people just weren’t studying this stuff. That leaves it to the kids who don’t bother with college and just figure it out for themselves with open source tools to pick up the mantel. Frankly, that’s probably the best bet nowadays anyway.
Geoffrey Hinton and his colleagues always knew the math would work behind the algorithms. They just didn’t have the hardware to make it a reality. That would take decades and the rise of massive clouds to make it possible to run the huge calculations behind neural networks.
And yet, even that wasn’t enough. It took the ascension of a different industry to really allow it to take off: Video Games.
The relentless acceleration of graphics processing power led to ever more parallel processors to deal with the constant calculations of drawing more and more photorealistic graphics in real time. Now computer scientists are tapping banks of those GPUs to make the giant matrix calculations of AI work faster and faster.
All right here you go.
Fifteen years ago, I proposed a revolutionary concept to my friend, Chris Dixon. He went on to become one of the top angel investors in Silicon Valley, but at the time he was just getting his start in investing, working at a company called Bessemer Venture Partners and starting his first company that I built the Linux servers to power, a company that was eventually purchased by McAfee and became Site Advisor. I was playing a lot of games back then as many young techies tend to do. They had graphics cards and were even starting to have physics cards, but they seemed to be missing the brains of the operation: AI. Computer opponents were dumb and easy to defeat. They couldn’t think like people. So I said:
We could accelerated path finding and any other calculations that made the enemies smarter. He went away and did some research. After a bit he came back and said “I think this is a great idea. Let’s do it.”
And you know what happened?
I chickened out. I didn’t think I was old enough, smart enough, or tough enough to make it a reality. In short, I was a coward.
I don’t look back in life with regret very often but this is one of those moments that I had the chance to change my life and the world forever and I failed. When I think about how much further along this industry would be right now, I’m really sad. We might already be close to General Artificial Intelligence (GAI). We almost certainly would’ve had Siri, smart phones and AlphaGo a decade earlier.
But with that said, I don’t regret it too much. In hindsight, it was the right decision. I wasn’t really meant to do that or I would have had the courage and monomaniacal focus to make it happen. That doesn’t happen for many things in life and it did not happen for me there. Without that kind of relentless focus it’s impossible to bring that kind of radical idea to life.
So there you have it. My focus is not tech companies anymore, it’s writing. That’s what I really want to do. That is my monomania. So maybe it wasn’t a mistake. Maybe it was just the universe saving me from myself. The universe has a way of incorporating our “mistakes” along with our “successes” into who we are today. It all works out in the end.
Frankly, it’s amazing to me that nobody else thought of this. Usually, I think of something and a few days later find out someone has been working on it for years or it already exists.
That said, the industry is starting to move in this direction though so get rolling fast. A bunch of companies are focusing on building AI silicone, but mostly for business apps. Fathom just released a USB neural compute stick but it’s not really for games even if it is cool. Nvidia just built a card designed for AI, with a suggest price of $299 (but you can only buy them in quantities of 1000 or more.) The open source software out there is a great place to get started. If you wanted to build an ASIC, find some designers in China and see if you can accelerate those open source tools.
The calculations in video games are what will make this surge to the next level. Those are different from the math needed to power machine vision or self driving cars.
There’s still time to take this idea and run with it, but believe me, someone is already out there trying to make it happen.
Beat them to it.
So what do I want for the idea? Nothing. (Well if you want to give me an honorary position at the company and pay me for hanging around and drinking from your water cooler, I’m game.) But seriously, I ask for nothing, only that one day say that it was me who inspired you.
AI game cards. This idea is officially open source.
Now go make me proud.
A bit about me: I’m an author, engineer and serial entrepreneur. During the last two decades, I’ve covered a broad range of tech from Linux to virtualization and containers. You can check out my latest novel, an epic Chinese sci-fi civil war saga where China throws off the chains of communism and becomes the world’s first direct democracy, running a highly advanced, artificially intelligent decentralized app platform with no leaders. You can also check out the Cicada open source project based on ideas from the book that outlines how to make that tech a reality right now and you can get in on the beta.