paint-brush
If AI Is The New Electricity, Who Is The Samuel Insull?by@robmay
823 reads
823 reads

If AI Is The New Electricity, Who Is The Samuel Insull?

by Rob MayJanuary 6th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Andrew Ng has repeatedly <a href="https://www.gsb.stanford.edu/insights/andrew-ng-why-ai-new-electricity" target="_blank">compared AI to electricity</a>, positioning AI as a technology that will be everywhere, and in everything. But if you study the history of electricity adoption in the United States, it was not always on track for ubiquity. With all the attention paid to Tesla, Edison, Westinghouse, and others, the real brains behind mass adoption of electricity was <a href="https://en.wikipedia.org/wiki/Samuel_Insull" target="_blank">Samuel Insull</a>.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - If AI Is The New Electricity, Who Is The Samuel Insull?
Rob May HackerNoon profile picture

Note: This originally appeared in the weekend commentary section of my InsideAI newsletter.

Andrew Ng has repeatedly compared AI to electricity, positioning AI as a technology that will be everywhere, and in everything. But if you study the history of electricity adoption in the United States, it was not always on track for ubiquity. With all the attention paid to Tesla, Edison, Westinghouse, and others, the real brains behind mass adoption of electricity was Samuel Insull.

Insull realized that electricity could be everywhere only if it was cheap, and to do so required building larger generation plants with better economies of scale. He was a Jeff Bezos style entrepreneur — always trying to lower unit prices. This didn’t always fly well with others, but, it worked in the long run. The more the per unit price of electricity dropped, the more households used. From that perspective, Insull was very successful. (Though he ended up in some trouble because of financial holding company issues).

I bring up Insull because, at the moment, I don’t really see AI positioned in a way that it can mirror the adoption pattern of electricity, even given the constant comparisons. At the moment, AI is very application specific. So I want to ask the question — what changes need to be made to make AI adopted in a similar pattern to electricity, and what person (or company) will be the Samuel Insull?

To really be “the new electricity” AI needs to be more generic. Someone has to come up with a way to provide Intelligence-As-A-Service. This isn’t picture classification as a service or sentiment analysis as a service or chatbot dialogue as a service. This is general intelligence as a service. Until we figure that out, I don’t see how we can parallel electricity adoption.

Once that happens (if it does, and I could argue that it won’t, at least for a very long time, in another post), we have to think about where the economies of scale are for intelligence, then architect our systems in ways that take advantage of those. Let’s think through what that might mean.

Economies of scale benefits usually come from areas with high fixed costs that then get spread over lots of units of something. In the case of AI, we could start by looking at labeled data. But data doesn’t really provide economies of scale. It acts more as a barrier to entry. It’s hard to get in some cases, but once you have it, it’s very valuable until you hit a point of diminishing marginal returns in your models where more data doesn’t really improve the models much. To draw a parallel to electricity, this might be the amount of coal needed to power a generator. Getting coal would be helpful, but, that isn’t your advantage. In this world of AI as electricity, I’m not sure data is still the advantage either.

The next place to look will be at inference — the point at which an AI makes a decision. There is a computational cost of inference, and lowering that cost allows you to do more and more inference. I believe inference is constrained by hardware right now, which drove my investments in Mythic and Rain. AI hardware innovation is going to help this significantly. So the next wave that starts to move us into the AI-as-electricity world is dropping the per unit cost of inference.

The next place to look after that might be the act of training, and by training I don’t mean training a neural net. The current model of doing so is way too targeted to be a generic benefit like we need for the AI-as-electricity framework to make sense. At some point, I think training will be a more generic process that includes humans training machines, and machines learning by reacting to a broad based environment like humans do — not just narrowly targeted applications. I think broad based training is the place to really get economies of scale — train a thing once and see it execute that training as many times as the world needs it to for all kinds of applications.

Once we hit that point, you can see how scaling a large training business would drop the per unit cost of intelligence. If you think of intelligence performing “one smart process” that cost could continually drop through a combination of hardware driven inference price drops and human+software driven training fixed costs being spread out over many units of intelligence.

In his book “Age of Em”, economist Robin Hanson pointed out that eventually you could drop the cost of intelligence close to the cost of electricity, so, the speed and length at which you want to run will determine the price you pay, more than what task you are performing.

We are a long way from this view of intelligence-as-a-service, and many technical hurdles have to be solved before we get there. But the AI world is moving fast and I’m keeping my eye out, as an investor, for the companies that could drive this view of the world. I’m hoping to invest in the next Samuel Insull for sure, and help drive intelligence prices down to the point where it is embedded everywhere.