These Technologies Will Shape The Future. How AI, mixed-reality, cryptocurrencies, and autonomy will play out over the next 10 years.
Exclusive: Andreessen Horowitz analyst Benedict Evans explains how AI, mixed-reality, cryptocurrencies, and autonomy will play out over the next 10 years. Dominant companies always look invulnerable–until they’re very vulnerable.
In his presentation, Evans equates the evolution of new technologies to the way a skyscraper rises into the sky. In the beginning, he says, there’s a hole in the ground and very little visible activity. But one day you walk by and the building’s frame has gone up. Then there’s another long period with little obvious advancement until, one day, it’s finished. The analog for tech innovation is that the pit is when the technology doesn’t quite work; the frame going up is when the tech is still finding a product-market fit; and the building being done is when you’re “pouring…rocket fuel” on the tech’s market reach. And that’s how he sees the four areas he’s called out–with each at a different point along the evolution: autonomy “down in the muddy hold in the ground;” mixed-realty’s just the building’s frame going up; cryptocurrencies have the frame up, but the facade is still being worked on; and AI is the finished tower looking for tenants.
S CURVESTo Evans, the tech industry has a long history of cycles of what he calls “S curves,” or slow starts, rapid and steep growth, and slow maturation. Today’s S curve is one of the mobile internet, while the previous was the PC internet. In each case, as the curve matured, the question became less about the technology itself and more about what could be built on top of it. That’s where we are with the mobile internet, with things like ride-sharing, Instagram, Instacart, and other things we can do with our phones today. The thing is, though, when we get to that point in the curve, people always start to say the platform is dead, and “What have you done for me lately,” Evans argues.And these predictable evolutions and reactions belie the fact, he tells Fast Company, that “these things are not overnight revolutions. Yes, the iPhone launches and people can look at it and say this is amazing, but it has taken 10 years to get to the point that the majority of the population has got something like that….Most people think that you’ll have a fully autonomous…car in a 5-to-10-year view, but there’s a long time between the first such car being sold, and all cars that are sold being autonomous, and then a long time from all that to all cars on the road being autonomous.”
“THE BLACK MONOLITH”In his presentation, Evans says that as the technology that’s the furthest along, AI is the easiest to talk about. He also says that calling it “AI” is probably unhelpful. “It feels like someone’s put the black monolith in the beginning of 2001 into the room,” Evans says, “and we’ve all turned into monkeys and we’re dancing around and screaming at it. We don’t actually know what it is.” Instead, it’s better to call the technology “machine learning”–as many people do–and to talk about how it’s at its core a technology that enables the discovery and leveraging of new patterns–things you want to know the answers to, or even things you didn’t know you don’t know–and about how it enables new forms of automation. The trick is to not fall for the idea that automation means wide-ranging capabilities. Rather, Evans says, the benefits of machine learning-driven automation are very specific new capabilities–and, more importantly, the opportunity to build many new companies around single-purpose verticals. As well, he says, automation gives a massive multiplier effect by being able to do the small tasks that thousands of people could do–like look for patterns in images. Done by one person, that’s not all that powerful. But done by thousands, it’s huge. What if, Evans asks, machine learning could be used to examine imagery of what everybody in a certain city was wearing over the last few months; What kind of traffic analysis could you do if you could always be automatically counting everyone who goes through a busy subway station. What new patterns, and thus, applications, could come out of that?
CHANGING ENTIRE CITIESAlthough it’s the furthest from changing the world, Evans touts the broad possible impact of autonomy. When the day comes, he says, that cars, buses, and other vehicles no longer need drivers, it’ll be possible to completely re-imagine what those vehicles can be, and even better, re-imagine the world in which they move. If you don’t have drivers, you can probably have more cars on the roads. There will be almost no accidents as the vehicles move in tandem, always aware of each other, and that will mean different kinds of roads. That, in turn, can lead to all-new urban design–with no need to provide parking spaces, no congestion, dynamic road pricing, and a totally different dynamic around where people live, shop, eat, drink, and so on. It’s not easy to predict exactly where this will go, or what the opportunities are, Evans suggest, but they could be huge. But it’s important to think about the application you want to build and whether it’s necessary to think a decade in the future. “It depends on what timeline you’re operating on,” Evans says. “If you’re building public infrastructure or planning a city, then [thinking years ahead] absolutely needs to be something that figures into your thinking. If you are planning your TV schedule, you probably shouldn’t yet be thinking about what people will watch if they don’t have to look at the road.”
MIXED-REALITYIn its early stages, mixed-reality is Evans’s third area of massive opportunity–what happens if you wear a computer that can see? Today, he says, we’re in the stage of working prototypes, and a few early, early products–see Meta, or Microsoft’s Hololens, for example–but in general, there’s not much yet available. But the potential is there, clearly. Look at Magic Leap, which has raised nearly $1.9 billion dollars in funding for a headset few have seen, but which is said to have impressive power to place virtual things in our real world.Evans seems less interested in the idea of things like recipes that we can see hovering in front of our stoves while we cook, and more in the power of computers that are able to see, and interpret, what’s going on around us. For example, being able to tell you who someone you’re meeting is, where you last encountered them, and whether you should want to talk to them. Or to tell you instantly if a product you’re holding is available cheaper online. Or any number of other powerful tools that could evolve as the technology progresses.
CRYPTOIn his presentation, Evans says less about the future opportunities of cryptocurrencies than he does about the other four technologies. Still, he calls out their power in a couple of meaningful ways. First, he says, cryptocurrencies allow for the distributed storage of value without the need for a central authority–like the governments that manage traditional money. That’s important in a world where people are losing trust in such institutions and want more control over money without governmental intrusion. Second, he argues, the records that make up the cryptocurrencies can be programmed in ways that were never possible before, and used in all-new ways. Those opportunities, presumably, will materialize over time. In the meantime, Evans doesn’t foreclose on other technologies presenting major opportunities. Rather, he focuses on the four identified above because of the way they each leverage the idea of automation, and seem to evolve naturally from today’s still-extant S-curve technology, the mobile internet.The real opportunity, though, will lie in seeing what no one else sees hidden right there in plain sight in front of them. Something like Airbnb had never been thought of before, in spite of it not representing a huge technological advance. But until Brian Chesky, Joe Gebbia, and Nathan Blecharczyk came along, finding vacation rentals was far harder than it is today. “In 2007, we were thinking what might the smartphone be, and now we know and we’ve seen what can do on top” of it, Evans says. Now “we’re thinking what might an autonomous car be, what might machine learning be, what might mixed-reality be, and in 20 years’ time, we’ll know, and we’ll be building stuff on top of them.”