Over the past two decades, Raquel Urtasun, founder and CEO of autonomous trucking startup Waabi, has developed AI systems that can think like humans.
The AI pioneer previously served as chief scientist at Uber ATG before launching Waabi in 2021. Waabi was launched using an “AI-first approach” to accelerate the commercial deployment of autonomous vehicles, starting with long-haul trucks.
“If you can build systems that can actually do that, you’ll suddenly need a lot less data,” Urtasun told TechCrunch. “You need a lot less calculations. If you were able to do the thinking in an efficient way, you wouldn’t need to deploy fleets of vehicles everywhere in the world.
Building a family of self-driving vehicles using AI, which see the world as human power and react in real time, is something Tesla is trying to do with a vision-first approach to self-driving. The difference, aside from Wabi’s convenience with lidar sensors, is that Tesla’s Full Self-Driving system uses “imitation learning” to learn how to drive. This requires Tesla to collect and analyze millions of videos of real-life driving situations that it uses to train its AI model.
Waabi Driver, on the other hand, did most of the training, testing, and validation using a closed-loop simulator called Waabi World that automatically builds digital twins of the world from the data; Performs real-time sensor simulation. Manufactures scenarios to stress test the Waabi driver; It teaches the driver to learn from his mistakes without human intervention.
In just four years, this simulation has helped Waabi launch commercial pilots (with a human driver in the front seat) in Texas, many of which are being implemented through a partnership with Uber Freight. Waabi World is also enabling the startup to reach a planned fully driverless commercial launch in 2025.
But Wabi’s long-term mission is much greater than just trucks.
“This technology is very, very powerful,” said Urtasun, who spoke to TechCrunch via a video interview. “It has an amazing ability to generalize, it’s very flexible, it’s fast to evolve. And it’s something we can expand to include a lot more than trucking in the future… This could be robotaxis. This could be humanoid robots or warehouse robots. And it could be robot taxis. This technology could solve any of these use cases.
The promise of Waabi’s technology — which will be used for the first time to scale autonomous trucking — has allowed the startup to close a $200 million Series B round, led by existing investors Uber and Khosla Ventures. Strong strategic investors include Nvidia, Volvo Group Venture Capital, Porsche Automobil Holding SE, Scania Invest and Ingka Investments. This brings Wabi’s total funding to $283.5 million.
The size of the round and the strength of its participants is particularly noteworthy given the successes the autonomous vehicle industry has achieved in recent years. In trucking alone, Embark Trucks shut down, Waymo decided to pause its independent shipping business, and TuSimple closed its U.S. operations. Meanwhile, in the robotaxi space, Argo AI faced a shutdown of its own, Cruise lost its permits to operate in California after a major safety incident, Motional cut nearly half its workforce, and regulators are actively investigating Waymo and Zoox.
“You build the strongest companies when you raise money in really difficult moments, and the autonomous vehicle industry in particular has seen a lot of setbacks,” Urtasun said.
However, AI-focused players in this second wave of autonomous vehicle startups have received significant capital raises this year. Wayve, based in the UK, is also developing a self-learning system rather than a rule-based system for self-driving, and in May, it closed a $1.05 billion Series C led by SoftBank Group. Applied Intuition in March raised a $250 million round at a $6 billion valuation to bring AI to the automotive, defense, construction and agriculture industries.
“In the context of AV 1.0, it is very clear today that it is very capital intensive and very slow to make progress,” Urtasun said, noting that the robotics and autonomous driving industry has been hampered by complex and fragile AI systems. “I would say investors are not very enthusiastic about this approach.”
However, what gets investors excited today is the promise of generative AI, a term that wasn’t exactly in vogue when Wabi launched, but which nonetheless describes the system that Urtasun and her team have created. Urtasun says Waabi is the next generation of genetic AI, one that can be deployed in the physical world. And unlike today’s popular language-based genetic AI models, such as OpenAI’s ChatGPT, Waabi has figured out how to build such systems without relying on massive datasets, large language models, and all the computing power that comes with them.
Urtasun says the wabi driver has a remarkable ability to generalize. So instead of trying to train the system on every possible data point that exists or could ever exist, the system can learn from some examples and deal with the unknown in a safe way.
“It was in the design. We built these systems that could perceive the world, create abstractions of the world, and then take those abstractions and think: ‘What might happen if I did this?'” Urtasun said.
This human-like, logic-based approach is more scalable and more capital efficient, Urtasun says. It is also vital to check safety critical systems operating at the edge; “You don’t want a system that takes a few seconds to respond, otherwise you’ll crash into the vehicle,” she said. And my father Announced a partnership to bring Nvidia’s Drive Thor to its self-driving trucks, which will give startups access to car-level computing power at scale.
On the road, the wabi driver seems to understand that there is something solid in front of him and he must drive carefully. He may not know what it is, but he will know how to avoid it. Urtasun also said that the driver was able to predict how other road users would behave without the need for training in different specific situations.
“It understands things without us telling the system what things are, how they move in the world, that different things move differently, that there is blockage, there is uncertainty, how to react when it rains heavily,” Urtasun said. “All those things, he learns automatically. And because he’s now exposed to driving scenarios, he’s learning all those capabilities.
She noted that Waabi’s simplified, single architecture could be applied to other autonomy use cases.
“If you expose him to interactions in a warehouse, picking up and dropping things, he can learn that, no problem,” she said. “You can expose him to multiple use cases, and he can learn how to do all of these skills together. There are no limits in terms of what you can do.”