Honda-backed Helm.ai Unveils Vision System for Self-Driving Cars

Helm.ai, a California-based startup backed by Honda Motor, introduced its new camera-based urban environment interpretation system called Helm.ai Vision. The company is negotiating with multiple automakers to integrate its self-driving technology into mass-market vehicles.

Helm.ai is collaborating with Honda to embed the system in the upcoming 2026 Honda Zero series of electric vehicles, which will enable hands-free driving and allow drivers to take their eyes off the road.

CEO and founder Vladislav Voroninski told Reuters that the company’s business model centers on licensing this software, including foundation model software, to automakers. Helm.ai’s vision-focused system aligns with Tesla’s approach, relying on cameras rather than sensors like lidar or radar, which can add significant costs.

Voroninski acknowledged Helm.ai’s foundation models can work with other sensors but emphasized that the primary offering remains vision-centric. Industry experts, however, highlight that supplementary sensors such as lidar and radar are vital for safety, especially under poor visibility conditions.

In contrast, robotaxi companies like Alphabet’s Waymo and May Mobility use a sensor fusion approach combining radar, lidar, and cameras to ensure comprehensive environment perception.

Helm.ai has raised $102 million to date, with investors including Goodyear Ventures, Korean auto parts maker Sungwoo HiTech, and Amplo.

The Helm.ai Vision system merges inputs from multiple cameras to create a bird’s-eye view map that enhances vehicle planning and control. It is optimized for hardware platforms from Nvidia, Qualcomm, and others, facilitating automakers’ integration of Helm.ai Vision into existing vehicle systems.