Blue Vision Labs (acq'd by Lyft)
I held a variety of titles, ending ultimately with Technical Program Manager while Lyft. My role started at Blue Vision Labs, where I led the team responsible for testing, improving, and scaling data collection efforts across the U.S. and the U.K. Blue Vision Labs was acquired by Lyft. After acquisition, my role transitioned to redesigning the sensors being used to ingest data, optimizing the end-to-end data ingestion pipeline, and figuring out how to cheaply, quickly scale the data collection efforts across the U.S.
Problem Statement
Blue Vision Labs, acquired by Lyft in 2018, developed precise collaborative augmented reality software. The underlying technology also enabled cheap, scalable, accurate localization and mapping, useful for autonomous vehicles of all kinds. The primary challenge was integrating this technology into Lyft's Level 5 self-driving division to build city-scale 3D maps, understand human driving patterns, and improve simulation tests. The goal was to leverage the data collected from Lyft’s rideshare drivers and autonomous vehicles to accelerate the development of their autonomous vehicle technology.
Initiation & High-Level Requirements
Requirements included:
- Hardware Redesign: Re-design, prototype, and iterate the hardware redesign of dashcam devices for the U.S. and U.K. markets.
- Data Integration: Integrate rideshare driver data and autonomous vehicle data to create city-scale 3D geometric maps.
- Real-time Analytics: Enable real-time data collection and analysis for continuous map updates and autonomous vehicle optimization. Solve for any glitches in data transmission.
- Cost Efficiency: Identify and estimate costs involved with sensor redesign, required tooling, and any other additional potential cost sources.
- Scalability: Scale data ingestion technology from a few cities to multiple countries.
- Compliance and Security: Ensure compliance with industry standards and secure data handling- crucial because of sensitivity around autonomous vehicles and navigating data capture and GDPR in the U.K.
Development & Initiation
The development of Blue Vision Labs’ integration into Lyft's self-driving program involved:
- Integrating Various Data Sources: Use forward-facing camera sensors in Express Drive vehicles and autonomous vehicles to collect data.
- Developing 3D Maps: Convert driver data into detailed 3D maps of urban environments.
- Implementing Advanced Analytics: Utilize data for simulation tests, trajectory tracking, and optimizing vehicle positioning on the road.
- Ensuring Security: Collaborate with Lyft’s legal team to secure patents for new technologies, ensure data protection, and carefully approach data capture in foreign markets- specifically the U.K.
- Automating Processes: Streamline data collection and map updating processes for efficiency and accuracy.
Results & Impact
- Operational Efficiency: Enhanced data collection efficiency, enabling continuous and real-time updates to 3D maps and improving simulation realism.
- Cost Savings: Negotiated cheaper vendors costs across a variety of parts.
- Scalability: Successfully scaled data ingestion technology from 3 cities to 2 countries, vastly increasing geographic coverage.
- Compliance and Security: Ensured secure data handling and compliance with industry standards.
- Enhanced Mapping: Created thousands of miles of up-to-date city-scale 3D maps, enabling large-scale, high-quality map generation. This effort led to the creation of the largest open-source autonomous vehicle dataset at the time, underpinned Lyft's HD spatial semantic mapping, and benefited both Lyft and the wider industry.
End-to-end Process
Raw camera footage is captured via a custom-built 'dashcam' secured inside hundreds of rentable Lyft rideshare vehicles.

Using camera footage, accelerometer data, gyroscopic data, GPS data, ambient light levels, and barometer data, we'd the build a 3D point cloud. You can see the 3D point cloud below for San Francisco.

The above map data has a host of applications. For example, the map data can be used to predict autonomous vehicle trajectories in various scenarios. (See below.)

Lessons Learned
- You can always be more resourceful: Utilizing the widespread presence of Lyft's rideshare vehicles as a mobile data collection network provided a cost-effective and scalable way to gather valuable data for autonomous driving development. Another fun example: we iterated on the first few camera re-designs with spare parts in the garage.
- When it comes to compliance & security, slow down: In 2018/2019, the market was abuzz with excitement about autonomous vehicles, and venture capitalists were eager to fund various experiments, prioritizing speed. However, navigating international and national privacy laws surrounding data collection from camera footage and in-cabin/Lyft vehicle tracking demanded a nuanced and detailed approach.
- Where possible, form strategic partnerships: Forming strategic partnerships with hardware providers and other technology firms facilitated access to cutting-edge technologies and resources, accelerating development timelines and reducing costs. 'Vertical integration' and 'do everything in house' can make sense, but it is a case-by-case decision.
- Recursively test in the real world: Emphasizing real-world testing and validation of the technology in diverse environments helped uncover edge cases and refine the algorithms for better performance in varied conditions. For example, we had to iterate on one sensor casing material to find a cheap, but strong enough plastic that wouldn't melt in the sun or superheat the components inside the sensor- while keeping per-unit costs low. Inversely, we had to change the light sensor to better capture low-light conditions when cars drove through tunnels.
- Leverage cross-disciplinary expertise: Leveraging expertise from multiple disciplines, including computer vision, machine learning, and automotive engineering, was crucial for developing a comprehensive and effective data collection solution. Many smart minds made for light(er) work!
Conclusion
Blue Vision Labs played a crucial role in advancing Lyft’s self-driving technology. The integration of advanced augmented reality software, real-time data collection, and 3D mapping significantly improved operational efficiency, reduced costs, and enhanced the safety and navigation capabilities of Lyft’s autonomous vehicles. This project underscored the importance of cross-functional collaboration, scalable design, and proactive problem-solving in achieving technological innovation and operational success.
[edit: Toyota via Woven Planet has since acquired Lyft Level 5 and all of its IP.]