Mobileye Global Inc. (Nasdaq: MBLY) is an Israeli technology company headquartered in Jerusalem that develops vision-based advanced driver-assistance systems (ADAS) and autonomous driving technology. The company designs the EyeQ family of automotive system-on-chips, develops perception software for cameras, radars, and (until 2024) lidar, and produces a tiered stack of driving products that ranges from entry-level emergency braking up to a Level 4 robotaxi platform called Mobileye Drive. Mobileye chips have shipped in roughly 200 million vehicles worldwide as of the mid-2020s, which makes the company the dominant supplier of camera-based ADAS to the global auto industry.
Founded in 1999 by computer vision researcher Amnon Shashua and serial entrepreneur Ziv Aviram, Mobileye spent its first decade selling a single windshield-mounted vision module that handled lane departure warning, forward collision warning, and pedestrian detection. The company went public on the New York Stock Exchange in 2014, was acquired by Intel in 2017 for $15.3 billion, and returned to the public markets through a Nasdaq listing in October 2022 while remaining majority-owned by Intel. Founder Shashua remains CEO and is also chairman of the large language model company AI21 Labs.
Mobileye traces its origins to academic research at the Hebrew University of Jerusalem, where Shashua, a professor of computer science, worked on the problem of estimating three-dimensional scene structure from a single moving camera. Conventional thinking in the late 1990s held that depth perception in cars required either stereo vision (two cameras) or active sensors such as radar. Shashua argued that a single forward-facing camera, paired with enough computing horsepower and the right algorithms, could detect vehicles, pedestrians, lane markings, and traffic signs reliably enough to power active safety functions like automatic emergency braking.
The company was incorporated in 1999 in Jerusalem with Shashua as chief technology officer and Aviram as chief executive. The earliest funding came from a meeting with an Asian original equipment manufacturer (OEM), which paid for a feasibility demo. By the early 2000s, Mobileye was developing custom silicon to run its computer vision pipelines at the power and cost budgets the auto industry required. The first generation of that silicon, the EyeQ1, entered series production in 2007 in BMW, Volvo, and General Motors vehicles, marking the start of monocular camera ADAS as a mainstream feature in passenger cars.
The original Mobileye also built an aftermarket retrofit business that sold a packaged camera and warning module under names like Mobileye 5 and later Mobileye 8 Connect. Fleet operators bolted these into trucks, taxis, buses, and corporate cars to add forward collision warning, lane departure warning, headway monitoring, and pedestrian alerts to vehicles that did not roll off the line with such features. The aftermarket unit was always small relative to the OEM business, but it gave the company a steady second revenue stream and a public-facing product.
Mobileye listed on the New York Stock Exchange on July 31, 2014 under the ticker MBLY. The IPO priced 35.6 million shares at $25 each, raising about $890 million and giving the company a market value of roughly $7.6 billion. Shares closed up about 51 percent on the first day. At the time, this was the largest IPO ever by an Israeli company on a U.S. exchange.
In March 2017, Intel announced an agreement to acquire Mobileye for $63.54 per share in cash, valuing the equity at approximately $15.3 billion and the enterprise at about $14.7 billion. The deal closed in August 2017, and Intel folded its existing Automated Driving Group into Mobileye. The combined organization remained based in Israel and continued to be led by Shashua. The price represented Intel's largest-ever acquisition of an Israeli company and one of the largest software and chip deals tied to the self-driving car industry.
Intel held Mobileye as a wholly owned subsidiary for about five years. In October 2022, Intel spun the unit back out via a Nasdaq IPO. Mobileye Global Inc. priced its initial public offering on October 25, 2022 at $21 per share, well below the $50 price range originally floated earlier in the year. The offering sold 41 million Class A shares for gross proceeds of $861 million. The stock began trading on the Nasdaq Global Select Market on October 26, 2022 under the ticker MBLY and closed up about 38 percent on its first day, giving the company a market capitalization near $23 billion. Intel kept the Class B shares, which carry ten times the voting power of Class A stock, and retained roughly 88 percent of total ownership.
Intel has since trimmed its position. In 2024 and 2025, the chipmaker offloaded blocks of Mobileye stock totaling more than $2 billion in proceeds as it raised cash to fund its turnaround plan. After these sales, Intel still held a controlling interest, with reported ownership in the low 80 percent range, but had reduced its economic stake by several percentage points.
The EyeQ chip is the technical core of Mobileye. Each generation pairs general-purpose CPU cores with a set of proprietary accelerators tuned to the workloads of automotive computer vision, including image signal processing, dense optical flow, deep learning inference, and structure from motion. Mobileye designed the chips itself, used external foundries (mostly STMicroelectronics, with later generations on TSMC) to manufacture them, and tightly co-designed the silicon with its perception algorithms.
The practical effect of this co-design is that Mobileye can hit ADAS performance targets at single-digit watts of power consumption, which lets the chips sit behind a windshield or in a small electronic control unit without active cooling. The trade-off, often pointed out by competitors and customers, is that EyeQ is largely a black box: OEMs typically buy the chip with Mobileye's perception software preloaded rather than programming it from scratch.
| Generation | First production | Approx. compute | Process node | Typical use case |
|---|---|---|---|---|
| EyeQ1 | 2007 | ~0.0044 TOPS | 180 nm | Single-feature ADAS (lane and vehicle warning) |
| EyeQ2 | 2010 | ~0.026 TOPS | 90 nm | Multi-feature ADAS, traffic sign recognition |
| EyeQ3 | 2014 | ~0.256 TOPS | 40 nm | Full ADAS suite, AEB, partial Tesla Autopilot 1 |
| EyeQ4 | 2018 | ~2.5 TOPS | 28 nm | Hands-on Level 2, surround sensing |
| EyeQ5 | 2021 | ~24 TOPS (DL) | 7 nm | Level 2+ to early Level 3 perception |
| EyeQ6L (Lite) | 2024 | ~5 TOPS | 7 nm | Entry and mid-tier ADAS, single windshield camera |
| EyeQ6H (High) | 2024-2025 | ~34 TOPS (int8) | 7 nm | Level 2+ surround, premium ADAS, SuperVision base |
| EyeQ Ultra | 2025 (samples), 2026 production | ~176 TOPS | 5 nm | End-to-end Level 4 single-chip AV computer |
| EyeQ7, EyeQ8 | In development | not disclosed | next node | Eyes-off Chauffeur and mind-off systems toward 2029-2030 |
The EyeQ Ultra is the most ambitious chip in the lineup. Unveiled at CES 2022 and now sampling on the road to volume production, it integrates four classes of proprietary accelerators alongside CPU cores, image signal processors, and graphics processing units onto a single 5 nm die. It is designed to ingest input from two independent perception subsystems, one camera-only and one radar plus lidar, and run the high-definition mapping and driving policy stack required for consumer-grade Level 4 autonomy.
In 2025, Mobileye disclosed that it had begun design of EyeQ8 to back its long-term "mind-off" roadmap targeted for the 2029 to 2030 window. Company statements describe EyeQ8 as three to four times more powerful than EyeQ7, with EyeQ7 itself slated to power the next wave of eyes-off Chauffeur deployments before the decade ends.
Mobileye's safety architecture for higher-autonomy systems is called True Redundancy. The idea is to build two independent perception stacks that each produce a complete model of the world around the vehicle, then cross-check their outputs rather than fuse raw signals.
The first stack is camera-only. A surround set of cameras (typically eleven on SuperVision and Chauffeur platforms) feeds Mobileye's vision algorithms, which produce a 360-degree environment model that includes road geometry, lane structure, dynamic objects, traffic lights, and signs. The second stack uses imaging radar (and, where present, lidar) to produce a separate 3D environment model. Each stack is filtered through Mobileye's safety policy and is, on its own, capable of driving the vehicle.
The operational logic is that if one stack fails (a software bug, a sensor occlusion, an unrecognized object class), the other can take over. From a safety-validation standpoint, this gives statistical leverage: the probability of both stacks failing simultaneously on the same scenario is the product of two independent failure rates, which can be much lower than fusing them and validating the combined fail rate. Mobileye has argued in its safety architecture papers that True Redundancy reduces the validation burden from "hundreds of millions of hours" associated with classical sensor fusion to "tens of thousands of hours" per channel, because each channel can be validated independently against its own perception accuracy targets.
The practical visibility of this architecture changed in September 2024, when Mobileye announced that it would wind down its in-house frequency modulated continuous wave (FMCW) lidar development and close the related R&D group, affecting roughly 100 employees. Management said the decision reflected better-than-expected progress on EyeQ6 vision performance, growing confidence in the in-house imaging radar, and falling prices for third-party time-of-flight lidar units. The True Redundancy concept itself was not abandoned: the company continues to use lidar in production programs, but sources sensors from outside suppliers rather than developing its own next-generation FMCW unit.
Mobileye has been developing a software-defined imaging radar, sometimes referred to in the industry as 4D radar, since 2018. The radar uses a massive multiple-input multiple-output (MIMO) antenna design, a custom radio-frequency front end developed in-house, and a dedicated radar-on-chip processor capable of about 11 trillion operations per second. The processor digitally samples the entire radar return rather than relying on classical analog filtering, which the company says allows it to maintain more than 1,500 virtual channels at 20 frames per second and to track many more objects with finer angular resolution than conventional automotive radar.
In 2025, Mobileye announced that a global automaker had selected its imaging radar for an eyes-off (Level 3) production program, the first publicly announced commercial design win for the in-house radar. Mobileye also signed a partnership with French Tier 1 supplier Valeo to industrialize and supply imaging radars at automotive volumes, with Valeo handling much of the manufacturing and integration work for OEM customers. The radar is positioned as the primary active-sensor backbone of True Redundancy now that the FMCW lidar program has been shelved.
Road Experience Management (REM) is Mobileye's crowdsourced approach to high-definition mapping. Vehicles equipped with EyeQ chips and a customer's data-sharing opt-in upload small packets of highly compressed semantic data (lane geometries, sign positions, traffic light placements, road markings) to the cloud as they drive. Mobileye stitches these contributions from millions of cars into a continuously updated map called the Roadbook.
The approach has two unusual properties. First, the bandwidth required is tiny, on the order of 10 kilobytes per kilometer of driving on average, because the cars upload extracted features rather than raw camera frames. Second, the map updates effectively in real time as driver behavior and road conditions change, which sidesteps the freshness problems that plague survey-based HD maps generated by dedicated mapping vehicles.
REM is used in two ways. For ADAS, the Roadbook acts as an extra layer that lets a vehicle anticipate lane geometry, speed limits, or construction beyond what the camera can see at any given moment. Volkswagen Group, Ford, Nissan, and others license REM data for production ADAS features such as enhanced lane keeping. For autonomous driving, REM provides a prior map that the driving policy can rely on to plan maneuvers, complementing the live perception output from the camera and radar stacks.
Mobileye has disclosed REM coverage milestones over the years that put it among the largest crowdsourced map collection efforts in the industry. The cloud processing pipeline is run on Amazon Web Services using Apache Spark on Kubernetes, according to public AWS case studies.
Responsibility-Sensitive Safety (RSS) is a formal mathematical model of safe driving introduced by Mobileye in a 2017 paper titled "On a Formal Model of Safe and Scalable Self-driving Cars," authored by Shai Shalev-Shwartz, Shaked Shammah, and Shashua. The paper, posted to arXiv as 1708.06374, encodes a small set of common-sense driving rules into rigorous mathematical constraints on speed, gap, lateral position, and right-of-way.
The model has five rules. The first four define what counts as a dangerous situation and specify the proper response, including safe longitudinal and lateral distance formulas, right-of-way logic at intersections, and rules for limited visibility scenarios. The fifth rule covers cases where another road user has already created a dangerous situation that cannot be avoided without violating road rules; here, RSS specifies a constrained "proper response" that maintains the autonomous vehicle's status as the non-blameworthy party.
RSS does not specify how a driving policy should behave optimally; it specifies what a driving policy must never do if it wants to remain blameless. This separation of concerns lets engineers build aggressive, comfortable driving policies on top of an RSS "shield" that vetoes any planned action which would violate the model. Intel released a C++ open-source implementation called ad-rss-lib, and elements of RSS have been picked up in standards work at IEEE (P2846) and discussed in regulatory contexts in Europe, China, and Japan.
Mobileye sells its driving capabilities as a tiered set of products that share software and hardware building blocks but target different price points and autonomy levels. The lineup also includes the older standalone ADAS chips that ship in mainstream vehicles for features like AEB and lane keeping.
| Product tier | Autonomy level | Driver oversight | Sensors | Typical platform | Status |
|---|---|---|---|---|---|
| Mobileye 8 Connect | Aftermarket warning | Driver in full control | Single front camera | Retrofit module for fleets | In market |
| EyeQ-based ADAS | Level 1 to Level 2 | Hands on, eyes on | 1 to 6 cameras, optional radar | Built into mass-market new cars | In market |
| Mobileye SuperVision | Level 2+ | Hands off, eyes on | 11 cameras plus radars, EyeQ5 or EyeQ6H | Zeekr 001, Polestar 4, future Audi, Porsche, Lamborghini | Shipping since 2021 |
| Mobileye Chauffeur | Level 3 (and toward Level 4 in defined domains) | Hands off, eyes off | SuperVision sensor set plus surround imaging radar and front lidar | Premium passenger cars, multiple OEMs | In development, first launches mid-decade |
| Mobileye Drive | Level 4 | Driverless in operational design domain | Full surround camera, radar, lidar suite, EyeQ Ultra | Robotaxis, shuttles, delivery vans (Holon, VW ID. Buzz, others) | In development, pilot deployments |
SuperVision is the centerpiece of the consumer lineup. It is a hands-off, eyes-on Level 2+ system that runs on a sensor set of eleven cameras and surround radars and currently uses EyeQ5 silicon, transitioning to EyeQ6H. SuperVision can perform navigate-on-pilot maneuvers on highways, rural roads, and urban roads, including lane changes, on- and off-ramp transitions, intersection handling, and stopping at red lights, while requiring the driver to keep eyes on the road and remain ready to intervene. The first SuperVision deployment was in the Geely-owned Zeekr 001 in China starting in 2022. Polestar 4 followed, and the Volkswagen Group selected SuperVision as the basis for premium driver-assist features across Audi, Bentley, Lamborghini, and Porsche brands in a March 2024 announcement.
Chauffeur is Mobileye's eyes-off Level 3 product. It builds on the SuperVision compute and camera stack, adds the imaging radar and a front-facing lidar, and is engineered to allow the driver to take their eyes off the road in defined operational design domains (typically highway driving up to about 130 km/h, or roughly 80 mph, in well-mapped regions). Volkswagen Group is expanding from SuperVision into Chauffeur in the same partnership, and other OEM design wins have been announced for production starting later in the decade.
Drive is the Level 4 platform aimed at robotaxis, ride-pooling shuttles, public transit, and goods delivery. It is sold to fleet operators rather than to individual consumers and is built around the EyeQ Ultra. The most prominent public deployment is Holon, an autonomous shuttle brand launched at CES 2023 by German Tier 1 supplier Benteler. The Holon Mover carries up to fifteen passengers, drives at up to 60 km/h, runs on Mobileye Drive, and is built to automotive safety standards. In 2025, Lyft announced that it would add Holon shuttles to its U.S. ride-hailing network beginning in 2026. Volkswagen Commercial Vehicles is also working with Mobileye on a Drive-equipped electric ID. Buzz that VW plans to use to launch a robotaxi service in Hamburg, Germany.
Mobileye's customer base includes most of the world's volume automakers. The relationships range from straightforward chip-plus-software supply for ADAS to deep, multi-year strategic partnerships that wrap in REM mapping, SuperVision, and Chauffeur deployments. Unlike vertically integrated approaches at companies such as Tesla (which built its own Full Self-Driving stack after parting ways with Mobileye in 2016) or BYD (which has its own ADAS effort alongside Mobileye-equipped models), Mobileye sells into multiple OEMs and is the dominant outside supplier of camera-based ADAS at industrial scale.
| Automaker | Relationship | Products and notes |
|---|---|---|
| Volkswagen Group | Strategic, multi-tier (Audi, Bentley, Lamborghini, Porsche, VW, VW Commercial Vehicles) | SuperVision and Chauffeur on E3 1.2 architecture; Mobileye Drive in ID. Buzz robotaxi |
| BMW | Long-running ADAS supplier, original EyeQ1 launch partner | EyeQ-based ADAS in volume models |
| Ford | ADAS chips plus REM mapping integrated into BlueCruise | Multi-generation deal |
| General Motors | ADAS supplier across many models | EyeQ chips for camera-based safety functions |
| Honda | ADAS supplier | EyeQ-based forward-facing camera systems |
| Nissan | ADAS plus REM mapping | ProPILOT-related programs |
| Stellantis | Driving Solutions for Stellantis cloud-enhanced ADAS, plus historic FCA-BMW-Intel-Mobileye AV alliance | EyeQ-based ADAS across multiple brands |
| Geely / Zeekr | First production launch of SuperVision | Zeekr 001 in China |
| Polestar | Chauffeur design win | Polestar 4 |
| Holon (Benteler) | Robotaxi platform partner | Holon Mover shuttle, Mobileye Drive |
| Lyft | Robotaxi network partner | Holon shuttle deployment from 2026 |
Mobileye disclosed in early 2024 that a single Western automaker (widely reported to be the Volkswagen Group) had awarded design wins covering all three of its top platforms (SuperVision, Chauffeur, and Drive) across seventeen vehicle models, with rollouts starting in 2026. In 2026, Berenberg analysts initiated coverage of Mobileye stock with a buy rating, calling the company the "dominant" global ADAS supplier in their note.
Mobileye's post-IPO arc has not been uniformly upward. The company entered a difficult period in 2024 driven by two related forces: a sharp decline in revenue from China and customer inventory reductions among Tier 1 suppliers and OEMs in Europe and North America after the post-COVID semiconductor bullwhip ran its course.
Full-year 2024 revenue came in at about $1.65 billion, a 20 percent decline from 2023. The company posted a roughly $3 billion net loss for the year, which was dominated by a $2.6 billion non-cash goodwill write-off recorded in the third quarter that reflected the gap between the $15.3 billion 2017 acquisition value and Mobileye's lower 2024 market capitalization.
First-half 2024 China revenue fell from $269 million in the prior-year period to $199 million as Chinese OEMs leaned more heavily on domestic ADAS suppliers and as price competition in the Chinese EV market squeezed component budgets across the board. Shashua told investors during the year that the company had cut its China volume forecasts because of "the uncertainty surrounding the dynamics in the local market."
Mobileye responded with several rounds of cost reduction. In late 2024 it ended internal FMCW lidar development and shut down the associated R&D unit, affecting roughly 100 employees. Earlier the same year, the company phased out its legacy original driver-assistance warning systems business and reduced headcount in related functions. A separate round of layoffs cut about 130 jobs, including 90 in Israel, as part of a broader efficiency push. Through 2025 the company emphasized a narrower focus on EyeQ6H, EyeQ Ultra, in-house imaging radar, and the SuperVision-Chauffeur-Drive trio of platforms as the engines for a return to growth later in the decade.
Founder Amnon Shashua continues to serve as president and chief executive officer of Mobileye and remains the public face of the company at events such as CES, where he typically delivers an annual keynote on Mobileye's product and technology roadmap. Shashua was awarded the Israel Prize for Computer Sciences in 2019 for his work on computer vision and its commercialization. Aside from Mobileye, he co-founded the assistive-technology company OrCam in 2010 and serves as chairman of AI21 Labs, the Tel Aviv-based company building large language models such as the Jurassic and Jamba families. AI21 reached a valuation of about $1.4 billion after a 2023 funding round and was reported in early 2026 to have held preliminary acquisition talks with Nvidia.
Co-founder Ziv Aviram, who served as CEO from founding through the Intel acquisition, departed full-time operating roles after the deal closed and has since been involved in venture investments and other ventures, including OrCam alongside Shashua.
Mobileye sits in a distinctive position in the autonomous driving market. It is neither a closed in-house effort like Tesla's nor a single-customer robotaxi venture like Waymo. Instead it operates as a Tier 1.5 supplier that licenses a stack of perception, mapping, and silicon to many OEMs at once.
Compared to Tesla, Mobileye has long maintained that camera-only is the right path for the first of two redundant systems but not the only path: cameras alone are not sufficient for an L4 robotaxi without an active-sensor cross-check. (Mobileye also points out, somewhat pointedly, that early versions of Tesla Autopilot ran on EyeQ3 silicon before Tesla split with Mobileye in 2016 and built its own hardware.)
Compared to Waymo, Mobileye relies far less on dense lidar and on heavy bespoke maps, and far more on crowdsourced REM and on a perception stack designed to scale to consumer price points. The bet is that consumer-level autonomy across many car models needs a solution that is two orders of magnitude cheaper per vehicle than what robotaxi-only companies can absorb.
Compared to Chinese ADAS suppliers and to in-house efforts at Chinese EV makers like BYD, Mobileye has historically led on per-watt computer vision performance, REM map scale, and OEM relationships outside China, while losing share inside China to local competitors who can iterate faster against domestic regulatory and consumer expectations.