| Eggie | |
|---|---|
| General information | |
| Manufacturer | Tangible Robots |
| Country of origin | United States |
| Year revealed | 2025 |
| Status | Prototype |
| Price | $32,000 USD (estimated) |
| Availability | Pre-commercial prototype |
| Locomotion | Wheeled base |
| Website | tangiblerobots.ai |
Eggie is a wheeled humanoid robot developed by Tangible Robots, a robotics startup based in Palo Alto, California. Designed for domestic environments, Eggie combines a mobile wheeled base with an anthropomorphic upper body featuring dexterous five-fingered hands. The robot emerged from stealth in November 2025 with a demonstration video showing it performing contact-rich household tasks such as wiping spills and handling objects in a kitchen setting. Tangible Robots positions Eggie as a general-purpose home assistant capable of operating in cluttered, unpredictable real-world spaces, differentiating itself from industrial robots and warehouse automation platforms.
Tangible Robots was founded in 2024 by three co-founders: Bipasha Sen, Shankara Narayanan Vaidyanathan, and Benjamin Soria. The company is incorporated in Palo Alto, California, and operates in the San Francisco Bay Area.[1][2]
Bipasha Sen serves as CEO. Sen spent over a decade in artificial intelligence research before founding the company. She began her career as a Data Scientist at Microsoft, where she worked on the recommendation and suggestion team for Outlook, reaching over 100 million users per month.[3] She subsequently pursued doctoral studies at MIT CSAIL (Computer Science and Artificial Intelligence Laboratory) as an Ida Green Fellow, advised by Professor Pulkit Agrawal, focusing on generative modeling and robotic manipulation.[3][4] Sen left her MIT PhD program to found Tangible Robots, motivated by the belief that domestic robotics represents "the hardest and most meaningful application of the technology today."[5] Her published research includes work on 3D vision and perception (HyP-NeRF at NeurIPS 2023, ConceptGraphs at ICRA 2024), motion planning (EDMP at ICRA 2024), and computer vision (WACV 2023, BMVC 2021).[3]
Sen has articulated a distinctive philosophical position on embodied intelligence, stating: "What is AGI without a sense of touch? To me, robots aren't just 'embodied AGI'; they are the truest form of AGI."[6]
Shankara Narayanan Vaidyanathan serves as co-founder and Chief Product Officer (CPO), leading core robotics and control systems work. Vaidyanathan holds an MS in Robotics from Northeastern University, where he worked in the Robust Autonomy Lab on visual-inertial navigation systems. He previously earned a B.E. in Mechanical Engineering from BITS Pilani and conducted research at the Robotics Research Center at IIIT Hyderabad under Professor Madhava Krishna. His experience includes leading autonomous driving research at BITS Pilani and working at Invento Robotics on autonomous navigation pipelines.[7][1]
Benjamin Soria serves as co-founder and Chief Technology Officer (CTO), heading mechanical and software engineering for the company.[1][8]
Tangible Robots has raised approximately $4 million in total funding across two seed-stage rounds. The company completed a pre-seed investment round in late 2024, followed by a $3 million seed round in 2025 led by India-based Blume Ventures and Micelio Technology Fund. Additional investors include Hubert Thieblot.[1][9] The company was incubated through Founders, Inc., a San Francisco-based hardware accelerator.[2]
The company's leadership team draws from institutions including MIT, UC Berkeley, and USC. Team members have prior experience deploying thousands of autonomous robots commercially and developing control systems for surgical robotics applications.[10]
To attract top talent, Tangible Robots has advertised compensation packages ranging from $1 million to $2 million in total compensation for senior research and engineering candidates, seeking expertise in imitation learning, reinforcement learning, dexterous manipulation, and vision-language-action models.[1][10]
Tangible Robots states its mission as putting "robots in every home." The company describes its vision as creating robots that serve "as a means to augment our physical world like AR/VR once promised," focusing on enhancement rather than replacement of human capabilities.[5] The Founders, Inc. portfolio describes Tangible as developing "dexterous butler robots designed to automate hospitality services," targeting complex tasks in dynamic environments such as "navigating cluttered spaces to retrieve items from sinks or refrigerators."[2]
The company positions itself as a platform company supporting diverse domestic applications rather than building single-purpose industrial machines. Initial target markets include luxury apartments and high-end hospitality venues.[2]
Eggie features a wheeled mobile base topped with an anthropomorphic upper body. The design is clean and minimalist, with smooth white panels covering much of the structure and exposed black joints at the wrists and elbows. The head is rectangular with two front-facing stereo cameras serving as the robot's primary perception system.[6][11]
The choice of a wheeled base over bipedal legs represents a deliberate engineering trade-off. Wheels offer greater stability, longer operational time, and simpler control compared to legged locomotion, at the cost of reduced terrain adaptability. This approach mirrors a broader industry trend: according to market research, the wheel drive segment held the highest market share (65.6%) among humanoid robots in 2024, reflecting the practical advantages of wheeled platforms for indoor applications.[12] Other wheeled humanoid competitors targeting the home market include Sunday Robotics' MEMO, which uses a similar torso-on-wheels configuration.[13]
The robot's overall aesthetic was designed to be friendly and approachable, intended to make Eggie a comfortable presence in domestic settings rather than an intimidating industrial machine.
Tangible Robots describes Eggie's engineering around three core pillars:[6][10]
Dexterity: Eggie features complex, anthropomorphic five-fingered hands as its primary hardware differentiator. While some competitors (such as Sunday Robotics' MEMO) have opted for simplified dual grippers or pincer-style end effectors, Tangible has invested in fully articulated human-like hands capable of manipulating a wide range of household objects. This design choice enables what the company calls "contact-rich" manipulation, meaning the robot can apply nuanced force and grip patterns suited to different objects and surfaces.[6][11]
Compliance: The robot's arms and joints incorporate compliance mechanisms that allow flexible, adaptive movement when interacting with objects and the environment. Compliant systems are important in domestic settings where the robot must handle fragile items, work alongside people, and navigate unpredictable contact situations without causing damage.
Whole-body control: Eggie coordinates its wheeled base, torso, arms, and hands as an integrated system rather than controlling each component independently. This whole-body coordination enables behaviors such as simultaneously navigating a kitchen while reaching for an object, or stabilizing the body while one hand wipes a surface and the other holds a mug.
Eggie employs dual stereo cameras mounted in its head for visual perception. These cameras provide both RGB imagery and depth information, enabling the robot to build spatial understanding of its environment. The perception system supports the robot's ability to identify objects, understand scene geometry, and plan manipulation actions in cluttered domestic spaces.[6][11]
The company's website indicates the use of multimodal perception systems, though detailed sensor specifications (resolution, frame rate, depth range) have not been publicly disclosed as of early 2026.
Tangible Robots takes a full-stack approach, developing hardware, control systems, and artificial intelligence software in-house. The company's AI stack integrates multiple machine learning paradigms:[10]
| AI Component | Description |
|---|---|
| Imitation learning | Robots learn from demonstrations captured via sensor-equipped exoskeletons worn by human operators |
| Reinforcement learning | Policies refined through trial-and-error optimization in both simulation and real-world settings |
| Dexterous manipulation | Specialized models for contact-rich grasping and object handling with five-fingered hands |
| Vision-language-action models | Integration of visual perception and language understanding to guide robot actions |
| Multimodal co-training | Combining data from multiple sensory modalities (vision, touch, force) during model training |
Cheng Chi, a co-founder of Sunday Robotics and creator of the Diffusion Policy framework, has emphasized the importance of vertical integration in this space, noting that achieving "mm level precision beyond actuator limits" requires "owning the whole stack from HW to AI."[11] Tangible Robots appears to share this philosophy, maintaining control over the entire technology stack from hardware design through AI model development.
One of Tangible Robots' distinctive approaches is its emphasis on real-world data collection for training Eggie's manipulation policies. Rather than relying primarily on synthetic data generated in simulation, the company deploys a novel data capture pipeline centered on human demonstrations.[1][10]
Engineers at Tangible Robots wear sensor-packed partial exoskeletons equipped with dozens of sensors that capture human-level touch and force information during task execution. As operators perform household tasks such as cleaning, picking up objects, and organizing, the exoskeleton records detailed data about hand positions, grip forces, contact patterns, and the amount of pressure appropriate for different materials (for example, how hard to press on glass versus a milk carton).[1][6]
This approach addresses a fundamental challenge in robot learning: collecting high-quality, diverse manipulation data at scale. While simulation-based approaches can generate large datasets quickly, they often fail to capture the nuanced physics of real-world contact, particularly for tasks involving deformable objects (cloths, sponges), liquids (spilled coffee), and fragile items (glassware).
Tangible Robots states that its robots learn from real deployments across "hundreds of unique environments and thousands of interactions."[10] This deployment-centric approach means that Eggie's capabilities improve iteratively as the system encounters new objects, surfaces, and spatial configurations in actual homes and hospitality settings.
The emphasis on real-world data collection places Tangible Robots alongside other robotics companies that have adopted similar methodologies, though each with distinct hardware. For example, the AirExo-2 system and the ExoStart framework represent parallel academic efforts to use low-cost exoskeletons for scalable robot demonstration data collection.[14]
Tangible Robots emerged from stealth in November 2025 with a 26-second demonstration video showing Eggie performing tasks in a kitchen environment. In the video, the robot:[1][6]
The demonstration highlighted what Tangible describes as "contact-rich" manipulation: the ability to apply appropriate force while cleaning a surface, handle a deformable object (the cloth), and coordinate multiple concurrent actions. Media coverage noted that while the movements were slow and somewhat stuttering, the robot successfully completed the manipulation task.[15]
Beyond the initial reveal video, Tangible Robots has indicated or demonstrated Eggie performing several additional household tasks:[10][16]
| Task Category | Examples |
|---|---|
| Cleaning | Wiping countertops, cleaning spills, handling cloths and sponges |
| Object manipulation | Grasping mugs, picking up household items, handling fragile objects |
| Garment handling | Hanging jackets on coat stands, stripping beds |
| General domestic tasks | Counter cleaning, towel handling, interacting with everyday objects |
These tasks emphasize the common thread in Eggie's target use cases: contact-rich manipulation in unstructured domestic environments where objects vary in shape, weight, fragility, and material properties.
As of early 2026, Tangible Robots has not published a comprehensive specification sheet for Eggie. The company has focused its public communications on design philosophy and capabilities rather than detailed technical metrics. The following table summarizes what is publicly known and what remains undisclosed:[6][11][17]
| Category | Specification | Status |
|---|---|---|
| Form factor | Wheeled humanoid with anthropomorphic upper body | Confirmed |
| Locomotion | Wheeled mobile base | Confirmed |
| Hands | Five-fingered anthropomorphic, two hands | Confirmed |
| Head sensors | Dual stereo cameras (RGB + depth) | Confirmed |
| Height | Not officially disclosed | Unknown |
| Weight | Not officially disclosed | Unknown |
| Degrees of freedom | Not officially disclosed | Unknown |
| Arm payload capacity | Not officially disclosed | Unknown |
| Battery life / runtime | Not officially disclosed | Unknown |
| Maximum speed | Not officially disclosed | Unknown |
| Actuator type | Not officially disclosed | Unknown |
| Operating system | Not officially disclosed | Unknown |
| Price target | $32,000 USD (estimated) | Reported |
The limited disclosure of hardware specifications is not unusual for early-stage robotics startups operating in stealth or near-stealth mode. As Eggie moves toward commercial availability, more detailed specifications are expected to be released.
Eggie enters a growing market for domestic and service humanoid robots. The global humanoid robot market was valued at approximately $1.55 billion in 2024 and is projected to reach $4.04 billion by 2030, growing at a compound annual growth rate of 17.5%.[12] Several companies are developing robots that compete directly or indirectly with Eggie in the home and hospitality segments.
| Robot | Company | Form Factor | Hands | Price | Status |
|---|---|---|---|---|---|
| Eggie | Tangible Robots | Wheeled humanoid | Five-fingered (2 hands) | $32,000 | Prototype |
| MEMO | Sunday Robotics | Wheeled humanoid | Dual grippers | $20,000 | Prototype (beta 2026) |
| NEO | 1X Technologies | Bipedal android | Dexterous hands | ~$20,000 | Pre-commercial |
| Figure 02 | Figure AI | Bipedal humanoid | Dexterous hands | Not disclosed | Commercial pilot |
| Optimus | Tesla | Bipedal humanoid | Dexterous hands | ~$20,000-$30,000 (target) | Prototype |
Eggie's primary differentiator from other wheeled humanoids is its investment in fully dexterous, five-fingered hands. Sunday Robotics' MEMO, the most directly comparable competitor, deliberately eschews human-like hands in favor of a custom dual-gripper design with fewer moving parts, prioritizing durability and easier control.[13] Tangible Robots takes the opposite approach, betting that human-like hands are essential for the range of manipulation tasks required in homes, which were designed and built around the human hand form factor.
The competition between these philosophies (dexterous anthropomorphic hands versus simplified grippers) reflects a broader debate in the robotics community. Simplified grippers are more reliable and easier to control but limited in the range of objects and tasks they can handle. Five-fingered hands can theoretically manipulate any object a human can but introduce significant mechanical complexity and control challenges.
Compared to bipedal competitors like 1X Technologies' NEO and Tesla's Optimus, Eggie trades away the ability to navigate stairs and uneven terrain in exchange for the stability, energy efficiency, and simplicity of a wheeled platform. For purely indoor domestic applications, this trade-off may be favorable, as most homes have flat floors and the primary value proposition lies in manipulation rather than locomotion.
Tangible Robots targets several market segments with Eggie:[2][5]
| Market Segment | Description |
|---|---|
| Home assistance | Performing household chores including cleaning, tidying, and object manipulation in private residences |
| Luxury hospitality | Butler-style service automation in high-end apartments and hospitality venues |
| Research | Platform for academic and industrial research in manipulation, robot learning, and human-robot interaction |
| Education | Demonstrations and educational applications showcasing advanced robotics capabilities |
The estimated price of $32,000 positions Eggie at the higher end of the emerging home robot market. For comparison, 1X Technologies targets a price of approximately $20,000 for NEO (or $500 per month), and Sunday Robotics prices MEMO at $20,000 with aspirations to bring the cost below $10,000 at scale.[13][18] The pricing reflects Eggie's current prototype status and the mechanical complexity of its dexterous hand system.
The company's initial go-to-market strategy appears focused on luxury residential settings and hospitality, where the higher price point is more acceptable, before eventually targeting broader consumer adoption as costs decrease.
Several technical challenges face Eggie and the broader domestic humanoid robot category:
Dexterous manipulation reliability: While five-fingered hands offer greater versatility, achieving the reliability needed for unsupervised household operation remains an open research problem. Contact-rich tasks like cleaning with a cloth or handling fragile items require precise force control and robust tactile sensing.
Long-horizon task execution: Household chores often involve long sequences of actions (for example, loading a dishwasher requires opening the door, placing items in specific slots, adding detergent, and starting the cycle). Maintaining reliable performance across such extended task sequences is significantly more challenging than single-step demonstrations.
Safety in shared spaces: Operating alongside humans, children, and pets in unpredictable home environments demands robust safety systems. Eggie's compliant design and wheeled base (which is inherently more stable than bipedal locomotion) address some safety concerns, but comprehensive safety certification for consumer deployment remains ahead.
Data scaling: Tangible's approach of collecting real-world data via exoskeletons, while producing high-quality demonstrations, faces scalability questions. Gathering enough diverse training data to cover the enormous variety of objects, surfaces, and spatial configurations found across different homes is a persistent challenge for the entire field of robot learning.
As of early 2026, Tangible Robots has not announced a specific commercial launch date for Eggie. The company appears focused on refining autonomy through large-scale real-world training and iterating on hardware and software before entering commercial production.[1]