Modern Mechanics 24

Explore latest robotics, tech & mechanical innovations

1X Technologies’ NEO Robot Learns New Skills by Watching Videos with New ‘World Model’ AI

The 1X NEO humanoid robot uses a new “World Model” AI to learn tasks from video prompts, enabling it to perform actions like ironing or packing a lunchbox without prior specific training.

Norway-based 1X Technologies AS has launched a breakthrough “World Model” AI that enables its NEO humanoid robot to learn physical tasks simply by watching videos, including actions it has never been explicitly trained to perform. Founder and CEO Bernt Børnich announced the update, which allows the $20,000 household robot to translate user prompts into autonomous actions by generating and executing its own visual plans.

What if you could teach a robot a new chore just by showing it a video or describing it? That science-fiction capability is now a claimed reality for 1X Technologies and its bipedal NEO robot. The company, headquartered in Palo Alto, California, has unveiled a new AI architecture that moves beyond painstaking, code-heavy robot training. Instead, NEO’s “World Model” uses a video-based understanding of physics to watch, interpret, and then replicate tasks from the vast world of online video data. “With the ability to transform any prompt into new actions—even without prior examples—this marks the starting point of NEO’s ability to teach itself to master nearly anything you could think to ask,” stated CEO Bernt Børnich.

This approach tackles a fundamental bottleneck in robotics: data collection. Traditionally, robots learn through demonstrations meticulously performed by human operators in controlled settings. The 1X World Model, however, allows NEO to collect and learn from its own interaction data and, more importantly, to generalize from the immense library of human activity captured in everyday online videos. As explained by 1X AI researcher Daniel Ho, “With the 1X World Model, you can turn any prompt into a fully autonomous robot action — even with tasks and objects NEO’s never seen before.” Demonstrations show the robot handling novel tasks like operating a toilet seat, ironing a shirt, or brushing a person’s hair after a simple voice command.

READ ALSO: https://modernmechanics24.com/post/fusion-energy-spinout-nextgen-microwaves/

Here’s how it works in practice. A user gives NEO a text or voice prompt, like “pack the lunch box.” The robot’s AI first generates a visualization of the steps required to complete that task in the specific environment it sees through its cameras. A built-in inverse dynamics model then translates that visual plan into the precise joint movements and forces needed for NEO’s body to execute it. This loop of perception, simulation, and action allows it to perform known tasks in unfamiliar settings and to attempt completely new ones.

The implications for practicality and adaptability are significant. 1X notes that traditional models often fail in the unpredictable “clutter or chaos” of a real home. Their model is designed to maintain composure amid rapid environmental changes, applying a more human-like understanding to navigate variability. This is crucial for the company’s goal of making NEO a useful domestic assistant. The robot is currently available through an early access program for $20,000, which includes priority delivery in 2026, with a $499/month subscription model also planned.

WATCH ALSO: https://modernmechanics24.com/post/ehang-next-gen-evtol-first-public-flight/

This development signals a shift in how humanoid robots may be trained at scale. By leveraging internet video, the potential training dataset becomes almost limitless. While the initial capabilities are focused on simple household tasks, the underlying principle—a robot that learns from observation like a human—could dramatically accelerate the path to versatile, general-purpose robots. 1X isn’t just programming behaviors; it’s building a foundation for robots that can watch, learn, and eventually figure things out for themselves.

Share this article

Leave a Reply

Your email address will not be published. Required fields are marked *