Register and share your invite link to earn from video plays and referrals.

OpenMind
@openmind_agi
Superintelligence for robots.
89 Following    159.3K Followers
Our Unitree Go2s are getting better at navigating real-world clutter. In this demo, it can now self-detour from corners and avoid obstacles using adjusted MPPI parameters. Fully autonomous navigation means being able to move without bumping into potentially critical objects. In workplaces and homes this is especially important.
Show more
Our build night was a huge success! Thank you to all the technical teams for showing up and launching on OM1. Winners received exclusive OpenMind backpacks and showcased their amazing demos including: - Robot that mirrors human motion in real time - New robot form-factor abstracted and automated in just 30 minutes - Autonomous drones controlled in simulation by natural language It will only get easier to build applications from here.
Show more
OpenMind OM1 Build Night w/ @OpenAI Codex Location: San Francisco [Address given upon successful RSVP] Date: Wednesday, May 6 @ 4:30 PM - 9:00 PM ​This event is for robotics and agent developers, AI-native builders, technical founders, and curious engineers who want a practical way to learn OM1 by actually building with it. ​Bring a laptop and come ready to ship something. Register:
Show more
All roboticists - in case you are building VLAs, please stop what you are doing, take a break, get a coffee, and read the LeWorldModel paper by @lucasmaes_, @randall_balestr, @ylecun and collaborators: And then read it several more times - the same general approach can be directly mapped to other key problems in robotics, including dealing with multimodal inputs such as vision and speech
Show more
OpenMind OM1 Build Night w/ @OpenAI Codex Location: San Francisco [Address given upon successful RSVP] Date: Wednesday, May 6 @ 4:30 PM - 9:00 PM ​This event is for robotics and agent developers, AI-native builders, technical founders, and curious engineers who want a practical way to learn OM1 by actually building with it. ​Bring a laptop and come ready to ship something. Register:
Show more
We were delighted to participate at CONNECT 2026: Global Embodied AI Innovation Summit where we formalized our strategic partnership with @MagicLab_Robot. We also shared our thoughts in the panel: Perception, Foundation Models & Decision-Making. Excited to see continued collaboration in the space as we expand our reach across multiple robot manufacturers.
Show more
Why Generic Humanoid Robots Will Fail — And What's Next Imagine an alternate world where we never invented the car. In that world, a robotics engineer might reasonably conclude that robotic horses are the future — replace the living ones, keep the stables and saddles, ride them to work. Convenient, modern, and the roads stay free of manure. It sounds absurd only because you already know about cars. We keep making the same mistake with humanoid robots. Consider transportation. To finally make driving safe, we had two options: put a humanoid in the driver's seat, or embed sensing and compute directly into the vehicle. Waymo chose the latter. It has no steering wheel. It exists purely to move people efficiently from A to B. The humanoid was not needed. Consider a sock factory. Yes, you could replace workers with humanoid robots one-for-one on the assembly line — and gain maybe 2-3x efficiency. Or you could completely redesign the workflow around a purpose-built autonomous sewing system and eliminate most of the factory, the chairs, the cafeteria, the manual sewing machines, the HVAC, the doors, and the restrooms. The actual optimization is to side-step the previous human-imposed physical constraint. Look at Ukraine. The front lines aren't filling up with Terminator-style humanoids carrying rifles. Human soldiers are being replaced by heterogeneous swarms of purpose-specific drones: some for reconnaissance, some for logistics, some for delivering munitions. War is being restructured around the desired outcome (survival), not the soldier's shape. Consider a 1970's office. Want to move information through teams of people? We once used typists, paper, trucks to supply the paper, typewriters, and repair technicians. A linear improvement would have been to replace the human typist with a 10-fingered humanoid. What actually happened? The entire workflow — paper, printers, typewriter factories, delivery trucks, the desks, the offices — was obliterated. Email deleted the human clerk's entire universe. Consider cancer early detection by mammography. Today, getting a mammogram requires expensive hardware, logistics infrastructure, human nurses and doctors, a biopsy workflow, a human pathologist with a microscope (imported from Germany or Japan), a written finding, multiple physician reviews. Sure, you could replace the pathologist with a humanoid (the microscope focus knob requires finger dexterity) and get a modest efficiency gain (and faster responses at 2 am). Or — the far more likely future — we all swallow a cancer detection pill every few months, and 24 hours later a color-changing sticker on our arm turns red or green. No hardware. No hospital. No logistics. No pathologist. No office. No desk. No humanoid. The workflow isn't optimized by a literal drop-in swap of a human pathologist for a humanoid. The entire workflow simply ceases to exist. Consider life sciences research and drug development. We're seeing excitement about robot arms and humanoids pipetting water in research labs. Robot horses, episode 7. We don't design aircraft by crashing test planes — we simulate them entirely in software first. Biology will go the same way. The path to scalable drug discovery isn't robot arms in conventional wet labs demonstrating 10 fingered prowess in manipulating Eppendorf tubes filled with purple food coloring. Rather, we need in-silico biological models that evaluate billions of hypotheses computationally, with physical manipulation of atoms only at the very end. The clear pattern. Efficient automation doesn't try to replicate a 10-fingered human in a static context. Automation eliminates physical rate-limiting steps in their entirety. That's why "classical" humanoid robots, as a generic category, will largely fail. They're robotic horses. They assume the infrastructure and workflows stay fixed and only the 10-fingered human is swapped out. That's not how economic and technological pressure works. What actually matters? If humans continue to inhabit the physical world, then moving atoms will remain important, and that requires five things: atoms, energy, force generation and actuation, sensing, and compute. Everything else — form factor, number of limbs, type of end effector — is a variable to be optimized for the task. So if you are a pathologist, a robotics engineer, a teacher, a parent, a politician, or a sewing factory owner - please think different. Most obviously, we should all anticipate, and build for, a future in which robots exhibit extreme physical fluidity: Two arms or four. Wheels or legs. Tentacles or flippers. Three fingers or twelve, or none at all. Eyes at the front, side, or tip of a tentacle. At OpenMind, we don't care what you look like right now - we got you, in all your physical form factors. OM2 ships in July, for all machines. Let's build.
Show more
This week, we were featured in Forbes and The Robot Report where our Founder, @JanLiphardt gave his thoughts on: Physical AI’s Future - Microsoft-OpenAI Alliance Overhauled As Battle For AI Dominance Widens - Take a look at how OpenMind is navigating the changing AI + robotics landscape.
Show more
From gesture to intent, our latest work shows how advanced keypoint detection (body + hands) can unlock powerful, real-time action recognition. We’re pushing the boundaries with both data-driven models and zero-shot approaches, scaling from a handful of core actions to a richer set of human behaviors without always needing new training data. This is a glimpse into more adaptive, intelligent systems that understand people naturally and is vital for mass robot adoption.
Show more
Demonstrating our latest localization system in action. At bootup, the robot has no prior position estimate, but within seconds it autonomously localizes itself using a fusion of three algorithms. Unlike our previous version, this updated system incorporates vision, allowing the robot to adapt to real-world changes (e.g., moved furniture) by recognizing previously seen environments. In this demo, we repeatedly reset navigation and reposition the robot to random locations, showing robust, repeatable localization. The robot then executes a full patrol, following a planned path (visualized in RViz) with real-time path tracking. Because our software is hardware-agnostic, it brings the same reliable performance to any robot it runs on.
Show more
Our CTO @boyuan is demoing our new localization algorithm, which surpasses leading industry solutions. Localization enables a robot to determine its position in an environment, which is essential for navigation. While most systems require robots to start from a predefined location, ours doesn’t. It lets robots boot up anywhere and immediately locate themselves, making deployment far more flexible for real world scenarios.
Show more
2/ Unitree Robotics @UnitreeRobotics In its third Gala appearance, Unitree brought robotic martial arts to the stage, performing alongside professional athletes. Precision. Balance. Control. A clear signal of hardware maturity and advanced motion intelligence- the prerequisites for reliable, large-scale real-world adoption.
Show more
OpenMind kicked off the Future of Robots conversation on @BloombergTV. Founder @JanLiphardt discusses how software is shaping social & home robotics, and how our hardware-agnostic platform helps humanoid companies move faster.
Show more
0
210
1K
194
Forward to community
Our robots integrate with @Virtuals_io to become your trading assistant. We’re showing how embodied AI means installing agents into robot brains, turning digital-only tasks into actions triggered by human-robot commands.
Show more
0
233
1K
161
Forward to community
Our CTO @boyuan is demoing our new localization algorithm, which surpasses leading industry solutions. Localization enables a robot to determine its position in an environment, which is essential for navigation. While most systems require robots to start from a predefined location, ours doesn’t. It lets robots boot up anywhere and immediately locate themselves, making deployment far more flexible for real world scenarios.
Show more
0
177
987
133
Forward to community