cyberivy
Genesis AIGENE-26.5VLAFoundation ModelRoboticsRobot HandKhosla VenturesEmbodied AIWuji Tech2026

Genesis AI unveils GENE-26.5 foundation model for robot hands

May 9, 2026

Detail einer mechanischen Roboterhand mit silbernen Gelenken vor einem hellen Hintergrund.

On 6 May 2026, Genesis AI introduced its GENE-26.5 foundation model that controls human-scale robot hands. Demos show cooking, pipetting, piano playing and solving a Rubik's Cube.

Genesis AI introduces GENE-26.5 for human-scale robot hands

Robotics startup Genesis AI, backed by Khosla Ventures and Eclipse, unveiled its first public foundation model GENE-26.5 on 6 May 2026. The model controls a human-scale dexterous robotic hand and a matched data engine. Genesis AI emerged from stealth in July 2025 with a US$105 million seed round.

What GENE-26.5 actually does

GENE-26.5 is a vision-language-action model that translates camera and sensor input into fine-motor movements of a multi-jointed hand. Genesis AI builds the hand itself, in partnership with Chinese manufacturer Wuji Tech, and runs its own data pipeline that turns human motion directly into training data.

Demo tasks

In a public demo video, Genesis AI showed a 20-step cooking recipe including chopping tomatoes and one-handed egg-cracking, smoothie preparation with two-hand coordination, lab work with pipetting and liquid transfer, wire harnessing, solving a Rubik's Cube, single-handed multi-object grasping and piano playing.

Strategic position

Vinod Khosla called Genesis AI a bet on a step toward AI that can act in the real world. The model competes with MolmoAct, NVIDIA Isaac GR00T, Figure and other VLA approaches, but emphasizes the combination with bespoke hardware.

Why this matters

Most foundation models today are language-centric. GENE-26.5 is one of the first broadly demonstrated models that puts hands β€” fine-motor manipulation β€” at the center. Anyone working in industry, care, logistics or research knows that many bottlenecks sit exactly where humans handle tools, cables, packaging or patients. Foundation models that control hands flexibly are a precondition for humanoid robots to move from show stages into workshops and labs. For German and Swiss industrial users that means: a field long treated as research is gaining serious competition between Genesis AI, NVIDIA, Figure and Boston Dynamics, with credible investor voices behind it.

In plain language

Imagine you have to teach a child to crack an egg. Words alone are not enough; the child has to see how a hand grasps the egg, taps it on the bowl, opens it gently. GENE-26.5 tries to do exactly that on a large scale: an artificial hand gathers enough experience with eggs, tomatoes, pipettes and piano keys via a model that it can attempt new tasks somewhat sensibly, instead of being painstakingly hand-programmed for each one.

A practical example

A Swiss lab-automation specialist with 250 employees is checking in 2026 whether to deploy a humanoid robot arm with GENE-26.5 in a research pilot cell. Tasks: pipetting 50 microliters into microplates, buffer swaps, reagent labeling. The team starts with 3 pilot tasks in a fenced cell and compares them for twelve weeks against today's standard arm. Key KPIs: volume-accuracy error rate, downtime per shift, maintenance hours. Only after solid evaluation does it make sense to scale to other stations.

Scope and limits

In May 2026, GENE-26.5 is a research and demo stage. Genesis AI has not announced broad commercial availability. Demos are informative but do not replace independent benchmarks. Multi-handed fine-motor tasks remain error-prone in unstructured environments; pipetting in a lab cell is different from an open classroom. Safety, liability and EU machinery-regulation questions are unresolved for productive setups. Scaling too early risks downtime and audit problems.

SEO and GEO keywords

Genesis AI, GENE-26.5, vision-language-action, VLA, foundation model, robotics, robot hand, Khosla Ventures, Eclipse, embodied AI, Wuji Tech, humanoid robot, 2026

πŸ’‘ In plain English

Genesis AI has unveiled an AI model that can control a multi-finger robotic hand. In demos it slices tomatoes, cracks eggs, pipettes in a lab and even solves a Rubik's Cube. The model is called GENE-26.5.

Key Takeaways

  • β†’Genesis AI publicly introduced GENE-26.5 on 6 May 2026, the startup's first foundation model.
  • β†’The model controls a human-scale robotic hand built with Wuji Tech in China.
  • β†’Demos cover a 20-step cooking recipe, pipetting, wire harnesses, piano and a Rubik's Cube.
  • β†’Genesis AI is backed by a US$105 million seed round from Khosla Ventures and Eclipse.
  • β†’The model competes against MolmoAct, NVIDIA Isaac GR00T and Figure on the VLA front.
  • β†’Broad commercial availability is not announced as of May 2026.

Sources & Context