Special series: Topic name
The BMW Talent Campus, located directly at the main Munich plant, was only opened in October 2025. The building itself already says a lot about how the group currently wants to present and be understood. Open, modern, technology‑driven, close to development and production at the same time. That fits a brand which has recently cut quite a good figure around its iFactory. And “figure” in this case is more than just a linguistic coincidence. With Figure.AI, BMW has already gained initial experience with humanoid robotics in production at its Spartanburg plant. Now Hexagon marks the next step . On site in Munich it becomes clear how seriously BMW takes this topic. In the new centre of competence for physical AI based at the Talent Campus, young engineers, robotics and AI experts are working to advance humanoid and other learning systems to the point where they not only work in demos, but extend into real plant operations on as many levels as possible.
A centre of competence is intended to bring speed and order together
Felix Häckel, head of the new “Center of Competence for Physical AI in Production” at BMW, describes the starting point quite clearly. The technology is currently developing at high speed. At the same time, various proofs of concept and technology trials are running in plants worldwide. The aim now is to pool knowledge, experience and capabilities so that these technologies can be industrialised more quickly. To this end, BMW has set up a multidisciplinary team in Munich that brings together robotics, AI and production expertise and is intended to organise the transfer from the lab into the plants. In doing so, Häckel refers not only to research partners and publicly funded projects, but above all to the concrete requirements from the factories. What is crucial is understanding what the plants actually need and how new systems can be transferred into existing processes.
We pool our expertise to make knowledge about AI and robotics usable across the entire production network.
Felix Häckel, head of the Center of Competence for Physical AI in Production at BMW
This statement is more than organisational prose. It describes the real core of the project. BMW does not want to treat physical AI as a single demo, but as a new production domain, with a network, standards, learning curves and a roadmap from the lab bench into day-to-day plant operations. That is precisely why Häckel also points to the exchange with sites such as Spartanburg
and Leipzig. The aim is not to show individual robots, but to turn distributed experience into robust industrialisation.
The brain counts for more than the form factor
Nikita Aleshin, who is responsible at BMW for strategy and humanoid projects in the field of physical AI, then deliberately shifts the focus away from the hardware. Whether a system moves on wheels, walks on legs or consists only of arms is initially of secondary importance for BMW. What matters is what happens in the robot’s “brain”. Aleshin describes this brain as a foundation model into which BMW feeds industrial knowledge. It is supposed to understand what a BMW component is, how it is assembled with another part and what needs to be done in a specific production situation. To do this, the model combines several layers: language as instruction, camera images as perception and sensors for the robot’s self-perception. Concrete actions arise from this combination. Aleshin illustrates this using the audible “clicking” of the system. The output is not text, but the next movement.
For us, the decisive factor is not the form factor, but what is happening in the robot’s brain.
Nikita Aleshin, Physical AI expert at BMW
This is where the break with classical robotics lies. Today, factories are still largely dominated by pre-programmed systems. Every weld point, every position, every sequence is defined. Aleshin openly says that BMW wants to move away from exactly that. Instead of programming point by point, robots are to learn through imitation learning – by demonstration, repetition and adaptation. The model does not simply receive coordinates, but an understanding of the task. This is relevant for BMW because, while a generic model may be able to grasp a cup, it is still far from knowing what a factory looks like or how a specific assembly process works. This industrial domain knowledge first has to be built up.
The interesting moment is the error
What this looks like in practice is demonstrated by Michael Gentner, a physical AI expert at BMW, using a demo of a comparatively small task. An electronic control unit is to be placed on a bracket and then secured. Today this process is manual because it is either too complex or not economically viable to automate. It is precisely such tasks that BMW regards as an interesting test case for learning-capable robotics. Not because they are spectacular, but because they contain the subtle deviations of real everyday factory work. Parts are not always in the same place, their orientation varies, and small tolerances add up.
AI expert Gentner explains what is meant by the often abstract-sounding term recovery. If a grasp does not succeed immediately, the system does not simply stop. It tries a different approach and keeps working. This is precisely one of the key differences from classical automation. Not every eventuality has to be created in advance as a fixed program branch. The system is supposed to infer from demonstrations and training data how to deal with variance and disturbances.
Without industrial data the model remains blind
It is at this point in particular that it becomes clear why BMW is approaching the topic so much through data and not only through robotics. The foundation models may be publicly available. That does not mean they have automatically seen industrial reality. Household objects, public datasets and generic movement patterns are comparatively easy to access. Industrial parts, assembly logic, production-typical processes and the imprecision of a real plant, on the other hand, are not.
Aleshin makes it clear that this domain knowledge is exactly the decisive factor. A model must first learn what is actually relevant in the factory. It has to understand how components belong together, how tasks vary depending on context and which processes in production really matter. Only then does a generic model become a system that can be used industrially. Physical AI thus becomes less a question of an individual robot and more a question of the data basis and specialisation.
Series production begins in the last millimetres
The demonstration also makes clear where the real hurdle lies. Not with the first grasp, but with the precision at the end of the movement. As soon as a component has to be placed exactly, for example in a defined position or on a mechanical reference point, it becomes clear whether a demo can become a robust process. From the outside this moment often looks like a brief hesitation. In reality, these are the final corrections that reveal whether a system can cope with tolerances.
In a production environment with fixed cycle times it is not enough if gripping or transporting works in principle. The final positioning also has to be repeatable. This is precisely where the dividing line runs between a technical demonstration and readiness for series production. The difference between a trial and shopfloor reality therefore rarely lies in the large movement, but almost always in the last millimetres.
There is also the question of speed. This is not treated as a minor issue at BMW either. More training and better calibration can accelerate processes. At the same time, the economic perspective remains crucial. A slower system can still make sense in the factory if costs, parallelisation and station logic are right. It is not the individual movement that counts, but the overall performance of a process.
Leipzig becomes the acid test
Where this logic is to be tested in practice is described by Michael Ströbel, head of process management and digitalisation in production at BMW. Leipzig is, for BMW, a particularly suitable starting point for the industrialisation of a robot because the plant maps the entire value chain. Press shop, body shop, paint shop, assembly, plus injection moulding, component production and energy modules. If a system works there, the transferability to the rest of the production network with more than 33 plants is significantly more likely. Ströbel also outlines the selection process for Hexagon. BMW continuously scans the market for technologies that could help with automation and efficiency. Partners are initially challenged in the laboratory with realistic tasks, such as sorting unstructured parts or measurement tasks. Only when that works does BMW consider a pilot in the plant. After an initial test in December 2025, two Aeon robots are now scheduled to come to Leipzig in the second quarter of this year. If everything goes to plan, they could be in production by the end of the year.
If we bring a robot to Leipzig and can use it there across all these technologies, then it can operate in our entire production network.
Michael Ströbel, head of process management and digitalisation in production at BMW
This gives the pilot a clear character. Leipzig is not just a backdrop, but an acid test. BMW is not looking there for an isolated demonstration, but for the answer to how robust a system remains in as many production environments as possible. That is precisely why the pilot is also strategically more interesting than a mere announcement of a humanoid robot.
Safety is not a footnote, but the ticket of entry
In parallel, BMW makes it clear that such systems only become relevant if they fit into industrial safety logics. In the workshops at the Talent Campus, attention therefore turns surprisingly quickly to classic questions from production. Where is the emergency stop? How is a cell stopped? Can people work in the same room? The answers are clear. This technology will only be used if it complies with current safety standards. At the same time, there is the requirement that the workspace must be able to be shared with people. Otherwise BMW would not pursue this approach. This shifts the debate. The benchmark is not the spectacular movement, but certifiability. Humanoid robotics only becomes interesting in the factory when it can be treated like any other piece of equipment.
Hexagon is relying on sensors, edge AI and a self‑swapping battery
Arnaud Robert, president of Hexagon Robotics, describes Aeon as a multi‑purpose humanoid that was not built from the outset solely for manipulation. In addition to gripping tasks, the system is also intended to support inspection, reality capture and the creation of a digital twin. To this end, Aeon is equipped with 34 degrees of freedom and 22 sensors. Robert lists different camera types, including peripheral, SLAM, time‑of‑flight and infrared cameras. There was also the decision in favour of wheels, because in a factory environment they are more efficient over the long term for distances and changes of direction than other form factors. On the topic of the battery, Hexagon is also opting for factory logic rather than show effect. Three hours of running time are only one part of the calculation. What really matters is 24/7 operation with automatic battery changing. The swap takes 23 seconds and is based on a superconducting magnetic mechanism.
From the very beginning we developed Aeon as a multi-purpose humanoid.
Arnaud Robert, president of Hexagon Robotics
Robert describes physical AI in several layers. Firstly, simulation and reinforcement learning. Secondly, perception, meaning not just object recognition but assigning a task to an object. Thirdly, imitation learning, where tasks are demonstrated rather than programmed. And fourthly, world models, which are intended to give the system a broader understanding of objects, material properties and the environment. A particularly important point for him is execution directly on the robot. Aeon works with two Nvidia Jetson boards. One consolidates the sensor data, the other translates task and perception into movements. Hexagon regards exactly this edge setup as critical in the factory environment, because obstacles, people and changing situations there require immediate reactions.
The digital twin is a tool, not a crutch
Both BMW and Hexagon emphasise that the digital twin is indeed central for training and simulation, but is not intended as a permanent support for real operations. Aeon can capture the factory, generate 3D information and thus contribute to updating the digital twin. At the same time, the system creates its own maps during operation and uses them for localisation. This is an important point. The digital twin here is not a replacement for perception, but an accelerator of industrialisation. Development times are to be reduced, training loops shortened and new use cases brought into real environments more quickly. In the end, however, the benchmark remains the real factory and not the virtual model.
In the end, what remains is a strikingly sober conclusion. When it comes to physical AI, BMW is not primarily interested in the robot’s body. What matters is whether a system can learn, whether it can recover from errors, whether it can reliably master the final few millimetres, and whether it can integrate into the safety, cycle times and process logic of a real factory. This is where the real shift lies. No longer automation based only on rigid sequences, but automation based on trained behaviour. The path to this does not run via grand visions, but via data, repetition and many small corrections.