A common digital twin for production transparency at Audi

Published Modified
7 min
Audi is taking live data from sensors across the factory to create a real-time network on hyper-converged infrastructure to control robots

Audi has been working on a digital twin of its production processes and through Edge Cloud 4 Production it is now ready to control the automated physical factory with a virtual PLC connected in real time, according to Henning Löser, head of the Audi Production Lab

Audi is using connected digital twin technology to manage production volatility, including for electric vehicle batteries, and making assembly at its factories flexible, fast and robust. At the same time the carmaker says it can monitor factory operations in near-real time with virtual programmable logic controllers (vPLCs), simulate changes before deployment, and rapidly adapt to new product variants or process adjustments. The combination of real-time control and digital twin that Audi has developed is key to the next generation of production planning, virtual commissioning and intelligent automation.

Speaking at this year’s Nvidia GLC conference held in March in San Jose, California, Henning Löser, senior manager and head of the Audi Production Lab, highlighted Audi’s strategy for integrating AI-driven robotics into its plants, anchored by digital twins of its production facilities. Using what Audi is calling Edge Cloud 4 Production (EC4P), the carmaker is planning to take live data from sensors across the factory to create a real-time network on hyper-converged infrastructure (HCI) to control robots, tools and other machinery.

We need a digital twin as a scalable, collaborative platform for next-gen production planning, virtual commissioning and robot training, in combination with a real-time connectivity to the physical world

Henning Löser, Audi Production Lab

Nvidia provides accelerated computing and artificial intelligence (AI) software, including the Omniverse platform that Audi is using to assist with its production planning overhaul.

According to Löser, EC4P is enabling Audi to increase line efficiency, adaptability and overall competitiveness in the face of rapidly evolving manufacturing technologies, and he pointed to electric vehicle (EV) battery production as a good example of that rapid evolution.

Cross-functional coping strategy

According to Henning Löser, Edge Cloud 4 Production is enabling Audi to increase line efficiency, adaptability and overall competitiveness in the face of rapidly evolving manufacturing technologies

Product volatility is a core challenge in the production planning for batteries, said Löser, because there is no guarantee that an EV designed today for launch in three years’ time will use batteries with the same chemistry. That chemistry has an impact on the geometry of the cells and, in turn, the cells have an impact on the packaging of the battery. Added to which, there is volatility in the number of battery units and EVs being produced, a lesson the automotive industry has been learning in the last couple of years.

Accommodating for these unknowns in EV production requires flexibility in production planning. Managing variance proactively is essential for stable, cost-efficient and scalable manufacturing, said Löser. To organise that, Audi has to think cross functionally what a change to the battery means end-to-end along the production process and how that affects the assembly line.

“We have to manage variants proactively and for that we need the right toolbox,” said Löser, adding that while Audi had developed digital tools with its partners to cope with variations to production, those individual toolboxes need to be brought together in one common digital twin to provide transparency from the shopfloor all the way to the management board. Löser said that what Audi needed was a tool that is simple to use, that helps with variance modelling and provides speed and flexibility in the production planning process for batteries.

“What we did was start from the point cloud of our production site and integrate it into [Nvidia] Omniverse,” he said.

 

Audi has programmed its own asset library and used it to build up a new production line

Real-time collaboration

A point cloud is a dense collection of data points in 3D space that represents the external surface of an object or environment and Audi Production Planning made a digital map of the production layout, one example being its Ingolstadt plant in Germany. Nvidia’s Omniverse platform enabled the carmaker to build and operate industrial-grade metaverse applications and physical AI. The software connects 3D design tools to enable real-time collaboration, physically accurate simulation, and photorealistic rendering for digital twins, robotics training and simulation workflows.

“We started to program our own asset library and then took that asset library to build up our new production line,” explained Löser. “By building up this new production line [digitally], it is really easy to talk with all of our suppliers, with all of our partners on the same common ground – [sharing] what is happening, and why we are doing this.”

In this pilot Audi is able to duplicate the battery package and carry out accurate robot motion planning in the digital twin. “We can check whether the robot can reach all the parts it is supposed to reach and by doing that we can verify the production set up that we have chosen in this digital twin, so it can actually work for what we want to do,” said Löser.

The speed of development is a particular highlight for the Audi Production Planning, according to Löser. “In April 2025 we made the first touchpoints on how this could be doable and by September we had the first showcasing,” he said. “In December we had two production facilities built up and now we are thinking about the rollout of the digital twin capabilities throughout our battery production planning. That is amazing. Think about the time it takes if you have to build up an expert system. It would take much longer to do that.”

Moving from local to remote

Audi is integrating more sensors into its production process, including on its robots, to enrich the feed of live data and see how the production process is performing, which is good for efficiency in controlling production processes, including quality control.

Löser pointed out that conventional thinking about cyber-physical systems for discrete automation includes some kind of computer attached to a mechanical device.

“You have the robot and the robot controller cabinet and all the motion planning which you program, that is all tightly coupled together,” said Löser. “Your PLC [programable logic controller] is in your automation cell; it is down there, it controls everything.”

Intelligent robot control will lead to a higher automation ratio through the use of cameras

However, leaps in software innovation are being made every seven to nine months and AI chips are changing fast. By comparison, when investing in an expensive assembly line robot and the automation technology to control it, the lifespan is at least 15 years. So how does a company run the same robot on software that will be released three years from now using AI chips that come out five years in the future, asked Löser? For Audi, the solution is to follow the same path it took in evolving its data centre, namely disaggregating mechanical automation equipment from the local controller.

 

“The only thing you need is real-time network and then you run all the software part on the HCI with real-time applications through the real-time network,” said Löser – the aforementioned EC4P.

An HCI is a software-defined IT infrastructure that virtualises all elements of the conventional hardware-defined systems. With HCI both the storage area network and the underlying storage abstractions are implemented virtually in software rather than physically in the hardware.

Automation cell control

The first application that Audi is running on its EC4P is the virtual PLC that Audi worked on with Siemens – Simatic S7-1500V – which is now running on the HCI and controlling the carmaker’s automation cells at Ingolstadt. According to Siemens the virtual PLC is essential for visualising the shopfloor and is the first controller certified by the independent German technical inspection associations (TÜV) for failsafe operational IT infrastructure. Löser said the virtual PLC is performing in the same way that the physical PLC would in its automation cells. Rather than continuing to add control boxes to every new piece of equipment, all of which need to be individually updated by maintenance staff, the updates can be made through the HCI using the software to control the specific function.

Transferring that function from the PLC to AI robotics means the software that controls the robot is on the HCI and that enables it to become scalable. “That is because it is easy to integrate new AI GPU chips into your server infrastructure,” explained Löser. “Then all of the robots in your factory can tap into that computing power that you have to run your facility.”

Bridging the gap

Audi is now bridging virtual to physical gap between its real-time, connected HCI and the digital twin it is using for production planning. One of the things that needs to change is the way robots are programmed to iron out the inevitable discrepancies that occur between a digital twin and the physical assembly line when it is built.

“When you come to the construction site and somebody bolts down the robot to the shopfloor… will that robot be at the exactly the data point in space where it was designed to be?” asked Löser. “Probably not. Most likely never.”

“The discrepancy between your digital twin from production planning and the real world grows apart. What we want to do is have them do exactly the same thing but do it in the digital twin

Henning Löser, Audi Production Lab

The risk is that digital twin programs won’t work because of inconsistency between the shopfloor and the digital twin, which is why Audi’s mechanical engineers had to start out with a point cloud of the real factory to get the physical world into the digital twin and start the production planning process. With that accurate data Audi can begin to look at robot control and how to increase assembly takt time by removing the need for robot interlocks, the safety mechanisms inbuilt to prevent a robot from operating while a hazard is present. Currently, every robot has to be programmed individually meaning they cannot interact without the interlock safety mechanisms.

“These robots waiting for one another is a waste of value-adding time,” said Löser. “In discrete automation there is no other way of doing this.”

An AI application detects weld spatter on the underbody of a car body in the body shop at the Audi site in Neckarsulm

What needs to happen instead is to connect all of the robots to a path-planning software with millisecond latency so that the platform knows the position of every single robot. That way, Löser said, you can have software operating all of the robots in their cells without interlocks. With that control the software can recalculate the path every single instance there is a deviation and compensate by either slowing something down or speeding another process up.

“Do not program your robots deterministically at every single robot cabinet anymore, go one layer up,” said Löser, adding that in doing so you can disaggregate the computer from the real cabinets and land it on the hyper-convergent, real-time connected infrastructure. “What you have then is a digital twin that is connected to your real world on a millisecond basis.”

Audi’s goal is to have an intelligent robot control that provides faster ramp up times by removing the need for robot pre-programming in the digital twin. That also promises to lower costs by saving time in the robot automation cells and reduce the number of robots required to perform the required tasks. Intelligent robot control will also lead to a higher automation ratio through the use of cameras. “If you have a camera you can start reacting to the environment not only to the process data out of the robots and the automation tooling but also by indicating that the piece you want to grasp is actually where it is supposed to be,” said Löser. There are also more opportunities to use robots to automate arduous tasks currently carried out by lineside workers to drive in greater efficiency.

“We are working on having a digital twin as a scalable, collaborative platform to do production planning,” said Löser. “If we do that, we are sure that we can do the virtual commissioning and the robot training in this digital twin. The next challenge then is that we have this digital twin and make it coherent to the physical world at all times.”