The latest simulation software can speed up vehicle modelling and compress development timescales

At the beginning of 2014, Swedish supercar manufacturer Koenigsegg launched a vehicle which, even by its standards, achieved new levels of extreme performance. One:1 takes its name from the ratio of its horsepower to its kerb weight in kilograms, though perhaps its single most eye-catching statistic is its top speed: a stunning 440kph. Another speed-related fact is that the whole development cycle for the vehicle took only eight months, so that just one year elapsed from the project go-ahead to delivery of the first vehicle. Only six will be made.

As explained by Jon Gunner, the company’s technical director, like many mainstream carmakers Koenigsegg uses well-established design software systems: the Catia V5 3D modelling system from Dassault Systems and the Alias surface modelling system now supplied by Autodesk, the latter being used for Class-A surfaces. For simulating airflows around its vehicles, Koenigsegg also makes use of a computational fluid dynamics package called ‘Icon FoamPro’ from UK-based ICON. The software runs on an in-house supercomputing hardware installation with 32 cores. All of these systems, adds Gunner, have been installed over the past decade as the company has implemented a policy of increasingly in-sourcing all its design and development operations.

Gunner says the specification of One:1 effectively pushed Koenigsegg “to the limits”, since the vehicle was intended to act as a high-performance track-racing car and an ultra-high-speed on-road vehicle “without any set-up changes”. It was therefore to be “two cars in one”. As Gunner points out, those two applications do not complement each other particularly well in one crucial respect that is related to aerodynamics: the “huge amount” of downforce required for the first of the two roles would create excessive drag and suspension loads that would unacceptably inhibit its second function, if allowed to build up unchecked as the car increased speed.
 

Quarter scale model

One:1 therefore has a variable aerodynamic system which means that the huge downforce generated at track speeds is supressed as much as possible up to maximum velocity. “From 300kph upwards, we actually start reducing the downforce rate as much as the bodywork allows,” Gunner confirms. Koenigsegg’s official figures are that the car generates a downforce of 610kg at 260kph, but only 830kg at 440kph.

Several systems on the car enable this feat to be achieved. There is a dynamic flap system on the front of the car that starts off as a high-pressure Venturi flap but is cut off from the airflow at higher speeds, as is some of the flow under the car. “We cut off a lot of the underbody downforce,” states Gunner. Furthermore, the actual rake of the car can be altered with the slight elevation of the front of the vehicle. The most obvious aerodynamic feature of One:1  – its twin wing system above the rear of the vehicle – can also be altered to vary from a very high to a “minimal” downforce and drag setting, according to Gunner. The One:1 programme, he confirms, was the first time Koenigsegg had ever attempted to implement this approach in one of its vehicles.

Simulating success
Intensive simulation of the vehicle’s aerodynamic performance under multiple different scenarios was therefore an essential part of the development programme. However, Gunner says that doing so within the highly compressed development timescale would simply have been “impossible” with Koenigsegg’s in-house resources. Wind tunnel testing was also unfeasible because there are no facilities that can generate airflows at a speed as high as 440kph.

The solution was to use high-performance computing facilities remotely, over the Internet Cloud. These services were provided by HPC Wales in the UK. Gunner explains that the method involved running simulation routines on virtual models of the car with either a high or slightly lower meshing density. The remote facilities were used for running multiple tests – six or seven at a time – on slightly less dense models, each representing a different aerodynamic configuration. Gunner explains that further compression of the run-times for the less dense models was possible because those models represented only one half of a car. Koenigsegg then carried out a more limited number of validation exercises using its in-house facilities, though in those instances complete vehicle representations were used.

Software provider ICON also played a role. The company installed its software on the HPC hardware, using 128 cores – or nearly four times the carmaker’s in-house resources. From the perspective of Koenigsegg’s users, input procedures were almost identical. “We really didn’t see much difference,” states Gunner. “It just went a lot faster.”

Envisage Group
Even so, a run of 24 hours was required to simulate just one second of virtual driving conditions (compared, however, with several days back at Koenigsegg in Sweden). Altogether, the need to simulate the performance of the car over multiple different settings for its aerodynamic elements meant that “several hundred” simulation routines were run, according to Gunner. Some tweaking of the software was required to cope with the fact that, at the simulated speeds involved, air becomes more compressed than is normally the case. “The software is very tuneable,” explains Gunner.

Looking back at the project, Gunner says the use of remote supercomputing resources went better than expected. The key enabling factor he identifies is the ability to access those resources as needed, for as long as needed. As a result, Gunner says Koenigsegg will certainly use this approach again. In fact, it has already done so for its next vehicle project which is due for launch in February 2015.

Everyday simulation
But what is the day-to-day reality of using digital tools in mainstream automotive manufacturing? One company that can provide an answer is Envisage Group in Coventry, UK. Operations director Adrian Coppin explains that the 160-strong team handles a range of services including design, engineering, model-making and low-volume panel production to major OEMs, particularly for body-in-white prototype build tasks.

It is a business which, as Coppin admits, still has some distinctly manual elements – for instance the manufacture of one-off panels – but in which the communication between Envisage and its clients is now overwhelmingly digital. He says the format in which design information comes into the company is “primarily” CAD data, with the Catia V5 system effectively the industry standard. Envisage itself operates that system, though Coppin says it also has ‘seats’ of the Siemens PLM design software and the Alias surface modelling package.

A typical sequence of operations for making a clay model starts with the receipt of surface data in V5 format, which is then converted into instructions to drive the CNC machining of the clay surface. But that physical model inevitably requires some manual modifications after being viewed by the OEM’s designers, and so the amended model is scanned to record the alterations. However, the scanning process only records ‘point cloud’ data and so actual resurfacing of the CAD model is conducted using the Alias software, before the new virtual model is reconstituted in V5. The Alias software, says Coppin, is a “much easier tool” for actual surface creation.
 
But once those surfaces have been generated, further ‘engineering’ work back at the OEM, such as the creation of attachment points on the interior surfaces, will take place in V5. After that, the design data comes back to Envisage, now as detailed individual panels rather than a single surface model, and can be used to drive the manufacture of actual prototype parts. “We can machine a low-volume mould tool out of high-density composite board, which will be released to our panel makers to produce the parts,” Coppin explains. That final stage of production, he adds, is mostly manual, though if volumes are high enough – say ten to 12 parts – the process can be made semi-automatic through the use of a rubber press to pre-form the metal panels before they are finished off by hand.

An increasingly digital future
It is not a process that Coppin expects to see changing substantially in the foreseeable future, unless 3D printing advances to the point where it can be used to make large fabrications as a matter of course. However, additive techniques are already used to make the seals that fit around prototype doors, plus the grills on clay models.
 
Thus far, all the additively manufactured parts used by Envisage are sourced externally, but in line with the increasing uptake of the technology within automotive manufacturing, Coppin says that establishing an in-house capability is a target for 2015. At a strategic level, he believes that additive techniques can only become more pervasive and therefore represent a technology with which the company needs to become familiar. At a more immediate tactical level, the move will provide Envisage with the ability to turn out additive parts more quickly – “overnight” if necessary, he says.

Envisage Group
Coppin predicts that another aspect of the company’s interaction with its clients that is also likely to become increasingly digital in format is the communication of colour-related information. He explains that the accuracy of computerised rendering packages is now such that they are obviously superior to older alternatives such as PowerPoint presentations. Envisage already has this capability and it is really a question of waiting for some – though not all – of its clients to catch up.

Otherwise, a major development that Coppin expects to see over the next few years is a general upgrade within the automotive industry from the V5 to V6 Catia format. The significance of the latter is not so much that it possesses enhanced modelling capabilities as that it offers greatly increased support for simultaneous collaborative working between geographically dispersed partners – something with obvious benefits for companies like Envisage, which depend on effective communication of technical data for the viability of their business.

Consistency is key

Computerised design and development tools and methodologies are most effective when they are pervasive, in other words when they make use of information generated in all parts of an organisation. At Austria-based powertrain engineering and testing company, AVL List, this is the responsibility of Dirk Denger, head of synergetic methods and tool development. Denger's remit covers the company’s three distinct but interlinked areas of activity: measurement and testing; powertrain engineering; and advanced simulation. He says the development of these globally dispersed activities involves “connecting knowledge from all over the world”.
 
According to Denger, the easiest part of his task is dealing with cultural differences, which are evident even in operations in countries as close as Germany and Austria. The more difficult part is the co-ordination of the highly varied “technical thinking” that must provide input to complex products which integrate multiple types of technologies. Denger says that a particular problem is overcoming communication difficulties across different specialisms: “The mechanical guy often cannot understand the software guy”.
 

Dirk Denger, AVL List
The software tools used by AVL play a major role in mitigating this situation. Foremost is PTC Integrity Modeller – previously known as Artisan Studio – from Parametric Technology Corporation, which is described by Denger as a “model-based system engineering tool” that incorporates different relationships and requirements. In turn, this enables “objective communication” between different domain specialists. The underlying principle is that of ensuring “a single source of truth” for all engineering information within the company.
 
The information includes both design and product data. At AVL, these originate from the same source: the PTC Creo and PTC Windchill systems respectively. Surprisingly, Denger says this is not necessarily an advantage. He explains that there is a danger of a user becoming “locked in” to a particular supplier’s products in a way that might inhibit the adoption of new technologies in its own products. “You need stable base systems and we have a good relationship with PTC,” he states. “But you also need openness so that you can adapt new functionalities to enable you to come to a higher value.” In short: “The tools have to follow the process and not the process the tools,” says Denger.
 
These PTC products, together with the company’s SAP ERP system, comprise its set of ‘master’ software tools, which are fed by multiple specialist tools within AVL’s three broad application areas. The master tools are now being put to a relatively new use serving what Denger terms “requirements engineering”. He explains that the twin pressures of increasing product complexity and decreasing development timescales mean that “you have to work out what you need to do and what you have to look at” at the outset of a new project. This is, he observes, a “high-effort” but relatively “low-cost” exercise that produces a set of performance targets which can be fed into the Integrity Modeller system. The approach enables the development process to be supplied with information generated in other areas of the company. “When we have a development project in one domain we now know very fast what we have to consider from other developments,” he states.
 
Underpinning all this is one paramount requirement: consistency. This means ensuring that data generated in one application area “can be checked and used in another development step”, says Denger, describing the process as “intelligent communication”. The consequence is, he states, a “consistent tool architecture” that enables the company to exploit all the information sources available to it and how they interact with each other, and thus to provide customers with “solutions that are consistent from concept through to product”.