Labor shortage with projections by Boston Consulting Group (BCG) stating, that nearly one-third of engineering roles will remain unfilled each year through at least 2030, and a multitude of different specialised engineering tools require engineers to be more efficient at their work and at the usage of engineering tools in general. At the core of this challenge are CAx (CAx: CAD, CAE, CAM, etc.) tools, where creating manufacturable and performant 3D designs remains highly manual and time-consuming, complicated by complex boundary representations and unintuitive interfaces that only experienced professionals can navigate effectively.
My last post explored the fundamentals of AI and its emergence in the engineering toolchain. This post aims to build on that foundation and dive deeper into the emerging AI in design automation and Simulation:
in Design Automation, AI accelerates workflows through generative design, text-to-CAD capabilities, automated reverse engineering, and improved tool interoperability;
in Simulation, AI is dramatically accelerating or even replacing traditional physics simulations, enabling faster validation and iteration cycles.
These AI-driven approaches directly address the talent shortage by streamlining processes, improving design quality, and reducing development time. This technological evolution responds to market pressures for faster innovation, lower costs, and enhanced competitiveness, with increasingly sophisticated AI systems now capable of handling complex engineering tasks across industries, fundamentally transforming the future of engineering design and development.
Note that for this article, I am listing my references in the small "link" icons below each segment to make it easier for people to read up on these topics more in depth.
Also note that this article is best consumed on a Laptop or Tablet. For Mobile the figures might be a little squeezed.
Last updated: 28th February 2025
The Challenge: Engineering faces a critical talent shortage with one-third of engineering roles projected to remain unfilled through 2030 (BCG). Simultaneously, market pressures demand faster innovation cycles at lower costs. Traditional CAx tools remain complex, time-consuming, and accessible only to specialists with years of training, creating significant barriers to productivity and innovation.
The AI Opportunity: AI-powered design automation and simulation technologies are transforming engineering workflows through three primary value drivers:
Dramatically accelerating development cycles
Machine learning surrogate models deliver near-instantaneous simulation results that previously required days of computation
Engineers can evaluate 120× more design configurations than conventional methods (Airbus case study)
Late-stage redesigns reduced by 63% through early-stage performance predictions (McKinsey, 2024)
Real-time validation brings simulation insights upstream to conceptual design phases
Democratising advanced engineering capabilities
Natural language and voice interfaces remove specialised tool barriers
AI assistants predict workflow steps and automate repetitive documentation and modeling tasks
Cross-domain knowledge integration bridges specialized silos between mechanical, electrical, and control engineering
Reduced reliance on scarce domain experts mitigates knowledge fluctuation challenges
Optimising business and product outcomes
Better conceptual design choices reduce lifetime costs by 5× compared to conventional approaches
Generative design achieves 6-20% part cost reduction across aerospace, automotive, and consumer products
Material optimisation simultaneously improves performance, weight, and sustainability metrics
Multi-objective optimisation balances competing requirements that traditional methods struggle to reconcile
Impact: AI-driven design automation is transforming diverse sectors including automotive, aerospace, energy, and civil engineering. These technologies directly address the engineering talent shortage while enabling organisations to achieve faster innovation cycles, lower development costs, and enhanced market competitiveness through higher-quality designs and optimised material usage.
In this chapter we will go a little bit deeper into the respective tools and concepts revolving around the automation of design and simulation, with respect to the last post. Let's start by the information exchange between the different teams/ tools in the value chain of engineering design.
In recent years, industries ranging from automotive to aerospace, silicon chip manufacturing to energy systems, have faced increasing pressure to accelerate their product development cycles. This drive for efficiency stems from the need to remain competitive in rapidly evolving markets, meet changing consumer demands, and adapt to technological advancements.
The reduction in development time has been pursued through the introduction of various innovative systems and strategies. A significant milestone was the widespread adoption of Computer-Aided (CAx) systems and tools over the past few decades. These technologies have revolutionised design, analysis, and manufacturing processes across sectors.
The schematic on the upper right describes the communication and information exchange in form of communication (blue arrows) or data exchange (green arrows) between the different teams (white boxes) and their respective tools (orange boxes) in the engineering value chain. The tools used as abbreviated as follows: Computer Aided Design (CAD), Computer Aided Engineering (or Simulation: CAE), Computer Aided Manufacturing (CAM) and finally Computer Aided Testing (CAT).
The fundaments to interoperability between software and hardware in the context of production automation but automation in general have been laid by Akerman where he introduces the ISA-95 automation hierarchy. Following ISA-95, approaches for interoperability for the sake of design automation can be derived. The image on the lower left shows a schematic depiction of an extended automation stack.
The engineering automation stack represents a comprehensive framework for integrated design and manufacturing systems. While traditionally visualised as a pyramid, the architecture functions as an interconnected network of services and protocols. The foundational Data Layer implements standardised formats such as .step and .3mf, establishing the fundamental data structures that enable interoperability between engineering systems.
Building upon this foundation, the API and Protocol Layer leverages industry-standard communication protocols like MQTT and REST to facilitate system integration. This layer ensures robust and standardised information exchange between various engineering tools and platforms.
The fundaments to interoperability between software and hardware in the context of production automation but automation in general have been laid by Akerman where he introduces the ISA-95 automation hierarchy. Following ISA-95, approaches for interoperability for the sake of design automation can be derived.
The engineering automation stack represents a comprehensive framework for integrated design and manufacturing systems. While traditionally visualised as a pyramid, the architecture functions as an interconnected network of services and protocols. Leaving out the field layer as it can be considered to be of secondary importance, the Data Layer builds the foundation. This foundational Data Layer implements standardised formats such as .step and .3mf, establishing the fundamental data structures that enable interoperability between engineering systems.
Building upon this foundation, the API and Protocol Layer leverages industry-standard communication protocols like MQTT and REST to facilitate system integration. This layer ensures robust and standardised information exchange between various engineering tools and platforms.
The Middleware Layer serves as a critical integration point, where platforms such as Synera and Grasshopper enable sophisticated engineering operations. This layer orchestrates complex workflows and manages the translation between different system requirements.
Above this, the Semantic Framework Layer, implementing standards such as MoSSEC, provides the semantic context necessary for consistent interpretation of engineering data across the enterprise.
Two cross-cutting components complete this architecture: Automation Scripts, typically implemented in Python or also using specialised platforms like Rhino.Inside, provide programmable control across all layers. The DevOps Infrastructure layer supports the entire stack, ensuring reliable deployment, maintenance, and scaling of automation systems.
This architecture represents a significant evolution from traditional linear hierarchies, acknowledging the need for direct communication between non-adjacent layers and bidirectional data flow. The framework enables organisations to implement robust automation strategies while maintaining system integrity and data consistency. Note that this is a high level overview and concepts like cloud native engineering platforms as well as security protocols etc. are not considered.
Engineering design optimisation has emerged as a transformative discipline that leverages computational power, mathematical rigour, and domain expertise to refine and innovate structural and mechanical systems. At its core, design optimisation seeks to balance competing objectives—such as minimising material usage, maximising structural integrity, and reducing costs—while adhering to physical, geometric, and operational constraints.
Three methodologies dominate this field: generative design, topology optimisation, and parametric optimisation. While each method operates under distinct principles, they share foundational connections in their reliance on algorithmic exploration, iterative refinement, and multidisciplinary integration. See an example of a standard topology optimisation problem in the image on the lower right (ref).
Design optimisation formalises engineering challenges into mathematical problems composed of variables (design parameters), objectives (quantities to maximise/minimise), and constraints (equalities/inequalities defining feasible solutions).
For instance, a structural beam’s optimisation might involve variables like cross-sectional dimensions, objectives like minimising weight, and constraints like maximum stress thresholds. These formulations enable systematic exploration of design spaces using gradient-based algorithms, heuristic methods, or hybrid approaches.
Modern optimisation relies on computational tools such as Finite Element Analysis (FEA), Computational Fluid Dynamics (CFD) for simulating physical behaviours, and parametric modelling for geometric manipulation. These tools enable rapid evaluation of design alternatives, bridging theoretical models with practical engineering outcomes.
The design optimisation process can be represented using a flow diagram, as shown in the figure below. As mentioned before, design optimisation requires a formal formulation of the optimisation problem that includes the design variables that are to be changed, the objective to be minimised, and the constraints that need to be satisfied. The evaluation of the design is strictly based on numerical values for the objective and constraints.
When a rigorous optimisation algorithm is used, the decision to finalise the design is made only when the current design satisfies the optimality conditions that ensure that no other similar design is better. The design changes are made automatically by the optimisation algorithm and do not require intervention from the designer. Even though the optimisation runs completely automatically, human intervention and expertise is needed to guide the optimisation and evaluate results.
The shown flowchart shows a few key principles, that are distinctly similar to the concept of differential engineering defined by Blake Courter. First, there needs to be the block defining the Design Jacobian, here the definition of the Optimisation Problem serves that purpose. Once the problem is defined, a shape needs to be generated, which happens in the initial design and the update of design variables blocks. Finally an evaluation of fitness is performed, which is indicated as the diamond blocks.
Topology Optimisation (TO), majorly driven by Ole Sigmund, iteratively redistributes material within a predefined design space to achieve optimal performance under specified loads and constraints. Unlike shape or size optimisation, TO modifies the topology—the connectivity and arrangement of material—yielding organic, often non-intuitive geometries. A widely used example is the Solid Isotropic Material with Penalisation (SIMP) method, that relaxes binary material densities (0 for void, 1 for solid) into continuous variables penalised to converge toward 0/1 solutions. See an example of Topology Optimisation in the image on the upper right in this section.
Generative Design (GD) goes beyond traditional optimisation by algorithmically exploring vast design spaces defined by objectives, constraints, and performance metrics. Unlike TO’s focus on material efficiency, GD incorporates multi-objective optimisation, balancing factors like manufacturability, aesthetics, and sustainability.
Parametric optimisation refines predefined design concepts by adjusting continuous or discrete variables (e.g., beam thickness, fin spacing). Unlike TO or GD, it assumes a fixed topology, focusing on local improvements within a constrained solution space. The image below shows the steps of a parametric design workflow light-weighting a simple bracket with beam lattices. This workflow was developed together with my team at Hyperganic. In this workflow, a latticed bracket experiences linear elastic static tension from the backside and is fixed on the floor. The objective is to keep the weight as low as possible while still sustaining the load (the maximum allowable stress is 150MPa for the aluminum material chosen).
Shoutout to Luis for his Bracket.
Let's start by analysing the file exchange formats and the limitations that occur in the process of communicating files and information between entities. This post extends past the explanations of geometry representations in my last post about AI in engineering.
Partial Standardisation: While STEP (ISO 10303) is a universal CAD exchange format, inconsistencies persist in how tools handle metadata (e.g., tolerances, parametric features). Critical design intent or material properties are often lost during conversions between CAD tools (i.e., CATIA to SolidWorks).
Simulation Incompatibility: STEP files lack native support for simulation-specific data like mesh settings or boundary conditions, requiring manual rework for CAE workflows.
Resolution vs. File Size: Voxel-based representations (common in medical imaging or additive manufacturing) produce massive datasets, complicating storage and transfer. As an example: To avoid stair stepping (aliasing) in voxel models, the resolution needs to be at least two voxels per minimum feature size, in accordance to the Nyquist Shannon Theorem. Furthermore, with insufficient compression, the number of voxels and thus the memory requirements scale cubically.
Conversion Loss: Translating voxel data to mesh formats (e.g., STL) risks losing detail, critical for simulations like fluid flow or stress analysis. The advent of Immersed Boundary Simulation Methods (IBM) for both structural and fluid simulation partly circumvent the need to convert voxels to meshes, since the underlying voxel representation can directly be used as a discretised "simulation-mesh" (see the figure on the lower right, that compares IBM with tetrahedral meshes for simulation, ref: Hyperganic).
Implicits: Signed Distance Fields (SDF) (e.g., .implicit) offer a wide range of modelling capabilities, are however hardly supported and can in most cases only be transferred as static representations of the original design. Evaluation of SDFs is compute heavy on CPU/ GPU.
Approximation Errors: STL files approximate surfaces with triangles, introducing "stair-stepping" artifacts on curved geometries, degrading simulation accuracy. Also here the resolution of the triangulation of the original object has a significant impact on the quality of geometry representation and memory requirements of the file.
Metadata Absence: STLs lack native support for material properties, colours, or simulation parameters, forcing engineers to rebuild metadata post-conversion. The .3mf file exchange format aims to eliviate this barrier.
Simulation Mesh Incompatibility: Simulation tools (e.g., ANSYS, COMSOL) require structured meshes (e.g., hexahedral), but most CAD exports produce unstructured meshes (e.g., tetrahedral), requiring costly re-meshing.
Resolution vs. File Size: Point cloud-based representations (common in scanning for reverse engineering and testing) produce massive datasets, complicating storage and transfer.
Unstructured Data: Lack of topology complicates segmentation/classification; algorithms struggle with noise/occlusions.
The field of engineering design optimisation faces significant challenges in computational complexity, resource inefficiency, and scalability, particularly as systems grow more multidisciplinary and high-dimensional. The image below shows a sketch of a parametric optimisation with the parameters on the left and an iteration path down the slope of a cost function/ objective function (ref).
Dimensionality: High-dimensional design spaces involving thousands of tune-able parameters create immense computational challenges. The "curse of dimensionality" renders exhaustive searches impractical as each additional parameter exponentially multiplies the search space. Traditional gradient-based methods falter when faced with these expansive parameter landscapes, making it increasingly difficult to find globally optimal solutions.
Fidelity: Engineers constantly navigate the tension between model accuracy and computational feasibility. Nonlinear relationships between design variables and objectives require sophisticated modelling approaches that capture true system behaviour. Simplified physics models may make optimisation tractable but risk overlooking critical phenomena.
Scale: As design problems grow in scale, optimisation algorithms struggle to maintain effectiveness. Local optimisers frequently converge to sub-optimal solutions when navigating vast design spaces. Even advanced techniques like pattern search face significant challenges with the discrete-continuous hybrid spaces commonly encountered in material selection and topology optimisation problems, further complicating the search for optimal designs.
Compute Time: The iterative nature of engineering optimisation creates substantial time burdens for design teams. Conventional simulation approaches such as finite element analysis (FEA) or computational fluid dynamics (CFD) typically demand hours or days per individual simulation. This leads to "long design lead times and untapped optimisation potential" when using classical methods, as engineers cannot feasibly explore the full design space.
Resource Consumption: Beyond just time, the resource demands of optimisation workflows create practical constraints on engineering teams. High-fidelity simulations and brute-force parameter sweeps consume substantial power and hardware resources, driving up operational costs. Cloud-based optimisation workflows, while offering scalability, incur escalating financial costs due to prolonged compute instances.
Complex Requirements: Modern engineering systems demand integration across multiple disciplines, creating layers of interacting requirements that must be simultaneously satisfied. Systems like aircraft require coordinated optimisation across mechanical, electrical, and control subsystems, where siloed approaches fail to capture critical cross-domain interactions. Engineers must constantly navigate difficult trade-offs between model accuracy, computational feasibility, and design performance, often without clear guidance on which compromises will yield the best overall results.
Engineers are required to innovate faster, while reducing cost to keep their respective enterprise as competitive as possible. Projects are typically bound by three factors: Scope, Time, Money. While adjusting the weight on either one of these one naturally come across engineering processes.
Engineering design workflows face several critical challenges that impact efficiency, innovation, and project outcomes. These pain points can be categorised into three primary buckets: process bottlenecks, technical barriers, and knowledge management challenges.
Fluctuation: Organisations struggle with the capture and preservation of institutional knowledge, as valuable expertise and best practices often remain undocumented or are lost with departing personnel.
Reporting: The interpretation of technical metrics, and creation of appropriate visualisations and reports or educational materials for different audiences can delay critical decision-making processes.
Late Physics and Manufacturing Validation: The late-stage problem detection often necessitates significant backtracking, extending project timelines and budgets.
Inefficient Conceptualisation: Conceptualisation relies mostly on experienced staff or requires many (high-fidelity) simulations to validate ideas. Besides the subject matter experience of the engineer, CAx tool handling can be challenging.
Physical prototyping: While necessary for validating factors like weight, strength, and safety, can further intensifies resource demands.
Multi-domain expertise: The requirement for expertise across multiple domains - from CAD and simulation software to numerical methods, combined with numerous concurrent responsibilities, can become challenging.
Integration challenges between different software tools: Engineers must navigate diverse tool ecosystems with proprietary formats and interfaces, leading to data exchange issues and workflow inefficiencies.
As already illustrated in the previous post about AI in Engineering (design), AI primarily emerges in a few key areas: Design Automation and Simulation as well as Knowledge Management and Integration. It is no surprise that AI emerges in these regions, as bad conceptual design choices on average end up with a quintupled (5x) life time cost in comparison to better ones. In this post we will therefore dive deeper into some methods to bring better decision making power to mechanical engineers.
AI-driven Design Automation accelerates the design process through techniques like generative design, text-to-CAD, reverse engineering, and increased interoperability. These tools streamline workflows, improve design quality, and reduce development time. Additionally, AI is transforming Simulation by accelerating or replacing physics simulations, allowing for faster design validation and iteration
These AI tools offer fundamental benefits to the engineers, distinctively corresponding to the challenges identified by BCG:
Democratisation of CAx tools (better tool ergonomics)
by Text to CAx or Automated CAx features
by automation of repetitive tasks
by supporting the engineer and thus removing sources of errors
Enabling better decisions faster (accelerating core workflows):
by bringing simulation knowledge and data upstream to the conceptualisation phase
by saving materials and product lifetime costs with generative design
CAx tools typically require years of training to be used effectively. Democratisation broadens participation by simplifying tool interactions. Natural language (NL) interfaces for are transforming CAD and CAE tools by streamlining workflows, enhancing accessibility, and reducing the learning curve for users. These interfaces leverage advancements in natural language processing (NLP) and machine learning to create more intuitive interactions. Thus, entry level tool users or engineers used to other CAx tools will find the transition easier and more intuitive.
NL interfaces allow users to perform complex operations through conversational language or voice input, bypassing traditional menu navigation. For example, tools like CADgpt enable 3D modelling in Rhino3D via natural language, see the example image on the right.
Controlled natural languages (CNLs) allow users to define design constraints and objectives in plain text, which are then translated into formal optimisation models. This eliminates the need to learn specialised modelling languages, streamlining problem-solving in mechanical design.
Speech recognition in NL interfaces makes CAD/CAE tools more accessible to users with motor impairments or disabilities. By translating voice commands into actions, these systems democratise access to advanced design workflows.
NL interfaces automate tasks like documentation analysis, extracting building codes or material specifications from text. They also enable direct model modifications via commands (e.g., “Make this bracket aluminum”), reducing time spent on manual adjustments.
AI-driven systems analyse user behaviour to predict next steps, automate repetitive tasks, and suggest context-aware tools. For instance, predictive interfaces can adjust workflows based on past actions, reducing manual input and minimising interruptions. See an example of Autodesks natural language based AutoCAD assistant on the lower right.
CAD packages directly infer the engineering drawing from the generated 3d-Constructive Solid Geometry model or the other way round. For example, Autodesk has implemented an automated function to derive CAD models from drawings using zone graphs.
The integration of artificial intelligence into mechanical engineering workflows represents a paradigm shift in how industries approach product development. By enabling faster, more informed decisions during critical early-stage design phases, AI-driven tools fundamentally alter cost structures and performance outcomes.
Traditional engineering processes often delay critical physics and manufacturing validation until late stages, leading to costly redesigns and extended timelines. See for example the results of a comparison between the ground truth simulation field (1st row) and the predictions of a simulation surrogate model (2nd row) and the error between the two (3rd row) in the image below (ref).
Democratising simulation expertise: AI assistants can guide less experienced engineers through complex simulation setups, reducing reliance on scarce domain experts and addressing the knowledge fluctuation pain point.
Predictive insights during conceptualisation: By training on historical simulation data, AI can provide immediate feedback on design viability before formal simulation, helping engineers avoid designs likely to fail downstream validation. A 2024 McKinsey study found this approach reduces late-stage redesigns by 63% in aerospace manufacturing compared to conventional methods.
Multi-domain knowledge integration: AI systems bridge knowledge gaps across specialised domains, serving as translators between mechanical, electrical, and control engineering considerations during early design phases.
Real-time performance predictions: Engineers can receive instantaneous performance estimates for conceptual designs, making refinements earlier when changes are less costly.
The computational complexity of design optimisation—with high-dimensional parameter spaces, fidelity requirements, and scale challenges—has traditionally limited exploration of design possibilities. See three iterations of NASA's excite bracket designed by Generative Design Methods in the image below.
Efficient design space exploration: Airbus reports using AI-powered generative systems to evaluate 120x more winglet configurations during preliminary design phases compared to pre-AI workflows, discovering novel solutions that human engineers might overlook.
Surrogate modelling for rapid iteration: Machine learning creates accurate surrogate models that approximate high-fidelity simulations at a fraction of the computational cost, enabling thousands of design iterations in the time previously required for a single analysis.
Material optimisation: AI identifies optimal material selections and distributions, reducing weight while maintaining structural integrity—particularly valuable in aerospace, automotive, and consumer products. McKinsey analysis shows 6-20% part cost reductions across these industries through such optimisations.
Multi-objective optimisation: Rather than simple trade-offs, AI simultaneously balances competing requirements for performance, manufacturability, cost, and sustainability, delivering holistic designs that traditional methods would struggle to achieve.
Automation of engineering design using advanced techniques like surrogate models, generative design or parametric design automation requires a high level of application specific understanding, since every application experiences different physical phenomena. Filtering through the openly available sources, a table with some key application areas can be summarised. Advanced Design techniques find usage in many industries ranging from automotive over energy all the way to civil engineering. The following table tries to give a concise high level overview over the application areas of AI in engineering design automation.
Find a short description of the acronyms used here:
And the used references to the table above in the spreadsheet below:
In an endless pursuit of understanding and learning, this is just one more step. I hope this article can help you to gain more understanding about the role of AI in engineering design automation and how to best apply these transformative technologies.
Feel free to connect on LinkedIn and give feedback.