How to Test and Maintain PLM in Aerospace and Defense
Key takeaways:
- A product lifecycle management (PLM) system in the aerospace and defense industry is the single source of truth to facilitate consistent decision-making in product design, manufacturing, operations, and business.
- The PLM itself is very complex and also integrates with several other equally complex enterprise systems.
- Keeping the PLM functional, reliable, and available poses several challenges that can be solved using non-invasive test automation.
A PLM is a critical system in every aerospace and defense (A&D) company. It ensures that all departments see the same consistent information about a product, updated in real time.
In this blog, find out the uses of PLM in aerospace and defense, the challenges posed when testing them, and the solutions offered by Keysight Eggplant to address them.
How are PLM systems used in aerospace and defense?
Figure 1. PLM stores design, manufacturing, and operational data about a product.
A PLM is the enterprise-wide single source of truth about the engineering, design, manufacturing, and operational aspects of a product. As aircraft, satellites, and defense systems become more sophisticated — with complex software, electronics, and mechanical subsystems — an interconnected approach to development is essential. PLM systems act as the backbone for managing all such product-related data and processes.
PLMs have become critical in aerospace and defense due to:
- complexity of A&D systems
- pressure from the market, industry, and government to adopt digital transformation and artificial intelligence (AI) in aerospace and defense manufacturing
- lengthy product lifecycles spanning decades
- stringent safety requirements
- demanding manufacturing and quality standards
- strict regulatory compliance
- tight information security restrictions
Let's understand these challenges in more depth.
Challenges in A&D PLM systems validation
The technical and workflow challenges faced by A&D customers, system integrators, and solution providers when testing PLM systems are outlined below.
Issues in complex PLM integrations
Figure 2. PLM integrates with all other enterprise systems.
In aerospace and defense companies, PLM — as the enterprise-wide single source of truth — receives data from all the other enterprise systems like:
- computer-aided design (CAD)
- computer-aided engineering (CAE)
- product data management (PDM)
- manufacturing execution systems (MES)
- customer relationship management (CRM)
- supply chain management (SCM)
- enterprise resource planning (ERP)
The PLM must ensure data consistency across all systems to be a genuine single source of truth at all times. That requires continuously testing that the PLM is properly interworking with other systems, through workflows like these:
- CAD-to-PLM data validation: Any changes in CAD/CAE environments must always be accurately and consistently synchronized with the PLM's product definitions.
- Bill-of-materials (BOM) synchronization testing: The engineering BOM (in CAD/CAE), manufacturing BOM (in MES and ERP), and sales BOM (in ERP) must be consistent at all times.
- User interface (UI) testing: Cloud-based PLM and integrated platforms can suddenly change UI hierarchies, controls, and visual attributes, which makes test scripts brittle.
- Bug triaging: In complex multi-system workflows, pinpointing the origin of data inconsistencies or corruption can be very challenging.
- Multi-system environment setup: Setting up, configuring, and maintaining accurate and consistent staging environments that mimic production for multi-system integrations is a significant logistical challenge.
Hurdles in maintaining customizations
Typically, PLM platforms and partners like CAD and ERP are all heavily customized to fulfill each customer's specific business processes. Every modification or upgrade of the backend or graphical user interface (GUI) of any system needs rigorous validation to ensure it works as expected without breaking existing functionality or integrations.
But testing dynamic UI modifications is challenging, as traditional tools struggle with changing UI structures. And backend testing often requires specialized validation for 3D models, domain-specific file formats, spreadsheets, and log files.
Complexities of model-based engineering
Model-based systems engineering (MBSE) and systems modeling language (SysML) provide structured, model-centric approaches to define, analyze, verify, and validate system requirements and architectures of complex products.
A PLM system typically manages the test cases and procedures derived from SysML requirements within the MBSE model. PLM also stores the resulting test data to provide traceable proof that the physical system meets its modeled requirements. Thorough requirements traceability testing is needed for all changes.
Problems from fragmented testing toolchains
Test scripts developed with fragmented tools become brittle, requiring more script maintenance effort because they can't seamlessly adapt to UI changes or version upgrades. A Keysight survey on manual PLM testing showed that 80% of teams faced challenges managing and maintaining test cases.
Siloed testing teams are common, leading to inefficiencies like:
- duplication of test script production across development and production environments
- replicated work between disconnected teams
- missing out on the knowledge that teams can share by communicating the issues they identify
Challenges in complying with regulations and standards
PLM software must comply with various regulations and standards at all times and under all conditions. Such verification itself must also comply with those regulations and testing standards.
The International Traffic in Arms Regulations (ITAR) and Export Administration Regulations (EAR) control the export of defense-related articles and technical data. ITAR/EAR-compliant PLM testing is essential to ensure that a PLM system that handles such data has proper access rights and data masking to prevent unauthorized access and export.
Industry standards like the DO-178C standard for airborne software and DO-254 for airborne electronic hardware are mandatory. Based on the potential consequences of a system failure, a design assurance level (DAL) is assigned from A (catastrophic failure) to E. Manual DO-254 verification for a level A or B design can take up to 12 months. The PLM system maintains these test plans, data, traceability, compliance documentation, and audit readiness. These capabilities must be thoroughly verified.
Heterogeneous environments and scalability
Testing is inherently difficult because PLM integrations include diverse deployment environments (like on-premises, cloud, client-server, mainframe, and embedded) and interface types (like thick clients, web applications, mobile apps, APIs, databases, and file systems).
PLM scalability testing is crucial to ensure responsiveness and availability across diverse locations, environments, and interfaces.
Challenges in securing A&D environments
PLM software testing must ensure that the platform complies with various Department of Defense (DoD) security standards and DevSecOps best practices.
To prove the ongoing security of a system, the DoD is moving from a single manual authority to operate (ATO) process to a continuous, automated ATO (C-ATO). Automated testing and continuous monitoring are essential.
PLM software must also follow the Cybersecurity Maturity Model Certification (CMMC) framework to protect federal contract information (FCI) and controlled unclassified information (CUI). CUI/FCI protection testing verifies that the necessary security controls are in place.
Cloud PLM solutions that store data in the cloud must comply with the Federal Risk and Authorization Management Program (FedRAMP), the standardized security assessment for cloud services. Third-party audit and continuous monitoring must be part of the cloud PLM security testing strategy.
Inefficiencies of manual testing
Manual PLM testing is time-consuming, expensive, inefficient, labor-intensive, and not scalable. The Keysight survey on manual testing came up with these alarming findings:
- 92% of teams had postponed or canceled releases because manual testing was not completed on time.
- 83% of all testing effort was spent on regression testing. For each full regression round, 66% of teams took at least one full week and 34% took two or more weeks.
- 54% of teams could only handle up to two integrated systems in their manual testing scope.
- A bug discovered after a product's release costs 100x more to fix than if found during planning.
- 88% of teams struggled with end-to-end testing, 87% with defect detection, and 84% with GUI complexity. This leads to tester burnout, high turnover, and a perpetual crisis mode.
Why test automation is essential in PLM environments
Figure 3. Eggplant two-system testing by securely connecting remotely
For aerospace & defense PLM testing, test automation is downright critical due to the extreme product complexity, decades-long lifecycles, stringent regulatory demands, and the dire consequences of failure. We outline some of the key reasons for PLM test automation below.
Conduct continuous regression testing
PLM systems are constantly evolving with new features, bug fixes, and security patches. Each change can inadvertently impact existing functionality. Automated PLM regression testing is the only way for rapid execution of a comprehensive test suite after every code change.
Handle PLM changes and upgrades
PLM solutions are always heavily customized and configured to suit the customer's processes. Manual re-testing after every configuration tweak or upgrade is impractical and expensive. Teams can accelerate PLM upgrades with automation and verify the customizations.
Implement continuous delivery
As part of its digital transformation and digitalization, the aerospace and defense industry is adopting continuous integration/continuous delivery (CI/CD) in its development processes for faster time-to-market.
Reduce human errors
In A&D, precision and accuracy are paramount. Human errors can have catastrophic consequences, from safety hazards to financial losses.
PLM's engineering change management (ECM) functions, when automated, directly optimizes against human failures like lack of communication and complacency. Automated testing of PLM ECM, in turn, facilitates streamlining.
Comply with stringent standards and regulations
Test automation is the only scalable way to ensure that PLM software meets the stringent safety, security, and regulatory requirements of the A&D industry as outlined below:
- Maintain audit trails: A PLM system is a centralized, auditable source of truth for documentation and certifications. Test automation ensures that these critical features are always working correctly and reliably.
- Ensure 100% test coverage: Test automation is the only way to achieve end-to-end test coverage of PLM systems and their integrations, as expected by regulatory requirements. For example, under DO-178 and DO-254, DAL A level requires 100% modified coverage.
- Accelerate certification processes: Automated testing accelerates the generation of test data, execution of test cases, and the creation of comprehensive compliance documentation, reducing manual errors and ensuring audit readiness.
How Keysight Eggplant solves PLM testing challenges
Figure 4. Keysight Eggplant modelling.
Keysight Eggplant enables semantic understanding of PLM software workflows. It uses AI-powered computer vision to interact with every software’s graphical elements, images, and text like a human user. It's basically for non-invasive GUI testing of certified systems.
This is unlike traditional tools that look for low-level rule-based matching based on UI hierarchy relationships, UI element type, caption, or screen location. Such tests usually look like this: "If the green button with the caption Export Data under the Tools <div> is clicked, check something." If these low-level details change even slightly during an update, the test becomes invalid.
In contrast, in Eggplant's semantic understanding, the tests are expressed using higher-level constructs, like: "If export data action is initiated, check something." The action may be implemented by a button, a menu item, a toolbar, or an application programming interface (API), but the test remains valid and reusable.
Let's look at some of the key capabilities of Eggplant for PLM testing.
Support for all PLM platforms
Figure 5. PLM-CAD integration testing using Eggplant.
Eggplant supports all PLM platforms and their integrations (CAD, MES, ERP, and more) at the UI layer without needing any access to the source code or API layers. It can test all popular PLM platforms like Teamcenter, Windchill, and Aras Innovator.
AI-powered test creation from models and UI
Eggplant's AI-driven computer vision allows it to see and interact with UIs like a human user. It includes optical character recognition and graphic element recognition.
Eggplant uses AI along with model-based testing to optimize and accelerate test creation. Using AI reasoning and data sets (like real user journeys, testing coverage, and past test failures) to guide new test creation, it goes far beyond happy path testing.
The model-based digital twin approach accurately represents an application's behaviors. It can auto-generate the most useful test cases based on learned behaviors.
Eggplant also supports defining tests graphically using state chart-like notation, representing the workflow and capturing all viable paths to generate a huge number of potential user journey variations.
Eggplant Generative AI (GAI) is a local, secure generative AI tool trained on testing, business analysis, and industry-specific knowledge. It can generate manual and executable tests, advise on risk-based testing, perform test deduplication and optimization, and update tests based on requirement changes.
Non-invasive testing of PLM capabilities
Eggplant's black-box non-invasive approach is a key differentiator, especially for secure A&D environments. It tests features directly through the UI, interacting with applications visually like a human user. It operates without needing any connection to the object (or API) layer, source code, or internal application structures.
This is perfect for even highly secure regulated environments, including sensitive air-gapped networks, as it does not require installing invasive software or accessing sensitive data.
End-to-end digital thread and system testing
Figure 6. Eggplant drives actions in the GUI of a satellite monitoring application
Eggplant supports comprehensive end-to-end testing of workflows spanning diverse enterprise applications like PLM, ERP, SCM, and more. It ensures a continuous digital thread from design to manufacturing and business operations, validating seamless data flow and functionality between them.
Eggplant allows easy switching between different systems in the same test. For example, it can launch an ERP application from the PLM system, perform actions, then switch to a CAD application, and initiate a simulation.
Tests can span any layer from the UI to APIs and databases, ensuring consistency from back-end operations to the UI layer.
Eggplant can also compare data between different systems or document types to ensure data consistency and prevent corruption.
Integration with CI/CD pipelines
Eggplant supports CI/CD best practices. It offers adapters for popular tools like Jenkins, Azure DevOps, and GitHub. Automated tests can be triggered directly from the CI/CD pipeline, enabling fully automated deployment. Eggplant supports headless mode operation to run without a visible UI on DevOps servers.
If a test fails, Eggplant provides detailed root cause analysis and failure analysis in a unified format with full visibility and traceability of test results.
Key benefits of Eggplant for aerospace and defense
Let's look at some key benefits of Eggplant for A&D companies.
Reduced release cycle time
Automation reduces release cycles from weeks to mere hours. Eggplant case studies show that:
- release cycle times reduced by 50% from four weeks to two
- month-long manual test process shortened to a four-hour fully automated test pipeline
- 15 days of manual testing reduced to an overnight test
- 10-hour test events every three days were upgraded to a daily 24-hour test
Eggplant achieves all this as follows:
- Single unified platform: Eggplant provides a comprehensive toolkit for managing all testing activities across various programs and supplier interfaces.
- Technology and platform agnostic: Eggplant can test any platform, device, operating system, application, and technology layer.
- Reusable and modular: Common PLM workflows, CAD manipulations, or data validation steps can be scripted once and then reused across multiple projects, programs, or even different supplier-specific implementations, significantly reducing redundant effort and speeding up new test development.
Minimized risks
An Eggplant customer achieved an 80% reduction in bugs. This directly translated to fewer costly post-production fixes, $2.5 million in cost savings, and enhanced product reliability, critical for aerospace safety.
Eggplant achieves this by ensuring data accuracy, a critical factor in A&D PLM where precision is non-negotiable. It validates data consistency across diverse systems and document types (e.g., comparing values in ERP with PLM or checking PDF content against web application data).
From a security viewpoint, Eggplant's non-intrusive black-box testing is ideal for A&D environments, reducing the risk of data breaches or intellectual property compromise during testing.
Improved system resilience
Eggplant enables highly resilient PLM systems by ensuring consistent performance, high availability, and adaptability to ongoing changes, which are crucial for complex A&D product development.
Streamline testing of PLM in aerospace and defense
In this blog, we looked at how test automation tools like Eggplant address the challenges of testing and maintaining complex software like PLM faced by aerospace and defense organizations.
Contact us for demos of PLM test automation using Eggplant.
To find out how Keysight supports aerospace defense innovation and modernization, visit our site.