Development and commissioning
Experts from Sick: What is important for sensors in the metaverse
Digital engineering tools are the basis of modern product development. Two experts from Sick discuss the current development. They explain the requirements for engineering tools in the metaverse and reveal how Sick uses them.
What is the current status quo of engineering tools in the industrial metaverse, what role does the digital twin play, and what trends are emerging?
Oliver Huther (Head of Disruptive Business Development): The offering of current engineering tools is mainly characterized by CAx software and CAx-based simulation platforms. These are increasingly being supplemented by collaborative cloud platforms,
which enable teams to collaborate efficiently across different locations and disciplines. In the past, especially concerning the topic of virtual commissioning, it has already been shown that unified interfaces between virtual solutions and real devices are the essential prerequisite for seamless data exchange.
The asset administration shell (AAS) - the digital twin - is of central importance here, as it enables the seamless integration of virtual solutions into real devices in its role as a standardized interface. At the same time, new trends are emerging that use artificial intelligence to support users in analyses and decisions. Occasionally, augmented or virtual reality places users in immersive environments that facilitate prototyping and problem-solving through interactive and realistic simulations.
What limitations do tools and users like Sick currently face - and how can these be shifted or removed?
Jan Jarvis (Head of Virtualization): A central challenge for current engineering tools and simulation platforms is the still prevalent use of proprietary data formats, which complicates the exchange and interoperability between different platforms. Future-proof engineering tools, on the other hand, must be open, meaning non-proprietary, to enable and promote collaboration across data formats and domain boundaries. They must therefore support open standards for interoperability.
From Sick's perspective, another deficit becomes apparent: the simulation of optoelectronic sensor technology. CAD-based tools are often unable to realistically represent the complex physical processes of modern optical sensor technology. Modern rendering technologies such as physically based rendering (PBR) and GPU-accelerated real-time ray tracing (RTX) provide a remedy.
In combination, these technologies open up the possibility of more precise and significantly more powerful simulation of light reflections, materials, and environmental conditions, especially in dynamic environments. Therefore, they are indispensable for the virtual development, optimization, and commissioning of sensor solutions, especially within customer applications. Sick has done valuable pioneering work in this area over the past two years and has been able to build up a lot of know-how in this field.
What impact does the industrial metaverse have on the collaboration of different users and disciplines?
Huther: When departments such as development, product management, sales, commissioning, and service, or even customers and manufacturers, interact seamlessly in virtual environments via the industrial metaverse in the future, this will naturally have a lasting impact on collaboration. Customized visualizations will present content in such a way that it meets the needs of each discipline and user group, whether at the 3D model level, the data level, or the process level. At the same time, the combination of real-time and historical data creates the basis for optimizing processes from commissioning to the operational phase.
Nevertheless, specialized expert tools that offer advanced functionalities to cover specific applications are still needed. And they too must enable and promote collaboration across domain boundaries, as seamless data and content exchanges between different tools are essential to make collaboration efficient. The future of the engineering tool landscape will therefore be characterized by interoperability. Only it enables open platforms that allow real-time collaboration and immersive work environments where physical and digital worlds merge.
To what extent are there already suitable solutions to ensure the indispensable interoperability of the tools?
Jarvis: Open platforms and formats for standardizing data formats and interfaces play the key role in ensuring interoperability. Cross-platform formats like OpenUSD (Universal Scene Description) enable the representation of complex scenarios and promote seamless data and content exchange between different tools. Currently, Nvidia offers a powerful software ecosystem with the Omniverse platform and the tools contained within it. It embraces the fundamental idea of openness and interoperability and thus has the potential to function as an orchestration platform and bridge between industrial metaverse tools.
Through the concept of omniverse connectors, multiple users of different domain-specific design and simulation tools can incrementally and continuously synchronize changes in the scene via a so-called "nucleus" and collaborate. OpenUSD as the underlying data format offers the mentioned advantages in terms of acceptance, scaling, and interoperability and is increasingly establishing itself as a kind of standard for the exchange of 3D data. Omniverse is already supported by some industry-renowned design and simulation tools. This allows us at Sick to already make the possibilities of metaverse concepts tangible in innovation projects today.
How can you imagine that?
Huther: With Nvidia's Isaac Sim, we at Sick use a simulation platform that is specifically designed for the development, testing, and optimization of autonomous systems in virtual environments and already fully relies on OpenUSD. It fits perfectly that NVIDIA, with its tool, enables access to the aforementioned rendering engines (PBR & RTX), which are indispensable for the simulation of optoelectronic sensors. Isaac Sim allows us, for example, to realistically simulate robotic systems.
This allows us to efficiently simulate, test, and optimize complex interactions and scenarios in the use of our sensors before implementing them in the physical world as real devices with order numbers and data sheets. At the same time, we take the opportunity to provide sensor expertise through virtual Sick sensor models on Isaac Sim. Currently, users already have access to the four sensors multiScan136, picoScan150, TiM781, and microScan3 as digital models on the simulation platform - and more will follow.