Capabilities
Keeping it FAIR: OGC and the Quest for Open Standards
November 21, 2022
· Written by
Mish Sukharev

                   

Have you ever wondered how a standard becomes – well… a standard? Why HTML? Why JPEG? Standards define a field and how we work in it. By creating a shared norm they become the cornerstones around which industries are built. They allow an emerging technology to become mature; to go from niche to mainstream. Though not always the case, setting standards can be an organically democratic exercise on a world-scale. Recently, Duality got a glimpse into this process at the Open Geospatial Consortium (OGC).

The 124th Member Meeting of the Open Geospatial Consortium OGC was held October 3-7th, 2022 in Singapore in partnership with the Singapore Land Authority. OGC, of which Duality is a proud member, is dedicated to the mission to make location information Findable, Accessible, Interoperable, and Reusable (FAIR). Composed of over 500 business, universities, government and research agencies, OGC mounts a continuous community-driven effort to set open standards that help make geospatial data be of most use to most people. 

Dr. Nadine Alameh, the CEO of OGC, understands the importance of open standards more so than most: data interoperability is essential for the geospatial community to thrive. Thanks to Dr. Alameh’s leadership, and that of Senior Director Trevor Taylor, this mission now extends to the rapidly evolving areas of the geospatial metaverse and Digital Twins, emphasized by OGC’s decision to become a founding member of the Metaverse Standards Forum.

Attendees of the 124th OGC Member Meeting, Singapore (image source: OGC )

To many members of OGC, Digital Twins (DTs) promise an exciting future of working with geospatial data. In fact, Singapore was a rather appropriate host location, being the first nation to launch the development of a Digital Twin of its entire geography: land and surrounding waters included (fittingly, the focus for this meeting was Digital Twins: Land & Sea). But since it is still an emerging technology, DTs do not yet have an agreed upon set of standards, and what exactly a Digital Twin is can be ambiguously open to interpretation. Fortunately, OGC has a well-defined process for identifying when standards are needed, how to debate them and, ultimately, how to vote on them and introduce them to organizations like the ISO (International Organization for Standardization). 

Amey Godse, Duality’s Technical Program Manager, went to this OGC meeting to share our perspective on the Digital Twin standards conversation and hear the thoughts of the wider geospatial community. Given its focus on location data, OGC is specifically invested in Environment Digital Twins (which at Duality we often refer to as Space Twins). And even though Duality has a wider focus that encompasses all predictive Digital Twins, our work is founded on generating high-fidelity environments. The success of our customers often hinges on access to shared geospatial data and our ability to use it in a Digital Twin context.

Duality’s Amey Godse discussing Digital Twin standards at OGC’s 124 member meeting - details of this slide are presented below. (image source: Nadine Alameh)

Duality’s Digital Twin Encapsulation Standard

The concept of Digital Twins was first introduced by Michael Grieves in 2002 with the name later emerging during a 2010 collaboration with NASA. Over the last 20 years Dr. Grieves’s continuously evolving DT framework has been inceptive to those on the Digital Twin frontier. The variety of digital twins and the domains in which they operate make it near impossible to define a single standard to cover these variety of use cases. This is what motivated us to turn to the concept of an encapsulation standard for digital twins to compose an interoperable enterprise metaverse.

So what do we mean by a Digital Twin Encapsulation Standard? To answer that question it is important to understand how Falcon, our Digital Twin simulator, uses DTs. Beyond looking realistic, our DTs also need to be “simulation-ready” (more in-depth elaboration here), which means they need to function like their real-world counterparts. They need to carry information and exhibit properties tied to their ontology and physics (and much more!). In every way possible they must be grounded in real-world data. And this data structure needs to be accessible and interoperable. Through this lens, a Digital Twin can be viewed as an encapsulation of all the relevant data (rather than the data itself) that the utilization of that Digital Twin may require. And this is why at Duality we are evolving a Digital Twin Encapsulation Standard.

On the surface our DTs are standard 3D models – but these geometries are wrapped within a USD definition.
A variety of standard 3D model formats can be wrapped in a USD definition to encapsulate them as Digital Twins. Pulling in the Digital Twin’s appearance through a reference allows us to use a 3D representation that is most efficient and suitable for that type of twin and for the domain in which it is being used. We discussed the performance implications of run time asset loading in this blog. By structuring our digital twin information in USD, we are able to pull in the powerful concept of non-destructive composition arcs

Simulation scenarios are also USD files – they combine Spaces, System and Item twins, along with their initial states, into a unified whole.
This structure enables some very useful features to become intrinsic to the model. These include: Level Of Detail management, metadata properties, physics, composition, variant and system specification support. Falcon efficiently parses these Digital Twin USD files, runs the simulation, and generates perception data using any number of simulated sensors.        

Fig.1 - Three levels of the Digital Twin Encapsulation Standard

As dictated by our experience with an array of customer needs, we structured our standard into three distinct but interconnected levels:

3D Asset Properties:
Define structure and appearance
Digital Twin Properties:
Make the DT simulation ready and interoperable with other DTs
Platform Properties:
Enable workflow functions (e.g. versioning, permissions, sharing, etc.)

As shown in Fig.1 the first set of physically motivated properties are concerned with the Digital Twin’s appearance and how it physically operates in, and interacts with, its virtual environment. The rest of the Digital Twin Properties are dedicated to further connecting the DT with its real-world counterpart by including domain-specific aspects (source/scan data, variant information, SKUs, etc.) that even extend to intelligent and autonomous twins (actuators, virtual sensors, fully functional autonomy software). The Platform Properties provide metadata about ownership of the DT and its version history. 

The associated property types for digital twins can vary significantly across different industries and specific domains of interest. The flexibility of USD custom schemas makes it possible to extend the encapsulation to a wide range of formats and data types while maintaining the overall digital twin structure and standardizing within a specific domain of operation. For these reasons, we characterize the standard we use not as a regular Digital Twin standard but as Digital Twin Encapsulation Standard. 

We will dive deeper into our Digital Twin Encapsulation Standard in the near future. For now it is simply important to note that this approach has been of high utility for Duality and our customers. Like physical assets, digital twins become even more valuable when they can be shared within teams and organizations and even between them. This leads to a key question - how could any format become recognized as a standard leading to broad adoption across the community?

Open Standards at OGC

Four times a year OGC’s community of geospatial experts gather to collaborate and share their ideas on interoperable technologies and open standards that matter in their respective domains. And OGC is highly encouraging of all interested parties participating in the process. All of the proceedings are involved and highly iterative – there is a great deal of nuance to understand, and diverse perspectives to incorporate – but we can share a simplified view. To help effectively survey this geospatial landscape, each member meeting is divided into established Domain Working Groups with each focused on a specific domain of interest (examples: Agriculture, Data Quality, Health, Urban Digital Twins, etc.). When a new Domain Working Group is proposed, members can vote it into existence. Once formed, thought-leaders from across the globe comprise the nascent Domain Working Group which can operate in a semi-open structure where anyone can share their ideas, concerns and experiences around the specific domain. 

A new standard is always conceived in a Domain Working Group. When a new standard begins to be deliberated, a relevant Domain Working Group may hold a vote to form a Standards Working Group which would be put in charge of that standard. If the Standards Working Group proposal passes, the next step requires inviting members who will head up the new standards debate. These members are generally groups or companies most involved with, or affected by, the potential standard. Leveraging their combined knowledge of the field, the Standards Working Group follows a formal, chartered process to come up with a draft of the new standard. Finally, when consensus is reached on the new standard, that Standards Working Group will start a new process of proposing the standard to organizations like the ISO, culminating in the formal adoption of that standard and its percolation to the world community at large.

We wish to express our gratitude to Scott Simmons, Chief Standards Officer at OGC. Scott’s shepherding of this meeting, along with the guidance he provided to members new to the standards process, was invaluable. We look forward to participating in this exemplary process, contributing our perspective and helping shape a more open world for the geospatial community.