Beyond the Map: Building Living Digital Twins with AI and LiDAR

Digital twins are transforming how we interact with the world, creating realistic and immersive 3D experiences from real-world geospatial data. At Xentity, our Digital Twinning team is at the forefront of this evolution, developing stunning virtual environments that are not just visually impressive, but also deeply rooted in accurate, data-driven insights seeking ways to generate from AI modules to help in the mass-generation of high-quality digital twins pipelined with detailed real-world data.

The challenge has always been bridging the gap between raw, complex data and a functional, intelligent model. With aerial LiDAR (Light Detection and Ranging) data now widely and easily available from sources like the USGS National Map 3DEP program, we have an incredible resource at our fingertips. However, manually translating these massive point clouds—our Angel Fire, New Mexico, project area alone consisted of approximately 60 LiDAR tiles—into distinct features like individual trees and buildings is a monumental task.

This is where artificial intelligence and automated workflows come in. By developing state-of-the-art tools and processes, we can automatically extract features from LiDAR, building the foundational layers of a digital twin with unprecedented speed and precision. Our Angel Fire pilot project serves as a real-world laboratory, proving out these advanced capabilities to create a truly dynamic and interactive representation of the natural and built environment.

 

Some of the Eye-Catching Outputs

Xentity’s Digital Twinning effort creates realistic, immersive, stunning 3D experiences driven off real-world geospatial data. Our Digital Twins combine the intuitive, immersive experiences of 3D modeling with the accuracy of geospatial data.

Explore these two videos capturing the Sky to zoomed in view and the Street Walk view

Xentity’s Digital Twins can incorporate any number of  features and datasets. Our Angel Fire New Mexico Digital Twin includes:

True-Height 3D Building Footprints – Derived from Lidar using our state of the art Structure Feature Extraction tools and workflows

A screenshot of a computer AI-generated content may be incorrect.

True-Height 3D Trees modeled across the landscape – Derived from Lidar using our in-house created workflows with realistic, data-driven modeling of individual tree location points

A tree in a forest AI-generated content may be incorrect.

High resolution satellite imagery - Bring in any imagery from any source

Aerial view of a town AI-generated content may be incorrect.

Real-time streaming data - Enhance situational awareness and decision-making by connecting to authoritative data sources

A map of the world AI-generated content may be incorrect.

A Look Under the Hood: The Angel Fire Case Study

The true power of a digital twin lies in its components and the ability to layer multiple data sources into a single, cohesive view. Our work in Angel Fire demonstrates how we use a combination of powerful GIS tools and in-house expertise to construct these rich environments.

Automated Feature Extraction in Action

Using ArcGIS Pro and its 3D Analyst extension, we developed a streamlined workflow to process vast amounts of LiDAR data. Key steps include:

  • Classifying Buildings: The "Classify LAS Building" tool rapidly identifies points in the LiDAR cloud that represent structures. For our project, this step processed 60 LiDAR tiles in just 10 minutes.
  • Extracting 3D & 2D Footprints: From the classified points, we extract 3D building models and 2D footprints. This process was equally fast, also taking only 10 minutes.
  • Refining with AI: The real "game-changer" was the "Regularize Building Footprint" tool, which cleans up the initial extracted shapes, converting jagged, computer-generated lines into clean, right-angled polygons that accurately represent real-world buildings.

This same data-driven approach applies to the natural landscape. Using custom-built workflows, we model true-height 3D trees, each represented as a unique point with detailed attributes such as species, height, width, and health, all derived directly from the LiDAR data combined with additional authoritative vector geospatial data.

Fusing Data for a Complete Picture

A digital twin is more than the sum of its parts. By integrating various datasets, we create a tool for enhanced situational awareness and decision-making:

  • High-Resolution Imagery: We can drape high-resolution satellite imagery over the 3D terrain and models, providing crucial visual context.
  • Real-Time Data Streams: We enhance the digital twin by connecting to live data services. For example, streaming current wildfire incident locations and perimeters from enterprise sources provides an up-to-the-minute operational view for emergency response.

Lessons Learned: The Right Tool for the Job

Our research involved a comprehensive evaluation of various software tools, each with distinct strengths:

  • ArcGIS and QGIS are GIS-centric powerhouses, essential for data processing and feature extraction. ArcGIS excels at handling GIS data, though its 3D visualization is more functional than photo-realistic. QGIS is a capable, free, and open-source alternative, whose functionality is greatly expanded by a universe of plugins.
  • Blender offers unmatched best-in-class visualization. While not a native GIS tool, it can import geospatial data to create stunningly realistic renderings with advanced lighting and environmental effects—perfect for presentation and simulation.

By leveraging the strengths of each tool, we have created a workflow that is both efficient and capable of producing stunningly detailed and data-rich digital twins. This fusion of AI-driven feature extraction, multi-source data integration, and advanced visualization is unlocking the full potential of digital twins for planning, management, and operations.