Next Generation 9-1-1 (NG911) is a national initiative aimed at updating the 9-1-1 service infrastructure in the United States and Canada to improve public emergency communications services in a growingly wireless mobile society. It provides an Emergency Call Routing Function (ECRF) and Location Validation Function (LVF). Xentity is teaming with NG911 primes to support:

  • Service Setup – Field Map and Data Migration to support ECRF and LVF services as well as LDB Integration solutions
  • Initial Data Creation – Transform and Digitize to NENA standards for Primary and Secondary Boundaries, Address Data (Routing and Dispatchable), Roads Data, and additional Rich NENA datasets
  • Design/Development & Enhancements – LVF, ECRF, LDB Integration tools, DR Reporting Integration, Geocoding, Data Pipeline, and Data Pipeline Self-Service Config App
  • Data Operations – Support for Data Management, Services, Tools/Configuration Support, and Training
  • GIS Self-Service Apps (as needed) – GIS importers, Boundary editors, Vector editors, Pipeline interface for sync’ing, calling reports, GIS Dashboard extensions to NGA911 reporting

State of California NG911 Strategy

The state of California required an implementation and data migration strategy that would address all California statewide GIS Services Requirements. One that also maintains their GIS Database. Xentity is helping augment GIS data, pipelines and LVF/ECRF services to support California GIS standards, along with providing data maintenance solutions for their database. Xentity focused not only on the table stakes of data transformation and LVF Services, yet also an automated cloud-based, serverless, data transformation and validation engine as changes to data occurs to provide fast response for data errors and pushes to production.

The Importance of Maintenance

The pipeline handles data acquisition, data ingestion, data validation & augmentation, and publishing & ledger merging. Data acquisition posts files to cloud based on approved sources  and standards. Also, Data ingestion will support near real-time queuing of record processing (At over 1,000 records/minute). The files from the data acquisition step will trigger scripts to then decompose such files into individual files. Furthermore, Data validation ensured the processing of each record by the associated GIS feature class. This leverages modular QC scripts written per National Emergency Number Association (NENA) Standards/requirements. Finally, publishing and ledger merging will handle the warnings and failures based on CA Governance standards. Also, accepted records are identified as new, “changed,” “same,” or “removed”. Furthermore, through our data pipeline assimilation strategy, we also hope to see a properly maintained database for their service.