Welcome to part 4 of our company’s ‘story’. Once again, we tell our tale through our projects from 20 years of business, dedicated to putting the ‘I” back in ‘IT’ and ‘GIS’. In the past two parts, we’ve introduced this concept to you all. Then, we told our story in geospatial data, and then open data, showcasing the variety of projects we’ve undertaken.
Data exhaust – logs of usage, traffic, interactions, transactions, search trends – is at such a granular level that before what was considered data exhaust is now considered data gold to help advise in micro-reactions, trends, and guidance to influence the smallest of interactive, search, algorithm, and model signals. For Xentity, this is Big Data. Former ‘data exhaust’ and metadata is now fueling new data flow constructs to maintain the fidelity. We help determine new patterns and solutions to feed this knowledge-first trend. Clients wish to move to platforms to integrate traditional MIS and data warehouse with new data pipelining architectures to feed their data-driven ambitions.
Some of Our Past Projects
And so, with our routine made clear, we will now tell our story once again. This time, we will tell it through the perspective of big data. More specifically, big data-related projects we have undertaken in 20 years of business.
The Aquatic Information Management’s data modeling initiative developed a process/workflow and data model for the Assessment, Inventory and Monitoring (AIM) program of ecosystem scientists. Xentity worked to develop the process and database modeling for the Assessment, Inventory and Monitoring program. The program required improvements in information sharing from disparate organizations and physical datastores to improve program delivery. After evaluating the many data management solutions already in use, the team developed a single, complete business architecture. This included the workflows with business rules and the supporting logical data model.
The models met BLM modeling standards and were designed for XML exchange standards. This created a single logical data model, complete with data documentation. Through this, we demonstrated our knowledge in business and solution architecture in big data. As an organization focused on improving existing data and its uses, it is important that we understand good solution architectures.
Xentity was awarded by the Federal Geographic Data Committee (FGDC) a 5-year Task Order for GeoPlatform Services & Support. The Federal Geographic Data Committee has developed the Geospatial Platform (GeoPlatform), a web-based application which provides a suite of well‐managed, highly available, and trusted geospatial data, services, and applications for use by Federal agencies. Considering the availability of GeoPlatform to the general public, this was arguably an exercise in both big data AND our previously discussed open data.
The major goal of the FGDC’s GeoPlatform.gov.is to support sharing of geographic data, maps and online services through an online portal. In response, Xentity provided support to adjust the GeoPlatform to changes made through the implementation of new Data Policies. Also, we executed a migration of current GeoPlatform components. That way, the platform can use new advances in cloud configurations, components, and technologies. Furthermore, we applied data maturity best practices to help improve the current quality of data. This is where we show our chops in big data, even though the goal is more open data-based. There was a lot of data and metadata that needed to be migrated and adjusted in response to changes to the platform. Our development services helped with exactly that.
Continuing our talk on data migration, we now discuss our work with the National Archives and Records Administration (NARA). NARA desired to migrate their public web records to a cloud-hosted architecture to provide accelerated delivery and discovery for any users requiring said records. At the time, NARA invested in an architecture model developed pre “big data”. It could handle full text search as well as facet searches. However, this model tended to be taxing on computing power in order to return accurate results. Also, those results were not contextually aware of popularity, typos, synonyms, etc.
Xentity in a 2-week sprint, took a 1% sampling of the billions of records. In the AWS EC2, search time was reduced from 45 seconds to a sub-second in the NoSQL environment, while also allowing for mere minutes to load 1% and build the search index. For the second phase, Xentity tuned the Oracle storage and other configuration parameters. These dealt with advanced Oracle components such as ASM, Active Data Guard, XML Chunk sizing and oracle block sizing. This reduced the Oracle query timing from 45 seconds to under 3 seconds.
Similarly to the GeoPlatform project, we see the results of data migration assistance to new, more technologically advanced platforms. In the case of NARA, it was a more advanced search engine (the best way to summarize all the jargon) that could handle the insane amounts of data it has. We may be more interested in information, but that does not mean we won’t assist in technology in order for organizations to better use said information.
The Institute of Museum and Library Services (IMLS) is a small grant-making agency which finances programs at 123,000 libraries and 35,000 museums across the US. These programs provide vital resources that contribute significantly to Americans’ economic development, education, health, and well-being. At the time, they had compiled a backlog of 3000 forms, many over 3 years old. The IMLS needed a method to evaluate and quantify various methods of digital data extraction from hard-copy forms vs manual data entry. Xentity worked to improve the processing of the final grantee reports in order to provide internal and external stakeholders the ability to discover, query, analyze, and browse these materials more effectively.
We were able to improve the processing of final hard-copy grantee reports, including the preparation of grantee-provided attachments/products for publishing on IMLS.gov. Xentity also recommended automated tools to improve how the agency records the results of grantees. Furthermore, they shared the products related to grant-supported projects with the public. This work helped describe grantee projects and attachments. This ensured that internal and external stakeholders could discover, query, analyze, and browse such materials more effectively.
The CSS program, over $100 million in annual budget, underpins all missions areas of the USGS, and delivers nationally-focused earth-system-science and information science programs that provide fundamental research and data in support of the USGS Science Strategy, Secretarial, Presidential, and Societal Priorities. CSS requested Xentity services to assist in analyzing and planning efficiencies to migrate to a shared services across thirty major characterization and analytic national data programs. Xentity architects analyzed a portfolio of over 100 systems and 40 major data sets. Also, they provided a blueprint addressing integrating service platforms, migrating geospatial platforms, integrating and migrating web map applications and alternatives for infrastructure. Furthermore, they provided new models for supporting the major transformational governance needed.
Much like a lot of our other big data-related projects, our efforts assisted government organizations in migrating data and metadata to new, better, more up-to-date platforms. At Xentity, we try to work in areas that we believe can have the greatest impact, and fewer places have greater impact than the US government. So we’re always proud of projects that involve helping out at the federal, state and even local levels of government.
Xentity worked as a subcontractor under First Data to support the establishment of a public Health Insurance Exchange. This would provide a competitive marketplace for individuals and small employers to directly compare available private health insurance options. Xentity provided an on-site team in Albany, NY Consultant and Analyst Support for the State of New York HealthCare Insurance Exchange System and Data Development. Tasks included supporting Data Architecture, Data Warehouse Design, Business Process Re-engineering, and overall Enterprise Architecture, and Solutions Architecture Activities. Also included were analysis support and architecture on health systems. Also, processing in support of implementing the insurance exchange/marketplace. We demonstrated strategic analysis and architecture for big data.
The USGS National Geospatial Program (NGP) was under extreme pressure to deliver the data, become more flexible, drive down cost, reduce cycle time, and improve services. Communities demanded they introduce the right-level of innovation and organization approaches. Xentity played a key design, architect, analysis, and communication role in advancing. As a trusted partner of the USGS, Xentity provided support for the National Geospatial Program (NGP) across Project / Change Management, Financial Management, Investment Planning / Management, Enterprise Architecture, Solution Architecture. More work in architecture, strategy and outreach.
The US Geological Survey (USGS) wished to improve in both science research and management. The goal was to better support a data-driven model for their organization. The hope was to create “at-the-ready data services and platforms”. It would help answer difficult questions regarding the application of specific policies on certain types of Protected Area Government Land Ownership and terrestrial species. Xentity had the task of developing data solution designs to establish the target platform. Also, to rationalize and assess the readiness, complexity, and other gaps to move to the new data-driven model. Our services also help in the creation of data platforms.
CDOT looked into reinventing itself as a data-driven organization with a focus on using data as a strategic asset. Safety being among the core challenges due to an increasing population and aging infrastructure. To achieve CDOT’s new goals and combat these challenges, they needed to examine the process of increased data integration. Xentity helped uncover what was thought to be a few dozen core data assets to be well over 5,000 and over 600 data-driven applications and systems. Through this first inventory phase was the “as-is” state and prioritized list of use cases established. Architects then created a conceptual architecture breaking down the core services into a key portfolio of services. The common theme in our big data related projects, as you can clearly see, is architecture.
A Preview For Part 5
We hope you enjoyed yet another trip down memory lane. Join us for the finale of this series of blogs, where we tell our story for the final time. Told through IoT data.