Will geoscience go for a shared service environment

Blog post
edited by
Wiki Admin

Will geoscience go for a shared service environment?

As the previous “How can we help geoscience to move their data to shared services” blog noted, unless we align the stakeholders, get a clear line of sight on their needs, and focus on earning trust and demonstrating value, the answer is no. But let’s say we are moving that way. How do we get started to fund such an approach?

Well, first off, the current grant and programmatic funding models are not designed to develop shared services or interoperable data for the geosciences.  Today, there are many geoscientists who are collaborating between disciplines and as a result improving the quality of knowledge and scaling the impact of their research.  It is also well established that the vast majority operate individually or in small teams.  Geoscientists, rightly so, continue to be very focused on targeted scientific objectives and not on enabling other scientists. It is a rare case when they have the necessary resources or skills.  With the bright shiny object of data driven science /Big data; do we have the Big Head wagging the body of the geoscientist community?  Xentity sees opportunities to develop funding strategies to execute collaborative performance-based cross discipline geosciences. It has been this way since World War II really expanded upon its war-time successful onesy-twosy grants to universities since then. There has been some movement towards hub and spoke grant funding models, but we are still out to get our PhD stripes, get our CVs bigger, and keep working with the same folks. I know it is a surly and cynical view. OK, the real is they are doing amazing work, but in their field, and anything that slows down their work, for greater good, is lacking incentive.

Also, there are few true shared services that are managed and extended to the community as products and services should be. Data driven science, which is out fourth paradigm of science, has been indirectly “demanding” scientific organizations and their systems to become 24×7 service delivery providers. We have been demanding IT programmers to become service managers, scientist to become product managers, or data managers.  With a few exceptions, it has not worked. Geoscientists are still struggling to find and use the basic data/metadata, produce quality metadata (only 60% meet quality standards per EarthCube studies) for their own purposes, let alone making the big leap to Big data and analytics. Data driven science requires not only a different business or operating model, but a much clearer definition of the program, as well as scientist’s roles and expectations.  It requires new funding strategies, incentive models and a service delivery model underpinned by the best practices of product management and service delivery.

Currently, and my favorite, there is limited to no incentive for most geoscientist to think beyond their immediate needs.  If geoscientists are to be encouraged to increase the frequency and volume of cross-discipline science, there needs to be enablement services, interoperable data and information products that solve repetitive problems and provide incentive for participation.  We need to develop the necessary incentive and management models to engage and motivate geoscientist, develop a maturity plan for the engineering of shared geoscience services and develop resourcing strategies to support its execution. Is this new funding models, new recognition models, new education, gamification, crowdsourcing, increasing competition, changing performance evaluation? Not sure as any changes to “game” rules can and usually introduces new loopholes and ways to “game” the system.

The concept of shareable geoscience data, information products and commodity or analytical computing services has an existing operating precedent in the IT domain –shared services.  Shared services could act as a major incentive for participation.  An approach would identify the most valuable cross cutting needs based on community stakeholder input. The team would use this information to develop a demand driven plan for shared service planning and investment. As an example, a service-based commodity computing platform can be developed to support both the Big Head and Long Tail and act as incentive to participation and perform highly repetitive data exchange operations.

How does one build and sustain a community as large and diverse as the geosciences? 

The ecosystem of geoscience is very complex from a geographic, discipline and skill level point of view. How does one engage so diverse a community in a sustainable manner?  “Increased visibility of stakeholder interests will accelerate stakeholder dialogue and alignment – avoiding “dead ends” and pursuing opportunities.” The stakeholders can range from youthful STEM to stern old school emeritus researchers; from high volume high frequency data producers of macro scale data to a single scientist with a geographically targeted research topic. It is estimated that between 80-85% of the science is done in small projects.  That is an enormous intellectual resource that if engaged can be made more valuable and productive.

Here is a draft target value chain :

The change or shift puts a large emphasis on upfront collaborative idea generation, team building, and knowledge sharing via syndication, and new forms of work decomposition in the context of crowd participation (Citizen Science and STEM).  The recommended change in the value chain begins to accommodate the future needs of the community.  However, the value chain becomes actionable based on the capabilities associated to the respective steps.  Xentity has taken the liberty to alliteratively define these four classes of capabilities or capability clusters as:

Encouragement, Engagement, Enablement, and Execution.

Encouragement capabilities are designed to incentivize or motivate the scientist and data suppliers to participate in the community and garner their trust. They are designed to increase collaboration, the quality and value of idea generation and will have a strong network and community building multiplier effect. 

Questions

Capabilities

  • How can new scientific initiatives be collaboratively planned for and developed?
  • How can one identify potential collaborators across disciplines?
  • How can one’s scientific accomplishments and recognition be assured and credited?
  • What are the data possibilities and how can I ensure that it will be readily available?
  • How can scientific idea generation be improved?
  • Incentives based on game theory
  • Collaboration, crowd funding, crowd sourcing and casting
  • Needs Analysis
  • Project Management and work definition
  • Credit for work Services

Engagement Capabilities include the geoscience participant outreach and communication capabilities required to build and maintain the respective communities within the geoscience areas.  These are the services that will provide the community the ability to discuss and resolve where the most valued changes will occur within the geosciences community, who else should be involved in the effort?  

Questions

Capabilities____________________

  • What participants are developing collaborative key project initiatives?
  • What ideas have been developed and vetted within the broadest set of communities?
  • Who, with similar needs, may be interested in participating in my project?
  • How can Xentity cost share?
  • Customer Relationship Management
  • Promotions
  • Needs Analysis
  • Communications and Outreach
  • Social and Professional Networking

Enablement capabilities are technical and infrastructure services designed to eliminate acquisition, data processing and computing obstacles and save scientist time and resources.  They are designed to solve frequently recurring problems that affect a wide variety and number of geoscience stakeholders from focusing on their core competencies – the creation of scientific knowledge. Enablement services will have a strong cost avoidance multiplier effect for the community on the whole if implemented and supported.

Questions

Capabilities

  • How does one solve data interoperability challenges for data formats and context?
  • How do I get data into the same geographic coordinate system or scale of information?
  • How can I capture and bundle my Meta information and scientific assets to support publication, validation and curation easily?
  • How can I get access extensible data storage throughout the project lifecycle?
  • Where and how can I develop an application with my team?
  • How can I bundle and store my project datasets and other digital assets later retrieval?
  • How can I get scalable computing resources without having to procure and manage servers to complete my project?
  • Workflow
  • Process Chaining
  • Data Interoperability
    • Data transformations
    • Semantics
    • Spatial Encoding and Transformation
    • Data Services
  • Publishing
  • Curation

Syndication

Execution Capabilities are comprised of the key management oriented disciplines that are required to support shared infrastructure, services or to help evolve a highly federated set of valuable assets “edges” to be more useable and valuable to the evolving community over time.

Questions

Capabilities____________________

  • How do we collectively determine what information might require a greater future investment?
  • What are the right incentives in the grant processes?
  • What are the future funding models?
  • What models should be invested in?
  • Which technologies should be evaluated for the shared assets?
  • What upcoming shared data or technology needs are in common to a large number of participants?
  • Governance,
  • IT Service Management (ITSM),
  • Product Management,
  • Performance Management,
  • Requirements Management,
  • Data Management,
  • Data Supply Management,
  • Data Life Cycle Management
  • Funding
  • Grants and processing

So, why did we develop these classes of capabilities? 

They represent, at the macro level, a way to organize a much larger group of business, operating and technical services that have been explicitly discussed in NSF EarthCube efforts over the last 3-4 years. We then been derived these outputs from analysis and associate them to the most important business drivers. Check out this “draft” relationship of capabilities drivers and rational 

 RationaleDrivers
EngageThe best way to create communities and identify common needs and objectives, begin to build trust and value awareness; bring the respective communities into an environment where they can build out their efforts and sustain collaborative approaches.Agency (how to navigate planned versus emergent change), intellectual property rights, infrastructure winners and losers, agreement on data storage, preservation, curation policies and procedures, incentives to share data and data sharing policies, and trust between data generators and data users.
EncourageThe best models to incentivize scientist’s and data producers to participate and collaborate. Xentity have developed game theory based approaches and large scale customer relationship management solutions Social and cultural challenges: Motivations and incentives, self-selected or closely-held leadership, levels of participation, types of organizations, and collaboration among domain and IT specialists)
EnableThe most costly data processing obstacles – The lowest common denominator – highest impact problem.  A common problem found in shared service environments. We have developed enterprise service analysis tools for cost benefit for the DOI geospatial community, so we have seen this work80% of scientist data needs can be expressed as standard data product, and 80 % of scientist time is spent getting data into proper form for research analysis
ExecuteA governance model that will increase the “edge effect” between the legacy and future capabilities and a very diverse set of communities. Simple planning capabilities that empower scientist to work complex cross disciplines ideas amongst themselves, define work and coordinate with the power of the crowd. We have designed collaborative environments and crowd based frameworks for data collection and analysis with corresponding performance management system.Conceptual and procedural challenges: Time (short-term funding decisions versus the long-term time-scale needed for infrastructures to grow); Scale (choices between worldwide interoperability and local optimization);

So why don’t we do it?

Well, this does introduce an outside approach into a closed knit geoscience community who is very used to solving for themselves. Having a facilitated method from outside consulting or even teaming with agency operations who have begun moving this route for their national geospatial data assets is not seen as something fits their culture. We are still learning of hybrid ways we can collaborate and help the geoscientists setup such a framework, but for now it is still a bit foreign of a concept, and while there is some awareness by the geoscientist community to adopt models that work for other sectors, industries, operational models, the lack of familiarity is causing a lot of hesitation – which goes back to the earn trust factor and finding ways to demonstrate value.

Til then, we will keep plugging away, connecting with the geoscience community in hopes that we can help them advance their infrastructure, data, and integration to improve earth science initiatives. Until then, we will remain one of the few top nations without an operational, enterprise national geoscience infrastructure.

How can we help geoscience to move their data to shared services

Blog post
added by
Wiki Admin

Most data that is generated out of science is not intended to be used on broader scale problems outside of their own research or their own specific domain. Let’s stick with geoscience to check this ‘hypothesis’ out and check on the ‘so what?’ factor.

Current grant and programmatic funding models are not designed to develop shared services or interoperable data for the geosciences.  There are few true shared services that are managed and extended to the community as products and services should be. “We broadly estimate that 20% of users of a dataset might be experts in the field for which it was created, while 80% might be others.” There is currently limited to no incentive for most geoscientists to think beyond immediate needs. “The culture of collaborative Science is just being established.”  Finally, there is no current clear way to build and sustain the large and diverse geosciences community.

The key we believe starts not with the tech, the money, the management, the govnernance, but with stakeholder alignment which can be “the extent to which interdependent stakeholder orient and connect with one another to advance their separate and shared interests”. 

The Geoscientist community – Big Head and Long Tail. Geoscientists with affiliated government institutions, academic and international partners. Data Scientists, Data and Information Stewards, Curators and Administrators (content and metadata), Data Product and Service Managers , Citizen Earth Science participants, Emerging Geoscientists found in the STEM community – (K-12)

Additionally, the supply chain roles from:

  • Data and Information Suppliers – NSF funded centers and systems.  Programmatic producers of geoscience related data and information – i.e. Earth Observation systems like Landsat or MODIS or specialized information systems that produce value added products like NAWQA. Indonesia NSDI, DOI Authoritative data sources and services
  • Cyber Infrastructure community/Development Collaborators – Basic and Applied Research and Development, Software and System Engineers, Data Manager and Analysts
  • Infrastructure  Management /Collaborators – IT Service Management (ITSM) – Managers and Operators of the shared infrastructure and key software services in industry, commercial, government, Research, and Cost-Sharing FFRDCs
  • Consumers reached out to via end user workshops., Public Policy, Regulatory, Legal and Administrative Analysts, Private Sector , Academia (non-participating state), Other science disciplines
  • Executive Sponsorship and Geoscience Various and Cross-Cutting Governance Community too numerous to get into in this blog at least

The stakeholder themes we have seen in data are generally the same. 

These challenge echoes the organizing themes of the Xentity supported developing 7-years ago for DOI Geospatial Services Architecture:

  • “I know the information exists, but I can’t find it or access it conveniently”, has its analog in “Considerable difficulties exist in finding and accessing data that already exists” 
  • “I don’t know who else I could be working with or who has the same needs”, has its analog in “Duplication of efforts across directorates and disciplines, disconnect between data and science; data graveyard –useless collection of data…”
  • “If I can find it, can I trust it?”, has its analog in “There is a need to evaluate consistency /accuracy of existing data. 

A start on this, to jump into boring consulting theory, is to Develop a clear line of sight to address stakeholder needs and community objectives. 

This ensures the analysis engages all the necessary dimensions and relationships within the architecture. Without a strategy like this, good solutions, business or technical, often suffer from lack of adoption or have unintended consequences and introduce unwanted constraints. The reason for this is the lack of alignment. Technology innovators tend not to share the same view of what is beneficial nor does the Geoscientist who is accustomed to enabling a single or small set of technology directives.

How does one create the shared enterprise view? Using the Line of Sight, our approach at least to architecture transformation and analysis creates the framework and operating model. It connects business drivers, objectives, stakeholders, products and services, data assets, systems, models, services, components and technologies.  Once the linkages have been established, the team will create the conceptual design using 40-50 geoscience domain investment areas. This will effectively describe the capabilities of the existing IT portfolio.  The architecture and the portfolio will be designed to support governance, future transition planning.

Sample Ecosystem Edge Analysis

Human Edges (adaptive systems)

Data and Information Edges

Computing and Infrastructure Edges

Citizen scientists/ STEM and Professional scientists

Data Supply and Information Product/Services

Centralization and federation of computing infrastructure

Geoscience as consumer and producer of data and information

Possessing the data and access the data

Commodity Computing vs. Analytical Computing

Individual science and collaborative science

Macro Scale data  vs. Micro Scale data

Mission driven systems and shared services access

Science Ideation: Piecemeal or segmented vs. holistic

Five data dimensions – spatial (x,y,z), temporal and scale

Domain Systems vs. Interoperability Frameworks

Individual vs. Collective Impact and credit

Authoritative sources vs. free for all data

Systems vs. Managed Services

Governance rigidity and flexibility

Data and models vs. Product

Big Head and Systematic Data Collection, vs. project components

Earth Science and Cyber-infrastructure and Engineering

Long Tail vs.  Big Head Data

 

The Line of Sight allows for exploring the complexities of geoscientist “ecosystem edges” and architect for greater interaction and production in the geosciences.   Those in the “Long Tail” encounter the same cross domain access, interoperability, management barriers as the “Big Head”. Neither have the incentive to develop common enabling data interoperability services, scalable incentive solutions, common planning approaches or increase the participation of the earth science community. Xentity’s believes architecture is an enabling design service.  It is used to empower the user community with the tools to expand its capacities. In this case, Xentity will provide the operating model and architecture framework in a conceptual design to bring together the currently unattended edges.  In the long run, the models will provide the emerging governance system the tools to develop investments strategies for new and legacy capabilities.

The Broader Impact

At its core, we believe the geoscience integration challenge is to exploit the benefits and possibilities of the current and future geoscience “ecosystems edge effect”.  In the ecosystem metaphor, the conceptual design approach will target the boundary zones lying between the habitats of the various geoscience disciplines and systems.   What is needed is an operating model, architectural framework and governance system that can understand the complexities of a geoscientist shared environment and successfully induce the “edge effect”.  It needs to balance the well performing aspects of the existing ecosystem with new edges to generate greater dynamism and diversification for all geosciences. 

An Operating Model example: Collaborative geoscience planning could make a good demonstration case for the benefits of the “edge effect”.  A lot of science efforts are driven by large scale programs or individual research groups who have very little knowledge of who else may be working in the same environmental zones, geographies or even on related topics.  A shared planning service could put disparate projects into known time, location and subject contexts and accelerate cross domain project resource savings and develop the resulting interdisciplinary cross pollination required to understand the earth’s systems.  An Enterprise geoscience initiative could provide a marketplace for geoscientist to shop around for collaborative opportunities.  The plans can be exposed in a market place to other resources like citizen scientists or STEM institutions.  The work can be decomposed so that environments like Amazon’s Mechanical Turk can post, track and monitor, distributed tasks.

By recognizing these edges, the architecture will create greater value or energy from the disciplines and improve the creativity, strength and diversity of ideas, and mitigate disruption.  The ecosystem-like design that balances the Big Head with the Long Tail will enable more cost effective geoscience projects and create a higher return on IT investments while collapsing the time to conduct quality impactful science. Most importantly, this will accelerate the realization of the sciences’ impact on other dependent scientific initiatives or time to develop and implement policy.  Xentity sees the potential to use this and other “ecosystem edges” to transform how geoscience is currently conducted. 

Xentity believes a geoscientist, emerging (STEM) or emeritus would be willing to participate in cross-cutting, shared service model based on how well these edges are architected and governed.  If designed and operated effectively, the edges will create an environment that will address the two key barriers to adoption: trust and value.  In essence, we see the scientists as consumers and producers.

  • As consumers of data, information and knowledge products and technology services, they are continuously looking to create more knowledge and contribute to social benefit. 
  • As producers they contribute data, information and knowledge back into their colleagues’ knowledge processes.  

In fact, the predominant challenge for such an approach is that the share-service will be develoepd by the community who themselves are a consumer. Just like any other consumer, they will have expectations when they purchase or use a product or a service.  If one cannot uphold the terms and conditions of product quality or a service agreement; you lose the consumer. So, how does the architecture ensure these “edges” develop and evolve?  It must ensure:

How to earn Geoscientists’ Trust

The scientists need to know that they will have highly reliable technical services and authoritative data that are available and perform well when they request them.  Most importantly, they will need to influence and control who and how they conduct the work within the shared environment. They need to ensure the quality of the science and appropriate credit.

How to demonstrate the value to the Geoscientist:

The scientists need the provider to correct products or services that will eliminate the most significant barriers and constraints to doing more and higher quality science – research, analysis and experimentation – with less effort.

In the short term, the shared service challenge is to earn the scientists trust and identify the optimal suite of products and services to provision value from the “community resources” as defined in Layered Architecture. For land elevation products up to 80% of the requests are for standardized products. If done correctly, the governance system, operating model and architecture framework will develop the trust and value recognition from the shared community.  In the longer term, the models and framework will guide the redirection of its limited resources towards an interoperable set of systems, processes and data.

Great, but even if we create this, how do we fund? 

See the next part on “Will geoscience go for a shared service environment” which discusses ways to address funding, ways to engage, encourage, enable, and support execution of these enterprise capabilities for geoscientists. 

Why we focus on spatial data science

Blog post
edited by
Matt Tricomi

The I in Information Technology is so broad – why is our first integrated data science problem focus on spatial data? It doesnt fit when looking on face of our Services Catalog . We get asked this a lot and this is our reason, and like Geospatial, its multi-dimensional spanning different ways of thinking, audiences, maturity, progressions, science, modeling, and time:

 

In green, x-axis, is the time progression of public web content. The summary point is data took the longest period – about 10-15 years. And data can only get better as it matures into being popular 25 years old on the web. We are in the information period now, but moving swiftly into the knowledge period. Just see how much more scientific data visualizations, and dependence we are on the internet. Just think how much you were on the web in 1998 compared to 15 years later – IT IS IN YOUR POCKET now. 

This isn’t just our theory.

RadarNetworks put together the visual of progressing through the web eras. Web 1.0 was websites or Content and early Commerce sites. Web 2.0 raised the web community with blogs and the web began to link collaboratively built information with wikis. Web 3.0 is ushering in the semantic direction and building integrated knowledge.

Even scarier, Public Web Content progression lags several business domains, but not necessarily in this leading order: Intelligence, Financial, Energy, Retail, and Large Corporate Analytics. Meaning, this curve reflects the Public maturity, and those other domains have different and faster curves. 

The recent discussions on intelligence analysis linking social/internet data with profile, Facebook/Google Privacy and use for personalized advertising, level of detail SalesForce knows about you and why companies pay so much for a license/seat, how energy exploration is optimizing where to drill in some harder to find areas, or the absolute complexity and risk of the financial derivatives as the world market goes – these technologies usually lag in how we integrate public content for googling someone, or using the internet to learn more and faster. Reason: Those do not make money. Same reason why the DoD invented the internet – it was driven by security of the U.S. which makes money which makes power. 

So, that digression aside (as we have been told “well, my industry is different”), the public progression does follow a parabolic curve that matches Moore’s Law driving factor in IT capability – every 2 years, computing power doubles in power, at same cost (paraphrasing). The fact that we can do more faster at quality levels means we can continue to increase our complexity of analysis in red. And there appears to be a stall not moving towards wisdom, but as we move toward knowledge. Its true our knowledge will continue to increase VERY fast, but what we do with that as a society is the “fear” as we move towards this singularity so fast. 

Fast is an understatement, very fast even for logarithmic progression as its hard to emote and digest the magnitude of just how fast it is moving. We moved from

  • The early 90s simply placing history up there and experimentation and having general content with loose hyperlinking and web logs
  • to the late 90s conducting eCommerce and doing math/financial interaction modeling and simulations and building product catalogs with metadata that allowed us to relate and say if a user found that quality or metdata in something, it might liek something else over here
  • to the early 2000s to engineering solutions including social and true community solutions that began to build on top of relational and the network effect and use semantics and continually share content on timelines and where a photo was taken as GPS devices began to appear in our pockets
  • To the 2010s or today where we are looking for new ways to collaborate, find new discoveries in cloud, and use the billions and billions on sensors and data streams to create more powerful more knowledgable applications

Another way to digest this progression is via the table below.

Web VersionTimeDIKWWeb MaturityKnowledge Domain Leading WebData Use Model on WebData Maturity on Web
.9early 90sDataContentHistoryExperimentalLogs
1.01995+Info HistoryExperimentalContent
1.11997  MathExperimentalRelational
1.21999 +CommerceMathHypotheticalMetadata
1.32002  EngineeringHypotheticalSpatial
2.02005+Knowledge+CommunityEngineeringComputationalTemporal
2.12010s  EngineeringComputationalSemantic
3.02015 and predictable webKnowledge+CollaborationScienceData as 4th paradigm notTempoSpatial (goes public)
4.02020 -2030Wisdom in sectorsAdvancing Collaboration with 3rd world coreAdvancing Science into Shared Services – Philsophical is out yearRobot/Ant data qualitySentiment and Predictive (goes public/useful) – Sensitive is out year

Now, think of the last teenager that could maintain eye contact in a conversation with an adult while holding phone in their hand and not be distracted by the pavlovian response of a text, tweet, instagram, etc. Now imagine, ten years from now, when its not tidbits of data, but as a call comes up, auto-searching on terms they arent aware of come up in augmented reality. Advice on how to react on the sentiment they just received – not just the information. The emotional knowledge quotient will be google now – “What do I do when?” versus critical thinking and live and learn.

So, taking it back to the “now”, though this blog is lacking the specific citations (blogs do allow us to cheat, but our research sources will make sure to detail and source our analysis), if you agree that spatial mapping for professional occurred in early 2000s and agree now that it has hit the public and understand that spatially tagging data has pass the tipping points with advent of smartphones, map apps, local scouts, augmented reality directions, and multi-dimensionl modeling integrating GIS and CAD with web, then you can see the data science maturity stage we are in that has the largest impact right now is – Geospatial.

Geospatial data is different. Prior to geospatial, data is non-dimension-based. It has many attributable and categorical facets, but prior to spatial data, that data does not have to be stored as a mathematical or picture form with specific relation to earth position. Spatial data – GIS, CAD, Lat/Longs, have to be stored in numerical fashion in order to calculate upon it. Further more, it hasnt be be related to a grounding point. Essentially, geospatial is storing vector maps or pixel maps. When you begin to put that together for 10s of millions of streams, you get a a very large complicated spatially referenced hydrography dataset. It gets even more complicated when you overlay 15-minute time-based data such as water attributes (flow, height, temperature, quality, changes, etc.) with that. Even more complicated when you combine that data with other dimensions such as earth elevations and need to relate across domains of science, speaking different languages to be able to calculate how fast water may flow a certain contaniment down a slope after a river bank or levy collapses.

Before we can get to those more complex scenarios, geospatial data is the next progression in data complexity .

That said, definitely check out our Geospatial Integrated Services and Capabilities

Geo is more than a dot on the map

Blog post
added by
Wiki Admin

Geo is more than a dot on the map – it is the next progression in information. Xentity has combined the below services for maximizing value of geospatial programs, products, systems, and workforce. We have found common, re-usable, and tailor-able patterns, issues applicable from small to large public and private institutions.

 

Xentity’s staff has been providing geospatial strategy, planning and solution expertise to large public and commercial organizations for the last 15 years. Our goal is to maximize the value of geospatial products, services, systems, technologies and data for our clients and their customers. We have observed, first hand, patterns of critical issues that have hampered our customers’ performance. These issues are found in the Geospatial business areas of Geospatial Planning, Production and Data Lifecycle Management, and Service Delivery as well as in advising the executive or program leadership to manage and execute such complex and complicated changes. 

We have quite a diverse Geospatial Client Portfolio from:

  • Geospatial Government Programs: NGA, DOI, USGS, BLM, EPA, The National Map, Core Science, Indonesia SDI
  • Private Sector: Space Imaging (GeoEye), Intrawest, Denver Metro Utilities, Cloud-based Geospatial CRM

Our approach and expertise has been developed by providing impactful analysis resulting in practical solutions to these patterns. Our proven approach to problem solving encompasses the best practices of management consulting, enterprise architecture planning coupled with geospatial domain knowledge.

Our services in Geospatial services are:

  • Geospatial Planning Analysis Service
  • Geospatial Production and Data Lifecycle Management Analysis
  • Geospatial Service Delivery Analysis
  • Overall Geospatial Strategic and Tactical Management Consulting

Geospatial Planning Analysis Service

Customer Relationship Management: Work with your customers to establish clearly defined needs and the benefits to help you prioritize your efforts and create greater value for your products and services.

  • For the US Geological Survey Land and Remote Sensing program and the National Geospatial Program, Xentity led the development business practice changes that focused on bringing external customer segments to the program planning processes leading to technological and product innovations and improved communications.

Cost Benefits Analysis of Geospatial: Optimizing the value of Geospatial assets is often very complex due to its versatility and diverse uses. Organizations typically have numerous and varied stakeholders with similar requirements to exploit its valuable data and powerful technologies.  Xentity offers executive level planning and advisory services to clients to identify value opportunities, asses its strategic importance and measure subsequent performance.

  • At the US Department of Interior, Xentity provided cost benefit study to establish return on investment for service deployment – analysis yielded 10:1 Benefits.

Data Acquisition and Quality Planning: Align your program data requirements with the available partners, volunteer data, and ensure it flows through the system in an efficient, qualified and cost effective manner. Ensure you retain data in compliance with existing records, archive and use policies. 
  • Developed and implemented several acquisition strategies for the USGS National Geospatial Program that led to improved data quality and long term maintenance for multiple scale datasets.

Funding Planning:  These are challenging economic times that have seen a considerable decrease in government geospatial funds even when the value of geospatial information has not been fully exploited.  Now more than ever it is critical to be able to determine and communicate the value and impact of geospatial activities internally, to your customers and with your executives. 

  • For USGS and use of FGDC CAP Grants, Xentity designed overall acquisition planning model that linked production tracking, multiple purchasing authorities, and locations of existing inventory and purchase targets to help increase ROI on geospatial data acquisition.

Geospatial Production and Data Lifecycle Management Analysis

Concept of Operations Development –Evaluate your existing production processes for new more cost effective ways to move higher volumes of data through the system. Minimize the number of times the data is accessed.

  • For Indonesia BAKOSURTANAL NSDI, Xentity staff designed a shared production Environment across disparate incubating programs. Xentity has done similar work for DOI, BLM, USGS, EPA, and numerous private firms.

GeoData Quality Lifecycle Analysis – Synchronize the sources to increase the integration qualities of the data or to improve the quality (accuracy, currentness) of the data.

  • US National Geospatial Program – using DLCM sourced data from same suppliers to meet multiple program goals and provided consistency to user community.

Geodata Use Preparation – Optimize how to prepare data for service consumption and use as a part of your production flow.

  • BAKOSURTANAL instituted a publishing model where data was integrated across 17 ministries prior to catalog publishing

Service Delivery Analysis

Xentity has provided architectural services to the USGS, DOI and Data.gov on geospatial metadata management, discovery and visualization. We have also conducted information sharing on re-usable information service delivery patterns with multiple agencies and international bodies such as Canada, Australia, and Indonesia

Geospatial Metadata Discovery Architectures – Improve how your data and products are described (metadata) discovered (cataloging), accessed and used in the online world. Integrate your efforts with open government initiatives.

  • For USGS Core Science Systems, amongst 4 major tenets, Xentity created a path/blueprint for moving to integrated clearinghouse and harvest solution integrating sciencebase.gov and an existing metadata clearinghouse. Worked with Geospatial One Stop, Data.gov and USGS Science Base Catalogs in support of digital data delivery.

Geospatial Delivery Application Architecture – Implement standardized application frameworks and specifications to support data access and rapid transition to new products and services or delivery methods.

  • Solution Architecture design for National Geospatial Program and BAKOSURTANAL for data services and delivery.

 Geospatial Delivery Solutions Architecture –Design the access methods for online service delivery or download for all forms of products and services (vector, raster, data file formats, bigdata) through real-time service/stream access for application integration patterns to bulk cloud computing to discovery bigdata search indices with geospatially integrated interpretative signalling

  • USGS delivery, by moving to eCommerce model for products, moving to staged products and improving web service access, architecture and standards for improved application integration has had significant IT cost reduction, and 10% monthly increase use over 4 years in product download and service access.

Geospatial Delivery Prototype & Research Application Development – Our architects and developers have capabilities to bring designs to life. Experts in prototyping and early phase technology selection and demonstration in BigData, Visualization, Modeling, Service, Discovery, Semantics, and informatics solutions.

We blend this with our Agile Project Business Management capabilities for rapid planning, scrum management, sprint planning tracking, and tying back to aforementioned requirements, line of sight, and levels of architecture, and ITIL deployment requirements.

Our goal is to provide services, software tools, and automation procedures to assist in continued appropriate innovation and to keep your research, prototypes, or beta on track.

  • USGS initial The National Map base map development in early research
  • USGS Cloud Migration analysis few dozen pilots in storage, inventory, services, APIs, bigdata, basemaps and more
  • Rapid Prototyping BigData Indices and interpretative signals for millions upon millions of records and complicated discovery requirements replacing traditional RDBMS.
  • Visualization Application stacks and mashup developments in ESRI JS API, CartoDB, OpenLayers, ArcGIS Online, CKAN, Socrata, ESRI GeoDataPortal, PostGIS, Leaflet, GeoServer, and many more JS APIs, map servers, and basemap engines.

Geospatial User-Centered Designs – Improve how your users access your digital data and mapping products and keep them aware as your content changes. Design solutions that have notification models in areas of interest, topically aware, and balance popularity, and other qualities and repeatable patterns unique to geospatial

  • Space Imaging provided spatial notifications to customers based on area, product and intended use criteria

Geospatial Vendor Product Evaluation & Reseller Specialization – Xentity has evaluated many geospatial product lines in geospatial data, production, product generation, service platforms, and applications/APIs. Xentity has capability to evaluate product lines as well as re-sell products.

  • Xentity has evaluated geospatial products, architectures in open-source, COTS, GOTS with and for USGS, GSA, DOI, Salesforce AppExchange company, Denver Metro Utilities Company, and more. See Xentity Partners partners for products Xentity has capability to resell today.

What makes Geospatial so different?

Why do we promote spatial at such a level? Why we focus on spatial data scienceRead on…

What does geodata.gov mean to data.gov

Blog post
added by
Wiki Admin

During the First International Open Government Data Conference in November 2010, Xentity Geospatial Services and Architect Lead, Jim Barrett had the opportunity to present alongside colleagues such as the OMB Federal CIO, CTO, Sir Tim Berners-Lee and several other name-dropping figures in this space.

Jim, at the time part of the XPN as independent consultant, presented on our recent conceptual architecture work for data.gov that looked to integrate the previous administrations geodata.gov. Geodata.gov open data registration accounts for over 80% of all data in data.gov, so its by all means a major impact to where data.gov would need to focus.

The conference appears to continue as a bi-annual event with the last one being held in July 2012

The following captures the extended version of his presentation: