How can we help geoscience to move their data to shared services

Blog post
added by
Wiki Admin

Most data that is generated out of science is not intended to be used on broader scale problems outside of their own research or their own specific domain. Let’s stick with geoscience to check this ‘hypothesis’ out and check on the ‘so what?’ factor.

Current grant and programmatic funding models are not designed to develop shared services or interoperable data for the geosciences.  There are few true shared services that are managed and extended to the community as products and services should be. “We broadly estimate that 20% of users of a dataset might be experts in the field for which it was created, while 80% might be others.” There is currently limited to no incentive for most geoscientists to think beyond immediate needs. “The culture of collaborative Science is just being established.”  Finally, there is no current clear way to build and sustain the large and diverse geosciences community.

The key we believe starts not with the tech, the money, the management, the govnernance, but with stakeholder alignment which can be “the extent to which interdependent stakeholder orient and connect with one another to advance their separate and shared interests”. 

The Geoscientist community – Big Head and Long Tail. Geoscientists with affiliated government institutions, academic and international partners. Data Scientists, Data and Information Stewards, Curators and Administrators (content and metadata), Data Product and Service Managers , Citizen Earth Science participants, Emerging Geoscientists found in the STEM community – (K-12)

Additionally, the supply chain roles from:

  • Data and Information Suppliers – NSF funded centers and systems.  Programmatic producers of geoscience related data and information – i.e. Earth Observation systems like Landsat or MODIS or specialized information systems that produce value added products like NAWQA. Indonesia NSDI, DOI Authoritative data sources and services
  • Cyber Infrastructure community/Development Collaborators – Basic and Applied Research and Development, Software and System Engineers, Data Manager and Analysts
  • Infrastructure  Management /Collaborators – IT Service Management (ITSM) – Managers and Operators of the shared infrastructure and key software services in industry, commercial, government, Research, and Cost-Sharing FFRDCs
  • Consumers reached out to via end user workshops., Public Policy, Regulatory, Legal and Administrative Analysts, Private Sector , Academia (non-participating state), Other science disciplines
  • Executive Sponsorship and Geoscience Various and Cross-Cutting Governance Community too numerous to get into in this blog at least

The stakeholder themes we have seen in data are generally the same. 

These challenge echoes the organizing themes of the Xentity supported developing 7-years ago for DOI Geospatial Services Architecture:

  • “I know the information exists, but I can’t find it or access it conveniently”, has its analog in “Considerable difficulties exist in finding and accessing data that already exists” 
  • “I don’t know who else I could be working with or who has the same needs”, has its analog in “Duplication of efforts across directorates and disciplines, disconnect between data and science; data graveyard –useless collection of data…”
  • “If I can find it, can I trust it?”, has its analog in “There is a need to evaluate consistency /accuracy of existing data. 

A start on this, to jump into boring consulting theory, is to Develop a clear line of sight to address stakeholder needs and community objectives. 

This ensures the analysis engages all the necessary dimensions and relationships within the architecture. Without a strategy like this, good solutions, business or technical, often suffer from lack of adoption or have unintended consequences and introduce unwanted constraints. The reason for this is the lack of alignment. Technology innovators tend not to share the same view of what is beneficial nor does the Geoscientist who is accustomed to enabling a single or small set of technology directives.

How does one create the shared enterprise view? Using the Line of Sight, our approach at least to architecture transformation and analysis creates the framework and operating model. It connects business drivers, objectives, stakeholders, products and services, data assets, systems, models, services, components and technologies.  Once the linkages have been established, the team will create the conceptual design using 40-50 geoscience domain investment areas. This will effectively describe the capabilities of the existing IT portfolio.  The architecture and the portfolio will be designed to support governance, future transition planning.

Sample Ecosystem Edge Analysis

Human Edges (adaptive systems)

Data and Information Edges

Computing and Infrastructure Edges

Citizen scientists/ STEM and Professional scientists

Data Supply and Information Product/Services

Centralization and federation of computing infrastructure

Geoscience as consumer and producer of data and information

Possessing the data and access the data

Commodity Computing vs. Analytical Computing

Individual science and collaborative science

Macro Scale data  vs. Micro Scale data

Mission driven systems and shared services access

Science Ideation: Piecemeal or segmented vs. holistic

Five data dimensions – spatial (x,y,z), temporal and scale

Domain Systems vs. Interoperability Frameworks

Individual vs. Collective Impact and credit

Authoritative sources vs. free for all data

Systems vs. Managed Services

Governance rigidity and flexibility

Data and models vs. Product

Big Head and Systematic Data Collection, vs. project components

Earth Science and Cyber-infrastructure and Engineering

Long Tail vs.  Big Head Data

 

The Line of Sight allows for exploring the complexities of geoscientist “ecosystem edges” and architect for greater interaction and production in the geosciences.   Those in the “Long Tail” encounter the same cross domain access, interoperability, management barriers as the “Big Head”. Neither have the incentive to develop common enabling data interoperability services, scalable incentive solutions, common planning approaches or increase the participation of the earth science community. Xentity’s believes architecture is an enabling design service.  It is used to empower the user community with the tools to expand its capacities. In this case, Xentity will provide the operating model and architecture framework in a conceptual design to bring together the currently unattended edges.  In the long run, the models will provide the emerging governance system the tools to develop investments strategies for new and legacy capabilities.

The Broader Impact

At its core, we believe the geoscience integration challenge is to exploit the benefits and possibilities of the current and future geoscience “ecosystems edge effect”.  In the ecosystem metaphor, the conceptual design approach will target the boundary zones lying between the habitats of the various geoscience disciplines and systems.   What is needed is an operating model, architectural framework and governance system that can understand the complexities of a geoscientist shared environment and successfully induce the “edge effect”.  It needs to balance the well performing aspects of the existing ecosystem with new edges to generate greater dynamism and diversification for all geosciences. 

An Operating Model example: Collaborative geoscience planning could make a good demonstration case for the benefits of the “edge effect”.  A lot of science efforts are driven by large scale programs or individual research groups who have very little knowledge of who else may be working in the same environmental zones, geographies or even on related topics.  A shared planning service could put disparate projects into known time, location and subject contexts and accelerate cross domain project resource savings and develop the resulting interdisciplinary cross pollination required to understand the earth’s systems.  An Enterprise geoscience initiative could provide a marketplace for geoscientist to shop around for collaborative opportunities.  The plans can be exposed in a market place to other resources like citizen scientists or STEM institutions.  The work can be decomposed so that environments like Amazon’s Mechanical Turk can post, track and monitor, distributed tasks.

By recognizing these edges, the architecture will create greater value or energy from the disciplines and improve the creativity, strength and diversity of ideas, and mitigate disruption.  The ecosystem-like design that balances the Big Head with the Long Tail will enable more cost effective geoscience projects and create a higher return on IT investments while collapsing the time to conduct quality impactful science. Most importantly, this will accelerate the realization of the sciences’ impact on other dependent scientific initiatives or time to develop and implement policy.  Xentity sees the potential to use this and other “ecosystem edges” to transform how geoscience is currently conducted. 

Xentity believes a geoscientist, emerging (STEM) or emeritus would be willing to participate in cross-cutting, shared service model based on how well these edges are architected and governed.  If designed and operated effectively, the edges will create an environment that will address the two key barriers to adoption: trust and value.  In essence, we see the scientists as consumers and producers.

  • As consumers of data, information and knowledge products and technology services, they are continuously looking to create more knowledge and contribute to social benefit. 
  • As producers they contribute data, information and knowledge back into their colleagues’ knowledge processes.  

In fact, the predominant challenge for such an approach is that the share-service will be develoepd by the community who themselves are a consumer. Just like any other consumer, they will have expectations when they purchase or use a product or a service.  If one cannot uphold the terms and conditions of product quality or a service agreement; you lose the consumer. So, how does the architecture ensure these “edges” develop and evolve?  It must ensure:

How to earn Geoscientists’ Trust

The scientists need to know that they will have highly reliable technical services and authoritative data that are available and perform well when they request them.  Most importantly, they will need to influence and control who and how they conduct the work within the shared environment. They need to ensure the quality of the science and appropriate credit.

How to demonstrate the value to the Geoscientist:

The scientists need the provider to correct products or services that will eliminate the most significant barriers and constraints to doing more and higher quality science – research, analysis and experimentation – with less effort.

In the short term, the shared service challenge is to earn the scientists trust and identify the optimal suite of products and services to provision value from the “community resources” as defined in Layered Architecture. For land elevation products up to 80% of the requests are for standardized products. If done correctly, the governance system, operating model and architecture framework will develop the trust and value recognition from the shared community.  In the longer term, the models and framework will guide the redirection of its limited resources towards an interoperable set of systems, processes and data.

Great, but even if we create this, how do we fund? 

See the next part on “Will geoscience go for a shared service environment” which discusses ways to address funding, ways to engage, encourage, enable, and support execution of these enterprise capabilities for geoscientists. 

Why we focus on spatial data science

Blog post
edited by
Matt Tricomi

The I in Information Technology is so broad – why is our first integrated data science problem focus on spatial data? It doesnt fit when looking on face of our Services Catalog . We get asked this a lot and this is our reason, and like Geospatial, its multi-dimensional spanning different ways of thinking, audiences, maturity, progressions, science, modeling, and time:

 

In green, x-axis, is the time progression of public web content. The summary point is data took the longest period – about 10-15 years. And data can only get better as it matures into being popular 25 years old on the web. We are in the information period now, but moving swiftly into the knowledge period. Just see how much more scientific data visualizations, and dependence we are on the internet. Just think how much you were on the web in 1998 compared to 15 years later – IT IS IN YOUR POCKET now. 

This isn’t just our theory.

RadarNetworks put together the visual of progressing through the web eras. Web 1.0 was websites or Content and early Commerce sites. Web 2.0 raised the web community with blogs and the web began to link collaboratively built information with wikis. Web 3.0 is ushering in the semantic direction and building integrated knowledge.

Even scarier, Public Web Content progression lags several business domains, but not necessarily in this leading order: Intelligence, Financial, Energy, Retail, and Large Corporate Analytics. Meaning, this curve reflects the Public maturity, and those other domains have different and faster curves. 

The recent discussions on intelligence analysis linking social/internet data with profile, Facebook/Google Privacy and use for personalized advertising, level of detail SalesForce knows about you and why companies pay so much for a license/seat, how energy exploration is optimizing where to drill in some harder to find areas, or the absolute complexity and risk of the financial derivatives as the world market goes – these technologies usually lag in how we integrate public content for googling someone, or using the internet to learn more and faster. Reason: Those do not make money. Same reason why the DoD invented the internet – it was driven by security of the U.S. which makes money which makes power. 

So, that digression aside (as we have been told “well, my industry is different”), the public progression does follow a parabolic curve that matches Moore’s Law driving factor in IT capability – every 2 years, computing power doubles in power, at same cost (paraphrasing). The fact that we can do more faster at quality levels means we can continue to increase our complexity of analysis in red. And there appears to be a stall not moving towards wisdom, but as we move toward knowledge. Its true our knowledge will continue to increase VERY fast, but what we do with that as a society is the “fear” as we move towards this singularity so fast. 

Fast is an understatement, very fast even for logarithmic progression as its hard to emote and digest the magnitude of just how fast it is moving. We moved from

  • The early 90s simply placing history up there and experimentation and having general content with loose hyperlinking and web logs
  • to the late 90s conducting eCommerce and doing math/financial interaction modeling and simulations and building product catalogs with metadata that allowed us to relate and say if a user found that quality or metdata in something, it might liek something else over here
  • to the early 2000s to engineering solutions including social and true community solutions that began to build on top of relational and the network effect and use semantics and continually share content on timelines and where a photo was taken as GPS devices began to appear in our pockets
  • To the 2010s or today where we are looking for new ways to collaborate, find new discoveries in cloud, and use the billions and billions on sensors and data streams to create more powerful more knowledgable applications

Another way to digest this progression is via the table below.

Web VersionTimeDIKWWeb MaturityKnowledge Domain Leading WebData Use Model on WebData Maturity on Web
.9early 90sDataContentHistoryExperimentalLogs
1.01995+Info HistoryExperimentalContent
1.11997  MathExperimentalRelational
1.21999 +CommerceMathHypotheticalMetadata
1.32002  EngineeringHypotheticalSpatial
2.02005+Knowledge+CommunityEngineeringComputationalTemporal
2.12010s  EngineeringComputationalSemantic
3.02015 and predictable webKnowledge+CollaborationScienceData as 4th paradigm notTempoSpatial (goes public)
4.02020 -2030Wisdom in sectorsAdvancing Collaboration with 3rd world coreAdvancing Science into Shared Services – Philsophical is out yearRobot/Ant data qualitySentiment and Predictive (goes public/useful) – Sensitive is out year

Now, think of the last teenager that could maintain eye contact in a conversation with an adult while holding phone in their hand and not be distracted by the pavlovian response of a text, tweet, instagram, etc. Now imagine, ten years from now, when its not tidbits of data, but as a call comes up, auto-searching on terms they arent aware of come up in augmented reality. Advice on how to react on the sentiment they just received – not just the information. The emotional knowledge quotient will be google now – “What do I do when?” versus critical thinking and live and learn.

So, taking it back to the “now”, though this blog is lacking the specific citations (blogs do allow us to cheat, but our research sources will make sure to detail and source our analysis), if you agree that spatial mapping for professional occurred in early 2000s and agree now that it has hit the public and understand that spatially tagging data has pass the tipping points with advent of smartphones, map apps, local scouts, augmented reality directions, and multi-dimensionl modeling integrating GIS and CAD with web, then you can see the data science maturity stage we are in that has the largest impact right now is – Geospatial.

Geospatial data is different. Prior to geospatial, data is non-dimension-based. It has many attributable and categorical facets, but prior to spatial data, that data does not have to be stored as a mathematical or picture form with specific relation to earth position. Spatial data – GIS, CAD, Lat/Longs, have to be stored in numerical fashion in order to calculate upon it. Further more, it hasnt be be related to a grounding point. Essentially, geospatial is storing vector maps or pixel maps. When you begin to put that together for 10s of millions of streams, you get a a very large complicated spatially referenced hydrography dataset. It gets even more complicated when you overlay 15-minute time-based data such as water attributes (flow, height, temperature, quality, changes, etc.) with that. Even more complicated when you combine that data with other dimensions such as earth elevations and need to relate across domains of science, speaking different languages to be able to calculate how fast water may flow a certain contaniment down a slope after a river bank or levy collapses.

Before we can get to those more complex scenarios, geospatial data is the next progression in data complexity .

That said, definitely check out our Geospatial Integrated Services and Capabilities

USGS executes Option year number one for Architecture IDIQ

Blog post
edited by
Wiki Admin

Today, Xentity was awarded Option year #1 for the U.S. Geological Survery IDIQ for Enterprise and Solution Architecture . This is the 6th year Xentity has provided outstanding Geospatial Integrated Services and Capabilities and Architecture services to the USGS and the results have continue. If interested in leveraging out IDIQ read the background below and review our USGS IDIQ for Enterprise and Solution Architecture

USGS National Geospatial Program Architecture Background

Xentity provides segment architecture development support in Program and Product Planning, Geospatial Data Acquisition and Production Lifecycle Management, Delivery Services, and Resource Management. Xentity also supported analyzing strategic planning and relation to architecture. EA development was based on the defined Federal Segment Architecture Methodology (FSAM).  As well, Xentity supported the transition from development to implementation which included such services as:

  • Blueprint Roadmap Development
  • Governance Formation, Analysis and Consulting
  • Solution Architecture Management Consulting
  • Operational Quality Analysis – Capacity, Availability, Continuity Analysis
  • Business Activity and Role Analysis
  • Planning Support and Consulting
  • Implementation Facilitation Support
  • Change Management and Consulting
  • Service-Level Management Consulting

Additionally, Xentity assisted the USGS in creating a communication product plan. This included:

  • Extraction of the communication products and communication product activities into a singular unified communication product plan.
  • Documented advice and tactics to product and service leads about opportunities unifying and coordinating events for the purpose of (1) increasing presence relating to TNM and USGS brands and (2) making communication product spending more efficient.
  • Documented advice about strategic communication techniques and products, including but not limited to branding rollout, key differentiator message support, Maturity road map support, conducting of customer meetings, and capturing stakeholder positioning statements for products,
  • Training on newer communication concepts such as social media

Xentity Performance:

Xentity has had 100% deliverable acceptance by USGS and were all on-time and on-budget. Of the 4 blueprints, the resulting 200+ milestones are now being implemented under Xentity supported Program Management Office, and are seeing improvements in all areas. Xentity’s support in providing solution architecture patterns are seeing over 100 IT assets are slated to be retired and still have increase usership, but at a lower cost. NGP Director implemented some organizational management improvements, of which some were based on the Xentity supported resource management roadmap. USGS NGP has shifted to a prioritized stakeholder-driven model to help better invest in data, product, and service content and features the users want/need. Operation Centers have taken to using new internal management processes and toolsets such as online document management collaboration and wikis, online issue/task tracking, migration for some NGP products and services to an ITIL Service Desk “triage” model with a much shorter and reasonable response period, and some early adoption of using Agile Project Management techniques to increase output.

In the area of Communications, Xentity provided the following services to the USGS on this contract even slow adoption of new concepts are yielding large benefits. From supporting a few hundred person inaugural The National Map user conference with a very well received brand treatment and event strategies. Conference support also included pre, during, and post event functions such as A/V coordination, feedback mechanism, last-minute event communication product generation, to seeing new social media accounts and other tools as recommended in the product communication sequence plans. Product Leads have received multiple communication training sessions which concepts included coach Influencing/Sales skills to P&S Leads. Consult on visual ID compliance, Advise and consult with the timing, communication paths, relationships, and critical success factors.

User relevancy is critical as prior to 2008, usage was actually on the decline, Search internet on The National Map User Conference which Xentity directly supported in setting up this inaugural event for May 2011 and results of program usership and product access also was observed as much as 3 times, but generally as 25% uptick in usage. Delivery usage post-conference saw a jump in increased usage, which continues to grow.  Delivery Solutions architecture results has continued to see higher usership and relevance to its users, all while not increasing the IT footprint, and in many places decreasing it while increasing service operation qualities.

Geo is more than a dot on the map

Blog post
added by
Wiki Admin

Geo is more than a dot on the map – it is the next progression in information. Xentity has combined the below services for maximizing value of geospatial programs, products, systems, and workforce. We have found common, re-usable, and tailor-able patterns, issues applicable from small to large public and private institutions.

 

Xentity’s staff has been providing geospatial strategy, planning and solution expertise to large public and commercial organizations for the last 15 years. Our goal is to maximize the value of geospatial products, services, systems, technologies and data for our clients and their customers. We have observed, first hand, patterns of critical issues that have hampered our customers’ performance. These issues are found in the Geospatial business areas of Geospatial Planning, Production and Data Lifecycle Management, and Service Delivery as well as in advising the executive or program leadership to manage and execute such complex and complicated changes. 

We have quite a diverse Geospatial Client Portfolio from:

  • Geospatial Government Programs: NGA, DOI, USGS, BLM, EPA, The National Map, Core Science, Indonesia SDI
  • Private Sector: Space Imaging (GeoEye), Intrawest, Denver Metro Utilities, Cloud-based Geospatial CRM

Our approach and expertise has been developed by providing impactful analysis resulting in practical solutions to these patterns. Our proven approach to problem solving encompasses the best practices of management consulting, enterprise architecture planning coupled with geospatial domain knowledge.

Our services in Geospatial services are:

  • Geospatial Planning Analysis Service
  • Geospatial Production and Data Lifecycle Management Analysis
  • Geospatial Service Delivery Analysis
  • Overall Geospatial Strategic and Tactical Management Consulting

Geospatial Planning Analysis Service

Customer Relationship Management: Work with your customers to establish clearly defined needs and the benefits to help you prioritize your efforts and create greater value for your products and services.

  • For the US Geological Survey Land and Remote Sensing program and the National Geospatial Program, Xentity led the development business practice changes that focused on bringing external customer segments to the program planning processes leading to technological and product innovations and improved communications.

Cost Benefits Analysis of Geospatial: Optimizing the value of Geospatial assets is often very complex due to its versatility and diverse uses. Organizations typically have numerous and varied stakeholders with similar requirements to exploit its valuable data and powerful technologies.  Xentity offers executive level planning and advisory services to clients to identify value opportunities, asses its strategic importance and measure subsequent performance.

  • At the US Department of Interior, Xentity provided cost benefit study to establish return on investment for service deployment – analysis yielded 10:1 Benefits.

Data Acquisition and Quality Planning: Align your program data requirements with the available partners, volunteer data, and ensure it flows through the system in an efficient, qualified and cost effective manner. Ensure you retain data in compliance with existing records, archive and use policies. 
  • Developed and implemented several acquisition strategies for the USGS National Geospatial Program that led to improved data quality and long term maintenance for multiple scale datasets.

Funding Planning:  These are challenging economic times that have seen a considerable decrease in government geospatial funds even when the value of geospatial information has not been fully exploited.  Now more than ever it is critical to be able to determine and communicate the value and impact of geospatial activities internally, to your customers and with your executives. 

  • For USGS and use of FGDC CAP Grants, Xentity designed overall acquisition planning model that linked production tracking, multiple purchasing authorities, and locations of existing inventory and purchase targets to help increase ROI on geospatial data acquisition.

Geospatial Production and Data Lifecycle Management Analysis

Concept of Operations Development –Evaluate your existing production processes for new more cost effective ways to move higher volumes of data through the system. Minimize the number of times the data is accessed.

  • For Indonesia BAKOSURTANAL NSDI, Xentity staff designed a shared production Environment across disparate incubating programs. Xentity has done similar work for DOI, BLM, USGS, EPA, and numerous private firms.

GeoData Quality Lifecycle Analysis – Synchronize the sources to increase the integration qualities of the data or to improve the quality (accuracy, currentness) of the data.

  • US National Geospatial Program – using DLCM sourced data from same suppliers to meet multiple program goals and provided consistency to user community.

Geodata Use Preparation – Optimize how to prepare data for service consumption and use as a part of your production flow.

  • BAKOSURTANAL instituted a publishing model where data was integrated across 17 ministries prior to catalog publishing

Service Delivery Analysis

Xentity has provided architectural services to the USGS, DOI and Data.gov on geospatial metadata management, discovery and visualization. We have also conducted information sharing on re-usable information service delivery patterns with multiple agencies and international bodies such as Canada, Australia, and Indonesia

Geospatial Metadata Discovery Architectures – Improve how your data and products are described (metadata) discovered (cataloging), accessed and used in the online world. Integrate your efforts with open government initiatives.

  • For USGS Core Science Systems, amongst 4 major tenets, Xentity created a path/blueprint for moving to integrated clearinghouse and harvest solution integrating sciencebase.gov and an existing metadata clearinghouse. Worked with Geospatial One Stop, Data.gov and USGS Science Base Catalogs in support of digital data delivery.

Geospatial Delivery Application Architecture – Implement standardized application frameworks and specifications to support data access and rapid transition to new products and services or delivery methods.

  • Solution Architecture design for National Geospatial Program and BAKOSURTANAL for data services and delivery.

 Geospatial Delivery Solutions Architecture –Design the access methods for online service delivery or download for all forms of products and services (vector, raster, data file formats, bigdata) through real-time service/stream access for application integration patterns to bulk cloud computing to discovery bigdata search indices with geospatially integrated interpretative signalling

  • USGS delivery, by moving to eCommerce model for products, moving to staged products and improving web service access, architecture and standards for improved application integration has had significant IT cost reduction, and 10% monthly increase use over 4 years in product download and service access.

Geospatial Delivery Prototype & Research Application Development – Our architects and developers have capabilities to bring designs to life. Experts in prototyping and early phase technology selection and demonstration in BigData, Visualization, Modeling, Service, Discovery, Semantics, and informatics solutions.

We blend this with our Agile Project Business Management capabilities for rapid planning, scrum management, sprint planning tracking, and tying back to aforementioned requirements, line of sight, and levels of architecture, and ITIL deployment requirements.

Our goal is to provide services, software tools, and automation procedures to assist in continued appropriate innovation and to keep your research, prototypes, or beta on track.

  • USGS initial The National Map base map development in early research
  • USGS Cloud Migration analysis few dozen pilots in storage, inventory, services, APIs, bigdata, basemaps and more
  • Rapid Prototyping BigData Indices and interpretative signals for millions upon millions of records and complicated discovery requirements replacing traditional RDBMS.
  • Visualization Application stacks and mashup developments in ESRI JS API, CartoDB, OpenLayers, ArcGIS Online, CKAN, Socrata, ESRI GeoDataPortal, PostGIS, Leaflet, GeoServer, and many more JS APIs, map servers, and basemap engines.

Geospatial User-Centered Designs – Improve how your users access your digital data and mapping products and keep them aware as your content changes. Design solutions that have notification models in areas of interest, topically aware, and balance popularity, and other qualities and repeatable patterns unique to geospatial

  • Space Imaging provided spatial notifications to customers based on area, product and intended use criteria

Geospatial Vendor Product Evaluation & Reseller Specialization – Xentity has evaluated many geospatial product lines in geospatial data, production, product generation, service platforms, and applications/APIs. Xentity has capability to evaluate product lines as well as re-sell products.

  • Xentity has evaluated geospatial products, architectures in open-source, COTS, GOTS with and for USGS, GSA, DOI, Salesforce AppExchange company, Denver Metro Utilities Company, and more. See Xentity Partners partners for products Xentity has capability to resell today.

What makes Geospatial so different?

Why do we promote spatial at such a level? Why we focus on spatial data scienceRead on…

What do current disruptive technologies mean to the roles of the Federal CIO office

Blog post
edited by
Matt Tricomi

We wanted to ask: What do current disruptive technologies mean to the roles of the Federal CIO office? 

Currently the Counter Weights are in Legacy footprints, primarily legacy policy

Traditionally, the operating model and funding approach for IT has been based on the Brooks Act of 1965 and only added minor portfolio integration concepts based on the Clinger-Cohen Act of 1996. These acts were focused on internal IT cost-based centers, management information systems, mission control systems, and enterprise resource planning systems. These systems were all either internal mission or data processing systems used to run business. Since 1996, a lot has happened in the IT based. It has moved from cost center to profit center in the private sector and in the government space to service or profit center as well (i.e. profit for IRS in efiling). As normally is the case when shifting positions of an asset to the executive level, this also means the investment models change and shift as it is now a critical part of executing transactions and interactions direct with the public, yet our policies are now 20-50 years old.

A few general observations :
  1. The Federal experiment with Clinger Cohen and Circular A-130 addressing the role of the CIO and Enterprise Architecture has not neared fulfilling its objective. New strategies such as CloudFirst and Federal Shared Services are guiding investment, but not new roles of CIOs.
  2. The policy and roles need to be readdressed to manage disruptive technologies like shared services, commoditized cloud computing, information exchange or data and knowledge driven analytics and “who-knows-what-else” coming down the line.  
  3. The CIO’s shop has not been able to transform to meet the basic demands of security and infrastructure disruptions let alone attempts to solve the needs of the mission.  
  4. Additionally, Enterprise Architecture is and has been miscast and ill-defined within the CIO organization and as a result is being used for compliance reporting or to support internal CIO initiatives leaving the mission out in the cold.  

If these statements are agreed to be true, Is it a wonder that a nearly a dozen years after the circular was published that people are still asking “What is Enterprise Architecture”?  Or does the Capital Planning & Investment Control (CPIC) process really lend itself to shared services? Are these skills and tools in the right organization?

Opportunities abound if the right people are managing the disruptions

The federal government opportunities for improvement are many but the most valued will be floating betwixt and between the current organizational, process and data architectures – in the federal architecture ether. 
This poses an especially difficult task to the business.  The mission leaders need to be allocating skilled resources to understanding how to assess the value of disruptive technologies or service changes to address their goals.    It is old school thinking that the CIO as a service provider can penetrate their mission problems with the timely and appropriate application of technology.  The development of extensible cloud computing platforms with transparent accounting systems provides an essential key for the mission to step in, reposition itself, and own the movement towards shared services, enhanced information exchanges or improved mission processes.  After all, they are the immediate beneficiaries.

What might these new roles look like?

What might this look like from a 100,000 foot perspective? In a Business Week article, it summarizes the new role of the Federal CIO, historically an IT manager, is now:

 In sum, the successful CIO needs an intimate idea of how current technology can increase the company’s sales and not just reduce costs or improve clerical productivity.

Beyond the CIO role, there are several other key leadership roles to consider in new, coordinated policy.

  • The future CIO role should be targeted to managing infrastructure services and support shared mission services. The CIO can retain the acronym but in essence they should be managers of cross cutting infrastructure and once agreed to and designed and built by the business – shared mission services.  
  • The Chief Architect provides the analysis and design expertise to the Program Managers and Chief Knowledge Officer to help plan for the adoption of the disruption.  
  • Ultimate accountability for performance will be the charge of the Chief Performance Officer.  

In order to achieve true business agility while supported by the adoption of disruptive technologies and services, these roles will need to be figured out how to be repositioned to improve the government’s business capabilities and satisfy citizens, businesses, and cross-government customers.