Macro versus Micro Geospatial Data Value

Blog post
edited by
Matt Tricomi

Kudos to the Canadian Government – NRCAN – for trying to get an clearer understanding of the economic significance and industry status of  geospatial data products, technologies and professional services industry.  http://www.nrcan.gc.ca/earth-sciences/geomatics/canadas-spatial-data-infrastructure/cgdi-initiatives/canadian-geomatics-0.  The Boston Consulting Group (BCG) produced a somewhat similar economic impact assessment for the United States in April 2013. http://geospatial.blogs.com/geospatial/2013/04/contribution-of-geospatial-services-to-the-united-states-economy.html

We geo-bigots working to support the public sector intuitively and intellectually realize geospatial’s potential value and are continuously frustrated by our lack of collective ability and capacity to make its implementation more immediate, simpler and more powerful.  While these macro- economic pictures are interesting and useful they do not seem to influence the micro government decision making capacities.  The BCG report “cautions that to continue this growth will require sustained public- and private-sector cooperation and partnership, open policies governing collection and dissemination of location-based data and increased technical education and training at all level”.

The intent of the US Federal Geographic Data Committee  (FGDC) “GeoPlatform” and Canada’s FGP, if supported by the proper policies and data management practices could simplify the data quality and acquisition challenges resident in our hyper-federated geospatial data environment.  Ironically, in the emerging and soon to be overwhelming information and knowledge based economy we are still struggling to manage data content.  Geospatial will not, break into the program and mission operations until the business leadership fundamentally adopts information centered performance objectives as a part of their organizational culture. 

Geospatial has always been an obtuse concept to classify, evaluate or pigeonhole into a nice neat framework let alone determine its national economic value.  For similar reasons, within the US Federal government, “geospatial” has struggled to find an organizational position that would enable its potential value to be maximized.  This is partly due to data structure complexity resulting from its form, resolution, temporal range, scale, geometry or accuracy qualities have created an artificial “boundary” and organizations are having are hard time navigating out of.  At least until the “director” sees the “map” or the “picture’ and then it is the silver bullet.

Here at Xentity, we want to start to frame the discussion about how to exploit Geospatial Value at the micro or organizational level and begin to guide our customers to sustained geospatially driven business improvements.  Our initial cut at how to break the field down is found in the diagram.  How can this be improved upon?

In 2014 Every Business will be Disrupted by Open Technology

Blog post
added by
Wiki Admin

The article “In 2014 Every Business will be Disrupted by Open Technology” raised some very key points on some key disruptive patterns. A key theme we picked up on was the movement from bleeding and leading edge to more popular adoption of the open data and Open platforms. 

As the article notes:

Yet the true impact begins not with invention, but adoption.  That’s when the second and third-order effects kick in. After all, the automobile was important not because it ended travel by horse, but because it created suburbs, gas stations and shopping malls

A few tangible themes we picked up on are:

  1. Stand-alone brands are shifting to open platforms for co-creation
  2. Open platforms built on a brand`s intellectual property enhance the value, not threaten it
  3. Open platforms provide enterprise-level capabilities to the smallest players, leveling the playing field, accelerating innovation, and amplifying competition
  4. Open platforms for co-creation shifts the focus away from driving out inefficiencies, and toward the power of networking and collaboration to create value.

Where its happening already: Federal, State, City, Commercial

In FedFocus 2014, they also emphasized that with the budget appropriations for 2014 and 2015, two big disruptive areas will continue to be in Open Data and Big Data. Especially, with the May 2013 release of the WhiteHouse Open Data memorandum for going into effect in November 2013, it will impact Open Data by:

Making Open and Machine Readable the New Default for Government Information, this Memorandum establishes a framework to help institutionalize the principles of effective information management at each stage of the information`s life cycle to promote interoperability and openness. Whether or not particular information can be made public, agencies can apply this framework to all information resources to promote efficiency and produce value.

We are seeing states get into the mix as well with Open Data movements like http://gocode.colorado.gov/

Go Code Colorado was created to help Colorado companies grow, by giving them better and more usable access to public data. Teams will compete to build business apps, creating tools that Colorado businesses actually need, making our economy stronger.

Also, at the city level with the City of Raleigh, North Carolina which is well recognized for its award-winning Open Data Portal.

 

We had previously tweeted on how IBM opened up their Watson cognitive computing API for developers…. publicly. This is a big deal. They know with open data platforms as an ecosystem, they not only get more use, which means more comfort, which means more apps, but every transaction that happens on it, that is legally allowed, they to improve their interpretative signals that make Watson so wicked smart. This article points this key example out as well. 

 

And back to National Data Assets moving ahead to make their data more distributable over the cloud, moving data closer to cloud applications, offering data via web services where they are too large or updated too often to sync, download, or sneakernet.

Xentity and its partners have been at the forefront of all these movements.

We have enjoyed being on the leading edge since the early leading edge phases of this movement. Our architectures are less on commodity IT, which not to undersell the importance of affordable, fast, robust, scalable, enabling IT services and data center models. Our architectures have been more focused on putting the I back in IT.

We have been moving National Geospatial Data Assets into these delivery models as data products and services (Xentity is awarded USGS IDIQ for Enterprise and Solution Architecture), supporting the architecture of data.gov (Xentity chosen to help re-arch data.gov), and recently supporting the data wrangling on Colorado`s State OpenData efforts. We are examining Can a predictable supply chain for geospatial data be done and actively participating in NSF EarthCube which looks to “Imagine a world…where you can easily plot data from any source and visualize it any way you want.” We have presented concepts

Our architecture methods (Developing a Transformation Approach) are slanted to examine mission oriented performance gains, process efficiencies, data lifecycle management orientation, integrating service models, and balancing the technology footprint while innovating. For instance, we are heavily involved in the new ACT-IAC Smart Lean Government efforts to look at aligning services across government and organizational boundaries around community life events much like other nations are beginning to move to.

Xentity is very excited about the open data movements and supported platforms and the traction it is getting in industry. This may move us forward from information services into the popular space to and for knowledge services (Why we focus on spatial data science)


To do BigData, address Data Quality – People and Processes – Tech Access to information

Blog post
added by
Wiki Admin

As a follow on to the “cliffhanger” on BigData is a big deal because it can help answer questions fast, there are three top limitations right now: Data Quality, People and Process, Tech Access to Information. 

Lets jump right in.

Number One and by far the biggest – Data Quality

Climate Change isn’t a myth, but it is the first science to ever be presented on a data premise. And in doing so, they prematurely presented models that didn’t take into account the driving variables. Their models have changed over and over again. Their resolution of source data has increased. Their simulations on top of simulations have proven countless theories of various models that can only be demonstrated simply by Hollywood blockblusters. Point being, we are dealing with inferior data for a world scale problem, and we jump into the political, emotional driven world with a data report? We will be the frog in slowly warming water, and we will hit that boiling point late. All because we started with a data justification approach using low quality data. Are they right the world is warming? Yes. Do they have enough data to proven the right mitigation, mediation, or policy adjustments? No, and not until either we increase the data quality or take a non-data tact.

People and processes is a generation away.

Our processes in IT have been driven by Defense and GSA business models from the fifties. Put anyone managing 0s and 1s technology in the back. They are nerds, look goofy, can’t talk, don’t understand what we actually do here and by the way, they smell funny. That has been the approach to IT since the 50s – nothing has changed with the exception that their are a few bakers dozen of the hoodie wearing, mountain dew drinking, late night owls who happen to be loaded now, and their is a pseudo culture of geek chic. We have not matured our people talent investment to balance maturity of service, data, governance, design, and product lifecycle to embrace that engine culture as core to the business. This means, more effective information sharing processes to get the right information to the right people. This also means, investing in the right skills – not just feeding doritos and free soda to hackers – to manage the information sharing and data lifecycle. I am not as worried about this one. As the baby boomer generation retires, it will leave a massive vacuum as Generation X is too small and we’ll have to groom Generation Y fast. That said, we will mess up a lot missing a lot of brain drain, but market will demand relevancy which will, albeit slowly, create this workforce model in 10-15 years.

Access to Environments 

If you asked this pre-hosting environments or pre-cloud, this would have been limited to massive corporations, defense, intel, and some of the academia co-investing with those groups. If you can manage the strain of shifting to a big data infrastructure, this barrier should be the least of your problems. If you can allow your staff to get the data they need at the speed they need so they can process in parallelization without long wait times, you are looking good. Get a credit card, or if Government, buy off a Cloud GWAC, and get your governance and policies moving, as they are likely behind and not ready. Likely they will prolong the silo’d information phenomenon. Focus on the I in IT, and let the CTO respond to the technology stack. 

Focus on data quality, have a workforce investment plan, and continue working your information access policies

The tipping point that move you into Big Data is where these combined require you to deal with the complicated enormity at speeds answering questions not just for MIS and reports, but to help answer questions. If you can focus on those things in that order (likely solving in reverse), you will be able to implement parallelization of data discovery.

This will shorten the distance from A to B and create new economies, new networks, and enable your customer or user base to do things they could not before. It is the train, plane, and automobile factor all over again.

And to throw the shameless plug in, this is what we do. This is Why we focus on spatial data science and Why is change so fundamental.

GAO releases report on FGDC Role and Geospatial Information

Blog post
edited by
Wiki Admin

GAO release report on us of geospatial information with title “OMB and Agencies Can Reduce Duplication by Making Coordination a Priority”. Readers Digest – focus on integrating data. 

Click to download PDF

We tend to agree. FGDC is currently very focused on a service enabling management model (Geoplatform) to accomplish this. It is bold, but if their role of being a service provisioner can directly or indirectly get them in the game to address the real problem of data lifecycle management, they will have a chance to address this. 

Point being, FGDC knows its role is not to be in IT Operations as its direct goal. But, they also saw that being a sideline judge with no carrot or stick role would not garner the direction and recommendations that GAO suggests. They are getting on the playing field, taking advantage of the open service provider role, being that broker, and using that role to move IT costs down, and also enabling those shifts in monies to then focus on the data issues cited. Its bold, and a unique approach, and there are many questions can a traditionally non-operational group develop that culture to be effective. Proof will show over the next 2 years.

Below find our summary of strategic direction for FGDC’s geoplatform.

The challenges and recommendation sections are:

  1. FGDC Had Not Made Fully Implementing Key Activities for Coordinating Geospatial Data a Priority
  2. Departments Had Not Fully Implemented Important Activities for Coordinating and Managing Geospatial Data
  3. Theme-lead Agencies Had Not Fully Implemented Important Activities for Coordinating and Managing Geospatial Data
  4. OMB Did Not Have Complete and Reliable Information to Identify Duplicative Geospatial Investments

Our review of Background – then and now

The foundation the FGDC has put in place. The Federal Geographic Data Committee (FGDC) has always been a catalyst and leader enabling the adoption and use of geospatial information.

The Federal Geographic Data Committee (FGDC) has been successfully creating the geospatial building blocks for the National Spatial Data Infrastructure (NSDI) and empowering users to exploit the value of geospatial information.  The FGDC has been leading the development of the NSDI by creating the standards and tools to organize the asset inventory, enhance data and system interoperability and increase the use of national geospatial assets. The FGDC has successfully created policy, metadata, data and lifecycle standards, clearinghouses, catalogs, segment architectures and platforms that broaden the types and number of geospatial users while increasing the reuse of geospatial assets. [1] 

What is next? The Geospatial Platform and NGDA portfolio will be the mechanism for adoption of shared geospatial services to create customer value

Recently, the FGDC and its’ partners, have expanded their vision to include the management and development of a shared services platform and a National Geospatial Data Asset (NGDA)portfolio.  The goals are to “develop National Shared Services Capabilities, Ensure Accountability and Effective Development and Management of Federal Geospatial Resources, and Convene Leadership of the National Geospatial Community benefitting the communities of interest with cost savings, improved process and decision making”.[2]

As the FGDC continues on the road to establish a world class geospatial data, application and service infrastructure, it will face significant challenges “where the Managing Partner, along with a growing partner network, will move from start‐up and proof‐of‐concept to an operational Geospatial Platform”.[3]

Xentity has reviewed the FGDC’s current strategy, business plan and policies and identified the following critical issues that need to be solved to attain the goals:

  • Building and maintaining a federated, “tagged”[4] standards-based NGDA and an open interoperable Geospatial platform. The assets need to provide sufficient data quantity and quality with service performance to attract and sustain partner and customer engagement[5]
  • Developing a customer base with enough critical mass to justify the FGDC portfolio and provide an “Increased return on existing geospatial investments by promoting the reuse of data application, web sites, and tools, executed through the Geospatial Platform” [6]
  • Improving Service Management and customer-partner relationship capabilities to accelerate the  adoption of interoperable “shared services” vision and satisfy customers [7]
  • Executing simple, transparent and responsive Task Order and Requirements management processes that result in standards based interoperable solutions.  [8]

The Big Challenges

Establish the financial value and business impact of the FGDC’s Portfolio!

The Geospatial Platform and NGDA will provide valuable cost saving opportunities for its adopters.  It will save employee’s time; avoid redundant data acquisition and management costs, and improve decision making and business processes.  The financial impact to government and commercial communities could be staggering. It is a big and unknown figure.

The Geospatial Platform by definition and design is a powerful efficient technology with the capacity to generate a significant return on investment.  It is a community investment and requires community participation to realize the return.  The solution will need to assist the communities with the creation and sharing of return on investment information, cost modeling, case studies, funding strategies, tools, references and continue to build the investment justification.  The solution will need to optimize funding enhancement and be responsive to shorter term “spot” or within current budget opportunities while always positioning for long term sustainability.  The FGDC Geospatial Platform Strategic Plan suggests a truly efficient capability could create powerful streamlined channels between much broader stakeholder communities including citizens, private sector, or other government-to-government interfaces. Similar to the market and business impacts of GPS, DOQ, satellite imaging technology, the platform could in turn promote more citizen satisfaction, private sector growth, or multiplier effects on engaged lines of business.

To get a big return, it will demand continuous creative thinking to develop investment, funding, management and communication approaches to realize and calculate the value.  It is a complex national challenge involving many organizations, geospatial policy, conflicting requirements, interests and intended uses.

The key is demonstrable successes.  Successes become the premise for investment strategy and cost savings for the customers.  Offering “a suite of well‐managed, highly available, and trusted geospatial data, services, and application, web site for use by Federal agencies—and their State, local, Tribal, and regional partners” [9] is the means to create the big value.  

”A successful model of enterprise service delivery will create an even greater business demand for these assets while reducing their incremental service delivery costs.” [10]

FGDC has to create and tell a compelling “geospatial” value proposition story

To successfully implement the FGDC’s vision, it will demand a robust set of outreach and marketing capabilities.  The solution will need to help construct the platforms value proposition and marketing story to build and inform the community.  The objective is to ensure longer term sustainable funding and community participation.  The solution will need to bring geospatial community awareness, incentive modeling, financial evaluation tools, multi-channel communication and funding development experience to the FGDC.  The solution will need to have transparently developed and implemented communication and marketing strategies that have led to growth in customer base, alternative portfolio funding models and shared services environments for the geospatial communities.  The solution will need to have an approach that will be transparent, engage the customer and partners and continuously build the community.

This is a challenging time to obtain needed capital and win customers even for efficient economic engines like shared geospatial data and services.  The solution will need to approach the community outreach is impactful, trusted and will tell the story of efficiencies, cost savings, and higher quality information.  The platform and NGDA must impact the customer program objectives. Figure 1 – FGDC Performance and Value framework shows how the platform’s value chain aligns with the types of performance benefits that can be realized throughout its inherent processes. The supporting team’s understanding of this model will need to organize the “Story” to convince the customer and partners that the platform can:

  • Provide decision makers with content that they can use with confidence to support daily functions and important issues,
  • Provide consistency of base maps and services that can be used by multiple organizations to address complex issues,
  • Eliminate the need to choose from redundant geospatial resources by providing access to preferred data, maps and services[11] 

As the approach is implemented, the FGDC, its partners and the Communities of Interest will have successfully accelerated the adoption and use of location based information.  Uses will recognize the value offering and reap the benefits to their operations and bottom line.   The benefits will be measurable and support the following FGDC business case objectives:

  • Increasing Return on Existing Investments, Government Efficiency, Service Delivery
  • Reducing (unintentional) Redundancy and Development and Management Costs
  • Increasing Quality and Usability[12]

Our Suggested Solution

FGDC’s challenges requires PMO, integrated lifecycle management, partner focus, and blend experience with an integrated approach and single voice designed to meet the FGDC’s strategic objectives and provide a world-class service shared services and data portfolio.  Doing this, they can integrate organizations, data, and service provision.

A solution like this would provide the program, partner and customer relationship management, communications, development and operational capabilities required to successfully implement the FGDC’s vision and business plan. The focus will need to 

  1. Coordinate cross-agency tasks, portfolio needs in agile prgoram management coordination with a single voice,
  2. implement an understanding of critical lifecycle processes to manage and operate the data, technology, capital assets and development projects for a secure cloud-based platform
  3. have communications and outreach focused on communities for partner and customer engagement in the lifecycle decisions
  4. Finally, make sure secretariat staff and team has rotating collective experience with representatives and contractors who hav esuccessfully performed at this scale across all functional areas with domain knowledge in Geospatial, technology, program, service, development and operations.

The strategy and collective experience and techniques will enable FGDC to provide a single voice from all management domains (PMO, Development, Operations and Service Management) for customer engagement. The approach will be need to be integrated with the existing FGDC operating model creating a sum value greater than that of its individual parts. This approach will help create the relationship to develop trusted partner relations services. 


[1]  (page 7 – Geospatial –Platform-Business-Plan-Redacted-Final)

[2]  (page 2 – Draft NSDI –Strategic Plan 2014-2016 V2)

[3]  (page 28 – Geospatial –Platform-Business-Plan-Redacted-Final)

[4]  (page 11 – Ibid )

[5]  (page 9 – Ibid)

[6]  (page 26 – Ibid)

[7]  (page 4 – Ibid)

[8]  (page 6 – Ibid)

[9]  (page 2 – Geospatial –Platform-Business-Plan-Redacted-Final)

[10]  (DOI Geospatial Services Blueprint – 2007)

[11]  (page 13 – Geospatial –Platform-Business-Plan-Redacted-Final)

[12]  (Appendix A – Geospatial –Platform-Business-Plan-Redacted-Final)

[13]  (Page 12 – OMB Circular A-16 Supplemental Guidance)

[14]  (page 12 – Geospatial –Platform-Business-Plan-Redacted-Final)

[15]  (page 36 – Ibid)

[16]  (ITSM – Service Operations V3.0)

[17]  (page 26 – Ibid) 


Apple Maps now has more share than Google Maps

Blog post
edited by
Wiki Admin

I have been tracking Apple for a long time (Check out 2011 article on Apple in the 80s and a local kid view on the Jobs-Sculley re-organization) and once again their approach to releasing a solution that works by default in their ecosystem triumphs over better engineering. VHS wins over beta, again. Lots of articles on this press release:

Apple maps: how Google lost when everyone thought it had won | Technology | theguardian.com