Go Code Colorado Open Data Effort is going into its final weeks

Blog post
edited by
Wiki Admin

States all-around have gotten into Open Data movements. Colorado has as well, and their recent Go Code Colorado effort is a very unique entry into this foray ( http://gocode.colorado.gov/)

Go Code Colorado was created to help Colorado companies grow, by giving them better and more usable access to public data. Teams will compete to build business apps, creating tools that Colorado businesses actually need, making our economy stronger.

 

The following is a great video that summarizes the event as produced by the State and one of Xentity’s colleagues, Engine7 Media.



Get Adobe Flash player

Xentity is very proud to be supporting this innovative Government Solution

Xentity was awarded IT consulting support for the the Business Intelligence Center platform and data catalog which supports the now branded Go Code Colorado initiative. Xentity’s consultants have provided the data and technology resources to manage and advise the publication of public sector data to the Colorado Information Marketplace and to provide technical support developers who participate in the Challenge. 

Xentity primarily has provided data platform support. We have provided data readiness analysis, data architecture guidance, project management, and the data analysts to “wrangle” the data (aka ETL) to get the datasets onto the platform. We also have provided the IT and data support on-site at the multiple locations and events to assure the challenge participants and finalists are getting the support they need to be successful in accessing and using the data and services. Finally, we are supporting the technical review of applications to assure these applications can have a life beyond the “hackathon” stage.

The final stages are coming the first 10 days of May. The 10 finalists have proven to demonstrate very viable solutions to achieve the goal of helping make our economy stronger. 

Some more background and detail on how we got here

(The following is from the State as guidance to this effort)

 

Colorado government agencies possess large volumes of public business and economic data. This data can help businesses with strategic planning, but it exists in so many different places and formats making it difficult for that most businesses to use it. The Secretary of State’s office will address this problem through the creation of the Business Intelligence Center (BIC). BIC seeks to aggregate and analyze data available to the business community.

This effort is led by the Colorado Secretary of State. The Secretary of State’s office interacts with hundreds of thousands of business entities, charities, and nonprofits in the state. The Secretary of State’s office collects, manages, and disseminates large amounts of basic data about those organizations and wanted to make the data useful to Colorado businesses. 

The Department sought to make this data more useful and collaborated with the Leeds School of Business at the University of Colorado to publish the Quarterly Business and Economic Indicator Report. This report combines Department data with other economic data collected by he Leeds School to provide meaningful economic information to the business community. For instance, new business filings are a leading indicator of job creation. With this and other information provided in the report, the business community can make smarter decisions that will grow the Colorado economy.

Since first publishing the report in 2012, the Secretary of State received comments from many members of the business community asking to see more detailed data regarding economic trends 
in order to better understand the distribution of commerce in Colorado. This includes access to the location, size, vibrancy, and concentration of key business nodes. While this level of detail would be tremendously helpful, the Department cannot provide the information because multiple state agencies collect the desired data and it is not readily available in a common place  or even a common format.

A central data collection point is needed. During meetings with other government agencies, Department staff concluded that these data requests could be met by aggregating all the information spread throughout various agencies and databases into a single tool by breaking down agency silos and better cataloging existing resources. Department staff also concluded that access and availability to the data is not enough. In order to make the raw data useful to the vast majority of business owners, data analysis and visualization tools are needed. These conclusions led to the Business Intelligence Center project.

The Business Intelligence Center consists of a centralized data catalog that combines public data into a meaningful tool for businesses. 

The vision for this project is two-fold. First, it consolidates public data relevant to businesses on a single platform. Second, it gives business the tools to make the data useful. The second goal is 
achieved through a civic apps challenge—the Colorado Business Innovation Challenge—that will give financial incentives to the technology community to build web and mobile applications that use state and other data to solve existing business challenges.

The data platform is akin to an information clearing house. It will make data sources currently dispersed over multiple government departments and agencies accessible in a common location. 
This platform will offer Colorado businesses unprecedented access to public data that is validated and relevant to short and long-term needs. Besides enhancing businesses’ access to state data, the BIC will also contribute to economic growth. The creation of the BIC will make data available to all Colorado businesses at no additional cost. Currently only large entities with the time, staff, and budget to engage in detailed statistical analysis can use these data sets. Providing this data to every type and size business in Colorado provides a unique opportunity to contribute to economic development. The BIC will nurture key industry networks and lay the foundation for a digital infrastructure that will continue to expand and improve over time.

The Colorado Business Innovation Challenge is an innovative way to create solutions and ensure the BIC is useful to Colorado businesses.

Simply making the data available is insufficient to most business owners. To truly help the vast majority of businesses—especially small businesses—tools must be developed to present the data in a useful and consumable form. Normally government agencies develop tools to fill this information vacuum, but historically the government has not been successful developing highly useful and effective tools. A new approach is needed—that approach is the Colorado Business Innovation Challenge.

Modeled after a “civ apps” challenge that has been run in multiple cities across the United States and internationally, the Challenge presents the software development community with problem 
questions and then asks that community to create possible solutions. At the end of the challenge, the Secretary of State will license the most innovative and implementable web or mobile application. The best design will receive a contract with the Secretary of State to make the application available to the public on the Business Intelligence Center platform. The Department will also pursue partnerships with the Colorado technology and startup industry to provide additional incentives, such as mentoring, hosting, and office space to the Challenge winners. The long-term intent of the program is to not only create an environment for fostering community involvement through the Challenge, but to develop sustainable tools that are  developed in the Challenge.

Xentity holding Methodology Development review for ACT-IAC, Smart Lean Government

Blog post
edited by
Wiki Admin

Xentity, Methodology Development example (2002-2008 How TAA grew to MBT grew to OMB FSAM)

Over 6 years, it took several iterations of method, applying to projects scaling up method scope, application on programs/segments, lots of info sharing, and then promotion to base of FSAM. This is not to direct the Content of the ACT-IAC, Smart Lean Government (SLG) efforts, but how the method got developed, training material built out and trainer, how the team went out information sharing, promoting, and then helping apply to support true business transformation and modernization blueprints.

Summary of Objectives:
– Covered how Xentity and partners supported the Dept of the Interior iterate/develop the “Methodology for Business Transformation” showing some sample deliverables, templates, and visuals
– Discuss how we iterated from Bureau TAA (02-03) to DOI MBT (04-06) to OMB FSAM (07-08) through maturing from bureau, to agency, to federal through not only iterative method development, but through applying in iteration. First a few blueprints at bureau, then four at agency cutting across different driven blueprints (legal, organization development, target architecture, and shared services), then improving the method, and doing more blueprints, then doing more communications and training, and finally, promotion to OMB through collaborative integration with other methods.
– Cover lessons learned, critical success factors (i.e. framework agnostic, mapping to frameworks, culture understood, tested in 3 cycles of blueprints, breaking into services, first attempted foray into doing as NGO in 2006)
– Pitfalls (i.e. overselling, under training, lacking certification, didn’t emphasize earlier phases, limited bureau accountability, perception and politics, moving to FSAM – losing key aspects, calling it method instead of approach, perceived waterfall and not presented situational, training/maintenance budget limits)
– Cover things we would do now differently (i.e. wiki, more examples, more new Media communications)

Related Xentity links:

 

How can we help geoscience to move their data to shared services

Blog post
added by
Wiki Admin

Most data that is generated out of science is not intended to be used on broader scale problems outside of their own research or their own specific domain. Let’s stick with geoscience to check this ‘hypothesis’ out and check on the ‘so what?’ factor.

Current grant and programmatic funding models are not designed to develop shared services or interoperable data for the geosciences.  There are few true shared services that are managed and extended to the community as products and services should be. “We broadly estimate that 20% of users of a dataset might be experts in the field for which it was created, while 80% might be others.” There is currently limited to no incentive for most geoscientists to think beyond immediate needs. “The culture of collaborative Science is just being established.”  Finally, there is no current clear way to build and sustain the large and diverse geosciences community.

The key we believe starts not with the tech, the money, the management, the govnernance, but with stakeholder alignment which can be “the extent to which interdependent stakeholder orient and connect with one another to advance their separate and shared interests”. 

The Geoscientist community – Big Head and Long Tail. Geoscientists with affiliated government institutions, academic and international partners. Data Scientists, Data and Information Stewards, Curators and Administrators (content and metadata), Data Product and Service Managers , Citizen Earth Science participants, Emerging Geoscientists found in the STEM community – (K-12)

Additionally, the supply chain roles from:

  • Data and Information Suppliers – NSF funded centers and systems.  Programmatic producers of geoscience related data and information – i.e. Earth Observation systems like Landsat or MODIS or specialized information systems that produce value added products like NAWQA. Indonesia NSDI, DOI Authoritative data sources and services
  • Cyber Infrastructure community/Development Collaborators – Basic and Applied Research and Development, Software and System Engineers, Data Manager and Analysts
  • Infrastructure  Management /Collaborators – IT Service Management (ITSM) – Managers and Operators of the shared infrastructure and key software services in industry, commercial, government, Research, and Cost-Sharing FFRDCs
  • Consumers reached out to via end user workshops., Public Policy, Regulatory, Legal and Administrative Analysts, Private Sector , Academia (non-participating state), Other science disciplines
  • Executive Sponsorship and Geoscience Various and Cross-Cutting Governance Community too numerous to get into in this blog at least

The stakeholder themes we have seen in data are generally the same. 

These challenge echoes the organizing themes of the Xentity supported developing 7-years ago for DOI Geospatial Services Architecture:

  • “I know the information exists, but I can’t find it or access it conveniently”, has its analog in “Considerable difficulties exist in finding and accessing data that already exists” 
  • “I don’t know who else I could be working with or who has the same needs”, has its analog in “Duplication of efforts across directorates and disciplines, disconnect between data and science; data graveyard –useless collection of data…”
  • “If I can find it, can I trust it?”, has its analog in “There is a need to evaluate consistency /accuracy of existing data. 

A start on this, to jump into boring consulting theory, is to Develop a clear line of sight to address stakeholder needs and community objectives. 

This ensures the analysis engages all the necessary dimensions and relationships within the architecture. Without a strategy like this, good solutions, business or technical, often suffer from lack of adoption or have unintended consequences and introduce unwanted constraints. The reason for this is the lack of alignment. Technology innovators tend not to share the same view of what is beneficial nor does the Geoscientist who is accustomed to enabling a single or small set of technology directives.

How does one create the shared enterprise view? Using the Line of Sight, our approach at least to architecture transformation and analysis creates the framework and operating model. It connects business drivers, objectives, stakeholders, products and services, data assets, systems, models, services, components and technologies.  Once the linkages have been established, the team will create the conceptual design using 40-50 geoscience domain investment areas. This will effectively describe the capabilities of the existing IT portfolio.  The architecture and the portfolio will be designed to support governance, future transition planning.

Sample Ecosystem Edge Analysis

Human Edges (adaptive systems)

Data and Information Edges

Computing and Infrastructure Edges

Citizen scientists/ STEM and Professional scientists

Data Supply and Information Product/Services

Centralization and federation of computing infrastructure

Geoscience as consumer and producer of data and information

Possessing the data and access the data

Commodity Computing vs. Analytical Computing

Individual science and collaborative science

Macro Scale data  vs. Micro Scale data

Mission driven systems and shared services access

Science Ideation: Piecemeal or segmented vs. holistic

Five data dimensions – spatial (x,y,z), temporal and scale

Domain Systems vs. Interoperability Frameworks

Individual vs. Collective Impact and credit

Authoritative sources vs. free for all data

Systems vs. Managed Services

Governance rigidity and flexibility

Data and models vs. Product

Big Head and Systematic Data Collection, vs. project components

Earth Science and Cyber-infrastructure and Engineering

Long Tail vs.  Big Head Data

 

The Line of Sight allows for exploring the complexities of geoscientist “ecosystem edges” and architect for greater interaction and production in the geosciences.   Those in the “Long Tail” encounter the same cross domain access, interoperability, management barriers as the “Big Head”. Neither have the incentive to develop common enabling data interoperability services, scalable incentive solutions, common planning approaches or increase the participation of the earth science community. Xentity’s believes architecture is an enabling design service.  It is used to empower the user community with the tools to expand its capacities. In this case, Xentity will provide the operating model and architecture framework in a conceptual design to bring together the currently unattended edges.  In the long run, the models will provide the emerging governance system the tools to develop investments strategies for new and legacy capabilities.

The Broader Impact

At its core, we believe the geoscience integration challenge is to exploit the benefits and possibilities of the current and future geoscience “ecosystems edge effect”.  In the ecosystem metaphor, the conceptual design approach will target the boundary zones lying between the habitats of the various geoscience disciplines and systems.   What is needed is an operating model, architectural framework and governance system that can understand the complexities of a geoscientist shared environment and successfully induce the “edge effect”.  It needs to balance the well performing aspects of the existing ecosystem with new edges to generate greater dynamism and diversification for all geosciences. 

An Operating Model example: Collaborative geoscience planning could make a good demonstration case for the benefits of the “edge effect”.  A lot of science efforts are driven by large scale programs or individual research groups who have very little knowledge of who else may be working in the same environmental zones, geographies or even on related topics.  A shared planning service could put disparate projects into known time, location and subject contexts and accelerate cross domain project resource savings and develop the resulting interdisciplinary cross pollination required to understand the earth’s systems.  An Enterprise geoscience initiative could provide a marketplace for geoscientist to shop around for collaborative opportunities.  The plans can be exposed in a market place to other resources like citizen scientists or STEM institutions.  The work can be decomposed so that environments like Amazon’s Mechanical Turk can post, track and monitor, distributed tasks.

By recognizing these edges, the architecture will create greater value or energy from the disciplines and improve the creativity, strength and diversity of ideas, and mitigate disruption.  The ecosystem-like design that balances the Big Head with the Long Tail will enable more cost effective geoscience projects and create a higher return on IT investments while collapsing the time to conduct quality impactful science. Most importantly, this will accelerate the realization of the sciences’ impact on other dependent scientific initiatives or time to develop and implement policy.  Xentity sees the potential to use this and other “ecosystem edges” to transform how geoscience is currently conducted. 

Xentity believes a geoscientist, emerging (STEM) or emeritus would be willing to participate in cross-cutting, shared service model based on how well these edges are architected and governed.  If designed and operated effectively, the edges will create an environment that will address the two key barriers to adoption: trust and value.  In essence, we see the scientists as consumers and producers.

  • As consumers of data, information and knowledge products and technology services, they are continuously looking to create more knowledge and contribute to social benefit. 
  • As producers they contribute data, information and knowledge back into their colleagues’ knowledge processes.  

In fact, the predominant challenge for such an approach is that the share-service will be develoepd by the community who themselves are a consumer. Just like any other consumer, they will have expectations when they purchase or use a product or a service.  If one cannot uphold the terms and conditions of product quality or a service agreement; you lose the consumer. So, how does the architecture ensure these “edges” develop and evolve?  It must ensure:

How to earn Geoscientists’ Trust

The scientists need to know that they will have highly reliable technical services and authoritative data that are available and perform well when they request them.  Most importantly, they will need to influence and control who and how they conduct the work within the shared environment. They need to ensure the quality of the science and appropriate credit.

How to demonstrate the value to the Geoscientist:

The scientists need the provider to correct products or services that will eliminate the most significant barriers and constraints to doing more and higher quality science – research, analysis and experimentation – with less effort.

In the short term, the shared service challenge is to earn the scientists trust and identify the optimal suite of products and services to provision value from the “community resources” as defined in Layered Architecture. For land elevation products up to 80% of the requests are for standardized products. If done correctly, the governance system, operating model and architecture framework will develop the trust and value recognition from the shared community.  In the longer term, the models and framework will guide the redirection of its limited resources towards an interoperable set of systems, processes and data.

Great, but even if we create this, how do we fund? 

See the next part on “Will geoscience go for a shared service environment” which discusses ways to address funding, ways to engage, encourage, enable, and support execution of these enterprise capabilities for geoscientists. 

Whiteboarding Data.gov and Geoplatform Original Vision

Blog post
edited by
Matt Tricomi

During the same week Xentity Team members presented on the data.gov and geoplatform integration vision at the International Data Conference, we had the opportunity to hold an information sharing exercise and whiteboard sessions with FGDC and data.gov executive sponsors to capture What does geodata.gov mean to data.gov.

In this session, we covered various user stories on the haves and have nots for the current data suppliers and what our analysis is suggesting data users want and how they want to access the data. Finally, we had discussions on how data.gov, geoplatform.gov can work with agencies to get these services funded and how geoplatform could serve the community.

The following captures a series of potential services that agencies may be requesting on geoplatform.gov as it unfolds. As the concept of geoplatform is for agencies to build out services as it sees fit and geoplatform.gov is more of the conveyance, governance, and service provider, how the actual services unfold could go anywhere.