Will geoscience go for a shared service environment

Blog post
edited by
Wiki Admin

Will geoscience go for a shared service environment?

As the previous “How can we help geoscience to move their data to shared services” blog noted, unless we align the stakeholders, get a clear line of sight on their needs, and focus on earning trust and demonstrating value, the answer is no. But let’s say we are moving that way. How do we get started to fund such an approach?

Well, first off, the current grant and programmatic funding models are not designed to develop shared services or interoperable data for the geosciences.  Today, there are many geoscientists who are collaborating between disciplines and as a result improving the quality of knowledge and scaling the impact of their research.  It is also well established that the vast majority operate individually or in small teams.  Geoscientists, rightly so, continue to be very focused on targeted scientific objectives and not on enabling other scientists. It is a rare case when they have the necessary resources or skills.  With the bright shiny object of data driven science /Big data; do we have the Big Head wagging the body of the geoscientist community?  Xentity sees opportunities to develop funding strategies to execute collaborative performance-based cross discipline geosciences. It has been this way since World War II really expanded upon its war-time successful onesy-twosy grants to universities since then. There has been some movement towards hub and spoke grant funding models, but we are still out to get our PhD stripes, get our CVs bigger, and keep working with the same folks. I know it is a surly and cynical view. OK, the real is they are doing amazing work, but in their field, and anything that slows down their work, for greater good, is lacking incentive.

Also, there are few true shared services that are managed and extended to the community as products and services should be. Data driven science, which is out fourth paradigm of science, has been indirectly “demanding” scientific organizations and their systems to become 24×7 service delivery providers. We have been demanding IT programmers to become service managers, scientist to become product managers, or data managers.  With a few exceptions, it has not worked. Geoscientists are still struggling to find and use the basic data/metadata, produce quality metadata (only 60% meet quality standards per EarthCube studies) for their own purposes, let alone making the big leap to Big data and analytics. Data driven science requires not only a different business or operating model, but a much clearer definition of the program, as well as scientist’s roles and expectations.  It requires new funding strategies, incentive models and a service delivery model underpinned by the best practices of product management and service delivery.

Currently, and my favorite, there is limited to no incentive for most geoscientist to think beyond their immediate needs.  If geoscientists are to be encouraged to increase the frequency and volume of cross-discipline science, there needs to be enablement services, interoperable data and information products that solve repetitive problems and provide incentive for participation.  We need to develop the necessary incentive and management models to engage and motivate geoscientist, develop a maturity plan for the engineering of shared geoscience services and develop resourcing strategies to support its execution. Is this new funding models, new recognition models, new education, gamification, crowdsourcing, increasing competition, changing performance evaluation? Not sure as any changes to “game” rules can and usually introduces new loopholes and ways to “game” the system.

The concept of shareable geoscience data, information products and commodity or analytical computing services has an existing operating precedent in the IT domain –shared services.  Shared services could act as a major incentive for participation.  An approach would identify the most valuable cross cutting needs based on community stakeholder input. The team would use this information to develop a demand driven plan for shared service planning and investment. As an example, a service-based commodity computing platform can be developed to support both the Big Head and Long Tail and act as incentive to participation and perform highly repetitive data exchange operations.

How does one build and sustain a community as large and diverse as the geosciences? 

The ecosystem of geoscience is very complex from a geographic, discipline and skill level point of view. How does one engage so diverse a community in a sustainable manner?  “Increased visibility of stakeholder interests will accelerate stakeholder dialogue and alignment – avoiding “dead ends” and pursuing opportunities.” The stakeholders can range from youthful STEM to stern old school emeritus researchers; from high volume high frequency data producers of macro scale data to a single scientist with a geographically targeted research topic. It is estimated that between 80-85% of the science is done in small projects.  That is an enormous intellectual resource that if engaged can be made more valuable and productive.

Here is a draft target value chain :

The change or shift puts a large emphasis on upfront collaborative idea generation, team building, and knowledge sharing via syndication, and new forms of work decomposition in the context of crowd participation (Citizen Science and STEM).  The recommended change in the value chain begins to accommodate the future needs of the community.  However, the value chain becomes actionable based on the capabilities associated to the respective steps.  Xentity has taken the liberty to alliteratively define these four classes of capabilities or capability clusters as:

Encouragement, Engagement, Enablement, and Execution.

Encouragement capabilities are designed to incentivize or motivate the scientist and data suppliers to participate in the community and garner their trust. They are designed to increase collaboration, the quality and value of idea generation and will have a strong network and community building multiplier effect. 

Questions

Capabilities

  • How can new scientific initiatives be collaboratively planned for and developed?
  • How can one identify potential collaborators across disciplines?
  • How can one’s scientific accomplishments and recognition be assured and credited?
  • What are the data possibilities and how can I ensure that it will be readily available?
  • How can scientific idea generation be improved?
  • Incentives based on game theory
  • Collaboration, crowd funding, crowd sourcing and casting
  • Needs Analysis
  • Project Management and work definition
  • Credit for work Services

Engagement Capabilities include the geoscience participant outreach and communication capabilities required to build and maintain the respective communities within the geoscience areas.  These are the services that will provide the community the ability to discuss and resolve where the most valued changes will occur within the geosciences community, who else should be involved in the effort?  

Questions

Capabilities____________________

  • What participants are developing collaborative key project initiatives?
  • What ideas have been developed and vetted within the broadest set of communities?
  • Who, with similar needs, may be interested in participating in my project?
  • How can Xentity cost share?
  • Customer Relationship Management
  • Promotions
  • Needs Analysis
  • Communications and Outreach
  • Social and Professional Networking

Enablement capabilities are technical and infrastructure services designed to eliminate acquisition, data processing and computing obstacles and save scientist time and resources.  They are designed to solve frequently recurring problems that affect a wide variety and number of geoscience stakeholders from focusing on their core competencies – the creation of scientific knowledge. Enablement services will have a strong cost avoidance multiplier effect for the community on the whole if implemented and supported.

Questions

Capabilities

  • How does one solve data interoperability challenges for data formats and context?
  • How do I get data into the same geographic coordinate system or scale of information?
  • How can I capture and bundle my Meta information and scientific assets to support publication, validation and curation easily?
  • How can I get access extensible data storage throughout the project lifecycle?
  • Where and how can I develop an application with my team?
  • How can I bundle and store my project datasets and other digital assets later retrieval?
  • How can I get scalable computing resources without having to procure and manage servers to complete my project?
  • Workflow
  • Process Chaining
  • Data Interoperability
    • Data transformations
    • Semantics
    • Spatial Encoding and Transformation
    • Data Services
  • Publishing
  • Curation

Syndication

Execution Capabilities are comprised of the key management oriented disciplines that are required to support shared infrastructure, services or to help evolve a highly federated set of valuable assets “edges” to be more useable and valuable to the evolving community over time.

Questions

Capabilities____________________

  • How do we collectively determine what information might require a greater future investment?
  • What are the right incentives in the grant processes?
  • What are the future funding models?
  • What models should be invested in?
  • Which technologies should be evaluated for the shared assets?
  • What upcoming shared data or technology needs are in common to a large number of participants?
  • Governance,
  • IT Service Management (ITSM),
  • Product Management,
  • Performance Management,
  • Requirements Management,
  • Data Management,
  • Data Supply Management,
  • Data Life Cycle Management
  • Funding
  • Grants and processing

So, why did we develop these classes of capabilities? 

They represent, at the macro level, a way to organize a much larger group of business, operating and technical services that have been explicitly discussed in NSF EarthCube efforts over the last 3-4 years. We then been derived these outputs from analysis and associate them to the most important business drivers. Check out this “draft” relationship of capabilities drivers and rational 

 RationaleDrivers
EngageThe best way to create communities and identify common needs and objectives, begin to build trust and value awareness; bring the respective communities into an environment where they can build out their efforts and sustain collaborative approaches.Agency (how to navigate planned versus emergent change), intellectual property rights, infrastructure winners and losers, agreement on data storage, preservation, curation policies and procedures, incentives to share data and data sharing policies, and trust between data generators and data users.
EncourageThe best models to incentivize scientist’s and data producers to participate and collaborate. Xentity have developed game theory based approaches and large scale customer relationship management solutions Social and cultural challenges: Motivations and incentives, self-selected or closely-held leadership, levels of participation, types of organizations, and collaboration among domain and IT specialists)
EnableThe most costly data processing obstacles – The lowest common denominator – highest impact problem.  A common problem found in shared service environments. We have developed enterprise service analysis tools for cost benefit for the DOI geospatial community, so we have seen this work80% of scientist data needs can be expressed as standard data product, and 80 % of scientist time is spent getting data into proper form for research analysis
ExecuteA governance model that will increase the “edge effect” between the legacy and future capabilities and a very diverse set of communities. Simple planning capabilities that empower scientist to work complex cross disciplines ideas amongst themselves, define work and coordinate with the power of the crowd. We have designed collaborative environments and crowd based frameworks for data collection and analysis with corresponding performance management system.Conceptual and procedural challenges: Time (short-term funding decisions versus the long-term time-scale needed for infrastructures to grow); Scale (choices between worldwide interoperability and local optimization);

So why don’t we do it?

Well, this does introduce an outside approach into a closed knit geoscience community who is very used to solving for themselves. Having a facilitated method from outside consulting or even teaming with agency operations who have begun moving this route for their national geospatial data assets is not seen as something fits their culture. We are still learning of hybrid ways we can collaborate and help the geoscientists setup such a framework, but for now it is still a bit foreign of a concept, and while there is some awareness by the geoscientist community to adopt models that work for other sectors, industries, operational models, the lack of familiarity is causing a lot of hesitation – which goes back to the earn trust factor and finding ways to demonstrate value.

Til then, we will keep plugging away, connecting with the geoscience community in hopes that we can help them advance their infrastructure, data, and integration to improve earth science initiatives. Until then, we will remain one of the few top nations without an operational, enterprise national geoscience infrastructure.

How can we help geoscience to move their data to shared services

Blog post
added by
Wiki Admin

Most data that is generated out of science is not intended to be used on broader scale problems outside of their own research or their own specific domain. Let’s stick with geoscience to check this ‘hypothesis’ out and check on the ‘so what?’ factor.

Current grant and programmatic funding models are not designed to develop shared services or interoperable data for the geosciences.  There are few true shared services that are managed and extended to the community as products and services should be. “We broadly estimate that 20% of users of a dataset might be experts in the field for which it was created, while 80% might be others.” There is currently limited to no incentive for most geoscientists to think beyond immediate needs. “The culture of collaborative Science is just being established.”  Finally, there is no current clear way to build and sustain the large and diverse geosciences community.

The key we believe starts not with the tech, the money, the management, the govnernance, but with stakeholder alignment which can be “the extent to which interdependent stakeholder orient and connect with one another to advance their separate and shared interests”. 

The Geoscientist community – Big Head and Long Tail. Geoscientists with affiliated government institutions, academic and international partners. Data Scientists, Data and Information Stewards, Curators and Administrators (content and metadata), Data Product and Service Managers , Citizen Earth Science participants, Emerging Geoscientists found in the STEM community – (K-12)

Additionally, the supply chain roles from:

  • Data and Information Suppliers – NSF funded centers and systems.  Programmatic producers of geoscience related data and information – i.e. Earth Observation systems like Landsat or MODIS or specialized information systems that produce value added products like NAWQA. Indonesia NSDI, DOI Authoritative data sources and services
  • Cyber Infrastructure community/Development Collaborators – Basic and Applied Research and Development, Software and System Engineers, Data Manager and Analysts
  • Infrastructure  Management /Collaborators – IT Service Management (ITSM) – Managers and Operators of the shared infrastructure and key software services in industry, commercial, government, Research, and Cost-Sharing FFRDCs
  • Consumers reached out to via end user workshops., Public Policy, Regulatory, Legal and Administrative Analysts, Private Sector , Academia (non-participating state), Other science disciplines
  • Executive Sponsorship and Geoscience Various and Cross-Cutting Governance Community too numerous to get into in this blog at least

The stakeholder themes we have seen in data are generally the same. 

These challenge echoes the organizing themes of the Xentity supported developing 7-years ago for DOI Geospatial Services Architecture:

  • “I know the information exists, but I can’t find it or access it conveniently”, has its analog in “Considerable difficulties exist in finding and accessing data that already exists” 
  • “I don’t know who else I could be working with or who has the same needs”, has its analog in “Duplication of efforts across directorates and disciplines, disconnect between data and science; data graveyard –useless collection of data…”
  • “If I can find it, can I trust it?”, has its analog in “There is a need to evaluate consistency /accuracy of existing data. 

A start on this, to jump into boring consulting theory, is to Develop a clear line of sight to address stakeholder needs and community objectives. 

This ensures the analysis engages all the necessary dimensions and relationships within the architecture. Without a strategy like this, good solutions, business or technical, often suffer from lack of adoption or have unintended consequences and introduce unwanted constraints. The reason for this is the lack of alignment. Technology innovators tend not to share the same view of what is beneficial nor does the Geoscientist who is accustomed to enabling a single or small set of technology directives.

How does one create the shared enterprise view? Using the Line of Sight, our approach at least to architecture transformation and analysis creates the framework and operating model. It connects business drivers, objectives, stakeholders, products and services, data assets, systems, models, services, components and technologies.  Once the linkages have been established, the team will create the conceptual design using 40-50 geoscience domain investment areas. This will effectively describe the capabilities of the existing IT portfolio.  The architecture and the portfolio will be designed to support governance, future transition planning.

Sample Ecosystem Edge Analysis

Human Edges (adaptive systems)

Data and Information Edges

Computing and Infrastructure Edges

Citizen scientists/ STEM and Professional scientists

Data Supply and Information Product/Services

Centralization and federation of computing infrastructure

Geoscience as consumer and producer of data and information

Possessing the data and access the data

Commodity Computing vs. Analytical Computing

Individual science and collaborative science

Macro Scale data  vs. Micro Scale data

Mission driven systems and shared services access

Science Ideation: Piecemeal or segmented vs. holistic

Five data dimensions – spatial (x,y,z), temporal and scale

Domain Systems vs. Interoperability Frameworks

Individual vs. Collective Impact and credit

Authoritative sources vs. free for all data

Systems vs. Managed Services

Governance rigidity and flexibility

Data and models vs. Product

Big Head and Systematic Data Collection, vs. project components

Earth Science and Cyber-infrastructure and Engineering

Long Tail vs.  Big Head Data

 

The Line of Sight allows for exploring the complexities of geoscientist “ecosystem edges” and architect for greater interaction and production in the geosciences.   Those in the “Long Tail” encounter the same cross domain access, interoperability, management barriers as the “Big Head”. Neither have the incentive to develop common enabling data interoperability services, scalable incentive solutions, common planning approaches or increase the participation of the earth science community. Xentity’s believes architecture is an enabling design service.  It is used to empower the user community with the tools to expand its capacities. In this case, Xentity will provide the operating model and architecture framework in a conceptual design to bring together the currently unattended edges.  In the long run, the models will provide the emerging governance system the tools to develop investments strategies for new and legacy capabilities.

The Broader Impact

At its core, we believe the geoscience integration challenge is to exploit the benefits and possibilities of the current and future geoscience “ecosystems edge effect”.  In the ecosystem metaphor, the conceptual design approach will target the boundary zones lying between the habitats of the various geoscience disciplines and systems.   What is needed is an operating model, architectural framework and governance system that can understand the complexities of a geoscientist shared environment and successfully induce the “edge effect”.  It needs to balance the well performing aspects of the existing ecosystem with new edges to generate greater dynamism and diversification for all geosciences. 

An Operating Model example: Collaborative geoscience planning could make a good demonstration case for the benefits of the “edge effect”.  A lot of science efforts are driven by large scale programs or individual research groups who have very little knowledge of who else may be working in the same environmental zones, geographies or even on related topics.  A shared planning service could put disparate projects into known time, location and subject contexts and accelerate cross domain project resource savings and develop the resulting interdisciplinary cross pollination required to understand the earth’s systems.  An Enterprise geoscience initiative could provide a marketplace for geoscientist to shop around for collaborative opportunities.  The plans can be exposed in a market place to other resources like citizen scientists or STEM institutions.  The work can be decomposed so that environments like Amazon’s Mechanical Turk can post, track and monitor, distributed tasks.

By recognizing these edges, the architecture will create greater value or energy from the disciplines and improve the creativity, strength and diversity of ideas, and mitigate disruption.  The ecosystem-like design that balances the Big Head with the Long Tail will enable more cost effective geoscience projects and create a higher return on IT investments while collapsing the time to conduct quality impactful science. Most importantly, this will accelerate the realization of the sciences’ impact on other dependent scientific initiatives or time to develop and implement policy.  Xentity sees the potential to use this and other “ecosystem edges” to transform how geoscience is currently conducted. 

Xentity believes a geoscientist, emerging (STEM) or emeritus would be willing to participate in cross-cutting, shared service model based on how well these edges are architected and governed.  If designed and operated effectively, the edges will create an environment that will address the two key barriers to adoption: trust and value.  In essence, we see the scientists as consumers and producers.

  • As consumers of data, information and knowledge products and technology services, they are continuously looking to create more knowledge and contribute to social benefit. 
  • As producers they contribute data, information and knowledge back into their colleagues’ knowledge processes.  

In fact, the predominant challenge for such an approach is that the share-service will be develoepd by the community who themselves are a consumer. Just like any other consumer, they will have expectations when they purchase or use a product or a service.  If one cannot uphold the terms and conditions of product quality or a service agreement; you lose the consumer. So, how does the architecture ensure these “edges” develop and evolve?  It must ensure:

How to earn Geoscientists’ Trust

The scientists need to know that they will have highly reliable technical services and authoritative data that are available and perform well when they request them.  Most importantly, they will need to influence and control who and how they conduct the work within the shared environment. They need to ensure the quality of the science and appropriate credit.

How to demonstrate the value to the Geoscientist:

The scientists need the provider to correct products or services that will eliminate the most significant barriers and constraints to doing more and higher quality science – research, analysis and experimentation – with less effort.

In the short term, the shared service challenge is to earn the scientists trust and identify the optimal suite of products and services to provision value from the “community resources” as defined in Layered Architecture. For land elevation products up to 80% of the requests are for standardized products. If done correctly, the governance system, operating model and architecture framework will develop the trust and value recognition from the shared community.  In the longer term, the models and framework will guide the redirection of its limited resources towards an interoperable set of systems, processes and data.

Great, but even if we create this, how do we fund? 

See the next part on “Will geoscience go for a shared service environment” which discusses ways to address funding, ways to engage, encourage, enable, and support execution of these enterprise capabilities for geoscientists. 

Geo is more than a dot on the map

Blog post
added by
Wiki Admin

Geo is more than a dot on the map – it is the next progression in information. Xentity has combined the below services for maximizing value of geospatial programs, products, systems, and workforce. We have found common, re-usable, and tailor-able patterns, issues applicable from small to large public and private institutions.

 

Xentity’s staff has been providing geospatial strategy, planning and solution expertise to large public and commercial organizations for the last 15 years. Our goal is to maximize the value of geospatial products, services, systems, technologies and data for our clients and their customers. We have observed, first hand, patterns of critical issues that have hampered our customers’ performance. These issues are found in the Geospatial business areas of Geospatial Planning, Production and Data Lifecycle Management, and Service Delivery as well as in advising the executive or program leadership to manage and execute such complex and complicated changes. 

We have quite a diverse Geospatial Client Portfolio from:

  • Geospatial Government Programs: NGA, DOI, USGS, BLM, EPA, The National Map, Core Science, Indonesia SDI
  • Private Sector: Space Imaging (GeoEye), Intrawest, Denver Metro Utilities, Cloud-based Geospatial CRM

Our approach and expertise has been developed by providing impactful analysis resulting in practical solutions to these patterns. Our proven approach to problem solving encompasses the best practices of management consulting, enterprise architecture planning coupled with geospatial domain knowledge.

Our services in Geospatial services are:

  • Geospatial Planning Analysis Service
  • Geospatial Production and Data Lifecycle Management Analysis
  • Geospatial Service Delivery Analysis
  • Overall Geospatial Strategic and Tactical Management Consulting

Geospatial Planning Analysis Service

Customer Relationship Management: Work with your customers to establish clearly defined needs and the benefits to help you prioritize your efforts and create greater value for your products and services.

  • For the US Geological Survey Land and Remote Sensing program and the National Geospatial Program, Xentity led the development business practice changes that focused on bringing external customer segments to the program planning processes leading to technological and product innovations and improved communications.

Cost Benefits Analysis of Geospatial: Optimizing the value of Geospatial assets is often very complex due to its versatility and diverse uses. Organizations typically have numerous and varied stakeholders with similar requirements to exploit its valuable data and powerful technologies.  Xentity offers executive level planning and advisory services to clients to identify value opportunities, asses its strategic importance and measure subsequent performance.

  • At the US Department of Interior, Xentity provided cost benefit study to establish return on investment for service deployment – analysis yielded 10:1 Benefits.

Data Acquisition and Quality Planning: Align your program data requirements with the available partners, volunteer data, and ensure it flows through the system in an efficient, qualified and cost effective manner. Ensure you retain data in compliance with existing records, archive and use policies. 
  • Developed and implemented several acquisition strategies for the USGS National Geospatial Program that led to improved data quality and long term maintenance for multiple scale datasets.

Funding Planning:  These are challenging economic times that have seen a considerable decrease in government geospatial funds even when the value of geospatial information has not been fully exploited.  Now more than ever it is critical to be able to determine and communicate the value and impact of geospatial activities internally, to your customers and with your executives. 

  • For USGS and use of FGDC CAP Grants, Xentity designed overall acquisition planning model that linked production tracking, multiple purchasing authorities, and locations of existing inventory and purchase targets to help increase ROI on geospatial data acquisition.

Geospatial Production and Data Lifecycle Management Analysis

Concept of Operations Development –Evaluate your existing production processes for new more cost effective ways to move higher volumes of data through the system. Minimize the number of times the data is accessed.

  • For Indonesia BAKOSURTANAL NSDI, Xentity staff designed a shared production Environment across disparate incubating programs. Xentity has done similar work for DOI, BLM, USGS, EPA, and numerous private firms.

GeoData Quality Lifecycle Analysis – Synchronize the sources to increase the integration qualities of the data or to improve the quality (accuracy, currentness) of the data.

  • US National Geospatial Program – using DLCM sourced data from same suppliers to meet multiple program goals and provided consistency to user community.

Geodata Use Preparation – Optimize how to prepare data for service consumption and use as a part of your production flow.

  • BAKOSURTANAL instituted a publishing model where data was integrated across 17 ministries prior to catalog publishing

Service Delivery Analysis

Xentity has provided architectural services to the USGS, DOI and Data.gov on geospatial metadata management, discovery and visualization. We have also conducted information sharing on re-usable information service delivery patterns with multiple agencies and international bodies such as Canada, Australia, and Indonesia

Geospatial Metadata Discovery Architectures – Improve how your data and products are described (metadata) discovered (cataloging), accessed and used in the online world. Integrate your efforts with open government initiatives.

  • For USGS Core Science Systems, amongst 4 major tenets, Xentity created a path/blueprint for moving to integrated clearinghouse and harvest solution integrating sciencebase.gov and an existing metadata clearinghouse. Worked with Geospatial One Stop, Data.gov and USGS Science Base Catalogs in support of digital data delivery.

Geospatial Delivery Application Architecture – Implement standardized application frameworks and specifications to support data access and rapid transition to new products and services or delivery methods.

  • Solution Architecture design for National Geospatial Program and BAKOSURTANAL for data services and delivery.

 Geospatial Delivery Solutions Architecture –Design the access methods for online service delivery or download for all forms of products and services (vector, raster, data file formats, bigdata) through real-time service/stream access for application integration patterns to bulk cloud computing to discovery bigdata search indices with geospatially integrated interpretative signalling

  • USGS delivery, by moving to eCommerce model for products, moving to staged products and improving web service access, architecture and standards for improved application integration has had significant IT cost reduction, and 10% monthly increase use over 4 years in product download and service access.

Geospatial Delivery Prototype & Research Application Development – Our architects and developers have capabilities to bring designs to life. Experts in prototyping and early phase technology selection and demonstration in BigData, Visualization, Modeling, Service, Discovery, Semantics, and informatics solutions.

We blend this with our Agile Project Business Management capabilities for rapid planning, scrum management, sprint planning tracking, and tying back to aforementioned requirements, line of sight, and levels of architecture, and ITIL deployment requirements.

Our goal is to provide services, software tools, and automation procedures to assist in continued appropriate innovation and to keep your research, prototypes, or beta on track.

  • USGS initial The National Map base map development in early research
  • USGS Cloud Migration analysis few dozen pilots in storage, inventory, services, APIs, bigdata, basemaps and more
  • Rapid Prototyping BigData Indices and interpretative signals for millions upon millions of records and complicated discovery requirements replacing traditional RDBMS.
  • Visualization Application stacks and mashup developments in ESRI JS API, CartoDB, OpenLayers, ArcGIS Online, CKAN, Socrata, ESRI GeoDataPortal, PostGIS, Leaflet, GeoServer, and many more JS APIs, map servers, and basemap engines.

Geospatial User-Centered Designs – Improve how your users access your digital data and mapping products and keep them aware as your content changes. Design solutions that have notification models in areas of interest, topically aware, and balance popularity, and other qualities and repeatable patterns unique to geospatial

  • Space Imaging provided spatial notifications to customers based on area, product and intended use criteria

Geospatial Vendor Product Evaluation & Reseller Specialization – Xentity has evaluated many geospatial product lines in geospatial data, production, product generation, service platforms, and applications/APIs. Xentity has capability to evaluate product lines as well as re-sell products.

  • Xentity has evaluated geospatial products, architectures in open-source, COTS, GOTS with and for USGS, GSA, DOI, Salesforce AppExchange company, Denver Metro Utilities Company, and more. See Xentity Partners partners for products Xentity has capability to resell today.

What makes Geospatial so different?

Why do we promote spatial at such a level? Why we focus on spatial data scienceRead on…

Transformation cannot survive without a Powerpoint intervention

Blog post
edited by
Wiki Admin

We’ve subscribed to Google Groups “The Enterprise Architecture Network” group for several years now. Mostly lurking.  Over that time, we definitely have been able to put great items in our “shopping cart” and gather a tenor of the state of EA. Our methods and work products are becoming more streamlined – at varying paces of satisfaction. Our work products are improving with training and information availability. But in that time, much like our U.S. policies on EA hasn’t changed since the Clinger-Cohen Act in 1996 pre-internet, neither has our EA communication mediums. 

We could blame ourselves, as we do as we are natural evaluators, and we tend to work on content, methods, arguing approaches – we look at the hard science. This is not uncommon as EAs tend to derive from IT which is about logic, organization, and somehow dealing with rapid transformation.
But, the executives, the ones we work for are still struggling with how they go about Layering Cultural and Business Transformation with rapid technology transformation. 
I’d offer we are looking in the wrong place. Yes, our methods, frameworks, tools could improve – a lot! But its the softer side of how we present, what we present to which audience I believe is what is killing enterprise, segment, portfolio, or any large transformation efforts with enterprise architecture seeking to be in the forefront. 

We’d offer an initial focus change for EA and transformation in and of itself: change our communication mediums

Corporations and Government are in lust with slideware, so we use such. To the point, Powerpoint is killing transformation analysis and EA and other complex topics and our core architecture conceptsPPT is too sequential and too outline oriented. Its been used, abused, and most of the time we use it in knee-jerk fashion.

How we get them off slide-meth (or pick your drug metaphor) is the trick?  Call me a Tuftian (Search Edward Tufte Powerpoint).

Getting people to care about Transformation and Architecture

Anyhow, if we want people to care, back to the title of the thread in Google groups, I think the content from this group and the general community is all well and very good in many cases, but we are in need of a branding enema. 
EA is in desperate need of marketing and branding. The efforts of the EA Google Group community and across the spectrum are good basis of content, but finding that audience to engage is our interest.  We need a way to focus on storytelling the output to multiple audiences. Anyone interested in some new mediums for presenting EA or transformation concepts.
To be very fair, at my company, we’re a a big offender as well – buzzword bingo, write too much (constant battle with the Twain Pascal quote “If I Had More Time I Would Write a Shorter Letter”) , go too far down path ahead of client, pushing, etc. while balancing challenging the client, helping them challenge themselves, simplifying complexity, and seek what most people fear – change. 
So, as a start, we have beta’d some variant approaches on some clients –
  • Early phases of powerpoint exit strategy 
    • Weening off powerpoint with 1-pager and a graphic and more discussion time, no presentation (maybe some highlights)
    • Switching powerpoints to be graphical only/mostly and use whitepaper conpendium so they can read themselves when discussing graphic
  • Increasing communication role over work products
    • Treating Communication products as dear and near as our architecture work products when organizing by audience level and focus area (choose your favorite framework when considering that)
    • Interactive data visualization for self-exploration with an executive summary – make it fun to see the line of sight.
  • Agile Management and its communication mediums/tools – notification tracking with plans as they are implemented in Program Management Offices
    • Minor little things like allowing users to reply to the notification and not to have to log in
    • Integrate Content wiki’s with data repositories to increase exploration possibilities and context which allows to notifications on discussions
    • Shifting email discussions to comment threads on issue tracking and content wiki’s which again further increase transparency and exposes transactive knowledge hidden in emails or hidden group threads.
  • Storytell with the goals, outcomes, impacts, and other high-points and denouement.  
    • Can be done using a low-cost high-quality mini-documentary (TLC/History Cable Channel quality) style that storytells about programs, the changes, from customer and investor point of views (rather than self-serving). 
    • Blog more – storytell more – bits and pieces at a time – rather than gems hidden in industry groups, open up the transparency and inclusivity of the topics.
But, we would like to take a step further.  

Here is an example of a mini-documentary, one in a series of four that helps tell the story of the client’s program, its value and the underlying tone of change its undergone, undergoing, and planned:

More can be seen at Communication Services (Video Series Example) which further describes:
These simple stories can show and explain the value of complex programs products, services, solutions, and systems that powerpoints, whitepapers, or multiple conferences cannot achieve. As well, these mediums have helped convey messages, even during times of pressures to reduce travel.

We’d like to do a video series for transformation and EA not unlike what Penn State did for Geospatial as part of their Geospatial Revolution Series – http://geospatialrevolution.psu.edu/ . It would be interesting to pool partners, efforts, gather up funding, and put together a series (process disclaimers aside) using mediums that connect EA and transformation in this new light. Anyone interested, can keep posted on threads in Google Groups “The Enterprise Architecture Network” group

The Surprising Reasons Why America Lost Its Ability To Compete

Blog post
edited by
Wiki Admin

Our Architecture Services Lead found this interesting Forbes article on “The Surprising Reasons Why America Lost Its Ability To Compete” written by several Harvard Business School MBA alumni. The article ultimately calls out management, not external factors as the reason for failure. 

 if there are disastrous shortfalls in the ability to compete, then surely the quality of management itself—the art and science of getting things done—must have a lot to do with it

Specifically, the focus on short-term and blaming external factors. At Xentity, we agree that though we understand management pressures in private and public sector have very impending issues to keep the organization within budget (public sector) and maintaining shareholder margins (private sector), but without and investment in outyear and next generation transformation, workforce, and research, the bailing water approach to management will not allow the organization to survive without adaptation. The article outlines:

  • Management trending to blame external factors instead of innovating, adapting, overcoming.
  • Management shifted to short-term focus and today’s numbers, versus investing in shared resources and pooling for longer haul
  • Managers have focused innovations and transformations more on cost-efficiency and cost-reductions and less on value-adding and increasing relevancy
    • Management education partly to blame focusing on short-term financial outcomes
    • Management shifted to focusing on maximizing shareholders outcomes while ignoring stakeholders needs
  • Instead of focusing on workforce/talent strategy, research, management instead continued focusing on short-term needs
  • Management can complain about government, external factors, but unless management finds way to not just focus on short-term needs, there is limited factors that government execution of new policies can do to stimulate growth
  • Management didn’t mention customer once in the report. C-level types have lost sight of understanding the communities of use, supply, and understanding their market
    • Management has lost ability to look back at the purpose of the program – to create the customer and balance with shareholder value

These observations from the study are very in-line with the Xentity’s published list of anti-patterns core architecture concepts towards view on transformation. As we published back in 2008, Our concepts are biased towards the next “generations” concept. The solutions recommended by the article generally align with our focuses on change as well:

Achieving continuous innovation and customer delight lies outside the performance envelope of firms that are built on hierarchical bureaucracy and focused on short-term gains and the stock price. It requires a fundamentally different way of leading and managing—in effect, a paradigm shift in management. It means:

Harvard Study: Management shiftsXentity Core concepts on addressing change

a shift from controlling individuals to self-organizing teams;

We are growing partners.

a shift from coordinating work by hierarchical bureaucracy to dynamic linking;

We think big on change, while changing small bits at a time

a shift from a preoccupation with economic value to an embrace of values that will grow the firm; and

We support executives transform their visions into action.

a shift from top-down communications to horizontal conversations.We share our concepts and supporting assets openly.

The article solutions wrap with balancing shareholder/budget-interests focus with stakeholder/relevancy focus:

 The article had some follow-on reads relating to this problem and emboldens many of the articles points:

And read also: 

In private and public sector, the management challenge is the same – external factors are continually battling against the mission, but management is doing the same thing to respond: Short-term cost-efficiency or cost-reduction approaches only with focus on the shareholder (private) or year-to-year budgeting (public). Management is not finding ways to balance the short-term and the long-term relevancy, and only education and leadership can help address, not waiting for external factors to make it easier.