Why we focus on spatial data science

Blog post
edited by
Matt Tricomi

The I in Information Technology is so broad – why is our first integrated data science problem focus on spatial data? It doesnt fit when looking on face of our Services Catalog . We get asked this a lot and this is our reason, and like Geospatial, its multi-dimensional spanning different ways of thinking, audiences, maturity, progressions, science, modeling, and time:

 

In green, x-axis, is the time progression of public web content. The summary point is data took the longest period – about 10-15 years. And data can only get better as it matures into being popular 25 years old on the web. We are in the information period now, but moving swiftly into the knowledge period. Just see how much more scientific data visualizations, and dependence we are on the internet. Just think how much you were on the web in 1998 compared to 15 years later – IT IS IN YOUR POCKET now. 

This isn’t just our theory.

RadarNetworks put together the visual of progressing through the web eras. Web 1.0 was websites or Content and early Commerce sites. Web 2.0 raised the web community with blogs and the web began to link collaboratively built information with wikis. Web 3.0 is ushering in the semantic direction and building integrated knowledge.

Even scarier, Public Web Content progression lags several business domains, but not necessarily in this leading order: Intelligence, Financial, Energy, Retail, and Large Corporate Analytics. Meaning, this curve reflects the Public maturity, and those other domains have different and faster curves. 

The recent discussions on intelligence analysis linking social/internet data with profile, Facebook/Google Privacy and use for personalized advertising, level of detail SalesForce knows about you and why companies pay so much for a license/seat, how energy exploration is optimizing where to drill in some harder to find areas, or the absolute complexity and risk of the financial derivatives as the world market goes – these technologies usually lag in how we integrate public content for googling someone, or using the internet to learn more and faster. Reason: Those do not make money. Same reason why the DoD invented the internet – it was driven by security of the U.S. which makes money which makes power. 

So, that digression aside (as we have been told “well, my industry is different”), the public progression does follow a parabolic curve that matches Moore’s Law driving factor in IT capability – every 2 years, computing power doubles in power, at same cost (paraphrasing). The fact that we can do more faster at quality levels means we can continue to increase our complexity of analysis in red. And there appears to be a stall not moving towards wisdom, but as we move toward knowledge. Its true our knowledge will continue to increase VERY fast, but what we do with that as a society is the “fear” as we move towards this singularity so fast. 

Fast is an understatement, very fast even for logarithmic progression as its hard to emote and digest the magnitude of just how fast it is moving. We moved from

  • The early 90s simply placing history up there and experimentation and having general content with loose hyperlinking and web logs
  • to the late 90s conducting eCommerce and doing math/financial interaction modeling and simulations and building product catalogs with metadata that allowed us to relate and say if a user found that quality or metdata in something, it might liek something else over here
  • to the early 2000s to engineering solutions including social and true community solutions that began to build on top of relational and the network effect and use semantics and continually share content on timelines and where a photo was taken as GPS devices began to appear in our pockets
  • To the 2010s or today where we are looking for new ways to collaborate, find new discoveries in cloud, and use the billions and billions on sensors and data streams to create more powerful more knowledgable applications

Another way to digest this progression is via the table below.

Web VersionTimeDIKWWeb MaturityKnowledge Domain Leading WebData Use Model on WebData Maturity on Web
.9early 90sDataContentHistoryExperimentalLogs
1.01995+Info HistoryExperimentalContent
1.11997  MathExperimentalRelational
1.21999 +CommerceMathHypotheticalMetadata
1.32002  EngineeringHypotheticalSpatial
2.02005+Knowledge+CommunityEngineeringComputationalTemporal
2.12010s  EngineeringComputationalSemantic
3.02015 and predictable webKnowledge+CollaborationScienceData as 4th paradigm notTempoSpatial (goes public)
4.02020 -2030Wisdom in sectorsAdvancing Collaboration with 3rd world coreAdvancing Science into Shared Services – Philsophical is out yearRobot/Ant data qualitySentiment and Predictive (goes public/useful) – Sensitive is out year

Now, think of the last teenager that could maintain eye contact in a conversation with an adult while holding phone in their hand and not be distracted by the pavlovian response of a text, tweet, instagram, etc. Now imagine, ten years from now, when its not tidbits of data, but as a call comes up, auto-searching on terms they arent aware of come up in augmented reality. Advice on how to react on the sentiment they just received – not just the information. The emotional knowledge quotient will be google now – “What do I do when?” versus critical thinking and live and learn.

So, taking it back to the “now”, though this blog is lacking the specific citations (blogs do allow us to cheat, but our research sources will make sure to detail and source our analysis), if you agree that spatial mapping for professional occurred in early 2000s and agree now that it has hit the public and understand that spatially tagging data has pass the tipping points with advent of smartphones, map apps, local scouts, augmented reality directions, and multi-dimensionl modeling integrating GIS and CAD with web, then you can see the data science maturity stage we are in that has the largest impact right now is – Geospatial.

Geospatial data is different. Prior to geospatial, data is non-dimension-based. It has many attributable and categorical facets, but prior to spatial data, that data does not have to be stored as a mathematical or picture form with specific relation to earth position. Spatial data – GIS, CAD, Lat/Longs, have to be stored in numerical fashion in order to calculate upon it. Further more, it hasnt be be related to a grounding point. Essentially, geospatial is storing vector maps or pixel maps. When you begin to put that together for 10s of millions of streams, you get a a very large complicated spatially referenced hydrography dataset. It gets even more complicated when you overlay 15-minute time-based data such as water attributes (flow, height, temperature, quality, changes, etc.) with that. Even more complicated when you combine that data with other dimensions such as earth elevations and need to relate across domains of science, speaking different languages to be able to calculate how fast water may flow a certain contaniment down a slope after a river bank or levy collapses.

Before we can get to those more complex scenarios, geospatial data is the next progression in data complexity .

That said, definitely check out our Geospatial Integrated Services and Capabilities

Macro versus Micro Geospatial Data Value

Blog post
edited by
Matt Tricomi

Kudos to the Canadian Government – NRCAN – for trying to get an clearer understanding of the economic significance and industry status of  geospatial data products, technologies and professional services industry.  http://www.nrcan.gc.ca/earth-sciences/geomatics/canadas-spatial-data-infrastructure/cgdi-initiatives/canadian-geomatics-0.  The Boston Consulting Group (BCG) produced a somewhat similar economic impact assessment for the United States in April 2013. http://geospatial.blogs.com/geospatial/2013/04/contribution-of-geospatial-services-to-the-united-states-economy.html

We geo-bigots working to support the public sector intuitively and intellectually realize geospatial’s potential value and are continuously frustrated by our lack of collective ability and capacity to make its implementation more immediate, simpler and more powerful.  While these macro- economic pictures are interesting and useful they do not seem to influence the micro government decision making capacities.  The BCG report “cautions that to continue this growth will require sustained public- and private-sector cooperation and partnership, open policies governing collection and dissemination of location-based data and increased technical education and training at all level”.

The intent of the US Federal Geographic Data Committee  (FGDC) “GeoPlatform” and Canada’s FGP, if supported by the proper policies and data management practices could simplify the data quality and acquisition challenges resident in our hyper-federated geospatial data environment.  Ironically, in the emerging and soon to be overwhelming information and knowledge based economy we are still struggling to manage data content.  Geospatial will not, break into the program and mission operations until the business leadership fundamentally adopts information centered performance objectives as a part of their organizational culture. 

Geospatial has always been an obtuse concept to classify, evaluate or pigeonhole into a nice neat framework let alone determine its national economic value.  For similar reasons, within the US Federal government, “geospatial” has struggled to find an organizational position that would enable its potential value to be maximized.  This is partly due to data structure complexity resulting from its form, resolution, temporal range, scale, geometry or accuracy qualities have created an artificial “boundary” and organizations are having are hard time navigating out of.  At least until the “director” sees the “map” or the “picture’ and then it is the silver bullet.

Here at Xentity, we want to start to frame the discussion about how to exploit Geospatial Value at the micro or organizational level and begin to guide our customers to sustained geospatially driven business improvements.  Our initial cut at how to break the field down is found in the diagram.  How can this be improved upon?

How Open Data Contributes Toward Better Interagency Collaboration and Orchestration at all Levels

Blog post
edited by
Matt Tricomi

In a recent email thread with Xentity, NASCIO, and members of Smart Lean Government, the following thoughts were offered on OpenData by NASCIO Program Director, Enterprise Architecture & Governance for National Association of State Chief Information Officers (NASCIO), Eric Sweden and republished with his permission

I believe open data contributes toward better inter agency collaboration and orchestration at all levels – notwithstanding PII is specifically removed from open data initiatives and must be.  But there is a place for open data in serving individual needs of citizens – for example – clinical epidemiology.  Employing population data – and even specific population data in evaluating prognosis and treatment regimes.  Think of the value in public health and medical services to underserved populations AND really anyone else.  Trends, patterns, correlations will surface for a similar approach / strategy in other government lines of business – we’re just at the brink of this kind of use data exploitation.

I’m looking beyond life events and also considering the complete Smart Lean Government concept.  Life events are a critical element – but there are also events abstracted up from individuals to communities.  So we move up an upside down pyramid from life events to “community events” or “community issues.”   Consider open data – and the larger concept of open government – in enabling better government.  Thus, a necessary part of Smart Lean Government.  Think about how government is able to work better together in collaboration and that leads to sharing data and information.

Example, Minnesota Department of Public Safety and Department of Transportation working together in drawing necessary correlations between crash data (from DPS) and speed/road conditions/weather data from DOT to develop strategy for safer roads and highways.

This particular example resonates with the “Imperatives of 21st Century Government Services” from volume one of the practical guide; steps 1-4 of the “Sustainable Shared  Services Lifecycle Model” from volume two of the practical guide.

This example is at the community event level – but impacting every individual and family that uses those roads and highways.

Flipping the Educational Value Chain

Blog post
edited by
Wiki Admin

Business, governments, and even many non-profits have benefited from the windfall of a flattening world – less war, trend towards better resource distribution, new business models, digital economy proliferation, sharing workforce. Education has not

At Xentity, to us exploring NextGen Transformation using architecture, analysis, design is not about IT. IT is a core component, but we are looking at how the Next Generation will progress and transform. And with generation lines becoming more of a blur, this isn’t a 30 year delay, or even a 15 year delay. In some cases, we are talking 5 to 10 years for transformation of a generation. Given such, when we examine workforce capital, we are truly interested in the changing models not just in the employee – which by the way, is a relic of the industrial age – but also how those employed in your organization (employee, contractor, consultant, vendor, service provider), are changing themselves.

One way of examining this is looking at the actual next generation. The kids. This is very important. For instance, the current incoming generation, aside from now being larger than the Baby Boomer generation, has benefited from the previous 30 years of relative stability, and Millenials engage in collaborative environments, as a result of growing up in a connected world NATURALLY.  

They weren’t taught this though, what they were taught for the most part, with some Montessori, STEM Academy, and other cloud school minor exceptions, in a school model that was intended for the children to go into a pre-industrial revolution business workforce that had bells to change shifts, required discipline of a “robot” in the factory for efficiency and safety, and required still minds to take orders and execute.

When examining your organization, you may have unwritten rules, or codes that have been passed down out of habit, institutionalization, or what we know. Those unspoken rules of engagement or life definitely help manage the chaos and focus on the mission, but the question that at times needs to be asked is “Is this the right mission? If not, are these the right rules?” and thereafter of course, do you or does your organization have the political and actual capital to make the transformation.

The following, in two parts, Jim Barrett examines this phenomena of:

Mr. Barrett is not only is Xentity’s Architecture lead, but has actively served and presently engages in multiple early childhood education development advisory and exploratory boards.

 

 

2013 Year in Review

Blog post
added by
Wiki Admin

2013 was definitely a very fluid year for Xentity. Mostly, directly adjusting to the impacts of the Federal Government instability caused by sequester, shutdown, and lingering impacts of debt ceiling and fiscal cliff budget impacts. As a large part of Xentity services helps Federal clients look at longer term investment, when you have a year where it is hard for Federal programs look much beyond a few months, those services were definitely of lower utilization. It is a shame, from a citizen perspective, to know our Federal clients are limited in capability to improve, enhance, and become more efficient for the long term, but 2014 is already shaping to be looking more on focus for those goals. 

That said, Xentiy hit several milestones this year:

Diversified into more State Government services – In previous years, Xentity has supported Healthcare information solutions in New York, or staff have supported IT Transformations adapt business transformation methods in Virginia, but in 2013, Xentity was awarded two prime contracts in Colorado for opendata competitions and transformation services contracts

Xentity served over 30 clients in 2013 with over 25 staff for all Services for clients ranging from:

  • Government Clients: DOI, multiple USGS programs, EPA, CDC, VA, NARA, multiple State of Colorado programs, NPS
  • Commercial Clients: Intrawest, Cloudbilt, reVision, Inc., LVI Services, Inc., Black Tusk Group, Center for Professional Development, Soaring Eagle, Synergy Staffing, Raytheon, Sky Research, Inc., Skyline Reclamation, Solidyn, SPEC, TriHydro, Vexcel, Winningham Forest Mgmt
  • Prime Contractors: IBM, PPC, PhaseOne Consulting Group, TomTom, SRA

Xentity staff worked in over 12 major geographic Locations including GA, NC, CO, MA, TX, AK, DC, VA, WI, NY, WA, NV including 5 new states (GA, NC, CO, MA, TX) as well as Canada and Australia bringing Xentity services to total of 4 countries supported (US, Canada, Australia, and Indonesia)

Xentity has been awarded more Government-wide Access vehicles and Schedules totaling seven vehicles including: GSA MOBIS, GSA IT Schedule 70, DOI Foundational Cloud Hosting Services GWAC, USGS-wide Architecture Services IDIQ, State of Colorado wide Transformation Services IDIQ, CIO-SP3 via PPC, and our 8(a) Sole Source Access.

Xentity has held information sharing sessions and participated in industry-wide activities with Australian Geospatial Mapping programs and ACT-IAC Smart Lean Government.

Xentity`s active Large BusinessSmall Business, and Academia active agreements now totaling over 50 adding ten (10) new partners

Xentity received accolades from Xentity makes the Hispanic Business Top 500 List – again as well as Xentity recognized on CIO Review list for Most Promising Government Technology Solution and Consulting Providers 2013. In addition our staff presented and received recognition at the  E-Gov Institute’s Annual Enterprise Architecture Conference.

All in all, though 2013 was fluid as noted, we were very excited to provided our transformational services to more clients, with more partners, in more locations than ever before. We look forward to 2014 to continued impact for our clients.