What Perpetual Acceleration Means to Geospatial Data
In a recent article, “Why We Focus On Spatial Data Science”, we discussed, well, just what it was in the title. However, there is one subject in particular that we felt was vast enough that it deserved to have a blog of its own: Perpetual Acceleration. Perpetual Acceleration, at its most basic form, is basically the constant progression of a concept, such as spatial data. In our previous blog, we talked about how spatial data was rapidly growing in importance. Its growth is accelerating. We are seeing other technological capabilities, like spatial data, progressing at an amazing rate.
Accelerating Forward, Really Fast
The progression of these capabilities has brought us to a point where we can blend what, when, and where – TempoSpatial data – into the flourishing cognitive and language world.
This isn’t just our theory.
Radar Networks put together this visual, tracing the progress of the Internet through the different web eras. Web 1.0 was websites or content and early commerce sites. Web 2.0 augmented the web community with blogs and began to link collaboratively built information with wikis. Web 3.0 is ushering in semantic direction and building integrated knowledge.
Is It Too Fast Though?
The fact we can do more at faster speeds at higher-quality levels means we can continue to increase the complexity of our analyses. However, there also appears to be an unfortunate disconnect as we move toward knowledge but not towards wisdom. It is true our knowledge will continue to increase evermore quickly. However, what we do with that knowledge as a society is a source of great fear as we move towards this singularity so fast.
Fast is an understatement. This progression we are witnessing is fast even for logarithmic progressions. So fast it’s hard to express and digest the magnitude of how quickly we’re moving. We’ve already moved from:
- The early 90s experimentation of simply placing history on the web and having general content with loose hyperlinking and web logs
- On to the late 90s conducting eCommerce, doing math and financial interaction modeling and simulations. Or, building product catalogs with metadata that allowed us to relate and predict that if a user liked a quality or metadata in something, they might like something elsewhere
- To the early 2000s engineering social and true-community solutions that began to build on top of relational networks. Also, using semantics and continually sharing content on timelines. And, tracking where photos were taken as GPS devices began to appear in our pockets
- Finally, the 2010s where today we are looking for new ways to collaborate. We want to find new discoveries in cloud, and to use the billions and billions of sensors and data streams to create more powerful and knowledgeable applications
Some Graphs For Your Consideration
Here’s another way to digest this progression:
In summation, it is all moving forward at an almost alarming rate. And it is showing no signs of stopping. It is perpetually accelerating, and we are all trying to play catch-up.
|Web Version||Time||DIKW||Web Maturity||Knowledge Domain Leading Web||Data Use Model on Web||Data Maturity on Web|
|3.0||2015 and predictable web||Knowledge||+Collaboration||Science||Data as 4th paradigm not||TempoSpatial (goes public)|
|4.0||2020 -2030||Wisdom in sectors||Advancing Collaboration with 3rd world core||Advancing Science into Shared Services – Philosophical is out year||Robot/Ant data quality||Sentiment and Predictive (goes public/useful) – Sensitive is out year|
Can We Keep Up Though?
Just because perpetual acceleration is a challenge, does not mean we cannot keep up. If knowledge is progressing rapidly, we need to strive for a future where we properly use that knowledge. This response ties back to why we at Xentity focus so much on the information in IT, rather than technology. We want the wisdom to properly understand and use all of this information that is suddenly surfacing. And, we want to share that wisdom with everyone who wants it.