In 2014 Every Business will be Disrupted by Open Technology

Blog post
added by
Wiki Admin

The article “In 2014 Every Business will be Disrupted by Open Technology” raised some very key points on some key disruptive patterns. A key theme we picked up on was the movement from bleeding and leading edge to more popular adoption of the open data and Open platforms. 

As the article notes:

Yet the true impact begins not with invention, but adoption.  That’s when the second and third-order effects kick in. After all, the automobile was important not because it ended travel by horse, but because it created suburbs, gas stations and shopping malls

A few tangible themes we picked up on are:

  1. Stand-alone brands are shifting to open platforms for co-creation
  2. Open platforms built on a brand`s intellectual property enhance the value, not threaten it
  3. Open platforms provide enterprise-level capabilities to the smallest players, leveling the playing field, accelerating innovation, and amplifying competition
  4. Open platforms for co-creation shifts the focus away from driving out inefficiencies, and toward the power of networking and collaboration to create value.

Where its happening already: Federal, State, City, Commercial

In FedFocus 2014, they also emphasized that with the budget appropriations for 2014 and 2015, two big disruptive areas will continue to be in Open Data and Big Data. Especially, with the May 2013 release of the WhiteHouse Open Data memorandum for going into effect in November 2013, it will impact Open Data by:

Making Open and Machine Readable the New Default for Government Information, this Memorandum establishes a framework to help institutionalize the principles of effective information management at each stage of the information`s life cycle to promote interoperability and openness. Whether or not particular information can be made public, agencies can apply this framework to all information resources to promote efficiency and produce value.

We are seeing states get into the mix as well with Open Data movements like http://gocode.colorado.gov/

Go Code Colorado was created to help Colorado companies grow, by giving them better and more usable access to public data. Teams will compete to build business apps, creating tools that Colorado businesses actually need, making our economy stronger.

Also, at the city level with the City of Raleigh, North Carolina which is well recognized for its award-winning Open Data Portal.

 

We had previously tweeted on how IBM opened up their Watson cognitive computing API for developers…. publicly. This is a big deal. They know with open data platforms as an ecosystem, they not only get more use, which means more comfort, which means more apps, but every transaction that happens on it, that is legally allowed, they to improve their interpretative signals that make Watson so wicked smart. This article points this key example out as well. 

 

And back to National Data Assets moving ahead to make their data more distributable over the cloud, moving data closer to cloud applications, offering data via web services where they are too large or updated too often to sync, download, or sneakernet.

Xentity and its partners have been at the forefront of all these movements.

We have enjoyed being on the leading edge since the early leading edge phases of this movement. Our architectures are less on commodity IT, which not to undersell the importance of affordable, fast, robust, scalable, enabling IT services and data center models. Our architectures have been more focused on putting the I back in IT.

We have been moving National Geospatial Data Assets into these delivery models as data products and services (Xentity is awarded USGS IDIQ for Enterprise and Solution Architecture), supporting the architecture of data.gov (Xentity chosen to help re-arch data.gov), and recently supporting the data wrangling on Colorado`s State OpenData efforts. We are examining Can a predictable supply chain for geospatial data be done and actively participating in NSF EarthCube which looks to “Imagine a world…where you can easily plot data from any source and visualize it any way you want.” We have presented concepts

Our architecture methods (Developing a Transformation Approach) are slanted to examine mission oriented performance gains, process efficiencies, data lifecycle management orientation, integrating service models, and balancing the technology footprint while innovating. For instance, we are heavily involved in the new ACT-IAC Smart Lean Government efforts to look at aligning services across government and organizational boundaries around community life events much like other nations are beginning to move to.

Xentity is very excited about the open data movements and supported platforms and the traction it is getting in industry. This may move us forward from information services into the popular space to and for knowledge services (Why we focus on spatial data science)


Xentity recognized on CIO Review list for Most Promising Government Technology Solution and Consulting Providers 2013

Blog post
edited by
Matt Tricomi

Xentity was recognized on CIO Review list for “20 Most Promising Government Technology Solution and Consulting Providers 2013” list. 

With the advent of internet technologies, there has been a change in the landscape of business processes related to the Federal Government system. But the change hasn’t been easy as it requires constant dedication to move the entire workforce from traditional systems, and getting them to seamlessly adapt to the modern systems. This transition also includes the role of technology consulting providers, whose sole responsibility is to provide a wide spectrum of services in order to help the federal agencies to cope with the changes, in the best possible manner.

As customers and business partners increasingly demand greater empowerment, it is imminent for government companies to seek for improved interactions and relationships in their entire business ecosystems, by enhancing software capabilities for collaboration, gaining deeper customer and market insight and improving process management.

In the last few months, we have looked at hundreds of solution providers and consulting companies, and shortlisted the ones that are at the forefront of tackling challenges related to government industry.

In our selection we looked at the vendor’s capability to fulfill the needs of government companies through the supply of a variety of services that support core business processes of all government verticals, including innovation areas related to advanced technologies and smart customer management. We also looked at the service providers’ capabilities related to the deployment of cloud, Big Data and analytics, mobility, and social media in the specific context of the government business.

We also evaluated the vendors support for government bridging the gap between IT and Operations Technology. We present to you, CIOReview’s 20 Most Promising Government Technology Solution and Consulting Providers 2013.

CIO Review Magazine Full Article on Xentity:

Xentity Corporation: Rapidly Designing The Needed Change In Cost-Cutting Times

By Benita M
Friday, December 6, 2013

 

Benita M

“We always try to believe that leaders want to execute positive change and can overcome the broken system. We are just that naïve,” says Matt Tricomi, Founder of Xentity Corporation in Golden, CO, named for “change your entity” which started on this premise just after 9/11 in 2001.“This desire started in 1999. I was lucky enough to be solution architect on the award winning re-architecture of united.com. It was a major revenue shift from paper to e-ticket, but the rollout included introducing kiosks to airports. Now that was both simple and impactful”. Xentity found their niche in providing these types of transformation in information lifecycle solutions. Xentity started slow, first, in providing embedded CIO and Chief Architect leadership for medium to large commercial organizations. 

Xentity progressed, in 2003, into supporting Federal Government and soon thereafter International to help IT move from the 40-year old cost center model to where the commercial world had successfully transitioned – to a service center. “Our first Federal engagement was serendipitous. Our staff was core support on the Department of the Interior (DOI) Enterprise Architecture team”, Matt recalls on how the program went from “worst to first” after over $65 million in cuts. “We wanted to help turn architecture on its head by focusing on business areas, mission, or segments at a time, rather than attack the entire enterprise from an IT first perspective.” The business transformation approach developed ultimately resulted in being adopted as the centerpiece or core to the OMB Federal Segment Architecture Methodology (FSAM) in 2008.

Xentity focuses on the rapid and strategic design, planning and transformation outreach portion of the technology investment in programs or CIO services. This upfront portion is generally 5 to 10 percent of overall IT spending. Xentity helps address the near-term cost-cutting need while introducing the right multi-year operating concepts and shifts which take advantage of disruptions like Geospatial, Cloud, Big Data, Data Supply Chain, Visualization, and Knowledge Transfer. Xentity helped data.gov overcome eighty percent in budget cut this way. “Healthcare.gov is an unfortunate classic example. If acquisition teams had access to experts to help register risks early on, the procurement could have increased the technically acceptable threshold for success.” 

One success story of Xentity is at United States Geological Survey (USGS). “After completing the DOI Geospatial Services Blueprint, one of several, the first program to be addressed was the largest: USGS National Mapping Program.” This very respected and proud 125-year old program had just been through major reductions in force, and was just trying to catch its breath. “The nation needs this program. The blueprint cited studies in which spending $1 on common “geo” data can spur $8 to $16 in economic development. Google Maps is one of thousands which use this data.” The challenge was to transition a paper map production program to be a data product and delivery services provider. “The effort affected program planning, data lifecycle, new delivery and service models, and road-mapping the technology and human resource plan. We did architecture, PMO, governance, planning, BPR, branding, etc.” Xentity, with its respected TV production capability, even supported high-gloss video production to deal with travel reduction and support communicating the program value and changes with partners and the new administration. This is definitely different than most technology firms. The National Map got back on the radar, increased usage significantly, and is expanding into more needed open data. 

Presently, Xentity is a certified 8(a) small disadvantaged business with multiple GSA Schedules and GWACs (Government Wide Acquisition Contracts). Xentity invested heavily in Federal Business management. Part of providing innovative, pragmatic, and rapid architecture and embedding talent is being able to respond quickly with compliant business management vehicles. Xentity is constantly seeking out the passionate CIOs, Program Directors, Architects, and Managers looking at transformation in this cost-cutting environment. “Sequester, Fiscal Cliff, debt ceiling, continuing resolutions–it’s all tying the hands of the executives who can look at best six months out. They don’t have the time to both re-budget and rapidly design multi-year scenarios to out-year performance drivers and options let alone staff up to speed on the latest disruptions or right innovation. That is where we come in. We start small or as fast or slow as the executive wants or believes their organization can absorb and progress.” 

BigData is a big deal because it can help answer questions fast

Blog post
added by
Wiki Admin

BigData is not just size and speed of complex data – it is moving us from information to knowledge

 

As our Why we focus on spatial data science article discusses, the progress of knowledge fields – history to math to engineering to science and to philosophy – or the individual pursuit of knowledge is based on moving from experiments to hypotheses to computation to now The Fourth Paradigm: Data-Intensive Scientific Discovery. This progression has happened over the course of human history and is now abstracting itself on the internet.

The early 90s web was about content, history, and experiments. The late 90s web was about transactions, security and eCommerce. The 2000s web was about engineering entities breaking silos – within companies, organizations, sectors, and communities. The 2010s web has been about increasing collaborating of communication, work production, and entering into knowledge collaboration. The internet progression is just emulating human history capability development.

When you are ready to move into BigData, it means you are wanting to Answer new questions.

That said, The BigData phenomenom is not about the input of all the raw data and explosion that the Internet of Things is being touted as. The resource sells, and the end product is the consumed byproduct. So lets focus on that by-product – its knowledge. Its not the speed of massive amounts of new complex and various quality data as our discussion on IBM’s 4 V’s focus on.

Its about what we can do with the technology on the cheap that before required supercomputer clusters that only the big boys had. Now with cloud, internet, and enough standards, if we have good and improving data, we ALL now have the environment to be answering complicated questions while sifting through the noise. Its about the enablement of the initial phase of knowledge discovery that everyone is complaining about the “web” right now “too much information” or “drowning in data”.

The article on Throwing a Lifeline to Scientists Drowning in Data discusses how we need to be able to “sift through the noise” and make search faster. That is the roadblock, the tall pole in the tent, the showstopper.

Parallelizing the search is the killer app – this is the Big Deal, we should call it BigSearch

If you have to search billions of records and map them to another billion records, doing that in sequence is the problem. You need to shorten the time it takes to sift through the noise. That is why Google became an amazing success out of nowhere. They did and are currently doing it better than anyone else – sifting through the noise.

The United States amazing growth is because of two things – we have resources and we found out how to get to them faster. Each growth phase of the United states was based on that fact alone, and a bit of stopping the barbarians at the gates our ourselves from implosion. You could say civilization. Some softball examples out of hundreds

  • Expanding West dramatically exploded after trains, which allowed for regional foraging and mining
  • Manufacturing dramatically exploded production output, which allowed for city growth
  • Engines shortened time between towns and cities, which allowed for job explosion
  • Highway systems shortened time between large cities, which allowed for regional economies
  • Airplanes shorten time between the legacy railroad time zones, which allowed for national economies
  • Internet shortened access to national resources internationally, which allowed for international economies
  • Computing shortened processing time of information, which allows for micro-targetted economies worldwide

Each “age” resulted in shortening the distance from A to B.  But, Google is sifting through data. Scientists are trying to sift as well through defined data sensors, link them together and ask very targetted simulated or modeled questions. We need to address the barriers limiting entities success to do this. 

 

So what is the point of this metaphoric drivel

Blog post
added by
Wiki Admin

So what is the point of this metaphoric drivel about cowpaths, space shuttles, and chariots?

Yes, fair enough. Aside from being a fun story, there should be a point.

I think there are 3, not unlike the Goldilocks story.

Change Agents can’t come in too hot to put in new technology and abandoned the old as there are consequences

Change Agents can’t come in too cold and put in new technologies just putting it in the footprint and same design footprint of the old.

Change Agents need to find the transition balance between the old and new that allows the new ecosystems to be adopted and the old ecosystem to adapt.

To get this balance, there are three factors standing in the way of introducing a disruption such as this:

  • Scaling – Scaling Research Readiness for solution expansion, adoption, and architecture qualities
  • Legacy – Legacy investment stakeholders agendas
  • Transition – Patterns for new investment that benefits the new solution and addresses legacy investment stakeholders

Read the next blog post for considering the disruption factors on an example topic – advancing our global network keeping up with the Computers to make the internet truly 21st century.

More to come.

What cow paths, space shuttles, and chariots have in common

Blog post
edited by
Wiki Admin

A colleague recently sent me a chain email (they do still exist) about the old adage on how new technology is driven by thousand year old standards. I had seen it before. I remember then I liked it. But, my new habit on chain emails or viral urban legends was to poke around. Being childlike, I hope for fun new ways too see things, but being a problem-solver as well, I am skeptical of these amazing discovery of trivial connections. Regardless, its still a fun story where one can mine some good nuggets.

The anecdote essentially notes how historical inventions are connected and a moral. Reading it backwards, it connotes how the width of the space shuttle rocket boosters are due to width of railroad tunnel. And how railroad tracks width are due to the carriage wheel width. And how that width is tied to chariot width because of the width of two horses. Point being, the boosters width is derived due to width of two horses rear-ends.

Like I said, it is fun, but the tangents are more loosely coupled and coincidental than the “seven degrees of Kevin Bacon” concept. Snopes nicely walks us through how while this is true, but only through generalities – not unlike how someone could say the clothes we wear now is because of a medieval tailor sized it that way. Snopes can be a party pooper some time, but they did also note a few things about people and change (insert my agenda HERE). This is why I do like stories like this as I can tie my own tangential take-aways from it.

Snopes points out humans presets on change:

Although we humans can be remarkably inventive, we are also often resistant to change and can be persistently stubborn (or perhaps practical) in trying to apply old solutions to new conditions. When confronted with a new idea such as a “rail,” why go to the expense and effort of designing a new vehicle for it rather than simply adapting ones already in abundant use on roadways? If someone comes along with an invention known as an “iron horse,” wouldn’t it make sense to put the same type of conveyance pulled by “regular” horses behind it?

It goes on for several more examples noting how new innovations leverage the blueprints of previous generation inventions, regardless of their direct influence. The tone felt a bit down when noting this, but I felt this continuity is not wholly a bad thing.

As a physical society that build infrastructure to share, this compatibility is needed to limit the impact of disruption while progressing towards addressing societal challenges of Maslo’s Hierarchy of Needs globally.

For example, lets say there is a future decision to stop using dams for hydroelectric power and go into a series of nano-electric generators that works off river flow that would impede water less and generators more power. This is great as we have a lower cost, simpler, more efficient solution that also does not disrupt the ecosystem such as riparian development, fiash spawning, etc. like dams have for decades.

How do we transition to the new nano solution. The railroad story says we would use the previous footprint of the dam, and once ready, slowly migrate to the new solution to allow the water flow to slowly come back in place. This would allow the wetlands and riparian ecosystem to grow back at natures pace, and allow for fish and river life to adapt generationally.

Yet, the new solution does not require the same footprint. We could build it anywhere along the river. It could even be setup in a series of micro generators, and once the level of energy put into the grid matches the dams, in theory, the dam could just be exploded, and we could progress on without anyone in the future anthropocene historic footprint to be aware that a dam was ever there.

But, removing the previous infrastructure in a responsible way will be key. Blowing up a dam means the water release would cause major sediment displacement, kill the dam-resulted adapted riparians and wetland ecosystems, and generations of fish and river life would actually die as a result. The dismantling process, though not required for the new direct energy human need, is very critical to consider the indirect impact of the evolved ecosystem. 

If still interested, check out the follow-up blog post So what is the point of this metaphoric drivel