Future Map for Neuro Technology and Five Other Areas

Blog post
edited by
Matt Tricomi

We tweeted a ‘harumph’ for Geekwire’s article on First, we kill all the ‘futurists’ . Then the Policy Horizons Canada group puts out a fantastic emerging tech futures map. Futurists be damned if they do or don’t. On the new study published by Envisioning and Policy Horizons Canada, a blog on Business Insider notes:

On Friday, the group published a giant graphic summarizing emerging technologies and showing when they could become scientifically viable, mainstream, and financially feasible. This follows more detailed graphics (pdf files) showing future innovations in agricultural and natural manufacturingneurology and cognitionnanotechnology and materials,healthdigital and communication technology, and energy. 

These predictions may not be so far off. 

Moore’s Law is accelerating digital processing well into the hockey stick shift. The web and flat world is kicking in Metcalfe’s law of network interconnection creating a tipping point for rapidly adapting to new tech globally.
So, some of these information related futures will only be delayed by political, geopolitical, or epidemic disruptions at this point. That all said, we’re keeping in mind the main premise of First, we kill all the ‘futurists’ – that it needs to be more than a smart guy presenting ideas ‘ripped from the pages of Google News Alerts. Some of the references did feel a bit like that, but it is a conversation starter to get the mind ‘context switched’ from the day-to-day rat race to what could be. 

Call it our guilty pleasure or call it – regardless of pinpointing “futurist” timelines – a great way to help teach awareness of the pace of emerging and disruptive tech.

A few findings we enjoyed

First the graphics are presented in how we love to present tech – they change business models:

The near future of technology promises change at an ever-increasing pace while rapidly transforming business models, governments and institutions worldwide. In order to help us make sense of our uncertain future, Policy Horizons Canada engaged Michell Zappa of Envisioning Technology to explore key technologies that are likely to have a profound effect on humanity on a global level and generational timeframe.

In the six areas, the focus is on economic impact, geopolitical (energy), and human-computer interaction and societal impacts.

Neuro and AI

Looking at the slices related to information progression, completeness, and how it gets more compelling and knowledgeable is of course our lens. As noted in Why we focus on spatial data science, we are very interested in the path from research to main stream of data to information to knowledge to wisdom. We also continuously discover it is true that our graphics are truly still at the whiteboard.

So, we of course are enthralled and drooling over the neurology and cognition aspects. It is great to see the agreement with our leanings and concepts that we must invoke sentiment (emotion tracking) prior to having prediction (crime prevention). Yet, it looks like the focus is on facial recognition aspects for emotions, but given there are so many other pantomimes of liars and other emotions and not too mention composite emotion detection in verbal, setting, background, environment, contemporary context, this does appear a bit aggressive. Not to mention there is now an abstraction of emotion through devices (txt, twitter, facebook, etc.) that create different faces of a person and emotion. This will take large data to help integrate the HUMINT concepts that the intelligence agencies have access to on the civilian level.

While they nailed some interesting concepts of physical, physiology, and neuro interactions – human-computer interaction, what felt missing in the Neuro area, was the concept that computers like Watson went from multiples of servers to one server and now is open-source in a matter of five years (From Jeopardy champ to cloud service). When will that capability make 2010 Siri look like in ten years – a novelty, a joke? Already Microsoft’s Contana in late 2014 has progressed from lookup and secretarial duties to executive administrative assistant. What will happen in another 10 years? What will happen when major brain mapping or DARPA’s brain mimic efforts produce its research in that time period? What will happen when the storage capacity of the web can handle brain storage?

Will we have personalized sensitive advisor, therapists? Have the slew of updated sci-fi movies on such cognitive devices painted that new picture (i.e. Transcendence (flop or not), Her). To believe we can get the emotion in ten years is very bold, but we will have the power of watson in our tablet or smartphone-like devices in 10 years. What that will bring for intelligence and information will be interesting.

The full publication is at http://www.horizons.gc.ca/eng/content/metascan-3-emerging-technologies-0 and a great way to learn more about the study quickly is at http://envisioning.io/horizons/.

There is so much more. This was a couple notes on 1/6 of the study. But to not spoil your exploration too much more, we’ll just summarize by saying, go in and explore and get your mind on the possible. As an IBM colleague of ours used to put in his email signature:

A pessimist sees the difficulty in every opportunity; an optimist sees the opportunity in every difficulty.

Winston Churchill (Brainyquote.com)

Xentity recognized on CIO Review list for Most Promising Government Technology Solution and Consulting Providers 2013

Blog post
edited by
Matt Tricomi

Xentity was recognized on CIO Review list for “20 Most Promising Government Technology Solution and Consulting Providers 2013” list. 

With the advent of internet technologies, there has been a change in the landscape of business processes related to the Federal Government system. But the change hasn’t been easy as it requires constant dedication to move the entire workforce from traditional systems, and getting them to seamlessly adapt to the modern systems. This transition also includes the role of technology consulting providers, whose sole responsibility is to provide a wide spectrum of services in order to help the federal agencies to cope with the changes, in the best possible manner.

As customers and business partners increasingly demand greater empowerment, it is imminent for government companies to seek for improved interactions and relationships in their entire business ecosystems, by enhancing software capabilities for collaboration, gaining deeper customer and market insight and improving process management.

In the last few months, we have looked at hundreds of solution providers and consulting companies, and shortlisted the ones that are at the forefront of tackling challenges related to government industry.

In our selection we looked at the vendor’s capability to fulfill the needs of government companies through the supply of a variety of services that support core business processes of all government verticals, including innovation areas related to advanced technologies and smart customer management. We also looked at the service providers’ capabilities related to the deployment of cloud, Big Data and analytics, mobility, and social media in the specific context of the government business.

We also evaluated the vendors support for government bridging the gap between IT and Operations Technology. We present to you, CIOReview’s 20 Most Promising Government Technology Solution and Consulting Providers 2013.

CIO Review Magazine Full Article on Xentity:

Xentity Corporation: Rapidly Designing The Needed Change In Cost-Cutting Times

By Benita M
Friday, December 6, 2013

 

Benita M

“We always try to believe that leaders want to execute positive change and can overcome the broken system. We are just that naïve,” says Matt Tricomi, Founder of Xentity Corporation in Golden, CO, named for “change your entity” which started on this premise just after 9/11 in 2001.“This desire started in 1999. I was lucky enough to be solution architect on the award winning re-architecture of united.com. It was a major revenue shift from paper to e-ticket, but the rollout included introducing kiosks to airports. Now that was both simple and impactful”. Xentity found their niche in providing these types of transformation in information lifecycle solutions. Xentity started slow, first, in providing embedded CIO and Chief Architect leadership for medium to large commercial organizations. 

Xentity progressed, in 2003, into supporting Federal Government and soon thereafter International to help IT move from the 40-year old cost center model to where the commercial world had successfully transitioned – to a service center. “Our first Federal engagement was serendipitous. Our staff was core support on the Department of the Interior (DOI) Enterprise Architecture team”, Matt recalls on how the program went from “worst to first” after over $65 million in cuts. “We wanted to help turn architecture on its head by focusing on business areas, mission, or segments at a time, rather than attack the entire enterprise from an IT first perspective.” The business transformation approach developed ultimately resulted in being adopted as the centerpiece or core to the OMB Federal Segment Architecture Methodology (FSAM) in 2008.

Xentity focuses on the rapid and strategic design, planning and transformation outreach portion of the technology investment in programs or CIO services. This upfront portion is generally 5 to 10 percent of overall IT spending. Xentity helps address the near-term cost-cutting need while introducing the right multi-year operating concepts and shifts which take advantage of disruptions like Geospatial, Cloud, Big Data, Data Supply Chain, Visualization, and Knowledge Transfer. Xentity helped data.gov overcome eighty percent in budget cut this way. “Healthcare.gov is an unfortunate classic example. If acquisition teams had access to experts to help register risks early on, the procurement could have increased the technically acceptable threshold for success.” 

One success story of Xentity is at United States Geological Survey (USGS). “After completing the DOI Geospatial Services Blueprint, one of several, the first program to be addressed was the largest: USGS National Mapping Program.” This very respected and proud 125-year old program had just been through major reductions in force, and was just trying to catch its breath. “The nation needs this program. The blueprint cited studies in which spending $1 on common “geo” data can spur $8 to $16 in economic development. Google Maps is one of thousands which use this data.” The challenge was to transition a paper map production program to be a data product and delivery services provider. “The effort affected program planning, data lifecycle, new delivery and service models, and road-mapping the technology and human resource plan. We did architecture, PMO, governance, planning, BPR, branding, etc.” Xentity, with its respected TV production capability, even supported high-gloss video production to deal with travel reduction and support communicating the program value and changes with partners and the new administration. This is definitely different than most technology firms. The National Map got back on the radar, increased usage significantly, and is expanding into more needed open data. 

Presently, Xentity is a certified 8(a) small disadvantaged business with multiple GSA Schedules and GWACs (Government Wide Acquisition Contracts). Xentity invested heavily in Federal Business management. Part of providing innovative, pragmatic, and rapid architecture and embedding talent is being able to respond quickly with compliant business management vehicles. Xentity is constantly seeking out the passionate CIOs, Program Directors, Architects, and Managers looking at transformation in this cost-cutting environment. “Sequester, Fiscal Cliff, debt ceiling, continuing resolutions–it’s all tying the hands of the executives who can look at best six months out. They don’t have the time to both re-budget and rapidly design multi-year scenarios to out-year performance drivers and options let alone staff up to speed on the latest disruptions or right innovation. That is where we come in. We start small or as fast or slow as the executive wants or believes their organization can absorb and progress.” 

To do BigData, address Data Quality – People and Processes – Tech Access to information

Blog post
added by
Wiki Admin

As a follow on to the “cliffhanger” on BigData is a big deal because it can help answer questions fast, there are three top limitations right now: Data Quality, People and Process, Tech Access to Information. 

Lets jump right in.

Number One and by far the biggest – Data Quality

Climate Change isn’t a myth, but it is the first science to ever be presented on a data premise. And in doing so, they prematurely presented models that didn’t take into account the driving variables. Their models have changed over and over again. Their resolution of source data has increased. Their simulations on top of simulations have proven countless theories of various models that can only be demonstrated simply by Hollywood blockblusters. Point being, we are dealing with inferior data for a world scale problem, and we jump into the political, emotional driven world with a data report? We will be the frog in slowly warming water, and we will hit that boiling point late. All because we started with a data justification approach using low quality data. Are they right the world is warming? Yes. Do they have enough data to proven the right mitigation, mediation, or policy adjustments? No, and not until either we increase the data quality or take a non-data tact.

People and processes is a generation away.

Our processes in IT have been driven by Defense and GSA business models from the fifties. Put anyone managing 0s and 1s technology in the back. They are nerds, look goofy, can’t talk, don’t understand what we actually do here and by the way, they smell funny. That has been the approach to IT since the 50s – nothing has changed with the exception that their are a few bakers dozen of the hoodie wearing, mountain dew drinking, late night owls who happen to be loaded now, and their is a pseudo culture of geek chic. We have not matured our people talent investment to balance maturity of service, data, governance, design, and product lifecycle to embrace that engine culture as core to the business. This means, more effective information sharing processes to get the right information to the right people. This also means, investing in the right skills – not just feeding doritos and free soda to hackers – to manage the information sharing and data lifecycle. I am not as worried about this one. As the baby boomer generation retires, it will leave a massive vacuum as Generation X is too small and we’ll have to groom Generation Y fast. That said, we will mess up a lot missing a lot of brain drain, but market will demand relevancy which will, albeit slowly, create this workforce model in 10-15 years.

Access to Environments 

If you asked this pre-hosting environments or pre-cloud, this would have been limited to massive corporations, defense, intel, and some of the academia co-investing with those groups. If you can manage the strain of shifting to a big data infrastructure, this barrier should be the least of your problems. If you can allow your staff to get the data they need at the speed they need so they can process in parallelization without long wait times, you are looking good. Get a credit card, or if Government, buy off a Cloud GWAC, and get your governance and policies moving, as they are likely behind and not ready. Likely they will prolong the silo’d information phenomenon. Focus on the I in IT, and let the CTO respond to the technology stack. 

Focus on data quality, have a workforce investment plan, and continue working your information access policies

The tipping point that move you into Big Data is where these combined require you to deal with the complicated enormity at speeds answering questions not just for MIS and reports, but to help answer questions. If you can focus on those things in that order (likely solving in reverse), you will be able to implement parallelization of data discovery.

This will shorten the distance from A to B and create new economies, new networks, and enable your customer or user base to do things they could not before. It is the train, plane, and automobile factor all over again.

And to throw the shameless plug in, this is what we do. This is Why we focus on spatial data science and Why is change so fundamental.

BigData is a big deal because it can help answer questions fast

Blog post
added by
Wiki Admin

BigData is not just size and speed of complex data – it is moving us from information to knowledge

 

As our Why we focus on spatial data science article discusses, the progress of knowledge fields – history to math to engineering to science and to philosophy – or the individual pursuit of knowledge is based on moving from experiments to hypotheses to computation to now The Fourth Paradigm: Data-Intensive Scientific Discovery. This progression has happened over the course of human history and is now abstracting itself on the internet.

The early 90s web was about content, history, and experiments. The late 90s web was about transactions, security and eCommerce. The 2000s web was about engineering entities breaking silos – within companies, organizations, sectors, and communities. The 2010s web has been about increasing collaborating of communication, work production, and entering into knowledge collaboration. The internet progression is just emulating human history capability development.

When you are ready to move into BigData, it means you are wanting to Answer new questions.

That said, The BigData phenomenom is not about the input of all the raw data and explosion that the Internet of Things is being touted as. The resource sells, and the end product is the consumed byproduct. So lets focus on that by-product – its knowledge. Its not the speed of massive amounts of new complex and various quality data as our discussion on IBM’s 4 V’s focus on.

Its about what we can do with the technology on the cheap that before required supercomputer clusters that only the big boys had. Now with cloud, internet, and enough standards, if we have good and improving data, we ALL now have the environment to be answering complicated questions while sifting through the noise. Its about the enablement of the initial phase of knowledge discovery that everyone is complaining about the “web” right now “too much information” or “drowning in data”.

The article on Throwing a Lifeline to Scientists Drowning in Data discusses how we need to be able to “sift through the noise” and make search faster. That is the roadblock, the tall pole in the tent, the showstopper.

Parallelizing the search is the killer app – this is the Big Deal, we should call it BigSearch

If you have to search billions of records and map them to another billion records, doing that in sequence is the problem. You need to shorten the time it takes to sift through the noise. That is why Google became an amazing success out of nowhere. They did and are currently doing it better than anyone else – sifting through the noise.

The United States amazing growth is because of two things – we have resources and we found out how to get to them faster. Each growth phase of the United states was based on that fact alone, and a bit of stopping the barbarians at the gates our ourselves from implosion. You could say civilization. Some softball examples out of hundreds

  • Expanding West dramatically exploded after trains, which allowed for regional foraging and mining
  • Manufacturing dramatically exploded production output, which allowed for city growth
  • Engines shortened time between towns and cities, which allowed for job explosion
  • Highway systems shortened time between large cities, which allowed for regional economies
  • Airplanes shorten time between the legacy railroad time zones, which allowed for national economies
  • Internet shortened access to national resources internationally, which allowed for international economies
  • Computing shortened processing time of information, which allows for micro-targetted economies worldwide

Each “age” resulted in shortening the distance from A to B.  But, Google is sifting through data. Scientists are trying to sift as well through defined data sensors, link them together and ask very targetted simulated or modeled questions. We need to address the barriers limiting entities success to do this. 

 

When you are ready to move into BigData, it means you are wanting to Answer new questions.

Blog post
added by
Wiki Admin

The article on Throwing a Lifeline to Scientists Drowning in Data discusses how we need to be able to “sift through the noise” to be able this faster and faster deluge of sensors and feeds. Its not amount information management models of fast, large retail or defense data. It is about finding the signals you need to know to take advantage.

In controlled environments, like retail and business, this has been done for years on end to guide business analytics and targeted micro-actions. 

For instance, gambling industries have been doing this for 15 plus years taking in all the transnational data of each pull of a slot machine from all their machines from all their hotels linked with your loyalty card you entered and time of year and when you go, your profile, your trip patterns, then laws allowing, they adjust the looseness of the slots, the coupons provided, the trip rewards all to make sure they do what they are supposed to do in capitalism – be profitable. 

Even in uncontrolled environments such as intelligence, defense or internet search, the model is build analytics on analytics to improve the data quality and lifecycle so that the end analytics can improve. Its sound equalizers on top of the sound board.

Do go for the neat tech for your MIS. Go because your users are asking more of you in the data information knowledge chain. 

Continue on to read more on our follow-on article: BigData is a big deal because it can help answer questions fast