Go Code Colorado recognized with a StateScoop 50 award

Blog post
edited by
Wiki Admin

Colorado has done something quite innovative and has been recognized with a StateScoop 50 award for State Innovation of the Year. 

And Xentity was glad to be a part of it.
Colorado’s Office of the Secretary of State invented and delivered Go Code Colorado, an app development competition for using public data to address challenges identified by the business community.  Three teams built winning apps and business plans and will have their technology licensed by the state for a year.  These teams have formed businesses; some are in discussions with investors and some have left their day jobs to focus on building their new venture.
Here’s the genius of Go Code Colorado: the State took funds from the filing fees that businesses pay to register their businesses and GAVE IT BACK to the business community through Go Code Colorado… AND incubated three new promising businesses in the process.
Plans are to build on this first year and run Go Code Colorado for a second year… on a larger scale and improving on many aspects based on lessons learned from year one.
Xentity had the pleasure of participating in Go Code Colorado as the State’s data management provider — sourcing, curating, improving and publishing 38 datasets from state agencies from Higher Ed to Labor and Employment.
You can see more about Go Code and Xentity’s support in our April Blog Entry: Go Code Colorado Open Data Effort is going into its final weeks . Also, check out the photo gallery of the final event.

What if there was a way to improve the game of football and reduce concussive injuries at the same time without losing the games appeal

Blog post
edited by
Matt Tricomi

We are always interested in transformational projects for data. Or data that can help drive transformational projects. Transformation – which we admit a word they is likely overused, but nonetheless – may be due a adopting new cultural norms, a new business practice, or due to a technological evolution. It may impact how a program runs, how a products is made or distributed, or workforce efficiency or asset acquisition. 

It could also be how a policy change impacts its community, its constituents, or its workforce as a whole. 

Given the Super Bowl season we are in here in the U.S., lets take a look at how professional football has adopted – or not – major rule changes that have impacted its game play, its fans, its players, safety, and entertainment value.

It has been a long time since American Football has rethought its approach to scoring.

It has dabbled around the edges with two point conversions but not fundamentally addressed this aspect of the game since 1912 or introduced a truly transformational offensive rule change since 1933 when emphasizing the forward pass.  

What if there was a way to improve scoring and hence offensive strategy of the game of American Football, reduce injuries while at the same time increasing the games scoring options, its unpredictability and hence fan appeal? What would this game look like?

By challenging some of the basic unspoken assumptions underlying the game, football can be refactored to draw out exciting and unpredictable aspects of a team’s offensive potential, turn the offensive side of the field into a point generation sweepstake and reduce the probability of injuries.

At the heart of this new design are principles that challenge footballs current assumptions that have determined its scoring system for the last hundred years and hence offensive strategies for the last half century.  These new principles are:

  1. Any play that generates points cannot have an excessively high rate of predictability for success, i.e. the current Point after Touchdown (PAT),
  2. The points from successful scoring plays, field goals or touchdowns from scrimmage, should be directly correlated to the yards gained during the scoring play,
  3. Increase the risk and reward opportunities for the offense whenever and wherever possible without slowing the game down,
  4. Develop incentives to maximize the scope of the fields scoring geography,
  5. Incentivize select types of plays and skills to reduce excessive injury causing collisions.

When these principles are applied to the offensive scoring events/plays like field goals, extra points and touchdowns from scrimmage, they open the door to the development of dramatically different offensive and by implication defensive strategies.  These new principles and designs will incentivize the types of play calling that will mitigate the chances of injuries by moving play downfield. Additionally, it can create opportunities for completely new and different emphasis on underrepresented skill positions like place kicking or rare long distance scoring from scrimmage. Lastly, it places a richer strategy dynamic for the fans who know 95% of the time what the next type of play will be called. It is the exception to be fooled.

Before we get into the details, we need to introduce a couple of borrowed and proven concepts from other sports that support the principles articulated above. These concepts will enable the offensive drive new approaches to football strategy. The concepts are defined as follows:

Degree of Difficulty (DoD) – a rating which reflects the difficulty of the maneuver or action an athlete is attempting to perform in sports such as gymnastics and diving, and which is factored into the final score. In the new approach to football we define the DoD as follows:

DoD for field goals and point after touchdowns is the equivalent to the reduction of the width of the goals posts by 0, 25 or 50 or 75. Table 1 describes the goal post distance and the allowed scoring methods.

Table 1 – Degree of Difficulty – Goal width and Scoring Method

DoD

Goal Post Width

Used for

Field Goal

Used for

Point after Touchdown

0

18’ 6”

Yes

No

25%

13’ 7.5”

Yes

No

50%

9’ 3”

Yes

Yes

75%

4’ 8.5”

Yes

Yes

Borrowing a the idea from the three point play in basketball that the further you are from the goal the more valuable the shot should be, the DoD for football is the distance from the line of scrimmage to the end zone. For simplicity of audience understanding and visualization on television, we introduce the idea of Point Zones on the field. 

Point Zones which are predefined areas of the field that determine the possible points on a scoring play based on the distance from the ball to goal posts or goal line.

Offensive Scoring

So what do these concepts look like on the field.  We will now describe how these ideas affect the offenses 3 main scoring methods and where the approach should not be applied.

Notable Changes

1898: A touchdown was changed from four points to five.

1904: A field: goal was changed from five points to four.

1906: The forward pass was legalized. The first authenticated pass completion in a pro game came on October 27, when George (Peggy) Parratt of Massillon threw a completion to Dan (Bullet) Riley in a victory over a combined Benwood-Moundsville team.

1909: A field goal dropped from four points to three.

1912: A touchdown was increased from five points to six.

1933: The NFL, which long had followed the rules of college football, made a number of significant changes from the college game for the first time and began to develop rules serving its needs and the style of play it preferred. The innovations from the 1932 championship game-inbounds line or hashmarks and goal posts on the goal lines-were adopted. Also the forward pass was legalized from anywhere behind the line of scrimmage.

1960: The AFL adopted the two-point option on points after touchdown

1994: There is now a 2 point conversion following touchdowns (teams now have the option of passing or running for two points or kicking for one after a TD);

Field Goals

Let’s talk field goal!  In Figure 1, we introduce the combined Point Zone and DoD for field goals that support the principles listed above and show them in the context of the field of play.    

Figure 1 – Field Goal – Point Zones and Degree of Difficulty

For example, if the offensive team chooses to kick a field goal from the 23 yard line, in effect kicking a 40 yard field goal, they would have been situated in Point Zone 3. If they chose a DoD of 50%, 25% or 0%, they would have the opportunity to score 5, 4 and 3 points respectively.  A coach’s decision would obviously need to take into consideration, the current score of the game, environmental conditions and the skill of the kicker and the supporting special team.  In effect this opens up the field and offensive strategy dramatically for teams positioned with talented kicking operations or provides alternative approaches as the game clock winds down at the half or end of game. This creates a “moment” where the fans at the stadium or watching on TV no longer have a high degree of certainty of what is going to happen next – hence increased attention. It also will lead to fundamental redesign of offensive strategies.  For the players, the model provides opportunities to score more points from more areas of the field without having to “grind” out the drives risking injuries as the field shortens.

The recommended Point Zones take into account the current statistical kicking performance.  In 2014, there were no misses in the NFL statistics 0-20 range (in effect the 3 yard line) with five attempts.  This practice, in effect violates principles #1, 2 and 3.  In the recommended model, the Point Zone scoring system acts as a disincentive to taking the chip shot by only granting 1 point up to 27 yards or requires the team to change the DoD and increase risk to achieve up to 3 points or go for the touchdown.  This stimulates a change in risk reward thinking, possibly moving teams to take more shots at the end zone while in zone 1.  It certainly provides more options for the fans to think and speculate about what could happen by removing the predictable decisions. More fans would sit and watch what would have been the “gimmes”.

As the Point Zones move away from the goal posts/line, the risk and reward calculus changes. The field goal now has the potential to nearly rival the touchdown as a primary objective for the offense. The field goals maximum value is 6 points if kicked beyond the 62 yard distance with the maximum DoD of 50%.  This may seem like an unlikely event, nearly equivalent to the current record of 63 yards, but we believe with reintroduction of tees and the greater point incentive, the distance will be conquered with increased investment in kicking skills and techniques.  Most importantly, it gives the offense numerous options to exercise and keep the fans guessing and supports all of the principles.

Point Zone 3 is where the value of the field goal in the new and old models converges.  The field goal kicker can score 3 points with no change in the degree of difficulty while kicking between 42 and 62 yards.  In 2014 through week 12 they were hitting 75/102 attempts successfully or roughly a 25% failure rate. No guarantees. It is here, the model provides an incentive for the team with a greater reward for riskier behavior.  An accurate kicker can realize up to 5 points for a successful attempt with a DoD of 50% within Point Zone 3. How many of the 75 successful field goals could have earned 1 or 2 more points and as a result made a difference in the game’s outcome. Once again, the situational context of the game will be a key to the decision process and provide a means to capture the audience with new strategies.  Not all fans want to see just hard hitting.

Touchdown from Scrimmage

In the new model, touchdowns from the line of scrimmage are also subject to the similar risk reward calculus as the field goal.  For a play from the line of scrimmage, the DoD is the yardage required to score. Figure 2, shows the Point Zones and the associated additional points that would be added to the six points when a touchdown is scored.  Once again, the idea is to incentivize the offense to attempt more tries to score over longer distance by increasing the number of points that can be gained.  The incentives would encourage teams to open up the offensive strategy and introduce plays to spread the field and reduce the number direct collisions occurring at the line of scrimmage.

The rushing offense style strategy seems to lead to most injuries.  “Offensive lineman (center, offensive guard, and offensive tackle) sustained the most injuries (18.3%) of all positions; however running back had the highest percentage of injury for any one position (16.3%)”. (3)

Spreading the offense can mitigate “the leading mechanism of injury is football’s full-contact nature, with player-player contact accounting for 64% of all injuries and 13.4% of injuries attributed to player-surface contact. More specifically, being tackled (24.4%) and tackling (21.8%) accounted for a majority of the injuries.

In the spirit of reducing injuries, the DoD points would not be used to incentivize kickoff and punt returns.

Figure 2 – Point Zones for Plays from Scrimmage

Point after Touchdown (PAT)

The point after touchdown is straight forward.  By default, the goal post will be set to DoD of .50 for a 1 point kick (See Figure 3). The team will have the option to set the DoD to .75 and go for two points.  Passing or running for two points will no longer be an option to minimize injuries.

Figure 3 – DoD and Points after Touchdown

Tying it back to transformation in our world

Professional Football has been around several decades now and has adapted and adopted to changing norms. There are more norms for it to address – social responsibilities, players safety, impact on youth, simplifying rules, financial access, organization non-profit status, etc.. The tale though shows how a major program/entity such as this, with so much on the line can choose to adapt and adopt, with some fall off, turbulence, and alienation, but ultimately thrive with its constituents. Its easy to throw mud at the largest professional sports league in the U.S., and there is a lot to throw. While at the same time, there are always lots of good takeaways from leading organizations as well that show how adapting, adopting, adjusting minor and major rules through varying time tables can actually happen when leadership can stand behind a change.

Is Federal Enterprise Architecture dead as we know it – yes

Blog post
edited by
Matt Tricomi

In short, yes! And it was not a humane end. It lived on way too long. As we noted in our Blog What do current disruptive technologies mean to the roles of the Federal CIO office:

Traditionally, the operating model and funding approach for IT has been based on the Brooks Act of 1965 and only added minor portfolio integration concepts based on the Clinger-Cohen Act of 1996. These acts were focused on internal IT cost-based centers, management information systems, mission control systems, and enterprise resource planning systems. These systems were all either internal mission or data processing systems used to run business. Since 1996, a lot has happened in the IT based. It has moved from cost center to profit center in the private sector and in the government space to service or profit center as well (i.e. profit for IRS in efiling). As normally is the case when shifting positions of an asset to the executive level, this also means the investment models change and shift as it is now a critical part of executing transactions and interactions direct with the public, yet our policies are now 20-50 years old.

How Enterprise System and Conceptual Solution Architecture Approaches changed in 10 years

is still the same “original concepts of governance gates between major decision points/steps, step at-a-glance view, work product based methods” 

“but you can see the appropriate gravitational move less from system architecting to portfolio designing to roadmapping”

For instance, in MBT and FSAM, Step 4 “Analyze the IT and Develop the Target Conceptual Solution Architecture” took 70 days. Now in CPM its a very high-level set of boxes for executives with risks done in less than a couple weeks with a series of transition plan notes to do it closer to implementation and investment budgetting time.

In MBT and FSAM, Step 4 was nearly identical. Here, its gone.

My assumption is three-fold:

  • by the time the team goes to invest, doing conceptual target solution architecture has already aged 6-18 months
  • target solution architecture depends highly on the progression of other enterprise services – what if the cloud contract gets delayed or re-scoped, what if that enterprise workflow component is done differently, what happens to your architecture?
  • service principles are becoming more best practices and standards like building guidelines which are less project specific, and more the project needs to choose at the right time what is required to apply.

The con could be to this the investment learns to late after budget season the true CAPEX (capital costs) and projects get further delayed. But, that is not all bad, as many time, the original CAPEX outlay is wrong anyhow, even with a good conceptual solution architecture due to the notes above.

Does this mean Architecture should not be done at the Enterprise any more?

Dont we wish that could be true. That the tech developed in one group would integrate naturally with another. The data would perfectly interface. The security worked seamlessly. The protocols were hidden. Redunandancies and sunsets were robotically self-realized. OK, the last was too much tongue in cheek. But, no, systems must interface and be interoperable, accessible, and all the other architecture “ility” words.

So EA policy was constructed made for internal systems and local data centers. Now we are in utility hardware running software for the untrained user or trained user with very little time alotted. This is just a natural program as noted in IT Footprint Progression in the Federal Government… and the role of Architecture

All that being said, losing architecture is a reality in the planning. Disruptions are moving too fast. The best way we have found is making sure the agile development efforts are build on solids plans (per the above) and make sure the architecture principles, component selection is guided by an excellent architecture runway – either via governance or an enterprise agile framework.

We have more adopted a three-pronged approach

  • Strategically building collaborative relationships with our clients using Smart Lean Government concepts of building communities, integrating life events across organization types, and working towards a standing Service Integration Model
  • Tactically, doing the Step 4 work products when needed
  • Timing the tactical work products by implementing Agile with Architecture in concepts driven by Scaled Agile Framework 

FSAM Replacement is coming – Collaborative Planning Methodology (CPM)

Blog post
edited by
Matt Tricomi

Our colleagues over at Phase One Consulting Group have sponsored the creation of The Planning Institute . The Planning Institute was formed to help foster the standards and methods by which collaborative planning is performed. We stumbled upon this update in following Phase One`s blog on Plan for the Future .

From the The Planning Institute methodology overview and the overall web site, the new Collaborative Planning Methodology (CPM) is the replacement for the 2008 FSAM method which is primarily sourced from the 2006 MBT method. You can see how those methods were originally developed at Developing a Transformation Approach.

The Planning Institute concept has been in the works since 2006 when MBT originally got legs and the idea was to move to an open-source methodology and get tied into more and more methods, approaches, frameworks, maturity models, etc. For instance, MBT work products were mapped to supported variuos architecture frameworks (FEAF, DODAF, TOGAF, C4ISR) and FSAM mapped into various other IT Portfolio planning (CPIC, A&A, Privacy). MBT also was build with the premise of linking segment architecture to the IT Investment Maturity Framework. It will be interesting to see where CPM gets mapped into to bring in larger communities – PMP? Smart Lean Government?

This did get slightly sidetracked in 2007, when MBT was lifted to be the OMB FSAM, and for good reason, as that did help establish a Federal Government direction towards common analysis, architecture, and planning methods. It is nice to see the progression back towards open-source methods.

As the Phaseone blog notes:

No one in our industry wants to be on the front page of the Washington Post! There are many approaches to planning … many that result in the dreaded analysis paralysis.  When we do take the time to plan, why does it sometimes not result in better performance?  I believe the critical element to planning is collaboration.  Ultimately, we have to realize that in most mission and business areas, there are a lot of people who have a lot of opinions.  There is never a single individual who is always right, always wrong, or all knowledgeable about any given topic.  Why do people come from so many different perspectives?  Well, all of these people have different experiences, different responsibilities, and different levels of creativity and ambition.  The bottom line:  working hard to get the best out of these individuals is key to gaining consensus, and ultimately delivering a better plan and a better product.

Why do we need an update aside from its been 7-8 years?

FSAM did stumble to be unwieldy – presented too waterfall. Only a few groups like CDC and USGS (latter with Xentity) developed an Agile or adapted version of FSAM to do the products needed and required at the right time. 

MBT had a similar curse. It could take 6-9 months to get to a 3-5 year plan, which is hard to fathom now. Now the DOI Geospatial Modernization Blueprint still is being implemented and still is relevant as the tact there was more like the CPM intent – the business and data investment issues are as relevant today as when the Geospatial blueprint started on MBT 1.0.

In Response to that, Xentity did introduce with  Government sponsors the concept of FSAM and MBT as Transformation Lab Services to get more agile shorter, more tactical wins in change , but it was bad timing to get off the ground

Checking out the new Collaborative Planning Methodology (CPM)

The CPM is taken the Geospatial Blueprint tact of more on the longer-term planning issue level, and less on the technical architecture. The CPM is an update less do to technology, disruptions, or even agile project methods, but more a response to the gridlock of technology portfolio change.

It is a “full planning and implementation lifecycle”, where as FSAM stopped at planning and design, and was more about architecture, and MBT attempted at the implementation portion, but in all honesty, its strength was on the collaborative blueprint development by enterprise service or mission area.

CPM is not bent on any technology pattern such as cloud, nor even IT. Nor does it suggest as MBT and FSAM to be focused on specific segments. It really leans more towards a planning, and less about architecture. That does hint to the fact that the disruptions are moving so fast, discussing actual architecture recommendations are becoming more and more difficult to stay current.

Collaborative Planning Methodology Overview

What has carried on in CPM from FSAM and MBT?

What is very cool to see is the DOI, PhaseOne, and Xentity teams original concepts of governance gates between major decision points/steps, step at-a-glance view, work product based methods. This pattern still is critical to assure a solid foundation is laid. CPM is still broken down the same way as MBT and FSAM even with the complexity burden pie chart. 

Looking back when we originally introduces this 1-page concept to a method during the methodology creation production workshops in DC and Denver, its amazing to see it stand as a key information reduction graphic even since MBT 1.0. Overall, it also gives a sense of comparing the maturing style of the planning as disruptions move SO MUCH faster in just ten years.

CPM Step 3 (2014)

FSAM Step 3 (2008)

MBT Step 3 (2005-2007)

Step Title: Define and Plan

Step Title: Define Business and Information Requirements

Step Title: Analyze the Business and Define the Target Business Environment

Amazing similarities, but you can see the appropriate gravitational move less from system architecting to portfolio designing to roadmapping. 

To save some clicks, here are some excerpts from the Planning Institute site

THE COLLABORATIVE PLANNING METHODOLOGY (CPM)

Planning is done to effect change in support of an organization’s Strategic Plan, and the many types of planners (e.g. architects, organization and program managers, strategic planners, capital planners, and other planners) must work together to develop an integrated, actionable plan to implement that change.  Planning should be used to determine the exact changes that are needed to implement an organization’s Strategic Plan, enable consistent decision-making, and provide measurable benefits to the organization.  In short, an organization’s Strategic Plan should be executed by well-rounded planning that results in purposeful projects with measurable benefits.

In today’s environment, which demands more efficient government through the reuse of solutions and services, organizations need actionable, consistent, and rigorous plans to implement Strategic Plans and solve priority needs.  These integrated plans should support efforts to leverage other Federal, state, local, tribal, and international experiences and results as a means of reusing rather than inventing from scratch.  Plans should be consistent and rigorous descriptions of the structure of the organization or enterprise, how IT resources will be efficiently used, and how the use of assets such as IT will ultimately achieve stated strategies and needs.

The role of planners is to help facilitate and support a common understanding of needs based on the organization’s Strategic Plan, help formulate recommendations to meet those needs, and facilitate the development of a plan of action that is grounded in an integrated view of not just technology planning, but the full spectrum of planning disciplines to include, but not limited to, mission/business, IT resources, capital, security, infrastructure, human capital, performance, and records planning. 

Planners provide facilitation and integration to enable this collaborative planning discipline, and work with specialists and subject matter experts from these planning groups in order to formulate a plan of action that not only meets needs but is also implementable within financial, political, and organizational constraints.  In addition, planners have an important role to play in the investment, implementation, and performance measurement activities and decisions that result from this integrated planning process

The Collaborative Planning Methodology, shown in Figure 1, is a simple, repeatable process that consists of integrated, multi-disciplinary analysis that results in recommendations formed in collaboration with sponsors, stakeholders, planners, and implementers.  This methodology includes the master steps and detailed guidance for planners to use throughout the planning process.  Architecture is but one planning discipline included in this methodology.  Over time the methods and approaches of other planning disciplines will continue to be interwoven into this common methodology to provide a single, collaborative approach for organizations to use. 

The Collaborative Planning Methodology is the next generation replacement for the Federal Segment Architecture Methodology (FSAM).  As the replacement for the FSAM, the Collaborative Planning Methodology has been designed to be more flexible, more widely applicable, and more inclusive of the larger set of planning disciplines.

The Collaborative Planning Methodology is intended as a full planning and implementation lifecycle for use at all levels of scope defined in the Common Approach to Federal Enterprise Architecture: International, National, Federal, Sector, Agency, Segment, System, and Application. 

Collaborative Planning Methodology Overview

Collaborative Planning Methodology Overview

The Collaborative Planning Methodology consists of two phases: (1) Organize and Plan and (2)Implement and Measure.  Although the phases are shown as sequential, in fact there are frequent and important iterations within and between the phases.  In the first phase, planners serve a key role facilitating the collaboration between sponsors and various stakeholders to clearly identify and prioritize needs, research other organizations facing similar needs, and formulate the plans to address the stated needs.  In the second phase, planners shift into a participatory role, supporting other key personnel working to implement and monitor change related activities.  As part of the second phase of the methodology, planners specifically support investment, procurement, implementation, and performance measurement actions and decisions. 

The Collaborative Planning Methodology is stakeholder-centered with a focus on understanding and validating needs from sponsor and stakeholder perspectives, planning for those needs, and ensuring that what is planned ultimately results in the intended outcomes (Step 1).  Additionally, this methodology is structured to embrace the principles of leverage and reuse by assisting planners in determining whether there are other organizations that have previously addressed similar needs, and whether their business model, experiences, and work products can be leveraged to expedite improvement (Step 2). 

Ultimately, the Collaborative Planning Methodology helps planners work with sponsors and stakeholders to clearly articulate a roadmap that defines needs, what will be done to address those needs, when actions will be taken, how much it will cost, what benefits will be achieved, when those benefits will be achieved, and how those benefits will be measured (Step 3).  The methodology also helps planners support sponsors and stakeholders as they make decisions regarding which courses of action are appropriate for the mission, including specific investment and implementation decisions (Step 4).  Finally and perhaps most importantly, the methodology provides planners with guidance in their support of measuring the actual performance changes that have resulted from the recommendations, and in turn, using these results in future planning activities (Step 5). 

For more information please see the other CPM pages as well as the Downloads Page where detailed guidance documents are available.

More about the Planning Institute notes:

 

WHO ARE WE?

We are a collection of government, industry, and non-profit organizations and individuals who are interested in better ways to conduct planning.  We advocate open source methodologies that can be used around the globe to solve major IT and non-IT challenges..

 


 

WHAT IS OUR GOAL?

Our goal is to see the wide-spread use of open source methodologies for planning so that we can better (1) collaborate, (2) innovate, (3) and build a better future.  The easier it is for us to work together, using a common vocabulary and process, the easier it will be to build a better future.

 


 

HOW CAN YOU HELP?

Get involved!  Contact us via Twitter or through our contact form on this site.  We would love to work with you, hear your case studies, feature your best practices, or just hear some words of encouragement!

Future Map for Neuro Technology and Five Other Areas

Blog post
edited by
Matt Tricomi

We tweeted a ‘harumph’ for Geekwire’s article on First, we kill all the ‘futurists’ . Then the Policy Horizons Canada group puts out a fantastic emerging tech futures map. Futurists be damned if they do or don’t. On the new study published by Envisioning and Policy Horizons Canada, a blog on Business Insider notes:

On Friday, the group published a giant graphic summarizing emerging technologies and showing when they could become scientifically viable, mainstream, and financially feasible. This follows more detailed graphics (pdf files) showing future innovations in agricultural and natural manufacturingneurology and cognitionnanotechnology and materials,healthdigital and communication technology, and energy. 

These predictions may not be so far off. 

Moore’s Law is accelerating digital processing well into the hockey stick shift. The web and flat world is kicking in Metcalfe’s law of network interconnection creating a tipping point for rapidly adapting to new tech globally.
So, some of these information related futures will only be delayed by political, geopolitical, or epidemic disruptions at this point. That all said, we’re keeping in mind the main premise of First, we kill all the ‘futurists’ – that it needs to be more than a smart guy presenting ideas ‘ripped from the pages of Google News Alerts. Some of the references did feel a bit like that, but it is a conversation starter to get the mind ‘context switched’ from the day-to-day rat race to what could be. 

Call it our guilty pleasure or call it – regardless of pinpointing “futurist” timelines – a great way to help teach awareness of the pace of emerging and disruptive tech.

A few findings we enjoyed

First the graphics are presented in how we love to present tech – they change business models:

The near future of technology promises change at an ever-increasing pace while rapidly transforming business models, governments and institutions worldwide. In order to help us make sense of our uncertain future, Policy Horizons Canada engaged Michell Zappa of Envisioning Technology to explore key technologies that are likely to have a profound effect on humanity on a global level and generational timeframe.

In the six areas, the focus is on economic impact, geopolitical (energy), and human-computer interaction and societal impacts.

Neuro and AI

Looking at the slices related to information progression, completeness, and how it gets more compelling and knowledgeable is of course our lens. As noted in Why we focus on spatial data science, we are very interested in the path from research to main stream of data to information to knowledge to wisdom. We also continuously discover it is true that our graphics are truly still at the whiteboard.

So, we of course are enthralled and drooling over the neurology and cognition aspects. It is great to see the agreement with our leanings and concepts that we must invoke sentiment (emotion tracking) prior to having prediction (crime prevention). Yet, it looks like the focus is on facial recognition aspects for emotions, but given there are so many other pantomimes of liars and other emotions and not too mention composite emotion detection in verbal, setting, background, environment, contemporary context, this does appear a bit aggressive. Not to mention there is now an abstraction of emotion through devices (txt, twitter, facebook, etc.) that create different faces of a person and emotion. This will take large data to help integrate the HUMINT concepts that the intelligence agencies have access to on the civilian level.

While they nailed some interesting concepts of physical, physiology, and neuro interactions – human-computer interaction, what felt missing in the Neuro area, was the concept that computers like Watson went from multiples of servers to one server and now is open-source in a matter of five years (From Jeopardy champ to cloud service). When will that capability make 2010 Siri look like in ten years – a novelty, a joke? Already Microsoft’s Contana in late 2014 has progressed from lookup and secretarial duties to executive administrative assistant. What will happen in another 10 years? What will happen when major brain mapping or DARPA’s brain mimic efforts produce its research in that time period? What will happen when the storage capacity of the web can handle brain storage?

Will we have personalized sensitive advisor, therapists? Have the slew of updated sci-fi movies on such cognitive devices painted that new picture (i.e. Transcendence (flop or not), Her). To believe we can get the emotion in ten years is very bold, but we will have the power of watson in our tablet or smartphone-like devices in 10 years. What that will bring for intelligence and information will be interesting.

The full publication is at http://www.horizons.gc.ca/eng/content/metascan-3-emerging-technologies-0 and a great way to learn more about the study quickly is at http://envisioning.io/horizons/.

There is so much more. This was a couple notes on 1/6 of the study. But to not spoil your exploration too much more, we’ll just summarize by saying, go in and explore and get your mind on the possible. As an IBM colleague of ours used to put in his email signature:

A pessimist sees the difficulty in every opportunity; an optimist sees the opportunity in every difficulty.

Winston Churchill (Brainyquote.com)