Blog

FSAM Replacement – Collaborative Planning Methodology (CPM)

Blog post
edited by
Matt Tricomi

Our colleagues over at Phase One Consulting Group have sponsored the creation of The Planning Institute . The Planning Institute was formed to help foster the standards and methods by which collaborative planning is performed. We stumbled upon this update in following Phase One`s blog on Plan for the Future .

From the The Planning Institute methodology overview and the overall web site, the new Collaborative Planning Methodology (CPM) is the replacement for the 2008 FSAM method which is primarily sourced from the 2006 MBT method. You can see how those methods were originally developed at Developing a Transformation Approach.

The Planning Institute concept has been in the works since 2006 when MBT originally got legs and the idea was to move to an open-source methodology and get tied into more and more methods, approaches, frameworks, maturity models, etc. For instance, MBT work products were mapped to supported variuos architecture frameworks (FEAF, DODAF, TOGAF, C4ISR) and FSAM mapped into various other IT Portfolio planning (CPIC, A&A, Privacy). MBT also was build with the premise of linking segment architecture to the IT Investment Maturity Framework. It will be interesting to see where CPM gets mapped into to bring in larger communities – PMP? Smart Lean Government?

This did get slightly sidetracked in 2007, when MBT was lifted to be the OMB FSAM, and for good reason, as that did help establish a Federal Government direction towards common analysis, architecture, and planning methods. It is nice to see the progression back towards open-source methods.

As the Phaseone blog notes:

No one in our industry wants to be on the front page of the Washington Post! There are many approaches to planning … many that result in the dreaded analysis paralysis.  When we do take the time to plan, why does it sometimes not result in better performance?  I believe the critical element to planning is collaboration.  Ultimately, we have to realize that in most mission and business areas, there are a lot of people who have a lot of opinions.  There is never a single individual who is always right, always wrong, or all knowledgeable about any given topic.  Why do people come from so many different perspectives?  Well, all of these people have different experiences, different responsibilities, and different levels of creativity and ambition.  The bottom line:  working hard to get the best out of these individuals is key to gaining consensus, and ultimately delivering a better plan and a better product.

Why do we need an update aside from its been 7-8 years?

FSAM did stumble to be unwieldy – presented too waterfall. Only a few groups like CDC and USGS (latter with Xentity) developed an Agile or adapted version of FSAM to do the products needed and required at the right time. 

MBT had a similar curse. It could take 6-9 months to get to a 3-5 year plan, which is hard to fathom now. Now the DOI Geospatial Modernization Blueprint still is being implemented and still is relevant as the tact there was more like the CPM intent – the business and data investment issues are as relevant today as when the Geospatial blueprint started on MBT 1.0.

In Response to that, Xentity did introduce with  Government sponsors the concept of FSAM and MBT as Transformation Lab Services to get more agile shorter, more tactical wins in change , but it was bad timing to get off the ground

Checking out the new Collaborative Planning Methodology (CPM)

The CPM is taken the Geospatial Blueprint tact of more on the longer-term planning issue level, and less on the technical architecture. The CPM is an update less do to technology, disruptions, or even agile project methods, but more a response to the gridlock of technology portfolio change.

It is a “full planning and implementation lifecycle”, where as FSAM stopped at planning and design, and was more about architecture, and MBT attempted at the implementation portion, but in all honesty, its strength was on the collaborative blueprint development by enterprise service or mission area.

CPM is not bent on any technology pattern such as cloud, nor even IT. Nor does it suggest as MBT and FSAM to be focused on specific segments. It really leans more towards a planning, and less about architecture. That does hint to the fact that the disruptions are moving so fast, discussing actual architecture recommendations are becoming more and more difficult to stay current.

Collaborative Planning Methodology Overview

What has carried on in CPM from FSAM and MBT?

What is very cool to see is the DOI, PhaseOne, and Xentity teams original concepts of governance gates between major decision points/steps, step at-a-glance view, work product based methods. This pattern still is critical to assure a solid foundation is laid. CPM is still broken down the same way as MBT and FSAM even with the complexity burden pie chart. 

Looking back when we originally introduces this 1-page concept to a method during the methodology creation production workshops in DC and Denver, its amazing to see it stand as a key information reduction graphic even since MBT 1.0. Overall, it also gives a sense of comparing the maturing style of the planning as disruptions move SO MUCH faster in just ten years.

CPM Step 3 (2014)

FSAM Step 3 (2008)

MBT Step 3 (2005-2007)

Step Title: Define and Plan

Step Title: Define Business and Information Requirements

Step Title: Analyze the Business and Define the Target Business Environment

Amazing similarities, but you can see the appropriate gravitational move less from system architecting to portfolio designing to roadmapping. 

To save some clicks, here are some excerpts from the Planning Institute site

THE COLLABORATIVE PLANNING METHODOLOGY (CPM)

Planning is done to effect change in support of an organization’s Strategic Plan, and the many types of planners (e.g. architects, organization and program managers, strategic planners, capital planners, and other planners) must work together to develop an integrated, actionable plan to implement that change.  Planning should be used to determine the exact changes that are needed to implement an organization’s Strategic Plan, enable consistent decision-making, and provide measurable benefits to the organization.  In short, an organization’s Strategic Plan should be executed by well-rounded planning that results in purposeful projects with measurable benefits.

In today’s environment, which demands more efficient government through the reuse of solutions and services, organizations need actionable, consistent, and rigorous plans to implement Strategic Plans and solve priority needs.  These integrated plans should support efforts to leverage other Federal, state, local, tribal, and international experiences and results as a means of reusing rather than inventing from scratch.  Plans should be consistent and rigorous descriptions of the structure of the organization or enterprise, how IT resources will be efficiently used, and how the use of assets such as IT will ultimately achieve stated strategies and needs.

The role of planners is to help facilitate and support a common understanding of needs based on the organization’s Strategic Plan, help formulate recommendations to meet those needs, and facilitate the development of a plan of action that is grounded in an integrated view of not just technology planning, but the full spectrum of planning disciplines to include, but not limited to, mission/business, IT resources, capital, security, infrastructure, human capital, performance, and records planning. 

Planners provide facilitation and integration to enable this collaborative planning discipline, and work with specialists and subject matter experts from these planning groups in order to formulate a plan of action that not only meets needs but is also implementable within financial, political, and organizational constraints.  In addition, planners have an important role to play in the investment, implementation, and performance measurement activities and decisions that result from this integrated planning process

The Collaborative Planning Methodology, shown in Figure 1, is a simple, repeatable process that consists of integrated, multi-disciplinary analysis that results in recommendations formed in collaboration with sponsors, stakeholders, planners, and implementers.  This methodology includes the master steps and detailed guidance for planners to use throughout the planning process.  Architecture is but one planning discipline included in this methodology.  Over time the methods and approaches of other planning disciplines will continue to be interwoven into this common methodology to provide a single, collaborative approach for organizations to use. 

The Collaborative Planning Methodology is the next generation replacement for the Federal Segment Architecture Methodology (FSAM).  As the replacement for the FSAM, the Collaborative Planning Methodology has been designed to be more flexible, more widely applicable, and more inclusive of the larger set of planning disciplines.

The Collaborative Planning Methodology is intended as a full planning and implementation lifecycle for use at all levels of scope defined in the Common Approach to Federal Enterprise Architecture: International, National, Federal, Sector, Agency, Segment, System, and Application. 

Collaborative Planning Methodology Overview

Collaborative Planning Methodology Overview

The Collaborative Planning Methodology consists of two phases: (1) Organize and Plan and (2)Implement and Measure.  Although the phases are shown as sequential, in fact there are frequent and important iterations within and between the phases.  In the first phase, planners serve a key role facilitating the collaboration between sponsors and various stakeholders to clearly identify and prioritize needs, research other organizations facing similar needs, and formulate the plans to address the stated needs.  In the second phase, planners shift into a participatory role, supporting other key personnel working to implement and monitor change related activities.  As part of the second phase of the methodology, planners specifically support investment, procurement, implementation, and performance measurement actions and decisions. 

The Collaborative Planning Methodology is stakeholder-centered with a focus on understanding and validating needs from sponsor and stakeholder perspectives, planning for those needs, and ensuring that what is planned ultimately results in the intended outcomes (Step 1).  Additionally, this methodology is structured to embrace the principles of leverage and reuse by assisting planners in determining whether there are other organizations that have previously addressed similar needs, and whether their business model, experiences, and work products can be leveraged to expedite improvement (Step 2). 

Ultimately, the Collaborative Planning Methodology helps planners work with sponsors and stakeholders to clearly articulate a roadmap that defines needs, what will be done to address those needs, when actions will be taken, how much it will cost, what benefits will be achieved, when those benefits will be achieved, and how those benefits will be measured (Step 3).  The methodology also helps planners support sponsors and stakeholders as they make decisions regarding which courses of action are appropriate for the mission, including specific investment and implementation decisions (Step 4).  Finally and perhaps most importantly, the methodology provides planners with guidance in their support of measuring the actual performance changes that have resulted from the recommendations, and in turn, using these results in future planning activities (Step 5). 

For more information please see the other CPM pages as well as the Downloads Page where detailed guidance documents are available.

More about the Planning Institute notes:

 

WHO ARE WE?

We are a collection of government, industry, and non-profit organizations and individuals who are interested in better ways to conduct planning.  We advocate open source methodologies that can be used around the globe to solve major IT and non-IT challenges..

 


 

WHAT IS OUR GOAL?

Our goal is to see the wide-spread use of open source methodologies for planning so that we can better (1) collaborate, (2) innovate, (3) and build a better future.  The easier it is for us to work together, using a common vocabulary and process, the easier it will be to build a better future.

 


 

HOW CAN YOU HELP?

Get involved!  Contact us via Twitter or through our contact form on this site.  We would love to work with you, hear your case studies, feature your best practices, or just hear some words of encouragement!

Future Map for Neuro Technology and Five Other Areas

Blog post
edited by
Matt Tricomi

We tweeted a ‘harumph’ for Geekwire’s article on First, we kill all the ‘futurists’ . Then the Policy Horizons Canada group puts out a fantastic emerging tech futures map. Futurists be damned if they do or don’t. On the new study published by Envisioning and Policy Horizons Canada, a blog on Business Insider notes:

On Friday, the group published a giant graphic summarizing emerging technologies and showing when they could become scientifically viable, mainstream, and financially feasible. This follows more detailed graphics (pdf files) showing future innovations in agricultural and natural manufacturingneurology and cognitionnanotechnology and materials,healthdigital and communication technology, and energy. 

These predictions may not be so far off. 

Moore’s Law is accelerating digital processing well into the hockey stick shift. The web and flat world is kicking in Metcalfe’s law of network interconnection creating a tipping point for rapidly adapting to new tech globally.
So, some of these information related futures will only be delayed by political, geopolitical, or epidemic disruptions at this point. That all said, we’re keeping in mind the main premise of First, we kill all the ‘futurists’ – that it needs to be more than a smart guy presenting ideas ‘ripped from the pages of Google News Alerts. Some of the references did feel a bit like that, but it is a conversation starter to get the mind ‘context switched’ from the day-to-day rat race to what could be. 

Call it our guilty pleasure or call it – regardless of pinpointing “futurist” timelines – a great way to help teach awareness of the pace of emerging and disruptive tech.

A few findings we enjoyed

First the graphics are presented in how we love to present tech – they change business models:

The near future of technology promises change at an ever-increasing pace while rapidly transforming business models, governments and institutions worldwide. In order to help us make sense of our uncertain future, Policy Horizons Canada engaged Michell Zappa of Envisioning Technology to explore key technologies that are likely to have a profound effect on humanity on a global level and generational timeframe.

In the six areas, the focus is on economic impact, geopolitical (energy), and human-computer interaction and societal impacts.

Neuro and AI

Looking at the slices related to information progression, completeness, and how it gets more compelling and knowledgeable is of course our lens. As noted in Why we focus on spatial data science, we are very interested in the path from research to main stream of data to information to knowledge to wisdom. We also continuously discover it is true that our graphics are truly still at the whiteboard.

So, we of course are enthralled and drooling over the neurology and cognition aspects. It is great to see the agreement with our leanings and concepts that we must invoke sentiment (emotion tracking) prior to having prediction (crime prevention). Yet, it looks like the focus is on facial recognition aspects for emotions, but given there are so many other pantomimes of liars and other emotions and not too mention composite emotion detection in verbal, setting, background, environment, contemporary context, this does appear a bit aggressive. Not to mention there is now an abstraction of emotion through devices (txt, twitter, facebook, etc.) that create different faces of a person and emotion. This will take large data to help integrate the HUMINT concepts that the intelligence agencies have access to on the civilian level.

While they nailed some interesting concepts of physical, physiology, and neuro interactions – human-computer interaction, what felt missing in the Neuro area, was the concept that computers like Watson went from multiples of servers to one server and now is open-source in a matter of five years (From Jeopardy champ to cloud service). When will that capability make 2010 Siri look like in ten years – a novelty, a joke? Already Microsoft’s Contana in late 2014 has progressed from lookup and secretarial duties to executive administrative assistant. What will happen in another 10 years? What will happen when major brain mapping or DARPA’s brain mimic efforts produce its research in that time period? What will happen when the storage capacity of the web can handle brain storage?

Will we have personalized sensitive advisor, therapists? Have the slew of updated sci-fi movies on such cognitive devices painted that new picture (i.e. Transcendence (flop or not), Her). To believe we can get the emotion in ten years is very bold, but we will have the power of watson in our tablet or smartphone-like devices in 10 years. What that will bring for intelligence and information will be interesting.

The full publication is at http://www.horizons.gc.ca/eng/content/metascan-3-emerging-technologies-0 and a great way to learn more about the study quickly is at http://envisioning.io/horizons/.

There is so much more. This was a couple notes on 1/6 of the study. But to not spoil your exploration too much more, we’ll just summarize by saying, go in and explore and get your mind on the possible. As an IBM colleague of ours used to put in his email signature:

A pessimist sees the difficulty in every opportunity; an optimist sees the opportunity in every difficulty.

Winston Churchill (Brainyquote.com)

Go Code Colorado Open Data Effort is going into its final weeks

Blog post
edited by
Wiki Admin

States all-around have gotten into Open Data movements. Colorado has as well, and their recent Go Code Colorado effort is a very unique entry into this foray ( http://gocode.colorado.gov/)

Go Code Colorado was created to help Colorado companies grow, by giving them better and more usable access to public data. Teams will compete to build business apps, creating tools that Colorado businesses actually need, making our economy stronger.

 

The following is a great video that summarizes the event as produced by the State and one of Xentity’s colleagues, Engine7 Media.



Get Adobe Flash player

Xentity is very proud to be supporting this innovative Government Solution

Xentity was awarded IT consulting support for the the Business Intelligence Center platform and data catalog which supports the now branded Go Code Colorado initiative. Xentity’s consultants have provided the data and technology resources to manage and advise the publication of public sector data to the Colorado Information Marketplace and to provide technical support developers who participate in the Challenge. 

Xentity primarily has provided data platform support. We have provided data readiness analysis, data architecture guidance, project management, and the data analysts to “wrangle” the data (aka ETL) to get the datasets onto the platform. We also have provided the IT and data support on-site at the multiple locations and events to assure the challenge participants and finalists are getting the support they need to be successful in accessing and using the data and services. Finally, we are supporting the technical review of applications to assure these applications can have a life beyond the “hackathon” stage.

The final stages are coming the first 10 days of May. The 10 finalists have proven to demonstrate very viable solutions to achieve the goal of helping make our economy stronger. 

Some more background and detail on how we got here

(The following is from the State as guidance to this effort)

 

Colorado government agencies possess large volumes of public business and economic data. This data can help businesses with strategic planning, but it exists in so many different places and formats making it difficult for that most businesses to use it. The Secretary of State’s office will address this problem through the creation of the Business Intelligence Center (BIC). BIC seeks to aggregate and analyze data available to the business community.

This effort is led by the Colorado Secretary of State. The Secretary of State’s office interacts with hundreds of thousands of business entities, charities, and nonprofits in the state. The Secretary of State’s office collects, manages, and disseminates large amounts of basic data about those organizations and wanted to make the data useful to Colorado businesses. 

The Department sought to make this data more useful and collaborated with the Leeds School of Business at the University of Colorado to publish the Quarterly Business and Economic Indicator Report. This report combines Department data with other economic data collected by he Leeds School to provide meaningful economic information to the business community. For instance, new business filings are a leading indicator of job creation. With this and other information provided in the report, the business community can make smarter decisions that will grow the Colorado economy.

Since first publishing the report in 2012, the Secretary of State received comments from many members of the business community asking to see more detailed data regarding economic trends 
in order to better understand the distribution of commerce in Colorado. This includes access to the location, size, vibrancy, and concentration of key business nodes. While this level of detail would be tremendously helpful, the Department cannot provide the information because multiple state agencies collect the desired data and it is not readily available in a common place  or even a common format.

A central data collection point is needed. During meetings with other government agencies, Department staff concluded that these data requests could be met by aggregating all the information spread throughout various agencies and databases into a single tool by breaking down agency silos and better cataloging existing resources. Department staff also concluded that access and availability to the data is not enough. In order to make the raw data useful to the vast majority of business owners, data analysis and visualization tools are needed. These conclusions led to the Business Intelligence Center project.

The Business Intelligence Center consists of a centralized data catalog that combines public data into a meaningful tool for businesses. 

The vision for this project is two-fold. First, it consolidates public data relevant to businesses on a single platform. Second, it gives business the tools to make the data useful. The second goal is 
achieved through a civic apps challenge—the Colorado Business Innovation Challenge—that will give financial incentives to the technology community to build web and mobile applications that use state and other data to solve existing business challenges.

The data platform is akin to an information clearing house. It will make data sources currently dispersed over multiple government departments and agencies accessible in a common location. 
This platform will offer Colorado businesses unprecedented access to public data that is validated and relevant to short and long-term needs. Besides enhancing businesses’ access to state data, the BIC will also contribute to economic growth. The creation of the BIC will make data available to all Colorado businesses at no additional cost. Currently only large entities with the time, staff, and budget to engage in detailed statistical analysis can use these data sets. Providing this data to every type and size business in Colorado provides a unique opportunity to contribute to economic development. The BIC will nurture key industry networks and lay the foundation for a digital infrastructure that will continue to expand and improve over time.

The Colorado Business Innovation Challenge is an innovative way to create solutions and ensure the BIC is useful to Colorado businesses.

Simply making the data available is insufficient to most business owners. To truly help the vast majority of businesses—especially small businesses—tools must be developed to present the data in a useful and consumable form. Normally government agencies develop tools to fill this information vacuum, but historically the government has not been successful developing highly useful and effective tools. A new approach is needed—that approach is the Colorado Business Innovation Challenge.

Modeled after a “civ apps” challenge that has been run in multiple cities across the United States and internationally, the Challenge presents the software development community with problem 
questions and then asks that community to create possible solutions. At the end of the challenge, the Secretary of State will license the most innovative and implementable web or mobile application. The best design will receive a contract with the Secretary of State to make the application available to the public on the Business Intelligence Center platform. The Department will also pursue partnerships with the Colorado technology and startup industry to provide additional incentives, such as mentoring, hosting, and office space to the Challenge winners. The long-term intent of the program is to not only create an environment for fostering community involvement through the Challenge, but to develop sustainable tools that are  developed in the Challenge.

GAO 4th annual report cites Enterprise Architecture, Geospatial, and Coordinating Research among ways to help reduce duplication – fragmentation – overlap

Blog post
edited by
Wiki Admin

GAO relased its 2014 Annual Report identified 11 new areas of fragmentation, overlap, and duplication in federal programs and activities. GAO also identified 15 new opportunities for cost savings and revenue enhancement. Related work and GAO’s Action Tracker—a tool that tracks progress on GAO’s specific suggestions for improvement.

The social services clearly are under the highest scrutiny as heritage.org notes:

In the previous three reports, the GAO found that Congress spent:

GAO summarizes their highlights as:

In its 2014 report, GAO presents 64 actions that the executive branch or Congress could take to improve efficiency and effectiveness across 26 areas that span a broad range of government missions and functions.
  • GAO suggests 19 actions to address evidence of fragmentation, overlap, or duplication in 11 new areas across the government missions of defense, health, income security, information technology, and international affairs.
  • GAO also presents 45 opportunities for executive branch agencies or Congress to take actions to reduce the cost of government operations or enhance revenue collections for the Treasury across 15 areas of government.

One blog took a fairly direct, yet appropriate view to this – “Why Have One Government Program When 10 Can Do the Same Thing? GAO Report Reveals Duplicated Efforts, Wasted Money.

Because, as the GAO points out, “the federal government faces an unsustainable fiscal path,” and getting out of its own way is one of the easier means of cutting costs.

They do point out a sort of ray of hope in that :

After taking a grand tour of federal government multiplicity, the GAO recommends 45 actions for cutting costs. Don’t get your hopes too high, though. Of the 380 reforms previously recommended, only 124 have been fully addressed.

I say ray of hope as about 1/3 improvement is actually, possibly sadly, not bad for the largest organization in the world. Beyond 1/3, who can or will use this? This is a fantastic guiding light, but for who? Clearly it is for congress, executive branch politicals, and Program Directors, but will they be interested to act? Does it fit with their agendas and objectives? Who has influence to more than suggest it should be part of such?

 

Topics we’ll be tracking

But more in our neck of the woods, where we look to help, and how we analyzed, here are some mission take-ways:

  • Renewable Energy programs are VERY fragmented, though not uncommon for organization all trying to get a service or solution piece of a new up and coming and relevant disruption. Specifically, they cite more coordination between USDA and DOE

Area 4: Renewable Energy Initiatives: Federal support for wind and solar energy,
biofuels, and other renewable energy sources, which has been estimated at several
billion dollars per year, is fragmented because 23 agencies implemented hundreds of
renewable energy initiatives in fiscal year 2010—the latest year for which GAO
developed these original data. Further, the DOE and USDA could take additional
actions—to the extent possible within their statutory authority—to help ensure effective
use of financial support from several wind initiatives, which GAO found provided
duplicative support that may not have been needed in all cases for projects to be built

  • In the 2011 reports under General Government, Enterprise Architecture and Data Center Consolidation were high on the list (Page 24):

Area 14: Enterprise architectures: key mechanisms for identifying potential overlap
and duplication. Well-defined and implemented enterprise architectures in federal agencies can lead to consolidation and reuse of shared services and elimination of antiquated and redundant mission operations, which can result in significant cost savings. For example, the Department of the Interior demonstrated that it had used enterprise architecture to modernize agency information technology operations and avoid costs through enterprise software license agreements and hardware procurement consolidation, resulting in financial savings of at least $80 million. In addition, Health and Human Services will achieve savings and cost avoidance of over $150 million between fiscal years 2011 to 2015 by leveraging its enterprise architecture to improve its telecommunications infrastructure.

Area 15: Consolidating federal data centers provides opportunity to improve
government efficiency. Consolidating federal data centers provides an opportunity to improve government efficiency and achieve cost savings of up to $3 billion over 10 years.

For what its worth, for some horn tooting, Xentity staff were providing support to the Interior Enterprise Architecture during the 2003-2005 period where DOI EA focused on the Enterprise license consolidation across all its bureaus to department-wide.

In 2012, GAO also added:

Area 19: Information Technology Investment Management: The Office of Management and Budget and the Departments Defense and Energy need to address potentially duplicative information technology investments to avoid investing in unnecessary systems.

For Geospatial Investment, it cites (Page 196) our general mantra on Geospatial Integrated Services and Capabilities :

Area 11: Geospatial Investments: Better coordination among federal agencies that collect, maintain, and use geospatial information could help reduce duplication of geospatial investments and provide the opportunity for potential savings of millions of dollar

This repeats its report concepts – mind you, at a higher level – on the GAO releases report on FGDC Role and Geospatial Information

Also, there was a trend to improve simplify Federal Contracting 

Finally, in talking about Research, they noted

Area 10: Dissemination of Technical Research Reports: Congress should consider whether the fee-based model under which the National Technical Information Service currently operates for disseminating technical information is still viable or appropriate, given that many of the reports overlap with similar information available from the issuing organizations or other sources for free.

There is so much more in this report, the question is, will the powers that be embrace these ideas as part of their program or political agendas and objectives?

 

State of Colorado IT highlighted in GovExec as example of Smart Lean Government

Blog post
edited by
Matt Tricomi

Xentity staff has been supporting Methodology Development with ACT-IAC on Smart Lean Government – http://smartleangovernment.com . A recent GovExec article has come out highlighting Smart Lean Government as well as the State of Colorado. 

Opening

In a data-driven world, agencies can’t afford to go it along any more. When Hurricane Katrina struck the Gulf Coast in 2005, the response and recovery were considered a disaster for government. There was no clear chain of command.

Our efforts at Smart Lean Government is in trying to introduce cross-government management and design of services across communities, be life event-based, and find ways to integrate services in design and implementation.

You can see the print magazine version in PDF here or the web version.

Here specifically is the part on the State of Colorado Office of Information Technology point of view

Treating Citizens Like Customers in Colorado

As a private sector technology executive, Kristin Russell [note: recently outgoing CIO] watched companies become adept at tracking customers from one division to the next and learning everything they could about them along the way. 

When a warranty expired, a product was recalled or a superior product came out, they knew just who to contact. And they knew the best way to contact them. 

When Russell became Colorado’s chief information officer, she saw something different. State agencies weren’t competing with anyone, so they had little incentive to offer great customer service. 

This wasn’t just bad for citizens. It was costly for government too. One agency spent $4 million annually on postage. If citizens could opt for email-only contacts statewide, that figure could be reduced significantly, Russell says. 

Russell and Colorado’s Chief Technology Officer Sherri Hammons started planning for a governmentwide customer relations management system that could recognize citizens from one agency to the next, save their addresses and personal information, and alert them to services they might qualify for. 

An early version, called PEAK, offers a unified portal for medical, welfare and child support services and links to the state’s new online health insurance marketplace. Russell hopes to expand the PEAK concept across Colorado’s 22 agencies so citizens can interact with government once and be done. 

****

Xentity is excited to be supported both the SLG initiative and a service provider for the OIT in support of moving to IT in the state to a customer oriented set of services as part of the Xentity’s recent award of IT IDIQ from State of Colorado.