Go Code Colorado Open Data Effort is going into its final weeks

Blog post
edited by
Wiki Admin

States all-around have gotten into Open Data movements. Colorado has as well, and their recent Go Code Colorado effort is a very unique entry into this foray ( http://gocode.colorado.gov/)

Go Code Colorado was created to help Colorado companies grow, by giving them better and more usable access to public data. Teams will compete to build business apps, creating tools that Colorado businesses actually need, making our economy stronger.

 

The following is a great video that summarizes the event as produced by the State and one of Xentity’s colleagues, Engine7 Media.



Get Adobe Flash player

Xentity is very proud to be supporting this innovative Government Solution

Xentity was awarded IT consulting support for the the Business Intelligence Center platform and data catalog which supports the now branded Go Code Colorado initiative. Xentity’s consultants have provided the data and technology resources to manage and advise the publication of public sector data to the Colorado Information Marketplace and to provide technical support developers who participate in the Challenge. 

Xentity primarily has provided data platform support. We have provided data readiness analysis, data architecture guidance, project management, and the data analysts to “wrangle” the data (aka ETL) to get the datasets onto the platform. We also have provided the IT and data support on-site at the multiple locations and events to assure the challenge participants and finalists are getting the support they need to be successful in accessing and using the data and services. Finally, we are supporting the technical review of applications to assure these applications can have a life beyond the “hackathon” stage.

The final stages are coming the first 10 days of May. The 10 finalists have proven to demonstrate very viable solutions to achieve the goal of helping make our economy stronger. 

Some more background and detail on how we got here

(The following is from the State as guidance to this effort)

 

Colorado government agencies possess large volumes of public business and economic data. This data can help businesses with strategic planning, but it exists in so many different places and formats making it difficult for that most businesses to use it. The Secretary of State’s office will address this problem through the creation of the Business Intelligence Center (BIC). BIC seeks to aggregate and analyze data available to the business community.

This effort is led by the Colorado Secretary of State. The Secretary of State’s office interacts with hundreds of thousands of business entities, charities, and nonprofits in the state. The Secretary of State’s office collects, manages, and disseminates large amounts of basic data about those organizations and wanted to make the data useful to Colorado businesses. 

The Department sought to make this data more useful and collaborated with the Leeds School of Business at the University of Colorado to publish the Quarterly Business and Economic Indicator Report. This report combines Department data with other economic data collected by he Leeds School to provide meaningful economic information to the business community. For instance, new business filings are a leading indicator of job creation. With this and other information provided in the report, the business community can make smarter decisions that will grow the Colorado economy.

Since first publishing the report in 2012, the Secretary of State received comments from many members of the business community asking to see more detailed data regarding economic trends 
in order to better understand the distribution of commerce in Colorado. This includes access to the location, size, vibrancy, and concentration of key business nodes. While this level of detail would be tremendously helpful, the Department cannot provide the information because multiple state agencies collect the desired data and it is not readily available in a common place  or even a common format.

A central data collection point is needed. During meetings with other government agencies, Department staff concluded that these data requests could be met by aggregating all the information spread throughout various agencies and databases into a single tool by breaking down agency silos and better cataloging existing resources. Department staff also concluded that access and availability to the data is not enough. In order to make the raw data useful to the vast majority of business owners, data analysis and visualization tools are needed. These conclusions led to the Business Intelligence Center project.

The Business Intelligence Center consists of a centralized data catalog that combines public data into a meaningful tool for businesses. 

The vision for this project is two-fold. First, it consolidates public data relevant to businesses on a single platform. Second, it gives business the tools to make the data useful. The second goal is 
achieved through a civic apps challenge—the Colorado Business Innovation Challenge—that will give financial incentives to the technology community to build web and mobile applications that use state and other data to solve existing business challenges.

The data platform is akin to an information clearing house. It will make data sources currently dispersed over multiple government departments and agencies accessible in a common location. 
This platform will offer Colorado businesses unprecedented access to public data that is validated and relevant to short and long-term needs. Besides enhancing businesses’ access to state data, the BIC will also contribute to economic growth. The creation of the BIC will make data available to all Colorado businesses at no additional cost. Currently only large entities with the time, staff, and budget to engage in detailed statistical analysis can use these data sets. Providing this data to every type and size business in Colorado provides a unique opportunity to contribute to economic development. The BIC will nurture key industry networks and lay the foundation for a digital infrastructure that will continue to expand and improve over time.

The Colorado Business Innovation Challenge is an innovative way to create solutions and ensure the BIC is useful to Colorado businesses.

Simply making the data available is insufficient to most business owners. To truly help the vast majority of businesses—especially small businesses—tools must be developed to present the data in a useful and consumable form. Normally government agencies develop tools to fill this information vacuum, but historically the government has not been successful developing highly useful and effective tools. A new approach is needed—that approach is the Colorado Business Innovation Challenge.

Modeled after a “civ apps” challenge that has been run in multiple cities across the United States and internationally, the Challenge presents the software development community with problem 
questions and then asks that community to create possible solutions. At the end of the challenge, the Secretary of State will license the most innovative and implementable web or mobile application. The best design will receive a contract with the Secretary of State to make the application available to the public on the Business Intelligence Center platform. The Department will also pursue partnerships with the Colorado technology and startup industry to provide additional incentives, such as mentoring, hosting, and office space to the Challenge winners. The long-term intent of the program is to not only create an environment for fostering community involvement through the Challenge, but to develop sustainable tools that are  developed in the Challenge.

Flipping the Educational Value Chain

Blog post
edited by
Wiki Admin

Business, governments, and even many non-profits have benefited from the windfall of a flattening world – less war, trend towards better resource distribution, new business models, digital economy proliferation, sharing workforce. Education has not

At Xentity, to us exploring NextGen Transformation using architecture, analysis, design is not about IT. IT is a core component, but we are looking at how the Next Generation will progress and transform. And with generation lines becoming more of a blur, this isn’t a 30 year delay, or even a 15 year delay. In some cases, we are talking 5 to 10 years for transformation of a generation. Given such, when we examine workforce capital, we are truly interested in the changing models not just in the employee – which by the way, is a relic of the industrial age – but also how those employed in your organization (employee, contractor, consultant, vendor, service provider), are changing themselves.

One way of examining this is looking at the actual next generation. The kids. This is very important. For instance, the current incoming generation, aside from now being larger than the Baby Boomer generation, has benefited from the previous 30 years of relative stability, and Millenials engage in collaborative environments, as a result of growing up in a connected world NATURALLY.  

They weren’t taught this though, what they were taught for the most part, with some Montessori, STEM Academy, and other cloud school minor exceptions, in a school model that was intended for the children to go into a pre-industrial revolution business workforce that had bells to change shifts, required discipline of a “robot” in the factory for efficiency and safety, and required still minds to take orders and execute.

When examining your organization, you may have unwritten rules, or codes that have been passed down out of habit, institutionalization, or what we know. Those unspoken rules of engagement or life definitely help manage the chaos and focus on the mission, but the question that at times needs to be asked is “Is this the right mission? If not, are these the right rules?” and thereafter of course, do you or does your organization have the political and actual capital to make the transformation.

The following, in two parts, Jim Barrett examines this phenomena of:

Mr. Barrett is not only is Xentity’s Architecture lead, but has actively served and presently engages in multiple early childhood education development advisory and exploratory boards.

 

 

In 2014 Every Business will be Disrupted by Open Technology

Blog post
added by
Wiki Admin

The article “In 2014 Every Business will be Disrupted by Open Technology” raised some very key points on some key disruptive patterns. A key theme we picked up on was the movement from bleeding and leading edge to more popular adoption of the open data and Open platforms. 

As the article notes:

Yet the true impact begins not with invention, but adoption.  That’s when the second and third-order effects kick in. After all, the automobile was important not because it ended travel by horse, but because it created suburbs, gas stations and shopping malls

A few tangible themes we picked up on are:

  1. Stand-alone brands are shifting to open platforms for co-creation
  2. Open platforms built on a brand`s intellectual property enhance the value, not threaten it
  3. Open platforms provide enterprise-level capabilities to the smallest players, leveling the playing field, accelerating innovation, and amplifying competition
  4. Open platforms for co-creation shifts the focus away from driving out inefficiencies, and toward the power of networking and collaboration to create value.

Where its happening already: Federal, State, City, Commercial

In FedFocus 2014, they also emphasized that with the budget appropriations for 2014 and 2015, two big disruptive areas will continue to be in Open Data and Big Data. Especially, with the May 2013 release of the WhiteHouse Open Data memorandum for going into effect in November 2013, it will impact Open Data by:

Making Open and Machine Readable the New Default for Government Information, this Memorandum establishes a framework to help institutionalize the principles of effective information management at each stage of the information`s life cycle to promote interoperability and openness. Whether or not particular information can be made public, agencies can apply this framework to all information resources to promote efficiency and produce value.

We are seeing states get into the mix as well with Open Data movements like http://gocode.colorado.gov/

Go Code Colorado was created to help Colorado companies grow, by giving them better and more usable access to public data. Teams will compete to build business apps, creating tools that Colorado businesses actually need, making our economy stronger.

Also, at the city level with the City of Raleigh, North Carolina which is well recognized for its award-winning Open Data Portal.

 

We had previously tweeted on how IBM opened up their Watson cognitive computing API for developers…. publicly. This is a big deal. They know with open data platforms as an ecosystem, they not only get more use, which means more comfort, which means more apps, but every transaction that happens on it, that is legally allowed, they to improve their interpretative signals that make Watson so wicked smart. This article points this key example out as well. 

 

And back to National Data Assets moving ahead to make their data more distributable over the cloud, moving data closer to cloud applications, offering data via web services where they are too large or updated too often to sync, download, or sneakernet.

Xentity and its partners have been at the forefront of all these movements.

We have enjoyed being on the leading edge since the early leading edge phases of this movement. Our architectures are less on commodity IT, which not to undersell the importance of affordable, fast, robust, scalable, enabling IT services and data center models. Our architectures have been more focused on putting the I back in IT.

We have been moving National Geospatial Data Assets into these delivery models as data products and services (Xentity is awarded USGS IDIQ for Enterprise and Solution Architecture), supporting the architecture of data.gov (Xentity chosen to help re-arch data.gov), and recently supporting the data wrangling on Colorado`s State OpenData efforts. We are examining Can a predictable supply chain for geospatial data be done and actively participating in NSF EarthCube which looks to “Imagine a world…where you can easily plot data from any source and visualize it any way you want.” We have presented concepts

Our architecture methods (Developing a Transformation Approach) are slanted to examine mission oriented performance gains, process efficiencies, data lifecycle management orientation, integrating service models, and balancing the technology footprint while innovating. For instance, we are heavily involved in the new ACT-IAC Smart Lean Government efforts to look at aligning services across government and organizational boundaries around community life events much like other nations are beginning to move to.

Xentity is very excited about the open data movements and supported platforms and the traction it is getting in industry. This may move us forward from information services into the popular space to and for knowledge services (Why we focus on spatial data science)


Xentity recognized on CIO Review list for Most Promising Government Technology Solution and Consulting Providers 2013

Blog post
edited by
Matt Tricomi

Xentity was recognized on CIO Review list for “20 Most Promising Government Technology Solution and Consulting Providers 2013” list. 

With the advent of internet technologies, there has been a change in the landscape of business processes related to the Federal Government system. But the change hasn’t been easy as it requires constant dedication to move the entire workforce from traditional systems, and getting them to seamlessly adapt to the modern systems. This transition also includes the role of technology consulting providers, whose sole responsibility is to provide a wide spectrum of services in order to help the federal agencies to cope with the changes, in the best possible manner.

As customers and business partners increasingly demand greater empowerment, it is imminent for government companies to seek for improved interactions and relationships in their entire business ecosystems, by enhancing software capabilities for collaboration, gaining deeper customer and market insight and improving process management.

In the last few months, we have looked at hundreds of solution providers and consulting companies, and shortlisted the ones that are at the forefront of tackling challenges related to government industry.

In our selection we looked at the vendor’s capability to fulfill the needs of government companies through the supply of a variety of services that support core business processes of all government verticals, including innovation areas related to advanced technologies and smart customer management. We also looked at the service providers’ capabilities related to the deployment of cloud, Big Data and analytics, mobility, and social media in the specific context of the government business.

We also evaluated the vendors support for government bridging the gap between IT and Operations Technology. We present to you, CIOReview’s 20 Most Promising Government Technology Solution and Consulting Providers 2013.

CIO Review Magazine Full Article on Xentity:

Xentity Corporation: Rapidly Designing The Needed Change In Cost-Cutting Times

By Benita M
Friday, December 6, 2013

 

Benita M

“We always try to believe that leaders want to execute positive change and can overcome the broken system. We are just that naïve,” says Matt Tricomi, Founder of Xentity Corporation in Golden, CO, named for “change your entity” which started on this premise just after 9/11 in 2001.“This desire started in 1999. I was lucky enough to be solution architect on the award winning re-architecture of united.com. It was a major revenue shift from paper to e-ticket, but the rollout included introducing kiosks to airports. Now that was both simple and impactful”. Xentity found their niche in providing these types of transformation in information lifecycle solutions. Xentity started slow, first, in providing embedded CIO and Chief Architect leadership for medium to large commercial organizations. 

Xentity progressed, in 2003, into supporting Federal Government and soon thereafter International to help IT move from the 40-year old cost center model to where the commercial world had successfully transitioned – to a service center. “Our first Federal engagement was serendipitous. Our staff was core support on the Department of the Interior (DOI) Enterprise Architecture team”, Matt recalls on how the program went from “worst to first” after over $65 million in cuts. “We wanted to help turn architecture on its head by focusing on business areas, mission, or segments at a time, rather than attack the entire enterprise from an IT first perspective.” The business transformation approach developed ultimately resulted in being adopted as the centerpiece or core to the OMB Federal Segment Architecture Methodology (FSAM) in 2008.

Xentity focuses on the rapid and strategic design, planning and transformation outreach portion of the technology investment in programs or CIO services. This upfront portion is generally 5 to 10 percent of overall IT spending. Xentity helps address the near-term cost-cutting need while introducing the right multi-year operating concepts and shifts which take advantage of disruptions like Geospatial, Cloud, Big Data, Data Supply Chain, Visualization, and Knowledge Transfer. Xentity helped data.gov overcome eighty percent in budget cut this way. “Healthcare.gov is an unfortunate classic example. If acquisition teams had access to experts to help register risks early on, the procurement could have increased the technically acceptable threshold for success.” 

One success story of Xentity is at United States Geological Survey (USGS). “After completing the DOI Geospatial Services Blueprint, one of several, the first program to be addressed was the largest: USGS National Mapping Program.” This very respected and proud 125-year old program had just been through major reductions in force, and was just trying to catch its breath. “The nation needs this program. The blueprint cited studies in which spending $1 on common “geo” data can spur $8 to $16 in economic development. Google Maps is one of thousands which use this data.” The challenge was to transition a paper map production program to be a data product and delivery services provider. “The effort affected program planning, data lifecycle, new delivery and service models, and road-mapping the technology and human resource plan. We did architecture, PMO, governance, planning, BPR, branding, etc.” Xentity, with its respected TV production capability, even supported high-gloss video production to deal with travel reduction and support communicating the program value and changes with partners and the new administration. This is definitely different than most technology firms. The National Map got back on the radar, increased usage significantly, and is expanding into more needed open data. 

Presently, Xentity is a certified 8(a) small disadvantaged business with multiple GSA Schedules and GWACs (Government Wide Acquisition Contracts). Xentity invested heavily in Federal Business management. Part of providing innovative, pragmatic, and rapid architecture and embedding talent is being able to respond quickly with compliant business management vehicles. Xentity is constantly seeking out the passionate CIOs, Program Directors, Architects, and Managers looking at transformation in this cost-cutting environment. “Sequester, Fiscal Cliff, debt ceiling, continuing resolutions–it’s all tying the hands of the executives who can look at best six months out. They don’t have the time to both re-budget and rapidly design multi-year scenarios to out-year performance drivers and options let alone staff up to speed on the latest disruptions or right innovation. That is where we come in. We start small or as fast or slow as the executive wants or believes their organization can absorb and progress.” 

BigData is a big deal because it can help answer questions fast

Blog post
added by
Wiki Admin

BigData is not just size and speed of complex data – it is moving us from information to knowledge

 

As our Why we focus on spatial data science article discusses, the progress of knowledge fields – history to math to engineering to science and to philosophy – or the individual pursuit of knowledge is based on moving from experiments to hypotheses to computation to now The Fourth Paradigm: Data-Intensive Scientific Discovery. This progression has happened over the course of human history and is now abstracting itself on the internet.

The early 90s web was about content, history, and experiments. The late 90s web was about transactions, security and eCommerce. The 2000s web was about engineering entities breaking silos – within companies, organizations, sectors, and communities. The 2010s web has been about increasing collaborating of communication, work production, and entering into knowledge collaboration. The internet progression is just emulating human history capability development.

When you are ready to move into BigData, it means you are wanting to Answer new questions.

That said, The BigData phenomenom is not about the input of all the raw data and explosion that the Internet of Things is being touted as. The resource sells, and the end product is the consumed byproduct. So lets focus on that by-product – its knowledge. Its not the speed of massive amounts of new complex and various quality data as our discussion on IBM’s 4 V’s focus on.

Its about what we can do with the technology on the cheap that before required supercomputer clusters that only the big boys had. Now with cloud, internet, and enough standards, if we have good and improving data, we ALL now have the environment to be answering complicated questions while sifting through the noise. Its about the enablement of the initial phase of knowledge discovery that everyone is complaining about the “web” right now “too much information” or “drowning in data”.

The article on Throwing a Lifeline to Scientists Drowning in Data discusses how we need to be able to “sift through the noise” and make search faster. That is the roadblock, the tall pole in the tent, the showstopper.

Parallelizing the search is the killer app – this is the Big Deal, we should call it BigSearch

If you have to search billions of records and map them to another billion records, doing that in sequence is the problem. You need to shorten the time it takes to sift through the noise. That is why Google became an amazing success out of nowhere. They did and are currently doing it better than anyone else – sifting through the noise.

The United States amazing growth is because of two things – we have resources and we found out how to get to them faster. Each growth phase of the United states was based on that fact alone, and a bit of stopping the barbarians at the gates our ourselves from implosion. You could say civilization. Some softball examples out of hundreds

  • Expanding West dramatically exploded after trains, which allowed for regional foraging and mining
  • Manufacturing dramatically exploded production output, which allowed for city growth
  • Engines shortened time between towns and cities, which allowed for job explosion
  • Highway systems shortened time between large cities, which allowed for regional economies
  • Airplanes shorten time between the legacy railroad time zones, which allowed for national economies
  • Internet shortened access to national resources internationally, which allowed for international economies
  • Computing shortened processing time of information, which allows for micro-targetted economies worldwide

Each “age” resulted in shortening the distance from A to B.  But, Google is sifting through data. Scientists are trying to sift as well through defined data sensors, link them together and ask very targetted simulated or modeled questions. We need to address the barriers limiting entities success to do this.