This wildfire season has definitely seen a different trend. Not necessarily a massive overall national increase in fire, but definitely […]
Doing some light examination of Denver West, the following is just two aspects of free public domain data. There is so much more beyond imagery and maps, and even more maps and more imagery (i.e. MODIS, LANDSAT, other NASA products).
USGS Topo Map 1-meter Quads are clockwise:
(Download this KML file to see all options or click the years above the thumbnails below to grab that historic map)
6 Diff dates for 15 Quads 1939-1965
5 Diff dates for 12 Quads 1941-1965
5 Diff dates for 11 Quads 1938-1965
4 Diff dates for 8 Quads 1939-1965
|Historic Imagery type||Year||Metadata||Thumbnail (Actual Files range from 20-200 MB)|
|Digital Orthophoto Quadrangles (DOQs)||1994|
|National Aerial Photography Program (NAPP)||1988|
|National High Altitude Photography (NHAP)||1983|
|Space Acquired Photography|
Black-and-white, natural color, and color
infrared aerial photographs
400 or 1,000 dpi.
|Aerial Photo Mosaics (Used when creating early/mid-USGS Topo Maps)||1953|
|High Resolution OrthoImagery (Corrected – Generally .3 meter, color)|
|Declass 1 (1996) Stereo Images||1965|
|Declass 2 (2002) Stereo Images||1966|
As a follow on to the “cliffhanger” on BigData is a big deal because it can help answer questions fast, there are three top limitations right now: Data Quality, People and Process, Tech Access to Information.
Lets jump right in.
Number One and by far the biggest – Data Quality
Climate Change isn’t a myth, but it is the first science to ever be presented on a data premise. And in doing so, they prematurely presented models that didn’t take into account the driving variables. Their models have changed over and over again. Their resolution of source data has increased. Their simulations on top of simulations have proven countless theories of various models that can only be demonstrated simply by Hollywood blockblusters. Point being, we are dealing with inferior data for a world scale problem, and we jump into the political, emotional driven world with a data report? We will be the frog in slowly warming water, and we will hit that boiling point late. All because we started with a data justification approach using low quality data. Are they right the world is warming? Yes. Do they have enough data to proven the right mitigation, mediation, or policy adjustments? No, and not until either we increase the data quality or take a non-data tact.
People and processes is a generation away.
Our processes in IT have been driven by Defense and GSA business models from the fifties. Put anyone managing 0s and 1s technology in the back. They are nerds, look goofy, can’t talk, don’t understand what we actually do here and by the way, they smell funny. That has been the approach to IT since the 50s – nothing has changed with the exception that their are a few bakers dozen of the hoodie wearing, mountain dew drinking, late night owls who happen to be loaded now, and their is a pseudo culture of geek chic. We have not matured our people talent investment to balance maturity of service, data, governance, design, and product lifecycle to embrace that engine culture as core to the business. This means, more effective information sharing processes to get the right information to the right people. This also means, investing in the right skills – not just feeding doritos and free soda to hackers – to manage the information sharing and data lifecycle. I am not as worried about this one. As the baby boomer generation retires, it will leave a massive vacuum as Generation X is too small and we’ll have to groom Generation Y fast. That said, we will mess up a lot missing a lot of brain drain, but market will demand relevancy which will, albeit slowly, create this workforce model in 10-15 years.
Access to Environments
If you asked this pre-hosting environments or pre-cloud, this would have been limited to massive corporations, defense, intel, and some of the academia co-investing with those groups. If you can manage the strain of shifting to a big data infrastructure, this barrier should be the least of your problems. If you can allow your staff to get the data they need at the speed they need so they can process in parallelization without long wait times, you are looking good. Get a credit card, or if Government, buy off a Cloud GWAC, and get your governance and policies moving, as they are likely behind and not ready. Likely they will prolong the silo’d information phenomenon. Focus on the I in IT, and let the CTO respond to the technology stack.
Focus on data quality, have a workforce investment plan, and continue working your information access policies
The tipping point that move you into Big Data is where these combined require you to deal with the complicated enormity at speeds answering questions not just for MIS and reports, but to help answer questions. If you can focus on those things in that order (likely solving in reverse), you will be able to implement parallelization of data discovery.
This will shorten the distance from A to B and create new economies, new networks, and enable your customer or user base to do things they could not before. It is the train, plane, and automobile factor all over again.
The article on Throwing a Lifeline to Scientists Drowning in Data discusses how we need to be able to “sift through the noise” to be able this faster and faster deluge of sensors and feeds. Its not amount information management models of fast, large retail or defense data. It is about finding the signals you need to know to take advantage.
In controlled environments, like retail and business, this has been done for years on end to guide business analytics and targeted micro-actions.
For instance, gambling industries have been doing this for 15 plus years taking in all the transnational data of each pull of a slot machine from all their machines from all their hotels linked with your loyalty card you entered and time of year and when you go, your profile, your trip patterns, then laws allowing, they adjust the looseness of the slots, the coupons provided, the trip rewards all to make sure they do what they are supposed to do in capitalism – be profitable.
Even in uncontrolled environments such as intelligence, defense or internet search, the model is build analytics on analytics to improve the data quality and lifecycle so that the end analytics can improve. Its sound equalizers on top of the sound board.
Do go for the neat tech for your MIS. Go because your users are asking more of you in the data information knowledge chain.
Continue on to read more on our follow-on article: BigData is a big deal because it can help answer questions fast