Using big data to map climate change

When I think about climate change and biological systems in general, I tend to think not in terms of the mere warming of the planet but in terms of balance/stability versus volatility/anomaly. When we see Hurricane Sandy and attempt to draw a connection to climate change, what we’re concerned about is not the warming of the planet by itself but anomalous and volatile weather.

To that end, the folks at Enigma labs have mined the last two centuries of daily weather data to produce expected daily normal high and low temperatures. They’ve then mapped the last 50 years of data against that data to produce a compelling graphic indicating that daily weather behavior is getting more anomalous. We’re seeing more warm or strongly warm days, as measured against historical norms. In 1964 42 percent of days were warm or strong warm versus the historical average. Today that percentage is 67. The model suggests we’ll be above 70 percent by 2030.

Enigma’s model was a very big data project, as they started with daily measurements from 90,000 weather stations worldwide. They ended up with almost 900 million rows of data, which mapped to over 3 million anomalies. The project is one more indicator that data science has a clear role to play in understanding not just climate change but in responding to climate change as many of the digital green startups are attempting to do by leveraging data to improve energy efficiency.

Relevant Analyst
lesser_profilepic14bc7fcadf2acb41d74be5ed84e63558-avatar2

Adam Lesser

Analyst Gigaom Research

Do you want to speak with Adam Lesser about this topic?

Learn More
You must be logged in to post a comment.
No Comments Subscribers to comment
Explore Related Topics

Latest Research

Latest Webinars

Want to conduct your own Webinar?
Learn More

Learn about our services or Contact us: Email / 800-906-8098