[photo_box title=”Nyquist Wanted Less Data, Not More” image=”2394″]How much IoT Data Do I Need to Ingest?[/photo_box]

Can I Ever Have Too Much Data?

Can I Ever Have Too Much Data?

or, Nyquist Wanted Less Data, Not More

This article is not about a retired Kentucky Derby winning racehorse. It’s not a hockey player with the Red Wings. Both probably have made more money during their lives than Harry Nyquist, who has a mini-bio at the bottom of our glossary page.

Nyquist was the most important thinker of his day about the relationships between bandwidth, channel capacity, and noise.

Several data and signal processing principles are named for Nyquist.

The Nyquist Rate is one of them. That’s the minimum data collection rate to assure integrity. This is complicated stuff.  It was ground breaking 80 and 90 years ago. It still confuses people today. Some confusion over Nyquist’s ideas have resurfaced in big data.

His ideas can be a big help in thinking about IoT and analytics. But, misunderstanding them can assure your project fails.

The link above, or a web search can provide some basics on Nyquist’s work. But we haven’t seen a good article on misusing, misunderstanding, or misapplication of his work.  In particular, misunderstanding what sampling rate of IoT data I need to collect or send to the cloud.

Your IoT project can succeed or fail based on one of three choices you make:

  1. You ignore Nyquist. Most likely you are making horrible mistakes. You may be blissfully unaware of this until something happens.
  2. You think about Nyquist but misunderstand. This ends the same way as choice number 1.
  3. You implement and appreciate Nyquist correctly; so life is good.

This is like the joke about passes in American football. Three things can happen, two of them are bad.

To understand these choices, imagine installing vibration sensors at a wind farm.  You think a fatal vibration frequency is 2,000 cycles per second, caused by turbulence acting on the blades.

Trap #1 would be thinking you know how to apply Nyquist from what we know so far. What we care about is how fast the turbines degrade, and how fast the vibration signature can change.

If the vibration signal changes over a matter of say, three hours, then if we sample every half hour that will be far more than Nyquist demands. If we need days of exposure to sample vibration above a certain level, to cause damage, that same data collection rate will exceed what we need for that purpose, as well.

But it is tempting to look at the 2,000 cycles per second “information” and think we need samples at 4,000 or 5,000 per second. That would only be true if you wanted to make a perfect recording of the vibration, assuming a sinusoid. And, if you did, who would listen to it?

5,000 times per second is 300,000 times per hour.  We already noticed twice an hour is overkill. So why are people tempted to sample thousands, or hundreds of thousands of time more data than they need?

Remember what Harry Nyquist was trying to do. He wanted to find out how little data he needed. He’d be surprised at religious zeal among big data fans who think all data is valuable.  He proved it’s not valuable. There is zero value in collecting, storing, transporting, and sorting excess information.

Some of the zeal is fear of missing out (FOMO). FOMO makes us wonder if we’ll lose some critical insight. But if we make IoT too hard, we miss all opportunity for insight.

AnaltyicsOS helps solve the FOMO problem with the Swan Trap™. When something interesting happens, AOS can capture the data around the event at a much higher rate. So, the best of both worlds. We obey Nyquist, and we don’t have FOMO.

So, for IoT analytics success we suggest three things:

  1. Realize the three choices Nyquist dictates.
  2. If you have FOMO, set a Swan Trap.
  3. Do IoT analytics at the edge with AOS

And remember, the Nyquist rate is not how fast the horse ran.

About Lone Star Analysis

Lone Star Analysis enables customers to make insightful decisions faster than their competitors.  We are a predictive guide bridging the gap between data and action.  Prescient insights support confident decisions for customers in Oil & Gas, Transportation & Logistics, Industrial Products & Services, Aerospace & Defense, and the Public Sector.

Lone Star delivers fast time to value supporting customers planning and on-going management needs.  Utilizing our TruNavigator® software platform, Lone Star brings proven modeling tools and analysis that improve customers top line, by winning more business, and improve the bottom line, by quickly enabling operational efficiency, cost reduction, and performance improvement. Our trusted AnalyticsOSSM software solutions support our customers’ real-time predictive analytics needs when continuous operational performance optimization, cost minimization, safety improvement, and risk reduction are important.

Headquartered in Dallas, Texas, Lone Star is found on the web at http://www.Lone-Star.com

Recent Blog Posts