Crunch Your IoT Data Before It Clogs the Network
Paul Glynn makes a good point here - for the Internet of things to be effective and efficient, some data refinement needs to take place at a local level. If you pass every measurement, second by second, you will clog the M2M to the cloud pipes. Essential data regarding context and environment needs to pass, and the rest needs to stay local on the internal M2M framework. For measurements that need to be near real time, you need to make a determination of what’s the best sampling rate for your application to be effective. (Every three seconds? Every five minutes?) You will also need to threshold some M2M events for detection reasons - e.g. a heart rate sensor ticking away at 65 BPM isn’t news the cloud needs, but a heart rate sensor ticking at 120 BPM while the subject is at rest might be.
In today’s IoT frenzy, a lot of companies rush to connect sensors and provide all sorts of monitoring services, and carriers will happily bill them for the data that transits through their networks.
But sending all the raw data to the cloud for processing and intelligence is inefficient and expensive, notes Paul Glynn, CEO of Irish startup Davra Networks. With the release of its RuBAN application enablement platform, the three-year-old Irish company jumps on the “fog computing” bandwagon with a clear goal to add local value to IoT data before it even reaches the cloud.
“Out of the estimated 50 billion connected devices that may be deployed by 2020, the vast majority will not have a direct connection to the cloud but will pass on their data through local gateways or routers,” explains Glynn.