In almost every industry, data is being created in places it never was before. As a result, it’s becoming increasingly difficult to reach it, and secure it. And it’s even harder draw to insight from data before taking action upon it.
Just look at where information is now flowing. Sensors create terrabytes of data in offshore oil wells, for instance. There’s also extremely time sensitive data created by robots in manufacturing.
Globally, the amount of data associated with the internet of things (IoT) will grow 3.8-fold by 2019. Factoring a compound annual growth rate of 30% this will reach 507.5 zettabytes per year. If that’s not big data, I don’t know what is.
But why is this so challenging? It’s not the data itself but the hyper-distributed nature of it. This effectively means large volumes of data are spread out across a wide array of locations.
Consider the data created by a retailer’s in-store video camera. It can be highly beneficial to learn about customer behavior and buying preferences in real-time.
Yet a store employee can only act to help influence a purchase if they are empowered with insight while the customers are physically in the store.
On your marks, get set….
The biggest theme I am seeing by far when talking with our customers is the race to become digital. It’s something many will be familiar with now, but in essence it’s about using digital technologies to change business models, provide new revenues, and add customer value-producing opportunities.
But this too comes with challenges. Cisco’s Digital Vortex study claims that four of the top 10 incumbents in each industry will be displaced by digital disruption in the next three years.
Combine this drive for digitisation with the hyper-connectivity brought by IoT (an estimated 50 billion devices connected to the internet by 2020!) the sources from which data can be collected is exponential.
What does this mean for analytics?
Being able to make sense of this increasingly complex space becomes vital. Take, for example, a car manufacturer that can use Wi-Fi analytics to pinpoint foot traffic patterns or repeat customers.
As a dealership manager, if I can understand the ratio of customers waiting for sales versus service, I can make adjustments to staffing and resources in order to create a more positive customer experience. This is digitisation in practice.
To take an example from the world of IoT, sensors can highlight failing machines or dangerous conditions before they become serious.
Out of the 30,000 sensors on an oil rig, currently only 1% of that data is examined. The data used is for anomaly detection not optimisation. But analysing this data could illuminate ways for employees to improve operations, and provide the greater value to the business.
Putting this all together (hyper connectivity-plus-digitisation) implies the growing need for edge analytics.
Take it to the edge to stay ahead
Edge analytics analyses the data close to its source instead of sending it to a central place for analysis.
For businesses looking to sustain a competitive advantage in this digital era, traditional approaches require all data to be moved to a central repository for analysis. This will need to be supplemented with a distributed analytics model.
A big reason for this is much of the data coming off connected devices (i.e. sensors) loses value minutes after it is collected. To stay ahead, it’s more about the ability to react quickly.
It’s being able to respond strategically, supported by data-driven decisions in the moment. The challenge is how to pinpoint the business value within the massive amounts of machine-generated data in an organisation’s environment.
Let me give you an example from something Cisco is doing with this in the IoT space.
The recent winter flooding in the UK is a stark reminder of the environmental threat the country faces.
Rapid urbanisation, combined with a warmer (and wetter) climate means 3.2 million people in the UK will be at higher risk of flooding by 2050. Last year alone, floods cost the UK £5 billion. And the weather doesn’t look like it’s going to be getting better (or drier) any time soon…
Other than spending billions on flood defences, what else can be done to reduce the risk and the damage? We’re looking to the internet of things for the answer.
Being able to instantly respond to rises in water levels in certain areas could be the difference between people getting away safely or not. We’re building an integrated platform that predicts and helps co-ordinate first responders in disaster situations such as floods.
It draws down huge amounts of information from thousands of sensors from river levels, ground saturation and weather data. This then uses smart algorithms to mine the data and predict flooding events.
Speed and intelligence is crucial. In an emergency situation, having a platform that can get resources and people to the right place, at the right time, is critical. Duplication of resources is something you just can’t afford.
First responders are only given the information they each need. Streamlining this information increases the efficiency of operations.
Maybe in future this could be extended to flood prevention if weather stations automatically talked to river barriers, sensors, storm tanks, and flow metres. This could result in drains opening or closing, or just diverting the water flow. The possibilities are only limited by our imagination.
For businesses today, while they might not be fighting against flood waters they will be wary of the impact of digital disruption.
Being able to move fast, with intelligence becomes part and parcel of everyday business, and edge analytics is one of the key ways to achieve that.
Find out more about how the internet of things and analytics can reduce the cost of flooding to the UK in the video below:
https://www.youtube.com/watch?v=Z3vLjTzGOOk