Categories: Talking Heads

Vast data – new tools needed to create new realities

It’s clear to any observer of our industry that its focus is fast shifting away from connectivity and towards the very acute issue of what exactly we’re going to do with all the data that we can now collect and analyse, writes M2M Now’s Alun Lewis. As a result, attention is concentrating on the role that analytic tools such as machine learning and artificial intelligence can play in helping us better understand our products, services and communities.

When it comes to spotting patterns in data – or at least the data that we receive through our senses – humans are superb. The trouble is that we’re so good at spotting patterns that we impose them even when there’s no pattern actually there, hence our predilection for seeing fairies, UFOs and other entities. Indeed, there’s a useful ancient Greek word – pareidolia – that means the human tendency to see faces in clouds that’s often useful to remember in all the current analytics debate.

The idea of getting machines to do our learning for us – or at least a lot of the heavy lifting – goes back to the days of Alan Turing. But what’s the current situation and where could this intensely important field be taking us? The algorithms that we create will inevitably influence human behaviour, a topic that Jaron Lanier has explored in a couple of excellent recent books such as ‘You are not a Gadget’ and ‘Who owns the Future?’.

Ben Parker, principal technologist at Guavus

In terms of the scale of the challenge facing us, Ben Parker, principal technologist at analytics company Guavus comments: “IDC’s Digital Universe study forecasts that the IoT will drive annual data production from 4.4 zettabytes today to 44 zettabytes by 2020. Machine learning and analytics will be critical to cope with this overload, so that all the data points can create benefits automatically. For traffic congestion in a smart city, a properly designed machine learning algorithm blends structured data such as historical rush-hour traffic patterns with unstructured data such as Tweets about an accident to automatically take measures to reduce congestion, for example retiming traffic lights and diverting traffic down uncongested alternate routes.”

Analysis in real-time
In aerospace, real-time analytics already plays a huge role as Rupak Ghosh, vice president of engineering services company Cyient, explains: “The IoT is revolutionising the aerospace industry. Airlines can translate vast volumes of data into meaningful business information that can then be applied to determine the status and performance of aircraft systems and subsystems. Sensors are now distributed throughout the aircraft to monitor key performance parameters, such as fuel burn in the engine. On landing, this information can be analysed, with action taken to correct any minor faults and get the aircraft back in service as soon as possible. Five years ago, this post-flight analysis would have taken many days to complete, whereas now there are solutions available that provide useful data within minutes of landing. There’s more work to be done before real-time health monitoring from the skies becomes a reality – current bandwidth for in-flight data transfer is around 400 Kbps and the next planned upgrade is up to 10 Mbps.”

For real-time analytics company VoltDB’s CMO, Peter Vescuso, addressing this mix of dynamic and historic data will be critical: “Organisations are interacting with Big Data – data that has volume and variety – but very few are successfully interacting with Fast Data – data that is both Big and has velocity. We are quickly reaching the point at which a new corporate data architecture is necessary to support both Big and fast data. Frameworks such as FIWARE and Hypercat are being discussed as approaches to a solution. In the immediate term, other systems, based on very fast, transactional databases, are being used to manage flows of sensor data from electric and power grids, enabling utilities to both manage fast flows of sensor data and use it in real time to make policy, billing and utilisation decisions. The challenge is being able to analyse this data as it’s coming in, create context for it based on current and historical information, and make instant, intelligent decisions that directly translate into business and civic value.”

Puneet Pandit, founder and CEO of Glassbeam

In terms of the growing power of tools, for Puneet Pandit, founder and CEO of specialist IoT analytics company Glassbeam – who recently announced a partnership with ThingWorx – techniques like AI and neural networks are appearing as compute resources become available more cost-effectively via the cloud: “The cutting edge focus of analytics is around machine learning and AI – and there are many ways that systems can now draw on unstructured data to solve problems, such as searching through existing knowledge bases. In addition, there are other techniques such as supervised machine learning and genetic algorithms.”

Brian Gilmore, solution expert, Internet of Things and Industrial Data, at Splunk

One significant issue in many implementations of this vision however is the tendency of organisations – and the humans that they consist of – to create disconnected departments and silos, as Brian Gilmore, solution expert, Internet of Things and Industrial Data, at Splunk suggests, “One word – data silos. There’s a painful lack of interoperability in M2M. Machines talk like machines – and even worse, only to machines that are just like them. There are cultural and language barriers in the space and it’s both stifling innovation as well as blocking advanced concepts in machine data analytics. So how do we fix this? Some cultural change is required first – the domain owners of M2M, especially on the industrial side, need to remove some of the walls to their environments, either letting the data out, or letting the tools in.”

Gilmore concludes: “But why stop there? What value can social graphs bring to M2M? Augmented reality? And what happens when we move advanced analytics of M2M data to new platforms? We’ve already seen very early results in the value that elastic cloud technologies can add, so how about quantum computing?”

Similar perspectives, but focused on the Utilities sector come from Mike Ballard, senior director, utilities at Oracle EMEA. “There’s now the ability to mix utility and non-utility data – especially with the introduction of smart meters. This enables us to drive down with much more detail into the greatly diversified patterns of consumption that we see across different households. Analytics already can present thirty to forty typical consumer profiles and this is going to grow as we gather more data and energy markets themselves become more complex. This in turn is going drive much more accurately targeted marketing, spotting home workers, for example with particular deals.”

And on the topic of quantum computing – seemingly IT’s own version of our perpetual wait for fusion power – Robert Bates, head of Information Architecture at Wipro Analytics, comments, “Quantum computing enables a leap forward in the ability to extract predictive and prescriptive solutions to many data problems. Wipro is currently working with a number of customers on developing quantum-like simulators using large scale platforms from public cloud providers as well as internal assets, for those that can afford the efforts, to run multiple simulations in parallel and then using regressive as well as correlation statistics to determine progressive direction and rates of change. We’ve been partnering with many leading Big Data analytics platforms and have also incorporated many open source technologies into our portfolio strategy.

He adds: “An immediate practical application of current quantum computing simulators for commercial uses would be specifically to identify trends in behavioural analytics. Individual, as well as segmented, behavioural data could be worth billions and Google has publicly stated its efforts to build a quantum computing platform. Additionally, Wipro sees tremendous capacity in an explosion of IoT information in conjunction with advances in nanotechnology. This will be key for the manufacturing and healthcare and life sciences industries.”

Touching on future potentials and the need to work together to breakdown the silos, Tom Gilley, founder and CTO of innovative data service exchange company wot.io that provides companies with access to different analytic tools, observes, “When you look at successful implementations of AI – which are so successful that people don’t really recognise the advances that have been made – such as voice recognition, these have usually involved the cumulative sharing of models from different disciplines to refine the intelligence. With VR, for example, this involved the application of acoustic models, language models and dictionary models. This is a critical issue in areas that involve multiple parties – such as in Smart Cities.”

 

a.weber@wkm-global.com

Recent Posts

The transformation of IoT: The role of hyperscalers in the rise of IoT

The number of connected IoT devices is expected to reach 17 billion by 2030 -…

2 years ago

Aeris to acquire IoT business from Ericsson

Ericsson and Aeris Communications, a provider of Internet of Things (IoT) solutions based in San…

2 years ago

Telenor IoT passes milestone of 20mn SIM cards

Telenor, the global IoT provider and telecom operator, has experienced rapid growth over the last…

2 years ago

Globalstar, Wiagro to supply IoT transmitters for agtech applications

Globalstar, Inc. has announced a commercial agreement with Wiagro, an Agtech start-up from Argentina. Globalstar is supplying Wiagro with…

2 years ago