Return to the The Data Chain website homepage

Articles

The future of Big Data – drawing insight from the deluge

The future of Big Data – drawing insight from the deluge

By Mark Steel, CEO at IP EXPO Europe 2014

 

Big Data has truly taken hold of the tech world over the last few years, and it shows no signs of slowing down. Gartner research suggested that by 2015, 4.4 million IT jobs globally will be created to support Big Data, with 1.9 million IT jobs generated in the US alone, while SNS research forecasts that Big Data investments will account for $76 billion worldwide by 2020. What’s more, for every Big Data-related role created in the US, there will be an additional three jobs created for people outside of IT – for instance, artists and designers to forge data visualisation – meaning that within the next year, a total of 6 million jobs in the US will be created by the Big Data boom.

 

However, as most organisations have discovered, this volume of unstructured information is multiplying more rapidly than anyone could have predicted, and while the potential for using the resulting analytics to raise productivity and improve decision making is huge, the process of handling the data itself can be complex. As global research firm Forrester sums it up, “The amount of data available is growing faster than our ability to deal with it, and more is coming.” While the technology designed to store, aggregate and analyse the data has evolved accordingly, releasing the true value of Big Data – and drawing meaningful insights from it - remains a challenge for businesses.

As we approach Data Centre EXPO and in such a complex landscape, I thought I would bring together some of the companies that will exhibit at the EXPO to discuss the future of Big Data.

According to Justin Milligan, Managing Director of Xenith Document Systems, “It is becoming increasingly clear to organisations that IT decisions should be taken based on insights gleaned through data analysis – which, until recently, was easier said than done – but thankfully, as the need to analyse Big Data increases, so does the quality of the tools and services available in order to accomplish this. For example, companies have saved hundreds of thousands of pounds each year by making changes to their printing policies, output devices and software solutions through good analysis of data.”

He continues, “The latest software allows you to have a dashboard view of all your devices at any given time in order to understand costs and prepare detailed reports. It also provides a tool to run ‘what if?’ modelling to provide a visual representation of the direct benefits that could be achieved by making changes to processes and devices.”

While it may be the case that a more comprehensive view of devices and incoming data is now a realistic possibility for IT professionals, the emerging ‘Internet of Things’ has further added to the data deluge as the number of consumer and industrial devices generating a steady stream of data grows. The Internet of Things is the ever-expanding ecosystem of objects used by consumers throughout their everyday lives – including even the most basic smartphones, activity trackers, and smart household devices.

Although in theory, access to this comprehensive and detailed portrait of consumers’ lives would be any marketer’s dream, the deluge of new information is throwing its very own set of problems due to the technical and administrative challenges of dealing with the advanced level of data granularity. The problem for IT professionals now is not an absence of enough data, but the possibility of missing out on vital information while focusing on less relevant data – in effect, the methods by which organisations prioritise and analyse their data has become just as important as the data itself.

Dick Paessler, CEO at Paessler, says “our customers’ networks are becoming increasingly complex and need to be even more efficient. This is creating demand for more sophisticated network monitoring tools, capable of reporting on the performance and response time of these ‘super’ networks. Therefore, as vendors it is vital that we continue adapting our product to our customers’ needs as they evolve.”

“Besides the necessity to continue monitoring all devices, virtual machines and cloud based applications, the “Internet of Things” also launches a new era in network monitoring (we call it the “Monitoring of Things”),” continues Paessler. “With every new thing connected to the network, the amount of data that can and should be monitored is constantly growing.”

More data generated by consumers and objects means a greater and more diverse spread over multiple devices and locations – the practical aspects of storage and capacity have caused widespread debate throughout the industry, with Forbes.com arguing that “the capacity of hard drives isn’t increasing fast enough to keep up with the explosion of digital data worldwide”. The issue of storage is one that no business can ignore – after all, that data still needs to live somewhere.

Over the years, many organisations have built up increasingly complex and dispersed storage systems, with more maintenance and costs a necessary consequence. Amr Awadallah, CTO of Cloudera, says “The pro of the Big Data approach is that it scales way, way better; the con is that to add more storage you have to add more servers, and more CPUs whether you need them or not.”

However, Neil Stobart, EMEA Sales Engineering Director at Nexenta, sees things differently: “Big Data is about analytics – not about storing lots of data. This is where the ‘hype’ and reality become confused - it is typically not a storage project, but a data mining and analysis project; these projects require large CPU cycles as well as storage, so, typically, we look at storage architectures that use storage within the server and then combine a number of servers together to provide a scaled out solution.”

The concept of automation, particularly with regard to data analysis within the arena of personalisation and behavioural targeting, has been lauded by many industry experts as the solution to the problems of having too much data of too many different types, but GigaOM journalist Derrick Harris sums up the resulting conflict: “Smart data scientists know they can’t trust the machines alone, which is why companies doing everything from predicting the content you’ll like to predicting your credit risk have figured out how to make machines work for humans instead of replacing them”. Although offloading the work to smart software may cut down the man-hours necessary to sift through the piles of data, there still needs to be the creative human insight to ask the right questions, identify patterns that may not be immediately obvious, and draw conclusions from the data that actually reflects real-life user behaviours, habits and tendencies.

Help is at hand for businesses struggling to bridge this gap, according to Ross Dyer, Technical Director UK at Trend Micro: “Organisations have to recognise they are responsible for the data they hold and be prepared to work out where it is and how they manage this. Luckily, there is technology designed to automate the process of data analysis, and third party data specialists that can work with companies if they don’t have the expertise in-house.”

While Gartner’s research predicted a wave of new jobs created to support Big Data, the skills necessary to extract meaningful conclusions from data are still a relatively scarce resource – the role of the ‘data scientist’ (a person within an organisation dedicated to sifting through the large amounts of data and applying critical thinking in order to understand how best to implement the findings) has been described by Harvard Business Review as ‘the sexiest job of the 21st century’, but demand for suitable candidates currently far outstrips supply. The lack of skills isn’t just a consequence of an empty talent pool - according to a recent study by EMC, only 29% of agencies have hired trained professionals to manage and analyse Big Data, or educated senior management on related issues, pointing to the need for a major attitude shift at board-level in preparing existing staff for dealing with Big Data.

The future of Big Data, then, looks to be paved with challenges for most organisations – but the end result promises to be worth it. Data analytics are becoming an increasingly important part of major businesses’ strategies, and they are investing in new vendor technologies to draw out the most valuable insights from the information they have access to. While human resources, storage and the sheer magnitude of data to be dealt with may still throw up difficulties for the industry, Big Data is no longer just a buzzword. Neil Stobart at Nexenta concludes: “Big Data is not hype. Companies are waking up to the fact that the data they hold on their customers, products and markets is invaluable. It is unique and in fact makes up the DNA of their organisation’s intellectual property.”

Today’s leading companies need to be information-driven to succeed. For true insight to be a reality, organisations are required to leverage an unprecedented volume and variety of data.  The outcomes when practiced effectively lead to strong competitive advantages and industry differentiation; however, the side effects are far-reaching and costly. That is why Data Centre EXPO, co-located with IP EXPO Europe and Cyber Security EXPO, has organised a wide range of activities focusing on Big Data.

The Big Data Evolution Summit in partnership with Cloudera is a range of sessions and a partner zone including Our Big Data Journey – Finding the Gold and Delivering the Value seminar presented by TUI Travel and Accelerating Enterprise Big Data Success by Intel. You can register for Data Centre EXPO at www.datacentreexpo.com (ExCeL, 8-9 October).




Copyright 2010-17 The Data Chain Website design and management by CBJ Digital Ltd.