What is big data?

Large Hadron Collider generates 25 petabytes of data per year

Large Hadron Collider generates 25 petabytes of data per year

Chancellor George Osborne has announced government funding for a new Alan Turing Institute with the aim of putting UK universities at the centre of the so-called “big data” phenomenon.

So what is big data and why does the government believe it is so important to the UK?

Digital systems run on digital data. This is the threads of ‘1s’ and ‘0s’ which flash around the electronic circuits in your computer, phone or car.

Electronics devices not only use data to operate, they also create data. Some devices are massive data producers, such as test systems, car analytics and even the large hadron collider.

Consider the amount of digital data produced by a car diagnostics system. These are the readouts displayed in tables and graphs on a computer. Now imagine how much diagnostic data is being produced by the systems monitoring a telephone network or a power station.

Data is not just created by physical processes in a car or power station. It is produced by the conceptual processes of an IT network.

This is the large amounts of data derived from corporate enterprise applications, such as ERM, CRM, and HR, and IT data such as events, logs, and inventories.

Then there is all the data being created on social media, this includes ‘selfie’ pictures, as photographs and videos posted on a website add significant amounts of data to the “big data” pool.

We are now getting an idea why the computer industry has added the prefix “big” to the data being created by electronics systems.

There is a mind-bogglingly large amount of data being created. One number I have seen quoted is 2.5 exabytes (2.5×1018) of data created each day.

The next question is what do we do with this data? It is only useful is it can be stored and analysed in a meaningful and practical way.

Networks of servers, these referred to as “the cloud”, are being created to store and analyse all these exabytes of data.

Storing the large amounts of data the world is creating will require new high capacity storage techniques.

For example, scientists at the University of Southampton have experimentally demonstrated a digital data recording process which they say will allow data storage up to 360Tbyte on a single optical disc with “practically unlimited lifetime”.

The biggest challenge is processing and analysing the data, and this will require massively parallel processing systems and efficient software algorithms which will run on them.

This is where the research work at the Turing Institute will be focused. Alan Turing was the father of computing not from a hardware stand point, but he invented the mathematical algorithms and rules which have become the basis of today’s computing processes.

Software engineers are adapting these processes for the world of big data, a world that did not exist when Turning was doing his ground-breaking research during World War II.

Today big data is big business and the government probably has an eye on the prize with its plans for the ‘big data’ research institute.

Big data is also important because it is an element in other business and lifestyle changing technologies, such as social media, mobile communications, personal healthcare and the internet-of-things.

Tags: Alan Turing

Related Tech News

Share your knowledge - Leave a comment