Big Data Analytics
Big data analytics is a growing area that can save your agency time and money, while also helping provide more targeted services to your citizens.
In issue 10 of our Digital Transformation in Government (DTIG) series we looked at the five tech enablers for digital transformation, as identified by worldwide information and communications technology (ICT) company Huawei, in its Global Connectivity Index. One of these was big data and we wanted to explore it in more depth.
What is big data and big data analytics?
Wikipedia (the font of knowledge!) defines big data as “data sets that are so large or complex that traditional data processing applications are inadequate.” What qualifies as big data has changed, as the growth of data has skyrocketed through the many devices that collect data (e.g. smartphones, cameras, wireless sensor networks and software logs). While big data might have been considered in terabytes a few years ago, now it’s petabytes and exabytes. And soon….well, you get the picture. The word ‘big’ in big data is a shifting definition.
Why big data analytics?
A key IT focus area now and into the future is analysing this ‘big data’ and using that information to improve services and develop new products and services.
Global analytics company, SAS, identifies three major benefits of big data analytics — cost reduction; faster, better decision-making; and new products and services to meet customer needs (as identified by big data analytics). In its 2013 whitepaper, Big Data in Big Companies, SAS said “Like many new information technologies, big data can bring about dramatic cost reductions, substantial improvements in the time required to perform a computing task, or new product and service offerings. Like traditional analytics, it can also support internal business decisions.” Let’s look at some examples.
The SAS whitepaper included a great case study of UPS, which showed big data’s cost reductions in action. The stats in the whitepaper provide us with some basic numbers for UPS — 16.3 million packages per day for 8.8 million customers, with an average of 39.5 million tracking requests from customers per day. UPS set up a program called ORION (OnRoad Integrated Optimization and Navigation) which included telematic sensors in over 46,000 vehicles to track speed, direction, braking, etc. The data, in conjunction with online map data, was used to redesign the UPS drivers’ routes and by 2011 the project had saved more than 38 million litres of fuel by cutting 137 million kilometres from daily routes. “UPS estimates that saving only one daily mile [1.6 kilometres] driven per driver saves the company $30 million, so the overall dollar savings are substantial.”
The SAS whitepaper also points out that the value in big data comes not from its collection, but from its interpretation. “It’s important to remember that the primary value from big data comes not from the data in its raw form, but from the processing and analysis of it and the insights, products, and services that emerge from analysis.”
New products and services
Many companies use big data to refine or create new products and/or services. Google is a good example, with its use of big data to refine its search and ad-placement algorithms. Google also says its self-driving car is an example of a big data application.
Big data analytics in the Australian public service
You probably know firsthand about the importance of big data analytics in government, and perhaps you’ve read the 2015 publication Australian Public Service Better Practice Guide for Big Data. The Guide states that “Big data analytics can be used to streamline service delivery, create opportunities for innovation, and identify new service and policy approaches as well as support the effective delivery of existing programs across a broad range of government operations — from the maintenance of our national infrastructure, through the enhanced delivery of health services, to reduced response times for emergency personnel.”
Importantly, the Guide outlines some of the key things you’ll need in place to run with big data analytics — storage, processing structures (e.g. grid computing and cloud computing), business processes and change management procedures, skills and personnel, governance and a culture that supports big data analytics (one example given here is agile project methodologies, something close to Salsa Digital’s heart). As you’d imagine, the Guide also identifies privacy and security as key issues for government use of big data and big data analytics.
The Guide also includes a case study on the Patient Admissions Prediction Tool (PAPT). This software, which is a collaborative effort of the Australian e-Health Research Centre, Queensland Health, Griffith University and Queensland University of Technology, predicts how many patients will arrive at emergency, their medical needs and how many will be admitted or discharged. Staff can get an idea of their expected patient load in the coming hours and even weeks. Importantly, PAPT has led to changes in hospital emergency procedures and delivers cost savings. “It is estimated that PAPT software has the potential to save $23 million a year in improved service efficiency for the health system if implemented in hospitals across Australia.”
Salsa Digital’s take
Huawei’s Global Connectivity Index estimated the “global market for big data, its analytics, and its technology” at US$200 billion by 2020. Data’s phenomenal growth will continue and, as it does, big data analytics will play an increasingly important role in government and industry. Government will need to continue to embrace and optimise its efforts in big data collection and analytics to keep up with the growth of big data and to create the most value possible.
The Better Practice Guide covers how government can get the most out of big data and big data tools, so it’s a good next read from here!