What is data analysis?
Data analysis is the use of data in order to make better business decisions. This means that you will look at large sets of data, analyze them and discover trends or patterns. In this way, you can predict future events or changes with a certain degree of certainty.
Traditionally, data analysis has involved a process of manually gathering a large quantity of information to get a firm’s figures. But our world is changing observes Fintalent’s data analysis consultants. With agile technology, new ways of processing information are arising that enable firms to extract meaning from the volume of data available. This allows them to look beyond raw numbers and focus on what is actually going on in the business.
Firms can scale this sort of analysis by using more advanced technologies such as machine learning, artificial intelligence, neural networks, predictive analytics and real-time self-learning algorithms (or even combining these). These solutions may sound daunting, but they can help firms make much better decisions based on data rather than intuition or opinion.
Let’s take an example from the hospitality industry. Hotels have long relied on a combination of intuition and poor data when deciding where to build new properties. In an interview with the Harvard Business Review, TripAdvisor’s VP of Data Sciences June Lee described how a major hotel chain wanted to open a particular property in several cities across America but was torn between two locations. It had no data on how successful they were as a location, so decided to simply go with which one got the most emails. Over the following weeks, sales at that property increased by 20 per cent.
“It was the kind of thing you could not do with just intuition,” Ms Lee said. “It was something we could have done with data. [The hotel chain] used its intuition about which locations seemed to be most popular and put up a sign… but it wasn’t predictive intelligence that told them that tens of thousands of people would communicate by email to say ‘please not in this city.’”
Data analytics can also help firms make important business decisions. In the insurance industry, small changes in a company’s data can mean significant changes to a firm’s risk profile. In one case, an insurer took over half a year to respond to a change in its customers’ data that had occurred just days before. Had it done so sooner, it would have saved over $5m in losses from fraud.
Firms are coming to use data analysis for other areas too. Traditionally, governments have been largely unable to tackle complex issues such as crime and healthcare because of their limited IT capacities and the lack of human intelligence. But recent improvements to AI technology have enabled governments to use automated data analysis and machine learning to tackle tough issues.
You might not realise it, but many of the things you do every day are already heavily influenced by data analysts. From the apps you use, such as Facebook and Tinder, to your favourite TV shows – everything is being translated into a set of numbers right now. It’s no surprise then that data analytics is one of the fastest growing sectors in the job market.
How do firms use data?
Firms turn raw numbers into something that can be used in everyday business decisions by running them through different tools such as dashboards and reports. But the quality of this data is heavily dependent on the quality of its acquisition. The way firms gather what they then call “intelligence” is not new – although the actual process has seen many changes over time.
More than 150 years ago, when Britain was first developing as an industrial power, private organisations were already beginning to collect data on their customers and prospects. This included information such as their age, income and location, but it also extended to more intimate details such as how many people lived in a household and how much they earned.
This early data collection was funded by the philanthropies of the day. These organisations had a habit of writing down and sharing their collected data (called “Gossman coding”), which helped make it easier to apply analytics processes to their figures.
In the early 1900s, gathering data with a key-in-the-loop or clickstream approach began in earnest. Companies selected a sample of people from a wide range of professions and tracked how many times they visited their website, what they bought and how much it cost them. This enabled firms to create profiles of customers who were more likely to persuade them to buy something in the future – and also helped them tailor their online experience accordingly.
By the 1950s, firms were more sophisticated in their attitudes towards data collection. Newer techniques such as correlation analysis provided a way to correlate measurements of customers and prospects, which allowed firms to access far larger quantities of information. This led to the development of new techniques such as value-added analytics (VAN), which live on in many branches of the business today.
VAN allowed businesses to build models based on their users’ behaviour and establish correlations between them, allowing them to see how buyers differ from non-buyers and what factors drive sales. Data collected this way became known as “actionable intelligence” – meaning that it could be applied immediately in order to drive revenue or other business goals.
VAN allowed firms to build models based on their users’ behaviour and establish correlations between them, allowing them to see how buyers differ from non-buyers and what factors drive sales.
But for all its apparent usefulness, VAN was something of a fad that didn’t last long. It was too complex to apply in real-time and it was difficult for firms to justify the cost of collecting data over a large number of customers. It also saw little adoption outside the US and UK in those days, which made it harder for foreign companies to use the techniques. The term “data analytics” started being used by firms around 1964 but wasn’t mainstream until much later.
Before the dotcom crash at the turn of the millennium, organisations devoted most of their time to getting a single point-of-view on their market and customers. This led to a boom in companies such as Market Research and Customer Relationship Management. Companies were finally able to see all their interactions with each of their customers at one time (a realisation that would later be called “big data”).
But what forced firms to keep on collecting greater and greater quantities of data was the emergence of machine learning in the early 2000s. Firms now had enough data about each piece of information that they could employ machine learning techniques to generate new insights from it. In essence, machine learning allowed companies to use data they had collected in the past to improve their products and services in the future.
This new ability to use data in real-time became a cornerstone of big data analytics. It could be used to make predictions about what customers needed based on their needs and buying history and to recommend related products or services that customers might be interested in purchasing.
These techniques are mainly used by firms engaged in digital marketing, but they can also be applied to other sectors such as healthcare, transportation and finance. For example, FinTech firms have begun using machine learning techniques (such as those used by Google’s search function) for calculating loan payments.
In addition to machine learning, we also see a boom in more traditional data analytics techniques such as traditional correlation analysis and VAN.
This latest wave shows just how much of an impact firms can have if they choose the right tools and use them correctly.
How is data changing?
Although the basics of big data analytics have remained the same since the 1950s (namely, it’s a way of analysing big quantities of data), other aspects have become increasingly complex. For many organisations, this has created confusion about what exactly companies should be looking for when it comes to analysing their marketing data. This can often be down to firms’ limited understanding of these new technologies.
One of the most common types of big data analytics is predictive analytics. Predictive analytics, or “event stream processing”, automatically looks for patterns in what has happened in the past in order to predict what is likely to happen next.
Based on a user’s online search history and their purchases, online stores use predictive analytics to suggest other products that they might like to purchase. The user then provides feedback on the accuracy of their suggestions and new data is collected. This continues until the system reaches a point where it can accurately predict what users want before they even know themselves.
A company can use predictive analytics in order to determine how many more people they need to advertise to in order to reach certain price points. For example, a retailer might use predictive analytics in order to determine the price point at which a product is expected to sell. In turn, this information can be used to target advertisements and promotions in line with the predicted demand.
Similarly, a company that is surveying potential customers can use predictive analytics in order to predict what questions are likely to be asked by potential customers. This allows the firm decide where it should conduct research and how much time it needs for its surveys.
Data analysis is now a crucial part of every firm’s operations, and it will continue to grow in importance. It may seem baffling at first, but given the emerging strategic role of data in a firm’s operations and its acquisition of such strong figures as Chief Data Officers, data analysis should be able to see steady growth for the foreseeable future.