When retail stores started using bar code scanning devices at checkout counters, companies emerged to collect and sell that point-of-sale (POS) data to a retail chain’s suppliers and other manufacturers of products sold to retail outlets. Nielsen has built a worldwide business supplying POS data and analytics.
In the 21st century, retailers themselves are collecting and analyzing POS data with their own systems; some are offering to share it with or sell it to companies whose products they sell in their stores. Many manufacturers want this data and believe significant insights can be gained, such as how often and when their products are out of stock in stores and how special pricing or incentives affect sales.
But processing that data is easier said than done. Let’s say 3M wants to analyze the sale of its tape products at Home Depot stores in the U.S. Every single sale of a roll of tape would be captured by Home Depot’s POS system, recording the price, the SKU (stock keeping unit) — whether it is 1-inch masking tape, 2-inch masking tape, a 3-roll pack, blue painter’s tape, electrical tape, packing, tape, or one of the other many variations of tape sold by 3M, the date, the store number and address, and other pertinent information, for each of Home Depot’s 1,974 U.S. stores.
This represents a flood of raw data – like drinking from a fire hose. Few companies have tools with which to analyze this enormous volume of data. Some are building their own data warehouse; others are considering packaged business intelligence applications.
In late 2010 a well-known brand name company – Company H — was weighing its options for analyzing its products’ POS data from one of its largest customers, Really Big Retailer (RBR). The data would be in raw form; about 9 million lines of data per week. For several months the company investigated storing the data in a data mart and using query tools to analyze it. Some of Company H’s staff tried using Excel spreadsheets on smaller samples of the data. At that same time, Company H’s sales team investigated outside software options.
Company I, a firm that specializes in analyzing retail POS data, had reached an agreement with RBR to collect, organize and sell RBR’s store sales data to RBR’s suppliers. Company H had been in discussions with RBR on how best to gain more insight to shelf performance and shelf conditions.
So Company H turned away from building something on its own and instead negotiated an agreement to purchase Company’s I’s POS analytics software on a pay-as-you-go basis. Company I’s model is to sell proprietary analytics, via a web portal, of the POS data it obtains. It sells reports, analysis and alerts via a monthly subscription. Company H signed a contract with Company I for a collection of specific reports and alerts for Company H’s sales at RBR stores.
Initial results were promising. In one of the first “use cases” by Company H, Company I’s alerts identified out-of-stock (OOS) conditions in stores that, when corrected, recovered an estimated $1.2 million in lost retail sales.
Company H has expanded the use of the application to some of its other retail customers. Many more “wins” have been scored by Company H, such as identifying and correcting distribution voids (missing products on shelves). While results have been good, Company H now wants to realize bigger benefits, create a self-sustaining process that delivers provable sales increases, and make a bigger impact on in-store replenishment practices.
Again, knowing about emerging software options yielded an attractive alternative. Just a few years ago, companies that wanted to process and analyze retail sales data had to find a place to store the data, figure out how to “normalize” it to match the company’s different definitions of sales, products and customers, and then determine what type of analytics software to use to make use of the data. This is commonly known as creating a data warehouse or a data mart, and a “BI stack,” all of which are tremendously expensive and can take many months or even years to construct. Another option would have been massive Excel spreadsheets, which in fact Company H did use, but with limited results prior to acquiring the analytics from Company I.
Company H had nailed a clear ROI. It’s hard not to overestimate the resources that would have been required to duplicate the analytics that were prepackaged in Company I’s application. Such an effort would have taken a year or more just to put the data into a usable format, in a standard relational database, and to build or set up the analytics application(s). That’s not including costs for data architects, programmers, database administrators and expert users who would know how to structure the data into usable reports and alerts. The cost of one good analyst alone at the time was more than the monthly cost of using Company I’s application.
Limiting the scope of the project to a narrow set of requirements made for a quick implementation. The company decided that only three to four key indicators of retail sales conditions were essential for it to positively impact OOS through corrective measures. Yet there was much more data available – inventories, pending deliveries, pricing information – that the company chose to leave out of scope in order to get a quick win on its main objective.