Better quality data equals increased profitability

In sectors such as mining and manufacturing, where prices are often fixed or at least expected to fall within certain parameters, improving profitability is not simply a matter of increasing the selling price.

February 13, 2012

By Gary Allemann, senior consultant at Master Data Management

In sectors such as mining and manufacturing, where prices are often fixed or at least expected to fall within certain parameters, improving profitability is not simply a matter of increasing the selling price. To increase the bottom line, operational efficiency needs to be improved so that the cost of production can be lowered, thereby increasing profit margins.
 

Gary Allemann

Gary Allemann

 

However, operations is one area that is typically plagued with inefficiency, as a result of inadequate, poor quality data. This relates to the axiom ‘if you can’t measure it, you can’t manage it’. If you don’t have good data, you cannot measure anything accurately. On the other hand data quality and master data management solutions will ensure that information is accurate, can be effectively measured, and can be used to ensure that efficiencies are improved for leaner, more profitable enterprises.

While data quality and master data management are two tools that are generally associated with the financial services sector, the fact is that all organisations need to understand what is happening within their business in order to improve processes and increase efficiency. There are three main problem areas that quality data can help to solve when it comes to improving operational efficiency for mining and manufacturing, namely asset management, supply chain and health and safety. All three of these areas are plagued by unnecessary expense and inefficiency driven by a lack of quality data.

Within asset management, in order for maximum efficiency to be realised it is vital to have a clear, accurate and complete view of what assets the enterprise owns and where these assets are deployed at any given time, from disposable tools to heavy machinery. While the majority of organisations have an asset register, inconsistencies with the way objects and assets are described often leads to problems, particularly in industries such as mining and manufacturing where distributed geographical locations may have different ways of capturing  data.

For example, if an asset is described as ‘ACME Water Pump, 1500HP’ in the asset register, but on the deployment schedule as ‘Water Pump, ACME, 1500HP’ these two objects cannot be reconciled as the same piece of equipment. This means that assets can easily go missing without anyone realising the fact, leading to unnecessary expense when these objects need to be replaced. These inconsistencies also work towards ensuring that enterprises have no clear idea of the lifespan of assets or where they are, because of inconsistent data. Fault management also becomes problematic with inaccurate and duplicated data, since faults may be logged multiple times in different fashions, each time showing up as a different fault.

Data quality aids in solving these challenges as it ensures de-duplication and standardisation of the asset register, which means that accurate inventory control and management can be accomplished through a single version of the truth of assets, increasing operational efficiency.

In a similar way, data quality can also help to improve supply chain management in several key areas. For example, when it comes to spend analysis, both in terms of total spend per vendor and total spend on specific materials, equipment and assets, having an accurate view of data is vital. If your data is inconsistent it is impossible to gauge how much money is spent with specific suppliers across different geographies, which means that supplier discounts cannot be accurately negotiated because it is impossible to properly quantify spend. It also becomes impossible to determine whether or not supplies are lasting as long as they should, whether equipment is faulty or going missing, and a whole host of other issues.

If data is cleaned and the integrity of it is restored, it provides a single, accurate view of spend.  It suddenly becomes a far simpler task for an organisation to see exactly how much money is being spent where and on what. This in turn provides the facts necessary for more informed decisions about suppliers, costing and the individual needs of various locations and branch operations, which in turn aids in once again improving efficiency.

Poor supply chain data can also lead to ‘lost’ inventory. For example stock may show that there is only one ‘Water Pump, ACME, 1500HP’ in supply, leading the stock controller to order more. However there may be 12 ‘ACME Water Pump, 1500HP’ in stock, which means that new stock is ordered unnecessarily, costing money where this need not be spent. Standardised, de-duplicated data would prevent this problem from occurring.

Health and safety regulations is another area where quality data will not only ensure more efficient operations, but will also help organisations to mitigate risk. In terms of regulations, employees that work underground or in dangerous scenarios need to have the appropriate qualifications or health clearances. However given the contract nature of a lot of the work, particularly on the mines, employee data tends to be duplicated across a variety of HR systems, being re-entered each time an employee’s contract is renewed. Different locations under the same parent company may also run different HR systems, making a centralised database difficult to manage, which in turn makes both cost and safety compliance difficult to manage. Data needs to be correct and up to date to ensure that workers are qualified to be in the positions and are medically cleared to be there. Without data quality it is impossible to keep track of these records, which increases both risk and liability in the event of something going wrong.

Ultimately, improving operational efficiency requires control, and without an accurate, single version of data this control is nearly impossible to achieve. If the right information is available, using the right data quality and master data management tools however, this control becomes attainable, which in turn optimises processes, improves operational efficiency and ultimately boosts the all-important bottom line