General2.07.2012

Better data = improved profits for the mining and manufacturing sectors

By Gary Allemann, senior consultant at Master Data Management

The mining and manufacturing sectors face a unique challenge when it comes to maintaining and improving profitability in difficult economic times. Unlike in other industries, it is not possible to simply raise prices, since these are often fixed or must fall within certain parameters. This means that to improve profitability, these organisations need to look deeper than the surface for solutions, addressing operational inefficiency to reduce the cost of operations, thereby running a leaner, tighter operation that can in turn generate greater profit.

Operations in mining and manufacturing are often plagued with various inefficiencies, typically as a direct result of either inadequate or poor quality data. If data is inadequate or inaccurate, it stands to reason that any decision made as a result of this data will be correspondingly inaccurate as well.

In order to make quality, fact-based decisions and improve operations, organisations need to know what is happening within their organisation. For this reason, data quality management and master data management solutions, historically the sole domain of the financial services sector, have become vital tools in commodities and manufacturing operations as well. Using such solutions to ensure the quality and adequacy of data ensures that data is accurate and measurable. This in turn means that decision-making is improved, as are operational efficiencies as a result.

Improvements to operational efficiency resulting from data quality initiatives can be leveraged in three main problem areas within the mining and manufacturing sectors: asset management; supply chain; and health and safety. While quality data can have a positive impact on many aspects of operations, these areas specifically tend to be plagued by unnecessary expenses and inefficient processes that cannot be managed due to inadequate data.

Asset management in particular relies on an accurate, single and comprehensive view of assets. This links back to the axiom, ‘if you can’t measure it, you can’t manage it’. If an enterprise has no idea of what assets it owns, where they are deployed at any given time, and what the expected lifecycle of these assets it, it cannot manage these assets, full stop.

Inconsistencies in the asset register can wreak havoc on operational efficiency, particularly in distributed environments. Multiple methods of data capture and multiple asset descriptions lead to an inconsistent picture of assets across an organisation. As an example, the same widget may be described as Widget B, 5cm, Silver in one instance, and as B Widget, SLVR, 2.5in in another. Although these are the exact same widget, the asset register shows them as two separate widgets, and they cannot be reconciled as the same equipment.

The upshot of this is that the lifespan of various assets is impossible to determine, which may mean that assets are purchased when they needn’t be and decommissioned before they should be. Assets can also go missing without anyone realising they have disappeared, because they cannot be tracked back to a specific entry in the asset register. In general, inaccurate data means that maximum efficiency is impossible to achieve. The end result is unnecessary expense, poor decision-making ability and duplicated data that is essentially meaningless at delivering any sort of value to the organisation.

If data quality management initiatives are put into place, de-duplication can be processed and the asset register can be standardised. However, this is not simply a matter of installing a piece of software and expecting that a once off project will solve all of an organisation’s problems. Data quality requires processes to be put into place, for example processes that govern the standards for capturing data for the asset register. If processes and software work together, inventory control can become more accurate, and asset management will be vastly improved, having a knock-on effect in improving operational efficiency.

Often the supply chain experiences similar issues to asset management when it comes to data quality, and the resulting challenges are similar too. Spend analysis, as an example, relies heavily on the underlying data in order to produce results. If this data is inaccurate, spend analysis is essentially meaningless. Without an accurate view of spend per vendor and spend per asset, it is impossible to understand how much is being spent with a specific vendor or how much is being spent on specific assets and equipment. This means that discounts with suppliers cannot be accurately negotiated, since organisations often cannot fully quantify spend. Inventory management can also be problematic, which again links to inconsistent data capture and a poor asset register. For example, stock may show that Site A is out of stock of Widget B, 5cm, Silver, and so the manager may order more widgets. However, there may be 75 B Widgets, SLVR, 2.5in in stock, so spend is unnecessarily increased It also means that trends in assets cannot be picked up, such as a new brand of widgets having half the lifespan of the previous brand, so intelligent decisions regarding suppliers, assets and equipment are impossible to make.

Clean, accurate, adequate and properly reconciled data will on the other hand provide a far more accurate view of spend and existing inventory. This then eliminates unnecessary purchasing and makes analysing spend a far simpler task, which informs better decisions around suppliers, costing and assets. This again improves efficiency, reduces unnecessary expense and adds directly to the bottom line.

When it comes to health and safety, more accurate and adequate data not only helps to improving operational efficiency, but also assists with mitigating organisational risk. Regulations within mining and manufacturing often dictate that employees need to have achieved certain qualifications or health clearances, so it makes sense that having this data accurate and up to date is essential. Without an accurate view of exactly who is working for the organisation at what time and what their clearances are, mining and manufacturing particularly open themselves up to risk in terms of health and safety legislation.

However the complexity of dealing with this problem is compounded by the contract nature of employment, a variety of HR systems across locations, and different regulations in different countries. These issues make a centralised database difficult manage, which in turn increases both risk and liability. Data quality is an essential tool to not only manage records but to mitigate risk and ensure compliance.

An accurate, single version of the truth resulting from quality data and master data management improves control across the organisation. Control is vital for optimising processes and improving operational efficiency across the board. In turbulent economic times, improving operational efficiency improves profitability, particularly in commoditised environments – and that is the bottom line.

 

Sign up to the MyBroadband newsletter