The BI industry – a realistic look a new developments
By Len de Goede, Vice President Systems Integration, T-Systems in South Africa
- Understand business requirements
- In-memory computing – the challenges
- Free-to-fee – outsourcing BI
Like so many other elements of the IT marketplace, Business Intelligence (BI) continues to evolve and with it the challenge to understand what is relevant to your organisation.
Indeed, one often has to hit the pause button on technology to allow the business to drive technology and not the other way around. The key is to understand South Africa’s unique challenges and how these should be met, with the understanding of local constraints such as regulatory requirements.
However, in order to make these decisions it is important to have a clearly defined strategy. In the case of BI, one version of the truth has always been the mantra both providers and companies should subscribe to. Ultimately, BI is a key enabler for organisations to understand and act on intelligence to better execute their strategy. This explains why year after year, it remains a hot topic within CIOs’ agendas.
And the good news is, BI continues to offer exciting strides – the key is to understand what advancements serve your company’s own requirements. For example, in the operational reporting space, fast analytic turnaround and processing times are critically important. Counter that with the strategic importance of executive information which is based on longer trends, but where the presentation of the information is key combined with ease of use and mobile accessibility.
One technology that is enjoying considerable hype and addresses the above is in-memory computing, which enables companies to analyse huge volumes of data at the speed of thought, detect patterns quickly and adjust their operation almost immediately. Data can therefore be analysed in seconds rather than hours, providing an edge to adapt to an ever-changing demand.
Consider this
Clearly in-memory is taking BI to a whole new level and represents a tremendous leap in the application of it; however – as with most technology developments – it also brings a new set of considerations.
As mentioned, one version of the truth is a fundamental principle of classic BI; in the case of in-memory computing, the principle stands.
Improved performance created by in-memory computing makes BI mobility an accessible and achievable reality.
Most organisations run a number of legacy systems that have been harmonised and consolidated in a data warehouse – one set of data throughout the enterprise. The objective of in-memory computing is to support and not replace this – for now at least.
There is considerable value in in-memory computing; the key is to determine whether or not it is relevant for your organisation right now and if so, what information should be accelerated considering the current cost of in-memory computing.
Pressure on Capex
Whilst In-Memory computing is relatively new in the BI space in SA, there are other developments that are impacting the marketplace. One of the most important is the increasing demand for Opex based models, which in essence sees BI and Enterprise Performance Management (EPM) offered “as a service” and at a fixed monthly rate.
Companies are therefore saying that BI is not their core focus and they may choose to outsource BI to partners with the skills and technology infrastructure associated with ensuring it meets business demands in a reliable and effective manner.
Clearly, the BI industry is going through some exciting changes; however, as an organisation we continue to find that it is critical to understand why companies require BI. For one, BI is and remains a discipline that helps us focus on effectively and efficiently executing our strategy; technology is purely the enabler.
BI should remain the foundation from which decisions are made and this emphasises the importance of carefully considering technology within the context of your strategy.