Should organizations with traditional business intelligence implementations stick with their traditional tools and try and run cloud tools alongside? Should they consider a more hybrid approach to running analytics? When should they look at migrating away from internal implementations and use cloud instead? Birst’s Southard Jones explores the options.

southard1As a technology, Business Intelligence (BI) has been in existence for more than 55 years in one form or another. From the Decision Support Systems of the 1970s and 80s, to the revival of BI that took place in 1989 and into the 1990s, BI has always promised to deliver greater insight that could help improve performance. Now, Gartner lists BI and Analytics as the top area for CIO investment in 2016.

Are CIOs trying to jump ahead of competitors and take advantage of the opportunities that BI can create? Or, are they still spending to get BI right in the first place? More importantly, what role will the cloud play in improving the value that BI delivers back to the business?

To see what the future might hold, it’s important to look at how the use of BI has changed over time. Traditional BI tools have been in place for years and perform a vital role in providing management information to the board or the CEO. However, many of these projects were aimed at large-scale data sets that were required for large-scale reporting, such as financial close reports or integration with a company’s Enterprise Resource Planning system.

These big implementations were likely to have been completed years ago, requiring on-premise IT systems and an army of specialist consultants to get things done. Once they were in place, they tended to be left alone. The sheer cost for some of these projects meant that they remained the preserve of the CFO and management team, rather than being extended out through the business. At the same time, the reporting normally required some specialist skills, too, further limiting the potential number of users within a company.

Enter the cloud

These traditional implementations provide valuable insight to those who use them. However, in recent years, the demand for data from all reaches of the business has grown exponentially. Business leaders recognize the fact that data can provide their competitive advantage; at the same time, they won’t wait for weeks, months or even years to get the answers they require with traditional BI tools. This speed of delivering data to people across the organization – coupled with the need for more self-service IT – means that traditional BI solutions can’t meet these needs.   At the same time, traditional BI tools are often now in “care and maintenance mode,” meaning that they are no longer being actively updated or new functionality added.

So how should CIOs look at the future of their BI and analytics projects? There are a few options available:

  • Stick with the existing platform and work around it. This has the benefit of being a known quantity, but it lacks the user interactivity and agility necessary to meet the analytics and data demands of everyday business users.
  • Take an Augmentation approach. If the business needs new reporting and visualization functionality, then alternate tools can be chosen and run in parallel to the existing legacy BI implementation. Depending on the approach, the new BI tools can take data from the existing BI platforms and join information sources together, or users can combine data themselves manually.
  • Consider a “rip and replace” option. There are new options for running BI in the cloud, from new and established vendors alike. Shifting existing BI platforms over to the “new” cloud service is one option, while migrating to other cloud-architected BI platforms can also be considered, too.

How and when companies migrate their data and analytics implementations requires some careful thought. Traditional BI platforms play vital roles within the businesses that have implemented them. However, the need for data across the business will force more CIOs to consider the future of their BI initiatives.

Migrating from legacy BI tools

Migrating off legacy BI tools is therefore an increasingly attractive option as modern BI platforms have all the capability of traditional ones while delivering the agility that de-centralized BI teams demand too. Putting a more proactive strategy in place around BI and migration – whether this involves augmenting existing BI systems or moving to cloud or Networked BI systems – can help the centralized team reinvigorate how data is used within the whole business.

Taking the middle path and integrating both new and traditional BI platforms should offer greater flexibility and faster speed of delivery, but also de-risk the implementation process for the CIO, as well. More importantly, users can continue with their existing front-end tools where investments in data discovery solutions have been made while getting consistency and uniformity of data. Supporting users in using the analytic interaction front-end they choose enables the wider business to be successful with the tools of their choice.

For CIOs looking at how to shift off legacy BI implementations, there are some new options coming through from those same vendors that are currently being used. This “lift and shift” to cloud can be perceived as less risky than implementing any new platforms. However, this can actually store up trouble for the future instead. The “lift and shift” mantra does exactly what it says – rather than taking on the advantages of true cloud, these platforms will remain the same and just run on someone else’s data center. This represents the worst of both worlds – loss of control for the CIO as well as losing out on the benefits of multi-tenancy and better data integration.

Linking BI and Data Discovery

A challenge that exists today is that users can easily start extracting data from the central BI and reporting tools and blend that locally with their own line of business data. The risk with this approach is that the key analytics done by each end-user on his or her desktop creates a silo that no one else can re-use. This is what creates inconsistencies.

This is exacerbated by the growth of desktop discovery tools that can make use of data but that do not have the level of control in place around data sharing. Instead, it is worth looking at how a multi-tenant cloud architecture can be applied to BI to aid users in getting access to the information that is required. This model involves networking data sources together so that all the data sets across the business are accessible and known to all users, while users can bring in their own data, as well.

Create the flexibility and ease of use of the desktop tool

The aim here is to create the flexibility and ease of use that individual desktop data discovery tools can bring, while not giving up on central data governance and compliance where it is needed. For those companies that have invested in data discovery tools – whether at a company level or for individual teams – this should not be wasted either.

By bringing centrally governed data closer to the user and making it easier to link up with other internal and external data sets, the networked BI approach can provide support for self-service and greater user autonomy. This is a great opportunity for cloud, as it can support multiple user groups while still being secure and controlled when it comes to the critical data that everyone needs.

When migrating away from traditional BI models, many companies see an opportunity to reduce their reliance on older products that can be expensive to maintain and forge close relationships among business units. The role for the cloud in BI is to help CIOs choose the best approach to support their long-term ambitions around data, both for central data and for individuals in line-of-business teams.