Register for our Life Sciences Capital Benchmarking Webinar here

Close button
Share Linkedin icon Twitter icon Facebook icon Whatsapp icon

12 April 2022

Benchmarking fundamentals in an evolving industry

Defining benchmarking  

Benchmarking involves the use of data to compare a company’s performance against industry measures and key indicators. At a fundamental level, it serves to address the questions of: 

  • Who performs better?  
  • Why are they better? 
  • What actions do we need to take in order to improve our performance?  

Essentially, it is gauging performance against other organizations, business divisions or projects in a systematic and logical way, and leveraging the emerging insights to make targeted improvements. It is a valuable forensic tool that enables strategic planning of project/program execution. Below, Linesight Associate Director, Diarmaid Connolly looks at some of the key considerations with regards to benchmarking. 

The rationale and some key challenges 

Benchmarking provides a level of outcome predictability and certainty, making it an invaluable tool in the building of a business case for CAPEX expenditure, with regards to the information being presented to the financial decision-makers. It demonstrates capital efficacy, and identifies opportunities for improvement and optimisation in the program. 

Against the backdrop of global programs, and large-scale and increasingly complex projects, it is fair to say that the role of benchmarking in key sectors has become more important than ever. Below, we discuss some of the associated key considerations: 

Consistency of information  

When running a comparison of data across businesses or projects, consistency is key. At a fundamental level, data should be captured in a templated format – that is to say, recorded in a uniform, standardized format so that meaningful comparisons can be made. Levelling and normalization are an integral part of this – ensuring that we are comparing like with like (e.g., accounting for geoparity and time factors that may have a bearing), and that the data is converted to the same currency or unit of measure. ‘Normalization’ of a benchmark is not an exact science – location factors and escalation are not specific to your project and typically only show trends. However, it is generally accepted that this process facilitates slicing and dicing of data, and drilling down in multiple ways, without introducing unintentional bias to the data set. Ultimately, no two projects or organisations are alike, and levelling and cleansing of data must be conducted to deliver a meaningful comparison. 

The volume and quality of data 

While the key objectives are to draw out key insights, and review performance against industry measures and key indicators, it is crucial that this is done in a statistically meaningful way. This means that the volume of data is important, to ensure that the data set is sufficiently large to be representative of the entire category. Furthermore, we must be confident that the data is of sufficient quality to be in a position to draw out relevant insights. The capturing of data should be conducted by a party that is experienced with the necessary expertise – if a benchmark is tampered with or subject to human manipulation, it reduces the quality, as it becomes an interpretation of a number as opposed to a genuine benchmark. It also is not a one-off or occasional process – it is a continuous process of adding new information and data to enhance the data set.  

The role of collaboration 

There are some interesting learnings emerging from a global benchmarking initiative that Linesight is leading with major life sciences organizations, which could be drawn upon by other sectors. While there can be a reticence in some sectors for competitors to come together to collaborate, the participants in this particular initiative recognize the value of the data output and this knowledge sharing, the role it will play in bringing predictability and meaningful insights to the sector, and overall, the greater good it is driving towards for the sector at large. Of course, this is based on anonymized data sets, whereby participants cannot recognise which project is which, bar their own, and is working extremely well to date.  

 

Hindsight versus insight versus foresight  

While benchmarking has traditionally focused on historical data and looking solely at what happened in the past, more advanced branches of analytics are emerging, looking at what will happen before it occurs – analysing past data to forecast what will happen in the future – what happened (descriptive analytics) and why (diagnostic analytics), to what might happen (predictive analytics), and what can be done about it (advanced analytics and leveraging artificial intelligence (AI)). AI-run modelling can be particularly powerful, accounting for multiple factors at play.  

 

Summary 

It is clear that benchmarking holds a huge amount of potential for a number of sectors, particularly in current times where margins are tight, and demands are significant. As more complex branches of analytics continue to take hold and offer increasing value, the principle remains the same – benchmarking or analytics hold significant potential when conducted at a relatively granular level, to inform design, and optimize cost and schedule.  

Share