Astudy carried out for the US National Institute for Standards and Technology (NIST) gives the first in-depth analysis of the technology infrastructure needs of that country’s biopharmaceutical industry(*). Its findings are just as valid for the worldwide industry, too.
Carried out by RTI International, the study estimates the biopharmaceutical industry’s annual spending on technology infrastructure-related investments and activities.
Most experts interviewed over the course of this study conceptualised technology infrastructure expenditures into two general categories. The first is that these expenditures are, in part, an investment in current and future R&D efficiency. The labour effort, systems, and instruments expended are essential, unavoidable, and integral to a firm's primary economic activity. The second is that these expenditures represent costs incurred to develop work-arounds and overcome technical barriers stemming from a pervasive lack of standardisation.
As a result, organisations view the current level of expenditures on infrastructure technologies to be above the level required with a more efficient infrastructure. Whereas the industry expended US$884million on the R&D technology infrastructure and US$335million on the commercial manufacturing and postmarket surveillance infrastructure (Table1), some portion of that total US$1219million in spending could have been avoided with an improved infrastructure.
Challenges in the biopharmaceutical technology infrastructure and the differences in how organisations respond to these challenges are rooted in:
- The inherent trial and error of the drug discovery and development process;
- Development times averaging 12 years, which is compounded by changes in regulatory requirements, information systems, and procedures;
- Variability in methodologies and protocols that acquire information and variations in how that information is described and characterised;
- Few industry standards for ontologies, data formats, and data communications systems;
- The rapid introduction and adoption of data acquisition technologies (which far outpaces the development of industry's ability to manage, communicate, analyse, and synthesise data); and
- Changes in the regulatory environment in a diverse set of countries and foreign languages.
Even though industry has made significant investments in some areas of technical infrastructure over the past several years, the results of this study indicate that the expected actual expense for a new approved drug could be reduced by 25percent under a conservative scenario and by 48percent under an optimistic scenario (Table2). This study also found that commercial manufacturing costs could be reduced by up to 22percent. A 25–48percent improvement is significant and would require both substantial and broad-based advances in a range of technical infrastructures, including: bioimaging; biomarkers; bioinformatics; and gene expression.
Advanced bioimaging techniques
Bioimaging studies are becoming more important in the analysis of how chemical compounds impact biological systems, particularly as FDA increasingly requests that such studies accompany submissions and Phase IV clinical trials.
While these studies have the ability to capture and analyse fluorescently coded information to seek out biomarkers, improvements in measurement, calibration, and metrology systems are needed to improve comparability, repeatability, and throughput.
Stakeholders cited the significant benefits of using bioimaging techniques to seek and quantify key biomarkers in large-scale imaging studies.
However, identification and quantification of key biomarkers for a disease first requires standard protocols for patient or specimen positioning, instrument calibration, and settings to reduce variability and allow images to be compared across tests (in discovery) or trials (in the clinical testing phases).
New biomarkers such as those that predict toxicity or detect concentrations of antibodies in cells are needed if the potential time and cost benefits quantified as part of this study are to be realised.
The industry experts interviewed as part of the study stressed the benefits that using validated efficacy biomarkers as surrogate end points in clinical trials could have, particularly the ability to rank and prioritise drug candidates in a firm’s portfolio of potential products. Enhancements to researchers’ ability to predict biological response to treatment provides researchers with access to information that previously was only available at later stages in the discovery process or clinical trials.
Outside of drug R&D, next generation biomarkers hold promise for patients whose quality of care may improve because doctors have improved foresight into the transition from wellness to disease.
Improving the effectiveness and efficiency of bioinformatics as drug discovery shifts away from traditional laboratory science toward computational systems is one of the key challenges facing the industry. Tools and strategies for interpreting, synthesising, and communicating data have not kept pace with the ability to generate it.
Standardisation for bioinformatics is essential if the full potential of advances in gene expression, bioimaging, and biomarkers’ improvements is to be realised.
Achieving key priorities within informatics would increase the confidence and assurance analysts have in the data being used in analyses.
Standardisation in gene and protein expression analysis is needed for drug R&D and the practice of medicine to achieve NIH’s and FDA’s goals of personalised medicine. Expression analysis systems capture and analyse information from organisms’ cells that can be used in medical research and health management.
It will likely be a decade or more until medical care reaches the point of personalisation, but if the underlying technology infrastructure supporting expression analysis remains fragmented and characterised by dissimilar approaches to diagnostics and assays, then availability of personalised medicine will be still further delayed.
Improvements in data integration and in-line process measurement techniques are needed in commercial manufacturing. Biopharmaceutical manufacturing is a complex process with a propensity for high levels of variability in reproducing biological substances.
Standardisation of impurity testing methods and protein aggregation studies will ensure safer and more effective products. Over the next 10–20 years, manufacturers will continue to achieve major milestones in improving production yields, demonstrating a better understanding of the bioprocess underlying their products. However, the benefits to consumers resulting from these innovations could be constrained by continued inefficiencies if parallel improvements to the measurement and technical infrastructure that supports the commercial manufacturing process are not made.
Improved standardisation of data elements, improved data integration, and more powerful statistical methodologies are needed to improve the efficiency and precision of postmarket surveillance activities. Over the next decade, the scope and quantity of data required for postmarket surveillance is going to increase dramatically largely because of globalisation of markets and increasing FDA requirements for long-term safety studies. Robust and efficient technology infrastructure in data collection, retrieval, and analysis methods will be required to ensure that these postmarket studies can be cost effectively implemented.
(*) The full report, titled Economic analysis of the technology infrastructure needs of the US biopharmaceutical industry, is available at: