Managing 70TB Of Reliable Data: EMA’s Big Challenge

Data analysis and interoperability for over 100 information systems is essential to the European Medicines Agency’s regulatory work, says Hans-Georg Wagner, IT chief at EMA

What are the leading health care IT issues for the EMA?

The most urgent and challenging one is how to go outside the EU today. Pharmaceutical really is global. It is necessary but not sufficient to have EU-wide standards. We need to have our standards aligned with worldwide standards and we’re working very closely with colleagues at the FDA and within ISO and HL7 to create worldwide technical standards – for example to describe a medicinal product, to describe the minimal information and the way it’s structured about an adverse drug reaction so that we can understand for example if the FDA sends us information about something that happened to a drug which is also sold in Europe.

And they tell us about some real insufficiency or reaction so that a patient or a specialist here in Europe can actually see that information and understand it, and also use it in his or her data analysis. So that’s by far the biggest challenge is to align ourselves with worldwide standards.

Changing a standard has a massive impact on the existing information systems. You have to change processing logic, and this causes big problems through Europe because people don’t have the money to do that.

I have some visions I’m working on with some of my colleagues in Europe about offering a common portal for all applications and sponsors, not just with intelligent forms but also with software service providers. So that, depending on the size of the pharmaceutical company, they can go there to get whatever they need so they can have quality good submission of dossiers to the regulators. And of course we also need to do this worldwide.

We’re entering a period with very severe pressures on budgets. We have very strong pressure to reduce the cost of operation and also the cost of development.

How does the EMA’s structural analysis software measure and scrutinise the 70 terabytes of data the agency stores?

We’re using a product from a French company called Know and Decide, which allows us to monitor the consumption of actual storage used.

For us, this addresses two important issues: to understand where we have duplicates we can throw out, and the second, which in many ways is more demanding, I find myself at the beginning of every year signing purchase orders for what are large amounts of additional hard disk storage. My people assure me that that will be comfortably enough for the next two years.

What is Cast and how does it work?

Cast is a tool that analyses source code. It looks at tables that don’t have foreign keys.

If you have a source code versioning system, then you try to provision a project team for licenses for Cast and ask them to run against code produced at least once per iteration. And you would look at the results. It reports against a whole number of headlines to do with performance, maintainability, to do with several other issues. There a total of six or seven headings. It will actually report to myself as the CIO, but more important to the project manager and the software architect a state of health of that source code.

We are now moving into a better proactive use of the tool. At the moment we have 125 seats licensed for the use of Cast.

We use [Cast] for somewhere around 10 to 12 of our applications. Increasingly project group by project group uses the Cast tool at different levels of granularity. Obviously a project manager and a software architect will look into things in much more detail than I will as the CIO. I have a dashboard, then I just look at the dashboard and if I don’t see red, I just concentrate on something else.

You have this information at your fingertips, and you can dig in to lower levels of detail.

Then I can drill down to see where Cast highlights there’s a problem. If you use Cast systematically and regularly, the debugging becomes much easier.