Clinical Evaluation Masterclass: Overcoming Non-conformities - Episode 1

Dr Paul Hercock

Clinical Evaluation Plans (CEPs) and Clinical Evaluation Reports (CERs) are critical components of medical device regulatory compliance. However, the reality is that meeting requirements isn’t always plain sailing.

Based on our experience with over 250 successful submissions, we’ve identified that overcoming clinical evaluation non-conformities is one of the major challenges manufacturers face during the device approval journey.

What is a Non-Conformity in Clinical Evaluation?

A non-conformity arises when a CEP or CER fails to meet MDR requirements. Non-conformities signal that the evaluation does not yet adequately demonstrate a medical device’s safety and performance or that full alignment with regulatory obligations has not been shown, requiring correction before approval.

The Impact of Non-Conformities on Manufacturers

Non-conformities significantly disrupt the approval process, leading to extended review timelines, increased costs, and mandatory revision cycles. Since notified bodies (NBs) cannot accept a CEP or CER with unresolved non-conformities, addressing them is critical for market access.

A helping hand

Our new video series applies our deep experience to help manufacturers proactively identify and fix these issues for smoother audits and faster certifications. This series breaks down the most common clinical evaluation non-conformities, explains why they keep happening, and most importantly, shows you how to fix them for good. Each episode provides step-by-step guidance to help you avoid these pitfalls in your CER submissions.

Episode 1: “Safety and Performance Objectives Lack Specific and Measurable Acceptance Criteria”

This frequently cited regulatory observation typically reflects not an actual safety or performance deficiency, but rather insufficiently defined evaluation parameters. The episode will address:

  • Fundamental concepts of safety and performance objectives
  • Deriving safety and performance objectives using weighted values
  • Defining “specific” and “measurable” objectives
  • Practical approaches to establishing and implementing acceptance criteria

Coming in Episode 2: We’ll tackle another frequent non-conformity, “Appraisal of literature articles has not been conducted appropriately” showing you how to strengthen your literature review process. Sign up to our newsletter now so you don’t miss out.


Transcript follows:

Okay, hi everyone, it’s Paul here from Mantra Systems. Welcome to a brand new series that we’re running on how to fix clinical evaluation non-conformities.

The idea behind this series is to equip you with the capability to, first of all, avoid common non-conformities that are seen in medical device clinical evaluations. And then secondly, if you have been unfortunate enough to receive one, by the end of the series you should hopefully have a powerful strategy for correcting them and moving on to full acceptance of your CE.

So, just for this first episode, we’re going to cover a few basics, and it’s worth starting with the question: What is a clinical evaluation non-conformity?

Well, obviously, apart from Class I devices, clinical evaluation plans and clinical evaluation reports need to be submitted for review by a notified body. And a non-conformity — or NC for short — is when conformity with requirements has not been adequately demonstrated. A non-conformity requires correction and then resubmission of the CEP and CER. Resolution — certainly of major non-conformities — is necessary in order for the clinical evaluation to be accepted.

So, it is a really important topic, and non-conformities under MDR are quite common. But the problem with non-conformities is that sometimes the questions — which is how they’re often manifest, it’s a question from a reviewer — they can be difficult to interpret. Sometimes it’s not clear exactly what a reviewer means, and it may be unclear how to fully address them as well.

A failure to resolve non-conformities in full may lead to a further round of review following yet another resubmission, which is just multiplying costs, losing time, and it leads to stress and worry — because it is anxiety-inducing trying to get a clinical evaluation through approval.

Okay, so the aim of the series is to work through common non-conformities one at a time, and to dig deep into what they mean and how to solve or avoid them. The series and the principles within it link over to general principles for optimal conduct of clinical evaluation under the MDR.

So, let’s start with our first example non-conformity. This one is a really common one, and it states:

“Safety and performance objectives do not appear to have specific and measurable acceptance criteria.”

Breaking that down, we need to understand exactly what the question means and how to address it. It’s possible to break it down into the following bullet points:

  • What actually are safety and performance objectives?
  • What in this context does “specific and measurable” mean?
  • What are acceptance criteria?

And importantly, of course, it needs to go beyond just knowing what they are but: How do we derive and analyse them?

So, let’s begin with safety and performance objectives. What are safety and performance objectives?

Well, these ultimately are benchmark values against which the device under evaluation will be compared. They are derived from the state-of-the-art literature review, which is an essential component of clinical evaluation.

Technically speaking, objectives have two components. They’re derived of a clinical outcome parameter — which is a qualitative concept or type of outcome — and then attached to that, there will be a quantitative value that forms the actual objective.

As per the non-conformity, safety and performance objectives must be specific and measurable. Now, that covers a lot of ground, so let’s take an example and break this down.

An example might be of a completed safety — or in this case a performance — objective:

“Increasing walking distance in meters at six weeks post-procedure of 61.46m.”

You can see within this the two components: the “increase in walking distance in meters at six weeks” — that’s not a value, okay? It contains a number, but that’s just because we need to compare like with like. That’s not the actual objective.

The clinical outcome parameter is a thing, a concept, something you might measure, something to which you might attach a value. And then the second part of it is the actual performance outcome, which is quantitative.

These two things together constitute a specific and measurable safety or performance objective.

So, how are these derived? Well, they’re derived during the analysis stage of a state-of-the-art literature review. We’re going to go on to a working example in a moment to show exactly how they are derived.

Remember, a clinical outcome parameter is a “thing,” and you decide upon those by looking for outcomes that are seen in a comparable form across multiple sources within the state-of-the-art literature review. If you have four, five, six sources all reporting the same type of outcome, it’s a strong candidate for a clinical outcome parameter.

In the state-of-the-art protocol, there should be a method for determining what constitutes a clinical outcome parameter. That, of course, would feed into the clinical evaluation plan.

Safety and performance objectives then are weighted mean values attached to those parameters. As we’ve seen before, you need both parts.

It’s worth at this point just reflecting on the principles of an effective state-of-the-art literature review. Because there’s no point producing state-of-the-art safety or performance outcomes that have not been derived properly.

We’ll go into this in another video, but a state-of-the-art literature review requires:

  • a detailed protocol
  • use of a validated method to define research questions and search terms (one such method is PICCO)
  • a well-documented literature extraction process with recording and justification of all excluded sources,
  • a structured appraisal — which is where we get to be very specific to the production of safety and performance objectives, and
  • the analysis — which is where the actual objectives are produced.

But we cover that in a lot more detail in a separate video.

So, let’s look at a working example of how safety and performance objectives are derived.

Here is fictional data. Let’s say we’ve done a state-of-the-art literature review, and sticking with the same example, we found four publications that all reported a mean increase in walking distance at six weeks, and they all did it in a comparable way. Here are the mean values from each of those sources.

How do we produce a weighted mean?

You can see in the table that the way we’re going to weight it is by a representation of the quality of each of these studies, to ensure that greater weighting or prominence is given to results from higher quality studies. In order to do that, we need to produce an appraisal score.

Appraisal scores — there are lots of different ways to do this. On the next slide we’ve got, again, a simplified fictional example of how to calculate an appraisal score. You might look at different aspects of the study: you might consider study type, sample size, the use of statistical tests and whether they were appropriate, and the length of follow-up. You’d probably in reality consider other factors as well, but for the purpose of this, this will suffice.

You’ve perhaps seen this — usually these manifest as an alphanumeric code, and again, the meaning behind this code should be expressed in the literature search protocol, in the appraisal plan section. So this will all be mapped out.

Ultimately, this enables the calculation of a numerical appraisal score for each paper, with a higher number representing a higher quality study. Then those values can be imputed back into the original table with all the other values being the same.

Now we have an appraisal score by which we can weight the results from each study. That’s done fairly simply just by multiplying the actual result by the appraisal score. We do that for all of them to produce a weighted value for each study.

But we’re not quite done there, because we need to calculate a weighted mean. That’s done by taking the sum of all the weighted values and dividing that by the sum of the appraisal scores.

What we’re doing there is making the appraisal scores the denominator — they are the weighting factor that’s being applied across all studies. This means we achieve our objective of moving the weighted mean closest to the values of the highest quality studies.

In this case, that means: 3872 / 63, which gives a weighted mean of 61.46.

It’s always worth sanity checking these, but if you look at the plain mean values from each study and the appraisal scores, you’d expect it to land around the 60 mark — and that’s where it falls: 61.46.

So, that’s a simplified example of how to calculate a weighted mean for the quantitative component of a safety and performance objective.

Don’t forget, the non-conformity required that we produce safety and performance objectives that are specific and measurable.

Let’s just reflect on what that means:

  • Specific means unambiguous, clear, and would be consistently interpreted as meaning the same thing.
  • Measurable means it contains a value — a quantitative value — against which another value from another device can be measured.

So we have this in this case. It’s clear that that is both specific and measurable, and that’s why we included reference to six weeks — to ensure we’re comparing like with like.

That leaves us to consider the final aspect of the non-conformity, which was reference to acceptance criteria.

An acceptance criteria basically defines when safety or performance of the device under evaluation are acceptable in comparison to state-of-the-art.

What we’re doing here is setting out a means for comparing the device-under-evaluation outcomes with the state-of-the-art objectives for specified clinical outcome parameters. By now, all of these terms should have a meaning attached to them.

So, the clinical evaluation plan needs to contain an analysis plan for how this comparison will be done. Suffice to say that the outcomes for the device under evaluation are calculated by weighted means using a very similar method that we used for the state-of-the-art objectives.

And acceptance — this is the key part — acceptance can be defined as showing (statistically) that the outcomes for the device under evaluation are non-inferior to objectives derived from the SOA. We don’t need to show superiority — we’re just showing non-inferiority. That’s a key distinction.

Let’s take a final example, then, working this through.

We can see:

  • Increasing walking distance (spelled correctly this time) of 61.46,
  • and in this example there were some other clinical outcome parameters as well:
    • improvement in pain (VAS, a measure of pain score),
    • and range of motion at six months.

We have values attached to all of those. Then we also have weighted mean values for the device under evaluation.

Remember, the job here is to determine whether the device under evaluation is non-inferior to SOA objectives.

For the top one, it’s very easy, because a plain number comparison shows that the device under evaluation did better than the SOA objective. We don’t need to do any fancy tests — it’s obviously non-inferior. That’s a straightforward conclusion.

But with the others, on a plain number comparison, the device under evaluation actually looks like it’s done less well than the state-of-the-art objectives. The important thing to understand here is whether that represents true inferiority, or whether these values are statistically non-different.

For that reason, conducting a statistical test is relevant. Often a t-test is an appropriate test because it’s a comparison of means, and it generates a p-value denoting significance or non-significance of the difference between these values.

Let’s say we conducted a t-test in this case, and these were the p-values that were derived. Generally, significance requires a value of less than 0.05 for the p-value. That’s not what we’re seeing here, and so these differences were non-inferior.

According to our acceptance criteria, the device under evaluation has been shown to have appropriate performance in reference to state-of-the-art.

So let’s go back to the original non-conformity:

“Safety and performance objectives do not appear to have specific and measurable acceptance criteria.”

We covered:

  • what safety and performance objectives are,
  • how to derive them using weighted values following a state-of-the-art literature review,
  • what “specific and measurable” means and how the outlined process generates specific and measurable objectives,
  • and what acceptance criteria are and how to apply them — including through use of statistical testing.

If you need any further support in relation to clinical evaluation or working through non-conformities, Mantra Systems are clinical evaluation specialists. You’re free to contact our team at any time.

If you’ve just got some general questions, don’t worry — it’s absolutely fine. More than happy to speak at any time. So feel free to reach out to us if you need any additional support.

That concludes the first episode of the Clinical Evaluation Non-Conformity series. I’d like to thank you very much, and if you have any questions or comments, please let me know below the video.

Thank you very much.

Related articles

  1. A man carefully steps across a cliff-face. An analogy for assessing risk.

    Top 5 Common Pitfalls to Avoid During Risk Assessment

    Learn how to sidestep costly mistakes which manufacturers commonly make. From hazard ID to post-market surveillance, we help you improve safety and speed up approvals.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  2. A label maker printing bar-code labels.

    Labelling 101: A Comprehensive Overview for Medical Device Manufacturers

    Labelling and packaging are critical elements to ensuring safety, compliance, and ease of use.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  3. An illustration of a brain-shaped object on an abstract background.

    European Commission Guidelines on Prohibited Artificial Intelligence Practices

    Summary of the 8 AI practices prohibited by the EU 2024/1689 artificial intelligence (AI) Act.

    Dr Clare Dixon Dr Clare Dixon Regulatory Specialist
  4. A compass being used to navigate across mountainous countryside.

    Navigating Non-Conformities in Technical Documentation

    We explore how to manage non-conformities effectively and implement Corrective and Preventive Actions (CAPAs).

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  5. A photograph of a literal maze that we're using as a clever metaphor.

    Mastering the EU MDR: Essential Steps for Compliance-Ready Docs

    If you're uncertain about the readiness of your EU MDR documentation, this article provides an overview of the essential steps to ensure you’re on track.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  6. An illustration showing scientists at work.

    A Guide to Electronic Instructions for Use (eIFU)

    Electronic Instructions for Use (eIFUs) are set to revolutionise how medical device instructions are delivered. We explore what this means for you.

    Dr Will Brambley Dr Will Brambley Lead Medical Writer
  7. Two helicopters look as if they are about to collide: An analogy for risk.

    Navigating Risk Management Requirements under the EU MDR

    This is a cornerstone of EU MDR 2017/745, requiring a continuous, well-documented approach. We unpack key requirements and provide actionable strategies.

    Peter Boxall Peter Boxall Lead Medical Writer
  8. A doctor operates a tablet computer.

    Beyond the Acronyms: Understanding SaMD and SiMD

    As software advancements continue, the line between traditional hardware-centric medical devices and software-driven solutions becomes increasingly blurred.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  9. A team of profesional-looking people sit around a table, congratulating themselves.

    Extending the Validity of your IVDD Certificates – Key Dates

    The EU and the MHRA have extended the validity of IVDD certificates, allowing you more time to transition to the IVDR. We explain what this means for manufacturers.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  10. A team of profesional-looking people sit around a table, congratulating themselves.

    GSPR 1: A New Era of Performance with Safety at the Core

    This regulation emphasizes risk management, durable design & biocompatibility to ensure medical devices are safe and effective. GSPR 1 protects users while driving innovation in medical technology.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  11. Cybersecurity Vulnerabilities in Medical Devices: FDA Alerts on Contec and Epsimed Monitors

    Patients can be exposed to risks when devices are online. We explore implications for EU MDR/IVDR cybersecurity requirements, including MDCG guidance

    Dr Clare Dixon Dr Clare Dixon Regulatory Specialist
  12. A futuristic-looking factory full of labelled cardboard boxes.

    Decoding UDI: Your Ultimate Guide to Smarter Medical Device Labelling

    The Unique Device Identifier (UDI) ensures medical device traceability and compliance. We break down its structure, Device Identifier (UDI-DI), Production Identifier (UDI-PI) and its role in EUDAMED.

    Kamiya Crabtree Kamiya Crabtree Regulatory Medical Writer
  13. A medical team discuss performance data at their desktop computer.

    Key Updates for Navigating EMDN: MDCG 2024-2 Rev.1 & 2021-12 Rev.1

    Release of the updated guidance helps manufacturers navigate the EMDN system for accurate device classification, ensuring market access.

    Ron Sangal Ron Sangal Lead Medical Writer
  14. A dated monitor for medical equipment.

    Understanding Clinical Evidence Requirements with MDCG 2020-6

    How can manufacturers ensure legacy devices meet MDR's stringent requirements? Discover how MDCG 2020-6 guidance simplifies the path to compliance.

    Dr Clare Dixon Dr Clare Dixon Regulatory Specialist
  15. A stethoscope laid on a desk of regulatory documentation.

    Clinical benefits of an in vitro diagnostic medical device

    How to determine the clinical benefit of an IVD and successfully incorporate it into regulatory documentation.

    Dr Gayle Buchel Dr Gayle Buchel Chief Medical Writer
  16. EU flags

    Regulation (EU) 2024/1860 - Its impact on EU MDR and IVDR

    How does the recent Regulation (EU) 2024/1860 amendment affect the EU MDR & IVDR?

    Shona Richardson Shona Richardson Regulatory Medical Writer
  17. EU flag

    MDCG 2024-10 - Orphan medical devices

    How to apply MDR pre-market clinical evidence requirements to medical devices intended for limited usage.

    Dr Simon Cumiskey Dr Simon Cumiskey Senior Lead Medical Writer
  18. Considering a medical device's intended purpose

    A medical device's intended purpose - what is the point?

    How do you define intended purpose, indication for use, intended clinical benefits, and claims?

    Dr Simon Cumiskey Dr Simon Cumiskey Senior Lead Medical Writer
  19. Mantra Systems presents EnableChat, your AI-powered MDR & MDCG chatbot

    EnableChat - Your AI-powered MDR and MDCG chatbot

    Search the MDR and MDCG documents in seconds by asking EnableChat your questions.

    Dr Simon Cumiskey Dr Simon Cumiskey Senior Lead Medical Writer
  20. Searching adverse event databases for vigilance data

    Staying vigilant - A guide to searching for adverse events data

    We discuss the pros and cons of existing adverse event databases for vigilance data searching.

    Dr Simon Cumiskey Dr Simon Cumiskey Senior Lead Medical Writer
  21. A doctor reading an SSCP document with a patient

    What is Summary of Safety and Clinical Performance (SSCP)?

    We explain what the SSCP is, when you'll need it and what its objectives are.

    Sandra Gopinath Sandra Gopinath Chief Regulatory Officer
  22. A pile of question marks

    Medical Device 'Significant Changes' – Navigating EU MDR Article 120(3) using MDCG 2020-3 rev. 1

    Understand what changes to your medical device are considered 'significant' under EU MDR (2017/745).

    Shen May Khoo Shen May Khoo Junior Regulatory Specialist
  23. A signpost giving unsure directions

    MDR or IVDR - A sibling rivalry?

    A guide to easily understanding whether your device is a medical device or an in vitro diagnostic medical device (IVD).

    Dr Gayle Buchel Dr Gayle Buchel Chief Medical Writer
  24. An EU and UK flag

    What the latest Brexit U-turn means for CE Marking of medical devices in Great Britain

    Will Great Britain continue to allow the use of the CE mark for medical devices beyond the 2024 deadline?

    Dr Hanna Gul Dr Hanna Gul Lead Medical Writer
  25. A woman writing her own medical device regulation documentation

    Gain confidence, reassurance and control over your EU MDR strategy

    Find out how to build your own technical files within a guided framework while minimising financial outlays.

    Dr Gayle Buchel Dr Gayle Buchel Chief Medical Writer
  26. Racing to achieve MDR compliance

    Still racing to achieve MDR compliance? A transition period update

    On January 6th 2023, the EU commission has adopted the proposal to extend the transition rules of the EU MDR.

    Sandra Gopinath Sandra Gopinath Chief Regulatory Officer
  27. A 7-step guide to navigating regulatory requirements for medical device start-ups

    A medical device regulations guide for start-up companies

    We present a 7-step guide to navigating regulatory requirements on a budget.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  28. An update on UKCA Marking of Medical Devices

    UKCA Marking of Medical Devices – An update on the status quo

    We review recently updated requirements for UKCA marking and what it means for your regulatory strategy.

    Dr Hanna Gul Dr Hanna Gul Lead Medical Writer
  29. How to choose a CER writer for your MDR Clinical Evaluation

    Choosing a CER writer for your MDR Clinical Evaluations

    We've compiled a list of considerations that will help you make the right choice when choosing a CER writer.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  30. Achieving MDR Compliance for Class I medical devices

    How to achieve MDR Compliance for Class I medical devices

    We outline a strategy for the regulatory compliance of Class I medical devices.

    Sandra Gopinath Sandra Gopinath Chief Regulatory Officer
  31. Literature Search, SOTA Review and Clinical Evaluation

    Literature Search, SOTA Review process and Clinical Evaluation

    We help to demystify the process of systematic search & review of literature for Clinical Evaluation.

    Sandra Gopinath Sandra Gopinath Chief Regulatory Officer
  32. Literature Search Protocols & SOTA Reviews for medical devices and what to know before you start

    Literature searches and reviews for medical devices - what to know before you start

    We explain what you should know before beginning a literature search & review for your medical device.

    Sandra Gopinath Sandra Gopinath Chief Regulatory Officer
  33. Five useful resources when writing a medical device CER

    Five useful resources when writing a medical device CER

    We outline five of the most useful and trustworthy Clinical Evaluation Report writing resources.

    Dr Victoria Cartwright Dr Victoria Cartwright Relationship Manager
  34. Avoid pitfalls when writing a Clinical Evaluation Report

    Five common pitfalls when writing a Clinical Evaluation Report

    We illustrate five pitfalls when writing CERs and give you some tips to overcome them.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  35. How to make a medical device equivalence claim under the MDR

    Five tips for making a medical device equivalence claim under the MDR

    We'll show you what to keep in mind with regards to equivalance and Clinical Evaluation.

    Sandra Gopinath Sandra Gopinath Chief Regulatory Officer
  36. Keeping medical devices in market and maintaining CE-marks - a guide to effective data collection

    Keeping medical devices in market and maintaining CE-marks

    The 4 golden rules to drive regulatory compliance with PMCF and vigilance data collection.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  37. How PMCF goes beyond simple compliance - improving products and engaging customers

    How PMCF goes beyond simple compliance

    The wider benefits of a well-designed PMCF system include improving your products and your relationship with your clients.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  38. PMCF systems for medical devices

    Why you'll almost certainly need a PMCF system for your medical devices

    We tell you what to be aware of under the EU MDR regarding PMCF and your medical devices.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  39. Ensure medical device regulatory compliance of your devices through Brexit

    The impact of Brexit on medical device regulatory compliance

    How to ensure regulatory alignment of your devices in the territories affected by Brexit.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  40. Use medical device regulatory consulting services to supercharge your MDR transition

    Is outside consulting support the answer to your MDR transition?

    Getting ready for the MDR is a demanding process. Outsourcing might be your solution.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  41. Increasing data entry compliance in PMCF studies

    Increasing data entry compliance in PMCF studies

    5 methods every medical device manufacturer should know to improve their Post-Market Clinical Follow-up studies.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  42. Why medical doctors can drive MDR compliance

    Why medical doctors can drive MDR compliance

    Working with the MDR requires knowing how to work with clinical evidence. Medical doctors are perfectly positioned to meet this requirement.

    Dr Victoria Cartwright Dr Victoria Cartwright Relationship Manager
  43. Software as a Medical Device

    Software as a Medical Device

    Unless you have spent time working with medical device legislation in the past, the idea that software could be a medical device may be rather unexpected.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  44. clinical investigator for pmcf eu mdr compliance

    Ensuring that clinical investigations work in practice

    How can medical device manufacturers ensure valid clinical investigations when access to medical expertise remains limited?

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  45. Coronavirus and medical device regulations

    Relaxing medical device regulatory requirements during a healthcare crisis

    During the coronavirus pandemic, how far should we go when relaxing medical device regulatory requirements?

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  46. The new MDR compliance challenge

    The new MDR compliance challenge

    Across the industry, medical device companies are facing challenges in meeting the demands of the new Medical Device Regulations (MDR) 2017/745 framework.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer
  47. Sources of Real World Evidence for MDR compliance

    Sources of Real World Evidence for MDR compliance

    At Mantra Systems our objective is to make sure that our clients choose the method of real world data harvesting that is right for them.

    Dr Paul Hercock Dr Paul Hercock Chief Executive Officer

More articles

Do you need support with your medical device approval strategy?

Contact us today