Development of new laboratory tests and putting them into practice

    21-Feb-2020
We often see and hear in the news about new laboratory tests that have been developed to detect or manage conditions or diseases that affect our life or that of someone we know. There are news about research for developing tests for hitherto unknown virus strains in times of epidemic. As with other products and services, new laboratory tests are meant to satisfy a need: to help us and our healthcare providers diagnose, screen for, or monitor conditions faster, easier, and with more confidence.
But how does a particular diagnostic test that shows promise in the research stages actually get to the point where it is commercially available for use at the doctor’s office, clinic, or hospital? What does it mean for the patients when a new test is announced? How are your healthcare needs met with the introduction of new tests and how is your health protected from new tests that might misinform or mislead your healthcare provider? Getting familiar with how laboratory tests go through the development, validation, and approval stages and placed into practice may help you understand the answers to these questions.
It may take years for a new test to pass through the many phases – research, testing, clinical evaluation, development of manufacturing processes, and review by regulatory authorities – before the test is available for use. It is a rigorous process with no assurance that the test, once developed and validated, will actually be adopted by healthcare providers for use on patients.
WHY NEW TESTS ARE DEVELOPED?
Researchers continually look for new ways to improve early detection and diagnosis of diseases, more accurately monitor conditions, and better predict outcomes (prognosis). The goals of improving and advancing patient care often provide the incentive for the development and use of new or improved laboratory tests.
One of the most common ways a new test gets developed is through the recognition of a need for an accurate test to diagnose or monitor a particular disease or condition. An example is the test for troponins.
After years of looking for a better way to diagnose heart attacks or acute coronary syndrome (ACS), it was realized that the protein troponin is released into the blood when heart muscle is damaged. Measurement of troponin levels in the blood is routinely used by the medical community as a test for evaluating patients with chest pain to help determine if they have had a heart attack.
Often researchers look to improve the way in which a condition is detected or a substance of interest is measured. Their goal is to improve upon the accuracy, precision, sensitivity and/or specificity of an existing test. This can sometimes be accomplished by developing and employing a new way of testing.
We can take the example of the development of polymerase chain reaction (PCR) to detect infections as a replacement for an immunoassay method that may be less sensitive or specific. Sometimes the decision to use new tests for established analytesis based on whether the new method offers faster results, moving from a slower, more labour-intensive method to an automated method that generates patient results in a shorter amount of time. This can directly impact how quickly a diagnosis can be obtained, how long a patient stays in the hospital, or alter any medications taken.
QUESTIONS TO BE CONSIDERED
There are several questions that are considered when evaluating the merits of developing a new test:
· Is the new test more accurate? In other words, can it detect disease when it is present and rule it out when it is not present?
· Is it less invasive? Is the sample required easier to obtain and/or does the procedure cause less discomfort for the patient?
· Does it provide results more quickly so that treatment can begin sooner?
Though the reasons for developing new tests or methods may vary, it is important to note that the development of all new tests is highly regulated. Each new test must meet certain criteria before it is allowed to be used on patient samples.
COMMERCIAL LABORATORY TESTS AND THEIR APPROVAL
Commercial laboratory tests are those that are performed using commercially manufactured kits and equipment. The majority of lab tests in use today fall into this category. Unlike tests developed for use in a single laboratory or laboratory company (known as lab-developed tests), they are manufactured, marketed, and sold in volume as kits to multiple laboratories and other healthcare facilities.
The development and marketing of most commercial tests are regulated by the U.S. Food and Drug Administration (FDA). These tests and methodologies are evaluated and approved by the FDA. Many countries around the world have agencies comparable to the FDA that are responsible for approving the use of clinical laboratory tests.
The process required for a new commercial test to gain approval for marketing by the FDA can be long and costly, sometimes taking many years, depending on how complicated the test is.
(The writer is Senior Consultant Pathologist & Managing Director, BABINA Diagnostics, Imphal)