BOSTON, Mass.––Consumers may place a high value on information to predict their future health, and may be willing to pay out of pocket to get it. In a national survey conducted by researchers at Tufts Medical Center, roughly 76% of people indicated that they would take a hypothetical predictive test to find out if they will later develop Alzheimer’s disease, breast or prostate cancer, or arthritis. On average, respondents were willing to pay $300 to $600, depending on the specific disease and the accuracy of the test.
Published online in the journal Health Economics, the study examined individuals’ willingness to take and pay for hypothetical predictive laboratory tests in which there would be no direct treatment consequences. Overall, researchers found that in most situations, people were willing to pay for this ‘value of knowing’—even if the tests were not perfectly accurate.
Responses to the survey varied according to information provided about the disease risk profile and the accuracy of the hypothetical test. Of the 1463 respondents, willingness to be tested was greatest for prostate cancer (87% of respondents), followed by breast cancer (81%), arthritis (79%), and Alzheimer’s disease (72%). Average willingness to pay varied from roughly $300 for an arthritis test to $600 for a prostate cancer test.
The randomized, population-based internet survey presented participants with the option to take a hypothetical predictive blood test for one of the four diseases, understanding that the test would not be covered by insurance. Participants were asked how much they would be willing to pay for a test that could predict their disease. Some respondents were asked about a ‘perfectly accurate’ test, and others about an ‘imperfect’ one. They were also queried about their socioeconomic information, health status, risk attitudes and behaviors, and likely actions after receiving a positive test result.
The advancing field of in vitro diagnostics (IVDs) includes an increasing number of clinical laboratory tests that offer the hope of personalized screening to assess an individual’s risk of developing certain diseases based on genetic markers found in blood or tissue samples.
According to Neumann, the growing use of predictive testing worldwide has resulted in increasing demands for evidence that demonstrates the value of such tests. Health technology assessment groups typically measure the utility of diagnostic tests in terms such as increased accuracy of test results, cost-effectiveness, or improved health outcomes for patients. But assessing the value of predictive testing may also require the use of new or different measures. In the Tufts Medical Center study, the researchers also found:
“By taking into account all implications of these tests— including the risks, costs, potential cost offsets, and the value they have outside of medical outcomes—we can build better policies and make better decisions about coverage and reimbursement, so that we may more accurately reflect patient preferences and appropriate uses of societal resources,” says Neumann.
The study, “Willingness to Pay for Predictive Diagnostic Information with No Immediate Treatment: A Survey of U.S. Residents,” (Health Economics, published online before print, 28 December 2010: doi: 10.1002/hec.1704) was supported by a grant from the Institute for Health Technology Studies (InHealth). Coauthors of the study are Joshua T. Cohen, James K. Hammitt, Thomas W. Concannon, Hannah R. Auerbach, ChiHui Fang, and David M. Kent.