Hex image
Blog
Insights Hub

Tokenization: Leveraging Patient Health Data to Improve Care in a Meaningful, Sustainable Way

Tokenization expands coverage and understanding of the patient journey by enabling the linking of real-world participant data. Our experts discuss the value of tokenization and practical applications in clinical and real-world research.

The ability to leverage health care data for real-world research has historically been challenging due to patient privacy and data integration concerns.

There has been continuous growth in the variety and volume of available health care data, and newer applications for use of real-world data (RWD) and real-world evidence (RWE) to enhance the speed of patient access to new therapies. Tokenization has evolved as a valuable solution to accurately link real-world patient data without compromising patient confidentiality.

Tokenization of health care data is a process by which patient identifiers are pseudonymized through generation of a patient-specific encrypted ”token.” By tokenizing participants in a clinical trial, researchers gain the ability to link RWD from trial participants’ medical data to their clinical trial data.

Here are five things you need to know about tokenization.

1. Although not new, tokenization is rising to the forefront as the industry continues to realize the value of RWE and its impact on health care delivery.

The health care industry and regulatory bodies are becoming more accepting of RWD and its ability to provide a greater understanding of each patient’s unique circumstances over time. Tokenization allows researchers to leverage and link disparate data sources – both in and out of the health care setting – while maintaining patient privacy. This provides researchers with a more complete view of the patient and an improved ability to better answer more questions.

2. The process of creating a token is very straightforward, but several factors must be considered when creating a token and using it to support building a combined dataset.

  • Study startup: In this stage, it’s critical to define research questions, determine what types of data will be collected and tokenized, evaluate the ability of tokenization and data vendors to deliver on your studies, and define the operational model. Part of the operational model includes selection of the tokenization vendor and determining what technology and data sources are required.
  • Consent development: Informed consent is the most important step in the tokenization process. Consent for tokenization can be collected at any time, but the token cannot be created until consent is provided. The consent form should include clear and appropriate information to ensure patients understand the importance of the study, the tokenization process, how to withdraw consent and types of personal identifiable information that will be used to generate a token.
  • Site training and activation: It is essential to ensure that site personnel understand the tokenization process, how patient confidentiality is maintained and ways that tokens are used so staff can effectively answer high-level questions regarding tokenization and linking of health care data.
  • Consent and withdrawal: Institutional review boards require the management of long-term participant consent. A pathway must be available for patients to withdraw their consent at any time during the study. To reduce the training requirements at each site, it is recommended that sites use a centralized operations team to manage tokenization, consent and withdrawal.

3. A key application of tokens is linking clinical study data and real-world data.

To optimize the utility of these diverse data, it’s recommended to use a centralized data lake that is cloud-based and hosts data from various RWD sources. Once in the data lake, tokenized data can be linked using a special linking software based on the data usage agreements to create longitudinal patient data. This enables deeper and more meaningful real-world insights on clinical study data.

4. Tokenization can accelerate innovation in clinical trials and drug development.

Following tokenized study participants in the real world and in real time across a broad data ecosystem extends the ability to answer more research questions through continuous evidence generation across a variety of use cases and therapy areas throughout the development life cycle.

5. If you don’t tokenize today, you may miss an opportunity to do so tomorrow.

While studies can be tokenized at all stages of clinical development, across all major therapeutic areas, and in studies of all shapes and sizes, implementing tokenization at the beginning of a study unlocks the full potential of the connections. Creating these tokens early results in potential future cost and time savings. Additionally, instead of a prospective data collection study, existing data can be leveraged to avoid costly site setup for data collection and reduce patient burden and costs of long follow-up studies.

Learn more about ways to leverage patient health data through tokenization. Watch our recent webinar, Patient Tokenization: Practical Applications in Clinical and Real-World Research, where our experts discuss why, when and how patients are tokenized, the new methods being used and how to use tokenization in practical applications.

Want to get started on your tokenization journey?

Recommended For You