Hongxiang Hu, PhD: No financial relationships to disclose
Despite significant advancements in machine learning within the field of quantitative pharmacology, there remains a critical need for comprehensive evaluation of natural language processing (NLP) algorithms in longitudinal pharmacokinetic/pharmacodynamic (PK/PD) modeling. In this study, we explore the application of the transformer model and compare its performance against other NLP techniques, including long short-term memory (LSTM) networks and neural-ODEs (Ordinary Differential Equations), in analyzing virtual PK/PD data derived from three different treatment regimens. Our findings suggest that both LSTM and neural-ODE, along with their respective variants, deliver a strong performance in predicting outcomes from training-included (seen) regimens, though with some loss of accuracy when applied to training-excluded (unseen) regimens. The transformer model, similar to neural-ODEs, demonstrates promising results in describing time-series PK/PD data. Nonetheless, when extrapolating to unseen regimens, while outlining the general data trends, it encountered difficulties in precisely capturing data fluctuations. Remarkably, a small integration of unseen data into the training dataset significantly bolsters predictive performance for both seen and unseen regimens. Our study marks a pioneering effort in deploying the transformer model for time-series PK/PD analysis and provides a systematic exploration of various NLP techniques for this application.