Abstract 4613

Aim

PAINReportIt® is a computer based, self-reporting pain assessment tool that was designed to serve as a communication tool between patients and medical care providers. Originally developed and validated for the reporting of cancer pain, the application of this tool to sickle cell pain is novel. One aspect of this computer program incorporates a body outline in which patients can indicate specific pain locations by drawing on the anterior and posterior facing body images. However, because of the systemic nature of SCD, affected individuals may experience a wider range in the number of painful sites and their distribution than individual cancer patients. Therefore, the established algorithms within the computer program, which convert a graphic pain representation into a numerical description of the number of painful sites and their anatomic distribution, must be able to account for this variability. The purpose of this study was to ensure the reliability and accuracy of the computer generated data by comparing it to human generated data regarding painful sites, and to account for all factors that may impinge on the accuracy of computer generated data.

Methods

Individual PAINReportIt® body outlines completed by 49 adolescent SCD patients ranging from 14 to 27 years in age (mean=18 +/- 2.6) were analyzed by two researchers based on a specific set of guidelines. The researchers were blinded to the computer generated data as they coded the data following guidelines that defined 1) the method for counting the number of pain sites in each drawing, 2) an ordering hierarchy for assigning a site number to each site, and 3) whether or not a particular body segment was included in each site. The 9 body segments included the head, chest, right and left arms, abdomen, right and left legs, upper back and lower back. A third researcher and consensus discussion resolved coding discrepancies between researchers. Descriptive and t test statistics were used to compare the data results of the researcher and results generated by the computer.

Results

There was strong accuracy in the interpretation of site number and locations between the researcher's data and the computer-generated data. For example, patients reported a mean value of 3.9 (+/- 3.3) and 3.8 (+/- 3.5) painful sites according to the computer program and researcher, respectively. Also, there was strong accuracy in patients' reports of site assignments for 8 of the 9 body segments. For example, the fraction of painful sites that included the head was found to be 0.2 +/- 0.4 and 0.3 +/- 0.5 by the computer and researcher, respectively. However, a significant difference between the computer and the researcher was found for inclusion of the lower back segment – 0.1 +/- 0.2 and 0.5 +/- 0.5 of painful sites, respectively (P < 10-3).

Conclusion

Findings suggest that the algorithms used to interpret patient drawings in the PAINReportIt® computer program generally accounts for the intentions of the patients. The minor discrepancies may reflect ambiguities in the guidelines used by the researcher in identifying involvement of this particular body segment, or an idiosyncrasy in the computer algorithm itself. Analysis of painful site data by multiple independent observers will likely resolve this discrepancy. Findings also indicate that there is still room for adjustment so that differences between the algorithms of the computer program and the specific guidelines used by the researcher can be reconciled to accurately reflect the patient's pain location report.

Disclosures:

No relevant conflicts of interest to declare.

Author notes

*

Asterisk with author names denotes non-ASH members.

Sign in via your Institution