Podium (General) 11

Inter-rater and Intra-rater Agreement of Radiographic Interpretations of Adult Hip Fractures Using Radiographic Plates, a Mobile Phone, and a computer. A Concordance Study

Tristan Marvin T. Tan, MD, TMC

Background: A shift in the paradigm of storage, access and transmission of information is practiced in these recent times brought about by the abundance of mobile phone applications (“apps”). Mobile phones as well are evolving, becoming more efficient and enhanced with newer models featuring clearer cameras and screens having better resolution of images. The use of the cellphone application, Viber, in orthopedics, brought a change in the workflow, making referral of cases even from remote areas supplemented by gross and radiographic images possible. Helping orthopedic practice become more efficient and expedient in terms of decision-making and treatment implementation.


Methods: Twelve cases of hip fractures were selected and was rated by an AO trauma specialist using radiographic plates serving as the gold standard in diagnosis and served as the control value in this study. Twelve orthopedic surgeons from our institution participated in this study. Hip fracture cases were rated using radiographic plates, a mobile phone and a laptop at three week intervals. The mobile phone camera acquired the images from the PACS console, images were sent to another identical phone and rated using the Viber application. The images rated from the laptop were acquired from the PACS and saved in PowerPoint format. The AO classification was used to diagnose the hip fracture cases. Concordance analysis using Cohen’s Kappa coefficient was done by determining the inter-rater and intra-rater agreement of the participants’ ratings from the mobile phone and laptop against the radiographic plates.


Results : There was fair and moderate agreement in the ratings using the mobile phone and laptop compared with the radiographic plates with a calculated Cohen’s Kappa of 0.3234 and 0.4492, respectively. Overall intra-rater agreement for all participants, among all media had a Kappa coefficient of 0.519. The intra-rater agreement in the ratings using the mobile phone was moderate with a Cohen’s Kappa of 0.597 while that of the laptop was 1.0 showing very good agreement. 


Conclusion: This study showed moderate inter-rater agreement in diagnosing cases of hip fractures comparing the laptop and fair results comparing the mobile phone against the radiographic plates. This study recommends the use of a laptop in diagnosis of hip fractures because it had more reproducible and consistent ratings compared with the other media platforms. Keywords: Viber, Teleradiology, AO Trauma, Concordance analysis, hip fractures