by Prasanth Perugupalli, Pramana

 

If you have attended recent digital pathology conferences, you would very likely have struggled to figure out which track to attend among the parallel discourses running throughout the day. The interest in various topics surrounding digital transformation is now palpable, and the adoption curve is indisputably starting to steepen. However, the industry has struggled to put its finger on one killer application. Telepathology will probably remain the least contested enabler since it cuts through shipping costs and saves time. Artificial intelligence (AI) algorithms and pre-screening have been gaining traction and will very likely prevail over a long horizon. Although AI has so far come up to address less than 1% workload of a general pathologist and thus cannot justify any large-scale investments in time, capex, and uncertain addendums, it has been clearly established that AI-enabled digital is the future of tissue and cellular level diagnostics. This means that IT departments need to acclimatize themselves to a host of new data types, storage needs, and software tools laced with AI that have different life cycles than what has been the norm for any lab in the past. Lab administrators need to adapt to modified workflow schedules to fit into a new paradigm. Visionary trailblazers, who have established the incredible value of data in its various forms in their possession, are leading the way with digitization and curated association of patient reports, lab tests, glass slides, and treatment outcomes. Archival scanning delivers a massive boost to development of inference algorithms, delivering rich dividends akin to the indelible advantage gained by Google for search or Tesla for autonomous driving. A more tangible roadmap for every healthcare institution delivers a gradual approach to digital transformation. Patient referrals transferred from remote labs provide a perfect opportunity to digitize right now, since it directly cuts down the shipment times, organizes the data in predetermined formats, and mitigates errors that often result in long delays. Patient notes and glass slides for primary diagnosis, when digitized at the remote lab and transferred over the internet, offer the window for a reference lab to better schedule esoteric testing and specialist time for hitting its TAT targets. However, data quality is of paramount concern, and it is imperative that the needs of the specialist are fully addressed. This means no missing tissue or out-of-focus fields on the digital image and preservation of pen marks, if any. If confidence isn't built with the specialist at the reference lab, this business model breaks down rapidly. In-line quality assurance that is validated at the reference lab and also adopted at the referral lab, has become a necessity to enable this business case. It is imperative that full dependence on human operators to make decisions is done away with. The no-brainer candidate that is gaining momentum across institutions is post-signout organization of the cases in the digital form. This allows for gradual accumulation of experience and structured datasets across various sub-specialties to create and manage digital assets, while also preparing the validation datasets that each institution will need when getting ready to test AI, developed by itself or a third-party vendor for its own consumption. However, if an institution chooses to start its journey into the digital realm, future proofing the data generated during the early days of adoption is highly recommended. It is to be expected that AI, storage, and visualization algorithms will continue to advance every year, and your data assets shouldn't become less desirable over time. Preserving the images with lossless ion, storing the Z stack data to mimic the fine focus of a conventional microscope, and associating quality tags to the data are some low hanging fruits that could be implemented with least pain in the early days of digital adoption. It would be best to promote the use of standardization of data formats across the industry, all the way to pixel level data management. While there is tremendous potential and progress on DICOM front as a standardized container, what is yet to be addressed is the data filling the DICOM folders. Until there is maturity in the ecosystem with several generations of technology rolled out, it is not unusual for confusion to prevail. A clear path to mitigate this uncertainty is to adopt modular solutions across the institution. The optical path, the imaging and panorama generation pipeline are critical, and wherever possible, it is advisable to find solutions that offer a uniform output for a high-volume reference lab as well as its partner referral labs that may need very small capacity.

 

 

Disclaimer: In seeking to foster discourse on a wide array of ideas, the Digital Pathology Association believes that it is important to share a range of prominent industry viewpoints. This article does not necessarily express the viewpoints of the DPA, however we view this as a valuable point with which to facilitate discussion.