Health NewsNewsOur NewsTech News

AI scans the tongue to diagnose diseases and illnesses with 96 percent accuracy

A novel artificial intelligence model has been created that can accurately diagnose a number of diseases by looking at a patient’s tongue. Although extremely sophisticated, this state-of-the-art technology is based on traditional medical procedures that have used tongue analysis for more than two millennia.

The tongue has long been used as a diagnostic instrument in traditional Chinese medicine and other holistic therapies. Its shape, color, and texture can provide important information about a variety of medical conditions, from diabetes and cancer to digestive and respiratory issues. Building on this age-old knowledge, scientists have recently used machine learning to develop an artificial intelligence system that can provide physicians an additional diagnostic viewpoint.

Engineers from Iraq’s Middle Technical University (MTU) and the University of South Australia (UniSA) collaborated to publish their research in the journal Technologies. The study found that the human tongue has unique properties related to internal organs, making it possible to identify and track illnesses. Among these, tongue color stands out as particularly important.

AI scans the tongue to diagnose diseases and illnesses with 96 percent accuracy
A scientist showcases the process of a camera capturing pictures of the tongue and examining it for signs of illness. (Middle TechnicalUniversity)

Lead researcher and adjunct associate professor Ali Al-Naji of UniSA provided a number of examples. For example, people with diabetes frequently have a yellowish tongue, and people with cancer may have a thick, greasy purple coating on their tongue. Acute stroke patients usually have a red tongue that is atypically shaped. Furthermore, anemia can be indicated by a white tongue, and gastrointestinal, vascular, or respiratory issues can be indicated by an indigo or violet tongue. Deep red tongues have been associated with severe COVID-19 cases in recent studies.

The researchers used a two-phase training process to create their AI model. 5,260 images with a range of tongue colors and lighting conditions were first fed into the system. In this dataset, 310 photos of healthy tongues were labeled as “red,” and 300 photos showed sick tongues that were classified as “gray.” The model was then improved with real-time data from two hospitals in Iraq, which included sixty pictures of both healthy and sick tongues.

Via live testing, the algorithm’s performance was assessed. The participants, who represented a range in health conditions, placed their tongues 20 centimeters away from an AI-connected webcam. The model showed amazing accuracy, and the results were astounding.

The researchers reported an accuracy rate of more than 98 percent in identifying illnesses linked to observable alterations in tongue color. Using a dataset of sixty tongue images, the system demonstrated an astounding 96.6 percent accuracy.

This innovative study highlights the possibility of incorporating comparable artificial intelligence systems into healthcare facilities. Such technology, which provides a safe, effective, practical, and affordable diagnostic tool, has the potential to completely transform the disease screening process.

Kyle James Lee

Majority Owner of The AEGIS Alliance. I studied in college for Media Arts, Game Development. Talents include Writer/Article Writer, Graphic Design, Photoshop, Web Design and Development, Video Production, Social Media, and eCommerce.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblocker Detected

Hello. Our systems have detected that you're using an adblocker. Please whitelist/bypass our website, or temporarily turn off the adblocker and reload the page. We apologize for the inconvenience. Thanks for your time.