By Katie Elyce Jones, PillarQ
In October 2022, the U.S. Food and Drug Administration (FDA) updated its list of Artificial Intelligence and Machine Learning (AI/ML)-Enabled Medical Devices. These devices have been reviewed and authorized by the FDA to be legally marketed (a step known as premarket clearance). Not surprisingly, this list has grown in recent years with 521 AI/ML-enabled devices added since 1995—and the vast majority cleared since 2016.
This article is a snapshot of the laboratory research goals and challenges behind just one AI/ML-enabled device that was recently cleared by the FDA. The device is called QmTRIAGE, a breast cancer screening software by MedCognetics, a Dallas-based company that is the brainchild of researchers from the University of Texas at Dallas and the UT Southwest Medical Center (UTSW). UTSW licensed the intellectual property to MedCognetics, and both institutions hold equity in the company as well.
The AI research started at the UT Dallas Quality of Life Technology Laboratory, which aims to “develop innovative methods, and to design intelligent systems for personalized health care, vital signs monitoring and disease prevention.” The lab consists of faculty and mostly doctoral students in electrical and computer engineering.
“The goal is to build technologies that will enhance the quality of human life,” said Lakshman Tamil, Professor of Electrical and Engineering at UT Dallas, Director of the Quality of Life Technology Lab, and Co-founder of MedCognetics. “We’ve built a lot of technology for heart wellness, lung wellness, cancer detection, and sleep wellness.”
The lab’s AI imaging technology for breast cancer screening is trained on hundreds of thousands of mammogram images, including those from both 2D and 3D screens as well as cancer and non-cancer diagnoses. The UT Dallas lab receives this deidentified data (meaning the image is not connected to the patient’s identity to protect privacy) from UTSW.
“In the mammogram, we look at features around the cancer or non-cancer, which are very different in terms of variation and contour of image. We are analyzing thousands of features, whereas the human eye can do only a few features at a time,” Tamil said.
“AI has the ability to peer through things that the human eye cannot.”
Tamil outlined a few goals and challenges their research team considered when developing their AI technology.

Dr. Lakshman Tamil, left, Professor of Electrical and Computer Engineering and Director of the Quality of Life Technology Laboratory at UT Dallas, with his former graduate students Dr. Tim Cogan BS’15, PhD’21 and Maribeth Cogan BS’15 MS’18. The three researchers demonstrated an AI technique with accuracy for detecting breast cancer, which led to the formation of MedCognetics, Inc. Tamil and Tim Cogan are co-founders of the company. Image courtesy of UT Dallas.
Goals
The team wanted their AI technology to reduce training bias, increase robustness for better detecting early manifestations of cancer, and help improve the productivity of the radiologist using the software.
1. Avoid bias. Bias in AI is a serious issue,” Tamil said. For example, if an AI model is trained on data (in this case, mammogram images) from mostly white patients, it may not perform as well on data from diverse patient groups. “We have built techniques that will remove the bias,” he said, explaining this is a patent-pending deep learning technique.
2. Increase robustness. Detecting early cancer features is a training challenge because there are fewer labeled images of early cancers. “Early manifestation means very small and feeble. It’s very difficult for the naked eye to discern that,” Tamil said. “We have a built a technology that can look at these feeble images and still make sense of it.”
3. Improve radiologist productivity. Like many AI technologies, AI-enabled breast cancer screening isn’t replacing human experts. The MedCognetics press release cites an analysis that reports burnout among about 50 percent of radiologists worldwide, as does the journal Applied Radiology online. “It [reviewing radiology images] is a routine, mundane thing, looking again and again,” he said. “It contributes to burnout and increases mistakes and errors.”
In addition to helping radiologists review images in less time, Tamil said AI technology can also provide some level of radiology service where little to none is available: “In underdeveloped countries, the availability of radiologists is rare, and they are overwhelmed. AI can be a great boon to these countries.”
Challenges
“When you build any system, it should be unbiased and then it should be robust. And then you see that one of the biggest problems is annotating these images,” Tamil said.
4. Enable affordable annotation. Using traditional methods, this step of data curation can be costly and time consuming. “Through hundreds of thousands of images, someone has to sit down and say, ‘Here is the cancer markup,’” he said. “If you hire a radiologist to do this at $50 for an image, that’s the cost of data curation.” The UT Dallas lab used AI techniques like semi-supervised learning, or “expert-in-the-loop” training, by which the AI model learns by working with the radiologist. The AI annotates a small array of images. Some of the AI’s annotations will be right and some wrong, but an expert radiologist cleans up the data for the AI to learn from and the process is repeated.
5. Protect patient privacy. “When you use someone’s data, you don’t want to find out who it is,” Tamil said. “So, collecting data becomes difficult.” To tackle this data challenge, the team used federated learning, meaning they built the model where the data originates. “We don’t take that data out of the premise, we only take the model,” he said.
The next step for the team is expanding the underlying AI technology to other types of cancer screening, such as lung and prostate screenings. While the technology is similar, it cannot simply be retrained for other types of cancer, but new neural networks—or deep learning algorithms—will need to be built.
Research References
T. Cogan and L. Tamil, “Deep Understanding of Breast Density Classification,” 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 2020, pp. 1140-1143, doi: 10.1109/EMBC44109.2020.9176628.
T. Cogan, M. Cogan, and L. Tamil, “RAMS: Remote and Automatic Mammogram Screening,” Computers in Biology and Medicine, 107 (April 2019): 18-29. https://doi.org/10.1016/j.compbiomed.2019.01.024.
Want to read more about university AI research? Check out this PillarQ&A with Lynne Parker, former Director of the National AI Initiative Office who is leading the AI Tennessee Initiative at the University of Tennessee.