In a significant advancement at the crossroads of artificial intelligence and healthcare, Tobi Titus Oyekanmi, a computer scientist at New Mexico Highlands University, has introduced a deep learning model poised to transform brain cancer diagnosis, particularly in under-resourced areas. His research, titled “Deep Learning-Based Diagnosis of Brain Cancer Using Convolutional Neural Networks on MRI Scans: A Comparative Study of Model Architectures and Tumor Classification Accuracy,” reveals a new system called LightBT-CNN, which has demonstrated a remarkable 98% accuracy in analyzing MRI brain scans.
The findings were published in the American Academic Scientific Research Journal for Engineering, Technology, and Sciences (ASRJETS), placing Oyekanmi at the forefront of the rapidly evolving field of explainable artificial intelligence (XAI) in medical imaging. Brain cancer is recognized as one of the most lethal forms of cancer worldwide, with conventional diagnosis often reliant on the manual interpretation of MRI scans—a method that can be time-consuming and susceptible to human error.
Oyekanmi expressed his hopes for the technology, stating, “AI can help level the playing field. The goal was to build a lightweight yet powerful neural network that can analyze brain MRI scans with accuracy comparable to expert radiologists but without the need for expensive infrastructure.” To realize this vision, he led a diverse team, including Peter Adigun, Nelson Azeez, and Ayodeji Adeniyi, to develop the LightBT-CNN. This convolutional neural network was trained with over 7,000 MRI images to classify four tumor types: glioma, meningioma, pituitary, and healthy brain scans.
In contrast to large deep learning frameworks like VGG16 or ResNet50, which necessitate high-performance GPUs, Oyekanmi”s LightBT-CNN contains only 3.6 million trainable parameters, making it compact and cost-effective, thus ideal for healthcare facilities in developing nations. Built using Python and TensorFlow, the model achieved precision and recall rates exceeding 95% across all tumor classifications.
A distinctive feature of LightBT-CNN is its interpretability. Utilizing Gradient-weighted Class Activation Mapping (Grad-CAM), the system visually identifies the brain regions that influence its predictions, providing clinicians with transparent insights into each diagnosis. Oyekanmi noted, “Trust is everything in medicine. If an AI can show why it made a decision, clinicians are more likely to adopt it.”
Although based in the United States, Oyekanmi maintains strong ties to Nigerian research networks, collaborating with physicist Nelson Abimbola Azeez from the University of Abuja. Their joint efforts focus on leveraging AI to tackle diagnostic challenges throughout Africa, addressing issues ranging from brain tumors to pneumonia detection. “This is not just about publishing papers,” Oyekanmi remarked. “It”s about creating practical tools that improve patient outcomes and build local capacity in medical AI.”
Previously, Oyekanmi collaborated with Adigun and Adeniyi on AI-based X-ray interpretation for pneumonia detection, laying the groundwork for this significant brain cancer initiative. Earlier this year, he was honored with the 2025 NIPES Award for Outstanding Contribution to Research and Innovation, awarded by the National Institute of Professional Engineers and Scientists. His work was recognized among over 1,200 nominations from four countries for its originality and measurable impact, further establishing Oyekanmi”s standing as a leading figure in AI-driven scientific research in Nigeria.
Oyekanmi expressed that receiving the NIPES award is deeply meaningful, stating, “It reminds me that impactful research isn”t just about algorithms; it”s about improving lives.” Experts have praised his research for merging academic rigor with practical application. The study compares the performance of LightBT-CNN against international benchmarks like ResNet and EfficientNet, showing comparable accuracy with significantly lower computational demands, which is crucial for healthcare systems with limited digital resources. “AI doesn”t have to be complicated to be effective. Sometimes simplicity and efficiency matter more than brute-force computation,” Oyekanmi added.
Despite the success of the model, he acknowledged its limitations, noting that the dataset reflects controlled MRI conditions rather than the variability encountered in actual hospital environments. Looking forward, Oyekanmi intends to collaborate with clinical partners to validate the system using real patient data and explore multi-modal imaging that combines MRI with CT and PET scans. Additionally, he advocates for cross-institutional AI training programs in Nigerian universities to prepare the next generation of scientists for hands-on experience in medical machine learning.
Reflecting on his work, Oyekanmi stated, “This work reminds us that innovation isn”t confined to big tech companies. With the right vision, collaboration, and compassion, AI can become a tool for equity, helping every patient, everywhere, get the care they deserve.”
