AI, defined as a computer algorithm that exhibits cognitive properties such as the ability to learn. AI-supported image analysis already plays an important role in pathology, radiology and dermatology. In the realm of genomics, intelligence is harnessed to forecast phenotypes based on genotypes (1, 2). AI computational algorithms frequently depend on sophisticated machine learning (ML) methods, encompassing natural language processing and computer vision. AI classification can be categorized into analytical, human-inspired, and human-inspired AI based on the types of intelligence they demonstrate. Alternatively, AI can be classified as generic or super-artificial, depending on their evolutionary stage. All these categories share a common characteristic, which is that AI is frequently not recognized as such when it is employed in a broad sense. This occurrence, known as the AI effect, arises when observers disregard the actions of an AI system by arguing that it does not possess genuine intelligence. The concept that any technology that reaches a certain level of advancement becomes indistinguishable from magic holds true (3, 4). Arthur Clarke famously stated that once one gains a comprehensive understanding of technology, the enchantment surrounding it inevitably fades away. Subsequently the 1950s, professionals have periodically forecasted the imminent arrival of comprehensive AI systems that exhibit human-like behavior across all domains of cognitive, emotional, and social intelligence. The veracity of these predictions remains to be seen. However, to gain a more comprehensive understanding of the potential of AI, we can examine it from two perspectives: the trajectory already traversed and the uncharted territory that lies ahead (5, 6). The aim of this study is reviewed the role of artificial intelligence in microbial infection diagnosis.
History of AI
The history of AI is shrouded in uncertainty, but it is believed to have originated in the 1940s with the publication of Isaac Asimov's short story, Runaround. An overview of the evolution of AI is presented in Table 1.
Years | History of artificial intelligence | Reference |
1942 | science fiction writer Isaac Asimov published his short story Runaround | (5) |
1950-1952 Alan Turing |
Construction of the first electromechanical computer Describing the “Imitation Game”, a game of deceit between a man and a machine |
(7) |
1952 Marvin Minsky et al. |
Establishment of artificial intelligence laboratory at MIT University | (5) |
1956 | Defined as the study of ‘intelligent agents. | (8) |
1956 | Artificial intelligence (AI) was first described by John McCarthy at Dartmouth College. | (9) |
1956 | AI has been described as the "fourth industrial revolution". | (9) |
1960 | The researchers at Stanford University developed the first problem solving program, “Dendral”, whose purpose was to evaluate hypotheses.This first system was used to identify the bacteria causing serious blood infections and recommend appropriate antibiotic therapies | (10) |
1975-Edward Shortleaf | MYCIN was the first, AI expert system (also known as knowledge-based systems), to provide consultation and diagnosis for antimicrobial therapy. | (11) |
1953-1993 Miller |
PIP- Present Illness Program (acquires the diagnosis of patients with renal disease) INTERNIST-1- internal medicine diagnosis by modeling behavior of clinicians, CASNET (Casual Associated Network)- for Glaucoma assessment and therapy, PUFF- Pulmonary function test interpretation. Miller has done a review on medical expert systems. |
(11) |
1995-2004 Shu-Hsien Liao |
Conducted a review on expert systems | (11) |
Advances in AI from the beginning until now in medicine
In the field of medicine, AI has emerged as a powerful and promising tool among existing analytical methods (12). While most health-related research can be supported by statisticians and bioinformatics experts, the advent of omics has generated vast amounts of data on gene polymorphisms, gene expression, metabolism, lipidomics, and proteomics, necessitating the development of more sophisticated apparatuses to recognize detailed cases from the global data volume. The use of AI in infectious disease management has improved diagnosis and prevented transmission, with mathematical models proving effective in predicting the magnitude of emerging infectious diseases (13, 14). In recent years, there has been a significant advancement in the development of predictive models and big data sets for non-communicable diseases. These diseases include hypertension, heart disease, and diabetes. A comprehensive study was conducted, gathering data from all fifty states in the United States over a span of five years (15). The emergence of life-threatening epidemics, such as Ebola and SARS-CoV-2, has instigated a wave of innovation in the realm of prediction. Various techniques, including machine learning, single-layer artificial neural networks, logistic regression, decision trees, and SVM classifiers, have been employed to develop a range of predictors. These predictors can be effectively utilized to analyze diverse combinations of Ebola-related data (16-18). Clinical laboratories are increasingly automated, with advanced robotics and software with AI capabilities being integrated into orchestration lines (19, 20).
The utilization of automated equipment has enabled robotic systems to perform procedures that were previously carried out by laboratory assistants in a swift and efficient manner. The implementation of contemporary automated systems for sample storage in method analysis can significantly augment the capacity of transparent tests (21). The intelligent Internet of Things for disease monitoring presents a viable solution for effectively monitoring diseases and detecting various forms of the ailment. The proposed plan entails a vast network of smart devices that will automatically process and interpret the entered data, which will then be transferred to the main backend, potentially the Ministry of Health data. This system will function as an initial alert mechanism to monitor the dissemination of illnesses. Once the trends and analysis are identified, it becomes easier to take action to curb the rapid spread of the disease and contain it across the country and globally. Additionally, it enables patients to identify the disease at an early stage (22). To address this pressing issue, many individuals are utilizing the Internet of Things to collect real-time sensory data, which was not feasible until recently. This involves monitoring individuals, medical facilities, the environment, and even remote areas of the world in certain circumstances (23). General AI and its subset, machine learning (ML), have demonstrated their immense value in analyzing data-rich sources, including macro or microscopic images, Matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS), and whole bacterial genome sequences. The application of AI in clinical microbiology has already yielded significant benefits, as it has provided valuable assistance to laboratory personnel in various aspects of diagnostic testing. The improvement of AI tools, the enhancement of analysis accuracy and dependability by AI software, and the integration of AI into clinical microbiology labs' workflow are all bound to progress in the future. Consequently, microbiologists will increasingly depend on AI for initial screening or regular analysis of infectious disease tests. This will enable them to dedicate their attention to diagnostic complexities, such as intricate engineering and laboratory quality monitoring. The implementation of these modifications will improve the effectiveness and excellence of laboratory testing in clinical microbiology, which will have positive outcomes for both the laboratory itself and the patients it serves (24, 25). The advancements in microbiological technologies are rapidly transforming the ability to diagnose infections, enhance patient care, and streamline clinical workflows. These innovative tools broaden the scope, depth, and speed of diagnostic data and tests generated by patients, bringing them closer to the patient through the utilization of rapid diagnostic technologies such as point-of-care (POC) technology. Despite the significance and potential of these novel technologies, there exists a gap between clinicians and certain payers and hospital management in terms of recognizing their clinical usefulness. Consequently, a primary obstacle for the clinical microbiology community is to effectively communicate the value proposition of these technologies in order to encourage payers and hospitals to adopt advanced microbiology testing. The provision of specific guidance on how to define and demonstrate clinical utility would be immensely advantageous (26, 27). Within the field of medicine, there exist two distinct categories of artificial intelligence: virtual and physical, the latter of which pertains to robotics. The virtual classification pertains to mathematical algorithms utilized for the purposes of diagnosis and prognosis, imaging and osteoporosis, appointment scheduling, dosing algorithms, drug interactions, and electronic health records. The physical feature, on the other hand, encompasses robotic assistance in surgical procedures, telepathy, rehabilitation, and social assistance robots utilized in the care of elderly patients (28, 29). AI has predominantly been employed in the field of endodontics through virtual means, specifically in the finding of periapical lesions, crown and root fractures, purpose of working length, and recognition of morphology. In the 1970s, certain AI-based methodologies were introduced to elucidate diseases, electrocardiography, facilitate the selection of appropriate treatment, and assist clinicians in generating hypotheses pertaining to complex diseases (30, 31). AI-based technologies are employed to collect personal health information from patients, which is then utilized to generate a comprehensive knowledge dataset. This dataset aids clinicians in making well-informed decisions and developing personalized treatment plans for their patients. The primary objective of AI-based healthcare technologies is to analyze the effectiveness of disease treatments, disease prevention strategies, and their respective outcomes for patients. To accomplish this, various approaches such as ML, convolutional and deep neural networks, and Bayesian networks are implemented to enhance healthcare through intelligent computing systems. The recent advancements in information and communication technologies have led to a substantial increase in the volume of data gathered from public health surveillance. By integrating AI-based tools with reliable disease management platforms, it is possible to establish robust analytics that empower stakeholders to effectively respond to outbreaks of infectious diseases. In the past, confirming a tuberculosis diagnosis was a laborious and time-consuming process that posed challenges for global control efforts. However, presently, an AI-based tool, an artificial immune detection system, has been largely successful in achieving an early detection system for tuberculosis (28). Similarly, ARIS has facilitated the advent of AI-helped finding in various challenging infections. Additional illustration of AI application in human healthcare is its efficacious application in the precise finding of malaria. This is a straightforward mechanical system that bypasses intricate processing and marking protocols, thereby aiding clinicians in mitigating possible mistakes (32, 33). AI created technologies have demonstrated successful application in the field of epidemiology pertaining to infectious diseases, including but not limited to Kyasanur Forest Disease, Chikungunya, Middle East Respiratory Syndrome (MERS), Zika, and Ebola (34). AI algorithms that are capable of predicting the genome and protein ultrastructure of successive virus generations can prove to be of significant value in preparing for potential viral infections. The application of such algorithms to datasets of the Newcastle disease virus in China and South Korea has resulted in the prediction of mutant nucleotides with an accuracy rate of 70%. (35). AI-based methodologies have been fulfilled in the diagnosis of COVID-19. Certain AI-based tools have proved effective organization of the severity of COVID-19 through the use of radiological images, such as CT scans and X-rays (36). Various machine learning algorithms, such as Convolutional Neural Network (CNN), linear discriminant analysis, naive bayes, vector machine support, decision trees, logistic regression, and random forest, are utilized to diagnose COVID-19. These algorithms make use of diverse datasets to classify images into different groups, thereby assisting in determining the severity of the disease and distinguishing COVID-19 from other similar diseases that exhibit comparable symptoms and pneumonia (37, 38).
Types of programs used in AI
AI is a dynamic and continuously evolving field of computing research that aims to develop systems capable of simulating human intelligence and performing tasks such as visual awareness, decision making, speech recognition, and natural language processing (39, 40). The development of AI is driven by two key factors: the availability of electronic health record data and advances in computing power. These factors are closely related to complex mathematical functions, namely ML or NN (41). The advent of Deep Neural Network (DNN) architectures has further increased the complexity of AI over the past decade (42).
ML, a subset of AI, distinguishes itself from expert systems by its capacity to adapt in the face of extensive data. Unlike expert systems, which are manually defined using human expertise, ML operates without the need for human intervention and strives to autonomously acquire rules, akin to the functioning of the human brain. This characteristic renders ML less vulnerable and less reliant on human experts (43). NNs, on the other hand, are computational models that employ mathematical calculations and are inspired by the workings of biological neural networks. Comprising interconnected information links, NNs possess the ability to identify underlying relationships within a vast amount of data. A DNN, specifically, encompasses multiple layers of processing units, enabling enhanced data predictions and independent learning (42). AI has the potential to revolutionize clinical decision making by effectively managing the extensive amount of data associated with a patient's care and medical history. However, many healthcare professionals still do not fully comprehend the advantages of AI and continue to rely solely on their personal experience and treatment guidelines when making decisions (41). ML was created to address the limitations of expert systems (40). In ML, engineers design algorithms that can establish their own rules based on data, replacing manually coded rules by humans. This enables ML systems to learn from data and interpret unfamiliar situations. Among the various ML techniques developed, deep learning, which relies on artificial neural networks, is the most well-known (9). AI-driven medical devices have been developed and are currently being utilized in clinical settings for a range of tasks, such as analyzing medical images, conducting omics analysis, and processing natural language for drug discovery, electronic health record information, and literature searches (44, 45). Additionally, AI is actively employed in vaccine development, the creation of new diagnostic methods, and the development of novel therapeutics by extracting crucial information from vast amounts of AI data in ongoing COVID-19 research (46). Fully automated diagnostic pipelines and machine learning have gained a foothold in various fields of clinical medicine, including clinical microbiology. Next-generation sequencing (NGS) techniques allow for insight into pathogens by analyzing millions of tiny fragments of their genome and even gaining insight into the composition of the microbiota. Automation combined with novel technologies can make a difference to traditional clinical microbiological tests, which often need an important quantity of physical work. However, the impact of these advances on clinical routine in terms of sample-to-outcome time, resources and management, and interpretation of large multimodal data subsequent from these novel technologies is still being studied (47, 48). Commercially available instruments, such as the WASPLab™ by Copan and the Kiestra TLA by Becton Dickinson, offer automated culture-based assays that encompass tasks such as sample streaking, slide preparation, and the transfer of intermediate inoculated media. These instruments also include instrumentation and automated incubators (49). By utilizing these systems, the number of manual pre-analytical, analytical, and post-analytical steps typically conducted in a non-automated laboratory can be significantly reduced. Moreover, studies have demonstrated that these automated systems enhance the processing of samples and decrease the time required to obtain results (50). Furthermore, the complete automation of diagnostic procedures holds the potential for additional advantages (47).
Clinical applications
An ideal clinical application in the realm of microbiology is characterized by its ability to expedite and ensure precise identification of the microorganisms accountable for causing an infectious disease in a patient. This, in turn, allows for the timely administration of the appropriate treatment. ML has the potential to achieve precise diagnoses with the aid of suitable input data, which is contingent upon the extent to which the sample or patient sample can be digitally represented. Microscopic image data is a widely accepted format for this purpose (51). A comprehensive review has elucidated the typical conversion of image data into units that serve as input for machine learning methods. Although the conventional method of examining thick and thin blood smears under a microscope is considered the most reliable technique for diagnosing malaria, recent research has investigated the advancements in automating malaria diagnosis using image analysis. To achieve this, machine learning methods have been combined with computer-aided diagnostic software (CADx) that utilizes image analysis. This approach aims to facilitate the process of diagnosing malaria through automated means. Nevertheless, the analysis of image variations, also known as "hand-designed features," still requires human expertise. To address this, researchers have utilized convolutional neural networks (CNNs), a type of deep learning model, for image analysis. These CNN models have been successfully applied in studies to classify parasitized and uninfected malaria cells by employing pre-trained models as feature extractors and conducting patient-level cross-validation. Additionally, pilot studies have been conducted to assess the effectiveness of deploying CNN models on mobile devices. This approach holds promise in reducing delays in disease-endemic or resource-constrained settings. Similarly, a trained CNN model was able to differentiate scanned images of stained parasitic fecal swabs from those that did not, thereby reducing the workload of lab workers and increasing sensitivity compared to examining human slides alone (52, 53).
Challenges Faced when Using Machine Learning
The use of ML has its own set of advantages and disadvantages, similar to other statistical methods in the field of biology. However, these drawbacks can be partially mitigated by carefully selecting the problem or question to be examined. It is crucial to meticulously choose the outcome variables and covariates in order to optimize the application of ML. ML algorithms are most suitable for questions that concentrate on a specific population, where the influence of covariates on the risk of selected outcomes is expected to be relatively consistent within a well-defined population group (54). This population group is often defined by certain disease states or microbial species. Nevertheless, if the populations under analysis are not well-defined or if the characteristics of the population exhibit high variability, ML may encounter challenges. Additionally, since ML's effectiveness is limited by small sample sizes, either the population being studied must be sufficiently large for the diseased cohort or the sample size must be large enough to leverage the capabilities of ML (55).
Application of Al in the diagnosis of infectious diseases
As a consequence of the aforementioned issue, the Global Influenza Surveillance and Response System (GISRS) has monitored the evolutionary mechanisms of influenza viruses (13). Artificial intelligence (AI) programs have been acknowledged as conventional tools for detecting early indications of infectious diseases and have become one of the fundamental principles of infectious diseases (56). AI is increasingly being utilized in laboratory medicine for various purposes, including the interpretation of antinuclear antibody (ANA) patterns and the analysis of white blood cells (57). Simple tree-based analytics rely on a set of predefined rules, such as CLSI or EUCAST interpretation criteria, and are often referred to as non-adaptive AI. On the other hand, adaptive AI can derive rules either from human input or machine discovery. These rules are then applied to new data to classify it. In the clinical laboratory, a simplified form of machine learning called linear regression is used to predict standard curves for instrument calibration. In regression analysis, a computer is provided with an equation and tasked with optimizing the values of variables to achieve the best possible representation, or prediction, for a two-dimensional data set showing analyte concentration versus assay reading. To illustrate some common machine learning terms, let's consider AI-based image analysis. While humans can easily interpret images due to the dedicated visual cortex in our brains, teaching a computer to understand images is a complex undertaking. Machine learning algorithms lack inherent knowledge of which data or features within an image are crucial for classification or diagnosis (24). ML applications include risk stratification of specific infections, identification of disease risk factors, characterization of host-pathogen interactions, prediction of the emergence and spread of emerging pathogens (6), and digital screening of bacterial cultures in agar (58) as well as review of Gram staining of blood cultures (59). ML has been utilized in various studies to predict different aspects of infectious diseases. For instance, ML has been employed to forecast the risk of nosocomial Clostridium difficile infection, zoonotic reservoirs, outcome in Ebola virus infections, the risk of developing septic shock based on a severity score, and sepsis mortality. In a particular study conducted by Guilamet et al., cluster analysis was employed for risk stratification. Previous attempts to identify episodes associated with bloodstream infections have relied on predetermined classification groups based on known microbiological episodes, site of infection, and patient characteristics. However, the authors hypothesized that clinically relevant groupings may transcend these previous classifications, even within a heterogeneous population. To test this hypothesis, the authors applied cluster analysis to variables from three domains: patient characteristics, disease severity/clinical presentation, and infection characteristics. The resulting analysis yielded four stable cluster arrangements: Cluster 1 "Surgical transfers from the hospital," Cluster 2 "Functionally immunosuppressed patients," Cluster 3 "Women with skin problems and urinary tract infection," and Cluster 4 "Acute pneumonia." Notably, Staphylococcus aureus was predominantly distributed in clusters 3 and 4, while non-fermenting gram-negative bacteria were mainly grouped in clusters 2 and 4. Furthermore, more than 50% of the pneumonia cases occurred in Cluster 4. These findings highlight the potential of machine learning methods to identify homogeneous clusters in infectious diseases, surpassing traditional patient categories. By identifying new clinical phenotypes, these methods have the potential to enhance severity assessment and facilitate the development of new treatments for complex or chronic infectious diseases in Figure1 (10, 60).
Figure 1. The algorithm diagnosis uses artificial intelligence in infection disease (Design by author from Biorender, 2024)
AI has been used in the diagnosis of diseases in clinical settings, a list of which is provided in Table2.
Table 2. AI and clinical trial in diagnosis of infectious disease
Title | Status | Conditions | Interventions | Phases | Locations/URL |
Validation of Artificial Intelligence Enabled TB Screening and Diagnosis in Zambia |
Recruiting | Tuberculosis | Zambia https://ClinicalTrials.gov/show/NCT05139940 |
||
Optimal Antibiotic Treatment of Moderate to Severe Bacterial Infections |
Unknown status | Community-associated Infections|Health-care Acquired Infections|Nosocomial Infections |
Other: antibiotic treatment of by TREAT/PCR | Phase 3 | Israel https://ClinicalTrials.gov/show/NCT01338116 |
Development of an Artificial Intelligence System for Intelligent Pathological Diagnosis and herapeutic Effect Prediction Based on Multimodal Data Fusion of Common Tumors and Major Infectious Diseases in the Respiratory System Using Deep Learning Technology. |
Recruiting | Artificial Intelligence Deep Learning Pathology, Molecular Medical Informatics|Database|Lung Cancer| Pulmonary Tuberculosis|Covid19 |
China https://ClinicalTrials.gov/show/NCT05046366 |
||
PEMF Therapy to TreatLingering Symptoms ofLyme Disease After Treatment With Antibiotics |
Terminated | Lyme Disease|Lyme Neuroborreliosis| Lyme Arthritis|Unknown Origin Fever |
Device: Scientific Consciousness Interface Operations (SCIO) Class II FDA approved medical device |
Not Applicable | United Kingdom https://ClinicalTrials.gov/show/NCT04577053 |
Early Risk Assessment in Household Contacts (≥10 Years)of TB Patients by New Diagnostic Tests in 3 African Countries |
Recruiting | Tuberculosis | Diagnostic Test: New test candidates | Zimbabwe https://ClinicalTrials.gov/show/NCT04781257 |
Digitalization in healthcare is expanding. Machine learning is changing the way we interact with health-related data, especially in the fields of clinical microbiology and infectious diseases. We are likely to see a shift from the Internet of Things environment to the Internet of the Body, where implanted devices continuously provide accurate health data even during disease-free periods. In addition, advances in molecular diagnostics, such as metagenomics, add to the complexity of the data. In vitro diagnostics are expected to play an important role in the next decade. In the future, digital technologies such as personal assistants, internet-connected devices and bodies, smartphone technologies, self-driving vehicles, drones, and self-healing algorithms will undoubtedly shape our lives. Consideration for digitization and artificial intelligence in healthcare are high, driven by the need to optimize quality and reduce costs.
No to declare.
Ethical Considerations
The study was approved by the Ethics Committee of Research & Technology of Hamadan University of Medical Sciences, Hamadan, Iran (ethic code: IR.UMSHA.REC.1401.642).
Authors’ Contributions
HM and HH designed the topic and wrote the manuscript. HM and HH participated in the initial draft and the revision of the manuscript. HM revised the final version of the manuscript. All authors read and approved the final manuscript.
This research has been supported by Vice Chancellor for Research & Technology of Hamadan University of Medical Sciences, Hamadan, Iran.
Conflicts of Interest
The authors declare that they have no competing interests.
Rights and permissions | |
![]() |
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. |