Biomedical research and drug discovery have long been constrained by labor-intensive processes. In order to kick-off a drug discovery campaign, researchers typically comb through numerous scientific papers for details about known protein targets and small molecule pairs. Reading—and deeply comprehending—a single paper takes one to six hours, while summarizing findings without AI assistance…
]]>Innovation in medical devices continues to accelerate, with a record number authorized by the FDA every year. When these new or updated devices are introduced to clinicians and patients, they require training to use them properly and safely. Once in use, clinicians or patients may need help troubleshooting issues. Medical devices are often accompanied by lengthy and technically complex…
]]>Clinical applications for AI are improving digital surgery, helping to reduce errors, provide consistency, and enable surgeon augmentations that were previously unimaginable. In endoscopy, a minimally invasive procedure used to examine the interior of an organ or cavity of a body, AI and accelerated computing are enabling better detection rates and visibility.
]]>The process of building an AI-powered solution from start to finish can be daunting. First, datasets must be curated and pre-processed. Next, models need to be trained and tested for inference performance, and then finally deployed into a usable, customer-facing application. At each step along the way, developers are constantly face time-consuming challenges, such as building efficient…
]]>This post is the second in a series (Part 1) that addresses the challenges of training an accurate deep learning model using a large public dataset and deploying the model on the edge for real-time inference using NVIDIA DeepStream. In the previous post, you learned how to train a RetinaNet network with a ResNet34 backbone for object detection. This included pulling a container…
]]>Some of the biggest challenges in deploying an AI-based application are the accuracy of the model and being able to extract insights in real time. There’s a trade-off between accuracy and inference throughput. Making the model more accurate makes the model larger which reduces the inference throughput. This post series addresses both challenges. In part 1, you train an accurate…
]]>