James Sohn – NVIDIA Technical Blog News and tutorials for developers, data scientists, and IT admins 2023-07-27T19:58:45Z http://www.open-lab.net/blog/feed/ James Sohn <![CDATA[Developing a Question Answering Application Quickly Using NVIDIA Riva]]> http://www.open-lab.net/blog/?p=24073 2023-03-22T01:16:51Z 2021-11-09T16:14:38Z Sign up for the latest Speech AI news from NVIDIA. There is a high chance that you have asked your smart speaker a question like, ��How tall is Mount...]]>

Sign up for the latest Speech AI news from NVIDIA. There is a high chance that you have asked your smart speaker a question like, “How tall is Mount Everest?” If you did, it probably said, “Mount Everest is 29,032 feet above sea level.” Have you ever wondered how it found an answer for you? Question answering (QA) is loosely defined as a system consisting of information retrieval (IR)…

Source

]]>
6
James Sohn <![CDATA[Building a Question and Answering Service Using Natural Language Processing with NVIDIA NGC and Google Cloud]]> http://www.open-lab.net/blog/?p=24231 2022-08-21T23:41:07Z 2021-03-04T01:10:12Z Enterprises across industries are leveraging natural language process (NLP) solutions��from chatbots to audio transcription��to improve customer engagement,...]]>

Enterprises across industries are leveraging natural language process (NLP) solutions—from chatbots to audio transcription—to improve customer engagement, increase employee productivity, and drive revenue growth. NLP is one of the most challenging tasks for AI because it must understand the underlying context of text without explicit rules in human language. Building an AI-powered solution…

Source

]]>
1
James Sohn <![CDATA[Getting the Most Out of the NVIDIA A100 GPU with Multi-Instance GPU]]> http://www.open-lab.net/blog/?p=21816 2023-07-27T19:58:45Z 2020-12-01T00:30:40Z With the third-generation Tensor Core technology, NVIDIA recently unveiled A100 Tensor Core GPU that delivers unprecedented acceleration at every scale for AI,...]]>

With the third-generation Tensor Core technology, NVIDIA recently unveiled A100 Tensor Core GPU that delivers unprecedented acceleration at every scale for AI, data analytics, and high-performance computing. Along with the great performance increase over prior generation GPUs comes another groundbreaking innovation, Multi-Instance GPU (MIG). With MIG, each A100 GPU can be partitioned up to seven…

Source

]]>
11
James Sohn <![CDATA[Deploying a Natural Language Processing Service on a Kubernetes Cluster with Helm Charts from NVIDIA NGC]]> http://www.open-lab.net/blog/?p=22018 2022-08-21T23:40:46Z 2020-11-11T22:39:07Z Conversational AI solutions such as chatbots are now deployed in the data center, on the cloud, and at the edge to deliver lower latency and high quality of...]]>

Conversational AI solutions such as chatbots are now deployed in the data center, on the cloud, and at the edge to deliver lower latency and high quality of service while meeting an ever-increasing demand. The strategic decision to run AI inference on any or all these compute platforms varies not only by the use case but also evolves over time with the business. Hence…

Source

]]>
4
James Sohn <![CDATA[Simplifying AI Inference with NVIDIA Triton Inference Server from NVIDIA NGC]]> http://www.open-lab.net/blog/?p=19889 2022-10-10T18:57:20Z 2020-08-25T00:12:17Z Seamlessly deploying AI services at scale in production is as critical as creating the most accurate AI model. Conversational AI services, for example, need...]]>

Seamlessly deploying AI services at scale in production is as critical as creating the most accurate AI model. Conversational AI services, for example, need multiple models handling functions of automatic speech recognition (ASR), natural language understanding (NLU), and text-to-speech (TTS) to complete the application pipeline. To provide real-time conversation to users…

Source

]]>
3
James Sohn <![CDATA[Optimizing and Accelerating AI Inference with the TensorRT Container from NVIDIA NGC]]> http://www.open-lab.net/blog/?p=19032 2022-10-10T18:57:20Z 2020-07-23T17:24:26Z Natural language processing (NLP) is one of the most challenging tasks for AI because it needs to understand context, phonics, and accent to convert human...]]>

Natural language processing (NLP) is one of the most challenging tasks for AI because it needs to understand context, phonics, and accent to convert human speech into text. Building this AI workflow starts with training a model that can understand and process spoken language to text. BERT is one of the best models for this task. Instead of starting from scratch to build state-of-the-art…

Source

]]>
0
���˳���97caoporen����