• <xmp id="om0om">
  • <table id="om0om"><noscript id="om0om"></noscript></table>
  • After clicking “Watch Now” you will be prompted to login or join.


    WATCH NOW



     
    Click “Watch Now” to login or join the NVIDIA Developer Program.

    WATCH NOW

    Natural Language Understanding and Conversational AI

    Christopher Manning, Stanford University

    GTC 2020

    Natural language processing (NLP) has made dramatic advances over the last two years, ranging from deep generative models for text-to-speech, such as WaveNet, through the extensive deployment of deep contextual language models, such as BERT. Pre-training with models like BERT has significantly raised the performance of almost all NLP tasks, allowed much better domain adaptation, and brought us human-level performance for tasks like answering straightforward factual questions. New neural language models have also brought much more fluent language generation. On one hand, we should not be too impressed by these linguistic savants; things like understanding the consequences of events in a story or performing common-sense reasoning remain out of reach. But on the other hand, we now live in an era where there are many good commercial uses of NLP, with much of the heavy lifting already done in the construction of large but downloadable models. Question answering can be used for pre-sales, customer support, and fine-grained information access; neural machine translation can offer worldwide customers more natural and accurate translations; and neural response generation can power fluent, more conversational dialog agents.




    View More GTC 2020 Content

    人人超碰97caoporen国产