How can an AI understand language? Scott Leishman, XOKind | Productive AI Podcast with Troy Angrignon

How do AI programs understand language? Hear Scott Leishman explain the state of the art of Natural Language Processing.

How can an AI understand language? Scott Leishman, XOKind | Productive AI Podcast with Troy Angrignon

How can an AI understand language? Computer-human communication is undergoing a revolution and AI can now listen to, understand, and speak back to us in much more powerful ways than it could before. On this episode, hear Scott Leishman discuss how AI can now write news articles, blog posts, poetry, and novels and how work done in the recent past is making it easier than ever to build incredibly powerful AI applications that can communicate with human beings.

Watch on YouTube

Listen on Simplecast

To see or hear more episodes:


00:00 Introduction

00:48 Scott’s background in computer science at FICO, Core Logic, and Nervana Systems (which exited to Intel for $400M in 2016), and Intel

06:56 What is Natural Language Processing (NLP)?

11:40 What was the significance of GPT-3’s release this year?

16:31 What can GPT-3 do? (explain it to somebody who doesn’t follow the field).

19:15 NLP is having its “ImageNet moment” – what does that mean? (Technical explanation)

25:39 Simplifying NLP for less-technical listeners

28:17 Standing on the shoulders of giants: Pre-trained models are making it easier to build AI applications

30:05 What kinds of new uses cases are possible with the current state of the art NLP?

33:29 Apple Knowledge Navigator – are we there yet?

37:25 Where does NLP live in the AI stack?

41:34 What are you doing with NLP at XOKind?

49:47 What should people be doing to improve their chances of working in this space?

54:05 Summary

-- LINKS --

Books: Manning & Jurafsky is sort of the best known, comprehensive but is a bit dated at this point.  Fortunately they are working on a new draft:

Conferences: the big ones for NLP are ACL, EMNLP (was just last week), CoNLL, but you’ll also see a lot of new work at ICLR and NeurIPS

Papers.  The field moves quick but arXiv is the first place to find new results.  I’d highly recommend searching through something like arxiv-sanity instead for a subject/topic of interest.

Mailing lists:  I’m a big fan of Sebastian Reuder’s monthly update, you can sign up for at NLP news


I mentioned to keep tabs on current state of the art for given downstream tasks

For folks that want a good practical introduction I’d recommend Stanford’s undergraduate NLP course (complete with video lectures online):

Getting interested in ML in general, this course is pretty good too if you have some programming experience under your belt:

Hugging Face are doing a lot of great work in the NLP space, they have easy integrations for various models, a solid python library etc.

Rasa are another open source solution, they now have APIs too for helping build conversation agents


Sign up for our mailing list on the front page here:

Job openings.  List is here: (scroll down the page).  Growing Frontend and Backend engineering is a current focus for us.

Apple Knowledge Navigator video: