Self-driving cars must get better at understanding people’s intentions by “reading” the body language of the humans around them, so that they can co-exist more safely with us. In this amazing discussion with the CTO and co-founder of Perceptive Automata, we will learn how a branch of cognitive science known as psychophysics is being used to teach cars about the intentions of the humans around them so that they can be better and safer drivers.

Watch on Youtube

Listen on Simplecast

To see or hear more episodes:

-- TIMING --

00:00 Introduction
01:12 Sam’s career
05:04 Perceptive Au-TAW-mah-ta, not Au-to-MAH-ta
05:52 How are you teaching cars to understand human intention?
11:25 Structure of the autonomous vehicle market and the 5 Levels of autonomous driving
17:06 The Autonomous Vehicle technology stack
23:25 Use case discussion – how does an intuitive car understand multiple scenarios?
31:25 Cars will have general purpose compute platforms
33:49 Where does Tesla fit in here?
38:14 Typical customers for Perceptive Automata’s tools
41:09 Can intuition extend to other non-vehicle environments such as hospitality, delivery robots, and construction
42:34 What about the military applications?
46:52 Summary of market and global need
48:44 How much of this is Edge AI vs. being processed in the datacenter or cloud?
49:53 What does it take to train AI to understand human body language?
53:00 Career advice for people interested in getting into the autonomous vehicle market
57:48 How to contact Perceptive Automata and Sam
59:14 Close

-- LINKS --

If you found this podcast episode helpful, don’t forget to subscribe at

DISCLOSURE: To support the channel, we use referral links wherever possible, which means if you click one of the links in this video or description and make a purchase, we may receive a small commission or other compensation.