Library/Spotlight

Back to Library
TED TalksCivilisational risk and strategySpotlightReleased: 26 Jan 2024

Could AI give you X-ray vision? | Tara Boroushaki

Why this matters

Auto-discovered candidate. Editorial positioning to be finalized.

Summary

Auto-discovered from TED Talks. Editorial summary pending review.

Perspective map

MixedGovernanceMedium confidenceTranscript-informed

The amber marker shows the most Risk-forward score. The white marker shows the most Opportunity-forward score. The black marker shows the median perspective for this library item. Tap the band, a marker, or the track to open the transcript there.

An explanation of the Perspective Map framework can be found here.

Episode arc by segment

Early → late · height = spectrum position · colour = band

Risk-forwardMixedOpportunity-forward

Each bar is tinted by where its score sits on the same strip as above (amber → cyan midpoint → white). Same lexicon as the headline. Bars are evenly spaced in transcript order (not clock time).

StartEnd

Across 5 full-transcript segments: median 0 · mean -2 · spread -100 (p10–p90 -100) · 0% risk-forward, 100% mixed, 0% opportunity-forward slices.

Slice bands
5 slices · p10–p90 -100

Mixed leaning, primarily in the Governance lens. Evidence mode: interview. Confidence: medium.

  • - Emphasizes governance
  • - Emphasizes safety
  • - Full transcript scored in 5 sequential slices (median slice 0).

Editor note

Auto-ingested from daily feed check. Review for editorial curation under intake methodology.

ai-safetyted-talks

Play on sAIfe Hands

On-site playback is enabled when an episode-level media URL is connected. This entry currently points to a source page.

This entry currently has a show-level source URL, not an episode-level media URL.

Episode transcript

YouTube captions (TED associates this talk with a public YouTube mirror) · video MA-uwhr50FU · stored Apr 10, 2026 · 109 caption segments

Captions are an imperfect primary: they can mis-hear names and technical terms. Use them alongside the audio and publisher materials when verifying claims.

No editorial assessment file yet. Add content/resources/transcript-assessments/could-ai-give-you-x-ray-vision-tara-boroushaki.json when you have a listen-based summary.

Show full transcript
As a teenager, I was fascinated with the wizarding world. I wanted to be like Hermione Granger. I wanted to be a powerful witch with a wand. Out of all the spells, I really liked "Accio." I could say "Accio" and name anything and it would fly into my hands, even if it's a restricted book, locked and hidden inside the headmaster's office. But as you might have noticed, I did not get into Hogwarts. So I came to my backup school, MIT. (Laughter) And I made my own magic. I can now -- I can have the spells now that are even more powerful than Hermione's, because I don't need a wand. I can see the invisible. Let me show you my first spell. For example, I want to find my black shirt in my closet that I lost. I put on my AR headset and I say "Accio." Within a matter of seconds, my shirt will light up. Like I said, I can see the invisible. (Applause) So how does this spell work? How did I make an AR headset able to see what my own eyes cannot see? My trick is to use wireless signals like Bluetooth and Wi-Fi. These signals are great. This is exactly why you can get Wi-Fi from another room. So these signals are sent by this headset. They are scattered in the environment. They go through boxes and then are reflected back from all of these hidden objects, including my shirt. We then design algorithms that use these reflected signals to find these objects. Let me show you how these algorithms work. First, the headset tries to understand its environment by creating a virtual 3D map. And then as I walk in the environment, it keeps sending these wireless signals, tries to collect all of the reflections, and then combines them to collect this information, and then tries to locate the object that I was looking for. Then after some time, it becomes confident about it and then tells me, "Here it is." This is the glow, which is where the object I want is, and I can go and grab it. Besides helping me having these magical powers, it has a lot of industrial applications. For example, in a warehouse, it can help warehouse workers to find packages, and then it can guide them through the warehouse to find these packages. In a retail store, it can help the store associates to pack customer's orders and find misplaced items. It also has a lot of other applications in logistics and manufacturing. So having X-ray vision is very cool. It's better than what Hermione had. But sometimes I'm just so lazy, I don't want to go ahead and put my headset on and then say "Accio" and then go and find my key, grab it under the pile. So I was thinking, what if I had a robot that knew my spell? So I taught my spells to a robot. My robot can find my keys and bring them to me. Let me show it to you. For example, here, my keychain is hidden under the pile in the basket. My robot is able to search the environment and locate my keys with centimeter-level accuracy, purposefully declutter the environment and then grab the keys and bring them to me. So how do we do this? We built a specialized gripper instead of a wand. So it has a camera and a wireless sensor, which sends similar signals like the headset that I just talked about. So this is great, now this gripper can locate the keys. But there's one challenge. What if I put my robot in a completely new environment? For example, put it in the living room and ask it to find a remote control. How does this robot adapt itself to a completely new environment and a completely new object? If the robot does exactly the same steps as before, then it can only find a key in a basket. We designed an AI algorithm specifically to help this robot to adapt itself to a completely new environment and find objects it has never seen before. For example, here it's trying to find a remote control on a couch in a living room it has never seen before. It's able to find it and then bring it to me, and I can go and grab this remote control and then watch my favorite show. Magical X-ray vision can change the way we think about our environment and the way we interact with our environment. It opens up new possibilities that we never thought are possible. For example, it can help our future robots in our homes to help us. It can also help us interact with our future smart homes in ways we never imagined before. But the application that I'm personally very excited about is helping our first responders to find and track humans in conditions with low visibility. Or helping humans under the rubble after disaster hits. And as the cherry on top, it can help those of us who did not get into Hogwarts to get one step closer to the wizarding world. Thank you. (Applause)

Counterbalance on this topic

Ranked with the mirror rule in the methodology: picks sit closer to the opposite side of your score on the same axis (lens alignment preferred). Each card plots you and the pick together.

More from this source