Computer Vision Decoded

In this episode of Computer Vision Decoded, we are going to dive into our in-house computer vision expert's reaction to the iPhone 15 and iPhone 15 Pro announcement.

We dive into the camera upgrades, decode what a quad sensor means, and even talk about the importance of depth maps.

Episode timeline:

00:00 Intro
02:59 iPhone 15 Overview
05:15 iPhone 15 Main Camera
07:20 Quad Pixel Sensor Explained
15:45 Depth Maps Explained
22:57 iPhone 15 Pro Overview
27:01 iPhone 15 Pro Cameras
32:20 Spatial Video
36:00 A17 Pro Chipset

This episode is brought to you by EveryPoint. Learn more about how EveryPoint is building an infinitely scalable data collection and processing platform for the next generation of spatial computing applications and services: https://www.everypoint.io

Creators & Guests

Host
Jared Heinly
Chief Scientist at @EveryPointIO | 3D computer vision researcher (PhD) and engineer
Host
Jonathan Stephens
Chief Evangelist at @EveryPointIO | Neural Radiance Fields (NeRF) | Industry 4.0

What is Computer Vision Decoded?

A tidal wave of computer vision innovation is quickly having an impact on everyone's lives, but not everyone has the time to sit down and read through a bunch of news articles and learn what it means for them. In Computer Vision Decoded, we sit down with Jared Heinly, the Chief Scientist at EveryPoint, to discuss topics in today’s quickly evolving world of computer vision and decode what they mean for you. If you want to be sure you understand everything happening in the world of computer vision, don't miss an episode!