Facebook is teaching AI to see the world through human eyes
The Future. Facebook announced that it has launched “Ego4D” — a program that surfaces hours of first-person video to teach its AI system to see and hear the world as humans do. Putting aside Facebook’s recent avalanche of criticism, the potential insights this program uncovers may ultimately convince customers to feel comfortable wearing Facebook on their faces all the time.
Facebook is taking machine learning to a whole new level.
- For its smartglasses to “become as useful in everyday life as smartphones,” the company is launching a project called “Ego4D.”
- The project aims to teach Facebook’s AI to see and hear the world as humans do so that devices can appropriately respond when being worn.
- Ego4D brings together 13 universities and labs across nine countries, who have collectively collated 2,200 hours of first-person video.
- The videos, documenting the POV of 700 participants, are of them going about their daily lives.
According to a Facebook blog post, all data will be open to the research community (which may still ring alarms).
Poking the singularity
Over the past year, Facebook has been charting the company’s future, focusing its ambitions on creating a metaverse and VR/AR devices that either bring people into digital worlds or augments the one they’re currently living in.
In the past couple of months, Facebook announced its first smart glasses: the Ray-Ban Stories — which was developed using a similar real-world capture program like Ego4D.
Still, it’s a weird time for Facebook to be announcing an all-learning AI system that tries to see and hear as humans do. With recent scathing reports from WSJ and more whistleblowers waiting in the wings to testify to the company’s alleged abuses, the last thing the public may want to celebrate is Facebook becoming more powerful.