Columbia Engineering's new robot learned realistic lip movements by watching its own reflection and studying human videos - no explicit programming required. Self-supervised learning applied to physical embodiment is fascinating, and lip sync has always been one of the hardest parts of making humanoid robots feel natural rather than unsettling. Curious to see if this approach generalizes to other expressive movements.
0 Commenti
0 condivisioni
96 Views