Modeling how a person’s eyes move toward an object before their hands touch it.

Often includes synchronized gaze data (where the person is looking) Content and Activity

The video belongs to a collection designed to help AI models understand how humans perform daily tasks. It was filmed using head-mounted cameras (like GoPro or specialized eye-tracking glasses) to capture exactly what the subject sees. GTEA Gaze+ Perspective: Egocentric (First-Person) Primary Focus: Meal preparation and kitchen activities

If you tell me more about your specific project, I can provide: for this specific timestamp (if available) Code snippets for loading GTEA Gaze+ videos in Python Related research papers that utilize the Group 4 dataset

Researchers use "g4_01136.mp4" and similar clips to train and test algorithms in several key areas:

Understanding the logical sequence of steps required to complete a complex task. Usage in AI Benchmarking