Tuesday, 19 March 2019

Atari-HEAD: Atari Human Eye-Tracking and Demonstration Dataset. (arXiv:1903.06754v1 [cs.LG])

We introduce a large-scale dataset of human actions and eye movements while playing Atari videos games. The dataset currently has 44 hours of gameplay data from 16 games and a total of 2.97 million demonstrated actions. Human subjects played games in a frame-by-frame manner to allow enough decision time in order to obtain near-optimal decisions. This dataset could be potentially used for research in imitation learning, reinforcement learning, and visual saliency.



from cs updates on arXiv.org https://ift.tt/2UMzCLq
//

0 comments:

Post a Comment