This page contains the data collected in the context of collaboration between RITMO of the University of Oslo, BIRD of the Loria (Lorraine Laboratory of Research in Computer Science and its Applications) and Ullo. The data was made available in the context of a Tutorial organised at ISMIR 2023.
If you have any questions or comments, feel free to e-mail me: bonnin [at] loria [dot] fr
PowerPoint version (with video) (157 MB)
PDF version (without video) (3 MB)
The code of the tutorial can be found here: https://github.com/laurabishop/MusicDiscoveryPupil
Download (42 MB)
We provide the data in JSON and text format. There are 6 files in total:
full-dataset.json | The whole collected data in JSON format |
format.json | A descriptive file showing the structure of full-dataset.json (same content as the description below) |
P2067560969.txt | A tab-separated-value file containing the subset of the full dataset related to the eye tracker |
PLR.txt | A subset of the dataset containing only pupil diameters |
ET-partial1.txt | A subset of the dataset containing only the participants' ID, session numbers, tracks' Spotify id, track time, pupil data, effort data, liking ratings and track familiarity |
ET-partial2.txt | A subset of the dataset containing only the participants' ID, session numbers, tracks' Spotify id, track time, pupil and gaze data, and subjective emotion data (AOI hits in the Circumplex model) |
The format of the full-dataset.json
file is as follows:
{
<participant_id>: {
"session1": {
"eyetracker": {
"timestamps": [ <timestamp1>, <timestamp2>, ... ],
"Gaze point X": [ <value1>, <value2>, ... ],
"Gaze point Y": [ ... ],
"Gaze point left X": [ ... ],
"Gaze point left Y": [ ... ],
"Gaze point right X": [ ... ],
"Gaze point right Y": [ ... ],
"Fixation point X": [ ... ],
"Fixation point Y": [ ... ],
"Pupil diameter left [mm]": [ ... ],
"Pupil diameter right [mm]": [ ... ],
"AOI hit [Pleasant]": [ ... ],
"AOI hit [Unpleasant]": [ ... ],
"AOI hit [Activation]": [ ... ],
"AOI hit [Deactivation]": [ ... ],
"AOI hit [Alert]": [ ... ],
"AOI hit [Excited]": [ ... ],
"AOI hit [Elated]": [ ... ],
"AOI hit [Happy]": [ ... ],
"AOI hit [Contented]": [ ... ],
"AOI hit [Serene]": [ ... ],
"AOI hit [Relaxed]": [ ... ],
"AOI hit [Fatigued]": [ ... ],
"AOI hit [Lethargic]": [ ... ],
"AOI hit [Calm]": [ ... ],
"AOI hit [Depressed]": [ ... ],
"AOI hit [Sad]": [ ... ],
"AOI hit [Upset]": [ ... ],
"AOI hit [Stressed]": [ ... ],
"AOI hit [Nervous]": [ ... ],
"AOI hit [Tense]": [ ... ]
},
"wristband": {
"TEMP": [
{ "timestamp": <timestamp>, "value": <value> },
...
],
"EDA": [ ... ],
"BVP": [ ... ],
"IBI": [ ... ],
"HR": [ ... ]
},
"post_questionnaires": [
{
"music_beginning": <timestamp>,
"music_end": <timestamp>,
"spotify_id": <spotify_id>, // null if metronome
"liking": <liking>,
"liking_comment_fr": <liking_comment>,
"liking_comment_en": <liking_comment>,
"genre_familiarity": <genre_familiarity>,
"genre_familiarity_comment_fr": <genre_familiarity_comment>,
"genre_familiarity_comment_en": <genre_familiarity_comment>,
"artist_familiarity": <artist_familiarity>,
"artist_familiarity_comment_fr": <artist_familiarity_comment>,
"artist_familiarity_comment_en": <artist_familiarity_comment>,
"track_familiarity": <track_familiarity>,
"track_familiarity_comment_fr": <track_familiarity_comment>,
"track_familiarity_comment_en": <track_familiarity_comment>,
"RSME": <RSME>,
"relaxed": <relaxed>,
"motivation": <motivation>,
"fatigue": <fatigue>,
"playcount": <playcount> //only for session 2
},
...
]
},
"session2": { ... },
"participant_occupation": <occupation>
},
<participant_id>: {
...
},
...,
"tracks": [
{
"spotify_id": <spotify_id>,
"artist": <artist>,
"track": <track>,
"duration_ms": <duration_ms>,
},
...
]
}
If you use this dataset, please cite the following paper:
Laura Bishop, Geoffray Bonnin, and Jérémy Frey. 2023. Analysing Physiological Data Collected During Music Listening: An Introduction. In Proceedings of the 24th International Society for Music Information Retrieval Conference (ISMIR '23).