Eyetracking data analysis

PennController for IBEX Forums Support Eyetracking data analysis

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #10813
    stradee
    Participant

    Hi, Jeremy. I am collecting some eye tracing data, and I have a few questions after looking through it.

    trial	time	left	right
    11	0	0	0
    11	157	0	0
    11	340	0	0
    11	492	0	0
    11	651	0	1
    11	809	0	0
    11	968	0	0
    11	1129	0	0
    11	1286	0	0
    11	1441	1	0
    11	1597	0	0
    11	1758	0	1
    11	1920	0	1
    11	2087	0	0
    11	2250	1	0
    11	2416	1	0
    11	2581	1	0
    11	2751	0	0

    This is a part of data.

    First, I wonder what both left and right are 0 means. Does it mean that a participant didn’t look at the canvas which I designated? Then is it okay to remove those data and to analyze only (left, right) = (0, 1) or (1, 0) ?

    Second, I want to analyze the data only after the target word in the sentence appears. In trial 11, the target word appears 697 and the whole audio ends 1366. Then, how can I process this data considering the target onset time? Is there any script or code to interpret it?

    Lastly, is it possible to put some additional category such as ‘ItemNo.’ next to category ‘trial’?

    Here is the demonstration link >> https://farm.pcibex.net/r/ipbjNz/

    Thanks for all your wonderful help.

    #10814
    Jeremy
    Keymaster

    Hi,

    1. The analysis part is up to you. If you discard the (0,0) data points to focus on (0,1) vs (1,0), you’ll be looking at left-vs-right preference when the participant is (estimated to be) looking at either left or right; but keep in mind that it could be that the participant is not looking at either the left or right image for some time. For example, you could get a 30% left vs 70% right distribution when discarding the (0,0) data points, but only (15% left vs 35% right) vs 50% neither when factoring in all the data points

    2. Subtract the EventTime value of the tracker element from the EventTime value of the audio element from your results file: you’ll get the offset for when the audio started relative to when the tracker started. Then add 697 to that number, and you’ll get the offset relative to the tracker’s start time for the word onset. Then you can compare that number to the time values from the eye-tracker data file and, for example, discard all the lines where time is lower than that number

    3. It is not possible to add a column to the eye-tracking data file during runtime. However you can use the trial column from that file to cross-reference the PennController trial ID from the results file, and add an “itemNo” column during analysis if you want

    Jeremy

    #10819
    stradee
    Participant

    Jeremy, I have one more question.

    It is about the fixation duration.
    In the data attached above,

    trial time left right
    11 1920 0 1
    11 2087 0 0
    11 2250 1 0
    11 2416 1 0
    11 2581 1 0
    11 2751 0 0

    The fixation duration of (left, right) = (0,1) is 2087-1920. is it right?
    Then, how can I calculate the last fixation (time = 2751)?

    always thanks!

    #10820
    Jeremy
    Keymaster

    Hi,

    The tracker collects data points at regular intervals. In this case, about every 160ms to 170ms. You cannot necessarily conclude from a change from one line to the other that the participant kept looking at the same point during that interval: not only because those are just estimates from the model, but also because the participant might have started looking elsewhere early after the former data point was collected, but the model just registered that on the next cycle, and so it shows up in the file only about 160ms later

    Based on an average interval of 165ms, what you can say from the lines you posted is that the participant was (estimated to be) looking at the left item for at least 2581-2250 = 331ms and at most 331+(165*2) = 661ms. As for the right item, all you can say from these lines is that the participant was (estimated to be) looking at it for any duration between 1ms and ~330ms

    Jeremy

    #10850
    stradee
    Participant

    Hi, Jeremy.

    While I was analyzing some eye data, I found some problems.

    trial time left right
    11 1920 0 1
    11 2087 0 0
    11 2250 1 0
    11 2416 1 0
    11 2581 1 0
    11 2751 0 0

    1. In the above data set, the regular intervals differ by participants. Some are 150~170ms, but others are 50~70ms. And a few are 30~50ms. So, there’s a wide difference between the number of the entire data by participants. Why do the regular intervals differ? I haven’t modified anything while collecting data.

    2. There’s some missing data. A few participants have all (left, right) = (0, 0) data. What causes this problem?

    3. In your last reply, you said the fixation duration is estimated considering the time course.
    Then, is there any way to figure out the exact fixation duration of each data point in pcibex farm? If so, I want to modify my script for further data collection.

    Thanks!

    #10858
    Jeremy
    Keymaster

    Hi,

    Sorry for the late reply

    1. Participants run the experiment in different conditions (different machines, different browsers, different number of tabs and programs open) which affect performance, hence the important variability in the intervals. The best you can do is to invite all your participants to use the same browser and close all tabs and programs that they can before taking your experiment

    2. When the data says (0,0) it means the eye tracker estimated that the gaze did not fall on either left of right. This can happen if the tracker estimated that the gaze was somewhere else on the page, somewhere else on the screen (for example the browser’s address bar) or if the participant closed their eyes, for example; all those remain estimates though, maybe the participant was indeed looking at either left or right but tracker got it wrong

    3. I’m not sure I understand this question

    Jeremy

Viewing 6 posts - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.