Asynchonous API¶
Device Discovery¶
- class pupil_labs.realtime_api.discovery.Network[source]¶
Bases:
object
- property devices¶
- Return type:
Remote Control¶
- class pupil_labs.realtime_api.device.Device(*args, **kwargs)[source]¶
Bases:
DeviceBase
- async get_calibration()[source]¶
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the request fails
- Return type:
- async get_status()[source]¶
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the request fails
- Return type:
- async get_template()[source]¶
Gets the template currently selected on device
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the template can’t be fetched.
- Return type:
- async get_template_data(format='simple')[source]¶
Gets the template data entered on device
- Parameters:
format (str) – “simple” | “api” “api” returns the data as is from the api eg. {“item_uuid”: [“42”]} “simple” returns the data parsed eg. {“item_uuid”: 42}
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the template’s data could not be fetched
- async post_template_data(template_answers, format='simple')[source]¶
Sets the data for the currently selected template
- Parameters:
format (str) – “simple” | “api” “api” accepts the data as in realtime api format eg. {“item_uuid”: [“42”]} “simple” accepts the data in parsed format eg. {“item_uuid”: 42}
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the data can not be sent. ValueError: if invalid data type.
- Return type:
- async recording_cancel()[source]¶
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the recording could not be started Possible reasons include - Recording not running
- async recording_start()[source]¶
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the recording could not be started. Possible reasons include - Recording already running - Template has required fields - Low battery - Low storage - No wearer selected - No workspace selected - Setup bottom sheets not completed
- Return type:
- async recording_stop_and_save()[source]¶
- Raises:
pupil_labs.realtime_api.device.DeviceError – if the recording could not be started Possible reasons include - Recording not running - template has required fields
- async send_event(event_name, event_timestamp_unix_ns=None)[source]¶
- Raises:
pupil_labs.realtime_api.device.DeviceError – if sending the event fails
- Return type:
- async status_updates()[source]¶
- Return type:
AsyncIterator
[Union
[Phone
,Hardware
,Sensor
,Recording
,NetworkDevice
]]
- pupil_labs.realtime_api.device.UpdateCallback¶
Type annotation for synchronous and asynchronous callbacks
Alias of
Union
[Callable
[[Component
],None
],Callable
[[Component
],Awaitable
[None
]]]
- pupil_labs.realtime_api.device.UpdateCallbackAsync¶
Type annotation for asynchronous update callbacks
Streaming¶
Gaze Data¶
- namedtuple pupil_labs.realtime_api.streaming.gaze.EyestateGazeData(x, y, worn, pupil_diameter_left, eyeball_center_left_x, eyeball_center_left_y, eyeball_center_left_z, optical_axis_left_x, optical_axis_left_y, optical_axis_left_z, pupil_diameter_right, eyeball_center_right_x, eyeball_center_right_y, eyeball_center_right_z, optical_axis_right_x, optical_axis_right_y, optical_axis_right_z, timestamp_unix_seconds)[source]¶
Bases:
NamedTuple
EyestateGazeData(x, y, worn, pupil_diameter_left, eyeball_center_left_x, eyeball_center_left_y, eyeball_center_left_z, optical_axis_left_x, optical_axis_left_y, optical_axis_left_z, pupil_diameter_right, eyeball_center_right_x, eyeball_center_right_y, eyeball_center_right_z, optical_axis_right_x, optical_axis_right_y, optical_axis_right_z, timestamp_unix_seconds)
- Fields:
x (
float
) – Alias for field number 0y (
float
) – Alias for field number 1worn (
bool
) – Alias for field number 2pupil_diameter_left (
float
) – Alias for field number 3eyeball_center_left_x (
float
) – Alias for field number 4eyeball_center_left_y (
float
) – Alias for field number 5eyeball_center_left_z (
float
) – Alias for field number 6optical_axis_left_x (
float
) – Alias for field number 7optical_axis_left_y (
float
) – Alias for field number 8optical_axis_left_z (
float
) – Alias for field number 9pupil_diameter_right (
float
) – Alias for field number 10eyeball_center_right_x (
float
) – Alias for field number 11eyeball_center_right_y (
float
) – Alias for field number 12eyeball_center_right_z (
float
) – Alias for field number 13optical_axis_right_x (
float
) – Alias for field number 14optical_axis_right_y (
float
) – Alias for field number 15optical_axis_right_z (
float
) – Alias for field number 16timestamp_unix_seconds (
float
) – Alias for field number 17
- property datetime¶
- property timestamp_unix_ns¶
- namedtuple pupil_labs.realtime_api.streaming.gaze.GazeData(x, y, worn, timestamp_unix_seconds)[source]¶
Bases:
NamedTuple
GazeData(x, y, worn, timestamp_unix_seconds)
- Fields:
- property datetime¶
- property timestamp_unix_ns¶
- namedtuple pupil_labs.realtime_api.streaming.gaze.Point(x, y)[source]¶
Bases:
NamedTuple
Point(x, y)
- class pupil_labs.realtime_api.streaming.gaze.RTSPGazeStreamer(*args, **kwargs)[source]¶
Bases:
RTSPRawStreamer
- async receive()[source]¶
- Return type:
AsyncIterator
[Union
[GazeData
,DualMonocularGazeData
,EyestateGazeData
]]
- async pupil_labs.realtime_api.streaming.gaze.receive_gaze_data(url, *args, **kwargs)[source]¶
- Return type:
AsyncIterator
[Union
[GazeData
,DualMonocularGazeData
,EyestateGazeData
]]
IMU Data¶
- namedtuple pupil_labs.realtime_api.streaming.imu.Data3D(x, y, z)[source]¶
Bases:
NamedTuple
Data3D(x, y, z)
- namedtuple pupil_labs.realtime_api.streaming.imu.IMUData(gyro_data, accel_data, quaternion, timestamp_unix_seconds)[source]¶
Bases:
NamedTuple
IMUData(gyro_data, accel_data, quaternion, timestamp_unix_seconds)
- Fields:
gyro_data (
Data3D
) – Alias for field number 0accel_data (
Data3D
) – Alias for field number 1quaternion (
Quaternion
) – Alias for field number 2timestamp_unix_seconds (
float
) – Alias for field number 3
- property datetime¶
- property timestamp_unix_nanoseconds¶
- property timestamp_unix_ns¶
- namedtuple pupil_labs.realtime_api.streaming.imu.Quaternion(x, y, z, w)[source]¶
Bases:
NamedTuple
Quaternion(x, y, z, w)
- class pupil_labs.realtime_api.streaming.imu.RTSPImuStreamer(*args, **kwargs)[source]¶
Bases:
RTSPRawStreamer
Scene Video¶
- pupil_labs.realtime_api.streaming.video.BGRBuffer¶
Type annotation for raw BGR image buffers of the scene camera
- class pupil_labs.realtime_api.streaming.video.RTSPVideoFrameStreamer(*args, **kwargs)[source]¶
Bases:
RTSPRawStreamer
- property sprop_parameter_set_payloads¶
- Raises:
pupil_labs.realtime_api.streaming.base.SDPDataNotAvailableError –
- Return type:
- namedtuple pupil_labs.realtime_api.streaming.video.VideoFrame(av_frame, timestamp_unix_seconds)[source]¶
Bases:
NamedTuple
VideoFrame(av_frame, timestamp_unix_seconds)
- Fields:
av_frame (
VideoFrame
) – Alias for field number 0timestamp_unix_seconds (
float
) – Alias for field number 1
- property datetime¶
- property timestamp_unix_ns¶
- async pupil_labs.realtime_api.streaming.video.receive_video_frames(url, *args, **kwargs)[source]¶
- Return type:
- pupil_labs.realtime_api.streaming.nal_unit.extract_payload_from_nal_unit(unit)[source]¶
prepend NAL unit start code to payload if necessary
handle fragmented units (of type FU-A)
Inspired by https://github.com/runtheops/rtsp-rtp/blob/master/transport/primitives/nal_unit.py Rewritten due to license incompatibility.
- Return type:
Raw RTSP Data¶
- namedtuple pupil_labs.realtime_api.streaming.base.RTSPData(raw, timestamp_unix_seconds)[source]¶
Bases:
NamedTuple
RTSPData(raw, timestamp_unix_seconds)
- Fields:
raw (
ByteString
) – Alias for field number 0timestamp_unix_seconds (
float
) – Alias for field number 1
- _asdict()¶
Return a new dict which maps field names to their values.
- classmethod _make(iterable)¶
Make a new RTSPData object from a sequence or iterable
- _replace(**kwds)¶
Return a new RTSPData object replacing specified fields with new values
- property datetime¶
- property timestamp_unix_ns¶
- class pupil_labs.realtime_api.streaming.base.RTSPRawStreamer(*args, **kwargs)[source]¶
Bases:
object
Forwards all arguments to aiortsp.rtsp.reader.RTSPReader
- property encoding¶
- property reader¶
- class pupil_labs.realtime_api.streaming.base._WallclockRTSPReader(*args, **kwargs)[source]¶
Bases:
RTSPReader
Time Echo Protocol¶
Manual time offset estimation via the Pupil Labs Time Echo protocol
The Realtime Network API host device timestamps its data with nanoseconds since the Unix epoch (January 1, 1970, 00:00:00 UTC). This clock is kept in sync by the operating system through NTP (Network Time Protocol). For some use cases, this sync is not good enough. For more accurate time syncs, the Time Echo protocol allows the estimation of the direct offset between the host’s and the client’s clocks.
The Time Echo protocol works in the following way:
The API host (Neon / Pupil Invisible Companion app) opens a TCP server at an specific port
The client connects to the host address and port
The client sends its current time (
t1
) in milliseconds as an uint64 in network byte order to the hostThe host responds with the time echo, two uint64 values in network byte order 1. The first value is equal to the sent client time (
t1
) 2. The second value corresponds to the host’s time in milliseconds (tH
)The client calculates the duration of steps 3 and 4 (roundtrip time) by measuring the client time before sending the request (
t1
) and after receiving the echo (t2
)The protocol assumes that the transport duration is symmetric. It will assume that
tH
was measured at the same time as the midpoint betweet1
andt2
.To calculate the offset between the host’s and client’s clock, we subtract
tH
from the client’s midpoint(t1 + t2) / 2
:offset_ms = ((t1 + t2) / 2) - tH
This measurement can be repeated multiple times to make the time offset estimation more robust.
To convert client to host time, subtract the offset:
host_time_ms = client_time_ms() - offset_ms
This is particularly helpful to accurately timestamp local events, e.g. a stimulus presentation.
To convert host to client time, add the offset:
client_time_ms = host_time_ms() + offset_ms
This is particularly helpful to convert the received data into the client’s time domain.
- class pupil_labs.realtime_api.time_echo.Estimate(measurements)[source]¶
Bases:
object
Provides easy access to statistics over a collection of measurements
- namedtuple pupil_labs.realtime_api.time_echo.TimeEcho(roundtrip_duration_ms, time_offset_ms)[source]¶
Bases:
NamedTuple
Measurement of a single time echo
- namedtuple pupil_labs.realtime_api.time_echo.TimeEchoEstimates(roundtrip_duration_ms, time_offset_ms)[source]¶
Bases:
NamedTuple
Provides estimates for the roundtrip duration and time offsets
- pupil_labs.realtime_api.time_echo.TimeFunction¶
Returns time in milliseconds
- class pupil_labs.realtime_api.time_echo.TimeOffsetEstimator(address, port)[source]¶
Bases:
object
- pupil_labs.realtime_api.time_echo.time_ms()[source]¶
Return milliseconds since Unix epoch (January 1, 1970, 00:00:00 UTC)