In order to properly monitor the measurement of your AV content, there are a few things you need to know.
play) represent the first click on the play button, which starts loading the video.
playbackStart) represent the playback of the first frame of the content.
The classic sequence of events is therefore
av.play followed by
av.start. In between, the video loading time can be measured with events
av.buffer.start followed by
If you are not able to differentiate these two times, it is necessary to send
If playback is resumed (after a pause or loading time (rebuffer) for example), it will be necessary to send an event
playbackResumed) rather than
When using methods related to reading content, the cursor position is required.
This position makes it possible to know the real progress of the reading.
In order to measure the position of the cursor, the libraries calculate the time actually spent playing. By default, one real second will correspond to one second of playing.
However, you can change this playback speed via the method
setPlaybackSpeed, by specifying the speed factor.
Some automatic processings are managed by the tagging libraries.
However, it is necessary to know them and to take them into account when tagging not using libraries.
When calling a
seek) method, several events are raised:
av.seek.start, indicating the start of seek, and bearing the cursor position before the move
av.backward, depending on the start and end positions of the cursor move .
Each playback session has a unique identifier (property
A playback session is defined by a content play. It stops if the content changes, or if the video is stopped (
Two types of loading times can be identified:
- buffer (
av.buffer.heartbeat), for loading before the start of playback (between
- rebuffer (
av.rebuffer.heartbeat), for loading during playback (after
bufferStart checks if a
playbackStart has been called on the current session, in order to send the corresponding load type.
During playback, in order to measure durations more precisely, heartbeat events provide information on the status of the player.
Several types of heartbeat are available:
av.heartbeat, triggers after a
playbackResumed, allows to measure the content playback
av.(re)buffer.heartbeat, triggers after a
bufferStart, allows to measure the duration of loading times
The time between two heartbeats is defined at the moment of media instantiation. It can be fixed for the whole playback, or it can evolve via a JSON object.
On short media, we recommend to fill in 5 seconds between each heartbeat, in order to have a fine enough measurement without overloading the user’s network and the collection servers. On larger media, you should space heartbeats with more time, up to 30 minutes in order to keep the playback session alive.
The time actually spent in front of the content is filled in the property
av_duration, via a timer.
This is the time elapsed in milliseconds between the previous event and the current one.
On each event, it is necessary to fill in the position (
av_previous_position) and the name (
av_previous_event) of the previously sent event.