End-to-end quality metrics measured by Sinefa include delay, jitter, loss and availability. These are measured by sending a UDP/TCP/ICMP packet to the test target.
Delay (also known as latency) is measured by calculating the time it takes to receive each packet back, minus any time spent by the operating system. It's important to account for OS processing time (such as the time it takes for the OS to schedule the request, or the time taken to serialize the packet). Not doing so can cause inaccurate results, particularly on low-delay links. Over a sample period, the median value is calculated, and this is what's reported. Why median and not mean? Typically, a set of delay samples is skewed to the right. If there are outliers, they are usually high, as opposed to low. These outliers can push averages up since they are rarely uniform, so the median better represents "typical" network delay, which in turn better reflects how applications get impacted.
Jitter is calculated by taking the median absolute deviation of the delays for each sample period. Median absolute deviation (or MAD for short) is best described as the median deviation from the median. As with delay, MAD is not as effected by large positive outliers, so provides a good representation of "typical" network jitter. Again, this better reflects how applications get impacted on the network.
Loss is calculated by counting the number of lost packets (after a time-out) and dividing that by the total number of packets sent for each sample period. Lost packets do not count towards delay or jitter calculations.
Availability is calculated by measuring the duration each sample period had 10 or more seconds of consecutive loss events. For each sample period, the available time and unavailable time is recorded, so over long periods of time, availability measurements are extremely accurate.
NQS is the Network Quality Score, a measure of layer 3 network quality calculated by taking into account delay, jitter and loss. It is based on the MOS (mean opinion score) formula used for VoIP quality scoring.
By default, 1 packet is sent per second (from Probes) OR 1 packet is sent per 10 seconds (from Endpoint Agents) and the sample period is 1 minute. This default configuration consumes less than 5MB of data per test, per day.