How to choose a sampling interval for mapping

Jakob Konradsen
|CQO & co-founder

The sampling interval determines how much of what actually happened in your environment ends up in the record, and a gap in data can become a finding in an audit. Understand how to choose the right interval, what the guidance actually says, and why inspectors are increasingly asking for documented rationale.

Visualize how the time between samples (interval) affects data fidelity.

What is a sampling interval in temperature mapping?

A sampling interval is the frequency at which a data logger records a temperature measurement. It determines how much of what actually happened in your environment ends up in the data record.

A mapping studyis only as reliable as the data it captures. Calibration gets most of the attention, but a factor of equal importance often goes unchallenged in protocol reviews: The sampling interval.

As we know, a data logger does not record temperature continuously, but "wakes up" at a set interval, takes a snapshot, saves the value, and goes back to sleep. The software then connects those snapshots into a line graph, and everything between two data points is interpolation, not measurement.

In other words, the sampling interval is simply how much time passes between those snapshots.

What happens when the sampling interval is too wide

The frequency at which a data logger records a measurement determines how much of the thermal picture you actually see. Set the interval too wide, and the final report may present a stable environment that never existed. Set it too narrow, and you are drowning in data points without gaining useful resolution.

For a validation engineer, the question is straightforward: what happened to the temperature between those two dots? If the interval is short enough, the answer is "probably nothing significant." If it is not, you may have gaps in your data that no amount of post-hoc analysis can fill.

This matters for three reasons that show up in audits and batch decisions.

Missed excursions

A compressor failure that lasts eight minutes will appear in data logged every minute. It may be completely invisible in data logged every fifteen minutes if the system recovers between samples. The mapping report would show a stable environment, even though a real excursion occurred and went unrecorded.

In GxP environments, undocumented excursions are not harmless. If an inspector reviews your mapping data alongside HVAC event logs and finds a cooling failure that does not appear in the temperature record, the gap itself becomes the finding.

MKT calculation accuracy

Mean kinetic temperature calculations depend on the granularity of the input data. USP <1079.2> defines specific evaluation windows for excursion assessment — 30 consecutive days for controlled room temperature, 24 consecutive hours for controlled cold temperature. If the underlying data has 15-minute gaps, short but significant temperature spikes will be averaged into the calculation rather than captured as discrete events. The result is an MKT value that understates actual thermal exposure.

Audit defensibility

When you present mapping results during an inspection, the sampling interval is part of the story. Inspectors expect the interval to be justified by a documented rationale tied to the thermal characteristics of the environment. "We used 5 minutes because that is what the logger defaults to" is not a defensible answer. "We used 5 minutes because the area's recovery time from door-opening events is 8–12 minutes, and this interval captures at least two data points per event" is.

Also read: GDP temperature mapping: protocol template and guidelines

What WHO and ISPE actually require

A common rule of thumb is that GxP guidelines suggest intervals between 1 and 15 minutes. That is roughly accurate but worth unpacking.

WHO TRS 961 Annex 9 Supplement 8 specifies that data loggers should have a programmable sampling period with intervals ranging from one second to beyond one hour. This describes the capability requirement for loggers, not a recommended interval for studies. The WHO does not prescribe a single interval for all applications.

The ISPE Good Practice Guide for Controlled Temperature Chambers elaborates on recording intervals and provides rationales that connect interval selection to the thermal behavior of the mapped space. Dickson and other industry sources cite 1–15 minutes as a typical range for warehouse mapping, with shorter intervals for equipment that changes temperature more rapidly, such as freezers.

Eupry's own WHO mapping guidelines page notes that five-minute intervals will often be sufficient, but emphasizes that this should be based on your risk assessment, not applied as a blanket default.

The common thread across all of these sources is that the interval should be justified by the rate of thermal change in the environment, not chosen by convenience or logger defaults.

Eupry 8-step checklist booklet cover featuring WHO logo for temperature mapping guidelines

8-step checklist: WHO’s temperature mapping guidelines

Get a step-by-step guide to implementing the principles in our WHO temperature mapping guideline checklist.

Initializing ...

ow do you choose the right sampling interval for mapping your environment?

Shorter is not always better. A 10-second interval on a 72-hour warehouse study generates over 25,000 data points per sensor. Multiply that across 30 or more loggers, and you have a dataset that is expensive to store, slow to analyze, and adds no value if the warehouse temperature changes meaningfully only over minutes, not seconds.

The decision depends on three factors.

Rate of change. How fast does the environment respond to disturbances? A shipping dock with frequent door openings and direct exposure to outside air changes temperature rapidly, sometimes within a few minutes. A large cold room with high thermal mass and stable HVAC may take an hour to shift by a single degree. The interval should be short enough to capture at least two data points during the fastest expected temperature change.

Also read: Thermal mass and thermal lag: why air alarms do not tell the full story

Risk assessment. What is the consequence if a short excursion goes unrecorded? For a warehouse storing controlled room temperature products with broad stability budgets, a missed two-minute fluctuation is unlikely to matter. For a freezer holding biologics at -70°C (-94°F), the same gap could mask a critical event.

Practical constraints. Battery life, memory capacity, and data processing time all scale with logging frequency. If your quality system cannot efficiently handle the data volume, the extra resolution creates a new problem without solving the original one. The right interval balances fidelity with operational feasibility.

A reasonable starting framework for most pharmaceutical mapping studies:

  • Freezers and ULT equipment: 10 seconds to 3 minutes, depending on door frequency and recovery characteristics
  • Refrigerators and cold rooms: 1 to 5 minutes
  • Ambient warehouses: 5 to 15 minutes, with shorter intervals near docks and doors

These ranges align with what we recommends for mapping equipment selection and should always be documented in the mapping protocol with a rationale tied to the specific environment.

Also see: How to create a temperature mapping protocol in pharma

Tool

Sensor Sampling Simulator: How interval changes affect your data

To move beyond theoretical understanding, we have built a Sensor Sampling Simulator to provide a dynamic way to see how data changes as the interval is adjusted.

The tool presents a simulated "true temperature" curve alongside the data points a logger would actually record at your chosen interval.

How to use the tool

  • Adjust the sampling interval: By using the slider to increase or decrease the interval (e.g., from 1 minute to 30 minutes), users can see the logged points move further apart.
  • Simulate a peak event: Users can insert a "simulated peak event" into the timeline. This feature allows for the observation of how a sudden, sharp temperature rise is represented (or misrepresented) at different sampling frequencies.
  • Observe the fidelity loss: As the interval increases, especially during a peak event, the "Logged Data" line begins to deviate significantly from the "True" curve. Users can observe how the recorded peak height is lower than the actual peak, or how the event is missed entirely as resolution decreases.
  • Analyze the impact: This visualization demonstrates why a high-turnover area (like a shipping dock) requires a much shorter interval than a high-thermal-mass area (like a large cold room).

Note: The simulator is an educational tool designed to illustrate how sampling frequency affects data representation. It does not recommend specific intervals for your application. Final logging intervals should always be determined by the thermal characteristics of the equipment, the products stored, and your applicable GxP requirements.

FAQ

5 questions ans answers about sampling intervals

What sampling interval should I use for mapping?

It depends on the environment. Freezers typically need 10 seconds to 3 minutes; warehouses 5 to 15 minutes. Justify the choice in your protocol.

Learn more about temperature mapping in GxP.

Can a long sampling interval cause me to miss a temperature excursion?

Yes. If the environment recovers between samples, a real excursion can go entirely unrecorded in your mapping data.

Do WHO or ISPE guidelines specify a required sampling interval?

No single interval is mandated. WHO requires programmable intervals; ISPE links interval choice to the thermal behavior of the space.

Learn more about the WHO's mapping guidelines.

How does the sampling interval affect MKT calculations?

Wide intervals can smooth out short temperature spikes, producing MKT values that understate actual thermal exposure.

Also see: USP <1079.2> in action: How Eupry’s alarm handling provides faster, safer excursion evaluation

What is the trade-off of using very short sampling intervals?

Shorter intervals increase data volume, reduce battery life, and extend analysis time. Balance fidelity with practical constraints.

Temperature mapping solutions built for GxP

Avoid delays, errors, and compliance risks with Eupry’s complete temperature mapping services and equipment.

  • Audit-ready mapping with a click: Digital protocols, analysis, and GxP-compliant reports in the software.
  • Validated right, every time: No costly errors or delays with expert guidance and live insights.
  • Minimal disruptions: Fast and flexible setup keep the operational downtime to a minimum.
Eupry temperature mapping kit in protective hard case with wireless sensors, beside engineer reviewing data on tablet in warehouse