Time Measurement and Reporting in Adobe Analytics

Time Measurement and Reporting in Adobe Analytics

In a world where user attention is the ultimate currency, truly understanding how people spend their time on your digital properties is no longer optional — it’s the difference between guessing and knowing.

Adobe Analytics remains one of the most powerful tools for measuring time-based behaviour, but only if you understand how it actually works under the hood. Most analysts use “Time Spent” metrics every day without realising the critical mechanics: time is always calculated backwards, attributed to the previous hit, and the last page or event in a visit gets zero seconds by default — unless you close the loop properly.

This comprehensive guide — fully updated and battle-tested through dozens of enterprise implementations — will show you exactly:

  • How time is really calculated (with clear rules and hit-by-hit examples)
  • The four essential time-spent dimensions you should be using in 99% of analyses
  • How to capture true end-to-end process timing (checkout, forms, applications) that native metrics can’t measure

Whether you’re trying to spot content that’s actually being read, identify channels bringing high-value attention, or prove that your new checkout flow cut completion time in half, mastering time measurement is your superpower.

Let’s stop treating time as a vanity metric and turn it into one of your most trustworthy leading indicators. Every second is waiting to tell its story — here’s how to listen correctly. 🚀

Why Time Matters in Digital Analytics

Time-based metrics are the heartbeat of user engagement analysis. They help answer critical questions like: How long do visitors linger on your content? Which pages hold attention, and which cause quick exits?

Adobe Analytics excels here by providing nuanced time measurements that go beyond simple averages. These metrics tie directly to user behaviour, influencing everything from content strategy to A/B testing. Start with time data as your foundation, and you’ll build a robust understanding of your audience’s journey.

Key Time Metrics

MetricLevelDescriptionBest For
Total Time SpentAnySum of all seconds attributed to a dimension item (page, campaign, etc.)Overall engagement totals
Time Spent per VisitVisitTotal or average time spent in visits where the dimension persistsSession-level engagement
Average Time Spent on SiteVisitorAverage across visits (excludes bounces), shown in seconds or hh:mm:ssSite-wide benchmarks
Time Spent per VisitorVisitorLifetime average time spent per unique visitor (cookie-based)Loyalty & long-term engagement
Time Spent on PageHitSeconds spent on an individual page view or link eventPage-level consumption analysis
DimensionTypeDescriptionWhen to Use
Time Spent per Visit – BucketedVisitAdobe’s out-of-the-box visit-level time dimension with pre-defined rangesUnderlying the second-level granular dimension, you can define your own time range from this dimension range
Time Spent per Visit – GranularVisitThe underlying second-level granular dimension, you can define your own time range from this dimension rangeTailored engagement tiers (e.g., “Highly Engaged” = >5 min, “Bounce” = <15 sec)
Time Spent on Page – BucketedHitAdobe’s out-of-the-box hit-level time dimension with pre-defined rangesInstantly identify skimmed vs. deeply consumed content without setup
Time Spent on Page – GranularHitThe underlying second-level granular dimension, you can define new rangesApply business-specific attention thresholds (e.g., <8 sec = bounce, >90 sec = high attention)

How Time is Calculated: Behind the Scenes

Adobe Analytics calculates time based on the timestamp difference between consecutive hits (any server call). The time is always attributed backwards to the previous hit — regardless of hit type.

Core Rules (applies to every hit type)

  1. First hit in a visit → always 0 seconds
  2. Every subsequent hit → elapsed time since the previous hit is credited to the prior hit
  3. Last hit in a visit → 0 seconds
  4. Single-hit (bounce) visits → 0 seconds total

Hit Types and Their Impact on Time Spent

Hit TypeCounts as a hit?Can it receive time?Does it count as a hit?Real-world Example & Effect
Page ViewYesYesYesStandard page load → normal backward attribution
Custom Link Event (trackLink)YesYesYes“Download PDF” button → time between PDF click and previous page is credited to that page
Exit LinkYesYesYesUser clicks outbound link → time spent on final page is finally captured
Download LinkYesYesYesFile download → prevents the final page from getting 0 seconds
Video Heartbeat (milestones, play, pause, complete)YesYesYesVideo play → 15 sec heartbeat → 15 seconds credited to the video page; video complete → captures final time

Example Journey with Mixed Hit Types

Example Journey with Mixed Hit Types
#Hit TypeTimestampTime ElapsedTime Credited To (Hit)Time Spent ValueeVar50 = “Video Campaign” (Visit expiration)prop50 = “Page / Event Name” (hit-only)Result: Where Does the Time Get Allocated?
1Page View: Home0s0s(not set)“Home”First hit → always 0s
2Page View: Product Page35s35sHit #1: HomeHome → 35s“Video Campaign” (set here)“Product Page”35s credited to Home page; eVar50 now persists from this hit onward
3Custom Link: “Add to Cart”52s17sHit #2: Product PageProduct → 17s“Video Campaign” (persisted)“Add to Cart”17s to Product Page; eVar50 still active
4Video Heartbeat (play)55s3sHit #3: “Add to Cart” linkAdd-to-Cart → 3s“Video Campaign”(no prop set)3s credited to the custom link hit; eVar50 receives it in reports
5Video Heartbeat (25%)80s25s#4: Previous heartbeatVideo +25s“Video Campaign”(no prop set)25s to previous heartbeat hit; eVar50 continues to accumulate time
6Video Complete140s60s#5: Previous heartbeatVideo +60s“Video Campaign”(no prop set)Another 60s to video-related hits; eVar50 gets full credit
7Custom Link: Download PDF155s15s#6: Video completeVideo → +15s“Video Campaign”“Download PDF”15s credited to video complete event; eVar50 still active
8Exit Link: outbound170s15s#7: PDF downloadPDF event → +15s“Video Campaign” (final hit)“Exit Link”Final 15s captured thanks to exit link; eVar50 receives it too

Total visit time captured = 170 seconds (only possible because exit link hits closed the timer).

eVar vs. Prop – How Time Allocation Actually Works

Variable TypePersistenceReceives Time Spent FromMost Common Use in Time Reporting
propHit-level onlyOnly the exact hit where the prop is setPage name, section, server, hit-level content grouping
eVarVisit, custom expiration, or forever (allocation: Most Recent/Last Touch by default)Every subsequent hit until the eVar expires or is overwrittenCampaigns, internal search, content category, logged-in status, etc.

What actually happened in the journey above

  • Page Name / prop50 report → Time Spent is split across many different hits (Home 35s, Product 17s, custom links & heartbeats get the rest). The last page looks terrible.
  • Campaign eVar50 = “Video Campaign” report → Time Spent = all 170 seconds because the eVar was set on hit #2 and persisted through the entire visit.

Rule of Thumb You Can Take to the Bank

  • Use props or the Page dimension when you want true page-level or hit-level time spent.
  • Use eVars with Visit (or longer) expiration when you want time allocated to campaigns, content groups, customer segments, etc.
  • Never be surprised when an eVar shows 5–20× more “Time Spent” than the corresponding prop/page — that’s exactly how Adobe Analytics is designed to work.

Key Takeaways for Accurate Time Measurement

  • Any properly implemented s.tl() or heartbeat = a timestamp → prevents time loss
  • The last hit wins for capturing remaining time — always track downloads, outbound exits, video completes, etc.
  • Custom link events themselves can receive time if something happens after them
  • If you only fire page views, you will massively under-report time spent on the final page of every visit

Implement those non-page-view hits correctly, and your time-spent numbers finally become trustworthy.

Practical Reporting in Analysis Workspace

GoalRecommended ComboWhat You’ll See Instantly
True page consumption rankingPages → break down by Time Spent on Page – Granular or – BucketedWhich pages users actually read vs. skim
High-engagement sessionsVisits → break down by Time Spent per Visit – Granular or – BucketedAre mobile users bouncing faster than desktop users?
Content performance heatmapPages + Occurrences + Time Spent on Page – Bucketed (custom <8s, 8–30s, etc.)Red-flag pages with high views but low time spent
Channel quality beyond volumeMarketing Channel + Time Spent per Visit – Bucketed (>3 min = high engagement)Which channels bring valuable, engaged traffic
Mobile UX issuesDevice Type + Time Spent on Page – GranularAre mobile users bouncing faster than desktop?

30-second quick win

  1. Drag dimension Pages
  2. Break down by Time Spent on Page – Granular
  3. Add metric Occurrences side-by-side
    → Instantly reveals hidden gems and content that’s being ignored.

Best Practices & Common Pitfalls

  • Enable exit-link, download, and video heartbeat tracking to capture the final-hit time
  • Background tabs inflate time — always cross-validate with scroll or interaction events
  • Use Visit-level dimensions with visit metrics, Hit-level with page metrics
  • Bucket your own – Bucketed versions — the default granular buckets are good, but rarely perfect for your KPIs
  • Exclude bots & internal traffic — they destroy time averages

Measuring Process & Transaction Completion Time

While the native “Time Spent” metrics excel at passive time spent measurement, many businesses need to actively measure how long it takes users to complete a specific process (e.g., checkout, lead form, registration, application funnel, etc.). This is fundamentally different from backwards-attributed time spent — it is forward-looking, event-to-event elapsed time.

Why Native Time Spent Is Not Enough for Processes

  • Native time is always credited backwards to the previous hit.
  • The last page/event in a multi-step flow usually gets 0 seconds (unless an exit/download/video-complete hit exists).
  • You cannot natively answer “How long did the average successful checkout take?” using only Page/eVar + Time Spent metrics.

The industry-standard (and Adobe-recommended) way is to explicitly record the start timestamp and end timestamp of the process and calculate the difference.

Adobe provides the getTimeBetweenEvents plug-in to measure the time between events automatically. Official reference can be found at: https://experienceleague.adobe.com/en/docs/analytics/implementation/vars/plugins/gettimebetweenevents

First, implement the plug-in code in your AppMeasurement file or via the Common Analytics Plugins extension in Adobe Experience Platform Data Collection.

Plug-in code (paste after s_gi() instantiation or use the extension to initialise):

/* Adobe Consulting Plugin: getTimeBetweenEvents v3.0 (AppMeasurement highly recommended) */
function getTimeBetweenEvents(ste,rt,stp,res,cn,etd,fmt,bml,rte){var v=ste,B=rt,x=stp,C=res,k=cn,m=etd,E=fmt,F=bml,p=rte;if("-v"===v)return{plugin:"getTimeBetweenEvents",version:"3.0"};var q=function(){if("undefined"!==typeof window.s_c_il)for(var c=0,b;c<window.s_c_il.length;c++)if(b=window.s_c_il[c],b._c&&"s_c"===b._c)return b}();if("undefined"!==typeof q&&(q.contextData.getTimeBetweenEvents="3.0",window.cookieWrite=window.cookieWrite||function(c,b,d){if("string"===typeof c){var n=window.location.hostname,f=window.location.hostname.split(".").length-1;if(n&&!/^[0-9.]+$/.test(n)){f=2<f?f:2;var l=n.lastIndexOf(".");if(0<=l){for(;0<=l&&1<f;)l=n.lastIndexOf(".",l-1),f--;l=0<l?n.substring(l):n}}g=l;b="undefined"!==typeof b?""+b:"";if(d||""===b)if(""===b&&(d=-60),"number"===typeof d){var e=new Date;e.setTime(e.getTime()+6E4*d)}else e=d;return c&&(document.cookie=encodeURIComponent(c)+"="+encodeURIComponent(b)+"; path=/;"+(d?" expires="+e.toUTCString()+";":"")+(g?" domain="+g+";":"") ,"undefined"!==typeof window.cookieRead)?window.cookieRead(c)===b:!1}},window.cookieRead=window.cookieRead||function(c){if("string"===typeof c)c=encodeURIComponent(c);else return"";var b=" "+document.cookie,d=b.indexOf(" "+c+"="),e=0>d?d:b.indexOf(";",d);return(c=0>d?"":decodeURIComponent(b.substring(d+2+c.length,0>e?b.length:e)))?c:""},window.formatTime=window.formatTime||function(c,b,d){function e(b,d,c,e){if("string"!==typeof d)return!1;if("string"===typeof b)b=b.split(c||",");else if("object"!==typeof b)return!1;c=0;for(a=b.length;c<a;c++)if(1==e&&d===b[c]||d.toLowerCase()===b[c].toLowerCase())return!0;return!1}if(!("undefined"===typeof c||isNaN(c)||0>Number(c))){var f="";"string"===typeof b&&"d"===b||("string"!==typeof b||!e("h,m,s",b))&&86400<=c?(b=86400,f="days",d=isNaN(d)?1:b/(d*b)):"string"===typeof b&&"h"===b||("string"!==typeof b||!e("m,s",b))&&3600<=c?(b=3600,f="hours",d=isNaN(d)?4:b/(d*b)):"string"===typeof b&&"m"===b||("string"!==typeof b||!e("s",b))&&60<=c?(b=60,f="minutes",d=isNaN(d)?2:b/(d*b)):(b=1,f="seconds",d=isNaN(d)?.2:b/d);f=Math.round(c*d/b)/d+" "+f;0===f.indexOf("1 ")&&(f=f.substring(0,f.length-1));return f}},window.inList=window.inList||function(c,b,d,e){if("string"!==typeof b)return!1;if("string"===typeof c)c=c.split(d||",");else if("object"!==typeof c)return!1;d=0;for(a=c.length;d<a;d++)if(1==e&&b===c[d]||b.toLowerCase()===c[d].toLowerCase())return!0;return!1},"string"===typeof v&&"undefined"!==typeof B&&"string"===typeof x&&"undefined"!==typeof C)){k=k?k:"s_tbe";m=isNaN(m)?1:Number(m);var r=!1,t=!1,y=v.split(","),z=x.split(",");p=p?p.split(","):[];for(var u=window.cookieRead(k),w,D=new Date,A=D.getTime(),h=new Date,e=0;e<p.length;++e)if(window.inList(q.events,p[e])){h.setDate(h.getDate()-1);window.cookieWrite(k,"",h);return}h.setTime(h.getTime()+864E5*m);for(e=0;e<y.length&&!r&&(r=window.inList(q.events,y[e]),!0!==r);++e);for(e=0;e<z.length&&!t&&(t=window.inList(q.events,z[e]),!0!==t);++e);1===y.length&&1===z.length&&v===x&&r&&t?(u&&(w=(A-u)/1E3),window.cookieWrite(k,A,m?h:0)):(!r||1!=B&&u||window.cookieWrite(k,A,m?h:0),t&&u&&(w=(D.getTime()-u)/1E3,!0===C&&(h.setDate(h.getDate()-1),window.cookieWrite(k,"",h))));return w?window.formatTime(w,E,F):""}}};
/******************************************** END CODE TO DEPLOY ********************************************/

Then, call the plug-in in the doPlugins function on every hit:

s.doPlugins = function (s) {
    // Set s.events on the appropriate hits elsewhere in your code
    // For example, on start hit: s.events = s.apl(s.events, "event100", ",", 2);
    // On end hit: s.events = s.apl(s.events, "event101", ",", 2);

    var elapsed = getTimeBetweenEvents("event100", true, "event101", true, "", 0, "s");
    if (elapsed !== "") {
        s.events = s.apl(s.events, "event102=" + elapsed, ",", 1); // Append serialized event for numeric value
        s.eVar10 = elapsed; // Optional: Set to eVar for string value and bucketing
    }
};

capture and calculate process lapses time

What happens behind the scenes:

  • When a hit includes a start event (e.g., event100 in s.events), the plug-in stores the current timestamp in a cookie.
  • When a hit includes a stop event (e.g., event101 in s.events), the plug-in calculates the elapsed time since the start, formats it (here in seconds), returns the value as a string, and resets the timer (since res=true).
  • The returned string is appended as a serialised event (event102=45) for numeric reporting, or set to an eVar for classification/bucketing.
  • Automatically resets the cookie after recording if res=true.

Key plug-in features:

  • Supports seconds, minutes, hours, or days granularity via the fmt parameter.
  • Works across pages and visits if you adjust etd (expiration in days) and rt (restart timer).
  • Can round to custom benchmarks via bml.
  • Use the rte parameter for events that explicitly reset the timer.

Note: This plug-in relies on cookies for persistence. It is not supported in Web SDK yet. Ensure you have the apl (append to list) plug-in for safely adding events.

Reporting the Results in Analysis Workspace

GoalRecommended ReportKey Metrics / Breakdowns
Average completion timeCustom Event 102 (from plug-in) or your numeric eventInstances + Average (Workspace auto-calculates)
Completion time distribution“Your numeric event → break down by Time Spent – Bucketed (create custom buckets: 0-30s, 30-60s, 1-3 min, 3-10 min, 10+ min)”% of conversions in each speed tier
Funnel drop-off vs. speedFallout report on steps → add segment where completion time < 2 min vs. > 5 minIdentify friction points that cause slow completions
Channel → checkout speedMarketing Channel → metric = your completion-time numeric eventWhich channels bring fast-converting traffic
Device performance on formsDevice Type → completion-time eventSpot mobile slowness

Best Practices for Process Timing

  1. Always fire the start event before any possible abandonment (e.g., on first step view, not on “Continue” click).
  2. Use a numeric or counter event for the elapsed time so you get proper averages and can bucket it.
  3. Create a “Process Speed” calculated metric: Average(Completion Time Event).
  4. Segment “Fast Conversions” (< 90 seconds) vs. “Slow Conversions” (> 5 min) to correlate with revenue, satisfaction scores, etc.
  5. Reset the timer appropriately (plug-in handles this via the reset parameter).

With this approach, you finally get accurate answers to questions like:

  • “How long does the average checkout take?”
  • “Are paid search users completing registration 40 % faster than organic?”
  • “Did our new one-page checkout reduce average time-to-purchase from 180 s to 72 s?”

Combine native time spent analysis (for engagement) with explicit process-timer tracking (for efficiency), and you’ll have the complete time picture in Adobe Analytics.

Final Thoughts

Mastering time measurement in Adobe Analytics gives you two powerful superpowers:

  1. Passive engagement insight – using the four core time-spent dimensions
    (Time Spent per Visit – Granular/Bucketed and Time Spent on Page – Granular/Bucketed) together with the standard time metrics to reveal exactly where attention and friction live across pages, content, campaigns, and channels.
  2. Active process efficiency insight – explicitly measuring start-to-finish elapsed time for critical journeys (checkout, registration, lead forms, applications, etc.) with tools like the getTimeBetweenEvents plug-in or manual timestamping. This finally lets you answer questions that native time spent metrics can’t touch:
    “How long does the average successful checkout actually take?”
    “Which channels or devices drive the fastest conversions?”
    “Did our UX improvement shave 90 seconds off registration time?”

Combine both approaches — native backwards-attributed time spent for understanding engagement depth, and forward-looking process timers for measuring conversion efficiency — and you’ll have the complete, trustworthy picture of how time flows through your digital experience.

The data is waiting. Go make every second count. 🚀

Questions, war stories, or breakthrough reports? Drop them in the comments — I read every one!


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *