Analytics
Ecommerce Analytics and Reporting Services
Pixeltree builds ecommerce analytics: GA4, server-side tagging, MER attribution, dashboards, cohort analysis, and LTV models for DTC decision-making.
What we offer
Services under Ecommerce Analytics and Reporting Services.
Why Pixeltree
Built for operators, not orgs.
Senior operators only
No junior handoffs. The person scoping the work is the person doing the work.
Fixed-scope, productized
Clear deliverables, clear price, clear timeline. No retainer sprawl.
No long lock-ins
Month-to-month on retainers. Cancel anytime. We earn the renewal.
How we work
Our approach.
Most DTC brands in 2026 are flying partially blind. Not because they lack data, but because the data they have is contradictory, delayed, or quietly wrong. Meta says it drove the sale. Google claims the same sale. Shopify shows a third number. The finance team builds a spreadsheet that reconciles none of them. Somewhere in that gap between platform dashboards and bank deposits, real money is being misallocated every week.
The root cause is signal loss. Apple's App Tracking Transparency kicked off the unraveling back in 2021. Safari's Intelligent Tracking Prevention kept clipping cookie lifetimes. Firefox followed. Chrome's phased deprecation of third-party cookies finished the job. On top of that, consent frameworks in the EU, the UK, California, Colorado, Virginia, Connecticut, Texas, and half a dozen other jurisdictions now require explicit opt-in for a growing share of tracking pixels. The result is a measurement environment where the click-based attribution models most brands built between 2015 and 2020 simply do not work anymore.
This is the problem Pixeltree's analytics practice exists to solve. We rebuild measurement stacks so DTC operators can make capital allocation decisions with confidence again. Not perfect decisions. Confident ones, grounded in data you can defend to a board, an investor, or a skeptical CFO.
TL;DR
If your Meta ROAS, GA4 revenue, and Shopify revenue do not reconcile within a reasonable tolerance, your measurement stack is broken. We fix it. That usually means a clean GA4 implementation, a server-side GTM container to recover signal, an MER-first attribution layer (Triple Whale, Northbeam, or Polar), cohort and LTV models that finance can trust, and a dashboard suite pared down to the ten or twelve numbers that actually drive decisions. We also tear out the dashboards nobody reads. See our companion posts on GA4 and server-side tagging, MER attribution for DTC, and LTV modeling for deeper technical walkthroughs.
Why ecommerce analytics is harder than it used to be
Five forces have converged to make DTC measurement genuinely difficult in a way that it was not a decade ago.
First, the iOS 14.5 aftermath. When Apple forced the ATT prompt, roughly three out of four iPhone users opted out of tracking. For brands that skew young or premium, that share often pushes past eighty percent. Meta's Aggregated Event Measurement and Google's Enhanced Conversions patched some of the loss, but the pixel-level fidelity that once made Meta's optimization algorithms terrifyingly accurate is simply gone. Platform ROAS has become an optimistic signal, not a ground truth.
Second, browser-level tracking prevention. Safari's ITP limits first-party cookies set via JavaScript to a seven-day lifespan and blocks third-party cookies outright. Firefox's Enhanced Tracking Protection does something similar by default. Chrome's long-running Privacy Sandbox rollout, finally broadly deployed across 2025, replaced third-party cookies with a collection of APIs that are harder to use and give less granular data. A user on a MacBook clicking a Meta ad, leaving, and coming back a week later is almost never attributed correctly by browser-side pixels alone.
Third, the consent layer. GDPR, the UK GDPR, CPRA, and the wave of US state laws mean that a meaningful fraction of visitors in many markets now arrive with tracking consent denied. Google Consent Mode v2 and server-side conditional logic can partially paper over this, but it requires implementation discipline that most brands never budgeted for.
Fourth, platform incentive misalignment. Meta, Google, TikTok, and every ad network you buy on is graded by their own reported ROAS. They mark their own homework. When a user sees a Meta ad, then a Google ad, then converts through an email click, all three platforms may claim the conversion. Add up platform-reported revenue in a typical DTC account and you will often find it exceeds actual revenue by thirty to fifty percent.
Fifth, the proliferation of tools. The modern DTC stack now includes Shopify, Klaviyo, Attentive, Meta, Google, TikTok, a headless CMS, a subscription app, a loyalty app, a returns platform, a helpdesk, and half a dozen apps that each pipe data somewhere different. Without a deliberate measurement architecture, you end up with six versions of the truth and no way to choose between them.
You cannot fix any of this by buying one more tool. You fix it by building a measurement stack deliberately, with a single source of truth at the center, and by teaching the team which numbers to trust for which decisions.
What our analytics services cover
Our engagements cluster into six workstreams. Most brands need two or three. A few need all six.
GA4 implementation. If you migrated from Universal Analytics in a hurry and have not revisited it since, your GA4 is almost certainly under-implemented. We rebuild event taxonomy, wire up enhanced ecommerce events correctly (view_item, add_to_cart, begin_checkout, purchase with item-level parameters), set up cross-domain tracking where you have a separate checkout subdomain, configure consent mode v2 properly, and establish internal traffic filters. We also set up BigQuery export so you own the raw event data rather than being trapped in the GA4 UI.
Server-side tagging. We deploy a GTM server-side container, typically on a subdomain of your main site, and route your key events through it. This moves the pixel firing off the browser and onto a server you control. It recovers signal lost to ad blockers, browser privacy features, and iOS restrictions. Most implementations recover ten to twenty percent of previously lost conversion signal, and in some verticals substantially more.
Attribution setup. We pick an attribution layer based on your scale and your existing stack, and we implement it end to end. Polar Analytics for brands under a couple million in revenue. Triple Whale for mid-market DTC. Northbeam for larger brands where marketing mix modeling starts to matter. We also build the MER and blended ROAS framework so your weekly ops meeting starts from truth rather than platform noise. More on this in our MER attribution guide.
Ecommerce dashboards. We build dashboards in Looker Studio, Metabase, Hex, or whatever BI tool your team already lives in. The default topline view covers revenue, gross margin, contribution margin, CAC, MER, new-customer rate, repeat rate, AOV, LTV, and payback period. From there we layer on channel drill-downs, cohort views, product-level margin, and geographic breakdowns where they matter.
Cohort analysis. We build cohort retention curves from your order data, usually by acquisition month and acquisition channel. This tells you whether your newer customers are retaining better or worse than older ones, and whether certain channels bring in customers who actually come back. It is the single most under-used piece of analysis in DTC. It directly informs paid acquisition budgets and retention spend.
LTV modeling. We build a forward-looking LTV model that accounts for margin, return rates, subscription retention if applicable, and realistic projection windows. The output is a defensible CAC ceiling you can give your paid team and a payback period number finance can use in cashflow planning. See our LTV deep-dive for the math.
How we work: methodology
Every engagement begins with a stack audit. We spend the first week mapping what you have. Which pixels are firing. Which are not. Whether Meta's CAPI is deduping correctly. Whether GA4 agrees with Shopify on orders and revenue (usually within two or three percent is acceptable, beyond five percent there is a real problem). Whether your consent layer is blocking more than it should. Whether any of your dashboards are silently broken.
The audit output is a plain-English document listing every issue we found, prioritized by revenue impact. Not a 90-page PDF. A document your CFO can read and understand.
From there we sequence the work. Usually we fix GA4 first because it is the substrate everything else depends on. Then we stand up server-side tagging so the remaining work is built on recovered signal rather than degraded browser data. Then we implement attribution and dashboards in parallel. Cohort and LTV models come last because they depend on having clean order-level data plumbed through.
Throughout the build we work in your stack, not ours. If you already have Hex, we build dashboards in Hex. If your team lives in Looker Studio, we meet them there. We do not force a vendor lock-in that benefits us and punishes you on the way out.
We also work closely with your paid media team, whether that is in-house or Pixeltree's own paid ads practice. Measurement that does not change what paid optimization does is wasted measurement. The two functions have to be coupled.
What you get: deliverables
A typical analytics engagement ships the following concrete artifacts.
A written measurement plan. One document that describes every event you track, where it fires, which platforms receive it, how consent gating applies, and what each downstream metric means. This is the document your next agency or next analyst reads on day one to come up to speed.
A fully implemented GA4 property with correct enhanced ecommerce, cross-domain tracking, consent mode v2, internal traffic filtering, and a BigQuery export that retains raw event data for as long as you want to keep it.
A server-side GTM container running on a subdomain you own, routing Meta CAPI, Google Ads conversions, TikTok Events API, and any other pixel that matters. With deduplication set up properly against client-side events.
An attribution platform fully configured with your order data, ad spend feeds, and margin overrides. Dashboards set up so your team can see blended and last-click numbers side by side and understand the difference.
A dashboard suite. One topline executive view. One paid media view. One retention and cohort view. One product and margin view. Optional geographic, wholesale, or subscription views where relevant. Each dashboard has a written one-pager explaining what the numbers mean and how to act on them.
A cohort retention model and an LTV model, both rebuilt whenever your underlying order economics shift materially.
Documentation of every named metric. Agreement across marketing, finance, and ops on what CAC means, what MER means, what contribution margin includes and excludes, and what LTV window you report against. This sounds trivial. It is not. Most internal disagreements in DTC ops meetings are actually disagreements about definitions masquerading as disagreements about strategy.
Who this is for
Analytics engagements make sense for DTC and ecommerce brands at a few specific inflection points.
Brands doing over half a million in monthly revenue where bad data is now costing real money. Below that scale the complexity usually is not worth the investment, and a well-configured GA4 plus a clean break-even ROAS calculator is often enough.
Brands preparing for an investment round or a sale. Due diligence will dig into your numbers. If CAC, LTV, cohort retention, and channel attribution are not defensible, valuations suffer.
Brands where the finance team and the marketing team fundamentally disagree about what is working. This is almost always a measurement problem dressed as a strategy problem.
Brands scaling paid spend past a million dollars a year. At that point the cost of misallocated spend from bad attribution easily exceeds the cost of a proper measurement stack.
Brands whose platform ROAS has been trending up while actual MER trends down. This divergence is the single clearest symptom that attribution is broken.
Brands planning to launch new channels (retail, wholesale, marketplace, international). Without clean incrementality measurement you will not know whether the new channel is additive or cannibalizing.
If none of those describe you, spend your budget elsewhere. Analytics investment should follow a specific business problem, not a general feeling that you should have better data.
How we engage
Most analytics engagements fall into one of three shapes.
A diagnostic audit. Two to three weeks. We map your stack, document every issue, and hand over a prioritized roadmap. Some brands implement the fixes in-house from there. Others bring us back for the build.
A build engagement. Six to twelve weeks depending on scope. We execute against the roadmap, stand up the tooling, build the dashboards, train your team, and hand off full documentation. Most brands that come to us end up here.
Ongoing analytics partnership. Monthly retainer. We maintain your stack, ship new dashboards as questions arise, run quarterly cohort and LTV refreshes, and stay plugged into your weekly ops meeting as the analytics voice in the room. This is most common for brands between five and fifty million in revenue where the volume of analytics questions justifies dedicated capacity but not a full-time senior hire.
Every engagement includes written SOPs and a Loom walkthrough of every system we build. When we leave, your team should be able to run the stack. If they cannot, we did not do our job.
Ready to see what your data is actually telling you?
Start with our GA4 and server-side tagging post to get a feel for how we think about the plumbing.
Read our MER attribution guide if platform ROAS has stopped matching your bank deposits.
Work through our break-even ROAS guide to get your acquisition math back on solid ground.
Or reach out to scope an audit. We will tell you honestly whether measurement is your bottleneck or whether your time is better spent elsewhere.
FAQ
Questions we hear most.
Let's see if we're a fit.
15 minutes. We'll tell you whether this service is the right call for where you are — and if not, we'll name what is.
Book a 15-min call