Identity, a GA4 checklist & how to decide on your analytics stack
Talk #1: Managing identity in the modern analytics stack.
Fellow friend and colleague David Grinberg, The Lumery’s Head of MarTech Development and I teamed up to chat about where identity lives in the modern analytics stack.
The great Data & Analytics Summit was hosted by Forefront Events in Sydney’s Daltone House, Darling Island.
If you missed it and want to soak up the atmos’ virtually, you can watch the wrap up event video here.
For our session, here are a few of the soundbites we shared:
- With the cookie crumbling, identity management is becoming increasingly fragmented. From Google’s Identity Platform to The Trade Desk’s ID Solution 2.0, to Meta and TikTok still cheekily still requesting real or hashed PII. There’s now a need to collect a number of ID’s for each platform in addition to devices to activate and attribute the audiences we are targeting.
- Attribution doesn’t always need 1:1 data. Increasingly, as organisations invest more in MMM – there will be an alternative mechanism to to help us with the allocation of spend, while we revert to the walled gardens for tactical optimisation.
- Data clean rooms are looking like the next frontier of tech investment with organisations considering data sharing partnerships as one among a number of strategies to close the visibility gap.
If you missed us, The Lumery has a great write up to get across over on the blog. Otherwise, reach out to David Grinberg or myself on LinkedIn for a chat.
Talk #2: Final preparations for a move to GA4
Time is ticking. Just before this keynote for Destination Victoria, I had quite a passionate discussion on LinkedIn with a couple of e-commerce industry leaders on the shift.
Regardless of just how much of a heads up Google has given us (with an extension!) many are yet to run both systems in parallel and truly understand the gravity of what the shift will mean practically for them in their businesses.
As I mentioned in my speech, this shift is not like updating your Toyota. It’s like changing to a Tesla. Despite retaining the Google name in the title, the data model, interface and metrics are totally and completely different.
Me scaring down a room full of people telling them their analytics world is finally about to change for good…
Change management takes time.
Beyond the need to ensure teams are enabled and well equipped to navigate and interpret the new data, there’s the upstream impact of the broader markets and business partners who need to understand what the new metrics and benchmarks mean.
Here’s the list of initiatives you need to consider:
- Audit GAU: Don’t bring the noise and dirty data over to your shiny new car.
- Migrate & Configure: Move over the good stuff and define the custom events you need to. Then, configure and QA in platform.
- Rebuild: All your reports will need to be reconnected. If you want to maintain some semblance of like-for-like comparisons, many metrics will need to be recalculated (Sessions and Bounces to name only a few, bookmark this resource to help you).
- Embed: Once you have done all the technical work, it’s time to ensure all stakeholders who have referenced metrics (especially the c-suite) understand the “why” behind the changes and what the new benchmarks are looking like they will be.
Beyond the above mid to long term initiatives to be actioned, there are some other items you need to be sure you have ticked off IMMEDIATELY (like, yesterday)! For those who aren’t on the paid version of Google 360, you have around 42 days to tick these boxes.
Thankfully, my brilliant team at The Lumery pulled together a checklist to make this easy for our clients, so you can find the checklist to download over on the blog.
Even if you think you’re prepared, this checklist offers a good prompt to do a final check of your settings.
How to decide on your reporting stack
Outside of blabbing about all things analytics on stage on Tuesday and Wednesday, I still managed to get the output of a Analytics Stack Solutions Discovery out the door. Similar to my framework for solving attribution shared a few weeks back, albeit a little different, I used the following framework to investigate and determine which combination of tools made the most sense for a client looking to improve on their reporting maturity.
Note that prior to investigating tools, we conducted a discovery workshop to align stakeholder needs and gain consensus on the priorities for reporting. We then dug deep into defining the requirements that were needed in a tool to realise the required reports.
Once the detailed requirements were defined, it was time to investigate a set of tools and score them. This team required a “no-code” stack as reporting responsibilities would need to remain in their team with no opportunity to bring on a dedicated analyst for the foreseeable future.
When the client asks how I know all the differences between the analytics stack tools I’m investigating…
We investigated Supermetrics, Funnel.io and Salesforce Marketing Intelligence (previously Datorama). The first two providers are for the data ETL and would require a visualisation platform to sit on top (whether Looker Studio, Tableau, PowerBI or another). They also have the flexibility to allow that visualisation tool to change. Salesforce Marketing Intelligence is a more all-in-one option with premium bells and whistles like it’s Einstein AI solution (and of course, the price tag to boot).
My job is to be Switzerland in this process. I dig in and serve back a model to interpret the similarities and differences between vendors that otherwise showcase apples vs oranges on their websites. It’s terribly difficult to navigate unless you’ve used each of the platforms and have a scoring framework that allows you to navigate the core reasons you might choose one option over another.
This is the framework I used for scoring (and the questions I asked and answered to myself along the way).
Data & Insights:
- Which tools service the highest number of metrics required to make decisions?
- Do all data sources have out of the box connectors? If not, how simple to use are the alternatives or workarounds?
Technology:
- Does the technology require one platform or many?
- What is the sophistication level of their features?
- Are there low or high levels of customisation and bespoke support available?
- Beyond the immediate need, how many use cases can be applied using the technology?
Cost:
- What are the costs associated with the vendor or platform itself?
- Is the contract short or long term?
- Based on the current team, what would be the effort to implement and onboard?
Skills & Effort:
- What is the complexity involved in the number of tools required in the stack to realise the output?
- What is the level of technical skill required? What is the likely learning curve of each?
- What is the level of training required and the availability of training support online and by the vendor?
Outside of the above, I also investigated a few hygiene items such as data freshness, storage and security.
Again, I’m neutral on which direction is taken. My job is to surface an easy-to-interpret view of how the alternatives fit their reporting requirements so they can make the best decision based on their fit for purpose and then budget, willingness to upskill.
I’m excited to see which direction they go and from there, work with my team to execute on the reporting requirements and give them back the reports they need to make better decisions.
Leave a Reply