There are 7 big marketing mistakes made when agencies deliver reporting to corporate clients. I’m excluding small businesses here. The small business has a very different perspective and shouldn’t normally use strategies that are suitable for large corporates.
Now of course many of these mistakes aren’t deliberate. They are probably accidental – but that doesn’t stop them hurting both your agency and the client.
1 Regarding analytics as separable from technical skills.
Many people see analytics and data as separate skills from web development. Analytics may bore the brilliant web developer. Or they might be no good at it.
And the analyst might be no good as a programmer.
But one mustn’t assume that knowledge of web development and programming don’t matter for analytics expert.
Why? Because modern analytics relies completely on data gathering facilitated by the web technologies.
And it’s not just the technologies you might expect.
So the analytics should understand the crucial role of the scripting language used to produce the HTML output and its structure (e.g. PHP, Ruby, Python). For instance scripting languages can have looping constructs. These can produce duplicates within the same page that completely mess up the notion of uniqueness that the naive analyst might be relying on.
The analytics expert who can interpret and understand the network traces in development tools inside the browser, can play an important part in diagnosing those stubborn measurement errors.
Beyond these obvious “programming languages” there are other technologies that have a crucial impact on the data gathering. Cascading Style Sheets (CSS) are an important mechanism that Tag Managers like Google Tag Manager use. It enables them to try and retrofit the data collection to existing web pages. Whilst this can work well, your agency can’t assume safely that development expertise is no longer needed.
The single page application website approach jeopardises one of the fundamental tenets of web analytics. It uses a single page address (view the browser address bar) for a number of different purposes. It becomes increasingly difficult for the analyst to know when step 1 has finished and step 2 has started without implementing additional measures.
2 Making do with limited in house expertise
Many of those working in marketing understand the concepts of analytics. Numbers of the things that an analyst does – like looking at different channels, looking at the navigation paths – may be things that the marketeer is well able to do. The challenge is that limited knowledge may well contribute to marketing mistakes. A deeper appreciation of the principles and technologies underpinning the reports will help the agency present a richer view to the client. A deeper understanding will help avoid a situation in which the client feels it has been misled by the agency. It’s like the classic 80/20 pareto analysis. The generalist may be able to deal with 80% of the analytics. But 80% of the value may lie in the 20% of the analytics that isn’t as easily understood.
To pick some examples.
The eCommerce addins to Google Analytics can be configured – and the reports are relatively easy to read. But how does one know whether the numbers are correct. One has to start with an external system. And the rules around currencies, tax and shipping applied by the business. These may not be simple. And it may be hard to reconcile these – particularly if there are time lags in the reporting. For instance the Google Analytics account could be on US Pacific time – but the internal system could be on a different timezone. In these cases the transactions that occurred on particular dates may differ between the two systems.
Configuring goal funnels may be easy. But on numbers of occasions I’ve found the way in which the shopping cart works interferes with easy configuration of a goal funnel. For instance parameters may be appended to the pages. This can mean that a regular expression is the only reliable way to segregate wanted from unwanted pages. And regular expressions are a particularly awkward syntax that is easy to get wrong.
Another example. The analytics may appear to be working correctly, but not working in every circumstance. It takes more expertise to work out this is happening.
So it would be wise for agencies to consider getting access to “top up expertise”. This helps where the numbers of complex queries don’t justify an extra member of staff. The result can be significant in terms of the quality of delivery and value add to the client.
3 Leaving the can of worms intact.
Your agency may dream of winning that new big client.
Unfortunately many big corporates have a history of mergers and acquisitions. Each acquired company typically brings one or more websites and a variety of marketing techniques. The individual “child” sites may be re-skinned and modified to make the “appear to fit” with into their new family. This may, however, be the easy bit to achieve. It can be much harder to mesh the reporting for all these marketing techniques and sites. For instance each may have a separate google analytics account. The need to achieve a comprehensive overview doesn’t fit well with such a fragmented approach.
One apparently attractive option would be to simply replace the Google Analytics snippet on each site with one single “rollup” snippet. Then all the reporting would come into one location. But without further work the visitor numbers to every home page on every site would simply be added together. An option is to insert the hostname as a prefix to the URL in Google Analytics. This stops everything being added together but the result is that there are vast numbers of pages. We now have all traffic from the original accounts in one new account. This could lead to sampling problems and other nasty side effects. In particular the naming conventions for some of the customisations are likely to conflict.
The actual solution is likely to rely on dedicating some expertise to designing a reporting solution that can bring all the individual children together in a rational and coherent structure. Even then it won’t be easy – and the solution may be awkward to implement.
These problems don’t only afflict big corporates however. Sometimes small and growing companies can have a can of worms that they bring with them from their “early years”. The initial struggles to create a viable business may mean the business has IT systems that are not reliable or maintainable over a longer time period. When they emerge from these early years and the business starts to scale up more rapidly the “early years” it can strain systems beyond their ability to cope. Often it isn’t the system that fails completely – but the surrounding processes that provide evidence of a “hand to mouth” existence. At this point a number of interrelated issues will affect the growing business and it’s partners (e.g. your agency).
- Processes and practices may need reform, to make them more robust and capable for the next phase of growth.
- Systems and code that relied on these processes and practices will need auditing. Bringing them to an acceptable baseline for reliability and robustness may require considerable remedial work.
Sometimes the solution will be to start again from scratch. However the biggest problem is likely to be that no-one really knows whether a clean break or a fix is better. And in these circumstances people can swap inconclusive anecdotes for a long time as they try and decide what to do for the best.
So the honest view has to be that it’s likely to be a hard slog to sort out the can of worms. But the cost of leaving the mess intact is that the whole client relationship can be poisoned through the lack of reliable reporting.
4 Trying to do everything through Google Tag Manager (GTM)
Google Tag Manager and other tag management systems can be a really useful mechanism. The business advantage in separating data gathering from the development road map and test cycles is clear and obvious. However whilst GTM can lessen the linkage – it doesn’t break it completely.
So there is one simple non technical message to take away. Tag Managers aren’t foolproof – and because they rely on code, they can be broken by code.
Another problem with GTM can occur from the mismatch of objectives between web developer and analyst.
The web developer’s objective when using CSS is to control the look and feel of the elements. Achieving uniformity across the website is important. So for instance – the web developer wants all buttons to look the same and behave in the same way. This is a very different objective from that of the analyst. The analyst will want to know that button A has been clicked. The analyst must differentiate that button A click clearly from any clicks of button B. This difference of perspective means that the CSS designed for uniformity may be almost useless for the analyst using Google Tag Manager.
Without some revision the analyst using GTM may find themselves trying really convoluted and fragile techniques to identify particular elements. It’s a bit like trying to break into a house by hooking the keys out through a letter box. Fine if the keys are in view, but it they are round the corner on an unseen shelf – almost impossible.
These potential problems with GTM can mean that development changes are the best course of action. They can make an almost impossible situation much much better.
5 Trying to Avoid development changes that would enable smart data gathering
Smart data gathering doesn’t just happen. Sometimes we must invent it. I learnt to appreciate this in my early career as a mechanical engineer.
I had a project where I had to work out quality assurance criteria for Ford Motor Company. These criteria had to ensure assembled car suspension arms didn’t fall apart. These arms were fitted onto all the then current Ford Mondeos. A tape measure was useless for determining whether the arm would or wouldn’t stay intact. We needed to find the appropriate technique. In this case we monitored the forces used during assembly at particular points.
The challenges with collecting data from online systems are different – but often need the same kind of creative thinking.
So the approach I would recommend is:
- Identify the processes for which you want to count success and failures, or assess progress.
- Identify each step in the process.
- Create a means of identifying beyond doubt which step we’ve reached.
- Develop criteria to help evaluate whether each step succeeded or failed.
- Design and implement measurements for each step. The data collection must enable us to prove the failure or success of each step.
This probably doesn’t fit with the standard configuration that “came out of the box”.
The standard package can’t cater for all the unique challenges that your agency’s client has. We must design, implement and test particular data gathering techniques to fit that client.
In many cases these techniques may not be particularly challenging. Perhaps particular we need to count button clicks, or assess particular form entries to create a “world of measurement” that properly supports the client’s business. But they may well require development support to make them happen. The way Cascading Style Sheets and the code is constructed will determine whether Google Tag Manager is viable.
In some cases the techniques themselves aren’t challenging but they have implications for the rest of the process. The speed with which the website moves a user from one page to another might interrupt a data collection. In these cases review the process design.
6 Not recognising when a report “doesn’t work”.
The standard reports that “come in the box” with Google Analytics are certainly comprehensive. But that doesn’t mean that they provide answers to the questions the client has; nor that all of them are relevant.
It’s worth reflecting for a moment how assumptions would change if we didn’t have free or near free cloud based services. Back some 20 years ago, the comprehensive package like Google Analytics would have sold for many thousands of pounds. And the vendor wouldn’t just offer the software – they would have pushed hard to sell consulting services. These services would customise the package to suit the customer’s business. The current world enables Google to cross subsidise the software supply. But if we aren’t careful the customisation just disappears. The lack of a vendor with a commercial reason to sell it – misleads the user into thinking customisation isn’t required.
But a mass market standard offering like Google Analytics cannot cater for a customer’s key business processes. Proper customisation takes time and effort.
Some of the customisation will push valuable information into standard reports that were blank until customisation happened. So for instance there are no goal funnels until someone has configured the goals in the Analytics account. Using this stepwise notion of process and the “goal funnel” can achieve a good deal. It can make the standard package reflect the nuances of the unique businesses agencies work with.
Another pitfall arises when too many dimensions are shown in one report, or the wrong dimension is chosen as the primary index.
For instance I’ve seen attempts to show the various channels that web traffic uses. A mini bar chart shows the varying number of sessions for desktop vs mobile vs tablet device for each marketing channel.
Does this work? No
The problem here is that the numbers of sessions varies dramatically by channel – so what 100% represents is dramatically different.
It would be clearer to show how popular each channel was for a particular device type.
The way to clarify this is to recognise the perspective of the web visitor. A visitor doesn’t choose the marketing channel to which they wish to respond and then choose the device they would use. Instead a visitor selects a device. For instance they pick up their mobile and react to information found or searched for. Or they sit at their desktop etc.. So the likelihood respond to a particular marketing channel for a given choice of device is probably the clearer presentation, and more informative.
7 Failing to implement proper naming conventions
Naming is surprisingly important. There are two important criteria
- The names of the events, goals, concepts, stages and processes within the reporting need to be easily understood.
- The naming needs to show the user how to navigate any hierarchy that may exist. This is particularly important when users are searching for the answer to a question and are trying to locate the information that will help them.
In most cases the naming should relate specifically to the client’s business. Even if there are standard terms there are likely to be subtleties about the way a particular term is understood in a particular client. For instance marketing qualified lead carries with it a set of assumptions about what it took for the prospect to be “qualified”. How subjective or objective was that process?
There should be a glossary of terms that enables newcomers to the reporting to rapidly learn the new terminology. The stakeholders may well have come into the business from competitors who use the terms in a subtly different way. Or be from areas of the business that don’t share the same assumptions.
The challenge of getting naming conventions implemented is normally more about process than fundamental difficulty. However we have to accept that perfection and complete coherency is unattainable. We are simply aiming for a set of names that fulfill the two criteria above.
The developers or analysts need to have the detailed conversation with the stakeholders. This conversation needs to cover
- how to achieve a robust convention that will accept will new names later on without destroying the fundamental logic.
- how to communicate any subtleties to the user arising from the data gathering or processing
I hope this post about Marketing Mistakes in reporting is useful. I realise it’s long and involved. Please get in touch or comment below if you want.