There are 7 marketing mistakes agencies make when reporting to corporate clients. Many of these mistakes are accidental. But accidental marketing mistakes can still hurt both your agency and the client.
1. Separating analytics from technical skills.
Many people see analytics as a separate skill from web development. Analytics may bore the brilliant web developer. The developer might be no good at analytics.
And the analyst might be no good as a programmer.
Web technologies and programming knowledge stills matters for the analytics expert. Why? The analyst is using data gathering that requires web technologies. It’s a marketing mistake to try and separate them.
And it’s not just the web technologies you might expect.
Your analyst should understand the crucial role of the web scripting languages. As scripts help produce the HTML output. The scripts can include loops. And loops can produce duplicates on the same page. These duplicates destroy the uniqueness a naive analyst might be relying on.
Client side script (Javascript) is crucial to data collection. The Google Analytics snippet uses Javascript to gather lots of information from the browser environment. The snippet then sends an image request to https://www.google-analytics.com/r/collect. If the Javascript click handlers don’t work data collection can fail.
Understanding a browser page load network traffic is invaluable. So the analytics can diagnose stubborn measurement errors. The analytics expert who can work with these is much more capable.
Other web technologies also affect the data gathering. Cascading Style Sheets (CSS) are used by Google Tag Manager. As CSS allows people to retrofit data collection to existing web pages.
Google Tag Manager works well, but development expertise is still needed. For example a single page application breaks web analytics assumptions. It uses a single page address (seen in the browser address bar) for many steps. This makes it hard for the analyst to know where step 1 ends, and step 2 starts. Developers often need to install extra code to sort this out.
2. Making do with limited in house expertise
Many marketing professionals understand the concepts of analytics. The marketer may be able to interpret many items as a expert analyst would. For example looking at different channels and bounce rates isn’t hard.
But limited knowledge contributes to marketing mistakes.
A deeper appreciation of the principles and technologies underpinning the reports helps. An agency can use this to provide more value to its clients. This will help avoid any client feeling shortchanged.
It’s like the classic 80/20 pareto analysis. The generalist may be able to deal with 80% of the analytics. But 80% of the value may lie in the other 20% and explaining the subtleties.
To pick some examples.
It’s fairly easy to install the eCommerce addins to Google Analytics. And the reports are relatively easy to read. But how does one know whether the numbers are correct?
One has to start with an external system. And the rules around currencies, tax and shipping applied by the business. These may be complex. The outputs may be hard to reconcile.
There may be time zone differences. The Google Analytics account could be in the US Pacific timezone and the external system is in US Central timezone. A specific transaction could appear on different dates in the two systems.
Sometimes the shopping cart makes the goal funnel configuration hard.
For instance shopping carts may append parameters to the pages. These parameters can require regular expressions to sort things out. But sorting can be hard with the awkward syntax of regular expressions.
Another example. The analytics may appear to be working correctly. But there may be intermittent failures in the reports. It takes more expertise to identify this and sort it out.
Wise agencies may seek “top up expertise”. This helps where the numbers of complex queries don’t justify an extra member of staff. The result can significantly improve the quality and value delivered to the client.
3. Limping on with a broken infrastructure
Your agency may dream of winning that new big client. Can you leave the can be worms alone? To do so can be one of the biggest marketing mistakes.
Many big corporates have a history of mergers and acquisitions. Each acquired company has websites and different marketing techniques. The individual “child” sites may be re-skinned and modified. They may “appear to fit” into their new family. This may be the easy bit to achieve.
It can be much harder to mesh the reporting for all the websites. For instance each may have a separate google analytics account. Achieving a comprehensive overview is hard with a fragmented approach.
One can put a single “rollup” Google Analytics snippet on each website. This might replace or go alongside the existing snippet. Rollup accounts have all the reporting in one location. But without careful configuration the home page visitor numbers will be wrong. The home page visitors to every site would simply be added together. An option is to insert the hostname as a prefix to the URL in Google Analytics. This stops misleading the reader with inflated numbers.
But a “rollup” snippet can still have vast numbers of pages. As we now have the traffic from all the original accounts in one new account. This could lead to sampling problems and other nasty side effects. The naming conventions for some of the customisations are likely to conflict.
The best solution is for an expert to design the reporting solution. This can bring all the individual children together into a coherent structure. Even then it won’t be easy. The solution may be awkward to implement.
These problems don’t only afflict big corporates. Sometimes small growing companies can have a can of worms. These come from their “early years”. The growing business may have IT systems that can’t support the greater scale.
The system may not fail completely. Instead the surrounding processes provide evidence of a “hand to mouth” existence. At this point many interrelated issues will affect the growing business. These can have serious implications for your agency.
- Processes and practices may need reform to become more robust and support growth.
- Systems and code that relied on these processes and practices will need auditing. Bringing them to an acceptable baseline may be hard.
Sometimes people will want to start again. However serious marketing mistakes arise if evidence isn’t gathered. Swapping inconclusive anecdotes doesn’t help good decision making. A well founded decision for a fresh start or a fix is required.
It may be a hard slog to sort out the mess. But leaving it can poison the whole client relationship.
4. Trying to use Google Tag Manager (GTM) to avoid any development involvement.
Google Tag Manager can really useful. The business advantage of releasing marketing from the development road map and test cycles is obvious. But marketing mistakes occur if people assume it breaks the link completely.
Google Tag Manager and Analytics rely on the full page being loaded. So any loading problem can stop them working entirely.
Google Tag Manager uses website code to collect data. So if the browser fails to run all of the code on a page, data collection can fail.
Most current browsers have a number of “threads”. Each thread interprets javascript code. If a thread crashes – then some code simply isn’t reached. So any data collection that “follows” the crash fails.
Something called “Event Bubbling” can also interfere with a Tag Manager like GTM. This is because they listen to events which “bubble up” to the topmost element. If “event bubbling” is stopped early the event of interest doesn’t reach the tag manager.
So there is one simple non technical message to take away. Tag Managers aren’t foolproof. And because they rely on code, they can be broken by code.
Differing objectives
The mismatch of objectives between web developer and analyst can cause problems with GTM.
A web developer uses CSS to control the look and feel of the elements. To achieve similar appearance everywhere. For instance all buttons should look and behave in the same way.
The analyst has a very different objective. The analyst wants to know if button A has been clicked. And not confuse this with clicks on button B. This difference of objectives means that the CSS designed for uniformity may be almost useless for the analyst using Google Tag Manager.
So a big marketing mistake is to stop any development activity.
5. Trying to avoid process changes that would enable smart data gathering
Smart data gathering doesn’t just happen. Sometimes we must invent it. I learnt to appreciate this in my early career as a mechanical engineer.
I had to work out quality assurance criteria for Ford Motor Company. We had to ensure assembled car suspension arms didn’t fall apart.
These arms were fitted onto Ford cars. A tape measure was useless for determining whether the arm would remain intact. We needed to find an appropriate technique. We decided to monitor the forces used during assembly at particular points.
The challenges with collecting data from online systems are different. We need the same kind of creative thinking to avoid marketing mistakes.
So the approach I would recommend is:
- Identify the processes for which to count success and failures, or assess progress steps.
- Identify each step in the process.
- Create a means of identifying beyond doubt which step we’ve reached.
- Decide how to evaluate whether each step succeeded or failed.
- Design and implement measurements for each step. The data collection must prove the failure or success of each step.
This means your analytics no longer is restricted to the standard configuration that “came out of the box”.
The standard package can’t cater for all the unique challenges that your agency’s client has. We must design, implement and test data gathering techniques to fit a specific client.
Our aim is to create a “world of measurement” that supports the client’s business. This may require development support. Their help in constructing Cascading Style Sheets and code will determine whether Google Tag Manager is viable.
In many cases these techniques may not be difficult. Perhaps we need to count button clicks, or assess particular form entries.
However the techniques may have implications for the rest of the process. The speed with which the website moves a user between pages might stop the data collection. In these cases review the process design.
6 Not recognising when a report “doesn’t work”.
The standard reports that “come in the box” with Google Analytics are certainly comprehensive. But marketing mistakes arise if it’s assumed they provide answers to all the questions the client has; or all of them are relevant.
It’s worth reflecting for a moment how assumptions would change if we didn’t have free or near free cloud based services.
Back some 20 years ago, the comprehensive package like Google Analytics would have sold for many thousands of pounds.
And the vendor wouldn’t just offer software package. It would have pushed hard to sell consulting services.
These consultancy services customised the package to suit the customer’s business. Google cross subsidises the software supply from digital advertising.
But it’s a bad marketing mistake to let customisation just disappear. The lack of a vendor with a commercial reason to sell customisation misleads people. Naively they can think customisation isn’t required.
But a mass market standard offering like Google Analytics cannot cater for a customer’s key business processes. And proper customisation takes time and effort.
Some of the customisation will push valuable information into otherwise empty standard reports. So for instance there are no goal funnels until goals are configured. Using this stepwise process definition means the “goal funnel” report can provide a lot of value. Value that makes Google Analytics reflect the nuances of an unique business.
Other marketing mistakes occur when too many dimensions are shown in one report, or the wrong dimension is chosen as the primary index.
For instance I’ve seen attempts to show the various channels that web traffic uses. A mini bar chart shows the varying number of sessions for desktop vs mobile vs tablet device for each marketing channel.
Does this work? No.
The numbers of sessions per channel can vary dramatically. The number of sessions 100% is far from consistent.
It would be clearer to show the popularity of each channel is by device type.
Recognising how the web visitor thinks clarifies this.
A visitor doesn’t choose the marketing channel to which they wish to respond. And choose a device afterwards.
A visitor selects a device first. And only afterwards reads information on it.
So the likelihood that a user responds to a marketing channel on a particular device is clearer.
7. Failing to implement proper naming conventions
More marketing mistakes occur with poor naming. Naming is surprisingly important. There are two important criteria:
- The names within reporting need to be easily understood. These might be for events, goals, concepts, stages or processes.
- Naming needs to show the user how to navigate the hierarchy. This is vital if users are searching for the answer to a question and are trying to locate information to help them.
In most cases the naming should reflect the client’s business.
There are probably subtleties about the way a particular term is understood by a particular client.
For instance marketing qualified lead carries with it a set of assumptions about what it took for the prospect to be “qualified”.
There should be a glossary of terms so newcomers can rapidly learn any new terminology.
Stakeholders may well have come from competitors who use the terms in a subtly different way. Or from other areas within the same business that don’t share the same assumptions.
Getting naming conventions implemented is normally more about process than difficulty. However we must accept that perfection and complete coherency is unattainable. We are simply looking for a set of names that fulfill the two criteria above.
Conversations about naming conventions need to cover:
- how it can robust and accept future names without destroying the fundamental logic.
- how it communicates any subtleties arising from the data gathering or processing
I hope this post about Marketing Mistakes in reporting is useful. I realise it’s long and involved. Please get in touch or comment below if you want.