Although door-to-door selling still works, depending on what you do, what you sell, and who you are visiting. However, in times of the corona crisis, digital market research has the upper-hand.
1- DIGITAL MARKET RESEARCH
When we talk about digital market research, we mean :
a) INBOUND MARKETING (ENTRANT)
Digital Marketing, also called Inbound Marketing (in French: prospection entrante).
Inbound marketing is how your leads (French : prospects) know you and how you draw them to you to get contact opportunities.
The basis for marketing is your website, which you will connect to Google. Google is the motorway, where everyone researches information. Your website must be well indexed on Google to allow good visibility but also be connected to the existing social networks (eg LinkedIn, Twitter, Facebook, Instagram).
In Marketing, we use 3 foundations :
SEO : Search Engine Optimisation (in French: référencement naturel). This is how you will write relevant web content to be ranked on Google when someone is looking for your company.
SEA : Search Engine Ads (in French: référencement payant). It refers to Google ads on the Internet. When people will look for you, you will be in the first results shown called “ad”. This allows you to be visible.
Both SEO and SEA are important. Be careful, however, one takes longer to get results than the other one. SEO takes between 6 and 12 months to be effective by following a real content strategy.
SEA is much quicker. In general, you pay Google and you are visible within the next following days. So, it allows you to generate quickly website traffic and convert this traffic into leads.
Both are complementary. SEO is a long-term time investment but doesn’t cost anything, whereas SEA works in the short-term since once you stop paying Google, it’s finished for you. It is recommended to mix both.
SMO : Social Media Optimisation –(in French : Optimisation des Réseaux Sociaux) allows you to create a community to get your website known what you sell (products or services), in order to generate traffic to your website, convert it into leads or even customers.
Contrarily to what some digital agencies will tell you, the Inbound marketing isn’t a one-solution-fits-all-situations. Outbound marketing (in French : prospection sortante) is still necessary. You will have to market research hard. Outbound marketing still has a big weight.
B) OUTBOUND MARKETING (SORTANT)
When one talks about outbound marketing (in French: prospection sortante), they mean contacting leads either by phone or by email. Although these are more traditional/classic market research techniques, if you do not carry them out, you will cut yourself out of a big market research part. Outbound marketing allows you to connect straight away what you sell to your target market. Furthermore, it’s not because it’s a traditional market research technique, that you will do it the old-fashioned way. Today, we use many digital products to create qualitative research. It is on this that I’m going to delve into and introduce you to different digital products I use.
Note as well that in 2019, 80% of companies in France achieved more than 80 % of their turnover thanks to outbound marketing.
2- GOAL: CREATION OF QUALITY DATABASE
You must create a good database that stands up.
You can do in-depth research on your ideal client profiles with LinkedIn Sales Navigator. Sales Navigator is a paying tool but it does better-targeted research than LinkedIn premium. LinkedIn Premium doesn’t allow a huge amount of data scraping. You can use filters such as :
Years of experience
You can also scrape data from LinkedIn and automate this process through Phantom Buster.
Phantom Buster will pretend to be You with your LinkedIn profile. To do this, you will need to copy and paste the link to your Sales Navigator/LinkedIn research. It will scrape data from each lead profile and convert it into an Excel spreadsheet table.
Then, you can also use other tools to find the phone numbers and email addresses of your leads. There are many options. Zoho is a free tool, but the downside is that it’s not interacting with any other tools.
Other tools they recommend you use:
These tools allow you to find the information you require and qualify your prospects while respecting the current General Data Protection Regulation (GDPR).
There is also ProspectIn, but you can’t export the data. Everything is integrated into the tool.
Anyway, what you need to remember is that these tools allow you to sort out market research information.
3- DATABASE CONSOLIDATION AND PROCESSING
Once you have this data, you will need to populate it in a tool, that will allow you to capitalise on the information you just gathered.
The unavoidable tool to do that is the Customer Relationship Management (CRM) database. Without a CRM, a salesperson is nothing today! The CRM will allow you to keep in touch with your clients and keep track of all actions done or required. In other terms, it will allow you to follow-up with your leads or clients.
You have several options for CRM software like:
HubSpot (free of charge)
Pipedrive (€12 per month).
In your CRM tool, you can add tasks such as dates for follow-up calls. You can also add details on the follow-up call results such as:
Not interested in – do not call back.
Follow-up call in X days…
If a lead answers you that they are not interested in, you can just answer; ‘I take into account…I am available…’. Most importantly, do not re-contact someone who replied they didn’t want to hear back from you.
You can also use Buffer to schedule your social media post, which will also allow you to do a follow-up.
Afraid of cold calling? And why not start by cold emailing? Before start sending any emails, of course, do your research to find out the best approach to use to tailor your emails to your leads. For your research, you need to answer questions such as :
What are the issues your leads are facing?
Is it relevant to exchange on the topic? (it is also a question you can ask your lead)
What do they post, talk about?
For cold emailing, they recommend you to use Lemlist, in order to create campaigns.
Lemlist will tell you the success of your campaign by giving you data like :
You can also add your email pictures.
Before starting reaching out, make sure to find out what problems your target market is facing, in order to customise better your campaign. Then, schedule on a monthly basis follow-up calls with a targeted lead list. If there are customers that aren’t online in your target market, you can connect with them through business networks and associations.
Finally, they recommend you to dedicate 1 to 2 hours to market research every day and count roughly €100-160 per month for all comprehensive market research tools, as these will replace a good few of your car trips to visit customers.
If you wish to get in touch with the UP BIZ, you can contact David Julien by email at firstname.lastname@example.org. He is based in Rouen, Normandy in France.
Since it is quite technical, I recommend you to sign up for Google Tag Manager and follow the process he is talking us through.
If you have a more audio or visual memory, you will find the podcast transcript and powerpoint presentation further in this article.
1. Understand and invest in your data
Google Tag Manager helps you measure success in Google Analytics.
If you take away only one thing from this evening, it’s understanding and investing in your data.
Google Analytics is designed to work well. Out of the box implementation with zero customisation, it’s very easy to set up.
But let’s be honest, ‘the one size fits all’ approach to marketing is rarely the best. Indeed, the needs of your business and the Key Performance Indicators(KPIs) of your website are unique.
Consequently, data collection is crucial for the entirety of the analysis process. It doesn’t matter:
how many segments you build
or how many goals you define,
if you mess up your data collection, it will screw up every other stage, too. So, what the value of the insights your analytics software will give you is directly tied back to the investment you have made in data collection at first stage of the whole process.
There are no magic bullets, but I hope everyone here will be able to take away at least one technique they weren’t previously aware of and get some of the value from it.
2. The challenges of engagement traffic
So, we are going to start with engagement tactics, specifically content engagement, because so many organisations are stuck trying to answer meaningless questions like ‘why is that Bounce Rateso high?’
The problem with that is that you see reports saying things like ‘Our content is really good because our sitewide’s average bounce rate is down to 10%’. But this statement is worse than misleading and is often inaccurate.
in fact, many people who use Bounce Rate as the primary KPI don’t actually understand what Bounce Rate is measuring. The effect of this is that the individuals are encouraged to fix the metric rather than the underlying problems, which are of course unique to your site.
So, let’s refresh ourselves with the definition of a Bounce Rate.
Google finds a single page session calculated as only being a single request through the analytics server. What that typically means is that a user arrives and leaves your site via a single page without doing anything on any other pages in-between.
It’s important to remember that sessions are really these fictional constructs Analytics come up with when it processes your data.
Analytics doesn’t know how long a user spends looking at a particular page. It doesn’t set any kind of timer to measure when a session started and when it ended. All it has is this raw hit data:
From this data, it extrapolates and builds this arbitrary notion of a session, which starts and ends after 30 minutes of inactivity (a time gap between hits, midnight or a campaign change).
Now, incidentally, this is why if you commit the sin of tagging your internal links with UTM parameters, you generally see a very high Bounce Rate on most pages. Navigation via those links will result in a new session starting.
So, in order to calculate, divide as the ‘average time on page’, it actually measures how long it takes until the next page is received. To get the session duration, it just measures the time between the first and the last ‘hit‘ in that session.
So, when it uses ‘Bounces’, GA doesn’t have enough data to generate all those metrics it reports such as average time on page, for example.
Indeed, there is no second hit it can measure against to calculate ‘time on page’, which is why it’s not a really good metric to use as your sole KPI, especially when used in aggregate. It becomes meaningless because the questions we can’t answer are substantial.
We don’t know what the user did on the page, how valuable they are to us as potential customers. We don’t even know either if:
the website functions properly on that device
they read every single word of that content and
they bookmarked it to come back later.
Ultimately we lack data.
3. Google Tag Manager can help us improve our data collection
A smart implementation of Google Tag Manager (GTA) is necessary.
CONTROLLING AND TWEAKING THE BOUNCE RATE
So, we will stick to the ‘Bounce Rate’ for a while because it demonstrates some good points. You do have control over the bounce rate calculation.
Indeed, you can control which hits will affect Bounce Rate(BR) and which don’t. To illustrate this point, this is an example of a client I recently on-boarded. They received a 0% BR on most of their pages and couldn’t figure out why.
Ultimately, what happened is the development team, which configured not just the standard page but also an ‘Event‘ that fired when all the dependent resources on the page were ready (images, skyscrapers…).
Consequently, it was impossible to have a single hit session because every page viewed was firing two hits. That’s the same principle why really bad WordPress implementations will often see low Bounce Rate because you get duplicate tracking code, i.e two hits per page.
But don’t worry, you can control which ‘events’ effect the Bounce Rate by using the ‘Non-Interaction Hit‘ Flag. You can set this very easily in GTM when you are configuring your ‘event’ tag to ‘Non-Interaction Hit’ to ‘True’. The BR for the page, on which this ‘event’ fires will be calculated as if the event wasn’t there.
So, for example, if you absolutely have to fire an event when an auto-playing video starts, just set ‘Non-Interaction Hit’ to ‘True’ and the BR will be calculated as if our second hit wasn’t there and would be more accurate.
This idea of using ‘events’ to control our BR plays nicely into the whole idea of ‘On-page Engagement Tracking‘, in a single page new session for eg.
A lot of people started using some of GTM built-in triggers to try and manipulate the BR. For example, GTM has a ‘Timer‘ trigger and by using that, you can avoid relying on GTM arbitrary ‘time-on-page’ calculations.
But one trigger I’m really fond of is the new ‘Element Visibility‘ trigger. To illustrate my point, I picked random examples from the Learn Inbound website. Let’s say you have strategically distributed throughout your longer pieces of content ‘Calls-to-Action‘ like this email sign-up widget.
You may be interested in who is getting to that position in your content or preventing people who got that far through your guides from being counted as Bounces.
If you strategically position these kinds of elements at different positions throughout your various page types, then the ‘element visibility’ trigger can be a powerful way to take advantage of this.
So, we’ll set up a trigger now. As you can see, it lets us define an ‘event’ based on either an ID or a CSS selector. We have control over when this trigger will fire. We can set it to fire when the element is on-screen for a certain duration as your user scrolls through your content. Or it has to be visible for a certain percentage of the element in the ‘View post’. You can even control how many times it will fire if the element appears multiple times per page.
So, in this example, we use this trigger and other triggers to fire an ‘event’ when someone starts scrolling through our content. Obviously, that would be a ‘Non-Interaction Hit’ trigger, when they view the ‘call-to-action’ and then when they reach the footer.
So, by drilling down to a particular page and then viewing this kind of ‘event’ data, it can be very powerful in allowing us to get a sense of who is actually reading our content versus just bouncing immediately.
It can also be segmented by audience types and page to give us insight. This way, we can actually stir our internal linking or content strategy, based on what we learnt about which pages people are engaging with. It can be specific to your other page types. So, needless to say, it goes much further than tweaking the Bounce Rate.
TAILORING YOUR DATA COLLECTION METHOD AROUND THE PAGE TYPES
Your data collection method needs to be tailored not just to your business but to that different page types, the different page types of content on your site.
As an example, we are going to look at ‘Interactive Content‘. It’s an interactive piece of content marketing which lets you calculate the heating costs for their home. You can select your ‘Room Types’, ‘Sizes’ and ‘Glazing’. Then it will give you an approximate cost for heating.
Now, in a classic example of ineffective communication between marketing and developing teams, this was pushed out of the door with very little consideration given to its tracking requirements.
It is a shame because GTM is really good at letting us track high relevant interactions that would be taking place on a piece of content like this. Interactions which are very relevant to the kind of audience we are trying to appeal to with this content.
In this instance, we have touched the ‘Data Layer’.push’ in the ‘Event’ and we have pulled ‘CalculatorGo’. To listen for this as a trigger in GTM, all we do is set up a ‘Custom Event’ trigger. Then, name the ‘Event’ that will appear in ‘Data Layer’ ‘CalculatorGo’. We can use this to fire a Google Analytics Event Tag, so we know how many people are using interactive.
USING CUSTOM VARIABLES TO GET MORE GRANULAR
We want to know how people are using this content. The purpose of it is to appeal a wide audience and drive more revenue. Ultimately we want to know how people are engaging with this content we built.
So, let’s say, for example, we want to know which option uses our selecting when they use our calculator. We can supplement our ‘Data Layer’ Event with two data variables. We’ve gone from ‘Room Type’ to ‘Glazing Type’. These simply populate the ‘Data Layer’ with variables reflecting the user choices at the moment. At the moment, they hit ‘Go’.
Then, we set these as data layer variables in GTM. This means they are now available for us to use in our tags, in our Google Analytics ‘Event’ Tag, for eg.
So, here we have referenced down variables as the ‘Event’ action label respectively. This will give us relevant data about:
what they are using our interactive content for
and what they are looking for.
We can use this to iterate not just the layout, the functionality of the page, but also use it as the basis for guiding our content strategy or improving our lead nurturing process.
You can extend this approach a long way by using our ‘Goals’. By segmenting to a particular campaign for eg., we can then see how people are engaging with this content and analyse that in isolation.
Thanks to native ‘variable types’, we can get quite creative.
So, to keep the same example, we could set up an ‘Event’ value which fires when someone engages with our piece of content and we can set the value based on what we know about them as users.
We could come up with systems using ‘Lookup Tables’ or even ‘Custom Job Description’ running in GTM, which will assign an arbitrary value to them based on how valuable they are to us as ‘leads’. Then set this as the ‘Goal’ value in GA.
This will give us a sense of how valuable that traffic is as potential customers. So, we can see the absolute number of conversion, but also an approximation of the fair value to us as customers.
And of course, when segmented based on a particular campaign, we can start to gauge the content value of our marketing content efforts.
4. SMARTer SEGMENTATION
The last area I want to explore is using GTM to better group our content.
For example, if we wish to segment our content strategy into different groups based on the offer, we can do that with the ‘Content Grouping’. It’s very easy to implement.
We can create the ‘Content Grouping’ at a ‘View’ Level. Then, we enable a content tracking code based implementation, and give it an ‘Index Number’ of ‘1’. Afterwards, we can set up the actual author using a ‘Data Layer’Variable’.
By using the ‘Data Layer’, you can work much more smartly. We get our development team to implement the ‘Blog Author‘ as a ‘Data Layer Variable’.
Same principle as we did earlier for our interactive content and then we can reference that in our ‘Pageview’ Tag. Under ‘More Settings’, we can reference the ‘Data Layer Variable’ in there, so that every page you hit will fetch the account of the author from the ‘Data Layer’. Then it will fire that as the value for that ‘Content Grouping’.
As a result of this, you can view an aggregate performance of pages by particular authors and get a sense of how they perform as a whole. That’s very useful data when it comes to assessing how well your content strategy is performing.
To segment further users, let’s look at particular groups of our audience like ‘Behaviours‘.
For example, we might decide to track users who comment on our blog. Then, view that ‘Audience’ group as a separate segment of traffic with ‘Custom Dimensions‘.
Whereas ‘Content Grouping’ allows us to organise our pages into logical groups, ‘Custom Dimensions’ let us record extra like non-standing data on top of GA standard dimensions. They are very flexible in how they let us do this as well.
Remember that every hit which goes to GA has a different scope. For eg, the ‘Pageview Hit’ has a scope limited to that page view. But ‘Landing Page‘ has a scope which applies to the whole session.
Now, it’s the ‘User Level Scope‘ we are interested in because it lets us apply the data from that hit from the user and all of their subsequent interactions on that website.
So we set it up at the ‘Property Level’, giving 20 ‘Dimensions’ per ‘Property’. We’ll give an ‘Index number’ of ‘1’ and set the ‘Scope’ as ‘User’. So, back in GTM, we are going to fire these ‘Custom Dimensions’ as part of an ‘Event’ hit that will be launched when someone is coming on our blog.
Then under ‘More Settings’, we can set the ‘Custom Dimensions’. We will put an ‘Index number’ of ‘1’ and a ‘Dimension Value’ of ‘Commenter.
In terms of trigger, we can once again use a ‘Data Layer Event’. To run through what happened in the back of this, I user a ‘User Submitted Content’. That action will push an ‘Event’ to the ‘Data Layer’, which we are listening for in GTM. GTM fires out a normal GA Tag ‘Event’. That hit goes on and includes a ‘Custom Dimension’, which defines the user as a commenter and that will apply to all his subsequent actions on the site as well.
As a result, we can now view the behaviour of our engaged users as a segment in GA. We can also see how they differ from our wider readership. We can use that as the primary dimension in a report to analyse the results in our funnel.
5. Work with your developers
It is important to collaborate with your development team when it comes to data collection.
It is really vital that you understand how these technologies work so that you can communicate effectively with your development team.
The ‘Data Layer’, which kind of underpins a lot of the techniques that run today, is in international waters. If you look at the kind of data encoded into the ‘Data Layer’, its semantic information about:
our audience and our customers,
what they are doing
enforces a shared language.
A well defined and maintained ‘Data Layer’ means the data about your content and interaction that take place are accessible in a format independent of any platforms or technology. You are not reliant on scraping your HTML. You can instead make the data points you are interested in available to use.
However, you need to get your development team to implement it. Indeed, it is a very powerful tool that can easily break your website. The ‘Data Layer‘ should be regarded as a pre-requisite for good measurement.
Thanks to the built-in variables of error messages, error URL, error line, information which the user wouldn’t be seeing, we can the fire the information to GA on real-world usability issues. Don’t forget to set that ‘Non-Interaction Hit’ to ‘True’. This will take no more than 5 minutes to implement. It will get real-world testing of your data about:
what’s breaking on your website
and for who.
You can cross-reference it with the other built-in dimensions as well, like upgrading system and browser. You can give that information to your developers, segment it by page. And you will make your website more accessible, functional. The value of the insight you can get from your analytics software is tied to the investment you make in data collection.
By demonstrating success and by unlocking the kind of actionable insights that you need, you can justify whatever it is that you are looking for:
more innovative projects
more development time for your team
and ultimately whatever you need to do your job better.
For those who would like to download the Powerpoint slides containing more visuals and his contact details, click on the link below: