You Ask, I Answer: Content Intent Measurement?

Warning: this content is older than 365 days. It may be out of date and no longer relevant.

You Ask, I Answer: Content Intent Measurement?

Susan asks, “Some marketers say that clicks are not a great indicator of content performance. They say we need a tool that measures intent, what do you think?”

You Ask, I Answer: Content Intent Measurement?

Can’t see anything? Watch it on YouTube here.

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for watching the video.

Christopher Penn 0:13

In today’s episode, Susan asks, some marketers and vendors say that clicks are not a great indicator of content performance.

They say we need a tool that measures intent.

What do you think? Oh, vendor recommending a tool? Gosh, I can’t imagine which which vendor they want us to select for that.

I would agree that clicks alone aren’t necessarily a great indicator of content performance.

Because a click really is sort of the top of the content funnel, right? If somebody clicks to a blog post on my blog, I know that they got there.

What I don’t know is, did they read it? Right? How much of it did they read that they then go take an action of some kind implicit in that post, to do something that is a value.

And those additional measures would indicate intent, right? If you’re on my blog, and you read a post, and you like, and you click on other posts, you may like and eventually you fill out a form, then I know that, that those metrics around that piece of content probably should be checked out to see if they correlate to the outcome I care about, which is somebody filling out a form.

So the way that you do this, is you take all of the metrics that you have clicks, scroll depth, time, on page bounce rate, all the content metrics that you know, and love.

And then you line them up at a really big table by day.

And then you line up a marketing objective, some kind of quantified goal or conversion in that same table.

And then you do the math, you do a regression analysis and say what variables which variables alone or in combination have a statistical relationship with the outcome we care about, it might be a bounce rate of 25% or less, it might be clicks, it might be scroll depth, 80% or more.

Whatever the thing is, whatever the the metric combination is, you correlate it to the outcome you care about, and then you test it, right? So if you find out that time on page has to equal two minutes or more, well then start writing longer content, right? If you get people spend longer on the page, do you then see a commensurate increase in the number of conversions? If the answer is yes, if it’s proportional, then you know, you found a causation, right? You’ve said you know that longer content keeps someone on page longer, you keep them on page longer, they’re more likely to convert, that’s a causative trace that you’ve done.

If on the other hand, you take your blog posts that were you know, 200 words, you made them all 1200 words, everyone’s bored of them, frankly.

And you see time on page go up, but you don’t see conversions go up, then you know that in that initial analysis, you had a correlation, but not necessarily causation.

And now in order to do this, you have to have an analytical infrastructure that supports a lot of these metrics.

That means using tools like Google Tag Manager, or Adobe Tag Manager using tools like Google Analytics, or Adobe analytics, and or the matomo, or plausible, or any of these analytics tools, and you’ve got to have all the different measures set up like scroll depth, for example, just how far down a page somebody has read time on page, average time per session, how many pages per session, all these different metrics, you need to make sure are configured and setup in things like Tag Manager in things like Google Analytics, so that you can run the analysis later on.

If you don’t have those metrics set up, you need to do that first.

Right.

And then as with almost everything in web analytics, especially, you got to wait some time, because those measures are never ever retroactive.

They only call the start collecting data the day you turn them on.

Once you’ve done that, then you do the math.

And you say, Okay, well, what is the outcome we care about? Is it leads is it form fills? Is it context, a schedule a demo? Book, a free trial, you know, rent our timeshare? Whatever the outcome is? Do you have that data collected? And is it is it in a format that you can analyze?

Christopher Penn 4:31

There has been a lot of work done with trying to discern intent.

And you can you can break down intent, very much like a funnel, right? Somebody goes from general awareness to knowing this problem, but not knowing what the solution is doing a problem knowing there’s a generic solution.

And then knowing there’s a problem knowing there’s a generic solution, then I don’t know if there’s a specific solution, which ideally is your company, that progression of intent.

It’s Something that you want to measure.

When you think about it, all the different metrics that we have access to probably fall into different buckets within that, that operational funnel, right, somebody who’s just browsing who may be is the early stages of understanding the problem they have, but not necessarily in the market for a solution, and certainly not wanting to talk to one of your sales folks.

Their content metrics might be different, for example, time on page, their time on page might be really long, because they’re trying to learn about the issue.

On the other hand, somebody who’s figured out the problem, figured out the solution and knows you are the solution, their time on page might be really short, right? They know the person who’s trying to understand the problem, I spent 15 minutes reading a blog post, the person who knows that they want to hire, you might spend two seconds on a blog post, because they just scroll down to find the contact us form, hit that form and fill it out.

And so even the content intent metrics that you have may need to be broken out based on the sort of that lifecycle of where the customer is in their journey, and then appropriately analyzed for each stage of the journey.

That’s not something that’s super easy to do that requires a lot of crunching numbers, advanced maths and coding.

It’s not something that any analytics tool on the market does out of the box, at least not to my knowledge.

If you do know of one, leave, leave a note in the comments, let me know.

But that’s how you would approach the path of trying to understand what metrics are good indicators of content performance.

And I will caution you that just because something is a good measure on an industry study, or white paper does not necessarily mean it’s right for your audience, your audience may be different than the collective as a whole.

Right? If you take toy makers, in aggregate, for who make toys for girls, and then you have Hasbro in there with my little pony is in there.

There’s this whole subculture is a Netflix special about bronies, men 26 to 40, who are really into My Little Pony, that audience, I guarantee you behaves differently than eight to 14 year old girls, I guarantee they behave differently, they buy more, they consume content differently.

Their intent is different.

And so an industry study about what eight to 14 year old girls likes in toys, probably is not going to be super helpful if if you’re Hasbro and you’ve got bronies in your in your stable of customers.

So you need to figure out of all the content marketing metrics that are available, which ones matter to your audience specifically means getting to know your audience, too.

So that’s what I would say.

Last thing I’d say is, if a vendor that makes content intent tools is telling you that existing tools are not great indicators.

Take that with a grain of salt.

And by a grain of salt, I mean, like a 40 pound bag, which is like what 20 kilograms.

They very clearly have an interest in selling you their software.

And so they’re going to position everything that exists as insufficient, and only their software will solve the problem for you.

That’s been my experience with a lot of vendors.

And it’s simply not true.

Now, if their software does this level of analysis, great, maybe it’s a good fit.

But if they just say well use our proprietary measurement system, then it’s, it’s our right and our need to push back and say, Great, I’m going to take that measure and do the same analysis as to do with all these other measures.

And we’re going to find out if your tool is actually effective enough or not for our audience.

And if it’s not effective, then guess what, we’re probably not going to buy it.

So as you negotiate with vendors, if you’ve got the analytical chops to do this kind of analysis, put them to the test, right? See if they’re willing to submit their data for analysis, in the same way that you’ve analyzed all your other content intent metrics, and see how the tool performs.

It’s usually never you never get to that stage because usually the vendor just bails out.

So a good question, complicated question.

But thank you for asking.

I’ll talk to you soon.


You might also enjoy:


Want to read more like this from Christopher Penn? Get updates here:

subscribe to my newsletter here


AI for Marketers Book
Take my Generative AI for Marketers course!

Analytics for Marketers Discussion Group
Join my Analytics for Marketers Slack Group!



Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Shares
Share This