When bad data can be okay

As marketers, and especially as marketing technologists, correct data, correct metrics, correct information is prized by us (or should be). Incorrect data, faulty data, and misleading data are anathema to our profession and our ability to do our jobs. So it might seem absolutely absurd, even heretical, to make the statement that sometimes, bad data can be okay.

When could wrong data, bad data ever be okay? Here’s a thought exercise for you.

Office clock

Imagine for a moment you wanted to know what time it was. On the wall was a clock that was clearly the wrong time. You know for sure it’s noon but the clock says 3. When you check the clock later that day as you head home, you know it’s 5 o’clock but the clock says 8.

Is the data bad? Yes. Can you still use it? Yes, as long as you know the clock is three hours ahead. The data is bad but predictably and reliably bad. You can develop a mental model (just subtract three) to compensate for the error.

Now imagine the clock shows 3 when you know it’s 12. In an hour, it shows 5. Then it goes backwards and shows 11 within a few minutes. Is the data bad? Yes. Can you still use it? No. In this case the data is bad and unreliably, unpredictable, seemingly random. It’s not something you can develop a model on, and thus it’s totally useless.

Many of the measures we use in marketing come from other derived sources, such as Klout scores and other social influence measures or estimated web traffic. When you’re looking at metrics and tools, the question you should be asking yourself isn’t necessarily whether the data is right (though that’s an important question) but whether it’s reliable.

You can model reliably wrong data that you understand. You cannot model correct data with surety if you don’t know what it’s made of, because things could be changing behind the scenes that you can’t see or compensate for. One day you wake up and what seemed like right data became wrong data overnight.

No better example of this exists than Google’s algorithm. No one knows what’s in it, and thus trying to “win at SEO” is an impossible task because what you think is right today may be wrong tomorrow, but you have no way of knowing it until you lose search rankings. Even worse, because you don’t know what’s in it, you don’t know how to fix what’s wrong except by random experimentation.

Ask how reliable your data is!


If you enjoyed this, please share it with your network!


Want to read more like this from ? Get daily updates now:


Get my book!

Subscribe to my free newsletter!


The secret of future SEO

SEO tricks continue to get devalued. Google keeps getting better at picking up tricks and rendering them valueless.

So how do you know what’s a trick that is a waste of time or at worst will get you penalized?

Here’s a simple rule.

Anything that can be repeated and scaled can be automated.

Anything that can be automated can be detected and discredited by Google.

It’s very easy to buy into a bot network and spam links across the web. Google caught on and has applied massive penalties to people who do so. It’s very easy to hire massive numbers of people through services like Amazon Mechanical Turk or Fiverr to mindlessly create links in blog comments or social networks. Google can catch those, too. It’s very easy to buy an absurd number of press releases and stuff them with links. Google caught on and slapped penalties liberally to companies that behaved badly.

If you’re considering a marketing tactic, if it can be automated and scaled, it can be caught by Google. They have more robots, more machines, more Ph.Ds, more networks, more everything than any one SEO company or marketing department.

So how do you know what won’t earn you a punishment from Google?

Google values what doesn’t scale. Google values great content, which is exceptionally difficult to scale. Google values innovative ideas, and heaven knows innovation is a struggle. If it’s unique and difficult to do, Google will probably value it. Being a great content provider? Hard to do, even harder to scale – ask anyone with a successful website or team blog just how difficult it is to consistently crank out great content. Being an innovative developer? Very hard to do, exceptionally hard to come up with consistently great new ideas, and incredibly hard to scale well.

Do what’s unique. Do what’s hard to replicate, hard to scale, hard to automate. And every proposal, pitch, or offer you get that says they have an easily automated system to do X, you now know to be a trap just waiting to happen.


If you enjoyed this, please share it with your network!


Want to read more like this from ? Get daily updates now:


Get my book!

Subscribe to my free newsletter!


Improve on-site SEO with Webmaster Tools Data Highlighter [Video]

Want to make sure you’re using the latest markup tags from Google for optimum on-site SEO? Google’s made it easier than ever inside of their free Webmaster Tools software with the Data Highlighter.

Google_Webmaster_Tools_Data_Highlighter_Getting_Started_-_YouTube

Just click on the structural elements of your website or blog, categorize the different on-page pieces of data, and let Google do the rest.

Is it a magic wand or a silver bullet that will catapult you to the top of the rankings? Of course not. Will it help, particularly if Google doesn’t highlight obvious pages or links in your site when you search for it? Yes.

Watch this short 4 minute video I made just for you:


If you enjoyed this, please share it with your network!


Want to read more like this from ? Get daily updates now:


Get my book!

Subscribe to my free newsletter!