Google+ and Search Signals: Tinfoil Hat Edition

I was recently examining Google+ and how it transparently redirects through as it sends readers to your website. One of the oddities I noticed was this:

Take a look at how the URL starts. The second query string variable looks awfully familiar. If you’re not familiar with query string variables, they’re the key/value pairs in the URL that have an equal sign. For example, let’s break up that G+ URL into key/value pairs:

  • sa=z
  • n=1310267970417
  • usg=Fl5VGX6zftZiPhe6N1gENpM0EDQ.

URL is pretty obvious, that’s the URL to the post I shared about Google+ Analytics.

USG appears to be a hash of some kind, but none of the usual reverse hash hacking tools made sense of it, which means it’s probably just a unique identifier, such as which Google+ post the URL was linked from.

SA is their standard URL discriminator; Z appears to be the type assigned to Google+.

That leaves us with the mystery of N. In the example above, N is a 13 digit number, 1310267970417. At first glance, it doesn’t appear to be anything, but if you look closely, it resembles a UNIX timestamp. Feed it into a UNIX timestamp calculator, however, and it returns a senseless result:

Sat, 27 Sep 43490 13:06:57 GMT

However, I poked around a bit more and did a regular date-time to UNIX conversion. The day and time this post was written is: 1311151020, which is only 10 digits long. So what if we trimmed the mysterious G+ number down from the right to match the same number of digits as a current date UNIX timestamp? 1310267970 turns into:

Sun, 10 Jul 2011 03:19:30 GMT

Now isn’t that interesting? That’s closer to the time that I posted the article. Here’s where it gets funky. I went to that post and clicked through just now, as I was writing this. The N variable now reads: 1311165558. That’s just seconds ago.

Wed, 20 Jul 2011 12:39:18 GMT

Google+ is assigning a UNIX timestamp with an extra three digits – I’m guessing a sort of microtime – to every outbound click from G+ at the time of the click. Let me state that again: they’re uniquely timestamping every CLICK from G+ in the URL in realtime. Not just when a post was shared, not just when a post was reshared, but Every. Single. Click.

Here’s where we get into tinfoil hat territory. There’s no logical reason to be timestamping clicks for things like spam control or malware control. You can, and they do in other places, just shut down the destination URL or redirect it to a warning page.

So why would G+ be timestamping every outbound click? This is pure speculation, but the only reason I can think of is that you’d want to track velocity on a link’s popularity. You’d want to track not only how often was it shared or reshared, but how engaged were people with the link, and over what period of time. When I post a link on Google+, it seems that G+ is measuring when clicks occur relative to that content – how popular it is over any given period of time.

We’ve known for a while, we’ve read in many places, that Google is using social signals to influence search. What we have here may be the next iteration of that. Twitter’s data feed with Google came to an end, but they’ve beefed up their social base with G+, and if they’re timestamping every single click, that data can be used to assess the validity of content and the virality of it in a very tight, compact fashion that any data analysis tool can process. Further, by putting the timestamp data right in the URL, they may be making it easy for other Google properties like Google Analytics to process G+ data with a minimum of overhead.

How easy are they making it? Using my existing Google+ data, this is my Google+ analytics blog post mapped in Excel using the timestamps from G+:

Microsoft Excel.jpg @ 100% (RGB/8*)

Notice that with this explicit timestamping, I can measure exactly when the clicks to the article really started to take off, and then when they plateaued again. Google is paying attention to this data, so it’s probably a good idea for you to pay attention to it as well.

If you don’t know how to collect this data… well, stay tuned. Tomorrow I’ll show you.

If you enjoyed this, please share it with your network!

Want to read more like this from ? Get updates here:

subscribe to my newsletter here

Marketing Blue Belt Preorder

Order your 2016 Marketing Planning Framework

  • Joe Robb

    Man, you’re smart.

    Thanks so much for sharing this – you’ve sent me off into an analytics rabbit hole … Again.

  • Only you, CP, would take the time to not only *think* about this, but to also share it. Thanks for opening my eyes (again) and making me think about something I normally would not.

  • I’ll say it again — Chris, you never fail to amaze me with how you tackle arcane, high-level concepts and just annihilate them. At this point I’m expecting you to invent warp drive or the flux capacitor.

  • Anonymous

    Really all i can say is unbelievable. First impression of the string wasn’t even in the neighborhood of a UNIX time stamp, but more of a random integer. Tim is right, you definitely never fail to amaze us with your ability to break it down so easily. Great work

    • What tipped it off was when I did a sort of the data file and saw that the numbers were ascending linearly. There was a definite order to them.

      • Anonymous

        Compiling that set of data would definitely trigger some curiosity. This is extremely useful and (i will don the tinfoil hat now), imagine what Google could do for real-time search with this data. Your SERP could be constantly fluctuating depending upon these metrics. Especially when incorporating social search and relevancy. Instead of returning blog posts from 2009 with no relevance, it could deliver more ‘recent’ articles based on search history.

  • Anonymous

    Wow, great detective work.  

    There are several good reasons that google would want to measure popularity velocity (as you’ve termed it) and I have no doubt they’ll incorporate it into search rankings soon-ish.

  • Amazing stuff Chris. G+ is a pain in the ass to adopt. I wonder if it will overcome the inertia of Facebook (and the status quo). The analytics that you are showing make it a marketing dream — but only if it is widely adopted. So far, I just respond to all the G+ requests that I get — have to think I am like a lot of people. I’m on G+ but use FB far more often.

  • Anonymous

    Really good read Chris! One question, where exactly are these transparent redirects? As far as I can tell all links within posts & on your profile are direct links.

    • They’re transitory. Watch very carefully your browser’s address bar as you click through.

  • Anonymous

    You totally just blew my mind with this post. Now all of the data you need to track all of the who’s what’s and when’s is right there waiting!

  • Well done, Chris. I was just starting to play around with G+ sharing inner workings, now I see that I might have to poke around with their links too 🙂

  • That is terrible fascinating! Thanks for sharing your insights.

  • Justin

    I’m sorry but this doesn’t really make sense. If you want to log timestamps, you do it server-side. That way you can extract usefull patterns (like in your illustration). If you do it client-side you get a lot of noise: every computer has a clock-offset, you can safely assume that they generate their own timestamp when the event arrives. So the question is not “what’s the benefit of timestamping events” but rather “why does goog need your local time”.

    • That’s an interesting point. So what would the benefit be? The only thing I can think of is that G+ is inherently location-aware, so perhaps that’s tied in somehow, too. But I agree – it seems strange to embed it in the URL when server-side is a lot more reliable.

  • There’s nothing odd about the timestamp being in milliseconds; that’s the native format in Javascript. And it’s awfully nice of them to include it in the URL, so it shows up in my web server’s logs.

  • That’s just a standard technique to proxy-cache bust the link.  This ensures that some proxy up-stream from the user’s browser doesn’t serve up the redirection information (and thus rob Google of the click-counting opportunity). You need something unique and the javascript has a handy-dandy current-time.

    Typically  url +  ‘?n=’ + (new Date()).getTime() 

  • Christopher, I wrote a post today about Google+ and The Illusion of Privacy. I mentioned black helicopters, and one of the commenters linked to your post here.

    From a non-tech perspective, this is a fascinating look at just how much data is collected that the average user will never know about.

    I’m glad to have found your blog. Looking forward to digging in further.

  • Anonymous

    OK, that looks like it might jsut actually work. Wow.

  • Fascinated to know how much data Google is capturing via a URL. Hopefully your assumption regarding velocity tracking turns out to be true. This might be a new reporting dimension which Google is trying to work out.

  • I so wish that when a blogger says “If you don’t know how to collect this data… well, stay tuned. Tomorrow I’ll show you.” they would take the time to come back and add a link to the next post for those who aren’t reading this the day you wrote it. If there is an easy way for ME to find that post I don’t know what it is. Some blogs have advanced navigation where I can at least go forward and back between posts but if you have that here I don’t see it. And what would I search on to find that post without a title or any specifics?