Did you build your site thinking that googlebot can't understand your javascript? I did, and I was a bit surprised when I learned I was wrong...
About a month ago, Starr Horne and I launched a reboot of OfficeSpace.com, which is all about helping people who are looking for office space find it. We've had a lot of fun trying out various techniques and technologies, including trying different approaches to analytics.
I was very interested in tracking on-page events, in addition to the usual page views, etc., so of course I implemented events tracking with Google Analytics. I also wanted to track those events internally, so along with the call to GA, I make a post ajax request to a Rails metal controller that logs the events in Redis. For example, when the map view gets loaded, pins get dropped on the map for buildings that have available office space, and a trigger is set on the markers to pop up a dialog with more info about each building. When the click event fires, json is fetched, the dialog is displayed, and a post is sent to the stats collecting controller to track an event for previewing a building. Since the request was a post and was triggered by javascript, I thought that only humans would trigger it. As I watched the logs, though, and saw googlebot posting to the stats tracker with data from the pages it was loading, I realized how wrong I was.
So now I'm a little wiser and a little more rigorous about keeping bots from creating data they shouldn't be creating. I'm also now more curious about just how much dynamic content googlebot can index. :)
Comments