Making Your AJAX Driven Site Crawlable
The only reason I’ve ever been against an ajax driven site is because the content is not accessible to the bots, thus seriously limiting your seo and findability capabilities. Google has come up with a solution to make ajax driven sites crawlable, and by following their easy guide, my reasoning against ajax sites is completely misguided. Below is my interpretation of the guide, but you can check it out in its entirety here.
Supporting ajax Crawlability Scheme
You need to let the bots know that you support ajax crawlability scheme, and you do this simply by adding a ! after the hash fragment. Simply adding ! after the # in your url completes your sites adoption of the scheme and your site is now considered ajax Crawlable.
Add _escaped_fragment_ Support to Your Server
You need to provide html snapshots of your url so the bots can see your content. Essentially we want the bots to see the url www.example.com/ajax.html?_escaped_fragment_=key=value instead of what users see www.example.com/ajax.html#!key=value. We accomplish this by using 1 of 3 methods;
- Recreate your content using a headless browser
- Recreate your content by replacing your client-side JavaScript with server-side code
- Create a static version of your page(s) offline
Regardless of your method, it needs to be tested using Fetch as Googlebot.
Enable Crawlability in Pages Without Hash Fragments
In order to make pages without hash fragments crawlable, simply add this meta element in the document’s head section. It tells the bots to crawl the ugly version of this url.
<meta name="fragment" content="!" />
Update Your Sitemap
Update your existing sitemap so that all of the crawlable urls that you want indexed are indicated there. See, I told ya’ll this was easy.