Modern websites make heavy use of JavaScript to support a better user experience, but what is the impact on SEO?

There are established server side techniques like “hash bang” for making JavaScript content “crawlable”, but I wanted to see how much I was able to get indexed without relying on these techniques. Here are some of my findings:

Angular sites

After reading some info about Google working on support for ajax sites I decided to test it by deploying a pure Angular site in hopes that google would find it. Unfortunately the '#' based url format used by Angular does not seem to agree with Google and anything to the right of the hash is skipped when making the request. Google offers a webmaster tool where you can enter a url and see for yourself how the Google crawler sees your webpage. The behavior I observed in the tool was that Google would only consider the part of the url to the left of the '#'. As a result the page is not indexed since the hash and the url fragment that goes with it are necessary to serve the correct content. Unfortunately it seems like server side workarounds like hash-bang are still necessary in order to successfully index Angular websites.

Websites making Ajax request during loading.

Another category of ajax sites are sites that make ajax requests as part of the initial loading of the requested page. These sites differ from Angular type sites in that the url does not contain hashtags in the url, but the content still relies on JavaScript to load. My finding is that these sites fare much better when it comes to SEO. Google crawlers seem to have the ability to execute JavaScript, which is also confirmed by the webmaster tool mentioned above. I am also seeing the content appear in regular Google searches.