From the Google Webmaster blog report that they have updated technical guidance to help webmasters preparing your website to be properly indexed by the search engine.
This text discussed why: Google indexing the above websites in text mode, not tracked the contents of the same way as a web browser, so the css, javascript and other components did not affect how the search engine understands a specific page. Now is not it.
The Google search engine and see the pages in the same way it does a browser, so it is important that we give access to the robot (Googlebot) to the directory where the images, js, css, etc. This directory is often blocked to allow indexing faster, but now the opposite is recommended.
Recommend using compression to make the robot do the work as quickly as possible, together css and js files are separated, avoiding "downloads" unneeded and verify in webmaster tool, the functionality and Render Fetch as Google works correctly, because just updated with a new version.
Tips to keep having a quick web, for both users and search engines are still in developers.google.com/web/fundamentals/performance/