When Googlebot is holds off crawling URLs because of performance reasons, it is just talking about the base HTML of the page, not about the resources such as JS, CSS, and images. Googlebot is detecting that fetching your HTML is taking longer than it would expect for a site of your size.
The most obvious way to improve this is to speed up your server. There are straightforward (but often $$$) ways to make your site faster:
- Upgrading your hosting plan
- Putting your site behind a load balancer
- Upgrading your database
- Getting a content delivery network (CDN)
You can also look into ways to improve the performance without upgrading your hardware:
- Add a caching layer
- Profile and optimize your database queries
- Make sure your server uses compression (such as gzip)
Googlebot is particularly sensitive (compared to real users) to the number of bytes in the HTML code. Finding ways to cut bytes from your HTML documents is one of the easiest ways to get Googlebot to crawl more. I've seen the following techniques substantially cut the size of HTML files to increase Googlebot crawling:
- Move resources (CSS, JS, and images) into separate files rather than serving them as part of the HTML document
- Replace absolute URLs in links to your own site with relative URLs. For example, replace
http://example.com/page.html with /page.html
- Have boilerplate such as headers, footers, and navigation written into the page with JavaScript rather than included with each HTML page.
If you are using a content management system (CMS) such as WordPress, you could install a lightweight or performance optimized theme for it rather than trying to implement everything yourself.
For more specifics on improving the performance of your site, see Ideas to improve website loading speed?, but keep in mind that the most important improvements for Googlebot are ones that make the initial HTML download faster.
Site performance is just one of the factors Googlebot uses to determine how much of your site gets crawled, and how often. The other big factor is reputation as measured by PageRank. The more sites that link to your site, the more crawling Googlebot is willing to do and the more often it is willing to come back and re-crawl pages. Reputation is often the biggest limitation to your site's crawl budget. If improving site performance is too hard or too costly for you, link building is your second-best option to getting more of your site crawled by Googlebot.