Find bots and pageviews [Click to enlarge] What's important here is not just that search bots come to your site, but that they actually spend their time crawling the right pages. What Shadow Making pages are they exploring? What is the HTTP status of these pages? Do search bots crawl the same or different pages? You can select each of the search user agents you want to check and export the data for comparison using pivot tables in Excel: HTTP state by user agent Based on this initial Shadow Making information, we'll start digging deeper to check not just how these bots differ in their crawling behavior, but whether they're really crawling where they should be. 3. Which pages are not serving properly? Look for pages with HTTP statuses 3xx, 4xx, and 5xx. By searching for the desired search bot (in this case, Googlebot), then choosing the “state” filter, you can select the HTTP values of the pages you want to analyze.
I recommend looking for those with 3xx, 4xx, and 5xx status codes because you want to see redirected or error pages that you serve to crawlers. Error and redirected pages for Googlebot [Click to enlarge] From there, you can identify the top pages generating the most redirects or errors. You can export the data and prioritize these pages to include in your SEO Shadow Making recommendations. 4. What are the main pages crawled by each of the search robots? Check if they coincide with the most important ones on your site. When searching for the search bot you want, you can directly select the “requestURI” filter to get a list of the main web documents, be it resources Shadow Making or pages, that the bot requests. You can consult them directly in the interface (to verify that they have an HTTP 200 status, for example) or export them to an Excel document, where you can determine if they coincide with your priority pages. Main crawled pages [Click to enlarge] If your most important
Pages are not among the most crawled pages Shadow Making (or worse, not included at all), then you can decide on appropriate Shadow Making actions in your SEO recommendations. You may want to improve the internal links to these pages (whether from the homepage or some of the main crawled pages you've identified), then generate and submit a new XML sitemap. 5. Are Shadow Making search bots crawling pages they shouldn't? You will also want to identify pages and resources that are not intended to be indexed and therefore should not be crawled. Use the “requestURI” filter again to get a list of the most requested pages by the desired bot, then export the data.