Home » Blog » Fetch, Googlebot! Google’s New Way to Submit URLs & Updated Pages

Fetch, Googlebot! Google’s New Way to Submit URLs & Updated Pages

Google New Way Submit URLs & Updated PagesRecently, Google launched a new way for site owners to request that specific web pages be crawled. How is this different from the other ways available to let Google know about your pages and when should you use this feature vs. the others? Read on for more.

This new method for submitting URLs to Google is limited, so you should use it when it’s important that certain pages be crawled right away. Although Google doesn’t guarantee that they’ll index every page that they crawl, this new feature does seem to at least escalate that evaluation process.

To better understand how this feature works, let’s take a look at how Google crawls the web and the various ways URLs are fed into Google’s crawling and indexing system.

How Google Crawls & Indexes the Web

First, it’s important to know a bit about Google’s crawling and indexing pipeline. Google learns about URLs through all of the ways described below and then adds those URLs to its crawl scheduling system. It dedupes the list and then rearranges the URLs in priority order and crawls in that order.

The priority is based on all kinds of factors, including the overall value of the page, based in part on PageRank, as well as how often the content changes and how important it is for Google to index that new content (a news home page would fall into this category, for instance.

Once a page is crawled, Google then goes through another algorithmic process to determine whether to store the page in their index. What this means is that Google doesn’t crawl every page it knows about, and doesn’t index every page it crawls.

Click Here to read the whole article.