Hello, I'm Mervi!
An artist, nerd and business sorcerer, dedicated to make world more beautiful and strange with art, illustrations and logos + to help you figure your sustainable business out.
After you have written a great blog post and hit publish, you hope your great text to reach search engines as soon as possible. But the search bots don't always crawl your site as often as you wish. Especially if your blog updates sporadically or has a small readership, your fine blog post may end up ignored by search engines for days. Good news is, you can tell the search engines to crawl your blog posts faster.
I need to start with a disclaimer: I always get my blog posts indexed immediately by using these methods. However, none of these methods guarantee your blog posts will be indexed by search engines, nor that they will be indexed immediately.
Google and Bing both offer free tools for monitoring and maintaining your site's search performance. I highly recommend using these tools, in case you want your blog posts to be found in searches. Previously the tools were known as Webmaster Tools, though Google has since renamed their tools as Search Console. You must begin by signing up for Google's Search Console and Bing's Webmaster Tools.
Add your site to Google Search Console and Bing Webmaster Tools by following the on-screen instructions. They both require you to verify the ownership of your website. This way just anyone cannot take control of your site in Search Console or Bing's Webmaster Tools. The verification is done by either uploading a verification file to your website's root or adding certain meta data to your site. If uploading a file to your site root or adding meta data sounds scary to you or your blogging platform prevents it, the different content management systems and blogging platforms usually offer easy ways to do it.
WordPress.com provides inbuilt Site Verification Services. For self-hosted Wordpress there are multiple different plugins for doing this. WP Site Verification tool is a standalone plugin, whereas both Jetpack and Yoast SEO feature verification tools. For Drupal there's a simple Site verification module, which allows using verification files. In case you are on Squarespace, you can use Settings > Advanced > Code Injection to add the meta tags to your site. All these tools can also be used to confirm your website with Pinterest to verify your site ownership and access the analytics they provide.
After you have verified your ownership, you can access tools and search performance analytics. These include the search phrases used to find your site and its various pages, and the most keywords found in the pages. You can use this data to optimise your blog posts and the whole site for the search engines. In case you want to see your Google Search Console data in Analytics reports, you can associate properties.
Using a XML sitemap can help search bots to crawl your content faster. You can submit your sitemap to Google Search Console and Bing Webmaster Tools after setting it up with them and verifying your ownership. WordPress.com and Squarespace have XML sitemap enabled automatically, and for self hosted websites you need to create a sitemap. For a WP site you can use aforementioned Jetpack and Yoast SEO plugins, which only require enabling the sitemaps. There are also standalone plugins for this, such as Google XML Sitemaps. For Drupal sites there's a very good module called XML sitemap.
Usually a sitemap can be accessed through yoursite.com/sitemap.xml. For example my sitemap is at http://merviemilia.com/sitemap.xml. Sometimes websites have multiple sitemaps, which can be a good idea if the site has lots and lots of separate pages.
After you have submitted your sitemap to Google Search Console and Bing Webmaster Tools, the search engines are informed when your site updates. Unfortunately sometimes it may take a while for a sitemap to update and the search engines may be lazy with crawling the sitemaps. But I still recommend creating a sitemap and submitting it to the search engines. It helps the search engines to crawl your website content.
In my experience the faster way to get your blog posts indexed is to submit them manually to Google Search Console and Bing Webmaster Tools.
In Google Search Console, go to your website's dashboard. In the side navigation, choose Crawl > Fetch as Google and request the blog post address to be fetched. By fetching you'll also get information that Google's search bot will be able to successfully crawl your page or if there are any issues with this. This tool can be also used to check how your blog post looks on desktop or mobile devices by requesting Fetch And Render. This is slower than the plain fetch.
As soon as the url has been fetched successfully, click Request indexing button in the fetch attempt row. You can choose to request Google to crawl only the blog post or to crawl the blog post and all the links it features. This includes the links in the navigation, sidebar and footer, in addition to the links within the blog post. Do note, you have monthly quotas for both requests. Each month, you have 500 requests for crawling only the one blog post, and 10 requests crawl the url and its direct links. Google alerts about the amount of crawl requests you have left.
In Bing Webmaster Tools you go to your website dashboard and, in the side navigation, choose Configure My Site > Submit URLs. Here you can submit multiple urls at once, limited to 50 submissions per month and 10 per day. You aren't required to do anything else. You can also resubmit the urls you have already submitted before by checking the checkbox before the submitted url and clicking the Resubmit button. Bing also tells how many submissions you have left of your monthly and daily quota.
So, why would you want to submit your blog posts for indexing? For starters, it helps your blog posts to (possibly) be found through search engines. While the search engines may crawl your site at some point either way, this can make it happen faster. Otherwise it can take days, if not weeks for search engines to crawl your site, depending on how popular it is and how often you usually update it. For my site over half of the traffic currently comes through search engines. I guess something is working, then.
Another reason is to make sure the search engines know your site posted the content first. This is in case someone steals your blog posts, either by hand or by automatically fetching it from your feed. I know you may be instructed to prevent content copying by using right click or copy blocking, but those don't really worka. The click and copy blocking can be disabled easily by disabling JavaScript, and they only are very annoying to those of us who aren't trying to steal your content, but only try to quote you on our blogs or on social media. Another common advice is to use only trimmed version of your blog post in your feed. That means I'm not going to subscribe to your blog with my feed reader and thus will miss your posts. But it's better to make sure the search engines know where the content was published first, so they can penalise the site that stole your content.
You can use Google Search Console and Bing Webmaster Tools to improve your blog post search visibility. Additionally they help you to see if there are issues with crawling and indexing your website. They also give some information about how people have found your website, including the used search keywords and keyphrases.