Zap SEO
We help with Search Engines   

Indexing

Remember books? They had an index at the back, it's a list of stuff next to the page number. When we talk about indexing for search engines it much the same sort of thing. The search engine creates a list of sites which it picks from, when you type a search query.

But how do you get on this list? Well actually all you have to do is wait, and a "robot" (you've guessed it another type of algorithm) will "crawl" your site (add pages from your site to the list). However, it will only add pages it likes the look of, so this means more work in the form of SEO chaps.

A good place to go to find out what is going on is Google Webmasters, here you will find a heading "Crawl " and you can check on problems. There are always problems!

Crawl Errors

Multiple crawl errors are not great for your ranking, its not the worst thing in the world to have a few errors, but if you have 100's then perhaps have a look at fixing them. The most common type of Crawl Error is a "404" this is when a link on your page points to . . . . nothing! Perhaps the page the link points to (the landing page) has been deleted or moved. To fix these simply redo the link or delete it. Simple but time consuming. Cut out old code to improve Page Speed.

Find and Replace

One thing not to do is to use, find and replace. Although this is fine for word documents and the likes, find and replace can have disastrous effects on code. When you edit your web pages in bulk using find and replace it is a recipe for disaster.

Robots.txt

If you like you can leave instructions for your uninvited robots visitors (the one that index you pages) to tell them what to do. It may be beneficial to your ranking not to index some pages. For example when describing jobs you have done or PDF versions of web pages. You can ask the Google bots not to index those pages.

Site Maps

There are a number of different sitemaps and also formats. Whilst Bing and Yahoo use a simple URL list, Google opt for a XML site map. These can be created with any number of Site Map Generators although some will fail to list all of your pages.

The point of putting a site map file on your server for the robots to find is to list all the pages that you would like indexed, so use a good site map tool. Once you have your files then drop them on your webserver, and verify them through the tool on Google Webmasters.