Drupal by it self is very search engine friendly. For example it is not uncommon for drupal based sites to have have a google ranking of 5 (out of 10) or more where using the same content on another CMS would score much lower.
Still, you can make drupal even more search engine friendly by changing some default parameters. On this page you will find severals ways of tweaking your drupal instalation to make it more search engine friendly.
First of all you might want to enable friendly URL's Then, make sure that you get rid of the session ID in the URL by changing the .htaccess if you are on 4.5.x. On 4.6, session IDs in URL are disabled by default.
Optional, use URL aliasing for all or some selective nodes. You can use the pathauto module to automatically create aliases for new nodes.
By default drupal does not ship with a pointer file for the search engines called "robots.txt". By adding this file to the root of your (virtual) webserver, you can quide bots tghrough your site or forbid indexing parts of your site. See for an example the file for drupal.org itself at http://drupal.org/robots.txt.
If you want to have robots.txt file, please follow the instructions below. For more details check http://www.robotstxt.org
Create a file with the content as shown below and call this file "robots.txt"
# small robots.txt
# more information about this file can be found at
# lines beginning with the pund ("#") sign are comments and can be deleted.
# if case your drupal site is in a directory
# lower than your docroot (e.g. /drupal)
# please add this before the /-es below
# to stop a polite robot indexing an exampledir
# user-agent: polite-bot
# Disallow: /exampledir/
# a list of know bots can be found at
# for syntax checking User-agent: * Crawl-Delay: 10 Disallow: /aggregator Disallow: /tracker Disallow: /comment/reply Disallow: /node/add Disallow: /user Disallow: /files Disallow: /search Disallow: /book/print
This file will say to indexing robots, that they should go to directory's that are for users instead of robots. For example the search page. Also it will stop robots to go to the first step of adding comments.
Many robots obey the "Crawl-delay:" parameter. Since drupal sites seem to be popular with search engines and lost of people have more aggresive bots than visitors at their site, it might be wise to slow down the robots by adding a robots.txt line like:
User-Agent: * Crawl-Delay: 10
(10 is the time in seconds between page requests)
Both "Slurp" (The robot that is indexing for yahoo and altaVista) and the Microsoft robots for the MSN sites obey this parameter. Googlebot does not use the "crawl-delay" parameter yet but will most likely in a new version of this indexing robot.
Change the file as you wish and save it. No upload it to your webserver and make sure it is in the root of the (virtual) webserver. If you have installed drupal in a subdirectory (for example /drupal), than change the URL's in the robots.txt file but place the file in the root of the webserver, not in the root of your drupal instalation.
Now watch the robots visit your site and after some time, watch your log files ("refferrer log") to see how many visitors cam from a search engine.
Was this article helpful?
Announcing an important message for Webmasters. Who Else Wants to Generate Massive Traffic and Crank Up the Exposure Their Websites Receive by Tapping Into the Unlimited Power of Today's Top Search Engines? As a webmaster, do you spend time studying the number of hits your website is receiving? Do you worry whether you and your clients are getting the exposure needed?