Search Engine Traffic Guide

The Most Powerful All-in-one SEO Tool Suite

SERPed is a game-changing SEO suite aiming to unify all the must-have tools needed to rank your website higher, outperform your competition and grow your business. The suite includes tools that will help you to easily and quickly discover profitable keywords, perform SEO analysis from just a single interface, site management, track all major search engines across different devices and locations, client acquisition, and detailed reporting. SERPed integrates data from the world's most trusted sources such as Google, Moz, Majestic, Bing, Yahoo, YouTube, Amazon, GoDaddy, Wordpress, among many others. Additionally, SERPed also comes along with other tools such as Link Index that helps you send links to up to 3 different link indexing platforms. Apart from Link Index, other tools include Google Index Checker, Spintax Checker, Grammar Checker, Content Curator, and Content Restorer. SERPed provides high-quality tools and services alongside world-class customer support system as well as video tutorials to help you get started swiftly. Their FAQ's section also covers virtually anything you may encounter while using the software. Continue reading...

The Most Powerful Allinone SEO Tool Suite Summary


4.7 stars out of 13 votes

Contents: Software
Official Website:
Price: $79.00

Access Now

My The Most Powerful Allinone SEO Tool Suite Review

Highly Recommended

All of the information that the author discovered has been compiled into a downloadable book so that purchasers of The Most Powerful All-in-one SEO Tool can begin putting the methods it teaches to use as soon as possible.

This ebook does what it says, and you can read all the claims at his official website. I highly recommend getting this book.

Google Webmaster Tools

If you've got a site that shows up in Google then you need a Google Webmaster Tools account. The Google Webmaster Tools provide you with detailed reports about your pages' visibility on Google. It's one of the most direct ways that you can communicate with Google about your site. It allows you to upload an XML sitemap, see if there are any problems with your site and fix them. It even lets you control the Google spider so that it doesn't drag your site down with constant visits. To use the tool, you need to verify your site. Fortunately, there is a great module called the Site verification module that helps you verify your site with the search engines. It was created and is maintained by Dave Reid. Thanks, Dave You'll always be verified in my book Google Webmaster Tools 10. Go back to Google Webmaster Tools and click on the Verify button. In a few seconds, you should see the success message, as shown in the following screenshot Google Webmaster Tools settings Now that your site is...

Removing duplicate search engine results

It is good practice to prevent search engines from indexing duplicate pages. For instance, after moving the front page to the actual home page, there is no need for search engines to index the separate http frontpage URL. (From the point of view of a search engine, this URL involves a frontpage directory even though there is no such directory on your system.)

Submitting Your Sitemap to Search Engines

Creating a sitemap is no magic potion to increase search engine rankings. The ugly truth is that search engines won't even know about your sitemap until you tell them. Fortunately, you have tools on your side. XML Sitemaps comes with the search engine sub-module, and each search engine has sets of tools on their site to help you manage your site. Before you submit a sitemap to a search engine, you must verify that you are the owner of the site and are therefore allowed to provide a sitemap for it. Each search engine has a slightly different method of verification, but they are all roughly the same. In the following exercise, you verify your site with Google in order to submit a sitemap, which you'll do in the exercise directly following this one. In this exercise, you verify ownership of your site within Google Webmaster Tools. Google webmaster tools 6. On the Site Verifications page, click the Add Verification tab and select Google as the search engine, as shown in Figure 19-6. Click...

Search Engine Friendly Migration

The biggest pain for me, having just migrated from Geeklog to Drupal is that my site has a good index in google and other search engines. All those users finding my site on Google are suddenly getting the 404 error page, giving up, and going away when they could dig a bit deeper and find exactly what they were looking for. So, I harnessed my experience writting HTTP_REFERER logging systems for geeklog to provide myself with a Page with PHP enabled content to parse the Search Engine query from the HTTP_REFERER and do a search on my Drupal site to try and find what the user was actualy looking for. I thought I'd share with the world, as that's the entire point of open source systems like Drupal. 'Ahttp web results 'query', echo( t( Search Engine DetectedIt would appear you arrived here on the wings of a search engine, so, I will search my local database and show you anything that matches what you were looking for )) This provides a (partial) list of regular expressions...

Search engine friendly and robotstxt

Drupal by it self is very search engine friendly. For example it is not uncommon for drupal based sites to have have a google ranking of 5 (out of 10) or more where using the same content on another CMS would score much lower. Still, you can make drupal even more search engine friendly by changing some default parameters. On this page you will find severals ways of tweaking your drupal instalation to make it more search engine friendly. By default drupal does not ship with a pointer file for the search engines called robots.txt . By adding this file to the root of your (virtual) webserver, you can quide bots tghrough your site or forbid indexing parts of your site. See for an example the file for itself at http robots.txt. Many robots obey the Crawl-delay parameter. Since drupal sites seem to be popular with search engines and lost of people have more aggresive bots than visitors at their site, it might be wise to slow down the robots by adding a robots.txt line...

Improving Your Sites Search Engine Visibility

An important way to get more visitors to your site and help build your community is to have good search engine visibility. WordPress is great for natural search engine visibility, and by that I mean the real search results that appear below and to the left of the paid-for-inclusion ads in Google and other popular search engines. Getting search engine optimization for free right out of the box is a great bonus when you're trying to build an online community. Getting good search engine ranking that is, getting your site higher up in the search results when people look for web sites on your subject is essential to guiding visitors to your site. WordPress helps you get good search engine ranking in a number of ways by constructing structured, semantic, standards-compliant HTML by providing multiple views of your content and by creating search-engine-friendly permalinks. But that's not all there is to improving search engine visibility for your site. There are also a number of things that...

Search Engine Optimization and Website Promotion

One of the most common goals for a website is to appear high up on the big search engine rankings. As you should know, having a good ranking increases the chances of potential users finding your site among the mass of other sites. So what can be done to rank as highly as possible without actually having to pay anyone It certainly helps to have everything named meaningfully, not least because search engines do look at le names. Instead of naming a page 19, give it something appropriate like expert-opinion. IMPORTANT Search engines, in particular Google, place a large amount of emphasis on the anchor text used in links. Make sure all your links have meaningful text associated with them. For example, you could rewrite the following sentence The reason for this is that the word here is not particularly meaningful to a search engine, even though humans can easily make the connection. For the sake of your rankings, simply move the link to the key phrase Wildlife community to place more...

Understanding search engine crawlers

Did you ever wonder how all those pages got into the search engines in the first place There's a magic search engine genie that flies from server to server waving a magic wand not really but close. Actually, there is a computer program called a crawler (or sometimes a spider or robot or 'bot) that lives on the search engine's servers. Its job is to surf the Web and save anything it finds. It starts by visiting sites that it already knows about and after that, follows any links that it finds along the way. At each site that it visits, it grabs all the HTML code from every single page it can find and saves that on its own servers. Later, an indexing server will take that HTML code, examine it, parse it, filter it, analyze it, and some other secret stuff (a lot like waving that magic wand). Finally, your site is saved into the search engine's index. Now, it's finally ready to be served up as a search result. Total time elapsed About two minutes. One important thing to note here is that...

Search Engine Optimization

One of the most common goals for a website is to appear high up on the big search engine rankings. As you should know, having a good ranking increases the chances of potential users finding your site among the mass of other sites. So what is it that you can do to make your site rank as highly as possible without actually having to pay anyone to do it for you Submit your site to search engines and online directories While this is not as important as the first point, it certainly will help to have everything named meaningfully, because search engines do look at file names. Instead of naming a page 19, you should name it something like awesome_webpage. html. Don't go overboard on this because it is not too important. IMPORTANT Search engines, in particular Google, place a large amount of emphasis on the anchor text used in links. As a result, make sure all your links have meaningful text associated with them. For example, you would rewrite the following sentence Donate to the Wildlife...

Contributing to Your Sites Search Engine Ranking

The things you can do to help your search engine rankings are all about the content. The best and most important advice I can give you is to write well. By sticking to your subject and writing relatively short or medium-length articles that stay on topic, written in the language the community uses, you will not only keep your readers interested, but you also will gain ranking for the very keywords your readers will use in their searches. Still you should bear in mind what measures are important to the search engines Each search engine has its own recipe for what its administrators believe is the right mixture of the keyword and link measures. However, here are a few simple guidelines that will help you with all search engine ranking algorithms WordPress is designed to help you achieve the goal of following these guidelines, but what it can't help you with is choosing the keywords and writing compelling posts that someone seeing your web site in a search engine results list would be...

Caution Search Engine Penalization

Most search engines use a number of different metrics to determine where a web site is positioned in a list of search results. Some search engines, such as Google have a metric (Google's is called Page Rank which factors in other things too) which is based on in-bound links, where a web site link from one site to another acts as a vote. Some web sites and businesses use this to their advantage, and offer to pay for advertising space on other web sites (normally ones which have a high ranking). Both the buying and selling of advertising in this way is not something which many search engines like. Google even has an online tool to report web sites that do this, which result in their rankings being penalized. The nofollow attribute signals to search engines that we don't wish the link to be counted as a vote for the web sites rankings. So if this was an advertisement, we would not be penalized by search engines for buying or selling advertising space for it, on a web site. Don't risk...

When to Use the Indexer

Indexers are generally used when implementing search engines that evaluate large data sets and you wish to process more than the standard most words matched queries. Indexers are also used to extract and organize metadata from files or other sources where text is not the default format. Search relevancy refers to content passing through a (usually complex) rule set to determine ranking within an index.

Editing your robotstxt file

If , for some reason the robots.txt file is missing you can easily create one, using any plain text editor like Notepad or TextEdit. Avoid using a word processor, though, as they add additional content which will make the file unreadable to the search engines. 5. Most directives in the robots.txt file are based on the line User-agent . If you are going to give different instructions to different engines, be sure to place them above the User-agent *, as some search engines will only read the directives for * if you place their specific instructions following that section. 6. Add the lines you want. Later in this chapter, you'll learn several changes which will help you with your SEO.

What This Book Covers

Chapter 12 gives you the knowledge you need to help promote and market your business online by looking at optimizing the store for search engines, advertising your site, and helping to bring back visitors to your site as well as some important tips and advice when advertising, promoting, and marketing on the Web

Understanding payment workflow

Ubercart Order Process

Before we continue with the configuration, we need to have a high-level understanding of how Ubercart handles payments and order stats. After your client finds the product he or she wishes to buy, either by searching your online catalog as we discussed in Chapter 4, Managing Categories, Products, and Attributes, or through a Google search (SEO optimization will be covered in Chapter 10, Optimizing and Promoting Your Store), the customer cart is updated with each selection. From this point, you have three alternatives as depicted in teh following image. It depends on your merchant plan whether you will accept electronic payments on your site, take the payment process offline via bank transfer or a check, or redirect it to a safe provider location.

Creating a Company Blog

A blog is a series of typically short postings that are displayed in reverse chronological order. Blogs normally allow users to comment on each posting. With a regularly updated blog or blogs, you can ensure that your site always has fresh content, which ensures that both visitors and the search engines keep coming back. In this chapter, we will create a blog where Chef Wanyama can discuss the Good Eatin' restaurant, cooking, the restaurant business, and more. He plans to use the blog to make the site more interactive and to draw search engine traffic.

Using Open Calais to offer More Like This blocks

One of the ways that search engines determines the relevance of a piece of content to a particular topic is where it links to and which pages link to it. For example, real estate and mortgages are related to Google because a lot of realtors link to mortgage companies and vice versa. Your content is now more connected and that's a good thing for your site visitors and search engines.

Mastering the htaccess file

There is a server configuration file at the root level of your Drupal 6 site called the .htaccess file. This file is a list of instructions to your web server software, usually Apache. These instructions are very helpful for cleaning up some redirects and otherwise making your site function a bit better for the search engines. In Chapter 1, The Tools You'll Need, we told Google Webmaster Tools that we wanted our site to show up in Google with or without the www in the URL. The .htaccess file allows you to do the same thing directly on your web site. Why are both necessary In Google's tool, you're only telling Google how you want them to display your URLs you're not actually changing the URLs on your web site. With the .htaccess file, you're actually affecting how the files are served. This will change how your site is displayed in all search engines. 7. Don't forget to tell Google which you prefer using Google Webmaster Tools. See the Google Webmaster Tools section in Chapter 1, The...

Indexing Content That Isnt a Node hookupdateindex

In the case that you need to wrap the search engine around content that isn't made up of Drupal nodes, you can hook right into the indexer and feed it any textual data you need, thus making it searchable within Drupal. Suppose your group supports a legacy application that has been used for entering and viewing technical notes about products for the last several years. For political reasons you cannot yet replace it with a Drupal solution, but you'd love to be able to search those technical notes from within Drupal. No problem. Let's assume the legacy application keeps its data in a database table called technote. We'll create a short module that will send the information in this database to Drupal's indexer using hook_update_index() and present search results using hook_search().

Adding your XML Sitemap to the robotstxt file

Another way that that the robots.txt file helps you search engine optimize your Drupal site is by allowing you to specify where your sitemaps are located. While you probably want to submit your sitemap directly to Google, Yahoo , and MSN, it's a good idea to put a reference to it in the robots.txt file for all of those other search engines. You can do this by carrying out the following steps If you have an XML sitemap, use it. If not, use the URL list sitemap. However, do not add both, an XML sitemap and a URL list sitemap, to the robots.txt file. It could confuse the search engines possibly even causing duplicate content on your site. Also, do not add your visitor-facing sitemap to your robots.txt file.

Using the Memcache API and Integration module

The Memcache API and integration module provides an API for integration with the Memcached daemon or service. There are many methods of installing the prerequisites for using this module, specifically the Memcached library but we're going to focus on how to easily install Memcached and the Drupal 6.x version of the module in a Windows WAMP environment using PHP 5.2.x. In order to use the Drupal module, we'll need to first install the Memcached service on our local development server and then integrate this service with our PHP version. There are many instructions available on the web for installing Memcached on a Linux or Mac OS system. You can do a Google search for tutorials on how to get Memcached installed or see the note below with links to resources on http and on Lullabot's website. Our focus here is to get Memcached up and running as quickly as possible so that you can see examples of how it works in a sandbox environment, so we're going to install using WAMP on...

Using Googles Webmaster Tools to evaluate your robotstxt file

Warning The robots.txt file is easy to mess up It's not written for humans so it's easy for site owners and webmasters to misunderstand exactly how to use it. Take care not to break your SEO campaign simply because a poorly written robots, txt file is excluding your site from Google. Fortunately, Google's Webmaster Tools provides a helpful utility that shows you exactly which pages are being excluded and included by your robots.txt file. Carry out the following steps to evaluate your robots.txt file using Google's Webmaster Tools Rule Ignored by Googlebot 6. Further down, Choose User-agents allows you to specify which Googlebot you want to evaluate. Google has several they use, like Googlebot-Mobile and Googlebot-Image. Let's try an example. We're going to tell Googlebot-Image to leave our site alone User-agent Googlebot-Image I chose Googlebot-Image as the Googlebot and here's what it looks like User-agent Googlebot-Image it Disallow images D is allow i *.j p g D is allowi *.gif...

The alt and title attributes

The alt attribute specifies alternative text to display if the image, movie, or other media can't be displayed. Suppose someone has images turned off in their browser settings (common on dial-up connections and text-based browsers) or the images get moved. The alt text would be displayed instead. For search engines, the alt text can be another indicator of what that element of the page is about and thence, what the entire page or site is about. Unfortunately, many black-hat SEOs have used alt and title text as a way to stuff keywords into their sites. This was a useful tactic .back in 1995. Just use alt and title tags as you would if you didn't care about SEO. Put keywords in there if it helps your users. Search engines are smart. They'll figure out if you're stuffing your keywords and penalize you in the search results. Many web designers want to use graphics instead of text to represent links or menu options. While this may be helpful to users, text links are better for SEO and are...

Submitting your Google News sitemap to Google News

Log in to Google Webmaster Tools by pointing your browser to The XML Sitemap is the ideal choice because it allows you to specify a lot of information about the content of your site. But, say for some reason that you can't install an XML Sitemap. Maybe there's a conflict with another module that you just have to have. Perhaps your server doesn't have the power to handle the large overhead that an XML sitemap needs for large sites. Or, possibly you want to submit a sitemap to a search engine that doesn't support XML yet.

Analyzing Your Site With Google

Few web statistics tracking systems are as user-friendly and versatile as Google Analytics. It enables you to view a wide range of data about traffic to your site, including the top content of the day or month, where your visitors are from, which search engines and sites send the most traffic, and much

How to pick the best keywords

By now, you know the goals of your SEO campaign branding, lead generation, sales transactions, and so on. Now, it's time to dig into the data. There are infinite number of ways to go about doing keyword research. I'm going to take you step-by-step through one of them. It's not necessarily the right or the best way but it's a good, solid technique that I've used many times to produce excellent results.

Structure your site hierarchically

There's a reason you learned the outlining format in grade school. It's easier to organize related ideas when they're structured hierarchically. It turns out that it's easier for search engines to figure your site out when it's structured that way as well. So, send a long-overdue thank you note to your fifth grade language arts teacher and let's get organized. Search engines also need to work in an organized manner. It turns out that search engines organize around much broader concepts than just keywords (although keywords are still the most important element). To find related words that will help your ranking, try typing your keyword into Google, in the following manner

Indexing Your Content

Indexing your content could also be called making your site's search engine work. If you perform a search after completing the previous exercise, you won't get any results this is because your site has not been indexed yet. Indexing occurs automatically when your site runs cron (discussed in Chapter 3, Your First Drupal Website ). You can see the status of your site's index, as shown in Figure 9-2, by navigating to Configuration C Search Settings.

Bold strong and emphasized text

Many search engines take into account text that is set apart on the page. You can set apart a word or phrase using a couple of methods. Bold and italics will do just that, bold or italicize the text. and are terms that can be styled to look like anything you'd like using a style sheet in your theme. Typically, strong and emphasis tend to look like bold or italics. All are good methods for pulling a word out of a block of text and making it stand out.

Adding the Data Entry Form

Next, we quickly check for cases in which we don't want to display the annotation field. One case is when the teaser parameter is TRUE. If it is, this node is not being displayed by itself but is being displayed in a list, such as in search engine results. We are not interested in adding anything in that case. Another case is when the user ID of the user object is 0, which means the

Robotstxt htaccess and WC Validation

Much of the SEO that we've accomplished so far is visible to your visitors (for example, titles, headings, body text, and even a sitemap or two). In this chapter, we're going to address some of the more technical aspects of on-page SEO. Over the last ten years, many elements have been added to the HTML specification. The search engines themselves have developed other elements to help you communicate better with them. Since our ultimate goal is to do well by the search engines and our visitors, it's time to embrace your inner geek and get technical with your SEO. Pocket protectors ready Let's do this thing. In this chapter, we discuss making edits to two different files that are considered core Drupal. Core means part of the base installation of Drupal and not in the sites directory. While what you'll accomplish in this chapter is not considered hacking core, it does mean that when you upgrade your Drupal site (say from 6.14 to 6.15) you will need to preserve your robots.txt and...

Strategies to Crack Drupal

This chapter goes example by example through several strategies to crack Drupal. The first is simply to search for a common security mistake in the code and then use some advanced Google search modifiers to find potentially vulnerable sites. Then you take a look at two vulnerabilities that were ''happened upon'' and discuss some things to be aware of as you click around sites and review code to increase the likelihood that you will happen upon these issues as well.

Submitting your XML sitemap to Google

If you have not already done so, you need to verify your web site with Google Webmaster Tools. Refer to Chapter 1, The Tools You'll Need, for details. Google Webmaster Central Improve traffic with Google Webmaster Tools Make your site more search engine friendly Google Webmaster Blog Tips on requesting reconsideration 8. Log in to Google Webmaster Tools, click your domain and then click the Sitemaps Overview. If the status is still Pending then wait a bit longer. When your sitemap has been crawled, it will say OK. You can easily see who has accessed your XML Sitemap by visiting your Watchdog log http admin reports dblog. You can see how recently each search engine has visited your sitemap. What about all those other search engines out there It's easy to let them all know where your XML Sitemap is located by adding it to your robots.txt file. We'll cover that in the robots.txt section in Chapter 7, robots.txt, .htaccess, and W3C Validation.

Search e gin optimizatio

You have many ways to promote your website, but the main traffic source will always be search engines. Search engine optimization helps your site to improve its position in the natural search results, thus generating more traffic and attracting visitors who search for your products. This module automatically creates path aliases for our nodes, categories, and users. It generates search engine-friendly URLs and improves the ranking of our pages. Browse to http project pathauto and right after you download the module, upload it and unzip it to your site's sites all modules folder, and go to Administer Site building Modules to enable it. To configure it, go to Home Administer Site building URL aliases. This is a simple but very useful module. In Drupal, especially when you're using clean URLs and Pathauto module, the same content can be reached using different URLs. For example, node 34, node 34 , index. php q node 34, and products ipod32mb are different URLs that may target...

Contents of This Book

This chapter covers several tools that can be used to create a wiki in Drupal, among other uses. The node revisions system (coupled with the useful Diff module), the Markdown filter for easy HTML entry, the Freelinking module to automatically create and link wiki pages, and the Pathauto module for automatically creating search engine-friendly URLs are all discussed in detail.

Problems with the default Drupal robotstxt file

There are several problems with the default Drupal robots.txt file. If you use Google Webmaster Tool's robots.txt testing utility (detailed instructions on this utility later in this chapter) to test each line of the file, you'll find that a lot of paths which look like they're being blocked will actually be crawled. The reason is that Drupal does not require the trailing slash ( ) after the path to show you the content. Because of the way robots.txt files are parsed, Googlebot will avoid the page with the slash but crawl the page without the slash. Google what Googlebot Google and other search engines use server systems (sometimes called spiders, crawlers, or robots) to go around the Internet and find each web site. We sometimes refer to Google's system as the Googlebot to distinguish it from other search engine robots. While Google doesn't report this number anymore, it is estimated that the Googlebot crawls 10 billion web sites each week That is a fast little robot.

Validating New User Accounts

Enabling e-mail validation limits an e-mail address to just one associated account, which is good for sanity purposes. It is a must in this day and age of automated registration bots, which create mass accounts on various web services that are ultimately designed to abuse search engines' indexes. Additionally, it keeps the onus on users to keep their account information up-to-date, as a valid e-mail address is required to recover passwords.

Conducting Special Searches

PhpBB's search engine is also equipped with three types of predefined special searches. Any users guests or registered are able to find posts that have yet to receive a reply by clicking the View unanswered posts link in the top-right portion of the page, just above the forum and category listing. Registered users can perform two additional searches

Writing page titles that Google and your visitors will love

There are two competing forces pulling for your page title's attention. First, the search engines use your page title to help determine where your web site fits. Second, your customers will see and use your page title to help them determine if your site has what they're looking for and to remind them what your site is about when they see it in their bookmark list. A good page title achieves both objectives. Now, that works for search engines but not so much for customers. Sure, they'll know that you sell mortgages but they may not remember which company. You always want them to remember who you are. So, you could do this That's good for your customers however, it moves the best keywords out of the first position. Search engines assume that the most important words come early in the page title. So, how about this You confuse the search engine who thinks that you've got the same page twice

A brief history of static and dynamic URLs

The method they used involved a question mark ( ), equal signs ( ), and ampersands (&). It was magnificent until search engines came along. Search engines couldn't understand these often long, complex strings of data being passed from a browser to a server and back again. URLs often looked like this As it turns out, the only important piece of this URL, at least as far as the search engines are concerned, was the first element. Even then, some sites would put the important things at the end of the URL and there was just no way for Google to know that. So, they ignored everything after the . That meant that web sites with thousands of products would look to Google like they only had one or two pages. Finally, in 1996, a really clever guy named Ralf Engelschall came up with a URL rewriting patch for Apache called mod_rewrite. It acts as a translator between URLs and databases, so that ugly query strings that confused search engines before now show up clean and friendly. You could now...

Optimizing the robotstxt file

The robots.txt file is a file that sits at the root level of your web site and asks spiders and bots to behave themselves when they're on your site. You can take a look at it by pointing your browser to http robots. txt. Think of it like an electronic No Trespassing sign that can easily tell the search engines not to crawl a certain directory or page of your site. Using wildcards, you can even tell the engines not to crawl certain file types like .jpg or .pdf. This means none of your JPEG images or PDF files will show up in the search engines. (I'm not recommending that you do that .but you could.) On December 1, 2008, John Mueller, a Google analyst, said that if the yCl N Googlebot can't access the robots.txt file (say the server is unreachable I 11 or returns a 5xx error result code) then it won't crawl the web site at _ all. In other words, the robots.txt file must be there if you want the web site to be crawled and indexed by Google. Read his full comment at...

What your keyword goal is

A 2004 survey by iProspect found that two out of three search engine users believed that the highest search results in Google were the top brands for their industry there is little reason to believe this perception has changed. That means that just by being at the top of Google will gain you a certain level of trust among search engine users. If Big Computers Unlimited can rank in the top three for Gaming PCs, they'll develop a lot of creed among gamers.

Modules used in Chapter

Pathauto automatically creates path aliases for our nodes, categories, and users. It generates search engine-friendly URLs and improves the ranking of our pages. Global Redirect creates page redirects and solves problems related to duplicate content and lower rankings by search engines. Site map creates a page with a site map in a readable form. The visitors of your site can view the structure of it, but it's also accessible by search engines.

Overloading a Permission

Weaknesses with overloaded permissions are generally more difficult to exploit. You have to find a site that has the module installed, gain an account on the site, and then probe for the misconfiguration. That said, a site that is totally misconfigured and allows anonymous users to perform the actions can often be found via a search engine. Again, this will be covered more thoroughly in Chapter 9.

Keyword research tools

The most important keyword research tool at your disposal is your own web site, http . If you already have some traffic, chances are that they're coming from somewhere. With analytics installed, you should be able to find out what they're searching on with just a few clicks. If you have Google Analytics installed then you can easily see this data by logging in to its admin section and then going to Traffic Sources Search Engines Google. This information can be invaluable if you cross-reference it with your current positions in the search engines. Say, for example, that you're getting 100 searchers a month on a term that you're on page 2 of Google. That's a good indicator that people are searching hard to find a company like yours and that may be a very good term to focus on with your search engine campaign. If you're getting that much traffic on page 2, imagine if you were in the top three on page 1. Drupal has a built-in search engine another great tool to see...

Optimize images video and other media

Video, Flash, and other media are not far behind. Great graphics, animations, or videos add a lot of value to an otherwise boring web site. The drawback is that search engine spiders cannot read the content of these files. Nothing kills your site's searchability faster than embedding all your best keywords into graphic or Flash files. Users see them just fine but search engines see nothing. It is extremely important to communicate as much information as you can to the search engines about each graphic or media piece that you use on your site. Google Image Search uses the file name and the alt tag to determine what the image is all about. Other services, like YouTube, Viddler, and video search engines, need help to determine what your video is about so that they can show it to people that are interested in watching your masterpiece. Make sure that you can take advantage of these powerful streams of traffic by adhering to the following guidelines. Did...

Project autocompletion and search

One area where autocomplete is not enabled in Drupal is on the search pages. As always, there are a variety of reasons for this, the most important being performance. Having lots of autocompletion scripts, on various clients, all running numerous searches would significantly increase the load on the search engine.

If You Dont Want to Outsource

If you are running an intranet, have privacy concerns, or simply don't want a third party crawling through your content, there are other options. The Apache Solr and Lucene open source search engines are extremely powerful, highly flexible, and freely available. Better yet, they have been integrated with Drupal. Be warned that although these search engines are incredibly powerful, they are also very complex. Entire books are dedicated to their installation and configuration. If this is your first introduction to Drupal or a content management system (CMS), you should leave this until a bit later in your Drupal journey.

Searching and Indexing Content

Both MySQL and PostgreSQL have built-in full-text search capabilities. While it's very easy to use these database-specific solutions to build a search engine, you sacrifice control over the mechanics and lose the ability to fine-tune the system according to the behavior of your application. What the database sees as a high-ranking word might actually be considered a noise word by the application if it had a say. The Drupal community decided to build a custom search engine in order to implement Drupal-specific indexing and page-ranking algorithms. The result is a search engine that walks, talks, and quacks like the rest of the Drupal framework with a standardized configuration and user interface no matter which database back-end is used.

Alias Your Way to a Better Search Ranking

It is a generally assumed best practice that having a well organized and aliased website will better your search engine rank. The voodoo magic that is SEO changes rapidly so whether or not this is actually true is another matter. Regardless, a well aliased site simply looks better. Here are a few tips to help keep your site looking good and give it a leg up on the search engine front. One alias per node Multiple aliases for the same node may lead search engines to believe that you are a spam site with little valuable content. The Global Redirect module can help to ensure that this doesn't happen to you. Organization A well thought out and organized website is easier to navigate and helps users, and search engines, understand your content. For example, place news at news, blogs at blogs, and your store at store.

Tip The TinyMCE module httpdrupalorgprojecttinymce allows Wysiwyg Html editing and integrates nicely with the Image

The final field on the HTML filter's configuration page is Spam Link Deterrent. In early 2005, Google announced that it would no longer award any page rank credit to sites based on links with the rel nofollow attribute in them. This was done in response to the increasing phenomenon of spammers posting comments on blogs with links to their own sites just to increase their page ranking with Google and other search engines. Drupal quickly responded, and by checking the Spam Link Deterrent option, you ensure that any links posted by your site's users will have the rel nofollow attribute, and Google will not follow them when spidering. Let's hope that the incentive for comment spam will dwindle as spammers realize that they are wasting their time.

PhpBBs Security Features

Visual confirmation phpBB's later releases come bundled with a visual confirmation system, which aids as a deterrent against automated registration bots that use your member list as ad space for shady web sites that are interested in increasing their search engine placement. Humans should be the only people using your forums, and visual confirmation helps in ensuring that is the case.

Banning Abusive Users

One of the most useful features of the Statistics module is that it allows you to identify visitors who are abusing your site. Usually, these are not human visitors, but rather search engine crawlers that are malfunctioning or machines automatically accessing your site in an abusive manner. Once you identify a user, usually represented by an IP address, that is abusing your site, you can ban access from that particular abuser. This is a fantastic tool if your site is buckling under an artificially high load that is being generated by attackers or corrupt automated programs generating excessive requests.

Search Performance Statistics and Reporting

Enabling and configuring your search engine Thus far, you have added user accounts, created and managed your content, and explored many of your site's configuration settings. In this chapter, you explore two of the most important aspects of creating a robust website. The first is enabling and configuring your search engine to provide a way for your visitors to dig through your site's content. The second is performance, or perhaps better said, how to make your site fast. In this chapter, you explore Drupal's caching options and other site optimization techniques.

Search and Metadata Clean URLs

As you can see the first URL is easier to tell your friends about or modify to select different shoes. Clean URLs is commonly a part of search engine optimization (SEO) helping search engines more easily find your data, which may lead to high search engine ranking. Regardless of if it helps your search ranking it simply looks better.

The Importance of URLs

The last URL is self-describing, making it easier for your site visitors and, more importantly, for search engines to understand what the content of the page will contain. As you can probably guess, this last clean, self-describing URL is what you want. To make your life easier, the Pathauto module can automatically create these URL aliases for you based on information like title, taxonomy, and content type. The only thing you need to do is to download, install, and enable the module although you should take a few minutes to configure or verify the default settings.

Providing Semantic Standards Compliant Content

Out of the box, WordPress gives you a semantic, standards-compliant web site. The search engines love this because it makes their job easier. When search engines send their spiders and crawlers out into the Internet, they often encounter barriers to navigating a site JavaScript links, Flash content, and so on. WordPress blogs don't have these barriers to navigation. Furthermore, when the search engines analyze the content of the pages they have gathered to identify the keywords and the keyword density, and determine the ranking of your pages for those keywords, WordPress makes it easy for them. WordPress uses the name of your blog in and tags on each page. For category pages, the also contains the category name, and for individual post pages, the post title is also in there. This common sense approach to structuring a web site achieves search engine optimization by publishing unique page tags, and by giving meaning and importance to keywords.

Visitorfacing sitemaps

XML Sitemaps are great for search engines but as you can see, they're not user-friendly at all. Some of your site visitors will want to see all of the pages or sections available to them on your web site. That's where a Visitor-facing sitemap comes in handy. Fortunately, there is a Drupal module that will do that for you automatically It's called the Site map module. Not only does it show you a nice overview of your site but it can show the RSS feeds too. Everybody raise a glass to Nic Ivy and Fredrik Jonsson, respectively the original author and current maintainer of this module. Cheers, gentlemen

Sitemap and webmaster tools

Google has released some webmaster tools, including one that allows us to create a list of the pages within our site, rank them in the order of importance and specify the frequency in which those sections will be updated. This is then saved as an XML file and stored on the web site. Google can then read it and see which content it should check regularly, and which content it should check less regularly. This helps keep the web site more relevant to search engines.

Creating Aliases to Drupal Paths

Tip The Pathauto module (http node 17345) automatically generates path aliases for various kinds of content (nodes, categories, and users) when no explicit alias is provided by the user. The patterns used for generating the aliases is fully configurable, making the Pathauto module an indispensable tool for search engine optimization.

Some Forum Administration Lingo

Bot Bots are a relatively new phenomenon in the fight for message board security the typical bot registers new accounts on message boards automatically in order to boost their search engine rankings. Some bots are sophisticated enough to spam certain types of message boards. If you enable the proper validation in phpBB, you won't have to deal with these. (I'll cover defense against bots at length in Chapter 10.)

Starting to Blog and Building Your Community

In this chapter, I'll take you through some simple steps to enhance your blog and build your community. First, you'll look in a little more depth at posting to your blog, using both the standard and advanced editing options. I'll show you that you don't need to have any great HTML skills to make rich content for your site. Then you'll see how to manage categories, manage comments, add multiple authors for your site, and create blog pages. During the course of this chapter, you'll install and use two plug-ins, giving you an idea of what WordPress plug-ins can help you do on your site. Finally, I'll give you some tips on improving the search engine visibility of your site, to attract more visitors.

Optimizing URLs with the Path module

Sure, search engines can read the URL but that's just the first step to making your web site addresses work for you. Search engines look at the URL for keywords just like they look at the Page Title or the body content. That means that a site with keywords in the URL path will do better than a site without them. Thankfully, Drupal core includes the Path module which lets you write your own paths. The Path module allows you to manually create search engine friendly URLs based on your content. This allows you to get addresses that look like the following URLs

The Tools Youll Need

Congratulations You're about to embark on a fun and interesting journey into the world of online marketing. Whether you're trying to sell more products, generate leads, or get more pageviews on your sponsors ads, Search Engine Optimization (SEO) will take you where you want to go. And, you're using Drupal 6 You've picked a great platform for building your web site. It's widely held that Drupal is one of the best choices if you want to rank well in the search engines. I personally believe that it's hands-down the best possible platform for SEO. I've seen clients triple their traffic within a few weeks of switching from a lesser platform. Believe it Drupal is the best But, you already knew that, didn't you In this chapter, we're going to dive right in and cover some of the top tips for Drupal SEO Some great paid tools to help you with your SEO The Drupal SEO group on


Now this is a truly interesting part of your site's design, and the art of writing for the Web is a lot more subtle than just saying what you mean. The reason for this is that you are no longer writing simply for human consumption, but also for consumption by machines. Since machines can only follow a certain number of rules when interpreting a page, the concessions on the language used must be made by the writers (if they want their sites to feature highly on search engines).

Website Activities

Backup Web Protect Change Password Custom Error Pages Redirects Mime Types Apache Handlers Frontpage Extensions Search Engine Submit HotLink Protection Backup Web Protect Change Password Custom Error Pages Redirects Mime Types Apache Handlers Frontpage Extensions Search Engine Submit HotLink Protection Obviously, there are a lot of toys to play with here and it is recommended that you spend some time finding out what is available for you to use and how to use it. Knowing what you have available is very important because it means you are better able to plan how you work. For example, the demo site's hosts offer an automated Search Engine submit facility that allows the new website to be submitted for indexing to all the major search engines much better than simply waiting around for the site to begin appearing on them. Since it is possible you have an entirely different set of options available on your hosted site, we won't discuss this any further here, but there are still a couple of...

Configuring Modules

Obviously, the nature of the setup for one module can differ wildly from the next. This is because modules can provide nearly any type of functionality imaginable, ranging from a simple poll to a search engine, or whatever. Accordingly, there are a host of different settings associated with each one.

Performing a Search

PhpBB also provides several options for displaying your results. Returning results as posts, as illustrated in Figure 9-15, displays a preview of posts returned by the search engine, with your search terms highlighted. Returning results as topics, as shown in Figure 9-16, displays the results in topic form. In every case, when you click a result, phpBB takes you to the topic, and the keywords you searched on are highlighted in the page, making it fast and easy to find the post you're seeking.

Interface Components

All Web sites have some kind of identifying mark that tells you which Web site you are visiting. It might consist of an image-based logo, a line of plain text, or a combination of the two. Generally this information appears at the top of your Web site. You may also want to include a value proposition or slogan as part of your site name. Visitors arriving from a search engine will be able to use this statement to quickly identify if they have arrived at a page that is useful to them.

The Module

If you don't find what you need with a search on try asking others in the drupal-support IRC channel (http irc), the Post Installation forum (http forum 22), or perform a custom Google search by appending site project to your query as demonstrated in Figure 17-1.

Google Account

Google is the undisputed leader in search. One way that they stay on top is by providing tools to help web site owners manage their sites. Among other things, they've created Google Analytics, Google Webmaster Tools, and Google Site Optimizer all three essential to a good SEO campaign. Oh And they're free. To access all this SEO goodness, you'll need to set up a Google Account.


When Drupal serves a page, the last task completed is to write the session to the sessions table (see sess_write() in includes This is only done if the browser has presented a valid cookie to avoid bloating the sessions table with sessions for web crawlers.


A module is a community-created plugin that enhances Drupal's core functionality. From XML sitemaps to better page titles, modules are crucial to the search engine optimization of any Drupal site. Installing modules is easy and once you know how to install one, you probably know how to install them all. Pathauto It automatically creates search engine friendly URLs based on the title of your content. XML Site map It creates a compliant, search engine readable, dynamic sitemap. Site verification It assists with search engine site ownership verification. Google Website Optimizer It integrates your site with Google's A B and multivariate tool.

Global Redirect

The module also redirects any specific node ID page to its alias if an alias exists. This is important, as your site will get requests by visitors for duplicate nodes or pages. Some people will visit your node 12 page and others may visit it by its alias name. This module makes sure they can only load your node through its corresponding alias. This helps with performance and also makes sure you're not getting slammed by search engine bots. We already installed Global Redirect in the last chapter. Let's go to its settings page and see what we can configure. Go to Site configuration Global Redirect. This will launch the configuration page.

Table of Contents

Google Webmaster Tools 25 Google Webmaster Tools settings 28 Understanding search engine crawlers 30 Make Drupal URLs clean and search engine optimized 66 Search engine optimizing content 171 Write for your audience, not the search engines 173 A B testing with Google Website Optimizer 230 Setting up a Google Website Optimizer account 231 Integrating Google Website Optimizer with Drupal 232


To prevent this situation, you need something called a 301 redirect or a Permanent Redirect. It tells visitors (including the search engines) that this content is permanently moved to the new location. For search engines, a permanent redirect means a few different things, as follows They'll show the new URL in the search listings instead of the old one. Default redirect status Set to 301 Moved Permanently. This tells search engines that the content is gone and not coming back. There are other options but this is the best one most of the time. Don't stop here Be sure to complete steps 6-8 to finish creating your 301 redirect or you will have duplicate content on your site and risk penalties from the search engines. How long you keep redirects depends on a couple of things. For the search engines index, I would only keep them for a couple of months. However, if someone has bookmarked that page or put a link to it on their web site, maybe you should keep it longer. I would keep them as...

Preflight Checklist

Using a basic checklist for site maintenance, users, Search Engine Optimization (SEO), performance, and disaster recovery to cover all your bases before going live In this chapter, you will explore the best ways of putting your website before a potential audience of millions. With SEO, you can increase your site's visibility by optimizing your site's content and correctly submitting it to search engines. The chapter introduces you to the SEO Checklist module for Drupal, which helps you properly implement SEO techniques. This chapter also covers the XML Sitemap module, which catalogs your site to help search engines like Google, Yahoo , and Bing crawl your site more efficiently and retrieve more accurate information.


This can be a great setting if you are transitioning a website to a new URL but be aware that Drupal does not send a 301 redirect with this setting. A 301 redirect is a web server directive that tells web browsers and search engines that the content has permanently moved. In short it is a change of address form for the Web. 301 redirects are setup within your web server configuration and not within Drupal.

Good body content

Base pages are the anchor content on any site and are always visible. They're considered the main pages and they don't change or move around much. These are the pages that you'll build links to and will probably show up in the search engines for the most difficult terms. The following are some good base pages on your site that you should consider creating. They're easy to create and are a positive addition to any site Supplemental pages are the pages on your site that support your core pages. They're relevant, but wouldn't be considered the primary site message. They'll show up in the search engines for long-tail keywords. Long-tail keywords are keywords that are 3 or more words long and tend to be searched less often. But, they're still valuable to your site. They may be pages or stories, blog posts, or even user-generated content like forums. What information would be interesting to your visitors that supplements your core content Be creative. The goal is to be a trusted source that...

Meta tags

Meta tags are pieces of text in the header of your web site that tell search engine spiders about your site. They are not visible to your site visitors, which make them handy places to communicate details about your site that visitors just don't care about. The problem is that in the stone age of search engines (1997) many people abused the meta tags by stuffing them full of keywords. This was invisible to their visitors but the search engines gave a lot of credence to the meta tags, so it was a viable way to get to the top of the search engines. Nowadays, most search engines ignore meta tags as a ranking mechanism but do take them into account for other things, so they're important to maintain on your sites. A two- or three-sentence description of the content of the page. Used by many search engines in the search results as the text under your link. Attributes a specific geographical location to your site. It helps search engines, like Google, who display different results based on...

XML Sitemap

Search engines such as Google, Yahoo , and Bing crawl sites all the time, looking for snippets of text that will tell them what a site is about. An engine typically finds a hyperlink that links to a piece of content that in turn links to more content, thus recursively searching a site or sites. This method doesn't allow the search engine to find everything, and it doesn't tell the engine what is and what is not important on your site. For example, if your contact page contains numerous links to other content or other sites, is that considered important Also, how often should the search engine come back and look for new information Should it check once a day, once a week, or every few months The XML Sitemap module helps search engines find your best and most relevant content by creating categorized and prioritized lists of your site's information. It might help to think of this list as a directory of your site. You then provide this list to the search engines to help them understand...

WC markup validation

You should run a comprehensive scan of the site to check for improperly formed code, broken links, and other oversights that could hinder your search engine positioning. Obviously, Google can't reject sites just because they have bad markup (most of the sites out there have at least one thing wrong with them). However, bad HTML can confuse the search engine spiders. They're not as forgiving as a modern browser is to technical issues. By eliminating any problem markup, you can remove this concern from your site.

Spotlight Pathauto

In Chapter 2, you learned about the Drupal path and how to use clean URLs. One reason to use clean URLs is so that they don't look so ugly. (To review, clean URLs remove the q from the URL.) That helps, but still leaves the URLs lacking a bit. Having a URL with node 123 in it doesn't really tell either humans or search engines much about the page itself. Isn't it much better to instead have a URL with something like about-us in it That is going to be much more memorable for a person, and the addition of pertinent keywords in the URL makes for better search engine optimization. So, even without clean URLs, you can still benefit from good pathnames. The second option, making a new alias and keeping the old one, may sound ideal, because you then access the content from either path and the problem of link rot is eliminated. But, while addressing the issue of link rot, the disadvantage of this option is that some search engines will penalize you for having many paths that point to the same...

OnPage Optimization

Google and the other search engines look at the content of your site to determine whether you should show up for a particular search. It makes sense. If you don't even mention the keyword, then you probably aren't talking about that topic. It's like picking up a book and skimming the title, chapter titles, headings, text, and appendices to find out what it's about. If you're looking for a book on Drupal and you don't see the word Drupal mentioned anywhere, chances are it's not a Drupal book. Google basically does the same thing with your web site. If the keyword doesn't appear anywhere on your site, they won't rank you for that keyword. The first step in convincing Google that you are the best is to tweak your site so that the keywords show up in all the right places. These changes to your site for the search engines are collectively called On-Page Optimization. Thankfully, because you're using Drupal, it's a lot easier than it might be otherwise. Making URL paths clean and search...

Other Great Tools

Another helpful tool is the Google Toolbar. Google Toolbar gives you some very helpful tools like a Google search box and the Google Pagerank indicator. Visit any page on the web and the toolbar will tell you the Pagerank of that page. Currently, the Google Toolbar only supports Firefox and Internet Explorer. The following screenshot shows the Google Toolbar for Mozilla Firefox PageRank is a very important factor which needs to be taken into consideration for the Search Engine Optimization of your site. Google isn't the only search engine that offers some great tools. Yahoo is one of the few search engines that will provide you with a list of all the links you have coming into your site. Just point your browser at http com and put in your URL. You can even add a badge to your site that tells you how many in links you have, as shown in the following screenshot

Path Module

In the earlier hands-on section ( Hands-On Content Management ), we created an About Us page and a Home page and added them to our site's menu. But if you return to those pages, you'll see that the URL is something like http www.example .com node 1. Wouldn't it be great if we could instead give these pages nice search engine-friendly URLs like http about We can, with Drupal core's Path module.


One of the key elements of successful search engine optimization is providing URLs on your site that are meaningful. By default, Drupal 7 out-of-the-box URLs look something like http localhost node 1. To a search engine, they have no idea what node 1 means, nor what the content associated with that page may be about just by looking at the URL. Humans visiting the site may also have a difficult time navigating around to pages that are not linked or accessible by a menu, as http localhost node 2487 is not very intuitive. Fortunately, we have the Pathauto module, which creates an Alias URL to the node being created, and the alias URL takes the form of the title of the node with hyphens used to separate words, and all words are made lowercase. An example might be http localhost node 2487. If that node has a title of Special deals of the month, the URL as generated by Pathauto would be http localhost special-deals-month (pathauto removes common words like the and of from titles when...

Word Tracker

WordTracker has been around the longest for good reason. They provide dozens of tools to help you find the best keywords. They gather their data from Meta search engines search engines that aggregate search results from many different search engines. They then extrapolate the number of searches on a particular term based on market share data. It's not perfect but it does provide some great data points.

Finding themes

Themes are what will make your Drupal site look different from everyone else's Drupal site. Themes can be free or commercial. The best place to start your search for a theme is on the Drupal web site, at project Themes. You can also use your favorite search engine to find many themes. If you cannot find something that's quite what you're looking for, then you can pay a web designer to have a theme custom-developed for you. Make certain that the designer has specifically done Drupal work before. Being able to design a slick home page isn't good enough, because, as you have seen, Drupal has functional areas on the page, and you cannot simply take just any web page design and superimpose it on Drupal.

What a keyword is

Keywords are single words that a search engine user types into the search box to try to find what they're looking for. Key phrases are the same as keywords except for the fact that they consist of two or more words. For the sake of simplicity, throughout this book let's use keywords to mean both, keywords and key phrases. Millions of random people visit Google every day. When they arrive, they are amorphous a huddled mass yearning for enlightenment with nothing more than a blank Google search form to guide them. As each person types keywords into Google and clicks the Search button, this random mass of people becomes extraordinarily organized. Each keyword identifies exactly what that person is looking for and allows Google to show them results that would satisfy their query.

XML sitemaps

In the early 2000s, Google started supporting XML sitemaps. Soon after Yahoo came out with their own standard and other search engines started to follow suit. Fortunately, in 2006, Google, Yahoo, Microsoft, and a handful of smaller players all got together and decided to support the same sitemap specification. That made it much easier for site owners to make sure every page of their web site is crawled and added to the search engine index. They published their specification at http Shortly thereafter, the Drupal community stepped up and created a module called (surprise ) the XML sitemap module. This module automatically generates an XML sitemap containing every node and taxonomy on your Drupal site. Actually, it was written by Matthew Loar as part of the Google Summer of Code. The Drupal 6 version of the module was developed by Kiam LaLuno. Finally, in mid-2009, Dave Reid began working on a version 2.0 of the module to address performance, scalability, and reliability...

Who this book is for

No matter your reason, you hold in your hands the knowledge that you need to rank at the top of the search engines, and turn visitors into paying customers for your business. Each page of this book tells you exactly what you'll need to do, to properly search engine optimize your web site. If you're relatively new to Drupal, just follow the easy, step-by-step instructions and screenshots. If you're an old hand, skip past the basic steps and review the best configuration options for each module. I've boiled down years of experience in Drupal, online marketing, monetization, dozens of modules, some tips, and a few tricks into a powerful potion of Drupal SEO goodness.

File name

The first and easiest way to identify what the file is about is to use a descriptive file name. A filename like img0004 . jpg does nothing for you. However, president-obama- eats -donut. jpg is descriptive, keyword-filled and does wonders for the findability of, say, a presidential pastries web site. The file extension (. j pg) also tells the search engines a lot about what that file is and how to display it to users. Make sure your videos have video extensions, your flash files have flash extensions, and so on.

Content is King

They make their money by taking good content and serving it up to the masses. Think about it from Google's perspective. The better the content on your site, the happier the search engine user is going to be when they get there. Happy visitors will use Google again and again. The purpose of this chapter is to talk about ways that you can create fantastic content that the search engines will crawl, index, and then put in their search results. The more excellent the content on your site, the more visitors will find you. In this chapter, we're going to cover Search Engine Optimizing your content


To optimize their site's content, Volacci advised Acquia on the implementation of specific Drupal modules for SEO. These modules included Pathauto, XML Sitemap, and others. Then, the site's content, HTML, and structure were modified to better communicate to the search engines that Acquia is an authority on Drupal. Consequently, its attractiveness to both visitors and search engines was enhanced.


Drupal 6 Search Engine Optimization Drupal 6 Search Engine Optimization Rank high in search engines with professional SEO tips, modules, and best practices for Drupal web sites Drupal 6 Search Engine Optimization Rank high in search engines with professional SEO tips, modules, and best practices for Drupal web sites 3. Create search engine friendly and optimized title tags, paths, sitemaps, headings, navigation, and more

Removing content

There will come a time when you want to take something off of your site. If you want to be friendly with the search engines, don't just unpublish the node. Search engines are crawling through your site on a regular basis and assigning value to each page. If you delete a page, you're just throwing that value away which is a loss for your site. Take the time to do it right and you can redirect all of the search engine goodness that your page carries to another page on your site, or even to another web site entirely. 3. Do a backlink search using Google Webmaster Tools, by carrying out the following steps Log in to Google Webmaster Tools.

URL Path Settings

You may have noticed while working with the revisions feature that the URL that was shown in your browser's address bar looked something like http localhost node 1, where node in the URL tells us that Drupal is displaying a single piece of content (a node) and 1 represents the unique ID of the node that is being displayed. In this case, it's the first node that we created in the system, so the ID is 1. That number will increase by 1 for each node we add. Although http localhost node 1 gets us to the content that we wanted, the URL is not very people- or search-engine-friendly. Fortunately, Drupal lets us override the URL to something that is. After entering the new URL alias, click the Save button at the bottom of the page. Drupal will redisplay the page using the new alias URL that you created on the previous page. In my example, the new URL is http localhost my-first-content-item. The new URL is easy for a human to understand and, more important, easy for a search engine to pick up...

Google ranking

Clearly, the combination of high quality, targeted content and SEO in partnership with Volacci has propelled Acquia to the top of the search engines. Comparing the first thirty days with Volacci (May, 2008) to the sixth month (October, 2008) Site visits from search engines increased by an incredible 270 SEO was a key part of making the launch successful and contributing to the organic search traffic gains at a crucial time in our evolution when market visibility was critical - Bryan House, Director of Marketing at Acquia

Outsourcing Search

Drupal's built-in search engine is great for small sites that don't have much content, but as your content grows and your users expect more relevant search results, you'll start to see signs of stress on your site. (After all, Google, Microsoft, and Yahoo wouldn't be in an all out slugfest if search were easy.) Consider outsourcing your search results with one of the following modules Acquia Search Based on the extremely powerful and complex Apache Lucene and Apache Solr (the same technologies that power the search on http, Acquia Search provides you all the power with none of the complexity. This quick drop-in replacement for Drupal's Search engine will provide you with faster searches, more relevant results, and a host of other features. Google Custom Search This customizable Google search engine can provide all of the power of Google with none of the work. Read more about Google Custom Search at http cse , and then add it to your site at http

Search Engine Traffic Tactics

Search Engine Traffic Tactics

Within this guide you will learn such tactics as the following Domain age, Regular upgrade, Write for your visitors, Press releases, Flash, Meta tags, Heading tag, Site map, Keywords, External links, Business address, Article distribution, Images, Multiple domains, Link exchange and so much more.

Get My Free Ebook