Search Engine Traffic Guide

Google Sniper

This system of money-making online has what NO other online money making system has: solid Proof that it works. Google Sniper has screenshots and bank statements to Prove to you that their system of making money works. And HOW does that work, do you ask? Google Sniper is an advanced marketing tool that helps you set up websites and start making money from them right away, by using Google algorithms to target customers that want to buy what you are selling! People have made upwards of $12,000 per Month on Google Sniper Some sites are cranking out as much as $400 per day. Google Sniper is a complete online marketing machine that is impervious to changes in the Google Algorithm because it works Inside the algorithms. This is the only system that you will find online that makes the money it promises AND has proof to back it up! Continue reading...

Google Sniper Overview

Rating:

4.8 stars out of 17 votes

Contents: Premium Membership
Creator: George Brown
Official Website: gsniper.com
Price: $47.00

Access Now

My Google Sniper Review

Highly Recommended

Maintaining your trust is number one. Therefore I try to provide as much reliable information as possible.

I personally recommend to buy this product. The quality is excellent and for this low price and 100% Money back guarantee, you have nothing to lose.

Improving Your Sites Search Engine Visibility

An important way to get more visitors to your site and help build your community is to have good search engine visibility. WordPress is great for natural search engine visibility, and by that I mean the real search results that appear below and to the left of the paid-for-inclusion ads in Google and other popular search engines. Getting search engine optimization for free right out of the box is a great bonus when you're trying to build an online community. Getting good search engine ranking that is, getting your site higher up in the search results when people look for web sites on your subject is essential to guiding visitors to your site. WordPress helps you get good search engine ranking in a number of ways by constructing structured, semantic, standards-compliant HTML by providing multiple views of your content and by creating search-engine-friendly permalinks. But that's not all there is to improving search engine visibility for your site. There are also a number of things that...

Caution Search Engine Penalization

Most search engines use a number of different metrics to determine where a web site is positioned in a list of search results. Some search engines, such as Google have a metric (Google's is called Page Rank which factors in other things too) which is based on in-bound links, where a web site link from one site to another acts as a vote. Some web sites and businesses use this to their advantage, and offer to pay for advertising space on other web sites (normally ones which have a high ranking). Both the buying and selling of advertising in this way is not something which many search engines like. Google even has an online tool to report web sites that do this, which result in their rankings being penalized. The nofollow attribute signals to search engines that we don't wish the link to be counted as a vote for the web sites rankings. So if this was an advertisement, we would not be penalized by search engines for buying or selling advertising space for it, on a web site. Don't risk...

Contributing to Your Sites Search Engine Ranking

The things you can do to help your search engine rankings are all about the content. The best and most important advice I can give you is to write well. By sticking to your subject and writing relatively short or medium-length articles that stay on topic, written in the language the community uses, you will not only keep your readers interested, but you also will gain ranking for the very keywords your readers will use in their searches. Still you should bear in mind what measures are important to the search engines Each search engine has its own recipe for what its administrators believe is the right mixture of the keyword and link measures. However, here are a few simple guidelines that will help you with all search engine ranking algorithms WordPress is designed to help you achieve the goal of following these guidelines, but what it can't help you with is choosing the keywords and writing compelling posts that someone seeing your web site in a search engine results list would be...

Submitting Your Sitemap to Search Engines

Creating a sitemap is no magic potion to increase search engine rankings. The ugly truth is that search engines won't even know about your sitemap until you tell them. Fortunately, you have tools on your side. XML Sitemaps comes with the search engine sub-module, and each search engine has sets of tools on their site to help you manage your site. Before you submit a sitemap to a search engine, you must verify that you are the owner of the site and are therefore allowed to provide a sitemap for it. Each search engine has a slightly different method of verification, but they are all roughly the same. In the following exercise, you verify your site with Google in order to submit a sitemap, which you'll do in the exercise directly following this one. In this exercise, you verify ownership of your site within Google Webmaster Tools. Google webmaster tools 6. On the Site Verifications page, click the Add Verification tab and select Google as the search engine, as shown in Figure 19-6. Click...

Understanding search engine crawlers

Did you ever wonder how all those pages got into the search engines in the first place There's a magic search engine genie that flies from server to server waving a magic wand not really but close. Actually, there is a computer program called a crawler (or sometimes a spider or robot or 'bot) that lives on the search engine's servers. Its job is to surf the Web and save anything it finds. It starts by visiting sites that it already knows about and after that, follows any links that it finds along the way. At each site that it visits, it grabs all the HTML code from every single page it can find and saves that on its own servers. Later, an indexing server will take that HTML code, examine it, parse it, filter it, analyze it, and some other secret stuff (a lot like waving that magic wand). Finally, your site is saved into the search engine's index. Now, it's finally ready to be served up as a search result. Total time elapsed About two minutes. One important thing to note here is that...

Search Engine Friendly Migration

The biggest pain for me, having just migrated from Geeklog to Drupal is that my site has a good index in google and other search engines. All those users finding my site on Google are suddenly getting the 404 error page, giving up, and going away when they could dig a bit deeper and find exactly what they were looking for. So, I harnessed my experience writting HTTP_REFERER logging systems for geeklog to provide myself with a Page with PHP enabled content to parse the Search Engine query from the HTTP_REFERER and do a search on my Drupal site to try and find what the user was actualy looking for. I thought I'd share with the world, as that's the entire point of open source systems like Drupal. 'Ahttp www.altavista.com web results > 'query', echo( t(< br > < h2> Search Engine Detected< h2> It would appear you arrived here on the wings of a search engine, so, I will search my local database and show you anything that matches what you were looking for < br > )) This...

Search engine friendly and robotstxt

Drupal by it self is very search engine friendly. For example it is not uncommon for drupal based sites to have have a google ranking of 5 (out of 10) or more where using the same content on another CMS would score much lower. Still, you can make drupal even more search engine friendly by changing some default parameters. On this page you will find severals ways of tweaking your drupal instalation to make it more search engine friendly. By default drupal does not ship with a pointer file for the search engines called robots.txt. By adding this file to the root of your (virtual) webserver, you can quide bots tghrough your site or forbid indexing parts of your site. See for an example the file for drupal.org itself at http drupal.org robots.txt. Many robots obey the Crawl-delay parameter. Since drupal sites seem to be popular with search engines and lost of people have more aggresive bots than visitors at their site, it might be wise to slow down the robots by adding a robots.txt line...

Search Engine Optimization and Website Promotion

One of the most common goals for a website is to appear high up on the big search engine rankings. As you should know, having a good ranking increases the chances of potential users finding your site among the mass of other sites. So what can be done to rank as highly as possible without actually having to pay anyone It certainly helps to have everything named meaningfully, not least because search engines do look at le names. Instead of naming a page 19, give it something appropriate like expert-opinion. IMPORTANT Search engines, in particular Google, place a large amount of emphasis on the anchor text used in links. Make sure all your links have meaningful text associated with them. For example, you could rewrite the following sentence The reason for this is that the word here is not particularly meaningful to a search engine, even though humans can easily make the connection. For the sake of your rankings, simply move the link to the key phrase Wildlife community to place more...

Search Engine Optimization

One of the most common goals for a website is to appear high up on the big search engine rankings. As you should know, having a good ranking increases the chances of potential users finding your site among the mass of other sites. So what is it that you can do to make your site rank as highly as possible without actually having to pay anyone to do it for you Submit your site to search engines and online directories While this is not as important as the first point, it certainly will help to have everything named meaningfully, because search engines do look at file names. Instead of naming a page 19, you should name it something like awesome_webpage. html. Don't go overboard on this because it is not too important. IMPORTANT Search engines, in particular Google, place a large amount of emphasis on the anchor text used in links. As a result, make sure all your links have meaningful text associated with them. For example, you would rewrite the following sentence Donate to the Wildlife...

Google Webmaster Tools

If you've got a site that shows up in Google then you need a Google Webmaster Tools account. The Google Webmaster Tools provide you with detailed reports about your pages' visibility on Google. It's one of the most direct ways that you can communicate with Google about your site. It allows you to upload an XML sitemap, see if there are any problems with your site and fix them. It even lets you control the Google spider so that it doesn't drag your site down with constant visits. To use the tool, you need to verify your site. Fortunately, there is a great module called the Site verification module that helps you verify your site with the search engines. It was created and is maintained by Dave Reid. Thanks, Dave You'll always be verified in my book Google Webmaster Tools 10. Go back to Google Webmaster Tools and click on the Verify button. In a few seconds, you should see the success message, as shown in the following screenshot Google Webmaster Tools settings Now that your site is...

Removing duplicate search engine results

It is good practice to prevent search engines from indexing duplicate pages. For instance, after moving the front page to the actual home page, there is no need for search engines to index the separate http YOURSITE.com frontpage URL. (From the point of view of a search engine, this URL involves a frontpage directory even though there is no such directory on your system.)

Making File System Backups

The files in the first category include the scripts that were delivered with the Drupal installation, contributed modules that you may have installed, customized themes, and the configuration file (settings.php). From the standpoint of a backup strategy, these files are relatively easy to deal with. Their volume doesn't increase even as site traffic grows, and you know every time they change since you make the changes yourself. If you simply take the time to manually copy these files to your local machine every time you install something new or make a change, you're already ahead of the game.

Keyword research tools

The most important keyword research tool at your disposal is your own web site, http www.yourDrupalsite.com . If you already have some traffic, chances are that they're coming from somewhere. With analytics installed, you should be able to find out what they're searching on with just a few clicks. If you have Google Analytics installed then you can easily see this data by logging in to its admin section and then going to Traffic Sources Search Engines Google. This information can be invaluable if you cross-reference it with your current positions in the search engines. Say, for example, that you're getting 100 searchers a month on a term that you're on page 2 of Google. That's a good indicator that people are searching hard to find a company like yours and that may be a very good term to focus on with your search engine campaign. If you're getting that much traffic on page 2, imagine if you were in the top three on page 1. Drupal has a built-in search engine another great tool to see...

Optimizing URLs with the Path module

Sure, search engines can read the URL but that's just the first step to making your web site addresses work for you. Search engines look at the URL for keywords just like they look at the Page Title or the body content. That means that a site with keywords in the URL path will do better than a site without them. Thankfully, Drupal core includes the Path module which lets you write your own paths. The Path module allows you to manually create search engine friendly URLs based on your content. This allows you to get addresses that look like the following URLs

Providing Semantic Standards Compliant Content

Out of the box, WordPress gives you a semantic, standards-compliant web site. The search engines love this because it makes their job easier. When search engines send their spiders and crawlers out into the Internet, they often encounter barriers to navigating a site JavaScript links, Flash content, and so on. WordPress blogs don't have these barriers to navigation. Furthermore, when the search engines analyze the content of the pages they have gathered to identify the keywords and the keyword density, and determine the ranking of your pages for those keywords, WordPress makes it easy for them. WordPress uses the name of your blog in < title> and < h1> tags on each page. For category pages, the < title> also contains the category name, and for individual post pages, the post title is also in there. This common sense approach to structuring a web site achieves search engine optimization by publishing unique page < title> tags, and by giving meaning and importance to...

If You Dont Want to Outsource

If you are running an intranet, have privacy concerns, or simply don't want a third party crawling through your content, there are other options. The Apache Solr and Lucene open source search engines are extremely powerful, highly flexible, and freely available. Better yet, they have been integrated with Drupal. Be warned that although these search engines are incredibly powerful, they are also very complex. Entire books are dedicated to their installation and configuration. If this is your first introduction to Drupal or a content management system (CMS), you should leave this until a bit later in your Drupal journey.

A brief history of static and dynamic URLs

The method they used involved a question mark ( ), equal signs ( ), and ampersands (& ). It was magnificent until search engines came along. Search engines couldn't understand these often long, complex strings of data being passed from a browser to a server and back again. URLs often looked like this As it turns out, the only important piece of this URL, at least as far as the search engines are concerned, was the first element. Even then, some sites would put the important things at the end of the URL and there was just no way for Google to know that. So, they ignored everything after the . That meant that web sites with thousands of products would look to Google like they only had one or two pages. Finally, in 1996, a really clever guy named Ralf Engelschall came up with a URL rewriting patch for Apache called mod_rewrite. It acts as a translator between URLs and databases, so that ugly query strings that confused search engines before now show up clean and friendly. You could...

Optimizing the robotstxt file

The robots.txt file is a file that sits at the root level of your web site and asks spiders and bots to behave themselves when they're on your site. You can take a look at it by pointing your browser to http www.yourDrupalsite.com robots. txt. Think of it like an electronic No Trespassing sign that can easily tell the search engines not to crawl a certain directory or page of your site. Using wildcards, you can even tell the engines not to crawl certain file types like .jpg or .pdf. This means none of your JPEG images or PDF files will show up in the search engines. (I'm not recommending that you do that .but you could.) On December 1, 2008, John Mueller, a Google analyst, said that if the yCl N Googlebot can't access the robots.txt file (say the server is unreachable I 11 or returns a 5xx error result code) then it won't crawl the web site at _> all. In other words, the robots.txt file must be there if you want the web site to be crawled and indexed by Google. Read his full comment...

Optimize images video and other media

Video, Flash, and other media are not far behind. Great graphics, animations, or videos add a lot of value to an otherwise boring web site. The drawback is that search engine spiders cannot read the content of these files. Nothing kills your site's searchability faster than embedding all your best keywords into graphic or Flash files. Users see them just fine but search engines see nothing. It is extremely important to communicate as much information as you can to the search engines about each graphic or media piece that you use on your site. Google Image Search uses the file name and the alt tag to determine what the image is all about. Other services, like YouTube, Viddler, and video search engines, need help to determine what your video is about so that they can show it to people that are interested in watching your masterpiece. Make sure that you can take advantage of these powerful streams of traffic by adhering to the following guidelines. Did...

Sitemap and webmaster tools

Google has released some webmaster tools, including one that allows us to create a list of the pages within our site, rank them in the order of importance and specify the frequency in which those sections will be updated. This is then saved as an XML file and stored on the web site. Google can then read it and see which content it should check regularly, and which content it should check less regularly. This helps keep the web site more relevant to search engines.

Visitorfacing sitemaps

XML Sitemaps are great for search engines but as you can see, they're not user-friendly at all. Some of your site visitors will want to see all of the pages or sections available to them on your web site. That's where a Visitor-facing sitemap comes in handy. Fortunately, there is a Drupal module that will do that for you automatically It's called the Site map module. Not only does it show you a nice overview of your site but it can show the RSS feeds too. Everybody raise a glass to Nic Ivy and Fredrik Jonsson, respectively the original author and current maintainer of this module. Cheers, gentlemen

Writing page titles that Google and your visitors will love

There are two competing forces pulling for your page title's attention. First, the search engines use your page title to help determine where your web site fits. Second, your customers will see and use your page title to help them determine if your site has what they're looking for and to remind them what your site is about when they see it in their bookmark list. A good page title achieves both objectives. Now, that works for search engines but not so much for customers. Sure, they'll know that you sell mortgages but they may not remember which company. You always want them to remember who you are. So, you could do this That's good for your customers however, it moves the best keywords out of the first position. Search engines assume that the most important words come early in the page title. So, how about this You confuse the search engine who thinks that you've got the same page twice

Tip The TinyMCE module httpdrupalorgprojecttinymce allows Wysiwyg Html editing and integrates nicely with the Image

The final field on the HTML filter's configuration page is Spam Link Deterrent. In early 2005, Google announced that it would no longer award any page rank credit to sites based on links with the rel nofollow attribute in them. This was done in response to the increasing phenomenon of spammers posting comments on blogs with links to their own sites just to increase their page ranking with Google and other search engines. Drupal quickly responded, and by checking the Spam Link Deterrent option, you ensure that any links posted by your site's users will have the rel nofollow attribute, and Google will not follow them when spidering. Let's hope that the incentive for comment spam will dwindle as spammers realize that they are wasting their time.

Validating New User Accounts

Enabling e-mail validation limits an e-mail address to just one associated account, which is good for sanity purposes. It is a must in this day and age of automated registration bots, which create mass accounts on various web services that are ultimately designed to abuse search engines' indexes. Additionally, it keeps the onus on users to keep their account information up-to-date, as a valid e-mail address is required to recover passwords.

When to Use the Indexer

Indexers are generally used when implementing search engines that evaluate large data sets and you wish to process more than the standard most words matched queries. Indexers are also used to extract and organize metadata from files or other sources where text is not the default format. Search relevancy refers to content passing through a (usually complex) rule set to determine ranking within an index.

Editing your robotstxt file

If , for some reason the robots.txt file is missing you can easily create one, using any plain text editor like Notepad or TextEdit. Avoid using a word processor, though, as they add additional content which will make the file unreadable to the search engines. 5. Most directives in the robots.txt file are based on the line User-agent . If you are going to give different instructions to different engines, be sure to place them above the User-agent *, as some search engines will only read the directives for * if you place their specific instructions following that section. 6. Add the lines you want. Later in this chapter, you'll learn several changes which will help you with your SEO.

Modules used in Chapter

Pathauto automatically creates path aliases for our nodes, categories, and users. It generates search engine-friendly URLs and improves the ranking of our pages. Global Redirect creates page redirects and solves problems related to duplicate content and lower rankings by search engines. Site map creates a page with a site map in a readable form. The visitors of your site can view the structure of it, but it's also accessible by search engines.

The Importance of URLs

The last URL is self-describing, making it easier for your site visitors and, more importantly, for search engines to understand what the content of the page will contain. As you can probably guess, this last clean, self-describing URL is what you want. To make your life easier, the Pathauto module can automatically create these URL aliases for you based on information like title, taxonomy, and content type. The only thing you need to do is to download, install, and enable the module although you should take a few minutes to configure or verify the default settings.

Search and Metadata Clean URLs

As you can see the first URL is easier to tell your friends about or modify to select different shoes. Clean URLs is commonly a part of search engine optimization (SEO) helping search engines more easily find your data, which may lead to high search engine ranking. Regardless of if it helps your search ranking it simply looks better.

What this book covers

Chapter 5, Sitemaps, discusses the origin of sitemaps and how they're used to make sure your entire site is crawled by the search engines. It also teaches you how to make a user-friendly sitemap for your site visitors. Chapter 7, RSS Feeds, Site Speed, and SEO Testing, helps you get your web site Search Engine Optimized. It teaches you about RSS Feeds, PageRank, Drupal's built-in caching, and checking your site with SEOmoz. Chapter 8, Content is King, teaches you how to get good content and search engine optimize it. It also teaches you how to maintain the content and keep it search engine-optimized.

Conducting Special Searches

PhpBB's search engine is also equipped with three types of predefined special searches. Any users guests or registered are able to find posts that have yet to receive a reply by clicking the View unanswered posts link in the top-right portion of the page, just above the forum and category listing. Registered users can perform two additional searches

Searching and Indexing Content

Both MySQL and PostgreSQL have built-in full-text search capabilities. While it's very easy to use these database-specific solutions to build a search engine, you sacrifice control over the mechanics and lose the ability to fine-tune the system according to the behavior of your application. What the database sees as a high-ranking word might actually be considered a noise word by the application if it had a say. The Drupal community decided to build a custom search engine in order to implement Drupal-specific indexing and page-ranking algorithms. The result is a search engine that walks, talks, and quacks like the rest of the Drupal framework with a standardized configuration and user interface no matter which database back-end is used.

Alias Your Way to a Better Search Ranking

It is a generally assumed best practice that having a well organized and aliased website will better your search engine rank. The voodoo magic that is SEO changes rapidly so whether or not this is actually true is another matter. Regardless, a well aliased site simply looks better. Here are a few tips to help keep your site looking good and give it a leg up on the search engine front. One alias per node Multiple aliases for the same node may lead search engines to believe that you are a spam site with little valuable content. The Global Redirect module can help to ensure that this doesn't happen to you. Organization A well thought out and organized website is easier to navigate and helps users, and search engines, understand your content. For example, place news at news, blogs at blogs, and your store at store.

Starting to Blog and Building Your Community

In this chapter, I'll take you through some simple steps to enhance your blog and build your community. First, you'll look in a little more depth at posting to your blog, using both the standard and advanced editing options. I'll show you that you don't need to have any great HTML skills to make rich content for your site. Then you'll see how to manage categories, manage comments, add multiple authors for your site, and create blog pages. During the course of this chapter, you'll install and use two plug-ins, giving you an idea of what WordPress plug-ins can help you do on your site. Finally, I'll give you some tips on improving the search engine visibility of your site, to attract more visitors.

Search Performance Statistics and Reporting

> Enabling and configuring your search engine Thus far, you have added user accounts, created and managed your content, and explored many of your site's configuration settings. In this chapter, you explore two of the most important aspects of creating a robust website. The first is enabling and configuring your search engine to provide a way for your visitors to dig through your site's content. The second is performance, or perhaps better said, how to make your site fast. In this chapter, you explore Drupal's caching options and other site optimization techniques.

Banning Abusive Users

One of the most useful features of the Statistics module is that it allows you to identify visitors who are abusing your site. Usually, these are not human visitors, but rather search engine crawlers that are malfunctioning or machines automatically accessing your site in an abusive manner. Once you identify a user, usually represented by an IP address, that is abusing your site, you can ban access from that particular abuser. This is a fantastic tool if your site is buckling under an artificially high load that is being generated by attackers or corrupt automated programs generating excessive requests.

Creating Aliases to Drupal Paths

Tip The Pathauto module (http drupal.org node 17345) automatically generates path aliases for various kinds of content (nodes, categories, and users) when no explicit alias is provided by the user. The patterns used for generating the aliases is fully configurable, making the Pathauto module an indispensable tool for search engine optimization.

Overloading a Permission

Weaknesses with overloaded permissions are generally more difficult to exploit. You have to find a site that has the module installed, gain an account on the site, and then probe for the misconfiguration. That said, a site that is totally misconfigured and allows anonymous users to perform the actions can often be found via a search engine. Again, this will be covered more thoroughly in Chapter 9.

PhpBBs Security Features

Visual confirmation phpBB's later releases come bundled with a visual confirmation system, which aids as a deterrent against automated registration bots that use your member list as ad space for shady web sites that are interested in increasing their search engine placement. Humans should be the only people using your forums, and visual confirmation helps in ensuring that is the case.

What your keyword goal is

A 2004 survey by iProspect found that two out of three search engine users believed that the highest search results in Google were the top brands for their industry there is little reason to believe this perception has changed. That means that just by being at the top of Google will gain you a certain level of trust among search engine users. If Big Computers Unlimited can rank in the top three for Gaming PCs, they'll develop a lot of creed among gamers.

Project autocompletion and search

One area where autocomplete is not enabled in Drupal is on the search pages. As always, there are a variety of reasons for this, the most important being performance. Having lots of autocompletion scripts, on various clients, all running numerous searches would significantly increase the load on the search engine.

Some Forum Administration Lingo

Bot Bots are a relatively new phenomenon in the fight for message board security the typical bot registers new accounts on message boards automatically in order to boost their search engine rankings. Some bots are sophisticated enough to spam certain types of message boards. If you enable the proper validation in phpBB, you won't have to deal with these. (I'll cover defense against bots at length in Chapter 10.)

Understanding payment workflow

Ubercart Order Process

Before we continue with the configuration, we need to have a high-level understanding of how Ubercart handles payments and order stats. After your client finds the product he or she wishes to buy, either by searching your online catalog as we discussed in Chapter 4, Managing Categories, Products, and Attributes, or through a Google search (SEO optimization will be covered in Chapter 10, Optimizing and Promoting Your Store), the customer cart is updated with each selection. From this point, you have three alternatives as depicted in teh following image. It depends on your merchant plan whether you will accept electronic payments on your site, take the payment process offline via bank transfer or a check, or redirect it to a safe provider location.

Creating a Company Blog

A blog is a series of typically short postings that are displayed in reverse chronological order. Blogs normally allow users to comment on each posting. With a regularly updated blog or blogs, you can ensure that your site always has fresh content, which ensures that both visitors and the search engines keep coming back. In this chapter, we will create a blog where Chef Wanyama can discuss the Good Eatin' restaurant, cooking, the restaurant business, and more. He plans to use the blog to make the site more interactive and to draw search engine traffic.

Using Open Calais to offer More Like This blocks

One of the ways that search engines determines the relevance of a piece of content to a particular topic is where it links to and which pages link to it. For example, real estate and mortgages are related to Google because a lot of realtors link to mortgage companies and vice versa. Your content is now more connected and that's a good thing for your site visitors and search engines.

Mastering the htaccess file

There is a server configuration file at the root level of your Drupal 6 site called the .htaccess file. This file is a list of instructions to your web server software, usually Apache. These instructions are very helpful for cleaning up some redirects and otherwise making your site function a bit better for the search engines. In Chapter 1, The Tools You'll Need, we told Google Webmaster Tools that we wanted our site to show up in Google with or without the www in the URL. The .htaccess file allows you to do the same thing directly on your web site. Why are both necessary In Google's tool, you're only telling Google how you want them to display your URLs you're not actually changing the URLs on your web site. With the .htaccess file, you're actually affecting how the files are served. This will change how your site is displayed in all search engines. 7. Don't forget to tell Google which you prefer using Google Webmaster Tools. See the Google Webmaster Tools section in Chapter 1, The...

Indexing Content That Isnt a Node hookupdateindex

In the case that you need to wrap the search engine around content that isn't made up of Drupal nodes, you can hook right into the indexer and feed it any textual data you need, thus making it searchable within Drupal. Suppose your group supports a legacy application that has been used for entering and viewing technical notes about products for the last several years. For political reasons you cannot yet replace it with a Drupal solution, but you'd love to be able to search those technical notes from within Drupal. No problem. Let's assume the legacy application keeps its data in a database table called technote. We'll create a short module that will send the information in this database to Drupal's indexer using hook_update_index() and present search results using hook_search().

Adding your XML Sitemap to the robotstxt file

Another way that that the robots.txt file helps you search engine optimize your Drupal site is by allowing you to specify where your sitemaps are located. While you probably want to submit your sitemap directly to Google, Yahoo , and MSN, it's a good idea to put a reference to it in the robots.txt file for all of those other search engines. You can do this by carrying out the following steps If you have an XML sitemap, use it. If not, use the URL list sitemap. However, do not add both, an XML sitemap and a URL list sitemap, to the robots.txt file. It could confuse the search engines possibly even causing duplicate content on your site. Also, do not add your visitor-facing sitemap to your robots.txt file.

Using the Memcache API and Integration module

The Memcache API and integration module provides an API for integration with the Memcached daemon or service. There are many methods of installing the prerequisites for using this module, specifically the Memcached library but we're going to focus on how to easily install Memcached and the Drupal 6.x version of the module in a Windows WAMP environment using PHP 5.2.x. In order to use the Drupal module, we'll need to first install the Memcached service on our local development server and then integrate this service with our PHP version. There are many instructions available on the web for installing Memcached on a Linux or Mac OS system. You can do a Google search for tutorials on how to get Memcached installed or see the note below with links to resources on http drupal.org and on Lullabot's website. Our focus here is to get Memcached up and running as quickly as possible so that you can see examples of how it works in a sandbox environment, so we're going to install using WAMP on...

Using Googles Webmaster Tools to evaluate your robotstxt file

Warning The robots.txt file is easy to mess up It's not written for humans so it's easy for site owners and webmasters to misunderstand exactly how to use it. Take care not to break your SEO campaign simply because a poorly written robots, txt file is excluding your site from Google. Fortunately, Google's Webmaster Tools provides a helpful utility that shows you exactly which pages are being excluded and included by your robots.txt file. Carry out the following steps to evaluate your robots.txt file using Google's Webmaster Tools Rule Ignored by Googlebot 6. Further down, Choose User-agents allows you to specify which Googlebot you want to evaluate. Google has several they use, like Googlebot-Mobile and Googlebot-Image. Let's try an example. We're going to tell Googlebot-Image to leave our site alone User-agent Googlebot-Image I chose Googlebot-Image as the Googlebot and here's what it looks like User-agent Googlebot-Image it Disallow images D is allow i *.j p g D is allowi *.gif...

Separate Database Server and a Web Server Cluster

Load balancers distribute web traffic among web servers. There are other kinds of load balancers for distributing other resources such as a hard disks and databases, but we'll cover those later. In the case of multiple web servers, load balancers allow web services to continue in the face of one web server's downtime or maintenance.

The alt and title attributes

The alt attribute specifies alternative text to display if the image, movie, or other media can't be displayed. Suppose someone has images turned off in their browser settings (common on dial-up connections and text-based browsers) or the images get moved. The alt text would be displayed instead. For search engines, the alt text can be another indicator of what that element of the page is about and thence, what the entire page or site is about. Unfortunately, many black-hat SEOs have used alt and title text as a way to stuff keywords into their sites. This was a useful tactic .back in 1995. Just use alt and title tags as you would if you didn't care about SEO. Put keywords in there if it helps your users. Search engines are smart. They'll figure out if you're stuffing your keywords and penalize you in the search results. Many web designers want to use graphics instead of text to represent links or menu options. While this may be helpful to users, text links are better for SEO and are...

Enabling and configuring the Throttle module

Drupal allows you to control when your modules and blocks get enabled and shown to your site visitors. This helps you to prevent bottlenecks in your server's web traffic and to optimize your server load to prevent any congestion that it might experience with its bandwidth and traffic. Throttling blocks and modules becomes increasingly important on larger scale websites where you have many blocks and modules active. You may have a site that contains a large number of blocks, for example, that have been built with the Views module. You can throttle these blocks, so they only get enabled when the site visitor calls a page that is supposed to show that block. The throttle module allows you to configure it, so it automatically gets enabled when the usage of your site goes above a certain threshold. For example, this can be the number of anonymous users visiting your site. When a certain amount of visitors are on your site, you can have Drupal enable throttling.

Submitting your Google News sitemap to Google News

Log in to Google Webmaster Tools by pointing your browser to The XML Sitemap is the ideal choice because it allows you to specify a lot of information about the content of your site. But, say for some reason that you can't install an XML Sitemap. Maybe there's a conflict with another module that you just have to have. Perhaps your server doesn't have the power to handle the large overhead that an XML sitemap needs for large sites. Or, possibly you want to submit a sitemap to a search engine that doesn't support XML yet.

Analyzing Your Site With Google

Few web statistics tracking systems are as user-friendly and versatile as Google Analytics. It enables you to view a wide range of data about traffic to your site, including the top content of the day or month, where your visitors are from, which search engines and sites send the most traffic, and much

How to pick the best keywords

By now, you know the goals of your SEO campaign branding, lead generation, sales transactions, and so on. Now, it's time to dig into the data. There are infinite number of ways to go about doing keyword research. I'm going to take you step-by-step through one of them. It's not necessarily the right or the best way but it's a good, solid technique that I've used many times to produce excellent results.

Structure your site hierarchically

There's a reason you learned the outlining format in grade school. It's easier to organize related ideas when they're structured hierarchically. It turns out that it's easier for search engines to figure your site out when it's structured that way as well. So, send a long-overdue thank you note to your fifth grade language arts teacher and let's get organized. Search engines also need to work in an organized manner. It turns out that search engines organize around much broader concepts than just keywords (although keywords are still the most important element). To find related words that will help your ranking, try typing your keyword into Google, in the following manner

Indexing Your Content

Indexing your content could also be called making your site's search engine work. If you perform a search after completing the previous exercise, you won't get any results this is because your site has not been indexed yet. Indexing occurs automatically when your site runs cron (discussed in Chapter 3, Your First Drupal Website). You can see the status of your site's index, as shown in Figure 9-2, by navigating to Configuration C> Search Settings.

Bold strong and emphasized text

Many search engines take into account text that is set apart on the page. You can set apart a word or phrase using a couple of methods. Bold and italics will do just that, bold or italicize the text. < strong> and < em> are terms that can be styled to look like anything you'd like using a style sheet in your theme. Typically, strong and emphasis tend to look like bold or italics. All are good methods for pulling a word out of a block of text and making it stand out.

Adding the Data Entry Form

Next, we quickly check for cases in which we don't want to display the annotation field. One case is when the teaser parameter is TRUE. If it is, this node is not being displayed by itself but is being displayed in a list, such as in search engine results. We are not interested in adding anything in that case. Another case is when the user ID of the user object is 0, which means the

Robotstxt htaccess and WC Validation

Much of the SEO that we've accomplished so far is visible to your visitors (for example, titles, headings, body text, and even a sitemap or two). In this chapter, we're going to address some of the more technical aspects of on-page SEO. Over the last ten years, many elements have been added to the HTML specification. The search engines themselves have developed other elements to help you communicate better with them. Since our ultimate goal is to do well by the search engines and our visitors, it's time to embrace your inner geek and get technical with your SEO. Pocket protectors ready Let's do this thing. In this chapter, we discuss making edits to two different files that are considered core Drupal. Core means part of the base installation of Drupal and not in the sites directory. While what you'll accomplish in this chapter is not considered hacking core, it does mean that when you upgrade your Drupal site (say from 6.14 to 6.15) you will need to preserve your robots.txt and...

Strategies to Crack Drupal

This chapter goes example by example through several strategies to crack Drupal. The first is simply to search for a common security mistake in the code and then use some advanced Google search modifiers to find potentially vulnerable sites. Then you take a look at two vulnerabilities that were ''happened upon'' and discuss some things to be aware of as you click around sites and review code to increase the likelihood that you will happen upon these issues as well.

Submitting your XML sitemap to Google

If you have not already done so, you need to verify your web site with Google Webmaster Tools. Refer to Chapter 1, The Tools You'll Need, for details. Google Webmaster Central Improve traffic with Google Webmaster Tools Make your site more search engine friendly Google Webmaster Blog Tips on requesting reconsideration 8. Log in to Google Webmaster Tools, click your domain and then click the Sitemaps Overview. If the status is still Pending then wait a bit longer. When your sitemap has been crawled, it will say OK. You can easily see who has accessed your XML Sitemap by visiting your Watchdog log http www.yourDrupalsite.com admin reports dblog. You can see how recently each search engine has visited your sitemap. What about all those other search engines out there It's easy to let them all know where your XML Sitemap is located by adding it to your robots.txt file. We'll cover that in the robots.txt section in Chapter 7, robots.txt, .htaccess, and W3C Validation.

Search e gin optimizatio

You have many ways to promote your website, but the main traffic source will always be search engines. Search engine optimization helps your site to improve its position in the natural search results, thus generating more traffic and attracting visitors who search for your products. This module automatically creates path aliases for our nodes, categories, and users. It generates search engine-friendly URLs and improves the ranking of our pages. Browse to http drupal.org project pathauto and right after you download the module, upload it and unzip it to your site's sites all modules folder, and go to Administer Site building Modules to enable it. To configure it, go to Home Administer Site building URL aliases. This is a simple but very useful module. In Drupal, especially when you're using clean URLs and Pathauto module, the same content can be reached using different URLs. For example, node 34, node 34 , index. php q node 34, and products ipod32mb are different URLs that may target...

Contents of This Book

This chapter covers several tools that can be used to create a wiki in Drupal, among other uses. The node revisions system (coupled with the useful Diff module), the Markdown filter for easy HTML entry, the Freelinking module to automatically create and link wiki pages, and the Pathauto module for automatically creating search engine-friendly URLs are all discussed in detail.

Problems with the default Drupal robotstxt file

There are several problems with the default Drupal robots.txt file. If you use Google Webmaster Tool's robots.txt testing utility (detailed instructions on this utility later in this chapter) to test each line of the file, you'll find that a lot of paths which look like they're being blocked will actually be crawled. The reason is that Drupal does not require the trailing slash ( ) after the path to show you the content. Because of the way robots.txt files are parsed, Googlebot will avoid the page with the slash but crawl the page without the slash. Google what Googlebot Google and other search engines use server systems (sometimes called spiders, crawlers, or robots) to go around the Internet and find each web site. We sometimes refer to Google's system as the Googlebot to distinguish it from other search engine robots. While Google doesn't report this number anymore, it is estimated that the Googlebot crawls 10 billion web sites each week That is a fast little robot.

Throttling

If your site gets linked to by a popular website, or otherwise connes under a Denial of Service (DoS) attack, your webserver might become overwhelmed. This module provides a congestion control throttling mechanism for automatically detecting a surge in incoming traffic, This mechanism is utilized by other Drupal modules to automatically optimize their performance by temporarily disabling CPU-intensive functionality, Reduce the load on your servers by preventing access to certain parts of your site for web crawlers of various kinds, using the robot.txt file

Word Tracker

WordTracker has been around the longest for good reason. They provide dozens of tools to help you find the best keywords. They gather their data from Meta search engines search engines that aggregate search results from many different search engines. They then extrapolate the number of searches on a particular term based on market share data. It's not perfect but it does provide some great data points.

Content is King

They make their money by taking good content and serving it up to the masses. Think about it from Google's perspective. The better the content on your site, the happier the search engine user is going to be when they get there. Happy visitors will use Google again and again. The purpose of this chapter is to talk about ways that you can create fantastic content that the search engines will crawl, index, and then put in their search results. The more excellent the content on your site, the more visitors will find you. In this chapter, we're going to cover Search Engine Optimizing your content

[packtI

Drupal 6 Search Engine Optimization Drupal 6 Search Engine Optimization Rank high in search engines with professional SEO tips, modules, and best practices for Drupal web sites Drupal 6 Search Engine Optimization Rank high in search engines with professional SEO tips, modules, and best practices for Drupal web sites 3. Create search engine friendly and optimized title tags, paths, sitemaps, headings, navigation, and more

OnPage Optimization

Google and the other search engines look at the content of your site to determine whether you should show up for a particular search. It makes sense. If you don't even mention the keyword, then you probably aren't talking about that topic. It's like picking up a book and skimming the title, chapter titles, headings, text, and appendices to find out what it's about. If you're looking for a book on Drupal and you don't see the word Drupal mentioned anywhere, chances are it's not a Drupal book. Google basically does the same thing with your web site. If the keyword doesn't appear anywhere on your site, they won't rank you for that keyword. The first step in convincing Google that you are the best is to tweak your site so that the keywords show up in all the right places. These changes to your site for the search engines are collectively called On-Page Optimization. Thankfully, because you're using Drupal, it's a lot easier than it might be otherwise. Making URL paths clean and search...

Solution

To optimize their site's content, Volacci advised Acquia on the implementation of specific Drupal modules for SEO. These modules included Pathauto, XML Sitemap, and others. Then, the site's content, HTML, and structure were modified to better communicate to the search engines that Acquia is an authority on Drupal. Consequently, its attractiveness to both visitors and search engines was enhanced.

Google ranking

Clearly, the combination of high quality, targeted content and SEO in partnership with Volacci has propelled Acquia to the top of the search engines. Comparing the first thirty days with Volacci (May, 2008) to the sixth month (October, 2008) Site visits from search engines increased by an incredible 270 SEO was a key part of making the launch successful and contributing to the organic search traffic gains at a crucial time in our evolution when market visibility was critical - Bryan House, Director of Marketing at Acquia

XML sitemaps

In the early 2000s, Google started supporting XML sitemaps. Soon after Yahoo came out with their own standard and other search engines started to follow suit. Fortunately, in 2006, Google, Yahoo, Microsoft, and a handful of smaller players all got together and decided to support the same sitemap specification. That made it much easier for site owners to make sure every page of their web site is crawled and added to the search engine index. They published their specification at http sitemaps.org. Shortly thereafter, the Drupal community stepped up and created a module called (surprise ) the XML sitemap module. This module automatically generates an XML sitemap containing every node and taxonomy on your Drupal site. Actually, it was written by Matthew Loar as part of the Google Summer of Code. The Drupal 6 version of the module was developed by Kiam LaLuno. Finally, in mid-2009, Dave Reid began working on a version 2.0 of the module to address performance, scalability, and reliability...

Path Module

We live in the time of search engines, and optimizing your site to work well with the crawling and indexing programs that are used to build search engines is vital. Your site's ranking in the results of search engines such as Google or Yahoo will greatly influence how many visitors it gets. One very important factor in facilitating this is the nature of the URLs that are used by your site to link to all of the content. If the URLs are meaningful and contain keywords pertinent to the content they link to, your site will fare better in the search results. Drupal offers

The Tools Youll Need

Congratulations You're about to embark on a fun and interesting journey into the world of online marketing. Whether you're trying to sell more products, generate leads, or get more pageviews on your sponsors ads, Search Engine Optimization (SEO) will take you where you want to go. And, you're using Drupal 6 You've picked a great platform for building your web site. It's widely held that Drupal is one of the best choices if you want to rank well in the search engines. I personally believe that it's hands-down the best possible platform for SEO. I've seen clients triple their traffic within a few weeks of switching from a lesser platform. Believe it Drupal is the best But, you already knew that, didn't you In this chapter, we're going to dive right in and cover some of the top tips for Drupal SEO Some great paid tools to help you with your SEO The Drupal SEO group on www.Drupal.org.

The scenario

Let's say that I'm doing keyword research for a large computer manufacturer called Big Computers Unlimited. They sell computers all over the U.S. from their web site and a few select retail outlets. They recently acquired a smaller competitor called Intergalactic Gaming that specialized in high-end gaming PCs. The purpose of this campaign is to create more online sales of the specialized line of entertainment computers by increasing traffic from the search engines. While Big Computers spends millions each year on search marketing, this campaign is a trial to test the waters so they've only allocated a few thousand dollars over the course of three months.

Removing content

There will come a time when you want to take something off of your site. If you want to be friendly with the search engines, don't just unpublish the node. Search engines are crawling through your site on a regular basis and assigning value to each page. If you delete a page, you're just throwing that value away which is a loss for your site. Take the time to do it right and you can redirect all of the search engine goodness that your page carries to another page on your site, or even to another web site entirely. 3. Do a backlink search using Google Webmaster Tools, by carrying out the following steps Log in to Google Webmaster Tools.

Who this book is for

No matter your reason, you hold in your hands the knowledge that you need to rank at the top of the search engines, and turn visitors into paying customers for your business. Each page of this book tells you exactly what you'll need to do, to properly search engine optimize your web site. If you're relatively new to Drupal, just follow the easy, step-by-step instructions and screenshots. If you're an old hand, skip past the basic steps and review the best configuration options for each module. I've boiled down years of experience in Drupal, online marketing, monetization, dozens of modules, some tips, and a few tricks into a powerful potion of Drupal SEO goodness.

File name

The first and easiest way to identify what the file is about is to use a descriptive file name. A filename like img0004 . jpg does nothing for you. However, president-obama- eats -donut. jpg is descriptive, keyword-filled and does wonders for the findability of, say, a presidential pastries web site. The file extension (. j pg) also tells the search engines a lot about what that file is and how to display it to users. Make sure your videos have video extensions, your flash files have flash extensions, and so on.

Language

Now this is a truly interesting part of a site's design, and the art of writing for the Web is a lot more subtle than just saying what you mean. The reason for this is that you are no longer writing simply for human consumption, but also for consumption by machines. Because machines can only follow a certain number of rules when interpreting a page, the concessions on the language used must be made by the writers (if they want their sites to feature highly on search engines).

Path Aliases

Every web page that you visit has an address, or URL, that is listed in the address bar of your web browser. If you've paid attention to the URLs in your web browser, then you've probably seen quite a few that look as though they span a paragraph in length. This doesn't affect the visitor's ability to get to the page. After all, clicking on a link is clicking a link, regardless of its size. However, if the person tries to remember the link, or needs to write it down, then it gets ugly. Beyond that, it's nice to have URLs that are actually meaningful for the search engine 'spiders' that crawl the Web and index everything. Not very helpful at all, is it Fortunately, we can make this link much nicer to look at, and much more meaningful. Drupal comes with a Path module. The term 'path' refers to the portion of the URL that follows the domain name node 15 in the example above. The Path module allows us to create an alias path. The original path will still exist and be usable (Drupal will...

Spotlight Pathauto

In Chapter 2, you learned about the Drupal path and how to use clean URLs. One reason to use clean URLs is so that they don't look so ugly. (To review, clean URLs remove the q from the URL.) That helps, but still leaves the URLs lacking a bit. Having a URL with node 123 in it doesn't really tell either humans or search engines much about the page itself. Isn't it much better to instead have a URL with something like about-us in it That is going to be much more memorable for a person, and the addition of pertinent keywords in the URL makes for better search engine optimization. So, even without clean URLs, you can still benefit from good pathnames. The second option, making a new alias and keeping the old one, may sound ideal, because you then access the content from either path and the problem of link rot is eliminated. But, while addressing the issue of link rot, the disadvantage of this option is that some search engines will penalize you for having many paths that point to the same...

Website Activities

Backup Web Protect Change Password Custom Error Pages Redirects Mime Types Apache Handlers Frontpage Extensions Search Engine Submit HotLink Protection Backup Web Protect Change Password Custom Error Pages Redirects Mime Types Apache Handlers Frontpage Extensions Search Engine Submit HotLink Protection Obviously, there are a lot of toys to play with here and it is recommended that you spend some time finding out what is available for you to use and how to use it. Knowing what you have available is very important because it means you are better able to plan how you work. For example, the demo site's hosts offer an automated Search Engine submit facility that allows the new website to be submitted for indexing to all the major search engines much better than simply waiting around for the site to begin appearing on them. Since it is possible you have an entirely different set of options available on your hosted site, we won't discuss this any further here, but there are still a couple of...

Other Great Tools

Another helpful tool is the Google Toolbar. Google Toolbar gives you some very helpful tools like a Google search box and the Google Pagerank indicator. Visit any page on the web and the toolbar will tell you the Pagerank of that page. Currently, the Google Toolbar only supports Firefox and Internet Explorer. The following screenshot shows the Google Toolbar for Mozilla Firefox PageRank is a very important factor which needs to be taken into consideration for the Search Engine Optimization of your site. Google isn't the only search engine that offers some great tools. Yahoo is one of the few search engines that will provide you with a list of all the links you have coming into your site. Just point your browser at http siteexplorer.search.yahoo. com and put in your URL. You can even add a badge to your site that tells you how many in links you have, as shown in the following screenshot

Outsourcing Search

Drupal's built-in search engine is great for small sites that don't have much content, but as your content grows and your users expect more relevant search results, you'll start to see signs of stress on your site. (After all, Google, Microsoft, and Yahoo wouldn't be in an all out slugfest if search were easy.) Consider outsourcing your search results with one of the following modules Acquia Search Based on the extremely powerful and complex Apache Lucene and Apache Solr (the same technologies that power the search on http drupal.org), Acquia Search provides you all the power with none of the complexity. This quick drop-in replacement for Drupal's Search engine will provide you with faster searches, more relevant results, and a host of other features. Google Custom Search This customizable Google search engine can provide all of the power of Google with none of the work. Read more about Google Custom Search at http www.google.com cse , and then add it to your site at http drupal.org...

Buying Themes

If you do not find a theme that you can use with standard configurations, and if you do not want to build your own (or add code to an existing theme), you may be in the market for a theme. As with everything Drupal, this is an expanding and changing market, so you can get the best feel for it by starting from a search engine to find the latest information.

Search

Your site may have the most thought-provoking and informative articles on the Web, but if people can't find them, your message won't be heard. With just a tad bit of configuration, you can get a search engine up and running on your site in no time at all. Drupal's core search modules automatically index your site's content types (Articles, Pages, and so on) and your users, making them all available to search queries. You can selectively exclude content types from the search as well as users. The Search module also includes a block that can be displayed within any region or a special form.

Finding themes

Themes are what will make your Drupal site look different from everyone else's Drupal site. Themes can be free or commercial. The best place to start your search for a theme is on the Drupal web site, at drupal.org project Themes. You can also use your favorite search engine to find many themes. If you cannot find something that's quite what you're looking for, then you can pay a web designer to have a theme custom-developed for you. Make certain that the designer has specifically done Drupal work before. Being able to design a slick home page isn't good enough, because, as you have seen, Drupal has functional areas on the page, and you cannot simply take just any web page design and superimpose it on Drupal.

Our plan

If this were a book on PHP development, I would suggest writing some server-side code to collect and analyze search engine data, and then use that information to pre-populate an autocompletion field. But we can't readily get that information. So we will try another approach.We will use a Drupal taxonomy as the source for our suggestions.

Configuring Modules

Obviously, the nature of the setup for each module can differ wildly from the next. This is because modules can provide pretty much any type of functionality you can imagine, ranging from a simple poll, to a search engine, or whatever. Accordingly, there are a host of different settings associated with each one.

What a keyword is

Keywords are single words that a search engine user types into the search box to try to find what they're looking for. Key phrases are the same as keywords except for the fact that they consist of two or more words. For the sake of simplicity, throughout this book let's use keywords to mean both, keywords and key phrases. Millions of random people visit Google every day. When they arrive, they are amorphous a huddled mass yearning for enlightenment with nothing more than a blank Google search form to guide them. As each person types keywords into Google and clicks the Search button, this random mass of people becomes extraordinarily organized. Each keyword identifies exactly what that person is looking for and allows Google to show them results that would satisfy their query.

Pathauto

One of the key elements of successful search engine optimization is providing URLs on your site that are meaningful. By default, Drupal 7 out-of-the-box URLs look something like http localhost node 1. To a search engine, they have no idea what node 1 means, nor what the content associated with that page may be about just by looking at the URL. Humans visiting the site may also have a difficult time navigating around to pages that are not linked or accessible by a menu, as http localhost node 2487 is not very intuitive. Fortunately, we have the Pathauto module, which creates an Alias URL to the node being created, and the alias URL takes the form of the title of the node with hyphens used to separate words, and all words are made lowercase. An example might be http localhost node 2487. If that node has a title of Special deals of the month, the URL as generated by Pathauto would be http localhost special-deals-month (pathauto removes common words like the and of from titles when...

Search Module

With the aid of the Search module, Drupal will index all of the content on your site and make it available through keyword searches. As with any search engine, ranking the search results is an important consideration, as eventually, more results than anyone cares to sort through will be returned. Ideally, the most relevant results will appear highest on the list.

Google

Instead of sifting through the numerous modules on Drupal. org to find the module you need, or if Drupal's search engine is failing, you can just Google it The simple trick is to use the site-restricting function. For example, if you were searching for a module that allows users to vote, you could type the following into the Google search box

Faceted Search

Users often resort to a search engine after they become frustrated at the inability to find the information they were searching for. The Faceted Search module lets users drill down through your site's content to narrow down the results and hone in on what they want. With this module enabled, users can filter content by multiple sets of taxonomy terms, search phrases, or a combination of the two. This is commonly done by adding a block with a set of top-level links that users start from to drill down into your site's content.

Summary

This chapter focused on creating content, setting the various options that are available when creating a content item, updating and deleting content. You learned how to place a content item on a menu so users can easily find and view content, and how to create search-engine- and user-friendly URLs. At this point, you have the basic skills and understanding necessary to create a basic Drupal website, but stopping now means that you would miss out on all of the other rich and powerful features that Drupal has to offer. In the chapters that follow, I will describe the processes for creating complex page layouts, rendering lists of content, controlling who has access to various features and functions on your website, and share tips and tricks for managing your new site.

Spam Module

Recent years have seen an explosion in web site spam. Spam is any content posted to a web site that is unwanted or has an ulterior motive other than being part of the online community. The most common ulterior motive is getting links to third-party web sites published, in pursuit of the higher search engine rankings that come with the elevated page rank that their web sites enjoy when links from external sites point to them.

URL Path Settings

You may have noticed while working with the revisions feature that the URL that was shown in your browser's address bar looked something like http localhost node 1, where node in the URL tells us that Drupal is displaying a single piece of content (a node) and 1 represents the unique ID of the node that is being displayed. In this case, it's the first node that we created in the system, so the ID is 1. That number will increase by 1 for each node we add. Although http localhost node 1 gets us to the content that we wanted, the URL is not very people- or search-engine-friendly. Fortunately, Drupal lets us override the URL to something that is. After entering the new URL alias, click the Save button at the bottom of the page. Drupal will redisplay the page using the new alias URL that you created on the previous page. In my example, the new URL is http localhost my-first-content-item. The new URL is easy for a human to understand and, more important, easy for a search engine to pick up...

Word Press Releases

The WordPress developer team grew, and more releases followed over the next few months. January 2004 saw the leap to version 1.0, which heralded a massive increase in functionality search engine-friendly permalinks, multiple categories, the much-touted quick installation, comment moderation, yet more improvements to the administration interface, and so on. By this time, the number of users was quite considerable. The forums were well established and quite busy.

SEO Made Easy Mind Map

SEO Made Easy Mind Map

This is a quick start guide to skyrocketing your offline and online business with search engines.

Get My Free Ebook