301 redirect is the best method to preserve your current search engine rankings when redirecting web pages or a web site. The code “301” is interpreted as “moved permanently”.
After the code, the URL of the missing or renamed page is noted, followed by a space, then followed by the new location or file name. You implement the 301 redirect by creating a .htaccess file.
When a visitor/spider requests a web page, your web server checks for a .htaccess file. The .htaccess file contains specific instructions for certain requests, including
security, redirection issues and how to handle certain errors.
1. To create a .htaccess file, open notepad, name and save the file as .htaccess (there is no extension).
2. If you already have a .htaccess file on your server, download it to your desktop for editing.
3. Place this code in your .htaccess file:
redirect 301 /old/old.htm https://www.you.com/new.htm
4. If the .htaccess file already has lines of code in it, skip a line, then add the above code.
5. Save the .htaccess file
6. Upload this file to the root folder of your server.
7. Test it by typing in the old address to the page you’ve changed. You should be immediately taken to the new location.
RewriteEngine On
RewriteCond %{HTTP_HOST} ^seobook.com [NC]
RewriteRule ^(.*)$ https://www.seobook.com/$1 [L,R=301]
The ‘(*.)$’ says that we’ll take anything that comes after https://seobook.com and append it to the end of ‘https://www.seobook.com’ (thats the ‘$1’ part) and redirect to that URL. For more grit on how this works checkout a good regular expressions resource or two.
Note: You only have to enter ‘RewriteEngine On’ once at the top of your .htaccess file.
Alternately you may chose to do this 301 redirect from
in the Apache config file httpsd.conf.
ServerName www.seobook.com
ServerAdmin [email protected]
DocumentRoot /home/seobook/public_html
ServerName seobook.com
RedirectMatch permanent ^/(.*) https://www.seobook.com/$1
Note that often webhost managers like CPanel would have placed a ‘ServerAlias’ seobook.com in the first VirtualHost entry which would negate the following VirtualHost so be sure to remove the non-www ServerAlias.
301 www to non-www
Finally the www 301 redirect to non-www version would look like:
RewriteCond %{HTTP_HOST} ^www.seobook.com [NC]
RewriteRule ^(.*)$ https://seobook.com/$1 [L,R=301]
Redirect All Files in a Folder to One File
Lets say you no longer carry ‘Super Hot Product’ and hence want to redirect all requests to the folder /superhotproduct to a single page called /new-hot-stuff.php. This redirect can be accomplished easily by adding the following your .htaccess page:
RewriteRule ^superhotproduct(.*)$ /new-hot-stuff.php [L,R=301]
But what if you want to do the same as the above example EXCEPT for one file? In the next example all files from /superhotproduct/ folder will redirect to the /new-hot-stuff.php file EXCEPT /superhotproduct/tony.html which will redirect to /imakemoney.html
RewriteRule ^superhotproduct/tony.html /imakemoney.html [L,R=301]
RewriteRule ^superhotproduct(.*)$ /new-hot-stuff.php [L,R=301]
Redirect a Dynamic URL to a New Single File
It’s common that one will need to redirect dynamic URL’s with parameters to single
static file:
RewriteRule ^article.jsp?id=(.*)$ /latestnews.htm [L,R=301]
In the above example, a request to a dynamic URL such as https://www.seobook.com/article.jsp?id=8932
will be redirected to https://www.seobook.com/latestnews.htm
SSL httpss to https
This one is more difficult but I have experienced serious canonicalization problems
when the secure httpss version of my site was fully indexed along side my https version. I have yet
to find a way to redirect httpss for the bots only so the only solution I have for now is
to attempt to tell the bots not to index the httpss version. There are only two ways I know to do this and neither are pretty.
1. Create the following PHP file and include it at the top of each page:
if (isset($_SERVER[‘HTTPS’]) && strtolower($_SERVER[‘HTTPS’]) == ‘on’) {
echo ‘‘. “\n”;
}
2. Cloak your robots.txt file.
If a visitor comes from httpss and happens to be one of the known bots such as googlebot, you will display:
User-agent: *
Disallow: /
Otherwise display your normal robots.txt. To do this you’ll need to alter your .htaccess
file treat .txt files as PHP or some other dynamic language and then proceed to write the cloaking code.
I really wish the search engines would get together and add a new attribute to robots.txt that would allow us to stop them from indexing httpss URLs.