httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Trutwin <j...@trutwins.homeip.net>
Subject Re: [users@httpd] avoiding a redirect loop with rewrites
Date Fri, 23 Feb 2007 02:52:13 GMT
On Thu, 22 Feb 2007 21:36:03 +0000
matt farey <matt.farey@gmail.com> wrote:

> RewriteEngine On
> ReWriteCond %{REQUEST_FILENAME} pid5.html$
> ReWriteCond %{QUERY_STRING} !stop=yes
> ReWriteRule . /products.html [R=301]
> ReWriteRule ^products.html$ /pages/pid5.html?stop=yes [L,QSA]

Thanks Matt - I'll give that a try to see if it fits the works.  I
assume this would still work too if I wanted to be as specific as
possible:

ReWriteCond %{REQUEST_FILENAME} pages/pid5.html$

> I'll be embarrassed if it works, the easier way would be to let
> your CMS handle this internally using PHP perhaps, then the rewrite
> rules can be simple,a dnt eh CMS ensures the right url in all the
> links.

The CMS is written in PhP and I am trying to catch all instances of
the old URL where I can and replacing with the new URL (the rules are
also stored in a db for quick lookup).  So *internally* all the
pidXX.html references should be taken care of.

> Where does the bookmark come from, chase down all the places where
> they can see that link, and force it to be the new url, all seems a
> bit backward. Next time tell your boss, "look you employed me to do
> this job, so trust me to do it" sounds like a micro manager!

Well - technically not a manager - a reseller I think.  Here's
verbatim what they told me (old pages = pages/pidXX.html):

"If we are changing url paths, we will need to setup 301's in
their .htaccess files from the old pages to the 'new' pages. Thats
the only way to insure the rankings wont drop.  ...  You want to
avoid systems that create two separate instances of a page that have
identical content but are addressable from different URLs. Duplicate
content risks the appearance of "spamming" the search index."

The big concern I think is Search Engines finding two URLs to the
same page and somehow punishing the site because of it.  They might
have pages/pid5.html as the products page and then crawl the site to
find the link products.html is the same page.  

I don't know enough about SEO to say much about this.

Thanks,

Josh

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Mime
View raw message