1

I'm dealing with a hosting team that is fairly skittish of managing many rewrite rules. What are your experiences with the number of rules your sites are currently managing?

I can see dozens (if not more) coming up as the site grows and contracts and need to set expectations that this isn't out of the norm.

Thanks

Dave M
  • 4,514

4 Answers4

2

Are they concerned about the logistics of managing many rules, or the performance?

In the former case, consider combining a hashmap generated from a DB, managed by either your CMS or a CRUD tool if, like Alex, most of your rules actually come from things like content moves and marketing campaigns. They can be tested by your content people and then migrated into production with little effort for your server teams.

If the problems is performance, well, that's a "how long is a piece of string" question, but I've certainly worked on sites with literally hundreds of rewrite rules to support things like content migrations where there's been no measurable impact on the response times for the servers.

Rodger
  • 609
1

You could consider using http://httpd.apache.org/docs/2.2/mod/mod_rewrite.html#rewritemap which would allow a single rule read from a hashfile.

1

We recently had a practical experience about this. See my answer here: https://stackoverflow.com/questions/1364673/apache-redirects-rewrite-maximum/18120886#18120886

I had the very same question recently. As I found no practical answer, we implemented an htaccess 6 rules of which 3 had 200,000 conditions.
That means an htaccess file with the size of 150 MB. It was actually fine for half a day, when noone was using this particular website, even though page load times were in the seconds. However next day, our whole server got hammered, with loads well above 400. (machine is 8 cores, 16 GB RAM, SAS RAID5, so no problem with resources usually)

I suggest if you need to implement anything like this. Design your rules, so they don't need conditions, and put them in a dbm rewrite map. this easily solved the performance issues for us.

http://httpd.apache.org/docs/current/rewrite/rewritemap.html#dbm

  • Welcome to Server Fault! Generally we like answers on the site to be able to stand on their own - Links are great, but if that link ever breaks the answer should have enough information to still be helpful. Please consider editing your answer to include more detail. See the FAQ for more info. – slm Aug 08 '13 at 08:33
  • @slm: That is a link to an answer on another StackExchange site. If that link ever breaks this site is probably going with it. – larsks Aug 08 '13 at 10:33
  • @larsks - true but how hard would it have been to add some context to the link? – slm Aug 08 '13 at 11:12
  • Or better yet, copy and paste. – Michael Hampton Aug 08 '13 at 12:54
  • sorry about it. would have probably quoted if it was an external site... – Gergely Zsamboki Aug 12 '13 at 11:27
0

Obviously everyone is going to have a significantly different number of rules to manage depending on individual situations. I would guess that dozens of rules are not uncommon. We typically use rewrites to deal with things like content moves, technology changes. Our marketing department is constantly coming at us with search engine optimization requests for things like expired content which we typically handle with a rewriterule. We also end up handling things like marketing campaigns that get printed specifying a URL that doesn't exist, so we make it valid by adding a rewriterule.

For a comparison, at my organization, we have 140 rewriterules spanning 19 subdomains in our production environment.

Alex
  • 6,623