1

If an ASP.NET Website is designed to be accessed by many users (i.e. 10,000 users) simultaneously, what are the techniques, knowledges, methods, designs or practices that can be implemented?

Can you please share the names of the designs/practices? I just need to know the names of the techniques, so that I can continue to do further research in Google.

Thanks.

mjb
  • 7,179
  • 8
  • 41
  • 56
  • 1
    Well, you can start by horizontal scaling (get a lot of servers) and vertical scaling (get more powerful servers) – Superzadeh Oct 01 '13 at 08:00
  • Where is the requirement for 10,000 concurrent users coming from? – Davin Tryon Oct 01 '13 at 08:02
  • Look http://stackoverflow.com/questions/3312569/what-are-the-ways-to-improve-asp-net-websites-performance – vborutenko Oct 01 '13 at 08:03
  • Are you getting issues with a web application, or is this just prior research before building? There isn't a silver bullet approach, it totally depends on resources, available technology, what the application is designed for etc. – Christian Phillips Oct 01 '13 at 08:10
  • @cahmadzadeh, it is perfectly possible to make a ASP.Net web site that can be accessed by 10,000 concurrent users on a basic PC. It all depends on what you need the web site to do. It concerns me when people jump immediately to hardware as a performance solution, especially on this web site. – Jodrell Oct 01 '13 at 08:14
  • @Jodrell of course, hardware is only one of the many solutions (do not hesitate to contribute). Still, running a website accessed by 10k concurrent users on a basic PC is not recommended ... – Superzadeh Oct 01 '13 at 08:17
  • I can't decide whether this question is too broad, primarily opinion based (since different perspectives will offer partisan answers,) or a duplicate of an existing question. – Jodrell Oct 01 '13 at 08:21
  • interesting article http://highscalability.com/plentyoffish-architecture – Jodrell Oct 01 '13 at 08:24
  • @Jodrell, I think it's definitely too broad. We don't even know if this is data-driven, or just static HTML. – Christian Phillips Oct 01 '13 at 08:49

2 Answers2

3

This is a HUGE topic - and as the comments say, there's no magic bullet.

I'll separate the response into two sections: architecture and process.

From an architecture point of view, there are a number of practices. Firstly, there is horizontal scaling - i.e. you add more servers, typically managed by a load balancer. This is a relatively cheap hardware solution, but requires you to know where your bottleneck is. The easiest horizontal scalability trick is adding more web servers; scaling database servers horizontally typically requires significant complexity, using techniques such as sharding. Horizontal scaling can improve your resilience as well as performance.

Vertical scalability basically means upgrading the hardware - more RAM, more CPU, SSD disks, etc. This is often the cheapest solution. It may also mean separating elements of the solution - e.g. separating the web server from the database server.

The next architectural solution is caching - this is a huge topic in its own right. Adding a CDN is a good first step; many CDN providers also offer "application accellerator" options, which effectively add as a reverse caching proxy (much like @Aviatrix recommends). Adding your own reverse caching proxy is often a solution for some weirdness in your own environment, or for offloading static file serving from your ASP.Net servers.

Of course, ASP.Net offers lots of caching options within the framework - make sure you read up on those and understand them; they give huge bang for buck. Also make sure you run your solution through a tool like YSlow to make sure you're setting the appropriate HTTP cache headers.

Another architectural solution that may or may not help is invoking external services asynchronously. If your solution depends on an external web service, calling that service synchronously basically limits your site to the capacity and resilience of the external system. For high-traffic solutions, that's not a good idea.

For very high scalability, many web sites use NoSQL for persistence - this is another huge topic, and there are many complex trade-offs.

From a process point of view, if scalability is a primary concern, you need to bake it into your development process. This means conducting regular performance and scalability assessments throughout the project, and building a measurement framework so you can decide which optimizations to pursue.

You need to be able to load test your solution - but load testing at production levels of traffic is usually commercially unrealistic, so you need to find an alternative solution - I regularly use JMeter with representative infrastructure. You also need to be able to find your bottlenecks under load - this may require instrumenting your code, and using a profiler (RedGate do a great one).

Most importantly is to have a process for evaluating trade-offs - nearly every performance/scalability improvement is at the expense of some other thing you care about. Load balancers cost money; reverse caching proxy solutions increase complexity; NoSQL requires new skills from your development team; "clever" coding practices often reduce maintainability. I recommend establishing your required baseline, building a measurement framework to evaluate your solution against that baseline, and profiling to identify the bottleneck. Each solution to improve scalability must address the current bottleneck, and I recommend a proof of concept stage to make sure the solution really does have the expected impact.

Finally, 10000 concurrent users isn't a particularly large number for most web applications on moderns hardware.

Neville Kuyt
  • 28,264
  • 1
  • 36
  • 50
2

here is my 2c as someone who is currently building a scalable system with asp.net backend

  1. use NGINX as reverse proxy and for caching. Odds are your users will request mostly the same data that can be cached , use that.
  2. use proper caching http headers and cache as much as you can both on the server and on the client, be careful tho, this can result in delay issues between when you update something and when the user sees it.
  3. have servers with a lot of ram and SSD's , the ssd do help a lot!
  4. use NGINX or something else as a load balancer to spread the load between servers.
Nikola Sivkov
  • 2,804
  • 3
  • 34
  • 63