Steps for profiling performance of a website running on azure

I am quite new to running websites in general. I am familiar with statistical profilers for desktop applications, but unsure how to even begin profiling a website as there are a lot of additional potential bottlenecks and I'm not sure what profilers are available for websites.

I have looked around and seen useful suggestions in other questions, but I am not sure they are a very complete solution. The main suggestions are azure performance counters and suggestions from this answer.

Summarizing they are: Use firebug to determine rendering time and loading time seperately so one can tell whether one has a rendering issue or a server issue.

If server side: Test a small static page like a page with a single gif. If that is slow one has a CPU issue. Otherwise one is probably IO bound or has problems with database performance.

One can use performance counters to check server aspects such as: memory garbage collection tcp/ip issues bytes sent / recieved requests requested, queued, rejected request wait time, processing time

From my naive standpoint some things that seem to be missing from this list are the sort of profiling one has for a traditional desktop application, ie what the stack looked like what percentage of the time (ie what functions were we spending time in, and in what context). Another missing item is profiling the database performance, which seems like it may be different on azure than in a local environment especially if one starts dealing with scaling. Another is time spent on requests to third party services, though maybe that can be done with azure performance counters(?).

I apologize for the naive nature of this question. What tools and aspects am I missing here to profile an azure MVC asp.net website and what changes would you make to the above list?


There's a lot of aspects to profiling a site, in terms of database calls, business logic, rendering a view, and even client side performance (any jQuery that might run, for example).

StackOverflow's MiniProfiler is one of the easiest things to get going, just install a NuGet package, add some Javascript includes, and wrap whatever you want to test inside a using() block, and you'll see execution times (including LINQ-to-SQL and EF). You can even create steps if you want finer grained timings of individual calls.

The nice thing about MiniProfiler is you can enable/disable based on the environment, which makes it suitable for running inside Azure (as opposed to say, the Visual Studio Profiler).

You can also look at Azure Performance Counters, which will give you an idea of system resources, but isn't profiling in the sense that MiniProfiler is. It will however give you an idea of network latency and CPU and memory utilization.

Once you're satisfied there, you can use Chrome's Developer Tools to profile your application on the client side. It'll give you an idea of how well your Javascript is doing, including CSS selectors and rendering.

Also worth noting, Visual Studio has a really good Profiler in some higher editions that can give you deep insights into your code. Time spent in methods, call counts, etc.

Between these four methods, you should be able to find most bottlenecks, especially for a first pass.

链接地址: http://www.djcxy.com/p/74280.html

上一篇: libstdc ++是否支持std :: unordered

下一篇: 分析在Azure上运行的网站性能的步骤