Optimizing Queries in the ASP.NET Membership Tables ASP.NET

With ASP.NET Membership provider,you can find a user with User Name or get profile information with the user’s ID.You can also change a user’s email address by locating the user with User Name.Example shows an example of such queries.

Some common queries in ASP.NET Membership tables

Select * from aspnet_users where UserName ='john@hotmail.com'
Select * from aspnet_profile where userID ='......'
Update aspnet_membership
SET Email = 'newemailaddress@somewhere.com'
Where Email = '...'

But when you have a giant database on your production server, running any of these queries will bring your server down.Although they look obvious,and you might need to run them frequently, these queries don’t use an index, which results in a “table scan” on millions of rows on respective tables,which is the worst-case scenario for any query.

Some common queries in ASP.NET Membership tables

Problem: Generating reports slowed the server and increased CPU usage.

Solution: Put the ApplicationID in the WHERE clause.

In a previous portal project that I worked on, we used fields, such as User Name, Email,User ID, and Is Anonymous, on many marketing reports, which were used only by the marketing team. Now,the site ran fine, but we would get calls several times a day from marketing and customer support telling us that the site was slow, users were reporting extremely slow performance,and that some pages were timing out.Usually when they called we would check SQL profiler for a long-running query.But we couldn’t find any problem on the profiler, and the CPU load was within parameters. The site ran nice and smooth while we investigated the problem. So,why was the site really slow several times during the day but not while we were investigating the problem?

The marketing team used to generate reports several times every day, which meant running a query that worked on large number of rows. Those queries made the server’s disk I/O and CPU spike,like you see on Figure

The CPU sees a spike when an index is not used on tables with many rows

The CPU sees a spike when an index is not used on tables with many rows

We had SAS drives that spun at 15,000 RPM—very expensive, very fast—and the CPU was Dual-Core Dual Xeon 64 bit.Still,those queries brought us down due to the huge database, which was about 70 GB at that time.However,the problem occurred only when the marketing team tried to generate any of the reports.

Let’s look at the indexes and see whether our queries really match any index on the ASP.NET Member ship tables.Table shows the default index available on ASP.NET Member ship tables.

default index available on ASP.NET Member ship tables

Most of the indexes have ApplicationID.This means that unless you put Application ID='...' in the WHERE clause, it’s not going to use any of the indexes.As a result, all the queries were suffering from table scan. So, if I put Application Id in the queries, they should become blazingly fast, but in reality,they didn’t.This was because Email and User Name fields were not part of the indexes, but instead Lowered User Name and Lowered Email were in the fields in conjunction with Application ID in those indexes.

Application ID is used in these tables to support multiple applications running on the same database tables.ASP.NET allows you to run several applications on the same data base and on the same aspnet_users tables.If you are running only one application in a single data base,then you don’t need the Application ID at all. But because it is part of the index,you need to put the ApplicationID in the WHERE clause.

Our admin site had many such reports, and each had lots of queries on the aspnet_users, aspnet_member ship, and aspnet_Profile tables. As a result, whenever the marketingteam tried to generate reports,they took all the power from the CPU and HDD and the rest of the site became very slow and some times nonresponsive.

However,the solution was not just using the right index. Sometimes the marketing reports ran queries that required a table scan no matter how many indexes were on a table. Imagine if you ran a query that makes SQL Server go through thousands of index entries and lookup rows one by one.SQL Server would be better off doing a clustered index scan. Example shows such a query where a large number of rows can be returned.

Running queries that run through a lot of rows

SELECT * FROM aspnet_users WHERE LastActivityDate < '1/1/2007'

If you have 10,000 users in your aspnet_users table that were active before 2007,then even if you put a non clustered index on Last Activity Date, it won’t hit the index if there are 20,000 rows in the table.In fact, it is better not to use the index because it will then go through 10,000 index entries and look up table rows for each entry,which means it will need 10,000 row lookups from the table. In this scenario, it’s better to just do a table scan,which means you can never run such queries on a production server. You must move data to a staging server and then run these queries.


All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

ASP.NET Topics