Building the Data Access Layer Using LINQ to SQL ASP.NET

The first step to using LINQ to SQL is to build a DataContext, which contains all entity classes and their data access methods. You will generate one Entity class per table, e.g.,the Page Entity class for the Page table in the database. Visual Studio 2008’s ORM designer allows you to easily drag and drop tables onto a designer surface and generate entity classes. The next step will be to create a simple helper class that makes working with LINQ to SQL even easier. Let’s start with designing the ORM in Visual Studio 2008.

Generating a Data Model Using the Visual Studio 2008 Designer

Visual Studio 2008 comes with an object relational mapping designer, which allows you to create a LINQ to SQL classes file and then drag and drop tables from the server explorer to the designer surface.Visual Studio will then automatically generate classes from those tables.You can further modify the associations, turn on or off lazy loading of properties, add validation, and much more.Nothing special was done to generate Figure in the data model, besides putting all the tables from the databaseonto the designer surface.

After you create the designer and build the project, the data access and entity classes will be ready for use. DashboardData is the data context class that’s included in the project.

Manipulating Data with a Database Helper

Example shows the code for a DatabaseHelper that makes working with LINQ to SQL a lot easier.

DatabaseHelper, part 1

Database model that shows LINQ to SQL classes

Database model that shows LINQ to SQL classes

Dat abase Helper also takes care of configuration management and initialization of the Dat aContent class. It has a Ge tDas hboard Data function that returns a reference to the Dash board Data instance already configured with the connection string. Insert, Update,and Delete methods offer shortcuts for performing common database operations.

Data base Helper reads the connection string from the <connectionString> block in the web.config or app.config file. It also stores the ApplicationId for the ASP.NET member ship provider.

Database model that shows LINQ to SQL classes

When ever you create a new application using an ASP.NET membership, ASP.NET creates an entry in the Aspnet_Applications table.This ApplicationID is also used in the aspnet_users table to identify which application a user belongs to.

Whenever you create a new application using an ASP.NET membership, ASP.NET creates an entry in the Aspnet_Applications table. This ApplicationID is also usedin the aspnet_users table to identify which application a user belongs to.

Here’s a problem with LINQ to SQL.If an entity travels through multiple tiers, then it gets detached from the DataContext that created it. This means that as soon as an entity is out of the data access layer and into the business or web layer, it is detached from the DataContext because the DataContext is only available within the data access layer. So, when you try to update entities again using a different DataContext, you first need to attach the entity instance to the new data context,then make the changes and call Submit Changes. Now the problem is that from the business layer you do not have access to the DataContext, which is created by the data access layer when the entity object is being updated. The business layer will just send the entity object to the data access component, and then the component will update by creating a new DataContext.

LINQ to SQL requires that you attach the entity object before making changes to it so that it can track which properties are being changed and determine what kind of UPDATE or INSERT statement to generate.However, a typical business layer will make the object modifications first and then send them to the data access component. So, a traditional attempt like Example will fail.

Common way of updating objects in the database


Proper way of updating objects in the database using LINQ to SQL

However, Example is not possible because you can’t make DashboardData stateless.You need to create DataContext inside methods and then store the reference to DataContext between function calls.This will be fine for a single-user scenario,but not an acceptable solution for multiuser web sites. So, I made a workaround (see Example)

Workaround for stateless data persistence

Here, the Update<> method first attaches the page object to DataContext and then calls the delegate passing the reference to the attached object.You can now modify the passed object as if you were modifying the original object inside the delegate.Once the delegate completes, the object will be updated using Data Context. Submit Changes ( );.

The DashboardData.Update<T> updates an object in the data base

The widget container uses DatabaseHelper to update objects in the database as shown in Example

The widget container uses DatabaseHelper to update objects

The delegate in Update <T> allows you to be in the context of the business layer or the caller.So, you can access the UI elements or other functions/properties that you need to update the entity ’s properties.

Similarly, there’s an UpdateAll<> function that updates a list of objects in the database (see Example).

DashboardData.UpdateAll<T> updates multiple objects

For convenience,I have made Insert<> an Delete<> also. But they are not required because they do not have an “Attach first, modify later” requirement (see Example)

Insert<> and Delete<> functions in DashboardData

The Delete<> method is a tricky one.First you need to attach the object to the Table and then call the table’s Remove function.This means you need to first get the object before you can call Delete, which adds a read overhead while deleting an object (see Example)

Delete<T> takes the object to delete and then removes it from the data base

Now that you have learned how to build the data access layer, let’s address some of the challenges you’ll face while running the portal project in a production environment

Cleaning Up Inactive User and Related Data

An Ajax web portal has a unique challenge when it comes to cleaning up unused data that is generated by anonymous users who never return.Every first visit creates one anonymous user,a page setup, widgets, etc. If the user doesn’t come back, that information remains in the database permanently.It is possible that the user might come back within a day,or a week or a month, but there’s no guarantee .Generally,sticky users—users who return to your site frequently—make up 30 to 50 percent of the total users who come to an Ajax web portal. So, you end up with 50 to 70 percent unused data.Dropthings requires daily data cleanup to keep the database size down—user accounts expire,RSS feeds get old,anonymous sessions expire,and users never come back.

This is a huge cleanup operation once a web portal becomes popular and starts receiving thousands of users every day. Think about deleting millions of rows from 20 or 30 tables, one after another,while maintaining foreign key constraints. Also, the cleanup operation needs to run while the site is running,without hampering its overall performance.The whole operation results in heavily fragmented index and space in the MDF file.The log file also becomes enormous to keep track of the large transactions. Hard drives get really hot and sweat furiously.Although the CPUkeeps going,it’s really painful to watch SQL Server go through this every day. But there is no alternative to keep up with SQL Server’s RAM and disk I/O requirements. Most importantly, this avoids counting users in monthly reports that are not valid users.

When a user visits the site, the ASP.NET membership provider updates the LastActivityDate of the aspnet_users table.From this field, you can find out how long the user has been idle.The IsAnonymous bit field shows whether the user account is anonymous or registered.If it is registered, then there is no need to worry. But if it is anonymous and more than 30 days old, you can be sure that the user will never come back because the cookie has already expired. However, we can’t avoid creating an anonymous user because the user might want a fresh start(see the “Implementing Authentication and Authorization” section in Chapter 3).Another scenario is a user logging out on a shared computer (e.g., a cyber café) and the next person using it as an anonymous user.

Here’s how the whole cleanup process works:

  1. Find out the users that are old enough to be discarded and are anonymous
  2. Find out the pages the user has
  3. Delete all of the widget instances on those pages
  4. Delete those pages: Building the Data and Business Layers Using .NET 3.5
  5. Remove rows from child tables related to aspnet_users like aspnet_profile, aspnet_UsersInRoles, and aspnet_Personalization Per User
  6. Remove rows for users to be deleted
  7. Remove the users from aspnet_users

Example is the giant DB script that does it all. I have added inline comments to explain what the script is doing.

Cleaning up old anonymous users and their related data

Cleaning up old anonymous users and their related data (continued)

Now the question is, when can I run this script? The answer depends on several factors:

• The period of lowest traffic on your site. For example, in the U.S., most users are a sleep at midnight. Of course,that works only if the majority of your users are from the U.S.

• Other maintenance tasks, such as index defrag or database backup, are the least likely to be running.If by any chance any other maintenance task conflicts with this enormous delete operation, SQL Server is dead.

• The time it takes to run the script. The operation will take anywhere from 10 minutes to a number of hours depending on the volume of trash to clean up.So,consider the amount of time it will take to run this script and plan other maintenance jobs accordingly.

• When you typically run index defrag. It’s best to run the script 30 minutes before the index defrag jobs run, because after the script completes,the tables will be heavily fragmented.So, you need to defrag the indexes.

Before running this script, first:

•Turn off auto shrink from database property.Database size will automatically reduce after the cleanup.Shrinking a database requires a lot of disk I/O activity and it slows the database down. Turn off auto shrink because the database will eventually grow again.

• Ensure that the log file’s initial size is big enough to hold such enormous transactions.You can specify one-third of the MDF size as LDF’s initial size. Also make sure the log file did not shrink—let it occupy HD space.It saves SQL Server from expanding and shrinking the file. Both of these require high disk I/O.

Once the cleanup job is done and the index defrag runs, the performance of your database will improve significantly. Because the tables are now smaller, the indexes are smaller,and SQL Server doesn’t have to run through large indexes anymore.Future index defrags take less time because there’s not much data left to optimize.SQL Server also takes less RAM because it has to work with much less amount of data.Database backup size is also reduced because the MDF size does not keep increasing indefinitely. As a result, the significant overhead of this cleanup operation is quite acceptable when compared to all the benefits.


All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

ASP.NET Topics