Category Archives: SQL for Beginners

How to find the Isolation level being used by a Database?

Hi Friends,

Recently i was preparing a Dashboard sheet for one of our critical Box. The Client was having a long list of  Requirements and one odd requirement was to embbed the Isolation level being used for every Database. This was a unique requirement for me and i never knew how to get this, but within 5 minutes of searching i came to know about an interesting Command which returns this info along with other useful stuff.

The command is
DBCC UserOptions
This command also tells you about certain DateFormats and ANSI Settings.

The below pic shows an output of one of my Database where the isolation level is set to Snapshot

1_SQL_Server_How_to_find_the_Isolation_level_being_used_by_a_Database

 

Hope this will help you in some way.

 

Regards

Sarabpreet Anand

What is Checkpoint in SQL Server?

Checkpoint is an internal process that writes all dirty pages (modified pages) from Buffer Cache to Physical disk, apart from this it also writes the log records from log buffer to physical file. Writing of Dirty pages from buffer cache to data file is also known as Hardening of dirty pages.

It is a dedicated process and runs automatically by SQL Server at specific intervals. SQL Server runs checkpoint process for each Database individually.

Checkpoint helps to reduce the recovery time for SQL Server in the event of unexpected shutdown or system crash\Failure.

 

In SQL Server 2012 there are four types of Checkpoints:

Automatic: This is the most common checkpoint which runs as a process in the background to make sure SQL Server Database can be recovered in the time limit defined by the Recovery Interval – Server Configuration Option.

Indirect: This is new in SQL Server 2012. This also runs in the background but to meet a user-specified target recovery time for the specific Database where the option has been configured. Once the Target_Recovery_Time for a given database has been selected this will override the Recovery Interval specified for the server and avoid Automatic Checkpoint on such DB.

Manual: This one runs just like any other T-SQL statement, once you issue checkpoint command it will run to its completion. Manual Checkpoint runs for your current Database Only. You can also specify the Checkpoint_Duration which is optional, this duration specifies the time in which you want your checkpoint to complete.

Internal: As a user you can’t control Internal Checkpoint. Issued on specific operations such as:
1. Shutdown initiates a Checkpoint operation on all databases except when Shutdown is not clean (Shutdown with nowait)
2. If the recovery model gets changed from Full\Bulk-logged to Simple.
3. While taking Backup of the Database.
4. If your DB is in Simple Recovery model, checkpoint process executes automatically either when the log becomes 70% full, or based on Server option-Recovery Interval.
5. Alter Database command to add or remove a data\log file also initiates a checkpoint.
6. Checkpoint also takes place when the recovery model of the DB is Bulk-Logged and a minimally logged operation is performed.
7. DB Snapshot creation.

Happy Learning 🙂

Thanks,
Sarabpreet Singh Anand
Subscribe now to get latest Tips\blog posts in your Inbox

Using Covering Index

A Non Clustered Index which contains all columns which are referred by a Query (in Select, Where or Join) is called Covering Index.

In other words

When all columns which are required to satisfy the query are included in a non-clustered Index to avoid extra key lookup that index is known as Covering index.

This type of index can effectively improve the query performance. To understand the reason why Covering index helps in gaining extra performance let’s do some testing.

In this example, I’ll be creating two identical tables Stock & Stock1 with exactly same data and number of rows. To test the performance benefits, we’ve to create different type of indexes on both tables. On first table we’ll create an Index with all key columns and on Second table we’ll create a Covering index which will also include the non-key column. Let’s see whether it impacted the performance or not.

This is the query which we need to test: (Queries are given below)

select PID, Qty, calories, PName from Stock where Qty>=11 and calories< 200

On Table Stock we created a Clustered Index on PID and a non clustered index on Qty and Calories whereas on Table Stock1 we created a clustered Index on PID and a NON Clustered Index on Qty and Calories along with these two columns we also included a non-key column PName to convert this index into a covering index.

In First table, the Query uses the non-clustered index to get PID,Qty and calories and once all these values are available SQL Engine will search the PName column at the leaf node of the clustered Index (Basically it will be a lookup against the Clustering key which is PID) . Now if we could avoid traversing Clustered index to get relevant PName values we can easily improve the query performance; to do so We created another table Stock1 where the non-clustered index was created on Qty and calories but PName (a non-key column) was included in the index; Since the non clustered index already has the PName values now there is no need to traverse the Clustered index and we can save those resources.

When we executed both these queries after enabling Statistics IO ON; we saw a decrease in Logical Reads from 43 to 39. I know the numbers are not too much and theres not a major difference but there is definitely a improvement which depends heavily on Number of records & selection criterion.

image

 

Then we compared the Execution plans and saw that the first query on Stock table where we don’t have non-key column in the index that is taking 93% resources compared to second table query which is just 7%.

image

 

Moreover, if you compare the IO and CPU Cost there is a drastic improvement. Hence we can conclude by saying that performance wise Covering indexes are much better then any other normal Non Clustered Index.

image    image

 

In-case, you want to test this you can use the below mentioned query. (Don’t execute queries on your production box, these queries are provided as is without any warranty)

create database Test_Sarab
go
use test_sarab
go

create table Stock
(
PID int Identity primary key,
Qty int,
calories int,
PName varchar(30)
)

create table Stock1
(
PID int Identity primary key,
Qty int,
calories int,
PName varchar(30)
)

insert into Stock values
(10, 267, ‘Doughnut Custard’),
(11, 410, ‘Doughnut Iced Glazed’),
(11, 256, ‘Doughnut Jam’),
(11, 266, ‘Muffin Blueberry’),
(11, 149, ‘Breakfast Muffin’),
(13, 401, ‘Muffin Chocolate Chip’),
(9, 141, ‘Wholemeal Muffin’),
(13, 78, ‘Brown Bread’)
go 1001

insert into Stock1 values
(10, 267, ‘Doughnut Custard’),
(11, 410, ‘Doughnut Iced Glazed’),
(11, 256, ‘Doughnut Jam’),
(11, 266, ‘Muffin Blueberry’),
(11, 149, ‘Breakfast Muffin’),
(13, 401, ‘Muffin Chocolate Chip’),
(9, 141, ‘Wholemeal Muffin’),
(13, 78, ‘Brown Bread’)
go 1001

create index Id2 on Stock (Qty, calories)

create index FinalIDx on Stock1 (Qty, calories) include (PName)

–dbcc dropcleanbuffers

SET STATISTICS IO ON;
select PID, Qty, calories, PName from Stock where  Qty>=11 and calories< 200

                       select PID, Qty, calories, PName from Stock1 where  Qty>=11 and calories< 200

 

Thanks,

Sarabpreet Singh Anand

Read Before you even think of Installing SQLServer 2014 CTP1

SQL Server 2014 CTP1 is out with lots of cool & exciting features, if you’ve not yet downloaded it, you can get it from below mentioned link: http://www.microsoft.com/en-us/sqlserver/sql-server-2014.aspx

If you are want to try out all\any of these cool features you might want to install this version but before you even think of doing this read the below limitations carefully – chances are your installation will fail if you do not fulfill any of these.

Microsoft SQL Server 2014 Community Technology Preview 1 (CTP1) installation limitations:

  1. · Microsoft SQL Server 2014 CTP1 is pre-release software and should be installed only on a clean machine. It does not support upgrade from or side-by-side installations with any previous version of SQL Server, including SQL Server 2012. In addition it does not work with or support side by side installations with any client redistributables of SQL Server such as feature pack components, Visual Studio 2012 or earlier versions, SQL Server Data Tools, Data Tier Application Framework, etc.
  2. · To administer Microsoft SQL Server 2014 CTP1 use only SQL Server Management Studio that ships as a part of Microsoft SQL Server 2014 CTP1.
  3. · The Microsoft SQL Server 2014 CTP1 release is NOT supported by Microsoft Customer Services and Support (CSS).
  4. · The Microsoft SQL Server 2014 CTP1 release is only available in the X64 architecture.
  5. · The Microsoft SQL Server 2014 CTP1 release is a NON-PRODUCTION release and should not be installed and used in production environments.
  6. · The Microsoft SQL Server 2014 CTP1 release does NOT have an associated Microsoft SQL Server 2014 CTP1 Feature Pack.
  7. · The Microsoft SQL Server 2014 CTP1 release on Windows Azure VM is only intended for functional development and testing, and not for performance testing.
  8. · The Microsoft SQL Server 2014 CTP1 release contains branding and versioning from older Microsoft SQL Server releases.

I actually tried installing this on my laptop which was already having all version of SQL Server starting from 2005. Same as the previous installations the Setup Support Rules screen is there in SQL Server 2014 installation media which detected the previous installation of SQL Server on my laptop & stopped my attempt of installing something which is against the rules.

clip_image002

After clicking on the Hyperlink “Failed” I got the complete error message in a dialog box – same as earlier versions.

clip_image003

So to fulfill these rules I created a new Virtual Machine to install and test out these features – I’ll be posting all those steps in my next blog – till then stay tuned and Happy Learning J

Thanks,
Sarabpreet Singh Anand
Subscribe now to get latest Blogs in your Inbox or Follow on twitter @SQLDB