Trace Flag 2371 to Control Auto update Statistics threshold and behaviour in SQL Server

Hi Friends,

As you already know, the auto update statistics feature of SQL Server relies on number of rows changed or updated to determine if statistics update is needed.  The statistics of a table will only be automatically updated if the number of rows changed exceeds a threshold.

AutoStat Threshold

The sysindexes.rowmodctr column maintains a running total of all modifications to a table that, over time, can adversely affect the query processor’s decision making process. This counter is updated each time any of the following events occurs:

  • A single row insert is made.
  • A single row delete is made.
  • An update to an indexed column is made.

NOTE: TRUNCATE TABLE does not update rowmodctr.

The basic algorithm for auto update statistics is:

  • If the cardinality for a table is less than six and the table is in the tempdb database, auto update with every six modifications to the table.
  • If the cardinality for a table is greater than 6, but less than or equal to 500, update status every 500 modifications.
  • If the cardinality for a table is greater than 500, update statistics when (500 + 20 percent of the table) changes have occurred.
  • For table variables, cardinality changes do not trigger auto update statistics.

NOTE: In this strictest sense, SQL Server counts cardinality as the number of rows in the table.

NOTE: In addition to cardinality, the selectivity of the predicate also affects AutoStats generation. This means that statistics may not be updated after every 500 modifications if cardinality were < 500 or for every 20% of changes if cardinality were > 500.

Problem with VLDBs \ Large Tables

When a table becomes very large, the old threshold (a fixed rate – 20% of rows changed) may be too high and the Autostat process may not be triggered frequently enough. This could lead to potential performance problems. SQL Server 2008 R2 Service Pack 1 and later versions introduce trace flag 2371 that you can enable to change this default behaviour.The higher the number of rows in a table, the lower the threshold will become to trigger an update of the statistics.

For example, if the trace flag is activated, update statistics will be triggered on a table with 1 billion rows when 1 million changes occur. If the trace flag is not activated, then the same table with 1 billion records would need 200 million changes before an update statistics is triggered.

You can use the following guidance for enabling the new trace flag in your environment:

  1. If you have not observed performance issues due to outdated statistics, there is no need to enable this trace flag.
  2. If you are on SAP systems, enable this trace flag.  Refer to this blog
  3. If you have to rely on nightly job to update statistics because current automatic update is not triggered frequently enough, consider enabling SQL Server trace flag 2371 to reduce the threshold.

sys.dm_db_stats_Properties DMV

Starting SQL Server 2008 R2 SP2 and SQL Server 2012 Service Pack 1, a new DMV sys.dm_db_stats_Properties is introduced to more accurately track the number of rows changed in a table.   You can use this DMV to detect number of rows changed in a table and decide if you wish to update statistics manually.

DMV sys.dm_db_stats_properties (Transact-SQL)Returns properties of statistics for the specified database object (table or indexed view) in the current SQL Server database.

The DMV takes two parameters Object_ID and Stats_ID, where both the values are int. Syntax: sys.dm_db_stats_properties (object_id, stats_id)

object_id: Is the ID of the object in the current database for which properties of one of its statistics is requested.

stats_id:   Is the ID of statistics for the specified object_id. The statistics ID can be obtained from the sys.stats dynamic management view.

The DMV will return number of columns; the columns which require our attention are last_updated&modification_Counter.

  1. last_updated will tell us when was the last time the stats for this given table\index was updated.
  2. Modification_Counter Column will reflect the Total number of modifications for the leading statistics column (the column on which the histogram is built) since the last time statistics were updated.

System Center Advisor checks for statistics information using this DMV and raises an alert if the column modification counter is more than 1000.

Hope this new Trace Flag and DMV might help you to resolve the performance issues related to obsolete stats.

Do let me know your take on this – just leave a comment.



Sarabpreet Anand

Find Jobs scheduled to run during a specific time range

Hi Friends,

Today I was going to raise a change which was related to SAN Migration for one of our production servers. The affected server will not be available for almost 3 hours. I prepared the plan and asked Mr. X to look into this & execute the change during the weekend. Mr. X happily agreed to it, I asked him to go through the plan and clear the doubts if any.

After reviewing the plan he asked multiple question which I answered promptly. His last question was “Sarab, you’ve mentioned that we need to execute the daily jobs manually which are scheduled to run between 2:30 AM to 4:30 AM CST. But there are 48 Odd jobs on this server; do you want me to check each and every job manually?”

This question by Mr. X forced me to complete & document this long pending query (which I always wanted to document somewhere but I was lingering it due to Laziness)

So I wrote the complete query & documented it so that everyone can make use of this query.

Here is the link for the query:

The logic is pretty simple, the only challenge I faced was to change the time and date which is given in non-standard format.

To change the time I created a Function (also available with this script) which can change the time from 15500    to 01:55:00 which is more readable in nature.

This query gives the capability to filter\find the enabled jobs on the server which are scheduled to be executed between a certain time ranges.

You can also tweak the query to include the date parameter in where clause.

Here is the example output:


Do let me know your inputs on the same. – Just leave a quick comment.



Sarabpreet Anand

Error while opening SQL Server Configuration Manager – Fixed

Hi Friends,

Today I again experienced the same old frustrating error while opening SQL Server Configuration Manager on one of the SQL Servers. Refer the screenshot below.


The error says:  Cannot connect to WMI provider. You do not have permission or the server is unreachable. Note that you can only manage SQL Server 2005 and later servers with SQL Server Configuration Manager.

Invalid namespace [0x8004100]  Adding text so that you can find it through search engine when you face this issue.

Now as the error mentioned: everybody will directly jump to services.msc console to check if WMI service (Windows Management Instrumentation) is running or not. Even i checked the same thing, It was running. The next step you might want to check is if WMI Service is running properly or not. Etc

Now Please stop validating your WMI Service, the issue is not related to WMI connectivity\service; having said that, it is related to WMI only but the error message is not pointing to the underlying issue.

The issue is with the WMI provider for your SQL instance. Basically what happens “whenever you uninstall any SQL instance from a Server it somehow also removes WMI provider – but this becomes a problem when you have multiple instances on a Server and you uninstall one of them. “

The 32-bit instance and the 64-bit instance of SQL Server 2008 share the same WMI configuration file. This file is located in the %programfiles(x86)% folder.

So what’s the solution\Fix?
The solution is a mof file which contains all configuration settings which regenerates WMI Provider for SQL Server, All you need to do is just locate the below mentioned folder on your server and issue the command with correct path.

mofcomp c:\program files (x86) \Microsoft SQL Server\100\Shared\sqlmgmproviderxpsp2up.mof”

This is how it will look once you issue this command on a Command Prompt.


That’s it once you see “Done!” On the command Prompt, everything is fixed and back to normal.

Now you can open your SQL Server Configuration Manager and work as you wish.

Do let me know if there is any other way to fix this; I’ll add that to my knowledge.



Sarabpreet Anand

Resolving – Disk Missing Issue for TempDB with a Twist

A not so common but major issue could be missing Disk\Drive issue for any SQL Server, If there were user DB files on that disk chances are you’ll lose some data if the same Disk is not allocated again to you, Moreover you have to bear downtime. (Though there are some exceptions to it which we’ll discuss in another blog post.)

But in case your TempDB was there on missing Drive\location – it will not even allow your SQL Server to start & gives the below error:

CREATE FILE encountered operating system error 3(The system cannot find the path specified.) while attempting to open or create the physical file ‘S:\TempDB_Date\tempdb.mdf’.

The only two options left with you are:

  1. Ask your storage admin to allocate a LUN (fresh\old one) – Map the storage to Server, assign the same drive letter, create the folder if it’s not there already and then start your SQL Server. (You might also need to work on permissions if there is a need.)
  2. If you have the space available on any other drive, start your SQL Server in Single User Mode, make changes to location of TempDB files with an Alter command and then restart SQL Server in normal mode.

Both the above solutions will work but the problem is you might not have enough time to implement either of these because if you ever encounter such issue on Tier 1 or Tier 2 Server a Sev.1 will be there immediately awaiting your response and on top of that you need to represent your tower on a Bridge call to answer multiple questions (what\where\when\why etc.) all of this add stress and between all this you need to take multiple steps to solve this issue.

Your SLA will also hit if you are not able to resolve the issue in given time. (SLA might depend but starting SQL in Single User mode or getting approval & allocating storage on server needs time)

Quickest Solution

The quickest solution could be learning just one dos command. Yes the very old Dos expose a highly powerful command which can resolve the issue under few minutes if not in seconds.

In my example the S: Drive was missing and here is the screenshot of the available drives.

Now, All you need to do is find the drive\disk where you have enough space on the server, and issue the below command:

SUBST S: D:\Sarabpreet\SSG_Example

–You need to replace the First parameter with Drive Letter you need and Second Parameter with the location\Drive where you have enough space to create TempDB Files

By issuing the above command it will map the location (Path) to Drive letter given as a second parameter.


You can also check the location where the drive is pointing to – Verify by just issuing command SUBST


Post that you can create the folder and restart the SQL Services.

If you want to delete the created drive once the disk has been allocated back to you, you can use the /D switch.


Note: Reboot will clear created drives by SUBST, but we can resolve this by keeping the command in autoexe.bat file

Hope you find it helpful.


Sarabpreet Anand

LepideAuditor Suite Review


Product: LepideAuditor Suite Version 15.0

LepideAuditor Suite offers a composite platform to audit and monitor changes to Active Directory, Group Policy Objects, Exchange Server, SQL Server, and SharePoint Server. Software ensures business continuity by monitoring all configuration changes within IT environment. You get explicit answers for Who, What, When and Where questions related to all changes. It provides a single console to audit, report, review and alert on changes done to the important network components. Let’s review the SQL Server Audit part of this tool.

It is a Robust and amazing product which meets all your Instance and DB level auditing requirements. Report scheduling, Alerts and getting custom reports in your mailbox are a few out of the box features which makes this product outstanding.


1. Installation & Configuration

Installation is very easy, just like any other ordinary software. To start Auditing you’ve to first register your SQL Server Instance, configure a certain things & install Monitoring agent.

Let’s see how to register an Instance and what all parameters we can configure, after launching the product for the first time, go to settings tab and you’ll get this dialogue box.


Once you select SQL Server, you’ll get dialogue box to connect to the SQL instance you wish to Monitor and from the same console you’ll be able to install Lepide Agent.

2_sarabpreet   3_sarabpreet    4_sarabpreet   6_sarabpreet 5_sarabpreet    

Once the agent is installed properly you’ll get the configuration console to configure the monitoring scope of the tool. The screenshots below are self-explanatory.

8_sarabpreet    7_sarabpreet

Once the agent is installed you can configure the SQL Instance where LepideAuditor Suite can keep all Audit logs and you can provide the name of the DB.


What I liked the most is the inbuilt feature to schedule the Archival of audit data – this is obviously optional but a great feature.



What happens under the hood?

To make this kind of robust auditing possible LepideAuditor Suite creates:

a. Couple of Stored Procedures in your Master Database

b. Creates a new Database named “LepideSQLTracker” (Customizable)

c. SQL Agent Job

d. SQL Server Login


2. Usage

Auditing is a basic need for every production environment but in some cases it becomes mandatory due to regulatory & compliance requirements like SOX, HIPPA etc. LepideAuditor Suite for SQL Server not only handles all those compliance requirements it also generates presentable reports for the same.


3. Is it really required?

You must be wondering why we need a tool when you can do all this yourself with the help of in-built SQL Features and some coding. Yes you are correct you can do so but this tool gives you much more than the basic auditing features. There is virtually no way to skip the tracking, it gives you an option to generate reports in multiple formats, and you can schedule reports and get those reports right in your mailbox. You can even configure alerts on events. You’ll realize the true power of this tool when you have more than just couple of servers, in those situations this tool helps you to manage the auditing from a single console and empower you to have a standard in all aspects be-it Reporting, tracking and monitoring changes, auditing or archiving of the logs.

This tool also provide you graphical representation of changes and you get multiple filter options. Moreover you get the actual T-SQL Statements of the changes which were executed along with the tracking\auditing records.

You can also create real-time monitoring alerts using the console. You don’t even have to change anything and worry about having different versions of SQL Server in your environment since the tool supports all SQL versions starting from SQL 2000 till SQL Server 2014.

Here are couple of screenshots of the console to show you how the console looks like:

Dashboard: There is a default Dashboard which has 6 different & most common things, but you can also customize it based on your needs, moreover you can also add another dashboard.




Alerts: you can schedule reports or create alerts to be delivered directly to your mailbox.



Audit Reports: there are tons of reports already available which you can extract.



Easy to use UI and you can easily play with columns and re-arrange them



Live Feed: I am sure you must have noticed Live Updates console by now, this is the coolest thing added in this version, in this console you get to see all changes in almost real time.


There are 6 different graphs visible on the main SQL Instance level and you can change the timeline for each graph individually to select whether you want to see changes made during last day, week or month.


Nice and Presentable Reports

The tool itself let you export the reports in different formats like CSV, PDF or HTML, apart from that you also get a chance to export those nice looking presentable graphs as Images.


Why this is a Robust Product?

Every tool has some bugs or loopholes. The malicious users make use of those loopholes to do the unexpected without being trapped but in this tool there is no easy way out. I tested couple of scenarios to see if I can temporarily stop the monitoring or find a way to hide the activities but the tool was able to track each and every little activity. Here are the scenarios I tested:

a. Stopped the SQL Agent Service.

b. Disabled the default trace.

c. Stopped the monitoring itself from the Lepide console.


I personally recommend this tool.

You can download the trial version from here….

Sarabpreet Singh

Tech-Meet by Delhi User Group | 01-Nov-2014 | Gurgaon

Hi friends!

DElhi User Group1Delhi User Group presents Tech-Meet, your chance to learn Hot technologies from Industry experts.
The half-day free event for professionals in Delhi NCR includes a great line-up of industry-recognized speakers.

The event will start at sharp 9:30 AM and security takes time so please be on time to skip the line and grab the front seat. Winking smile 

I’ll be speaking on Backup and Restore improvements in SQL Server 2014.

REGISTER | Registration required | Limited seats only

Refer the Image below for the complete agenda:



See you at the event.


Sarabpreet Singh

Screen cast of SQL Server VM Creation on Microsoft Azure

Yesterday one of my friend asked me the process to create a SQL VM on Microsoft Azure, though the process is quite simple and easy but he asked me to help him since he never worked on Azure platform before.

So this screen cast post is for those who never worked on Azure and want step by step approach to create the VM.


You can create a SQL VM Machine on Azure by selecting New menu on the lower left part of the page.



You can select image of the VM from two sources: Quick create or from Gallery (There is other way also, but let’s not complicate this post, we may discuss that later) – you need to select the image from Gallery.



You can browse from the multiple available images in the gallery, as you can see you can select SQL Server 2014 RTM image on Windows Server 2012 R2.



While creation of the Image, you can select the size of the image from the options – size means the Cores and Memory you want to allocate to this VM.









Finally the VM is ready and started – you may wish to connect to the image & start working.

Enjoy learning


Sarabpreet Singh

SQL Server data files on Azure | Configure Azure security

Before we even start discussing about this feature we must have an idea about Azure Storage accounts and how the security works in Azure.

You can access your Azure storage account linked container because you are the owner, but let’s say you want to share this access (full or Partial) for a blob/container with someone else; to do this you must configure security keys.

In Azure portal security works on two types of Keys and you can use these keys in order to fulfill your security requirements; these are:

a. Storage Access Key (Shared Storage Key)

b. Shared Access Signature Key

Storage Access Key (Shared Storage Key): This is an Admin key which provides full access on your storage account and with this key the person can do any operation on underlying container\blob.

Shared Access Signature Key: This is another key which can be used to provide granular access on the container and helps you enforce restricted permissions (read only, read\write with expiry etc.) this key is also known as SAS Key. SAS key normally looks like a URL which defines the resources you can access along with permissions and the validity of the SAS Key.

How to get these keys:


Storage Access Key (Shared Storage Key)

Azure portal provides the Shared Storage Key and you don’t need to use any other tool. All you need to do is login to your Azure Portal, select the Storage Account and select Manage Access Keys from the lower bottom of the portal. Refer the screenshots below:



You can easily copy the access key by using the copy link, marked in the image above.


Shared Access Signature Key

As of now unfortunately Azure portal doesn’t have a GUI for this, if you want to generate a SAS key you can do that either by Programming (by calling REST APIs or by using PowerShell cmdlets) or by using Freeware or third party tools. Let me show you a way to generate this key by using an awesome freeware codeplex (open source) project named “Azure Storage Explorer” which is contributed by David.

You can download the tool from here….. It’s a small 3 MB file. The new version is very intuitive and has many new features….Once the tool has been installed you must add your storage account; in order to add your account you have to provide your Storage Access Key. Refer the screenshot below:



Once the Storage account has been added you can easily navigate the Storage account and containers, select the container where you want to configure the SAS key and select Security from the upper tab.



It will popup a window with two Tabs: Access Level and Shared Access Signature. You can change the default behavior of your storage container or configure a SAS key by using the tabs respectively. Refer the screenshots below to get more clarity.

AzureStorageExplorer-AccessLevel  AzureStorageExplorer-Create_SAS_Key_sarabpreet

You can read more about the tool on David’s website….here


Sarabpreet Singh