13 Apr

LepideAuditor Suite Review

 

Product: LepideAuditor Suite Version 15.0

LepideAuditor Suite offers a composite platform to audit and monitor changes to Active Directory, Group Policy Objects, Exchange Server, SQL Server, and SharePoint Server. Software ensures business continuity by monitoring all configuration changes within IT environment. You get explicit answers for Who, What, When and Where questions related to all changes. It provides a single console to audit, report, review and alert on changes done to the important network components. Let’s review the SQL Server Audit part of this tool.

It is a Robust and amazing product which meets all your Instance and DB level auditing requirements. Report scheduling, Alerts and getting custom reports in your mailbox are a few out of the box features which makes this product outstanding.

 

1. Installation & Configuration

Installation is very easy, just like any other ordinary software. To start Auditing you’ve to first register your SQL Server Instance, configure a certain things & install Monitoring agent.

Let’s see how to register an Instance and what all parameters we can configure, after launching the product for the first time, go to settings tab and you’ll get this dialogue box.

1_sarabpreet

Once you select SQL Server, you’ll get dialogue box to connect to the SQL instance you wish to Monitor and from the same console you’ll be able to install Lepide Agent.

2_sarabpreet   3_sarabpreet    4_sarabpreet   6_sarabpreet 5_sarabpreet    

Once the agent is installed properly you’ll get the configuration console to configure the monitoring scope of the tool. The screenshots below are self-explanatory.

8_sarabpreet    7_sarabpreet

Once the agent is installed you can configure the SQL Instance where LepideAuditor Suite can keep all Audit logs and you can provide the name of the DB.

9_sarabpreet

What I liked the most is the inbuilt feature to schedule the Archival of audit data – this is obviously optional but a great feature.

10_sarabpreet

 

What happens under the hood?

To make this kind of robust auditing possible LepideAuditor Suite creates:

a. Couple of Stored Procedures in your Master Database

b. Creates a new Database named “LepideSQLTracker” (Customizable)

c. SQL Agent Job

d. SQL Server Login

 

2. Usage

Auditing is a basic need for every production environment but in some cases it becomes mandatory due to regulatory & compliance requirements like SOX, HIPPA etc. LepideAuditor Suite for SQL Server not only handles all those compliance requirements it also generates presentable reports for the same.

 

3. Is it really required?

You must be wondering why we need a tool when you can do all this yourself with the help of in-built SQL Features and some coding. Yes you are correct you can do so but this tool gives you much more than the basic auditing features. There is virtually no way to skip the tracking, it gives you an option to generate reports in multiple formats, and you can schedule reports and get those reports right in your mailbox. You can even configure alerts on events. You’ll realize the true power of this tool when you have more than just couple of servers, in those situations this tool helps you to manage the auditing from a single console and empower you to have a standard in all aspects be-it Reporting, tracking and monitoring changes, auditing or archiving of the logs.

This tool also provide you graphical representation of changes and you get multiple filter options. Moreover you get the actual T-SQL Statements of the changes which were executed along with the tracking\auditing records.

You can also create real-time monitoring alerts using the console. You don’t even have to change anything and worry about having different versions of SQL Server in your environment since the tool supports all SQL versions starting from SQL 2000 till SQL Server 2014.

Here are couple of screenshots of the console to show you how the console looks like:

Dashboard: There is a default Dashboard which has 6 different & most common things, but you can also customize it based on your needs, moreover you can also add another dashboard.

12_sarabpreet   

11_sarabpreet

 

Alerts: you can schedule reports or create alerts to be delivered directly to your mailbox.

13_sarabpreet

 

Audit Reports: there are tons of reports already available which you can extract.

15_sarabpreet   

14_sarabpreet

Easy to use UI and you can easily play with columns and re-arrange them

16_sarabpreet

 

Live Feed: I am sure you must have noticed Live Updates console by now, this is the coolest thing added in this version, in this console you get to see all changes in almost real time.

17_sarabpreet

There are 6 different graphs visible on the main SQL Instance level and you can change the timeline for each graph individually to select whether you want to see changes made during last day, week or month.

 

Nice and Presentable Reports

The tool itself let you export the reports in different formats like CSV, PDF or HTML, apart from that you also get a chance to export those nice looking presentable graphs as Images.

 

Why this is a Robust Product?

Every tool has some bugs or loopholes. The malicious users make use of those loopholes to do the unexpected without being trapped but in this tool there is no easy way out. I tested couple of scenarios to see if I can temporarily stop the monitoring or find a way to hide the activities but the tool was able to track each and every little activity. Here are the scenarios I tested:

a. Stopped the SQL Agent Service.

b. Disabled the default trace.

c. Stopped the monitoring itself from the Lepide console.

 

I personally recommend this tool.

You can download the trial version from here….

Thanks,
Sarabpreet Singh


20 Oct

Tech-Meet by Delhi User Group | 01-Nov-2014 | Gurgaon

Hi friends!

DElhi User Group1Delhi User Group presents Tech-Meet, your chance to learn Hot technologies from Industry experts.
The half-day free event for professionals in Delhi NCR includes a great line-up of industry-recognized speakers.

The event will start at sharp 9:30 AM and security takes time so please be on time to skip the line and grab the front seat. Winking smile 

I’ll be speaking on Backup and Restore improvements in SQL Server 2014.

REGISTER | Registration required | Limited seats only

Refer the Image below for the complete agenda:

agenda

location

See you at the event.

 

Thanks,
Sarabpreet Singh


18 Aug

SQL Server data files on Azure | Configure Azure security

Before we even start discussing about this feature we must have an idea about Azure Storage accounts and how the security works in Azure.

You can access your Azure storage account linked container because you are the owner, but let’s say you want to share this access (full or Partial) for a blob/container with someone else; to do this you must configure security keys.

In Azure portal security works on two types of Keys and you can use these keys in order to fulfill your security requirements; these are:

a. Storage Access Key (Shared Storage Key)

b. Shared Access Signature Key

Storage Access Key (Shared Storage Key): This is an Admin key which provides full access on your storage account and with this key the person can do any operation on underlying container\blob.

Shared Access Signature Key: This is another key which can be used to provide granular access on the container and helps you enforce restricted permissions (read only, read\write with expiry etc.) this key is also known as SAS Key. SAS key normally looks like a URL which defines the resources you can access along with permissions and the validity of the SAS Key.

How to get these keys:

 

Storage Access Key (Shared Storage Key)

Azure portal provides the Shared Storage Key and you don’t need to use any other tool. All you need to do is login to your Azure Portal, select the Storage Account and select Manage Access Keys from the lower bottom of the portal. Refer the screenshots below:

AzureStorageAccount-sarabpreet

AzureStorageAccessKey-sarabpreet

You can easily copy the access key by using the copy link, marked in the image above.

 

Shared Access Signature Key

As of now unfortunately Azure portal doesn’t have a GUI for this, if you want to generate a SAS key you can do that either by Programming (by calling REST APIs or by using PowerShell cmdlets) or by using Freeware or third party tools. Let me show you a way to generate this key by using an awesome freeware codeplex (open source) project named “Azure Storage Explorer” which is contributed by David.

You can download the tool from here….. It’s a small 3 MB file. The new version is very intuitive and has many new features….Once the tool has been installed you must add your storage account; in order to add your account you have to provide your Storage Access Key. Refer the screenshot below:

AzureStorageExplorer-add_new_account_sarabpreet

 

Once the Storage account has been added you can easily navigate the Storage account and containers, select the container where you want to configure the SAS key and select Security from the upper tab.

StorageExplorer-container

 

It will popup a window with two Tabs: Access Level and Shared Access Signature. You can change the default behavior of your storage container or configure a SAS key by using the tabs respectively. Refer the screenshots below to get more clarity.

AzureStorageExplorer-AccessLevel  AzureStorageExplorer-Create_SAS_Key_sarabpreet

You can read more about the tool on David’s website….here

 

Thanks,
Sarabpreet Singh


3 Jun

SQL Server Data Files in Windows Azure | Part-1

This is a brand new technology which is introduced in SQL Server 2014, having said that believe me this is not something which is new to us – we’ve been using this technology since a long time in our daily life like: Streaming music files through internet or saving all your data on One Drive.

So just like the examples mentioned above now starting from SQL Server 2014 you can take advantage of Azure & place your Data and Log files in Azure no matter if your SQL Server Instance is running on-Premise or on Windows Azure Virtual Machine.

It lets you create & place your data and log files directly on windows azure storage, by doing this you get all the advantages of Azure storage – Bottomless storage, offsite location, Geo-Redundant, No provisioning and No need to do Device Management, its decay free (Full Media Safety) and always accessible remotely from anywhere. To know more read this.

 

Why do you want to place your data\log files on Azure?

Don’t you think this is counter intuitive to keep the files on Cloud? Yes at first even I was thinking the same. This looks odd but can be used in many circumstances. So essentially all you need to do is re-attach your (data and log) files from cloud to your SQL Instance, SQL instance can be on-premise or on Cloud.

The next question will obviously be regarding Performance…. In case you are skeptical about the performance you can host your SQL Server also on Cloud (windows Azure VM) this way your SQL Server & data\log files both will be on cloud.

This feature gives you power of a DR, even if your Data Centre goes down you can bring back the SQL instance in minutes without any special technique all you need is a SQL Instance which you can use just like your production server because all data is still intact & placed on Cloud – you just have to re-attach the files to make it working.

 

Is it Secure?

Obviously it is very much secure and your data is protected in many ways:

a) Windows Azure itself provides a mechanism to secure & protect your data; it’s not like a public folder which is open to all. To know more you can read this blog which will help you understand how the security works in Azure Storage and the importance of each type of key.

b) Granular Encryption encrypts the data (Cell by cell encryption) at all times & the key to decrypt is only available with SQL Server instance and never on the cloud along with data\log files.

c) If you are still not convinced you may still use TDE (transparent data encryption) to further encrypt it. It is fully supported.

 

Ok, I am convinced but what about my RTO & RPO?

You don’t have to worry about these two because inherently this technology take care of these two; It saves a lot of your time – you don’t have to restore the Database from a backup even if your server got crashed because your data files are intact in Cloud, you just need to re-attach them on a new Instance and your database is available right from the point where you left (you lose only uncommitted transactions) so it gives you better RPO (Recovery Point Objective) along with RTO (Recovery Time Objective)

Enough of talking now, let me know how to use this amazing feature….

 

Thanks,
Sarabpreet Singh


25 Apr

SQL Server Managed Backup to Microsoft (Windows) Azure | Part-1

SQL Server Managed backup is a new feature introduced in SQL Server 2014, this feature is simply outstanding – you don’t have to worry about the backup strategy anymore, this feature manages and automates the backup process.

This feature uses sort of artificial intelligence – if you see at current scheduling features they just keep on taking backups at a specified time interval whereas, in this feature it checks couple of things and based on the situation it decides whether a backup needs to be initiated for a Database or not. This Feature is based on Retention period selected for backups and transaction workload on the database and it also provides Point-in-time recovery. You want to know how? Keep reading… you’ll get your answers soon.

You can enable this feature at two levels Instance as well as Database level. It doesn’t really matter if your SQL Server is running On-Premise or on Cloud. SQL Server Managed Backup to Microsoft (Windows) Azure is in-fact recommended for SQL Server running on a VM in Cloud.

The Retention period is a value which determines what backup files to keep and what needs to be deleted in the Azure Storage so that the underlying DB can be recovered to a point in time within the time frame specified. The valid retention period values are from 1 till 30 days.  The backup_finish_date of the backup is used to determine and match the time specified by the retention period settings.

Naming convention it follows for the backup and container

The backup files are named using a combination of first 40 characters of the database name, the GUID of DB without “-“& the timestamp; all these values are separated using Underscore to make it readable. The .bak and .log extensions are used for Full and Log Backups respectively. In case of Availability Group databases the Availability group database GUID is added after the first 40 characters of the DB name, you can get this GUID value from group_database_id column of Sys.databases.

The same way Backup container of (Windows) Microsoft Azure also get named using the SQL Instance name and in the case of Availability databases it gets named using GUID of Availability group.

 

Conditions that trigger a backup

As we discussed earlier also SQL Server managed backup doesn’t initiate a backup merely on the basis of schedule which has been fixed but takes a calculative decision whether a backup is really required for the given DB or not. It decides based on the below mentioned conditions:

A. Full backup gets triggered:

1. At the time when Managed Backup is configured on DB or when it is configured on the complete instance with default settings.

2. If the log generation since last full backup is greater than or equal to 1 GB in size.

3. If maximum time interval of one week has passed since the last full backup. (it tries to take at least one full back up every week)

4. When the log chain is broken; it checks periodically whether the log chain is intact or not by comparing first and last LSN of backup files, if it finds there is a gap it initiates a full DB Backup. There could be n number of reasons for log chain breaks, most common would be:
a. Switching recovery model to simple.
b. Taking back up through any other job\manual process.
c. Deleting some backup files
d. Accidental overwrites of backup files.
e. Truncating the log.

B. Transaction Log backup gets triggered:

1. When the Transaction Log space used is equal to or greater than 5 MB.

2. The maximum time interval of 2 hours has passed since the last log backup.

3. If no log backup history is found; for example when you enable Managed backup for the first time for the respective DB.

4. Anytime the TLog backup is lagging behind the full backup, it always tries to keep the log chain ahead of the Full backup. In simple words after every full backup it will initiate one TLog backup.

 

Thanks,
Sarabpreet Singh Anand


4 Apr

SQL Tip-10 | SQL Server Database Backup to Cloud (Microsoft Azure)

This feature can help you leverage the power of Cloud with your SQL instance and taking your backups directly on Cloud.

Backup to URL is not new to SQL 2014, it was first introduced in SQL Server 2012 SP1 CU2, but you were allowed to use the URL in query only. Starting with SQL Server 2014 this feature has been totally integrated, now you can specify URL while taking Backup or doing Restore no matter if you are performing the task via Query, Backup\Restore GUI, Powershell cmdlets or using SMO.

To be able to take backups or restore from Microsoft Azure (earlier named as Windows Azure) you must create a Credential. Before creating a credential you must already have a Storage account created on Azure. Once you create your storage account you’ll be able to get Storage Access Key from Azure portal. This Storage access key will provide full Admin control on your storage account.

Here is the script to create the Credential:

Create credential Storage_4Backup

with identity = ‘dbbackup12’,

SECRET = ‘H61bUxq5ld+AUFGa3Zc30CKYGSOyMDGEb/MKcCjvssdrgF34PZt+TV/42/HoJpOgnrM82KQpuCYYHEudjg==’

go

 

You need to replace the Secret key with your storage access key, the screenshot below will tell you from where you need to copy this.

clip_image002

Once the credential is created on your SQL instance you are now all set to be able to take backups or do restore from Windows Azure storage account.

Here is the T-SQL Script for the same:

Backup database azure_test

to URL = ‘http://dbbackup12.blob.core.windows.net/sqlprod/azure_test.bak’

with credential = ‘Storage_4Backup’, format, compression, stats=10, MediaName = ‘azure_test_22_02_2014’,

mediadescription=‘Backup_of_azure_test’

 

Note: To get the complete script bundled with step-by-step demo follow this link.

You can also do the backup to windows azure using Backup UI – but the credential is a must, here is the screenshot.

If the credential already exists you can select from the drop down menu otherwise the wizard itself gives you an option to create one.

clip_image004

Rest everything is self-explanatory. 😉

 

Thanks,
Sarabpreet Singh


30 Jan

Microsoft Cloud OS Roadshow-Gurgaon | 22-Feb-2014

This February MS is organizing Cloud OS Roadshows and I’ve got a chance to be a speaker in Microsoft Cloud OS Roadshow which is happening on 22nd Feb’14 in Gurgaon.

I’ll be sharing the stage with two amazing speakers and MVPs: Gaurav Mantri (Microsoft Azure MVP) and Shantanu Kaushik (Windows Expert IT Pro MVP)

Join this exciting full day event to learn:

  • Breakthrough Data Performance Platform
  • Virtualization
  • Building Modern Apps and Windows Azure
  • Introduction to Cloud Computing and Windows Azure Overview
  • Windows Azure Compute – IaaS, Websites and Cloud Services
  • Windows Azure Storage – SQL Database, Blob Storage, Table Storage

Lunch will be provided.

Register today: http://bit.ly/KkS1cL

MS Cloud OS Roadshow - Infopage- sarabpreet singh anand

Cheers!

 

Update: Covered by a few bloggers: https://plus.google.com/+DronacharyaInfo/posts/XHxSzkUDRK8


17 Jul

Read Before you even think of Installing SQLServer 2014 CTP1

SQL Server 2014 CTP1 is out with lots of cool & exciting features, if you’ve not yet downloaded it, you can get it from below mentioned link: http://www.microsoft.com/en-us/sqlserver/sql-server-2014.aspx

If you are want to try out all\any of these cool features you might want to install this version but before you even think of doing this read the below limitations carefully – chances are your installation will fail if you do not fulfill any of these.

Microsoft SQL Server 2014 Community Technology Preview 1 (CTP1) installation limitations:

  1. · Microsoft SQL Server 2014 CTP1 is pre-release software and should be installed only on a clean machine. It does not support upgrade from or side-by-side installations with any previous version of SQL Server, including SQL Server 2012. In addition it does not work with or support side by side installations with any client redistributables of SQL Server such as feature pack components, Visual Studio 2012 or earlier versions, SQL Server Data Tools, Data Tier Application Framework, etc.
  2. · To administer Microsoft SQL Server 2014 CTP1 use only SQL Server Management Studio that ships as a part of Microsoft SQL Server 2014 CTP1.
  3. · The Microsoft SQL Server 2014 CTP1 release is NOT supported by Microsoft Customer Services and Support (CSS).
  4. · The Microsoft SQL Server 2014 CTP1 release is only available in the X64 architecture.
  5. · The Microsoft SQL Server 2014 CTP1 release is a NON-PRODUCTION release and should not be installed and used in production environments.
  6. · The Microsoft SQL Server 2014 CTP1 release does NOT have an associated Microsoft SQL Server 2014 CTP1 Feature Pack.
  7. · The Microsoft SQL Server 2014 CTP1 release on Windows Azure VM is only intended for functional development and testing, and not for performance testing.
  8. · The Microsoft SQL Server 2014 CTP1 release contains branding and versioning from older Microsoft SQL Server releases.

I actually tried installing this on my laptop which was already having all version of SQL Server starting from 2005. Same as the previous installations the Setup Support Rules screen is there in SQL Server 2014 installation media which detected the previous installation of SQL Server on my laptop & stopped my attempt of installing something which is against the rules.

clip_image002

After clicking on the Hyperlink “Failed” I got the complete error message in a dialog box – same as earlier versions.

clip_image003

So to fulfill these rules I created a new Virtual Machine to install and test out these features – I’ll be posting all those steps in my next blog – till then stay tuned and Happy Learning J

Thanks,
Sarabpreet Singh Anand
Subscribe now to get latest Blogs in your Inbox or Follow on twitter @SQLDB