SQL Server 2016 Compare Showplan Option

Just released in SQL Server 2016 CTP 2.4 is a new feature that will allow us to compare execution plans. The main purpose of this feature is to provide a side-by-side comparison of two execution plans. This makes it easier to find similarities and changes on each plan. This is a very nice enhancement that aids in troubleshooting issues such as understanding the impact of rewriting queries or observing how a design change, such as an index, may impact the plan. In the tip below, we’ll go over how it works.

For this post, I’m using the AdventureWorks2014 database and I’m going to use the following query:

SELECT FirstName, LastName, CreditCardID 
FROM Person.Person p 
INNER JOIN Sales.PersonCreditCard c ON p.BusinessEntityID = c.BusinessEntityID
WHERE p.ModifiedDate > '2014-01-01'

pen SQL Server Management Studio (SSMS) 2016 and execute the query with SET STATISTICS IO ON and “Include Actual Execution plan“ enabled.

1-Compare Showplan SQL FreelancerClick the Messages tab and you’ll notice that we’re doing 3819 logical reads on the Person table. This is a sign that this query can be tuned. We’d like to get this number to the lowest possible.

2-Compare Showplan SQL Freelancer

Click the execution plan tab and you’ll see the plan along with some Missing Index Details.

3-Compare Showplan SQL Freelancer

The execution plan is what we’ll focus on in this tip, but I wanted to look at STATISTICS IO also to see if we can improve the number of logical reads.

To be able to compare plan we’ll need to save the execution plan. Right click anywhere in the plan window and select Save Execution Plan As… Choose a name and location and hit Save.

Click here to view the rest of this post.

SQL Server Management Studio 2015

Microsoft SQL Server Management Studio is a product I use throughout the day, every day. I’ve tried using other environments to access and manage SQL Server, but SSMS is what I learned using and what I always go back to. The problem with SSMS was that you couldn’t download it individually. You always had to have a licensed copy of SQL Server or install SQL Server Express with Tools to get this…..until now!

Microsoft has finally released a standalone download of SSMS. This release supports SQL Server 2016 through SQL Server 2005. It also provides the greatest level of support when working with Azure.

Some enhancements include:

  • New SSMS Installer – SSMS can now be installed with a light weight stand-alone web installer.
  • SSMS Updates – Receive notification within SSMS when new updates are available, and choose to install them at your convenience.
  • Enhanced SSMS support for Azure SQL Database – Several fixes and enhancements, including expanded SQL Server Management Objects (SMO) coverage, and an updated Import/Export wizard.

Let’s step through the install:

Download the SSMS-Web-Setup.exe from here.

Start the install, agree to the license terms, and click Install.SSMS 2015 SQL Freelancer

 

It should take a few minutes, but once this completes restart your computer and that’s it! Easy install.

SSMS 2015 SQL Freelancer 2

 

 

 

 

 

Passing Multiple Values into a Variable

Passing multiple values into a variable is a little more difficult than it should be. In other languages you can use functions such as Lists or Arrays, but SQL makes it a bit more complicated. To show you what I mean, let’s look at an example.

First, let’s create a Demo table:

CREATE TABLE [dbo].[Demo](
[ID] [int] NULL,
[Name] [varchar](50) NULL,
[Address] [varchar](50) NULL,
[State] [varchar](50) NULL,
[Zip] [int] NULL
)

Next, populate it with some data:

INSERT INTO [dbo].[Demo]
VALUES (1, 'Brady', '123 Main Street', 'TN', 12345)

 

INSERT INTO [dbo].[Demo]
VALUES (2, 'Tommy', '124 Main Street', 'TN', 12345)

 

INSERT INTO [dbo].[Demo]
VALUES (3, 'Jonny', '125 Main Street', 'TN', 12345)

Now that we have some data, let’s try a query using variables. I want to define a variable on the column ID.

DECLARE @MultipleValue varchar(200)
SET @MultipleValue = '1,2'

SELECT * FROM Demo WHERE ID IN (@MultipleValue)

After running this query, I get 0 results and an error:

Msg 245, Level 16, State 1, Line 24
Conversion failed when converting the varchar value ‘1,2’ to data type int.

Why? I know the ID’s 1 and 2 are in the table, but SQL is looking at this variable as one string. So unless I have 1,2 in the same ID column, it will show 0 results.

One way to get around this is to use a UDF, user defined function. In this function, we’re going to convert the comma separated values (1,2) into a table, then query from that.

CREATE FUNCTION [dbo].[MultipleValues] (@InStr VARCHAR(MAX))
RETURNS @TempTable TABLE
(id int not null)
AS
BEGIN

SET @InStr = REPLACE(@InStr + ',', ',,', ',')
DECLARE @SP INT
DECLARE @VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', @INSTR ) <> 0

BEGIN

SELECT @SP = PATINDEX('%,%',@INSTR)
SELECT @VALUE = LEFT(@INSTR , @SP - 1)
SELECT @INSTR = STUFF(@INSTR, 1, @SP, '')
INSERT INTO @TempTable(id) VALUES (@VALUE)
END
RETURN
END
GO

Now that we have a UDF, let’s use this in the query:

DECLARE @MultipleValue varchar(200)
SET @MultipleValue = '1,2'

SELECT * FROM Demo WHERE ID IN (SELECT * FROM dbo.MultipleValues(@MultipleValue))

Ta da!

We now have two results. ID 1 and 2:

Passing multiple values into a variable

Using a DateTime Expression in a SSIS Destination

Recently, I’ve ran into a few situations where I needed to export some SQL data into a CSV on a daily basis. I also needed to keep these CSV files in an archive folder for reference in case a problem came up. Moving the files to an archive folder is not a problem using the “File System” task, but the CSV needs to be a unique name otherwise it will just overwrite the previous file. There are numerous ways to make the filename unique, but in the post I’ll describe adding a datetime stamp to each file which also helps determine when the file was created.

I’ll assume you already have a Data Flow task configured with a destination and a flat file source. In this example, I’m going to use the AdventureWorksDW2012 database and copy data from the DimProduct table to a CSV. I already have my OLE DB source task configured and now I’m going to configure the Flat Destination.

Right click the Flat File Destination and click Edit:

SSIS Expression Destination DateTime

In the Flat File Destination Editor, click New…

SSIS Expression Destination DateTime

Select Delimited and click OK.

I’ve created a blank CSV on my Desktop which we’ll use for now, but this will eventually change:

SSIS Expression Destination DateTime

Click OK and select Mappings and verify all the mappings from the source are correct:

4 - SSIS Expression Destination DateTime

Once this is verified, click OK. (For this example, I only want to copy the first 6 columns)

Next, we need to create an expression on the connection string so that it knows to add the datetime stamp. Click on the Flat File Connection Manager. Scroll down the Properties window until you see Expressions:

SSIS Expression Destination DateTime

Click the ellipsis beside Expressions and the Property Expressions Editor will appear:

SSIS Expression Destination DateTime

Under Property, select Connection String and click the ellipsis under Expression:

SSIS Expression Destination DateTime

You should now see the Expression Builder window. Here is where we will define our complete file name including file path. For this example, I want to create the CSV in a directory named CSV on the C: drive. The filename that I will use is Products_datetime stamp.csv so the final output should be C:\CSV\Products_20150417092331.csv

The expression I’ll use to achieve this is below:

"C:\\CSV\\Products_" + (DT_STR,4,1252)DATEPART( "yyyy" , getdate() ) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "hh" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "mi" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "ss" , getdate() ), 2) + ".csv"

Type your expression in the Expression window of the Expression builder:

SSIS Expression Destination DateTime

Click Evaluate Expression to view the output:

SSIS Expression Destination DateTime

Now that the expression is built, click OK twice. You should now see the expression formula icon beside the Flat File Connection Manager:

SSIS Expression Destination DateTime

If we execute the package and look inside the CSV folder, you should see our new file:

SSIS Expression Destination DateTime

SQL Server Integration Services Connector for SalesForce

Problem:

I have a list of contacts in my SQL Server database that I would like to import into Salesforce and keep Salesforce up to date as new contacts are entered daily into my database. I know there are a few different ways to accomplish this but I’d like to use a SSIS package so I can automate.

Solution:

In my opinion, the best (and easiest) way to accomplish this goal is using SSIS packages and a component from CozyRoc called Salesforce Destination. CozyRoc provides advanced SSIS libraries of custom tasks, components and scripts for handling Zip, GZip, BZip2, Tar archives, enhanced script reuse and secure Internet communications. I always go to CozyRoc when I need a custom task. In this tip I’ll show you how to take a contact list from SQL Server and import it into SalesForce. I’ll also show you how to schedule this on a daily basis.

For this example, let’s say I have a contact table in my SQL Server database like the following:

SQL Server SSIS SalesForce

I need to import these records into my Contacts tab in SalesForce:

SQL Server SSIS SalesForce

Create SSIS package

First thing I’ll need to do is download the CozyRoc SalesForce component. This download can be found here: http://www.cozyroc.com/ssis/salesforce-destination

I’ll download and install this component on the server or workstation that I plan on developing the package. Once installed, open SQL Server Data Tools and create a SSIS project:

SQL Server SSIS SalesForce

First, let’s create a connection to the SQL Server database that stores the contact information and the Salesforce destination:

Right click in Connection Manager and create “New OLE DB Connection”:

SQL Server SSIS SalesForce

Type Server Name, credentials, and select database from drop down that stores the contact information. In this example, I’m selecting my local server and Demo database:

SQL Server SSIS SalesForce

Test Connection and click OK. Right click on the Connection Manager again and click “New Connection”:

SQL Server SSIS SalesForce

If the Salesforce Destination component was installed properly, you should see a connection called “SFORCE”:

SQL Server SSIS SalesForce

Click Add. Type your UserName and Password + Security token. If you don’t know your security token click here to read more about resetting it.

SQL Server SSIS SalesForce

Once you have entered your login credentials, make sure to test connection.

You should now have two connections in Connection Managers:

SQL Server SSIS SalesForce

Now to designing the package:

From the Control Flow we will add our Data Flow Task and right click to configure.

SQL Server SSIS SalesForce

Add the OLE DB source and SalesForce Destination tasks to the designer:

SQL Server SSIS SalesForce

Configure OLE DB Source:

SQL Server SSIS SalesForce

To configure the SalesForce Destination right click and edit. Choose the connection manager you created earlier:

SQL Server SSIS SalesForce

Click the Component Properties tab and you’ll notice a few options under Custom Properties:

Action:

Action Description
Create Create a new record in the destination object
Update Update an existing record in the destination object. You must specify the appropriate object record identifier for the update to work.
Delete Delete an existing record from the destination object. You must specify the appropriate object record identifier for the delete to work.
Upsert Update and insert a record in the destination object. Selecting this value displays the dynamic parameter ExternalId.

 

Batch Size: Specify the number of rows to be sent as a batch. The maximum for regular mode is 200. The maximum for bulk-load is 10,000\.

Destination Object: Specify the destination Salesforce object from a dropdown where the data is to be loaded.

ExternalID: Specify the external identifier field for updating the foreign key lookup with an external identifier.

Mode:

Value Description
Regular Process the data in regular mode
BulkData Process the data in bulk-load data mode
BulkBinary Process the data in bulk-load binary mode

SQL Server SSIS SalesForce

Since we are adding Contacts, we’ll select Create (Action) and Contact from the DestinationObject option.

Click the Column Mappings tab and make sure all the columns are mapped correctly (delete mapping between ID’s for this example):

SQL Server SSIS SalesForce

Run the package and if everything is setup correctly your Salesforce instance should new contacts in it:

SQL Server SSIS SalesForceSQL Server SSIS SalesForce

Run this package as daily job

To run this package as a daily job that will populate Salesforce hourly, daily, weekly, etc. we’ll need to create a SQL Agent job and set a schedule.

Next Steps:

  • Updating Salesforce is a little more complicated and requires a custom field in Salesforce. The video titled “Upsert with external ID” located here will help with this.
  • Check out com for your custom SSIS needs.

Managing the size of the SQL Server SSIS catalog database

The SSIS catalog is the central point for working with Integration Services (SSIS) projects that you’ve deployed to the SSIS server. When the SSIS catalog is implemented, the SSISDB database is automatically created. The scope of this post will focus on the SSISDB growth more than the SSIS catalog itself.

SSIS projects, packages, parameters, environments, and operational history are all stored in the SSISDB database so if you have hundreds of SSIS packages or packages that run every few minutes you could see how the database storing all the historical information would grow exponentially.

Also included when you enable this feature is a SQL Server Agent Job called SSIS Server Maintenance job:

SQL Freelancer SSIS Catalog 1
Inside this job are two steps, SSIS Server Operation Records Maintenance and SSIS Server Max Version Per Project Maintenance, that will help clean up the database. By default, this job is set to run at 12:00am nightly which is sufficient:

SQL Freelancer SSIS Catalog
Looking at the first step, SSIS Server Operations Records Maintenance, you will notice that it executes a stored procedure named internal.cleanup_server_retention_window. This sounds like it could be the stored procedure that cleans up history:

SQL Freelancer SSIS Catalog
Let’s browse out to the stored procedure in Management Studio and take a look at the code:

SQL Freelancer SSIS Catalog
You can see from the very beginning of the stored procedure in the BEGIN TRY statement it first looks to see if Operation cleanup is enabled and if cleanup is enabled then it looks for the Retention Window:

SQL Freelancer SSIS Catalog
Click here to view the rest of this post.

SQL Server Maintenance Plans Reporting and Logging

This post will focus on the reporting and logging option of maintenance plans.

When a maintenance plan executes it’s nice to know the results especially in case of a failure. You can view the results in a few different ways that include the following:

  • Maintenance Plan history
  • SQL Server Agent Job history
  • SQL Server Error Log
  • sp_readerrorlog
  • Maintenance Plan Reporting and Logging

Maintenance Plan Reporting and Logging Options

The Maintenance Plan reporting and logging option is enabled by default, but a lot of DBA’s and developers don’t even realize it is an option, much less that it’s enabled.

To configure this option, open a maintenance plan and on the top bar beside Manage Connections…. you’ll notice a little chart/paper icon. It’s not hidden, but it doesn’t jump out at you and that’s probably why a lot of DBA’s don’t pay any attention.

SQL Freelancer SQL Server Maintenance Plan Reporting and Logging

If you click the icon you’ll notice there are a few options to choose from:

SQL Freelancer SQL Server Maintenance Plan Reporting and Logging

Let’s go over each one of these:

Generate a text file report

This option allows you to enable or disable the text file report.

Create a new file

This option allows you to create a new report file each time a maintenance plan is executed. Create a new file is the default option and the default folder location is the folder you specified SQL Server to use for the LOG folder. You can specify a different location if preferred.

Here is a screenshot of Windows Explorer where a new file is created each execution:

SQL Freelancer SQL Server Maintenance Plan Reporting and Logging

You may notice that if you run a maintenance plan throughout the day that it could quickly fill up your drive with these 1kb files. Luckily, we don’t have to go in and check our file system and delete these files manually. SQL Server has a task that will automate this for us (see below).

While we are at this point, go ahead and check one of your servers. Check the location of a maintenance plan text file and then check the folder on the server. Or maybe you’ve noticed these files while browsing through your LOG folder and wasn’t sure where they came from.

Append to file

This option allows you to create one text file and append the results to that file. This will reduce the number of files, but will increase the size of the file and it makes it more difficult to read in my opinion.

Click here to view the rest of this post.

Managing the Transaction Log

In this post, I’ll discuss some tasks that you can do to manage the transaction log. The transaction log is very important to SQL Server and there are books alone on how to manage the log so this will be an introduction post.

Where do I start? The transaction log is an integral part of SQL Server. Every database has a transaction log that is stored within the log file that is separate from the data file. A transaction log basically records all database modifications. When a user issues an INSERT, for example, it is logged in the transaction log. This enables the database to roll back or restore the transaction if a failure were to occur and prevents data corruption. For example, let’s say Sue is using an application and inserts 2000 rows of data. While SQL Server is processing this data let’s say someone pulls the plug on the server. (Again, this is just an example, I hope nobody would actually do this). Because the INSERT statement was writing to the transaction log and it knows a failure occurred it will roll back the statement. If this wasn’t put in place, could you imagine having to sift through the data to see how many rows it inserted and then change the code to insert the remaining rows? Or even better, what if the application inserted random columns in no order and you had to determine what data was inserted and what data was left out? This could take forever!

The transaction log supports the following:

ROLLBACK TRANSACTION – If a user or application issues the ROLLBACK statement, or if the DB engine detects a failure, the log records are used to roll back the transaction.

Recover Incomplete Transactions – If you have ever started SQL Server from a failure you may have noticed databases in the (In Recovery) mode:

SQL Freelancer SQL Server Transaction Log

This is an indication that SQL Server is rolling back transactions that did not complete before the SQL Server was restarted or it is rolling forward all modifications that were recorded in the log but not written to the data file.

Rolling a restored DB, file, filegroup, or page forward to the point of failure – If SQL Server were to fail and you need to restore the database back to the point in which the failure occurred you can as long as you are using the FULL recovery model. Start with a full backup, then apply the latest differential, and the subsequent transaction log backups up to the point of failure.

High availability solutions – Transactional replication, mirroring, and log shipping all use the transaction log.


Transaction Log Truncation

One thing that I see a lot of administrators ask about is transaction log size and how to truncate it. Log records that are not managed correctly will eventually fill up the disk causing no more modifications to the database. Transaction log growth can occur for a few different reasons. Long running transactions, incorrect recovery model configuration and lack of log backups can grow the log.

Log truncation frees up space in the log file so the transaction log can reuse it. Unless there is some kind of unexpected delay, log truncation will occur automatically after a checkpoint (if the database is in SIMPLE recovery model) or after a log backup (if the database is in FULL or BULK-LOGGED recovery model). MSSQLTips.com offers plenty of tips regarding transaction log truncation.


Recovery Models

I mentioned recovery models above so I’ll give a brief overview of each one…

Full – In the Full recovery model, log files should be backed up on a regular basis to prevent the disk from filling up the drive. Also, in the Full recovery model, you can restore a database to the point of failure depending on your log backup schedules. This is the default for the model database in SQL Server. If you leave this recovery model set to Full, make sure you schedule log backups.

Simple – In the Simple recovery model, transaction log backups cannot be used. This means that SQL Server will automatically reclaim disk space at certain intervals, which can be good, but it also means that if a failure were to occur you can only restore back to the last full backup since all transactions are lost in between. This is generally used in development environments.

Bulk Logged – In the Bulk Logged recovery model, certain large scale or bulk copy operations are minimally logged. I have never left a database in the Bulk Logged recovery model for a large amount of time. I usually will have a database in the Full recovery model and if I’m doing a bulk insert and don’t need to log the transactions, I’ll switch to Bulk Logged and then back to Full once this is complete. Bulk Logged, however, does support log backup and point in time recovery.


Backup the Log File

When using the Full or Bulk Logged recovery models it is recommended to backup the transaction log regularly. Remember, when the log is backed up, it will automatically truncate.

Before you backup the transaction log, you must have a full backup otherwise you will get an error:

SQL Freelancer SQL Server Transaction Log

You can perform a backup using SQL Server Management Studio (SSMS) or the following syntax:

BACKUP LOG [Demo]
TO DISK = N'D:\Backups\Demo.bak'

Monitoring the Log

Monitoring the log file is very important and SQL Server has made it fairly easy for us to do this. One way to find information about the log is in the catalog view sys.database_files. This view returns information about data and log files that include type of file, name, location, state, size, growth, etc. The following query will filter down to only the log file and displays some very useful information:

SELECT name AS [File Name], physical_name AS [Physical Name], size/128.0 AS [Total Size in MB], size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS [Available Space In MB], [growth], [file_id]
FROM sys.database_files
WHERE type_desc = 'LOG'

SQL Freelancer SQL Server Transaction LogYou can also use DBCC SQLPERF (‘logspace’) which has been around for a while. This command displays useful details such as DB name, Log Size (MB) and Log Space Used (%):

SQL Freelancer SQL Server Transaction Log

Controlling the growth of the transaction log can help in performance issues. To change the growth increment, use the FILEGROWTH option. A value of 0 indicates that automatic growth is set to off and no additional space is permitted. A small autogrowth increment on a log file can reduce performance. The file growth increment on a log file should be sufficiently large to avoid frequent expansion. The default growth increment of 10 percent is generally suitable.

As you can see, the transaction log is used throughout SQL Server, and the information above will get you started managing the transaction log. There are multiple items that I did not cover in this tip that include:

  • Modifying the log file
  • Shrinking the log file
  • Adding/Removing a log file
  • How a log file is used in High Availability solutions
  • How to restore to a point in time

Creating SQL Server performance based reports using Excel

I’m not a big fan of Performance Monitor but in this post we will review the steps in how to export the results so they can be read more easily in Excel.  This post will not discuss how to setup a data collector set and will assume the reader already has results saved to the file system in a .blg file format.

If you’ve ever used PerfMon you’ve probably noticed .blg files. These are the files that hold all of our performance data that we have collected over the past week. When you open this file in PerfMon it can be tedious work to get the data in a presentable form. Actually, I don’t know if there’s a good way to get this data in a presentable form using the PerfMon’s GUI.

SQL Freelancer SQL Server PeSQL Freelancer SQL Server Perfmon Graph Excel

First thing we need to do is to convert the .blg to .csv so we can open it in Excel. To convert this file open command prompt and navigate to the directory the file is located and type:

Relog SQL_BASELINE_20140128.blg –f CSV –o NewFile.csv

SQL Freelancer SQL Server Perfmon Graph Excel
Once the command completes successfully, you should see the new file in the same directory:

SQL Freelancer SQL Server Perfmon Graph Excel

Open NewFile.csv in Excel:

SQL Freelancer SQL Server Perfmon Graph Excel

Not pretty huh? First thing we need to do is format the first column….which is the Date column. Change the text in the A1 cell from PDH-CSV 4.0…. to DateTime:

SQL Freelancer SQL Server Perfmon Graph Excel

Next, remove Row 2. We do not need this data. Also, highlight column A and right click anywhere in the column and choose Format Cells:

SQL Freelancer SQL Server Perfmon Graph Excel

On the number tab, click the Data Category and select 3/14/2012 1:30PM:

SQL Freelancer SQL Server Perfmon Graph Excel

Press Ctrl+Home to select the A1 cell.

Create Pivot Chart

Click on the Insert tab and select Pivot Chart:

SQL Freelancer SQL Server Perfmon Graph Excel

The Pivot Chart dialog box should automatically select the correct cells to analyze. If not, make sure all of the cells are selected:

SQL Freelancer SQL Server Perfmon Graph Excel

Once the new worksheet opens, drag DateTime to the Axis Fields pane:

SQL Freelancer SQL Server Perfmon Graph Excel

Now, depending on what counter(s) you want to analyze, drag it down to the Values pane. For this example, I’ll analyze Processor(_Total)\% Processor Time:

SQL Freelancer SQL Server Perfmon Graph Excel

Ta da! You now have a graph that displays your Processor %.

SQL Freelancer SQL Server Perfmon Graph Excel

To make this presentable, simply format the graph to your liking and you’re done.

Identify SQL Server databases that are no longer in use

I have come across this problem a few different times in my career. I’ll change jobs and they’ll be numerous database servers that I inherit that I know nothing about. It’s a process to learn what each server does, what applications use them, and what databases are no longer used and can be removed.

There is no “tried and true” method to knowing if a database is truly no longer used, but I have three different suggestions that may help with your research. These suggestions are all based around capturing user connections.

SQL Server User Connection Count

One suggestion to finding orphan databases is to get connection counts. In most cases, if a database has zero user connections over a long period of time, it may be time to look into removing this database. The following query will capture server name, database name, number of connections, and time the query was executed and it will also filter out system databases because they are needed:

SELECT @@ServerName AS server
 ,NAME AS dbname
 ,COUNT(STATUS) AS number_of_connections
 ,GETDATE() AS timestamp
FROM sys.databases sd
LEFT JOIN sysprocesses sp ON sd.database_id = sp.dbid
WHERE database_id NOT BETWEEN 1 AND 4
GROUP BY NAME

SQL Freelancer SQL Server Database Connections

I’m using a server named BUADMIN for this example. As you can see I have 3 active connections to the database SQLCensus. This is a good indication that this database is in use. MonitorDB and SSISDB have 0 connections, so I may need to monitor them further. The easiest way to monitor these databases is to create a stored procedure using this query so I can schedule it. You can also put this query directly into a SQL Server Agent Job and set a schedule.

Before setting a schedule, you will need to create a table that will hold the results. To create a table using the following code:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Connections](
 [server] [nvarchar](130) NOT NULL,
 [name] [nvarchar](130) NOT NULL,
 [number_of_connections] [int] NOT NULL,
 [timestamp] [datetime] NOT NULL
) ON [PRIMARY]
GO

Next, create a stored procedure that will INSERT the results into the table:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE usp_ConnectionsCount 
AS
BEGIN
 SET NOCOUNT ON;
INSERT INTO Connections 
  SELECT @@ServerName AS server
 ,NAME AS dbname
 ,COUNT(STATUS) AS number_of_connections
 ,GETDATE() AS timestamp
FROM sys.databases sd
LEFT JOIN master.dbo.sysprocesses sp ON sd.database_id = sp.dbid
WHERE database_id NOT BETWEEN 1
  AND 4
GROUP BY NAME
END

Once the stored procedure is created you can create a SQL Server Agent Job and set it to run on a schedule. I’ll set it to run every 10 minutes.

Let this run a few days, a few months or however long you think is appropriate and then go back and examine the results. Once you are happy with the timeframe chosen, use the following query to select the MAX number of connections per database:

SELECT NAME
 ,MAX(number_of_connections) AS MAX#
FROM Connections
GROUP BY NAME

SQL Freelancer SQL Server Database Connections

From here you will be able to determine if any databases have not had a user connection in the timeframe specified.

Detailed SQL Server Connection Information

The above suggestion is good if you just need connection counts. However, sometimes a count isn’t good enough. Sometimes you need to know exactly what is connecting. This suggestion helps in that aspect.

It’s basically setup the same way, create a stored procedure, insert data into a table, set a schedule and examine the results.

The following query gives you more information:

SELECT @@ServerName AS SERVER
 ,NAME
 ,login_time
 ,last_batch
 ,getdate() AS DATE
 ,STATUS
 ,hostname
 ,program_name
 ,nt_username
 ,loginame
FROM sys.databases d
LEFT JOIN sysprocesses sp ON d.database_id = sp.dbid
WHERE database_id NOT BETWEEN 0
  AND 4
 AND loginame IS NOT NULL

SQL Freelancer SQL Server Database Connections
Click here to view the rest of this post.