Tuesday, June 12, 2018

Using Symbolic Links with SSMS templates

For years I have used the template browser in SQL Server Management Studio to store commonly used queries. I have created a sub-folder prefixed with 'aa_ ' so it is listed first then created more sub-folders under that to store my various queries. Then i have those queries available to every server I am connected to. This keeps me from having to find remember if I stored the latest version in Confluence, on my laptop or on a network share. They are always available right from SSMS.
if you haven't used SSMS templates before, to to View→ Template Explorer to bring up the browser. Then you can create a folder structure there.

Now if you are like me, you have more than one edition of SSMS installed on your laptop. Currently I have SSMS 2014, 2016 and 2017. The different versions of SSMS use different folders to store templates, so with each new version I have to copy my aa_Rutherford folder from the old template location to the new. The problem is now they are out of sync, if I add folders or files in one version of SSMS they don't get updated in the others.
I solved this problem using something called symbolic links. If you have used Linux before you are probably familiar with these as they are used quite a bit by the operating system. Basically a symbolic link is a fake file system object that points to a different and real file system object. You can create a folder in one directory that is a link to a folder somewhere else on your machine. Then when you browse to that folder, you see the files and folders that exist in the other directory.
Might be a bit confusing so lets see an example. The command you use is: mklink /D [symlink_name] [C:\Path\To\Actual_folder]
Suppose we have a folder name aa_Rutherford located at C:\JRTemp and we wanted to create a symlink at C:\JR_NewTemp. First you need to launch a command prompt as administrator.
  1. Use 'CD' to change to the new directory
  2. We can use the 'DIR' command to see that it is empty
  3. Run our 'MKLINK' command to create the symlink
  4. Run 'DIR' again to see the new symlink and where it is pointing.
  5. Then run 'DIR' on the symlink and you can see the files that are inside the destination location.
Here are the locations for the template folders in SSMS
2014 - C:\Users\[username]\AppData\Roaming\Microsoft\SQL Server Management Studio\12.0\Templates\Sql\
2016 - C:\Users\[username]\AppData\Roaming\Microsoft\SQL Server Management Studio\13.0\Templates\Sql\
2017 - C:\Users\[username]\AppData\Roaming\Microsoft\SQL Server Management Studio\14.0\Templates\Sql\

Tuesday, May 15, 2018

Windows 10, Credential Guard and Linked Servers

Currently we are looking into upgrading the user desktops and laptops to Windows 10. During testing we ran into an issue using Linked Servers. Any time you attempted to run a query against a linked server you would get a Anonymous Logon error
On the machine we were attempting to run the query against there would be a similar error in the log:
     Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'. Reason: Could not find a login matching the name provided.
Normally when I see the Anonymous logon error I immediately think SPN issue, but this error message is different than the one I normally see due to the addition of the "Could not find a login matching the name provided". Also, we were not having any issue running the same Linked server query from a windows 7 machine.

After some fruitless googling, we came across a MSDN article that led us to the culprit: Credential Guard. (Here is the link to the article)

Credential Guard and Linked Servers

Credential Guard is new in Windows 10 and is a feature that aims to "isolate and harden key system and user secrets against compromise". Here are a few links to some Microsoft documentation if you want to read up on it
The third link gave us the information we needed to figure out what was happening in our environment. According to the article, Credential Guard will break applications if they require Kerberos unconstrained delegation. This means if you enable delegation for a service account and do not set it to use specified services only, then your linked server connections will fail when you are not connecting directly from the server (this is the classic double hop scenario). 
If we look at the delegation page of a service account in active directory we can see the difference between constrained and unconstrained delegation. In our environment, we choose "Trust this user for delegation to any service". This is unconstrained delegation, the service account can act on behalf of another user to any server in our environment.
To change this to constrained delegation you select the radio button for "Trust this user for delegation to specified services only"  but then you must create an entry for every host and service that the account can connect to.
This might be a valid solution if you run a shop with only a few servers and rarely add linked servers. If you are constantly building new servers and adding linked servers to the environment this becomes unmanageable. Especially if you are in an environment where the people who create linked servers and the people who are in charge of creating SPNs and configuring delegation are not part of the same team.

Solutions?

So how do we get around this? I mentioned using constrained delegation which might be suitable for some people but even on the MSDN post that I mentioned at the beginning of the article, the poster was only able to get that to work on 9 out of his 12 servers. I also read that this solution might not work properly with availability group listener names, cluster names or CNAMES but I have not had the opportunity to test these out.
There was also mention, in the MSDN post, of setting up a priviledged access workstation or PAW (see this link), but users are not going to want to have to connect to a secondary workstation every time they want to use SSMS. 
Another option is to disable Credential Guard (see this link). This is probably the work around that most people will start with but you will have to work with your Security team to see if they are willing to disable this built-in protection long term.
I was surprised to only find a few posts in internet forums dealing with this as windows 10 has been out for a few years and while linked servers are generally frowned on they are being used elsewhere. 
Has anyone else run into this issue? Post in the comments if you have any additional solutions.

Monday, March 5, 2018

Getting Replication Snapshot Delivery Times

Getting Replication Snapshot Delivery Times

We use transactional replication quite heavily in our environment. We have multiple publishers, distributors and subscribers. Currently we have three subscribers overseas, one in London and two in Sydney. When we were setting up the second subscriber in Sydney I was asked how long it was going to take. My answer was, "More than an hour but less than two days". Really I had no idea, we had only ever reinitialized a publication or two out to Sydney but never tried to setup all sixty four international subscriptions and initialize them all at once. I started around Midnight on Friday night setting up replication (from scripts of course, you never want to use the GUI for a go live event) and it finished around 10 pm Saturday night. I decided I wanted to know which publications took the longest.
Normally I figure out the times by opening up Replication monitor, expanding out the publication, right clicking on the subscription and selecting "View Details". Then I look for the following lines.

First I look for the start time:

Then I look for the completion event:

I grab the date and times from these two events and using a little math (or excel) and I can figure out how long the snapshots took to deliver. 

So, I opened a new excel sheet, created a row for the first publication and started logging the data. I looked down the page and realized I did not want to do this process sixty three more times. I figured the data in replication monitor had to be stored somewhere on one of the servers involved and there must be a way to query it. I assumed (correctly) that I could get the information from the distributor and after poking around I found the tables I needed to use. Below is the completed query 


USE distribution_Research
GO
DECLARE @mysub sysname,
    @mystart DATETIME
-- Variables - change these you will
SET @mysub = 'SubscriberDB-PRD'
SET @mystart = '2018-02-24 00:00:00.000'
-- Query - touch these you will not
SELECT DISTINCT pub.name,
    ms.publisher_db,
    mp.publication,
    sub.name,
    ms.subscriber_db,
    msh.start_time,
    msh.time AS endtime,
    CAST((msh.time - msh.start_time) AS TIMEAS duration,
    msh.comments
FROM dbo.MSdistribution_history AS msh
JOIN dbo.MSsubscriptions AS ms
    ON ms.agent_id = msh.agent_id
JOIN master.sys.servers AS pub
    ON ms.publisher_id = pub.server_id
JOIN master.sys.servers AS sub
    ON ms.subscriber_id = sub.server_id
JOIN dbo.MSpublications AS mp
    ON ms.publication_id = mp.publication_id
WHERE 1=1
AND sub.name = @mysub
--AND ms.subscriber_db = 'Returns'
AND msh.start_time > @mystart
AND (msh.comments LIKE 'Delivered%')
ORDER BY mp.publication, msh.start_time DESC


Using this I was able to get the the times for all the publications at once and throw them into excel. I combined the data with some row counts and tables sizes and was able to make some pretty nice reports to show my boss. Hopefully this will help someone else do the same.

Friday, February 16, 2018

Getting Started with Dbatools

Getting Started with Dbatools



When I started using PowerShell with SQL Server I used a module from CodePlex named SQLPSX. The module provided quite a few functions that made working with SQL server easier. But today Codeplex is gone and I don't think anyone has updated SQLPSX in years. You can still get it on github but some of the folders have update dates like "3 years ago". Thank goodness for DBATools (https://dbatools.io/). I attended a pre-conferece session this year at SQL Pass Summit that was run by three people on  the team and their module does everything that I was using SQLPSX for and more. Click the link above and go to the download page for installation instructions then use this page a quick primer on some of the things you can use it for.

Getting Data In and Out of SQL

One of the things I use PowerShell for is to gather information on all my database servers and store that data in my central DBAAdmin database. Dbatools has a function, invoke-DbaSqlQuery, that can both read and write to the database. SQLPSX used two different functions to do this. In this first example I am going to use the function to get a list of all servers from my server table
import-module dbatools
$server 'DBAAdminDB-PRD'
$db 'DBAAdmin'
$qry 'select Server_name from dbo.vw_ServerListing'
invoke-DbaSqlQuery -sqlinstance $server -database $db -query $qry
Very easy, no need to load .NET assemlies or create objects like you do when using SMO. The first line imports the module, then I create some variables and finally execute the statement. I could have done the whole thing in 2 lines by not using any variables.
Next lets use the function to insert some data into a SQL Server table. Suppose my server table only has three columns:  Server_name, Server_Env and Description
import-module dbatools
$server 'DBAAdminDB-PRD'
$db 'DBAAdmin'
$qry 'insert into dbo.server (Server_Name, Server_Env, Description) 
         VALUES ('SERVER1','PRD','Mission Critical Server')' 
invoke-DbaSqlQuery -sqlinstance $server -database $db -query $qry
Again very simple. Now lets look at some more interesting things you can do with the module.

Getting Information about your Server and Databases

Here are a few commands you can use to get information about your SQL Server and it's Databases
import-module dbatools
get-dbasqlinstanceproperty -SqlInstance TESTSERVER |select-object Name, Value
This command lists instance level information such as build number, edition, etc. You can see the output here


Now lets get some information on the databases on the server
import-module dbatools
Get-DbaDatabase -SqlInstance TESTSERVER
Here is the output


And finally lets get some information about the datafiles of one of the databases.
import-module dbatools
Get-DbaDatabaseFile -SqlInstance TESTSERVER -Database admin


These are just some very basic things you can do with Dbatools. There are over 300 commands in the module that you can use for gathering information, backing up SQL, or even for server migration. PowerShell is a very powerful language and is getting used more and more by SQL and Windows Admins. Using modules like dbatools is a great way to start using PowerShell easily and effectively