John McCormack DBA

SQL Server Databases and Cloud

  • Personal
    • About
  • Free Training
    • SQL Server on Amazon RDS (Free Course)
    • Free practice questions to help you pass DP-900
  • Save money in Azure
    • Azure IaaS SQL Backups – Stop burning money
    • Your Azure SQL Database and Managed Instance is too big
    • Turn the cloud off at bedtime to save 70%
    • Your Azure SQL Virtual Machine might be too big
    • Save money with Azure SQL DB serverless
    • Save up to 73% with reserved instances
    • Delete unused instances to save money in Azure
  • Hire me
    • 60 minute cost optimization
    • Let me solve your SQL Server problems
    • Take a look at my Sessionize speaker’s profile

John McCormack's blogs

When should I stripe SQL Server database backups

23rd March 2020 By John McCormack Leave a Comment

What is a striped backup?

First of all a definition of what we mean when we say stripe SQL Server database backups. When we talk about striping a backup, it simply means the overall backup is distributed across several files, often on different drives. Microsoft’s description.

What are the benefits?

  1. Time
    1. Writing to multiple files can save time. This is because the files can be written to the file system in parallel. This is useful if your backups are taking too long and you would like/need them to complete faster.
  2. Storage
    1. If a backup drive is getting full, you can split your backup across multiple drives thus saving space on each drive. The total space used will be similar to keeping the backup on a single file but you may see a bit more overall space taken up.

Striped vs Single File

It’s a balancing act. My advice is stick with the defaults until you have an issue or would like something to perform better. Then it’s a good time to experiment and see how you get on. As a rule, the bigger your backups become, the more useful you may find striping your database backups.

How to stripe my backups?

We will discuss T-SQL and Powershell. For Powershell, I’m using the DBATools PowerShell module. DBATools covers far more scenarios than Microsoft’s own default module for SQL Server and is being used and tested all day every day. If you wish to run the powershell commands below, you will need to have dbatools installed. For the T-SQL commands, these all run natively in SSMS.

T-SQL

-- Split file across two drives as not enough space on E:
BACKUP DATABASE StackOverFlow2010
TO DISK = N'E:\Backup\StackOverFlow2010\StackOverFlow2010_1.bak',
DISK = N'F:\Backup\StackOverFlow2010\StackOverFlow2010_2.bak'
WITH COMPRESSION;

T-SQL (Using Ola Hallengren’s DatabaseBackup procedure)

EXECUTE dbo.DatabaseBackup
@Databases = 'StackOverFlow2010',
@Directory = 'E:\Backup\StackOverFlow2010\,F:\Backup\StackOverFlow2010\',
@BackupType = 'FULL',
@Compress = 'Y',
@NumberOfFiles = 2

Powershell (DBATools)

Backup-DbaDatabase -SqlInstance localhost -Database StackOverflow2010 -Path E:\Backup\StackOverflow2010\,F:\Backup\StackOverflow2010\ -CompressBackup

Restoring striped backups

It is therefore worth testing your backups and restore scripts every so often. A backup that cannot be restored is not really a backup.

T-SQL

RESTORE DATABASE StackOverFlow2010
FROM DISK = N'E:\Backup\StackOverFlow2010\StackOverFlow2010_1.bak',
DISK = N'F:\Backup\StackOverFlow2010\StackOverFlow2010_2.bak'

Powershell (DBATools)

Restore-DbaDatabase -SqlInstance localhost -DatabaseName StackOverflow2010 -Path E:\Backup\StackOverflow2010\,F:\Backup\StackOverflow2010\ -WithReplace

Filed Under: front-page, Guides, PowerShell, T-SQL

Data Glasgow User Group

13th February 2020 By John McCormack Leave a Comment

John McCormack Presenting SQL Server ToolsI spoke at the relaunch of the Glasgow Data User Group (formerly Glasgow SQL Server User Group).

It was a great event, hosted by Craig, Louise and Robert. My presentation was on some useful free community tools, which can make your life as a DBA much easier. It shows that DBAs with zero budget to spend on expensive software still have a wide range of free software to make our lives easier. Moreover, even with a large budget, some of these free tools are still the best in their class so are worth reviewing.

Slides for tonight’s talk on SQL Server Community Tools are available on Github.

 

Filed Under: front-page

Change a user’s resource class within Azure Synapse Analytics using Azure Automation runbooks

17th December 2019 By John McCormack Leave a Comment

Change a user’s resource class within Azure Synapse Analytics | A solution

This post shows a method for allowing developers to change a user’s resource class within Azure Synapse Analytics | Azure SQL Data Warehouse without granting them membership of the db_owner fixed database role.

Scenario:

I was a approached by a data warehouse developer who wanted to be able to take advantage of the feature in Azure Synapse Analytics (Formerly Azure SQL Data Warehouse) called ‘Resource classes’. Resource classes allow users in certain roles to have access to extra compute resources for their queries. This can increase their max memory per query (although there is an overall trade-off on concurrency). Unfortunately, due to this being managed via roles, the person looking to change resource classes needs to be in the db_owner role so they can manage database role membership. As I didn’t want to add the developer to the db_owner role, a different solution was required.

Solution:

A separate sql user can be added to the db_owner role on the SQL Data Warehouse in question. We can then create a credential in azure automation based upon this user. Finally, we create a Runbook which executes db_addrolemember (but only allows a specific few roles related to resource groups). This means we can give an end user the ability to add/remove people by granting them access to run the runbook.

How to:

Create SQL login/user
-- Create sql login
CREATE LOGIN [dev_dba_automation] WITH PASSWORD = 'S3cur£P@ssword!'

-- Change into DWH context, not master
CREATE USER [dev_dba_automation] WITH DEFAULT_SCHEMA=[dbo]

-- FYI
-- ALTER ROLE X ADD MEMBER Y doesn't currently work with Azure SQL DWH
Create Azure Automation Credential

From your automation account, find the Credentials option (under shared resources) on the left hand menu.

Add Credential To Azure Automation

Then create your new credential, using the newly created sql login name as the user name and then add the correct password.

Add New Credential To Azure Automation

Please note if the end user has privileged access to Azure Automation with a role such as owner or contributor, they could use the credential to carry out other tasks in the data warehouse so please be careful about who will have access to creating new runbooks.

Create Runbook
  • Log into Azure Automation and create a new runbook.
  • Copy this Powershell code in.
<#
.DESCRIPTION
This runbook allows the person running it to add a database user to a
database role on dev-datawarehouse

.PARAMETER Database
String name of the SQL Server database to connect to

.PARAMETER databaserole
String name of the database role you wish to add a member to

.PARAMETER rolemember
String name of the user you wish to add to the database role

.PARAMETER add_or_drop
String Must = add for db_addrolemember or drop for sp_droprolemember

.NOTES
AUTHOR: John McCormack
LASTEDIT: Dec 16, 2019
#>

param(

[parameter(Mandatory=$True]
[string] $Database,

[parameter(Mandatory=$True]
[string] $databaserole,

[parameter(Mandatory=$True]
[string] $rolemember,

[parameter(Mandatory=$True]
[string] $add_or_drop
)

# Variables
$SqlCredentialAsset = 'dev_az_automation'
$SqlServer = 'dev-datawarehouse.database.windows.net'
$SqlServerPort = 1433
$SqlCredential = Get-AutomationPSCredential -Name $SqlCredentialAsset

if ($SqlCredential -eq $null)
{
throw "Could not retrieve '$SqlCredentialAsset' credential asset. Check that you created this first in the Automation service."
}
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password

# Define the connection to the SQL Database
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")

# Open the SQL connection
$Conn.Open()

# Run sp_addrolemember as the credential user
#$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
if($add_or_drop -eq "add")
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_addrolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
else
{
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC sp_droprolemember @rolename = $databaserole, @membername = $rolemember", $Conn)
}
$Cmd.CommandTimeout=120

if ($databaserole -Like "*rc*")
{
# Execute the SQL command
$Ds=New-Object system.Data.DataSet
$Da=New-Object system.Data.SqlClient.SqlDataAdapter($Cmd)
[void]$Da.fill($Ds)

# We should look at adding some logging for this
WRITE-OUTPUT $rolemember $add_or_drop"ed to database role:$databaserole."
}
else
{
WRITE-OUTPUT "Invalid role"
}

# Close the SQL connection
$Conn.Close()
  • Run the workbook and observe the output.
Further reading
  • https://docs.microsoft.com/en-us/azure/sql-data-warehouse/resource-classes-for-workload-management#example-code-for-finding-the-best-resource-class
  • https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-role-transact-sql?view=sql-server-ver15#examples
  • https://gallery.technet.microsoft.com/scriptcenter/How-to-use-a-SQL-Command-be77f9d2
    • (Base script I adapted)
Other Azure Content
  • https://johnmccormack.it/2019/03/put-tempdb-files-on-d-drive-in-azure-iaas/

Filed Under: Azure Automation, Azure SQL Data Warehouse, Azure Synapse Analytics Tagged With: azure, azure automation, azure sql data warehouse, azure synapse analytics

3 things a new DBA should learn first

6th November 2019 By John McCormack Leave a Comment

3 things a new DBA should learn first

Why would I put a post together about the 3 things a new DBA should learn first? Well, I was asked by a colleague what I thought she should be focusing on learning and mastering as newish SQL DBA. She was looking to become more experienced and confident in her role. Obviously this is an opinion piece but I felt it was worth sharing. The 3 suggestions are not in any particular order, and of course I’m sure not everyone will agree but it’s an opinion based on my own experience as a SQL DBA. (Feel free to disagree in the comments, I’m sure people will strongly about some item’s I’ve omitted)

  • What is happening on the server right now
  • Backup and restore of SQL Databases
  • Scripting i.e T-SQL and PowerShell

What is happening on the SQL server right now

Help, the SQL Server is on fire and my really important process is not working. Why is the server so slow?

This scenario will happen to you. Invariably, the person (often a manager) will stand over you and expect you to knock out one or two lines of t-sql wizardry to get things running along smoothly again. First of all, I should say that in a perfect world, you will ask them to raise a ticket and you will work on it according to its priority against your other tasks. Then you can let them know what the issue was and what can be done to prevent it happening again. But we rarely work in a perfect world. In this scenario, you need one primary line of t-sql to get started.

EXEC sp_whoisactive

The procedure sp_whoisactive doesn’t come with SQL Server. It is a community script created by Adam Machanic. You can download it from GitHub. If you don’t have it installed on your SQL Servers, It is something your should really consider as it gives much more useful and readable information that sp_who2 and it’s a lot easier than pulling together your own code. It’s mature and very safe and has been installed on SQL Servers worldwide since 2007.

sp_whoisactive offers loads of optional parameters that allow you to customise and sort the output according to your own preference but just running the proc on its own without parameters will give you an ordered list of everything that is executing, at that point in time. (Ordered by duration descending). If you see things running in the minutes that usually take seconds, maybe you need to see if they are blocking other transactions.

example output from sp_whoisactive

One parameter I find really useful during an incident is @find_block_leaders. By running [sql]EXEC sp_whoisactive @find_block_leaders = 1[/sql] , you can see exactly how many sessions are being blocked from each blocking session. In the example below, you can see that an INSERT transaction in session_id 52 is blocking 5 other sessions. Each of these are trying to read from the table with an open insert transaction so they are blocked. You either need to wait for 52 to finish or you need to kill it in order for the other transactions to move on.

example output from sp_whoisactive @find_block_leadersA quick note on killing spids. I really only recommend this if you know what the process is and you have an idea of how long it will take to rollback. (Remember those other 5 spids are still blocked until the rollback completes and this is a single threaded process)

Of course, it might not be blocking and in that case, you will need more scripts to analyse what is running regularly, how queries are performing and if any optimisations are needed. This will need to be in another blog as I want to keep this post fairly succinct.

Backup and restore of SQL Databases

Knowing how to back up a database is an essential skill of a DBA, certainly one of the top required skills. Equally important is knowing how to restore your database backups. This is something you should regularly practice to ensure that when/if the time comes that you need to act quickly, you are well rehearsed. Trying to restore a database to a point in time in the middle of a P1 emergency, is the stuff of nightmares if you haven’t done it before.

Learn the different types of backups available and how often you should be doing each of them. This will vary depending on your business needs. Even on the same instance, some databases may need point in time recovery and others wont. e.g. It might be fairly acceptable to back up your small master database once per day but you cannot afford to lose more than 5 minutes of orders in the event of your instance going down. In the case of your orders database, you will need a transaction log backup every 5 minutes. Depending on the size of your database, you will be looking at a combination of regular full and differential backups or just regular full backups (as well as your transaction log backups of course)

If you are on a PAAS database such as Azure SQL Database or AWS RDS, the backups can be done automatically without any administration effort but you will still want to practice restoring to a new instance, possibly into a new region or availability zone in the event of a disaster.

Other backup related topics to look into are compression, encryption, striping and retention management.

Scripting i.e T-SQL and PowerShell

This may be a strong opinion but I believe point and click DBAs are going the way of the dinosaurs. SQL Server Management Studio (SSMS) still makes it possible to do lots of work with knowing much T-SQL or PowerShell but this does not scale. If you end up managing tens or even hundreds of SQL Servers, scripting will be your friend. The more tedious work you can automate, the more time you have to work on more interesting tasks.

Learn T-SQL

Despite me saying that point and click is not the way to go, SSMS is very useful for providing the T-SQL code for how to do something. Say I wanted to create a new sql login that doesn’t expire and I want to give it sysadmin permissions. The first time I do this, I can step through the GUI but instead of clicking OK, I can click Script and the code is opened out in SSMS for me to review and run. I can also save this as a .sql script and either use it as a template for creating future sql logins, or refer to it often enough that I learn the syntax.

script out create login from ssms

USE [master]
GO
CREATE LOGIN [john] WITH PASSWORD=N'ae34bhijkfgcd5', DEFAULT_DATABASE=[master], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
GO
ALTER SERVER ROLE [sysadmin] ADD MEMBER [john]
GO

Learn Powershell

If this seems daunting, please don’t worry. There is a huge community project called dbatools that has more functionality for managing SQL Server than the official Microsoft SQL Server module. It’s a great way to start running Powershell commands and building your knowledge as you go.

You can start with commands as an when you need them and build up your knowledge from there. The website is extremely useful and by tagging @psdbatools on twitter, you will usually get help pretty quickly from someone involved. As your skills and confidence increase, you may even choose to add code to the project by making a pull request. As I write this post, dbatools has over 15,000 commits from 175 contributors. (I’m one of them in a VERY small way)

Summary

There are more than 3 things a new DBA should learn, a lot more but these 3 items will help you remain calm in a crisis and let you automate work, meaning you can be more efficient and have time to spend on interesting work.

Further reading

  • A brief history of activity monitoring
  • DBATools in a month of lunches
  • Backup and restore of SQL Server Databases

Filed Under: front-page, Guides, SQL Server

Data Scotland 2019 presentation

13th September 2019 By John McCormack 2 Comments

Data Scotland 2019 presentation:

My Data Scotland 2019 presentation: You can run SQL Server on AWS

Data Scotland 2019: You can run SQL Server on AWS from John McCormack

Useful links:

https://aws.amazon.com/rds/sqlserver/

https://docs.aws.amazon.com/en_pv/AmazonRDS/latest/UserGuide/USER_ConnectToMicrosoftSQLServerInstance.html

https://aws.amazon.com/blogs/developer/handling-credentials-with-aws-tools-for-windows-powershell/

https://johnmccormack.it/?s=aws

 

Filed Under: AWS RDS, AWS SQL Server, front-page Tagged With: #datascotland, datascotland2019, ec2, rds, SQL server

  • « Previous Page
  • 1
  • …
  • 10
  • 11
  • 12
  • 13
  • 14
  • …
  • 21
  • Next Page »
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

John McCormack · Copyright © 2023