:::: MENU ::::
Browsing posts in: SQL Server Scripts

Powershell copy database backups

Needed to setup an automatic copy of latest SQL Server database backup from production database to test server. A twist was that the production database was setup with AlwaysOn database mirroring with backups taken on the secondary replica. So, how do we find the latest database backup? I solved it by querying both msdb’s and comparing the backup dates. Ugly but it works…

# Script to setup automatic copy of database backup from AlwaysOn
# production servers to test server
# I use it on a separate admin server
# Run this script as a Scheduled Task or on demand,
# the user need to be able to access the backup files remotely
# and also on the destination server catalog
# This is just an example script, needs to be edited to suit your environment
# It works by logging in to both AlwaysOn instances and
# checking the file path and time stamp
# for the latest backup of the database you want to copy.
# You need a simple user to log in to the production instances,
# no special rights granted,
# public can read this information in msdb, I use the login RemoteFilecopy
# I use diskshares for backups in the form of \\servername\drive$
# eg. \\192.168.0.2\z$
# edit to suit your need

########## Common variables for all copying ##########

$password = "YOUR PASSWORD FOR THIS LOGIN"
$db = "msdb"
$DestinationCatalog = "\\TESTSERVER\DISKSHARE$\DESTINATIONCATALOG\"

########## GET LATEST PRODUCTION DATABASE BACKUP ##########

$ServerInstance1 = "PRODUCTION_SERVER_1"
$ServerInstance2 = "PRODUCTION_SERVER_2"
$SourceDatabase = "DATABASE_NAME"

# Check latest backup from each server and get the latest one
$sqlstring = @"
SELECT TOP 1 bmf.physical_device_name [Filename], bs.backup_finish_date [BackupDate]
FROM msdb.dbo.backupmediafamily bmf
INNER JOIN msdb.dbo.backupset bs ON bmf.media_set_id = bs.media_set_id
WHERE bs.type = 'D'
AND bs.database_name = '$SourceDatabase'
ORDER BY bs.backup_finish_date DESC
"@

# I use a simple dummy date below just to have a non-null date to compare to
# If your backups are older than 2010, than you have other problems :-)

$Filename1 = (Invoke-Sqlcmd -Username $user -Password $password -Serverinstance $ServerInstance1 -Database $db -Query $sqlstring).Filename
$BackupDate1 = Get-Date "1/1/2010"
if($Filename1.length -gt 0 ) {
$BackupDate1 = (Invoke-Sqlcmd -Username $user -Password $password -Serverinstance $ServerInstance1 -Database $db -Query $sqlstring).BackupDate
}

$Filename2 = (Invoke-Sqlcmd -Username $user -Password $password -Serverinstance $ServerInstance2 -Database $db -Query $sqlstring).Filename
$BackupDate2 = Get-Date "1/1/2010"
if($Filename2.length -gt 0 ) {
$BackupDate2 = (Invoke-Sqlcmd -Username $user -Password $password -Serverinstance $ServerInstance2 -Database $db -Query $sqlstring).BackupDate
}

if ( $BackupDate1 -gt $BackupDate2 ) {
$SourceBaseCatalog = "\\PRODUCTION_SERVER_1\DISKSHARE$\"
$Filename = $SourceBaseCatalog + $Filename1.Remove(0,3)
}
if ( $BackupDate1 -lt $BackupDate2 ) {
$SourceBaseCatalog = "\\PRODUCTION_SERVER_2\DISKSHARE$\"
$Filename = $SourceBaseCatalog + $Filename2.Remove(0,3)
}
if ( $BackupDate1 -eq $BackupDate2 ) { Write-Host "Found no backup or same date, break!!"; Break; }

################### START COPYING BACKUP ################################

# Change from sqlps to normal filesystem usage
Set-Location C:

# Create destination filename, I use the original name plus "FULL_LATEST"
$DestinationFilename = "$DestinationCatalog" + $SourceDatabase + "_FULL_LATEST.bak"

# Check if file exists, exit otherwise
IF(!(Test-Path -path $Filename))a
{Write-Host "$Filename not found!"; exit}

# Check if destination catalogexists, exit otherwise
IF(!(Test-Path -path $DestinationCatalog)) {
Write-Host "$DestinationCatalog not found, exiting`r`n"
exit
}

# Copy file from PROD server to TEST server
Write-Host "Start copying file $Filename to $DestinationFilename"

# Uncomment line below to actually copy the file, but ONLY after testing, there are numerous possibilities to screw up here :-)
# Copy-Item -Path $Filename -Destination $DestinationFilename
########## COPY BACKUP DONE ##########


Database user roles

When moving a database to a new instance you need to make sure all users are connected to logins, or whenever you just want to check users roles.

SELECT u.name,
CASE
WHEN (r.principal_id IS NULL) THEN 'public'
ELSE r.name
END GroupName,
l.name LoginName,
l.default_database_name,
u.default_schema_name,
u.principal_id,
u.sid
FROM sys.database_principals u
LEFT JOIN (sys.database_role_members m JOIN sys.database_principals r
ON m.role_principal_id = r.principal_id)
ON m.member_principal_id = u.principal_id
LEFT JOIN sys.server_principals l ON u.sid = l.sid
WHERE u.TYPE <> 'R'
order by u.name


Unused indexes

Indexes are only really ever useful for finding rows, when they are not being used they only take up space and take time to update, use something like below to find indexes that are never used for what they are good for.

SELECT o.name AS object_name,
i.name AS index_name,
i.type_desc,
u.user_seeks,
u.user_scans,
u.user_lookups,
u.user_updates
FROM sys.indexes i
JOIN sys.objects o ON i.object_id = o.object_id
LEFT JOIN sys.dm_db_index_usage_stats u ON i.object_id = u.object_id
AND i.index_id = u.index_id
AND u.database_id = DB_ID()
WHERE o.type <> 'S'
and isnull(u.user_updates,0) > 0
and i.type_desc <> 'HEAP'
and u.index_id is null or (u.user_updates > 0 and u.user_seeks < 10 and u.user_scans < 10 and u.user_lookups < 10) ORDER BY u.user_updates desc


When were the statistics updated?

Run the below script after adding your table name in “ON o.name = ‘YourTableName’ ”
Shows you time and date when index statistics were last updated.

SELECT 'Index Name' = i.name,
'Statistics Date' = STATS_DATE(i.object_id, i.index_id)
FROM sys.objects AS o WITH (NOLOCK)
JOIN sys.indexes AS i WITH (NOLOCK)
ON o.name = 'YourTableName'
AND o.object_id = i.object_id
ORDER BY STATS_DATE(i.object_id, i.index_id);


SPID -2 in the instance

Sometimes you come upon an instance with some sessions that have a sessionid < 0, they have -2.

A -2 spid is an orphaned session, that means a session that no longer have an outer connection, maybe a DTC transaction that went wrong or something else, whatever the reason the problem is that these will hang around for a long while and if they are locking up resources, those resources will never be open again until we either kill the -2 spid or restart the instance.

But we can’t use the kill command for this, it won’t work. And how do we find them?

First, finding them, run:

SELECT request_owner_guid 
FROM sys.dm_tran_locks 
WHERE request_session_id = -2 
and request_owner_guid <> '00000000-0000-0000-0000-000000000000'

they will probably need to be killed, but I leave the decision for you to consider, do your homework before just killing them, I’m showing you how, not saying you should do it blindly.

To kill them use the UOW number, the long alphanumeric number instead of sessionid, eg.:

KILL 'DEF12078-0199-1212-B810-CD46A73F2498'

Or if you have a lot:

SELECT 'kill ''' + convert(varchar(100), request_owner_guid) + '''' 
FROM sys.dm_tran_locks 
WHERE request_session_id = -2 
and request_owner_guid <> '00000000-0000-0000-0000-000000000000'
order by 1

After taking care of these, talk to the application/web group to find out how you can fix this problem.

//Richard


Find all logins with sysadmin role

Potentially a great big security problem is having more than just one or a few logins with sysadmin role, try to keep the list as short as possible, find all of them with this script:

SELECT	p.name AS [Name] ,r.type_desc,r.is_disabled,r.create_date , r.modify_date,r.default_database_name
FROM	sys.server_principals r
INNER JOIN sys.server_role_members m ON r.principal_id = m.role_principal_id
INNER JOIN sys.server_principals p ON p.principal_id = m.member_principal_id
WHERE	r.type = 'R' 
AND    r.name = N'sysadmin'

//R


Top 50 queries with sql text and plan

The script below shows you the top 50 queries with sql text and plan order by total worker time.

Total worker time is measured in cpu time it has used since compilation, so even a relatively simple plan that runs fast but very often can have a high total worker time.

select
eqp.query_plan,
highest_cpu_queries.total_worker_time,
db_name(q.dbid) as dbname,
q.objectid,
highest_cpu_queries.execution_count,
highest_cpu_queries.last_execution_time,
highest_cpu_queries.creation_time,
q.[text]
from
(select top 50
qs.plan_handle,
qs.total_worker_time,
qs.last_execution_time,
qs.execution_count,
qs.creation_time
from
sys.dm_exec_query_stats qs
order by qs.total_worker_time desc) as highest_cpu_queries
cross apply sys.dm_exec_sql_text(plan_handle) as q
cross apply sys.dm_exec_query_plan(plan_handle) eqp
order by highest_cpu_queries.total_worker_time desc

//R


Find long running queries

One of the first scripts I go to when looking at what’s happening in an instance is this:

SELECT     db_name(r.database_id), r.session_id, r.blocking_session_id, r.cpu_time,
SUBSTRING(qt.text,(r.statement_start_offset/2) +1,
(CASE WHEN r.statement_end_offset = -1
THEN LEN(CONVERT(nvarchar(max), qt.text)) * 2
ELSE
r.statement_end_offset
END -r.statement_start_offset)/2) AS stmt_executing,
s.login_name,     s.HOST_NAME,     s.PROGRAM_NAME,    s.host_process_id,
r.status,     r.wait_time,    wait_type,     r.wait_resource,
r.total_elapsed_time,r.reads,r.writes, r.logical_reads, r.plan_handle
, (select query_plan from sys.dm_exec_query_plan(r.plan_handle)) as xml_plan
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(sql_handle) as qt,
sys.dm_exec_sessions s
WHERE r.session_id > 50
and r.session_id=s.session_id
ORDER BY r.cpu_time desc

It shows you among other things blocked processes, a link to the sql plan, reads, writes, the executing statement and a lot of other handy stuff.

A simple good script to have around.

//R


SQL Server Scripts

I have a large file with a lot of different sql scripts covering performance and tuning, index, logins, mirroring, DMV’s, DBCC commands and so on, I’m in the process of sorting them and creating shorter posts for a couple of scripts in each category, they will come up here shortly.

Hopefully they can come in handy, I refer to them quite often in my work as a DBA, I have been working on compiling these scripts for several years now, so it’s quite a large collection.