SQL Support on tv

In the show How It’s Done on RTL-Z we’ve show the importance of data at one of our main customers, Meander Medisch Centrum in Amersfoort (a large hospital).

The director Digital&ICT, Vincent van Luling, tells you more about the critical role of data in today’s hospitals, and the role we have in that subject.

We have supported him and his team to properly manage all their databases, and as you can hear in the video they are happy with our approach and the way we work with them.

Their data is safe with us!

Make extensive jobhistory more readable

In SQL Server backups and maintenance of databases is usually managed using jobs.
In this article I’ll explain how the history of these jobs can be read, and what you can do if the output is too long or hardly readable.

In the example below you’ll see the job that makes backups of the systemdatabases. I you, like me, are using the maintenance scripts made by Ola Hallengren then the job should look familiar to you. Of course this example works with all jobs.

Request the properties of the job ‘DatabaseBackup – SYSTEM_DATABASES – FULL’:

Click ‘View Job History’.
Click the plus-sign in front of the runtime you want to investigate to see the history of the step:

In the bottom of the screen you see the details of the run. The scripts of OLA provide extensive logging but it has a side-effect: it is sometimes hard to read the long text properly.

By default Ola configures logging to file, and I am a big fan of that feature. It makes troubleshooting a jobrun much easier.

However, sometimes I work at a client that does not allow me access to the underlying operating sytem. In those cases I need another way to get this important info. And (of course) that is possible, I just need to make a small adjustment to the job.

Open the job you want to adjust:

Click ‘Steps’

Select the job-step you need to adjust and click Edit (or doubleclick the job-step) to open the properties:

Click ‘Advanced’

Put checkmarks in front of ‘Log to table’ and ‘Append output to existing entry in table’
The second option is neede in jobs with more than one step to make sure you get the histrory of all jobsteps in the table (instead of just the latest one).
Click OK to save your jobupdate.
Done!

Now the fun part, read the history from our new source.
You just need a query to do that. Of course you’ll have to wait for the job to run to see any data, so you may need some patience.

select * from msdb.dbo.sysjobstepslogs
order by log_id desc

Find the log_id of the run you want to check out and copy it to the clipboard.
Adjust the output of the query so that it outputs to text instead of grid by clicking the button in the GUI or just type CTRL+T.
Next, execute the following query:

select log_id, [log] from msdb.dbo.sysjobstepslogs
where log_id=<gekopieerde log_id>
order by log_id desc

Sometimes you still don’t see the whole text; results to text by default only shows max 256 characters.
You can change that for this query to the max value of 8192 in the query-options:
Click in the main menu of SSMS ‘Query’ and ‘Query Options’:

Adjust the indicated field to the max value of 8192.
If you want to set this for all queries to come, set this using ‘Tools’ in the menu and then ‘Options’:

Of course there are cases when 8192 characters is not enough. We can use the  import-export wizard to export the output of the query to a destination we want, for example a textfile on the machine you use to run SSMS.
Click the right mousebutton on the MSDB-database that contains all jobrelated info:

Select ‘Tasks’, ‘Export Data…’

Select as a source ‘SQL Server Native Client’ and click ‘Next’

Select as ‘Destination’ the ‘Flat File Destination’ and enter a filename and path.
Don’t forget the set the checkmarks in front of  ‘Unicode’ en ‘Column names in the first datarow’ as shown above!
Click ‘Next’

Select ‘Write a query…..’ and click ‘Next’.

Fill in the query we used before and click ‘Next’.

Click ‘Finish’
The export is executed:

The results of the query can be found in the file you set as ‘Destination’.

Good luck!

 

 

Updates to the SQL Server Incremental Servicing Model | SQL Server Release Services

In the old days every experienced DBA updated SQL Server with only the latest servicepacks. CU’s (Cumulative Update packages) were available, but the advise was to only install those when a problem occurred that was fixed in the CU. ‘If it isn’t broken, don’t fix it!‘.
Occasionally CU’s would break smoothly running installations…..

Well, those were the old days.

Starting in 2016 Microsoft performs tests on CU’s the same way as on Servicepacks, and advises us to proactively install CU’s when available to get the most out of SQL Server. ‘If it isn’t broken, improve it!

Read more here: Announcing updates to the SQL Server Incremental Servicing Model (ISM) | SQL Server Release Services

What’s new in SQL Server 2016

SQL Server 2016 has been release some time ago. This new version contains a lot of new features and possibilities, but you might wonder if it’s worth the effort of updating.

The answer to that question will be most likely be ‘YES’, but you need to be careful *).

If you’d like to know what has changed, I’ll refer you to the following link:

What’s New in SQL Server 2016 | Microsoft Docs

*  Before upgrading I advise you to check what the new version does to YOUR workload. With this new version that’s even more important than before. If preparations are not performed as needed, performance may suffer heavily instead of improve. These issues can be avoided though, and the performance can improve hugely. But that only works after thorough testing and the right preparations before migration.