Conquering the world one query at a time
Ultrawide Monitor - Samsung CF971 - First impressions

As a developer, I've been a big fan of multiple monitors for a long time. I moved to a triple screen setup years ago, and up until recently, had no desire to try anything different.

Well... I've finally found the replacement. The holy grail of productivity, immersion, as well as usability. Ultrawides :-)

Don't you miss the extra screen?

No. The width of the screen is plenty for 2 full Visual Studio environments, or 3 text editors side by side. I've always had issues with effectively managing multiple IDE environmen with floating windows as those can tend to get pretty confusing on which parent they belong to. I've found I prefer to keep floating windows in the parent environment, and an ultrawide allows this type of workflow perfectly.

Do you need the curve?

I prefer it.the Samsung CF971 is curved at 1500R, which is one of the most curve displays as of the current time. Unlike TV, where this feature is pretty much useless, with an ultrawide monitor only a few feet from your face, the curve is welcome, and still subtle. I do videography, photography, and some gaming, and for my purposes I've found it to be perfect.

Do I need to get a 3440x1440 or can I save money and get 2560x1080?

Get the 3440x1440 if you are doing anything besides gaming. Then for gaming just toggle to the lower resolution. I way overthought this too, and thought the downscaling would be terrible. It's not. Don't listen to the naysayers. If you aren't a competitive ESports gamer, then it will serve you just fine at the lower resolution, and then you can toggle back up to the high resolution for anytype of productivity work. I recommend Display Fusion as you can setup some quick monitor profiles to toggle without delay.

Gaming on it?

Awesome! I still am running an old AMD 7950, and when I down the settings I'm still enjoying it. It's about time with Vega to upgrade, but until then, I'm still getting some life out of the current card, and still not leveraging the 100hz refresh rate with Freesync.

Any negatives?

I had 1 stuck pixel that I couldn't fix, so with black background this small dot can be seen (red). However, I'm still happy with the choice.

For Development

I plan on moving to this screen in the future for development as well. I included some images of the workspace you gain, so I have no complaints about my time at home working on it.

But I'm scared of moving away from IPS!

Me too. No regrets. The SVA panel beautiful. No complaints whatsoever, and no regrets about leaving behind my IPS display for this.

Fixing VirtualBox Only Showing 32bit Windows OS Options

Original help was identified from this article Why is VirtualBox only showing 32 bit guest versions on my 64 bit host OS?

In browsing through the comments, I saw mention that the root issue is that Hypervisor running interferes with Virtualboxes virtual management, so I disabled Hypervisor service, repaired the install, and rebooted. I also disabled automatic start for Hypervisor. This resolved the issue without requiring the uninstallation of the Hypervisor feature in Windows.

Exploring SQL Server With Powershell And SMO Basics

SqlServer Powershell Cmdlets 2017 - Initialize Look

Diving into the Sql Server Management Objects library can be a pretty interesting process. You get to work with database objects as in a new way, and begin manipulating and execute code in a much different approach than purely using T-SQL. Powershell offers a unique way to interact with prebuilt cmdlets, and you can explore leveraging .NET in powershell as well to have a powerful toolkit of options.

This post is a not focused on a full walk-through, but instead to communicate some of the exploration I've done, to help if you are beginning to explore more database automation and management.

I plan on doing some basic walk-throughs for the powershell newbie in the future, so if you are confused about anything powershell related feel free to post a comment and I'll add it to my list of stuff to walk through.

cmdlets vs .NET approach

What I've found interesting is there are really 2 main approaches to interacting with SQL Server. You can directly invoke the SMO dlls and access the methods, properties, and extensibility this offers. This requires more .NET knowledge as you would be directly working with the SMO namespace, in a way that is almost the same as what you code in C#. The other approach is to leverage cmdlets. The cmdlets try to abstract away a lot of the complexities that working directly with the SMO namespace for ease of use and automation, and to simplify the process for those not as comfortable with coding in C# or directly leverage the SMO namespace in C#

If purely focused on automation and little experience working with .NET then cmdlet's will be by far the way to go. There is a serious learning curve in working with .NET directly vs prebuilt cmdlets. If desiring to expand your .NET knowledge, as well find that the prebuilt cmdlets don't offer the behavior you are trying to achieve, then exploring the SMO namespace for directly invoking the methods and accessing properties can be valuable. The learning curve is more intense, so just be prepared for that if you are new to working with .NET directly in Powershell. & other sources

When possible, I personally am going to recommend to leverage a package like dbatools instead of rolling your own. is a powerful project that I've recently begun to explore more. This well rounded package gives you a powerful powershell set of commands that can help you set server properties, obtain default paths, backup, restore, migrate entire sets of databases to a new location and more. To code all of this from scratch would be a massive project. I'd recommend considering and just getting involved in that project if you have something to contribute. I found it really helpful to quickly setup some default server options without having to configure manually myself.

Exploring SQL Path Provider

Trying to find the path initially can be challenging. However, by opening SSMS up, right clicking, and launching the powershell window you'll be able to easily find the correct path to get the server level object.

This allows you to leverage default methods in powershell like Get-ChildItem for iterating through objects. It treats the navigated SQL server path basically as a "file structure" allowing some interesting actions to be performed. One of these is a different approach to killing connections to a particular database.

I found this great pointer by reading Killing SPIDS in Powershell from MidnightDBA

Review that article for scripts focused on the termination of running spids.

For an adhoc purpose the scripts MidnightDba provided are excellent and would allow quickly executing a kill script on connections from ssms > powershell prompt.

import-module -name sqlserver -disablenamechecking -verbose:$false -debug:$false
 CD SQLSERVER:\SQL\$ServerName -Verbose:$false -Debug:$false
 dir | ?{$_.Name -eq "$DatabaseName"} | %{$_.KillAllProcesses($DatabaseName)}

I approach this with a different method in one final script using just the SMO server method KillAllProcesses. For some tasks I've found it really helpful to have a simple 1 line kill statement thanks to MidnightDba's pointer with the statements similar to the one above.

Using Microsoft's documented method shows another example of how to use to restart the service. This was one modified approach I took. I prefer not to use this type of approach as working with get-childitem with server objects to me as a little unintuitive.


$ErrorActionPreference = 'continue'
[bool]$Matched = (get-module -name SqlServer -listavailable | measure-object).count
if($Matched -eq $false) { install-package SqlServer -scope CurrentUser -verbose:$false -Force} 

[datetime]$StepTimer = [datetime]::Now

$private:ServerName = $env:ServerName
import-module -name sqlserver -disablenamechecking -verbose:$false -debug:$false

# Get a reference to the ManagedComputer class.  
CD SQLSERVER:\SQL\$private:ServerName -Verbose:$false -Debug:$false
$Wmi = (get-item -debug:$false -verbose:$false .).ManagedComputer
$DfltInstance = $Wmi.Services['MSSQLSERVER']

#Display the state of the service.
write-host "Stopping Instance: $($DfltInstance.ServiceState.value__)"

while($DfltInstance.ServiceState.value__ -ne 1) #1 stopped
    Start-Sleep -seconds 5
    write-host "... state: $($DfltInstance.ServiceState)"
## Start the service.
write-host "Current Service State: $($DfltInstance.ServiceState)"
write-host "Initiating Service Start"

while($DfltInstance.ServiceState.value__ -ne 4) #4 running
    Start-Sleep -seconds 5
    write-host "... state: $($DfltInstance.ServiceState)"
write-host( "{0:hh\:mm\:ss\.fff} {1}: finished" -f [timespan]::FromMilliseconds(((Get-Date)-$StepTimer).TotalMilliseconds),'SQL Service Restart')

Database as an Object

Getting the database as an object proved to be easy though, if a little confusing to navigate initially.

$s = SqlServer\Get-SqlDatabase -ServerInstance $ServerInstance -Verbose

Once the object is obtained, you can begin scripting objects, change database properties and more very easily.

I found this method an interesting alternative to invoking using .NET accelerators as it was a quick way to easily get a database level object to work with. However, some of the limitations of not having the server level object immediately available made me end up preferring the .NET accelerator version which could look like this.

         $ServerName = 'localhost'
         ,$DatabaseName = 'tempdb'
 $s = [Microsoft.SqlServer.Management.Smo.Server]::New($ServerName)
 $d = [Microsoft.SqlServer.Management.Smo.Database]::New($s, $DatabaseName)
 $s.EnumProcesses() | format-table -AutoSize
 $d.EnumObjects() | Out-GridView

Interestingly, to actually access the many of the database properties you actually would call it via reference to the server object with SMO calls instead of the cmdlet. Trying $d.PrimaryFilePath doesn't work as I believe it's initiating the instance of a new database object for creation instead of referencing the initialization of a new object to an existing database. I found documentation a bit challenging to immediately sift through to get an answer, so YMMV. Someone coming from a .NET focused background might find the process a little more clear, but for me it did take some work to correctly identify the behavior.

#doesn't work. Probably trying to initialize new object for creating a db
 $d = [Microsoft.SqlServer.Management.Smo.Database]::New($s, $db)

 #works to access current existing object

Exploring Properties

If you want to explore properties of an object, try using the ever faithful get-member

Depending on the type of object, you can additionally explore them with GetEnumerator, GetProperties, etc. You'll find intellisense helpful as you explore more.

For instance, here's a walkthrough on the various ways you might explore the object and find you need to dig into it to get the full detail of what you have access to.

Comparing Restoring a Database with Cmdlet vs SMO

using dbatools cmdlet

An example of how simple using dbatools cmdlet can make restoring a database copy

rolling your own wheel

Now compare this to the complexity of running your own invocation of the SMO namespace and requires a lot more coding. Since dbatools wraps up a lot of the functionality, I've actually migrated to leveraging this toolkit for these dba related tasks instead of trying to reinvent the wheel.

Capturing Perfmon Counters With Telegraf

InfluxDb & Grafana Series

I had a lot of issues with getting the GO enviroment setup in windows, this time and previous times. For using telegraf, I'd honestly recommend just leveraging the compiled binary provided.

Once downloaded, generate a new config file by running the first command and then the next to install as service. (I tried doing through NSSM originally and it failed to work with telegraf fyi)

Once this service was setup and credentials entered, it's ready to run as a service in the background, sending whatever you've configured to the destination of choice.

In my test in Amazon Web Services, using EC2 with Windows Server 2016, I had no issues once EC2 issues were resolved to allow the services to start sending their metrics and show me the load being experienced across all in Grafana.

InfluxDB and Annotations

InfluxDb & Grafana Series

This post assumes you've already setup InfluxDB and have Grafana running.

Inserting annotations

Annotations are not a special type of resource, instead it's just another metric that you query with a feature in Grafana to display on other metrics. This means the same insert Line Protocol applies to the Annotation.

This post on greatly helped me get started: Creating Grafana Annotations with InfluxDb | Max Chadwick

Per Max's original post it supports html as well, so you could link for example to a build, test result, or anything else you want to link to from your performance statistics.

This provides an annotation on your timeline in a nice format for browsing through the timeline. I can see usage cases for identifying specific activity or progress in tests, helping coorelate the performance metrics with known activity steps from a build, script, or other related tasks. You could have an type of activity trigger this powershell insert, providing a lot of flexibility to help relate useful metrics to your monitoring.

My personal use case has been to ensure load testing start/end times and other significant points of time in a test are easily visible in the same timeline I'm reviewing metrics on.

Warning: I did experience performance degradation with Grafana and many annotations on a timeline. I found just disabling the annotations kept this from occurring, so you only pull them when youd them.

inserting annotation shows on graph

Adding Annotations to Grafana

Now that you have the results being inserted into InfluxDB, you can query these in Grafana as annonations to overlay your graphs.

Adding Annotation To Grafana to pull from InfluxDB

Potential Uses

I could see a whole lot of uses for this!

  • insert at build related activity
  • Windows update
  • Specific Database Related Maintenance like Ola Hallengren's index optimize or database integrity check

Monitoring always loses it's value when you have a limited picture of what is happening. Triggering relevant details for stuff that might help analyze activity might be the key to immediately gaining an understanding on what is causing a spike of activity, or of better evaluating the timeline of a load test.