All microblog

2021

Go R1 Day 23

Day 23 of 100

progress

  • Used Viper to load configuration for a CLI tool.
  • Ran into problems with trying to print out map[string]interface{} type. Lots of magic performed by .NET to do this type of action. This is a lot more hands-on 😁.
  • Had stability issues with VSCode today, so I finally switched over to Intellij with the Go plugin and it worked well. The keyboard mappings are different so that was painful, but still overall a good experience that got me unblocked.

2020

Go R1 Day 22

Day 22 of 100

progress

Using Dash, I read through much of the language specification. Dry reading for sure, but helped a bit in understanding a little more on stuff like arrays, slices, loops, etc.

Nothing profound to add, except to say I don’t think I want to write a language specification.

Go R1 Day 21

Day 21 of 100

progress

  • Signed up for exercism.io, which is a pretty great website to work through progressively harder exercises.
  • Did Hello world to start with as requires progressive steps through the exercises.
  • Did a string concatenation exercise as well (Two Fer).

I like the mentor feedback system concept and submission of work. After I finish this, would be good to add myself as a mentor and contribute back to this community. This is a fantastic concept to help get acclimated to a new language and do progressively harder exercises to better learn the language usage.

Go R1 Day 20

Go R1 Day 19

Go R1 Day 18

SQL Server Meets AWS Systems Manager

Excited. Have a new solution in the works to deploy Ola Hallengren via SSM Automation runbook across all SQL Server instances with full scheduling and synchronization to S3. Hoping to get the ok to publish this soon, as I haven’t seen anything like this built.

Includes:

  • Building SSM Automation YAML doc from a PS1 file using AST & metadata
  • Download dependencies from s3 automatically
  • Credentials pulled automatically via AWS Parameter Store (could be adapted to Secrets Manager as well)
  • Leverage s5cmd for roughly 40x faster sync performance with no aws-cli required. It’s a Go executable. #ilovegolang
  • Deployment of a job that automates flipping instances to FULL or SIMPLE recovery similar to how RDS does this, for those cases where you can’t control the creation scripts and want to flip SIMPLE to full for immediate backups.
  • Formatted deployment summary card sent with all properties to Microsoft Teams. #imissslack
  • Management of these docs via terraform.
  • Snippet for the setup of an S3 lifecycle policy automatically cleanup old backups. (prefer terraform, but this is still good to know for retro-active fixes)

I’m pretty proud of this being done, as it is replacing Cloudberry, which has a lot of trouble at scale in my experience. I’ve seen a lot of issues with Cloudberry when dealing with 1000-3000 databases on a server.

Once I get things running, I’ll see if I can get this shared in full since it’s dbatools + Ola Hallengren Backup Solution driven.

Also plan on adding a few things like on failure send a PagerDuty incident and other little enhancements to possible enable better response handling.

Other Resources

Five

I asked my daughter (3) how much she loved me. She held up her hands and said: “Five”.

I’ll take that as a win considering that’s all the fingers on that hand. 😂

Leave Me Alone

Free Means You Are the Product

Over time, I’ve begun to look at products that are free with more judgment. The saying is: “If it’s free, you are the product”. This often means your data and privacy are compromised as the product.

This has resulted in me looking more favorably at apps I would have dismissed in the past, such as Leave Me Alone.

Leave Me Alone

The notion of buying credits for something I could script, click, or do myself made me use sporadically last year. This year, I took the plunge and spent $10 and appreciate the concept and cost.

If you have a lot of tech interaction, you’ll have a slew of newsletter and marketing subscriptions coming your way. This noise can drown your email.

I saw one Children’s clothing place that got my email on a receipt generate an average of 64 emails a month!

Leave Me Alone helps simplify the cleanup process by simplifying the summary of noisiest offenders, and one-click unsubscribes to any of these.

You can use an automatically generated rating based on ranked value on mailing lists, read engagement, number of emails sent monthly, and more.

Take a look, the free start is enough to figure out if you like it.

Other Tools

Combine this type of tool with:

  • Kill The Newsletter
  • Inoreader (RSS Reader)
  • Subscription Score: a really promising tool made by the same folks, but haven’t added at this time as price seems a bit high for this specific feature if I’m already using their app. (at this time $49 a year). Be nice if this was a feature provided automatically to those who bought 250 credits or more since it’s powered by the data mining of lists users unsubscribe from the most.

You’ll be more likely to keep up to date with this noise reduced. Last tip: Add GitHub Release notes like Terraform and others as a subscription in your RSS reader, and it might reduce the noise via email and slack on releases.

Go R1 Day 17

Day 17 of 100

progress

  • reviewed adding excel conversion to cli
  • shelved this after reviewing implementation requirements
  • this is one of those cases where PowerShell makes much more sense for adhoc work as converts pscustomobject (similar to struct) via pipeline automatically to excel sheet.

Go R1 Day 16

Day 16 of 100

progress

  • refactored AWS SDK call to export a named file using flags.
  • Iterated through regions so cli call aggregated all results from all regions into single JSON.
  • Working with v1 makes me want v2 so much more. The level of pointers required is ridiculous. At one point I had something like &*ec2 due to the SDK requirements. Having to write a filter with: Filters: { Name: aws.String("foo")} is so clunky. I believe in v2 this is greatly simplified, and the code is much cleaner.

Go R1 Day 15

Day 15 of 100

progress

  • figured out scope issues with pointer and struct
  • Used omitempty in struct
  • exported final report in json format after searching for matching image id from ec2 instance image id
  • Find it interesting how much more wordy the go search method was, but appreciate it in a way as the “syntactic” sugar that’s missing also is the reason there is more complication at times in languages like PowerShell/C#.

Go R1 Day 14

Day 14 of 100

progress

  • built golang function with aws-sdk that returned ec2 instances, images
  • Joined the data together together to search for matching image from the ec2 metadata
  • generated json report from results and final struct

Go R1 Day 14

Day 14 of 100

progress

  • Migrated my new aws lambda logger from zap to zerolog. Zap gave me some problems initially so zerolog is my favorite structured logger right now, much simpler.
  • Constructed go-task runner file for launching go test and go build/run.
  • Structured logging required a little bit of refactor but worked.

Here’s an example of providing back a logged string (don’t log secrets normally, but I’m in testing phase) with structure.

1
2
3
4
5
	log.Debug().
		Str("decodedBinarySecret", decodedBinarySecret).
		Str("secretString", secretString).
		Msg("Depending on whether the secret is a string or binary, one of these fields will be populated.")

Based on my improved understanding of conversions vs type assertions, the need to convert using a “cast” (Go calls these conversions, and yes it makes a copy in memory for this):

1
log.Info().Str("requestDump", string(requestDump)).Msg("request information")

Type assertions are done when working with an interface. I’m still working on my understanding of interfaces as they are their own beast in Go. Unlike most other languages, a Go type implements an interface when all the required methods are matched. This provides a great deal of the flexibility in Go interfaces.

The scoping of the interfaces is important, and while I listened to a lecture on this, I didn’t yet work through the interface design principles to ensure the best resusability/narrowness of scope concepts. I think that’s going to take more “getting my hands dirty” for it to click.

Unable To Resolve Provider AWS with Terraform Version 0.13.4

I couldn’t get past this for a while when I accidentally stumbled across a fix. I believe the fix was merged, however this problem still existed in 0.13.4 so I stuck with it.

GitHub Issues

When investigating the cause, I found this PR which intended this to be the installer behaviour for the implicit global cache, in order to match 0.12. Any providers found in the global cache directory are only installed from the cache, and the registry is not queried. Note that this behaviour can be overridden using provider_installation configuration. That is, you can specify configuration like this ~/.terraform.d/providercache.tfrc

GitHub Issue Comment

I used the code snippet here: micro ~/.terraform.d/providercache.tfrc

Wasn’t sure if it was interpreted with shell, so I didn’t use the relative path ~/.terraform.d/plugins, though that might work as well.

1
2
3
4
5
6
7
8
provider_installation {
  filesystem_mirror {
    path = "/Users/sheldonhull/.terraform.d/plugins"
  }
  direct {
    exclude = []
  }
}

After this terraform init worked.

Go R1 Day 13

Day 13 of 100

progress

  • Worked with type asserts in my efforts to generate json collection from the parsed front matter.

Set Theory Basics in the Eyes of 10 Year Old

My morning. Explaining set and intersect theory basics to my 10 year old with Minecraft gamer tags. Trying to justify the need to know this, the best I could come up with was his future need to build a shark attack report accurately.

Kids are the best. Tech is fun. What job would have me spin up with docker-compose up -d my MSSQL container, write a quick SQL example with INTERSECT, UNION and all to demonstrate this magic.

Followed it up with a half-hearted lie that my day is comprised of cmatrix 😂 which he didn’t believe for more than a couple seconds.

Ways to Improve Codespaces Local Docker Experience

I’ve been enjoying Codespaces local development workflow with Docker containers.

I’m using macOS and on Docker experimental release. Here are some ideas to get started on improving the development experience.

  • Clone the repository in the virtual volume (supported by the extension) to eliminate the binding between host and container. This would entail working exclusively inside the container.
  • Increased Docker allowed ram to 8GB from the default of 2GB.

Any other ideas? Add a comment (powered by GitHub issues, so it’s just a GitHub issue in the backend)

Keep the Snippet Simple

I took a quick step back when too many parentheses started showing up. If you question the complexity of your quick snippet, you are probably right that there is a much simpler way to do things.

I wanted to get a trimmed message of the results of git status -s. As I worked on this snippet, I realized it was becoming way overcomplicated. 😆

1
$(((git status -s) -join ',') -split '')[0..20] -join ''

I knew my experimentation was going down the wrong road, so I took a quick step back to see what someone else did. Sure enough, Stack Overflow provided me a snippet.

1
$(((git status -s) -join ','))[0..20] -join ''     # returns the string '12345'

Moral of the story… there’s always someone smarter on Stack Overflow. 😆

Go R1 Day 12

Day 12 of 100

progress

  • Worked on Algolia index project to do atomic updates on search index on my blog.
  • Worked with json, structs, ranges, and more.
  • Saw success with the first value in my output now correctly parsing out the title from the front matter.
  • Implemented zerolog.
  • Used front library to parse yaml front matter into map.
  • Accessed map to get title into json.

Hoping that eventually I can build out a Go app for sharing that’s the equivalent of “atomic alogia” allowing diff updates. I haven’t found anything like that for hugo so far.