If you’ve been working with both AWS and Azure you should have noticed that each of them has some advantages.
Tools like Terraform might be very helpful if you’re not familiar with both CloudFormation and Azure RM.
However don’t consider Terraform as a nonpareil (this is not true at all), it is simple tool for simple tasks.
So in this post I’ll tell you about Terraform terms and concepts and show example with AWS & Azure.
For a bit more complicated infrastructure you’ll have to use CloudFormation and Azure RM
If you’re working with GitHub you definitely should consider Gists for smaller pieces of code, like scripts.
As soon as you start you might want to commit to Gists using SSH, and GUI provides such option:
When you start working with Azure Service Fabric you might be disappointed with customisation possibilities.
So even if you just want to add a few Internal Load balancers you have to customise ARM template.
When you working with ARM templates it might be good idea to split VNet and KeyVaults from computing resources.
To make provision easier you can use PowerShell scripts, like this one for certificates.
Full solution you can find in my GitHub – https://github.com/kagarlickij/arm-fabric
Now let’s see how it can be implemented:
When you’re working on CI/CD security is always important and certificates are quite useful.
Azure Service Fabric management with certificates is very easy, but creating certificate might be a bit confusing.
However, like most everything it can be easily automated with PowerShell and here’s example for you:
Sometimes you might need to execute telnet from Docker container, but telnet typically isn’t installed and can’t be installed easily.
However there’s a simple solution which I want to share with you.
It’s quite common practice to keep databases on dedicated servers nowdays, especially if you use AWS RDS or Azure.
Relational databases performance is always painful and you might want to split data across at least a few databases. But if data is divided you still to have to do some logical operations across the whole amount and it’s quite simple, so let me show how it can be done using bot SQL Server Management Studio and CLI.
Since it’s possible to run dotnet core apps on Linux, Docker immediately comes into play.
The typical approach is to execute dotnet run and it will output to the console:
..which is nice for debugging, but for real scenarios you probably will redirect application output to CloudWatch Logs, Loggly or ELK.
So what is you want to run app without this console output?
If you interested in Tableau installation on AWS you should have a look at CloudFormation templates from Tableau.
Single server installation suits well for trial, but it has a number of limitations including link to default VPC. But what if you want to deploy it in dedicated VPC or you don’t have default one?
Not a big deal, I’ve updated template and you can use it:
If you have MS SQL server in your environment and have to do some actions (execute migrations, change data, etc.) with it during your CI/CD it might be quite inconvenient to use Windows machine.
Fortunately we have sqlcmd for Linux, and Microsoft provides some instructions for popular Linux distributions – https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-setup-tools
But what if you have AWS Linux? If you try to use instructions from MS you will fail and there’s not much information across the Internet about this topic. The only useful link is here.
Since my env is highly automated I decided to create simple script to install sqlcmd on AWS Linux and share it with you:
When you do some CI/CD jobs you might want to mark some builds with name of the current (active) Jira sprint.
We have a dozen components in the project with dedicated Jira projects and sprint names are like “Backend sprint 12”, so you probably don’t want to add useless information to the build and need only number to identify build.
Jira has a nice REST, so you can get what you want in a very simple way: