Task
When API specification is updated in git (master branch only) it must be updated on Confluence automatically.
Solution
https://github.com/kagarlickij/aws-cloudformation-s3-basic-auth
When API specification is updated in git (master branch only) it must be updated on Confluence automatically.
https://github.com/kagarlickij/aws-cloudformation-s3-basic-auth
Task
Provide sample API specification with CI pipeline for deployment to AWS API Gateway and S3 (only if petstore.yaml
was changed in master branch)
This sample API specification is used by aws-cloudformation-git-confluence-integration sample
Solution
It’s quite common practice to keep databases on dedicated servers nowdays, especially if you use AWS RDS or Azure.
Relational databases performance is always painful and you might want to split data across at least a few databases. But if data is divided you still to have to do some logical operations across the whole amount and it’s quite simple, so let me show how it can be done using bot SQL Server Management Studio and CLI.
Since it’s possible to run dotnet core apps on Linux, Docker immediately comes into play.
The typical approach is to execute dotnet run and it will output to the console:
..which is nice for debugging, but for real scenarios you probably will redirect application output to CloudWatch Logs, Loggly or ELK.
So what is you want to run app without this console output?
If you have MS SQL server in your environment and have to do some actions (execute migrations, change data, etc.) with it during your CI/CD it might be quite inconvenient to use Windows machine.
Fortunately we have sqlcmd for Linux, and Microsoft provides some instructions for popular Linux distributions – https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-setup-tools
But what if you have AWS Linux? If you try to use instructions from MS you will fail and there’s not much information across the Internet about this topic. The only useful link is here.
Since my env is highly automated I decided to create simple script to install sqlcmd on AWS Linux and share it with you:
When you do some CI/CD jobs you might want to mark some builds with name of the current (active) Jira sprint.
We have a dozen components in the project with dedicated Jira projects and sprint names are like “Backend sprint 12”, so you probably don’t want to add useless information to the build and need only number to identify build.
Jira has a nice REST, so you can get what you want in a very simple way:
Although XCode server is almost perfect for building iOS apps, Jenkins is still more popular. If your application consists of a few parts such as a database, backend, frontend, Android, and iOS apps you typically want to have the same CI/CD for all components.
My Jenkins master is running in AWS cloud together with a dozen of Linux and Windows slaves. However, iOS app can be built only on macOS and you have to use Apple computer to build it. It’s typically a Mac Mini computer located in the office.
In this post we’ll consider secure and reliable connection between mac in the office and Jenkins master in AWS cloud.
Both Jenkins and GitHub are very popular, so it couldn’t be a problem integrating them. It still might be a bit confusing if you’re doing it for the first time. That’s why I decided to spend a few minutes to show you guys how it can be done.
Jenkins master can be accessed through the URL different from the one specified in Jenkins configuration.
Why might we need this? Well, you probably want your Jenkins server to be publicly accessible (this is required for GitHub integration, by the way) and since it’s public you typically want to use an encrypted HTTPS connection.
Well, you can install nginx proxy to achieve this, but in this case you’ll have to maintain SSL certs, which obviously sucks, especially if you can use AWS Certificate Manager with AWS ELB.
Another reason to use different URL is to save your time. When you use Windows slaves via JNLP there’re well-known issues with both nginx and load balancers.
And the last but not least reason is that “LAN” connection between Jenkins master and slaves is still more secure and faster, so it’s preferable in most cases.
So let’s start implementing Jenkins and GitHub integration within these conditions!
Nowadays applications can be created really fast, but Ops teams are still slow.
It this article I’ll show an example how to manage this situation.