Recently, I was assigned an exciting task of moving the environment (non-critical “secret” file) away from the repository and fetching them dynamically during the pipeline build job. Let’s get through the development process with me once again.
Long time no hear, but we are back! And we are not coming empty-handed. I want to introduce you to my new project: Azure Pipelines How To, the series. 🎉
Table of Contents
Pre-requirements
There are several circumstances that we have to consider. Firstly, Pipeline space is shared between several projects. Thus, the Library is also shared, meaning file naming must be unique. Secondly, we work with React App, so KeyVault is not an option (later, I will answer why). Thirdly, we are working with something that already exists, so we don’t want to change naming convention, that has been once approved. Lastly we have 2 different environments – dev and test. CI pipelines are running in Azure DevOps.
The first and third one can give us the most hiccups, but fear not. We’ll solve it.
What there is to be done
Let’s start with the to-do list.
- Upload your files to Library.
- Locate an existing pipeline and find your way around it. Find a build job. (or any other place where you need your secret files. Can also be in e2e 🤷🏼♀️)
- At the very beggining of it add download file task
- Write a short bash script to rename the file from the custom name to the one agreed before.
- Pray it works on a first try.
Spoiler alert: didn’t work for me.
Having that we can start!
Uploading secured files in ADO
Uploading your files is as simple as going to your Azure DevOps project and then to Pipelines > Library (pic 1). Go to the Secure files tab and click “+ Secure file” (pic 2). The only thing to remember is naming. Give each file a unique name. In our case, it’ll be GosiaTestApp-env.{ENVIRONMENT}
.
Pic. 1. Navigate to Library in pipeline’s tab | Pic. 2. Add secure files |
Modify pipeline job
I won’t be able to help you find the exact place where you will need this in your pipeline. It can be different – build, e2e, units, whatever. Just find a job in which you are going to need those files. In my case, it’ll be the beginning of the build process. Now, the fun part begins! We have to extend the job with some more steps. First, let’s download the files we’ve just published. You can get inspiration from the lines below:
parameters:
- name: environment
type: string
values:
- test
- dev
[...]
- task: DownloadSecureFile@1
name: environmentSecrets
displayName: 'Download secure file for Gosia's test app'
inputs:
secureFile: 'GosiaTestApp-env.${{parameters.environment}}'
DownloadSecureFile@1
– is the name of the task we have to execute to fetch the files.
In inputs
, we can define several things. We used secureFile
to define the file name we want to fetch – custom for our application. That’s the only required one. Add optional retryCount
(with default value 8) and/or socketTimeout
(more info in MSDN) if needed.${{parameters.env}}
– variable from which we get the environment on which the build is being run.
Having that let’s move to my personal hell – scripts.
Rename your file
Currently, the files name is set to GosiaTestApp-env.${{parameters.environment}
and we expect it to be .env.{ENVIRONMENT}
. Let’s change that.
- bash: mv $(Agent.TempDirectory)/SiteAdmin-env.${{parameters.env}} $(Build.SourcesDirectory)/.env.${{parameters.env}}
displayName: Rename environment file script
It’s a simple one-liner, but it has one trick: figuring out the paths. It took me quite a while to find that out. I cannot promise it’ll be the same for your case, but I’m giving you a starting point. There are some predefined variables in Azure Pipelines. Among them, there are Agent variables (with TempDirectory
where the downloaded file is stored). As well as Build variables, that were important for us, as there we can find SourcesDirectory
– the local path on the agent where your source code files are downloaded. If you are not working with build-related files you have to figure that part out. It’s important!
Put it together and pray it works
At the end your code should look like this
parameters:
- name: environment
type: string
values:
- test
- dev
jobs:
- job: some_test_${{parameters.env}}
[...]
steps:
[...]
- task: DownloadSecureFile@1
name: environmentSecrets
displayName: 'Download secure file for Gosia's test app'
inputs:
secureFile: 'GosiaTestApp-env.${{parameters.environment}}'
- bash: mv $(Agent.TempDirectory)/SiteAdmin-env.${{parameters.env}} $(Build.SourcesDirectory)/.env.${{parameters.env}}
This worked fine, but you can imagine that getting there was a journey. Please see next section if you want to learn from some of my mistakes.
Why not KeyVault for frontend app
In terms of securing the application’s secrets, there is an option for KeyVault in Azure Portal, however, we decided no to go that path for several reasons:
- Increased code complexity. It is easy to make frontend code complex and difficult to read. We don’t want to add extra code if we can avoid it.
- The gap between frontend and “infrastructure”. Those who are only front-end developers might not be interested in understanding how things are working in, let alone taking a look into, Azure Portal. That might create an extra load for DevOps/Ops/platform/backend teams to maintain. Secret files are much more straightforward – you do this once and if the file has to change you only re-upload it.
- Maintenance cost. As I mentioned in the second point – even if the frontender wants to learn and is willing to maintain those values, it adds extra load and cost for maintenance. First, the initial step is much longer, as you manually create key-value pairs for every secret in every environment, while files most likely exist already from the development process. Then, every update and/or tiny change means tons of updates.
Of course, nothing is stopping you from choosing this path. This doc on MSDN can be a good starting point. Let us know if it works for you in the comment section below.
“There are no mistakes in life, only lessons.“
Here are my lessons from this process.
1️⃣ Firstly, I tried to fetch documents and rename them in a separate job process. That was silly, as jobs are create-do-kill containers so that we can run multiple jobs simultaneously (I’ll create a separate post on those)! No wonder I was getting:
Error: Failed to find .env file at path: .env.test
if I fetched a file and then deleted it right away 🤣 ✅
2️⃣ Then renaming. I started without any directory. That resulted in the same error – not surprising when you imagine that “cloud” is a physical server somewhere with physical files. We have to define a directory. ✅
3️⃣ Lastly, plan separate Library spaces for each project. That will make your life easier and pipelines prettier. No renaming, no scripting, no figuring out the correct directory. ✅
Mischief managed,