The deployment process is one that continues to garner interest from both new developers as well as seasoned engineers as technologies and processes continue to evolve and get more efficient. Gone are the days of cowboy-coding and manual FTP transfers. The modern deployment process is one that is often command-line driven and continuously integrated with a remote repository system. As a Top 25 Software Development Company on SoftwareDevelopmentCompany.co, we spend a lot of time working to perfect our process.
Finding the right fit can be tricky, though. By leveraging tools like GitHub and (a new favorite) DeployHQ, we finally nailed down a process that we love and allows us to control our deployments from the command line and keep them in sync with our git repos.
In this article, we’ll outline our preferred setup. In an effort to not get too into the weeds, we make one fairly large assumption — that you’re using GitHub. This process does not necessarily require GitHub so you can hopefully just replace references to GitHub with your remote repository service of choice, as I’m sure they all have similar setups. With that said, if you have any questions regarding this or a similar setup and workflow, we’d be happy to work through them with you – drop me a line, or leave a comment below.
- Create accounts for both GitHub and DeployHQ.
- Create a new repo on GitHub (or you can use existing project repo).
- Connect your DeployHQ account to your GitHub account and repo.
- Start working locally.
- When you’re at a good place locally, commit your changes to git.
- Push your changes to your GitHub repo.
- GitHub notifies DeployHQ of the changes to the repo
- DeployHQ deploys your files to your server (only the diffs).
- QA your site to see the changes in action.
- Do your happy dance!
For reference, the “old way” of deploying sites…
For those that have been around a while, you’ll be familiar with the old process:
- Work locally as much as possible for efficiency
- When ready, connect via FTP to a staging server to upload your files and show your progress and get client review
- Upon approval, connect via FTP to your production server and upload your site
- When feeling rebellious, you may even connect to the server and edit live files right there 🤫
On the surface, that all seems pretty easy, and in fact, it is. But, it could should be easier. We’re in the age of productivity – if we can save 5 minutes on anything, we do it.
The first solution was SSH. SSH is kind of like FTPing into your server, except you do it all via the command line and it’s more secure. Since we’re using the command line, this allowed us to integrate git. We could connect to our server, do a git pull and pull our repo updates to the server.
This was pretty nice.
Very quickly, though, this approach became too cumbersome (at least for me). We found getting git installed on our client servers to be unnecessarily complicated (many clients would use shared servers which was the basis of most issues). Also, remembering the SSH connection command for our client’s servers became more effort than it was worth in my opinion. I’m sure there was some way we could have made this easier (I did use TextExpander to at least store shortcuts) but it was still no bueno, in my opinion.
So, we set out to find the next best option to implement a more modern deployment process.
wanted needed a way that was:
1. Reasonably easy to set up
2. Connected to GitHub
First Attempt – Beanstalk
Beanstalk basically does exactly what we wanted. The hang up for me was that you have to host your repos with them – you can’t connect to GitHub. Initially, this wasn’t a deal breaker, so we gave it a go. However, we have a lot of repos and you are fairly limited with Beanstalk, so the cost (while more than reasonable for the product they’re providing) soon became something that we couldn’t quite justify, especially when we really preferred to be using GitHub.
Second Attempt – Manual
Part of the reason I couldn’t stomach the cost of Beanstalk was because I knew (or thought I knew) that you could basically set up the server yourself to do what Beanstalk was doing for you. You add a script (webhook) that listens for changes to your repo, then does some git magic to pull the changes to your server. Much easier said than done. Granted, I didn’t give it a full college effort, but it soon became clear it was too much work, or at least more work than we were willing to commit to this. My time should be spent building stuff, not managing servers, so we gave up.
Finally, I Discovered DeployHQ…
I stumbled across DeployHQ and it checked off all the boxes for me:
- Set up is a breeze.
- It connects to your existing GitHub repos.
- It creates a webhook that allows for automatic deployments to your staging and/or production servers any time you push to a remote repo (if you want it to, which we did).
Detailed Instructions for How to Deploy a Website Automatically Using GitHub and DeployHQ
First things first, make sure you’re set up on GitHub. It’s $7/month for unlimited private repos – can’t beat that in my opinion. I won’t get into the weeds on how to do that, there are lots of great resources already out there about getting set up on GitHub.
Next, setup a DeployHQ account. Once your account is set up, you can dive right in and “Create Your First Project”.
Give your project a name and select where your repository is hosted. Again, in our case, it’s GitHub.
When you click to “Create Project”, if you’re not already logged in to your GitHub account, you’ll be prompted to do so so that you can select your repo and branch from GitHub.
In doing this, DeployHQ does a lot of magic for you. It creates a webhook and automatically adds that webhook to your selected repo in GitHub.
In short, this essentially connects DeployHQ to Github and allows the two services to securely communicate with each other when changes happen in your repo.
The last thing you need to do is add a server. You can do this in any number of ways — SSH, FTP, etc.
And so, as you can see, it all eventually goes back to the old school way of deploying files via SSH or FTP, except now, we’re letting a computer do all of the connecting and deploying for us while we sit back and sip our coffee.
More specifically, here’s the workflow…
- Work locally on a staging branch of your repo, for example.
- Commit your changes and push to the GitHub repo.
- Once that commit hits GitHub’s servers, the webhook that DeployHQ set up for us on our repo sends a notification to the DeployHQ servers saying, “Hey, there are have been some changes to this repo!”
- DeployHQ, which has access to both our server and our repo, says, “Cool, I’ve got it from here!” and transfers any modified files from GitHub to our server automatically.
- You can then pull up your staging and review the changes. Assuming all is good, merge those changes from the staging branch into master, push master to GitHub, and the process starts over again — Github notifies DeployHQ, DeployHQ deploys files to production.
*Note: many people would recommend NOT automating the deployment process to production. That’s not a bad idea just to be extra careful. If you don’t automate that part, you just need to log in to DeployHQ and trigger a manual deployment – or, I’m sure there is a manual command-line command to do that as well.
There’s waaay more cool stuff you can do (like creating templates to make setup quicker), but even this basic setup allows us to quickly and easily deploy sites from our local machines to a public server from the command line.
It’s amazing to be able to just push your changes to GitHub (something you’re probably doing already) and be done. No more opening up an FTP app, connecting, waiting to connect, manually dragging files to the server, forgetting which files have been changed so just uploading a bunch of files, etc.
All of this happens for you very, very efficiently.