Business, Automation, Project management

Cron in the Cloud for smoother BizOps

ebook-template-lp-image-2

An example use case of how to manage cron with Octopus Deploy and Google Firebase for a modern technology company.

We have a new business process that we are rolling out that requires us to run a daily function that essentially synchronizes the several databases, and dust some deep data learning, which could take anywhere from 15 minutes to 25 minutes. this process has to run daily, ideally multiple times per day, to keep our CRM databases in sync and our people have the latest information at their disposal.

Usually when a technically inclined person needs to process some sort of daily or hourly jobs or tasks, they would use cron. Cron has been around for a long time and is a very reliable way to have a server, or a computer, execute some sort of command at an interval.

But what happens when you invested tens of thousands of dollars into building a multi-tenant, container based, web hosting infrastructure? Where do you run Cron?

We could run it inside of a container, or we could have kubernetes handle it, but in our particular use case that doesn't really make sense because our kubernetes is, once again, a multi-tenant hosting platform would provide for clients.

So to find a solution to this problem I decided to go further into the cloud.

For a long-term project I've been involved with, where are my primary objective is to make sure that a highly available infrastructure is maintained and routinely updated as new requirements are made, I have become very familiar with octopus deploy.

2018-12-17_06-34-00

 

Although I don't use their schedule or function for our infrastructure management, I have seen this feature around and have been considering using it to run routine checks.

So here is how I solved this problem by using DevOps tools to solve business problems. I suppose if I like buzzwords I would call this BizOps.

First, here is summary of where are special business-oriented functions live. Since these are functions that are very purpose specific, I could not justify provisioning a dedicated docker container which is in all honestly fairly overpowered to facilitate a single function, as most of our containers are optimized for hosting high-end WordPress sites. I did not need Memcached, an Elasticsearch index, or a MySQL cluster to run a single function, so our infrastructure isn't the best fit.

Instead, I used Google Cloud Functions, that are integrated with firebase. this made sense because I was also using the firebase real-time database which synchronizes all updates to it to an elasticsearch cluster that we use for data analysis.

Simply put, I created a octopus project that would invoke the URL given to me by the firebase cloud function, add an interval that is managed by octopus deploy, via their hosted solution.

2018-12-17_06-35-51

Every hour or so, octopus will create a new release for this project and start to process the steps that I have outlined above. it's actually quite simple really, all octopus does is make a web request to my cloud function, which otherwise sits there and awaits somebody to invoke it.

After the function is done processing, we do a quick check to make sure that it's returned something that is not an error or failure message of some kind.

2018-12-17_06-36-00

Finally, we simply make a slack message notification will let us know that everything works well.

slack-screenshot

On a side note, one problem with cron was that it was very difficult to know that it worked, you would essentially have to know how to SSH into a Linux box and tail some logs. Nobody likes doing this, and if they say they do, they are lying and trying to justify their job. Ideally it's the people who actually need the data are the ones who should be able to dig in and make sure it works if something seems out of whack. Octopus deploy solve this problem as well, as each when release can be easily viewed with an octopus interface by anybody who has read permissions. Since we like to pride ourselves on transparency and universal access to data, this fits right along with our core values.

gcp-abstract

And now we have what used to be a traditional cron, but now completely in the cloud. We are so ready for 2019 with a little less technical debt than before.