in Projects, Programming, Technology

CodePipeline for Client and Server Web Projects

What is the recommended way to build and deploy a project with client static assets (Webpack) and server code (in this case python) using AWS CodePipeline? I’m not sure, but here’s what I’ve learned.

I set up AWS CodeBuild to build and test a Python (Django) web project with a Node (React) front end.

The take-away: if you’re committed to getting everything on AWS or expect that your builds will be too frequent and CPU intensive for another CI provider, sure go for it. Otherwise, stick to something with more features and documentation (like Circle CI).

A single Github repository hosts the source code for both projects. Despite some docker machinations to allow the tests to run against a real Postgres instance (Sqlite doesn’t support JSON fields), it runs fine.

The next step was to enable Continuous Deployment using AWS CodePipeline. We were already using Elastic Beanstalk, so the server deploy from my first pipeline was straightforward. However, EB deploys the code from source control not the compiled front-end code.

I didn’t really want the staging server to host the front-end code anyway. This is a staging/test server, ideally it would be similar to production and reference the front-end code from CloudFront or at least S3. While CodeBuild can produce multiple artifacts, I couldn’t figure out how to reference a secondary artifact (the compiled front end) in my pipeline. So, I added a second pipeline with another build for just the front end. It was easy enough to get it to push the build to S3. This is a staging environment so I’m using a public S3 bucket to host the front-end code, not CloudFront as we use in production. Also, I was just too tired of clicking on things by that point.

The first build still a redundantly compiles the front end in order to run tests. That first build is also integrated with Github for reporting build status. At night, just before I go to sleep – I think about how CodeBuild is running 3 builds for every change in Github. The individual build plus each pipeline triggers a build. I could probably consolidate both builds to one pipeline and restrict what triggers a build on that pipeline to reduce the number of builds. But again, tired of clicking on things.

UPDATE

Tuesday, Sep 24 2019

We’ve now given up on AWS CodeBuild and CodePipeline and migrated to CircleCI. This is disappointing given our simple Javascript + Python build feels like a common scenario.

The next thing I wanted to do was deploy assets to a new s3 folder on every build and then pass the name of that folder to the python build (so it could be set in an environment variable when Elastic Beanstalk is deployed). This would give us control over switching to the new version of the client side assets at the time the server is deployed.

Unfortunately, I couldn’t figure out how to do it.

I found clues that you can write a JSON file as a build artifact and then use a “ParameterOverrides” function to read the output of the build. However, despite using boto3 to fetch the build step of the pipeline and set the ParameterOverrides via the command line (you can’t do it via the AWS Console UI) – the overrides would disappear. My best guess is that the ParameterOverrides can only be specified when using Cloud Formation to construct the pipeline (not the AWS Console UI, or the CLI).

Write a Comment

Comment