@reason.co/pipelines-support

1.5.1 • Public • Published
██████╗ ███████╗ █████╗ ███████╗ ██████╗ ███╗   ██╗    ██████╗ ██████╗ 
██╔══██╗██╔════╝██╔══██╗██╔════╝██╔═══██╗████╗  ██║   ██╔════╝██╔═══██╗
██████╔╝█████╗  ███████║███████╗██║   ██║██╔██╗ ██║   ██║     ██║   ██║
██╔══██╗██╔══╝  ██╔══██║╚════██║██║   ██║██║╚██╗██║   ██║     ██║   ██║
██║  ██║███████╗██║  ██║███████║╚██████╔╝██║ ╚████║██╗╚██████╗╚██████╔╝
╚═╝  ╚═╝╚══════╝╚═╝  ╚═╝╚══════╝ ╚═════╝ ╚═╝  ╚═══╝╚═╝ ╚═════╝ ╚═════╝ 
                                                                       
Pipeline Support Library (c) 2020 With Reason Ltd

-----------------------------------------------------------------------------

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.

----------------------------------------------------------------------------- 

Pipeline Support scripts

When running a project that has multiple sub services living under a "mono repo" structure, this tooling will automatically parse subfolders within the service directory and envoke the relevant test or deploy scripts.

service/
├── s3-or-azure-service/
│   ├── [project files...]
│   └── pacakage.json
└── serverless-service/
    ├── [project files...]
    └── serverless.yaml

Once correctly configured, this will automatically deploy based on folder structure within your project and recent changes based on the set commit, this also checks yaml and json files to determine the correct deployment type, such as s3 or serverless.

The below readme uses bitbucket pipelines as the example source, but the same commands can be run in any pipleines

Environment variables

Credentials should be stored in "secured" environment variables in the Pipeline settings.

For Bitbucket, you can edit these in Repository Settings > Pipelines > Repository variables.

AWS

To deploy to AWS ensure the SERVICE_ENV , AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY EVs are available.

Azure

To deploy a zip bundle to Azure scm, you will need the AZURE_DEPLOY_USER and AZURE_DEPLOY_PASS credentials populated with a service account.

Adding deployments to your project

In your pipeline, add the deployer:

Example for Bitbucket Pipelines (in RemoteDev):

pipelines:
  branches:
    master:
      - step:
          script:
            - export SERVICE_ENV=remotedev
            - export AWS_ACCESS_KEY_ID=$REMOTEDEV_AWS_ACCESS_KEY_ID
            - export AWS_SECRET_ACCESS_KEY=$REMOTEDEV_AWS_SECRET_ACCESS_KEY
            - node ./node_modules/@reason.co/pipelines-support/index.js test $BITBUCKET_COMMIT remotedev
            - node ./node_modules/@reason.co/pipelines-support/index.js deploy $BITBUCKET_COMMIT remotedev

Per the above, ensure SERVICE_ENV , AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are available.

When invoking bitbucket-pipelines-support please specify the following arguments (in order):

  • Job (test or deploy)
  • Commit that is used to check for file changes ($BITBUCKET_COMMIT is exposed by default by Bitbucket pipelines, this will differ depending on the service you are using )
  • Stage to apply (such as remotedev)

S3 static website deployments

Ensure your local package.json has a "s3-deployment" node

"s3-deployment": {
    "remotedev": {
      "region": "eu-west-1",
      "bucket": "fqdn.bucketname-remotedev.com"
    },
    "staging": {
      "region": "eu-west-1",
      "bucket": "fqdn.bucketname-staging.com"
    },
    "production": {
      "region": "eu-west-1",
      "bucket": "fqdn.bucketname-production.com"
    }
  },

This will match based on the Stage specified when executing the script (such as remotedev) and pick the corresponding bucket for your stage.

This will automatically create some rules on your bucket, but some need to added manually and double checked if needed.

Bucket Policy

Either in the GUI or via Terraform if applicable, ensure your bucket policy (in permissions) allows s3:getObject per:

{
    "Version": "2012-10-17",
    "Id": "REASONDEPLOYPOLICY",
    "Statement": [
        {
            "Sid": "Stmt1508317523362",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::your-bucket-name-here/*"
        }
    ]
}

If using Terraform, please add the following (this assumes you have a aws_s3_bucket.www resource):

resource "aws_s3_bucket_policy" "policy" {
  bucket = "${aws_s3_bucket.www.id}"
  policy = <<POLICY
{
    "Version": "2012-10-17",
    "Id": "REASONDEPLOYPOLICY",
    "Statement": [
        {
            "Sid": "Stmt1508317523362",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::${aws_s3_bucket.www.id}/*"
        }
    ]
}
POLICY
}

Static website Hosting Configuration

Also ensure that "Static website Hosting" (in Permissions) is enabled and that the default index is set to your index.html or equivalent.

If using Terraform, please add the following:

resource "aws_s3_bucket" "www" {
  bucket = var.bucket
  acl    = "public-read"
  website {
    index_document = "index.html"
    error_document = "error.html"
  }
}

Azure ZIP deployments

To deploy a zip bundle to Azure scm, you will need the AZURE_DEPLOY_USER and AZURE_DEPLOY_PASS credentials populated with a service account.

This does not auto create services, so you will need an existing service created before you push to it.

Ensure your local package.json has a azure-push-deployment node

"azure-push-deployment": {
    "remotedev": {
      "service-name": "the-name-of-your-azure-service"
    }
  },

Serverless deployments

If a Serverless YAML file is detected, this will trigger a serverless deployment after running a npm install.

If a stage is specified, this will prefix the serverless job with a --stage STAGE_HERE

Whilst this guide specifies AWS environment variables, any other PaaS supported by serverless framework as long as the correct environment variables are supplied.

Other Job Types

Migrate DB

node ./node_modules/@reason.co/pipelines-support/index.js migrate-db $BITBUCKET_COMMIT remotedev

Will run the NPM script migrate-db on any pacakge.json files in service sub folders.

Test

node ./node_modules/@reason.co/pipelines-support/index.js test $BITBUCKET_COMMIT remotedev

Will run the NPM script test on any pacakge.json files in service sub folders.

Other deploy

node ./node_modules/@reason.co/pipelines-support/index.js deploy $BITBUCKET_COMMIT remotedev

If there is a deploy script in the package.json this will be run.

Note that this would also trigger the s3/azure deploys if the relevant properties were populated in package.json

Scripts ran by this task are (in order): npm install -q, npm run build (If job "build" exists), npm run deploy.

Readme

Keywords

none

Package Sidebar

Install

npm i @reason.co/pipelines-support

Weekly Downloads

1

Version

1.5.1

License

Attribution-NoDerivs 3.0 Unported (CC BY-ND 3.0)

Unpacked Size

33.8 kB

Total Files

15

Last publish

Collaborators

  • withreason_npm
  • andersdn
  • jimle_uk
  • karlsburg