Using Github action cache

I am using github actions to run tests for my Elixir repo. Here is part of my github workflow .yml:

    steps:
      - uses: actions/checkout@v2
      - uses: erlef/setup-elixir@v1
        with:
          otp-version: '23.1.2'
          elixir-version: '1.11.2'

      - name: Restore dependencies cache
        uses: actions/cache@v2
        with:
          path: deps
          key: ${{ runner.os }}-mix-${{ hashFiles('mix.lock') }}-myapp
          restore-keys: ${{ runner.os }}-mix-myapp-

      - name: Restore build cache
        uses: actions/cache@v2
        with:
          path: _build
          key: ${{ runner.os }}-build-${{ hashFiles('mix.lock') }}-myapp
          restore-keys: ${{ runner.os }}-build-myapp

      - name: Get deps
        run: |
          mix local.hex --force
          mix local.rebar
          mix deps.get

      - name: Compile
        run: mix compile

      - name: Test
        run: mix test

I followed cache/examples.md at main · actions/cache · GitHub but still my compile time is very slow when I do PRs. Did I do somethings wrong? Maybe I need more options for restore-keys? Compile time sometimes is 3 minutes. Thank you for ideas!

6 Likes

:wave:

it’s a bit tough to get github actions caching right because they have some odd rules about which branches can use the cache.

We use github actions caching pretty extensively across our repos and we have something of a standard template now:

slipstream/ci.yml at 2148aaee152335e9c2a61b62da6767d50658f0de · NFIBrokerage/slipstream · GitHub has code very similar to your workflow:

    - name: Restore the deps cache
      uses: actions/cache@v1
      id: deps-cache
      with:
        path: deps
        key: ${{ runner.os }}-${{ env.ELIXIR_VERSION }}-${{ env.OTP_VERSION }}-${{ env.MIX_ENV }}-deps-mixlockhash-${{ hashFiles(format('{0}{1}', github.workspace, '/mix.lock')) }}
        restore-keys: |
          ${{ runner.os }}-${{ env.ELIXIR_VERSION }}-${{ env.OTP_VERSION }}-${{ env.MIX_ENV }}-deps-

    - name: Restore the _build cache
      uses: actions/cache@v1
      id: build-cache
      with:
        path: _build
        key: ${{ runner.os }}-${{ env.ELIXIR_VERSION }}-${{ env.OTP_VERSION }}-${{ env.MIX_ENV }}-build-mixlockhash-${{ hashFiles(format('{0}{1}', github.workspace, '/mix.lock')) }}
        restore-keys: |
          ${{ runner.os }}-${{ env.ELIXIR_VERSION }}-${{ env.OTP_VERSION }}-${{ env.MIX_ENV }}-build-

(this is the CI workflow, mainly does tests)

And we have a second workflow that only runs on pushes to main/master which builds up the cache: slipstream/refresh-docs-cache.yml at 2148aaee152335e9c2a61b62da6767d50658f0de · NFIBrokerage/slipstream · GitHub. Runs on other branches (e.g. new branches and tags) can use caches from the default branch

Before adding dialyzer to this particular build, we were averaging around 30s/build!

8 Likes

Thank you! This is helpful. I also looked to the second workflow in your link. I think my workflow is ok, but it is still slow. With no cache, it takes 3 minutes. With cache it takes about 1 minutes if there are only changes in app. If there are other changes, it takes about 2 minutes. I do not see anything clear that I maybe could do better. Thank you for your input!

1 Like

This seems to not be working. When I watch Github, I see the compile step even though no code changed on my second PR. I thought maybe it was because the default branch was changed. I cannot find a cache hit.

I came up with a workflow that uses s3 to cache between branches

###################################################################
# Sample Github workflow to cache deps/ and _build/ in S3
# between branches
###################################################################
# S3 credentials needed (define these as Github secrets)
# - AWS_ACCESS_KEY_ID
# - AWS_SECRET_ACCESS_KEY
# - BUILD_BUCKET
# 
# If you have private Git repos, you need to add a private key to the repo
# - HEADLESS_PRIV
name: Test

on:
  pull_request:
    branches:
      - develop
    paths-ignore:
      - 'docs/**'
      - '*.md'

jobs:
  test:

    name: Tests

    runs-on: ubuntu-18.04

    env:
      MIX_ENV: test

    services:
      db:
        image: postgres:11
        ports: ['5432:5432']
        env:
          POSTGRES_PASSWORD: postgres
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v2
      - uses: erlef/setup-elixir@v1
        with:
          otp-version: '23.1.2'
          elixir-version: '1.11.2'

      - name: Configure SSH for private Repos
        uses: webfactory/ssh-agent@v0.4.1
        with:
          ssh-private-key: ${{ secrets.HEADLESS_PRIV }}

      - name: Restore Dependencies from S3 (if present)
        id: restore-deps-cache
        run: '(aws s3 ls s3://${{ secrets.BUILD_BUCKET }}/some-dir/deps.tar && aws s3 cp s3://${{ secrets.BUILD_BUCKET }}/some-dir/deps.tar deps.tar) || echo deps.tar not found on S3, skipping step'
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'us-east-1'

      - name: Restore Build from S3 (if present)
        id: restore-build-cache
        run: '(aws s3 ls s3://${{ secrets.BUILD_BUCKET }}/some-dir/_build.tar && aws s3 cp s3://${{ secrets.BUILD_BUCKET }}/some-dir/_build.tar _build.tar) || echo _build.tar not found on S3, skipping step'
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'us-east-1'

      - name: 'Untar deps.tar (if present)'
        run: '(test -f deps.tar && tar -xvf deps.tar) || echo deps.tar not found, skipping step'

      - name: 'Untar _build.tar (if present)'
        run: '(test -f _build.tar && tar -xvf _build.tar) || echo _build.tar not found, skipping step'

      - name: Get dependencies
        run: |
          mix local.hex --force
          mix local.rebar
          mix deps.get

      - name: Compile
        run: mix compile

      - name: Create database
        run: mix ecto.create

      - name: Migrate database
        run: mix ecto.migrate

      - name: Test
        run: mix test

      - name: 'Tar deps'
        run: sudo tar -cvf deps.tar deps

      - name: 'Tar _build'
        run: sudo tar -cvf _build.tar _build

      - name: Cache Dependencies to S3
        id: deps-cache
        run: '(test -f deps.tar && aws s3 cp deps.tar s3://${{ secrets.BUILD_BUCKET }}/some-dir/deps.tar) || echo deps.tar not found locally, skipping step'
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'us-east-1'

      - name: Cache Build to S3
        id: build-cache
        run: '(test -f _build.tar && aws s3 cp _build.tar s3://${{ secrets.BUILD_BUCKET }}/some-dir/_build.tar) || echo _build.tar not found locally, skipping step'
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'us-east-1'

2 Likes

I think this can be better with using s3 sync instead of s3 copy