Employee Spotlight

Integrating GitHub Actions Logs to Your Elasticsearch

Shahar Glazner
|
May 11, 2021
Updated:
May 11, 2021
Read this useful guide from anecdotes about how to integrate GitHub actions logs to your Elasticsearch
Table of Contents

TL;DR;

If you’re using GitHub, you’re probably familiar with GitHub Actions — a simple and clean CI/CD that helps to build automation around your code repositories. In this blog post, I’ll guide you through how to sync your GitHub Actions workflow logs into Elasticsearch.

Intro

Once you become a GitHub Actions user, you’ll probably want to add some monitoring and reporting to your CI/CD. You may want some statistics on the average time of a specific workflow, or how many of your jobs are being failed over time.

The first step to achieve your goal is to stream your logs into central log storage. At anecdotes.ai, our logs are streamed to Elasticsearch, so it was intuitive to stream the logs produced by the CI/CD workflows to Elasticsearch too.


However, I was surprised to find out that there isn’t a way to do it. You can manually download specific workflow logs and upload them yourself, but of course, that’s not something we want to do.

So I took the opportunity to develop a GitHub Action that will do it for you. There were several obstacles in developing the action, but I’ll cover them in another more technical blog post.

In this blog post, I’ll focus on how to use the GitHub Action to easily stream your workflow logs to Elastic.


{{banner-image}}

Step 0: Python demo app

To make this even easier to understand, let’s review some code. For the purposes of this post, I’ve created a python-demo repository with an example workflow, which just triggers the app entrypoint.

Notice that the repository is private, since we use sensitive information such as the Elasticsearch credentials (which, of course, are protected in GitHub Secrets).


main.py

def main():

  # echo output

  for i in range(100):

      print("XXX" * i)

      print(i)

      print ("XXX" * i)

if __name__ == '__main__':

  main()

example.yml

name: Example workflow

on:

workflow_dispatch

jobs:

run-demo-app:

  runs-on: ubuntu-latest

  steps:

  - name: Checkout code

    uses: actions/checkout@v2

    with:

      submodules: 'recursive'

      token: ${{ secrets.GITHUB }}

  - name: Set up Python 3.7.8

    uses: actions/setup-python@v2

    with:

      python-version: '3.7.8'

  - name: Run script that echo outputs

    run: python main.py

Triggering the workflow will produce the following logs:

example workflow runs

And our final goal is to stream them to Elastic.

Step #1: Modifying the Workflow

The first (and only) thing you’ll need to change in your code is to add a new job that collects your workflow logs to Elastic.

Add this step to each of your workflows:

upload-logs-to-elastic:

runs-on: ubuntu-latest

needs: run-demo-app

if: always()

steps:

- name: Upload GitHub Action workflows logs to elastic

  uses: shahargl/upload-github-workflow-logs-to-elastic@1.0.13

  with:

    github_token: "${{ secrets.GITHUB }}"

    github_org: "shahargl"

    github_repository: "python-demo"

    github_run_id: "${{ github.run_id }}"

    elastic_host: "${{ secrets.ELASTIC_HOST }}"

    elastic_api_key_id: "${{ secrets.ELASTIC_KEY_ID }}"

    elastic_api_key: "${{ secrets.ELASTIC_API_KEY }}"

    elastic_index: "ci-cd"


Where:

  1. github_token: your GitHub PAT — I’ll explain how to get one and keep it in secrets.
  2. github_org: the org/user — change to your org/user.
  3. githun_repository: the GitHub repository — change to your repo.
  4. github_run_id: GitHub run id — nothing to change.
  5. elastic_host: the elastic host — change to your elastic host e.g. https://some-elastic-host.com:9243.
  6. elastic_api_key_id + elastic_api_key: the elastic API key id and API key, I’ll explain how to generate one and store it securely.
  7. elastic_index: your elastic index prefix — you can keep it with “ci-cd” or change it to whatever you like. You shouldn’t worry about creating the index, as the GitHub Action will create it for you.

Step #2: Generate a Github PAT

GitHub has pretty good documentation on how to generate GitHub PAT. The permission that you’ll need is “workflow” so you’ll be able to read the workflow logs.


Then, copy the token and add it as a secret.


Step #3: Generate an Elastic API KEY

To get an elastic API key, just login into Kibana and go to “Dev Tools”:

Now create an API key:

POST /_security/api_key

{

"name": "api-key-for-github",

"expiration": "1d" # notice it's api key for one day, after you sure everything works for you, generate a new api key with longer expiration

}

The response should be something like:

{

"id" : "4NtdXXXXXXXXXXXXXX",

"name" : "api-key-for-github",

"expiration" : 1618909336568,

"api_key" : "BAs-XXXXXXXXXXXXXXXXXXXXXXXXXXXX"

}

Now, store the value of “id” and “api_key” in the GitHub secrets store.

Your repository secrets should look like this:


Step #4: Run Your Workflow


Now to the interesting part, let’s run the workflow. Go to your repository actions and run your workflow with the upload-logs-to-elastic step:


We can see that the demo app created the following logs:


Now let’s finally review them on Elastic!


Step #5: Review Your Logs on Elastic

The first thing we need to do (after login into your Kibana) is to create an index pattern for the ci-cd index.

Go to Elastic -> Stack Management -> Kibana -> Index Patterns:


And then “Create index pattern”:


In the index pattern name, just write the prefix you choose (“ci-cd”) and then just click “next”. For the “Time field”, choose “@timestamp”:


Finally, click “Create index pattern” and go again to Kibana to review the logs:

And there you have it! If you've been following along, you should have successfully integrated GitHub Action logs into your Elasticsearch. This should allow you to easily and quickly stream logs produced by GitHub Actions workflows to Elasticsearch and use it for monitoring, alerting, and continuous improvement.

Key Takeaways

What you will learn

Shahar Glazner
Passionate about technology, sports, economy, chess and system architecture in a cloud-native world. A long-distance runner and Lead Developer at Anecdotes.
Link 1
Link 1
Link 1

Explore Our Compliance Leader Playground

No items found.