Automated network configuration backups using Github actions and git-scraping

Overview

Network Config Scraper

This repository demonstrates the use of Github Actions and git-scraping to build an automated backup solution for network configuration files. Git already provides an efficient way to track and manage changes to textual data, and Github Actions provide automation that we can use to fetch and process configuration backups without reliance on any additional infrastructure. The solution in this repository uses both to retrieve configurations from network devices on a defined schedule and commits the detected changes back to the repository.

This approach is heavily inspired by Git scraping: track changes over time by scraping to a Git repository by Simon Willison. His post provides an excellent overview of git-scraping data from various sources on the Internet.

config scraper

Could I use this?

Because of the textual nature of network configurations, git-scraping offers a simple yet effective way to backup and version configuration data. It runs entirely on GitHub Actions, so there's no complex infrastructure or orchestrations to manage.

Because the data we retrieve data from devices in a private lab, the repository uses a self-hosted runner that makes a connection to Github and has access to the lab. You'd have to decide if this is an acceptable model for your environment.

However, the solution works wonderfully and can be extended to fit into your existing automation. For example, using the webhooks in Github can extend this solution by allowing you to build or set up integrations to external automation systems which can subscribe to certain events on the repository. The configured events can trigger an HTTP POST to a system like Ansible Tower to provide additional automation.

How does it work?

The configuration scraper is configured and scheduled in the .github/workflows/scrape.yml action workflow. It's a short and simple workflow that defines all the triggers and steps to run our automation.

trigger

The workflow can be triggered in 3 different ways: on a push event to the repo, manually using the workflow_dispatch, or, most importantly, on a cron schedule. For example, in the snippet below, you can see that we have the workflow triggered every 30 minutes.

on:
  push:
    branches:
      - main
  workflow_dispatch:
  schedule:
    - cron:  '*/30 * * * *'

self-hosted runner(s)

Because our use case is slightly different from fetching data from flat files, we need to account for how GitHub would access the devices that we are fetching data from. For example, the devices in the inventory.yml are in a lab that sits behind a firewall, so a public Github action runner would not have the required access. For that reason, the repo is configured to use a self-hosted Github action runner that has access to the lab environment. It took about 15 minutes to provision and to configure a runner in this environment, so it is a relatively easy and painless process. Once configured, the self-hosted runner creates a connection to Github and listens for job requests to execute your actions.

Within the action workflow, the only things to specify are the fact that we're using a self-hosted runner and we also provide the tags to identify which runner to use for the action. We use the runs-on directive in the action to do just that, as shown below.

jobs:
  scheduled:
    runs-on: [self-hosted, atc-runners]

data

For data gathering, the retrieve_configs.py script is used to retrieve and save configuration data from each of the devices listed in the inventory.yml file. The script uses an async SSH connection transport from the scrapli library to handle parallel sessions to devices. Once the files are saved, the action workflow uses git to stage any files in which changes have been detected. A commit is created with a timestamp before the changes are pushed into the repository.

  - name: Fetch latest configs
    run: |-
      python retrieve_configs.py
    env:
      SSH_AUTH_USERNAME: ${{ secrets.SSH_AUTH_USERNAME }}
      SSH_AUTH_PASSWORD: ${{ secrets.SSH_AUTH_PASSWORD }}

  - name: Commit and push if it changed
    run: |-
      git config user.name "Automated"
      git config user.email "[email protected]"
      git add -A
      timestamp=$(date -u)
      git commit -m "Latest data: ${timestamp}" || exit 0
      git push

How do I track changes?

The neat thing about using Git to manage your configuration backups is that you get this commit log showing the history of commits that have been made to your configs.

Author(s)

Owner
WWT
World Wide Technology, Inc.
WWT
PyBERT is a serial communication link bit error rate tester simulator with a graphical user interface (GUI).

PyBERT PyBERT is a serial communication link bit error rate tester simulator with a graphical user interface (GUI). It uses the Traits/UI package of t

David Banas 59 Dec 23, 2022
A SOCKS proxy server implemented with the powerful python cooperative concurrency framework asyncio.

asyncio-socks-server A SOCKS proxy server implemented with the powerful python cooperative concurrency framework asyncio. Features Supports both TCP a

Amaindex 164 Dec 30, 2022
Learn how modern web applications and microservice architecture work as you complete a creative assignment

Micro-service Создание микросервиса Цель работы Познакомиться с механизмом работы современных веб-приложений и микросервисной архитектуры в процессе в

Григорий Верховский 1 Dec 19, 2021
This application aims to read all wifi passwords and visualizes the complexity in graph formation by taking into account several criteria and help you generate new random passwords.

This application aims to read all wifi passwords and visualizes the complexity in graph formation by taking into account several criteria and help you generate new random passwords.

Njomza Rexhepi 0 May 29, 2022
Ov3 - Easy common OpenVPN3 operations

ov3 Easy common OpenVPN3 operations Install ov3 requires Python3 and OpenVPN3 to

Yunus Bora Erciyas 6 Apr 25, 2022
Cobalt Strike script for ScareCrow payloads

🎃 🌽 ScareCrow Cobalt Strike intergration CNA A Cobalt Strike script for ScareCrow payload generation. Works only with the binary and DLL Loader. 💣

UserX 401 Dec 11, 2022
Initial code of an A3C network

A3C-network Initial code of an A3C network Open the python file named as "APL452 Project Report2" The following libraries and packages have been insta

Ayush Tanwar 0 Jun 11, 2022
track IP Address

ipX Table of Contents ipX Welcome Features Uses Author 📝 License Welcome find the location of an IP address. Specifically, you can get the following

Ali Shahid 15 Sep 26, 2022
Home Assistant integration for MyEnergi devices

myenergi for Home Assistant myenergi custom component for Home Assistant This is a very early release, will add more documentations soon! This compone

Johan Isacsson 70 Dec 18, 2022
Remote vanilla PDB (over TCP sockets) done right: no extras, proper handling around connection failures and CI.

Overview docs tests package Remote vanilla PDB (over TCP sockets) done right: no extras, proper handling around connection failures and CI. Based on p

Ionel Cristian Mărieș 227 Dec 27, 2022
Rufus is a Dos tool written in Python3.

🦎 Rufus 🦎 Rufus is a simple but powerful Denial of Service tool written in Python3. The type of the Dos attack is TCP Flood, the power of the attack

Billy 88 Dec 20, 2022
Compare the contents of your hosted and proxy repositories for coordinate collisions

Nexus Repository Manager dependency/namespace confusion checker This repository contains a script to check if you have artifacts containing the same n

Sonatype Community 59 Mar 31, 2022
OptiPLANT is a cloud-based based system that empowers professional and non-professional data scientists to build high-quality predictive models

OptiPLANT OptiPLANT is a cloud-based based system that empowers professional and non-professional data scientists to build high-quality predictive mod

Intellia ICT 1 Jan 26, 2022
Nexum is an open-source, remote administration tool written in Python 3

A full-featured remote administration tool written in Python 3. The goal of this project is to make the use of a remote administration tool as simple

z3phyrus 2 Nov 26, 2021
It can be used both locally and remotely (indicating IP and port)

It can be used both locally and remotely (indicating IP and port). It automatically finds the offset to the Instruction Pointer stored in the stack.

DiegoAltF4 13 Dec 29, 2022
Bittensor - an open, decentralized, peer-to-peer network that functions as a market system for the development of artificial intelligence

At Bittensor, we are creating an open, decentralized, peer-to-peer network that functions as a market system for the development of artificial intelligence.

Opentensor 169 Dec 30, 2022
Fmog: Fortinet Mass Object Generator. This script will take a list of IP addresses and create address objects with the same name

Fmog: Fortinet Mass Object Generator This script will take a list of IP addresses and create address objects with the same name. It will also add them

2 Oct 26, 2021
Arp Spoofer using Python 3.

ARP Spoofer / Wifi Killer By Auax Run: Run the application with the following command: python3 spoof.py -t target_ip_address -lh host_ip_address I

Auax 6 Sep 15, 2022
Tool to get the top 100 of the fastest nodes in the Tor network. Based on Kirzahk tool.

Tor Network Top 100 IPs Tool to get the top 100 of the fastest nodes in the Tor network. Based on Kirzahk tool. Just execute top100ipstor.py to get th

Juan Manuel 0 Jan 23, 2022
A simple tcpdump sidecar injector to demonstrate Kubernetes's Mutating Webhook

k8s-tcpdump-webhook A simple tcpdump sidecar injector to demonstrate Kubernetes's Mutating Webhook Build and Deploy Build docker image; docker build -

Bilal Ünal 2 Sep 01, 2022