API Programmability - Part 3: Webhooks

6 minute read


In API Programmability - Part 2 we showed you the basics of how to use IP Fabric’s Python SDK.  In this post, we will create a web server using FastAPI to receive webhooks from IP Fabric. After certain events occur, we will use python-ipfabric to extract and manipulate data to enhance your automation efforts.

Find today's code example at ipfabric-webhook-listener.

Real-Life Use Case

Today we will be importing IP Fabric data into a PostgreSQL database after a Snapshot Discovery is completed. This is beneficial for visualizing trending analysis of important information such as the number of devices, End of Life migrations, or Intent Verifications. Due to the IP Fabric limit of five loaded snapshots, it is very difficult to see last week's KPI's or 6 months ago. Long historical analysis can be accomplished by extracting a subset of the data using the API, transforming it into a Python data model, and loading it into a PostgreSQL database. Connecting this database to a visualization tool like Grafana or Tableau will allow your teams to create interactive dashboards.

This example takes all the Intent Rules in an Intent Group and adds severities together to summarize the entire group.

It is also possible to graph the individual intent rules for further analysis.


Here are some basic requirements for working on this project. Please note that this is a base example for developmental purposes and extra caution should be taken into account prior to running this project in a production environment (enabling HTTPS, healthchecks, etc).

  • Desktop or Server accessible by the IP Fabric server to run application. (For instance, if you are running this script on desktop connected remotely through a VPN the IP Fabric server may not be able to route to your IP.)
  • Python (version 3.7+)
  • git (optional)
  • Poetry installed globally
    • python3 -m pip install -U pip poetry
  • Branch specific requirements which can be found at:
    • postgres will take key details about your inventory and intent rules and insert it into a PostgreSQL database for historical trending (branch we will be using today).
    • notify creates a listener which will then post to a Slack or Teams channel about configurable events and their statuses.
    • pdf-report will create a PDF report after Intent Verifications are calculated and send an email.
    • tableau will also extract key data, create a Tableau file, and uploaded it to a Tableau server to display the most recent snapshot data.


If you are interested in merging two branches and are new to version control, please take a look at Git Branching and Merging: A Step-By-Step Guide. If you need further assistance, please reach out via email or open a GitHub issue on the repository.


The easiest way to download the project is to use git for cloning and switching branches.

  • Cloning
    • SSH: git clone [email protected]:community-fabric/ipfabric-webhook-listener.git
    • HTTPS: git clone https://github.com/community-fabric/ipfabric-webhook-listener.git
  • Change to postgres branch:
    • git checkout postgres

Another option would be going to GitHub, changing the branch to postgres in the top left, and downloading the zip file. This method will not allow you to easily merge two integrations from separate branches.


Installing the Python-specific requirements for this project is a simple command. Please take a look at the branch-specific README files (linked above) for the exact options to use to ensure all packages are included. The postrges branch requires extra packages signified by -E postgres. Using the extra options ensures that you do not have to download unnecessary packages for other branches.

poetry install -E postgres

If you have decided to merge two branch integrations into a single project please note that running poetry install -E postgres and then poetry install -E notify will remove the postgres requirements and only install the notify requirements. The correct way to install can be accomplished like this poetry install -E postgres -E notify. If you are planning to use Docker please ensure the merge correctly adds both options in the Dockerfile.

IP Fabric Setup

To create a new Webhook navigate to Settings > Webhooks in IP Fabric and select Add Webhook:

Here you will create a name, your URL, and select the events to listen for. The postgres branch requires both Snapshot and Intent verification to load all the required data. Prior to saving please copy the Secret key as this will be used in the configuration. If you forget this step it can be viewed after saving, unlike API tokens.

Environment Setup

In your project copy sample.env file to .env and enter your environment's specific data.

  • IPF_SECRET is the secret key copied above; this validates that the message came from your IP Fabric instance and not another server.
  • IPF_URL must be in this format https://demo3.ipfabric.io/
  • IPF_TOKEN is created in Settings > API Token
  • Set IPF_VERIFY to false if your IP Fabric certificate is not trusted.
  • Set IPF_TEST to true for initial testing and then change to false.
  • Finally, this requires PostgreSQL specific variables for connections which can be found in the README.

Running the Server

The webhook listener can be run through Poetry or Docker. This will require communicating to the PostgreSQL database prior to starting the webserver to ensure that the schema is installed and the tables are set up.

  • Poetry
    • poetry run api
  • Docker
    • Requires Docker to be installed and running
    • docker-compose up
C:\Code\ipfabric-webhook-listener>poetry run api                         
INFO:     Started server process [12740]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on (Press CTRL+C to quit)

This output provides us the information which we can put in the Webhook's Settings URL. The signifies it is listening on all IP addresses of your system (run ipconfig or ip addr to get the correct IP to replace it). This is also configured to run on port 8000 so the URL which I will need to edit in IP Fabric will look like


When the IPF_TEST variable is set to true the server will process a test message as a normal webhook and verify it is working. Select the lightning bolt icon in the Webhooks settings and then choose which rule to send.

The postgres branch will use the $last snapshot to perform the automation against when a test webhook event is sent (make sure to test both Snapshot - discover and Intent verification - calculate to load all the data for that snapshot). When a test webhook runs it creates a random snapshot ID that does not conflict with others in the system.

Once the test is successful it is advisable to set IPF_TEST back to false and restart the server. If you try to run the test again it will fail because the unique snapshot_id has already been inserted into the database to prevent duplicate entries.

This branch will also only process snapshot events that have been run through the scheduler (user:cron). If a user manually creates a new snapshot or updates an existing one, then the webhook messages will be processed and ignored.


Using IP Fabric Webhooks will further your team on their Network Automation journey and provide the ability to integrate into any external system you can imagine. Today we focused on importing the data into an external database, but this can be extended to import into a Configuration Management Database (CMDB), Network Management System (NMS), or Monitoring Tools to ensure that these critical infrastructure components have full visibility of your network.

If you have found this article helpful, please follow our company’s LinkedIn or check out our other blog posts. If you would like to test our solution to see for yourself how IP Fabric can help you manage your network more effectively, please contact us through www.ipfabric.io.

Further Reading

Grafana Examples

The Grafana JSON models are located in the Grafana directory on GitHub. You will need to configure your Grafana instance to connect to your PostgreSQL database and find the generated UID for that connection. Then in the JSON files replace all instances of <REPLACE WITH UID OF YOUR CONNECTED POSTGRES DB> with the correct value. Finally, you should be able to import the new dashboards.


If you are interested in extracting more data than the example provides (perhaps the BGP Summary table) this can be accomplished by adding on to the existing Python code. If you need assistance with this or have an idea for a new integration please open a GitHub issue.

Get IP Fabric

Request a demo and discover how to increase
your networks visibility & get better time efficiency.
Free Demo | Zero Obligation
Request a Demo
We're Hiring!
Join the Team and be part of the Future of Network Automation
Available Positions
IP Fabric, Inc.
115 BROADWAY, 5th Floor
NEW YORK NY, 10006
United States
This is a block of text. Double-click this text to edit it.
Phone : +1 (914) 752-2991
Email : [email protected]
IP Fabric s.r.o.
Kateřinská 466/40
Praha 2 - Nové Město, 120 00
Czech Republic
This is a block of text. Double-click this text to edit it.
Phone : +420 720 022 997
Email : [email protected]
IP Fabric, Inc. © 2022 All Rights Reserved