In API Programmability - Part 2 we showed you the basics of how to use IP Fabric’s Python SDK. In this post, we will create a web server using FastAPI to receive webhooks from IP Fabric. After certain events occur, we will use python-ipfabric to extract and manipulate data to enhance your automation efforts.
Find today's code example at ipfabric-webhook-listener.
Today we will be importing IP Fabric data into a PostgreSQL database after a Snapshot Discovery is completed. This is beneficial for visualizing trending analysis of important information such as the number of devices, End of Life migrations, or Intent Verifications. Due to the IP Fabric limit of five loaded snapshots, it is very difficult to see last week's KPI's or 6 months ago. Long historical analysis can be accomplished by extracting a subset of the data using the API, transforming it into a Python data model, and loading it into a PostgreSQL database. Connecting this database to a visualization tool like Grafana or Tableau will allow your teams to create interactive dashboards.
This example takes all the Intent Rules in an Intent Group and adds severities together to summarize the entire group.
It is also possible to graph the individual intent rules for further analysis.
Here are some basic requirements for working on this project. Please note that this is a base example for developmental purposes and extra caution should be taken into account prior to running this project in a production environment (enabling HTTPS, healthchecks, etc).
python3 -m pip install -U pip
The project is structured in branches to enable users to easily add multiple integrations into one server instance by merging two or more branches into a single branch. For compatibility between different branches, this project is using Poetry - this ensures that all the python dependencies are able to be installed without conflict. Today we will be focusing on the
postgres branch which will take the snapshot data and insert inventory and intent data into a PostgeSQL database for long-term historical trending. This will require access to either a local or remote PostgreSQL database as well as other requirements listed above.
If you are interested in merging two branches and are new to version control, please take a look at Git Branching and Merging: A Step-By-Step Guide. If you need further assistance, please reach out via email or open a GitHub issue on the repository.
The easiest way to download the project is to use git for cloning and switching branches.
git clone [email protected]:community-fabric/ipfabric-webhook-listener.git
git clone https://github.com/community-fabric/ipfabric-webhook-listener.git
git checkout postgres
Another option would be going to GitHub, changing the branch to
postgres in the top left, and downloading the zip file. This method will not allow you to easily merge two integrations from separate branches.
Installing the Python-specific requirements for this project is a simple command. Please take a look at the branch-specific README files (linked above) for the exact options to use to ensure all packages are included. The
postrges branch requires extra packages signified by
-E postgres. Using the extra options ensures that you do not have to download unnecessary packages for other branches.
poetry install -E postgres
If you have decided to merge two branch integrations into a single project please note that running
poetry install -E postgres and then
poetry install -E notify will remove the
postgres requirements and only install the
notify requirements. The correct way to install can be accomplished like this
poetry install -E postgres -E notify. If you are planning to use Docker please ensure the merge correctly adds both options in the
To create a new Webhook navigate to Settings > Webhooks in IP Fabric and select Add Webhook:
Here you will create a name, your URL, and select the events to listen for. The
postgres branch requires both Snapshot and Intent verification to load all the required data. Prior to saving please copy the Secret key as this will be used in the configuration. If you forget this step it can be viewed after saving, unlike API tokens.
In your project copy
sample.env file to
.env and enter your environment's specific data.
IPF_SECRETis the secret key copied above; this validates that the message came from your IP Fabric instance and not another server.
IPF_URLmust be in this format
IPF_TOKENis created in Settings > API Token
falseif your IP Fabric certificate is not trusted.
truefor initial testing and then change to
The webhook listener can be run through Poetry or Docker. This will require communicating to the PostgreSQL database prior to starting the webserver to ensure that the schema is installed and the tables are set up.
poetry run api
C:\Code\ipfabric-webhook-listener>poetry run api INFO: Started server process  INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
This output provides us the information which we can put in the Webhook's Settings URL. The
0.0.0.0 signifies it is listening on all IP addresses of your system (run
ip addr to get the correct IP to replace it). This is also configured to run on port
8000 so the URL which I will need to edit in IP Fabric will look like
IPF_TEST variable is set to
true the server will process a test message as a normal webhook and verify it is working. Select the lightning bolt icon in the Webhooks settings and then choose which rule to send.
postgres branch will use the
$last snapshot to perform the automation against when a test webhook event is sent (make sure to test both Snapshot - discover and Intent verification - calculate to load all the data for that snapshot). When a test webhook runs it creates a random snapshot ID that does not conflict with others in the system.
Once the test is successful it is advisable to set
IPF_TEST back to
false and restart the server. If you try to run the test again it will fail because the unique
snapshot_id has already been inserted into the database to prevent duplicate entries.
This branch will also only process snapshot events that have been run through the scheduler (
user:cron). If a user manually creates a new snapshot or updates an existing one, then the webhook messages will be processed and ignored.
Using IP Fabric Webhooks will further your team on their Network Automation journey and provide the ability to integrate into any external system you can imagine. Today we focused on importing the data into an external database, but this can be extended to import into a Configuration Management Database (CMDB), Network Management System (NMS), or Monitoring Tools to ensure that these critical infrastructure components have full visibility of your network.
If you have found this article helpful, please follow our company’s LinkedIn or check out our other blog posts. If you would like to test our solution to see for yourself how IP Fabric can help you manage your network more effectively, please contact us through www.ipfabric.io.
The Grafana JSON models are located in the Grafana directory on GitHub. You will need to configure your Grafana instance to connect to your PostgreSQL database and find the generated
UID for that connection. Then in the JSON files replace all instances of
<REPLACE WITH UID OF YOUR CONNECTED POSTGRES DB> with the correct value. Finally, you should be able to import the new dashboards.
If you are interested in extracting more data than the example provides (perhaps the BGP Summary table) this can be accomplished by adding on to the existing Python code. If you need assistance with this or have an idea for a new integration please open a GitHub issue.