In my OSS project OpenETL I created a Garmin data pipeline in Airflow to automate the daily extraction, processing and loading of data from the Garmin Connect API into a well-documented postgreSQL database. This pipeline leverages python-garminconnect for data extraction from the Garmin Connect API, which in its stead leverages garth!
This pipeline allows you to have very well structured datasets in a SQL database that can be used for analytics in standard ways (such as dashboards, Jupyter, etc.) and new ways (such as with the @psql tool in Github Copilot Chat Agent Mode in your IDE, for a natural language analytics experience).
diegoscara•2h ago
In my OSS project OpenETL I created a Garmin data pipeline in Airflow to automate the daily extraction, processing and loading of data from the Garmin Connect API into a well-documented postgreSQL database. This pipeline leverages python-garminconnect for data extraction from the Garmin Connect API, which in its stead leverages garth!
This pipeline allows you to have very well structured datasets in a SQL database that can be used for analytics in standard ways (such as dashboards, Jupyter, etc.) and new ways (such as with the @psql tool in Github Copilot Chat Agent Mode in your IDE, for a natural language analytics experience).
Check it out and let me know what you think!