Compare commits

...

87 Commits

Author SHA1 Message Date
Sebastian Goscik
4b4cb86749 Bump version: 0.12.0 → 0.13.0 2025-04-09 13:01:45 +01:00
Sebastian Goscik
c091fa4f92 changelog 2025-04-09 13:01:45 +01:00
Sebastian Goscik
2bf90b6763 Update readme with parallel downloads 2025-04-09 11:27:45 +01:00
Sebastian Goscik
f275443a7a Fix issue with duplicated logging with parallel loggers 2025-04-09 11:25:34 +01:00
Sebastian Goscik
3a43c1b670 Enable multiple parallel uploaders 2025-04-09 11:25:34 +01:00
Sebastian Goscik
e0421c1dd1 Add all smart detection types 2025-04-09 02:37:10 +01:00
Sebastian Goscik
4ee70e6d4b Updating dev dependencies 2025-04-09 02:25:10 +01:00
Sebastian Goscik
ce2993624f Correct CAMERAS envvar 2025-04-09 02:12:52 +01:00
Sebastian Goscik
cec1f69d8d Bump uiprotect 2025-04-09 02:06:38 +01:00
Sebastian Goscik
c07fb30fff update pre-commit 2025-04-09 01:54:57 +01:00
Sebastian Goscik
1de9b9a757 [actions] Fix CRLF issue on windows 2025-04-09 01:51:29 +01:00
Sebastian Goscik
3ec69a7a97 [actions] Fix uv install on windows 2025-04-09 01:47:06 +01:00
Sebastian Goscik
855607fa29 Migrate project to use uv 2025-04-09 01:40:24 +01:00
Sebastian Goscik
e11828bd59 Update makfile to use ruff 2025-04-08 23:54:24 +01:00
Sebastian Goscik
7439ac9bda Bump version: 0.11.0 → 0.12.0 2025-01-18 18:23:33 +00:00
Sebastian Goscik
e3cbcc819e git github action python version parsing 2025-01-18 18:23:33 +00:00
Sebastian Goscik
ccb816ddbc fix bump2version config 2025-01-18 17:19:47 +00:00
Sebastian Goscik
9d2d6558a6 Changelog 2025-01-18 17:18:05 +00:00
Sebastian Goscik
3c5056614c Monkey patch in experimental downloader 2025-01-18 17:07:44 +00:00
Sebastian Goscik
1f18c06e17 Bump dependency versions 2025-01-18 17:07:44 +00:00
Sebastian Goscik
3181080bca Fix issue when --camera isnt specified
Click defaults options with multiple=true to an empty list not None if they are not provided
2025-01-18 16:43:02 +00:00
Wietse Wind
6e5d90a9f5 Add ability to INCLUDE specific cameras instead of EXCLUDE (#179)
Co-authored-by: Sebastian Goscik <sebastian.goscik@live.co.uk>
2025-01-18 15:12:55 +00:00
dependabot[bot]
475beaee3d Bump aiohttp from 3.10.10 to 3.10.11 in the pip group across 1 directory
Bumps the pip group with 1 update in the / directory: [aiohttp](https://github.com/aio-libs/aiohttp).


Updates `aiohttp` from 3.10.10 to 3.10.11
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.10.10...v3.10.11)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: indirect
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-06 21:02:05 +00:00
Wietse Wind
75cd1207b4 Fix iterating over empty events 2025-01-06 20:41:11 +00:00
Sebastian Goscik
c067dbd9f7 Filter out on-going events
Unifi Protect has started to return events that have not ended. These are now explicitly filtered out
2024-10-26 22:12:50 +01:00
Sebastian Goscik
2c43149c99 ruff formatting 2024-10-26 22:12:50 +01:00
Sebastian Goscik
78a2c3034d Bump uiprotect 2024-10-26 22:12:50 +01:00
jimmydoh
1bb8496b30 Adding support for SMART_DETECT_LINE events 2024-10-26 22:12:50 +01:00
Sebastian Goscik
80ad55d0d0 Simplified websocket reconnection logic
This is now handled automatically by uiprotect internally. We do not need to worry about this, greatly simplifying the logic here to just logging messages
2024-10-26 21:27:19 +01:00
jimmydoh
0b2c46888c Replace check_ws with subcription to websocket state 2024-10-26 21:27:19 +01:00
Jonathan Laliberte
0026eaa2ca #171 - Use exponential backoff when logging into Unifi API (#172) 2024-10-10 20:55:00 +00:00
Sebastian Goscik
c3290a223a Update 30-config
Fixed path in error message
2024-09-10 12:32:50 +01:00
Sebastian Goscik
4265643806 Update contribution guide setup steps 2024-08-10 00:38:42 +01:00
Sebastian Goscik
78be4808d9 mypy fixes 2024-08-10 00:17:55 +01:00
Sebastian Goscik
0a6a259120 remove twine dev dependency 2024-08-09 23:53:18 +01:00
Sebastian Goscik
de4f69dcb5 switch pre-commit to ruff 2024-08-09 23:49:11 +01:00
Sebastian Goscik
a7c4eb8dae remove editor config 2024-08-09 23:46:50 +01:00
Sebastian Goscik
129d89480e update git ignore 2024-08-09 23:46:09 +01:00
Sebastian Goscik
a7ccef7f1d ruff check 2024-08-09 23:45:21 +01:00
Sebastian Goscik
bbd70f49bf ruff format 2024-08-09 23:43:03 +01:00
Sebastian Goscik
f9d74c27f9 change linter to ruff 2024-08-09 23:39:54 +01:00
Sebastian Goscik
9d79890eff Update poetry lock 2024-08-09 23:38:46 +01:00
Lloyd Pickering
ccf2cde272 Switch to using UIProtect library (#160)
* Updated poetry dependencies to remove optional flags on dev/test

* file fixups from running poetry run tox

* Updated to Python 3.10

* Switched to UI Protect library

* Updated changelog

* Fix docker permissions

- Make scripts executable by everyone
- Correct XDG variable name to fix incorrect config path being used

* Revert "Updated poetry dependencies to remove optional flags on dev/test" and regenerated lock file
This reverts commit 432d0d3df7.

---------

Co-authored-by: Sebastian Goscik <sebastian.goscik@live.co.uk>
2024-08-09 22:16:19 +00:00
Sebastian Goscik
a8328fd09e Bump version: 0.10.7 → 0.11.0 2024-06-08 01:31:58 +01:00
Sebastian Goscik
28d241610b changelog 2024-06-08 01:31:21 +01:00
Sebastian Goscik
aa1335e73b Fix typos and add experimental downloader to README 2024-06-08 01:29:06 +01:00
Sebastian Goscik
9cb2ccf8b2 Update pyunifiprotect to point to my fork
This is done to accept in features that have not been merged into the upstream repo yet. This also allows for stability in the future.
2024-06-08 01:18:14 +01:00
Sebastian Goscik
30ea7de5c2 Add experimental downloader
This uses a new API to download events like the way the web ui does, where it first asks for a video to be prepared (on the unifi protect host) and then downloads it. This might be potentially more stable than the existing downloader.
2024-06-06 00:41:42 +01:00
Sebastian Goscik
2dac2cee23 TEMP: Switch to fork of pyunifiprotect
In order to test new functionality of a PR this commit temporarily changes the source of pyunifiprotect
2024-06-06 00:41:42 +01:00
Sebastian Goscik
f4d992838a Fix permissions issue with ufp/sessions.json in docker container
The python library `platformdirs` is detecting the user as root instead of the uid being set to execute UPB. This work around forces the session cache file to be placed in /config
2024-06-06 00:41:20 +01:00
Sebastian Goscik
9fe4394ee4 bump pyunifiprotect to 6.0.1 2024-05-27 23:05:19 +01:00
Sebastian Goscik
e65d8dde6c Bump version: 0.10.6 → 0.10.7 2024-03-23 00:18:57 +00:00
Sebastian Goscik
90108edeb8 Force using pyunifiprotect >= 5.0.1 2024-03-23 00:18:49 +00:00
Sebastian Goscik
1194e957a5 Bump version: 0.10.5 → 0.10.6 2024-03-22 22:50:20 +00:00
Sebastian Goscik
65128b35dd changelog 2024-03-22 22:50:14 +00:00
mmolitor87
64bb353f67 Bump pyunifiprotect to support protect 3.0.22 (#133) 2024-03-22 22:47:54 +00:00
Adrian Keenan
558859dd72 Update docs for ignoring cameras (#134)
* update docs

* remove docker from log scanning notes
2024-03-21 23:09:09 +00:00
Sebastian Goscik
d3b40b443a Bump version: 0.10.4 → 0.10.5 2024-02-24 16:19:22 +00:00
Sebastian Goscik
4bfe9afc10 Bump pyunifiprotect 2024-02-24 16:19:11 +00:00
Sebastian Goscik
c69a3e365a Bump version: 0.10.3 → 0.10.4 2024-01-26 19:49:36 +00:00
Sebastian Goscik
ace6a09bba changelong 2024-01-26 19:49:32 +00:00
Sebastian Goscik
e3c00e3dfa Update pyunifiprotect version 2024-01-26 19:47:44 +00:00
Sebastian Goscik
5f7fad72d5 Bump version: 0.10.2 → 0.10.3 2023-12-07 19:59:13 +00:00
Sebastian Goscik
991998aa37 changelog 2023-12-07 19:59:10 +00:00
Sebastian Goscik
074f5b372c bump pyunifiprotect version 2023-12-07 19:57:21 +00:00
Sebastian Goscik
00aec23805 Bump version: 0.10.1 → 0.10.2 2023-11-21 00:20:46 +00:00
Sebastian Goscik
52e4ecd50d changelog 2023-11-21 00:20:35 +00:00
Sebastian Goscik
6b116ab93b Fixed issue where duplicate events were being downloaded
Previously unifi would only end one update which contained the end time stamp
so it was sufficient to check if it existed in the new event data.
However, now it is possible to get update events after the end timestamp
has been set. With this change we now look for when the event change
data contains the end time stamp. So long as unifi does not change its
mind about when an event ends, this should solve the issue.
2023-11-21 00:18:36 +00:00
Sebastian Goscik
70526b2f49 Make default file path format use event start time 2023-11-21 00:08:24 +00:00
Sebastian Goscik
5069d28f0d Bump version: 0.10.0 → 0.10.1 2023-11-01 21:34:01 +00:00
Sebastian Goscik
731ab1081d changelog 2023-11-01 21:33:55 +00:00
Sebastian Goscik
701fd9b0a8 Fix event enum string conversion to value 2023-11-01 21:32:19 +00:00
Sebastian Goscik
5fa202005b Bump version: 0.9.5 → 0.10.0 2023-11-01 00:16:17 +00:00
Sebastian Goscik
3644ad3754 changelog 2023-11-01 00:15:54 +00:00
Sebastian Goscik
9410051ab9 Add feature to skip events longer than a maximum length 2023-11-01 00:11:49 +00:00
Sebastian Goscik
d5a74f475a failed rcat no longer writes to database 2023-10-31 23:37:52 +00:00
Sebastian Goscik
dc8473cc3d Fix bug with event chunking during initial ignore of events 2023-10-31 17:47:59 +00:00
Sebastian Goscik
60901e9a84 Fix crash caused by no events occurring in retention interval 2023-10-31 17:35:30 +00:00
Sebastian Goscik
4a0bd87ef2 Move docker base image to alpine edge to get latest rclone release 2023-10-31 17:32:43 +00:00
Sebastian Goscik
8dc0f8a212 Bump version: 0.9.4 → 0.9.5 2023-10-07 22:52:45 +01:00
Sebastian Goscik
34252c461f changelog 2023-10-07 22:52:17 +01:00
Sebastian Goscik
acc405a1f8 Chunk event query to prevent crashing unifi protect 2023-10-07 22:50:04 +01:00
Sebastian Goscik
b66d40736c Bump dependency versions 2023-10-07 21:49:46 +01:00
cyberpower678
171796e5c3 Update unifi_protect_backup_core.py (#100)
Fix typo in connection attempts.  The application only attempts to connect once instead of 10 times.
2023-09-08 16:27:09 +01:00
Sebastian Goscik
cbc497909d linting 2023-07-29 12:07:31 +01:00
Sebastian Goscik
66b3344e29 Add download rate limiter 2023-07-29 12:07:31 +01:00
Sebastian Goscik
89cab64679 Add validation of retention/purge interval 2023-07-29 12:06:54 +01:00
29 changed files with 2873 additions and 3049 deletions

View File

@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.9.4
current_version = 0.13.0
commit = True
tag = True
@@ -8,8 +8,8 @@ search = version = "{current_version}"
replace = version = "{new_version}"
[bumpversion:file:unifi_protect_backup/__init__.py]
search = __version__ = '{current_version}'
replace = __version__ = '{new_version}'
search = __version__ = "{current_version}"
replace = __version__ = "{new_version}"
[bumpversion:file:Dockerfile]
search = COPY dist/unifi_protect_backup-{current_version}.tar.gz sdist.tar.gz

View File

@@ -1,24 +0,0 @@
# http://editorconfig.org
root = true
[*]
indent_style = space
indent_size = 4
trim_trailing_whitespace = true
insert_final_newline = true
charset = utf-8
end_of_line = lf
[*.bat]
indent_style = tab
end_of_line = crlf
[LICENSE]
insert_final_newline = false
[Makefile]
indent_style = tab
[*.{yml, yaml}]
indent_size = 2

View File

@@ -1,90 +1,108 @@
# This is a basic workflow to help you get started with Actions
name: Test and Build
name: dev workflow
# Controls when the action will run.
on:
# Triggers the workflow on push events but only for the dev branch
push:
branches: [ dev ]
branches-ignore:
- main
pull_request:
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "test"
test:
# The type of runner that the job will run on
strategy:
matrix:
python-versions: [3.9]
os: [ubuntu-18.04, macos-latest, windows-latest]
python-versions: ["3.10", "3.11", "3.12", "3.13"]
os: [ubuntu-latest, macos-latest, windows-latest]
runs-on: ${{ matrix.os }}
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- name: Configure Git to maintain line endings
run: |
git config --global core.autocrlf false
git config --global core.eol lf
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-versions }}
- name: Install dependencies
- name: Install uv (Unix)
if: runner.os != 'Windows'
run: |
python -m pip install --upgrade pip
pip install poetry tox tox-gh-actions
curl -LsSf https://astral.sh/uv/install.sh | sh
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: test with tox
run:
tox
- name: Install uv (Windows)
if: runner.os == 'Windows'
run: |
iwr -useb https://astral.sh/uv/install.ps1 | iex
echo "$HOME\.cargo\bin" | Out-File -FilePath $env:GITHUB_PATH -Encoding utf8 -Append
- name: list files
run: ls -l .
- uses: codecov/codecov-action@v1
with:
fail_ci_if_error: true
files: coverage.xml
- name: Install dev dependencies
run: |
uv sync --dev
- name: Run pre-commit
run: uv run pre-commit run --all-files
- name: Run pytest
run: uv run pytest
- name: Build
run: uv build
dev_container:
name: Create dev container
runs-on: ubuntu-20.04
if: github.event_name != 'pull_request'
# Steps represent a sequence of tasks that will be executed as part of the job
name: Create dev container
needs: test
if: github.ref == 'refs/heads/dev'
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.9
python-version: '3.12'
- name: Install dependencies
- name: Install uv (Unix)
if: runner.os != 'Windows'
run: |
python -m pip install --upgrade pip
pip install poetry tox tox-gh-actions
curl -LsSf https://astral.sh/uv/install.sh | sh
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: Build wheels and source tarball
run: >-
poetry build
- name: Install uv (Windows)
if: runner.os == 'Windows'
run: |
iwr -useb https://astral.sh/uv/install.ps1 | iex
echo "$HOME\.cargo\bin" | Out-File -FilePath $env:GITHUB_PATH -Encoding utf8 -Append
- name: Build
run: uv build
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Log in to container registry
uses: docker/login-action@v2
uses: docker/setup-buildx-action@v3
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push dev
uses: docker/build-push-action@v2
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64,linux/arm64

View File

@@ -1,34 +1,27 @@
# Publish package on main branch if it's tagged with 'v*'
name: release & publish workflow
name: Release & Publish Workflow
# Controls when the action will run.
on:
# Triggers the workflow on push events but only for the master branch
push:
tags:
- 'v*'
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "release"
release:
name: Create Release
runs-on: ubuntu-20.04
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
- name: Get version from tag
id: tag_name
run: |
echo ::set-output name=current_version::${GITHUB_REF#refs/tags/v}
echo "current_version=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
shell: bash
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- name: Checkout code
uses: actions/checkout@v4
- name: Get Changelog Entry
id: changelog_reader
@@ -37,56 +30,57 @@ jobs:
version: ${{ steps.tag_name.outputs.current_version }}
path: ./CHANGELOG.md
- uses: actions/setup-python@v2
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.9
python-version: "3.10"
- name: Install dependencies
- name: Install uv
run: |
python -m pip install --upgrade pip
pip install poetry
curl -LsSf https://astral.sh/uv/install.sh | sh
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: Build wheels and source tarball
run: >-
poetry build
run: uv build
- name: show temporary files
run: >-
ls -lR
- name: Show build artifacts
run: ls -lR dist/
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
uses: docker/setup-buildx-action@v3
- name: Log in to container registry
uses: docker/login-action@v2
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push dev
uses: docker/build-push-action@v2
- name: Build and push container
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: ghcr.io/${{ github.repository }}:${{ steps.tag_name.outputs.current_version }}, ghcr.io/${{ github.repository }}:latest
tags: |
ghcr.io/${{ github.repository }}:${{ steps.tag_name.outputs.current_version }}
ghcr.io/${{ github.repository }}:latest
- name: create github release
- name: Create GitHub release
id: create_release
uses: softprops/action-gh-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
body: ${{ steps.changelog_reader.outputs.changes }}
files: dist/*.whl
files: dist/*
draft: false
prerelease: false
- name: publish to PyPI
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__

4
.gitignore vendored
View File

@@ -119,4 +119,6 @@ data/
.envrc
clips/
*.sqlite
*.sqlite
.tool-versions
docker-compose.yml

View File

@@ -5,32 +5,26 @@ repos:
- id: forbid-crlf
- id: remove-crlf
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.4.0
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-merge-conflict
- id: check-yaml
args: [ --unsafe ]
- repo: https://github.com/pre-commit/mirrors-isort
rev: v5.8.0
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.5.7
hooks:
- id: isort
args: [ "--filter-files" ]
- repo: https://github.com/ambv/black
rev: 21.5b1
hooks:
- id: black
language_version: python3.9
- repo: https://github.com/pycqa/flake8
rev: 3.9.2
hooks:
- id: flake8
additional_dependencies: [ flake8-typing-imports==1.10.0 ]
# Run the linter.
- id: ruff
# Run the formatter.
- id: ruff-format
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.901
rev: v1.14.1
hooks:
- id: mypy
exclude: tests/
additional_dependencies:
- types-click
- types-pytz
- types-cryptography
- types-python-dateutil
- types-aiofiles

View File

@@ -4,6 +4,80 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.13.0] - 2025-04-09
### Added
- Parallel uploaders are now supported
- All smart detection types are now supported
- Migrated the project from poetry to uv
### Fixed
- Corrected the envar for setting cameras to backup for ONLY_CAMERAS -> CAMERAS
- Bumped to the latest uiprotect library to fix issue when unifi access devices are present
## [0.12.0] - 2025-01-18
### Added
- Tool now targets UIProtect instead of pyunifiprotect which should help any lingering auth issues with Unifi OS 4.X
- Python Version bumped to 3.10 (based on UIProtect need)
- The ability to specify only specific cameras to backup
- Re-enabled the experimental downloader after adding a monkey patch for UIProtect to include the unmerged code
- Switched linter to `ruff`
- Added support for SMART_DETECT_LINE events
-
### Fixed
- Unifi now returns unfinished events, this is now handled correctly
- Login attempts now use an exponentially increasing delay to try work around aggressive rate limiting on logins
## [0.11.0] - 2024-06-08
### Added
- A new experimental downloader that uses the same mechanism the web ui does. Enable with
`--experimental-downloader`
### Fixed
- Support for UniFi OS 4.x.x
## [0.10.7] - 2024-03-22
### Fixed
- Set pyunifiprotect to a minimum version of 5.0.0
## [0.10.6] - 2024-03-22
### Fixed
- Bumped `pyunifiprotect` version to fix with versions of Unifi Protect after 3.0.10
## [0.10.5] - 2024-01-26
### Fixed
- Bumped `pyunifiprotect` version to fix issue with old version of yarl
## [0.10.4] - 2024-01-26
### Fixed
- Bumped `pyunifiprotect` version to fix issue caused by new video modes
## [0.10.3] - 2023-12-07
### Fixed
- Bumped `pyunifiprotect` version to fix issue caused by unifi protect returning invalid UUIDs
## [0.10.2] - 2023-11-21
### Fixed
- Issue where duplicate events were being downloaded causing database errors
- Default file path format now uses event start time instead of event end time which makes more logical sense
## [0.10.1] - 2023-11-01
### Fixed
- Event type enum conversion string was no longer converting to the enum value, this is now done explicitly.
## [0.10.0] - 2023-11-01
### Added
- Command line option to skip events longer than a given length (default 2 hours)
- Docker image is now based on alpine edge giving access to the latest version of rclone
### Fixed
- Failed uploads no longer write to the database, meaning they will be retried
- Fixed issue with chunked event fetch during initial ignore of events
- Fixed error when no events were fetched for the retention period
## [0.9.5] - 2023-10-07
### Fixed
- Errors caused by latest unifi protect version by bumping the version of pyunifiprotect used
- Queries for events are now chunked into groups of 500 which should help stop this tool crashing large
unifi protect instances.
## [0.9.4] - 2023-07-29
### Fixed
- Time period parsing, 'Y' -> 'y'

View File

@@ -55,12 +55,11 @@ Ready to contribute? Here's how to set up `unifi-protect-backup` for local devel
$ git clone git@github.com:your_name_here/unifi-protect-backup.git
```
3. Ensure [poetry](https://python-poetry.org/docs/) is installed.
4. Install dependencies and start your virtualenv:
3. Ensure [uv](https://docs.astral.sh/uv/) is installed.
4. Create virtual environment and install dependencies:
```
$ poetry install -E test -E dev
$ poetry shell
$ uv install --dev
```
5. Create a branch for local development:
@@ -75,14 +74,21 @@ Ready to contribute? Here's how to set up `unifi-protect-backup` for local devel
be inside the `poetry shell` virtualenv or run it via poetry:
```
$ poetry run unifi-protect-backup {args}
$ uv run unifi-protect-backup {args}
```
7. When you're done making changes, check that your changes pass the
tests, including testing other Python versions, with tox:
7. Install pre-commit git hooks to ensure all code commit to the repository
is formatted correctly and meets coding standards:
```
$ uv run pre-commit install
```
8. When you're done making changes, check that your changes pass the
tests:
```
$ poetry run tox
$ uv run pytest
```
8. Commit your changes and push your branch to GitHub:
@@ -103,14 +109,14 @@ Before you submit a pull request, check that it meets these guidelines:
2. If the pull request adds functionality, the docs should be updated. Put
your new functionality into a function with a docstring. If adding a CLI
option, you should update the "usage" in README.md.
3. The pull request should work for Python 3.9. Check
3. The pull request should work for Python 3.10. Check
https://github.com/ep1cman/unifi-protect-backup/actions
and make sure that the tests pass for all supported Python versions.
## Tips
```
$ poetry run pytest tests/test_unifi_protect_backup.py
$ uv run pytest tests/test_unifi_protect_backup.py
```
To run a subset of tests.
@@ -123,7 +129,7 @@ Make sure all your changes are committed (including an entry in CHANGELOG.md).
Then run:
```
$ poetry run bump2version patch # possible: major / minor / patch
$ uv run bump2version patch # possible: major / minor / patch
$ git push
$ git push --tags
```

View File

@@ -1,16 +1,16 @@
# To build run:
# make docker
FROM ghcr.io/linuxserver/baseimage-alpine:3.16
FROM ghcr.io/linuxserver/baseimage-alpine:edge
LABEL maintainer="ep1cman"
WORKDIR /app
COPY dist/unifi_protect_backup-0.9.4.tar.gz sdist.tar.gz
COPY dist/unifi_protect_backup-0.13.0.tar.gz sdist.tar.gz
# https://github.com/rust-lang/cargo/issues/2808
ENV CARGO_NET_GIT_FETCH_WITH_CLI=true
ENV CARGO_NET_GIT_FETCH_WITH_CLI=true
RUN \
echo "**** install build packages ****" && \
@@ -29,7 +29,7 @@ RUN \
py3-pip \
python3 && \
echo "**** install unifi-protect-backup ****" && \
pip install --no-cache-dir sdist.tar.gz && \
pip install --no-cache-dir --break-system-packages sdist.tar.gz && \
echo "**** cleanup ****" && \
apk del --purge \
build-dependencies && \
@@ -50,6 +50,9 @@ ENV TZ=UTC
ENV IGNORE_CAMERAS=""
ENV SQLITE_PATH=/config/database/events.sqlite
# Fixes issue where `platformdirs` is unable to properly detect the user directory
ENV XDG_CONFIG_HOME=/config
COPY docker_root/ /
RUN mkdir -p /config/database /config/rclone

View File

@@ -28,8 +28,8 @@ retention period.
- Automatic pruning of old clips
## Requirements
- Python 3.9+
- Unifi Protect version 1.20 or higher (as per [`pyunifiprotect`](https://github.com/briis/pyunifiprotect))
- Python 3.10+
- Unifi Protect version 1.20 or higher (as per [`uiprotect`](https://github.com/uilibs/uiprotect))
- `rclone` installed with at least one remote configured.
# Setup
@@ -48,7 +48,7 @@ In order to connect to your unifi protect instance, you will first need to setup
## Installation
*The prefered way to run this tool is using a container*
*The preferred way to run this tool is using a container*
### Docker Container
You can run this tool as a container if you prefer with the following command.
@@ -129,14 +129,21 @@ Options:
example.
--rclone-purge-args TEXT Optional extra arguments to pass to `rclone delete` directly.
Common usage for this would be to execute a permanent delete
instead of using the recycle bin on a destination.
Google Drive example: `--drive-use-trash=false`
instead of using the recycle bin on a destination. Google Drive
example: `--drive-use-trash=false`
--detection-types TEXT A comma separated list of which types of detections to backup.
Valid options are: `motion`, `person`, `vehicle`, `ring`
[default: motion,person,vehicle,ring]
--ignore-camera TEXT IDs of cameras for which events should not be backed up. Use
multiple times to ignore multiple IDs. If being set as an
environment variable the IDs should be separated by whitespace.
Alternatively, use a Unifi user with a role which has access
restricted to the subset of cameras that you wish to backup.
--camera TEXT IDs of *ONLY* cameras for which events should be backed up. Use
multiple times to include multiple IDs. If being set as an
environment variable the IDs should be separated by whitespace.
Alternatively, use a Unifi user with a role which has access
restricted to the subset of cameras that you wish to backup.
--file-structure-format TEXT A Python format string used to generate the file structure/name
on the rclone remote.For details of the fields available, see
the projects `README.md` file. [default: {camera_name}/{event.s
@@ -187,8 +194,16 @@ Options:
More details about supported platforms can be found here:
https://github.com/caronc/apprise
--skip-missing If set, events which are 'missing' at the start will be ignored.
--skip-missing If set, events which are 'missing' at the start will be ignored.
Subsequent missing events will be downloaded (e.g. a missed event) [default: False]
--download-rate-limit FLOAT Limit how events can be downloaded in one minute. Disabled by
default
--max-event-length INTEGER Only download events shorter than this maximum length, in
seconds [default: 7200]
--experimental-downloader If set, a new experimental download mechanism will be used to match
what the web UI does. This might be more stable if you are experiencing
a lot of failed downloads with the default downloader. [default: False]
--parallel-uploads INTEGER Max number of parallel uploads to allow [default: 1]
--help Show this message and exit.
```
@@ -204,6 +219,7 @@ always take priority over environment variables):
- `RCLONE_ARGS`
- `RCLONE_PURGE_ARGS`
- `IGNORE_CAMERAS`
- `CAMERAS`
- `DETECTION_TYPES`
- `FILE_STRUCTURE_FORMAT`
- `SQLITE_PATH`
@@ -212,6 +228,10 @@ always take priority over environment variables):
- `PURGE_INTERVAL`
- `APPRISE_NOTIFIERS`
- `SKIP_MISSING`
- `DOWNLOAD_RATELIMIT`
- `MAX_EVENT_LENGTH`
- `EXPERIMENTAL_DOWNLOADER`
- `PARALLEL_UPLOADS`
## File path formatting
@@ -223,7 +243,7 @@ If you wish for the clips to be structured differently you can do this using the
option. It uses standard [python format string syntax](https://docs.python.org/3/library/string.html#formatstrings).
The following fields are provided to the format string:
- *event:* The `Event` object as per https://github.com/briis/pyunifiprotect/blob/master/pyunifiprotect/data/nvr.py
- *event:* The `Event` object as per https://github.com/uilibs/uiprotect/blob/main/src/uiprotect/data/nvr.py
- *duration_seconds:* The duration of the event in seconds
- *detection_type:* A nicely formatted list of the event detection type and the smart detection types (if any)
- *camera_name:* The name of the camera that generated this event
@@ -236,6 +256,46 @@ now on, you can use the `--skip-missing` flag. This does not enable the periodic
If you use this feature it is advised that your run the tool once with this flag, then stop it once the database has been created and the events are ignored. Keeping this flag set permanently could cause events to be missed if the tool crashes and is restarted etc.
## Selecting cameras
By default unifi-protect-backup backs up clips from all cameras.
If you want to limit the backups to certain cameras you can do that in one of two ways.
Note: Camera IDs can be obtained by scanning the logs, by looking for `Found cameras:`. You can find this section of the logs by piping the logs in to this `sed` command
`sed -n '/Found cameras:/,/NVR TZ/p'`
### Back-up only specific cameras
By using the `--camera` argument, you can specify the ID of the cameras you want to backup. If you want to backup more than one camera you can specify this argument more than once. If this argument is specified all other cameras will be ignored.
#### Example:
If you have three cameras:
- `CAMERA_ID_1`
- `CAMERA_ID_2`
- `CAMERA_ID_3`
and run the following command:
```
$ unifi-protect-backup [...] --camera CAMERA_ID_1 --camera CAMERA_ID_2
```
Only `CAMERA_ID_1` and `CAMERA_ID_2` will be backed up.
### Ignoring cameras
By using the `--ignore-camera` argument, you can specify the ID of the cameras you *do not* want to backup. If you want to ignore more than one camera you can specify this argument more than once. If this argument is specified all cameras will be backed up except the ones specified
#### Example:
If you have three cameras:
- `CAMERA_ID_1`
- `CAMERA_ID_2`
- `CAMERA_ID_3`
and run the following command:
```
$ unifi-protect-backup [...] --ignore-camera CAMERA_ID_1 --ignore-camera CAMERA_ID_2
```
Only `CAMERA_ID_3` will be backed up.
### Note about unifi protect accounts
It is possible to limit what cameras a unifi protect accounts can see. If an account does not have access to a camera this tool will never see it as available so it will not be impacted by the above arguments.
# A note about `rclone` backends and disk wear
This tool attempts to not write the downloaded files to disk to minimise disk wear, and instead streams them directly to
rclone. Sadly, not all storage backends supported by `rclone` allow "Stream Uploads". Please refer to the `StreamUpload` column on this table to see which one do and don't: https://rclone.org/overview/#optional-features
@@ -265,7 +325,7 @@ tmpfs /mnt/tmpfs tmpfs nosuid,nodev,noatime 0 0
```
# Running Backup Tool as a Service (LINUX ONLY)
You can create a service that will run the docker or local version of this backup tool. The service can be configured to launch on boot. This is likely the preferred way you want to execute the tool once you have it completely configured and tested so it is continiously running.
You can create a service that will run the docker or local version of this backup tool. The service can be configured to launch on boot. This is likely the preferred way you want to execute the tool once you have it completely configured and tested so it is continuously running.
First create a service configuration file. You can replace `protectbackup` in the filename below with the name you wish to use for your service, if you change it remember to change the other locations in the following scripts as well.
@@ -335,7 +395,7 @@ docker run \
</a>
- Heavily utilises [`pyunifiprotect`](https://github.com/briis/pyunifiprotect) by [@briis](https://github.com/briis/)
- Heavily utilises [`uiprotect`](https://github.com/uilibs/uiprotect)
- All the cloud functionality is provided by [`rclone`](https://rclone.org/)
- This package was created with [Cookiecutter](https://github.com/audreyr/cookiecutter) and the [waynerv/cookiecutter-pypackage](https://github.com/waynerv/cookiecutter-pypackage) project template.

2
docker_root/etc/cont-init.d/30-config Normal file → Executable file
View File

@@ -4,7 +4,7 @@ mkdir -p /config/rclone
# For backwards compatibility
[[ -f "/root/.config/rclone/rclone.conf" ]] && \
echo "DEPRECATED: Copying rclone conf from /root/.config/rclone/rclone.conf, please change your mount to /config/rclone.conf"
echo "DEPRECATED: Copying rclone conf from /root/.config/rclone/rclone.conf, please change your mount to /config/rclone/rclone.conf"
cp \
/root/.config/rclone/rclone.conf \
/config/rclone/rclone.conf

18
docker_root/etc/services.d/unifi-protect-backup/run Normal file → Executable file
View File

@@ -1,9 +1,21 @@
#!/usr/bin/with-contenv bash
export RCLONE_CONFIG=/config/rclone/rclone.conf
export XDG_CACHE_HOME=/config
echo $VERBOSITY
[[ -n "$VERBOSITY" ]] && export VERBOSITY_ARG=-$VERBOSITY || export VERBOSITY_ARG=""
exec \
s6-setuidgid abc unifi-protect-backup ${VERBOSITY_ARG}
# Run without exec to catch the exit code
s6-setuidgid abc unifi-protect-backup ${VERBOSITY_ARG}
exit_code=$?
# If exit code is 200 (arg error), exit the container
if [ $exit_code -eq 200 ]; then
# Send shutdown signal to s6
/run/s6/basedir/bin/halt
exit $exit_code
fi
# Otherwise, let s6 handle potential restart
exit $exit_code

View File

@@ -6,11 +6,10 @@ container_arches ?= linux/amd64,linux/arm64
test: format lint unittest
format:
isort $(sources) tests
black $(sources) tests
ruff format $(sources) tests
lint:
flake8 $(sources) tests
ruff check $(sources) tests
mypy $(sources) tests
unittest:
@@ -29,5 +28,5 @@ clean:
rm -rf coverage.xml .coverage
docker:
poetry build
docker buildx build . --platform $(container_arches) -t $(container_name) --push
uv build
docker buildx build . --platform $(container_arches) -t $(container_name) --push

2439
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,99 +1,76 @@
[tool]
[tool.poetry]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "unifi_protect_backup"
version = "0.9.4"
homepage = "https://github.com/ep1cman/unifi-protect-backup"
version = "0.13.0"
description = "Python tool to backup unifi event clips in realtime."
authors = ["sebastian.goscik <sebastian@goscik.com>"]
readme = "README.md"
license = "MIT"
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Information Technology',
'License :: OSI Approved :: MIT License',
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.9',
license = {text = "MIT"}
authors = [
{name = "sebastian.goscik", email = "sebastian@goscik.com"}
]
packages = [
{ include = "unifi_protect_backup" },
{ include = "tests", format = "sdist" },
classifiers = [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Information Technology",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.10",
]
requires-python = ">=3.10.0,<4.0"
dependencies = [
"click==8.0.1",
"aiorun>=2023.7.2",
"aiosqlite>=0.17.0",
"python-dateutil>=2.8.2",
"apprise>=1.5.0",
"expiring-dict>=1.1.0",
"async-lru>=2.0.4",
"aiolimiter>=1.1.0",
"uiprotect==7.5.2",
"aiohttp==3.11.16",
]
[tool.poetry.dependencies]
python = ">=3.9.0,<4.0"
click = "8.0.1"
pyunifiprotect = "^4.0.11"
aiorun = "^2022.11.1"
aiosqlite = "^0.17.0"
python-dateutil = "^2.8.2"
apprise = "^1.3.0"
expiring-dict = "^1.1.0"
async-lru = "^2.0.3"
[project.urls]
Homepage = "https://github.com/ep1cman/unifi-protect-backup"
[tool.poetry.group.dev]
optional = true
[project.scripts]
unifi-protect-backup = "unifi_protect_backup.cli:main"
[tool.poetry.group.dev.dependencies]
black = "^22.10.0"
isort = "^5.8.0"
flake8 = "^3.9.2"
flake8-docstrings = "^1.6.0"
virtualenv = "^20.2.2"
mypy = "^0.900"
types-pytz = "^2021.3.5"
types-cryptography = "^3.3.18"
twine = "^3.3.0"
bump2version = "^1.0.1"
pre-commit = "^2.12.0"
types-python-dateutil = "^2.8.19.10"
[dependency-groups]
dev = [
"mypy>=1.15.0",
"types-pytz>=2021.3.5",
"types-cryptography>=3.3.18",
"types-python-dateutil>=2.8.19.10",
"types-aiofiles>=24.1.0.20241221",
"bump2version>=1.0.1",
"pre-commit>=4.2.0",
"ruff>=0.11.4",
"pytest>=8.3.5",
]
[tool.poetry.group.test]
optional = true
[tool.hatch.build.targets.wheel]
packages = ["unifi_protect_backup"]
[tool.poetry.group.test.dependencies]
pytest = "^6.2.4"
pytest-cov = "^2.12.0"
tox = "^3.20.1"
tox-asdf = "^0.1.0"
[tool.hatch.build.targets.sdist]
include = ["unifi_protect_backup", "tests"]
[tool.poetry.scripts]
unifi-protect-backup = 'unifi_protect_backup.cli:main'
[tool.black]
[tool.ruff]
line-length = 120
skip-string-normalization = true
target-version = ['py39']
include = '\.pyi?$'
exclude = '''
/(
\.eggs
| \.git
| \.hg
| \.mypy_cache
| \.tox
| \.venv
| _build
| buck-out
| build
| dist
)/
'''
target-version = "py310"
[tool.isort]
multi_line_output = 3
include_trailing_comma = true
force_grid_wrap = 0
use_parentheses = true
ensure_newline_before_comments = true
line_length = 120
skip_gitignore = true
# you can skip files as below
#skip_glob = docs/conf.py
[tool.ruff.lint]
[tool.ruff.format]
[tool.mypy]
allow_redefinition=true
allow_redefinition = true
exclude = [
'unifi_protect_backup/uiprotect_patch.py'
]
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.uv]
default-groups = []

View File

@@ -1,88 +0,0 @@
[flake8]
max-line-length = 120
max-complexity = 18
ignore = E203, E266, W503
docstring-convention = google
per-file-ignores = __init__.py:F401
exclude = .git,
__pycache__,
setup.py,
build,
dist,
docs,
releases,
.venv,
.tox,
.mypy_cache,
.pytest_cache,
.vscode,
.github,
# By default test codes will be linted.
# tests
[mypy]
ignore_missing_imports = True
[coverage:run]
# uncomment the following to omit files during running
#omit =
[coverage:report]
exclude_lines =
pragma: no cover
def __repr__
if self.debug:
if settings.DEBUG
raise AssertionError
raise NotImplementedError
if 0:
if __name__ == .__main__.:
def main
[tox:tox]
isolated_build = true
envlist = py39, format, lint, build
[gh-actions]
python =
3.9: py39, format, lint, build
[testenv]
allowlist_externals = pytest
extras =
test
passenv = *
setenv =
PYTHONPATH = {toxinidir}
PYTHONWARNINGS = ignore
commands =
pytest --cov=unifi_protect_backup --cov-branch --cov-report=xml --cov-report=term-missing tests
[testenv:format]
allowlist_externals =
isort
black
extras =
test
commands =
isort unifi_protect_backup
black unifi_protect_backup tests
[testenv:lint]
allowlist_externals =
flake8
mypy
extras =
test
commands =
flake8 unifi_protect_backup tests
mypy unifi_protect_backup tests
[testenv:build]
allowlist_externals =
poetry
twine
extras =
dev
commands =
poetry build
twine check dist/*

View File

@@ -1,12 +1,21 @@
"""Top-level package for Unifi Protect Backup."""
__author__ = """sebastian.goscik"""
__email__ = 'sebastian@goscik.com'
__version__ = '0.9.4'
__email__ = "sebastian@goscik.com"
__version__ = "0.13.0"
from .downloader import VideoDownloader
from .downloader_experimental import VideoDownloaderExperimental
from .event_listener import EventListener
from .purge import Purge
from .uploader import VideoUploader
from .missing_event_checker import MissingEventChecker
from .missing_event_checker import MissingEventChecker # isort: skip
__all__ = [
"VideoDownloader",
"VideoDownloaderExperimental",
"EventListener",
"Purge",
"VideoUploader",
"MissingEventChecker",
]

View File

@@ -1,18 +1,24 @@
"""Console script for unifi_protect_backup."""
import sys
import re
import click
from aiorun import run # type: ignore
from dateutil.relativedelta import relativedelta
from uiprotect.data.types import SmartDetectObjectType
from unifi_protect_backup import __version__
from unifi_protect_backup.unifi_protect_backup_core import UnifiProtectBackup
from unifi_protect_backup.utils import human_readable_to_float
DETECTION_TYPES = ["motion", "person", "vehicle", "ring"]
DETECTION_TYPES = ["motion", "ring", "line"] + SmartDetectObjectType.values()
def _parse_detection_types(ctx, param, value):
# split columns by ',' and remove whitespace
types = [t.strip() for t in value.split(',')]
types = [t.strip() for t in value.split(",")]
# validate passed columns
for t in types:
@@ -22,77 +28,110 @@ def _parse_detection_types(ctx, param, value):
return types
def parse_rclone_retention(ctx, param, retention) -> relativedelta:
"""Parses the rclone `retention` parameter into a relativedelta which can then be used to calculate datetimes."""
matches = {k: int(v) for v, k in re.findall(r"([\d]+)(ms|s|m|h|d|w|M|y)", retention)}
# Check that we matched the whole string
if len(retention) != len("".join([f"{v}{k}" for k, v in matches.items()])):
raise click.BadParameter("See here for expected format: https://rclone.org/docs/#time-option")
return relativedelta(
microseconds=matches.get("ms", 0) * 1000,
seconds=matches.get("s", 0),
minutes=matches.get("m", 0),
hours=matches.get("h", 0),
days=matches.get("d", 0),
weeks=matches.get("w", 0),
months=matches.get("M", 0),
years=matches.get("y", 0),
)
@click.command(context_settings=dict(max_content_width=100))
@click.version_option(__version__)
@click.option('--address', required=True, envvar='UFP_ADDRESS', help='Address of Unifi Protect instance')
@click.option('--port', default=443, envvar='UFP_PORT', show_default=True, help='Port of Unifi Protect instance')
@click.option('--username', required=True, envvar='UFP_USERNAME', help='Username to login to Unifi Protect instance')
@click.option('--password', required=True, envvar='UFP_PASSWORD', help='Password for Unifi Protect user')
@click.option("--address", required=True, envvar="UFP_ADDRESS", help="Address of Unifi Protect instance")
@click.option("--port", default=443, envvar="UFP_PORT", show_default=True, help="Port of Unifi Protect instance")
@click.option("--username", required=True, envvar="UFP_USERNAME", help="Username to login to Unifi Protect instance")
@click.option("--password", required=True, envvar="UFP_PASSWORD", help="Password for Unifi Protect user")
@click.option(
'--verify-ssl/--no-verify-ssl',
"--verify-ssl/--no-verify-ssl",
default=True,
show_default=True,
envvar='UFP_SSL_VERIFY',
envvar="UFP_SSL_VERIFY",
help="Set if you do not have a valid HTTPS Certificate for your instance",
)
@click.option(
'--rclone-destination',
"--rclone-destination",
required=True,
envvar='RCLONE_DESTINATION',
envvar="RCLONE_DESTINATION",
help="`rclone` destination path in the format {rclone remote}:{path on remote}."
" E.g. `gdrive:/backups/unifi_protect`",
)
@click.option(
'--retention',
default='7d',
"--retention",
default="7d",
show_default=True,
envvar='RCLONE_RETENTION',
help="How long should event clips be backed up for. Format as per the `rclone1 time option format "
"(https://rclone.org/docs/#time-option)",
envvar="RCLONE_RETENTION",
help="How long should event clips be backed up for. Format as per the `--max-age` argument of `rclone` "
"(https://rclone.org/filtering/#max-age-don-t-transfer-any-file-older-than-this)",
callback=parse_rclone_retention,
)
@click.option(
'--rclone-args',
default='',
envvar='RCLONE_ARGS',
"--rclone-args",
default="",
envvar="RCLONE_ARGS",
help="Optional extra arguments to pass to `rclone rcat` directly. Common usage for this would "
"be to set a bandwidth limit, for example.",
)
@click.option(
'--rclone-purge-args',
default='',
envvar='RCLONE_PURGE_ARGS',
"--rclone-purge-args",
default="",
envvar="RCLONE_PURGE_ARGS",
help="Optional extra arguments to pass to `rclone delete` directly. Common usage for this would "
"be to execute a permanent delete instead of using the recycle bin on a destination. "
"Google Drive example: `--drive-use-trash=false`",
)
@click.option(
'--detection-types',
envvar='DETECTION_TYPES',
default=','.join(DETECTION_TYPES),
"--detection-types",
envvar="DETECTION_TYPES",
default=",".join(DETECTION_TYPES),
show_default=True,
help="A comma separated list of which types of detections to backup. "
f"Valid options are: {', '.join([f'`{t}`' for t in DETECTION_TYPES])}",
callback=_parse_detection_types,
)
@click.option(
'--ignore-camera',
'ignore_cameras',
"--ignore-camera",
"ignore_cameras",
multiple=True,
envvar="IGNORE_CAMERAS",
help="IDs of cameras for which events should not be backed up. Use multiple times to ignore "
"multiple IDs. If being set as an environment variable the IDs should be separated by whitespace.",
"multiple IDs. If being set as an environment variable the IDs should be separated by whitespace. "
"Alternatively, use a Unifi user with a role which has access restricted to the subset of cameras "
"that you wish to backup.",
)
@click.option(
'--file-structure-format',
envvar='FILE_STRUCTURE_FORMAT',
"--camera",
"cameras",
multiple=True,
envvar="CAMERAS",
help="IDs of *ONLY* cameras for which events should be backed up. Use multiple times to include "
"multiple IDs. If being set as an environment variable the IDs should be separated by whitespace. "
"Alternatively, use a Unifi user with a role which has access restricted to the subset of cameras "
"that you wish to backup.",
)
@click.option(
"--file-structure-format",
envvar="FILE_STRUCTURE_FORMAT",
default="{camera_name}/{event.start:%Y-%m-%d}/{event.end:%Y-%m-%dT%H-%M-%S} {detection_type}.mp4",
show_default=True,
help="A Python format string used to generate the file structure/name on the rclone remote."
"For details of the fields available, see the projects `README.md` file.",
)
@click.option(
'-v',
'--verbose',
"-v",
"--verbose",
count=True,
help="How verbose the logging output should be."
"""
@@ -112,37 +151,38 @@ all warnings, and websocket data
""",
)
@click.option(
'--sqlite_path',
default='events.sqlite',
envvar='SQLITE_PATH',
"--sqlite_path",
default="events.sqlite",
envvar="SQLITE_PATH",
help="Path to the SQLite database to use/create",
)
@click.option(
'--color-logging/--plain-logging',
"--color-logging/--plain-logging",
default=False,
show_default=True,
envvar='COLOR_LOGGING',
envvar="COLOR_LOGGING",
help="Set if you want to use color in logging output",
)
@click.option(
'--download-buffer-size',
default='512MiB',
"--download-buffer-size",
default="512MiB",
show_default=True,
envvar='DOWNLOAD_BUFFER_SIZE',
envvar="DOWNLOAD_BUFFER_SIZE",
help='How big the download buffer should be (you can use suffixes like "B", "KiB", "MiB", "GiB")',
callback=lambda ctx, param, value: human_readable_to_float(value),
)
@click.option(
'--purge_interval',
default='1d',
"--purge_interval",
default="1d",
show_default=True,
envvar='PURGE_INTERVAL',
envvar="PURGE_INTERVAL",
help="How frequently to check for file to purge.\n\nNOTE: Can create a lot of API calls, so be careful if "
"your cloud provider charges you per api call",
callback=parse_rclone_retention,
)
@click.option(
'--apprise-notifier',
'apprise_notifiers',
"--apprise-notifier",
"apprise_notifiers",
multiple=True,
envvar="APPRISE_NOTIFIERS",
help="""\b
@@ -160,20 +200,74 @@ If no tags are specified, it defaults to ERROR
More details about supported platforms can be found here: https://github.com/caronc/apprise""",
)
@click.option(
'--skip-missing',
"--skip-missing",
default=False,
show_default=True,
is_flag=True,
envvar='SKIP_MISSING',
envvar="SKIP_MISSING",
help="""\b
If set, events which are 'missing' at the start will be ignored.
Subsequent missing events will be downloaded (e.g. a missed event)
""",
)
@click.option(
"--download-rate-limit",
default=None,
show_default=True,
envvar="DOWNLOAD_RATELIMIT",
type=float,
help="Limit how events can be downloaded in one minute. Disabled by default",
)
@click.option(
"--max-event-length",
default=2 * 60 * 60,
show_default=True,
envvar="MAX_EVENT_LENGTH",
type=int,
help="Only download events shorter than this maximum length, in seconds",
)
@click.option(
"--experimental-downloader",
"use_experimental_downloader",
default=False,
show_default=True,
is_flag=True,
envvar="EXPERIMENTAL_DOWNLOADER",
help="""\b
If set, a new experimental download mechanism will be used to match
what the web UI does. This might be more stable if you are experiencing
a lot of failed downloads with the default downloader.
""",
)
@click.option(
"--parallel-uploads",
default=1,
show_default=True,
envvar="PARALLEL_UPLOADS",
type=int,
help="Max number of parallel uploads to allow",
)
def main(**kwargs):
"""A Python based tool for backing up Unifi Protect event clips as they occur."""
event_listener = UnifiProtectBackup(**kwargs)
run(event_listener.start(), stop_on_unhandled_errors=True)
try:
# Validate only one of the camera select arguments was given
if kwargs.get("cameras") and kwargs.get("ignore_cameras"):
click.echo(
"Error: --camera and --ignore-camera options are mutually exclusive. "
"Please use only one of these options.",
err=True,
)
raise SystemExit(200) # throw 200 = arg error, service will not be restarted (docker)
# Only create the event listener and run if validation passes
event_listener = UnifiProtectBackup(**kwargs)
run(event_listener.start(), stop_on_unhandled_errors=True)
except SystemExit as e:
sys.exit(e.code)
except Exception as e:
click.echo(f"Error: {str(e)}", err=True)
sys.exit(1)
if __name__ == "__main__":

View File

@@ -10,10 +10,11 @@ from typing import Optional
import aiosqlite
import pytz
from aiohttp.client_exceptions import ClientPayloadError
from aiolimiter import AsyncLimiter
from expiring_dict import ExpiringDict # type: ignore
from pyunifiprotect import ProtectApiClient
from pyunifiprotect.data.nvr import Event
from pyunifiprotect.data.types import EventType
from uiprotect import ProtectApiClient
from uiprotect.data.nvr import Event
from uiprotect.data.types import EventType
from unifi_protect_backup.utils import (
SubprocessException,
@@ -28,14 +29,14 @@ from unifi_protect_backup.utils import (
async def get_video_length(video: bytes) -> float:
"""Uses ffprobe to get the length of the video file passed in as a byte stream."""
returncode, stdout, stderr = await run_command(
'ffprobe -v quiet -show_streams -select_streams v:0 -of json -', video
"ffprobe -v quiet -show_streams -select_streams v:0 -of json -", video
)
if returncode != 0:
raise SubprocessException(stdout, stderr, returncode)
json_data = json.loads(stdout)
return float(json_data['streams'][0]['duration'])
return float(json_data["streams"][0]["duration"])
class VideoDownloader:
@@ -48,6 +49,8 @@ class VideoDownloader:
download_queue: asyncio.Queue,
upload_queue: VideoQueue,
color_logging: bool,
download_rate_limit: float,
max_event_length: timedelta,
):
"""Init.
@@ -57,6 +60,8 @@ class VideoDownloader:
download_queue (asyncio.Queue): Queue to get event details from
upload_queue (VideoQueue): Queue to place downloaded videos on
color_logging (bool): Whether or not to add color to logging output
download_rate_limit (float): Limit how events can be downloaded in one minute",
max_event_length (timedelta): Maximum length in seconds for an event to be considered valid and downloaded
"""
self._protect: ProtectApiClient = protect
self._db: aiosqlite.Connection = db
@@ -64,13 +69,16 @@ class VideoDownloader:
self.upload_queue: VideoQueue = upload_queue
self.current_event = None
self._failures = ExpiringDict(60 * 60 * 12) # Time to live = 12h
self._download_rate_limit = download_rate_limit
self._max_event_length = max_event_length
self._limiter = AsyncLimiter(self._download_rate_limit) if self._download_rate_limit is not None else None
self.base_logger = logging.getLogger(__name__)
setup_event_logger(self.base_logger, color_logging)
self.logger = logging.LoggerAdapter(self.base_logger, {'event': ''})
self.logger = logging.LoggerAdapter(self.base_logger, {"event": ""})
# Check if `ffprobe` is available
ffprobe = shutil.which('ffprobe')
ffprobe = shutil.which("ffprobe")
if ffprobe is not None:
self.logger.debug(f"ffprobe found: {ffprobe}")
self._has_ffprobe = True
@@ -81,15 +89,20 @@ class VideoDownloader:
"""Main loop."""
self.logger.info("Starting Downloader")
while True:
if self._limiter:
self.logger.debug("Waiting for rate limit")
await self._limiter.acquire()
try:
# Wait for unifi protect to be connected
await self._protect.connect_event.wait()
event = await self.download_queue.get()
self.current_event = event
self.logger = logging.LoggerAdapter(self.base_logger, {'event': f' [{event.id}]'})
# Fix timezones since pyunifiprotect sets all timestamps to UTC. Instead localize them to
self.current_event = event
self.logger = logging.LoggerAdapter(self.base_logger, {"event": f" [{event.id}]"})
# Fix timezones since uiprotect sets all timestamps to UTC. Instead localize them to
# the timezone of the unifi protect NVR.
event.start = event.start.replace(tzinfo=pytz.utc).astimezone(self._protect.bootstrap.nvr.timezone)
event.end = event.end.replace(tzinfo=pytz.utc).astimezone(self._protect.bootstrap.nvr.timezone)
@@ -101,14 +114,19 @@ class VideoDownloader:
self.logger.debug(f"Video Download Buffer: {output_queue_current_size}/{output_queue_max_size}")
self.logger.debug(f" Camera: {await get_camera_name(self._protect, event.camera_id)}")
if event.type == EventType.SMART_DETECT:
self.logger.debug(f" Type: {event.type} ({', '.join(event.smart_detect_types)})")
self.logger.debug(f" Type: {event.type.value} ({', '.join(event.smart_detect_types)})")
else:
self.logger.debug(f" Type: {event.type}")
self.logger.debug(f" Type: {event.type.value}")
self.logger.debug(f" Start: {event.start.strftime('%Y-%m-%dT%H-%M-%S')} ({event.start.timestamp()})")
self.logger.debug(f" End: {event.end.strftime('%Y-%m-%dT%H-%M-%S')} ({event.end.timestamp()})")
duration = (event.end - event.start).total_seconds()
self.logger.debug(f" Duration: {duration}s")
# Skip invalid events
if not self._valid_event(event):
await self._ignore_event(event)
continue
# Unifi protect does not return full video clips if the clip is requested too soon.
# There are two issues at play here:
# - Protect will only cut a clip on an keyframe which happen every 5s
@@ -137,15 +155,7 @@ class VideoDownloader:
self.logger.error(
"Event has failed to download 10 times in a row. Permanently ignoring this event"
)
# ignore event
await self._db.execute(
"INSERT INTO events VALUES "
f"('{event.id}', '{event.type}', '{event.camera_id}',"
f"'{event.start.timestamp()}', '{event.end.timestamp()}')"
)
await self._db.commit()
await self._ignore_event(event)
continue
# Remove successfully downloaded event from failures list
@@ -184,6 +194,15 @@ class VideoDownloader:
self.logger.debug(f" Downloaded video size: {human_readable_size(len(video))}s")
return video
async def _ignore_event(self, event):
self.logger.warning("Ignoring event")
await self._db.execute(
"INSERT INTO events VALUES "
f"('{event.id}', '{event.type.value}', '{event.camera_id}',"
f"'{event.start.timestamp()}', '{event.end.timestamp()}')"
)
await self._db.commit()
async def _check_video_length(self, video, duration):
"""Check if the downloaded event is at least the length of the event, warn otherwise.
@@ -198,3 +217,11 @@ class VideoDownloader:
self.logger.debug(msg)
except SubprocessException as e:
self.logger.warning(" `ffprobe` failed", exc_info=e)
def _valid_event(self, event):
duration = event.end - event.start
if duration > self._max_event_length:
self.logger.warning(f"Event longer ({duration}) than max allowed length {self._max_event_length}")
return False
return True

View File

@@ -0,0 +1,238 @@
# noqa: D100
import asyncio
import json
import logging
import shutil
from datetime import datetime, timedelta, timezone
from typing import Optional
import aiosqlite
import pytz
from aiohttp.client_exceptions import ClientPayloadError
from aiolimiter import AsyncLimiter
from expiring_dict import ExpiringDict # type: ignore
from uiprotect import ProtectApiClient
from uiprotect.data.nvr import Event
from uiprotect.data.types import EventType
from unifi_protect_backup.utils import (
SubprocessException,
VideoQueue,
get_camera_name,
human_readable_size,
run_command,
setup_event_logger,
)
async def get_video_length(video: bytes) -> float:
"""Uses ffprobe to get the length of the video file passed in as a byte stream."""
returncode, stdout, stderr = await run_command(
"ffprobe -v quiet -show_streams -select_streams v:0 -of json -", video
)
if returncode != 0:
raise SubprocessException(stdout, stderr, returncode)
json_data = json.loads(stdout)
return float(json_data["streams"][0]["duration"])
class VideoDownloaderExperimental:
"""Downloads event video clips from Unifi Protect."""
def __init__(
self,
protect: ProtectApiClient,
db: aiosqlite.Connection,
download_queue: asyncio.Queue,
upload_queue: VideoQueue,
color_logging: bool,
download_rate_limit: float,
max_event_length: timedelta,
):
"""Init.
Args:
protect (ProtectApiClient): UniFi Protect API client to use
db (aiosqlite.Connection): Async SQLite database to check for missing events
download_queue (asyncio.Queue): Queue to get event details from
upload_queue (VideoQueue): Queue to place downloaded videos on
color_logging (bool): Whether or not to add color to logging output
download_rate_limit (float): Limit how events can be downloaded in one minute",
max_event_length (timedelta): Maximum length in seconds for an event to be considered valid and downloaded
"""
self._protect: ProtectApiClient = protect
self._db: aiosqlite.Connection = db
self.download_queue: asyncio.Queue = download_queue
self.upload_queue: VideoQueue = upload_queue
self.current_event = None
self._failures = ExpiringDict(60 * 60 * 12) # Time to live = 12h
self._download_rate_limit = download_rate_limit
self._max_event_length = max_event_length
self._limiter = AsyncLimiter(self._download_rate_limit) if self._download_rate_limit is not None else None
self.base_logger = logging.getLogger(__name__)
setup_event_logger(self.base_logger, color_logging)
self.logger = logging.LoggerAdapter(self.base_logger, {"event": ""})
# Check if `ffprobe` is available
ffprobe = shutil.which("ffprobe")
if ffprobe is not None:
self.logger.debug(f"ffprobe found: {ffprobe}")
self._has_ffprobe = True
else:
self._has_ffprobe = False
async def start(self):
"""Main loop."""
self.logger.info("Starting Downloader")
while True:
if self._limiter:
self.logger.debug("Waiting for rate limit")
await self._limiter.acquire()
try:
# Wait for unifi protect to be connected
await self._protect.connect_event.wait()
event = await self.download_queue.get()
self.current_event = event
self.logger = logging.LoggerAdapter(self.base_logger, {"event": f" [{event.id}]"})
# Fix timezones since uiprotect sets all timestamps to UTC. Instead localize them to
# the timezone of the unifi protect NVR.
event.start = event.start.replace(tzinfo=pytz.utc).astimezone(self._protect.bootstrap.nvr.timezone)
event.end = event.end.replace(tzinfo=pytz.utc).astimezone(self._protect.bootstrap.nvr.timezone)
self.logger.info(f"Downloading event: {event.id}")
self.logger.debug(f"Remaining Download Queue: {self.download_queue.qsize()}")
output_queue_current_size = human_readable_size(self.upload_queue.qsize())
output_queue_max_size = human_readable_size(self.upload_queue.maxsize)
self.logger.debug(f"Video Download Buffer: {output_queue_current_size}/{output_queue_max_size}")
self.logger.debug(f" Camera: {await get_camera_name(self._protect, event.camera_id)}")
if event.type == EventType.SMART_DETECT:
self.logger.debug(f" Type: {event.type.value} ({', '.join(event.smart_detect_types)})")
else:
self.logger.debug(f" Type: {event.type.value}")
self.logger.debug(f" Start: {event.start.strftime('%Y-%m-%dT%H-%M-%S')} ({event.start.timestamp()})")
self.logger.debug(f" End: {event.end.strftime('%Y-%m-%dT%H-%M-%S')} ({event.end.timestamp()})")
duration = (event.end - event.start).total_seconds()
self.logger.debug(f" Duration: {duration}s")
# Skip invalid events
if not self._valid_event(event):
await self._ignore_event(event)
continue
# Unifi protect does not return full video clips if the clip is requested too soon.
# There are two issues at play here:
# - Protect will only cut a clip on an keyframe which happen every 5s
# - Protect's pipeline needs a finite amount of time to make a clip available
# So we will wait 1.5x the keyframe interval to ensure that there is always ample video
# stored and Protect can return a full clip (which should be at least the length requested,
# but often longer)
time_since_event_ended = datetime.utcnow().replace(tzinfo=timezone.utc) - event.end
sleep_time = (timedelta(seconds=5 * 1.5) - time_since_event_ended).total_seconds()
if sleep_time > 0:
self.logger.debug(f" Sleeping ({sleep_time}s) to ensure clip is ready to download...")
await asyncio.sleep(sleep_time)
try:
video = await self._download(event)
assert video is not None
except Exception as e:
# Increment failure count
if event.id not in self._failures:
self._failures[event.id] = 1
else:
self._failures[event.id] += 1
self.logger.warning(
f"Event failed download attempt {self._failures[event.id]}",
exc_info=e,
)
if self._failures[event.id] >= 10:
self.logger.error(
"Event has failed to download 10 times in a row. Permanently ignoring this event"
)
await self._ignore_event(event)
continue
# Remove successfully downloaded event from failures list
if event.id in self._failures:
del self._failures[event.id]
# Get the actual length of the downloaded video using ffprobe
if self._has_ffprobe:
await self._check_video_length(video, duration)
await self.upload_queue.put((event, video))
self.logger.debug("Added to upload queue")
self.current_event = None
except Exception as e:
self.logger.error(
f"Unexpected exception occurred, abandoning event {event.id}:",
exc_info=e,
)
async def _download(self, event: Event) -> Optional[bytes]:
"""Downloads the video clip for the given event."""
self.logger.debug(" Downloading video...")
for x in range(5):
assert isinstance(event.camera_id, str)
assert isinstance(event.start, datetime)
assert isinstance(event.end, datetime)
try:
prepared_video_file = await self._protect.prepare_camera_video( # type: ignore
event.camera_id, event.start, event.end
)
video = await self._protect.download_camera_video( # type: ignore
event.camera_id, prepared_video_file["fileName"]
)
assert isinstance(video, bytes)
break
except (AssertionError, ClientPayloadError, TimeoutError) as e:
self.logger.warning(f" Failed download attempt {x+1}, retying in 1s", exc_info=e)
await asyncio.sleep(1)
else:
self.logger.error(f"Download failed after 5 attempts, abandoning event {event.id}:")
return None
self.logger.debug(f" Downloaded video size: {human_readable_size(len(video))}s")
return video
async def _ignore_event(self, event):
self.logger.warning("Ignoring event")
await self._db.execute(
"INSERT INTO events VALUES "
f"('{event.id}', '{event.type.value}', '{event.camera_id}',"
f"'{event.start.timestamp()}', '{event.end.timestamp()}')"
)
await self._db.commit()
async def _check_video_length(self, video, duration):
"""Check if the downloaded event is at least the length of the event, warn otherwise.
It is expected for events to regularly be slightly longer than the event specified
"""
try:
downloaded_duration = await get_video_length(video)
msg = f" Downloaded video length: {downloaded_duration:.3f}s" f"({downloaded_duration - duration:+.3f}s)"
if downloaded_duration < duration:
self.logger.warning(msg)
else:
self.logger.debug(msg)
except SubprocessException as e:
self.logger.warning(" `ffprobe` failed", exc_info=e)
def _valid_event(self, event):
duration = event.end - event.start
if duration > self._max_event_length:
self.logger.warning(f"Event longer ({duration}) than max allowed length {self._max_event_length}")
return False
return True

View File

@@ -5,10 +5,11 @@ import logging
from time import sleep
from typing import List
from pyunifiprotect.api import ProtectApiClient
from pyunifiprotect.data.nvr import Event
from pyunifiprotect.data.types import EventType
from pyunifiprotect.data.websocket import WSAction, WSSubscriptionMessage
from uiprotect.api import ProtectApiClient
from uiprotect.websocket import WebsocketState
from uiprotect.data.nvr import Event
from uiprotect.data.types import EventType
from uiprotect.data.websocket import WSAction, WSSubscriptionMessage
logger = logging.getLogger(__name__)
@@ -22,6 +23,7 @@ class EventListener:
protect: ProtectApiClient,
detection_types: List[str],
ignore_cameras: List[str],
cameras: List[str],
):
"""Init.
@@ -30,22 +32,22 @@ class EventListener:
protect (ProtectApiClient): UniFI Protect API client to use
detection_types (List[str]): Desired Event detection types to look for
ignore_cameras (List[str]): Cameras IDs to ignore events from
cameras (List[str]): Cameras IDs to ONLY include events from
"""
self._event_queue: asyncio.Queue = event_queue
self._protect: ProtectApiClient = protect
self._unsub = None
self._unsub_websocketstate = None
self.detection_types: List[str] = detection_types
self.ignore_cameras: List[str] = ignore_cameras
self.cameras: List[str] = cameras
async def start(self):
"""Main Loop."""
logger.debug("Subscribed to websocket")
self._unsub_websocket_state = self._protect.subscribe_websocket_state(self._websocket_state_callback)
self._unsub = self._protect.subscribe_websocket(self._websocket_callback)
while True:
await asyncio.sleep(60)
await self._check_websocket_and_reconnect()
def _websocket_callback(self, msg: WSSubscriptionMessage) -> None:
"""Callback for "EVENT" websocket messages.
@@ -61,9 +63,16 @@ class EventListener:
return
if msg.new_obj.camera_id in self.ignore_cameras:
return
if msg.new_obj.end is None:
if self.cameras and msg.new_obj.camera_id not in self.cameras:
return
if msg.new_obj.type not in [EventType.MOTION, EventType.SMART_DETECT, EventType.RING]:
if "end" not in msg.changed_data:
return
if msg.new_obj.type not in [
EventType.MOTION,
EventType.SMART_DETECT,
EventType.RING,
EventType.SMART_DETECT_LINE,
]:
return
if msg.new_obj.type is EventType.MOTION and "motion" not in self.detection_types:
logger.extra_debug(f"Skipping unwanted motion detection event: {msg.new_obj.id}") # type: ignore
@@ -71,6 +80,9 @@ class EventListener:
if msg.new_obj.type is EventType.RING and "ring" not in self.detection_types:
logger.extra_debug(f"Skipping unwanted ring event: {msg.new_obj.id}") # type: ignore
return
if msg.new_obj.type is EventType.SMART_DETECT_LINE and "line" not in self.detection_types:
logger.extra_debug(f"Skipping unwanted line event: {msg.new_obj.id}") # type: ignore
return
elif msg.new_obj.type is EventType.SMART_DETECT:
for event_smart_detection_type in msg.new_obj.smart_detect_types:
if event_smart_detection_type not in self.detection_types:
@@ -89,42 +101,20 @@ class EventListener:
# Unifi protect has started sending the event id in the websocket as a {event_id}-{camera_id} but when the
# API is queried they only have {event_id}. Keeping track of these both of these would be complicated so
# instead we fudge the ID here to match what the API returns
if '-' in msg.new_obj.id:
msg.new_obj.id = msg.new_obj.id.split('-')[0]
if "-" in msg.new_obj.id:
msg.new_obj.id = msg.new_obj.id.split("-")[0]
logger.debug(f"Adding event {msg.new_obj.id} to queue (Current download queue={self._event_queue.qsize()})")
async def _check_websocket_and_reconnect(self):
"""Checks for websocket disconnect and triggers a reconnect."""
logger.extra_debug("Checking the status of the websocket...")
if self._protect.check_ws():
logger.extra_debug("Websocket is connected.")
else:
self._protect.connect_event.clear()
logger.warning("Lost connection to Unifi Protect.")
def _websocket_state_callback(self, state: WebsocketState) -> None:
"""Callback for websocket state messages.
# Unsubscribe, close the session.
self._unsub()
await self._protect.close_session()
Flags the websocket for reconnection
while True:
logger.warning("Attempting reconnect...")
try:
# Start the pyunifiprotect connection by calling `update`
await self._protect.close_session()
self._protect._bootstrap = None
await self._protect.update(force=True)
if self._protect.check_ws():
self._unsub = self._protect.subscribe_websocket(self._websocket_callback)
break
else:
logger.error("Unable to establish connection to Unifi Protect")
except Exception as e:
logger.error("Unexpected exception occurred while trying to reconnect:", exc_info=e)
# Back off for a little while
await asyncio.sleep(10)
self._protect.connect_event.set()
logger.info("Re-established connection to Unifi Protect and to the websocket.")
Args:
msg (WebsocketState): new state of the websocket
"""
if state == WebsocketState.DISCONNECTED:
logger.error("Unifi Protect Websocket lost connection. Reconnecting...")
elif state == WebsocketState.CONNECTED:
logger.info("Unifi Protect Websocket connection restored")

View File

@@ -3,13 +3,13 @@
import asyncio
import logging
from datetime import datetime
from typing import List
from typing import AsyncIterator, List
import aiosqlite
from dateutil.relativedelta import relativedelta
from pyunifiprotect import ProtectApiClient
from pyunifiprotect.data.nvr import Event
from pyunifiprotect.data.types import EventType
from uiprotect import ProtectApiClient
from uiprotect.data.nvr import Event
from uiprotect.data.types import EventType
from unifi_protect_backup import VideoDownloader, VideoUploader
@@ -25,10 +25,11 @@ class MissingEventChecker:
db: aiosqlite.Connection,
download_queue: asyncio.Queue,
downloader: VideoDownloader,
uploader: VideoUploader,
uploaders: List[VideoUploader],
retention: relativedelta,
detection_types: List[str],
ignore_cameras: List[str],
cameras: List[str],
interval: int = 60 * 5,
) -> None:
"""Init.
@@ -38,82 +39,118 @@ class MissingEventChecker:
db (aiosqlite.Connection): Async SQLite database to check for missing events
download_queue (asyncio.Queue): Download queue to check for on-going downloads
downloader (VideoDownloader): Downloader to check for on-going downloads
uploader (VideoUploader): Uploader to check for on-going uploads
uploaders (List[VideoUploader]): Uploaders to check for on-going uploads
retention (relativedelta): Retention period to limit search window
detection_types (List[str]): Detection types wanted to limit search
ignore_cameras (List[str]): Ignored camera IDs to limit search
cameras (List[str]): Included (ONLY) camera IDs to limit search
interval (int): How frequently, in seconds, to check for missing events,
"""
self._protect: ProtectApiClient = protect
self._db: aiosqlite.Connection = db
self._download_queue: asyncio.Queue = download_queue
self._downloader: VideoDownloader = downloader
self._uploader: VideoUploader = uploader
self._uploaders: List[VideoUploader] = uploaders
self.retention: relativedelta = retention
self.detection_types: List[str] = detection_types
self.ignore_cameras: List[str] = ignore_cameras
self.cameras: List[str] = cameras
self.interval: int = interval
async def _get_missing_events(self) -> List[Event]:
# Get list of events that need to be backed up from unifi protect
unifi_events = await self._protect.get_events(
start=datetime.now() - self.retention,
end=datetime.now(),
types=[EventType.MOTION, EventType.SMART_DETECT, EventType.RING],
)
unifi_events = {event.id: event for event in unifi_events}
async def _get_missing_events(self) -> AsyncIterator[Event]:
start_time = datetime.now() - self.retention
end_time = datetime.now()
chunk_size = 500
# Get list of events that have been backed up from the database
while True:
# Get list of events that need to be backed up from unifi protect
logger.extra_debug(f"Fetching events for interval: {start_time} - {end_time}") # type: ignore
events_chunk = await self._protect.get_events(
start=start_time,
end=end_time,
types=[
EventType.MOTION,
EventType.SMART_DETECT,
EventType.RING,
EventType.SMART_DETECT_LINE,
],
limit=chunk_size,
)
# events(id, type, camera_id, start, end)
async with self._db.execute("SELECT * FROM events") as cursor:
rows = await cursor.fetchall()
db_event_ids = {row[0] for row in rows}
if not events_chunk:
break # There were no events to backup
# Prevent re-adding events currently in the download/upload queue
downloading_event_ids = {event.id for event in self._downloader.download_queue._queue} # type: ignore
current_download = self._downloader.current_event
if current_download is not None:
downloading_event_ids.add(current_download.id)
# Filter out on-going events
unifi_events = {event.id: event for event in events_chunk if event.end is not None}
uploading_event_ids = {event.id for event, video in self._uploader.upload_queue._queue} # type: ignore
current_upload = self._uploader.current_event
if current_upload is not None:
uploading_event_ids.add(current_upload.id)
if not unifi_events:
break # No completed events to process
missing_event_ids = set(unifi_events.keys()) - (db_event_ids | downloading_event_ids | uploading_event_ids)
# Next chunks start time should be the end of the oldest complete event in the current chunk
start_time = max([event.end for event in unifi_events.values() if event.end is not None])
def wanted_event_type(event_id):
event = unifi_events[event_id]
if event.start is None or event.end is None:
return False # This event is still on-going
if event.camera_id in self.ignore_cameras:
return False
if event.type is EventType.MOTION and "motion" not in self.detection_types:
return False
if event.type is EventType.RING and "ring" not in self.detection_types:
return False
elif event.type is EventType.SMART_DETECT:
for event_smart_detection_type in event.smart_detect_types:
if event_smart_detection_type not in self.detection_types:
return False
return True
# Get list of events that have been backed up from the database
wanted_event_ids = set(filter(wanted_event_type, missing_event_ids))
# events(id, type, camera_id, start, end)
async with self._db.execute("SELECT * FROM events") as cursor:
rows = await cursor.fetchall()
db_event_ids = {row[0] for row in rows}
return [unifi_events[id] for id in wanted_event_ids]
# Prevent re-adding events currently in the download/upload queue
downloading_event_ids = {event.id for event in self._downloader.download_queue._queue} # type: ignore
current_download = self._downloader.current_event
if current_download is not None:
downloading_event_ids.add(current_download.id)
uploading_event_ids = {event.id for event, video in self._downloader.upload_queue._queue} # type: ignore
for uploader in self._uploaders:
current_upload = uploader.current_event
if current_upload is not None:
uploading_event_ids.add(current_upload.id)
missing_event_ids = set(unifi_events.keys()) - (db_event_ids | downloading_event_ids | uploading_event_ids)
# Exclude events of unwanted types
def wanted_event_type(event_id):
event = unifi_events[event_id]
if event.start is None or event.end is None:
return False # This event is still on-going
if event.camera_id in self.ignore_cameras:
return False
if self.cameras and event.camera_id not in self.cameras:
return False
if event.type is EventType.MOTION and "motion" not in self.detection_types:
return False
if event.type is EventType.RING and "ring" not in self.detection_types:
return False
if event.type is EventType.SMART_DETECT_LINE and "line" not in self.detection_types:
return False
elif event.type is EventType.SMART_DETECT:
for event_smart_detection_type in event.smart_detect_types:
if event_smart_detection_type not in self.detection_types:
return False
return True
wanted_event_ids = set(filter(wanted_event_type, missing_event_ids))
# Yeild events one by one to allow the async loop to start other task while
# waiting on the full list of events
for id in wanted_event_ids:
yield unifi_events[id]
# Last chunk was in-complete, we can stop now
if len(events_chunk) < chunk_size:
break
async def ignore_missing(self):
"""Ignore missing events by adding them to the event table."""
wanted_events = await self._get_missing_events()
logger.info(" Ignoring missing events")
logger.info(f" Ignoring {len(wanted_events)} missing events")
for event in wanted_events:
async for event in self._get_missing_events():
logger.extra_debug(f"Ignoring event '{event.id}'")
await self._db.execute(
"INSERT INTO events VALUES "
f"('{event.id}', '{event.type}', '{event.camera_id}',"
f"('{event.id}', '{event.type.value}', '{event.camera_id}',"
f"'{event.start.timestamp()}', '{event.end.timestamp()}')"
)
await self._db.commit()
@@ -123,28 +160,24 @@ class MissingEventChecker:
logger.info("Starting Missing Event Checker")
while True:
try:
shown_warning = False
# Wait for unifi protect to be connected
await self._protect.connect_event.wait()
logger.extra_debug("Running check for missing events...")
logger.debug("Running check for missing events...")
wanted_events = await self._get_missing_events()
async for event in self._get_missing_events():
if not shown_warning:
logger.warning(" Found missing events, adding to backup queue")
shown_warning = True
logger.debug(f" Undownloaded events of wanted types: {len(wanted_events)}")
if len(wanted_events) > 20:
logger.warning(f" Adding {len(wanted_events)} missing events to backup queue")
missing_logger = logger.extra_debug
else:
missing_logger = logger.warning
for event in wanted_events:
if event.type != EventType.SMART_DETECT:
event_name = f"{event.id} ({event.type})"
event_name = f"{event.id} ({event.type.value})"
else:
event_name = f"{event.id} ({', '.join(event.smart_detect_types)})"
missing_logger(
logger.extra_debug(
f" Adding missing event to backup queue: {event_name}"
f" ({event.start.strftime('%Y-%m-%dT%H-%M-%S')} -"
f" {event.end.strftime('%Y-%m-%dT%H-%M-%S')})"
@@ -152,6 +185,9 @@ class MissingEventChecker:
await self._download_queue.put(event)
except Exception as e:
logger.error("Unexpected exception occurred during missing event check:", exc_info=e)
logger.error(
"Unexpected exception occurred during missing event check:",
exc_info=e,
)
await asyncio.sleep(self.interval)

View File

@@ -8,11 +8,11 @@ notifier = apprise.Apprise()
def add_notification_service(url):
"""Add apprise URI with support for tags e.g. TAG1,TAG2=PROTOCOL://settings."""
config = apprise.AppriseConfig()
config.add_config(url, format='text')
config.add_config(url, format="text")
# If not tags are specified, default to errors otherwise ALL logging will
# be spammed to the notification service
if not config.servers()[0].tags:
config.servers()[0].tags = {'ERROR'}
config.servers()[0].tags = {"ERROR"}
notifier.add(config)

View File

@@ -64,7 +64,6 @@ class Purge:
f"SELECT * FROM events WHERE end < {retention_oldest_time}"
) as event_cursor:
async for event_id, event_type, camera_id, event_start, event_end in event_cursor:
logger.info(f"Purging event: {event_id}.")
# For every backup for this event
@@ -86,5 +85,5 @@ class Purge:
logger.error("Unexpected exception occurred during purge:", exc_info=e)
next_purge_time = datetime.now() + self.interval
logger.extra_debug(f'sleeping until {next_purge_time}')
logger.extra_debug(f"sleeping until {next_purge_time}")
await wait_until(next_purge_time)

View File

@@ -0,0 +1,135 @@
import enum
from datetime import datetime
from pathlib import Path
from typing import Any, Optional
import aiofiles
from uiprotect.data import Version
from uiprotect.exceptions import BadRequest
from uiprotect.utils import to_js_time
# First, let's add the new VideoExportType enum
class VideoExportType(str, enum.Enum):
TIMELAPSE = "timelapse"
ROTATING = "rotating"
def monkey_patch_experimental_downloader():
from uiprotect.api import ProtectApiClient
# Add the version constant
ProtectApiClient.NEW_DOWNLOAD_VERSION = Version("4.0.0") # You'll need to import Version from uiprotect
async def _validate_channel_id(self, camera_id: str, channel_index: int) -> None:
if self._bootstrap is None:
await self.update()
try:
camera = self._bootstrap.cameras[camera_id]
camera.channels[channel_index]
except (IndexError, AttributeError, KeyError) as e:
raise BadRequest(f"Invalid input: {e}") from e
async def prepare_camera_video(
self,
camera_id: str,
start: datetime,
end: datetime,
channel_index: int = 0,
validate_channel_id: bool = True,
fps: Optional[int] = None,
filename: Optional[str] = None,
) -> Optional[dict[str, Any]]:
if self.bootstrap.nvr.version < self.NEW_DOWNLOAD_VERSION:
raise ValueError("This method is only support from Unifi Protect version >= 4.0.0.")
if validate_channel_id:
await self._validate_channel_id(camera_id, channel_index)
params = {
"camera": camera_id,
"start": to_js_time(start),
"end": to_js_time(end),
}
if channel_index == 3:
params.update({"lens": 2})
else:
params.update({"channel": channel_index})
if fps is not None and fps > 0:
params["fps"] = fps
params["type"] = VideoExportType.TIMELAPSE.value
else:
params["type"] = VideoExportType.ROTATING.value
if not filename:
start_str = start.strftime("%m-%d-%Y, %H.%M.%S %Z")
end_str = end.strftime("%m-%d-%Y, %H.%M.%S %Z")
filename = f"{camera_id} {start_str} - {end_str}.mp4"
params["filename"] = filename
return await self.api_request(
"video/prepare",
params=params,
raise_exception=True,
)
async def download_camera_video(
self,
camera_id: str,
filename: str,
output_file: Optional[Path] = None,
iterator_callback: Optional[callable] = None,
progress_callback: Optional[callable] = None,
chunk_size: int = 65536,
) -> Optional[bytes]:
if self.bootstrap.nvr.version < self.NEW_DOWNLOAD_VERSION:
raise ValueError("This method is only support from Unifi Protect version >= 4.0.0.")
params = {
"camera": camera_id,
"filename": filename,
}
if iterator_callback is None and progress_callback is None and output_file is None:
return await self.api_request_raw(
"video/download",
params=params,
raise_exception=False,
)
r = await self.request(
"get",
f"{self.api_path}video/download",
auto_close=False,
timeout=0,
params=params,
)
if output_file is not None:
async with aiofiles.open(output_file, "wb") as output:
async def callback(total: int, chunk: Optional[bytes]) -> None:
if iterator_callback is not None:
await iterator_callback(total, chunk)
if chunk is not None:
await output.write(chunk)
await self._stream_response(r, chunk_size, callback, progress_callback)
else:
await self._stream_response(
r,
chunk_size,
iterator_callback,
progress_callback,
)
r.close()
return None
# Patch the methods into the class
ProtectApiClient._validate_channel_id = _validate_channel_id
ProtectApiClient.prepare_camera_video = prepare_camera_video
ProtectApiClient.download_camera_video = download_camera_video

View File

@@ -1,20 +1,23 @@
"""Main module."""
import asyncio
import logging
import os
import shutil
from datetime import datetime, timezone
from datetime import datetime, timedelta, timezone
from typing import Callable, List
import aiosqlite
from pyunifiprotect import ProtectApiClient
from pyunifiprotect.data.types import ModelType
from dateutil.relativedelta import relativedelta
from uiprotect import ProtectApiClient
from uiprotect.data.types import ModelType
from unifi_protect_backup import (
EventListener,
MissingEventChecker,
Purge,
VideoDownloader,
VideoDownloaderExperimental,
VideoUploader,
notifications,
)
@@ -22,16 +25,23 @@ from unifi_protect_backup.utils import (
SubprocessException,
VideoQueue,
human_readable_size,
parse_rclone_retention,
run_command,
setup_logging,
)
from unifi_protect_backup.uiprotect_patch import monkey_patch_experimental_downloader
logger = logging.getLogger(__name__)
# TODO: https://github.com/cjrh/aiorun#id6 (smart shield)
# We have been waiting for a long time for this PR to get merged
# https://github.com/uilibs/uiprotect/pull/249
# Since it has not progressed, we will for now patch in the functionality ourselves
monkey_patch_experimental_downloader()
async def create_database(path: str):
"""Creates sqlite database and creates the events abd backups tables."""
db = await aiosqlite.connect(path)
@@ -57,20 +67,25 @@ class UnifiProtectBackup:
password: str,
verify_ssl: bool,
rclone_destination: str,
retention: str,
retention: relativedelta,
rclone_args: str,
rclone_purge_args: str,
detection_types: List[str],
ignore_cameras: List[str],
cameras: List[str],
file_structure_format: str,
verbose: int,
download_buffer_size: int,
purge_interval: str,
purge_interval: relativedelta,
apprise_notifiers: str,
skip_missing: bool,
max_event_length: int,
sqlite_path: str = "events.sqlite",
color_logging=False,
color_logging: bool = False,
download_rate_limit: float | None = None,
port: int = 443,
use_experimental_downloader: bool = False,
parallel_uploads: int = 1,
):
"""Will configure logging settings and the Unifi Protect API (but not actually connect).
@@ -91,6 +106,7 @@ class UnifiProtectBackup:
rclone_purge_args (str): Optional extra arguments to pass to `rclone delete` directly.
detection_types (List[str]): List of which detection types to backup.
ignore_cameras (List[str]): List of camera IDs for which to not backup events.
cameras (List[str]): List of ONLY camera IDs for which to backup events.
file_structure_format (str): A Python format string for output file path.
verbose (int): How verbose to setup logging, see :func:`setup_logging` for details.
download_buffer_size (int): How many bytes big the download buffer should be
@@ -99,6 +115,10 @@ class UnifiProtectBackup:
skip_missing (bool): If initial missing events should be ignored
sqlite_path (str): Path where to find/create sqlite database
color_logging (bool): Whether to add color to logging output or not
download_rate_limit (float): Limit how events can be downloaded in one minute. Disabled by default",
max_event_length (int): Maximum length in seconds for an event to be considered valid and downloaded
use_experimental_downloader (bool): Use the new experimental downloader (the same method as used by the webUI)
parallel_uploads (int): Max number of parallel uploads to allow
"""
self.color_logging = color_logging
setup_logging(verbose, self.color_logging)
@@ -125,6 +145,7 @@ class UnifiProtectBackup:
logger.debug(f" {rclone_args=}")
logger.debug(f" {rclone_purge_args=}")
logger.debug(f" {ignore_cameras=}")
logger.debug(f" {cameras=}")
logger.debug(f" {verbose=}")
logger.debug(f" {detection_types=}")
logger.debug(f" {file_structure_format=}")
@@ -133,9 +154,13 @@ class UnifiProtectBackup:
logger.debug(f" {purge_interval=}")
logger.debug(f" {apprise_notifiers=}")
logger.debug(f" {skip_missing=}")
logger.debug(f" {download_rate_limit=} events per minute")
logger.debug(f" {max_event_length=}s")
logger.debug(f" {use_experimental_downloader=}")
logger.debug(f" {parallel_uploads=}")
self.rclone_destination = rclone_destination
self.retention = parse_rclone_retention(retention)
self.retention = retention
self.rclone_args = rclone_args
self.rclone_purge_args = rclone_purge_args
self.file_structure_format = file_structure_format
@@ -155,6 +180,7 @@ class UnifiProtectBackup:
subscribed_models={ModelType.EVENT},
)
self.ignore_cameras = ignore_cameras
self.cameras = cameras
self._download_queue: asyncio.Queue = asyncio.Queue()
self._unsub: Callable[[], None]
self.detection_types = detection_types
@@ -162,8 +188,12 @@ class UnifiProtectBackup:
self._sqlite_path = sqlite_path
self._db = None
self._download_buffer_size = download_buffer_size
self._purge_interval = parse_rclone_retention(purge_interval)
self._purge_interval = purge_interval
self._skip_missing = skip_missing
self._download_rate_limit = download_rate_limit
self._max_event_length = timedelta(seconds=max_event_length)
self._use_experimental_downloader = use_experimental_downloader
self._parallel_uploads = parallel_uploads
async def start(self):
"""Bootstrap the backup process and kick off the main loop.
@@ -180,18 +210,25 @@ class UnifiProtectBackup:
logger.info("Checking rclone configuration...")
await self._check_rclone()
# Start the pyunifiprotect connection by calling `update`
# Start the uiprotect connection by calling `update`
logger.info("Connecting to Unifi Protect...")
for attempts in range(1):
delay = 5 # Start with a 5 second delay
max_delay = 3600 # 1 hour in seconds
for attempts in range(20):
try:
await self._protect.update()
break
except Exception as e:
logger.warning(f"Failed to connect to UniFi Protect, retrying in {attempts}s...", exc_info=e)
await asyncio.sleep(attempts)
logger.warning(
f"Failed to connect to UniFi Protect, retrying in {delay}s...",
exc_info=e,
)
await asyncio.sleep(delay)
delay = min(max_delay, delay * 2) # Double the delay but do not exceed max_delay
else:
raise ConnectionError("Failed to connect to UniFi Protect after 10 attempts")
raise ConnectionError("Failed to connect to UniFi Protect after 20 attempts")
# Add a lock to the protect client that can be used to prevent code accessing the client when it has
# lost connection
@@ -204,8 +241,8 @@ class UnifiProtectBackup:
logger.info(f" - {camera.id}: {camera.name}")
# Print timezone info for debugging
logger.debug(f'NVR TZ: {self._protect.bootstrap.nvr.timezone}')
logger.debug(f'Local TZ: {datetime.now(timezone.utc).astimezone().tzinfo}')
logger.debug(f"NVR TZ: {self._protect.bootstrap.nvr.timezone}")
logger.debug(f"Local TZ: {datetime.now(timezone.utc).astimezone().tzinfo}")
tasks = []
@@ -223,32 +260,54 @@ class UnifiProtectBackup:
# Create downloader task
# This will download video files to its buffer
downloader = VideoDownloader(self._protect, self._db, download_queue, upload_queue, self.color_logging)
if self._use_experimental_downloader:
downloader_cls = VideoDownloaderExperimental
else:
downloader_cls = VideoDownloader
downloader = downloader_cls(
self._protect,
self._db,
download_queue,
upload_queue,
self.color_logging,
self._download_rate_limit,
self._max_event_length,
)
tasks.append(downloader.start())
# Create upload task
# Create upload tasks
# This will upload the videos in the downloader's buffer to the rclone remotes and log it in the database
uploader = VideoUploader(
self._protect,
upload_queue,
self.rclone_destination,
self.rclone_args,
self.file_structure_format,
self._db,
self.color_logging,
)
tasks.append(uploader.start())
uploaders = []
for i in range(self._parallel_uploads):
uploader = VideoUploader(
self._protect,
upload_queue,
self.rclone_destination,
self.rclone_args,
self.file_structure_format,
self._db,
self.color_logging,
)
uploaders.append(uploader)
tasks.append(uploader.start())
# Create event listener task
# This will connect to the unifi protect websocket and listen for events. When one is detected it will
# be added to the queue of events to download
event_listener = EventListener(download_queue, self._protect, self.detection_types, self.ignore_cameras)
event_listener = EventListener(
download_queue, self._protect, self.detection_types, self.ignore_cameras, self.cameras
)
tasks.append(event_listener.start())
# Create purge task
# This will, every midnight, purge old backups from the rclone remotes and database
purge = Purge(
self._db, self.retention, self.rclone_destination, self._purge_interval, self.rclone_purge_args
self._db,
self.retention,
self.rclone_destination,
self._purge_interval,
self.rclone_purge_args,
)
tasks.append(purge.start())
@@ -260,10 +319,11 @@ class UnifiProtectBackup:
self._db,
download_queue,
downloader,
uploader,
uploaders,
self.retention,
self.detection_types,
self.ignore_cameras,
self.cameras,
)
if self._skip_missing:
logger.info("Ignoring missing events")
@@ -292,7 +352,7 @@ class UnifiProtectBackup:
ValueError: The given rclone destination is for a remote that is not configured
"""
rclone = shutil.which('rclone')
rclone = shutil.which("rclone")
if not rclone:
raise RuntimeError("`rclone` is not installed on this system")
logger.debug(f"rclone found: {rclone}")

View File

@@ -6,10 +6,17 @@ import re
from datetime import datetime
import aiosqlite
from pyunifiprotect import ProtectApiClient
from pyunifiprotect.data.nvr import Event
from uiprotect import ProtectApiClient
from uiprotect.data.nvr import Event
from unifi_protect_backup.utils import VideoQueue, get_camera_name, human_readable_size, run_command, setup_event_logger
from unifi_protect_backup.utils import (
SubprocessException,
VideoQueue,
get_camera_name,
human_readable_size,
run_command,
setup_event_logger,
)
class VideoUploader:
@@ -49,7 +56,7 @@ class VideoUploader:
self.base_logger = logging.getLogger(__name__)
setup_event_logger(self.base_logger, color_logging)
self.logger = logging.LoggerAdapter(self.base_logger, {'event': ''})
self.logger = logging.LoggerAdapter(self.base_logger, {"event": ""})
async def start(self):
"""Main loop.
@@ -63,7 +70,7 @@ class VideoUploader:
event, video = await self.upload_queue.get()
self.current_event = event
self.logger = logging.LoggerAdapter(self.base_logger, {'event': f' [{event.id}]'})
self.logger = logging.LoggerAdapter(self.base_logger, {"event": f" [{event.id}]"})
self.logger.info(f"Uploading event: {event.id}")
self.logger.debug(
@@ -74,10 +81,13 @@ class VideoUploader:
destination = await self._generate_file_path(event)
self.logger.debug(f" Destination: {destination}")
await self._upload_video(video, destination, self._rclone_args)
await self._update_database(event, destination)
try:
await self._upload_video(video, destination, self._rclone_args)
await self._update_database(event, destination)
self.logger.debug("Uploaded")
except SubprocessException:
self.logger.error(f" Failed to upload file: '{destination}'")
self.logger.debug("Uploaded")
self.current_event = None
except Exception as e:
@@ -99,7 +109,7 @@ class VideoUploader:
"""
returncode, stdout, stderr = await run_command(f'rclone rcat -vv {rclone_args} "{destination}"', video)
if returncode != 0:
self.logger.error(f" Failed to upload file: '{destination}'")
raise SubprocessException(stdout, stderr, returncode)
async def _update_database(self, event: Event, destination: str):
"""Add the backed up event to the database along with where it was backed up to."""
@@ -107,7 +117,7 @@ class VideoUploader:
assert isinstance(event.end, datetime)
await self._db.execute(
"INSERT INTO events VALUES "
f"('{event.id}', '{event.type}', '{event.camera_id}',"
f"('{event.id}', '{event.type.value}', '{event.camera_id}',"
f"'{event.start.timestamp()}', '{event.end.timestamp()}')"
)
@@ -128,7 +138,7 @@ class VideoUploader:
Provides the following fields to the format string:
event: The `Event` object as per
https://github.com/briis/pyunifiprotect/blob/master/pyunifiprotect/data/nvr.py
https://github.com/briis/uiprotect/blob/master/uiprotect/data/nvr.py
duration_seconds: The duration of the event in seconds
detection_type: A nicely formatted list of the event detection type and the smart detection types (if any)
camera_name: The name of the camera that generated this event
@@ -147,13 +157,13 @@ class VideoUploader:
format_context = {
"event": event,
"duration_seconds": (event.end - event.start).total_seconds(),
"detection_type": f"{event.type} ({' '.join(event.smart_detect_types)})"
"detection_type": f"{event.type.value} ({' '.join(event.smart_detect_types)})"
if event.smart_detect_types
else f"{event.type}",
else f"{event.type.value}",
"camera_name": await get_camera_name(self._protect, event.camera_id),
}
file_path = self._file_structure_format.format(**format_context)
file_path = re.sub(r'[^\w\-_\.\(\)/ ]', '', file_path) # Sanitize any invalid chars
file_path = re.sub(r"[^\w\-_\.\(\)/ ]", "", file_path) # Sanitize any invalid chars
return pathlib.Path(f"{self._rclone_destination}/{file_path}")

View File

@@ -7,10 +7,9 @@ from datetime import datetime
from typing import List, Optional
from apprise import NotifyType
from dateutil.relativedelta import relativedelta
from pyunifiprotect import ProtectApiClient
from pyunifiprotect.data.nvr import Event
from async_lru import alru_cache
from uiprotect import ProtectApiClient
from uiprotect.data.nvr import Event
from unifi_protect_backup import notifications
@@ -51,11 +50,11 @@ def add_logging_level(levelName: str, levelNum: int, methodName: Optional[str] =
methodName = levelName.lower()
if hasattr(logging, levelName):
raise AttributeError('{} already defined in logging module'.format(levelName))
raise AttributeError("{} already defined in logging module".format(levelName))
if hasattr(logging, methodName):
raise AttributeError('{} already defined in logging module'.format(methodName))
raise AttributeError("{} already defined in logging module".format(methodName))
if hasattr(logging.getLoggerClass(), methodName):
raise AttributeError('{} already defined in logger class'.format(methodName))
raise AttributeError("{} already defined in logger class".format(methodName))
# This method was inspired by the answers to Stack Overflow post
# http://stackoverflow.com/q/2183233/2988730, especially
@@ -85,19 +84,19 @@ def add_color_to_record_levelname(record):
"""Colorizes logging level names."""
levelno = record.levelno
if levelno >= logging.CRITICAL:
color = '\x1b[31;1m' # RED
color = "\x1b[31;1m" # RED
elif levelno >= logging.ERROR:
color = '\x1b[31;1m' # RED
color = "\x1b[31;1m" # RED
elif levelno >= logging.WARNING:
color = '\x1b[33;1m' # YELLOW
color = "\x1b[33;1m" # YELLOW
elif levelno >= logging.INFO:
color = '\x1b[32;1m' # GREEN
color = "\x1b[32;1m" # GREEN
elif levelno >= logging.DEBUG:
color = '\x1b[36;1m' # CYAN
color = "\x1b[36;1m" # CYAN
elif levelno >= logging.EXTRA_DEBUG:
color = '\x1b[35;1m' # MAGENTA
color = "\x1b[35;1m" # MAGENTA
else:
color = '\x1b[0m'
color = "\x1b[0m"
return f"{color}{record.levelname}\x1b[0m"
@@ -175,7 +174,7 @@ class AppriseStreamHandler(logging.StreamHandler):
def create_logging_handler(format, color_logging):
"""Constructs apprise logging handler for the given format."""
date_format = "%Y-%m-%d %H:%M:%S"
style = '{'
style = "{"
sh = AppriseStreamHandler(color_logging)
formatter = logging.Formatter(format, date_format, style)
@@ -204,11 +203,11 @@ def setup_logging(verbosity: int, color_logging: bool = False, apprise_notifiers
"""
add_logging_level(
'EXTRA_DEBUG',
"EXTRA_DEBUG",
logging.DEBUG - 1,
)
add_logging_level(
'WEBSOCKET_DATA',
"WEBSOCKET_DATA",
logging.DEBUG - 2,
)
@@ -239,12 +238,18 @@ def setup_logging(verbosity: int, color_logging: bool = False, apprise_notifiers
logger.setLevel(logging.WEBSOCKET_DATA) # type: ignore
_initialized_loggers = []
def setup_event_logger(logger, color_logging):
"""Sets up a logger that also displays the event ID currently being processed."""
format = "{asctime} [{levelname:^11s}] {name:<42} :{event} {message}"
sh = create_logging_handler(format, color_logging)
logger.addHandler(sh)
logger.propagate = False
global _initialized_loggers
if logger not in _initialized_loggers:
format = "{asctime} [{levelname:^11s}] {name:<42} :{event} {message}"
sh = create_logging_handler(format, color_logging)
logger.addHandler(sh)
logger.propagate = False
_initialized_loggers.append(logger)
_suffixes = ["B", "KiB", "MiB", "GiB", "TiB", "PiB", "EiB", "ZiB", "YiB"]
@@ -287,7 +292,7 @@ async def get_camera_name(protect: ProtectApiClient, id: str):
If the camera ID is not know, it tries refreshing the cached data
"""
# Wait for unifi protect to be connected
await protect.connect_event.wait()
await protect.connect_event.wait() # type: ignore
try:
return protect.bootstrap.cameras[id].name
@@ -295,7 +300,7 @@ async def get_camera_name(protect: ProtectApiClient, id: str):
# Refresh cameras
logger.debug(f"Unknown camera id: '{id}', checking API")
await protect.update(force=True)
await protect.update()
try:
name = protect.bootstrap.cameras[id].name
@@ -328,21 +333,6 @@ class SubprocessException(Exception):
return f"Return Code: {self.returncode}\nStdout:\n{self.stdout}\nStderr:\n{self.stderr}"
def parse_rclone_retention(retention: str) -> relativedelta:
"""Parses the rclone `retention` parameter into a relativedelta which can then be used to calculate datetimes."""
matches = {k: int(v) for v, k in re.findall(r"([\d]+)(ms|s|m|h|d|w|M|y)", retention)}
return relativedelta(
microseconds=matches.get("ms", 0) * 1000,
seconds=matches.get("s", 0),
minutes=matches.get("m", 0),
hours=matches.get("h", 0),
days=matches.get("d", 0),
weeks=matches.get("w", 0),
months=matches.get("M", 0),
years=matches.get("y", 0),
)
async def run_command(cmd: str, data=None):
"""Runs the given command returning the exit code, stdout and stderr."""
proc = await asyncio.create_subprocess_shell(
@@ -353,9 +343,9 @@ async def run_command(cmd: str, data=None):
)
stdout, stderr = await proc.communicate(data)
stdout = stdout.decode()
stdout_indented = '\t' + stdout.replace('\n', '\n\t').strip()
stdout_indented = "\t" + stdout.replace("\n", "\n\t").strip()
stderr = stderr.decode()
stderr_indented = '\t' + stderr.replace('\n', '\n\t').strip()
stderr_indented = "\t" + stderr.replace("\n", "\n\t").strip()
if proc.returncode != 0:
logger.error(f"Failed to run: '{cmd}")
@@ -393,7 +383,7 @@ class VideoQueue(asyncio.Queue):
self._queue.append(item) # type: ignore
self._bytes_sum += len(item[1])
def full(self, item: tuple[Event, bytes] = None):
def full(self, item: tuple[Event, bytes] | None = None):
"""Return True if there are maxsize bytes in the queue.
optionally if `item` is provided, it will return False if there is enough space to
@@ -423,7 +413,7 @@ class VideoQueue(asyncio.Queue):
)
while self.full(item):
putter = self._loop.create_future() # type: ignore
putter = self._get_loop().create_future() # type: ignore
self._putters.append(putter) # type: ignore
try:
await putter

1648
uv.lock generated Normal file

File diff suppressed because it is too large Load Diff