Merge remote-tracking branch 'meisnate12/nightly' into nightly

pull/1866/head
bullmoose20 10 months ago
commit 63dc9da7cf

@ -6,13 +6,17 @@ Updated python-dotenv requirement to 1.0.1
# New Features # New Features
Added `monitor_existing` to sonarr and radarr. To update the monitored status of items existing in plex to match the `monitor` declared. Added `monitor_existing` to sonarr and radarr. To update the monitored status of items existing in plex to match the `monitor` declared.
Added [Gotify](https://gotify.net/) as a notification service. Thanks @krstn420 for the initial code.
# Updates # Updates
Added new [BoxOfficeMojo Builder](https://metamanager.wiki/en/latest/files/builders/mojo/) - credit to @nwithan8 for the suggestion and initial code submission Added new [BoxOfficeMojo Builder](https://metamanager.wiki/en/latest/files/builders/mojo/) - credit to @nwithan8 for the suggestion and initial code submission
Added new [`trakt_chart` attributes](https://metamanager.wiki/en/latest/files/builders/trakt/#trakt-chart) `network_ids`, `studio_ids`, `votes`, `tmdb_ratings`, `tmdb_votes`, `imdb_ratings`, `imdb_votes`, `rt_meters`, `rt_user_meters`, `metascores` and removed the deprecated `network` attribute Added new [`trakt_chart` attributes](https://metamanager.wiki/en/latest/files/builders/trakt/#trakt-chart) `network_ids`, `studio_ids`, `votes`, `tmdb_ratings`, `tmdb_votes`, `imdb_ratings`, `imdb_votes`, `rt_meters`, `rt_user_meters`, `metascores` and removed the deprecated `network` attribute
Added [Trakt and MyAnimeList Authentication Page](https://metamanager.wiki/en/latest/config/auth/) allowing users to authenticate against those services directly from the wiki. credit to @chazlarson for developing the script Added [Trakt and MyAnimeList Authentication Page](https://metamanager.wiki/en/latest/config/auth/) allowing users to authenticate against those services directly from the wiki. credit to @chazlarson for developing the script
Trakt Builder `trakt_userlist` value `recommendations` removed and `favorites` added. Trakt Builder `trakt_userlist` value `recommendations` removed and `favorites` added.
Mass Update operations now can be given a list of sources to fall back on when one fails including a manual source.
`mass_content_rating_update` has a new source `mdb_age_rating`
`mass_originally_available_update` has a new source `mdb_digital`
`plex` attributes `clean_bundles`, `empty_trash`, and `optimize` can now take any schedule options to be run only when desired.
# Defaults # Defaults

@ -1 +1 @@
1.20.0-develop19 1.20.0-develop24

@ -105,6 +105,9 @@ mdblist:
cache_expiration: 60 cache_expiration: 60
notifiarr: notifiarr:
apikey: #################################### apikey: ####################################
gotify:
url: http://192.168.1.12:80
token: ####################################
anidb: # Not required for AniDB builders unless you want mature content anidb: # Not required for AniDB builders unless you want mature content
username: ###### username: ######
password: ###### password: ######

@ -0,0 +1,31 @@
# Gotify Attributes
Configuring [Gotify](https://gotify.net/) is optional but can allow you to send the [webhooks](webhooks.md)
straight to gotify.
A `gotify` mapping is in the root of the config file.
Below is a `gotify` mapping example and the full set of attributes:
```yaml
gotify:
url: ####################################
token: ####################################
```
| Attribute | Allowed Values | Required |
|:----------|:-------------------------|:------------------------------------------:|
| `url` | Gotify Server Url | :fontawesome-solid-circle-check:{ .green } |
| `token` | Gotify Application Token | :fontawesome-solid-circle-check:{ .green } |
Once you have added the apikey your config.yml you have to add `gotify` to any [webhook](webhooks.md) to send that
notification to Gotify.
```yaml
webhooks:
error: gotify
version: gotify
run_start: gotify
run_end: gotify
changes: gotify
```

@ -104,7 +104,7 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_genre_update` **Attribute:** `mass_genre_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`tmdb`</td><td>Use TMDb for Genres</td></tr> <tr><td>`tmdb`</td><td>Use TMDb for Genres</td></tr>
@ -123,6 +123,7 @@ You can create individual blocks of operations by using a list under `operations
<tr><td>`unlock`</td><td>Unlock all Genre Field</td></tr> <tr><td>`unlock`</td><td>Unlock all Genre Field</td></tr>
<tr><td>`remove`</td><td>Remove all Genres and Lock all Field</td></tr> <tr><td>`remove`</td><td>Remove all Genres and Lock all Field</td></tr>
<tr><td>`reset`</td><td>Remove all Genres and Unlock all Field</td></tr> <tr><td>`reset`</td><td>Remove all Genres and Unlock all Field</td></tr>
<tr><td colspan="2">List of Strings for Genres (<code>["String 1", "String 2"]</code>)</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -131,7 +132,10 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
Movies: Movies:
operations: operations:
mass_genre_update: imdb mass_genre_update:
- imdb
- tmdb
- ["Unknown"]
``` ```
??? blank "`mass_content_rating_update` - Updates the content rating of every item in the library.<a class="headerlink" href="#mass-content-rating-update" title="Permanent link"></a>" ??? blank "`mass_content_rating_update` - Updates the content rating of every item in the library.<a class="headerlink" href="#mass-content-rating-update" title="Permanent link"></a>"
@ -143,18 +147,21 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_content_rating_update` **Attribute:** `mass_content_rating_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`mdb`</td><td>Use MdbList for Content Ratings</td></tr> <tr><td>`mdb`</td><td>Use MdbList for Content Ratings</td></tr>
<tr><td>`mdb_commonsense`</td><td>Use Common Sense Rating through MDbList for Content Ratings</td></tr> <tr><td>`mdb_commonsense`</td><td>Use Common Sense Rating through MDbList for Content Ratings</td></tr>
<tr><td>`mdb_commonsense0`</td><td>Use Common Sense Rating with Zero Padding through MDbList for Content Ratings</td></tr> <tr><td>`mdb_commonsense0`</td><td>Use Common Sense Rating with Zero Padding through MDbList for Content Ratings</td></tr>
<tr><td>`mdb_age_rating`</td><td>Use MDbList Age Rating for Content Ratings</td></tr>
<tr><td>`mdb_age_rating0`</td><td>Use MDbList Age Rating with Zero Padding for Content Ratings</td></tr>
<tr><td>`omdb`</td><td>Use IMDb through OMDb for Content Ratings</td></tr> <tr><td>`omdb`</td><td>Use IMDb through OMDb for Content Ratings</td></tr>
<tr><td>`mal`</td><td>Use MyAnimeList for Content Ratings</td></tr> <tr><td>`mal`</td><td>Use MyAnimeList for Content Ratings</td></tr>
<tr><td>`lock`</td><td>Lock Content Rating Field</td></tr> <tr><td>`lock`</td><td>Lock Content Rating Field</td></tr>
<tr><td>`unlock`</td><td>Unlock Content Rating Field</td></tr> <tr><td>`unlock`</td><td>Unlock Content Rating Field</td></tr>
<tr><td>`remove`</td><td>Remove Content Rating and Lock Field</td></tr> <tr><td>`remove`</td><td>Remove Content Rating and Lock Field</td></tr>
<tr><td>`reset`</td><td>Remove Content Rating and Unlock Field</td></tr> <tr><td>`reset`</td><td>Remove Content Rating and Unlock Field</td></tr>
<tr><td colspan="2">Any String for Content Ratings</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -163,7 +170,10 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
Movies: Movies:
operations: operations:
mass_content_rating_update: omdb mass_content_rating_update:
- mdb_commonsense
- mdb_age_rating
- NR
``` ```
??? blank "`mass_original_title_update` - Updates the original title of every item in the library.<a class="headerlink" href="#mass-original-title-update" title="Permanent link"></a>" ??? blank "`mass_original_title_update` - Updates the original title of every item in the library.<a class="headerlink" href="#mass-original-title-update" title="Permanent link"></a>"
@ -175,7 +185,7 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_original_title_update` **Attribute:** `mass_original_title_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`anidb`</td><td>Use AniDB Main Title for Original Titles</td></tr> <tr><td>`anidb`</td><td>Use AniDB Main Title for Original Titles</td></tr>
@ -187,6 +197,7 @@ You can create individual blocks of operations by using a list under `operations
<tr><td>`unlock`</td><td>Unlock Original Title Field</td></tr> <tr><td>`unlock`</td><td>Unlock Original Title Field</td></tr>
<tr><td>`remove`</td><td>Remove Original Title and Lock Field</td></tr> <tr><td>`remove`</td><td>Remove Original Title and Lock Field</td></tr>
<tr><td>`reset`</td><td>Remove Original Title and Unlock Field</td></tr> <tr><td>`reset`</td><td>Remove Original Title and Unlock Field</td></tr>
<tr><td colspan="2">Any String for Original Titles</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -195,7 +206,10 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
Anime: Anime:
operations: operations:
mass_original_title_update: anidb_official mass_original_title_update:
- anidb_official
- anidb
- Unknown
``` ```
??? blank "`mass_studio_update` - Updates the studio of every item in the library.<a class="headerlink" href="#mass-studio-update" title="Permanent link"></a>" ??? blank "`mass_studio_update` - Updates the studio of every item in the library.<a class="headerlink" href="#mass-studio-update" title="Permanent link"></a>"
@ -206,16 +220,17 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_studio_update` **Attribute:** `mass_studio_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`anidb`</td><td>Use AniDB Animation Work for Studio</td></tr> <tr><td>`anidb`</td><td>Use AniDB Animation Work for Studio</td></tr>
<tr><td>`mal`</td><td>Use MyAnimeList Studio for Studio</td></tr> <tr><td>`mal`</td><td>Use MyAnimeList Studio for Studio</td></tr>
<tr><td>`tmdb`</td><td>Use TMDb Studio for Studio</td></tr> <tr><td>`tmdb`</td><td>Use TMDb Studio for Studio</td></tr>
<tr><td>`lock`</td><td>Lock Original Title Field</td></tr> <tr><td>`lock`</td><td>Lock Studio Field</td></tr>
<tr><td>`unlock`</td><td>Unlock Original Title Field</td></tr> <tr><td>`unlock`</td><td>Unlock Studio Field</td></tr>
<tr><td>`remove`</td><td>Remove Original Title and Lock Field</td></tr> <tr><td>`remove`</td><td>Remove Studio and Lock Field</td></tr>
<tr><td>`reset`</td><td>Remove Original Title and Unlock Field</td></tr> <tr><td>`reset`</td><td>Remove Studio and Unlock Field</td></tr>
<tr><td colspan="2">Any String for Studio</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -224,7 +239,10 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
Anime: Anime:
operations: operations:
mass_studio_update: mal mass_studio_update:
- mal
- anidb
- Unknown
``` ```
??? blank "`mass_originally_available_update` - Updates the originally available date of every item in the library.<a class="headerlink" href="#mass-originally-available-update" title="Permanent link"></a>" ??? blank "`mass_originally_available_update` - Updates the originally available date of every item in the library.<a class="headerlink" href="#mass-originally-available-update" title="Permanent link"></a>"
@ -241,19 +259,21 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_originally_available_update` **Attribute:** `mass_originally_available_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`tmdb`</td><td>Use TMDb Release Date</td></tr> <tr><td>`tmdb`</td><td>Use TMDb Release Date</td></tr>
<tr><td>`tvdb`</td><td>Use TVDb Release Date</td></tr> <tr><td>`tvdb`</td><td>Use TVDb Release Date</td></tr>
<tr><td>`omdb`</td><td>Use IMDb Release Date through OMDb</td></tr> <tr><td>`omdb`</td><td>Use IMDb Release Date through OMDb</td></tr>
<tr><td>`mdb`</td><td>Use MdbList Release Date</td></tr> <tr><td>`mdb`</td><td>Use MdbList Release Date</td></tr>
<tr><td>`mdb_digital`</td><td>Use MdbList Digital Release Date</td></tr>
<tr><td>`anidb`</td><td>Use AniDB Release Date</td></tr> <tr><td>`anidb`</td><td>Use AniDB Release Date</td></tr>
<tr><td>`mal`</td><td>Use MyAnimeList Release Date</td></tr> <tr><td>`mal`</td><td>Use MyAnimeList Release Date</td></tr>
<tr><td>`lock`</td><td>Lock Originally Available Field</td></tr> <tr><td>`lock`</td><td>Lock Originally Available Field</td></tr>
<tr><td>`unlock`</td><td>Unlock Originally Available Field</td></tr> <tr><td>`unlock`</td><td>Unlock Originally Available Field</td></tr>
<tr><td>`remove`</td><td>Remove Originally Available and Lock Field</td></tr> <tr><td>`remove`</td><td>Remove Originally Available and Lock Field</td></tr>
<tr><td>`reset`</td><td>Remove Originally Available and Unlock Field</td></tr> <tr><td>`reset`</td><td>Remove Originally Available and Unlock Field</td></tr>
<tr><td colspan="2">Any String in the Format: YYYY-MM-DD for Originally Available (<code>2022-05-28</code>)</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -262,7 +282,10 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
TV Shows: TV Shows:
operations: operations:
mass_originally_available_update: tvdb mass_originally_available_update:
- mdb_digital
- mdb
- 1900-01-01
``` ```
??? blank "`mass_***_rating_update` - Updates the audience/critic/user rating of every item in the library.<a class="headerlink" href="#mass-star-rating-update" title="Permanent link"></a>" ??? blank "`mass_***_rating_update` - Updates the audience/critic/user rating of every item in the library.<a class="headerlink" href="#mass-star-rating-update" title="Permanent link"></a>"
@ -284,7 +307,7 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_audience_rating_update`/`mass_critic_rating_update`/`mass_user_rating_update` **Attribute:** `mass_audience_rating_update`/`mass_critic_rating_update`/`mass_user_rating_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`tmdb`</td><td>Use TMDb Rating</td></tr> <tr><td>`tmdb`</td><td>Use TMDb Rating</td></tr>
@ -310,6 +333,7 @@ You can create individual blocks of operations by using a list under `operations
<tr><td>`unlock`</td><td>Unlock Rating Field</td></tr> <tr><td>`unlock`</td><td>Unlock Rating Field</td></tr>
<tr><td>`remove`</td><td>Remove Rating and Lock Field</td></tr> <tr><td>`remove`</td><td>Remove Rating and Lock Field</td></tr>
<tr><td>`reset`</td><td>Remove Rating and Unlock Field</td></tr> <tr><td>`reset`</td><td>Remove Rating and Unlock Field</td></tr>
<tr><td colspan="2">Any Number between 0.0-10.0 for Ratings</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -318,9 +342,17 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
Movies: Movies:
operations: operations:
mass_audience_rating_update: mdb_average mass_audience_rating_update:
mass_critic_rating_update: mdb_metacritic - mdb
mass_user_rating_update: imdb - mdb_average
- 2.0
mass_critic_rating_update:
- imdb
- omdb
- 2.0
mass_user_rating_update:
- trakt_user
- 2.0
``` ```
??? blank "`mass_episode_***_rating_update` - Updates the audience/critic/user rating of every episode in the library.<a class="headerlink" href="#mass-episode-star-rating-update" title="Permanent link"></a>" ??? blank "`mass_episode_***_rating_update` - Updates the audience/critic/user rating of every episode in the library.<a class="headerlink" href="#mass-episode-star-rating-update" title="Permanent link"></a>"
@ -342,7 +374,7 @@ You can create individual blocks of operations by using a list under `operations
**Attribute:** `mass_episode_audience_rating_update`/`mass_episode_critic_rating_update`/`mass_episode_user_rating_update` **Attribute:** `mass_episode_audience_rating_update`/`mass_episode_critic_rating_update`/`mass_episode_user_rating_update`
**Accepted Values:** **Accepted Values:** Source or List of sources to use in that order
<table class="clearTable"> <table class="clearTable">
<tr><td>`tmdb`</td><td>Use TMDb Rating</td></tr> <tr><td>`tmdb`</td><td>Use TMDb Rating</td></tr>
@ -351,6 +383,7 @@ You can create individual blocks of operations by using a list under `operations
<tr><td>`unlock`</td><td>Unlock Rating Field</td></tr> <tr><td>`unlock`</td><td>Unlock Rating Field</td></tr>
<tr><td>`remove`</td><td>Remove Rating and Lock Field</td></tr> <tr><td>`remove`</td><td>Remove Rating and Lock Field</td></tr>
<tr><td>`reset`</td><td>Remove Rating and Unlock Field</td></tr> <tr><td>`reset`</td><td>Remove Rating and Unlock Field</td></tr>
<tr><td colspan="2">Any Number between 0.0-10.0 for Ratings</td></tr>
</table> </table>
???+ example "Example" ???+ example "Example"
@ -359,9 +392,17 @@ You can create individual blocks of operations by using a list under `operations
libraries: libraries:
TV Shows: TV Shows:
operations: operations:
mass_episode_audience_rating_update: tmdb mass_episode_audience_rating_update:
mass_episode_critic_rating_update: remove - mdb
mass_episode_user_rating_update: imdb - mdb_average
- 2.0
mass_episode_critic_rating_update:
- imdb
- omdb
- 2.0
mass_episode_user_rating_update:
- trakt_user
- 2.0
``` ```
??? blank "`mass_poster_update` - Updates the poster of every item in the library.<a class="headerlink" href="#mass-poster-update" title="Permanent link"></a>" ??? blank "`mass_poster_update` - Updates the poster of every item in the library.<a class="headerlink" href="#mass-poster-update" title="Permanent link"></a>"

@ -24,6 +24,7 @@ requirements for setup that can be found by clicking the links within the table.
| [`tautulli`](tautulli.md) | :fontawesome-solid-circle-xmark:{ .red } | | [`tautulli`](tautulli.md) | :fontawesome-solid-circle-xmark:{ .red } |
| [`omdb`](omdb.md) | :fontawesome-solid-circle-xmark:{ .red } | | [`omdb`](omdb.md) | :fontawesome-solid-circle-xmark:{ .red } |
| [`notifiarr`](notifiarr.md) | :fontawesome-solid-circle-xmark:{ .red } | | [`notifiarr`](notifiarr.md) | :fontawesome-solid-circle-xmark:{ .red } |
| [`gotify`](gotify.md) | :fontawesome-solid-circle-xmark:{ .red } |
| [`anidb`](anidb.md) | :fontawesome-solid-circle-xmark:{ .red } | | [`anidb`](anidb.md) | :fontawesome-solid-circle-xmark:{ .red } |
| [`radarr`](radarr.md) | :fontawesome-solid-circle-xmark:{ .red } | | [`radarr`](radarr.md) | :fontawesome-solid-circle-xmark:{ .red } |
| [`sonarr`](sonarr.md) | :fontawesome-solid-circle-xmark:{ .red } | | [`sonarr`](sonarr.md) | :fontawesome-solid-circle-xmark:{ .red } |

@ -23,14 +23,14 @@ plex:
``` ```
| Attribute | Allowed Values | Default | Required | | Attribute | Allowed Values | Default | Required |
|:----------------|:------------------------------------------------------------------------|:--------|:------------------------------------------:| |:----------------|:-------------------------------------------------------------------------------------------------------------------------------|:--------|:------------------------------------------:|
| `url` | Plex Server URL<br><strong>Example:</strong> http://192.168.1.12:32400 | N/A | :fontawesome-solid-circle-check:{ .green } | | `url` | Plex Server URL<br><strong>Example:</strong> http://192.168.1.12:32400 | N/A | :fontawesome-solid-circle-check:{ .green } |
| `token` | Plex Server Authentication Token | N/A | :fontawesome-solid-circle-check:{ .green } | | `token` | Plex Server Authentication Token | N/A | :fontawesome-solid-circle-check:{ .green } |
| `timeout` | Plex Server Timeout | 60 | :fontawesome-solid-circle-xmark:{ .red } | | `timeout` | Plex Server Timeout | 60 | :fontawesome-solid-circle-xmark:{ .red } |
| `db_cache` | Plex Server Database Cache Size | None | :fontawesome-solid-circle-xmark:{ .red } | | `db_cache` | Plex Server Database Cache Size | None | :fontawesome-solid-circle-xmark:{ .red } |
| `clean_bundles` | Runs Clean Bundles on the Server after all Collection Files are run | false | :fontawesome-solid-circle-xmark:{ .red } | | `clean_bundles` | Runs Clean Bundles on the Server after all Collection Files are run<br>(`true`, `false` or Any [schedule option](schedule.md)) | false | :fontawesome-solid-circle-xmark:{ .red } |
| `empty_trash` | Runs Empty Trash on the Server after all Collection Files are run | false | :fontawesome-solid-circle-xmark:{ .red } | | `empty_trash` | Runs Empty Trash on the Server after all Collection Files are run<br>(`true`, `false` or Any [schedule option](schedule.md)) | false | :fontawesome-solid-circle-xmark:{ .red } |
| `optimize` | Runs Optimize on the Server after all Collection Files are run | false | :fontawesome-solid-circle-xmark:{ .red } | | `optimize` | Runs Optimize on the Server after all Collection Files are run<br>(`true`, `false` or Any [schedule option](schedule.md)) | false | :fontawesome-solid-circle-xmark:{ .red } |
???+ warning ???+ warning

@ -28,6 +28,7 @@ webhooks:
* Each Attribute can be either a webhook url as a string or a comma-separated list of webhooks urls. * Each Attribute can be either a webhook url as a string or a comma-separated list of webhooks urls.
* To send notifications to [Notifiarr](notifiarr.md) just add `notifiarr` to a webhook instead of the webhook url. * To send notifications to [Notifiarr](notifiarr.md) just add `notifiarr` to a webhook instead of the webhook url.
* To send notifications to [Gotify](gotify.md) just add `gotify` to a webhook instead of the webhook url.
## Error Notifications ## Error Notifications
@ -77,7 +78,6 @@ level the error occurs.
"error": str, // Error Message "error": str, // Error Message
"critical": bool, // Critical Error "critical": bool, // Critical Error
"server_name": str, // Server Name "server_name": str, // Server Name
"library_name": str, // Library Name
"playlist": str // Playlist Name "playlist": str // Playlist Name
} }
``` ```

@ -293,17 +293,17 @@ table.dualTable td, table.dualTable th {
right: 0; right: 0;
} }
.md-typeset :is(.admonition, details) {
background-color: #1b1b1b;
}
/* Custom tooltips */ /* Custom tooltips */
.md-tooltip { .md-tooltip {
background-color: var(--md-primary-fg-color); background-color: var(--md-primary-fg-color);
border-radius: 6px; border-radius: 6px;
} }
[data-md-color-scheme="slate"] .md-typeset .admonition.builder,
.md-typeset details.quicklink {
background-color: #1b1b1b;
}
.md-typeset .admonition.builder, .md-typeset .admonition.builder,
.md-typeset details.builder { .md-typeset details.builder {
border: 1px solid var(--pg-light-border); border: 1px solid var(--pg-light-border);

@ -25,6 +25,9 @@
"notifiarr": { "notifiarr": {
"$ref": "#/definitions/notifiarr-api" "$ref": "#/definitions/notifiarr-api"
}, },
"gotify": {
"$ref": "#/definitions/gotify-api"
},
"anidb": { "anidb": {
"$ref": "#/definitions/anidb-api" "$ref": "#/definitions/anidb-api"
}, },
@ -283,6 +286,24 @@
], ],
"title": "notifiarr" "title": "notifiarr"
}, },
"gotify-api": {
"type": "object",
"additionalProperties": false,
"properties": {
"url": {
"type": "string"
},
"token": {
"type": "string"
}
},
"required": [
"url",
"token"
],
"title": "gotify"
},
"anidb-api": { "anidb-api": {
"type": "object", "type": "object",
"additionalProperties": false, "additionalProperties": false,
@ -1116,7 +1137,7 @@
"type": "object", "type": "object",
"additionalProperties": false, "additionalProperties": false,
"patternProperties": { "patternProperties": {
"^(?!plex|tmdb|tautulli|webhooks|omdb|mdblist|notifiarr|anidb|radarr|sonarr|trakt|mal).+$": { "^(?!plex|tmdb|tautulli|webhooks|omdb|mdblist|notifiarr|gotify|anidb|radarr|sonarr|trakt|mal).+$": {
"additionalProperties": false, "additionalProperties": false,
"properties": { "properties": {
"metadata_files": { "metadata_files": {

@ -472,6 +472,9 @@ mdblist:
cache_expiration: 60 cache_expiration: 60
notifiarr: notifiarr:
apikey: this-is-a-placeholder-string apikey: this-is-a-placeholder-string
gotify:
url: http://192.168.1.12:80
token: this-is-a-placeholder-string
anidb: # Not required for AniDB builders unless you want mature content anidb: # Not required for AniDB builders unless you want mature content
username: this-is-a-placeholder-string username: this-is-a-placeholder-string
password: this-is-a-placeholder-string password: this-is-a-placeholder-string

@ -178,6 +178,7 @@ nav:
- Radarr: config/radarr.md - Radarr: config/radarr.md
- Sonarr: config/sonarr.md - Sonarr: config/sonarr.md
- Notifiarr: config/notifiarr.md - Notifiarr: config/notifiarr.md
- Gotify: config/gotify.md
- Tautulli: config/tautulli.md - Tautulli: config/tautulli.md
- Github: config/github.md - Github: config/github.md
- MdbList: config/mdblist.md - MdbList: config/mdblist.md

@ -25,6 +25,7 @@ class Cache:
cursor.execute("DROP TABLE IF EXISTS mdb_data") cursor.execute("DROP TABLE IF EXISTS mdb_data")
cursor.execute("DROP TABLE IF EXISTS mdb_data2") cursor.execute("DROP TABLE IF EXISTS mdb_data2")
cursor.execute("DROP TABLE IF EXISTS mdb_data3") cursor.execute("DROP TABLE IF EXISTS mdb_data3")
cursor.execute("DROP TABLE IF EXISTS mdb_data4")
cursor.execute("DROP TABLE IF EXISTS omdb_data") cursor.execute("DROP TABLE IF EXISTS omdb_data")
cursor.execute("DROP TABLE IF EXISTS omdb_data2") cursor.execute("DROP TABLE IF EXISTS omdb_data2")
cursor.execute("DROP TABLE IF EXISTS tvdb_data") cursor.execute("DROP TABLE IF EXISTS tvdb_data")
@ -98,12 +99,13 @@ class Cache:
expiration_date TEXT)""" expiration_date TEXT)"""
) )
cursor.execute( cursor.execute(
"""CREATE TABLE IF NOT EXISTS mdb_data4 ( """CREATE TABLE IF NOT EXISTS mdb_data5 (
key INTEGER PRIMARY KEY, key INTEGER PRIMARY KEY,
key_id TEXT UNIQUE, key_id TEXT UNIQUE,
title TEXT, title TEXT,
year INTEGER, year INTEGER,
released TEXT, released TEXT,
released_digital TEXT,
type TEXT, type TEXT,
imdbid TEXT, imdbid TEXT,
traktid INTEGER, traktid INTEGER,
@ -119,8 +121,9 @@ class Cache:
tmdb_rating INTEGER, tmdb_rating INTEGER,
letterboxd_rating REAL, letterboxd_rating REAL,
myanimelist_rating REAL, myanimelist_rating REAL,
commonsense TEXT,
certification TEXT, certification TEXT,
commonsense TEXT,
age_rating TEXT,
expiration_date TEXT)""" expiration_date TEXT)"""
) )
cursor.execute( cursor.execute(
@ -480,20 +483,22 @@ class Cache:
with sqlite3.connect(self.cache_path) as connection: with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor: with closing(connection.cursor()) as cursor:
cursor.execute("SELECT * FROM mdb_data4 WHERE key_id = ?", (key_id,)) cursor.execute("SELECT * FROM mdb_data5 WHERE key_id = ?", (key_id,))
row = cursor.fetchone() row = cursor.fetchone()
if row: if row:
mdb_dict["title"] = row["title"] if row["title"] else None mdb_dict["title"] = row["title"] if row["title"] else None
mdb_dict["year"] = row["year"] if row["year"] else None mdb_dict["year"] = row["year"] if row["year"] else None
mdb_dict["released"] = row["released"] if row["released"] else None mdb_dict["released"] = row["released"] if row["released"] else None
mdb_dict["released_digital"] = row["released_digital"] if row["released_digital"] else None
mdb_dict["type"] = row["type"] if row["type"] else None mdb_dict["type"] = row["type"] if row["type"] else None
mdb_dict["imdbid"] = row["imdbid"] if row["imdbid"] else None mdb_dict["imdbid"] = row["imdbid"] if row["imdbid"] else None
mdb_dict["traktid"] = row["traktid"] if row["traktid"] else None mdb_dict["traktid"] = row["traktid"] if row["traktid"] else None
mdb_dict["tmdbid"] = row["tmdbid"] if row["tmdbid"] else None mdb_dict["tmdbid"] = row["tmdbid"] if row["tmdbid"] else None
mdb_dict["score"] = row["score"] if row["score"] else None mdb_dict["score"] = row["score"] if row["score"] else None
mdb_dict["score_average"] = row["average"] if row["average"] else None mdb_dict["score_average"] = row["average"] if row["average"] else None
mdb_dict["commonsense"] = row["commonsense"] if row["commonsense"] else None
mdb_dict["certification"] = row["certification"] if row["certification"] else None mdb_dict["certification"] = row["certification"] if row["certification"] else None
mdb_dict["commonsense"] = row["commonsense"] if row["commonsense"] else None
mdb_dict["age_rating"] = row["age_rating"] if row["age_rating"] else None
mdb_dict["ratings"] = [ mdb_dict["ratings"] = [
{"source": "imdb", "value": row["imdb_rating"] if row["imdb_rating"] else None}, {"source": "imdb", "value": row["imdb_rating"] if row["imdb_rating"] else None},
{"source": "metacritic", "value": row["metacritic_rating"] if row["metacritic_rating"] else None}, {"source": "metacritic", "value": row["metacritic_rating"] if row["metacritic_rating"] else None},
@ -515,16 +520,17 @@ class Cache:
with sqlite3.connect(self.cache_path) as connection: with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor: with closing(connection.cursor()) as cursor:
cursor.execute("INSERT OR IGNORE INTO mdb_data4(key_id) VALUES(?)", (key_id,)) cursor.execute("INSERT OR IGNORE INTO mdb_data5(key_id) VALUES(?)", (key_id,))
update_sql = "UPDATE mdb_data4 SET title = ?, year = ?, released = ?, type = ?, imdbid = ?, traktid = ?, " \ update_sql = "UPDATE mdb_data5 SET title = ?, year = ?, released = ?, released_digital = ?, type = ?, imdbid = ?, traktid = ?, " \
"tmdbid = ?, score = ?, average = ?, imdb_rating = ?, metacritic_rating = ?, metacriticuser_rating = ?, " \ "tmdbid = ?, score = ?, average = ?, imdb_rating = ?, metacritic_rating = ?, metacriticuser_rating = ?, " \
"trakt_rating = ?, tomatoes_rating = ?, tomatoesaudience_rating = ?, tmdb_rating = ?, " \ "trakt_rating = ?, tomatoes_rating = ?, tomatoesaudience_rating = ?, tmdb_rating = ?, " \
"letterboxd_rating = ?, myanimelist_rating = ?, certification = ?, commonsense = ?, expiration_date = ? WHERE key_id = ?" "letterboxd_rating = ?, myanimelist_rating = ?, certification = ?, commonsense = ?, age_rating = ?, expiration_date = ? WHERE key_id = ?"
cursor.execute(update_sql, ( cursor.execute(update_sql, (
mdb.title, mdb.year, mdb.released.strftime("%Y-%m-%d") if mdb.released else None, mdb.type, mdb.title, mdb.year, mdb.released.strftime("%Y-%m-%d") if mdb.released else None,
mdb.released_digital.strftime("%Y-%m-%d") if mdb.released_digital else None, mdb.type,
mdb.imdbid, mdb.traktid, mdb.tmdbid, mdb.score, mdb.average, mdb.imdb_rating, mdb.metacritic_rating, mdb.imdbid, mdb.traktid, mdb.tmdbid, mdb.score, mdb.average, mdb.imdb_rating, mdb.metacritic_rating,
mdb.metacriticuser_rating, mdb.trakt_rating, mdb.tomatoes_rating, mdb.tomatoesaudience_rating, mdb.metacriticuser_rating, mdb.trakt_rating, mdb.tomatoes_rating, mdb.tomatoesaudience_rating,
mdb.tmdb_rating, mdb.letterboxd_rating, mdb.myanimelist_rating, mdb.content_rating, mdb.commonsense, mdb.tmdb_rating, mdb.letterboxd_rating, mdb.myanimelist_rating, mdb.content_rating, mdb.commonsense, mdb.age_rating,
expiration_date.strftime("%Y-%m-%d"), key_id expiration_date.strftime("%Y-%m-%d"), key_id
)) ))

@ -16,6 +16,7 @@ from modules.mal import MyAnimeList
from modules.meta import PlaylistFile from modules.meta import PlaylistFile
from modules.mojo import BoxOfficeMojo from modules.mojo import BoxOfficeMojo
from modules.notifiarr import Notifiarr from modules.notifiarr import Notifiarr
from modules.gotify import Gotify
from modules.omdb import OMDb from modules.omdb import OMDb
from modules.overlays import Overlays from modules.overlays import Overlays
from modules.plex import Plex from modules.plex import Plex
@ -55,8 +56,10 @@ mass_genre_options = {
} }
mass_content_options = { mass_content_options = {
"lock": "Lock Rating", "unlock": "Unlock Rating", "remove": "Remove and Lock Rating", "reset": "Remove and Unlock Rating", "lock": "Lock Rating", "unlock": "Unlock Rating", "remove": "Remove and Lock Rating", "reset": "Remove and Unlock Rating",
"omdb": "Use IMDb Rating through OMDb", "mdb": "Use MdbList Rating", "mdb_commonsense": "Use Commonsense Rating through MDbList", "omdb": "Use IMDb Rating through OMDb", "mdb": "Use MdbList Rating",
"mdb_commonsense0": "Use Commonsense Rating with Zero Padding through MDbList", "mal": "Use MyAnimeList Rating" "mdb_commonsense": "Use Commonsense Rating through MDbList", "mdb_commonsense0": "Use Commonsense Rating with Zero Padding through MDbList",
"mdb_age_rating": "Use MDbList Age Rating", "mdb_age_rating0": "Use MDbList Age Rating with Zero Padding",
"mal": "Use MyAnimeList Rating"
} }
mass_studio_options = { mass_studio_options = {
"lock": "Lock Rating", "unlock": "Unlock Rating", "remove": "Remove and Lock Rating", "reset": "Remove and Unlock Rating", "lock": "Lock Rating", "unlock": "Unlock Rating", "remove": "Remove and Lock Rating", "reset": "Remove and Unlock Rating",
@ -69,7 +72,7 @@ mass_original_title_options = {
} }
mass_available_options = { mass_available_options = {
"lock": "Lock Originally Available", "unlock": "Unlock Originally Available", "remove": "Remove and Lock Originally Available", "reset": "Remove and Unlock Originally Available", "lock": "Lock Originally Available", "unlock": "Unlock Originally Available", "remove": "Remove and Lock Originally Available", "reset": "Remove and Unlock Originally Available",
"tmdb": "Use TMDb Release", "omdb": "Use IMDb Release through OMDb", "mdb": "Use MdbList Release", "tvdb": "Use TVDb Release", "tmdb": "Use TMDb Release", "omdb": "Use IMDb Release through OMDb", "mdb": "Use MdbList Release", "mdb_digital": "Use MdbList Digital Release", "tvdb": "Use TVDb Release",
"anidb": "Use AniDB Release", "mal": "Use MyAnimeList Release" "anidb": "Use AniDB Release", "mal": "Use MyAnimeList Release"
} }
mass_image_options = { mass_image_options = {
@ -287,6 +290,7 @@ class ConfigFile:
if "omdb" in self.data: self.data["omdb"] = self.data.pop("omdb") if "omdb" in self.data: self.data["omdb"] = self.data.pop("omdb")
if "mdblist" in self.data: self.data["mdblist"] = self.data.pop("mdblist") if "mdblist" in self.data: self.data["mdblist"] = self.data.pop("mdblist")
if "notifiarr" in self.data: self.data["notifiarr"] = self.data.pop("notifiarr") if "notifiarr" in self.data: self.data["notifiarr"] = self.data.pop("notifiarr")
if "gotify" in self.data: self.data["gotify"] = self.data.pop("gotify")
if "anidb" in self.data: self.data["anidb"] = self.data.pop("anidb") if "anidb" in self.data: self.data["anidb"] = self.data.pop("anidb")
if "radarr" in self.data: if "radarr" in self.data:
if "monitor" in self.data["radarr"] and isinstance(self.data["radarr"]["monitor"], bool): if "monitor" in self.data["radarr"] and isinstance(self.data["radarr"]["monitor"], bool):
@ -524,6 +528,24 @@ class ConfigFile:
else: else:
logger.info("notifiarr attribute not found") logger.info("notifiarr attribute not found")
self.GotifyFactory = None
if "gotify" in self.data:
logger.info("Connecting to Gotify...")
try:
self.GotifyFactory = Gotify(self, {
"url": check_for_attribute(self.data, "url", parent="gotify", throw=True),
"token": check_for_attribute(self.data, "token", parent="gotify", throw=True)
})
except Failed as e:
if str(e).endswith("is blank"):
logger.warning(e)
else:
logger.stacktrace()
logger.error(e)
logger.info(f"Gotify Connection {'Failed' if self.GotifyFactory is None else 'Successful'}")
else:
logger.info("gotify attribute not found")
self.webhooks = { self.webhooks = {
"error": check_for_attribute(self.data, "error", parent="webhooks", var_type="list", default_is_none=True), "error": check_for_attribute(self.data, "error", parent="webhooks", var_type="list", default_is_none=True),
"version": check_for_attribute(self.data, "version", parent="webhooks", var_type="list", default_is_none=True), "version": check_for_attribute(self.data, "version", parent="webhooks", var_type="list", default_is_none=True),
@ -532,7 +554,7 @@ class ConfigFile:
"changes": check_for_attribute(self.data, "changes", parent="webhooks", var_type="list", default_is_none=True), "changes": check_for_attribute(self.data, "changes", parent="webhooks", var_type="list", default_is_none=True),
"delete": check_for_attribute(self.data, "delete", parent="webhooks", var_type="list", default_is_none=True) "delete": check_for_attribute(self.data, "delete", parent="webhooks", var_type="list", default_is_none=True)
} }
self.Webhooks = Webhooks(self, self.webhooks, notifiarr=self.NotifiarrFactory) self.Webhooks = Webhooks(self, self.webhooks, notifiarr=self.NotifiarrFactory, gotify=self.GotifyFactory)
try: try:
self.Webhooks.start_time_hooks(self.start_time) self.Webhooks.start_time_hooks(self.start_time)
if self.version[0] != "Unknown" and self.latest_version[0] != "Unknown" and self.version[1] != self.latest_version[1] or (self.version[2] and self.version[2] < self.latest_version[2]): if self.version[0] != "Unknown" and self.latest_version[0] != "Unknown" and self.version[1] != self.latest_version[1] or (self.version[2] and self.version[2] < self.latest_version[2]):
@ -717,11 +739,17 @@ class ConfigFile:
"url": check_for_attribute(self.data, "url", parent="plex", var_type="url", default_is_none=True), "url": check_for_attribute(self.data, "url", parent="plex", var_type="url", default_is_none=True),
"token": check_for_attribute(self.data, "token", parent="plex", default_is_none=True), "token": check_for_attribute(self.data, "token", parent="plex", default_is_none=True),
"timeout": check_for_attribute(self.data, "timeout", parent="plex", var_type="int", default=60), "timeout": check_for_attribute(self.data, "timeout", parent="plex", var_type="int", default=60),
"db_cache": check_for_attribute(self.data, "db_cache", parent="plex", var_type="int", default_is_none=True), "db_cache": check_for_attribute(self.data, "db_cache", parent="plex", var_type="int", default_is_none=True)
"clean_bundles": check_for_attribute(self.data, "clean_bundles", parent="plex", var_type="bool", default=False),
"empty_trash": check_for_attribute(self.data, "empty_trash", parent="plex", var_type="bool", default=False),
"optimize": check_for_attribute(self.data, "optimize", parent="plex", var_type="bool", default=False)
} }
for attr in ["clean_bundles", "empty_trash", "optimize"]:
try:
self.general["plex"][attr] = check_for_attribute(self.data, attr, parent="plex", var_type="bool", default=False, throw=True)
except Failed as e:
if "plex" in self.data and attr in self.data["plex"] and self.data["plex"][attr]:
self.general["plex"][attr] = self.data["plex"][attr]
else:
self.general["plex"][attr] = False
logger.warning(str(e).replace("Error", "Warning"))
self.general["radarr"] = { self.general["radarr"] = {
"url": check_for_attribute(self.data, "url", parent="radarr", var_type="url", default_is_none=True), "url": check_for_attribute(self.data, "url", parent="radarr", var_type="url", default_is_none=True),
"token": check_for_attribute(self.data, "token", parent="radarr", default_is_none=True), "token": check_for_attribute(self.data, "token", parent="radarr", default_is_none=True),
@ -845,8 +873,32 @@ class ConfigFile:
for op, data_type in library_operations.items(): for op, data_type in library_operations.items():
if op not in config_op: if op not in config_op:
continue continue
if isinstance(data_type, list): if op == "mass_imdb_parental_labels":
section_final[op] = check_for_attribute(config_op, op, test_list=data_type, default_is_none=True, save=False) section_final[op] = check_for_attribute(config_op, op, test_list=data_type, default_is_none=True, save=False)
elif isinstance(data_type, dict):
try:
if not config_op[op]:
raise Failed("is blank")
input_list = config_op[op] if isinstance(config_op[op], list) else [config_op[op]]
final_list = []
for list_attr in input_list:
if not list_attr:
raise Failed(f"has a blank value")
if str(list_attr).lower() in data_type:
final_list.append(str(list_attr).lower())
elif op in ["mass_content_rating_update", "mass_studio_update", "mass_original_title_update"]:
final_list.append(str(list_attr))
elif op == "mass_genre_update":
final_list.append(list_attr if isinstance(list_attr, list) else [list_attr])
elif op == "mass_originally_available_update":
final_list.append(util.validate_date(list_attr))
elif op.endswith("rating_update"):
final_list.append(util.check_int(list_attr, datatype="float", minimum=0, maximum=10, throw=True))
else:
raise Failed(f"has an invalid value: {list_attr}")
section_final[op] = final_list
except Failed as e:
logger.error(f"Config Error: {op} {e}")
elif op == "mass_collection_mode": elif op == "mass_collection_mode":
section_final[op] = util.check_collection_mode(config_op[op]) section_final[op] = util.check_collection_mode(config_op[op])
elif data_type == "dict": elif data_type == "dict":
@ -901,24 +953,28 @@ class ConfigFile:
logger.warning(f"Config Warning: Operation {k} already scheduled") logger.warning(f"Config Warning: Operation {k} already scheduled")
for k, v in final_operations.items(): for k, v in final_operations.items():
params[k] = v params[k] = v
def error_check(err_attr, service):
logger.error(f"Config Error: Operation {err_attr} cannot be {params[err_attr]} without a successful {service} Connection")
params[err_attr] = None
for mass_key in operations.meta_operations: for mass_key in operations.meta_operations:
if not params[mass_key]: if not params[mass_key]:
continue continue
source = params[mass_key]["source"] if isinstance(params[mass_key], dict) else params[mass_key] sources = params[mass_key]["source"] if isinstance(params[mass_key], dict) else params[mass_key]
if source == "omdb" and self.OMDb is None: if not isinstance(sources, list):
error_check(mass_key, "OMDb") sources = [sources]
if source and source.startswith("mdb") and not self.Mdblist.has_key: try:
error_check(mass_key, "MdbList") for source in sources:
if source and source.startswith("anidb") and not self.AniDB.is_authorized: if source and source == "omdb" and self.OMDb is None:
error_check(mass_key, "AniDB") raise Failed(f"{source} without a successful OMDb Connection")
if source and source.startswith("mal") and self.MyAnimeList is None: if source and str(source).startswith("mdb") and not self.Mdblist.has_key:
error_check(mass_key, "MyAnimeList") raise Failed(f"{source} without a successful MdbList Connection")
if source and source.startswith("trakt") and self.Trakt is None: if source and str(source).startswith("anidb") and not self.AniDB.is_authorized:
error_check(mass_key, "Trakt") raise Failed(f"{source} without a successful AniDB Connection")
if source and str(source).startswith("mal") and self.MyAnimeList is None:
raise Failed(f"{source} without a successful MyAnimeList Connection")
if source and str(source).startswith("trakt") and self.Trakt is None:
raise Failed(f"{source} without a successful Trakt Connection")
except Failed as e:
logger.error(f"Config Error: {mass_key} cannot use {e}")
params[mass_key] = None
lib_vars = {} lib_vars = {}
if lib and "template_variables" in lib and lib["template_variables"] and isinstance(lib["template_variables"], dict): if lib and "template_variables" in lib and lib["template_variables"] and isinstance(lib["template_variables"], dict):
@ -1071,11 +1127,21 @@ class ConfigFile:
"url": check_for_attribute(lib, "url", parent="plex", var_type="url", default=self.general["plex"]["url"], req_default=True, save=False), "url": check_for_attribute(lib, "url", parent="plex", var_type="url", default=self.general["plex"]["url"], req_default=True, save=False),
"token": check_for_attribute(lib, "token", parent="plex", default=self.general["plex"]["token"], req_default=True, save=False), "token": check_for_attribute(lib, "token", parent="plex", default=self.general["plex"]["token"], req_default=True, save=False),
"timeout": check_for_attribute(lib, "timeout", parent="plex", var_type="int", default=self.general["plex"]["timeout"], save=False), "timeout": check_for_attribute(lib, "timeout", parent="plex", var_type="int", default=self.general["plex"]["timeout"], save=False),
"db_cache": check_for_attribute(lib, "db_cache", parent="plex", var_type="int", default=self.general["plex"]["db_cache"], default_is_none=True, save=False), "db_cache": check_for_attribute(lib, "db_cache", parent="plex", var_type="int", default=self.general["plex"]["db_cache"], default_is_none=True, save=False)
"clean_bundles": check_for_attribute(lib, "clean_bundles", parent="plex", var_type="bool", default=self.general["plex"]["clean_bundles"], save=False),
"empty_trash": check_for_attribute(lib, "empty_trash", parent="plex", var_type="bool", default=self.general["plex"]["empty_trash"], save=False),
"optimize": check_for_attribute(lib, "optimize", parent="plex", var_type="bool", default=self.general["plex"]["optimize"], save=False)
} }
for attr in ["clean_bundles", "empty_trash", "optimize"]:
try:
params["plex"][attr] = check_for_attribute(lib, attr, parent="plex", var_type="bool", save=False, throw=True)
except Failed as er:
test = lib["plex"][attr] if "plex" in lib and attr in lib["plex"] and lib["plex"][attr] else self.general["plex"][attr]
params["plex"][attr] = False
if test is not True and test is not False:
try:
util.schedule_check(attr, test, current_time, self.run_hour)
params["plex"][attr] = True
except NotScheduled:
logger.info(f"Skipping Operation Not Scheduled for {test}")
if params["plex"]["url"].lower() == "env": if params["plex"]["url"].lower() == "env":
params["plex"]["url"] = self.env_plex_url params["plex"]["url"] = self.env_plex_url
if params["plex"]["token"].lower() == "env": if params["plex"]["token"].lower() == "env":
@ -1175,7 +1241,7 @@ class ConfigFile:
logger.info("") logger.info("")
logger.info(f"{display_name} library's Tautulli Connection {'Failed' if library.Tautulli is None else 'Successful'}") logger.info(f"{display_name} library's Tautulli Connection {'Failed' if library.Tautulli is None else 'Successful'}")
library.Webhooks = Webhooks(self, {}, library=library, notifiarr=self.NotifiarrFactory) library.Webhooks = Webhooks(self, {}, library=library, notifiarr=self.NotifiarrFactory, gotify=self.GotifyFactory)
library.Overlays = Overlays(self, library) library.Overlays = Overlays(self, library)
logger.info("") logger.info("")

@ -0,0 +1,99 @@
from json import JSONDecodeError
from modules import util
from modules.util import Failed
logger = util.logger
class Gotify:
def __init__(self, config, params):
self.config = config
self.token = params["token"]
self.url = params["url"].rstrip("/")
logger.secret(self.url)
logger.secret(self.token)
try:
logger.info(f"Gotify Version: {self._request(path='version', post=False)['version']}")
except Exception:
logger.stacktrace()
raise Failed("Gotify Error: Invalid URL")
def _request(self, path="message", json=None, post=True):
if post:
response = self.config.post(f"{self.url}/{path}", headers={"X-Gotify-Key": self.token}, json=json)
else:
response = self.config.get(f"{self.url}/{path}")
try:
response_json = response.json()
except JSONDecodeError as e:
logger.error(response.content)
logger.debug(e)
raise e
if response.status_code >= 400:
raise Failed(f"({response.status_code} [{response.reason}]) {response_json['errorDescription']}")
return response_json
def notification(self, json):
message = ""
if json["event"] == "run_end":
title = "Run Completed"
message = f"Start Time: {json['start_time']}\n" \
f"End Time: {json['end_time']}\n" \
f"Run Time: {json['run_time']}\n" \
f"Collections Created: {json['collections_created']}\n" \
f"Collections Modified: {json['collections_modified']}\n" \
f"Collections Deleted: {json['collections_deleted']}\n" \
f"Items Added: {json['items_added']}\n" \
f"Items Removed: {json['items_removed']}"
if json["added_to_radarr"]:
message += f"\n{json['added_to_radarr']} Movies Added To Radarr"
if json["added_to_sonarr"]:
message += f"\n{json['added_to_sonarr']} Movies Added To Sonarr"
elif json["event"] == "run_start":
title = "Run Started"
message = json["start_time"]
elif json["event"] == "version":
title = "New Version Available"
message = f"Current: {json['current']}\n" \
f"Latest: {json['latest']}\n" \
f"Notes: {json['notes']}"
elif json["event"] == "delete":
if "library_name" in json:
title = "Collection Deleted"
else:
title = "Playlist Deleted"
message = json["message"]
else:
new_line = "\n"
if "server_name" in json:
message += f"{new_line if message else ''}Server: {json['server_name']}"
if "library_name" in json:
message += f"{new_line if message else ''}Library: {json['library_name']}"
if "collection" in json:
message += f"{new_line if message else ''}Collection: {json['collection']}"
if "playlist" in json:
message += f"{new_line if message else ''}Playlist: {json['playlist']}"
if json["event"] == "error":
if "collection" in json:
title_name = "Collection"
elif "playlist" in json:
title_name = "Playlist"
elif "library_name" in json:
title_name = "Library"
else:
title_name = "Global"
title = f"{'Critical ' if json['critical'] else ''}{title_name} Error"
message += f"{new_line if message else ''}Error Message: {json['error']}"
else:
title = f"{'Collection' if 'collection' in json else 'Playlist'} {'Created' if json['created'] else 'Modified'}"
if json['radarr_adds']:
message += f"{new_line if message else ''}{len(json['radarr_adds'])} Radarr Additions:"
if json['sonarr_adds']:
message += f"{new_line if message else ''}{len(json['sonarr_adds'])} Sonarr Additions:"
message += f"{new_line if message else ''}{len(json['additions'])} Additions:"
for add_dict in json['additions']:
message += f"\n{add_dict['title']}"
message += f"{new_line if message else ''}{len(json['removals'])} Removals:"
for add_dict in json['removals']:
message += f"\n{add_dict['title']}"
self._request(json={"message": message, "title": title})

@ -643,6 +643,9 @@ class IMDb:
def get_rating(self, imdb_id): def get_rating(self, imdb_id):
return self.ratings[imdb_id] if imdb_id in self.ratings else None return self.ratings[imdb_id] if imdb_id in self.ratings else None
def get_genres(self, imdb_id):
return self.genres[imdb_id] if imdb_id in self.genres else []
def get_episode_rating(self, imdb_id, season_num, episode_num): def get_episode_rating(self, imdb_id, season_num, episode_num):
season_num = str(season_num) season_num = str(season_num)
episode_num = str(episode_num) episode_num = str(episode_num)

@ -16,7 +16,6 @@ class Library(ABC):
self.Webhooks = None self.Webhooks = None
self.Operations = Operations(config, self) self.Operations = Operations(config, self)
self.Overlays = None self.Overlays = None
self.Notifiarr = None
self.collections = [] self.collections = []
self.collection_names = [] self.collection_names = []
self.metadatas = [] self.metadatas = []
@ -129,7 +128,6 @@ class Library(ABC):
self.library_operation = True if self.items_library_operation or self.delete_collections or self.mass_collection_mode \ self.library_operation = True if self.items_library_operation or self.delete_collections or self.mass_collection_mode \
or self.radarr_remove_by_tag or self.sonarr_remove_by_tag or self.show_unmanaged or self.show_unconfigured \ or self.radarr_remove_by_tag or self.sonarr_remove_by_tag or self.show_unmanaged or self.show_unconfigured \
or self.metadata_backup or self.update_blank_track_titles else False or self.metadata_backup or self.update_blank_track_titles else False
self.meta_operations = [i["source"] if isinstance(i, dict) else i for i in [getattr(self, o) for o in operations.meta_operations] if i]
self.label_operations = True if self.assets_for_all or self.mass_imdb_parental_labels else False self.label_operations = True if self.assets_for_all or self.mass_imdb_parental_labels else False
if self.asset_directory: if self.asset_directory:

@ -28,6 +28,10 @@ class MDbObj:
self.released = datetime.strptime(data["released"], "%Y-%m-%d") self.released = datetime.strptime(data["released"], "%Y-%m-%d")
except (ValueError, TypeError): except (ValueError, TypeError):
self.released = None self.released = None
try:
self.released_digital = datetime.strptime(data["released_digital"], "%Y-%m-%d")
except (ValueError, TypeError):
self.released_digital = None
self.type = data["type"] self.type = data["type"]
self.imdbid = data["imdbid"] self.imdbid = data["imdbid"]
self.traktid = util.check_num(data["traktid"]) self.traktid = util.check_num(data["traktid"])
@ -64,6 +68,7 @@ class MDbObj:
self.myanimelist_rating = util.check_num(rating["value"], is_int=False) self.myanimelist_rating = util.check_num(rating["value"], is_int=False)
self.content_rating = data["certification"] self.content_rating = data["certification"]
self.commonsense = data["commonsense"] self.commonsense = data["commonsense"]
self.age_rating = data["age_rating"]
class Mdblist: class Mdblist:

@ -98,8 +98,6 @@ class Operations:
ep_lock_edits = {} ep_lock_edits = {}
ep_unlock_edits = {} ep_unlock_edits = {}
trakt_ratings = self.config.Trakt.user_ratings(self.library.is_movie) if any([o == "trakt_user" for o in self.library.meta_operations]) else []
reverse_anidb = {} reverse_anidb = {}
for k, v in self.library.anidb_map.items(): for k, v in self.library.anidb_map.items():
reverse_anidb[v] = k reverse_anidb[v] = k
@ -164,110 +162,163 @@ class Operations:
path = path[:-1] if path.endswith(("/", "\\")) else path path = path[:-1] if path.endswith(("/", "\\")) else path
sonarr_adds.append((tvdb_id, path)) sonarr_adds.append((tvdb_id, path))
tmdb_item = None _trakt_ratings = None
if any([o == "tmdb" for o in self.library.meta_operations]): def trakt_ratings():
nonlocal _trakt_ratings
if _tmdb_obj is None:
_trakt_ratings = self.config.Trakt.user_ratings(self.library.is_movie)
if not _trakt_ratings:
raise Failed
return _trakt_ratings
_tmdb_obj = None
def tmdb_obj():
nonlocal _tmdb_obj
if _tmdb_obj is None:
_tmdb_obj = False
try: try:
tmdb_item = self.config.TMDb.get_item(item, tmdb_id, tvdb_id, imdb_id, is_movie=self.library.is_movie) _item = self.config.TMDb.get_item(item, tmdb_id, tvdb_id, imdb_id, is_movie=self.library.is_movie)
except Failed as e: if _item:
logger.error(str(e)) _tmdb_obj = _item
except Failed as err:
logger.error(str(err))
if not _tmdb_obj:
raise Failed
return _tmdb_obj
omdb_item = None _omdb_obj = None
if any([o == "omdb" for o in self.library.meta_operations]): def omdb_obj():
nonlocal _omdb_obj
if _omdb_obj is None:
_omdb_obj = False
if self.config.OMDb.limit is not False: if self.config.OMDb.limit is not False:
logger.error("Daily OMDb Limit Reached") logger.error("Daily OMDb Limit Reached")
elif not imdb_id: elif not imdb_id:
logger.info(f"No IMDb ID for Guid: {item.guid}") logger.info(f"No IMDb ID for Guid: {item.guid}")
else: else:
try: try:
omdb_item = self.config.OMDb.get_omdb(imdb_id) _omdb_obj = self.config.OMDb.get_omdb(imdb_id)
except Failed as e: except Failed as err:
logger.error(str(e)) logger.error(str(err))
except Exception: except Exception:
logger.error(f"IMDb ID: {imdb_id}") logger.error(f"IMDb ID: {imdb_id}")
raise raise
if not _omdb_obj:
raise Failed
return _omdb_obj
tvdb_item = None _tvdb_obj = None
if any([o == "tvdb" for o in self.library.meta_operations]): def tvdb_obj():
nonlocal _tvdb_obj
if _tvdb_obj is None:
_tvdb_obj = False
if tvdb_id: if tvdb_id:
try: try:
tvdb_item = self.config.TVDb.get_tvdb_obj(tvdb_id, is_movie=self.library.is_movie) _tvdb_obj = self.config.TVDb.get_tvdb_obj(tvdb_id, is_movie=self.library.is_movie)
except Failed as e: except Failed as err:
logger.error(str(e)) logger.error(str(err))
else: else:
logger.info(f"No TVDb ID for Guid: {item.guid}") logger.info(f"No TVDb ID for Guid: {item.guid}")
if not _tvdb_obj:
raise Failed
return _tvdb_obj
anidb_item = None _mdb_obj = None
mal_item = None def mdb_obj():
if any([o.startswith("anidb") or o.startswith("mal") for o in self.library.meta_operations]): nonlocal _mdb_obj
if _mdb_obj is None:
_mdb_obj = False
if self.config.Mdblist.limit is False:
if self.library.is_show and tvdb_id:
try:
_mdb_obj = self.config.Mdblist.get_series(tvdb_id)
except LimitReached as err:
logger.debug(err)
except Failed as err:
logger.error(str(err))
except Exception:
logger.trace(f"TVDb ID: {tvdb_id}")
raise
if self.library.is_movie and tmdb_id:
try:
_mdb_obj = self.config.Mdblist.get_movie(tmdb_id)
except LimitReached as err:
logger.debug(err)
except Failed as err:
logger.error(str(err))
except Exception:
logger.trace(f"TMDb ID: {tmdb_id}")
raise
if imdb_id and not _mdb_obj:
try:
_mdb_obj = self.config.Mdblist.get_imdb(imdb_id)
except LimitReached as err:
logger.debug(err)
except Failed as err:
logger.error(str(err))
except Exception:
logger.trace(f"IMDb ID: {imdb_id}")
raise
if not _mdb_obj:
logger.warning(f"No MdbItem for {item.title} (Guid: {item.guid})")
if not _mdb_obj:
raise Failed
return _mdb_obj
anidb_id = None
def get_anidb_id():
if item.ratingKey in reverse_anidb: if item.ratingKey in reverse_anidb:
anidb_id = reverse_anidb[item.ratingKey] return reverse_anidb[item.ratingKey]
elif tvdb_id in self.config.Convert._tvdb_to_anidb: elif tvdb_id in self.config.Convert._tvdb_to_anidb:
anidb_id = self.config.Convert._tvdb_to_anidb[tvdb_id] return self.config.Convert._tvdb_to_anidb[tvdb_id]
elif imdb_id in self.config.Convert._imdb_to_anidb: elif imdb_id in self.config.Convert._imdb_to_anidb:
anidb_id = self.config.Convert._imdb_to_anidb[imdb_id] return self.config.Convert._imdb_to_anidb[imdb_id]
else: else:
anidb_id = None return False
if any([o.startswith("anidb") for o in self.library.meta_operations]):
_anidb_obj = None
def anidb_obj():
nonlocal anidb_id, _anidb_obj
if _anidb_obj is None:
_anidb_obj = False
if anidb_id is None:
anidb_id = get_anidb_id()
if anidb_id: if anidb_id:
try: try:
anidb_item = self.config.AniDB.get_anime(anidb_id) _anidb_obj = self.config.AniDB.get_anime(anidb_id)
except Failed as e: except Failed as err:
logger.error(str(e)) logger.error(str(err))
else: else:
logger.warning(f"No AniDB ID for Guid: {item.guid}") logger.warning(f"No AniDB ID for Guid: {item.guid}")
if any([o.startswith("mal") for o in self.library.meta_operations]): if not _anidb_obj:
raise Failed
return _anidb_obj
_mal_obj = None
def mal_obj():
nonlocal anidb_id, _mal_obj
if _mal_obj is None:
_mal_obj = False
if anidb_id is None:
anidb_id = get_anidb_id()
mal_id = None
if item.ratingKey in reverse_mal: if item.ratingKey in reverse_mal:
mal_id = reverse_mal[item.ratingKey] mal_id = reverse_mal[item.ratingKey]
elif not anidb_id: elif not anidb_id:
logger.warning(f"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: {item.guid}") logger.warning(f"Convert Warning: No AniDB ID to Convert to MyAnimeList ID for Guid: {item.guid}")
mal_id = None
elif anidb_id not in self.config.Convert._anidb_to_mal: elif anidb_id not in self.config.Convert._anidb_to_mal:
logger.warning(f"Convert Warning: No MyAnimeList Found for AniDB ID: {anidb_id} of Guid: {item.guid}") logger.warning(f"Convert Warning: No MyAnimeList Found for AniDB ID: {anidb_id} of Guid: {item.guid}")
mal_id = None
else: else:
mal_id = self.config.Convert._anidb_to_mal[anidb_id] mal_id = self.config.Convert._anidb_to_mal[anidb_id]
if mal_id: if mal_id:
try: try:
mal_item = self.config.MyAnimeList.get_anime(mal_id) _mal_obj = self.config.MyAnimeList.get_anime(mal_id)
except Failed as e: except Failed as err:
logger.error(str(e)) logger.error(str(err))
if not _mal_obj:
raise Failed
return _mal_obj
mdb_item = None
if any([o and o.startswith("mdb") for o in self.library.meta_operations]):
if self.config.Mdblist.limit is False:
try:
if self.library.is_show and tvdb_id and mdb_item is None:
try:
mdb_item = self.config.Mdblist.get_series(tvdb_id)
except Failed as e:
logger.trace(str(e))
except Exception:
logger.trace(f"TVDb ID: {tvdb_id}")
raise
if tmdb_id and mdb_item is None:
try:
mdb_item = self.config.Mdblist.get_movie(tmdb_id)
except LimitReached as e:
logger.debug(e)
except Failed as e:
logger.trace(str(e))
except Exception:
logger.trace(f"TMDb ID: {tmdb_id}")
raise
if imdb_id and mdb_item is None:
try:
mdb_item = self.config.Mdblist.get_imdb(imdb_id)
except LimitReached as e:
logger.debug(e)
except Failed as e:
logger.trace(str(e))
except Exception:
logger.trace(f"IMDb ID: {imdb_id}")
raise
if mdb_item is None:
logger.warning(f"No MdbItem for {item.title} (Guid: {item.guid})")
except LimitReached as e:
logger.debug(e)
for attribute, item_attr in [ for attribute, item_attr in [
(self.library.mass_audience_rating_update, "audienceRating"), (self.library.mass_audience_rating_update, "audienceRating"),
(self.library.mass_critic_rating_update, "rating"), (self.library.mass_critic_rating_update, "rating"),
@ -275,105 +326,124 @@ class Operations:
]: ]:
if attribute: if attribute:
current = getattr(item, item_attr) current = getattr(item, item_attr)
if attribute == "remove" and current is not None: for option in attribute:
if option in ["lock", "remove"]:
if option == "remove" and current:
if item_attr not in remove_edits: if item_attr not in remove_edits:
remove_edits[item_attr] = [] remove_edits[item_attr] = []
remove_edits[item_attr].append(item.ratingKey) remove_edits[item_attr].append(item.ratingKey)
item_edits += f"\nRemove {name_display[item_attr]} (Batched)" item_edits += f"\nRemove {name_display[item_attr]} (Batched)"
elif attribute == "reset" and current is not None: elif item_attr not in locked_fields:
if item_attr not in lock_edits:
lock_edits[item_attr] = []
lock_edits[item_attr].append(item.ratingKey)
item_edits += f"\nLock {name_display[item_attr]} (Batched)"
break
elif option in ["unlock", "reset"]:
if option == "reset" and current:
if item_attr not in reset_edits: if item_attr not in reset_edits:
reset_edits[item_attr] = [] reset_edits[item_attr] = []
reset_edits[item_attr].append(item.ratingKey) reset_edits[item_attr].append(item.ratingKey)
item_edits += f"\nReset {name_display[item_attr]} (Batched)" item_edits += f"\nReset {name_display[item_attr]} (Batched)"
elif attribute in ["unlock", "reset"] and item_attr in locked_fields: elif item_attr in locked_fields:
if item_attr not in unlock_edits: if item_attr not in unlock_edits:
unlock_edits[item_attr] = [] unlock_edits[item_attr] = []
unlock_edits[item_attr].append(item.ratingKey) unlock_edits[item_attr].append(item.ratingKey)
item_edits += f"\nUnlock {name_display[item_attr]} (Batched)" item_edits += f"\nUnlock {name_display[item_attr]} (Batched)"
elif attribute in ["lock", "remove"] and item_attr not in locked_fields: break
if item_attr not in lock_edits: else:
lock_edits[item_attr] = [] try:
lock_edits[item_attr].append(item.ratingKey) if option == "tmdb":
item_edits += f"\nLock {name_display[item_attr]} (Batched)" found_rating = tmdb_obj().vote_average # noqa
elif attribute not in ["lock", "unlock", "remove", "reset"]: elif option == "imdb":
if tmdb_item and attribute == "tmdb":
found_rating = tmdb_item.vote_average
elif imdb_id and attribute == "imdb":
found_rating = self.config.IMDb.get_rating(imdb_id) found_rating = self.config.IMDb.get_rating(imdb_id)
elif attribute == "trakt_user" and self.library.is_movie and tmdb_id in trakt_ratings: elif option == "trakt_user":
found_rating = trakt_ratings[tmdb_id] _ratings = trakt_ratings()
elif attribute == "trakt_user" and self.library.is_show and tvdb_id in trakt_ratings: _id = tmdb_id if self.library.is_movie else tvdb_id
found_rating = trakt_ratings[tvdb_id] if _id in _ratings:
elif omdb_item and attribute == "omdb": found_rating = _ratings[_id]
found_rating = omdb_item.imdb_rating
elif mdb_item and attribute == "mdb":
found_rating = mdb_item.score / 10 if mdb_item.score else None
elif mdb_item and attribute == "mdb_average":
found_rating = mdb_item.average / 10 if mdb_item.average else None
elif mdb_item and attribute == "mdb_imdb":
found_rating = mdb_item.imdb_rating if mdb_item.imdb_rating else None
elif mdb_item and attribute == "mdb_metacritic":
found_rating = mdb_item.metacritic_rating / 10 if mdb_item.metacritic_rating else None
elif mdb_item and attribute == "mdb_metacriticuser":
found_rating = mdb_item.metacriticuser_rating if mdb_item.metacriticuser_rating else None
elif mdb_item and attribute == "mdb_trakt":
found_rating = mdb_item.trakt_rating / 10 if mdb_item.trakt_rating else None
elif mdb_item and attribute == "mdb_tomatoes":
found_rating = mdb_item.tomatoes_rating / 10 if mdb_item.tomatoes_rating else None
elif mdb_item and attribute == "mdb_tomatoesaudience":
found_rating = mdb_item.tomatoesaudience_rating / 10 if mdb_item.tomatoesaudience_rating else None
elif mdb_item and attribute == "mdb_tmdb":
found_rating = mdb_item.tmdb_rating / 10 if mdb_item.tmdb_rating else None
elif mdb_item and attribute == "mdb_letterboxd":
found_rating = mdb_item.letterboxd_rating * 2 if mdb_item.letterboxd_rating else None
elif mdb_item and attribute == "mdb_myanimelist":
found_rating = mdb_item.myanimelist_rating if mdb_item.myanimelist_rating else None
elif anidb_item and attribute == "anidb_rating":
found_rating = anidb_item.rating
elif anidb_item and attribute == "anidb_average":
found_rating = anidb_item.average
elif anidb_item and attribute == "anidb_score":
found_rating = anidb_item.score
elif mal_item and attribute == "mal":
found_rating = mal_item.score
else: else:
found_rating = None raise Failed
elif str(option).startswith("mdb"):
if found_rating and float(found_rating) > 0: mdb_item = mdb_obj()
if option == "mdb_average":
found_rating = mdb_item.average / 10 if mdb_item.average else None # noqa
elif option == "mdb_imdb":
found_rating = mdb_item.imdb_rating if mdb_item.imdb_rating else None # noqa
elif option == "mdb_metacritic":
found_rating = mdb_item.metacritic_rating / 10 if mdb_item.metacritic_rating else None # noqa
elif option == "mdb_metacriticuser":
found_rating = mdb_item.metacriticuser_rating if mdb_item.metacriticuser_rating else None # noqa
elif option == "mdb_trakt":
found_rating = mdb_item.trakt_rating / 10 if mdb_item.trakt_rating else None # noqa
elif option == "mdb_tomatoes":
found_rating = mdb_item.tomatoes_rating / 10 if mdb_item.tomatoes_rating else None # noqa
elif option == "mdb_tomatoesaudience":
found_rating = mdb_item.tomatoesaudience_rating / 10 if mdb_item.tomatoesaudience_rating else None # noqa
elif option == "mdb_tmdb":
found_rating = mdb_item.tmdb_rating / 10 if mdb_item.tmdb_rating else None # noqa
elif option == "mdb_letterboxd":
found_rating = mdb_item.letterboxd_rating * 2 if mdb_item.letterboxd_rating else None # noqa
elif option == "mdb_myanimelist":
found_rating = mdb_item.myanimelist_rating if mdb_item.myanimelist_rating else None # noqa
else:
found_rating = mdb_item.score / 10 if mdb_item.score else None # noqa
elif option == "anidb_rating":
found_rating = anidb_obj().rating # noqa
elif option == "anidb_average":
found_rating = anidb_obj().average # noqa
elif option == "anidb_score":
found_rating = anidb_obj().score # noqa
elif option == "mal":
found_rating = mal_obj().score # noqa
else:
found_rating = option
if not found_rating:
logger.info(f"No {option} {name_display[item_attr]} Found")
raise Failed
found_rating = f"{float(found_rating):.1f}" found_rating = f"{float(found_rating):.1f}"
if str(current) != found_rating: if str(current) != found_rating:
if found_rating not in rating_edits[item_attr]: if found_rating not in rating_edits[item_attr]:
rating_edits[item_attr][found_rating] = [] rating_edits[item_attr][found_rating] = []
rating_edits[item_attr][found_rating].append(item.ratingKey) rating_edits[item_attr][found_rating].append(item.ratingKey)
item_edits += f"\nUpdate {name_display[item_attr]} (Batched) | {found_rating}" item_edits += f"\nUpdate {name_display[item_attr]} (Batched) | {found_rating}"
else: break
logger.info(f"No {name_display[item_attr]} Found") except Failed:
continue
if self.library.mass_genre_update or self.library.genre_mapper: if self.library.mass_genre_update or self.library.genre_mapper:
try: if self.library.mass_genre_update:
new_genres = [] new_genres = []
if self.library.mass_genre_update and self.library.mass_genre_update not in ["lock", "unlock", "remove", "reset"]: extra_option = None
if tmdb_item and self.library.mass_genre_update == "tmdb": for option in self.library.mass_genre_update:
new_genres = tmdb_item.genres if option in ["lock", "unlock", "remove", "reset"]:
elif imdb_id and self.library.mass_genre_update == "imdb" and imdb_id in self.config.IMDb.genres: extra_option = option
new_genres = self.config.IMDb.genres[imdb_id] break
elif omdb_item and self.library.mass_genre_update == "omdb": try:
new_genres = omdb_item.genres if option == "tmdb":
elif tvdb_item and self.library.mass_genre_update == "tvdb": new_genres = tmdb_obj().genres # noqa
new_genres = tvdb_item.genres elif option == "imdb":
elif anidb_item and self.library.mass_genre_update in anidb.weights: new_genres = self.config.IMDb.get_genres(imdb_id)
logger.trace(anidb_item.main_title) elif option == "omdb":
logger.trace(anidb_item.tags) new_genres = omdb_obj().genres # noqa
new_genres = [str(t).title() for t, w in anidb_item.tags.items() if w >= anidb.weights[self.library.mass_genre_update]] elif option == "tvdb":
elif mal_item and self.library.mass_genre_update == "mal": new_genres = tvdb_obj().genres # noqa
new_genres = mal_item.genres elif str(option) in anidb.weights:
new_genres = [str(t).title() for t, w in anidb_obj().tags.items() if w >= anidb.weights[str(option)]] # noqa
elif option == "mal":
new_genres = mal_obj().genres # noqa
else: else:
raise Failed new_genres = option
if not new_genres: if not new_genres:
logger.info("No Genres Found") logger.info(f"No {option} Genres Found")
if self.library.genre_mapper or self.library.mass_genre_update in ["lock", "unlock"]: raise Failed
if not new_genres and self.library.mass_genre_update not in ["remove", "reset"]: break
new_genres = [g.tag for g in item.genres] except Failed:
continue
item_genres = [g.tag for g in item.genres]
if not new_genres and extra_option not in ["remove", "reset"]:
new_genres = item_genres
if self.library.genre_mapper: if self.library.genre_mapper:
mapped_genres = [] mapped_genres = []
for genre in new_genres: for genre in new_genres:
@ -383,7 +453,6 @@ class Operations:
else: else:
mapped_genres.append(genre) mapped_genres.append(genre)
new_genres = mapped_genres new_genres = mapped_genres
item_genres = [g.tag for g in item.genres]
_add = list(set(new_genres) - set(item_genres)) _add = list(set(new_genres) - set(item_genres))
_remove = list(set(item_genres) - set(new_genres)) _remove = list(set(item_genres) - set(new_genres))
for genre_list, edit_type in [(_add, "add"), (_remove, "remove")]: for genre_list, edit_type in [(_add, "add"), (_remove, "remove")]:
@ -393,41 +462,63 @@ class Operations:
genre_edits[edit_type][g] = [] genre_edits[edit_type][g] = []
genre_edits[edit_type][g].append(item.ratingKey) genre_edits[edit_type][g].append(item.ratingKey)
item_edits += f"\n{edit_type.capitalize()} Genres (Batched) | {', '.join(genre_list)}" item_edits += f"\n{edit_type.capitalize()} Genres (Batched) | {', '.join(genre_list)}"
if self.library.mass_genre_update in ["unlock", "reset"] and ("genre" in locked_fields or _add or _remove): if extra_option in ["unlock", "reset"] and ("genre" in locked_fields or _add or _remove):
if "genre" not in unlock_edits: if "genre" not in unlock_edits:
unlock_edits["genre"] = [] unlock_edits["genre"] = []
unlock_edits["genre"].append(item.ratingKey) unlock_edits["genre"].append(item.ratingKey)
item_edits += "\nUnlock Genre (Batched)" item_edits += "\nUnlock Genre (Batched)"
elif self.library.mass_genre_update in ["lock", "remove"] and "genre" not in locked_fields and not _add and not _remove: elif extra_option in ["lock", "remove"] and "genre" not in locked_fields and not _add and not _remove:
if "genre" not in lock_edits: if "genre" not in lock_edits:
lock_edits["genre"] = [] lock_edits["genre"] = []
lock_edits["genre"].append(item.ratingKey) lock_edits["genre"].append(item.ratingKey)
item_edits += "\nLock Genre (Batched)" item_edits += "\nLock Genre (Batched)"
except Failed:
pass
if self.library.mass_content_rating_update or self.library.content_rating_mapper: if self.library.mass_content_rating_update or self.library.content_rating_mapper:
if self.library.mass_content_rating_update:
new_rating = None
extra_option = None
for option in self.library.mass_content_rating_update:
if option in ["lock", "unlock", "remove", "reset"]:
extra_option = option
break
try: try:
if option == "omdb":
new_rating = omdb_obj().content_rating # noqa
elif option == "mdb":
_rating = mdb_obj().content_rating # noqa
new_rating = _rating if _rating else None
elif str(option).startswith("mdb_commonsense"):
_rating = mdb_obj().commonsense # noqa
if not _rating:
new_rating = None new_rating = None
if self.library.mass_content_rating_update and self.library.mass_content_rating_update not in ["lock", "unlock", "remove", "reset"]: elif option == "mdb_commonsense0":
if omdb_item and self.library.mass_content_rating_update == "omdb": new_rating = str(_rating).rjust(2, "0")
new_rating = omdb_item.content_rating
elif mdb_item and self.library.mass_content_rating_update == "mdb":
new_rating = mdb_item.content_rating if mdb_item.content_rating else None
elif mdb_item and self.library.mass_content_rating_update == "mdb_commonsense":
new_rating = mdb_item.commonsense if mdb_item.commonsense else None
elif mdb_item and self.library.mass_content_rating_update == "mdb_commonsense0":
new_rating = str(mdb_item.commonsense).rjust(2, "0") if mdb_item.commonsense else None
elif mal_item and self.library.mass_content_rating_update == "mal":
new_rating = mal_item.rating
else: else:
raise Failed new_rating = _rating
elif str(option).startswith("mdb_age_rating"):
_rating = mdb_obj().age_rating # noqa
if not _rating:
new_rating = None
elif option == "mdb_age_rating0":
new_rating = str(_rating).rjust(2, "0")
else:
new_rating = _rating
elif option == "mal":
new_rating = mal_obj().rating # noqa
else:
new_rating = option
if new_rating is None: if new_rating is None:
logger.info("No Content Rating Found") logger.info(f"No {option} Content Rating Found")
raise Failed
else: else:
new_rating = str(new_rating) new_rating = str(new_rating)
break
except Failed:
continue
is_none = False is_none = False
do_lock = False
do_unlock = False
current_rating = item.contentRating current_rating = item.contentRating
if not new_rating: if not new_rating:
new_rating = current_rating new_rating = current_rating
@ -436,182 +527,199 @@ class Operations:
new_rating = self.library.content_rating_mapper[new_rating] new_rating = self.library.content_rating_mapper[new_rating]
if not new_rating: if not new_rating:
is_none = True is_none = True
has_edit = False if extra_option == "reset":
if (is_none or self.library.mass_content_rating_update == "remove") and current_rating: if current_rating:
if "contentRating" not in remove_edits:
remove_edits["contentRating"] = []
remove_edits["contentRating"].append(item.ratingKey)
item_edits += "\nRemove Content Rating (Batched)"
elif self.library.mass_content_rating_update == "reset" and current_rating:
if "contentRating" not in reset_edits: if "contentRating" not in reset_edits:
reset_edits["contentRating"] = [] reset_edits["contentRating"] = []
reset_edits["contentRating"].append(item.ratingKey) reset_edits["contentRating"].append(item.ratingKey)
item_edits += "\nReset Content Rating (Batched)" item_edits += "\nReset Content Rating (Batched)"
elif "contentRating" in locked_fields:
do_unlock = True
elif extra_option == "remove" or is_none:
if current_rating:
if "contentRating" not in remove_edits:
remove_edits["contentRating"] = []
remove_edits["contentRating"].append(item.ratingKey)
item_edits += "\nRemove Content Rating (Batched)"
elif "contentRating" not in locked_fields:
do_lock = True
elif new_rating and new_rating != current_rating: elif new_rating and new_rating != current_rating:
if new_rating not in content_edits: if new_rating not in content_edits:
content_edits[new_rating] = [] content_edits[new_rating] = []
content_edits[new_rating].append(item.ratingKey) content_edits[new_rating].append(item.ratingKey)
item_edits += f"\nUpdate Content Rating (Batched) | {new_rating}" item_edits += f"\nUpdate Content Rating (Batched) | {new_rating}"
has_edit = True do_lock = False
if self.library.mass_content_rating_update in ["unlock", "reset"] and ("contentRating" in locked_fields or has_edit): if extra_option == "lock" or do_lock:
if "contentRating" not in unlock_edits:
unlock_edits["contentRating"] = []
unlock_edits["contentRating"].append(item.ratingKey)
item_edits += "\nUnlock Content Rating (Batched)"
elif self.library.mass_content_rating_update in ["lock", "remove"] and "contentRating" not in locked_fields and not has_edit:
if "contentRating" not in lock_edits: if "contentRating" not in lock_edits:
lock_edits["contentRating"] = [] lock_edits["contentRating"] = []
lock_edits["contentRating"].append(item.ratingKey) lock_edits["contentRating"].append(item.ratingKey)
item_edits += "\nLock Content Rating (Batched)" item_edits += "\nLock Content Rating (Batched)"
except Failed: elif extra_option == "unlock" or do_unlock:
pass if "contentRating" not in unlock_edits:
unlock_edits["contentRating"] = []
unlock_edits["contentRating"].append(item.ratingKey)
item_edits += "\nUnlock Content Rating (Batched)"
if self.library.mass_original_title_update: if self.library.mass_original_title_update:
current_original = item.originalTitle current_original = item.originalTitle
has_edit = False for option in self.library.mass_original_title_update:
if self.library.mass_original_title_update == "remove" and current_original: if option in ["lock", "remove"]:
if option == "remove" and current_original:
if "originalTitle" not in remove_edits: if "originalTitle" not in remove_edits:
remove_edits["originalTitle"] = [] remove_edits["originalTitle"] = []
remove_edits["originalTitle"].append(item.ratingKey) remove_edits["originalTitle"].append(item.ratingKey)
item_edits += "\nRemove Original Title (Batched)" item_edits += "\nRemove Original Title (Batched)"
elif self.library.mass_original_title_update == "reset" and current_original: elif "originalTitle" not in locked_fields:
if "originalTitle" not in lock_edits:
lock_edits["originalTitle"] = []
lock_edits["originalTitle"].append(item.ratingKey)
item_edits += "\nLock Original Title (Batched)"
break
elif option in ["unlock", "reset"]:
if option == "reset" and current_original:
if "originalTitle" not in reset_edits: if "originalTitle" not in reset_edits:
reset_edits["originalTitle"] = [] reset_edits["originalTitle"] = []
reset_edits["originalTitle"].append(item.ratingKey) reset_edits["originalTitle"].append(item.ratingKey)
item_edits += "\nReset Original Title (Batched)" item_edits += "\nReset Original Title (Batched)"
elif self.library.mass_original_title_update not in ["lock", "unlock", "remove", "reset"]: elif "originalTitle" in locked_fields:
if "originalTitle" not in unlock_edits:
unlock_edits["originalTitle"] = []
unlock_edits["originalTitle"].append(item.ratingKey)
item_edits += "\nUnlock Original Title (Batched)"
break
else:
try: try:
if anidb_item and self.library.mass_original_title_update == "anidb": if option == "anidb":
new_original_title = anidb_item.main_title new_original_title = anidb_obj().main_title # noqa
elif anidb_item and self.library.mass_original_title_update == "anidb_official": elif option == "anidb_official":
new_original_title = anidb_item.official_title new_original_title = anidb_obj().official_title # noqa
elif mal_item and self.library.mass_original_title_update == "mal": elif option == "mal":
new_original_title = mal_item.title new_original_title = mal_obj().title # noqa
elif mal_item and self.library.mass_original_title_update == "mal_english": elif option == "mal_english":
new_original_title = mal_item.title_english new_original_title = mal_obj().title_english # noqa
elif mal_item and self.library.mass_original_title_update == "mal_japanese": elif option == "mal_japanese":
new_original_title = mal_item.title_japanese new_original_title = mal_obj().title_japanese # noqa
else: else:
raise Failed new_original_title = option
if not new_original_title: if not new_original_title:
logger.info("No Original Title Found") logger.info(f"No {option} Original Title Found")
elif str(current_original) != str(new_original_title): raise Failed
if str(current_original) != str(new_original_title):
item.editOriginalTitle(new_original_title) item.editOriginalTitle(new_original_title)
item_edits += f"\nUpdated Original Title | {new_original_title}" item_edits += f"\nUpdated Original Title | {new_original_title}"
has_edit = True break
except Failed: except Failed:
pass continue
if self.library.mass_original_title_update in ["unlock", "reset"] and ("originalTitle" in locked_fields or has_edit):
if "originalTitle" not in unlock_edits:
unlock_edits["originalTitle"] = []
unlock_edits["originalTitle"].append(item.ratingKey)
item_edits += "\nUnlock Original Title (Batched)"
elif self.library.mass_original_title_update in ["lock", "remove"] and "originalTitle" not in locked_fields and not has_edit:
if "originalTitle" not in lock_edits:
lock_edits["originalTitle"] = []
lock_edits["originalTitle"].append(item.ratingKey)
item_edits += "\nLock Original Title (Batched)"
if self.library.mass_studio_update: if self.library.mass_studio_update:
current_studio = item.studio current_studio = item.studio
has_edit = False for option in self.library.mass_studio_update:
if self.library.mass_studio_update == "remove" and current_studio: if option in ["lock", "remove"]:
if option == "remove" and current_studio:
if "studio" not in remove_edits: if "studio" not in remove_edits:
remove_edits["studio"] = [] remove_edits["studio"] = []
remove_edits["studio"].append(item.ratingKey) remove_edits["studio"].append(item.ratingKey)
item_edits += "\nRemove Studio (Batched)" item_edits += "\nRemove Studio (Batched)"
elif self.library.mass_studio_update == "reset" and current_studio: elif "studio" not in locked_fields:
if "studio" not in lock_edits:
lock_edits["studio"] = []
lock_edits["studio"].append(item.ratingKey)
item_edits += "\nLock Studio (Batched)"
break
elif option in ["unlock", "reset"]:
if option == "reset" and current_studio:
if "studio" not in reset_edits: if "studio" not in reset_edits:
reset_edits["studio"] = [] reset_edits["studio"] = []
reset_edits["studio"].append(item.ratingKey) reset_edits["studio"].append(item.ratingKey)
item_edits += "\nReset Studio (Batched)" item_edits += "\nReset Studio (Batched)"
elif self.library.mass_studio_update not in ["lock", "unlock", "remove", "reset"]: elif "studio" in locked_fields:
if "studio" not in unlock_edits:
unlock_edits["studio"] = []
unlock_edits["studio"].append(item.ratingKey)
item_edits += "\nUnlock Studio (Batched)"
break
else:
try: try:
if anidb_item and self.library.mass_studio_update == "anidb": if option == "tmdb":
new_studio = anidb_item.studio new_studio = tmdb_obj().studio # noqa
elif mal_item and self.library.mass_studio_update == "mal": elif option == "anidb":
new_studio = mal_item.studio new_studio = anidb_obj().studio # noqa
elif tmdb_item and self.library.mass_studio_update == "tmdb": elif option == "mal":
new_studio = tmdb_item.studio new_studio = mal_obj().studio # noqa
else: else:
raise Failed new_studio = option
if not new_studio: if not new_studio:
logger.info("No Studio Found") logger.info(f"No {option} Studio Found")
elif str(current_studio) != str(new_studio): raise Failed
if str(current_studio) != str(new_studio):
if new_studio not in studio_edits: if new_studio not in studio_edits:
studio_edits[new_studio] = [] studio_edits[new_studio] = []
studio_edits[new_studio].append(item.ratingKey) studio_edits[new_studio].append(item.ratingKey)
item_edits += f"\nUpdate Studio (Batched) | {new_studio}" item_edits += f"\nUpdate Studio (Batched) | {new_studio}"
has_edit = True break
except Failed: except Failed:
pass continue
if self.library.mass_studio_update in ["unlock", "reset"] and ("studio" in locked_fields or has_edit):
if "studio" not in unlock_edits:
unlock_edits["studio"] = []
unlock_edits["studio"].append(item.ratingKey)
item_edits += "\nUnlock Studio (Batched)"
elif self.library.mass_studio_update in ["lock", "remove"] and "studio" not in locked_fields and not has_edit:
if "studio" not in lock_edits:
lock_edits["studio"] = []
lock_edits["studio"].append(item.ratingKey)
item_edits += "\nLock Studio (Batched)"
if self.library.mass_originally_available_update: if self.library.mass_originally_available_update:
current_available = item.originallyAvailableAt current_available = item.originallyAvailableAt
if current_available: if current_available:
current_available = current_available.strftime("%Y-%m-%d") current_available = current_available.strftime("%Y-%m-%d")
has_edit = False for option in self.library.mass_originally_available_update:
if self.library.mass_originally_available_update == "remove" and current_available: if option in ["lock", "remove"]:
if option == "remove" and current_available:
if "originallyAvailableAt" not in remove_edits: if "originallyAvailableAt" not in remove_edits:
remove_edits["originallyAvailableAt"] = [] remove_edits["originallyAvailableAt"] = []
remove_edits["originallyAvailableAt"].append(item.ratingKey) remove_edits["originallyAvailableAt"].append(item.ratingKey)
item_edits += "\nRemove Originally Available Date (Batched)" item_edits += "\nRemove Originally Available Date (Batched)"
elif self.library.mass_originally_available_update == "reset" and current_available: elif "originallyAvailableAt" not in locked_fields:
if "originallyAvailableAt" not in lock_edits:
lock_edits["originallyAvailableAt"] = []
lock_edits["originallyAvailableAt"].append(item.ratingKey)
item_edits += "\nLock Originally Available Date (Batched)"
break
elif option in ["unlock", "reset"]:
if option == "reset" and current_available:
if "originallyAvailableAt" not in reset_edits: if "originallyAvailableAt" not in reset_edits:
reset_edits["originallyAvailableAt"] = [] reset_edits["originallyAvailableAt"] = []
reset_edits["originallyAvailableAt"].append(item.ratingKey) reset_edits["originallyAvailableAt"].append(item.ratingKey)
item_edits += "\nReset Originally Available Date (Batched)" item_edits += "\nReset Originally Available Date (Batched)"
elif self.library.mass_originally_available_update not in ["lock", "unlock", "remove", "reset"]: elif "originallyAvailableAt" in locked_fields:
if "originallyAvailableAt" not in unlock_edits:
unlock_edits["originallyAvailableAt"] = []
unlock_edits["originallyAvailableAt"].append(item.ratingKey)
item_edits += "\nUnlock Originally Available Date (Batched)"
break
else:
try: try:
if omdb_item and self.library.mass_originally_available_update == "omdb": if option == "tmdb":
new_available = omdb_item.released new_available = tmdb_obj().release_date if self.library.is_movie else tmdb_obj().first_air_date # noqa
elif mdb_item and self.library.mass_originally_available_update == "mdb": elif option == "omdb":
new_available = mdb_item.released new_available = omdb_obj().released # noqa
elif tvdb_item and self.library.mass_originally_available_update == "tvdb": elif option == "tvdb":
new_available = tvdb_item.release_date new_available = tvdb_obj().release_date # noqa
elif tmdb_item and self.library.mass_originally_available_update == "tmdb": elif option == "mdb":
new_available = tmdb_item.release_date if self.library.is_movie else tmdb_item.first_air_date new_available = mdb_obj().released # noqa
elif anidb_item and self.library.mass_originally_available_update == "anidb": elif option == "mdb_digital":
new_available = anidb_item.released new_available = mdb_obj().released_digital # noqa
elif mal_item and self.library.mass_originally_available_update == "mal": elif option == "anidb":
new_available = mal_item.aired new_available = anidb_obj().released # noqa
elif option == "mal":
new_available = mal_obj().aired # noqa
else: else:
new_available = option
if not new_available:
logger.info(f"No {option} Originally Available Date Found")
raise Failed raise Failed
if new_available:
new_available = new_available.strftime("%Y-%m-%d") new_available = new_available.strftime("%Y-%m-%d")
if current_available != new_available: if current_available != new_available:
if new_available not in available_edits: if new_available not in available_edits:
available_edits[new_available] = [] available_edits[new_available] = []
available_edits[new_available].append(item.ratingKey) available_edits[new_available].append(item.ratingKey)
item_edits += f"\nUpdate Originally Available Date (Batched) | {new_available}" item_edits += f"\nUpdate Originally Available Date (Batched) | {new_available}"
has_edit = True break
else:
logger.info("No Originally Available Date Found")
except Failed: except Failed:
pass continue
if self.library.mass_originally_available_update in ["unlock", "reset"] and ("originallyAvailableAt" in locked_fields or has_edit):
if "originallyAvailableAt" not in unlock_edits:
unlock_edits["originallyAvailableAt"] = []
unlock_edits["originallyAvailableAt"].append(item.ratingKey)
item_edits += "\nUnlock Originally Available Date (Batched)"
elif self.library.mass_originally_available_update in ["lock", "remove"] and "originallyAvailableAt" not in locked_fields and not has_edit:
if "originallyAvailableAt" not in lock_edits:
lock_edits["originallyAvailableAt"] = []
lock_edits["originallyAvailableAt"].append(item.ratingKey)
item_edits += "\nLock Originally Available Date (Batched)"
if len(item_edits) > 0: if len(item_edits) > 0:
logger.info(f"Item Edits{item_edits}") logger.info(f"Item Edits{item_edits}")
@ -626,10 +734,14 @@ class Operations:
name = None name = None
new_poster = None new_poster = None
new_background = None new_background = None
try:
tmdb_item = tmdb_obj()
except Failed:
tmdb_item = None
if self.library.mass_poster_update: if self.library.mass_poster_update:
self.library.poster_update(item, new_poster, tmdb=tmdb_item.poster_url if tmdb_item else None, title=item.title) self.library.poster_update(item, new_poster, tmdb=tmdb_item.poster_url if tmdb_item else None, title=item.title) # noqa
if self.library.mass_background_update: if self.library.mass_background_update:
self.library.background_update(item, new_background, tmdb=tmdb_item.backdrop_url if tmdb_item else None, title=item.title) self.library.background_update(item, new_background, tmdb=tmdb_item.backdrop_url if tmdb_item else None, title=item.title) # noqa
if self.library.is_show and ( if self.library.is_show and (
(self.library.mass_poster_update and (self.library.mass_poster_update and
@ -639,7 +751,7 @@ class Operations:
): ):
real_show = None real_show = None
try: try:
real_show = tmdb_item.load_show() if tmdb_item else None real_show = tmdb_item.load_show() if tmdb_item else None # noqa
except Failed as e: except Failed as e:
logger.error(e) logger.error(e)
tmdb_seasons = {s.season_number: s for s in real_show.seasons} if real_show else {} tmdb_seasons = {s.season_number: s for s in real_show.seasons} if real_show else {}
@ -725,10 +837,14 @@ class Operations:
ep_lock_edits[item_attr].append(ep) ep_lock_edits[item_attr].append(ep)
item_edits += f"\nLock {name_display[item_attr]} (Batched)" item_edits += f"\nLock {name_display[item_attr]} (Batched)"
elif attribute not in ["lock", "unlock", "remove", "reset"]: elif attribute not in ["lock", "unlock", "remove", "reset"]:
try:
tmdb_item = tmdb_obj()
except Failed:
tmdb_item = None
found_rating = None found_rating = None
if tmdb_item and attribute == "tmdb": if tmdb_item and attribute == "tmdb":
try: try:
found_rating = self.config.TMDb.get_episode(tmdb_item.tmdb_id, ep.seasonNumber, ep.episodeNumber).vote_average found_rating = self.config.TMDb.get_episode(tmdb_item.tmdb_id, ep.seasonNumber, ep.episodeNumber).vote_average # noqa
except Failed as er: except Failed as er:
logger.error(er) logger.error(er)
elif imdb_id and attribute == "imdb": elif imdb_id and attribute == "imdb":

@ -724,13 +724,16 @@ def schedule_check(attribute, data, current_time, run_hour, is_all=False):
raise NotScheduled(schedule_str) raise NotScheduled(schedule_str)
return schedule_str return schedule_str
def check_int(value, datatype="int", minimum=1, maximum=None): def check_int(value, datatype="int", minimum=1, maximum=None, throw=False):
try: try:
value = int(str(value)) if datatype == "int" else float(str(value)) value = int(str(value)) if datatype == "int" else float(str(value))
if (maximum is None and minimum <= value) or (maximum is not None and minimum <= value <= maximum): if (maximum is None and minimum <= value) or (maximum is not None and minimum <= value <= maximum):
return value return value
except ValueError: except ValueError:
pass if throw:
message = f"{value} must be {'an integer' if datatype == 'int' else 'a number'}"
raise Failed(f"{message} {minimum} or greater" if maximum is None else f"{message} between {minimum} and {maximum}")
return None
def parse_and_or(error, attribute, data, test_list): def parse_and_or(error, attribute, data, test_list):
out = "" out = ""

@ -5,7 +5,7 @@ from modules.util import Failed, YAML
logger = util.logger logger = util.logger
class Webhooks: class Webhooks:
def __init__(self, config, system_webhooks, library=None, notifiarr=None): def __init__(self, config, system_webhooks, library=None, notifiarr=None, gotify=None):
self.config = config self.config = config
self.error_webhooks = system_webhooks["error"] if "error" in system_webhooks else [] self.error_webhooks = system_webhooks["error"] if "error" in system_webhooks else []
self.version_webhooks = system_webhooks["version"] if "version" in system_webhooks else [] self.version_webhooks = system_webhooks["version"] if "version" in system_webhooks else []
@ -14,6 +14,7 @@ class Webhooks:
self.delete_webhooks = system_webhooks["delete"] if "delete" in system_webhooks else [] self.delete_webhooks = system_webhooks["delete"] if "delete" in system_webhooks else []
self.library = library self.library = library
self.notifiarr = notifiarr self.notifiarr = notifiarr
self.gotify = gotify
def _request(self, webhooks, json): def _request(self, webhooks, json):
logger.trace("") logger.trace("")
@ -30,6 +31,9 @@ class Webhooks:
response = self.notifiarr.notification(json) response = self.notifiarr.notification(json)
if response.status_code < 500: if response.status_code < 500:
break break
elif webhook == "gotify":
if self.gotify:
self.gotify.notification(json)
else: else:
if webhook.startswith("https://discord.com/api/webhooks"): if webhook.startswith("https://discord.com/api/webhooks"):
json = self.discord(json) json = self.discord(json)
@ -326,3 +330,4 @@ class Webhooks:
fields.append(field) fields.append(field)
new_json["embeds"][0]["fields"] = fields new_json["embeds"][0]["fields"] = fields
return new_json return new_json

@ -927,11 +927,11 @@ def run_playlists(config):
#logger.add_playlist_handler(playlist_log_name) #logger.add_playlist_handler(playlist_log_name)
status[mapping_name] = {"status": "Unchanged", "errors": [], "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0} status[mapping_name] = {"status": "Unchanged", "errors": [], "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
server_name = None server_name = None
library_names = None
try: try:
builder = CollectionBuilder(config, playlist_file, mapping_name, playlist_attrs, extra=output_str) builder = CollectionBuilder(config, playlist_file, mapping_name, playlist_attrs, extra=output_str)
stats["names"].append(builder.name) stats["names"].append(builder.name)
logger.info("") logger.info("")
server_name = builder.libraries[0].PlexServer.friendlyName
logger.separator(f"Running {mapping_name} Playlist", space=False, border=False) logger.separator(f"Running {mapping_name} Playlist", space=False, border=False)
@ -1049,7 +1049,7 @@ def run_playlists(config):
except Deleted as e: except Deleted as e:
logger.info(e) logger.info(e)
status[mapping_name]["status"] = "Deleted" status[mapping_name]["status"] = "Deleted"
config.notify_delete(e) config.notify_delete(e, server=server_name)
except NotScheduled as e: except NotScheduled as e:
logger.info(e) logger.info(e)
if str(e).endswith("and was deleted"): if str(e).endswith("and was deleted"):
@ -1059,13 +1059,13 @@ def run_playlists(config):
else: else:
status[mapping_name]["status"] = "Not Scheduled" status[mapping_name]["status"] = "Not Scheduled"
except Failed as e: except Failed as e:
config.notify(e, server=server_name, library=library_names, playlist=mapping_name) config.notify(e, server=server_name, playlist=mapping_name)
logger.stacktrace() logger.stacktrace()
logger.error(e) logger.error(e)
status[mapping_name]["status"] = "PMM Failure" status[mapping_name]["status"] = "PMM Failure"
status[mapping_name]["errors"].append(e) status[mapping_name]["errors"].append(e)
except Exception as e: except Exception as e:
config.notify(f"Unknown Error: {e}", server=server_name, library=library_names, playlist=mapping_name) config.notify(f"Unknown Error: {e}", server=server_name, playlist=mapping_name)
logger.stacktrace() logger.stacktrace()
logger.error(f"Unknown Error: {e}") logger.error(f"Unknown Error: {e}")
status[mapping_name]["status"] = "Unknown Error" status[mapping_name]["status"] = "Unknown Error"

Loading…
Cancel
Save