Merge pull request #590 from meisnate12/develop

v1.15.0
pull/597/head v1.15.0
meisnate12 3 years ago committed by GitHub
commit 3acb867139
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -5,34 +5,104 @@
[![Docker Image Version (latest semver)](https://img.shields.io/docker/v/meisnate12/plex-meta-manager?label=docker&sort=semver&style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager) [![Docker Image Version (latest semver)](https://img.shields.io/docker/v/meisnate12/plex-meta-manager?label=docker&sort=semver&style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager)
[![Docker Cloud Build Status](https://img.shields.io/docker/cloud/build/meisnate12/plex-meta-manager?style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager) [![Docker Cloud Build Status](https://img.shields.io/docker/cloud/build/meisnate12/plex-meta-manager?style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager)
[![Docker Pulls](https://img.shields.io/docker/pulls/meisnate12/plex-meta-manager?style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager) [![Docker Pulls](https://img.shields.io/docker/pulls/meisnate12/plex-meta-manager?style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager)
[![Discord](https://img.shields.io/discord/822460010649878528?label=Discord&style=plastic)](https://discord.gg/TsdpsFYqqm) [![Discord](https://img.shields.io/discord/822460010649878528?label=Discord&style=plastic)](https://discord.gg/NfH6mGFuAB)
[![Sponsor or Donate](https://img.shields.io/badge/-Sponsor_or_Donate-blueviolet?style=plastic)](https://github.com/sponsors/meisnate12) [![Sponsor or Donate](https://img.shields.io/badge/-Sponsor_or_Donate-blueviolet?style=plastic)](https://github.com/sponsors/meisnate12)
The original concept for Plex Meta Manager is [Plex Auto Collections](https://github.com/mza921/Plex-Auto-Collections), but this is rewritten from the ground up to be able to include a scheduler, metadata edits, multiple libraries, and logging. Plex Meta Manager is a Python 3 script that can be continuously run using YAML configuration files to update on a schedule the metadata of the movies, shows, and collections in your libraries as well as automatically build collections based on various methods all detailed in the wiki. Some collection examples that the script can automatically build and update daily include Plex Based Searches like actor, genre, or studio collections or Collections based on TMDb, IMDb, Trakt, TVDb, AniDB, or MyAnimeList lists and various other services. The original concept for Plex Meta Manager is [Plex Auto Collections](https://github.com/mza921/Plex-Auto-Collections), but this is rewritten from the ground up to be able to include a scheduler, metadata edits, multiple libraries, and logging. Plex Meta Manager is a Python 3 script that can be continuously run using YAML configuration files to update on a schedule the metadata of the movies, shows, and collections in your libraries as well as automatically build collections based on various methods all detailed in the wiki. Some collection examples that the script can automatically build and update daily include Plex Based Searches like actor, genre, or studio collections or Collections based on TMDb, IMDb, Trakt, TVDb, AniDB, or MyAnimeList lists and various other services.
The script can update many metadata fields for movies, shows, collections, seasons, and episodes and can act as a backup if your plex DB goes down. It can even update metadata the plex UI can't like Season Names. If the time is put into the metadata configuration file you can have a way to recreate your library and all its metadata changes with the click of a button. The script can update many metadata fields for movies, shows, collections, seasons, and episodes and can act as a backup if your plex DB goes down. If the time is put into the metadata configuration file you can have a way to recreate your library and all its metadata changes with the click of a button.
The script works with most Metadata agents including the New Plex Movie Agent, New Plex TV Agent, [Hama Anime Agent](https://github.com/ZeroQI/Hama.bundle), [MyAnimeList Anime Agent](https://github.com/Fribb/MyAnimeList.bundle), and [XBMC NFO Movie and TV Agents](https://github.com/gboudreau/XBMCnfoMoviesImporter.bundle). The script works with most Metadata agents including the New Plex Movie Agent, New Plex TV Agent, [Hama Anime Agent](https://github.com/ZeroQI/Hama.bundle), [MyAnimeList Anime Agent](https://github.com/Fribb/MyAnimeList.bundle), and [XBMC NFO Movie and TV Agents](https://github.com/gboudreau/XBMCnfoMoviesImporter.bundle).
![Colletions1](https://raw.githubusercontent.com/wiki/meisnate12/Plex-Meta-Manager/collections1.png)
![Colletions2](https://raw.githubusercontent.com/wiki/meisnate12/Plex-Meta-Manager/collections2.png)
## Getting Started ## Getting Started
1. Install Plex Meta Manager either by installing Python3 and following the [Local Installation Guide](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Local-Installation) 1. Install Plex Meta Manager either by installing Python3 and following the [Local Walkthrough](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Local-Walkthrough)
or by installing Docker and following the [Docker Installation Guide](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Docker-Installation) or the [unRAID Installation Guide](https://github.com/meisnate12/Plex-Meta-Manager/wiki/unRAID-Installation). or by installing Docker and following the [Docker Walkthrough](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Docker-Walkthrough) or the [unRAID Walkthrough](https://github.com/meisnate12/Plex-Meta-Manager/wiki/unRAID-Walkthrough).
2. Once installed, you have to create a [Configuration File](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Configuration-File) filled with all your values to connect to the various services. 2. Once installed, you have to create a [Configuration File](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Configuration-File) filled with all your values to connect to the various services.
3. After that you can start updating Metadata and building automatic Collections by creating a [Metadata File](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Metadata-File) for each Library you want to interact with. 3. After that you can start updating Metadata and building automatic Collections by creating a [Metadata File](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Metadata-and-Playlist-File) for each Library you want to interact with.
4. Explore the [Wiki](https://github.com/meisnate12/Plex-Meta-Manager/wiki) to see all the different Collection Builders that can be used to create collections. 4. Explore the [Wiki](https://github.com/meisnate12/Plex-Meta-Manager/wiki) to see all the different Collection Builders that can be used to create collections.
## IBRACORP Video Walkthrough ## Wiki
The [Wiki](https://github.com/meisnate12/Plex-Meta-Manager/wiki) details evey available option you have with Plex Meta Manager its Table of Contents is below.
[IBRACORP](https://ibracorp.io/) made a video walkthough for installing Plex Meta Manager on Unraid. While you might not be using Unraid the video goes over many key aspects of Plex Meta Manager and can be a great place to start learning how to use the script. ## Example Community Metadata Files
To see user submitted Metadata configuration files, and you to even add your own, go to the [Plex Meta Manager Configs](https://github.com/meisnate12/Plex-Meta-Manager-Configs).
[![Plex Meta Manager](https://img.youtube.com/vi/dF69MNoot3w/0.jpg)](https://www.youtube.com/watch?v=dF69MNoot3w "Plex Meta Manager") ## Support Discord
Before posting on GitHub about an enhancement, error, or configuration question please visit the [Plex Meta Manager Discord Server](https://discord.gg/NfH6mGFuAB) **it is without a doubt the best place to get support**.
## Feature Requests, Errors, and Configuration Questions
* If you have an idea for how to enhance Plex Meta Manager please open a new [Feature Request](https://github.com/meisnate12/Plex-Meta-Manager/issues/new?assignees=meisnate12&labels=status%3Anot-yet-viewed%2C+enhancement&template=feature_request.md&title=Feature+Request%3A+).
* If you're getting an Error please update to the latest develop branch and then open a [Bug Report](https://github.com/meisnate12/Plex-Meta-Manager/issues/new?assignees=meisnate12&labels=status%3Anot-yet-viewed%2C+bug&template=bug_report.md&title=Bug%3A+) if it's still happening.
* If you have a metadata configuration question post in the [Discussions](https://github.com/meisnate12/Plex-Meta-Manager/discussions).
## Support ## Development Build
* There is a [develop](https://github.com/meisnate12/Plex-Meta-Manager/tree/develop) branch which will have the most updated fixes and enhancements to the script.
* to access the Docker Image for the develop branch use the `develop` tag by adding `:develop` to the image name. i.e. `meisnate12/plex-meta-manager:develop`
* Before posting on GitHub about an enhancement, error, or configuration question please visit the [Plex Meta Manager Discord Server](https://discord.gg/TsdpsFYqqm). ## Contributing
* If you're getting an Error or have an Enhancement post in the [Issues](https://github.com/meisnate12/Plex-Meta-Manager/issues). * Pull Request are welcome but please submit them to the develop branch.
* If you have a configuration question post in the [Discussions](https://github.com/meisnate12/Plex-Meta-Manager/discussions).
* To see user submitted Metadata configuration files, and even add your own, go to the [Plex Meta Manager Configs](https://github.com/meisnate12/Plex-Meta-Manager-Configs).
* Pull Requests are welcome but please submit them to the develop branch.
* If you wish to contribute to the Wiki please fork and send a pull request on the [Plex Meta Manager Wiki Repository](https://github.com/meisnate12/Plex-Meta-Manager-Wiki). * If you wish to contribute to the Wiki please fork and send a pull request on the [Plex Meta Manager Wiki Repository](https://github.com/meisnate12/Plex-Meta-Manager-Wiki).
## IBRACORP Video Walkthrough
[IBRACORP](https://ibracorp.io/) made a video walkthough for installing Plex Meta Manager on unRAID. While you might not be using unRAID the video goes over many key accepts of Plex Meta Manager and can be a great place to start learning how to use the script.
[![Plex Meta Manager](https://img.youtube.com/vi/dF69MNoot3w/0.jpg)](https://www.youtube.com/watch?v=dF69MNoot3w "Plex Meta Manager")
## Wiki Table of Contents
- [Home](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Home)
- [Installation](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Installation)
- [Local Walkthrough](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Local-Walkthrough)
- [Docker Walkthrough](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Docker-Walkthrough)
- [unRAID Walkthrough](https://github.com/meisnate12/Plex-Meta-Manager/wiki/unRAID-Walkthrough)
- [Configuration File](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Configuration-File)
- [Libraries Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Libraries-Attributes)
- [Operations Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Operations-Attributes)
- [Playlist Files Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Playlist-Files-Attributes)
- [Settings Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Settings-Attributes)
- [Image Asset Directory](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Image-Asset-Directory)
- [Webhooks Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Webhooks-Attributes)
- [Plex Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Plex-Attributes)
- [TMDb Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/TMDb-Attributes)
- [Tautulli Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Tautulli-Attributes)
- [OMDb Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/OMDb-Attributes)
- [Notifiarr Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Notifiarr-Attributes)
- [AniDB Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/AniDB-Attributes)
- [Radarr Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Radarr-Attributes)
- [Sonarr Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Sonarr-Attributes)
- [Trakt Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Trakt-Attributes)
- [MyAnimeList Attributes](https://github.com/meisnate12/Plex-Meta-Manager/wiki/MyAnimeList-Attributes)
- [Metadata and Playlist Files](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Metadata-and-Playlist-Files)
- Metadata
- [Movies Metadata](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Movies-Metadata)
- [Shows Metadata](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Shows-Metadata)
- [Artists Metadata](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Artists-Metadata)
- [Templates](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Templates)
- [Filters](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Filters)
- Builders
- [Plex Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Plex-Builders)
- [Smart Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Smart-Builders)
- [TMDb Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/TMDb-Builders)
- [TVDb Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/TVDb-Builders)
- [IMDb Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/IMDb-Builders)
- [Trakt Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Trakt-Builders)
- [Tautulli Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Tautulli-Builders)
- [Letterboxd Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Letterboxd-Builders)
- [ICheckMovies Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/ICheckMovies-Builders)
- [FlixPatrol Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/FlixPatrol-Builders)
- [StevenLu Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/StevenLu-Builders)
- [AniDB Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/AniDB-Builders)
- [AniList Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/AniList-Builders)
- [MyAnimeList Builders](https://github.com/meisnate12/Plex-Meta-Manager/wiki/MyAnimeList-Builders)
- Details
- [Setting Details](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Setting-Details)
- [Schedule Detail](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Schedule-Detail)
- [Image Overlay Detail](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Image-Overlay-Detail)
- [Metadata Details](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Metadata-Details)
- [Arr Details](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Arr-Details)
- [Acknowledgements](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Acknowledgements)

@ -1 +1 @@
1.14.1 1.15.0

@ -14,7 +14,8 @@ libraries: # Library mappings must have a c
- file: config/Anime.yml # You have to create this file the other is online - file: config/Anime.yml # You have to create this file the other is online
- git: meisnate12/AnimeCharts - git: meisnate12/AnimeCharts
playlist_files: playlist_files:
- file: config/playlists.yml - file: config/playlists.yml # You have to create this file the other is online
- git: meisnate12/Playlists
settings: # Can be individually specified per library as well settings: # Can be individually specified per library as well
cache: true cache: true
cache_expiration: 60 cache_expiration: 60
@ -23,9 +24,11 @@ settings: # Can be individually specified
asset_depth: 0 asset_depth: 0
create_asset_folders: false create_asset_folders: false
dimensional_asset_rename: false dimensional_asset_rename: false
download_url_assets: false
show_missing_season_assets: false show_missing_season_assets: false
sync_mode: append sync_mode: append
collection_minimum: 1 minimum_items: 1
default_collection_order:
delete_below_minimum: true delete_below_minimum: true
delete_not_scheduled: false delete_not_scheduled: false
run_again_delay: 2 run_again_delay: 2
@ -41,11 +44,12 @@ settings: # Can be individually specified
ignore_ids: ignore_ids:
ignore_imdb_ids: ignore_imdb_ids:
playlist_sync_to_user: all playlist_sync_to_user: all
verify_ssl: true
webhooks: # Can be individually specified per library as well webhooks: # Can be individually specified per library as well
error: error:
run_start: run_start:
run_end: run_end:
collection_changes: changes:
plex: # Can be individually specified per library as well; REQUIRED for the script to run plex: # Can be individually specified per library as well; REQUIRED for the script to run
url: http://192.168.1.12:32400 url: http://192.168.1.12:32400
token: #################### token: ####################
@ -69,7 +73,8 @@ anidb: # Not required for AniDB builder
radarr: # Can be individually specified per library as well radarr: # Can be individually specified per library as well
url: http://192.168.1.12:7878 url: http://192.168.1.12:7878
token: ################################ token: ################################
add: false add_missing: false
add_existing: false
root_folder_path: S:/Movies root_folder_path: S:/Movies
monitor: true monitor: true
availability: announced availability: announced
@ -81,7 +86,8 @@ radarr: # Can be individually specified
sonarr: # Can be individually specified per library as well sonarr: # Can be individually specified per library as well
url: http://192.168.1.12:8989 url: http://192.168.1.12:8989
token: ################################ token: ################################
add: false add_missing: false
add_existing: false
root_folder_path: "S:/TV Shows" root_folder_path: "S:/TV Shows"
monitor: all monitor: all
quality_profile: HD-1080p quality_profile: HD-1080p

File diff suppressed because it is too large Load Diff

@ -22,6 +22,7 @@ class Cache:
cursor.execute("DROP TABLE IF EXISTS imdb_to_tvdb_map") cursor.execute("DROP TABLE IF EXISTS imdb_to_tvdb_map")
cursor.execute("DROP TABLE IF EXISTS tmdb_to_tvdb_map") cursor.execute("DROP TABLE IF EXISTS tmdb_to_tvdb_map")
cursor.execute("DROP TABLE IF EXISTS imdb_map") cursor.execute("DROP TABLE IF EXISTS imdb_map")
cursor.execute("DROP TABLE IF EXISTS omdb_data")
cursor.execute( cursor.execute(
"""CREATE TABLE IF NOT EXISTS guids_map ( """CREATE TABLE IF NOT EXISTS guids_map (
key INTEGER PRIMARY KEY, key INTEGER PRIMARY KEY,
@ -69,7 +70,7 @@ class Cache:
expiration_date TEXT)""" expiration_date TEXT)"""
) )
cursor.execute( cursor.execute(
"""CREATE TABLE IF NOT EXISTS omdb_data ( """CREATE TABLE IF NOT EXISTS omdb_data2 (
key INTEGER PRIMARY KEY, key INTEGER PRIMARY KEY,
imdb_id TEXT UNIQUE, imdb_id TEXT UNIQUE,
title TEXT, title TEXT,
@ -80,6 +81,9 @@ class Cache:
imdb_votes INTEGER, imdb_votes INTEGER,
metacritic_rating INTEGER, metacritic_rating INTEGER,
type TEXT, type TEXT,
series_id TEXT,
season_num INTEGER,
episode_num INTEGER,
expiration_date TEXT)""" expiration_date TEXT)"""
) )
cursor.execute( cursor.execute(
@ -235,7 +239,7 @@ class Cache:
with sqlite3.connect(self.cache_path) as connection: with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor: with closing(connection.cursor()) as cursor:
cursor.execute("SELECT * FROM omdb_data WHERE imdb_id = ?", (imdb_id,)) cursor.execute("SELECT * FROM omdb_data2 WHERE imdb_id = ?", (imdb_id,))
row = cursor.fetchone() row = cursor.fetchone()
if row: if row:
omdb_dict["imdbID"] = row["imdb_id"] if row["imdb_id"] else None omdb_dict["imdbID"] = row["imdb_id"] if row["imdb_id"] else None
@ -247,6 +251,9 @@ class Cache:
omdb_dict["imdbVotes"] = row["imdb_votes"] if row["imdb_votes"] else None omdb_dict["imdbVotes"] = row["imdb_votes"] if row["imdb_votes"] else None
omdb_dict["Metascore"] = row["metacritic_rating"] if row["metacritic_rating"] else None omdb_dict["Metascore"] = row["metacritic_rating"] if row["metacritic_rating"] else None
omdb_dict["Type"] = row["type"] if row["type"] else None omdb_dict["Type"] = row["type"] if row["type"] else None
omdb_dict["seriesID"] = row["series_id"] if row["series_id"] else None
omdb_dict["Season"] = row["season_num"] if row["season_num"] else None
omdb_dict["Episode"] = row["episode_num"] if row["episode_num"] else None
omdb_dict["Response"] = "True" omdb_dict["Response"] = "True"
datetime_object = datetime.strptime(row["expiration_date"], "%Y-%m-%d") datetime_object = datetime.strptime(row["expiration_date"], "%Y-%m-%d")
time_between_insertion = datetime.now() - datetime_object time_between_insertion = datetime.now() - datetime_object
@ -258,9 +265,14 @@ class Cache:
with sqlite3.connect(self.cache_path) as connection: with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor: with closing(connection.cursor()) as cursor:
cursor.execute("INSERT OR IGNORE INTO omdb_data(imdb_id) VALUES(?)", (omdb.imdb_id,)) cursor.execute("INSERT OR IGNORE INTO omdb_data2(imdb_id) VALUES(?)", (omdb.imdb_id,))
update_sql = "UPDATE omdb_data SET title = ?, year = ?, content_rating = ?, genres = ?, imdb_rating = ?, imdb_votes = ?, metacritic_rating = ?, type = ?, expiration_date = ? WHERE imdb_id = ?" update_sql = "UPDATE omdb_data2 SET title = ?, year = ?, content_rating = ?, genres = ?, " \
cursor.execute(update_sql, (omdb.title, omdb.year, omdb.content_rating, omdb.genres_str, omdb.imdb_rating, omdb.imdb_votes, omdb.metacritic_rating, omdb.type, expiration_date.strftime("%Y-%m-%d"), omdb.imdb_id)) "imdb_rating = ?, imdb_votes = ?, metacritic_rating = ?, type = ?, series_id = ?, " \
"season_num = ?, episode_num = ?, expiration_date = ? WHERE imdb_id = ?"
cursor.execute(update_sql, (omdb.title, omdb.year, omdb.content_rating, omdb.genres_str,
omdb.imdb_rating, omdb.imdb_votes, omdb.metacritic_rating, omdb.type,
omdb.series_id, omdb.season_num, omdb.episode_num,
expiration_date.strftime("%Y-%m-%d"), omdb.imdb_id))
def query_anime_map(self, anime_id, id_type): def query_anime_map(self, anime_id, id_type):
ids = None ids = None

@ -46,6 +46,9 @@ class ConfigFile:
self.read_only = read_only self.read_only = read_only
self.test_mode = attrs["test"] if "test" in attrs else False self.test_mode = attrs["test"] if "test" in attrs else False
self.trace_mode = attrs["trace"] if "trace" in attrs else False self.trace_mode = attrs["trace"] if "trace" in attrs else False
self.delete_collections = attrs["delete"] if "delete" in attrs else False
self.ignore_schedules = attrs["ignore_schedules"] if "ignore_schedules" in attrs else False
self.library_first = attrs["library_first"] if "library_first" in attrs else False
self.start_time = attrs["time_obj"] self.start_time = attrs["time_obj"]
self.run_hour = datetime.strptime(attrs["time"], "%H:%M").hour self.run_hour = datetime.strptime(attrs["time"], "%H:%M").hour
self.requested_collections = util.get_list(attrs["collections"]) if "collections" in attrs else None self.requested_collections = util.get_list(attrs["collections"]) if "collections" in attrs else None
@ -79,14 +82,34 @@ class ConfigFile:
replace_attr(new_config, "save_missing", "plex") replace_attr(new_config, "save_missing", "plex")
if new_config["libraries"]: if new_config["libraries"]:
for library in new_config["libraries"]: for library in new_config["libraries"]:
if new_config["libraries"][library] and "plex" in new_config["libraries"][library]: if not new_config["libraries"][library]:
continue
if "radarr_add_all" in new_config["libraries"][library]:
new_config["libraries"][library]["radarr_add_all_existing"] = new_config["libraries"][library].pop("radarr_add_all")
if "sonarr_add_all" in new_config["libraries"][library]:
new_config["libraries"][library]["sonarr_add_all_existing"] = new_config["libraries"][library].pop("sonarr_add_all")
if "plex" in new_config["libraries"][library] and new_config["libraries"][library]["plex"]:
replace_attr(new_config["libraries"][library], "asset_directory", "plex") replace_attr(new_config["libraries"][library], "asset_directory", "plex")
replace_attr(new_config["libraries"][library], "sync_mode", "plex") replace_attr(new_config["libraries"][library], "sync_mode", "plex")
replace_attr(new_config["libraries"][library], "show_unmanaged", "plex") replace_attr(new_config["libraries"][library], "show_unmanaged", "plex")
replace_attr(new_config["libraries"][library], "show_filtered", "plex") replace_attr(new_config["libraries"][library], "show_filtered", "plex")
replace_attr(new_config["libraries"][library], "show_missing", "plex") replace_attr(new_config["libraries"][library], "show_missing", "plex")
replace_attr(new_config["libraries"][library], "save_missing", "plex") replace_attr(new_config["libraries"][library], "save_missing", "plex")
if new_config["libraries"][library] and "webhooks" in new_config["libraries"][library] and "collection_changes" not in new_config["libraries"][library]["webhooks"]: if "settings" in new_config["libraries"][library] and new_config["libraries"][library]["settings"]:
if "collection_minimum" in new_config["libraries"][library]["settings"]:
new_config["libraries"][library]["settings"]["minimum_items"] = new_config["libraries"][library]["settings"].pop("collection_minimum")
if "radarr" in new_config["libraries"][library] and new_config["libraries"][library]["radarr"]:
if "add" in new_config["libraries"][library]["radarr"]:
new_config["libraries"][library]["radarr"]["add_missing"] = new_config["libraries"][library]["radarr"].pop("add")
if "sonarr" in new_config["libraries"][library] and new_config["libraries"][library]["sonarr"]:
if "add" in new_config["libraries"][library]["sonarr"]:
new_config["libraries"][library]["sonarr"]["add_missing"] = new_config["libraries"][library]["sonarr"].pop("add")
if "operations" in new_config["libraries"][library] and new_config["libraries"][library]["operations"]:
if "radarr_add_all" in new_config["libraries"][library]["operations"]:
new_config["libraries"][library]["operations"]["radarr_add_all_existing"] = new_config["libraries"][library]["operations"].pop("radarr_add_all")
if "sonarr_add_all" in new_config["libraries"][library]["operations"]:
new_config["libraries"][library]["operations"]["sonarr_add_all_existing"] = new_config["libraries"][library]["operations"].pop("sonarr_add_all")
if "webhooks" in new_config["libraries"][library] and new_config["libraries"][library]["webhooks"] and "collection_changes" not in new_config["libraries"][library]["webhooks"]:
changes = [] changes = []
def hooks(attr): def hooks(attr):
if attr in new_config["libraries"][library]["webhooks"]: if attr in new_config["libraries"][library]["webhooks"]:
@ -95,10 +118,14 @@ class ConfigFile:
hooks("collection_addition") hooks("collection_addition")
hooks("collection_removal") hooks("collection_removal")
hooks("collection_changes") hooks("collection_changes")
new_config["libraries"][library]["webhooks"]["changes"] = changes if changes else None new_config["libraries"][library]["webhooks"]["changes"] = None if not changes else changes if len(changes) > 1 else changes[0]
if "libraries" in new_config: new_config["libraries"] = new_config.pop("libraries") if "libraries" in new_config: new_config["libraries"] = new_config.pop("libraries")
if "playlists" in new_config: new_config["playlists"] = new_config.pop("playlists") if "playlists" in new_config: new_config["playlists"] = new_config.pop("playlists")
if "settings" in new_config: new_config["settings"] = new_config.pop("settings") if "settings" in new_config:
temp = new_config.pop("settings")
if "collection_minimum" in temp:
temp["minimum_items"] = temp.pop("collection_minimum")
new_config["settings"] = temp
if "webhooks" in new_config: if "webhooks" in new_config:
temp = new_config.pop("webhooks") temp = new_config.pop("webhooks")
if "changes" not in temp: if "changes" not in temp:
@ -112,7 +139,7 @@ class ConfigFile:
hooks("collection_addition") hooks("collection_addition")
hooks("collection_removal") hooks("collection_removal")
hooks("collection_changes") hooks("collection_changes")
temp["changes"] = changes if changes else None temp["changes"] = None if not changes else changes if len(changes) > 1 else changes[0]
new_config["webhooks"] = temp new_config["webhooks"] = temp
if "plex" in new_config: new_config["plex"] = new_config.pop("plex") if "plex" in new_config: new_config["plex"] = new_config.pop("plex")
if "tmdb" in new_config: new_config["tmdb"] = new_config.pop("tmdb") if "tmdb" in new_config: new_config["tmdb"] = new_config.pop("tmdb")
@ -120,8 +147,16 @@ class ConfigFile:
if "omdb" in new_config: new_config["omdb"] = new_config.pop("omdb") if "omdb" in new_config: new_config["omdb"] = new_config.pop("omdb")
if "notifiarr" in new_config: new_config["notifiarr"] = new_config.pop("notifiarr") if "notifiarr" in new_config: new_config["notifiarr"] = new_config.pop("notifiarr")
if "anidb" in new_config: new_config["anidb"] = new_config.pop("anidb") if "anidb" in new_config: new_config["anidb"] = new_config.pop("anidb")
if "radarr" in new_config: new_config["radarr"] = new_config.pop("radarr") if "radarr" in new_config:
if "sonarr" in new_config: new_config["sonarr"] = new_config.pop("sonarr") temp = new_config.pop("radarr")
if temp and "add" in temp:
temp["add_missing"] = temp.pop("add")
new_config["radarr"] = temp
if "sonarr" in new_config:
temp = new_config.pop("sonarr")
if temp and "add" in temp:
temp["add_missing"] = temp.pop("add")
new_config["sonarr"] = temp
if "trakt" in new_config: new_config["trakt"] = new_config.pop("trakt") if "trakt" in new_config: new_config["trakt"] = new_config.pop("trakt")
if "mal" in new_config: new_config["mal"] = new_config.pop("mal") if "mal" in new_config: new_config["mal"] = new_config.pop("mal")
if not self.read_only: if not self.read_only:
@ -184,8 +219,8 @@ class ConfigFile:
if len(warning_message) > 0: if len(warning_message) > 0:
warning_message += "\n" warning_message += "\n"
warning_message += f"Config Warning: Path does not exist: {os.path.abspath(p)}" warning_message += f"Config Warning: Path does not exist: {os.path.abspath(p)}"
if do_print: if do_print and warning_message:
util.print_multiline(f"Config Warning: {warning_message}") util.print_multiline(warning_message)
if len(temp_list) > 0: return temp_list if len(temp_list) > 0: return temp_list
else: message = "No Paths exist" else: message = "No Paths exist"
elif var_type == "lower_list": return util.get_list(data[attribute], lower=True) elif var_type == "lower_list": return util.get_list(data[attribute], lower=True)
@ -220,8 +255,6 @@ class ConfigFile:
util.print_multiline(options) util.print_multiline(options)
return default return default
self.session = requests.Session()
self.general = { self.general = {
"cache": check_for_attribute(self.data, "cache", parent="settings", var_type="bool", default=True), "cache": check_for_attribute(self.data, "cache", parent="settings", var_type="bool", default=True),
"cache_expiration": check_for_attribute(self.data, "cache_expiration", parent="settings", var_type="int", default=60), "cache_expiration": check_for_attribute(self.data, "cache_expiration", parent="settings", var_type="int", default=60),
@ -230,9 +263,11 @@ class ConfigFile:
"asset_depth": check_for_attribute(self.data, "asset_depth", parent="settings", var_type="int", default=0), "asset_depth": check_for_attribute(self.data, "asset_depth", parent="settings", var_type="int", default=0),
"create_asset_folders": check_for_attribute(self.data, "create_asset_folders", parent="settings", var_type="bool", default=False), "create_asset_folders": check_for_attribute(self.data, "create_asset_folders", parent="settings", var_type="bool", default=False),
"dimensional_asset_rename": check_for_attribute(self.data, "dimensional_asset_rename", parent="settings", var_type="bool", default=False), "dimensional_asset_rename": check_for_attribute(self.data, "dimensional_asset_rename", parent="settings", var_type="bool", default=False),
"download_url_assets": check_for_attribute(self.data, "download_url_assets", parent="settings", var_type="bool", default=False),
"show_missing_season_assets": check_for_attribute(self.data, "show_missing_season_assets", parent="settings", var_type="bool", default=False), "show_missing_season_assets": check_for_attribute(self.data, "show_missing_season_assets", parent="settings", var_type="bool", default=False),
"sync_mode": check_for_attribute(self.data, "sync_mode", parent="settings", default="append", test_list=sync_modes), "sync_mode": check_for_attribute(self.data, "sync_mode", parent="settings", default="append", test_list=sync_modes),
"collection_minimum": check_for_attribute(self.data, "collection_minimum", parent="settings", var_type="int", default=1), "default_collection_order": check_for_attribute(self.data, "default_collection_order", parent="settings", default_is_none=True),
"minimum_items": check_for_attribute(self.data, "minimum_items", parent="settings", var_type="int", default=1),
"delete_below_minimum": check_for_attribute(self.data, "delete_below_minimum", parent="settings", var_type="bool", default=False), "delete_below_minimum": check_for_attribute(self.data, "delete_below_minimum", parent="settings", var_type="bool", default=False),
"delete_not_scheduled": check_for_attribute(self.data, "delete_not_scheduled", parent="settings", var_type="bool", default=False), "delete_not_scheduled": check_for_attribute(self.data, "delete_not_scheduled", parent="settings", var_type="bool", default=False),
"run_again_delay": check_for_attribute(self.data, "run_again_delay", parent="settings", var_type="int", default=0), "run_again_delay": check_for_attribute(self.data, "run_again_delay", parent="settings", var_type="int", default=0),
@ -248,8 +283,17 @@ class ConfigFile:
"ignore_ids": check_for_attribute(self.data, "ignore_ids", parent="settings", var_type="int_list", default_is_none=True), "ignore_ids": check_for_attribute(self.data, "ignore_ids", parent="settings", var_type="int_list", default_is_none=True),
"ignore_imdb_ids": check_for_attribute(self.data, "ignore_imdb_ids", parent="settings", var_type="list", default_is_none=True), "ignore_imdb_ids": check_for_attribute(self.data, "ignore_imdb_ids", parent="settings", var_type="list", default_is_none=True),
"playlist_sync_to_user": check_for_attribute(self.data, "playlist_sync_to_user", parent="settings", default="all", default_is_none=True), "playlist_sync_to_user": check_for_attribute(self.data, "playlist_sync_to_user", parent="settings", default="all", default_is_none=True),
"verify_ssl": check_for_attribute(self.data, "verify_ssl", parent="settings", var_type="bool", default=True),
"assets_for_all": check_for_attribute(self.data, "assets_for_all", parent="settings", var_type="bool", default=False, save=False, do_print=False) "assets_for_all": check_for_attribute(self.data, "assets_for_all", parent="settings", var_type="bool", default=False, save=False, do_print=False)
} }
self.session = requests.Session()
if not self.general["verify_ssl"]:
self.session.verify = False
if self.session.verify is False:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
self.webhooks = { self.webhooks = {
"error": check_for_attribute(self.data, "error", parent="webhooks", var_type="list", default_is_none=True), "error": check_for_attribute(self.data, "error", parent="webhooks", var_type="list", default_is_none=True),
"run_start": check_for_attribute(self.data, "run_start", parent="webhooks", var_type="list", default_is_none=True), "run_start": check_for_attribute(self.data, "run_start", parent="webhooks", var_type="list", default_is_none=True),
@ -452,7 +496,7 @@ class ConfigFile:
self.general["radarr"] = { self.general["radarr"] = {
"url": check_for_attribute(self.data, "url", parent="radarr", var_type="url", default_is_none=True), "url": check_for_attribute(self.data, "url", parent="radarr", var_type="url", default_is_none=True),
"token": check_for_attribute(self.data, "token", parent="radarr", default_is_none=True), "token": check_for_attribute(self.data, "token", parent="radarr", default_is_none=True),
"add": check_for_attribute(self.data, "add", parent="radarr", var_type="bool", default=False), "add_missing": check_for_attribute(self.data, "add_missing", parent="radarr", var_type="bool", default=False),
"add_existing": check_for_attribute(self.data, "add_existing", parent="radarr", var_type="bool", default=False), "add_existing": check_for_attribute(self.data, "add_existing", parent="radarr", var_type="bool", default=False),
"root_folder_path": check_for_attribute(self.data, "root_folder_path", parent="radarr", default_is_none=True), "root_folder_path": check_for_attribute(self.data, "root_folder_path", parent="radarr", default_is_none=True),
"monitor": check_for_attribute(self.data, "monitor", parent="radarr", var_type="bool", default=True), "monitor": check_for_attribute(self.data, "monitor", parent="radarr", var_type="bool", default=True),
@ -466,7 +510,7 @@ class ConfigFile:
self.general["sonarr"] = { self.general["sonarr"] = {
"url": check_for_attribute(self.data, "url", parent="sonarr", var_type="url", default_is_none=True), "url": check_for_attribute(self.data, "url", parent="sonarr", var_type="url", default_is_none=True),
"token": check_for_attribute(self.data, "token", parent="sonarr", default_is_none=True), "token": check_for_attribute(self.data, "token", parent="sonarr", default_is_none=True),
"add": check_for_attribute(self.data, "add", parent="sonarr", var_type="bool", default=False), "add_missing": check_for_attribute(self.data, "add_missing", parent="sonarr", var_type="bool", default=False),
"add_existing": check_for_attribute(self.data, "add_existing", parent="sonarr", var_type="bool", default=False), "add_existing": check_for_attribute(self.data, "add_existing", parent="sonarr", var_type="bool", default=False),
"root_folder_path": check_for_attribute(self.data, "root_folder_path", parent="sonarr", default_is_none=True), "root_folder_path": check_for_attribute(self.data, "root_folder_path", parent="sonarr", default_is_none=True),
"monitor": check_for_attribute(self.data, "monitor", parent="sonarr", test_list=sonarr.monitor_descriptions, default="all"), "monitor": check_for_attribute(self.data, "monitor", parent="sonarr", test_list=sonarr.monitor_descriptions, default="all"),
@ -500,7 +544,8 @@ class ConfigFile:
"genre_mapper": None, "genre_mapper": None,
"radarr_remove_by_tag": None, "radarr_remove_by_tag": None,
"sonarr_remove_by_tag": None, "sonarr_remove_by_tag": None,
"mass_collection_mode": None "mass_collection_mode": None,
"genre_collections": None
} }
display_name = f"{params['name']} ({params['mapping_name']})" if lib and "library_name" in lib and lib["library_name"] else params["mapping_name"] display_name = f"{params['name']} ({params['mapping_name']})" if lib and "library_name" in lib and lib["library_name"] else params["mapping_name"]
@ -515,6 +560,7 @@ class ConfigFile:
params["asset_folders"] = check_for_attribute(lib, "asset_folders", parent="settings", var_type="bool", default=self.general["asset_folders"], do_print=False, save=False) params["asset_folders"] = check_for_attribute(lib, "asset_folders", parent="settings", var_type="bool", default=self.general["asset_folders"], do_print=False, save=False)
params["asset_depth"] = check_for_attribute(lib, "asset_depth", parent="settings", var_type="int", default=self.general["asset_depth"], do_print=False, save=False) params["asset_depth"] = check_for_attribute(lib, "asset_depth", parent="settings", var_type="int", default=self.general["asset_depth"], do_print=False, save=False)
params["sync_mode"] = check_for_attribute(lib, "sync_mode", parent="settings", test_list=sync_modes, default=self.general["sync_mode"], do_print=False, save=False) params["sync_mode"] = check_for_attribute(lib, "sync_mode", parent="settings", test_list=sync_modes, default=self.general["sync_mode"], do_print=False, save=False)
params["default_collection_order"] = check_for_attribute(lib, "default_collection_order", parent="settings", default=self.general["default_collection_order"], default_is_none=True, do_print=False, save=False)
params["show_unmanaged"] = check_for_attribute(lib, "show_unmanaged", parent="settings", var_type="bool", default=self.general["show_unmanaged"], do_print=False, save=False) params["show_unmanaged"] = check_for_attribute(lib, "show_unmanaged", parent="settings", var_type="bool", default=self.general["show_unmanaged"], do_print=False, save=False)
params["show_filtered"] = check_for_attribute(lib, "show_filtered", parent="settings", var_type="bool", default=self.general["show_filtered"], do_print=False, save=False) params["show_filtered"] = check_for_attribute(lib, "show_filtered", parent="settings", var_type="bool", default=self.general["show_filtered"], do_print=False, save=False)
params["show_options"] = check_for_attribute(lib, "show_options", parent="settings", var_type="bool", default=self.general["show_options"], do_print=False, save=False) params["show_options"] = check_for_attribute(lib, "show_options", parent="settings", var_type="bool", default=self.general["show_options"], do_print=False, save=False)
@ -525,8 +571,9 @@ class ConfigFile:
params["only_filter_missing"] = check_for_attribute(lib, "only_filter_missing", parent="settings", var_type="bool", default=self.general["only_filter_missing"], do_print=False, save=False) params["only_filter_missing"] = check_for_attribute(lib, "only_filter_missing", parent="settings", var_type="bool", default=self.general["only_filter_missing"], do_print=False, save=False)
params["create_asset_folders"] = check_for_attribute(lib, "create_asset_folders", parent="settings", var_type="bool", default=self.general["create_asset_folders"], do_print=False, save=False) params["create_asset_folders"] = check_for_attribute(lib, "create_asset_folders", parent="settings", var_type="bool", default=self.general["create_asset_folders"], do_print=False, save=False)
params["dimensional_asset_rename"] = check_for_attribute(lib, "dimensional_asset_rename", parent="settings", var_type="bool", default=self.general["dimensional_asset_rename"], do_print=False, save=False) params["dimensional_asset_rename"] = check_for_attribute(lib, "dimensional_asset_rename", parent="settings", var_type="bool", default=self.general["dimensional_asset_rename"], do_print=False, save=False)
params["download_url_assets"] = check_for_attribute(lib, "download_url_assets", parent="settings", var_type="bool", default=self.general["download_url_assets"], do_print=False, save=False)
params["show_missing_season_assets"] = check_for_attribute(lib, "show_missing_season_assets", parent="settings", var_type="bool", default=self.general["show_missing_season_assets"], do_print=False, save=False) params["show_missing_season_assets"] = check_for_attribute(lib, "show_missing_season_assets", parent="settings", var_type="bool", default=self.general["show_missing_season_assets"], do_print=False, save=False)
params["collection_minimum"] = check_for_attribute(lib, "collection_minimum", parent="settings", var_type="int", default=self.general["collection_minimum"], do_print=False, save=False) params["minimum_items"] = check_for_attribute(lib, "minimum_items", parent="settings", var_type="int", default=self.general["minimum_items"], do_print=False, save=False)
params["delete_below_minimum"] = check_for_attribute(lib, "delete_below_minimum", parent="settings", var_type="bool", default=self.general["delete_below_minimum"], do_print=False, save=False) params["delete_below_minimum"] = check_for_attribute(lib, "delete_below_minimum", parent="settings", var_type="bool", default=self.general["delete_below_minimum"], do_print=False, save=False)
params["delete_not_scheduled"] = check_for_attribute(lib, "delete_not_scheduled", parent="settings", var_type="bool", default=self.general["delete_not_scheduled"], do_print=False, save=False) params["delete_not_scheduled"] = check_for_attribute(lib, "delete_not_scheduled", parent="settings", var_type="bool", default=self.general["delete_not_scheduled"], do_print=False, save=False)
params["delete_unmanaged_collections"] = check_for_attribute(lib, "delete_unmanaged_collections", parent="settings", var_type="bool", default=False, do_print=False, save=False) params["delete_unmanaged_collections"] = check_for_attribute(lib, "delete_unmanaged_collections", parent="settings", var_type="bool", default=False, do_print=False, save=False)
@ -543,8 +590,9 @@ class ConfigFile:
params["mass_critic_rating_update"] = check_for_attribute(lib, "mass_critic_rating_update", test_list=mass_update_options, default_is_none=True, save=False, do_print=False) params["mass_critic_rating_update"] = check_for_attribute(lib, "mass_critic_rating_update", test_list=mass_update_options, default_is_none=True, save=False, do_print=False)
params["mass_trakt_rating_update"] = check_for_attribute(lib, "mass_trakt_rating_update", var_type="bool", default=False, save=False, do_print=False) params["mass_trakt_rating_update"] = check_for_attribute(lib, "mass_trakt_rating_update", var_type="bool", default=False, save=False, do_print=False)
params["split_duplicates"] = check_for_attribute(lib, "split_duplicates", var_type="bool", default=False, save=False, do_print=False) params["split_duplicates"] = check_for_attribute(lib, "split_duplicates", var_type="bool", default=False, save=False, do_print=False)
params["radarr_add_all"] = check_for_attribute(lib, "radarr_add_all", var_type="bool", default=False, save=False, do_print=False) params["radarr_add_all_existing"] = check_for_attribute(lib, "radarr_add_all_existing", var_type="bool", default=False, save=False, do_print=False)
params["sonarr_add_all"] = check_for_attribute(lib, "sonarr_add_all", var_type="bool", default=False, save=False, do_print=False) params["sonarr_add_all_existing"] = check_for_attribute(lib, "sonarr_add_all_existing", var_type="bool", default=False, save=False, do_print=False)
params["missing_path"] = check_for_attribute(lib, "missing_path", var_type="path", default_is_none=True, save=False)
if lib and "operations" in lib and lib["operations"]: if lib and "operations" in lib and lib["operations"]:
if isinstance(lib["operations"], dict): if isinstance(lib["operations"], dict):
@ -564,12 +612,12 @@ class ConfigFile:
params["mass_trakt_rating_update"] = check_for_attribute(lib["operations"], "mass_trakt_rating_update", var_type="bool", default=False, save=False) params["mass_trakt_rating_update"] = check_for_attribute(lib["operations"], "mass_trakt_rating_update", var_type="bool", default=False, save=False)
if "split_duplicates" in lib["operations"]: if "split_duplicates" in lib["operations"]:
params["split_duplicates"] = check_for_attribute(lib["operations"], "split_duplicates", var_type="bool", default=False, save=False) params["split_duplicates"] = check_for_attribute(lib["operations"], "split_duplicates", var_type="bool", default=False, save=False)
if "radarr_add_all" in lib["operations"]: if "radarr_add_all_existing" in lib["operations"]:
params["radarr_add_all"] = check_for_attribute(lib["operations"], "radarr_add_all", var_type="bool", default=False, save=False) params["radarr_add_all_existing"] = check_for_attribute(lib["operations"], "radarr_add_all_existing", var_type="bool", default=False, save=False)
if "radarr_remove_by_tag" in lib["operations"]: if "radarr_remove_by_tag" in lib["operations"]:
params["radarr_remove_by_tag"] = check_for_attribute(lib["operations"], "radarr_remove_by_tag", var_type="comma_list", default=False, save=False) params["radarr_remove_by_tag"] = check_for_attribute(lib["operations"], "radarr_remove_by_tag", var_type="comma_list", default=False, save=False)
if "sonarr_add_all" in lib["operations"]: if "sonarr_add_all_existing" in lib["operations"]:
params["sonarr_add_all"] = check_for_attribute(lib["operations"], "sonarr_add_all", var_type="bool", default=False, save=False) params["sonarr_add_all_existing"] = check_for_attribute(lib["operations"], "sonarr_add_all_existing", var_type="bool", default=False, save=False)
if "sonarr_remove_by_tag" in lib["operations"]: if "sonarr_remove_by_tag" in lib["operations"]:
params["sonarr_remove_by_tag"] = check_for_attribute(lib["operations"], "sonarr_remove_by_tag", var_type="comma_list", default=False, save=False) params["sonarr_remove_by_tag"] = check_for_attribute(lib["operations"], "sonarr_remove_by_tag", var_type="comma_list", default=False, save=False)
if "mass_collection_mode" in lib["operations"]: if "mass_collection_mode" in lib["operations"]:
@ -585,7 +633,6 @@ class ConfigFile:
"template": {"tmdb_collection_details": "<<collection_id>>"} "template": {"tmdb_collection_details": "<<collection_id>>"}
} }
if lib["operations"]["tmdb_collections"] and isinstance(lib["operations"]["tmdb_collections"], dict): if lib["operations"]["tmdb_collections"] and isinstance(lib["operations"]["tmdb_collections"], dict):
params["tmdb_collections"]["exclude_ids"] = check_for_attribute(lib["operations"]["tmdb_collections"], "exclude_ids", var_type="int_list", default_is_none=True, save=False) params["tmdb_collections"]["exclude_ids"] = check_for_attribute(lib["operations"]["tmdb_collections"], "exclude_ids", var_type="int_list", default_is_none=True, save=False)
params["tmdb_collections"]["remove_suffix"] = check_for_attribute(lib["operations"]["tmdb_collections"], "remove_suffix", var_type="comma_list", default_is_none=True, save=False) params["tmdb_collections"]["remove_suffix"] = check_for_attribute(lib["operations"]["tmdb_collections"], "remove_suffix", var_type="comma_list", default_is_none=True, save=False)
if "dictionary_variables" in lib["operations"]["tmdb_collections"] and lib["operations"]["tmdb_collections"]["dictionary_variables"] and isinstance(lib["operations"]["tmdb_collections"]["dictionary_variables"], dict): if "dictionary_variables" in lib["operations"]["tmdb_collections"] and lib["operations"]["tmdb_collections"]["dictionary_variables"] and isinstance(lib["operations"]["tmdb_collections"]["dictionary_variables"], dict):
@ -608,6 +655,32 @@ class ConfigFile:
params["genre_mapper"][old_genre] = new_genre params["genre_mapper"][old_genre] = new_genre
else: else:
logger.error("Config Error: genre_mapper is blank") logger.error("Config Error: genre_mapper is blank")
if "genre_collections" in lib["operations"]:
params["genre_collections"] = {
"exclude_genres": [],
"dictionary_variables": {},
"title_format": "Top <<genre>> <<library_type>>s",
"template": {"smart_filter": {"limit": 50, "sort_by": "critic_rating.desc", "all": {"genre": "<<genre>>"}}}
}
if lib["operations"]["genre_collections"] and isinstance(lib["operations"]["genre_collections"], dict):
params["genre_collections"]["exclude_genres"] = check_for_attribute(lib["operations"]["genre_collections"], "exclude_genres", var_type="comma_list", default_is_none=True, save=False)
title_format = check_for_attribute(lib["operations"]["genre_collections"], "title_format", default=params["genre_collections"]["title_format"], save=False)
if "<<genre>>" in title_format:
params["genre_collections"]["title_format"] = title_format
else:
logger.error(f"Config Error: using default title_format. <<genre>> not in title_format attribute: {title_format} ")
if "dictionary_variables" in lib["operations"]["genre_collections"] and lib["operations"]["genre_collections"]["dictionary_variables"] and isinstance(lib["operations"]["genre_collections"]["dictionary_variables"], dict):
for key, value in lib["operations"]["genre_collections"]["dictionary_variables"].items():
if isinstance(value, dict):
params["genre_collections"]["dictionary_variables"][key] = value
else:
logger.warning(f"Config Warning: genre_collections dictionary_variables {key} must be a dictionary")
if "template" in lib["operations"]["genre_collections"] and lib["operations"]["genre_collections"]["template"] and isinstance(lib["operations"]["genre_collections"]["template"], dict):
params["genre_collections"]["template"] = lib["operations"]["genre_collections"]["template"]
else:
logger.warning("Config Warning: Using default template for genre_collections")
else:
logger.error("Config Error: genre_collections blank using default settings")
else: else:
logger.error("Config Error: operations must be a dictionary") logger.error("Config Error: operations must be a dictionary")
@ -653,7 +726,7 @@ class ConfigFile:
params["default_dir"] = default_dir params["default_dir"] = default_dir
params["skip_library"] = False params["skip_library"] = False
if lib and "schedule" in lib: if lib and "schedule" in lib and not self.requested_libraries and not self.ignore_schedules:
if not lib["schedule"]: if not lib["schedule"]:
raise Failed(f"Config Error: schedule attribute is blank") raise Failed(f"Config Error: schedule attribute is blank")
else: else:
@ -691,7 +764,7 @@ class ConfigFile:
library.Radarr = Radarr(self, library, { library.Radarr = Radarr(self, library, {
"url": check_for_attribute(lib, "url", parent="radarr", var_type="url", default=self.general["radarr"]["url"], req_default=True, save=False), "url": check_for_attribute(lib, "url", parent="radarr", var_type="url", default=self.general["radarr"]["url"], req_default=True, save=False),
"token": check_for_attribute(lib, "token", parent="radarr", default=self.general["radarr"]["token"], req_default=True, save=False), "token": check_for_attribute(lib, "token", parent="radarr", default=self.general["radarr"]["token"], req_default=True, save=False),
"add": check_for_attribute(lib, "add", parent="radarr", var_type="bool", default=self.general["radarr"]["add"], save=False), "add_missing": check_for_attribute(lib, "add_missing", parent="radarr", var_type="bool", default=self.general["radarr"]["add_missing"], save=False),
"add_existing": check_for_attribute(lib, "add_existing", parent="radarr", var_type="bool", default=self.general["radarr"]["add_existing"], save=False), "add_existing": check_for_attribute(lib, "add_existing", parent="radarr", var_type="bool", default=self.general["radarr"]["add_existing"], save=False),
"root_folder_path": check_for_attribute(lib, "root_folder_path", parent="radarr", default=self.general["radarr"]["root_folder_path"], req_default=True, save=False), "root_folder_path": check_for_attribute(lib, "root_folder_path", parent="radarr", default=self.general["radarr"]["root_folder_path"], req_default=True, save=False),
"monitor": check_for_attribute(lib, "monitor", parent="radarr", var_type="bool", default=self.general["radarr"]["monitor"], save=False), "monitor": check_for_attribute(lib, "monitor", parent="radarr", var_type="bool", default=self.general["radarr"]["monitor"], save=False),
@ -719,7 +792,7 @@ class ConfigFile:
library.Sonarr = Sonarr(self, library, { library.Sonarr = Sonarr(self, library, {
"url": check_for_attribute(lib, "url", parent="sonarr", var_type="url", default=self.general["sonarr"]["url"], req_default=True, save=False), "url": check_for_attribute(lib, "url", parent="sonarr", var_type="url", default=self.general["sonarr"]["url"], req_default=True, save=False),
"token": check_for_attribute(lib, "token", parent="sonarr", default=self.general["sonarr"]["token"], req_default=True, save=False), "token": check_for_attribute(lib, "token", parent="sonarr", default=self.general["sonarr"]["token"], req_default=True, save=False),
"add": check_for_attribute(lib, "add", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add"], save=False), "add_missing": check_for_attribute(lib, "add_missing", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add_missing"], save=False),
"add_existing": check_for_attribute(lib, "add_existing", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add_existing"], save=False), "add_existing": check_for_attribute(lib, "add_existing", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add_existing"], save=False),
"root_folder_path": check_for_attribute(lib, "root_folder_path", parent="sonarr", default=self.general["sonarr"]["root_folder_path"], req_default=True, save=False), "root_folder_path": check_for_attribute(lib, "root_folder_path", parent="sonarr", default=self.general["sonarr"]["root_folder_path"], req_default=True, save=False),
"monitor": check_for_attribute(lib, "monitor", parent="sonarr", test_list=sonarr.monitor_descriptions, default=self.general["sonarr"]["monitor"], save=False), "monitor": check_for_attribute(lib, "monitor", parent="sonarr", test_list=sonarr.monitor_descriptions, default=self.general["sonarr"]["monitor"], save=False),

@ -40,13 +40,15 @@ class Library(ABC):
self.default_dir = params["default_dir"] self.default_dir = params["default_dir"]
self.mapping_name, output = util.validate_filename(self.original_mapping_name) self.mapping_name, output = util.validate_filename(self.original_mapping_name)
self.image_table_name = self.config.Cache.get_image_table_name(self.original_mapping_name) if self.config.Cache else None self.image_table_name = self.config.Cache.get_image_table_name(self.original_mapping_name) if self.config.Cache else None
self.missing_path = os.path.join(self.default_dir, f"{self.mapping_name}_missing.yml") self.missing_path = params["missing_path"] if params["missing_path"] else os.path.join(self.default_dir, f"{self.mapping_name}_missing.yml")
self.asset_folders = params["asset_folders"] self.asset_folders = params["asset_folders"]
self.create_asset_folders = params["create_asset_folders"] self.create_asset_folders = params["create_asset_folders"]
self.dimensional_asset_rename = params["dimensional_asset_rename"] self.dimensional_asset_rename = params["dimensional_asset_rename"]
self.download_url_assets = params["download_url_assets"]
self.show_missing_season_assets = params["show_missing_season_assets"] self.show_missing_season_assets = params["show_missing_season_assets"]
self.sync_mode = params["sync_mode"] self.sync_mode = params["sync_mode"]
self.collection_minimum = params["collection_minimum"] self.default_collection_order = params["default_collection_order"]
self.minimum_items = params["minimum_items"]
self.delete_below_minimum = params["delete_below_minimum"] self.delete_below_minimum = params["delete_below_minimum"]
self.delete_not_scheduled = params["delete_not_scheduled"] self.delete_not_scheduled = params["delete_not_scheduled"]
self.missing_only_released = params["missing_only_released"] self.missing_only_released = params["missing_only_released"]
@ -66,12 +68,13 @@ class Library(ABC):
self.mass_audience_rating_update = params["mass_audience_rating_update"] self.mass_audience_rating_update = params["mass_audience_rating_update"]
self.mass_critic_rating_update = params["mass_critic_rating_update"] self.mass_critic_rating_update = params["mass_critic_rating_update"]
self.mass_trakt_rating_update = params["mass_trakt_rating_update"] self.mass_trakt_rating_update = params["mass_trakt_rating_update"]
self.radarr_add_all = params["radarr_add_all"] self.radarr_add_all_existing = params["radarr_add_all_existing"]
self.radarr_remove_by_tag = params["radarr_remove_by_tag"] self.radarr_remove_by_tag = params["radarr_remove_by_tag"]
self.sonarr_add_all = params["sonarr_add_all"] self.sonarr_add_all_existing = params["sonarr_add_all_existing"]
self.sonarr_remove_by_tag = params["sonarr_remove_by_tag"] self.sonarr_remove_by_tag = params["sonarr_remove_by_tag"]
self.mass_collection_mode = params["mass_collection_mode"] self.mass_collection_mode = params["mass_collection_mode"]
self.tmdb_collections = params["tmdb_collections"] self.tmdb_collections = params["tmdb_collections"]
self.genre_collections = params["genre_collections"]
self.genre_mapper = params["genre_mapper"] self.genre_mapper = params["genre_mapper"]
self.error_webhooks = params["error_webhooks"] self.error_webhooks = params["error_webhooks"]
self.changes_webhooks = params["changes_webhooks"] self.changes_webhooks = params["changes_webhooks"]
@ -79,10 +82,15 @@ class Library(ABC):
self.clean_bundles = params["plex"]["clean_bundles"] # TODO: Here or just in Plex? self.clean_bundles = params["plex"]["clean_bundles"] # TODO: Here or just in Plex?
self.empty_trash = params["plex"]["empty_trash"] # TODO: Here or just in Plex? self.empty_trash = params["plex"]["empty_trash"] # TODO: Here or just in Plex?
self.optimize = params["plex"]["optimize"] # TODO: Here or just in Plex? self.optimize = params["plex"]["optimize"] # TODO: Here or just in Plex?
self.library_operation = self.assets_for_all or self.delete_unmanaged_collections or self.delete_collections_with_less \ self.stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
or self.mass_genre_update or self.mass_audience_rating_update or self.mass_critic_rating_update \ self.status = {}
or self.mass_trakt_rating_update or self.radarr_add_all or self.sonarr_add_all \
or self.tmdb_collections or self.genre_mapper self.tmdb_library_operation = self.assets_for_all or self.mass_genre_update or self.mass_audience_rating_update \
or self.mass_critic_rating_update or self.mass_trakt_rating_update \
or self.tmdb_collections or self.radarr_add_all_existing or self.sonarr_add_all_existing
self.library_operation = self.tmdb_library_operation or self.delete_unmanaged_collections or self.delete_collections_with_less \
or self.radarr_remove_by_tag or self.sonarr_remove_by_tag or self.mass_collection_mode \
or self.genre_collections or self.genre_mapper or self.show_unmanaged
metadata = [] metadata = []
for file_type, metadata_file in self.metadata_path: for file_type, metadata_file in self.metadata_path:
if file_type == "Folder": if file_type == "Folder":
@ -151,7 +159,7 @@ class Library(ABC):
if poster_uploaded or image is None or image != item.thumb or f"{overlay_name.lower()} overlay" not in item_labels: if poster_uploaded or image is None or image != item.thumb or f"{overlay_name.lower()} overlay" not in item_labels:
if not item.posterUrl: if not item.posterUrl:
raise Failed(f"Overlay Error: No existing poster to Overlay for {item.title}") raise Failed(f"Overlay Error: No existing poster to Overlay for {item.title}")
response = requests.get(item.posterUrl) response = self.config.get(item.posterUrl)
if response.status_code >= 400: if response.status_code >= 400:
raise Failed(f"Overlay Error: Overlay Failed for {item.title}") raise Failed(f"Overlay Error: Overlay Failed for {item.title}")
og_image = response.content og_image = response.content
@ -218,7 +226,7 @@ class Library(ABC):
pass pass
@abstractmethod @abstractmethod
def get_all(self): def get_all(self, collection_level=None):
pass pass
def add_missing(self, collection, items, is_movie): def add_missing(self, collection, items, is_movie):

@ -9,6 +9,18 @@ logger = logging.getLogger("Plex Meta Manager")
github_base = "https://raw.githubusercontent.com/meisnate12/Plex-Meta-Manager-Configs/master/" github_base = "https://raw.githubusercontent.com/meisnate12/Plex-Meta-Manager-Configs/master/"
advance_tags_to_edit = {
"Movie": ["metadata_language", "use_original_title"],
"Show": ["episode_sorting", "keep_episodes", "delete_episodes", "season_display", "episode_ordering",
"metadata_language", "use_original_title"],
"Artist": ["album_sorting"]
}
tags_to_edit = {
"Movie": ["genre", "label", "collection", "country", "director", "producer", "writer"],
"Show": ["genre", "label", "collection"],
"Artist": ["genre", "style", "mood", "country", "collection", "similar_artist"]
}
def get_dict(attribute, attr_data, check_list=None): def get_dict(attribute, attr_data, check_list=None):
if check_list is None: if check_list is None:
@ -19,14 +31,11 @@ def get_dict(attribute, attr_data, check_list=None):
new_dict = {} new_dict = {}
for _name, _data in attr_data[attribute].items(): for _name, _data in attr_data[attribute].items():
if _name in check_list: if _name in check_list:
logger.error( logger.error(f"Config Warning: Skipping duplicate {attribute[:-1] if attribute[-1] == 's' else attribute}: {_name}")
f"Config Warning: Skipping duplicate {attribute[:-1] if attribute[-1] == 's' else attribute}: {_name}")
elif _data is None: elif _data is None:
logger.error( logger.error(f"Config Warning: {attribute[:-1] if attribute[-1] == 's' else attribute}: {_name} has no data")
f"Config Warning: {attribute[:-1] if attribute[-1] == 's' else attribute}: {_name} has no data")
elif not isinstance(_data, dict): elif not isinstance(_data, dict):
logger.error( logger.error(f"Config Warning: {attribute[:-1] if attribute[-1] == 's' else attribute}: {_name} must be a dictionary")
f"Config Warning: {attribute[:-1] if attribute[-1] == 's' else attribute}: {_name} must be a dictionary")
else: else:
new_dict[str(_name)] = _data new_dict[str(_name)] = _data
return new_dict return new_dict
@ -65,14 +74,14 @@ class DataFile:
util.print_stacktrace() util.print_stacktrace()
raise Failed(f"YAML Error: {e}") raise Failed(f"YAML Error: {e}")
def apply_template(self, name, data, template): def apply_template(self, name, data, template_call):
if not self.templates: if not self.templates:
raise Failed(f"{self.data_type} Error: No templates found") raise Failed(f"{self.data_type} Error: No templates found")
elif not template: elif not template_call:
raise Failed(f"{self.data_type} Error: template attribute is blank") raise Failed(f"{self.data_type} Error: template attribute is blank")
else: else:
logger.debug(f"Value: {template}") logger.debug(f"Value: {template_call}")
for variables in util.get_list(template, split=False): for variables in util.get_list(template_call, split=False):
if not isinstance(variables, dict): if not isinstance(variables, dict):
raise Failed(f"{self.data_type} Error: template attribute is not a dictionary") raise Failed(f"{self.data_type} Error: template attribute is not a dictionary")
elif "name" not in variables: elif "name" not in variables:
@ -84,9 +93,15 @@ class DataFile:
elif not isinstance(self.templates[variables["name"]], dict): elif not isinstance(self.templates[variables["name"]], dict):
raise Failed(f"{self.data_type} Error: template {variables['name']} is not a dictionary") raise Failed(f"{self.data_type} Error: template {variables['name']} is not a dictionary")
else: else:
remove_variables = []
for tm in variables: for tm in variables:
if not variables[tm]: if variables[tm] is None:
raise Failed(f"{self.data_type} Error: template sub-attribute {tm} is blank") remove_variables.append(tm)
optional = []
for remove_variable in remove_variables:
variables.pop(remove_variable)
optional.append(str(remove_variable))
if self.data_type == "Collection" and "collection_name" not in variables: if self.data_type == "Collection" and "collection_name" not in variables:
variables["collection_name"] = str(name) variables["collection_name"] = str(name)
if self.data_type == "Playlist" and "playlist_name" not in variables: if self.data_type == "Playlist" and "playlist_name" not in variables:
@ -100,16 +115,21 @@ class DataFile:
if template["default"]: if template["default"]:
if isinstance(template["default"], dict): if isinstance(template["default"], dict):
for dv in template["default"]: for dv in template["default"]:
if template["default"][dv]: if str(dv) not in optional:
default[dv] = template["default"][dv] if template["default"][dv] is not None:
else: final_value = str(template["default"][dv])
raise Failed(f"{self.data_type} Error: template default sub-attribute {dv} is blank") if "<<collection_name>>" in final_value:
final_value = final_value.replace("<<collection_name>>", str(name))
if "<<playlist_name>>" in final_value:
final_value = final_value.replace("<<playlist_name>>", str(name))
default[dv] = final_value
else:
raise Failed(f"{self.data_type} Error: template default sub-attribute {dv} is blank")
else: else:
raise Failed(f"{self.data_type} Error: template sub-attribute default is not a dictionary") raise Failed(f"{self.data_type} Error: template sub-attribute default is not a dictionary")
else: else:
raise Failed(f"{self.data_type} Error: template sub-attribute default is blank") raise Failed(f"{self.data_type} Error: template sub-attribute default is blank")
optional = []
if "optional" in template: if "optional" in template:
if template["optional"]: if template["optional"]:
for op in util.get_list(template["optional"]): for op in util.get_list(template["optional"]):
@ -221,6 +241,48 @@ class MetadataFile(DataFile):
else: else:
return self.collections return self.collections
def edit_tags(self, attr, obj, group, alias, extra=None):
if attr in alias and f"{attr}.sync" in alias:
logger.error(f"Metadata Error: Cannot use {attr} and {attr}.sync together")
elif f"{attr}.remove" in alias and f"{attr}.sync" in alias:
logger.error(f"Metadata Error: Cannot use {attr}.remove and {attr}.sync together")
elif attr in alias and group[alias[attr]] is None:
logger.error(f"Metadata Error: {attr} attribute is blank")
elif f"{attr}.remove" in alias and group[alias[f"{attr}.remove"]] is None:
logger.error(f"Metadata Error: {attr}.remove attribute is blank")
elif f"{attr}.sync" in alias and group[alias[f"{attr}.sync"]] is None:
logger.error(f"Metadata Error: {attr}.sync attribute is blank")
elif attr in alias or f"{attr}.remove" in alias or f"{attr}.sync" in alias:
add_tags = util.get_list(group[alias[attr]]) if attr in alias else []
if extra:
add_tags.extend(extra)
remove_tags = util.get_list(group[alias[f"{attr}.remove"]]) if f"{attr}.remove" in alias else None
sync_tags = util.get_list(group[alias[f"{attr}.sync"]] if group[alias[f"{attr}.sync"]] else []) if f"{attr}.sync" in alias else None
return self.library.edit_tags(attr, obj, add_tags=add_tags, remove_tags=remove_tags, sync_tags=sync_tags)
return False
def set_images(self, obj, group, alias):
def set_image(attr, is_poster=True, is_url=True):
if group[alias[attr]]:
return ImageData(attr, group[alias[attr]], is_poster=is_poster, is_url=is_url)
else:
logger.error(f"Metadata Error: {attr} attribute is blank")
poster = None
background = None
if "url_poster" in alias:
poster = set_image("url_poster")
elif "file_poster" in alias:
poster = set_image("file_poster", is_url=False)
if "url_background" in alias:
background = set_image("url_background", is_poster=False)
elif "file_background" in alias:
background = set_image("file_background",is_poster=False, is_url=False)
if poster or background:
self.library.upload_images(obj, poster=poster, background=background)
def update_metadata(self): def update_metadata(self):
if not self.metadata: if not self.metadata:
return None return None
@ -229,8 +291,6 @@ class MetadataFile(DataFile):
logger.info("") logger.info("")
for mapping_name, meta in self.metadata.items(): for mapping_name, meta in self.metadata.items():
methods = {mm.lower(): mm for mm in meta} methods = {mm.lower(): mm for mm in meta}
if self.config.test_mode and ("test" not in methods or meta[methods["test"]] is not True):
continue
updated = False updated = False
edits = {} edits = {}
@ -243,13 +303,11 @@ class MetadataFile(DataFile):
if value is None: value = group[alias[name]] if value is None: value = group[alias[name]]
try: try:
current = str(getattr(current_item, key, "")) current = str(getattr(current_item, key, ""))
final_value = None
if var_type == "date": if var_type == "date":
final_value = util.validate_date(value, name, return_as="%Y-%m-%d") final_value = util.validate_date(value, name, return_as="%Y-%m-%d")
current = current[:-9] current = current[:-9]
elif var_type == "float": elif var_type == "float":
if value is None:
raise Failed(f"Metadata Error: {name} attribute is blank")
final_value = None
try: try:
value = float(str(value)) value = float(str(value))
if 0 <= value <= 10: if 0 <= value <= 10:
@ -258,6 +316,13 @@ class MetadataFile(DataFile):
pass pass
if final_value is None: if final_value is None:
raise Failed(f"Metadata Error: {name} attribute must be a number between 0 and 10") raise Failed(f"Metadata Error: {name} attribute must be a number between 0 and 10")
elif var_type == "int":
try:
final_value = int(str(value))
except ValueError:
pass
if final_value is None:
raise Failed(f"Metadata Error: {name} attribute must be an integer")
else: else:
final_value = value final_value = value
if current != str(final_value): if current != str(final_value):
@ -269,13 +334,11 @@ class MetadataFile(DataFile):
else: else:
logger.error(f"Metadata Error: {name} attribute is blank") logger.error(f"Metadata Error: {name} attribute is blank")
def add_advanced_edit(attr, obj, group, alias, show_library=False, new_agent=False): def add_advanced_edit(attr, obj, group, alias, new_agent=False):
key, options = plex.item_advance_keys[f"item_{attr}"] key, options = plex.item_advance_keys[f"item_{attr}"]
if attr in alias: if attr in alias:
if new_agent and self.library.agent not in plex.new_plex_agents: if new_agent and self.library.agent not in plex.new_plex_agents:
logger.error(f"Metadata Error: {attr} attribute only works for with the New Plex Movie Agent and New Plex TV Agent") logger.error(f"Metadata Error: {attr} attribute only works for with the New Plex Movie Agent and New Plex TV Agent")
elif show_library and not self.library.is_show:
logger.error(f"Metadata Error: {attr} attribute only works for show libraries")
elif group[alias[attr]]: elif group[alias[attr]]:
method_data = str(group[alias[attr]]).lower() method_data = str(group[alias[attr]]).lower()
if method_data not in options: if method_data not in options:
@ -286,54 +349,11 @@ class MetadataFile(DataFile):
else: else:
logger.error(f"Metadata Error: {attr} attribute is blank") logger.error(f"Metadata Error: {attr} attribute is blank")
def edit_tags(attr, obj, group, alias, extra=None, movie_library=False):
if movie_library and not self.library.is_movie and (attr in alias or f"{attr}.sync" in alias or f"{attr}.remove" in alias):
logger.error(f"Metadata Error: {attr} attribute only works for movie libraries")
elif attr in alias and f"{attr}.sync" in alias:
logger.error(f"Metadata Error: Cannot use {attr} and {attr}.sync together")
elif f"{attr}.remove" in alias and f"{attr}.sync" in alias:
logger.error(f"Metadata Error: Cannot use {attr}.remove and {attr}.sync together")
elif attr in alias and group[alias[attr]] is None:
logger.error(f"Metadata Error: {attr} attribute is blank")
elif f"{attr}.remove" in alias and group[alias[f"{attr}.remove"]] is None:
logger.error(f"Metadata Error: {attr}.remove attribute is blank")
elif f"{attr}.sync" in alias and group[alias[f"{attr}.sync"]] is None:
logger.error(f"Metadata Error: {attr}.sync attribute is blank")
elif attr in alias or f"{attr}.remove" in alias or f"{attr}.sync" in alias:
add_tags = util.get_list(group[alias[attr]]) if attr in alias else []
if extra:
add_tags.extend(extra)
remove_tags = util.get_list(group[alias[f"{attr}.remove"]]) if f"{attr}.remove" in alias else None
sync_tags = util.get_list(group[alias[f"{attr}.sync"]] if group[alias[f"{attr}.sync"]] else []) if f"{attr}.sync" in alias else None
return self.library.edit_tags(attr, obj, add_tags=add_tags, remove_tags=remove_tags, sync_tags=sync_tags)
return False
def set_image(attr, group, alias, is_poster=True, is_url=True):
if group[alias[attr]]:
return ImageData(attr, group[alias[attr]], is_poster=is_poster, is_url=is_url)
else:
logger.error(f"Metadata Error: {attr} attribute is blank")
def set_images(obj, group, alias):
poster = None
background = None
if "url_poster" in alias:
poster = set_image("url_poster", group, alias)
elif "file_poster" in alias:
poster = set_image("file_poster", group, alias, is_url=False)
if "url_background" in alias:
background = set_image("url_background", group, alias, is_poster=False)
elif "file_background" in alias:
background = set_image("file_background", group, alias, is_poster=False, is_url=False)
if poster or background:
self.library.upload_images(obj, poster=poster, background=background)
logger.info("") logger.info("")
util.separator() util.separator()
logger.info("") logger.info("")
year = None year = None
if "year" in methods: if "year" in methods and not self.library.is_music:
next_year = datetime.now().year + 1 next_year = datetime.now().year + 1
if meta[methods["year"]] is None: if meta[methods["year"]] is None:
raise Failed("Metadata Error: year attribute is blank") raise Failed("Metadata Error: year attribute is blank")
@ -370,15 +390,14 @@ class MetadataFile(DataFile):
logger.error(f"Skipping {mapping_name}") logger.error(f"Skipping {mapping_name}")
continue continue
item_type = "Movie" if self.library.is_movie else "Show" logger.info(f"Updating {self.library.type}: {title}...")
logger.info(f"Updating {item_type}: {title}...")
tmdb_item = None tmdb_item = None
tmdb_is_movie = None tmdb_is_movie = None
if ("tmdb_show" in methods or "tmdb_id" in methods) and "tmdb_movie" in methods: if not self.library.is_music and ("tmdb_show" in methods or "tmdb_id" in methods) and "tmdb_movie" in methods:
logger.error("Metadata Error: Cannot use tmdb_movie and tmdb_show when editing the same metadata item") logger.error("Metadata Error: Cannot use tmdb_movie and tmdb_show when editing the same metadata item")
if "tmdb_show" in methods or "tmdb_id" in methods or "tmdb_movie" in methods: if not self.library.is_music and "tmdb_show" in methods or "tmdb_id" in methods or "tmdb_movie" in methods:
try: try:
if "tmdb_show" in methods or "tmdb_id" in methods: if "tmdb_show" in methods or "tmdb_id" in methods:
data = meta[methods["tmdb_show" if "tmdb_show" in methods else "tmdb_id"]] data = meta[methods["tmdb_show" if "tmdb_show" in methods else "tmdb_id"]]
@ -421,134 +440,255 @@ class MetadataFile(DataFile):
edits = {} edits = {}
add_edit("title", item, meta, methods, value=title) add_edit("title", item, meta, methods, value=title)
add_edit("sort_title", item, meta, methods, key="titleSort") add_edit("sort_title", item, meta, methods, key="titleSort")
add_edit("originally_available", item, meta, methods, key="originallyAvailableAt", value=originally_available, var_type="date") if not self.library.is_music:
add_edit("critic_rating", item, meta, methods, value=rating, key="rating", var_type="float") add_edit("originally_available", item, meta, methods, key="originallyAvailableAt", value=originally_available, var_type="date")
add_edit("audience_rating", item, meta, methods, key="audienceRating", var_type="float") add_edit("critic_rating", item, meta, methods, value=rating, key="rating", var_type="float")
add_edit("user_rating", item, meta, methods, key="userRating", var_type="float") add_edit("audience_rating", item, meta, methods, key="audienceRating", var_type="float")
add_edit("content_rating", item, meta, methods, key="contentRating") add_edit("user_rating", item, meta, methods, key="userRating", var_type="float")
add_edit("original_title", item, meta, methods, key="originalTitle", value=original_title) add_edit("content_rating", item, meta, methods, key="contentRating")
add_edit("studio", item, meta, methods, value=studio) add_edit("original_title", item, meta, methods, key="originalTitle", value=original_title)
add_edit("tagline", item, meta, methods, value=tagline) add_edit("studio", item, meta, methods, value=studio)
add_edit("tagline", item, meta, methods, value=tagline)
add_edit("summary", item, meta, methods, value=summary) add_edit("summary", item, meta, methods, value=summary)
if self.library.edit_item(item, mapping_name, item_type, edits): if self.library.edit_item(item, mapping_name, self.library.type, edits):
updated = True updated = True
advance_edits = {} advance_edits = {}
for advance_edit in ["episode_sorting", "keep_episodes", "delete_episodes", "season_display", "episode_ordering", "metadata_language", "use_original_title"]: for advance_edit in advance_tags_to_edit[self.library.type]:
is_show = advance_edit in ["episode_sorting", "keep_episodes", "delete_episodes", "season_display", "episode_ordering"]
is_new_agent = advance_edit in ["metadata_language", "use_original_title"] is_new_agent = advance_edit in ["metadata_language", "use_original_title"]
add_advanced_edit(advance_edit, item, meta, methods, show_library=is_show, new_agent=is_new_agent) add_advanced_edit(advance_edit, item, meta, methods, new_agent=is_new_agent)
if self.library.edit_item(item, mapping_name, item_type, advance_edits, advanced=True): if self.library.edit_item(item, mapping_name, self.library.type, advance_edits, advanced=True):
updated = True updated = True
for tag_edit in ["genre", "label", "collection", "country", "director", "producer", "writer"]: for tag_edit in tags_to_edit[self.library.type]:
is_movie = tag_edit in ["country", "director", "producer", "writer"] if self.edit_tags(tag_edit, item, meta, methods, extra=genres if tag_edit == "genre" else None):
has_extra = genres if tag_edit == "genre" else None
if edit_tags(tag_edit, item, meta, methods, movie_library=is_movie, extra=has_extra):
updated = True updated = True
logger.info(f"{item_type}: {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}") logger.info(f"{self.library.type}: {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}")
set_images(item, meta, methods) self.set_images(item, meta, methods)
if "seasons" in methods and self.library.is_show: if "seasons" in methods and self.library.is_show:
if meta[methods["seasons"]]: if not meta[methods["seasons"]]:
for season_id in meta[methods["seasons"]]: logger.error("Metadata Error: seasons attribute is blank")
elif not isinstance(meta[methods["seasons"]], dict):
logger.error("Metadata Error: seasons attribute must be a dictionary")
else:
for season_id, season_dict in meta[methods["seasons"]].items():
updated = False updated = False
logger.info("") logger.info("")
logger.info(f"Updating season {season_id} of {mapping_name}...") logger.info(f"Updating season {season_id} of {mapping_name}...")
if isinstance(season_id, int): try:
season = None if isinstance(season_id, int):
for s in item.seasons(): season = item.season(season=season_id)
if s.index == season_id:
season = s
break
if season is None:
logger.error(f"Metadata Error: Season: {season_id} not found")
else: else:
season_dict = meta[methods["seasons"]][season_id] season = item.season(title=season_id)
season_methods = {sm.lower(): sm for sm in season_dict} except NotFound:
logger.error(f"Metadata Error: Season: {season_id} not found")
if "title" in season_methods and season_dict[season_methods["title"]]: continue
title = season_dict[season_methods["title"]] season_methods = {sm.lower(): sm for sm in season_dict}
else:
title = season.title if "title" in season_methods and season_dict[season_methods["title"]]:
if "sub" in season_methods: title = season_dict[season_methods["title"]]
if season_dict[season_methods["sub"]] is None:
logger.error("Metadata Error: sub attribute is blank")
elif season_dict[season_methods["sub"]] is True and "(SUB)" not in title:
title = f"{title} (SUB)"
elif season_dict[season_methods["sub"]] is False and title.endswith(" (SUB)"):
title = title[:-6]
else:
logger.error("Metadata Error: sub attribute must be True or False")
edits = {}
add_edit("title", season, season_dict, season_methods, value=title)
add_edit("summary", season, season_dict, season_methods)
if self.library.edit_item(season, season_id, "Season", edits):
updated = True
set_images(season, season_dict, season_methods)
else: else:
logger.error(f"Metadata Error: Season: {season_id} invalid, it must be an integer") title = season.title
if "sub" in season_methods:
if season_dict[season_methods["sub"]] is None:
logger.error("Metadata Error: sub attribute is blank")
elif season_dict[season_methods["sub"]] is True and "(SUB)" not in title:
title = f"{title} (SUB)"
elif season_dict[season_methods["sub"]] is False and title.endswith(" (SUB)"):
title = title[:-6]
else:
logger.error("Metadata Error: sub attribute must be True or False")
edits = {}
add_edit("title", season, season_dict, season_methods, value=title)
add_edit("summary", season, season_dict, season_methods)
if self.library.edit_item(season, season_id, "Season", edits):
updated = True
self.set_images(season, season_dict, season_methods)
logger.info(f"Season {season_id} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}") logger.info(f"Season {season_id} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}")
else:
logger.error("Metadata Error: seasons attribute is blank") if "episodes" in season_methods and self.library.is_show:
elif "seasons" in methods: if not season_dict[season_methods["episodes"]]:
logger.error("Metadata Error: seasons attribute only works for show libraries") logger.error("Metadata Error: episodes attribute is blank")
elif not isinstance(season_dict[season_methods["episodes"]], dict):
logger.error("Metadata Error: episodes attribute must be a dictionary")
else:
for episode_str, episode_dict in season_dict[season_methods["episodes"]].items():
updated = False
logger.info("")
logger.info(f"Updating episode {episode_str} in {season_id} of {mapping_name}...")
try:
if isinstance(episode_str, int):
episode = season.episode(episode=episode_str)
else:
episode = season.episode(title=episode_str)
except NotFound:
logger.error(f"Metadata Error: Episode {episode_str} in Season {season_id} not found")
continue
episode_methods = {em.lower(): em for em in episode_dict}
if "title" in episode_methods and episode_dict[episode_methods["title"]]:
title = episode_dict[episode_methods["title"]]
else:
title = episode.title
if "sub" in episode_dict:
if episode_dict[episode_methods["sub"]] is None:
logger.error("Metadata Error: sub attribute is blank")
elif episode_dict[episode_methods["sub"]] is True and "(SUB)" not in title:
title = f"{title} (SUB)"
elif episode_dict[episode_methods["sub"]] is False and title.endswith(" (SUB)"):
title = title[:-6]
else:
logger.error("Metadata Error: sub attribute must be True or False")
edits = {}
add_edit("title", episode, episode_dict, episode_methods, value=title)
add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort")
add_edit("rating", episode, episode_dict, episode_methods, var_type="float")
add_edit("originally_available", episode, episode_dict, episode_methods, key="originallyAvailableAt", var_type="date")
add_edit("summary", episode, episode_dict, episode_methods)
if self.library.edit_item(episode, f"{episode_str} in Season: {season_id}", "Episode", edits):
updated = True
for tag_edit in ["director", "writer"]:
if self.edit_tags(tag_edit, episode, episode_dict, episode_methods):
updated = True
self.set_images(episode, episode_dict, episode_methods)
logger.info(f"Episode {episode_str} in Season {season_id} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}")
if "episodes" in methods and self.library.is_show: if "episodes" in methods and self.library.is_show:
if meta[methods["episodes"]]: if not meta[methods["episodes"]]:
for episode_str in meta[methods["episodes"]]: logger.error("Metadata Error: episodes attribute is blank")
elif not isinstance(meta[methods["episodes"]], dict):
logger.error("Metadata Error: episodes attribute must be a dictionary")
else:
for episode_str, episode_dict in meta[methods["episodes"]].items():
updated = False updated = False
logger.info("") logger.info("")
match = re.search("[Ss]\\d+[Ee]\\d+", episode_str) match = re.search("[Ss]\\d+[Ee]\\d+", episode_str)
if match: if not match:
output = match.group(0)[1:].split("E" if "E" in match.group(0) else "e") logger.error(f"Metadata Error: episode {episode_str} invalid must have S##E## format")
season_id = int(output[0]) continue
episode_id = int(output[1]) output = match.group(0)[1:].split("E" if "E" in match.group(0) else "e")
logger.info(f"Updating episode S{season_id}E{episode_id} of {mapping_name}...") season_id = int(output[0])
episode_id = int(output[1])
logger.info(f"Updating episode S{season_id}E{episode_id} of {mapping_name}...")
try:
episode = item.episode(season=season_id, episode=episode_id)
except NotFound:
logger.error(f"Metadata Error: episode {episode_id} of season {season_id} not found")
continue
episode_methods = {em.lower(): em for em in episode_dict}
if "title" in episode_methods and episode_dict[episode_methods["title"]]:
title = episode_dict[episode_methods["title"]]
else:
title = episode.title
if "sub" in episode_dict:
if episode_dict[episode_methods["sub"]] is None:
logger.error("Metadata Error: sub attribute is blank")
elif episode_dict[episode_methods["sub"]] is True and "(SUB)" not in title:
title = f"{title} (SUB)"
elif episode_dict[episode_methods["sub"]] is False and title.endswith(" (SUB)"):
title = title[:-6]
else:
logger.error("Metadata Error: sub attribute must be True or False")
edits = {}
add_edit("title", episode, episode_dict, episode_methods, value=title)
add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort")
add_edit("rating", episode, episode_dict, episode_methods, var_type="float")
add_edit("originally_available", episode, episode_dict, episode_methods, key="originallyAvailableAt", var_type="date")
add_edit("summary", episode, episode_dict, episode_methods)
if self.library.edit_item(episode, f"{season_id} Episode: {episode_id}", "Season", edits):
updated = True
for tag_edit in ["director", "writer"]:
if self.edit_tags(tag_edit, episode, episode_dict, episode_methods):
updated = True
self.set_images(episode, episode_dict, episode_methods)
logger.info(f"Episode S{season_id}E{episode_id} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}")
if "albums" in methods and self.library.is_music:
if not meta[methods["albums"]]:
logger.error("Metadata Error: albums attribute is blank")
elif not isinstance(meta[methods["albums"]], dict):
logger.error("Metadata Error: albums attribute must be a dictionary")
else:
for album_name, album_dict in meta[methods["albums"]].items():
updated = False
title = None
album_methods = {am.lower(): am for am in album_dict}
logger.info("")
logger.info(f"Updating album {album_name} of {mapping_name}...")
try:
album = item.album(album_name)
except NotFound:
try: try:
episode = item.episode(season=season_id, episode=episode_id) if "alt_title" not in album_methods or not album_dict[album_methods["alt_title"]]:
raise NotFound
album = item.album(album_dict[album_methods["alt_title"]])
title = album_name
except NotFound: except NotFound:
logger.error(f"Metadata Error: episode {episode_id} of season {season_id} not found") logger.error(f"Metadata Error: Album: {album_name} not found")
else: continue
episode_dict = meta[methods["episodes"]][episode_str]
episode_methods = {em.lower(): em for em in episode_dict}
if "title" in episode_methods and episode_dict[episode_methods["title"]]: if not title:
title = episode_dict[episode_methods["title"]] title = album.title
else: edits = {}
title = episode.title add_edit("title", album, album_dict, album_methods, value=title)
if "sub" in episode_dict: add_edit("sort_title", album, album_dict, album_methods, key="titleSort")
if episode_dict[episode_methods["sub"]] is None: add_edit("rating", album, album_dict, album_methods, var_type="float")
logger.error("Metadata Error: sub attribute is blank") add_edit("originally_available", album, album_dict, album_methods, key="originallyAvailableAt", var_type="date")
elif episode_dict[episode_methods["sub"]] is True and "(SUB)" not in title: add_edit("record_label", album, album_dict, album_methods, key="studio")
title = f"{title} (SUB)" add_edit("summary", album, album_dict, album_methods)
elif episode_dict[episode_methods["sub"]] is False and title.endswith(" (SUB)"): if self.library.edit_item(album, title, "Album", edits):
title = title[:-6] updated = True
else: for tag_edit in ["genre", "style", "mood", "collection", "label"]:
logger.error("Metadata Error: sub attribute must be True or False") if self.edit_tags(tag_edit, album, album_dict, album_methods):
edits = {} updated = True
add_edit("title", episode, episode_dict, episode_methods, value=title) self.set_images(album, album_dict, album_methods)
add_edit("sort_title", episode, episode_dict, episode_methods, key="titleSort") logger.info(f"Album: {title} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}")
add_edit("rating", episode, episode_dict, episode_methods, var_type="float")
add_edit("originally_available", episode, episode_dict, episode_methods, key="originallyAvailableAt", var_type="date") if "tracks" in album_methods:
add_edit("summary", episode, episode_dict, episode_methods) if not album_dict[album_methods["tracks"]]:
if self.library.edit_item(episode, f"{season_id} Episode: {episode_id}", "Season", edits): logger.error("Metadata Error: tracks attribute is blank")
updated = True elif not isinstance(album_dict[album_methods["tracks"]], dict):
if edit_tags("director", episode, episode_dict, episode_methods): logger.error("Metadata Error: tracks attribute must be a dictionary")
updated = True else:
if edit_tags("writer", episode, episode_dict, episode_methods): for track_num, track_dict in album_dict[album_methods["tracks"]].items():
updated = True updated = False
set_images(episode, episode_dict, episode_methods) title = None
logger.info(f"Episode S{season_id}E{episode_id} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}") track_methods = {tm.lower(): tm for tm in track_dict}
else: logger.info("")
logger.error(f"Metadata Error: episode {episode_str} invalid must have S##E## format") logger.info(f"Updating track {track_num} on {album_name} of {mapping_name}...")
else: try:
logger.error("Metadata Error: episodes attribute is blank") if isinstance(track_num, int):
elif "episodes" in methods: track = album.track(track=track_num)
logger.error("Metadata Error: episodes attribute only works for show libraries") else:
track = album.track(title=track_num)
except NotFound:
try:
if "alt_title" not in track_methods or not track_dict[track_methods["alt_title"]]:
raise NotFound
track = album.track(title=track_dict[track_methods["alt_title"]])
title = track_num
except NotFound:
logger.error(f"Metadata Error: Track: {track_num} not found")
continue
if not title:
title = track.title
edits = {}
add_edit("title", track, track_dict, track_methods, value=title)
add_edit("rating", track, track_dict, track_methods, var_type="float")
add_edit("track", track, track_dict, track_methods, key="index", var_type="int")
add_edit("disc", track, track_dict, track_methods, key="parentIndex", var_type="int")
add_edit("original_artist", track, track_dict, track_methods, key="originalTitle")
if self.library.edit_item(album, title, "Track", edits):
updated = True
if self.edit_tags("mood", track, track_dict, track_methods):
updated = True
logger.info(f"Track: {track_num} on Album: {title} of {mapping_name} Details Update {'Complete' if updated else 'Not Needed'}")
class PlaylistFile(DataFile): class PlaylistFile(DataFile):

@ -34,6 +34,19 @@ class OMDbObj:
self.metacritic_rating = None self.metacritic_rating = None
self.imdb_id = data["imdbID"] self.imdb_id = data["imdbID"]
self.type = data["Type"] self.type = data["Type"]
try:
self.series_id = data["seriesID"]
except (ValueError, TypeError, KeyError):
self.series_id = None
try:
self.season_num = int(data["Season"])
except (ValueError, TypeError, KeyError):
self.season_num = None
try:
self.episode_num = int(data["Episode"])
except (ValueError, TypeError, KeyError):
self.episode_num = None
class OMDb: class OMDb:
def __init__(self, config, params): def __init__(self, config, params):

@ -41,7 +41,47 @@ search_translation = {
"audio_language": "audioLanguage", "audio_language": "audioLanguage",
"progress": "inProgress", "progress": "inProgress",
"episode_progress": "episode.inProgress", "episode_progress": "episode.inProgress",
"unplayed_episodes": "show.unwatchedLeaves" "unplayed_episodes": "show.unwatchedLeaves",
"artist_title": "artist.title",
"artist_user_rating": "artist.userRating",
"artist_genre": "artist.genre",
"artist_collection": "artist.collection",
"artist_country": "artist.country",
"artist_mood": "artist.mood",
"artist_style": "artist.style",
"artist_added": "artist.addedAt",
"artist_last_played": "artist.lastViewedAt",
"artist_unmatched": "artist.unmatched",
"album_title": "album.title",
"album_year": "album.year",
"album_decade": "album.decade",
"album_genre": "album.genre",
"album_plays": "album.viewCount",
"album_last_played": "album.lastViewedAt",
"album_user_rating": "album.userRating",
"album_critic_rating": "album.rating",
"album_record_label": "album.studio",
"album_mood": "album.mood",
"album_style": "album.style",
"album_format": "album.format",
"album_type": "album.subformat",
"album_collection": "album.collection",
"album_added": "album.addedAt",
"album_released": "album.originallyAvailableAt",
"album_unmatched": "album.unmatched",
"album_source": "album.source",
"album_label": "album.label",
"track_mood": "track.mood",
"track_title": "track.title",
"track_plays": "track.viewCount",
"track_last_played": "track.lastViewedAt",
"track_skips": "track.skipCount",
"track_last_skipped": "track.lastSkippedAt",
"track_user_rating": "track.userRating",
"track_last_rated": "track.lastRatedAt",
"track_added": "track.addedAt",
"track_trash": "track.trash",
"track_source": "track.source"
} }
show_translation = { show_translation = {
"title": "show.title", "title": "show.title",
@ -71,6 +111,7 @@ modifier_translation = {
".before": "%3C%3C", ".after": "%3E%3E", ".begins": "%3C", ".ends": "%3E" ".before": "%3C%3C", ".after": "%3E%3E", ".begins": "%3C", ".ends": "%3E"
} }
episode_sorting_options = {"default": "-1", "oldest": "0", "newest": "1"} episode_sorting_options = {"default": "-1", "oldest": "0", "newest": "1"}
album_sorting_options = {"default": -1, "newest": 0, "oldest": 1, "name": 2}
keep_episodes_options = {"all": 0, "5_latest": 5, "3_latest": 3, "latest": 1, "past_3": -3, "past_7": -7, "past_30": -30} keep_episodes_options = {"all": 0, "5_latest": 5, "3_latest": 3, "latest": 1, "past_3": -3, "past_7": -7, "past_30": -30}
delete_episodes_options = {"never": 0, "day": 1, "week": 7, "refresh": 100} delete_episodes_options = {"never": 0, "day": 1, "week": 7, "refresh": 100}
season_display_options = {"default": -1, "show": 0, "hide": 1} season_display_options = {"default": -1, "show": 0, "hide": 1}
@ -83,10 +124,13 @@ metadata_language_options = {lang.lower(): lang for lang in plex_languages}
metadata_language_options["default"] = None metadata_language_options["default"] = None
use_original_title_options = {"default": -1, "no": 0, "yes": 1} use_original_title_options = {"default": -1, "no": 0, "yes": 1}
collection_order_options = ["release", "alpha", "custom"] collection_order_options = ["release", "alpha", "custom"]
collection_level_options = ["episode", "season"] collection_level_show_options = ["episode", "season"]
collection_level_music_options = ["album", "track"]
collection_level_options = collection_level_show_options + collection_level_music_options
collection_mode_keys = {-1: "default", 0: "hide", 1: "hideItems", 2: "showItems"} collection_mode_keys = {-1: "default", 0: "hide", 1: "hideItems", 2: "showItems"}
collection_order_keys = {0: "release", 1: "alpha", 2: "custom"} collection_order_keys = {0: "release", 1: "alpha", 2: "custom"}
item_advance_keys = { item_advance_keys = {
"item_album_sorting": ("albumSort", album_sorting_options),
"item_episode_sorting": ("episodeSort", episode_sorting_options), "item_episode_sorting": ("episodeSort", episode_sorting_options),
"item_keep_episodes": ("autoDeletionItemPolicyUnwatchedLibrary", keep_episodes_options), "item_keep_episodes": ("autoDeletionItemPolicyUnwatchedLibrary", keep_episodes_options),
"item_delete_episodes": ("autoDeletionItemPolicyWatchedLibrary", delete_episodes_options), "item_delete_episodes": ("autoDeletionItemPolicyWatchedLibrary", delete_episodes_options),
@ -96,6 +140,48 @@ item_advance_keys = {
"item_use_original_title": ("useOriginalTitle", use_original_title_options) "item_use_original_title": ("useOriginalTitle", use_original_title_options)
} }
new_plex_agents = ["tv.plex.agents.movie", "tv.plex.agents.series"] new_plex_agents = ["tv.plex.agents.movie", "tv.plex.agents.series"]
music_searches = [
"artist_title", "artist_title.not", "artist_title.is", "artist_title.isnot", "artist_title.begins", "artist_title.ends",
"artist_user_rating.gt", "artist_user_rating.gte", "artist_user_rating.lt", "artist_user_rating.lte",
"artist_genre", "artist_genre.not",
"artist_collection", "artist_collection.not",
"artist_country", "artist_country.not",
"artist_mood", "artist_mood.not",
"artist_style", "artist_style.not",
"artist_added", "artist_added.not", "artist_added.before", "artist_added.after",
"artist_last_played", "artist_last_played.not", "artist_last_played.before", "artist_last_played.after",
"artist_unmatched",
"album_title", "album_title.not", "album_title.is", "album_title.isnot", "album_title.begins", "album_title.ends",
"album_year.gt", "album_year.gte", "album_year.lt", "album_year.lte",
"album_decade",
"album_genre", "album_genre.not",
"album_plays.gt", "album_plays.gte", "album_plays.lt", "album_plays.lte",
"album_last_played", "album_last_played.not", "album_last_played.before", "album_last_played.after",
"album_user_rating.gt", "album_user_rating.gte", "album_user_rating.lt", "album_user_rating.lte",
"album_critic_rating.gt", "album_critic_rating.gte", "album_critic_rating.lt", "album_critic_rating.lte",
"album_record_label", "album_record_label.not", "album_record_label.is", "album_record_label.isnot", "album_record_label.begins", "album_record_label.ends",
"album_mood", "album_mood.not",
"album_style", "album_style.not",
"album_format", "album_format.not",
"album_type", "album_type.not",
"album_collection", "album_collection.not",
"album_added", "album_added.not", "album_added.before", "album_added.after",
"album_released", "album_released.not", "album_released.before", "album_released.after",
"album_unmatched",
"album_source", "album_source.not",
"album_label", "album_label.not",
"track_mood", "track_mood.not",
"track_title", "track_title.not", "track_title.is", "track_title.isnot", "track_title.begins", "track_title.ends",
"track_plays.gt", "track_plays.gte", "track_plays.lt", "track_plays.lte",
"track_last_played", "track_last_played.not", "track_last_played.before", "track_last_played.after",
"track_skips.gt", "track_skips.gte", "track_skips.lt", "track_skips.lte",
"track_last_skipped", "track_last_skipped.not", "track_last_skipped.before", "track_last_skipped.after",
"track_user_rating.gt", "track_user_rating.gte", "track_user_rating.lt", "track_user_rating.lte",
"track_last_rated", "track_last_rated.not", "track_last_rated.before", "track_last_rated.after",
"track_added", "track_added.not", "track_added.before", "track_added.after",
"track_trash",
"track_source", "track_source.not"
]
searches = [ searches = [
"title", "title.not", "title.is", "title.isnot", "title.begins", "title.ends", "title", "title.not", "title.is", "title.isnot", "title.begins", "title.ends",
"studio", "studio.not", "studio.is", "studio.isnot", "studio.begins", "studio.ends", "studio", "studio.not", "studio.is", "studio.isnot", "studio.begins", "studio.ends",
@ -129,7 +215,7 @@ searches = [
"episode_plays.gt", "episode_plays.gte", "episode_plays.lt", "episode_plays.lte", "episode_plays.gt", "episode_plays.gte", "episode_plays.lt", "episode_plays.lte",
"episode_user_rating.gt", "episode_user_rating.gte", "episode_user_rating.lt", "episode_user_rating.lte", "episode_user_rating.gt", "episode_user_rating.gte", "episode_user_rating.lt", "episode_user_rating.lte",
"episode_year", "episode_year.not", "episode_year.gt", "episode_year.gte", "episode_year.lt", "episode_year.lte" "episode_year", "episode_year.not", "episode_year.gt", "episode_year.gte", "episode_year.lt", "episode_year.lte"
] ] + music_searches
and_searches = [ and_searches = [
"title.and", "studio.and", "actor.and", "audio_language.and", "collection.and", "title.and", "studio.and", "actor.and", "audio_language.and", "collection.and",
"content_rating.and", "country.and", "director.and", "genre.and", "label.and", "content_rating.and", "country.and", "director.and", "genre.and", "label.and",
@ -157,18 +243,29 @@ show_only_searches = [
"episode_year", "episode_year.not", "episode_year.gt", "episode_year.gte", "episode_year.lt", "episode_year.lte", "episode_year", "episode_year.not", "episode_year.gt", "episode_year.gte", "episode_year.lt", "episode_year.lte",
"unplayed_episodes", "episode_unplayed", "episode_duplicate", "episode_progress", "episode_unmatched", "unplayed_episodes", "episode_unplayed", "episode_duplicate", "episode_progress", "episode_unmatched",
] ]
float_attributes = ["user_rating", "episode_user_rating", "critic_rating", "audience_rating"] string_attributes = ["title", "studio", "episode_title", "artist_title", "album_title", "album_record_label", "track_title"]
float_attributes = [
"user_rating", "episode_user_rating", "critic_rating", "audience_rating",
"artist_user_rating", "album_user_rating", "album_critic_rating", "track_user_rating"
]
boolean_attributes = [ boolean_attributes = [
"hdr", "unmatched", "duplicate", "unplayed", "progress", "trash", "hdr", "unmatched", "duplicate", "unplayed", "progress", "trash", "unplayed_episodes", "episode_unplayed",
"unplayed_episodes", "episode_unplayed", "episode_duplicate", "episode_progress", "episode_unmatched", "episode_duplicate", "episode_progress", "episode_unmatched", "artist_unmatched", "album_unmatched", "track_trash"
] ]
tmdb_attributes = ["actor", "director", "producer", "writer"] tmdb_attributes = ["actor", "director", "producer", "writer"]
date_attributes = ["added", "episode_added", "release", "episode_air_date", "last_played", "episode_last_played", "first_episode_aired", "last_episode_aired"] date_attributes = [
number_attributes = ["plays", "episode_plays", "duration", "tmdb_vote_count"] + date_attributes "added", "episode_added", "release", "episode_air_date", "last_played", "episode_last_played",
"first_episode_aired", "last_episode_aired", "artist_added", "artist_last_played", "album_last_played",
"album_added", "album_released", "track_last_played", "track_last_skipped", "track_last_rated", "track_added"
]
year_attributes = ["decade", "year", "episode_year", "album_year", "album_decade"]
number_attributes = ["plays", "episode_plays", "duration", "tmdb_vote_count", "album_plays", "track_plays", "track_skips"] + year_attributes
search_display = {"added": "Date Added", "release": "Release Date", "hdr": "HDR", "progress": "In Progress", "episode_progress": "Episode In Progress"} search_display = {"added": "Date Added", "release": "Release Date", "hdr": "HDR", "progress": "In Progress", "episode_progress": "Episode In Progress"}
tags = [ tag_attributes = [
"actor", "audio_language", "collection", "content_rating", "country", "director", "genre", "label", "actor", "audio_language", "collection", "content_rating", "country", "director", "genre", "label", "network",
"network", "producer", "resolution", "studio", "subtitle_language", "writer" "producer", "resolution", "studio", "subtitle_language", "writer", "artist_genre", "artist_collection",
"artist_country", "artist_mood", "artist_style", "album_genre", "album_mood", "album_style", "album_format",
"album_type", "album_collection", "album_source", "album_label", "track_mood", "track_source"
] ]
movie_sorts = { movie_sorts = {
"title.asc": "titleSort", "title.desc": "titleSort%3Adesc", "title.asc": "titleSort", "title.desc": "titleSort%3Adesc",
@ -180,8 +277,12 @@ movie_sorts = {
"user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc", "user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc",
"content_rating.asc": "contentRating", "content_rating.desc": "contentRating%3Adesc", "content_rating.asc": "contentRating", "content_rating.desc": "contentRating%3Adesc",
"duration.asc": "duration", "duration.desc": "duration%3Adesc", "duration.asc": "duration", "duration.desc": "duration%3Adesc",
"progress.asc": "viewOffset", "progress.desc": "viewOffset%3Adesc",
"plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc", "plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc", "added.asc": "addedAt", "added.desc": "addedAt%3Adesc",
"viewed.asc": "lastViewedAt", "viewed.desc": "lastViewedAt%3Adesc",
"resolution.asc": "mediaHeight", "resolution.desc": "mediaHeight%3Adesc",
"bitrate.asc": "mediaBitrate", "bitrate.desc": "mediaBitrate%3Adesc",
"random": "random" "random": "random"
} }
show_sorts = { show_sorts = {
@ -193,8 +294,10 @@ show_sorts = {
"audience_rating.asc": "audienceRating", "audience_rating.desc": "audienceRating%3Adesc", "audience_rating.asc": "audienceRating", "audience_rating.desc": "audienceRating%3Adesc",
"user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc", "user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc",
"content_rating.asc": "contentRating", "content_rating.desc": "contentRating%3Adesc", "content_rating.asc": "contentRating", "content_rating.desc": "contentRating%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc", "unplayed.asc": "unviewedLeafCount", "unplayed.desc": "unviewedLeafCount%3Adesc",
"episode_added.asc": "episode.addedAt", "episode_added.desc": "episode.addedAt%3Adesc", "episode_added.asc": "episode.addedAt", "episode_added.desc": "episode.addedAt%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc",
"viewed.asc": "lastViewedAt", "viewed.desc": "lastViewedAt%3Adesc",
"random": "random" "random": "random"
} }
season_sorts = { season_sorts = {
@ -215,11 +318,61 @@ episode_sorts = {
"audience_rating.asc": "audienceRating", "audience_rating.desc": "audienceRating%3Adesc", "audience_rating.asc": "audienceRating", "audience_rating.desc": "audienceRating%3Adesc",
"user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc", "user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc",
"duration.asc": "duration", "duration.desc": "duration%3Adesc", "duration.asc": "duration", "duration.desc": "duration%3Adesc",
"progress.asc": "viewOffset", "progress.desc": "viewOffset%3Adesc",
"plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc",
"viewed.asc": "lastViewedAt", "viewed.desc": "lastViewedAt%3Adesc",
"resolution.asc": "mediaHeight", "resolution.desc": "mediaHeight%3Adesc",
"bitrate.asc": "mediaBitrate", "bitrate.desc": "mediaBitrate%3Adesc",
"random": "random"
}
artist_sorts = {
"title.asc": "titleSort", "title.desc": "titleSort%3Adesc",
"user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc",
"played.asc": "lastViewedAt", "played.desc": "lastViewedAt%3Adesc",
"plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc",
"random": "random"
}
album_sorts = {
"title.asc": "titleSort", "title.desc": "titleSort%3Adesc",
"album_artist.asc": "artist.titleSort%2Calbum.titleSort%2Calbum.index%2Calbum.id%2Calbum.originallyAvailableAt",
"album_artist.desc": "artist.titleSort%3Adesc%2Calbum.titleSort%2Calbum.index%2Calbum.id%2Calbum.originallyAvailableAt",
"year.asc": "year", "year.desc": "year%3Adesc",
"originally_available.asc": "originallyAvailableAt", "originally_available.desc": "originallyAvailableAt%3Adesc",
"release.asc": "originallyAvailableAt", "release.desc": "originallyAvailableAt%3Adesc",
"critic_rating.asc": "rating", "critic_rating.desc": "rating%3Adesc",
"user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc",
"played.asc": "lastViewedAt", "played.desc": "lastViewedAt%3Adesc",
"plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc",
"random": "random"
}
track_sorts = {
"title.asc": "titleSort", "title.desc": "titleSort%3Adesc",
"album_artist.asc": "artist.titleSort%2Calbum.titleSort%2Calbum.year%2Ctrack.absoluteIndex%2Ctrack.index%2Ctrack.titleSort%2Ctrack.id",
"album_artist.desc": "artist.titleSort%3Adesc%2Calbum.titleSort%2Calbum.year%2Ctrack.absoluteIndex%2Ctrack.index%2Ctrack.titleSort%2Ctrack.id",
"artist.asc": "originalTitle", "artist.desc": "originalTitle%3Adesc",
"album.asc": "album.titleSort", "album.desc": "album.titleSort%3Adesc",
"user_rating.asc": "userRating", "user_rating.desc": "userRating%3Adesc",
"duration.asc": "duration", "duration.desc": "duration%3Adesc",
"plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc", "plays.asc": "viewCount", "plays.desc": "viewCount%3Adesc",
"added.asc": "addedAt", "added.desc": "addedAt%3Adesc", "added.asc": "addedAt", "added.desc": "addedAt%3Adesc",
"played.asc": "lastViewedAt", "played.desc": "lastViewedAt%3Adesc",
"rated.asc": "lastRatedAt", "rated.desc": "lastRatedAt%3Adesc",
"popularity.asc": "ratingCount", "popularity.desc": "ratingCount%3Adesc",
"bitrate.asc": "mediaBitrate", "bitrate.desc": "mediaBitrate%3Adesc",
"random": "random" "random": "random"
} }
sort_types = {"movies": (1, movie_sorts), "shows": (2, show_sorts), "seasons": (3, season_sorts), "episodes": (4, episode_sorts)} sort_types = {
"movies": (1, movie_sorts),
"shows": (2, show_sorts),
"seasons": (3, season_sorts),
"episodes": (4, episode_sorts),
"artists": (8, artist_sorts),
"albums": (9, album_sorts),
"tracks": (10, track_sorts)
}
class Plex(Library): class Plex(Library):
def __init__(self, config, params): def __init__(self, config, params):
@ -246,8 +399,8 @@ class Plex(Library):
self.Plex = s self.Plex = s
break break
if not self.Plex: if not self.Plex:
raise Failed(f"Plex Error: Plex Library {params['name']} not found. Options: {library_names}") raise Failed(f"Plex Error: Plex Library '{params['name']}' not found. Options: {library_names}")
if self.Plex.type in ["movie", "show"]: if self.Plex.type in ["movie", "show", "artist"]:
self.type = self.Plex.type.capitalize() self.type = self.Plex.type.capitalize()
else: else:
raise Failed(f"Plex Error: Plex Library must be a Movies or TV Shows library") raise Failed(f"Plex Error: Plex Library must be a Movies or TV Shows library")
@ -256,6 +409,7 @@ class Plex(Library):
self.agent = self.Plex.agent self.agent = self.Plex.agent
self.is_movie = self.type == "Movie" self.is_movie = self.type == "Movie"
self.is_show = self.type == "Show" self.is_show = self.type == "Show"
self.is_music = self.type == "Artist"
self.is_other = self.agent == "com.plexapp.agents.none" self.is_other = self.agent == "com.plexapp.agents.none"
if self.is_other: if self.is_other:
self.type = "Video" self.type = "Video"
@ -293,9 +447,12 @@ class Plex(Library):
def fetchItem(self, data): def fetchItem(self, data):
return self.PlexServer.fetchItem(data) return self.PlexServer.fetchItem(data)
def get_all(self): def get_all(self, collection_level=None):
logger.info(f"Loading All {self.type}s from Library: {self.name}") collection_type = collection_level if collection_level else self.Plex.TYPE
key = f"/library/sections/{self.Plex.key}/all?includeGuids=1&type={utils.searchType(self.Plex.TYPE)}" if not collection_level:
collection_level = self.type
logger.info(f"Loading All {collection_level.capitalize()}s from Library: {self.name}")
key = f"/library/sections/{self.Plex.key}/all?includeGuids=1&type={utils.searchType(collection_type)}"
container_start = 0 container_start = 0
container_size = plexapi.X_PLEX_CONTAINER_SIZE container_size = plexapi.X_PLEX_CONTAINER_SIZE
results = [] results = []
@ -303,7 +460,7 @@ class Plex(Library):
results.extend(self.fetchItems(key, container_start, container_size)) results.extend(self.fetchItems(key, container_start, container_size))
util.print_return(f"Loaded: {container_start}/{self.Plex._totalViewSize}") util.print_return(f"Loaded: {container_start}/{self.Plex._totalViewSize}")
container_start += container_size container_start += container_size
logger.info(util.adjust_space(f"Loaded {self.Plex._totalViewSize} {self.type}s")) logger.info(util.adjust_space(f"Loaded {self.Plex._totalViewSize} {collection_level.capitalize()}s"))
return results return results
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
@ -376,6 +533,7 @@ class Plex(Library):
item.uploadArt(filepath=image.location) item.uploadArt(filepath=image.location)
self.reload(item) self.reload(item)
except BadRequest as e: except BadRequest as e:
item.refresh()
raise Failed(e) raise Failed(e)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
@ -383,6 +541,10 @@ class Plex(Library):
item.uploadPoster(filepath=image) item.uploadPoster(filepath=image)
self.reload(item) self.reload(item)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def get_genres(self):
return [genre.title for genre in self.Plex.listFilterChoices("genre")]
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed) @retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def get_search_choices(self, search_name, title=True): def get_search_choices(self, search_name, title=True):
final_search = search_translation[search_name] if search_name in search_translation else search_name final_search = search_translation[search_name] if search_name in search_translation else search_name
@ -395,6 +557,8 @@ class Plex(Library):
names.append(choice.title) names.append(choice.title)
if choice.key not in names: if choice.key not in names:
names.append(choice.key) names.append(choice.key)
choices[choice.title] = choice.title if title else choice.key
choices[choice.key] = choice.title if title else choice.key
choices[choice.title.lower()] = choice.title if title else choice.key choices[choice.title.lower()] = choice.title if title else choice.key
choices[choice.key.lower()] = choice.title if title else choice.key choices[choice.key.lower()] = choice.title if title else choice.key
return choices, names return choices, names
@ -533,8 +697,8 @@ class Plex(Library):
def get_rating_keys(self, method, data): def get_rating_keys(self, method, data):
items = [] items = []
if method == "plex_all": if method == "plex_all":
logger.info(f"Processing Plex All {self.type}s") logger.info(f"Processing Plex All {data.capitalize()}s")
items = self.get_all() items = self.get_all(collection_level=data)
elif method == "plex_search": elif method == "plex_search":
util.print_multiline(data[1], info=True) util.print_multiline(data[1], info=True)
items = self.get_filter_items(data[2]) items = self.get_filter_items(data[2])
@ -636,6 +800,8 @@ class Plex(Library):
def edit_tags(self, attr, obj, add_tags=None, remove_tags=None, sync_tags=None): def edit_tags(self, attr, obj, add_tags=None, remove_tags=None, sync_tags=None):
display = "" display = ""
key = builder.filter_translation[attr] if attr in builder.filter_translation else attr key = builder.filter_translation[attr] if attr in builder.filter_translation else attr
attr_display = attr.replace("_", " ").title()
attr_call = attr_display.replace(" ", "")
if add_tags or remove_tags or sync_tags is not None: if add_tags or remove_tags or sync_tags is not None:
_add_tags = add_tags if add_tags else [] _add_tags = add_tags if add_tags else []
_remove_tags = [t.lower() for t in remove_tags] if remove_tags else [] _remove_tags = [t.lower() for t in remove_tags] if remove_tags else []
@ -648,13 +814,13 @@ class Plex(Library):
_add = [f"{t[:1].upper()}{t[1:]}" for t in _add_tags + _sync_tags if t.lower() not in _item_tags] _add = [f"{t[:1].upper()}{t[1:]}" for t in _add_tags + _sync_tags if t.lower() not in _item_tags]
_remove = [t for t in _item_tags if (sync_tags is not None and t not in _sync_tags) or t in _remove_tags] _remove = [t for t in _item_tags if (sync_tags is not None and t not in _sync_tags) or t in _remove_tags]
if _add: if _add:
self.query_data(getattr(obj, f"add{attr.capitalize()}"), _add) self.query_data(getattr(obj, f"add{attr_call}"), _add)
display += f"+{', +'.join(_add)}" display += f"+{', +'.join(_add)}"
if _remove: if _remove:
self.query_data(getattr(obj, f"remove{attr.capitalize()}"), _remove) self.query_data(getattr(obj, f"remove{attr_call}"), _remove)
display += f"-{', -'.join(_remove)}" display += f"-{', -'.join(_remove)}"
if len(display) > 0: if len(display) > 0:
logger.info(f"{obj.title[:25]:<25} | {attr.capitalize()} | {display}") logger.info(f"{obj.title[:25]:<25} | {attr_display} | {display}")
return len(display) > 0 return len(display) > 0
def find_assets(self, item, name=None, upload=True, overlay=None, folders=None, create=None): def find_assets(self, item, name=None, upload=True, overlay=None, folders=None, create=None):
@ -665,12 +831,12 @@ class Plex(Library):
elif isinstance(item, Collection): elif isinstance(item, Collection):
name = name if name else item.title name = name if name else item.title
else: else:
return None, None return None, None, None
if not folders: if not folders:
folders = self.asset_folders folders = self.asset_folders
if not create: if not create:
create = self.create_asset_folders create = self.create_asset_folders
found_folder = False found_folder = None
poster = None poster = None
background = None background = None
for ad in self.asset_directory: for ad in self.asset_directory:
@ -689,7 +855,7 @@ class Plex(Library):
break break
if item_dir is None: if item_dir is None:
continue continue
found_folder = True found_folder = item_dir
poster_filter = os.path.join(item_dir, "poster.*") poster_filter = os.path.join(item_dir, "poster.*")
background_filter = os.path.join(item_dir, "background.*") background_filter = os.path.join(item_dir, "background.*")
else: else:
@ -725,7 +891,7 @@ class Plex(Library):
if upload: if upload:
self.upload_images(item, poster=poster, background=background, overlay=overlay) self.upload_images(item, poster=poster, background=background, overlay=overlay)
else: else:
return poster, background return poster, background, item_dir
if isinstance(item, Show): if isinstance(item, Show):
missing_assets = "" missing_assets = ""
found_season = False found_season = False
@ -761,13 +927,15 @@ class Plex(Library):
self.upload_images(episode, poster=episode_poster) self.upload_images(episode, poster=episode_poster)
if self.show_missing_season_assets and found_season and missing_assets: if self.show_missing_season_assets and found_season and missing_assets:
util.print_multiline(f"Missing Season Posters for {item.title}{missing_assets}", info=True) util.print_multiline(f"Missing Season Posters for {item.title}{missing_assets}", info=True)
if isinstance(item, (Movie, Show)) and not poster and overlay: if isinstance(item, (Movie, Show)) and not poster and overlay:
self.upload_images(item, overlay=overlay) self.upload_images(item, overlay=overlay)
if create and folders and not found_folder: if create and folders and not found_folder:
os.makedirs(os.path.join(self.asset_directory[0], name), exist_ok=True) found_folder = os.path.join(self.asset_directory[0], name)
logger.info(f"Asset Directory Created: {os.path.join(self.asset_directory[0], name)}") os.makedirs(found_folder, exist_ok=True)
logger.info(f"Asset Directory Created: {found_folder}")
elif isinstance(item, (Movie, Show)) and not overlay and folders and not found_folder: elif isinstance(item, (Movie, Show)) and not overlay and folders and not found_folder:
logger.error(f"Asset Warning: No asset folder found called '{name}'") logger.warning(f"Asset Warning: No asset folder found called '{name}'")
elif isinstance(item, (Movie, Show)) and not poster and not background and self.show_missing_assets: elif isinstance(item, (Movie, Show)) and not poster and not background and self.show_missing_assets:
logger.error(f"Asset Warning: No poster or background found in an assets folder for '{name}'") logger.warning(f"Asset Warning: No poster or background found in an assets folder for '{name}'")
return None, None return None, None, found_folder

@ -2,7 +2,7 @@ import logging
from modules import util from modules import util
from modules.util import Failed from modules.util import Failed
from arrapi import RadarrAPI from arrapi import RadarrAPI
from arrapi.exceptions import ArrException, Invalid from arrapi.exceptions import ArrException
logger = logging.getLogger("Plex Meta Manager") logger = logging.getLogger("Plex Meta Manager")
@ -22,7 +22,7 @@ class Radarr:
self.api._validate_add_options(params["root_folder_path"], params["quality_profile"]) self.api._validate_add_options(params["root_folder_path"], params["quality_profile"])
except ArrException as e: except ArrException as e:
raise Failed(e) raise Failed(e)
self.add = params["add"] self.add_missing = params["add_missing"]
self.add_existing = params["add_existing"] self.add_existing = params["add_existing"]
self.root_folder_path = params["root_folder_path"] self.root_folder_path = params["root_folder_path"]
self.monitor = params["monitor"] self.monitor = params["monitor"]
@ -34,9 +34,6 @@ class Radarr:
self.plex_path = params["plex_path"] if params["radarr_path"] and params["plex_path"] else "" self.plex_path = params["plex_path"] if params["radarr_path"] and params["plex_path"] else ""
def add_tmdb(self, tmdb_ids, **options): def add_tmdb(self, tmdb_ids, **options):
logger.info("")
util.separator("Adding to Radarr", space=False, border=False)
logger.debug("")
_ids = [] _ids = []
_paths = [] _paths = []
for tmdb_id in tmdb_ids: for tmdb_id in tmdb_ids:
@ -44,6 +41,9 @@ class Radarr:
_paths.append(tmdb_id) _paths.append(tmdb_id)
else: else:
_ids.append(tmdb_id) _ids.append(tmdb_id)
logger.info("")
util.separator(f"Adding {'Missing' if _ids else 'Existing'} to Radarr", space=False, border=False)
logger.debug("")
logger.debug(f"Radarr Adds: {_ids if _ids else ''}") logger.debug(f"Radarr Adds: {_ids if _ids else ''}")
for tmdb_id in _paths: for tmdb_id in _paths:
logger.debug(tmdb_id) logger.debug(tmdb_id)
@ -68,10 +68,23 @@ class Radarr:
exists = [] exists = []
skipped = [] skipped = []
invalid = [] invalid = []
invalid_root = []
movies = [] movies = []
path_lookup = {} path_lookup = {}
mismatched = {} mismatched = {}
path_in_use = {} path_in_use = {}
def mass_add():
try:
_a, _e, _i = self.api.add_multiple_movies(movies, folder, quality_profile, monitor, search,
availability, tags, per_request=100)
added.extend(_a)
exists.extend(_e)
invalid.extend(_i)
except ArrException as e:
util.print_stacktrace()
raise Failed(f"Radarr Error: {e}")
for i, item in enumerate(tmdb_ids, 1): for i, item in enumerate(tmdb_ids, 1):
path = item[1] if isinstance(item, tuple) else None path = item[1] if isinstance(item, tuple) else None
tmdb_id = item[0] if isinstance(item, tuple) else item tmdb_id = item[0] if isinstance(item, tuple) else item
@ -88,7 +101,12 @@ class Radarr:
if path and path.lower() in arr_paths: if path and path.lower() in arr_paths:
mismatched[path] = tmdb_id mismatched[path] = tmdb_id
continue continue
if path and not path.startswith(folder):
invalid_root.append(item)
continue
movie = self.api.get_movie(tmdb_id=tmdb_id) movie = self.api.get_movie(tmdb_id=tmdb_id)
if self.config.trace_mode:
logger.debug(f"Folder to Check: {folder}/{movie.folder}")
if f"{folder}/{movie.folder}".lower() in arr_paths: if f"{folder}/{movie.folder}".lower() in arr_paths:
path_in_use[f"{folder}/{movie.folder}"] = tmdb_id path_in_use[f"{folder}/{movie.folder}"] = tmdb_id
continue continue
@ -100,20 +118,16 @@ class Radarr:
except ArrException: except ArrException:
invalid.append(item) invalid.append(item)
if len(movies) == 100 or len(tmdb_ids) == i: if len(movies) == 100 or len(tmdb_ids) == i:
try: mass_add()
_a, _e, _i = self.api.add_multiple_movies(movies, folder, quality_profile, monitor, search, movies = []
availability, tags, per_request=100) if movies:
added.extend(_a) mass_add()
exists.extend(_e) movies = []
invalid.extend(_i)
movies = []
except Invalid as e:
raise Failed(f"Radarr Error: {e}")
if len(added) > 0: if len(added) > 0:
logger.info("") logger.info("")
for movie in added: for movie in added:
logger.info(f"Added to Radarr | {movie.tmdbId:<6} | {movie.title}") logger.info(f"Added to Radarr | {movie.tmdbId:<7} | {movie.title}")
if self.config.Cache: if self.config.Cache:
self.config.Cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name) self.config.Cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name)
logger.info(f"{len(added)} Movie{'s' if len(added) > 1 else ''} added to Radarr") logger.info(f"{len(added)} Movie{'s' if len(added) > 1 else ''} added to Radarr")
@ -122,7 +136,7 @@ class Radarr:
logger.info("") logger.info("")
if len(exists) > 0: if len(exists) > 0:
for movie in exists: for movie in exists:
logger.info(f"Already in Radarr | {movie.tmdbId:<6} | {movie.title}") logger.info(f"Already in Radarr | {movie.tmdbId:<7} | {movie.title}")
if self.config.Cache: if self.config.Cache:
self.config.Cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name) self.config.Cache.update_radarr_adds(movie.tmdbId, self.library.original_mapping_name)
if len(skipped) > 0: if len(skipped) > 0:
@ -150,6 +164,12 @@ class Radarr:
logger.info(f"Invalid TMDb ID | {tmdb_id}") logger.info(f"Invalid TMDb ID | {tmdb_id}")
logger.info(f"{len(invalid)} Movie{'s' if len(invalid) > 1 else ''} with Invalid IDs") logger.info(f"{len(invalid)} Movie{'s' if len(invalid) > 1 else ''} with Invalid IDs")
if len(invalid_root) > 0:
logger.info("")
for tmdb_id, path in invalid_root:
logger.info(f"Invalid Root Folder for TMDb ID | {tmdb_id:<7} | {path}")
logger.info(f"{len(invalid_root)} Movie{'s' if len(invalid_root) > 1 else ''} with Invalid Paths")
return len(added) return len(added)
def edit_tags(self, tmdb_ids, tags, apply_tags): def edit_tags(self, tmdb_ids, tags, apply_tags):

@ -2,7 +2,7 @@ import logging
from modules import util from modules import util
from modules.util import Failed from modules.util import Failed
from arrapi import SonarrAPI from arrapi import SonarrAPI
from arrapi.exceptions import ArrException, Invalid from arrapi.exceptions import ArrException
logger = logging.getLogger("Plex Meta Manager") logger = logging.getLogger("Plex Meta Manager")
@ -40,7 +40,7 @@ class Sonarr:
self.api._validate_add_options(params["root_folder_path"], params["quality_profile"], params["language_profile"]) self.api._validate_add_options(params["root_folder_path"], params["quality_profile"], params["language_profile"])
except ArrException as e: except ArrException as e:
raise Failed(e) raise Failed(e)
self.add = params["add"] self.add_missing = params["add_missing"]
self.add_existing = params["add_existing"] self.add_existing = params["add_existing"]
self.root_folder_path = params["root_folder_path"] self.root_folder_path = params["root_folder_path"]
self.monitor = params["monitor"] self.monitor = params["monitor"]
@ -56,9 +56,6 @@ class Sonarr:
self.plex_path = params["plex_path"] if params["sonarr_path"] and params["plex_path"] else "" self.plex_path = params["plex_path"] if params["sonarr_path"] and params["plex_path"] else ""
def add_tvdb(self, tvdb_ids, **options): def add_tvdb(self, tvdb_ids, **options):
logger.info("")
util.separator("Adding to Sonarr", space=False, border=False)
logger.debug("")
_ids = [] _ids = []
_paths = [] _paths = []
for tvdb_id in tvdb_ids: for tvdb_id in tvdb_ids:
@ -66,6 +63,9 @@ class Sonarr:
_paths.append(tvdb_id) _paths.append(tvdb_id)
else: else:
_ids.append(tvdb_id) _ids.append(tvdb_id)
logger.info("")
util.separator(f"Adding {'Missing' if _ids else 'Existing'} to Sonarr", space=False, border=False)
logger.debug("")
logger.debug(f"Sonarr Adds: {_ids if _ids else ''}") logger.debug(f"Sonarr Adds: {_ids if _ids else ''}")
for tvdb_id in _paths: for tvdb_id in _paths:
logger.debug(tvdb_id) logger.debug(tvdb_id)
@ -94,10 +94,23 @@ class Sonarr:
exists = [] exists = []
skipped = [] skipped = []
invalid = [] invalid = []
invalid_root = []
shows = [] shows = []
path_lookup = {} path_lookup = {}
mismatched = {} mismatched = {}
path_in_use = {} path_in_use = {}
def mass_add():
try:
_a, _e, _i = self.api.add_multiple_series(shows, folder, quality_profile, language_profile, monitor,
season, search, cutoff_search, series_type, tags, per_request=100)
added.extend(_a)
exists.extend(_e)
invalid.extend(_i)
except ArrException as e:
util.print_stacktrace()
raise Failed(f"Radarr Error: {e}")
for i, item in enumerate(tvdb_ids, 1): for i, item in enumerate(tvdb_ids, 1):
path = item[1] if isinstance(item, tuple) else None path = item[1] if isinstance(item, tuple) else None
tvdb_id = item[0] if isinstance(item, tuple) else item tvdb_id = item[0] if isinstance(item, tuple) else item
@ -114,7 +127,12 @@ class Sonarr:
if path and path.lower() in arr_paths: if path and path.lower() in arr_paths:
mismatched[path] = tvdb_id mismatched[path] = tvdb_id
continue continue
if path and not path.startswith(folder):
invalid_root.append(item)
continue
show = self.api.get_series(tvdb_id=tvdb_id) show = self.api.get_series(tvdb_id=tvdb_id)
if self.config.trace_mode:
logger.debug(f"Folder to Check: {folder}/{show.folder}")
if f"{folder}/{show.folder}".lower() in arr_paths: if f"{folder}/{show.folder}".lower() in arr_paths:
path_in_use[f"{folder}/{show.folder}"] = tvdb_id path_in_use[f"{folder}/{show.folder}"] = tvdb_id
continue continue
@ -126,20 +144,16 @@ class Sonarr:
except ArrException: except ArrException:
invalid.append(item) invalid.append(item)
if len(shows) == 100 or len(tvdb_ids) == i: if len(shows) == 100 or len(tvdb_ids) == i:
try: mass_add()
_a, _e, _i = self.api.add_multiple_series(shows, folder, quality_profile, language_profile, monitor, shows = []
season, search, cutoff_search, series_type, tags, per_request=100) if shows:
added.extend(_a) mass_add()
exists.extend(_e) shows = []
invalid.extend(_i)
shows = []
except Invalid as e:
raise Failed(f"Sonarr Error: {e}")
if len(added) > 0: if len(added) > 0:
logger.info("") logger.info("")
for series in added: for series in added:
logger.info(f"Added to Sonarr | {series.tvdbId:<6} | {series.title}") logger.info(f"Added to Sonarr | {series.tvdbId:<7} | {series.title}")
if self.config.Cache: if self.config.Cache:
self.config.Cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name) self.config.Cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name)
logger.info(f"{len(added)} Series added to Sonarr") logger.info(f"{len(added)} Series added to Sonarr")
@ -148,7 +162,7 @@ class Sonarr:
logger.info("") logger.info("")
if len(exists) > 0: if len(exists) > 0:
for series in exists: for series in exists:
logger.info(f"Already in Sonarr | {series.tvdbId:<6} | {series.title}") logger.info(f"Already in Sonarr | {series.tvdbId:<7} | {series.title}")
if self.config.Cache: if self.config.Cache:
self.config.Cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name) self.config.Cache.update_sonarr_adds(series.tvdbId, self.library.original_mapping_name)
if len(skipped) > 0: if len(skipped) > 0:
@ -176,6 +190,12 @@ class Sonarr:
logger.info(f"Invalid TVDb ID | {tvdb_id}") logger.info(f"Invalid TVDb ID | {tvdb_id}")
logger.info(f"{len(invalid)} Series with Invalid IDs") logger.info(f"{len(invalid)} Series with Invalid IDs")
if len(invalid_root) > 0:
logger.info("")
for tvdb_id, path in invalid_root:
logger.info(f"Invalid Root Folder for TVDb ID | {tvdb_id:<7} | {path}")
logger.info(f"{len(invalid_root)} Series with Invalid Paths")
return len(added) return len(added)
def edit_tags(self, tvdb_ids, tags, apply_tags): def edit_tags(self, tvdb_ids, tags, apply_tags):

@ -24,7 +24,7 @@ class Tautulli:
if response["response"]["result"] != "success": if response["response"]["result"] != "success":
raise Failed(f"Tautulli Error: {response['response']['message']}") raise Failed(f"Tautulli Error: {response['response']['message']}")
def get_rating_keys(self, library, params): def get_rating_keys(self, library, params, all_items):
query_size = int(params["list_size"]) + int(params["list_buffer"]) query_size = int(params["list_size"]) + int(params["list_buffer"])
logger.info(f"Processing Tautulli Most {params['list_type'].capitalize()}: {params['list_size']} {'Movies' if library.is_movie else 'Shows'}") logger.info(f"Processing Tautulli Most {params['list_type'].capitalize()}: {params['list_size']} {'Movies' if library.is_movie else 'Shows'}")
response = self._request(f"{self.url}/api/v2?apikey={self.apikey}&cmd=get_home_stats&time_range={params['list_days']}&stats_count={query_size}") response = self._request(f"{self.url}/api/v2?apikey={self.apikey}&cmd=get_home_stats&time_range={params['list_days']}&stats_count={query_size}")
@ -42,7 +42,7 @@ class Tautulli:
section_id = self._section_id(library.name) section_id = self._section_id(library.name)
rating_keys = [] rating_keys = []
for item in items: for item in items:
if item["section_id"] == section_id and len(rating_keys) < int(params['list_size']): if (all_items or item["section_id"] == section_id) and len(rating_keys) < int(params['list_size']):
if int(item[stat_type]) < params['list_minimum']: if int(item[stat_type]) < params['list_minimum']:
continue continue
try: try:

@ -2,6 +2,7 @@ import glob, logging, os, re, signal, sys, time, traceback
from datetime import datetime, timedelta from datetime import datetime, timedelta
from logging.handlers import RotatingFileHandler from logging.handlers import RotatingFileHandler
from pathvalidate import is_valid_filename, sanitize_filename from pathvalidate import is_valid_filename, sanitize_filename
from plexapi.audio import Artist, Album, Track
from plexapi.exceptions import BadRequest, NotFound, Unauthorized from plexapi.exceptions import BadRequest, NotFound, Unauthorized
from plexapi.video import Season, Episode, Movie from plexapi.video import Season, Episode, Movie
@ -23,6 +24,9 @@ class Failed(Exception):
class NotScheduled(Exception): class NotScheduled(Exception):
pass pass
class NotScheduledRange(NotScheduled):
pass
class ImageData: class ImageData:
def __init__(self, attribute, location, prefix="", is_poster=True, is_url=True): def __init__(self, attribute, location, prefix="", is_poster=True, is_url=True):
self.attribute = attribute self.attribute = attribute
@ -182,19 +186,19 @@ def regex_first_int(data, id_type, default=None):
else: else:
raise Failed(f"Regex Error: Failed to parse {id_type} from {data}") raise Failed(f"Regex Error: Failed to parse {id_type} from {data}")
def centered(text, sep=" "): def centered(text, sep=" ", side_space=True, left=False):
if len(text) > screen_width - 2: if len(text) > screen_width - 2:
return text return text
space = screen_width - len(text) - 2 space = screen_width - len(text) - 2
text = f" {text} " text = f"{' ' if side_space else sep}{text}{' ' if side_space else sep}"
if space % 2 == 1: if space % 2 == 1:
text += sep text += sep
space -= 1 space -= 1
side = int(space / 2) - 1 side = int(space / 2) - 1
final_text = f"{sep * side}{text}{sep * side}" final_text = f"{text}{sep * side}{sep * side}" if left else f"{sep * side}{text}{sep * side}"
return final_text return final_text
def separator(text=None, space=True, border=True, debug=False): def separator(text=None, space=True, border=True, debug=False, side_space=True, left=False):
sep = " " if space else separating_character sep = " " if space else separating_character
for handler in logger.handlers: for handler in logger.handlers:
apply_formatter(handler, border=False) apply_formatter(handler, border=False)
@ -207,9 +211,9 @@ def separator(text=None, space=True, border=True, debug=False):
text_list = text.split("\n") text_list = text.split("\n")
for t in text_list: for t in text_list:
if debug: if debug:
logger.debug(f"|{sep}{centered(t, sep=sep)}{sep}|") logger.debug(f"|{sep}{centered(t, sep=sep, side_space=side_space, left=left)}{sep}|")
else: else:
logger.info(f"|{sep}{centered(t, sep=sep)}{sep}|") logger.info(f"|{sep}{centered(t, sep=sep, side_space=side_space, left=left)}{sep}|")
if border and debug: if border and debug:
logger.debug(border_text) logger.debug(border_text)
elif border: elif border:
@ -261,6 +265,10 @@ def item_title(item):
return f"{text}: {item.parentTitle}: {item.title}" return f"{text}: {item.parentTitle}: {item.title}"
elif isinstance(item, Movie) and item.year: elif isinstance(item, Movie) and item.year:
return f"{item.title} ({item.year})" return f"{item.title} ({item.year})"
elif isinstance(item, Album):
return f"{item.parentTitle}: {item.title}"
elif isinstance(item, Track):
return f"{item.grandparentTitle}: {item.parentTitle}: {item.title}"
else: else:
return item.title return item.title
@ -370,12 +378,14 @@ def check_day(_m, _d):
def schedule_check(attribute, data, current_time, run_hour): def schedule_check(attribute, data, current_time, run_hour):
skip_collection = True skip_collection = True
range_collection = False
schedule_list = get_list(data) schedule_list = get_list(data)
next_month = current_time.replace(day=28) + timedelta(days=4) next_month = current_time.replace(day=28) + timedelta(days=4)
last_day = next_month - timedelta(days=next_month.day) last_day = next_month - timedelta(days=next_month.day)
schedule_str = "" schedule_str = ""
for schedule in schedule_list: for schedule in schedule_list:
run_time = str(schedule).lower() run_time = str(schedule).lower()
display = f"{attribute} attribute {schedule} invalid"
if run_time.startswith(("day", "daily")): if run_time.startswith(("day", "daily")):
skip_collection = False skip_collection = False
elif run_time == "never": elif run_time == "never":
@ -395,10 +405,10 @@ def schedule_check(attribute, data, current_time, run_hour):
else: else:
raise ValueError raise ValueError
except ValueError: except ValueError:
logger.error(f"Schedule Error: hourly {attribute} attribute {schedule} invalid must be an integer between 0 and 23") logger.error(f"Schedule Error: hourly {display} must be an integer between 0 and 23")
elif run_time.startswith("week"): elif run_time.startswith("week"):
if param.lower() not in days_alias: if param.lower() not in days_alias:
logger.error(f"Schedule Error: weekly {attribute} attribute {schedule} invalid must be a day of the week i.e. weekly(Monday)") logger.error(f"Schedule Error: weekly {display} must be a day of the week i.e. weekly(Monday)")
continue continue
weekday = days_alias[param.lower()] weekday = days_alias[param.lower()]
schedule_str += f"\nScheduled weekly on {pretty_days[weekday]}" schedule_str += f"\nScheduled weekly on {pretty_days[weekday]}"
@ -414,7 +424,7 @@ def schedule_check(attribute, data, current_time, run_hour):
else: else:
raise ValueError raise ValueError
except ValueError: except ValueError:
logger.error(f"Schedule Error: monthly {attribute} attribute {schedule} invalid must be an integer between 1 and 31") logger.error(f"Schedule Error: monthly {display} must be an integer between 1 and 31")
elif run_time.startswith("year"): elif run_time.startswith("year"):
try: try:
if "/" in param: if "/" in param:
@ -428,12 +438,11 @@ def schedule_check(attribute, data, current_time, run_hour):
else: else:
raise ValueError raise ValueError
except ValueError: except ValueError:
logger.error( logger.error(f"Schedule Error: yearly {display} must be in the MM/DD format i.e. yearly(11/22)")
f"Schedule Error: yearly {attribute} attribute {schedule} invalid must be in the MM/DD format i.e. yearly(11/22)")
elif run_time.startswith("range"): elif run_time.startswith("range"):
match = re.match("^(1[0-2]|0?[1-9])/(3[01]|[12][0-9]|0?[1-9])-(1[0-2]|0?[1-9])/(3[01]|[12][0-9]|0?[1-9])$", param) match = re.match("^(1[0-2]|0?[1-9])/(3[01]|[12][0-9]|0?[1-9])-(1[0-2]|0?[1-9])/(3[01]|[12][0-9]|0?[1-9])$", param)
if not match: if not match:
logger.error(f"Schedule Error: range {attribute} attribute {schedule} invalid must be in the MM/DD-MM/DD format i.e. range(12/01-12/25)") logger.error(f"Schedule Error: range {display} must be in the MM/DD-MM/DD format i.e. range(12/01-12/25)")
continue continue
month_start, day_start = check_day(int(match.group(1)), int(match.group(2))) month_start, day_start = check_day(int(match.group(1)), int(match.group(2)))
month_end, day_end = check_day(int(match.group(3)), int(match.group(4))) month_end, day_end = check_day(int(match.group(3)), int(match.group(4)))
@ -441,12 +450,15 @@ def schedule_check(attribute, data, current_time, run_hour):
check = datetime.strptime(f"{month_check}/{day_check}", "%m/%d") check = datetime.strptime(f"{month_check}/{day_check}", "%m/%d")
start = datetime.strptime(f"{month_start}/{day_start}", "%m/%d") start = datetime.strptime(f"{month_start}/{day_start}", "%m/%d")
end = datetime.strptime(f"{month_end}/{day_end}", "%m/%d") end = datetime.strptime(f"{month_end}/{day_end}", "%m/%d")
range_collection = True
schedule_str += f"\nScheduled between {pretty_months[month_start]} {make_ordinal(day_start)} and {pretty_months[month_end]} {make_ordinal(day_end)}" schedule_str += f"\nScheduled between {pretty_months[month_start]} {make_ordinal(day_start)} and {pretty_months[month_end]} {make_ordinal(day_end)}"
if start <= check <= end if start < end else (check <= end or check >= start): if start <= check <= end if start < end else (check <= end or check >= start):
skip_collection = False skip_collection = False
else: else:
logger.error(f"Schedule Error: {attribute} attribute {schedule} invalid") logger.error(f"Schedule Error: {display}")
if len(schedule_str) == 0: if len(schedule_str) == 0:
skip_collection = False skip_collection = False
if skip_collection: if skip_collection and range_collection:
raise NotScheduledRange(schedule_str)
elif skip_collection:
raise NotScheduled(schedule_str) raise NotScheduled(schedule_str)

@ -1,6 +1,6 @@
import logging import logging
from json import JSONDecodeError from json import JSONDecodeError
from modules import util
from modules.util import Failed from modules.util import Failed
logger = logging.getLogger("Plex Meta Manager") logger = logging.getLogger("Plex Meta Manager")
@ -16,6 +16,7 @@ class Webhooks:
def _request(self, webhooks, json): def _request(self, webhooks, json):
if self.config.trace_mode: if self.config.trace_mode:
util.separator("Webhooks", space=False, border=False)
logger.debug("") logger.debug("")
logger.debug(f"JSON: {json}") logger.debug(f"JSON: {json}")
for webhook in list(set(webhooks)): for webhook in list(set(webhooks)):

@ -27,11 +27,14 @@ parser.add_argument("-c", "--config", dest="config", help="Run with desired *.ym
parser.add_argument("-t", "--time", "--times", dest="times", help="Times to update each day use format HH:MM (Default: 03:00) (comma-separated list)", default="03:00", type=str) parser.add_argument("-t", "--time", "--times", dest="times", help="Times to update each day use format HH:MM (Default: 03:00) (comma-separated list)", default="03:00", type=str)
parser.add_argument("-re", "--resume", dest="resume", help="Resume collection run from a specific collection", type=str) parser.add_argument("-re", "--resume", dest="resume", help="Resume collection run from a specific collection", type=str)
parser.add_argument("-r", "--run", dest="run", help="Run without the scheduler", action="store_true", default=False) parser.add_argument("-r", "--run", dest="run", help="Run without the scheduler", action="store_true", default=False)
parser.add_argument("-is", "--ignore-schedules", dest="ignore_schedules", help="Run ignoring collection schedules", action="store_true", default=False)
parser.add_argument("-rt", "--test", "--tests", "--run-test", "--run-tests", dest="test", help="Run in debug mode with only collections that have test: true", action="store_true", default=False) parser.add_argument("-rt", "--test", "--tests", "--run-test", "--run-tests", dest="test", help="Run in debug mode with only collections that have test: true", action="store_true", default=False)
parser.add_argument("-co", "--collection-only", "--collections-only", dest="collection_only", help="Run only collection operations", action="store_true", default=False) parser.add_argument("-co", "--collection-only", "--collections-only", dest="collection_only", help="Run only collection operations", action="store_true", default=False)
parser.add_argument("-lo", "--library-only", "--libraries-only", dest="library_only", help="Run only library operations", action="store_true", default=False) parser.add_argument("-lo", "--library-only", "--libraries-only", dest="library_only", help="Run only library operations", action="store_true", default=False)
parser.add_argument("-lf", "--library-first", "--libraries-first", dest="library_first", help="Run library operations before collections", action="store_true", default=False)
parser.add_argument("-rc", "-cl", "--collection", "--collections", "--run-collection", "--run-collections", dest="collections", help="Process only specified collections (comma-separated list)", type=str) parser.add_argument("-rc", "-cl", "--collection", "--collections", "--run-collection", "--run-collections", dest="collections", help="Process only specified collections (comma-separated list)", type=str)
parser.add_argument("-rl", "-l", "--library", "--libraries", "--run-library", "--run-libraries", dest="libraries", help="Process only specified libraries (comma-separated list)", type=str) parser.add_argument("-rl", "-l", "--library", "--libraries", "--run-library", "--run-libraries", dest="libraries", help="Process only specified libraries (comma-separated list)", type=str)
parser.add_argument("-dc", "--delete", "--delete-collections", dest="delete", help="Deletes all Collections in the Plex Library before running", action="store_true", default=False)
parser.add_argument("-nc", "--no-countdown", dest="no_countdown", help="Run without displaying the countdown", action="store_true", default=False) parser.add_argument("-nc", "--no-countdown", dest="no_countdown", help="Run without displaying the countdown", action="store_true", default=False)
parser.add_argument("-nm", "--no-missing", dest="no_missing", help="Run without running the missing section", action="store_true", default=False) parser.add_argument("-nm", "--no-missing", dest="no_missing", help="Run without running the missing section", action="store_true", default=False)
parser.add_argument("-ro", "--read-only-config", dest="read_only_config", help="Run without writing to the config", action="store_true", default=False) parser.add_argument("-ro", "--read-only-config", dest="read_only_config", help="Run without writing to the config", action="store_true", default=False)
@ -60,10 +63,13 @@ config_file = get_arg("PMM_CONFIG", args.config)
times = get_arg("PMM_TIME", args.times) times = get_arg("PMM_TIME", args.times)
run = get_arg("PMM_RUN", args.run, arg_bool=True) run = get_arg("PMM_RUN", args.run, arg_bool=True)
test = get_arg("PMM_TEST", args.test, arg_bool=True) test = get_arg("PMM_TEST", args.test, arg_bool=True)
ignore_schedules = get_arg("PMM_IGNORE_SCHEDULES", args.ignore_schedules, arg_bool=True)
collection_only = get_arg("PMM_COLLECTIONS_ONLY", args.collection_only, arg_bool=True) collection_only = get_arg("PMM_COLLECTIONS_ONLY", args.collection_only, arg_bool=True)
library_only = get_arg("PMM_LIBRARIES_ONLY", args.library_only, arg_bool=True) library_only = get_arg("PMM_LIBRARIES_ONLY", args.library_only, arg_bool=True)
library_first = get_arg("PMM_LIBRARIES_FIRST", args.library_first, arg_bool=True)
collections = get_arg("PMM_COLLECTIONS", args.collections) collections = get_arg("PMM_COLLECTIONS", args.collections)
libraries = get_arg("PMM_LIBRARIES", args.libraries) libraries = get_arg("PMM_LIBRARIES", args.libraries)
delete = get_arg("PMM_DELETE_COLLECTIONS", args.delete, arg_bool=True)
resume = get_arg("PMM_RESUME", args.resume) resume = get_arg("PMM_RESUME", args.resume)
no_countdown = get_arg("PMM_NO_COUNTDOWN", args.no_countdown, arg_bool=True) no_countdown = get_arg("PMM_NO_COUNTDOWN", args.no_countdown, arg_bool=True)
no_missing = get_arg("PMM_NO_MISSING", args.no_missing, arg_bool=True) no_missing = get_arg("PMM_NO_MISSING", args.no_missing, arg_bool=True)
@ -72,7 +78,6 @@ divider = get_arg("PMM_DIVIDER", args.divider)
screen_width = get_arg("PMM_WIDTH", args.width, arg_int=True) screen_width = get_arg("PMM_WIDTH", args.width, arg_int=True)
debug = get_arg("PMM_DEBUG", args.debug, arg_bool=True) debug = get_arg("PMM_DEBUG", args.debug, arg_bool=True)
trace = get_arg("PMM_TRACE", args.trace, arg_bool=True) trace = get_arg("PMM_TRACE", args.trace, arg_bool=True)
stats = {}
util.separating_character = divider[0] util.separating_character = divider[0]
@ -150,8 +155,11 @@ def start(attrs):
logger.debug(f"--run-tests (PMM_TEST): {test}") logger.debug(f"--run-tests (PMM_TEST): {test}")
logger.debug(f"--collections-only (PMM_COLLECTIONS_ONLY): {collection_only}") logger.debug(f"--collections-only (PMM_COLLECTIONS_ONLY): {collection_only}")
logger.debug(f"--libraries-only (PMM_LIBRARIES_ONLY): {library_only}") logger.debug(f"--libraries-only (PMM_LIBRARIES_ONLY): {library_only}")
logger.debug(f"--libraries-first (PMM_LIBRARIES_FIRST): {library_first}")
logger.debug(f"--run-collections (PMM_COLLECTIONS): {collections}") logger.debug(f"--run-collections (PMM_COLLECTIONS): {collections}")
logger.debug(f"--run-libraries (PMM_LIBRARIES): {libraries}") logger.debug(f"--run-libraries (PMM_LIBRARIES): {libraries}")
logger.debug(f"--ignore-schedules (PMM_IGNORE_SCHEDULES): {ignore_schedules}")
logger.debug(f"--delete-collections (PMM_DELETE_COLLECTIONS): {delete}")
logger.debug(f"--resume (PMM_RESUME): {resume}") logger.debug(f"--resume (PMM_RESUME): {resume}")
logger.debug(f"--no-countdown (PMM_NO_COUNTDOWN): {no_countdown}") logger.debug(f"--no-countdown (PMM_NO_COUNTDOWN): {no_countdown}")
logger.debug(f"--no-missing (PMM_NO_MISSING): {no_missing}") logger.debug(f"--no-missing (PMM_NO_MISSING): {no_missing}")
@ -163,8 +171,7 @@ def start(attrs):
logger.debug("") logger.debug("")
util.separator(f"Starting {start_type}Run") util.separator(f"Starting {start_type}Run")
config = None config = None
global stats stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "removed": 0, "radarr": 0, "sonarr": 0}
try: try:
config = ConfigFile(default_dir, attrs, read_only_config) config = ConfigFile(default_dir, attrs, read_only_config)
except Exception as e: except Exception as e:
@ -172,7 +179,7 @@ def start(attrs):
util.print_multiline(e, critical=True) util.print_multiline(e, critical=True)
else: else:
try: try:
update_libraries(config) stats = update_libraries(config)
except Exception as e: except Exception as e:
config.notify(e) config.notify(e)
util.print_stacktrace() util.print_stacktrace()
@ -186,11 +193,10 @@ def start(attrs):
except Failed as e: except Failed as e:
util.print_stacktrace() util.print_stacktrace()
logger.error(f"Webhooks Error: {e}") logger.error(f"Webhooks Error: {e}")
util.separator(f"Finished {start_type}Run\nRun Time: {run_time}") util.separator(f"Finished {start_type}Run\nFinished: {end_time.strftime('%H:%M:%S %Y-%m-%d')} Run Time: {run_time}")
logger.removeHandler(file_handler) logger.removeHandler(file_handler)
def update_libraries(config): def update_libraries(config):
global stats
for library in config.libraries: for library in config.libraries:
if library.skip_library: if library.skip_library:
logger.info("") logger.info("")
@ -210,6 +216,9 @@ def update_libraries(config):
logger.info("") logger.info("")
util.separator(f"{library.name} Library") util.separator(f"{library.name} Library")
if config.library_first and library.library_operation and not config.test_mode and not collection_only:
library_operations(config, library)
logger.debug("") logger.debug("")
logger.debug(f"Mapping Name: {library.original_mapping_name}") logger.debug(f"Mapping Name: {library.original_mapping_name}")
logger.debug(f"Folder Name: {library.mapping_name}") logger.debug(f"Folder Name: {library.mapping_name}")
@ -218,10 +227,12 @@ def update_libraries(config):
logger.debug(f"Asset Directory: {ad}") logger.debug(f"Asset Directory: {ad}")
logger.debug(f"Asset Folders: {library.asset_folders}") logger.debug(f"Asset Folders: {library.asset_folders}")
logger.debug(f"Create Asset Folders: {library.create_asset_folders}") logger.debug(f"Create Asset Folders: {library.create_asset_folders}")
logger.debug(f"Download URL Assets: {library.download_url_assets}")
logger.debug(f"Sync Mode: {library.sync_mode}") logger.debug(f"Sync Mode: {library.sync_mode}")
logger.debug(f"Collection Minimum: {library.collection_minimum}") logger.debug(f"Collection Minimum: {library.minimum_items}")
logger.debug(f"Delete Below Minimum: {library.delete_below_minimum}") logger.debug(f"Delete Below Minimum: {library.delete_below_minimum}")
logger.debug(f"Delete Not Scheduled: {library.delete_not_scheduled}") logger.debug(f"Delete Not Scheduled: {library.delete_not_scheduled}")
logger.debug(f"Default Collection Order: {library.default_collection_order}")
logger.debug(f"Missing Only Released: {library.missing_only_released}") logger.debug(f"Missing Only Released: {library.missing_only_released}")
logger.debug(f"Only Filter Missing: {library.only_filter_missing}") logger.debug(f"Only Filter Missing: {library.only_filter_missing}")
logger.debug(f"Show Unmanaged: {library.show_unmanaged}") logger.debug(f"Show Unmanaged: {library.show_unmanaged}")
@ -229,24 +240,19 @@ def update_libraries(config):
logger.debug(f"Show Missing: {library.show_missing}") logger.debug(f"Show Missing: {library.show_missing}")
logger.debug(f"Show Missing Assets: {library.show_missing_assets}") logger.debug(f"Show Missing Assets: {library.show_missing_assets}")
logger.debug(f"Save Missing: {library.save_missing}") logger.debug(f"Save Missing: {library.save_missing}")
logger.debug(f"Assets For All: {library.assets_for_all}")
logger.debug(f"Delete Collections With Less: {library.delete_collections_with_less}")
logger.debug(f"Delete Unmanaged Collections: {library.delete_unmanaged_collections}")
logger.debug(f"Mass Genre Update: {library.mass_genre_update}")
logger.debug(f"Mass Audience Rating Update: {library.mass_audience_rating_update}")
logger.debug(f"Mass Critic Rating Update: {library.mass_critic_rating_update}")
logger.debug(f"Mass Trakt Rating Update: {library.mass_trakt_rating_update}")
logger.debug(f"Split Duplicates: {library.split_duplicates}")
logger.debug(f"Radarr Add All: {library.radarr_add_all}")
logger.debug(f"Sonarr Add All: {library.sonarr_add_all}")
logger.debug(f"TMDb Collections: {library.tmdb_collections}")
logger.debug(f"Genre Mapper: {library.genre_mapper}")
logger.debug(f"Clean Bundles: {library.clean_bundles}") logger.debug(f"Clean Bundles: {library.clean_bundles}")
logger.debug(f"Empty Trash: {library.empty_trash}") logger.debug(f"Empty Trash: {library.empty_trash}")
logger.debug(f"Optimize: {library.optimize}") logger.debug(f"Optimize: {library.optimize}")
logger.debug(f"Timeout: {library.timeout}") logger.debug(f"Timeout: {library.timeout}")
if not library.is_other: if config.delete_collections:
logger.info("")
util.separator(f"Deleting all Collections from the {library.name} Library", space=False, border=False)
logger.info("")
for collection in library.get_all_collections():
logger.info(f"Collection {collection.title} Deleted")
library.query(collection.delete)
if not library.is_other and not library.is_music:
logger.info("") logger.info("")
util.separator(f"Mapping {library.name} Library", space=False, border=False) util.separator(f"Mapping {library.name} Library", space=False, border=False)
logger.info("") logger.info("")
@ -281,7 +287,7 @@ def update_libraries(config):
logger.info("") logger.info("")
builder.sort_collection() builder.sort_collection()
if not config.test_mode and not collection_only: if not config.library_first and library.library_operation and not config.test_mode and not collection_only:
library_operations(config, library) library_operations(config, library)
logger.removeHandler(library_handler) logger.removeHandler(library_handler)
@ -290,6 +296,8 @@ def update_libraries(config):
util.print_stacktrace() util.print_stacktrace()
util.print_multiline(e, critical=True) util.print_multiline(e, critical=True)
playlist_status = {}
playlist_stats = {}
if config.playlist_files: if config.playlist_files:
os.makedirs(os.path.join(default_dir, "logs", "playlists"), exist_ok=True) os.makedirs(os.path.join(default_dir, "logs", "playlists"), exist_ok=True)
pf_file_logger = os.path.join(default_dir, "logs", "playlists", "playlists.log") pf_file_logger = os.path.join(default_dir, "logs", "playlists", "playlists.log")
@ -299,7 +307,7 @@ def update_libraries(config):
if should_roll_over: if should_roll_over:
playlists_handler.doRollover() playlists_handler.doRollover()
logger.addHandler(playlists_handler) logger.addHandler(playlists_handler)
run_playlists(config) playlist_status, playlist_stats = run_playlists(config)
logger.removeHandler(playlists_handler) logger.removeHandler(playlists_handler)
has_run_again = False has_run_again = False
@ -308,6 +316,7 @@ def update_libraries(config):
has_run_again = True has_run_again = True
break break
amount_added = 0
if has_run_again and not library_only: if has_run_again and not library_only:
logger.info("") logger.info("")
util.separator("Run Again") util.separator("Run Again")
@ -332,10 +341,10 @@ def update_libraries(config):
library.map_guids() library.map_guids()
for builder in library.run_again: for builder in library.run_again:
logger.info("") logger.info("")
util.separator(f"{builder.name} Collection") util.separator(f"{builder.name} Collection in {library.name}")
logger.info("") logger.info("")
try: try:
builder.run_collections_again() amount_added += builder.run_collections_again()
except Failed as e: except Failed as e:
library.notify(e, collection=builder.name, critical=False) library.notify(e, collection=builder.name, critical=False)
util.print_stacktrace() util.print_stacktrace()
@ -357,6 +366,59 @@ def update_libraries(config):
if library.optimize: if library.optimize:
library.query(library.PlexServer.library.optimize) library.query(library.PlexServer.library.optimize)
longest = 20
for library in config.libraries:
for title in library.status:
if len(title) > longest:
longest = len(title)
if playlist_status:
for title in playlist_status:
if len(title) > longest:
longest = len(title)
def print_status(section, status):
logger.info("")
util.separator(f"{section} Summary", space=False, border=False)
logger.info("")
logger.info(f"{'Title':^{longest}} | + | = | - | {'Status':^13}")
breaker = f"{util.separating_character * longest}|{util.separating_character * 5}|{util.separating_character * 5}|{util.separating_character * 5}|"
util.separator(breaker, space=False, border=False, side_space=False, left=True)
for name, data in status.items():
logger.info(f"{name:<{longest}} | {data['added']:^3} | {data['unchanged']:^3} | {data['removed']:^3} | {data['status']}")
if data["errors"]:
for error in data["errors"]:
util.print_multiline(error, info=True)
logger.info("")
util.separator("Summary")
for library in config.libraries:
print_status(library.name, library.status)
if playlist_status:
print_status("Playlists", playlist_status)
stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
stats["added"] += amount_added
for library in config.libraries:
stats["created"] += library.stats["created"]
stats["modified"] += library.stats["modified"]
stats["deleted"] += library.stats["deleted"]
stats["added"] += library.stats["added"]
stats["unchanged"] += library.stats["unchanged"]
stats["removed"] += library.stats["removed"]
stats["radarr"] += library.stats["radarr"]
stats["sonarr"] += library.stats["sonarr"]
if playlist_stats:
stats["created"] += playlist_stats["created"]
stats["modified"] += playlist_stats["modified"]
stats["deleted"] += playlist_stats["deleted"]
stats["added"] += playlist_stats["added"]
stats["unchanged"] += playlist_stats["unchanged"]
stats["removed"] += playlist_stats["removed"]
stats["radarr"] += playlist_stats["radarr"]
stats["sonarr"] += playlist_stats["sonarr"]
return stats
def library_operations(config, library): def library_operations(config, library):
logger.info("") logger.info("")
util.separator(f"{library.name} Library Operations") util.separator(f"{library.name} Library Operations")
@ -370,16 +432,14 @@ def library_operations(config, library):
logger.debug(f"Mass Trakt Rating Update: {library.mass_trakt_rating_update}") logger.debug(f"Mass Trakt Rating Update: {library.mass_trakt_rating_update}")
logger.debug(f"Mass Collection Mode Update: {library.mass_collection_mode}") logger.debug(f"Mass Collection Mode Update: {library.mass_collection_mode}")
logger.debug(f"Split Duplicates: {library.split_duplicates}") logger.debug(f"Split Duplicates: {library.split_duplicates}")
logger.debug(f"Radarr Add All: {library.radarr_add_all}") logger.debug(f"Radarr Add All Existing: {library.radarr_add_all_existing}")
logger.debug(f"Radarr Remove by Tag: {library.radarr_remove_by_tag}") logger.debug(f"Radarr Remove by Tag: {library.radarr_remove_by_tag}")
logger.debug(f"Sonarr Add All: {library.sonarr_add_all}") logger.debug(f"Sonarr Add All Existing: {library.sonarr_add_all_existing}")
logger.debug(f"Sonarr Remove by Tag: {library.sonarr_remove_by_tag}") logger.debug(f"Sonarr Remove by Tag: {library.sonarr_remove_by_tag}")
logger.debug(f"TMDb Collections: {library.tmdb_collections}") logger.debug(f"TMDb Collections: {library.tmdb_collections}")
logger.debug(f"Genre Collections: {library.genre_collections}")
logger.debug(f"Genre Mapper: {library.genre_mapper}") logger.debug(f"Genre Mapper: {library.genre_mapper}")
tmdb_operation = library.assets_for_all or library.mass_genre_update or library.mass_audience_rating_update \ logger.debug(f"TMDb Operation: {library.tmdb_library_operation}")
or library.mass_critic_rating_update or library.mass_trakt_rating_update \
or library.tmdb_collections or library.radarr_add_all or library.sonarr_add_all
logger.debug(f"TMDb Operation: {tmdb_operation}")
if library.split_duplicates: if library.split_duplicates:
items = library.search(**{"duplicate": True}) items = library.search(**{"duplicate": True})
@ -387,11 +447,11 @@ def library_operations(config, library):
item.split() item.split()
logger.info(util.adjust_space(f"{item.title[:25]:<25} | Splitting")) logger.info(util.adjust_space(f"{item.title[:25]:<25} | Splitting"))
if tmdb_operation: tmdb_collections = {}
if library.tmdb_library_operation:
items = library.get_all() items = library.get_all()
radarr_adds = [] radarr_adds = []
sonarr_adds = [] sonarr_adds = []
tmdb_collections = {}
trakt_ratings = config.Trakt.user_ratings(library.is_movie) if library.mass_trakt_rating_update else [] trakt_ratings = config.Trakt.user_ratings(library.is_movie) if library.mass_trakt_rating_update else []
for i, item in enumerate(items, 1): for i, item in enumerate(items, 1):
@ -435,11 +495,11 @@ def library_operations(config, library):
pass pass
path = os.path.dirname(str(item.locations[0])) if library.is_movie else str(item.locations[0]) path = os.path.dirname(str(item.locations[0])) if library.is_movie else str(item.locations[0])
if library.Radarr and library.radarr_add_all and tmdb_id: if library.Radarr and library.radarr_add_all_existing and tmdb_id:
path = path.replace(library.Radarr.plex_path, library.Radarr.radarr_path) path = path.replace(library.Radarr.plex_path, library.Radarr.radarr_path)
path = path[:-1] if path.endswith(('/', '\\')) else path path = path[:-1] if path.endswith(('/', '\\')) else path
radarr_adds.append((tmdb_id, path)) radarr_adds.append((tmdb_id, path))
if library.Sonarr and library.sonarr_add_all and tvdb_id: if library.Sonarr and library.sonarr_add_all_existing and tvdb_id:
path = path.replace(library.Sonarr.plex_path, library.Sonarr.sonarr_path) path = path.replace(library.Sonarr.plex_path, library.Sonarr.sonarr_path)
path = path[:-1] if path.endswith(('/', '\\')) else path path = path[:-1] if path.endswith(('/', '\\')) else path
sonarr_adds.append((tvdb_id, path)) sonarr_adds.append((tvdb_id, path))
@ -545,38 +605,54 @@ def library_operations(config, library):
except Failed: except Failed:
pass pass
if library.Radarr and library.radarr_add_all: if library.Radarr and library.radarr_add_all_existing:
try: try:
library.Radarr.add_tmdb(radarr_adds) library.Radarr.add_tmdb(radarr_adds)
except Failed as e: except Failed as e:
logger.error(e) logger.error(e)
if library.Sonarr and library.sonarr_add_all: if library.Sonarr and library.sonarr_add_all_existing:
try: try:
library.Sonarr.add_tvdb(sonarr_adds) library.Sonarr.add_tvdb(sonarr_adds)
except Failed as e: except Failed as e:
logger.error(e) logger.error(e)
if tmdb_collections or library.genre_collections:
logger.info("")
util.separator(f"Starting Automated Collections")
logger.info("")
new_collections = {}
templates = {}
if tmdb_collections: if tmdb_collections:
logger.info("") templates["TMDb Collection"] = library.tmdb_collections["template"]
util.separator(f"Starting TMDb Collections")
logger.info("")
new_collections = {}
for _i, _n in tmdb_collections.items(): for _i, _n in tmdb_collections.items():
if int(_i) not in library.tmdb_collections["exclude_ids"]: if int(_i) not in library.tmdb_collections["exclude_ids"]:
template = {"name": "TMDb Collection", "collection_id": _i} template = {"name": "TMDb Collection", "collection_id": _i}
for k, v in library.tmdb_collections["dictionary_variables"]: for k, v in library.tmdb_collections["dictionary_variables"].items():
if int(_i) in v: if int(_i) in v:
template[k] = v[int(_i)] template[k] = v[int(_i)]
for suffix in library.tmdb_collections["remove_suffix"]: for suffix in library.tmdb_collections["remove_suffix"]:
if _n.endswith(suffix): if _n.endswith(suffix):
_n = _n[:-len(suffix)] _n = _n[:-len(suffix)]
new_collections[_n.strip()] = {"template": template} new_collections[_n.strip()] = {"template": template}
metadata = MetadataFile(config, library, "Data", {
"collections": new_collections, if library.genre_collections:
"templates": {"TMDb Collection": library.tmdb_collections["template"]} templates["Genre Collection"] = library.genre_collections["template"]
}) for genre in library.get_genres():
run_collection(config, library, metadata, metadata.get_collections(None)) if genre not in library.genre_collections["exclude_genres"]:
template = {"name": "Genre Collection", "genre": genre}
for k, v in library.genre_collections["dictionary_variables"].items():
if genre in v:
template[k] = v[genre]
title = library.genre_collections["title_format"]
title = title.replace("<<genre>>", genre)
if "<<library_type>>" in title:
title = title.replace("<<library_type>>", library.type)
new_collections[title] = {"template": template}
metadata = MetadataFile(config, library, "Data", {"collections": new_collections, "templates": templates})
run_collection(config, library, metadata, metadata.get_collections(None))
if library.radarr_remove_by_tag: if library.radarr_remove_by_tag:
library.Radarr.remove_all_with_tags(library.radarr_remove_by_tag) library.Radarr.remove_all_with_tags(library.radarr_remove_by_tag)
@ -629,7 +705,6 @@ def library_operations(config, library):
library.find_assets(col) library.find_assets(col)
def run_collection(config, library, metadata, requested_collections): def run_collection(config, library, metadata, requested_collections):
global stats
logger.info("") logger.info("")
for mapping_name, collection_attrs in requested_collections.items(): for mapping_name, collection_attrs in requested_collections.items():
collection_start = datetime.now() collection_start = datetime.now()
@ -668,9 +743,10 @@ def run_collection(config, library, metadata, requested_collections):
if should_roll_over: if should_roll_over:
collection_handler.doRollover() collection_handler.doRollover()
logger.addHandler(collection_handler) logger.addHandler(collection_handler)
library.status[mapping_name] = {"status": "", "errors": [], "created": False, "modified": False, "deleted": False, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
try: try:
util.separator(f"{mapping_name} Collection") util.separator(f"{mapping_name} Collection in {library.name}")
logger.info("") logger.info("")
if output_str: if output_str:
logger.info(output_str) logger.info(output_str)
@ -707,15 +783,16 @@ def run_collection(config, library, metadata, requested_collections):
builder.find_rating_keys() builder.find_rating_keys()
if len(builder.added_items) >= builder.minimum and builder.build_collection: if len(builder.added_items) >= builder.minimum and builder.build_collection:
logger.info("") items_added, items_unchanged = builder.add_to_collection()
util.separator(f"Adding to {mapping_name} Collection", space=False, border=False) library.stats["added"] += items_added
logger.info("") library.status[mapping_name]["added"] = items_added
items_added = builder.add_to_collection() library.stats["unchanged"] += items_unchanged
stats["added"] += items_added library.status[mapping_name]["unchanged"] = items_unchanged
items_removed = 0 items_removed = 0
if builder.sync: if builder.sync:
items_removed = builder.sync_collection() items_removed = builder.sync_collection()
stats["removed"] += items_removed library.stats["removed"] += items_removed
library.status[mapping_name]["removed"] = items_removed
elif len(builder.added_items) < builder.minimum and builder.build_collection: elif len(builder.added_items) < builder.minimum and builder.build_collection:
logger.info("") logger.info("")
logger.info(f"Collection Minimum: {builder.minimum} not met for {mapping_name} Collection") logger.info(f"Collection Minimum: {builder.minimum} not met for {mapping_name} Collection")
@ -726,22 +803,22 @@ def run_collection(config, library, metadata, requested_collections):
builder.deleted = True builder.deleted = True
if builder.do_missing and (len(builder.missing_movies) > 0 or len(builder.missing_shows) > 0): if builder.do_missing and (len(builder.missing_movies) > 0 or len(builder.missing_shows) > 0):
if builder.details["show_missing"] is True:
logger.info("")
util.separator(f"Missing from Library", space=False, border=False)
logger.info("")
radarr_add, sonarr_add = builder.run_missing() radarr_add, sonarr_add = builder.run_missing()
stats["radarr"] += radarr_add library.stats["radarr"] += radarr_add
stats["sonarr"] += sonarr_add library.status[mapping_name]["radarr"] += radarr_add
library.stats["sonarr"] += sonarr_add
library.status[mapping_name]["sonarr"] += sonarr_add
run_item_details = True run_item_details = True
if valid and builder.build_collection and (builder.builders or builder.smart_url): if valid and builder.build_collection and (builder.builders or builder.smart_url):
try: try:
builder.load_collection() builder.load_collection()
if builder.created: if builder.created:
stats["created"] += 1 library.stats["created"] += 1
library.status[mapping_name]["created"] = True
elif items_added > 0 or items_removed > 0: elif items_added > 0 or items_removed > 0:
stats["modified"] += 1 library.stats["modified"] += 1
library.status[mapping_name]["modified"] = True
except Failed: except Failed:
util.print_stacktrace() util.print_stacktrace()
run_item_details = False run_item_details = False
@ -751,7 +828,8 @@ def run_collection(config, library, metadata, requested_collections):
builder.update_details() builder.update_details()
if builder.deleted: if builder.deleted:
stats["deleted"] += 1 library.stats["deleted"] += 1
library.status[mapping_name]["deleted"] = True
if builder.server_preroll is not None: if builder.server_preroll is not None:
library.set_server_preroll(builder.server_preroll) library.set_server_preroll(builder.server_preroll)
@ -776,22 +854,36 @@ def run_collection(config, library, metadata, requested_collections):
if builder.run_again and (len(builder.run_again_movies) > 0 or len(builder.run_again_shows) > 0): if builder.run_again and (len(builder.run_again_movies) > 0 or len(builder.run_again_shows) > 0):
library.run_again.append(builder) library.run_again.append(builder)
if library.status[mapping_name]["created"]:
library.status[mapping_name]["status"] = "Created"
elif library.status[mapping_name]["deleted"]:
library.status[mapping_name]["status"] = "Deleted"
elif library.status[mapping_name]["modified"]:
library.status[mapping_name]["status"] = "Modified"
else:
library.status[mapping_name]["status"] = "Unchanged"
except NotScheduled as e: except NotScheduled as e:
util.print_multiline(e, info=True) util.print_multiline(e, info=True)
library.status[mapping_name]["status"] = "Not Scheduled"
except Failed as e: except Failed as e:
library.notify(e, collection=mapping_name) library.notify(e, collection=mapping_name)
util.print_stacktrace() util.print_stacktrace()
util.print_multiline(e, error=True) util.print_multiline(e, error=True)
library.status[mapping_name]["status"] = "PMM Failure"
library.status[mapping_name]["errors"].append(e)
except Exception as e: except Exception as e:
library.notify(f"Unknown Error: {e}", collection=mapping_name) library.notify(f"Unknown Error: {e}", collection=mapping_name)
util.print_stacktrace() util.print_stacktrace()
logger.error(f"Unknown Error: {e}") logger.error(f"Unknown Error: {e}")
library.status[mapping_name]["status"] = "Unknown Error"
library.status[mapping_name]["errors"].append(e)
logger.info("") logger.info("")
util.separator(f"Finished {mapping_name} Collection\nCollection Run Time: {str(datetime.now() - collection_start).split('.')[0]}") util.separator(f"Finished {mapping_name} Collection\nCollection Run Time: {str(datetime.now() - collection_start).split('.')[0]}")
logger.removeHandler(collection_handler) logger.removeHandler(collection_handler)
def run_playlists(config): def run_playlists(config):
stats = {"created": 0, "modified": 0, "deleted": 0, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
status = {}
logger.info("") logger.info("")
util.separator("Playlists") util.separator("Playlists")
logger.info("") logger.info("")
@ -828,6 +920,7 @@ def run_playlists(config):
if should_roll_over: if should_roll_over:
playlist_handler.doRollover() playlist_handler.doRollover()
logger.addHandler(playlist_handler) logger.addHandler(playlist_handler)
status[mapping_name] = {"status": "", "errors": [], "created": False, "modified": False, "deleted": False, "added": 0, "unchanged": 0, "removed": 0, "radarr": 0, "sonarr": 0}
server_name = None server_name = None
library_names = None library_names = None
try: try:
@ -1027,15 +1120,16 @@ def run_playlists(config):
builder.filter_and_save_items(items) builder.filter_and_save_items(items)
if len(builder.added_items) >= builder.minimum: if len(builder.added_items) >= builder.minimum:
logger.info("") items_added, items_unchanged = builder.add_to_collection()
util.separator(f"Adding to {mapping_name} Playlist", space=False, border=False)
logger.info("")
items_added = builder.add_to_collection()
stats["added"] += items_added stats["added"] += items_added
status[mapping_name]["added"] += items_added
stats["unchanged"] += items_unchanged
status[mapping_name]["unchanged"] += items_unchanged
items_removed = 0 items_removed = 0
if builder.sync: if builder.sync:
items_removed = builder.sync_collection() items_removed = builder.sync_collection()
stats["removed"] += items_removed stats["removed"] += items_removed
status[mapping_name]["removed"] += items_removed
elif len(builder.added_items) < builder.minimum: elif len(builder.added_items) < builder.minimum:
logger.info("") logger.info("")
logger.info(f"Playlist Minimum: {builder.minimum} not met for {mapping_name} Playlist") logger.info(f"Playlist Minimum: {builder.minimum} not met for {mapping_name} Playlist")
@ -1046,21 +1140,21 @@ def run_playlists(config):
builder.deleted = True builder.deleted = True
if builder.do_missing and (len(builder.missing_movies) > 0 or len(builder.missing_shows) > 0): if builder.do_missing and (len(builder.missing_movies) > 0 or len(builder.missing_shows) > 0):
if builder.details["show_missing"] is True:
logger.info("")
util.separator(f"Missing from Library", space=False, border=False)
logger.info("")
radarr_add, sonarr_add = builder.run_missing() radarr_add, sonarr_add = builder.run_missing()
stats["radarr"] += radarr_add stats["radarr"] += radarr_add
status[mapping_name]["radarr"] += radarr_add
stats["sonarr"] += sonarr_add stats["sonarr"] += sonarr_add
status[mapping_name]["sonarr"] += sonarr_add
run_item_details = True run_item_details = True
try: try:
builder.load_collection() builder.load_collection()
if builder.created: if builder.created:
stats["created"] += 1 stats["created"] += 1
status[mapping_name]["created"] = True
elif items_added > 0 or items_removed > 0: elif items_added > 0 or items_removed > 0:
stats["modified"] += 1 stats["modified"] += 1
status[mapping_name]["modified"] = True
except Failed: except Failed:
util.print_stacktrace() util.print_stacktrace()
run_item_details = False run_item_details = False
@ -1071,6 +1165,7 @@ def run_playlists(config):
if builder.deleted: if builder.deleted:
stats["deleted"] += 1 stats["deleted"] += 1
status[mapping_name]["deleted"] = True
if valid and run_item_details and builder.builders and (builder.item_details or builder.custom_sort): if valid and run_item_details and builder.builders and (builder.item_details or builder.custom_sort):
try: try:
@ -1091,27 +1186,34 @@ def run_playlists(config):
except NotScheduled as e: except NotScheduled as e:
util.print_multiline(e, info=True) util.print_multiline(e, info=True)
status[mapping_name]["status"] = "Not Scheduled"
except Failed as e: except Failed as e:
config.notify(e, server=server_name, library=library_names, playlist=mapping_name) config.notify(e, server=server_name, library=library_names, playlist=mapping_name)
util.print_stacktrace() util.print_stacktrace()
util.print_multiline(e, error=True) util.print_multiline(e, error=True)
status[mapping_name]["status"] = "PMM Failure"
status[mapping_name]["errors"].append(e)
except Exception as e: except Exception as e:
config.notify(f"Unknown Error: {e}", server=server_name, library=library_names, playlist=mapping_name) config.notify(f"Unknown Error: {e}", server=server_name, library=library_names, playlist=mapping_name)
util.print_stacktrace() util.print_stacktrace()
logger.error(f"Unknown Error: {e}") logger.error(f"Unknown Error: {e}")
status[mapping_name]["status"] = "Unknown Error"
status[mapping_name]["errors"].append(e)
logger.info("") logger.info("")
util.separator( util.separator(f"Finished {mapping_name} Playlist\nPlaylist Run Time: {str(datetime.now() - playlist_start).split('.')[0]}")
f"Finished {mapping_name} Playlist\nPlaylist Run Time: {str(datetime.now() - playlist_start).split('.')[0]}")
logger.removeHandler(playlist_handler) logger.removeHandler(playlist_handler)
return status, stats
try: try:
if run or test or collections or libraries or resume: if run or test or collections or libraries or resume:
start({ start({
"config_file": config_file, "config_file": config_file,
"test": test, "test": test,
"delete": delete,
"ignore_schedules": ignore_schedules,
"collections": collections, "collections": collections,
"libraries": libraries, "libraries": libraries,
"library_first": library_first,
"resume": resume, "resume": resume,
"trace": trace "trace": trace
}) })
@ -1127,7 +1229,7 @@ try:
else: else:
raise Failed(f"Argument Error: blank time argument") raise Failed(f"Argument Error: blank time argument")
for time_to_run in valid_times: for time_to_run in valid_times:
schedule.every().day.at(time_to_run).do(start, {"config_file": config_file, "time": time_to_run, "trace": trace}) schedule.every().day.at(time_to_run).do(start, {"config_file": config_file, "time": time_to_run, "delete": delete, "library_first": library_first, "trace": trace})
while True: while True:
schedule.run_pending() schedule.run_pending()
if not no_countdown: if not no_countdown:

@ -2,9 +2,9 @@ PlexAPI==4.8.0
tmdbv3api==1.7.6 tmdbv3api==1.7.6
arrapi==1.3.0 arrapi==1.3.0
lxml==4.7.1 lxml==4.7.1
requests==2.26.0 requests==2.27.1
ruamel.yaml==0.17.19 ruamel.yaml==0.17.20
schedule==1.1.0 schedule==1.1.0
retrying==1.3.3 retrying==1.3.3
pathvalidate==2.5.0 pathvalidate==2.5.0
pillow==8.4.0 pillow==9.0.0
Loading…
Cancel
Save