Merge pull request #309 from meisnate12/develop

v1.11.0
pull/320/head v1.11.0
meisnate12 4 years ago committed by GitHub
commit f9cd722875
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -0,0 +1 @@
github: meisnate12

@ -0,0 +1,25 @@
---
name: Bug Report
about: Please do not use bug reports for support issues.
title: 'Bug: '
labels: 'status:not-yet-viewed, bug'
assignees: 'meisnate12'
---
<!---
THIS IS NOT THE PLACE TO ASK FOR SUPPORT! Please use [Discord](https://discord.gg/TsdpsFYqqm) for support issues.
DO NOT ERASE THE TEMPLATE! Please complete the entire template.
--->
**Describe the Bug**
A clear and concise description of what the bug is.
**Relevant Collection Config**
- Having a problem with a collection include the collection config from your metadata file.
**Plex Meta Manager Info**
- Version Number (can be found at the beginning of your meta.log file.
**Link to logs (required)**
- If having an error with a specific collection include the collection.log otherwise please include the full meta.log file on [Gist](http://gist.github.com). _Do not upload attachments_.

@ -0,0 +1,8 @@
blank_issues_enabled: false
contact_links:
- name: Plex Meta Manager Wiki
url: https://github.com/meisnate12/Plex-Meta-Manager/wiki
about: Please check the wiki to see if your question has already been answered.
- name: Discord
url: https://discord.gg/TsdpsFYqqm
about: Please use Discord to ask for support.

@ -0,0 +1,20 @@
---
name: Feature Request
about: Suggest a new feature for Plex Meta Manager.
title: 'Feature Request: '
labels: 'status:not-yet-viewed, enhancement'
assignees: 'meisnate12'
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

@ -0,0 +1,21 @@
## Description
Please include a summary of the changes.
### Issues Fixed or Closed
- Fixes #(issue)
## Type of Change
Please delete options that are not relevant.
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
## Checklist
- [ ] My code follows the style guidelines of this project
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas

3
.gitignore vendored

@ -12,7 +12,10 @@ __pycache__/
/test.py
logs/
config/*
!config/overlays/
!config/*.template
*.png
!overlay.png
build/
develop-eggs/
dist/

@ -1,5 +1,11 @@
# Plex Meta Manager
#### Version 1.10.0
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/meisnate12/Plex-Meta-Manager?style=plastic)](https://github.com/meisnate12/Plex-Meta-Manager/releases)
[![GitHub commits since latest release (by SemVer)](https://img.shields.io/github/commits-since/meisnate12/plex-meta-manager/latest/develop?label=Number%20of%20Commits%20in%20Develop&style=plastic)](https://github.com/meisnate12/Plex-Meta-Manager/tree/develop)
[![Docker Image Version (latest semver)](https://img.shields.io/docker/v/meisnate12/plex-meta-manager?label=docker&sort=semver&style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager)
[![Docker Cloud Build Status](https://img.shields.io/docker/cloud/build/meisnate12/plex-meta-manager?style=plastic)](https://hub.docker.com/r/meisnate12/plex-meta-manager)
[![Discord](https://img.shields.io/discord/822460010649878528?label=Discord&style=plastic)](https://discord.gg/TsdpsFYqqm)
[![Sponsor or Donate](https://img.shields.io/badge/-Sponsor_or_Donate-blueviolet?style=plastic)](https://github.com/sponsors/meisnate12)
The original concept for Plex Meta Manager is [Plex Auto Collections](https://github.com/mza921/Plex-Auto-Collections), but this is rewritten from the ground up to be able to include a scheduler, metadata edits, multiple libraries, and logging. Plex Meta Manager is a Python 3 script that can be continuously run using YAML configuration files to update on a schedule the metadata of the movies, shows, and collections in your libraries as well as automatically build collections based on various methods all detailed in the wiki. Some collection examples that the script can automatically build and update daily include Plex Based Searches like actor, genre, or studio collections or Collections based on TMDb, IMDb, Trakt, TVDb, AniDB, or MyAnimeList lists and various other services.
@ -7,8 +13,6 @@ The script can update many metadata fields for movies, shows, collections, seaso
The script is designed to work with most Metadata agents including the new Plex Movie Agent, New Plex TV Agent, [Hama Anime Agent](https://github.com/ZeroQI/Hama.bundle), and [MyAnimeList Anime Agent](https://github.com/Fribb/MyAnimeList.bundle).
[![paypal](https://www.paypalobjects.com/en_US/i/btn/btn_donateCC_LG.gif)](https://www.paypal.com/donate?business=JTK3CVKF3ZHP2&item_name=Plex+Meta+Manager&currency_code=USD)
## Getting Started
1. Install Plex Meta Manager either by installing Python3 and following the [Local Installation Guide](https://github.com/meisnate12/Plex-Meta-Manager/wiki/Local-Installation)

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

@ -8,7 +8,7 @@ logger = logging.getLogger("Plex Meta Manager")
builders = ["anidb_id", "anidb_relation", "anidb_popular"]
class AniDBAPI:
class AniDB:
def __init__(self, config):
self.config = config
self.urls = {
@ -60,7 +60,7 @@ class AniDBAPI:
else: raise Failed(f"AniDB Error: Method {method} not supported")
movie_ids, show_ids = self.config.Convert.anidb_to_ids(anidb_ids)
logger.debug("")
logger.debug(f"AniDB IDs Found: {anidb_ids}")
logger.debug(f"TMDb IDs Found: {movie_ids}")
logger.debug(f"TVDb IDs Found: {show_ids}")
logger.debug(f"{len(anidb_ids)} AniDB IDs Found: {anidb_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -22,7 +22,7 @@ pretty_names = {
tag_query = "query{MediaTagCollection {name}}"
genre_query = "query{GenreCollection}"
class AniListAPI:
class AniList:
def __init__(self, config):
self.config = config
self.url = "https://graphql.anilist.co"
@ -243,7 +243,7 @@ class AniListAPI:
raise Failed(f"AniList Error: Method {method} not supported")
movie_ids, show_ids = self.config.Convert.anilist_to_ids(anilist_ids)
logger.debug("")
logger.debug(f"AniList IDs Found: {anilist_ids}")
logger.debug(f"Shows Found: {show_ids}")
logger.debug(f"Movies Found: {movie_ids}")
logger.debug(f"{len(anilist_ids)} AniList IDs Found: {anilist_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -1,7 +1,8 @@
import logging, os, re
from datetime import datetime, timedelta
from modules import anidb, anilist, imdb, letterboxd, mal, plex, radarr, sonarr, tautulli, tmdb, trakttv, tvdb, util
from modules.util import Failed
from modules import anidb, anilist, icheckmovies, imdb, letterboxd, mal, plex, radarr, sonarr, tautulli, tmdb, trakttv, tvdb, util
from modules.util import Failed, ImageData
from PIL import Image
from plexapi.exceptions import BadRequest, NotFound
from plexapi.video import Movie, Show
from urllib.parse import quote
@ -57,7 +58,7 @@ filter_translation = {
"writer": "writers"
}
modifier_alias = {".greater": ".gt", ".less": ".lt"}
all_builders = anidb.builders + anilist.builders + imdb.builders + letterboxd.builders + mal.builders + plex.builders + tautulli.builders + tmdb.builders + trakttv.builders + tvdb.builders
all_builders = anidb.builders + anilist.builders + icheckmovies.builders + imdb.builders + letterboxd.builders + mal.builders + plex.builders + tautulli.builders + tmdb.builders + trakttv.builders + tvdb.builders
dictionary_builders = [
"filters",
"anilist_genre",
@ -81,6 +82,8 @@ show_only_builders = [
movie_only_builders = [
"letterboxd_list",
"letterboxd_list_details",
"icheckmovies_list",
"icheckmovies_list_details",
"tmdb_collection",
"tmdb_collection_details",
"tmdb_movie",
@ -123,21 +126,29 @@ smart_url_collection_invalid = [
"sonarr_series", "sonarr_season", "sonarr_tag", "sonarr_search", "sonarr_cutoff_search",
"filters"
]
all_details = [
"sort_title", "content_rating", "collection_mode", "collection_order",
summary_details = [
"summary", "tmdb_summary", "tmdb_description", "tmdb_biography", "tvdb_summary",
"tvdb_description", "trakt_description", "letterboxd_description",
"url_poster", "tmdb_poster", "tmdb_profile", "tvdb_poster", "file_poster",
"url_background", "tmdb_background", "tvdb_background", "file_background",
"name_mapping", "label", "show_filtered", "show_missing", "save_missing"
"tvdb_description", "trakt_description", "letterboxd_description", "icheckmovies_description"
]
collectionless_details = [
"sort_title", "content_rating",
"summary", "tmdb_summary", "tmdb_description", "tmdb_biography",
"collection_order", "plex_collectionless",
"url_poster", "tmdb_poster", "tmdb_profile", "file_poster",
"url_background", "file_background",
"name_mapping", "label", "label_sync_mode", "test"
poster_details = [
"url_poster", "tmdb_poster", "tmdb_profile", "tvdb_poster", "file_poster"
]
background_details = [
"url_background", "tmdb_background", "tvdb_background", "file_background"
]
boolean_details = [
"visible_library",
"visible_home",
"visible_shared",
"show_filtered",
"show_missing",
"save_missing",
"item_assets"
]
string_details = [
"sort_title",
"content_rating",
"name_mapping"
]
ignored_details = [
"smart_filter",
@ -148,29 +159,29 @@ ignored_details = [
"sync_mode",
"template",
"test",
"tmdb_person"
]
boolean_details = [
"show_filtered",
"show_missing",
"save_missing"
"tmdb_person",
"build_collection"
]
collectionless_details = [
"collection_order", "plex_collectionless",
"label", "label_sync_mode", "test"
] + poster_details + background_details + summary_details + string_details
all_filters = [
"actor", "actor.not",
"audio_language", "audio_language.not",
"audio_track_title", "audio_track_title.not", "audio_track_title.begins", "audio_track_title.ends",
"audio_track_title", "audio_track_title.not", "audio_track_title.begins", "audio_track_title.ends", "audio_track_title.regex",
"collection", "collection.not",
"content_rating", "content_rating.not",
"country", "country.not",
"director", "director.not",
"filepath", "filepath.not",
"filepath", "filepath.not", "filepath.begins", "filepath.ends", "filepath.regex",
"genre", "genre.not",
"label", "label.not",
"producer", "producer.not",
"release", "release.not", "release.before", "release.after",
"added", "added.not", "added.before", "added.after",
"last_played", "last_played.not", "last_played.before", "last_played.after",
"title", "title.not", "title.begins", "title.ends",
"release", "release.not", "release.before", "release.after", "release.regex", "history",
"added", "added.not", "added.before", "added.after", "added.regex",
"last_played", "last_played.not", "last_played.before", "last_played.after", "last_played.regex",
"title", "title.not", "title.begins", "title.ends", "title.regex",
"plays.gt", "plays.gte", "plays.lt", "plays.lte",
"tmdb_vote_count.gt", "tmdb_vote_count.gte", "tmdb_vote_count.lt", "tmdb_vote_count.lte",
"duration.gt", "duration.gte", "duration.lt", "duration.lte",
@ -178,7 +189,7 @@ all_filters = [
"user_rating.gt", "user_rating.gte", "user_rating.lt", "user_rating.lte",
"audience_rating.gt", "audience_rating.gte", "audience_rating.lt", "audience_rating.lte",
"critic_rating.gt", "critic_rating.gte", "critic_rating.lt", "critic_rating.lte",
"studio", "studio.not", "studio.begins", "studio.ends",
"studio", "studio.not", "studio.begins", "studio.ends", "studio.regex",
"subtitle_language", "subtitle_language.not",
"resolution", "resolution.not",
"writer", "writer.not",
@ -186,7 +197,7 @@ all_filters = [
]
movie_only_filters = [
"audio_language", "audio_language.not",
"audio_track_title", "audio_track_title.not", "audio_track_title.begins", "audio_track_title.ends",
"audio_track_title", "audio_track_title.not", "audio_track_title.begins", "audio_track_title.ends", "audio_track_title.regex",
"country", "country.not",
"director", "director.not",
"duration.gt", "duration.gte", "duration.lt", "duration.lte",
@ -207,7 +218,8 @@ class CollectionBuilder:
self.details = {
"show_filtered": self.library.show_filtered,
"show_missing": self.library.show_missing,
"save_missing": self.library.save_missing
"save_missing": self.library.save_missing,
"item_assets": False
}
self.item_details = {}
self.radarr_options = {}
@ -439,7 +451,7 @@ class CollectionBuilder:
if "build_collection" in methods:
logger.info("")
logger.info("Validating Method: build_collection")
if not self.data[methods["build_collection"]]:
if self.data[methods["build_collection"]] is None:
logger.warning(f"Collection Warning: build_collection attribute is blank defaulting to true")
else:
logger.debug(f"Value: {self.data[methods['build_collection']]}")
@ -447,7 +459,7 @@ class CollectionBuilder:
if "tmdb_person" in methods:
logger.info("")
logger.info("Validating Method: build_collection")
logger.info("Validating Method: tmdb_person")
if not self.data[methods["tmdb_person"]]:
raise Failed("Collection Error: tmdb_person attribute is blank")
else:
@ -560,6 +572,8 @@ class CollectionBuilder:
self.summaries[method_name] = config.Trakt.standard_list(config.Trakt.validate_trakt(util.get_list(method_data))[0]).description
elif method_name == "letterboxd_description":
self.summaries[method_name] = config.Letterboxd.get_list_description(method_data, self.library.Plex.language)
elif method_name == "icheckmovies_description":
self.summaries[method_name] = config.ICheckMovies.get_list_description(method_data, self.library.Plex.language)
elif method_name == "collection_mode":
if str(method_data).lower() == "default":
self.details[method_name] = "default"
@ -601,20 +615,37 @@ class CollectionBuilder:
if os.path.exists(method_data): self.backgrounds[method_name] = os.path.abspath(method_data)
else: raise Failed(f"Collection Error: Background Path Does Not Exist: {os.path.abspath(method_data)}")
elif method_name == "label":
if "label" in self.data and "label.sync" in self.data:
raise Failed(f"Collection Error: Cannot use label and label.sync together")
if "label.remove" in self.data and "label.sync" in self.data:
raise Failed(f"Collection Error: Cannot use label.remove and label.sync together")
if method_final == "label" and "label_sync_mode" in self.data and self.data["label_sync_mode"] == "sync":
if "label" in methods and "label.sync" in methods:
raise Failed("Collection Error: Cannot use label and label.sync together")
if "label.remove" in methods and "label.sync" in methods:
raise Failed("Collection Error: Cannot use label.remove and label.sync together")
if method_final == "label" and "label_sync_mode" in methods and self.data[methods["label_sync_mode"]] == "sync":
self.details["label.sync"] = util.get_list(method_data)
else:
self.details[method_final] = util.get_list(method_data)
elif method_name == "item_label":
if "item_label" in self.data and "item_label.sync" in self.data:
if "item_label" in methods and "item_label.sync" in methods:
raise Failed(f"Collection Error: Cannot use item_label and item_label.sync together")
if "item_label.remove" in self.data and "item_label.sync" in self.data:
if "item_label.remove" in methods and "item_label.sync" in methods:
raise Failed(f"Collection Error: Cannot use item_label.remove and item_label.sync together")
self.item_details[method_final] = util.get_list(method_data)
elif method_name in ["item_radarr_tag", "item_sonarr_tag"]:
if method_name in methods and f"{method_name}.sync" in methods:
raise Failed(f"Collection Error: Cannot use {method_name} and {method_name}.sync together")
if f"{method_name}.remove" in methods and f"{method_name}.sync" in methods:
raise Failed(f"Collection Error: Cannot use {method_name}.remove and {method_name}.sync together")
if method_name in methods and f"{method_name}.remove" in methods:
raise Failed(f"Collection Error: Cannot use {method_name} and {method_name}.remove together")
self.item_details[method_name] = util.get_list(method_data)
self.item_details["apply_tags"] = method_mod[1:] if method_mod else ""
elif method_name == "item_overlay":
overlay = os.path.join(config.default_dir, "overlays", method_data, "overlay.png")
if not os.path.exists(overlay):
raise Failed(f"Collection Error: {method_data} overlay image not found at {overlay}")
if method_data in self.library.overlays:
raise Failed("Each Overlay can only be used once per Library")
self.library.overlays.append(method_data)
self.item_details[method_name] = method_data
elif method_name in plex.item_advance_keys:
key, options = plex.item_advance_keys[method_name]
if method_name in advance_new_agent and self.library.agent not in plex.new_plex_agents:
@ -627,8 +658,8 @@ class CollectionBuilder:
self.item_details[method_name] = str(method_data).lower()
elif method_name in boolean_details:
self.details[method_name] = util.get_bool(method_name, method_data)
elif method_name in all_details:
self.details[method_name] = method_data
elif method_name in string_details:
self.details[method_name] = str(method_data)
elif method_name == "radarr_add":
self.add_to_radarr = util.get_bool(method_name, method_data)
elif method_name == "radarr_folder":
@ -714,6 +745,17 @@ class CollectionBuilder:
list_count = 0
new_list.append({"url": imdb_url, "limit": list_count})
self.methods.append((method_name, new_list))
elif method_name == "icheckmovies_list":
valid_lists = []
for icheckmovies_list in util.get_list(method_data, split=False):
valid_lists.append(config.ICheckMovies.validate_icheckmovies_list(icheckmovies_list, self.library.Plex.language))
self.methods.append((method_name, valid_lists))
elif method_name == "icheckmovies_list_details":
valid_lists = []
for icheckmovies_list in util.get_list(method_data, split=False):
valid_lists.append(config.ICheckMovies.validate_icheckmovies_list(icheckmovies_list, self.library.Plex.language))
self.methods.append((method_name[:-8], valid_lists))
self.summaries[method_name] = config.ICheckMovies.get_list_description(method_data, self.library.Plex.language)
elif method_name == "letterboxd_list":
self.methods.append((method_name, util.get_list(method_data, split=False)))
elif method_name == "letterboxd_list_details":
@ -721,7 +763,8 @@ class CollectionBuilder:
self.summaries[method_name] = config.Letterboxd.get_list_description(values[0], self.library.Plex.language)
self.methods.append((method_name[:-8], values))
elif method_name in dictionary_builders:
if isinstance(method_data, dict):
for dict_data in util.get_list(method_data):
if isinstance(dict_data, dict):
def get_int(parent, int_method, data_in, methods_in, default_in, minimum=1, maximum=None):
if int_method not in methods_in:
logger.warning(f"Collection Warning: {parent} {int_method} attribute not found using {default_in} as default")
@ -737,13 +780,13 @@ class CollectionBuilder:
return default_in
if method_name == "filters":
validate = True
if "validate" in method_data:
if method_data["validate"] is None:
if "validate" in dict_data:
if dict_data["validate"] is None:
raise Failed("Collection Error: validate filter attribute is blank")
if not isinstance(method_data["validate"], bool):
if not isinstance(dict_data["validate"], bool):
raise Failed("Collection Error: validate filter attribute must be either true or false")
validate = method_data["validate"]
for filter_method, filter_data in method_data.items():
validate = dict_data["validate"]
for filter_method, filter_data in dict_data.items():
filter_attr, modifier, filter_final = self._split(filter_method)
if filter_final not in all_filters:
raise Failed(f"Collection Error: {filter_final} is not a valid filter attribute")
@ -757,19 +800,19 @@ class CollectionBuilder:
self.filters.append((filter_final, self.validate_attribute(filter_attr, modifier, f"{filter_final} filter", filter_data, validate)))
elif method_name == "plex_collectionless":
new_dictionary = {}
dict_methods = {dm.lower(): dm for dm in method_data}
dict_methods = {dm.lower(): dm for dm in dict_data}
prefix_list = []
if "exclude_prefix" in dict_methods and method_data[dict_methods["exclude_prefix"]]:
if isinstance(method_data[dict_methods["exclude_prefix"]], list):
prefix_list.extend([exclude for exclude in method_data[dict_methods["exclude_prefix"]] if exclude])
if "exclude_prefix" in dict_methods and dict_data[dict_methods["exclude_prefix"]]:
if isinstance(dict_data[dict_methods["exclude_prefix"]], list):
prefix_list.extend([exclude for exclude in dict_data[dict_methods["exclude_prefix"]] if exclude])
else:
prefix_list.append(str(method_data[dict_methods["exclude_prefix"]]))
prefix_list.append(str(dict_data[dict_methods["exclude_prefix"]]))
exact_list = []
if "exclude" in dict_methods and method_data[dict_methods["exclude"]]:
if isinstance(method_data[dict_methods["exclude"]], list):
exact_list.extend([exclude for exclude in method_data[dict_methods["exclude"]] if exclude])
if "exclude" in dict_methods and dict_data[dict_methods["exclude"]]:
if isinstance(dict_data[dict_methods["exclude"]], list):
exact_list.extend([exclude for exclude in dict_data[dict_methods["exclude"]] if exclude])
else:
exact_list.append(str(method_data[dict_methods["exclude"]]))
exact_list.append(str(dict_data[dict_methods["exclude"]]))
if len(prefix_list) == 0 and len(exact_list) == 0:
raise Failed("Collection Error: you must have at least one exclusion")
exact_list.append(self.name)
@ -777,10 +820,10 @@ class CollectionBuilder:
new_dictionary["exclude"] = exact_list
self.methods.append((method_name, [new_dictionary]))
elif method_name == "plex_search":
self.methods.append((method_name, [self.build_filter("plex_search", method_data)]))
self.methods.append((method_name, [self.build_filter("plex_search", dict_data)]))
elif method_name == "tmdb_discover":
new_dictionary = {"limit": 100}
for discover_name, discover_data in method_data.items():
for discover_name, discover_data in dict_data.items():
discover_final = discover_name.lower()
if discover_data:
if (self.library.is_movie and discover_final in tmdb.discover_movie) or (self.library.is_show and discover_final in tmdb.discover_tv):
@ -800,12 +843,12 @@ class CollectionBuilder:
else:
raise Failed(f"Collection Error: {method_name} attribute {discover_final}: {discover_data} is invalid")
elif discover_final == "certification_country":
if "certification" in method_data or "certification.lte" in method_data or "certification.gte" in method_data:
if "certification" in dict_data or "certification.lte" in dict_data or "certification.gte" in dict_data:
new_dictionary[discover_final] = discover_data
else:
raise Failed(f"Collection Error: {method_name} attribute {discover_final}: must be used with either certification, certification.lte, or certification.gte")
elif discover_final in ["certification", "certification.lte", "certification.gte"]:
if "certification_country" in method_data:
if "certification_country" in dict_data:
new_dictionary[discover_final] = discover_data
else:
raise Failed(f"Collection Error: {method_name} attribute {discover_final}: must be used with certification_country")
@ -843,22 +886,22 @@ class CollectionBuilder:
new_dictionary["list_type"] = "watched"
else:
raise Failed(f"Collection Error: {method_name} attribute not supported")
dict_methods = {dm.lower(): dm for dm in method_data}
new_dictionary["list_days"] = get_int(method_name, "list_days", method_data, dict_methods, 30)
new_dictionary["list_size"] = get_int(method_name, "list_size", method_data, dict_methods, 10)
new_dictionary["list_buffer"] = get_int(method_name, "list_buffer", method_data, dict_methods, 20)
dict_methods = {dm.lower(): dm for dm in dict_data}
new_dictionary["list_days"] = get_int(method_name, "list_days", dict_data, dict_methods, 30)
new_dictionary["list_size"] = get_int(method_name, "list_size", dict_data, dict_methods, 10)
new_dictionary["list_buffer"] = get_int(method_name, "list_buffer", dict_data, dict_methods, 20)
self.methods.append((method_name, [new_dictionary]))
elif method_name == "mal_season":
new_dictionary = {"sort_by": "anime_num_list_users"}
dict_methods = {dm.lower(): dm for dm in method_data}
dict_methods = {dm.lower(): dm for dm in dict_data}
if "sort_by" not in dict_methods:
logger.warning("Collection Warning: mal_season sort_by attribute not found using members as default")
elif not method_data[dict_methods["sort_by"]]:
elif not dict_data[dict_methods["sort_by"]]:
logger.warning("Collection Warning: mal_season sort_by attribute is blank using members as default")
elif method_data[dict_methods["sort_by"]] not in mal.season_sort:
logger.warning(f"Collection Warning: mal_season sort_by attribute {method_data[dict_methods['sort_by']]} invalid must be either 'members' or 'score' using members as default")
elif dict_data[dict_methods["sort_by"]] not in mal.season_sort:
logger.warning(f"Collection Warning: mal_season sort_by attribute {dict_data[dict_methods['sort_by']]} invalid must be either 'members' or 'score' using members as default")
else:
new_dictionary["sort_by"] = mal.season_sort[method_data[dict_methods["sort_by"]]]
new_dictionary["sort_by"] = mal.season_sort[dict_data[dict_methods["sort_by"]]]
if self.current_time.month in [1, 2, 3]: new_dictionary["season"] = "winter"
elif self.current_time.month in [4, 5, 6]: new_dictionary["season"] = "spring"
@ -867,49 +910,49 @@ class CollectionBuilder:
if "season" not in dict_methods:
logger.warning(f"Collection Warning: mal_season season attribute not found using the current season: {new_dictionary['season']} as default")
elif not method_data[dict_methods["season"]]:
elif not dict_data[dict_methods["season"]]:
logger.warning(f"Collection Warning: mal_season season attribute is blank using the current season: {new_dictionary['season']} as default")
elif method_data[dict_methods["season"]] not in util.pretty_seasons:
logger.warning(f"Collection Warning: mal_season season attribute {method_data[dict_methods['season']]} invalid must be either 'winter', 'spring', 'summer' or 'fall' using the current season: {new_dictionary['season']} as default")
elif dict_data[dict_methods["season"]] not in util.pretty_seasons:
logger.warning(f"Collection Warning: mal_season season attribute {dict_data[dict_methods['season']]} invalid must be either 'winter', 'spring', 'summer' or 'fall' using the current season: {new_dictionary['season']} as default")
else:
new_dictionary["season"] = method_data[dict_methods["season"]]
new_dictionary["season"] = dict_data[dict_methods["season"]]
new_dictionary["year"] = get_int(method_name, "year", method_data, dict_methods, self.current_time.year, minimum=1917, maximum=self.current_time.year + 1)
new_dictionary["limit"] = get_int(method_name, "limit", method_data, dict_methods, 100, maximum=500)
new_dictionary["year"] = get_int(method_name, "year", dict_data, dict_methods, self.current_time.year, minimum=1917, maximum=self.current_time.year + 1)
new_dictionary["limit"] = get_int(method_name, "limit", dict_data, dict_methods, 100, maximum=500)
self.methods.append((method_name, [new_dictionary]))
elif method_name == "mal_userlist":
new_dictionary = {"status": "all", "sort_by": "list_score"}
dict_methods = {dm.lower(): dm for dm in method_data}
dict_methods = {dm.lower(): dm for dm in dict_data}
if "username" not in dict_methods:
raise Failed("Collection Error: mal_userlist username attribute is required")
elif not method_data[dict_methods["username"]]:
elif not dict_data[dict_methods["username"]]:
raise Failed("Collection Error: mal_userlist username attribute is blank")
else:
new_dictionary["username"] = method_data[dict_methods["username"]]
new_dictionary["username"] = dict_data[dict_methods["username"]]
if "status" not in dict_methods:
logger.warning("Collection Warning: mal_season status attribute not found using all as default")
elif not method_data[dict_methods["status"]]:
elif not dict_data[dict_methods["status"]]:
logger.warning("Collection Warning: mal_season status attribute is blank using all as default")
elif method_data[dict_methods["status"]] not in mal.userlist_status:
logger.warning(f"Collection Warning: mal_season status attribute {method_data[dict_methods['status']]} invalid must be either 'all', 'watching', 'completed', 'on_hold', 'dropped' or 'plan_to_watch' using all as default")
elif dict_data[dict_methods["status"]] not in mal.userlist_status:
logger.warning(f"Collection Warning: mal_season status attribute {dict_data[dict_methods['status']]} invalid must be either 'all', 'watching', 'completed', 'on_hold', 'dropped' or 'plan_to_watch' using all as default")
else:
new_dictionary["status"] = mal.userlist_status[method_data[dict_methods["status"]]]
new_dictionary["status"] = mal.userlist_status[dict_data[dict_methods["status"]]]
if "sort_by" not in dict_methods:
logger.warning("Collection Warning: mal_season sort_by attribute not found using score as default")
elif not method_data[dict_methods["sort_by"]]:
elif not dict_data[dict_methods["sort_by"]]:
logger.warning("Collection Warning: mal_season sort_by attribute is blank using score as default")
elif method_data[dict_methods["sort_by"]] not in mal.userlist_sort:
logger.warning(f"Collection Warning: mal_season sort_by attribute {method_data[dict_methods['sort_by']]} invalid must be either 'score', 'last_updated', 'title' or 'start_date' using score as default")
elif dict_data[dict_methods["sort_by"]] not in mal.userlist_sort:
logger.warning(f"Collection Warning: mal_season sort_by attribute {dict_data[dict_methods['sort_by']]} invalid must be either 'score', 'last_updated', 'title' or 'start_date' using score as default")
else:
new_dictionary["sort_by"] = mal.userlist_sort[method_data[dict_methods["sort_by"]]]
new_dictionary["sort_by"] = mal.userlist_sort[dict_data[dict_methods["sort_by"]]]
new_dictionary["limit"] = get_int(method_name, "limit", method_data, dict_methods, 100, maximum=1000)
new_dictionary["limit"] = get_int(method_name, "limit", dict_data, dict_methods, 100, maximum=1000)
self.methods.append((method_name, [new_dictionary]))
elif "anilist" in method_name:
new_dictionary = {"sort_by": "score"}
dict_methods = {dm.lower(): dm for dm in method_data}
dict_methods = {dm.lower(): dm for dm in dict_data}
if method_name == "anilist_season":
if self.current_time.month in [12, 1, 2]: new_dictionary["season"] = "winter"
elif self.current_time.month in [3, 4, 5]: new_dictionary["season"] = "spring"
@ -918,43 +961,43 @@ class CollectionBuilder:
if "season" not in dict_methods:
logger.warning(f"Collection Warning: anilist_season season attribute not found using the current season: {new_dictionary['season']} as default")
elif not method_data[dict_methods["season"]]:
elif not dict_data[dict_methods["season"]]:
logger.warning(f"Collection Warning: anilist_season season attribute is blank using the current season: {new_dictionary['season']} as default")
elif method_data[dict_methods["season"]] not in util.pretty_seasons:
logger.warning(f"Collection Warning: anilist_season season attribute {method_data[dict_methods['season']]} invalid must be either 'winter', 'spring', 'summer' or 'fall' using the current season: {new_dictionary['season']} as default")
elif dict_data[dict_methods["season"]] not in util.pretty_seasons:
logger.warning(f"Collection Warning: anilist_season season attribute {dict_data[dict_methods['season']]} invalid must be either 'winter', 'spring', 'summer' or 'fall' using the current season: {new_dictionary['season']} as default")
else:
new_dictionary["season"] = method_data[dict_methods["season"]]
new_dictionary["season"] = dict_data[dict_methods["season"]]
new_dictionary["year"] = get_int(method_name, "year", method_data, dict_methods, self.current_time.year, minimum=1917, maximum=self.current_time.year + 1)
new_dictionary["year"] = get_int(method_name, "year", dict_data, dict_methods, self.current_time.year, minimum=1917, maximum=self.current_time.year + 1)
elif method_name == "anilist_genre":
if "genre" not in dict_methods:
raise Failed(f"Collection Warning: anilist_genre genre attribute not found")
elif not method_data[dict_methods["genre"]]:
elif not dict_data[dict_methods["genre"]]:
raise Failed(f"Collection Warning: anilist_genre genre attribute is blank")
else:
new_dictionary["genre"] = self.config.AniList.validate_genre(method_data[dict_methods["genre"]])
new_dictionary["genre"] = self.config.AniList.validate_genre(dict_data[dict_methods["genre"]])
elif method_name == "anilist_tag":
if "tag" not in dict_methods:
raise Failed(f"Collection Warning: anilist_tag tag attribute not found")
elif not method_data[dict_methods["tag"]]:
elif not dict_data[dict_methods["tag"]]:
raise Failed(f"Collection Warning: anilist_tag tag attribute is blank")
else:
new_dictionary["tag"] = self.config.AniList.validate_tag(method_data[dict_methods["tag"]])
new_dictionary["tag"] = self.config.AniList.validate_tag(dict_data[dict_methods["tag"]])
if "sort_by" not in dict_methods:
logger.warning(f"Collection Warning: {method_name} sort_by attribute not found using score as default")
elif not method_data[dict_methods["sort_by"]]:
elif not dict_data[dict_methods["sort_by"]]:
logger.warning(f"Collection Warning: {method_name} sort_by attribute is blank using score as default")
elif str(method_data[dict_methods["sort_by"]]).lower() not in ["score", "popular"]:
logger.warning(f"Collection Warning: {method_name} sort_by attribute {method_data[dict_methods['sort_by']]} invalid must be either 'score' or 'popular' using score as default")
elif str(dict_data[dict_methods["sort_by"]]).lower() not in ["score", "popular"]:
logger.warning(f"Collection Warning: {method_name} sort_by attribute {dict_data[dict_methods['sort_by']]} invalid must be either 'score' or 'popular' using score as default")
else:
new_dictionary["sort_by"] = method_data[dict_methods["sort_by"]]
new_dictionary["sort_by"] = dict_data[dict_methods["sort_by"]]
new_dictionary["limit"] = get_int(method_name, "limit", method_data, dict_methods, 0, maximum=500)
new_dictionary["limit"] = get_int(method_name, "limit", dict_data, dict_methods, 0, maximum=500)
self.methods.append((method_name, [new_dictionary]))
else:
raise Failed(f"Collection Error: {method_name} attribute is not a dictionary: {method_data}")
raise Failed(f"Collection Error: {method_name} attribute is not a dictionary: {dict_data}")
elif method_name in numbered_builders:
list_count = util.regex_first_int(method_data, "List Size", default=10)
if list_count < 1:
@ -1036,8 +1079,7 @@ class CollectionBuilder:
if self.build_collection:
try:
self.obj = self.library.get_collection(self.name)
collection_smart = self.library.smart(self.obj)
if (self.smart and not collection_smart) or (not self.smart and collection_smart):
if (self.smart and not self.obj.smart) or (not self.smart and self.obj.smart):
logger.info("")
logger.error(f"Collection Error: Converting {self.obj.title} to a {'smart' if self.smart else 'normal'} collection")
self.library.query(self.obj.delete)
@ -1093,6 +1135,7 @@ class CollectionBuilder:
elif "mal" in method: check_map(self.config.MyAnimeList.get_items(method, value))
elif "tvdb" in method: check_map(self.config.TVDb.get_items(method, value, self.library.Plex.language))
elif "imdb" in method: check_map(self.config.IMDb.get_items(method, value, self.library.Plex.language, self.library.is_movie))
elif "icheckmovies" in method: check_map(self.config.ICheckMovies.get_items(method, value, self.library.Plex.language))
elif "letterboxd" in method: check_map(self.config.Letterboxd.get_items(method, value, self.library.Plex.language))
elif "tmdb" in method: check_map(self.config.TMDb.get_items(method, value, self.library.is_movie))
elif "trakt" in method: check_map(self.config.Trakt.get_items(method, value, self.library.is_movie))
@ -1226,11 +1269,12 @@ class CollectionBuilder:
base_dict = {}
any_dicts = []
for alias_key, alias_value in filter_alias.items():
if alias_key in plex.and_searches:
_, _, final = self._split(alias_key)
if final in plex.and_searches:
base_dict[alias_value[:-4]] = plex_filter[alias_value]
elif alias_key in plex.or_searches:
elif final in plex.or_searches:
any_dicts.append({alias_value: plex_filter[alias_value]})
elif alias_key in plex.searches:
elif final in plex.searches:
base_dict[alias_value] = plex_filter[alias_value]
if len(any_dicts) > 0:
base_dict["any"] = any_dicts
@ -1258,12 +1302,34 @@ class CollectionBuilder:
def validate_attribute(self, attribute, modifier, final, data, validate, pairs=False):
def smart_pair(list_to_pair):
return [(t, t) for t in list_to_pair] if pairs else list_to_pair
if attribute in ["title", "studio", "episode_title", "audio_track_title"] and modifier in ["", ".not", ".begins", ".ends"]:
if modifier == ".regex":
regex_list = util.get_list(data, split=False)
valid_regex = []
for reg in regex_list:
try:
re.compile(reg)
valid_regex.append(reg)
except re.error:
util.print_stacktrace()
err = f"Collection Error: Regular Expression Invalid: {reg}"
if validate:
raise Failed(err)
else:
logger.error(err)
return valid_regex
elif attribute in ["title", "studio", "episode_title", "audio_track_title"] and modifier in ["", ".not", ".begins", ".ends"]:
return smart_pair(util.get_list(data, split=False))
elif attribute == "original_language":
return util.get_list(data, lower=True)
elif attribute == "filepath":
return util.get_list(data)
elif attribute == "history":
try:
return util.check_number(data, final, minimum=1, maximum=30)
except Failed:
if str(data).lower() in ["day", "month"]:
return data.lower()
raise Failed(f"Collection Error: history attribute invalid: {data} must be a number between 1-30, day, or month")
elif attribute in plex.tags and modifier in ["", ".not"]:
if attribute in plex.tmdb_attributes:
final_values = []
@ -1275,13 +1341,21 @@ class CollectionBuilder:
final_values.append(value)
else:
final_values = util.get_list(data)
try:
return self.library.validate_search_list(final_values, attribute, title=not pairs, pairs=pairs)
except Failed as e:
search_choices = self.library.get_search_choices(attribute, title=not pairs)
valid_list = []
for value in final_values:
if str(value).lower() in search_choices:
if pairs:
valid_list.append((value, search_choices[str(value).lower()]))
else:
valid_list.append(search_choices[str(value).lower()])
else:
error = f"Plex Error: {attribute}: {value} not found"
if validate:
raise
raise Failed(error)
else:
logger.error(e)
logger.error(error)
return valid_list
elif attribute in ["year", "episode_year"] and modifier in [".gt", ".gte", ".lt", ".lte"]:#
return util.check_year(data, self.current_year, final)
elif attribute in plex.date_attributes and modifier in [".before", ".after"]:#
@ -1302,10 +1376,10 @@ class CollectionBuilder:
attribute = method_alias[attribute] if attribute in method_alias else attribute
modifier = modifier_alias[modifier] if modifier in modifier_alias else modifier
if attribute.lower() == "add_to_arr":
if attribute == "add_to_arr":
attribute = "radarr_add" if self.library.is_movie else "sonarr_add"
elif attribute.lower() in ["arr_tag", "arr_folder"]:
attribute = f"{'rad' if self.library.is_movie else 'son'}{attribute.lower()}"
elif attribute in ["arr_tag", "arr_folder"]:
attribute = f"{'rad' if self.library.is_movie else 'son'}{attribute}"
elif attribute in plex.date_attributes and modifier in [".gt", ".gte"]:
modifier = ".after"
elif attribute in plex.date_attributes and modifier in [".lt", ".lte"]:
@ -1334,8 +1408,9 @@ class CollectionBuilder:
except Failed as e:
logger.error(e)
continue
current_title = f"{current.title} ({current.year})" if current.year else current.title
if self.check_filters(current, f"{(' ' * (max_length - len(str(i))))}{i}/{total}"):
logger.info(util.adjust_space(f"{name} Collection | {'=' if current in collection_items else '+'} | {current.title}"))
logger.info(util.adjust_space(f"{name} Collection | {'=' if current in collection_items else '+'} | {current_title}"))
if current in collection_items:
self.plex_map[current.ratingKey] = None
elif self.smart_label_collection:
@ -1343,7 +1418,7 @@ class CollectionBuilder:
else:
self.library.query_data(current.addCollection, name)
elif self.details["show_filtered"] is True:
logger.info(f"{name} Collection | X | {current.title}")
logger.info(f"{name} Collection | X | {current_title}")
media_type = f"{'Movie' if self.library.is_movie else 'Show'}{'s' if total > 1 else ''}"
util.print_end()
logger.info("")
@ -1356,7 +1431,7 @@ class CollectionBuilder:
for filter_method, filter_data in self.filters:
filter_attr, modifier, filter_final = self._split(filter_method)
filter_actual = filter_translation[filter_attr] if filter_attr in filter_translation else filter_attr
if filter_attr in ["release", "added", "last_played"]:
if filter_attr in ["release", "added", "last_played"] and modifier != ".regex":
current_data = getattr(current, filter_actual)
if modifier in ["", ".not"]:
threshold_date = current_date - timedelta(days=filter_data)
@ -1366,6 +1441,17 @@ class CollectionBuilder:
elif (modifier == ".before" and (current_data is None or current_data >= filter_data)) \
or (modifier == ".after" and (current_data is None or current_data <= filter_data)):
return False
elif filter_attr in ["release", "added", "last_played"] and modifier == ".regex":
jailbreak = False
current_data = getattr(current, filter_actual)
if current_data is None:
return False
for check_data in filter_data:
if re.compile(check_data).match(current_data.strftime("%m/%d/%Y")):
jailbreak = True
break
if not jailbreak:
return False
elif filter_attr == "audio_track_title":
jailbreak = False
for media in current.media:
@ -1375,13 +1461,14 @@ class CollectionBuilder:
title = audio.title if audio.title else ""
if (modifier in ["", ".not"] and check_title.lower() in title.lower()) \
or (modifier == ".begins" and title.lower().startswith(check_title.lower())) \
or (modifier == ".ends" and title.lower().endswith(check_title.lower())):
or (modifier == ".ends" and title.lower().endswith(check_title.lower())) \
or (modifier == ".regex" and re.compile(check_title).match(title)):
jailbreak = True
break
if jailbreak: break
if jailbreak: break
if jailbreak: break
if (jailbreak and modifier == ".not") or (not jailbreak and modifier in ["", ".begins", ".ends"]):
if (jailbreak and modifier == ".not") or (not jailbreak and modifier in ["", ".begins", ".ends", ".regex"]):
return False
elif filter_attr == "filepath":
jailbreak = False
@ -1389,11 +1476,12 @@ class CollectionBuilder:
for check_text in filter_data:
if (modifier in ["", ".not"] and check_text.lower() in location.lower()) \
or (modifier == ".begins" and location.lower().startswith(check_text.lower())) \
or (modifier == ".ends" and location.lower().endswith(check_text.lower())):
or (modifier == ".ends" and location.lower().endswith(check_text.lower())) \
or (modifier == ".regex" and re.compile(check_text).match(location)):
jailbreak = True
break
if jailbreak: break
if (jailbreak and modifier == ".not") or (not jailbreak and modifier in ["", ".begins", ".ends"]):
if (jailbreak and modifier == ".not") or (not jailbreak and modifier in ["", ".begins", ".ends", ".regex"]):
return False
elif filter_attr in ["title", "studio"]:
jailbreak = False
@ -1401,10 +1489,29 @@ class CollectionBuilder:
for check_data in filter_data:
if (modifier in ["", ".not"] and check_data.lower() in current_data.lower()) \
or (modifier == ".begins" and current_data.lower().startswith(check_data.lower())) \
or (modifier == ".ends" and current_data.lower().endswith(check_data.lower())):
or (modifier == ".ends" and current_data.lower().endswith(check_data.lower())) \
or (modifier == ".regex" and re.compile(check_data).match(current_data)):
jailbreak = True
break
if (jailbreak and modifier == ".not") or (not jailbreak and modifier in ["", ".begins", ".ends"]):
if (jailbreak and modifier == ".not") or (not jailbreak and modifier in ["", ".begins", ".ends", ".regex"]):
return False
elif filter_attr == "history":
item_date = current.originallyAvailableAt
if item_date is None:
return False
elif filter_data == "day":
if item_date.month != current_date.month or item_date.day != current_date.day:
return False
elif filter_data == "month":
if item_date.month != current_date.month:
return False
else:
date_match = False
for i in range(filter_data):
check_date = current_date - timedelta(days=i)
if item_date.month == check_date.month and item_date.day == check_date.day:
date_match = True
if date_match is False:
return False
elif filter_attr == "original_language":
movie = None
@ -1491,12 +1598,13 @@ class CollectionBuilder:
or (filter_method == "tmdb_vote_count.lte" and movie.vote_count > filter_data):
match = False
break
current_title = f"{movie.title} ({util.check_date(movie.release_date, 'test', plex_date=True).year})" if movie.release_date else movie.title
if match:
missing_movies_with_names.append((movie.title, missing_id))
missing_movies_with_names.append((current_title, missing_id))
if self.details["show_missing"] is True:
logger.info(f"{self.name} Collection | ? | {movie.title} (TMDb: {missing_id})")
logger.info(f"{self.name} Collection | ? | {current_title} (TMDb: {missing_id})")
elif self.details["show_filtered"] is True:
logger.info(f"{self.name} Collection | X | {movie.title} (TMDb: {missing_id})")
logger.info(f"{self.name} Collection | X | {current_title} (TMDb: {missing_id})")
logger.info("")
logger.info(f"{len(missing_movies_with_names)} Movie{'s' if len(missing_movies_with_names) > 1 else ''} Missing")
if self.details["save_missing"] is True:
@ -1581,8 +1689,30 @@ class CollectionBuilder:
except Failed as e:
logger.error(e)
overlay = None
overlay_folder = None
rating_keys = []
if "item_overlay" in self.item_details:
overlay_name = self.item_details["item_overlay"]
if self.config.Cache:
rating_keys = self.config.Cache.query_image_map_overlay(self.library.original_mapping_name, "poster", overlay_name)
overlay_folder = os.path.join(self.config.default_dir, "overlays", overlay_name)
overlay_image = Image.open(os.path.join(overlay_folder, "overlay.png"))
temp_image = os.path.join(overlay_folder, f"temp.png")
overlay = (overlay_name, overlay_folder, overlay_image, temp_image)
tmdb_ids = []
tvdb_ids = []
for item in items:
if int(item.ratingKey) in rating_keys:
rating_keys.remove(int(item.ratingKey))
if self.details["item_assets"] or overlay is not None:
self.library.update_item_from_assets(item, overlay=overlay)
self.library.edit_tags("label", item, add_tags=add_tags, remove_tags=remove_tags, sync_tags=sync_tags)
if "item_radarr_tag" in self.item_details and item.ratingKey in self.library.movie_rating_key_map:
tmdb_ids.append(self.library.movie_rating_key_map[item.ratingKey])
if "item_sonarr_tag" in self.item_details and item.ratingKey in self.library.show_rating_key_map:
tvdb_ids.append(self.library.show_rating_key_map[item.ratingKey])
advance_edits = {}
for method_name, method_data in self.item_details.items():
if method_name in plex.item_advance_keys:
@ -1591,6 +1721,24 @@ class CollectionBuilder:
advance_edits[key] = options[method_data]
self.library.edit_item(item, item.title, "Movie" if self.library.is_movie else "Show", advance_edits, advanced=True)
if len(tmdb_ids) > 0:
self.library.Radarr.edit_tags(tmdb_ids, self.item_details["item_radarr_tag"], self.item_details["apply_tags"])
if len(tvdb_ids) > 0:
self.library.Sonarr.edit_tags(tvdb_ids, self.item_details["item_sonarr_tag"], self.item_details["apply_tags"])
for rating_key in rating_keys:
try:
item = self.fetch_item(rating_key)
except Failed as e:
logger.error(e)
continue
og_image = os.path.join(overlay_folder, f"{rating_key}.png")
if os.path.exists(og_image):
self.library._upload_file_poster(item, og_image)
os.remove(og_image)
self.config.Cache.update_image_map(item.ratingKey, self.library.original_mapping_name, "poster", "", "", "")
def update_details(self):
if not self.obj and self.smart_url:
self.library.create_smart_collection(self.name, self.smart_type_key, self.smart_url)
@ -1621,6 +1769,8 @@ class CollectionBuilder:
elif "tmdb_collection_details" in self.summaries: summary = get_summary("tmdb_collection_details", self.summaries)
elif "trakt_list_details" in self.summaries: summary = get_summary("trakt_list_details", self.summaries)
elif "tmdb_list_details" in self.summaries: summary = get_summary("tmdb_list_details", self.summaries)
elif "letterboxd_list_details" in self.summaries: summary = get_summary("letterboxd_list_details", self.summaries)
elif "icheckmovies_list_details" in self.summaries: summary = get_summary("icheckmovies_list_details", self.summaries)
elif "tmdb_actor_details" in self.summaries: summary = get_summary("tmdb_actor_details", self.summaries)
elif "tmdb_crew_details" in self.summaries: summary = get_summary("tmdb_crew_details", self.summaries)
elif "tmdb_director_details" in self.summaries: summary = get_summary("tmdb_director_details", self.summaries)
@ -1660,6 +1810,25 @@ class CollectionBuilder:
self.library.collection_order_query(self.obj, self.details["collection_order"])
logger.info(f"Detail: collection_order updated Collection Order to {self.details['collection_order']}")
if "visible_library" in self.details or "visible_home" in self.details or "visible_shared" in self.details:
visibility = self.library.collection_visibility(self.obj)
visible_library = None
visible_home = None
visible_shared = None
if "visible_library" in self.details and self.details["visible_library"] != visibility["library"]:
visible_library = self.details["visible_library"]
if "visible_home" in self.details and self.details["visible_home"] != visibility["library"]:
visible_home = self.details["visible_home"]
if "visible_shared" in self.details and self.details["visible_shared"] != visibility["library"]:
visible_shared = self.details["visible_shared"]
if visible_library is not None or visible_home is not None or visible_shared is not None:
self.library.collection_visibility_update(self.obj, visibility=visibility, library=visible_library, home=visible_home, shared=visible_shared)
logger.info("Detail: Collection visibility updated")
add_tags = self.details["label"] if "label" in self.details else None
remove_tags = self.details["label.remove"] if "label.remove" in self.details else None
sync_tags = self.details["label.sync"] if "label.sync" in self.details else None
@ -1675,60 +1844,59 @@ class CollectionBuilder:
if "name_mapping" in self.details:
if self.details["name_mapping"]: name_mapping = self.details["name_mapping"]
else: logger.error("Collection Error: name_mapping attribute is blank")
poster_image, background_image = self.library.update_item_from_assets(self.obj, collection_mode=True, upload=False, name=name_mapping)
poster_image, background_image = self.library.find_collection_assets(self.obj, name=name_mapping)
if poster_image:
self.posters["asset_directory"] = poster_image
if background_image:
self.backgrounds["asset_directory"] = background_image
def set_image(image_method, images, is_background=False):
message = f"{'background' if is_background else 'poster'} to [{'File' if image_method in image_file_details else 'URL'}] {images[image_method]}"
try:
self.library.upload_image(self.obj, images[image_method], poster=not is_background, url=image_method not in image_file_details)
logger.info(f"Detail: {image_method} updated collection {message}")
except BadRequest:
logger.error(f"Detail: {image_method} failed to update {message}")
if len(self.posters) > 1:
logger.info(f"{len(self.posters)} posters found:")
poster = None
if len(self.posters) > 0:
logger.debug(f"{len(self.posters)} posters found:")
for p in self.posters:
logger.info(f"Method: {p} Poster: {self.posters[p]}")
if "url_poster" in self.posters: set_image("url_poster", self.posters)
elif "file_poster" in self.posters: set_image("file_poster", self.posters)
elif "tmdb_poster" in self.posters: set_image("tmdb_poster", self.posters)
elif "tmdb_profile" in self.posters: set_image("tmdb_profile", self.posters)
elif "tvdb_poster" in self.posters: set_image("tvdb_poster", self.posters)
elif "asset_directory" in self.posters: set_image("asset_directory", self.posters)
elif "tmdb_person" in self.posters: set_image("tmdb_person", self.posters)
elif "tmdb_collection_details" in self.posters: set_image("tmdb_collection_details", self.posters)
elif "tmdb_actor_details" in self.posters: set_image("tmdb_actor_details", self.posters)
elif "tmdb_crew_details" in self.posters: set_image("tmdb_crew_details", self.posters)
elif "tmdb_director_details" in self.posters: set_image("tmdb_director_details", self.posters)
elif "tmdb_producer_details" in self.posters: set_image("tmdb_producer_details", self.posters)
elif "tmdb_writer_details" in self.posters: set_image("tmdb_writer_details", self.posters)
elif "tmdb_movie_details" in self.posters: set_image("tmdb_movie_details", self.posters)
elif "tvdb_movie_details" in self.posters: set_image("tvdb_movie_details", self.posters)
elif "tvdb_show_details" in self.posters: set_image("tvdb_show_details", self.posters)
elif "tmdb_show_details" in self.posters: set_image("tmdb_show_details", self.posters)
else: logger.info("No poster to update")
if len(self.backgrounds) > 1:
logger.info(f"{len(self.backgrounds)} backgrounds found:")
logger.debug(f"Method: {p} Poster: {self.posters[p]}")
if "url_poster" in self.posters: poster = ImageData("url_poster", self.posters["url_poster"])
elif "file_poster" in self.posters: poster = ImageData("file_poster", self.posters["file_poster"], is_url=False)
elif "tmdb_poster" in self.posters: poster = ImageData("tmdb_poster", self.posters["tmdb_poster"])
elif "tmdb_profile" in self.posters: poster = ImageData("tmdb_poster", self.posters["tmdb_profile"])
elif "tvdb_poster" in self.posters: poster = ImageData("tvdb_poster", self.posters["tvdb_poster"])
elif "asset_directory" in self.posters: poster = self.posters["asset_directory"]
elif "tmdb_person" in self.posters: poster = ImageData("tmdb_person", self.posters["tmdb_person"])
elif "tmdb_collection_details" in self.posters: poster = ImageData("tmdb_collection_details", self.posters["tmdb_collection_details"])
elif "tmdb_actor_details" in self.posters: poster = ImageData("tmdb_actor_details", self.posters["tmdb_actor_details"])
elif "tmdb_crew_details" in self.posters: poster = ImageData("tmdb_crew_details", self.posters["tmdb_crew_details"])
elif "tmdb_director_details" in self.posters: poster = ImageData("tmdb_director_details", self.posters["tmdb_director_details"])
elif "tmdb_producer_details" in self.posters: poster = ImageData("tmdb_producer_details", self.posters["tmdb_producer_details"])
elif "tmdb_writer_details" in self.posters: poster = ImageData("tmdb_writer_details", self.posters["tmdb_writer_details"])
elif "tmdb_movie_details" in self.posters: poster = ImageData("tmdb_movie_details", self.posters["tmdb_movie_details"])
elif "tvdb_movie_details" in self.posters: poster = ImageData("tvdb_movie_details", self.posters["tvdb_movie_details"])
elif "tvdb_show_details" in self.posters: poster = ImageData("tvdb_show_details", self.posters["tvdb_show_details"])
elif "tmdb_show_details" in self.posters: poster = ImageData("tmdb_show_details", self.posters["tmdb_show_details"])
else:
logger.info("No poster collection detail or asset folder found")
background = None
if len(self.backgrounds) > 0:
logger.debug(f"{len(self.backgrounds)} backgrounds found:")
for b in self.backgrounds:
logger.info(f"Method: {b} Background: {self.backgrounds[b]}")
if "url_background" in self.backgrounds: set_image("url_background", self.backgrounds, is_background=True)
elif "file_background" in self.backgrounds: set_image("file_background", self.backgrounds, is_background=True)
elif "tmdb_background" in self.backgrounds: set_image("tmdb_background", self.backgrounds, is_background=True)
elif "tvdb_background" in self.backgrounds: set_image("tvdb_background", self.backgrounds, is_background=True)
elif "asset_directory" in self.backgrounds: set_image("asset_directory", self.backgrounds, is_background=True)
elif "tmdb_collection_details" in self.backgrounds: set_image("tmdb_collection_details", self.backgrounds, is_background=True)
elif "tmdb_movie_details" in self.backgrounds: set_image("tmdb_movie_details", self.backgrounds, is_background=True)
elif "tvdb_movie_details" in self.backgrounds: set_image("tvdb_movie_details", self.backgrounds, is_background=True)
elif "tvdb_show_details" in self.backgrounds: set_image("tvdb_show_details", self.backgrounds, is_background=True)
elif "tmdb_show_details" in self.backgrounds: set_image("tmdb_show_details", self.backgrounds, is_background=True)
else: logger.info("No background to update")
logger.debug(f"Method: {b} Background: {self.backgrounds[b]}")
if "url_background" in self.backgrounds: background = ImageData("url_background", self.backgrounds["url_background"], is_poster=False)
elif "file_background" in self.backgrounds: background = ImageData("file_background", self.backgrounds["file_background"], is_poster=False, is_url=False)
elif "tmdb_background" in self.backgrounds: background = ImageData("tmdb_background", self.backgrounds["tmdb_background"], is_poster=False)
elif "tvdb_background" in self.backgrounds: background = ImageData("tvdb_background", self.backgrounds["tvdb_background"], is_poster=False)
elif "asset_directory" in self.backgrounds: background = self.backgrounds["asset_directory"]
elif "tmdb_collection_details" in self.backgrounds: background = ImageData("tmdb_collection_details", self.backgrounds["tmdb_collection_details"], is_poster=False)
elif "tmdb_movie_details" in self.backgrounds: background = ImageData("tmdb_movie_details", self.backgrounds["tmdb_movie_details"], is_poster=False)
elif "tvdb_movie_details" in self.backgrounds: background = ImageData("tvdb_movie_details", self.backgrounds["tvdb_movie_details"], is_poster=False)
elif "tvdb_show_details" in self.backgrounds: background = ImageData("tvdb_show_details", self.backgrounds["tvdb_show_details"], is_poster=False)
elif "tmdb_show_details" in self.backgrounds: background = ImageData("tmdb_show_details", self.backgrounds["tmdb_show_details"], is_poster=False)
else:
logger.info("No background collection detail or asset folder found")
if poster or background:
self.library.upload_images(self.obj, poster=poster, background=background)
def run_collections_again(self):
self.obj = self.library.get_collection(self.name)
@ -1748,13 +1916,12 @@ class CollectionBuilder:
except (BadRequest, NotFound):
logger.error(f"Plex Error: Item {rating_key} not found")
continue
current_title = f"{current.title} ({current.year})" if current.year else current.title
if current in collection_items:
logger.info(f"{name} Collection | = | {current.title}")
elif self.smart_label_collection:
self.library.query_data(current.addLabel, name)
logger.info(f"{name} Collection | = | {current_title}")
else:
self.library.query_data(current.addCollection, name)
logger.info(f"{name} Collection | + | {current.title}")
self.library.query_data(current.addLabel if self.smart_label_collection else current.addCollection, name)
logger.info(f"{name} Collection | + | {current_title}")
logger.info(f"{len(rating_keys)} {'Movie' if self.library.is_movie else 'Show'}{'s' if len(rating_keys) > 1 else ''} Processed")
if len(self.run_again_movies) > 0:
@ -1767,7 +1934,8 @@ class CollectionBuilder:
logger.error(e)
continue
if self.details["show_missing"] is True:
logger.info(f"{name} Collection | ? | {movie.title} (TMDb: {missing_id})")
current_title = f"{movie.title} ({util.check_date(movie.release_date, 'test', plex_date=True).year})" if movie.release_date else movie.title
logger.info(f"{name} Collection | ? | {current_title} (TMDb: {missing_id})")
logger.info("")
logger.info(f"{len(self.run_again_movies)} Movie{'s' if len(self.run_again_movies) > 1 else ''} Missing")

@ -79,6 +79,16 @@ class Cache:
kitsu TEXT,
expiration_date TEXT)"""
)
cursor.execute(
"""CREATE TABLE IF NOT EXISTS image_map (
INTEGER PRIMARY KEY,
rating_key TEXT,
library TEXT,
type TEXT,
overlay TEXT,
compare TEXT,
location TEXT)"""
)
self.expiration = expiration
self.cache_path = cache
@ -145,7 +155,7 @@ class Cache:
if row and row[to_id]:
datetime_object = datetime.strptime(row["expiration_date"], "%Y-%m-%d")
time_between_insertion = datetime.now() - datetime_object
id_to_return = int(row[to_id])
id_to_return = row[to_id] if to_id == "imdb_id" else int(row[to_id])
expired = time_between_insertion.days > self.expiration
return id_to_return, expired
@ -180,6 +190,7 @@ class Cache:
omdb_dict["imdbVotes"] = row["imdb_votes"] if row["imdb_votes"] else None
omdb_dict["Metascore"] = row["metacritic_rating"] if row["metacritic_rating"] else None
omdb_dict["Type"] = row["type"] if row["type"] else None
omdb_dict["Response"] = "True"
datetime_object = datetime.strptime(row["expiration_date"], "%Y-%m-%d")
time_between_insertion = datetime.now() - datetime_object
expired = time_between_insertion.days > self.expiration
@ -221,3 +232,31 @@ class Cache:
with closing(connection.cursor()) as cursor:
cursor.execute("INSERT OR IGNORE INTO anime_map(anidb) VALUES(?)", (anime_ids["anidb"],))
cursor.execute("UPDATE anime_map SET anilist = ?, myanimelist = ?, kitsu = ?, expiration_date = ? WHERE anidb = ?", (anime_ids["anidb"], anime_ids["myanimelist"], anime_ids["kitsu"], expiration_date.strftime("%Y-%m-%d"), anime_ids["anidb"]))
def query_image_map_overlay(self, library, image_type, overlay):
rks = []
with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor:
cursor.execute(f"SELECT * FROM image_map WHERE overlay = ? AND library = ? AND type = ?", (overlay, library, image_type))
rows = cursor.fetchall()
for row in rows:
rks.append(int(row["rating_key"]))
return rks
def query_image_map(self, rating_key, library, image_type):
with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor:
cursor.execute(f"SELECT * FROM image_map WHERE rating_key = ? AND library = ? AND type = ?", (rating_key, library, image_type))
row = cursor.fetchone()
if row and row["location"]:
return row["location"], row["compare"], row["overlay"]
return None, None, None
def update_image_map(self, rating_key, library, image_type, location, compare, overlay):
with sqlite3.connect(self.cache_path) as connection:
connection.row_factory = sqlite3.Row
with closing(connection.cursor()) as cursor:
cursor.execute("INSERT OR IGNORE INTO image_map(rating_key, library, type) VALUES(?, ?, ?)", (rating_key, library, image_type))
cursor.execute("UPDATE image_map SET location = ?, compare = ?, overlay = ? WHERE rating_key = ? AND library = ? AND type = ?", (location, compare, overlay, rating_key, library, image_type))

@ -1,35 +1,34 @@
import logging, os
from datetime import datetime
from modules import util
from modules.anidb import AniDBAPI
from modules.anilist import AniListAPI
from modules.anidb import AniDB
from modules.anilist import AniList
from modules.cache import Cache
from modules.convert import Convert
from modules.imdb import IMDbAPI
from modules.letterboxd import LetterboxdAPI
from modules.mal import MyAnimeListAPI
from modules.omdb import OMDbAPI
from modules.plex import PlexAPI
from modules.radarr import RadarrAPI
from modules.sonarr import SonarrAPI
from modules.tautulli import TautulliAPI
from modules.tmdb import TMDbAPI
from modules.trakttv import TraktAPI
from modules.tvdb import TVDbAPI
from modules.icheckmovies import ICheckMovies
from modules.imdb import IMDb
from modules.letterboxd import Letterboxd
from modules.mal import MyAnimeList
from modules.omdb import OMDb
from modules.plex import Plex
from modules.radarr import Radarr
from modules.sonarr import Sonarr
from modules.tautulli import Tautulli
from modules.tmdb import TMDb
from modules.trakttv import Trakt
from modules.tvdb import TVDb
from modules.util import Failed
from ruamel import yaml
logger = logging.getLogger("Plex Meta Manager")
sync_modes = {"append": "Only Add Items to the Collection", "sync": "Add & Remove Items from the Collection"}
radarr_versions = {"v2": "For Radarr 0.2", "v3": "For Radarr 3.0"}
radarr_availabilities = {
"announced": "For Announced",
"cinemas": "For In Cinemas",
"released": "For Released",
"db": "For PreDB"
}
sonarr_versions = {"v2": "For Sonarr 0.2", "v3": "For Sonarr 3.0"}
sonarr_monitors = {
"all": "Monitor all episodes except specials",
"future": "Monitor episodes that have not aired yet",
@ -57,6 +56,7 @@ class Config:
else: raise Failed(f"Config Error: config not found at {os.path.abspath(default_dir)}")
logger.info(f"Using {self.config_path} as config")
self.default_dir = default_dir
self.test_mode = is_test
self.run_start_time = time_scheduled
self.run_hour = datetime.strptime(time_scheduled, "%H:%M").hour
@ -214,7 +214,7 @@ class Config:
try: self.tmdb["apikey"] = check_for_attribute(self.data, "apikey", parent="tmdb", throw=True)
except Failed as e: raise Failed(e)
self.tmdb["language"] = check_for_attribute(self.data, "language", parent="tmdb", default="en")
self.TMDb = TMDbAPI(self, self.tmdb)
self.TMDb = TMDb(self, self.tmdb)
logger.info(f"TMDb Connection {'Failed' if self.TMDb is None else 'Successful'}")
else:
raise Failed("Config Error: tmdb attribute not found")
@ -227,7 +227,7 @@ class Config:
self.omdb = {}
try:
self.omdb["apikey"] = check_for_attribute(self.data, "apikey", parent="omdb", throw=True)
self.OMDb = OMDbAPI(self.omdb, Cache=self.Cache)
self.OMDb = OMDb(self.omdb, Cache=self.Cache)
except Failed as e:
logger.error(e)
logger.info(f"OMDb Connection {'Failed' if self.OMDb is None else 'Successful'}")
@ -245,7 +245,7 @@ class Config:
self.trakt["client_secret"] = check_for_attribute(self.data, "client_secret", parent="trakt", throw=True)
self.trakt["config_path"] = self.config_path
authorization = self.data["trakt"]["authorization"] if "authorization" in self.data["trakt"] and self.data["trakt"]["authorization"] else None
self.Trakt = TraktAPI(self.trakt, authorization)
self.Trakt = Trakt(self.trakt, authorization)
except Failed as e:
logger.error(e)
logger.info(f"Trakt Connection {'Failed' if self.Trakt is None else 'Successful'}")
@ -263,19 +263,20 @@ class Config:
self.mal["client_secret"] = check_for_attribute(self.data, "client_secret", parent="mal", throw=True)
self.mal["config_path"] = self.config_path
authorization = self.data["mal"]["authorization"] if "authorization" in self.data["mal"] and self.data["mal"]["authorization"] else None
self.MyAnimeList = MyAnimeListAPI(self.mal, self, authorization)
self.MyAnimeList = MyAnimeList(self.mal, self, authorization)
except Failed as e:
logger.error(e)
logger.info(f"My Anime List Connection {'Failed' if self.MyAnimeList is None else 'Successful'}")
else:
logger.warning("mal attribute not found")
self.TVDb = TVDbAPI(self)
self.IMDb = IMDbAPI(self)
self.AniDB = AniDBAPI(self)
self.TVDb = TVDb(self)
self.IMDb = IMDb(self)
self.AniDB = AniDB(self)
self.Convert = Convert(self)
self.AniList = AniListAPI(self)
self.Letterboxd = LetterboxdAPI(self)
self.AniList = AniList(self)
self.Letterboxd = Letterboxd(self)
self.ICheckMovies = ICheckMovies(self)
util.separator()
@ -292,7 +293,6 @@ class Config:
self.general["radarr"] = {}
self.general["radarr"]["url"] = check_for_attribute(self.data, "url", parent="radarr", var_type="url", default_is_none=True)
self.general["radarr"]["token"] = check_for_attribute(self.data, "token", parent="radarr", default_is_none=True)
self.general["radarr"]["version"] = check_for_attribute(self.data, "version", parent="radarr", test_list=radarr_versions, default="v3")
self.general["radarr"]["add"] = check_for_attribute(self.data, "add", parent="radarr", var_type="bool", default=False)
self.general["radarr"]["root_folder_path"] = check_for_attribute(self.data, "root_folder_path", parent="radarr", default_is_none=True)
self.general["radarr"]["monitor"] = check_for_attribute(self.data, "monitor", parent="radarr", var_type="bool", default=True)
@ -304,7 +304,6 @@ class Config:
self.general["sonarr"] = {}
self.general["sonarr"]["url"] = check_for_attribute(self.data, "url", parent="sonarr", var_type="url", default_is_none=True)
self.general["sonarr"]["token"] = check_for_attribute(self.data, "token", parent="sonarr", default_is_none=True)
self.general["sonarr"]["version"] = check_for_attribute(self.data, "version", parent="sonarr", test_list=sonarr_versions, default="v3")
self.general["sonarr"]["add"] = check_for_attribute(self.data, "add", parent="sonarr", var_type="bool", default=False)
self.general["sonarr"]["root_folder_path"] = check_for_attribute(self.data, "root_folder_path", parent="sonarr", default_is_none=True)
self.general["sonarr"]["monitor"] = check_for_attribute(self.data, "monitor", parent="sonarr", test_list=sonarr_monitors, default="all")
@ -321,8 +320,7 @@ class Config:
self.general["tautulli"]["apikey"] = check_for_attribute(self.data, "apikey", parent="tautulli", default_is_none=True)
self.libraries = []
try: libs = check_for_attribute(self.data, "libraries", throw=True)
except Failed as e: raise Failed(e)
libs = check_for_attribute(self.data, "libraries", throw=True)
for library_name, lib in libs.items():
if self.requested_libraries and library_name not in self.requested_libraries:
@ -404,6 +402,11 @@ class Config:
else:
params["mass_critic_rating_update"] = None
if lib and "split_duplicates" in lib and lib["split_duplicates"]:
params["split_duplicates"] = check_for_attribute(lib, "split_duplicates", var_type="bool", default=False, save=False)
else:
params["split_duplicates"] = None
if lib and "radarr_add_all" in lib and lib["radarr_add_all"]:
params["radarr_add_all"] = check_for_attribute(lib, "radarr_add_all", var_type="bool", default=False, save=False)
else:
@ -449,10 +452,11 @@ class Config:
params["plex"]["clean_bundles"] = check_for_attribute(lib, "clean_bundles", parent="plex", var_type="bool", default=self.general["plex"]["clean_bundles"], save=False)
params["plex"]["empty_trash"] = check_for_attribute(lib, "empty_trash", parent="plex", var_type="bool", default=self.general["plex"]["empty_trash"], save=False)
params["plex"]["optimize"] = check_for_attribute(lib, "optimize", parent="plex", var_type="bool", default=self.general["plex"]["optimize"], save=False)
library = PlexAPI(params)
library = Plex(self, params)
logger.info("")
logger.info(f"{display_name} Library Connection Successful")
except Failed as e:
util.print_stacktrace()
util.print_multiline(e, error=True)
logger.info(f"{display_name} Library Connection Failed")
continue
@ -467,7 +471,6 @@ class Config:
try:
radarr_params["url"] = check_for_attribute(lib, "url", parent="radarr", var_type="url", default=self.general["radarr"]["url"], req_default=True, save=False)
radarr_params["token"] = check_for_attribute(lib, "token", parent="radarr", default=self.general["radarr"]["token"], req_default=True, save=False)
radarr_params["version"] = check_for_attribute(lib, "version", parent="radarr", test_list=radarr_versions, default=self.general["radarr"]["version"], save=False)
radarr_params["add"] = check_for_attribute(lib, "add", parent="radarr", var_type="bool", default=self.general["radarr"]["add"], save=False)
radarr_params["root_folder_path"] = check_for_attribute(lib, "root_folder_path", parent="radarr", default=self.general["radarr"]["root_folder_path"], req_default=True, save=False)
radarr_params["monitor"] = check_for_attribute(lib, "monitor", parent="radarr", var_type="bool", default=self.general["radarr"]["monitor"], save=False)
@ -475,8 +478,9 @@ class Config:
radarr_params["quality_profile"] = check_for_attribute(lib, "quality_profile", parent="radarr", default=self.general["radarr"]["quality_profile"], req_default=True, save=False)
radarr_params["tag"] = check_for_attribute(lib, "search", parent="radarr", var_type="lower_list", default=self.general["radarr"]["tag"], default_is_none=True, save=False)
radarr_params["search"] = check_for_attribute(lib, "search", parent="radarr", var_type="bool", default=self.general["radarr"]["search"], save=False)
library.Radarr = RadarrAPI(radarr_params)
library.Radarr = Radarr(radarr_params)
except Failed as e:
util.print_stacktrace()
util.print_multiline(e, error=True)
logger.info("")
logger.info(f"{display_name} library's Radarr Connection {'Failed' if library.Radarr is None else 'Successful'}")
@ -491,7 +495,6 @@ class Config:
try:
sonarr_params["url"] = check_for_attribute(lib, "url", parent="sonarr", var_type="url", default=self.general["sonarr"]["url"], req_default=True, save=False)
sonarr_params["token"] = check_for_attribute(lib, "token", parent="sonarr", default=self.general["sonarr"]["token"], req_default=True, save=False)
sonarr_params["version"] = check_for_attribute(lib, "version", parent="sonarr", test_list=sonarr_versions, default=self.general["sonarr"]["version"], save=False)
sonarr_params["add"] = check_for_attribute(lib, "add", parent="sonarr", var_type="bool", default=self.general["sonarr"]["add"], save=False)
sonarr_params["root_folder_path"] = check_for_attribute(lib, "root_folder_path", parent="sonarr", default=self.general["sonarr"]["root_folder_path"], req_default=True, save=False)
sonarr_params["monitor"] = check_for_attribute(lib, "monitor", parent="sonarr", test_list=sonarr_monitors, default=self.general["sonarr"]["monitor"], save=False)
@ -505,8 +508,9 @@ class Config:
sonarr_params["tag"] = check_for_attribute(lib, "search", parent="sonarr", var_type="lower_list", default=self.general["sonarr"]["tag"], default_is_none=True, save=False)
sonarr_params["search"] = check_for_attribute(lib, "search", parent="sonarr", var_type="bool", default=self.general["sonarr"]["search"], save=False)
sonarr_params["cutoff_search"] = check_for_attribute(lib, "cutoff_search", parent="sonarr", var_type="bool", default=self.general["sonarr"]["cutoff_search"], save=False)
library.Sonarr = SonarrAPI(sonarr_params, library.Plex.language)
library.Sonarr = Sonarr(sonarr_params)
except Failed as e:
util.print_stacktrace()
util.print_multiline(e, error=True)
logger.info("")
logger.info(f"{display_name} library's Sonarr Connection {'Failed' if library.Sonarr is None else 'Successful'}")
@ -521,8 +525,9 @@ class Config:
try:
tautulli_params["url"] = check_for_attribute(lib, "url", parent="tautulli", var_type="url", default=self.general["tautulli"]["url"], req_default=True, save=False)
tautulli_params["apikey"] = check_for_attribute(lib, "apikey", parent="tautulli", default=self.general["tautulli"]["apikey"], req_default=True, save=False)
library.Tautulli = TautulliAPI(tautulli_params)
library.Tautulli = Tautulli(tautulli_params)
except Failed as e:
util.print_stacktrace()
util.print_multiline(e, error=True)
logger.info("")
logger.info(f"{display_name} library's Tautulli Connection {'Failed' if library.Tautulli is None else 'Successful'}")

@ -53,21 +53,20 @@ class Convert:
unconverted_id_sets = []
for anime_dict in all_ids:
if self.config.Cache:
for id_type, anime_id in anime_dict.items():
query_ids = None
expired = None
if self.config.Cache:
query_ids, expired = self.config.Cache.query_anime_map(anime_id, id_type)
if query_ids and not expired:
converted_ids.append(query_ids)
else:
unconverted_ids.append({id_type: anime_id})
if len(unconverted_ids) == 100:
unconverted_id_sets.append(unconverted_ids)
unconverted_ids = []
else:
if query_ids is None or expired:
unconverted_ids.append(anime_dict)
if len(unconverted_ids) == 100:
unconverted_id_sets.append(unconverted_ids)
unconverted_ids = []
if len(unconverted_ids) > 0:
unconverted_id_sets.append(unconverted_ids)
for unconverted_id_set in unconverted_id_sets:
for anime_ids in self._request(unconverted_id_set):
if anime_ids:

@ -0,0 +1,55 @@
import logging, requests
from lxml import html
from modules import util
from modules.util import Failed
from retrying import retry
logger = logging.getLogger("Plex Meta Manager")
builders = ["icheckmovies_list", "icheckmovies_list_details"]
class ICheckMovies:
def __init__(self, config):
self.config = config
self.list_url = "https://www.icheckmovies.com/lists/"
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def _request(self, url, language):
return html.fromstring(requests.get(url, headers={"Accept-Language": language, "User-Agent": "Mozilla/5.0 x64"}).content)
def _parse_list(self, list_url, language):
response = self._request(list_url, language)
imdb_urls = response.xpath("//a[@class='optionIcon optionIMDB external']/@href")
return [t[t.find("/tt") + 1:-1] for t in imdb_urls]
def get_list_description(self, list_url, language):
descriptions = self._request(list_url, language).xpath("//div[@class='span-19 last']/p/em/text()")
return descriptions[0] if len(descriptions) > 0 and len(descriptions[0]) > 0 else None
def validate_icheckmovies_list(self, list_url, language):
list_url = list_url.strip()
if not list_url.startswith(self.list_url):
raise Failed(f"ICheckMovies Error: {list_url} must begin with: {self.list_url}")
if len(self._parse_list(list_url, language)) > 0:
return list_url
raise Failed(f"ICheckMovies Error: {list_url} failed to parse")
def get_items(self, method, data, language):
pretty = util.pretty_names[method] if method in util.pretty_names else method
movie_ids = []
if method == "icheckmovies_list":
logger.info(f"Processing {pretty}: {data}")
imdb_ids = self._parse_list(data, language)
total_ids = len(imdb_ids)
for i, imdb_id in enumerate(imdb_ids, 1):
try:
util.print_return(f"Converting IMDb ID {i}/{total_ids}")
movie_ids.append(self.config.Convert.imdb_to_tmdb(imdb_id))
except Failed as e:
logger.error(e)
logger.info(util.adjust_space(f"Processed {total_ids} IMDb IDs"))
else:
raise Failed(f"ICheckMovies Error: Method {method} not supported")
logger.debug("")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
return movie_ids, []

@ -8,7 +8,7 @@ logger = logging.getLogger("Plex Meta Manager")
builders = ["imdb_list", "imdb_id"]
class IMDbAPI:
class IMDb:
def __init__(self, config):
self.config = config
self.urls = {
@ -94,13 +94,15 @@ class IMDbAPI:
pretty = util.pretty_names[method] if method in util.pretty_names else method
show_ids = []
movie_ids = []
fail_ids = []
def run_convert(imdb_id):
tvdb_id = self.config.Convert.imdb_to_tvdb(imdb_id) if not is_movie else None
tmdb_id = self.config.Convert.imdb_to_tmdb(imdb_id) if tvdb_id is None else None
if not tmdb_id and not tvdb_id:
logger.error(f"Convert Error: No {'' if is_movie else 'TVDb ID or '}TMDb ID found for IMDb: {imdb_id}")
if tmdb_id: movie_ids.append(tmdb_id)
if tvdb_id: show_ids.append(tvdb_id)
elif tvdb_id: show_ids.append(tvdb_id)
else:
logger.error(f"Convert Error: No {'' if is_movie else 'TVDb ID or '}TMDb ID found for IMDb: {imdb_id}")
fail_ids.append(imdb_id)
if method == "imdb_id":
logger.info(f"Processing {pretty}: {data}")
@ -117,6 +119,7 @@ class IMDbAPI:
else:
raise Failed(f"IMDb Error: Method {method} not supported")
logger.debug("")
logger.debug(f"TMDb IDs Found: {movie_ids}")
logger.debug(f"TVDb IDs Found: {show_ids}")
logger.debug(f"{len(fail_ids)} IMDb IDs Failed to Convert: {fail_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -8,7 +8,7 @@ logger = logging.getLogger("Plex Meta Manager")
builders = ["letterboxd_list", "letterboxd_list_details"]
class LetterboxdAPI:
class Letterboxd:
def __init__(self, config):
self.config = config
self.url = "https://letterboxd.com"
@ -69,5 +69,5 @@ class LetterboxdAPI:
else:
logger.error(f"Letterboxd Error: No List Items found in {data}")
logger.debug("")
logger.debug(f"TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
return movie_ids, []

@ -72,7 +72,7 @@ userlist_status = [
"plan_to_watch"
]
class MyAnimeListAPI:
class MyAnimeList:
def __init__(self, params, config, authorization=None):
self.config = config
self.urls = {
@ -214,7 +214,7 @@ class MyAnimeListAPI:
raise Failed(f"MyAnimeList Error: Method {method} not supported")
movie_ids, show_ids = self.config.Convert.myanimelist_to_ids(mal_ids)
logger.debug("")
logger.debug(f"MyAnimeList IDs Found: {mal_ids}")
logger.debug(f"Shows Found: {show_ids}")
logger.debug(f"Movies Found: {movie_ids}")
logger.debug(f"{len(mal_ids)} MyAnimeList IDs Found: {mal_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -1,14 +1,15 @@
import logging, os, re, requests
from datetime import datetime
from modules import plex, util
from modules.util import Failed
from modules.util import Failed, ImageData
from plexapi.exceptions import NotFound
from ruamel import yaml
logger = logging.getLogger("Plex Meta Manager")
class Metadata:
def __init__(self, library, file_type, path):
def __init__(self, config, library, file_type, path):
self.config = config
self.library = library
self.type = file_type
self.path = path
@ -16,7 +17,7 @@ class Metadata:
logger.info("")
logger.info(f"Loading Metadata {file_type}: {path}")
def get_dict(attribute, attr_data, check_list=None):
if attribute in attr_data:
if attr_data and attribute in attr_data:
if attr_data[attribute]:
if isinstance(attr_data[attribute], dict):
if check_list:
@ -102,7 +103,7 @@ class Metadata:
logger.error(f"Metadata Error: {name} attribute is blank")
def add_advanced_edit(attr, obj, group, alias, show_library=False, new_agent=False):
key, options = plex.advance_keys[attr]
key, options = plex.item_advance_keys[f"item_{attr}"]
if attr in alias:
if new_agent and self.library.agent not in plex.new_plex_agents:
logger.error(f"Metadata Error: {attr} attribute only works for with the New Plex Movie Agent and New Plex TV Agent")
@ -140,23 +141,26 @@ class Metadata:
return self.library.edit_tags(attr, obj, add_tags=add_tags, remove_tags=remove_tags, sync_tags=sync_tags)
return False
def set_image(attr, obj, group, alias, poster=True, url=True):
def set_image(attr, group, alias, is_poster=True, is_url=True):
if group[alias[attr]]:
message = f"{'poster' if poster else 'background'} to [{'URL' if url else 'File'}] {group[alias[attr]]}"
self.library.upload_image(obj, group[alias[attr]], poster=poster, url=url)
logger.info(f"Detail: {attr} updated {message}")
return ImageData(attr, group[alias[attr]], is_poster=is_poster, is_url=is_url)
else:
logger.error(f"Metadata Error: {attr} attribute is blank")
def set_images(obj, group, alias):
poster = None
background = None
if "url_poster" in alias:
set_image("url_poster", obj, group, alias)
poster = set_image("url_poster", group, alias)
elif "file_poster" in alias:
set_image("file_poster", obj, group, alias, url=False)
poster = set_image("file_poster", group, alias, is_url=False)
if "url_background" in alias:
set_image("url_background", obj, group, alias, poster=False)
background = set_image("url_background", group, alias, is_poster=False)
elif "file_background" in alias:
set_image("file_background", obj, group, alias, poster=False, url=False)
background = set_image("file_background", group, alias, is_poster=False, is_url=False)
if poster or background:
self.library.upload_images(obj, poster=poster, background=background)
logger.info("")
util.separator()

@ -6,8 +6,11 @@ from retrying import retry
logger = logging.getLogger("Plex Meta Manager")
class OMDbObj:
def __init__(self, data):
def __init__(self, imdb_id, data):
self._imdb_id = imdb_id
self._data = data
if data["Response"] == "False":
raise Failed(f"OMDb Error: {data['Error']} IMDb ID: {imdb_id}")
self.title = data["Title"]
try:
self.year = int(data["Year"])
@ -31,7 +34,7 @@ class OMDbObj:
self.imdb_id = data["imdbID"]
self.type = data["Type"]
class OMDbAPI:
class OMDb:
def __init__(self, params, Cache=None):
self.url = "http://www.omdbapi.com/"
self.apikey = params["apikey"]
@ -45,10 +48,10 @@ class OMDbAPI:
if self.Cache:
omdb_dict, expired = self.Cache.query_omdb(imdb_id)
if omdb_dict and expired is False:
return OMDbObj(omdb_dict)
return OMDbObj(imdb_id, omdb_dict)
response = requests.get(self.url, params={"i": imdb_id, "apikey": self.apikey})
if response.status_code < 400:
omdb = OMDbObj(response.json())
omdb = OMDbObj(imdb_id, response.json())
if self.Cache:
self.Cache.update_omdb(expired, omdb)
return omdb

@ -1,15 +1,16 @@
import glob, logging, os, requests
import glob, logging, os, plexapi, requests, shutil
from modules import builder, util
from modules.meta import Metadata
from modules.util import Failed
import plexapi
from modules.util import Failed, ImageData
from plexapi import utils
from plexapi.exceptions import BadRequest, NotFound, Unauthorized
from plexapi.collection import Collections
from plexapi.collection import Collection
from plexapi.server import PlexServer
from PIL import Image
from retrying import retry
from ruamel import yaml
from urllib import parse
from xml.etree.ElementTree import ParseError
logger = logging.getLogger("Plex Meta Manager")
@ -24,6 +25,7 @@ search_translation = {
"critic_rating": "rating",
"user_rating": "userRating",
"plays": "viewCount",
"unplayed": "unwatched",
"episode_title": "episode.title",
"episode_added": "episode.addedAt",
"episode_air_date": "episode.originallyAvailableAt",
@ -57,15 +59,6 @@ metadata_language_options["default"] = None
use_original_title_options = {"default": -1, "no": 0, "yes": 1}
collection_mode_keys = {-1: "default", 0: "hide", 1: "hideItems", 2: "showItems"}
collection_order_keys = {0: "release", 1: "alpha", 2: "custom"}
advance_keys = {
"episode_sorting": ("episodeSort", episode_sorting_options),
"keep_episodes": ("autoDeletionItemPolicyUnwatchedLibrary", keep_episodes_options),
"delete_episodes": ("autoDeletionItemPolicyWatchedLibrary", delete_episodes_options),
"season_display": ("flattenSeasons", season_display_options),
"episode_ordering": ("showOrdering", episode_ordering_options),
"metadata_language": ("languageOverride", metadata_language_options),
"use_original_title": ("useOriginalTitle", use_original_title_options)
}
item_advance_keys = {
"item_episode_sorting": ("episodeSort", episode_sorting_options),
"item_keep_episodes": ("autoDeletionItemPolicyUnwatchedLibrary", keep_episodes_options),
@ -259,15 +252,16 @@ sort_types = {
"episodes": (4, episode_sorts),
}
class PlexAPI:
def __init__(self, params):
class Plex:
def __init__(self, config, params):
self.config = config
try:
self.PlexServer = PlexServer(params["plex"]["url"], params["plex"]["token"], timeout=params["plex"]["timeout"])
except Unauthorized:
raise Failed("Plex Error: Plex token is invalid")
except ValueError as e:
raise Failed(f"Plex Error: {e}")
except requests.exceptions.ConnectionError:
except (requests.exceptions.ConnectionError, ParseError):
util.print_stacktrace()
raise Failed("Plex Error: Plex url is invalid")
self.Plex = next((s for s in self.PlexServer.library.sections() if s.title == params["name"]), None)
@ -285,7 +279,7 @@ class PlexAPI:
self.metadata_files = []
for file_type, metadata_file in params["metadata_path"]:
try:
meta_obj = Metadata(self, file_type, metadata_file)
meta_obj = Metadata(config, self, file_type, metadata_file)
if meta_obj.collections:
self.collections.extend([c for c in meta_obj.collections])
if meta_obj.metadata:
@ -307,7 +301,8 @@ class PlexAPI:
self.Sonarr = None
self.Tautulli = None
self.name = params["name"]
self.mapping_name, output = util.validate_filename(params["mapping_name"])
self.original_mapping_name = params["mapping_name"]
self.mapping_name, output = util.validate_filename(self.original_mapping_name)
if output:
logger.info(output)
self.missing_path = os.path.join(params["default_dir"], f"{self.name}_missing.yml")
@ -323,9 +318,10 @@ class PlexAPI:
self.mass_genre_update = params["mass_genre_update"]
self.mass_audience_rating_update = params["mass_audience_rating_update"]
self.mass_critic_rating_update = params["mass_critic_rating_update"]
self.split_duplicates = params["split_duplicates"]
self.radarr_add_all = params["radarr_add_all"]
self.sonarr_add_all = params["sonarr_add_all"]
self.mass_update = self.mass_genre_update or self.mass_audience_rating_update or self.mass_critic_rating_update or self.radarr_add_all or self.sonarr_add_all
self.mass_update = self.mass_genre_update or self.mass_audience_rating_update or self.mass_critic_rating_update or self.split_duplicates or self.radarr_add_all or self.sonarr_add_all
self.plex = params["plex"]
self.url = params["plex"]["url"]
self.token = params["plex"]["token"]
@ -339,6 +335,7 @@ class PlexAPI:
self.movie_rating_key_map = {}
self.show_rating_key_map = {}
self.run_again = []
self.overlays = []
def get_all_collections(self):
return self.search(libtype="collection")
@ -363,13 +360,24 @@ class PlexAPI:
def fetchItem(self, data):
return self.PlexServer.fetchItem(data)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def get_all(self):
return self.Plex.all()
logger.info(f"Loading All {'Movie' if self.is_movie else 'Show'}s from Library: {self.name}")
logger.info("")
key = f"/library/sections/{self.Plex.key}/all?type={utils.searchType(self.Plex.TYPE)}"
container_start = 0
container_size = plexapi.X_PLEX_CONTAINER_SIZE
results = []
while self.Plex._totalViewSize is None or container_start <= self.Plex._totalViewSize:
results.extend(self.fetchItems(key, container_start, container_size))
util.print_return(f"Loaded: {container_start}/{self.Plex._totalViewSize}")
container_start += container_size
logger.info(util.adjust_space(f"Loaded {self.Plex._totalViewSize} {'Movies' if self.is_movie else 'Shows'}"))
logger.info("")
return results
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def server_search(self, data):
return self.PlexServer.search(data)
def fetchItems(self, key, container_start, container_size):
return self.Plex.fetchItems(key, container_start=container_start, container_size=container_size)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def query(self, method):
@ -389,12 +397,16 @@ class PlexAPI:
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def get_guids(self, item):
self.reload(item)
return item.guids
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def reload(self, item):
item.reload(checkFiles=False, includeAllConcerts=False, includeBandwidths=False, includeChapters=False,
includeChildren=False, includeConcerts=False, includeExternalMedia=False, includeExtras=False,
includeFields='', includeGeolocation=False, includeLoudnessRamps=False, includeMarkers=False,
includeOnDeck=False, includePopularLeaves=False, includePreferences=False, includeRelated=False,
includeFields=False, includeGeolocation=False, includeLoudnessRamps=False, includeMarkers=False,
includeOnDeck=False, includePopularLeaves=False, includeRelated=False,
includeRelatedCount=0, includeReviews=False, includeStations=False)
return item.guids
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def edit_query(self, item, edits, advanced=False):
@ -402,24 +414,93 @@ class PlexAPI:
item.editAdvanced(**edits)
else:
item.edit(**edits)
item.reload()
self.reload(item)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def _upload_image(self, item, image):
if image.is_poster and image.is_url:
item.uploadPoster(url=image.location)
elif image.is_poster:
item.uploadPoster(filepath=image.location)
elif image.is_url:
item.uploadArt(url=image.location)
else:
item.uploadArt(filepath=image.location)
self.reload(item)
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_plex)
def upload_image(self, item, location, poster=True, url=True):
if poster and url:
item.uploadPoster(url=location)
elif poster:
item.uploadPoster(filepath=location)
elif url:
item.uploadArt(url=location)
def _upload_file_poster(self, item, image):
item.uploadPoster(filepath=image)
self.reload(item)
def upload_images(self, item, poster=None, background=None, overlay=None):
poster_uploaded = False
if poster is not None:
try:
image = None
if self.config.Cache:
image, image_compare, _ = self.config.Cache.query_image_map(item.ratingKey, self.original_mapping_name, "poster")
if str(poster.compare) != str(image_compare):
image = None
if image is None or image != item.thumb:
self._upload_image(item, poster)
poster_uploaded = True
logger.info(f"Detail: {poster.attribute} updated {poster.message}")
else:
item.uploadArt(filepath=location)
logger.info(f"Detail: {poster.prefix}poster update not needed")
except BadRequest:
util.print_stacktrace()
logger.error(f"Detail: {poster.attribute} failed to update {poster.message}")
overlay_name = ""
if overlay is not None:
overlay_name, overlay_folder, overlay_image, temp_image = overlay
image_overlay = None
if self.config.Cache:
image, _, image_overlay = self.config.Cache.query_image_map(item.ratingKey, self.original_mapping_name, "poster")
if poster_uploaded or not image_overlay or image_overlay != overlay_name:
og_image = requests.get(item.posterUrl).content
with open(temp_image, "wb") as handler:
handler.write(og_image)
shutil.copyfile(temp_image, os.path.join(overlay_folder, f"{item.ratingKey}.png"))
new_poster = Image.open(temp_image)
new_poster = new_poster.resize(overlay_image.size, Image.ANTIALIAS)
new_poster.paste(overlay_image, (0, 0), overlay_image)
new_poster.save(temp_image)
self._upload_file_poster(item, temp_image)
poster_uploaded = True
logger.info(f"Detail: Overlay: {overlay_name} applied to {item.title}")
background_uploaded = False
if background is not None:
try:
image = None
if self.config.Cache:
image, image_compare, _ = self.config.Cache.query_image_map(item.ratingKey, self.original_mapping_name, "background")
if str(background.compare) != str(image_compare):
image = None
if image is None or image != item.art:
self._upload_image(item, background)
background_uploaded = True
logger.info(f"Detail: {background.attribute} updated {background.message}")
else:
logger.info(f"Detail: {background.prefix}background update not needed")
except BadRequest:
util.print_stacktrace()
logger.error(f"Detail: {background.attribute} failed to update {background.message}")
if self.config.Cache:
if poster_uploaded:
self.config.Cache.update_image_map(item.ratingKey, self.original_mapping_name, "poster", item.thumb, poster.compare if poster else "", overlay_name)
if background_uploaded:
self.config.Cache.update_image_map(item.ratingKey, self.original_mapping_name, "background", item.art, background.compare, "")
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def get_search_choices(self, search_name, title=True):
final_search = search_translation[search_name] if search_name in search_translation else search_name
try:
choices = {}
for choice in self.Plex.listFilterChoices(search_name):
for choice in self.Plex.listFilterChoices(final_search):
choices[choice.title.lower()] = choice.title if title else choice.key
choices[choice.key.lower()] = choice.title if title else choice.key
return choices
@ -436,6 +517,7 @@ class PlexAPI:
elif put: method = self.Plex._server._session.put
else: method = None
self.Plex._server.query(key, method=method)
return self.Plex._server.query(key, method=method)
def smart_label_url(self, title, sort):
labels = self.get_labels()
@ -474,37 +556,39 @@ class PlexAPI:
self.test_smart_filter(uri_args)
self._query(f"/library/collections/{collection.ratingKey}/items{utils.joinArgs({'uri': self.build_smart_filter(uri_args)})}", put=True)
def smart(self, collection):
return utils.cast(bool, self.get_collection(collection)._data.attrib.get('smart', '0'))
def smart_filter(self, collection):
smart_filter = self.get_collection(collection)._data.attrib.get('content')
smart_filter = self.get_collection(collection).content
return smart_filter[smart_filter.index("?"):]
def validate_search_list(self, data, search_name, title=True, pairs=False):
final_search = search_translation[search_name] if search_name in search_translation else search_name
search_choices = self.get_search_choices(final_search, title=title)
valid_list = []
for value in util.get_list(data):
if str(value).lower() in search_choices:
if pairs:
valid_list.append((value, search_choices[str(value).lower()]))
else:
valid_list.append(search_choices[str(value).lower()])
else:
raise Failed(f"Plex Error: {search_name}: {value} not found")
return valid_list
def collection_visibility(self, collection):
try:
attrs = self._query(f"/hubs/sections/{self.Plex.key}/manage?metadataItemId={collection.ratingKey}")[0].attrib
return {
"library": utils.cast(bool, attrs.get("promotedToRecommended", "0")),
"home": utils.cast(bool, attrs.get("promotedToOwnHome", "0")),
"shared": utils.cast(bool, attrs.get("promotedToSharedHome", "0"))
}
except IndexError:
return {"library": False, "home": False, "shared": False}
def collection_visibility_update(self, collection, visibility=None, library=None, home=None, shared=None):
if visibility is None:
visibility = self.collection_visibility(collection)
key = f"/hubs/sections/{self.Plex.key}/manage?metadataItemId={collection.ratingKey}"
key += f"&promotedToRecommended={1 if (library is None and visibility['library']) or library else 0}"
key += f"&promotedToOwnHome={1 if (home is None and visibility['home']) or home else 0}"
key += f"&promotedToSharedHome={1 if (shared is None and visibility['shared']) or shared else 0}"
self._query(key, post=True)
def get_collection(self, data):
if isinstance(data, int):
collection = self.fetchItem(data)
elif isinstance(data, Collections):
elif isinstance(data, Collection):
collection = data
else:
collection = util.choose_from_list(self.search(title=str(data), libtype="collection"), "collection", str(data), exact=True)
if collection:
return collection
else:
raise Failed(f"Plex Error: Collection {data} not found")
def validate_collections(self, collections):
@ -554,7 +638,7 @@ class PlexAPI:
for i, item in enumerate(all_items, 1):
util.print_return(f"Processing: {i}/{len(all_items)} {item.title}")
add_item = True
self.query(item.reload)
self.reload(item)
for collection in item.collections:
if collection.id in collection_indexes:
add_item = False
@ -586,9 +670,9 @@ class PlexAPI:
def get_collection_items(self, collection, smart_label_collection):
if smart_label_collection:
return self.get_labeled_items(collection.title if isinstance(collection, Collections) else str(collection))
elif isinstance(collection, Collections):
if self.smart(collection):
return self.get_labeled_items(collection.title if isinstance(collection, Collection) else str(collection))
elif isinstance(collection, Collection):
if collection.smart:
return self.get_filter_items(self.smart_filter(collection))
else:
return self.query(collection.items)
@ -600,19 +684,17 @@ class PlexAPI:
return self.Plex._search(key, None, 0, plexapi.X_PLEX_CONTAINER_SIZE)
def get_collection_name_and_items(self, collection, smart_label_collection):
name = collection.title if isinstance(collection, Collections) else str(collection)
name = collection.title if isinstance(collection, Collection) else str(collection)
return name, self.get_collection_items(collection, smart_label_collection)
def map_guids(self, config):
logger.info(f"Loading {'Movie' if self.is_movie else 'Show'} Library: {self.name}")
logger.info("")
items = self.Plex.all()
def map_guids(self):
items = self.get_all()
logger.info(f"Mapping {'Movie' if self.is_movie else 'Show'} Library: {self.name}")
logger.info("")
for i, item in enumerate(items, 1):
util.print_return(f"Processing: {i}/{len(items)} {item.title}")
if item.ratingKey not in self.movie_rating_key_map and item.ratingKey not in self.show_rating_key_map:
id_type, main_id = config.Convert.get_id(item, self)
id_type, main_id = self.config.Convert.get_id(item, self)
if main_id:
if not isinstance(main_id, list):
main_id = [main_id]
@ -663,82 +745,98 @@ class PlexAPI:
updated = False
key = builder.filter_translation[attr] if attr in builder.filter_translation else attr
if add_tags or remove_tags or sync_tags:
item_tags = [item_tag.tag for item_tag in getattr(obj, key)]
input_tags = []
if add_tags:
input_tags.extend(add_tags)
if sync_tags:
input_tags.extend(sync_tags)
if sync_tags or remove_tags:
remove_method = getattr(obj, f"remove{attr.capitalize()}")
for tag in item_tags:
if (sync_tags and tag not in sync_tags) or (remove_tags and tag in remove_tags):
_add_tags = add_tags if add_tags else []
_remove = [t.lower() for t in remove_tags] if remove_tags else []
_sync_tags = sync_tags if sync_tags else []
_sync = [t.lower() for t in _sync_tags]
item_tags = [item_tag.tag.lower() for item_tag in getattr(obj, key)]
_add = _add_tags + _sync_tags
if _add:
add = [f"{t[:1].upper()}{t[1:]}" for t in _add if t.lower() not in item_tags]
if add:
updated = True
self.query_data(remove_method, tag)
logger.info(f"Detail: {attr.capitalize()} {tag} removed")
if input_tags:
add_method = getattr(obj, f"add{attr.capitalize()}")
for tag in input_tags:
if tag not in item_tags:
self.query_data(getattr(obj, f"add{attr.capitalize()}"), add)
logger.info(f"Detail: {attr.capitalize()} {add} added")
if _remove or _sync:
remove = [t for t in item_tags if t not in _sync or t in _remove]
if remove:
updated = True
self.query_data(add_method, tag)
logger.info(f"Detail: {attr.capitalize()} {tag} added")
self.query_data(getattr(obj, f"remove{attr.capitalize()}"), remove)
logger.info(f"Detail: {attr.capitalize()} {remove} removed")
return updated
def update_item_from_assets(self, item, collection_mode=False, upload=True, dirs=None, name=None):
if dirs is None:
dirs = self.asset_directory
if not name and collection_mode:
name = item.title
elif not name:
def update_item_from_assets(self, item, overlay=None):
name = os.path.basename(os.path.dirname(item.locations[0]) if self.is_movie else item.locations[0])
for ad in dirs:
poster_image = None
background_image = None
found_one = False
for ad in self.asset_directory:
poster = None
background = None
item_dir = None
if self.asset_folders:
if not os.path.isdir(os.path.join(ad, name)):
if os.path.isdir(os.path.join(ad, name)):
item_dir = os.path.join(ad, name)
else:
matches = glob.glob(os.path.join(ad, "*", name))
if len(matches) > 0:
item_dir = os.path.abspath(matches[0])
if item_dir is None:
continue
poster_filter = os.path.join(ad, name, "poster.*")
background_filter = os.path.join(ad, name, "background.*")
found_one = True
poster_filter = os.path.join(item_dir, "poster.*")
background_filter = os.path.join(item_dir, "background.*")
else:
poster_filter = os.path.join(ad, f"{name}.*")
background_filter = os.path.join(ad, f"{name}_background.*")
matches = glob.glob(poster_filter)
if len(matches) > 0:
poster_image = os.path.abspath(matches[0])
if upload:
self.upload_image(item, poster_image, url=False)
logger.info(f"Detail: asset_directory updated {item.title}'s poster to [file] {poster_image}")
poster = ImageData("asset_directory", os.path.abspath(matches[0]), prefix=f"{item.title}'s ", is_url=False)
matches = glob.glob(background_filter)
if len(matches) > 0:
background_image = os.path.abspath(matches[0])
if upload:
self.upload_image(item, background_image, poster=False, url=False)
logger.info(f"Detail: asset_directory updated {item.title}'s background to [file] {background_image}")
if collection_mode:
for ite in self.query(item.items):
self.update_item_from_assets(ite, dirs=[os.path.join(ad, name)])
if not upload:
return poster_image, background_image
if self.is_show and not collection_mode:
background = ImageData("asset_directory", os.path.abspath(matches[0]), prefix=f"{item.title}'s ", is_poster=False, is_url=False)
if poster or background:
self.upload_images(item, poster=poster, background=background, overlay=overlay)
if self.is_show:
for season in self.query(item.seasons):
if self.asset_folders:
season_filter = os.path.join(ad, name, f"Season{'0' if season.seasonNumber < 10 else ''}{season.seasonNumber}.*")
if item_dir:
season_filter = os.path.join(item_dir, f"Season{'0' if season.seasonNumber < 10 else ''}{season.seasonNumber}.*")
else:
season_filter = os.path.join(ad, f"{name}_Season{'0' if season.seasonNumber < 10 else ''}{season.seasonNumber}.*")
matches = glob.glob(season_filter)
if len(matches) > 0:
season_path = os.path.abspath(matches[0])
self.upload_image(season, season_path, url=False)
logger.info(f"Detail: asset_directory updated {item.title} Season {season.seasonNumber}'s poster to [file] {season_path}")
season_poster = ImageData("asset_directory", os.path.abspath(matches[0]), prefix=f"{item.title} Season {season.seasonNumber}'s ", is_url=False)
self.upload_images(season, poster=season_poster)
for episode in self.query(season.episodes):
if self.asset_folders:
episode_filter = os.path.join(ad, name, f"{episode.seasonEpisode.upper()}.*")
if item_dir:
episode_filter = os.path.join(item_dir, f"{episode.seasonEpisode.upper()}.*")
else:
episode_filter = os.path.join(ad, f"{name}_{episode.seasonEpisode.upper()}.*")
matches = glob.glob(episode_filter)
if len(matches) > 0:
episode_path = os.path.abspath(matches[0])
self.upload_image(episode, episode_path, url=False)
logger.info(f"Detail: asset_directory updated {item.title} {episode.seasonEpisode.upper()}'s poster to [file] {episode_path}")
episode_poster = ImageData("asset_directory", os.path.abspath(matches[0]), prefix=f"{item.title} {episode.seasonEpisode.upper()}'s ", is_url=False)
self.upload_images(episode, poster=episode_poster)
if not found_one:
logger.error(f"Asset Warning: No asset folder found called '{name}'")
def find_collection_assets(self, item, name=None):
if name is None:
name = item.title
for ad in self.asset_directory:
poster = None
background = None
if self.asset_folders:
if not os.path.isdir(os.path.join(ad, name)):
continue
poster_filter = os.path.join(ad, name, "poster.*")
background_filter = os.path.join(ad, name, "background.*")
else:
poster_filter = os.path.join(ad, f"{name}.*")
background_filter = os.path.join(ad, f"{name}_background.*")
matches = glob.glob(poster_filter)
if len(matches) > 0:
poster = ImageData("asset_directory", os.path.abspath(matches[0]), prefix=f"{item.title}'s ", is_url=False)
matches = glob.glob(background_filter)
if len(matches) > 0:
background = ImageData("asset_directory", os.path.abspath(matches[0]), prefix=f"{item.title}'s ", is_poster=False, is_url=False)
if poster or background:
return poster, background
return None, None

@ -1,7 +1,8 @@
import logging, requests
import logging
from modules import util
from modules.util import Failed
from retrying import retry
from arrapi import RadarrAPI
from arrapi.exceptions import ArrException, Invalid
logger = logging.getLogger("Plex Meta Manager")
@ -11,122 +12,75 @@ availability_translation = {
"released": "released",
"db": "preDB"
}
apply_tags_translation = {
"": "add",
"sync": "replace",
"remove": "remove"
}
class RadarrAPI:
class Radarr:
def __init__(self, params):
self.url = params["url"]
self.token = params["token"]
self.version = params["version"]
self.base_url = f"{self.url}/api{'/v3' if self.version == 'v3' else ''}/"
try:
result = requests.get(f"{self.base_url}system/status", params={"apikey": f"{self.token}"}).json()
except Exception:
util.print_stacktrace()
raise Failed(f"Radarr Error: Could not connect to Radarr at {self.url}")
if "error" in result and result["error"] == "Unauthorized":
raise Failed("Radarr Error: Invalid API Key")
if "version" not in result:
raise Failed("Radarr Error: Unexpected Response Check URL")
self.api = RadarrAPI(self.url, self.token)
except ArrException as e:
raise Failed(e)
self.add = params["add"]
self.root_folder_path = params["root_folder_path"]
self.monitor = params["monitor"]
self.availability = params["availability"]
self.quality_profile_id = self.get_profile_id(params["quality_profile"])
self.quality_profile = params["quality_profile"]
self.tag = params["tag"]
self.tags = self.get_tags()
self.search = params["search"]
def get_profile_id(self, profile_name):
profiles = ""
for profile in self._get("qualityProfile" if self.version == "v3" else "profile"):
if len(profiles) > 0:
profiles += ", "
profiles += profile["name"]
if profile["name"] == profile_name:
return profile["id"]
raise Failed(f"Radarr Error: quality_profile: {profile_name} does not exist in radarr. Profiles available: {profiles}")
def get_tags(self):
return {tag["label"]: tag["id"] for tag in self._get("tag")}
def add_tags(self, tags):
added = False
for label in tags:
if str(label).lower() not in self.tags:
added = True
self._post("tag", {"label": str(label).lower()})
if added:
self.tags = self.get_tags()
def lookup(self, tmdb_id):
results = self._get("movie/lookup", params={"term": f"tmdb:{tmdb_id}"})
if results:
return results[0]
else:
raise Failed(f"Sonarr Error: TMDb ID: {tmdb_id} not found")
def add_tmdb(self, tmdb_ids, **options):
logger.info("")
util.separator(f"Adding to Radarr", space=False, border=False)
logger.info("")
util.separator("Adding to Radarr", space=False, border=False)
logger.debug("")
logger.debug(f"TMDb IDs: {tmdb_ids}")
tag_nums = []
add_count = 0
folder = options["folder"] if "folder" in options else self.root_folder_path
monitor = options["monitor"] if "monitor" in options else self.monitor
availability = options["availability"] if "availability" in options else self.availability
quality_profile_id = self.get_profile_id(options["quality"]) if "quality" in options else self.quality_profile_id
availability = availability_translation[options["availability"] if "availability" in options else self.availability]
quality_profile = options["quality"] if "quality" in options else self.quality_profile
tags = options["tag"] if "tag" in options else self.tag
search = options["search"] if "search" in options else self.search
if tags:
self.add_tags(tags)
tag_nums = [self.tags[label.lower()] for label in tags if label.lower() in self.tags]
for tmdb_id in tmdb_ids:
try:
movie_info = self.lookup(tmdb_id)
except Failed as e:
logger.error(e)
continue
added, exists, invalid = self.api.add_multiple_movies(tmdb_ids, folder, quality_profile, monitor, search, availability, tags)
except Invalid as e:
raise Failed(f"Radarr Error: {e}")
poster_url = None
for image in movie_info["images"]:
if "coverType" in image and image["coverType"] == "poster" and "remoteUrl" in image:
poster_url = image["remoteUrl"]
if len(added) > 0:
logger.info("")
for movie in added:
logger.info(f"Added to Radarr | {movie.tmdbId:<6} | {movie.title}")
logger.info(f"{len(added)} Movie{'s' if len(added) > 1 else ''} added to Radarr")
url_json = {
"title": movie_info["title"],
f"{'qualityProfileId' if self.version == 'v3' else 'profileId'}": quality_profile_id,
"year": int(movie_info["year"]),
"tmdbid": int(tmdb_id),
"titleslug": movie_info["titleSlug"],
"minimumAvailability": availability_translation[availability],
"monitored": monitor,
"rootFolderPath": folder,
"images": [{"covertype": "poster", "url": poster_url}],
"addOptions": {"searchForMovie": search}
}
if tag_nums:
url_json["tags"] = tag_nums
response = self._post("movie", url_json)
if response.status_code < 400:
logger.info(f"Added to Radarr | {tmdb_id:<6} | {movie_info['title']}")
add_count += 1
else:
try:
logger.error(f"Radarr Error: ({tmdb_id}) {movie_info['title']}: ({response.status_code}) {response.json()[0]['errorMessage']}")
except KeyError:
logger.debug(url_json)
logger.error(f"Radarr Error: {response.json()}")
logger.info(f"{add_count} Movie{'s' if add_count > 1 else ''} added to Radarr")
if len(exists) > 0:
logger.info("")
for movie in exists:
logger.info(f"Already in Radarr | {movie.tmdbId:<6} | {movie.title}")
logger.info(f"{len(exists)} Movie{'s' if len(exists) > 1 else ''} already existing in Radarr")
if len(invalid) > 0:
logger.info("")
for tmdb_id in invalid:
logger.info(f"Invalid TMDb ID | {tmdb_id}")
def edit_tags(self, tmdb_ids, tags, apply_tags):
logger.info("")
logger.info(f"{apply_tags_translation[apply_tags].capitalize()} Radarr Tags: {tags}")
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def _get(self, url, params=None):
url_params = {"apikey": f"{self.token}"}
if params:
for param in params:
url_params[param] = params[param]
return requests.get(f"{self.base_url}{url}", params=url_params).json()
edited, not_exists = self.api.edit_multiple_movies(tmdb_ids, tags=tags, apply_tags=apply_tags)
if len(edited) > 0:
logger.info("")
for movie in edited:
logger.info(f"Radarr Tags | {movie.title:<25} | {movie.tags}")
logger.info(f"{len(edited)} Movie{'s' if len(edited) > 1 else ''} edited in Radarr")
if len(not_exists) > 0:
logger.info("")
for tmdb_id in not_exists:
logger.info(f"TMDb ID Not in Radarr | {tmdb_id}")
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def _post(self, url, url_json):
return requests.post(f"{self.base_url}{url}", json=url_json, params={"apikey": f"{self.token}"})

@ -1,8 +1,8 @@
import logging, requests
from json.decoder import JSONDecodeError
import logging
from modules import util
from modules.util import Failed
from retrying import retry
from arrapi import SonarrAPI
from arrapi.exceptions import ArrException, Invalid
logger = logging.getLogger("Plex Meta Manager")
@ -17,150 +17,82 @@ monitor_translation = {
"latest": "latestSeason",
"none": "none"
}
apply_tags_translation = {
"": "add",
"sync": "replace",
"remove": "remove"
}
class SonarrAPI:
def __init__(self, params, language):
class Sonarr:
def __init__(self, params):
self.url = params["url"]
self.token = params["token"]
self.version = params["version"]
self.base_url = f"{self.url}/api{'/v3/' if self.version == 'v3' else '/'}"
try:
result = requests.get(f"{self.base_url}system/status", params={"apikey": f"{self.token}"}).json()
except Exception:
util.print_stacktrace()
raise Failed(f"Sonarr Error: Could not connect to Sonarr at {self.url}")
if "error" in result and result["error"] == "Unauthorized":
raise Failed("Sonarr Error: Invalid API Key")
if "version" not in result:
raise Failed("Sonarr Error: Unexpected Response Check URL")
self.api = SonarrAPI(self.url, self.token)
except ArrException as e:
raise Failed(e)
self.add = params["add"]
self.root_folder_path = params["root_folder_path"]
self.monitor = params["monitor"]
self.quality_profile_id = self.get_profile_id(params["quality_profile"], "quality_profile")
self.quality_profile = params["quality_profile"]
self.language_profile_id = None
if self.version == "v3" and params["language_profile"] is not None:
self.language_profile_id = self.get_profile_id(params["language_profile"], "language_profile")
if self.language_profile_id is None:
self.language_profile_id = 1
self.language_profile = params["language_profile"]
self.series_type = params["series_type"]
self.season_folder = params["season_folder"]
self.tag = params["tag"]
self.tags = self.get_tags()
self.search = params["search"]
self.cutoff_search = params["cutoff_search"]
self.language = language
def get_profile_id(self, profile_name, profile_type):
profiles = ""
if profile_type == "quality_profile" and self.version == "v3":
endpoint = "qualityProfile"
elif profile_type == "language_profile":
endpoint = "languageProfile"
else:
endpoint = "profile"
for profile in self._get(endpoint):
if len(profiles) > 0:
profiles += ", "
profiles += profile["name"]
if profile["name"] == profile_name:
return profile["id"]
raise Failed(f"Sonarr Error: {profile_type}: {profile_name} does not exist in sonarr. Profiles available: {profiles}")
def get_tags(self):
return {tag["label"]: tag["id"] for tag in self._get("tag")}
def add_tags(self, tags):
added = False
for label in tags:
if str(label).lower() not in self.tags:
added = True
self._post("tag", {"label": str(label).lower()})
if added:
self.tags = self.get_tags()
def lookup(self, tvdb_id):
results = self._get("series/lookup", params={"term": f"tvdb:{tvdb_id}"})
if results:
return results[0]
else:
raise Failed(f"Sonarr Error: TVDb ID: {tvdb_id} not found")
def add_tvdb(self, tvdb_ids, **options):
logger.info("")
util.separator(f"Adding to Sonarr", space=False, border=False)
logger.info("")
util.separator("Adding to Sonarr", space=False, border=False)
logger.debug("")
logger.debug(f"TVDb IDs: {tvdb_ids}")
tag_nums = []
add_count = 0
folder = options["folder"] if "folder" in options else self.root_folder_path
monitor = options["monitor"] if "monitor" in options else self.monitor
quality_profile_id = self.get_profile_id(options["quality"], "quality_profile") if "quality" in options else self.quality_profile_id
language_profile_id = self.get_profile_id(options["language"], "language_profile") if "quality" in options else self.language_profile_id
monitor = monitor_translation[options["monitor"] if "monitor" in options else self.monitor]
quality_profile = options["quality"] if "quality" in options else self.quality_profile
language_profile = options["language"] if "language" in options else self.language_profile
language_profile = language_profile if self.api.v3 else 1
series = options["series"] if "series" in options else self.series_type
season = options["season"] if "season" in options else self.season_folder
tags = options["tag"] if "tag" in options else self.tag
search = options["search"] if "search" in options else self.search
cutoff_search = options["cutoff_search"] if "cutoff_search" in options else self.cutoff_search
if tags:
self.add_tags(tags)
tag_nums = [self.tags[label.lower()] for label in tags if label.lower() in self.tags]
for tvdb_id in tvdb_ids:
try:
show_info = self.lookup(tvdb_id)
except Failed as e:
logger.error(e)
continue
added, exists, invalid = self.api.add_multiple_series(tvdb_ids, folder, quality_profile, language_profile, monitor, season, search, cutoff_search, series, tags)
except Invalid as e:
raise Failed(f"Sonarr Error: {e}")
poster_url = None
for image in show_info["images"]:
if "coverType" in image and image["coverType"] == "poster" and "remoteUrl" in image:
poster_url = image["remoteUrl"]
if len(added) > 0:
logger.info("")
for series in added:
logger.info(f"Added to Sonarr | {series.tvdbId:<6} | {series.title}")
logger.info(f"{len(added)} Series added to Sonarr")
url_json = {
"title": show_info["title"],
f"{'qualityProfileId' if self.version == 'v3' else 'profileId'}": quality_profile_id,
"languageProfileId": language_profile_id,
"tvdbId": int(tvdb_id),
"titleslug": show_info["titleSlug"],
"language": self.language,
"monitored": monitor != "none",
"seasonFolder": season,
"seriesType": series,
"rootFolderPath": folder,
"seasons": [],
"images": [{"covertype": "poster", "url": poster_url}],
"addOptions": {
"searchForMissingEpisodes": search,
"searchForCutoffUnmetEpisodes": cutoff_search,
"monitor": monitor_translation[monitor]
}
}
if tag_nums:
url_json["tags"] = tag_nums
response = self._post("series", url_json)
if response.status_code < 400:
logger.info(f"Added to Sonarr | {tvdb_id:<6} | {show_info['title']}")
add_count += 1
else:
try:
logger.error(f"Sonarr Error: ({tvdb_id}) {show_info['title']}: ({response.status_code}) {response.json()[0]['errorMessage']}")
except KeyError:
logger.debug(url_json)
logger.error(f"Sonarr Error: {response.json()}")
except JSONDecodeError:
logger.debug(url_json)
logger.error(f"Sonarr Error: {response}")
if len(exists) > 0:
logger.info("")
for series in exists:
logger.info(f"Already in Sonarr | {series.tvdbId:<6} | {series.title}")
logger.info(f"{len(exists)} Series already existing in Sonarr")
if len(invalid) > 0:
for tvdb_id in invalid:
logger.info("")
logger.info(f"Invalid TVDb ID | {tvdb_id}")
logger.info(f"{add_count} Show{'s' if add_count > 1 else ''} added to Sonarr")
def edit_tags(self, tvdb_ids, tags, apply_tags):
logger.info("")
logger.info(f"{apply_tags_translation[apply_tags].capitalize()} Sonarr Tags: {tags}")
edited, not_exists = self.api.edit_multiple_series(tvdb_ids, tags=tags, apply_tags=apply_tags)
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def _get(self, url, params=None):
url_params = {"apikey": f"{self.token}"}
if params:
for param in params:
url_params[param] = params[param]
return requests.get(f"{self.base_url}{url}", params=url_params).json()
if len(edited) > 0:
logger.info("")
for series in edited:
logger.info(f"Radarr Tags | {series.title:<25} | {series.tags}")
logger.info(f"{len(edited)} Series edited in Sonarr")
@retry(stop_max_attempt_number=6, wait_fixed=10000)
def _post(self, url, url_json):
return requests.post(f"{self.base_url}{url}", json=url_json, params={"apikey": f"{self.token}"})
if len(not_exists) > 0:
logger.info("")
for tvdb_id in not_exists:
logger.info(f"TVDb ID Not in Sonarr | {tvdb_id}")

@ -8,7 +8,7 @@ logger = logging.getLogger("Plex Meta Manager")
builders = ["tautulli_popular", "tautulli_watched"]
class TautulliAPI:
class Tautulli:
def __init__(self, params):
self.url = params["url"]
self.apikey = params["apikey"]

@ -108,7 +108,7 @@ discover_tv_sort = [
"popularity.desc", "popularity.asc"
]
class TMDbAPI:
class TMDb:
def __init__(self, config, params):
self.config = config
self.TMDb = tmdbv3api.TMDb()
@ -362,6 +362,6 @@ class TMDbAPI:
if not is_movie and len(show_ids) > 0:
logger.info(f"Processing {pretty}: ({tmdb_id}) {tmdb_name} ({len(show_ids)} Show{'' if len(show_ids) == 1 else 's'})")
logger.debug("")
logger.debug(f"TMDb IDs Found: {movie_ids}")
logger.debug(f"TVDb IDs Found: {show_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -3,7 +3,7 @@ from modules import util
from modules.util import Failed, TimeoutExpired
from retrying import retry
from ruamel import yaml
from trakt import Trakt
from trakt import Trakt as TraktAPI
from trakt.objects.episode import Episode
from trakt.objects.movie import Movie
from trakt.objects.season import Season
@ -23,7 +23,7 @@ builders = [
"trakt_watchlist"
]
class TraktAPI:
class Trakt:
def __init__(self, params, authorization=None):
self.base_url = "https://api.trakt.tv"
self.redirect_uri = "urn:ietf:wg:oauth:2.0:oob"
@ -36,20 +36,20 @@ class TraktAPI:
self.client_secret = params["client_secret"]
self.config_path = params["config_path"]
self.authorization = authorization
Trakt.configuration.defaults.client(self.client_id, self.client_secret)
TraktAPI.configuration.defaults.client(self.client_id, self.client_secret)
if not self._save(self.authorization):
if not self._refresh():
self._authorization()
def _authorization(self):
url = Trakt["oauth"].authorize_url(self.redirect_uri)
url = TraktAPI["oauth"].authorize_url(self.redirect_uri)
logger.info(f"Navigate to: {url}")
logger.info("If you get an OAuth error your client_id or client_secret is invalid")
webbrowser.open(url, new=2)
try: pin = util.logger_input("Trakt pin (case insensitive)", timeout=300).strip()
except TimeoutExpired: raise Failed("Input Timeout: Trakt pin required.")
if not pin: raise Failed("Trakt Error: No input Trakt pin required.")
new_authorization = Trakt["oauth"].token(pin, self.redirect_uri)
new_authorization = TraktAPI["oauth"].token(pin, self.redirect_uri)
if not new_authorization:
raise Failed("Trakt Error: Invalid trakt pin. If you're sure you typed it in correctly your client_id or client_secret may be invalid")
if not self._save(new_authorization):
@ -57,8 +57,8 @@ class TraktAPI:
def _check(self, authorization):
try:
with Trakt.configuration.oauth.from_response(authorization, refresh=True):
if Trakt["users/settings"].get():
with TraktAPI.configuration.oauth.from_response(authorization, refresh=True):
if TraktAPI["users/settings"].get():
return True
except ValueError: pass
return False
@ -66,7 +66,7 @@ class TraktAPI:
def _refresh(self):
if self.authorization and "refresh_token" in self.authorization and self.authorization["refresh_token"]:
logger.info("Refreshing Access Token...")
refreshed_authorization = Trakt["oauth"].token_refresh(self.authorization["refresh_token"], self.redirect_uri)
refreshed_authorization = TraktAPI["oauth"].token_refresh(self.authorization["refresh_token"], self.redirect_uri)
return self._save(refreshed_authorization)
return False
@ -86,13 +86,13 @@ class TraktAPI:
logger.info(f"Saving authorization information to {self.config_path}")
yaml.round_trip_dump(config, open(self.config_path, "w"), indent=ind, block_seq_indent=bsi)
self.authorization = authorization
Trakt.configuration.defaults.oauth.from_response(self.authorization)
TraktAPI.configuration.defaults.oauth.from_response(self.authorization)
return True
return False
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def convert(self, external_id, from_source, to_source, media_type):
lookup = Trakt["search"].lookup(external_id, from_source, media_type)
lookup = TraktAPI["search"].lookup(external_id, from_source, media_type)
if lookup:
lookup = lookup[0] if isinstance(lookup, list) else lookup
if lookup.get_key(to_source):
@ -107,13 +107,13 @@ class TraktAPI:
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def _user_list(self, list_type, data, is_movie):
items = Trakt[f"users/{data}/{list_type}"].movies() if is_movie else Trakt[f"users/{data}/{list_type}"].shows()
items = TraktAPI[f"users/{data}/{list_type}"].movies() if is_movie else TraktAPI[f"users/{data}/{list_type}"].shows()
if items is None: raise Failed("Trakt Error: No List found")
else: return [i for i in items]
@retry(stop_max_attempt_number=6, wait_fixed=10000, retry_on_exception=util.retry_if_not_failed)
def standard_list(self, data):
try: trakt_list = Trakt[requests.utils.urlparse(data).path].get()
try: trakt_list = TraktAPI[requests.utils.urlparse(data).path].get()
except AttributeError: trakt_list = None
if trakt_list is None: raise Failed("Trakt Error: No List found")
else: return trakt_list
@ -181,6 +181,6 @@ class TraktAPI:
show_ids.append(int(trakt_item.show.pk[1]))
logger.debug(f"Trakt {media_type} Found: {trakt_items}")
logger.debug("")
logger.debug(f"TMDb IDs Found: {movie_ids}")
logger.debug(f"TVDb IDs Found: {show_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -74,7 +74,7 @@ class TVDbObj:
self.is_movie = is_movie
self.TVDb = TVDb
class TVDbAPI:
class TVDb:
def __init__(self, config):
self.config = config
self.site_url = "https://www.thetvdb.com"
@ -164,6 +164,6 @@ class TVDbAPI:
else:
raise Failed(f"TVDb Error: Method {method} not supported")
logger.debug("")
logger.debug(f"TMDb IDs Found: {movie_ids}")
logger.debug(f"TVDb IDs Found: {show_ids}")
logger.debug(f"{len(movie_ids)} TMDb IDs Found: {movie_ids}")
logger.debug(f"{len(show_ids)} TVDb IDs Found: {show_ids}")
return movie_ids, show_ids

@ -1,4 +1,4 @@
import logging, re, signal, sys, time, traceback
import logging, os, re, signal, sys, time, traceback
from datetime import datetime
from pathvalidate import is_valid_filename, sanitize_filename
from plexapi.exceptions import BadRequest, NotFound, Unauthorized
@ -18,6 +18,16 @@ class TimeoutExpired(Exception):
class Failed(Exception):
pass
class ImageData:
def __init__(self, attribute, location, prefix="", is_poster=True, is_url=True):
self.attribute = attribute
self.location = location
self.prefix = prefix
self.is_poster = is_poster
self.is_url = is_url
self.compare = location if is_url else os.stat(location).st_size
self.message = f"{prefix}{'poster' if is_poster else 'background'} to [{'URL' if is_url else 'File'}] {location}"
def retry_if_not_failed(exception):
return not isinstance(exception, Failed)
@ -78,6 +88,7 @@ pretty_names = {
"anilist_studio": "AniList Studio",
"anilist_tag": "AniList Tag",
"anilist_top_rated": "AniList Top Rated",
"icheckmovies_list": "I Check Movies List",
"imdb_list": "IMDb List",
"imdb_id": "IMDb ID",
"letterboxd_list": "Letterboxd List",
@ -277,6 +288,7 @@ def unix_input(prompt, timeout=60):
signal.signal(signal.SIGALRM, alarm_handler)
signal.alarm(timeout)
try: return input(prompt)
except EOFError: raise Failed("Input Failed")
finally: signal.alarm(0)
def old_windows_input(prompt, timeout=60, timer=time.monotonic):
@ -349,7 +361,7 @@ def regex_first_int(data, id_type, default=None):
def centered(text, sep=" "):
if len(text) > screen_width - 2:
raise Failed("text must be shorter then screen_width")
return text
space = screen_width - len(text) - 2
text = f" {text} "
if space % 2 == 1:

Binary file not shown.

@ -59,7 +59,7 @@ for time_to_run in times_to_run:
util.separating_character = os.environ.get("PMM_DIVIDER")[0] if os.environ.get("PMM_DIVIDER") else args.divider[0]
screen_width = os.environ.get("PMM_WIDTH") if os.environ.get("PMM_WIDTH") else args.width
screen_width = int(os.environ.get("PMM_WIDTH")) if os.environ.get("PMM_WIDTH") else args.width
if 90 <= screen_width <= 300:
util.screen_width = screen_width
else:
@ -105,7 +105,7 @@ def start(config_path, is_test=False, time_scheduled=None, requested_collections
logger.info(util.centered("| __/| | __/> < | | | | __/ || (_| | | | | | (_| | | | | (_| | (_| | __/ | "))
logger.info(util.centered("|_| |_|\\___/_/\\_\\ |_| |_|\\___|\\__\\__,_| |_| |_|\\__,_|_| |_|\\__,_|\\__, |\\___|_| "))
logger.info(util.centered(" |___/ "))
logger.info(util.centered(" Version: 1.10.0 "))
logger.info(util.centered(" Version: 1.11.0 "))
if time_scheduled: start_type = f"{time_scheduled} "
elif is_test: start_type = "Test "
elif requested_collections: start_type = "Collections "
@ -144,7 +144,7 @@ def update_libraries(config):
logger.info("")
util.separator(f"Mapping {library.name} Library", space=False, border=False)
logger.info("")
library.map_guids(config)
library.map_guids()
if not config.test_mode and not config.resume_from and not collection_only and library.mass_update:
mass_metadata(config, library)
for metadata in library.metadata_files:
@ -189,7 +189,8 @@ def update_libraries(config):
util.separator(f"All {'Movies' if library.is_movie else 'Shows'} Assets Check for {library.name} Library", space=False, border=False)
logger.info("")
for col in unmanaged_collections:
library.update_item_from_assets(col, collection_mode=True)
poster, background = library.find_collection_assets(col)
library.upload_images(col, poster=poster, background=background)
for item in library.get_all():
library.update_item_from_assets(item)
@ -221,7 +222,7 @@ def update_libraries(config):
logger.info("")
util.separator(f"{library.name} Library Run Again")
logger.info("")
library.map_guids(config)
library.map_guids()
for builder in library.run_again:
logger.info("")
util.separator(f"{builder.name} Collection")
@ -248,10 +249,16 @@ def mass_metadata(config, library):
logger.info("")
util.separator(f"Mass Editing {'Movie' if library.is_movie else 'Show'} Library: {library.name}")
logger.info("")
if library.split_duplicates:
items = library.search(**{"duplicate": True})
for item in items:
item.split()
logger.info(util.adjust_space(f"{item.title[:25]:<25} | Splitting"))
radarr_adds = []
sonarr_adds = []
items = library.Plex.all()
items = library.get_all()
for i, item in enumerate(items, 1):
library.reload(item)
util.print_return(f"Processing: {i}/{len(items)} {item.title}")
tmdb_id = None
tvdb_id = None
@ -297,6 +304,9 @@ def mass_metadata(config, library):
omdb_item = config.OMDb.get_omdb(imdb_id)
except Failed as e:
logger.info(util.adjust_space(str(e)))
except Exception:
logger.error(f"IMDb ID: {imdb_id}")
raise
else:
logger.info(util.adjust_space(f"{item.title[:25]:<25} | No IMDb ID for Guid: {item.guid}"))
@ -313,12 +323,18 @@ def mass_metadata(config, library):
raise Failed
item_genres = [genre.tag for genre in item.genres]
display_str = ""
for genre in (g for g in item_genres if g not in new_genres):
library.query_data(item.removeGenre, genre)
display_str += f"{', ' if len(display_str) > 0 else ''}-{genre}"
add_genre = []
for genre in (g for g in new_genres if g not in item_genres):
library.query_data(item.addGenre, genre)
add_genre.append(genre)
display_str += f"{', ' if len(display_str) > 0 else ''}+{genre}"
if len(add_genre) > 0:
library.query_data(item.addGenre, add_genre)
remove_genre = []
for genre in (g for g in item_genres if g not in new_genres):
remove_genre.append(genre)
display_str += f"{', ' if len(display_str) > 0 else ''}-{genre}"
if len(remove_genre) > 0:
library.query_data(item.removeGenre, remove_genre)
if len(display_str) > 0:
logger.info(util.adjust_space(f"{item.title[:25]:<25} | Genres | {display_str}"))
except Failed:
@ -446,7 +462,6 @@ def run_collection(config, library, metadata, requested_collections):
logger.info("")
builder.update_details()
if len(builder.item_details) > 0:
logger.info("")
util.separator(f"Updating Details of the Items in {mapping_name} Collection", space=False, border=False)
logger.info("")

@ -1,8 +1,9 @@
# Remove
# Less common, pinned
PlexAPI==4.5.2
PlexAPI==4.6.1
tmdbv3api==1.7.5
trakt.py==4.3.0
arrapi==1.0.2
# More common, flexible
lxml
requests>=2.4.2
@ -10,3 +11,4 @@ ruamel.yaml
schedule
retrying
pathvalidate
pillow

Loading…
Cancel
Save