diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md
index 28ddae8..4721aee 100644
--- a/.github/ISSUE_TEMPLATE/bug_report.md
+++ b/.github/ISSUE_TEMPLATE/bug_report.md
@@ -1,10 +1,9 @@
---
name: Bug report
about: Create a report to help us improve
-title: ''
-labels: ''
+title: ""
+labels: ""
assignees: Sp5rky
-
---
**Describe the bug**
@@ -12,21 +11,55 @@ A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
-1. Run command uv run [...]
+
+1. Run command `uv run unshackle [...]`
2. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
+**System Details**
+
+- OS: [e.g. Windows 11, Ubuntu 22.04, macOS 14]
+- unshackle Version: [e.g. 1.0.1]
+
+**Dependency Versions** (if relevant)
+
+- Shaka-packager: [e.g. 2.6.1]
+- n_m3u8dl-re: [e.g. 0.3.0-beta]
+- aria2c: [e.g. 1.36.0]
+- ffmpeg: [e.g. 6.0]
+- Other: [e.g. ccextractor, subby]
+
+**Logs/Error Output**
+
+
+Click to expand logs
+
+```
+Paste relevant error messages or stack traces here
+```
+
+
+
+**Configuration** (if relevant)
+Please describe relevant configuration settings (DO NOT paste credentials or API keys):
+
+- Downloader used: [e.g. requests, aria2c, n_m3u8dl_re]
+- Proxy provider: [e.g. NordVPN, none]
+- Other relevant config options
+
**Screenshots**
If applicable, add screenshots to help explain your problem.
-**Desktop (please complete the following information):**
- - OS: [e.g. Windows/Unix]
- - Version [e.g. 1.0.1]
- - Shaka-packager Version [e.g. 2.6.1]
- - n_m3u8dl-re Version [e.g. 0.3.0 beta]
- - Any additional software, such as subby/ccextractor/aria2c
-
**Additional context**
-Add any other context about the problem here, if you're reporting issues with services not running or working, please try to expand on where in your service it breaks but don't include service code (unless you have rights to do so.)
+Add any other context about the problem here.
+
+---
+
+**⚠️ Important:**
+
+- **DO NOT include service-specific implementation code** unless you have explicit rights to share it
+- **DO NOT share credentials, API keys, WVD files, or authentication tokens**
+- For service-specific issues, describe the behavior without revealing proprietary implementation details
+- Focus on core framework issues (downloads, DRM, track handling, CLI, configuration, etc.)
diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md
index d5714f3..2f3d79e 100644
--- a/.github/ISSUE_TEMPLATE/feature_request.md
+++ b/.github/ISSUE_TEMPLATE/feature_request.md
@@ -1,21 +1,53 @@
---
name: Feature request
about: Suggest an idea for this project
-title: ''
-labels: ''
+title: ""
+labels: ""
assignees: Sp5rky
-
---
+**Feature Category**
+What area does this feature request relate to?
+
+- [ ] Core framework (downloaders, DRM, track handling)
+- [ ] CLI/commands (new commands or command improvements)
+- [ ] Configuration system
+- [ ] Manifest parsing (DASH, HLS, ISM)
+- [ ] Output/muxing (naming, metadata, tagging)
+- [ ] Proxy system
+- [ ] Key vault system
+- [ ] Documentation
+- [ ] Other (please specify)
+
**Is your feature request related to a problem? Please describe.**
-A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
+A clear and concise description of what the problem is.
+Example: "I'm always frustrated when [...]"
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
-Other tools like Devine/VT had this function [...]
+
+**Reference implementations** (if applicable)
+Have you seen this feature in other tools?
+
+- [ ] Vinetrimmer
+- [ ] yt-dlp
+- [ ] Other: [please specify]
+
+Please describe how it works there (without sharing proprietary code).
+
+**Use case / Impact**
+
+- How would this feature benefit users?
+- How often would you use this feature?
+- Does this solve a common workflow issue?
**Additional context**
Add any other context or screenshots about the feature request here.
+
+---
+
+**⚠️ Note:**
+This project focuses on the core framework and tooling. Service-specific feature requests should focus on what the framework should support, not specific service implementations.
diff --git a/.gitignore b/.gitignore
index 4535362..e5f7d2e 100644
--- a/.gitignore
+++ b/.gitignore
@@ -23,6 +23,9 @@ unshackle/cookies/
unshackle/certs/
unshackle/WVDs/
unshackle/PRDs/
+temp/
+logs/
+services/
Temp/
# Byte-compiled / optimized / DLL files
@@ -215,6 +218,7 @@ cython_debug/
# you could uncomment the following to ignore the entire vscode folder
.vscode/
.github/copilot-instructions.md
+CLAUDE.md
# Ruff stuff:
.ruff_cache/
diff --git a/CHANGELOG.md b/CHANGELOG.md
index abe5c9e..9625999 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -5,6 +5,200 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
+## [2.1.0] - 2025-11-27
+
+### Added
+
+- **Per-Track Quality-Based CDM Selection**: Dynamic CDM switching during runtime DRM operations
+ - Enables quality-based CDM selection during runtime DRM switching
+ - Different CDMs can be used for different video quality levels within the same download session
+ - Example: Use Widevine L3 for SD/HD and PlayReady SL3 for 4K content
+- **Enhanced Track Export**: Improved export functionality with additional metadata
+ - Added URL field to track export for easier identification
+ - Added descriptor information in export output
+ - Keys now exported in hex-formatted strings
+
+### Changed
+
+- **Dependencies**: Upgraded to latest compatible versions
+ - Updated various dependencies to their latest versions
+
+### Fixed
+
+- **Attachment Preservation**: Fixed attachments being dropped during track filtering
+ - Attachments (screenshots, fonts) were being lost when track list was rebuilt
+ - Fixes image files remaining in temp directory after muxing
+- **DASH BaseURL Resolution**: Added AdaptationSet-level BaseURL support per DASH spec
+ - URL resolution chain now properly follows: MPD → Period → AdaptationSet → Representation
+- **WindscribeVPN Region Support**: Restricted to supported regions with proper error handling
+ - Added error handling for unsupported regions in get_proxy method
+ - Prevents cryptic errors when using unsupported region codes
+- **Filename Sanitization**: Fixed space-hyphen-space handling in filenames
+ - Pre-process space-hyphen-space patterns (e.g., "Title - Episode") before other replacements
+ - Made space-hyphen-space handling conditional on scene_naming setting
+ - Addresses PR #44 by fixing the root cause
+- **CICP Enum Values**: Corrected values to match ITU-T H.273 specification
+ - Added Primaries.Unspecified (value 2) per H.273 spec
+ - Renamed Primaries/Transfer value 0 from Unspecified to Reserved for spec accuracy
+ - Simplified Transfer value 2 from Unspecified_Image to Unspecified
+ - Verified against ITU-T H.273, ISO/IEC 23091-2, H.264/H.265 specs, and FFmpeg enums
+- **HLS Byte Range Parsing**: Fixed TypeError in range_offset conversion
+ - Converted range_offset to int to prevent TypeError in calculate_byte_range
+- **pyplayready Compatibility**: Pinned to <0.7 to avoid KID extraction bug
+
+## [2.0.0] - 2025-11-10
+
+### Breaking Changes
+
+- **REST API Integration**: Core architecture modified to support REST API functionality
+ - Changes to internal APIs for download management and tracking
+ - Title and track classes updated with API integration points
+ - Core component interfaces modified for queue management support
+- **Configuration Changes**: New required configuration options for API and enhanced features
+ - Added `simkl_client_id` now required for Simkl functionality
+ - Service-specific configuration override structure introduced
+ - Debug logging configuration options added
+- **Forced Subtitles**: Behavior change for forced subtitle inclusion
+ - Forced subs no longer auto-included, requires explicit `--forced-subs` flag
+
+### Added
+
+- **REST API Server**: Complete download management via REST API (early development)
+ - Implemented download queue management and worker system
+ - Added OpenAPI/Swagger documentation for easy API exploration
+ - Included download progress tracking and status endpoints
+ - API authentication and comprehensive error handling
+ - Updated core components to support API integration
+ - Early development work with more changes planned
+- **CustomRemoteCDM**: Highly configurable custom CDM API support
+ - Support for third-party CDM API providers with maximum configurability
+ - Full configuration through YAML without code changes
+ - Addresses GitHub issue #26 for flexible CDM integration
+- **WindscribeVPN Proxy Provider**: New VPN provider support
+ - Added WindscribeVPN following NordVPN and SurfsharkVPN patterns
+ - Fixes GitHub issue #29
+- **Latest Episode Download**: New `--latest-episode` CLI option
+ - `-le, --latest-episode` flag to download only the most recent episode
+ - Automatically selects the single most recent episode regardless of season
+ - Fixes GitHub issue #28
+- **Video Track Exclusion**: New `--no-video` CLI option
+ - `-nv, --no-video` flag to skip downloading video tracks
+ - Allows downloading only audio, subtitles, attachments, and chapters
+ - Useful for audio-only or subtitle extraction workflows
+ - Fixes GitHub issue #39
+- **Service-Specific Configuration Overrides**: Per-service fine-tuned control
+ - Support for per-service configuration overrides in YAML
+ - Fine-tuned control of downloader and command options per service
+ - Fixes GitHub issue #13
+- **Comprehensive JSON Debug Logging**: Structured logging for troubleshooting
+ - Binary toggle via `--debug` flag or `debug: true` in config
+ - JSON Lines (.jsonl) format for easy parsing and analysis
+ - Comprehensive logging of all operations (session info, CLI params, CDM details, auth status, title/track metadata, DRM operations, vault queries)
+ - Configurable key logging via `debug_keys` option with smart redaction
+ - Error logging for all critical operations
+ - Removed old text logging system
+- **curl_cffi Retry Handler**: Enhanced session reliability
+ - Added automatic retry mechanism to curl_cffi Session
+ - Improved download reliability with configurable retries
+- **Simkl API Configuration**: New API key support
+ - Added `simkl_client_id` configuration option
+ - Simkl now requires client_id from https://simkl.com/settings/developer/
+- **Custom Session Fingerprints**: Enhanced browser impersonation capabilities
+ - Added custom fingerprint and preset support for better service compatibility
+ - Configurable fingerprint presets for different device types
+ - Improved success rate with services using advanced bot detection
+- **TMDB and Simkl Metadata Caching**: Enhanced title cache system
+ - Added metadata caching to title cache to reduce API calls
+ - Caches movie/show metadata alongside title information
+ - Improves performance for repeated title lookups and reduces API rate limiting
+- **API Enhancements**: Improved REST API functionality
+ - Added default parameter handling for better request processing
+ - Added URL field to services endpoint response for easier service identification
+ - Complete API enhancements for production readiness
+ - Improved error responses with better detail and debugging information
+
+### Changed
+
+- **Binary Search Enhancement**: Improved binary discovery
+ - Refactored to search for binaries in root of binary folder or subfolder named after the binary
+ - Better organization of binary dependencies
+- **Type Annotations**: Modernized to PEP 604 syntax
+ - Updated session.py type annotations to use modern Python syntax
+ - Improved code readability and type checking
+
+### Fixed
+
+- **Audio Description Track Support**: Added option to download audio description tracks
+ - Added `--audio-description/-ad` flag to optionally include descriptive audio tracks
+ - Previously, audio description tracks were always filtered out
+ - Users can now choose to download AD tracks when needed
+ - Fixes GitHub issue #33
+- **Config Directory Support**: Cross-platform user config directory support
+ - Fixed config loading to properly support user config directories across all platforms
+ - Fixes GitHub issue #23
+- **HYBRID Mode Validation**: Pre-download validation for hybrid processing
+ - Added validation to check both HDR10 and DV tracks are available before download
+ - Prevents wasted downloads when hybrid processing would fail
+- **TMDB/Simkl API Keys**: Graceful handling of missing API keys
+ - Improved error handling when TMDB or Simkl API keys are not configured
+ - Better user messaging for API configuration requirements
+- **Forced Subtitles Behavior**: Correct forced subtitle filtering
+ - Fixed forced subtitles being incorrectly included without `--forced-subs` flag
+ - Forced subs now only included when explicitly requested
+- **Font Attachment Constructor**: Fixed ASS/SSA font attachment
+ - Use keyword arguments for Attachment constructor in font attachment
+ - Fixes "Invalid URL: No scheme supplied" error
+ - Fixes GitHub issue #24
+- **Binary Subdirectory Checking**: Enhanced binary location discovery (by @TPD94, PR #19)
+ - Updated binaries.py to check subdirectories in binaries folders named after the binary
+ - Improved binary detection and loading
+- **HLS Manifest Processing**: Minor HLS parser fix (by @TPD94, PR #19)
+- **lxml and pyplayready**: Updated dependencies (by @Sp5rky)
+ - Updated lxml constraint and pyplayready import path for compatibility
+- **DASH Segment Calculation**: Corrected segment handling
+ - Fixed segment count calculation for DASH manifests with startNumber=0
+ - Ensures accurate segment processing for all DASH manifest configurations
+ - Prevents off-by-one errors in segment downloads
+- **HDR Detection and Naming**: Comprehensive HDR format support
+ - Improved HDR detection with comprehensive transfer characteristics checks
+ - Added hybrid DV+HDR10 support for accurate file naming
+ - Better identification of HDR formats across different streaming services
+ - More accurate HDR/DV detection in filename generation
+- **Subtitle Processing**: VTT subtitle handling improvements
+ - Resolved SDH (Subtitles for Deaf and Hard of hearing) stripping crash when processing VTT files
+ - More robust subtitle processing pipeline with better error handling
+ - Fixes crashes when filtering specific VTT subtitle formats
+- **DRM Processing**: Enhanced encoding handling
+ - Added explicit UTF-8 encoding to mp4decrypt subprocess calls
+ - Prevents encoding issues on systems with non-UTF-8 default encodings
+ - Improves cross-platform compatibility for Windows and some Linux configurations
+- **Session Fingerprints**: Updated OkHttp presets
+ - Updated OkHttp fingerprint presets for better Android TV compatibility
+ - Improved success rate with services using fingerprint-based detection
+
+### Documentation
+
+- **GitHub Issue Templates**: Enhanced issue reporting
+ - Improved bug report template with better structure and required fields
+ - Enhanced feature request template for clearer specifications
+ - Added helpful guidance for contributors to provide complete information
+
+### Refactored
+
+- **Import Cleanup**: Removed unused imports
+ - Removed unused mypy import from binaries.py
+ - Fixed import ordering in API download_manager and handlers
+
+### Contributors
+
+This release includes contributions from:
+
+- @Sp5rky - REST API server implementation, dependency updates
+- @stabbedbybrick - curl_cffi retry handler (PR #31)
+- @stabbedbybrick - n_m3u8dl-re refactor (PR #38)
+- @TPD94 - Binary search enhancements, manifest parser fixes (PR #19)
+- @scene (Andy) - Core features, configuration system, bug fixes
+
## [1.4.8] - 2025-10-08
### Added
@@ -179,7 +373,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Fixed
-- **Matroska Tag Compliance**: Enhanced media container compatibility
+- **Matroska Tag Compliance**: Enhanced media container compatibility
- Fixed Matroska tag compliance with official specification
- **Application Branding**: Cleaned up version display
- Removed old devine version reference from banner to avoid developer confusion
diff --git a/CONFIG.md b/CONFIG.md
index 40aa078..15eef05 100644
--- a/CONFIG.md
+++ b/CONFIG.md
@@ -547,9 +547,12 @@ Configuration data for pywidevine's serve functionality run through unshackle.
This effectively allows you to run `unshackle serve` to start serving pywidevine Serve-compliant CDMs right from your
local widevine device files.
+- `api_secret` - Secret key for REST API authentication. When set, enables the REST API server alongside the CDM serve functionality. This key is required for authenticating API requests.
+
For example,
```yaml
+api_secret: "your-secret-key-here"
users:
secret_key_for_jane: # 32bit hex recommended, case-sensitive
devices: # list of allowed devices for this user
diff --git a/pyproject.toml b/pyproject.toml
index 906dfd8..5c91c9b 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
[project]
name = "unshackle"
-version = "1.4.8"
+version = "2.1.0"
description = "Modular Movie, TV, and Music Archival Software."
authors = [{ name = "unshackle team" }]
requires-python = ">=3.10,<3.13"
@@ -31,21 +31,22 @@ dependencies = [
"click>=8.1.8,<9",
"construct>=2.8.8,<3",
"crccheck>=1.3.0,<2",
- "jsonpickle>=3.0.4,<4",
+ "fonttools>=4.0.0,<5",
+ "jsonpickle>=3.0.4,<5",
"langcodes>=3.4.0,<4",
"lxml>=5.2.1,<7",
"pproxy>=2.7.9,<3",
- "protobuf>=4.25.3,<5",
+ "protobuf>=4.25.3,<7",
"pycaption>=2.2.6,<3",
"pycryptodomex>=3.20.0,<4",
"pyjwt>=2.8.0,<3",
- "pymediainfo>=6.1.0,<7",
+ "pymediainfo>=6.1.0,<8",
"pymp4>=1.4.0,<2",
"pymysql>=1.1.0,<2",
"pywidevine[serve]>=1.8.0,<2",
"PyYAML>=6.0.1,<7",
- "requests[socks]>=2.31.0,<3",
- "rich>=13.7.1,<14",
+ "requests[socks]>=2.32.5,<3",
+ "rich>=13.7.1,<15",
"rlaphoenix.m3u8>=3.4.0,<4",
"ruamel.yaml>=0.18.6,<0.19",
"sortedcontainers>=2.4.0,<3",
@@ -53,12 +54,14 @@ dependencies = [
"Unidecode>=1.3.8,<2",
"urllib3>=2.2.1,<3",
"chardet>=5.2.0,<6",
- "curl-cffi>=0.7.0b4,<0.8",
+ "curl-cffi>=0.7.0b4,<0.14",
"pyplayready>=0.6.3,<0.7",
"httpx>=0.28.1,<0.29",
- "cryptography>=45.0.0",
+ "cryptography>=45.0.0,<47",
"subby",
+ "aiohttp-swagger3>=0.9.0,<1",
"pysubs2>=1.7.0,<2",
+ "PyExecJS>=1.5.1,<2",
]
[project.urls]
@@ -73,14 +76,14 @@ unshackle = "unshackle.core.__main__:main"
[dependency-groups]
dev = [
- "pre-commit>=3.7.0,<4",
+ "pre-commit>=3.7.0,<5",
"mypy>=1.9.0,<2",
"mypy-protobuf>=3.6.0,<4",
- "types-protobuf>=4.24.0.20240408,<5",
+ "types-protobuf>=4.24.0.20240408,<7",
"types-PyMySQL>=1.1.0.1,<2",
"types-requests>=2.31.0.20240406,<3",
- "isort>=5.13.2,<6",
- "ruff~=0.3.7",
+ "isort>=5.13.2,<8",
+ "ruff>=0.3.7,<0.15",
"unshackle",
]
diff --git a/unshackle/commands/dl.py b/unshackle/commands/dl.py
index c8d951e..2918167 100644
--- a/unshackle/commands/dl.py
+++ b/unshackle/commands/dl.py
@@ -41,22 +41,24 @@ from rich.text import Text
from rich.tree import Tree
from unshackle.core import binaries
-from unshackle.core.cdm import DecryptLabsRemoteCDM
+from unshackle.core.cdm import CustomRemoteCDM, DecryptLabsRemoteCDM
from unshackle.core.config import config
from unshackle.core.console import console
from unshackle.core.constants import DOWNLOAD_LICENCE_ONLY, AnyTrack, context_settings
from unshackle.core.credential import Credential
from unshackle.core.drm import DRM_T, PlayReady, Widevine
from unshackle.core.events import events
-from unshackle.core.proxies import Basic, Hola, NordVPN, SurfsharkVPN
+from unshackle.core.proxies import Basic, Hola, NordVPN, SurfsharkVPN, WindscribeVPN
from unshackle.core.service import Service
from unshackle.core.services import Services
+from unshackle.core.title_cacher import get_account_hash
from unshackle.core.titles import Movie, Movies, Series, Song, Title_T
from unshackle.core.titles.episode import Episode
from unshackle.core.tracks import Audio, Subtitle, Tracks, Video
from unshackle.core.tracks.attachment import Attachment
from unshackle.core.tracks.hybrid import Hybrid
-from unshackle.core.utilities import get_system_fonts, is_close_match, time_elapsed_since
+from unshackle.core.utilities import (find_font_with_fallbacks, get_debug_logger, get_system_fonts, init_debug_logger,
+ is_close_match, suggest_font_packages, time_elapsed_since)
from unshackle.core.utils import tags
from unshackle.core.utils.click_types import (LANGUAGE_RANGE, QUALITY_LIST, SEASON_RANGE, ContextData, MultipleChoice,
SubtitleCodecChoice, VideoCodecChoice)
@@ -67,7 +69,7 @@ from unshackle.core.vaults import Vaults
class dl:
@staticmethod
- def _truncate_pssh_for_display(pssh_string: str, drm_type: str) -> str:
+ def truncate_pssh_for_display(pssh_string: str, drm_type: str) -> str:
"""Truncate PSSH string for display when not in debug mode."""
if logging.root.level == logging.DEBUG or not pssh_string:
return pssh_string
@@ -78,6 +80,114 @@ class dl:
return pssh_string[: max_width - 3] + "..."
+ def find_custom_font(self, font_name: str) -> Optional[Path]:
+ """
+ Find font in custom fonts directory.
+
+ Args:
+ font_name: Font family name to find
+
+ Returns:
+ Path to font file, or None if not found
+ """
+ family_dir = Path(config.directories.fonts, font_name)
+ if family_dir.exists():
+ fonts = list(family_dir.glob("*.*tf"))
+ return fonts[0] if fonts else None
+ return None
+
+ def prepare_temp_font(
+ self,
+ font_name: str,
+ matched_font: Path,
+ system_fonts: dict[str, Path],
+ temp_font_files: list[Path]
+ ) -> Path:
+ """
+ Copy system font to temp and log if using fallback.
+
+ Args:
+ font_name: Requested font name
+ matched_font: Path to matched system font
+ system_fonts: Dictionary of available system fonts
+ temp_font_files: List to track temp files for cleanup
+
+ Returns:
+ Path to temp font file
+ """
+ # Find the matched name for logging
+ matched_name = next(
+ (name for name, path in system_fonts.items() if path == matched_font),
+ None
+ )
+
+ if matched_name and matched_name.lower() != font_name.lower():
+ self.log.info(f"Using '{matched_name}' as fallback for '{font_name}'")
+
+ # Create unique temp file path
+ safe_name = font_name.replace(" ", "_").replace("/", "_")
+ temp_path = config.directories.temp / f"font_{safe_name}{matched_font.suffix}"
+
+ # Copy if not already exists
+ if not temp_path.exists():
+ shutil.copy2(matched_font, temp_path)
+ temp_font_files.append(temp_path)
+
+ return temp_path
+
+ def attach_subtitle_fonts(
+ self,
+ font_names: list[str],
+ title: Title_T,
+ temp_font_files: list[Path]
+ ) -> tuple[int, list[str]]:
+ """
+ Attach fonts for subtitle rendering.
+
+ Args:
+ font_names: List of font names requested by subtitles
+ title: Title object to attach fonts to
+ temp_font_files: List to track temp files for cleanup
+
+ Returns:
+ Tuple of (fonts_attached_count, missing_fonts_list)
+ """
+ system_fonts = get_system_fonts()
+
+ font_count = 0
+ missing_fonts = []
+
+ for font_name in set(font_names):
+ # Try custom fonts first
+ if custom_font := self.find_custom_font(font_name):
+ title.tracks.add(Attachment(path=custom_font, name=f"{font_name} ({custom_font.stem})"))
+ font_count += 1
+ continue
+
+ # Try system fonts with fallback
+ if system_font := find_font_with_fallbacks(font_name, system_fonts):
+ temp_path = self.prepare_temp_font(font_name, system_font, system_fonts, temp_font_files)
+ title.tracks.add(Attachment(path=temp_path, name=f"{font_name} ({system_font.stem})"))
+ font_count += 1
+ else:
+ self.log.warning(f"Subtitle uses font '{font_name}' but it could not be found")
+ missing_fonts.append(font_name)
+
+ return font_count, missing_fonts
+
+ def suggest_missing_fonts(self, missing_fonts: list[str]) -> None:
+ """
+ Show package installation suggestions for missing fonts.
+
+ Args:
+ missing_fonts: List of font names that couldn't be found
+ """
+ if suggestions := suggest_font_packages(missing_fonts):
+ self.log.info("Install font packages to improve subtitle rendering:")
+ for package_cmd, fonts in suggestions.items():
+ self.log.info(f" $ sudo apt install {package_cmd}")
+ self.log.info(f" → Provides: {', '.join(fonts)}")
+
@click.command(
short_help="Download, Decrypt, and Mux tracks for titles from a Service.",
cls=Services,
@@ -151,6 +261,13 @@ class dl:
default=None,
help="Wanted episodes, e.g. `S01-S05,S07`, `S01E01-S02E03`, `S02-S02E03`, e.t.c, defaults to all.",
)
+ @click.option(
+ "-le",
+ "--latest-episode",
+ is_flag=True,
+ default=False,
+ help="Download only the single most recent episode available.",
+ )
@click.option(
"-l",
"--lang",
@@ -229,6 +346,8 @@ class dl:
@click.option("-ns", "--no-subs", is_flag=True, default=False, help="Do not download subtitle tracks.")
@click.option("-na", "--no-audio", is_flag=True, default=False, help="Do not download audio tracks.")
@click.option("-nc", "--no-chapters", is_flag=True, default=False, help="Do not download chapters tracks.")
+ @click.option("-nv", "--no-video", is_flag=True, default=False, help="Do not download video tracks.")
+ @click.option("-ad", "--audio-description", is_flag=True, default=False, help="Download audio description tracks.")
@click.option(
"--slow",
is_flag=True,
@@ -308,11 +427,77 @@ class dl:
self.log = logging.getLogger("download")
self.service = Services.get_tag(ctx.invoked_subcommand)
+ service_dl_config = config.services.get(self.service, {}).get("dl", {})
+ if service_dl_config:
+ param_types = {param.name: param.type for param in ctx.command.params if param.name}
+
+ for param_name, service_value in service_dl_config.items():
+ if param_name not in ctx.params:
+ continue
+
+ current_value = ctx.params[param_name]
+ global_default = config.dl.get(param_name)
+ param_type = param_types.get(param_name)
+
+ try:
+ if param_type and global_default is not None:
+ global_default = param_type.convert(global_default, None, ctx)
+ except Exception as e:
+ self.log.debug(f"Failed to convert global default for '{param_name}': {e}")
+
+ if current_value == global_default or (current_value is None and global_default is None):
+ try:
+ converted_value = service_value
+ if param_type and service_value is not None:
+ converted_value = param_type.convert(service_value, None, ctx)
+
+ ctx.params[param_name] = converted_value
+ self.log.debug(f"Applied service-specific '{param_name}' override: {converted_value}")
+ except Exception as e:
+ self.log.warning(
+ f"Failed to apply service-specific '{param_name}' override: {e}. "
+ f"Check that the value '{service_value}' is valid for this parameter."
+ )
+
self.profile = profile
self.tmdb_id = tmdb_id
self.tmdb_name = tmdb_name
self.tmdb_year = tmdb_year
+ # Initialize debug logger with service name if debug logging is enabled
+ if config.debug or logging.root.level == logging.DEBUG:
+ from collections import defaultdict
+ from datetime import datetime
+
+ debug_log_path = config.directories.logs / config.filenames.debug_log.format_map(
+ defaultdict(str, service=self.service, time=datetime.now().strftime("%Y%m%d-%H%M%S"))
+ )
+ init_debug_logger(log_path=debug_log_path, enabled=True, log_keys=config.debug_keys)
+ self.debug_logger = get_debug_logger()
+
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="INFO",
+ operation="download_init",
+ message=f"Download command initialized for service {self.service}",
+ service=self.service,
+ context={
+ "profile": profile,
+ "proxy": proxy,
+ "tag": tag,
+ "tmdb_id": tmdb_id,
+ "tmdb_name": tmdb_name,
+ "tmdb_year": tmdb_year,
+ "cli_params": {
+ k: v
+ for k, v in ctx.params.items()
+ if k not in ["profile", "proxy", "tag", "tmdb_id", "tmdb_name", "tmdb_year"]
+ },
+ },
+ )
+ else:
+ self.debug_logger = None
+
if self.profile:
self.log.info(f"Using profile: '{self.profile}'")
@@ -321,6 +506,13 @@ class dl:
if service_config_path.exists():
self.service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
self.log.info("Service Config loaded")
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="DEBUG",
+ operation="load_service_config",
+ service=self.service,
+ context={"config_path": str(service_config_path), "config": self.service_config},
+ )
else:
self.service_config = {}
merge_dict(config.services.get(self.service), self.service_config)
@@ -331,71 +523,134 @@ class dl:
if getattr(config, "decryption_map", None):
config.decryption = config.decryption_map.get(self.service, config.decryption)
- with console.status("Loading Key Vaults...", spinner="dots"):
+ service_config = config.services.get(self.service, {})
+ if service_config:
+ reserved_keys = {
+ "profiles",
+ "api_key",
+ "certificate",
+ "api_endpoint",
+ "region",
+ "device",
+ "endpoints",
+ "client",
+ "dl",
+ }
+
+ for config_key, override_value in service_config.items():
+ if config_key in reserved_keys or not isinstance(override_value, dict):
+ continue
+
+ if hasattr(config, config_key):
+ current_config = getattr(config, config_key, {})
+
+ if isinstance(current_config, dict):
+ merged_config = deepcopy(current_config)
+ merge_dict(override_value, merged_config)
+ setattr(config, config_key, merged_config)
+
+ self.log.debug(
+ f"Applied service-specific '{config_key}' overrides for {self.service}: {override_value}"
+ )
+
+ cdm_only = ctx.params.get("cdm_only")
+
+ if cdm_only:
self.vaults = Vaults(self.service)
- total_vaults = len(config.key_vaults)
- failed_vaults = []
+ self.log.info("CDM-only mode: Skipping vault loading")
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="INFO",
+ operation="vault_loading_skipped",
+ service=self.service,
+ context={"reason": "cdm_only flag set"},
+ )
+ else:
+ with console.status("Loading Key Vaults...", spinner="dots"):
+ self.vaults = Vaults(self.service)
+ total_vaults = len(config.key_vaults)
+ failed_vaults = []
- for vault in config.key_vaults:
- vault_type = vault["type"]
- vault_name = vault.get("name", vault_type)
- vault_copy = vault.copy()
- del vault_copy["type"]
+ for vault in config.key_vaults:
+ vault_type = vault["type"]
+ vault_name = vault.get("name", vault_type)
+ vault_copy = vault.copy()
+ del vault_copy["type"]
- if vault_type.lower() == "api" and "decrypt_labs" in vault_name.lower():
- if "token" not in vault_copy or not vault_copy["token"]:
- if config.decrypt_labs_api_key:
- vault_copy["token"] = config.decrypt_labs_api_key
- else:
- self.log.warning(
- f"No token provided for DecryptLabs vault '{vault_name}' and no global "
- "decrypt_labs_api_key configured"
- )
+ if vault_type.lower() == "api" and "decrypt_labs" in vault_name.lower():
+ if "token" not in vault_copy or not vault_copy["token"]:
+ if config.decrypt_labs_api_key:
+ vault_copy["token"] = config.decrypt_labs_api_key
+ else:
+ self.log.warning(
+ f"No token provided for DecryptLabs vault '{vault_name}' and no global "
+ "decrypt_labs_api_key configured"
+ )
- if vault_type.lower() == "sqlite":
- try:
- self.vaults.load_critical(vault_type, **vault_copy)
- self.log.debug(f"Successfully loaded vault: {vault_name} ({vault_type})")
- except Exception as e:
- self.log.error(f"vault failure: {vault_name} ({vault_type}) - {e}")
- raise
- else:
- # Other vaults (MySQL, HTTP, API) - soft fail
- if not self.vaults.load(vault_type, **vault_copy):
- failed_vaults.append(vault_name)
- self.log.debug(f"Failed to load vault: {vault_name} ({vault_type})")
+ if vault_type.lower() == "sqlite":
+ try:
+ self.vaults.load_critical(vault_type, **vault_copy)
+ self.log.debug(f"Successfully loaded vault: {vault_name} ({vault_type})")
+ except Exception as e:
+ self.log.error(f"vault failure: {vault_name} ({vault_type}) - {e}")
+ raise
else:
- self.log.debug(f"Successfully loaded vault: {vault_name} ({vault_type})")
+ # Other vaults (MySQL, HTTP, API) - soft fail
+ if not self.vaults.load(vault_type, **vault_copy):
+ failed_vaults.append(vault_name)
+ self.log.debug(f"Failed to load vault: {vault_name} ({vault_type})")
+ else:
+ self.log.debug(f"Successfully loaded vault: {vault_name} ({vault_type})")
- loaded_count = len(self.vaults)
- if failed_vaults:
- self.log.warning(f"Failed to load {len(failed_vaults)} vault(s): {', '.join(failed_vaults)}")
- self.log.info(f"Loaded {loaded_count}/{total_vaults} Vaults")
+ loaded_count = len(self.vaults)
+ if failed_vaults:
+ self.log.warning(f"Failed to load {len(failed_vaults)} vault(s): {', '.join(failed_vaults)}")
+ self.log.info(f"Loaded {loaded_count}/{total_vaults} Vaults")
- # Debug: Show detailed vault status
- if loaded_count > 0:
- vault_names = [vault.name for vault in self.vaults]
- self.log.debug(f"Active vaults: {', '.join(vault_names)}")
- else:
- self.log.debug("No vaults are currently active")
+ # Debug: Show detailed vault status
+ if loaded_count > 0:
+ vault_names = [vault.name for vault in self.vaults]
+ self.log.debug(f"Active vaults: {', '.join(vault_names)}")
+ else:
+ self.log.debug("No vaults are currently active")
with console.status("Loading DRM CDM...", spinner="dots"):
try:
self.cdm = self.get_cdm(self.service, self.profile)
except ValueError as e:
self.log.error(f"Failed to load CDM, {e}")
+ if self.debug_logger:
+ self.debug_logger.log_error("load_cdm", e, service=self.service)
sys.exit(1)
if self.cdm:
+ cdm_info = {}
if isinstance(self.cdm, DecryptLabsRemoteCDM):
drm_type = "PlayReady" if self.cdm.is_playready else "Widevine"
self.log.info(f"Loaded {drm_type} Remote CDM: DecryptLabs (L{self.cdm.security_level})")
+ cdm_info = {"type": "DecryptLabs", "drm_type": drm_type, "security_level": self.cdm.security_level}
elif hasattr(self.cdm, "device_type") and self.cdm.device_type.name in ["ANDROID", "CHROME"]:
self.log.info(f"Loaded Widevine CDM: {self.cdm.system_id} (L{self.cdm.security_level})")
+ cdm_info = {
+ "type": "Widevine",
+ "system_id": self.cdm.system_id,
+ "security_level": self.cdm.security_level,
+ "device_type": self.cdm.device_type.name,
+ }
else:
self.log.info(
f"Loaded PlayReady CDM: {self.cdm.certificate_chain.get_name()} (L{self.cdm.security_level})"
)
+ cdm_info = {
+ "type": "PlayReady",
+ "certificate": self.cdm.certificate_chain.get_name(),
+ "security_level": self.cdm.security_level,
+ }
+
+ if self.debug_logger and cdm_info:
+ self.debug_logger.log(
+ level="INFO", operation="load_cdm", service=self.service, context={"cdm": cdm_info}
+ )
self.proxy_providers = []
if no_proxy:
@@ -408,6 +663,8 @@ class dl:
self.proxy_providers.append(NordVPN(**config.proxy_providers["nordvpn"]))
if config.proxy_providers.get("surfsharkvpn"):
self.proxy_providers.append(SurfsharkVPN(**config.proxy_providers["surfsharkvpn"]))
+ if config.proxy_providers.get("windscribevpn"):
+ self.proxy_providers.append(WindscribeVPN(**config.proxy_providers["windscribevpn"]))
if binaries.HolaProxy:
self.proxy_providers.append(Hola())
for proxy_provider in self.proxy_providers:
@@ -468,6 +725,7 @@ class dl:
channels: float,
no_atmos: bool,
wanted: list[str],
+ latest_episode: bool,
lang: list[str],
v_lang: list[str],
a_lang: list[str],
@@ -483,6 +741,8 @@ class dl:
no_subs: bool,
no_audio: bool,
no_chapters: bool,
+ no_video: bool,
+ audio_description: bool,
slow: bool,
list_: bool,
list_titles: bool,
@@ -521,29 +781,117 @@ class dl:
else:
vaults_only = not cdm_only
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="DEBUG",
+ operation="drm_mode_config",
+ service=self.service,
+ context={
+ "cdm_only": cdm_only,
+ "vaults_only": vaults_only,
+ "mode": "CDM only" if cdm_only else ("Vaults only" if vaults_only else "Both CDM and Vaults"),
+ },
+ )
+
with console.status("Authenticating with Service...", spinner="dots"):
- cookies = self.get_cookie_jar(self.service, self.profile)
- credential = self.get_credentials(self.service, self.profile)
- service.authenticate(cookies, credential)
- if cookies or credential:
- self.log.info("Authenticated with Service")
+ try:
+ cookies = self.get_cookie_jar(self.service, self.profile)
+ credential = self.get_credentials(self.service, self.profile)
+ service.authenticate(cookies, credential)
+ if cookies or credential:
+ self.log.info("Authenticated with Service")
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="INFO",
+ operation="authenticate",
+ service=self.service,
+ context={
+ "has_cookies": bool(cookies),
+ "has_credentials": bool(credential),
+ "profile": self.profile,
+ },
+ )
+ except Exception as e:
+ if self.debug_logger:
+ self.debug_logger.log_error(
+ "authenticate", e, service=self.service, context={"profile": self.profile}
+ )
+ raise
with console.status("Fetching Title Metadata...", spinner="dots"):
- titles = service.get_titles_cached()
- if not titles:
- self.log.error("No titles returned, nothing to download...")
- sys.exit(1)
+ try:
+ titles = service.get_titles_cached()
+ if not titles:
+ self.log.error("No titles returned, nothing to download...")
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="ERROR",
+ operation="get_titles",
+ service=self.service,
+ message="No titles returned from service",
+ success=False,
+ )
+ sys.exit(1)
+ except Exception as e:
+ if self.debug_logger:
+ self.debug_logger.log_error("get_titles", e, service=self.service)
+ raise
- if self.tmdb_year and self.tmdb_id:
+ if self.debug_logger:
+ titles_info = {
+ "type": titles.__class__.__name__,
+ "count": len(titles) if hasattr(titles, "__len__") else 1,
+ "title": str(titles),
+ }
+ if hasattr(titles, "seasons"):
+ titles_info["seasons"] = len(titles.seasons) if hasattr(titles, "seasons") else 0
+ self.debug_logger.log(
+ level="INFO", operation="get_titles", service=self.service, context={"titles": titles_info}
+ )
+
+ title_cacher = service.title_cache if hasattr(service, "title_cache") else None
+ cache_title_id = None
+ if hasattr(service, "title"):
+ cache_title_id = service.title
+ elif hasattr(service, "title_id"):
+ cache_title_id = service.title_id
+ cache_region = service.current_region if hasattr(service, "current_region") else None
+ cache_account_hash = get_account_hash(service.credential) if hasattr(service, "credential") else None
+
+ if (self.tmdb_year or self.tmdb_name) and self.tmdb_id:
sample_title = titles[0] if hasattr(titles, "__getitem__") else titles
kind = "tv" if isinstance(sample_title, Episode) else "movie"
- tmdb_year_val = tags.get_year(self.tmdb_id, kind)
- if tmdb_year_val:
- if isinstance(titles, (Series, Movies)):
- for t in titles:
+
+ tmdb_year_val = None
+ tmdb_name_val = None
+
+ if self.tmdb_year:
+ tmdb_year_val = tags.get_year(
+ self.tmdb_id, kind, title_cacher, cache_title_id, cache_region, cache_account_hash
+ )
+
+ if self.tmdb_name:
+ tmdb_name_val = tags.get_title(
+ self.tmdb_id, kind, title_cacher, cache_title_id, cache_region, cache_account_hash
+ )
+
+ if isinstance(titles, (Series, Movies)):
+ for t in titles:
+ if tmdb_year_val:
t.year = tmdb_year_val
- else:
+ if tmdb_name_val:
+ if isinstance(t, Episode):
+ t.title = tmdb_name_val
+ else:
+ t.name = tmdb_name_val
+ else:
+ if tmdb_year_val:
titles.year = tmdb_year_val
+ if tmdb_name_val:
+ if isinstance(titles, Episode):
+ titles.title = tmdb_name_val
+ else:
+ titles.name = tmdb_name_val
console.print(Padding(Rule(f"[rule.text]{titles.__class__.__name__}: {titles}"), (1, 2)))
@@ -551,18 +899,36 @@ class dl:
if list_titles:
return
+ # Determine the latest episode if --latest-episode is set
+ latest_episode_id = None
+ if latest_episode and isinstance(titles, Series) and len(titles) > 0:
+ # Series is already sorted by (season, number, year)
+ # The last episode in the sorted list is the latest
+ latest_ep = titles[-1]
+ latest_episode_id = f"{latest_ep.season}x{latest_ep.number}"
+ self.log.info(f"Latest episode mode: Selecting S{latest_ep.season:02}E{latest_ep.number:02}")
+
for i, title in enumerate(titles):
- if isinstance(title, Episode) and wanted and f"{title.season}x{title.number}" not in wanted:
+ if isinstance(title, Episode) and latest_episode and latest_episode_id:
+ # If --latest-episode is set, only process the latest episode
+ if f"{title.season}x{title.number}" != latest_episode_id:
+ continue
+ elif isinstance(title, Episode) and wanted and f"{title.season}x{title.number}" not in wanted:
continue
console.print(Padding(Rule(f"[rule.text]{title}"), (1, 2)))
+ temp_font_files = []
if isinstance(title, Episode) and not self.tmdb_searched:
kind = "tv"
if self.tmdb_id:
- tmdb_title = tags.get_title(self.tmdb_id, kind)
+ tmdb_title = tags.get_title(
+ self.tmdb_id, kind, title_cacher, cache_title_id, cache_region, cache_account_hash
+ )
else:
- self.tmdb_id, tmdb_title, self.search_source = tags.search_show_info(title.title, title.year, kind)
+ self.tmdb_id, tmdb_title, self.search_source = tags.search_show_info(
+ title.title, title.year, kind, title_cacher, cache_title_id, cache_region, cache_account_hash
+ )
if not (self.tmdb_id and tmdb_title and tags.fuzzy_match(tmdb_title, title.title)):
self.tmdb_id = None
if list_ or list_titles:
@@ -578,7 +944,9 @@ class dl:
self.tmdb_searched = True
if isinstance(title, Movie) and (list_ or list_titles) and not self.tmdb_id:
- movie_id, movie_title, _ = tags.search_show_info(title.name, title.year, "movie")
+ movie_id, movie_title, _ = tags.search_show_info(
+ title.name, title.year, "movie", title_cacher, cache_title_id, cache_region, cache_account_hash
+ )
if movie_id:
console.print(
Padding(
@@ -591,11 +959,7 @@ class dl:
if self.tmdb_id and getattr(self, "search_source", None) != "simkl":
kind = "tv" if isinstance(title, Episode) else "movie"
- tags.external_ids(self.tmdb_id, kind)
- if self.tmdb_year:
- tmdb_year_val = tags.get_year(self.tmdb_id, kind)
- if tmdb_year_val:
- title.year = tmdb_year_val
+ tags.external_ids(self.tmdb_id, kind, title_cacher, cache_title_id, cache_region, cache_account_hash)
if slow and i != 0:
delay = random.randint(60, 120)
@@ -620,26 +984,83 @@ class dl:
s_lang = None
title.tracks.subtitles = []
+ if no_video:
+ console.log("Skipped video as --no-video was used...")
+ v_lang = None
+ title.tracks.videos = []
+
with console.status("Getting tracks...", spinner="dots"):
- title.tracks.add(service.get_tracks(title), warn_only=True)
- title.tracks.chapters = service.get_chapters(title)
+ try:
+ title.tracks.add(service.get_tracks(title), warn_only=True)
+ title.tracks.chapters = service.get_chapters(title)
+ except Exception as e:
+ if self.debug_logger:
+ self.debug_logger.log_error(
+ "get_tracks", e, service=self.service, context={"title": str(title)}
+ )
+ raise
+
+ if self.debug_logger:
+ tracks_info = {
+ "title": str(title),
+ "video_tracks": len(title.tracks.videos),
+ "audio_tracks": len(title.tracks.audio),
+ "subtitle_tracks": len(title.tracks.subtitles),
+ "has_chapters": bool(title.tracks.chapters),
+ "videos": [
+ {
+ "codec": str(v.codec),
+ "resolution": f"{v.width}x{v.height}" if v.width and v.height else "unknown",
+ "bitrate": v.bitrate,
+ "range": str(v.range),
+ "language": str(v.language) if v.language else None,
+ "drm": [str(type(d).__name__) for d in v.drm] if v.drm else [],
+ }
+ for v in title.tracks.videos
+ ],
+ "audio": [
+ {
+ "codec": str(a.codec),
+ "bitrate": a.bitrate,
+ "channels": a.channels,
+ "language": str(a.language) if a.language else None,
+ "descriptive": a.descriptive,
+ "drm": [str(type(d).__name__) for d in a.drm] if a.drm else [],
+ }
+ for a in title.tracks.audio
+ ],
+ "subtitles": [
+ {
+ "codec": str(s.codec),
+ "language": str(s.language) if s.language else None,
+ "forced": s.forced,
+ "sdh": s.sdh,
+ }
+ for s in title.tracks.subtitles
+ ],
+ }
+ self.debug_logger.log(
+ level="INFO", operation="get_tracks", service=self.service, context=tracks_info
+ )
# strip SDH subs to non-SDH if no equivalent same-lang non-SDH is available
# uses a loose check, e.g, wont strip en-US SDH sub if a non-SDH en-GB is available
- for subtitle in title.tracks.subtitles:
- if subtitle.sdh and not any(
- is_close_match(subtitle.language, [x.language])
- for x in title.tracks.subtitles
- if not x.sdh and not x.forced
- ):
- non_sdh_sub = deepcopy(subtitle)
- non_sdh_sub.id += "_stripped"
- non_sdh_sub.sdh = False
- title.tracks.add(non_sdh_sub)
- events.subscribe(
- events.Types.TRACK_MULTIPLEX,
- lambda track: (track.strip_hearing_impaired()) if track.id == non_sdh_sub.id else None,
- )
+ # Check if automatic SDH stripping is enabled in config
+ if config.subtitle.get("strip_sdh", True):
+ for subtitle in title.tracks.subtitles:
+ if subtitle.sdh and not any(
+ is_close_match(subtitle.language, [x.language])
+ for x in title.tracks.subtitles
+ if not x.sdh and not x.forced
+ ):
+ non_sdh_sub = deepcopy(subtitle)
+ non_sdh_sub.id += "_stripped"
+ non_sdh_sub.sdh = False
+ title.tracks.add(non_sdh_sub)
+ events.subscribe(
+ events.Types.TRACK_MULTIPLEX,
+ lambda track, sub_id=non_sdh_sub.id: (track.strip_hearing_impaired()) if track.id == sub_id else None,
+ )
with console.status("Sorting tracks by language and bitrate...", spinner="dots"):
video_sort_lang = v_lang or lang
@@ -788,6 +1209,29 @@ class dl:
selected_videos.append(match)
title.tracks.videos = selected_videos
+ # validate hybrid mode requirements
+ if any(r == Video.Range.HYBRID for r in range_):
+ hdr10_tracks = [v for v in title.tracks.videos if v.range == Video.Range.HDR10]
+ dv_tracks = [v for v in title.tracks.videos if v.range == Video.Range.DV]
+
+ if not hdr10_tracks and not dv_tracks:
+ available_ranges = sorted(set(v.range.name for v in title.tracks.videos))
+ self.log.error("HYBRID mode requires both HDR10 and DV tracks, but neither is available")
+ self.log.error(
+ f"Available ranges: {', '.join(available_ranges) if available_ranges else 'none'}"
+ )
+ sys.exit(1)
+ elif not hdr10_tracks:
+ available_ranges = sorted(set(v.range.name for v in title.tracks.videos))
+ self.log.error("HYBRID mode requires both HDR10 and DV tracks, but only DV is available")
+ self.log.error(f"Available ranges: {', '.join(available_ranges)}")
+ sys.exit(1)
+ elif not dv_tracks:
+ available_ranges = sorted(set(v.range.name for v in title.tracks.videos))
+ self.log.error("HYBRID mode requires both HDR10 and DV tracks, but only HDR10 is available")
+ self.log.error(f"Available ranges: {', '.join(available_ranges)}")
+ sys.exit(1)
+
# filter subtitle tracks
if require_subs:
missing_langs = [
@@ -828,7 +1272,8 @@ class dl:
# filter audio tracks
# might have no audio tracks if part of the video, e.g. transport stream hls
if len(title.tracks.audio) > 0:
- title.tracks.select_audio(lambda x: not x.descriptive) # exclude descriptive audio
+ if not audio_description:
+ title.tracks.select_audio(lambda x: not x.descriptive) # exclude descriptive audio
if acodec:
title.tracks.select_audio(lambda x: x.codec == acodec)
if not title.tracks.audio:
@@ -887,7 +1332,7 @@ class dl:
self.log.error(f"There's no {processed_lang} Audio Track, cannot continue...")
sys.exit(1)
- if video_only or audio_only or subs_only or chapters_only or no_subs or no_audio or no_chapters:
+ if video_only or audio_only or subs_only or chapters_only or no_subs or no_audio or no_chapters or no_video:
keep_videos = False
keep_audio = False
keep_subtitles = False
@@ -914,6 +1359,8 @@ class dl:
keep_audio = False
if no_chapters:
keep_chapters = False
+ if no_video:
+ keep_videos = False
kept_tracks = []
if keep_videos:
@@ -924,6 +1371,7 @@ class dl:
kept_tracks.extend(title.tracks.subtitles)
if keep_chapters:
kept_tracks.extend(title.tracks.chapters)
+ kept_tracks.extend(title.tracks.attachments)
title.tracks = Tracks(kept_tracks)
@@ -1010,8 +1458,17 @@ class dl:
)
):
download.result()
+
except KeyboardInterrupt:
console.print(Padding(":x: Download Cancelled...", (0, 5, 1, 5)))
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="WARNING",
+ operation="download_tracks",
+ service=self.service,
+ message="Download cancelled by user",
+ context={"title": str(title)},
+ )
return
except Exception as e: # noqa
error_messages = [
@@ -1034,6 +1491,19 @@ class dl:
# CalledProcessError already lists the exception trace
console.print_exception()
console.print(Padding(Group(*error_messages), (1, 5)))
+
+ if self.debug_logger:
+ self.debug_logger.log_error(
+ "download_tracks",
+ e,
+ service=self.service,
+ context={
+ "title": str(title),
+ "error_type": type(e).__name__,
+ "tracks_count": len(title.tracks),
+ "returncode": getattr(e, "returncode", None),
+ },
+ )
return
if skip_dl:
@@ -1049,6 +1519,7 @@ class dl:
and not no_subs
and not (hasattr(service, "NO_SUBTITLES") and service.NO_SUBTITLES)
and not video_only
+ and not no_video
and len(title.tracks.videos) > video_track_n
and any(
x.get("codec_name", "").startswith("eia_")
@@ -1101,26 +1572,16 @@ class dl:
if line.startswith("Style: "):
font_names.append(line.removesuffix("Style: ").split(",")[1])
- font_count = 0
- system_fonts = get_system_fonts()
- for font_name in set(font_names):
- family_dir = Path(config.directories.fonts, font_name)
- fonts_from_system = [file for name, file in system_fonts.items() if name.startswith(font_name)]
- if family_dir.exists():
- fonts = family_dir.glob("*.*tf")
- for font in fonts:
- title.tracks.add(Attachment(path=font, name=f"{font_name} ({font.stem})"))
- font_count += 1
- elif fonts_from_system:
- for font in fonts_from_system:
- title.tracks.add(Attachment(path=font, name=f"{font_name} ({font.stem})"))
- font_count += 1
- else:
- self.log.warning(f"Subtitle uses font [text2]{font_name}[/] but it could not be found...")
+ font_count, missing_fonts = self.attach_subtitle_fonts(
+ font_names, title, temp_font_files
+ )
if font_count:
self.log.info(f"Attached {font_count} fonts for the Subtitles")
+ if missing_fonts and sys.platform != "win32":
+ self.suggest_missing_fonts(missing_fonts)
+
# Handle DRM decryption BEFORE repacking (must decrypt first!)
service_name = service.__class__.__name__.upper()
decryption_method = config.decryption_map.get(service_name, config.decryption)
@@ -1274,8 +1735,17 @@ class dl:
video_track.delete()
for track in title.tracks:
track.delete()
+
+ # Clear temp font attachment paths and delete other attachments
for attachment in title.tracks.attachments:
- attachment.delete()
+ if attachment.path and attachment.path in temp_font_files:
+ attachment.path = None
+ else:
+ attachment.delete()
+
+ # Clean up temp fonts
+ for temp_path in temp_font_files:
+ temp_path.unlink(missing_ok=True)
else:
# dont mux
@@ -1286,9 +1756,13 @@ class dl:
final_dir = config.directories.downloads
if not no_folder and isinstance(title, (Episode, Song)):
# Use first available track for filename generation
- sample_track = title.tracks.videos[0] if title.tracks.videos else (
- title.tracks.audio[0] if title.tracks.audio else (
- title.tracks.subtitles[0] if title.tracks.subtitles else None
+ sample_track = (
+ title.tracks.videos[0]
+ if title.tracks.videos
+ else (
+ title.tracks.audio[0]
+ if title.tracks.audio
+ else (title.tracks.subtitles[0] if title.tracks.subtitles else None)
)
)
if sample_track and sample_track.path:
@@ -1315,7 +1789,9 @@ class dl:
track_suffix = f".{track.codec.name if hasattr(track.codec, 'name') else 'video'}"
elif isinstance(track, Audio):
lang_suffix = f".{track.language}" if track.language else ""
- track_suffix = f"{lang_suffix}.{track.codec.name if hasattr(track.codec, 'name') else 'audio'}"
+ track_suffix = (
+ f"{lang_suffix}.{track.codec.name if hasattr(track.codec, 'name') else 'audio'}"
+ )
elif isinstance(track, Subtitle):
lang_suffix = f".{track.language}" if track.language else ""
forced_suffix = ".forced" if track.forced else ""
@@ -1382,30 +1858,51 @@ class dl:
if not drm:
return
+ track_quality = None
if isinstance(track, Video) and track.height:
- pass
+ track_quality = track.height
if isinstance(drm, Widevine):
if not isinstance(self.cdm, (WidevineCdm, DecryptLabsRemoteCDM)) or (
isinstance(self.cdm, DecryptLabsRemoteCDM) and self.cdm.is_playready
):
- widevine_cdm = self.get_cdm(self.service, self.profile, drm="widevine")
+ widevine_cdm = self.get_cdm(self.service, self.profile, drm="widevine", quality=track_quality)
if widevine_cdm:
- self.log.info("Switching to Widevine CDM for Widevine content")
+ if track_quality:
+ self.log.info(f"Switching to Widevine CDM for Widevine {track_quality}p content")
+ else:
+ self.log.info("Switching to Widevine CDM for Widevine content")
self.cdm = widevine_cdm
elif isinstance(drm, PlayReady):
if not isinstance(self.cdm, (PlayReadyCdm, DecryptLabsRemoteCDM)) or (
isinstance(self.cdm, DecryptLabsRemoteCDM) and not self.cdm.is_playready
):
- playready_cdm = self.get_cdm(self.service, self.profile, drm="playready")
+ playready_cdm = self.get_cdm(self.service, self.profile, drm="playready", quality=track_quality)
if playready_cdm:
- self.log.info("Switching to PlayReady CDM for PlayReady content")
+ if track_quality:
+ self.log.info(f"Switching to PlayReady CDM for PlayReady {track_quality}p content")
+ else:
+ self.log.info("Switching to PlayReady CDM for PlayReady content")
self.cdm = playready_cdm
if isinstance(drm, Widevine):
+ if self.debug_logger:
+ self.debug_logger.log_drm_operation(
+ drm_type="Widevine",
+ operation="prepare_drm",
+ service=self.service,
+ context={
+ "track": str(track),
+ "title": str(title),
+ "pssh": drm.pssh.dumps() if drm.pssh else None,
+ "kids": [k.hex for k in drm.kids],
+ "track_kid": track_kid.hex if track_kid else None,
+ },
+ )
+
with self.DRM_TABLE_LOCK:
- pssh_display = self._truncate_pssh_for_display(drm.pssh.dumps(), "Widevine")
+ pssh_display = self.truncate_pssh_for_display(drm.pssh.dumps(), "Widevine")
cek_tree = Tree(Text.assemble(("Widevine", "cyan"), (f"({pssh_display})", "text"), overflow="fold"))
pre_existing_tree = next(
(x for x in table.columns[0].cells if isinstance(x, Tree) and x.label == cek_tree.label), None
@@ -1432,11 +1929,32 @@ class dl:
if not any(f"{kid.hex}:{content_key}" in x.label for x in cek_tree.children):
cek_tree.add(label)
self.vaults.add_key(kid, content_key, excluding=vault_used)
+
+ if self.debug_logger:
+ self.debug_logger.log_vault_query(
+ vault_name=vault_used,
+ operation="get_key_success",
+ service=self.service,
+ context={
+ "kid": kid.hex,
+ "content_key": content_key,
+ "track": str(track),
+ "from_cache": True,
+ },
+ )
elif vaults_only:
msg = f"No Vault has a Key for {kid.hex} and --vaults-only was used"
cek_tree.add(f"[logging.level.error]{msg}")
if not pre_existing_tree:
table.add_row(cek_tree)
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="ERROR",
+ operation="vault_key_not_found",
+ service=self.service,
+ message=msg,
+ context={"kid": kid.hex, "track": str(track)},
+ )
raise Widevine.Exceptions.CEKNotFound(msg)
else:
need_license = True
@@ -1447,6 +1965,18 @@ class dl:
if need_license and not vaults_only:
from_vaults = drm.content_keys.copy()
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="INFO",
+ operation="get_license",
+ service=self.service,
+ message="Requesting Widevine license from service",
+ context={
+ "track": str(track),
+ "kids_needed": [k.hex for k in all_kids if k not in drm.content_keys],
+ },
+ )
+
try:
if self.service == "NF":
drm.get_NF_content_keys(cdm=self.cdm, licence=licence, certificate=certificate)
@@ -1460,8 +1990,27 @@ class dl:
cek_tree.add(f"[logging.level.error]{msg}")
if not pre_existing_tree:
table.add_row(cek_tree)
+ if self.debug_logger:
+ self.debug_logger.log_error(
+ "get_license",
+ e,
+ service=self.service,
+ context={"track": str(track), "exception_type": type(e).__name__},
+ )
raise e
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="INFO",
+ operation="license_keys_retrieved",
+ service=self.service,
+ context={
+ "track": str(track),
+ "keys_count": len(drm.content_keys),
+ "kids": [k.hex for k in drm.content_keys.keys()],
+ },
+ )
+
for kid_, key in drm.content_keys.items():
if key == "0" * 32:
key = f"[red]{key}[/]"
@@ -1498,17 +2047,40 @@ class dl:
if export:
keys = {}
if export.is_file():
- keys = jsonpickle.loads(export.read_text(encoding="utf8"))
+ keys = jsonpickle.loads(export.read_text(encoding="utf8")) or {}
if str(title) not in keys:
keys[str(title)] = {}
if str(track) not in keys[str(title)]:
keys[str(title)][str(track)] = {}
- keys[str(title)][str(track)].update(drm.content_keys)
+
+ track_data = keys[str(title)][str(track)]
+ track_data["url"] = track.url
+ track_data["descriptor"] = track.descriptor.name
+
+ if "keys" not in track_data:
+ track_data["keys"] = {}
+ for kid, key in drm.content_keys.items():
+ track_data["keys"][kid.hex] = key
+
export.write_text(jsonpickle.dumps(keys, indent=4), encoding="utf8")
elif isinstance(drm, PlayReady):
+ if self.debug_logger:
+ self.debug_logger.log_drm_operation(
+ drm_type="PlayReady",
+ operation="prepare_drm",
+ service=self.service,
+ context={
+ "track": str(track),
+ "title": str(title),
+ "pssh": drm.pssh_b64 or "",
+ "kids": [k.hex for k in drm.kids],
+ "track_kid": track_kid.hex if track_kid else None,
+ },
+ )
+
with self.DRM_TABLE_LOCK:
- pssh_display = self._truncate_pssh_for_display(drm.pssh_b64 or "", "PlayReady")
+ pssh_display = self.truncate_pssh_for_display(drm.pssh_b64 or "", "PlayReady")
cek_tree = Tree(
Text.assemble(
("PlayReady", "cyan"),
@@ -1541,11 +2113,33 @@ class dl:
if not any(f"{kid.hex}:{content_key}" in x.label for x in cek_tree.children):
cek_tree.add(label)
self.vaults.add_key(kid, content_key, excluding=vault_used)
+
+ if self.debug_logger:
+ self.debug_logger.log_vault_query(
+ vault_name=vault_used,
+ operation="get_key_success",
+ service=self.service,
+ context={
+ "kid": kid.hex,
+ "content_key": content_key,
+ "track": str(track),
+ "from_cache": True,
+ "drm_type": "PlayReady",
+ },
+ )
elif vaults_only:
msg = f"No Vault has a Key for {kid.hex} and --vaults-only was used"
cek_tree.add(f"[logging.level.error]{msg}")
if not pre_existing_tree:
table.add_row(cek_tree)
+ if self.debug_logger:
+ self.debug_logger.log(
+ level="ERROR",
+ operation="vault_key_not_found",
+ service=self.service,
+ message=msg,
+ context={"kid": kid.hex, "track": str(track), "drm_type": "PlayReady"},
+ )
raise PlayReady.Exceptions.CEKNotFound(msg)
else:
need_license = True
@@ -1566,6 +2160,17 @@ class dl:
cek_tree.add(f"[logging.level.error]{msg}")
if not pre_existing_tree:
table.add_row(cek_tree)
+ if self.debug_logger:
+ self.debug_logger.log_error(
+ "get_license_playready",
+ e,
+ service=self.service,
+ context={
+ "track": str(track),
+ "exception_type": type(e).__name__,
+ "drm_type": "PlayReady",
+ },
+ )
raise e
for kid_, key in drm.content_keys.items():
@@ -1596,12 +2201,21 @@ class dl:
if export:
keys = {}
if export.is_file():
- keys = jsonpickle.loads(export.read_text(encoding="utf8"))
+ keys = jsonpickle.loads(export.read_text(encoding="utf8")) or {}
if str(title) not in keys:
keys[str(title)] = {}
if str(track) not in keys[str(title)]:
keys[str(title)][str(track)] = {}
- keys[str(title)][str(track)].update(drm.content_keys)
+
+ track_data = keys[str(title)][str(track)]
+ track_data["url"] = track.url
+ track_data["descriptor"] = track.descriptor.name
+
+ if "keys" not in track_data:
+ track_data["keys"] = {}
+ for kid, key in drm.content_keys.items():
+ track_data["keys"][kid.hex] = key
+
export.write_text(jsonpickle.dumps(keys, indent=4), encoding="utf8")
@staticmethod
@@ -1640,7 +2254,7 @@ class dl:
@staticmethod
def save_cookies(path: Path, cookies: CookieJar):
- if hasattr(cookies, 'jar'):
+ if hasattr(cookies, "jar"):
cookies = cookies.jar
cookie_jar = MozillaCookieJar(path)
@@ -1760,8 +2374,9 @@ class dl:
cdm_api = next(iter(x.copy() for x in config.remote_cdm if x["name"] == cdm_name), None)
if cdm_api:
- is_decrypt_lab = True if cdm_api.get("type") == "decrypt_labs" else False
- if is_decrypt_lab:
+ cdm_type = cdm_api.get("type")
+
+ if cdm_type == "decrypt_labs":
del cdm_api["name"]
del cdm_api["type"]
@@ -1776,14 +2391,22 @@ class dl:
# All DecryptLabs CDMs use DecryptLabsRemoteCDM
return DecryptLabsRemoteCDM(service_name=service, vaults=self.vaults, **cdm_api)
+
+ elif cdm_type == "custom_api":
+ del cdm_api["name"]
+ del cdm_api["type"]
+
+ # All Custom API CDMs use CustomRemoteCDM
+ return CustomRemoteCDM(service_name=service, vaults=self.vaults, **cdm_api)
+
else:
return RemoteCdm(
- device_type=cdm_api['Device Type'],
- system_id=cdm_api['System ID'],
- security_level=cdm_api['Security Level'],
- host=cdm_api['Host'],
- secret=cdm_api['Secret'],
- device_name=cdm_api['Device Name'],
+ device_type=cdm_api["Device Type"],
+ system_id=cdm_api["System ID"],
+ security_level=cdm_api["Security Level"],
+ host=cdm_api["Host"],
+ secret=cdm_api["Secret"],
+ device_name=cdm_api["Device Name"],
)
prd_path = config.directories.prds / f"{cdm_name}.prd"
diff --git a/unshackle/commands/kv.py b/unshackle/commands/kv.py
index 035f7f7..28c870d 100644
--- a/unshackle/commands/kv.py
+++ b/unshackle/commands/kv.py
@@ -12,7 +12,7 @@ from unshackle.core.vault import Vault
from unshackle.core.vaults import Vaults
-def _load_vaults(vault_names: list[str]) -> Vaults:
+def load_vaults(vault_names: list[str]) -> Vaults:
"""Load and validate vaults by name."""
vaults = Vaults()
for vault_name in vault_names:
@@ -30,7 +30,7 @@ def _load_vaults(vault_names: list[str]) -> Vaults:
return vaults
-def _process_service_keys(from_vault: Vault, service: str, log: logging.Logger) -> dict[str, str]:
+def process_service_keys(from_vault: Vault, service: str, log: logging.Logger) -> dict[str, str]:
"""Get and validate keys from a vault for a specific service."""
content_keys = list(from_vault.get_keys(service))
@@ -41,9 +41,9 @@ def _process_service_keys(from_vault: Vault, service: str, log: logging.Logger)
return {kid: key for kid, key in content_keys if kid not in bad_keys}
-def _copy_service_data(to_vault: Vault, from_vault: Vault, service: str, log: logging.Logger) -> int:
+def copy_service_data(to_vault: Vault, from_vault: Vault, service: str, log: logging.Logger) -> int:
"""Copy data for a single service between vaults."""
- content_keys = _process_service_keys(from_vault, service, log)
+ content_keys = process_service_keys(from_vault, service, log)
total_count = len(content_keys)
if total_count == 0:
@@ -95,7 +95,7 @@ def copy(to_vault_name: str, from_vault_names: list[str], service: Optional[str]
log = logging.getLogger("kv")
all_vault_names = [to_vault_name] + list(from_vault_names)
- vaults = _load_vaults(all_vault_names)
+ vaults = load_vaults(all_vault_names)
to_vault = vaults.vaults[0]
from_vaults = vaults.vaults[1:]
@@ -112,7 +112,7 @@ def copy(to_vault_name: str, from_vault_names: list[str], service: Optional[str]
services_to_copy = [service] if service else from_vault.get_services()
for service_tag in services_to_copy:
- added = _copy_service_data(to_vault, from_vault, service_tag, log)
+ added = copy_service_data(to_vault, from_vault, service_tag, log)
total_added += added
if total_added > 0:
@@ -164,7 +164,7 @@ def add(file: Path, service: str, vaults: list[str]) -> None:
log = logging.getLogger("kv")
service = Services.get_tag(service)
- vaults_ = _load_vaults(list(vaults))
+ vaults_ = load_vaults(list(vaults))
data = file.read_text(encoding="utf8")
kid_keys: dict[str, str] = {}
@@ -194,7 +194,7 @@ def prepare(vaults: list[str]) -> None:
"""Create Service Tables on Vaults if not yet created."""
log = logging.getLogger("kv")
- vaults_ = _load_vaults(vaults)
+ vaults_ = load_vaults(vaults)
for vault in vaults_:
if hasattr(vault, "has_table") and hasattr(vault, "create_table"):
diff --git a/unshackle/commands/serve.py b/unshackle/commands/serve.py
index 85c9739..a28d633 100644
--- a/unshackle/commands/serve.py
+++ b/unshackle/commands/serve.py
@@ -1,19 +1,32 @@
+import logging
import subprocess
import click
+from aiohttp import web
from unshackle.core import binaries
+from unshackle.core.api import cors_middleware, setup_routes, setup_swagger
from unshackle.core.config import config
from unshackle.core.constants import context_settings
-@click.command(short_help="Serve your Local Widevine Devices for Remote Access.", context_settings=context_settings)
+@click.command(
+ short_help="Serve your Local Widevine Devices and REST API for Remote Access.", context_settings=context_settings
+)
@click.option("-h", "--host", type=str, default="0.0.0.0", help="Host to serve from.")
@click.option("-p", "--port", type=int, default=8786, help="Port to serve from.")
@click.option("--caddy", is_flag=True, default=False, help="Also serve with Caddy.")
-def serve(host: str, port: int, caddy: bool) -> None:
+@click.option("--api-only", is_flag=True, default=False, help="Serve only the REST API, not pywidevine CDM.")
+@click.option("--no-key", is_flag=True, default=False, help="Disable API key authentication (allows all requests).")
+@click.option(
+ "--debug-api",
+ is_flag=True,
+ default=False,
+ help="Include technical debug information (tracebacks, stderr) in API error responses.",
+)
+def serve(host: str, port: int, caddy: bool, api_only: bool, no_key: bool, debug_api: bool) -> None:
"""
- Serve your Local Widevine Devices for Remote Access.
+ Serve your Local Widevine Devices and REST API for Remote Access.
\b
Host as 127.0.0.1 may block remote access even if port-forwarded.
@@ -23,8 +36,28 @@ def serve(host: str, port: int, caddy: bool) -> None:
You may serve with Caddy at the same time with --caddy. You can use Caddy
as a reverse-proxy to serve with HTTPS. The config used will be the Caddyfile
next to the unshackle config.
+
+ \b
+ The REST API provides programmatic access to unshackle functionality.
+ Configure authentication in your config under serve.users and serve.api_secret.
"""
- from pywidevine import serve
+ from pywidevine import serve as pywidevine_serve
+
+ log = logging.getLogger("serve")
+
+ # Validate API secret for REST API routes (unless --no-key is used)
+ if not no_key:
+ api_secret = config.serve.get("api_secret")
+ if not api_secret:
+ raise click.ClickException(
+ "API secret key is not configured. Please add 'api_secret' to the 'serve' section in your config."
+ )
+ else:
+ api_secret = None
+ log.warning("Running with --no-key: Authentication is DISABLED for all API endpoints!")
+
+ if debug_api:
+ log.warning("Running with --debug-api: Error responses will include technical debug information!")
if caddy:
if not binaries.Caddy:
@@ -39,7 +72,53 @@ def serve(host: str, port: int, caddy: bool) -> None:
if not config.serve.get("devices"):
config.serve["devices"] = []
config.serve["devices"].extend(list(config.directories.wvds.glob("*.wvd")))
- serve.run(config.serve, host, port)
+
+ if api_only:
+ # API-only mode: serve just the REST API
+ log.info("Starting REST API server (pywidevine CDM disabled)")
+ if no_key:
+ app = web.Application(middlewares=[cors_middleware])
+ app["config"] = {"users": []}
+ else:
+ app = web.Application(middlewares=[cors_middleware, pywidevine_serve.authentication])
+ app["config"] = {"users": [api_secret]}
+ app["debug_api"] = debug_api
+ setup_routes(app)
+ setup_swagger(app)
+ log.info(f"REST API endpoints available at http://{host}:{port}/api/")
+ log.info(f"Swagger UI available at http://{host}:{port}/api/docs/")
+ log.info("(Press CTRL+C to quit)")
+ web.run_app(app, host=host, port=port, print=None)
+ else:
+ # Integrated mode: serve both pywidevine + REST API
+ log.info("Starting integrated server (pywidevine CDM + REST API)")
+
+ # Create integrated app with both pywidevine and API routes
+ if no_key:
+ app = web.Application(middlewares=[cors_middleware])
+ app["config"] = dict(config.serve)
+ app["config"]["users"] = []
+ else:
+ app = web.Application(middlewares=[cors_middleware, pywidevine_serve.authentication])
+ # Setup config - add API secret to users for authentication
+ serve_config = dict(config.serve)
+ if not serve_config.get("users"):
+ serve_config["users"] = []
+ if api_secret not in serve_config["users"]:
+ serve_config["users"].append(api_secret)
+ app["config"] = serve_config
+
+ app.on_startup.append(pywidevine_serve._startup)
+ app.on_cleanup.append(pywidevine_serve._cleanup)
+ app.add_routes(pywidevine_serve.routes)
+ app["debug_api"] = debug_api
+ setup_routes(app)
+ setup_swagger(app)
+
+ log.info(f"REST API endpoints available at http://{host}:{port}/api/")
+ log.info(f"Swagger UI available at http://{host}:{port}/api/docs/")
+ log.info("(Press CTRL+C to quit)")
+ web.run_app(app, host=host, port=port, print=None)
finally:
if caddy_p:
caddy_p.kill()
diff --git a/unshackle/core/__init__.py b/unshackle/core/__init__.py
index 4963389..9aa3f90 100644
--- a/unshackle/core/__init__.py
+++ b/unshackle/core/__init__.py
@@ -1 +1 @@
-__version__ = "1.4.8"
+__version__ = "2.1.0"
diff --git a/unshackle/core/__main__.py b/unshackle/core/__main__.py
index e4717fa..6cf2fac 100644
--- a/unshackle/core/__main__.py
+++ b/unshackle/core/__main__.py
@@ -1,6 +1,5 @@
import atexit
import logging
-from pathlib import Path
import click
import urllib3
@@ -16,23 +15,16 @@ from unshackle.core.config import config
from unshackle.core.console import ComfyRichHandler, console
from unshackle.core.constants import context_settings
from unshackle.core.update_checker import UpdateChecker
-from unshackle.core.utilities import rotate_log_file
-
-LOGGING_PATH = None
+from unshackle.core.utilities import close_debug_logger, init_debug_logger
@click.command(cls=Commands, invoke_without_command=True, context_settings=context_settings)
@click.option("-v", "--version", is_flag=True, default=False, help="Print version information.")
-@click.option("-d", "--debug", is_flag=True, default=False, help="Enable DEBUG level logs.")
-@click.option(
- "--log",
- "log_path",
- type=Path,
- default=config.directories.logs / config.filenames.log,
- help="Log path (or filename). Path can contain the following f-string args: {name} {time}.",
-)
-def main(version: bool, debug: bool, log_path: Path) -> None:
+@click.option("-d", "--debug", is_flag=True, default=False, help="Enable DEBUG level logs and JSON debug logging.")
+def main(version: bool, debug: bool) -> None:
"""unshackle—Modular Movie, TV, and Music Archival Software."""
+ debug_logging_enabled = debug or config.debug
+
logging.basicConfig(
level=logging.DEBUG if debug else logging.INFO,
format="%(message)s",
@@ -48,11 +40,8 @@ def main(version: bool, debug: bool, log_path: Path) -> None:
],
)
- if log_path:
- global LOGGING_PATH
- console.record = True
- new_log_path = rotate_log_file(log_path)
- LOGGING_PATH = new_log_path
+ if debug_logging_enabled:
+ init_debug_logger(enabled=True)
urllib3.disable_warnings(InsecureRequestWarning)
@@ -98,10 +87,9 @@ def main(version: bool, debug: bool, log_path: Path) -> None:
@atexit.register
-def save_log():
- if console.record and LOGGING_PATH:
- # TODO: Currently semi-bust. Everything that refreshes gets duplicated.
- console.save_text(LOGGING_PATH)
+def cleanup():
+ """Clean up resources on exit."""
+ close_debug_logger()
if __name__ == "__main__":
diff --git a/unshackle/core/api/__init__.py b/unshackle/core/api/__init__.py
new file mode 100644
index 0000000..8369876
--- /dev/null
+++ b/unshackle/core/api/__init__.py
@@ -0,0 +1,3 @@
+from unshackle.core.api.routes import cors_middleware, setup_routes, setup_swagger
+
+__all__ = ["setup_routes", "setup_swagger", "cors_middleware"]
diff --git a/unshackle/core/api/download_manager.py b/unshackle/core/api/download_manager.py
new file mode 100644
index 0000000..2f45d44
--- /dev/null
+++ b/unshackle/core/api/download_manager.py
@@ -0,0 +1,660 @@
+import asyncio
+import json
+import logging
+import os
+import sys
+import tempfile
+import threading
+import uuid
+from contextlib import suppress
+from dataclasses import dataclass, field
+from datetime import datetime, timedelta
+from enum import Enum
+from typing import Any, Callable, Dict, List, Optional
+
+log = logging.getLogger("download_manager")
+
+
+class JobStatus(Enum):
+ QUEUED = "queued"
+ DOWNLOADING = "downloading"
+ COMPLETED = "completed"
+ FAILED = "failed"
+ CANCELLED = "cancelled"
+
+
+@dataclass
+class DownloadJob:
+ """Represents a download job with all its parameters and status."""
+
+ job_id: str
+ status: JobStatus
+ created_time: datetime
+ service: str
+ title_id: str
+ parameters: Dict[str, Any]
+
+ # Progress tracking
+ started_time: Optional[datetime] = None
+ completed_time: Optional[datetime] = None
+ progress: float = 0.0
+
+ # Results and error info
+ output_files: List[str] = field(default_factory=list)
+ error_message: Optional[str] = None
+ error_details: Optional[str] = None
+ error_code: Optional[str] = None
+ error_traceback: Optional[str] = None
+ worker_stderr: Optional[str] = None
+
+ # Cancellation support
+ cancel_event: threading.Event = field(default_factory=threading.Event)
+
+ def to_dict(self, include_full_details: bool = False) -> Dict[str, Any]:
+ """Convert job to dictionary for JSON response."""
+ result = {
+ "job_id": self.job_id,
+ "status": self.status.value,
+ "created_time": self.created_time.isoformat(),
+ "service": self.service,
+ "title_id": self.title_id,
+ "progress": self.progress,
+ }
+
+ if include_full_details:
+ result.update(
+ {
+ "parameters": self.parameters,
+ "started_time": self.started_time.isoformat() if self.started_time else None,
+ "completed_time": self.completed_time.isoformat() if self.completed_time else None,
+ "output_files": self.output_files,
+ "error_message": self.error_message,
+ "error_details": self.error_details,
+ "error_code": self.error_code,
+ "error_traceback": self.error_traceback,
+ "worker_stderr": self.worker_stderr,
+ }
+ )
+
+ return result
+
+
+def _perform_download(
+ job_id: str,
+ service: str,
+ title_id: str,
+ params: Dict[str, Any],
+ cancel_event: Optional[threading.Event] = None,
+ progress_callback: Optional[Callable[[Dict[str, Any]], None]] = None,
+) -> List[str]:
+ """Execute the synchronous download logic for a job."""
+
+ def _check_cancel(stage: str):
+ if cancel_event and cancel_event.is_set():
+ raise Exception(f"Job was cancelled {stage}")
+
+ from contextlib import redirect_stderr, redirect_stdout
+ from io import StringIO
+
+ _check_cancel("before execution started")
+
+ # Import dl.py components lazily to avoid circular deps during module import
+ import click
+ import yaml
+
+ from unshackle.commands.dl import dl
+ from unshackle.core.config import config
+ from unshackle.core.services import Services
+ from unshackle.core.utils.click_types import ContextData
+ from unshackle.core.utils.collections import merge_dict
+
+ log.info(f"Starting sync download for job {job_id}")
+
+ # Load service configuration
+ service_config_path = Services.get_path(service) / config.filenames.config
+ if service_config_path.exists():
+ service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
+ else:
+ service_config = {}
+ merge_dict(config.services.get(service), service_config)
+
+ from unshackle.commands.dl import dl as dl_command
+
+ ctx = click.Context(dl_command.cli)
+ ctx.invoked_subcommand = service
+ ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=[], profile=params.get("profile"))
+ ctx.params = {
+ "proxy": params.get("proxy"),
+ "no_proxy": params.get("no_proxy", False),
+ "profile": params.get("profile"),
+ "tag": params.get("tag"),
+ "tmdb_id": params.get("tmdb_id"),
+ "tmdb_name": params.get("tmdb_name", False),
+ "tmdb_year": params.get("tmdb_year", False),
+ }
+
+ dl_instance = dl(
+ ctx=ctx,
+ no_proxy=params.get("no_proxy", False),
+ profile=params.get("profile"),
+ proxy=params.get("proxy"),
+ tag=params.get("tag"),
+ tmdb_id=params.get("tmdb_id"),
+ tmdb_name=params.get("tmdb_name", False),
+ tmdb_year=params.get("tmdb_year", False),
+ )
+
+ service_module = Services.load(service)
+
+ _check_cancel("before service instantiation")
+
+ try:
+ import inspect
+
+ service_init_params = inspect.signature(service_module.__init__).parameters
+
+ service_ctx = click.Context(click.Command(service))
+ service_ctx.parent = ctx
+ service_ctx.obj = ctx.obj
+
+ service_kwargs = {}
+
+ if "title" in service_init_params:
+ service_kwargs["title"] = title_id
+
+ for key, value in params.items():
+ if key in service_init_params and key not in ["service", "title_id"]:
+ service_kwargs[key] = value
+
+ for param_name, param_info in service_init_params.items():
+ if param_name not in service_kwargs and param_name not in ["self", "ctx"]:
+ if param_info.default is inspect.Parameter.empty:
+ if param_name == "movie":
+ service_kwargs[param_name] = "/movies/" in title_id
+ elif param_name == "meta_lang":
+ service_kwargs[param_name] = None
+ else:
+ log.warning(f"Unknown required parameter '{param_name}' for service {service}, using None")
+ service_kwargs[param_name] = None
+
+ service_instance = service_module(service_ctx, **service_kwargs)
+
+ except Exception as exc: # noqa: BLE001 - propagate meaningful failure
+ log.error(f"Failed to create service instance: {exc}")
+ raise
+
+ original_download_dir = config.directories.downloads
+
+ _check_cancel("before download execution")
+
+ stdout_capture = StringIO()
+ stderr_capture = StringIO()
+
+ # Simple progress tracking if callback provided
+ if progress_callback:
+ # Report initial progress
+ progress_callback({"progress": 0.0, "status": "starting"})
+
+ # Simple approach: report progress at key points
+ original_result = dl_instance.result
+
+ def result_with_progress(*args, **kwargs):
+ try:
+ # Report that download started
+ progress_callback({"progress": 5.0, "status": "downloading"})
+
+ # Call original method
+ result = original_result(*args, **kwargs)
+
+ # Report completion
+ progress_callback({"progress": 100.0, "status": "completed"})
+ return result
+ except Exception as e:
+ progress_callback({"progress": 0.0, "status": "failed", "error": str(e)})
+ raise
+
+ dl_instance.result = result_with_progress
+
+ try:
+ with redirect_stdout(stdout_capture), redirect_stderr(stderr_capture):
+ dl_instance.result(
+ service=service_instance,
+ quality=params.get("quality", []),
+ vcodec=params.get("vcodec"),
+ acodec=params.get("acodec"),
+ vbitrate=params.get("vbitrate"),
+ abitrate=params.get("abitrate"),
+ range_=params.get("range", ["SDR"]),
+ channels=params.get("channels"),
+ no_atmos=params.get("no_atmos", False),
+ wanted=params.get("wanted", []),
+ latest_episode=params.get("latest_episode", False),
+ lang=params.get("lang", ["orig"]),
+ v_lang=params.get("v_lang", []),
+ a_lang=params.get("a_lang", []),
+ s_lang=params.get("s_lang", ["all"]),
+ require_subs=params.get("require_subs", []),
+ forced_subs=params.get("forced_subs", False),
+ exact_lang=params.get("exact_lang", False),
+ sub_format=params.get("sub_format"),
+ video_only=params.get("video_only", False),
+ audio_only=params.get("audio_only", False),
+ subs_only=params.get("subs_only", False),
+ chapters_only=params.get("chapters_only", False),
+ no_subs=params.get("no_subs", False),
+ no_audio=params.get("no_audio", False),
+ no_chapters=params.get("no_chapters", False),
+ audio_description=params.get("audio_description", False),
+ slow=params.get("slow", False),
+ list_=False,
+ list_titles=False,
+ skip_dl=params.get("skip_dl", False),
+ export=params.get("export"),
+ cdm_only=params.get("cdm_only"),
+ no_proxy=params.get("no_proxy", False),
+ no_folder=params.get("no_folder", False),
+ no_source=params.get("no_source", False),
+ no_mux=params.get("no_mux", False),
+ workers=params.get("workers"),
+ downloads=params.get("downloads", 1),
+ best_available=params.get("best_available", False),
+ )
+
+ except SystemExit as exc:
+ if exc.code != 0:
+ stdout_str = stdout_capture.getvalue()
+ stderr_str = stderr_capture.getvalue()
+ log.error(f"Download exited with code {exc.code}")
+ log.error(f"Stdout: {stdout_str}")
+ log.error(f"Stderr: {stderr_str}")
+ raise Exception(f"Download failed with exit code {exc.code}")
+
+ except Exception as exc: # noqa: BLE001 - propagate to caller
+ stdout_str = stdout_capture.getvalue()
+ stderr_str = stderr_capture.getvalue()
+ log.error(f"Download execution failed: {exc}")
+ log.error(f"Stdout: {stdout_str}")
+ log.error(f"Stderr: {stderr_str}")
+ raise
+
+ log.info(f"Download completed for job {job_id}, files in {original_download_dir}")
+
+ return []
+
+
+class DownloadQueueManager:
+ """Manages download job queue with configurable concurrency limits."""
+
+ def __init__(self, max_concurrent_downloads: int = 2, job_retention_hours: int = 24):
+ self.max_concurrent_downloads = max_concurrent_downloads
+ self.job_retention_hours = job_retention_hours
+
+ self._jobs: Dict[str, DownloadJob] = {}
+ self._job_queue: asyncio.Queue = asyncio.Queue()
+ self._active_downloads: Dict[str, asyncio.Task] = {}
+ self._download_processes: Dict[str, asyncio.subprocess.Process] = {}
+ self._job_temp_files: Dict[str, Dict[str, str]] = {}
+ self._workers_started = False
+ self._shutdown_event = asyncio.Event()
+
+ log.info(
+ f"Initialized download queue manager: max_concurrent={max_concurrent_downloads}, retention_hours={job_retention_hours}"
+ )
+
+ def create_job(self, service: str, title_id: str, **parameters) -> DownloadJob:
+ """Create a new download job and add it to the queue."""
+ job_id = str(uuid.uuid4())
+ job = DownloadJob(
+ job_id=job_id,
+ status=JobStatus.QUEUED,
+ created_time=datetime.now(),
+ service=service,
+ title_id=title_id,
+ parameters=parameters,
+ )
+
+ self._jobs[job_id] = job
+ self._job_queue.put_nowait(job)
+
+ log.info(f"Created download job {job_id} for {service}:{title_id}")
+ return job
+
+ def get_job(self, job_id: str) -> Optional[DownloadJob]:
+ """Get job by ID."""
+ return self._jobs.get(job_id)
+
+ def list_jobs(self) -> List[DownloadJob]:
+ """List all jobs."""
+ return list(self._jobs.values())
+
+ def cancel_job(self, job_id: str) -> bool:
+ """Cancel a job if it's queued or downloading."""
+ job = self._jobs.get(job_id)
+ if not job:
+ return False
+
+ if job.status == JobStatus.QUEUED:
+ job.status = JobStatus.CANCELLED
+ job.cancel_event.set() # Signal cancellation
+ log.info(f"Cancelled queued job {job_id}")
+ return True
+ elif job.status == JobStatus.DOWNLOADING:
+ # Set the cancellation event first - this will be checked by the download thread
+ job.cancel_event.set()
+ job.status = JobStatus.CANCELLED
+ log.info(f"Signaled cancellation for downloading job {job_id}")
+
+ # Cancel the active download task
+ task = self._active_downloads.get(job_id)
+ if task:
+ task.cancel()
+ log.info(f"Cancelled download task for job {job_id}")
+
+ process = self._download_processes.get(job_id)
+ if process:
+ try:
+ process.terminate()
+ log.info(f"Terminated worker process for job {job_id}")
+ except ProcessLookupError:
+ log.debug(f"Worker process for job {job_id} already exited")
+
+ return True
+
+ return False
+
+ def cleanup_old_jobs(self) -> int:
+ """Remove jobs older than retention period."""
+ cutoff_time = datetime.now() - timedelta(hours=self.job_retention_hours)
+ jobs_to_remove = []
+
+ for job_id, job in self._jobs.items():
+ if job.status in [JobStatus.COMPLETED, JobStatus.FAILED, JobStatus.CANCELLED]:
+ if job.completed_time and job.completed_time < cutoff_time:
+ jobs_to_remove.append(job_id)
+ elif not job.completed_time and job.created_time < cutoff_time:
+ jobs_to_remove.append(job_id)
+
+ for job_id in jobs_to_remove:
+ del self._jobs[job_id]
+
+ if jobs_to_remove:
+ log.info(f"Cleaned up {len(jobs_to_remove)} old jobs")
+
+ return len(jobs_to_remove)
+
+ async def start_workers(self):
+ """Start worker tasks to process the download queue."""
+ if self._workers_started:
+ return
+
+ self._workers_started = True
+
+ # Start worker tasks
+ for i in range(self.max_concurrent_downloads):
+ asyncio.create_task(self._download_worker(f"worker-{i}"))
+
+ # Start cleanup task
+ asyncio.create_task(self._cleanup_worker())
+
+ log.info(f"Started {self.max_concurrent_downloads} download workers")
+
+ async def shutdown(self):
+ """Shutdown the queue manager and cancel all active downloads."""
+ log.info("Shutting down download queue manager")
+ self._shutdown_event.set()
+
+ # Cancel all active downloads
+ for task in self._active_downloads.values():
+ task.cancel()
+
+ # Terminate worker processes
+ for job_id, process in list(self._download_processes.items()):
+ try:
+ process.terminate()
+ except ProcessLookupError:
+ log.debug(f"Worker process for job {job_id} already exited during shutdown")
+
+ for job_id, process in list(self._download_processes.items()):
+ try:
+ await asyncio.wait_for(process.wait(), timeout=5)
+ except asyncio.TimeoutError:
+ log.warning(f"Worker process for job {job_id} did not exit, killing")
+ process.kill()
+ await process.wait()
+ finally:
+ self._download_processes.pop(job_id, None)
+
+ # Clean up any remaining temp files
+ for paths in self._job_temp_files.values():
+ for path in paths.values():
+ try:
+ os.remove(path)
+ except OSError:
+ pass
+ self._job_temp_files.clear()
+
+ # Wait for workers to finish
+ if self._active_downloads:
+ await asyncio.gather(*self._active_downloads.values(), return_exceptions=True)
+
+ async def _download_worker(self, worker_name: str):
+ """Worker task that processes jobs from the queue."""
+ log.debug(f"Download worker {worker_name} started")
+
+ while not self._shutdown_event.is_set():
+ try:
+ # Wait for a job or shutdown signal
+ job = await asyncio.wait_for(self._job_queue.get(), timeout=1.0)
+
+ if job.status == JobStatus.CANCELLED:
+ continue
+
+ # Start processing the job
+ job.status = JobStatus.DOWNLOADING
+ job.started_time = datetime.now()
+
+ log.info(f"Worker {worker_name} starting job {job.job_id}")
+
+ # Create download task
+ download_task = asyncio.create_task(self._execute_download(job))
+ self._active_downloads[job.job_id] = download_task
+
+ try:
+ await download_task
+ except asyncio.CancelledError:
+ job.status = JobStatus.CANCELLED
+ log.info(f"Job {job.job_id} was cancelled")
+ except Exception as e:
+ job.status = JobStatus.FAILED
+ job.error_message = str(e)
+ log.error(f"Job {job.job_id} failed: {e}")
+ finally:
+ job.completed_time = datetime.now()
+ if job.job_id in self._active_downloads:
+ del self._active_downloads[job.job_id]
+
+ except asyncio.TimeoutError:
+ continue
+ except Exception as e:
+ log.error(f"Worker {worker_name} error: {e}")
+
+ async def _execute_download(self, job: DownloadJob):
+ """Execute the actual download for a job."""
+ log.info(f"Executing download for job {job.job_id}")
+
+ try:
+ output_files = await self._run_download_async(job)
+ job.status = JobStatus.COMPLETED
+ job.output_files = output_files
+ job.progress = 100.0
+ log.info(f"Download completed for job {job.job_id}: {len(output_files)} files")
+ except Exception as e:
+ import traceback
+
+ from unshackle.core.api.errors import categorize_exception
+
+ job.status = JobStatus.FAILED
+ job.error_message = str(e)
+ job.error_details = str(e)
+
+ api_error = categorize_exception(
+ e, context={"service": job.service, "title_id": job.title_id, "job_id": job.job_id}
+ )
+ job.error_code = api_error.error_code.value
+
+ job.error_traceback = traceback.format_exc()
+
+ log.error(f"Download failed for job {job.job_id}: {e}")
+ raise
+
+ async def _run_download_async(self, job: DownloadJob) -> List[str]:
+ """Invoke a worker subprocess to execute the download."""
+
+ payload = {
+ "job_id": job.job_id,
+ "service": job.service,
+ "title_id": job.title_id,
+ "parameters": job.parameters,
+ }
+
+ payload_fd, payload_path = tempfile.mkstemp(prefix=f"unshackle_job_{job.job_id}_", suffix="_payload.json")
+ os.close(payload_fd)
+ result_fd, result_path = tempfile.mkstemp(prefix=f"unshackle_job_{job.job_id}_", suffix="_result.json")
+ os.close(result_fd)
+ progress_fd, progress_path = tempfile.mkstemp(prefix=f"unshackle_job_{job.job_id}_", suffix="_progress.json")
+ os.close(progress_fd)
+
+ with open(payload_path, "w", encoding="utf-8") as handle:
+ json.dump(payload, handle)
+
+ process = await asyncio.create_subprocess_exec(
+ sys.executable,
+ "-m",
+ "unshackle.core.api.download_worker",
+ payload_path,
+ result_path,
+ progress_path,
+ stdout=asyncio.subprocess.PIPE,
+ stderr=asyncio.subprocess.PIPE,
+ )
+
+ self._download_processes[job.job_id] = process
+ self._job_temp_files[job.job_id] = {"payload": payload_path, "result": result_path, "progress": progress_path}
+
+ communicate_task = asyncio.create_task(process.communicate())
+
+ stdout_bytes = b""
+ stderr_bytes = b""
+
+ try:
+ while True:
+ done, _ = await asyncio.wait({communicate_task}, timeout=0.5)
+ if communicate_task in done:
+ stdout_bytes, stderr_bytes = communicate_task.result()
+ break
+
+ # Check for progress updates
+ try:
+ if os.path.exists(progress_path):
+ with open(progress_path, "r", encoding="utf-8") as handle:
+ progress_data = json.load(handle)
+ if "progress" in progress_data:
+ new_progress = float(progress_data["progress"])
+ if new_progress != job.progress:
+ job.progress = new_progress
+ log.info(f"Job {job.job_id} progress updated: {job.progress}%")
+ except (FileNotFoundError, json.JSONDecodeError, ValueError) as e:
+ log.debug(f"Could not read progress for job {job.job_id}: {e}")
+
+ if job.cancel_event.is_set() or job.status == JobStatus.CANCELLED:
+ log.info(f"Cancellation detected for job {job.job_id}, terminating worker process")
+ process.terminate()
+ try:
+ await asyncio.wait_for(communicate_task, timeout=5)
+ except asyncio.TimeoutError:
+ log.warning(f"Worker process for job {job.job_id} did not terminate, killing")
+ process.kill()
+ await asyncio.wait_for(communicate_task, timeout=5)
+ raise asyncio.CancelledError("Job was cancelled")
+
+ returncode = process.returncode
+ stdout = stdout_bytes.decode("utf-8", errors="ignore")
+ stderr = stderr_bytes.decode("utf-8", errors="ignore")
+
+ if stdout.strip():
+ log.debug(f"Worker stdout for job {job.job_id}: {stdout.strip()}")
+ if stderr.strip():
+ log.warning(f"Worker stderr for job {job.job_id}: {stderr.strip()}")
+ job.worker_stderr = stderr.strip()
+
+ result_data: Optional[Dict[str, Any]] = None
+ try:
+ with open(result_path, "r", encoding="utf-8") as handle:
+ result_data = json.load(handle)
+ except FileNotFoundError:
+ log.error(f"Result file missing for job {job.job_id}")
+ except json.JSONDecodeError as exc:
+ log.error(f"Failed to parse worker result for job {job.job_id}: {exc}")
+
+ if returncode != 0:
+ message = result_data.get("message") if result_data else "unknown error"
+ if result_data:
+ job.error_details = result_data.get("error_details", message)
+ job.error_code = result_data.get("error_code")
+ raise Exception(f"Worker exited with code {returncode}: {message}")
+
+ if not result_data or result_data.get("status") != "success":
+ message = result_data.get("message") if result_data else "worker did not report success"
+ if result_data:
+ job.error_details = result_data.get("error_details", message)
+ job.error_code = result_data.get("error_code")
+ raise Exception(f"Worker failure: {message}")
+
+ return result_data.get("output_files", [])
+
+ finally:
+ if not communicate_task.done():
+ communicate_task.cancel()
+ with suppress(asyncio.CancelledError):
+ await communicate_task
+
+ self._download_processes.pop(job.job_id, None)
+
+ temp_paths = self._job_temp_files.pop(job.job_id, {})
+ for path in temp_paths.values():
+ try:
+ os.remove(path)
+ except OSError:
+ pass
+
+ def _execute_download_sync(self, job: DownloadJob) -> List[str]:
+ """Execute download synchronously using existing dl.py logic."""
+ return _perform_download(job.job_id, job.service, job.title_id, job.parameters.copy(), job.cancel_event)
+
+ async def _cleanup_worker(self):
+ """Worker that periodically cleans up old jobs."""
+ while not self._shutdown_event.is_set():
+ try:
+ await asyncio.sleep(3600) # Run every hour
+ self.cleanup_old_jobs()
+ except Exception as e:
+ log.error(f"Cleanup worker error: {e}")
+
+
+# Global instance
+download_manager: Optional[DownloadQueueManager] = None
+
+
+def get_download_manager() -> DownloadQueueManager:
+ """Get the global download manager instance."""
+ global download_manager
+ if download_manager is None:
+ # Load configuration from unshackle config
+ from unshackle.core.config import config
+
+ max_concurrent = getattr(config, "max_concurrent_downloads", 2)
+ retention_hours = getattr(config, "download_job_retention_hours", 24)
+
+ download_manager = DownloadQueueManager(max_concurrent, retention_hours)
+
+ return download_manager
diff --git a/unshackle/core/api/download_worker.py b/unshackle/core/api/download_worker.py
new file mode 100644
index 0000000..7afca32
--- /dev/null
+++ b/unshackle/core/api/download_worker.py
@@ -0,0 +1,102 @@
+"""Standalone worker process entry point for executing download jobs."""
+
+from __future__ import annotations
+
+import json
+import logging
+import sys
+import traceback
+from pathlib import Path
+from typing import Any, Dict
+
+from .download_manager import _perform_download
+
+log = logging.getLogger("download_worker")
+
+
+def _read_payload(path: Path) -> Dict[str, Any]:
+ with path.open("r", encoding="utf-8") as handle:
+ return json.load(handle)
+
+
+def _write_result(path: Path, payload: Dict[str, Any]) -> None:
+ path.parent.mkdir(parents=True, exist_ok=True)
+ with path.open("w", encoding="utf-8") as handle:
+ json.dump(payload, handle)
+
+
+def main(argv: list[str]) -> int:
+ if len(argv) not in [3, 4]:
+ print(
+ "Usage: python -m unshackle.core.api.download_worker [progress_path]",
+ file=sys.stderr,
+ )
+ return 2
+
+ payload_path = Path(argv[1])
+ result_path = Path(argv[2])
+ progress_path = Path(argv[3]) if len(argv) > 3 else None
+
+ result: Dict[str, Any] = {}
+ exit_code = 0
+
+ try:
+ payload = _read_payload(payload_path)
+ job_id = payload["job_id"]
+ service = payload["service"]
+ title_id = payload["title_id"]
+ params = payload.get("parameters", {})
+
+ log.info(f"Worker starting job {job_id} ({service}:{title_id})")
+
+ def progress_callback(progress_data: Dict[str, Any]) -> None:
+ """Write progress updates to file for main process to read."""
+ if progress_path:
+ try:
+ log.info(f"Writing progress update: {progress_data}")
+ _write_result(progress_path, progress_data)
+ log.info(f"Progress update written to {progress_path}")
+ except Exception as e:
+ log.error(f"Failed to write progress update: {e}")
+
+ output_files = _perform_download(
+ job_id, service, title_id, params, cancel_event=None, progress_callback=progress_callback
+ )
+
+ result = {"status": "success", "output_files": output_files}
+
+ except Exception as exc: # noqa: BLE001 - capture for parent process
+ from unshackle.core.api.errors import categorize_exception
+
+ exit_code = 1
+ tb = traceback.format_exc()
+ log.error(f"Worker failed with error: {exc}")
+
+ api_error = categorize_exception(
+ exc,
+ context={
+ "service": payload.get("service") if "payload" in locals() else None,
+ "title_id": payload.get("title_id") if "payload" in locals() else None,
+ "job_id": payload.get("job_id") if "payload" in locals() else None,
+ },
+ )
+
+ result = {
+ "status": "error",
+ "message": str(exc),
+ "error_details": api_error.message,
+ "error_code": api_error.error_code.value,
+ "traceback": tb,
+ }
+
+ finally:
+ try:
+ _write_result(result_path, result)
+ except Exception as exc: # noqa: BLE001 - last resort logging
+ log.error(f"Failed to write worker result file: {exc}")
+
+ return exit_code
+
+
+if __name__ == "__main__":
+ sys.exit(main(sys.argv))
diff --git a/unshackle/core/api/errors.py b/unshackle/core/api/errors.py
new file mode 100644
index 0000000..312ee12
--- /dev/null
+++ b/unshackle/core/api/errors.py
@@ -0,0 +1,322 @@
+"""
+API Error Handling System
+
+Provides structured error responses with error codes, categorization,
+and optional debug information for the unshackle REST API.
+"""
+
+from __future__ import annotations
+
+import traceback
+from datetime import datetime, timezone
+from enum import Enum
+from typing import Any
+
+from aiohttp import web
+
+
+class APIErrorCode(str, Enum):
+ """Standard API error codes for programmatic error handling."""
+
+ # Client errors (4xx)
+ INVALID_INPUT = "INVALID_INPUT" # Missing or malformed request data
+ INVALID_SERVICE = "INVALID_SERVICE" # Unknown service name
+ INVALID_TITLE_ID = "INVALID_TITLE_ID" # Invalid or malformed title ID
+ INVALID_PROFILE = "INVALID_PROFILE" # Profile doesn't exist
+ INVALID_PROXY = "INVALID_PROXY" # Invalid proxy specification
+ INVALID_LANGUAGE = "INVALID_LANGUAGE" # Invalid language code
+ INVALID_PARAMETERS = "INVALID_PARAMETERS" # Invalid download parameters
+
+ AUTH_FAILED = "AUTH_FAILED" # Authentication failure (invalid credentials/cookies)
+ AUTH_REQUIRED = "AUTH_REQUIRED" # Missing authentication
+ FORBIDDEN = "FORBIDDEN" # Action not allowed
+ GEOFENCE = "GEOFENCE" # Content not available in region
+
+ NOT_FOUND = "NOT_FOUND" # Resource not found (title, job, etc.)
+ NO_CONTENT = "NO_CONTENT" # No titles/tracks/episodes found
+ JOB_NOT_FOUND = "JOB_NOT_FOUND" # Download job doesn't exist
+
+ RATE_LIMITED = "RATE_LIMITED" # Service rate limiting
+
+ # Server errors (5xx)
+ INTERNAL_ERROR = "INTERNAL_ERROR" # Unexpected server error
+ SERVICE_ERROR = "SERVICE_ERROR" # Streaming service API error
+ NETWORK_ERROR = "NETWORK_ERROR" # Network connectivity issue
+ DRM_ERROR = "DRM_ERROR" # DRM/license acquisition failure
+ DOWNLOAD_ERROR = "DOWNLOAD_ERROR" # Download process failure
+ SERVICE_UNAVAILABLE = "SERVICE_UNAVAILABLE" # Service temporarily unavailable
+ WORKER_ERROR = "WORKER_ERROR" # Download worker process error
+
+
+class APIError(Exception):
+ """
+ Structured API error with error code, message, and details.
+
+ Attributes:
+ error_code: Standardized error code from APIErrorCode enum
+ message: User-friendly error message
+ details: Additional structured error information
+ retryable: Whether the operation can be retried
+ http_status: HTTP status code to return (default based on error_code)
+ """
+
+ def __init__(
+ self,
+ error_code: APIErrorCode,
+ message: str,
+ details: dict[str, Any] | None = None,
+ retryable: bool = False,
+ http_status: int | None = None,
+ ):
+ super().__init__(message)
+ self.error_code = error_code
+ self.message = message
+ self.details = details or {}
+ self.retryable = retryable
+ self.http_status = http_status or self._default_http_status(error_code)
+
+ @staticmethod
+ def _default_http_status(error_code: APIErrorCode) -> int:
+ """Map error codes to default HTTP status codes."""
+ status_map = {
+ # 400 Bad Request
+ APIErrorCode.INVALID_INPUT: 400,
+ APIErrorCode.INVALID_SERVICE: 400,
+ APIErrorCode.INVALID_TITLE_ID: 400,
+ APIErrorCode.INVALID_PROFILE: 400,
+ APIErrorCode.INVALID_PROXY: 400,
+ APIErrorCode.INVALID_LANGUAGE: 400,
+ APIErrorCode.INVALID_PARAMETERS: 400,
+ # 401 Unauthorized
+ APIErrorCode.AUTH_REQUIRED: 401,
+ APIErrorCode.AUTH_FAILED: 401,
+ # 403 Forbidden
+ APIErrorCode.FORBIDDEN: 403,
+ APIErrorCode.GEOFENCE: 403,
+ # 404 Not Found
+ APIErrorCode.NOT_FOUND: 404,
+ APIErrorCode.NO_CONTENT: 404,
+ APIErrorCode.JOB_NOT_FOUND: 404,
+ # 429 Too Many Requests
+ APIErrorCode.RATE_LIMITED: 429,
+ # 500 Internal Server Error
+ APIErrorCode.INTERNAL_ERROR: 500,
+ # 502 Bad Gateway
+ APIErrorCode.SERVICE_ERROR: 502,
+ APIErrorCode.DRM_ERROR: 502,
+ # 503 Service Unavailable
+ APIErrorCode.NETWORK_ERROR: 503,
+ APIErrorCode.SERVICE_UNAVAILABLE: 503,
+ APIErrorCode.DOWNLOAD_ERROR: 500,
+ APIErrorCode.WORKER_ERROR: 500,
+ }
+ return status_map.get(error_code, 500)
+
+
+def build_error_response(
+ error: APIError | Exception,
+ debug_mode: bool = False,
+ extra_debug_info: dict[str, Any] | None = None,
+) -> web.Response:
+ """
+ Build a structured JSON error response.
+
+ Args:
+ error: APIError or generic Exception to convert to response
+ debug_mode: Whether to include technical debug information
+ extra_debug_info: Additional debug info (stderr, stdout, etc.)
+
+ Returns:
+ aiohttp JSON response with structured error data
+ """
+ if isinstance(error, APIError):
+ error_code = error.error_code.value
+ message = error.message
+ details = error.details
+ http_status = error.http_status
+ retryable = error.retryable
+ else:
+ # Generic exception - convert to INTERNAL_ERROR
+ error_code = APIErrorCode.INTERNAL_ERROR.value
+ message = str(error) or "An unexpected error occurred"
+ details = {}
+ http_status = 500
+ retryable = False
+
+ response_data: dict[str, Any] = {
+ "status": "error",
+ "error_code": error_code,
+ "message": message,
+ "timestamp": datetime.now(timezone.utc).isoformat(),
+ }
+
+ # Add details if present
+ if details:
+ response_data["details"] = details
+
+ # Add retryable hint if specified
+ if retryable:
+ response_data["retryable"] = True
+
+ # Add debug information if in debug mode
+ if debug_mode:
+ debug_info: dict[str, Any] = {
+ "exception_type": type(error).__name__,
+ }
+
+ # Add traceback for debugging
+ if isinstance(error, Exception):
+ debug_info["traceback"] = traceback.format_exc()
+
+ # Add any extra debug info provided
+ if extra_debug_info:
+ debug_info.update(extra_debug_info)
+
+ response_data["debug_info"] = debug_info
+
+ return web.json_response(response_data, status=http_status)
+
+
+def categorize_exception(
+ exc: Exception,
+ context: dict[str, Any] | None = None,
+) -> APIError:
+ """
+ Categorize a generic exception into a structured APIError.
+
+ This function attempts to identify the type of error based on the exception
+ type, message patterns, and optional context information.
+
+ Args:
+ exc: The exception to categorize
+ context: Optional context (service name, operation type, etc.)
+
+ Returns:
+ APIError with appropriate error code and details
+ """
+ context = context or {}
+ exc_str = str(exc).lower()
+ exc_type = type(exc).__name__
+
+ # Authentication errors
+ if any(keyword in exc_str for keyword in ["auth", "login", "credential", "unauthorized", "forbidden", "token"]):
+ return APIError(
+ error_code=APIErrorCode.AUTH_FAILED,
+ message=f"Authentication failed: {exc}",
+ details={**context, "reason": "authentication_error"},
+ retryable=False,
+ )
+
+ # Network errors
+ if any(
+ keyword in exc_str
+ for keyword in [
+ "connection",
+ "timeout",
+ "network",
+ "unreachable",
+ "socket",
+ "dns",
+ "resolve",
+ ]
+ ) or exc_type in ["ConnectionError", "TimeoutError", "URLError", "SSLError"]:
+ return APIError(
+ error_code=APIErrorCode.NETWORK_ERROR,
+ message=f"Network error occurred: {exc}",
+ details={**context, "reason": "network_connectivity"},
+ retryable=True,
+ http_status=503,
+ )
+
+ # Geofence/region errors
+ if any(keyword in exc_str for keyword in ["geofence", "region", "not available in", "territory"]):
+ return APIError(
+ error_code=APIErrorCode.GEOFENCE,
+ message=f"Content not available in your region: {exc}",
+ details={**context, "reason": "geofence_restriction"},
+ retryable=False,
+ )
+
+ # Not found errors
+ if any(keyword in exc_str for keyword in ["not found", "404", "does not exist", "invalid id"]):
+ return APIError(
+ error_code=APIErrorCode.NOT_FOUND,
+ message=f"Resource not found: {exc}",
+ details={**context, "reason": "not_found"},
+ retryable=False,
+ )
+
+ # Rate limiting
+ if any(keyword in exc_str for keyword in ["rate limit", "too many requests", "429", "throttle"]):
+ return APIError(
+ error_code=APIErrorCode.RATE_LIMITED,
+ message=f"Rate limit exceeded: {exc}",
+ details={**context, "reason": "rate_limited"},
+ retryable=True,
+ http_status=429,
+ )
+
+ # DRM errors
+ if any(keyword in exc_str for keyword in ["drm", "license", "widevine", "playready", "decrypt"]):
+ return APIError(
+ error_code=APIErrorCode.DRM_ERROR,
+ message=f"DRM error: {exc}",
+ details={**context, "reason": "drm_failure"},
+ retryable=False,
+ )
+
+ # Service unavailable
+ if any(keyword in exc_str for keyword in ["service unavailable", "503", "maintenance", "temporarily unavailable"]):
+ return APIError(
+ error_code=APIErrorCode.SERVICE_UNAVAILABLE,
+ message=f"Service temporarily unavailable: {exc}",
+ details={**context, "reason": "service_unavailable"},
+ retryable=True,
+ http_status=503,
+ )
+
+ # Validation errors
+ if any(keyword in exc_str for keyword in ["invalid", "malformed", "validation"]) or exc_type in [
+ "ValueError",
+ "ValidationError",
+ ]:
+ return APIError(
+ error_code=APIErrorCode.INVALID_INPUT,
+ message=f"Invalid input: {exc}",
+ details={**context, "reason": "validation_failed"},
+ retryable=False,
+ )
+
+ # Default to internal error for unknown exceptions
+ return APIError(
+ error_code=APIErrorCode.INTERNAL_ERROR,
+ message=f"An unexpected error occurred: {exc}",
+ details={**context, "exception_type": exc_type},
+ retryable=False,
+ )
+
+
+def handle_api_exception(
+ exc: Exception,
+ context: dict[str, Any] | None = None,
+ debug_mode: bool = False,
+ extra_debug_info: dict[str, Any] | None = None,
+) -> web.Response:
+ """
+ Convenience function to categorize an exception and build an error response.
+
+ Args:
+ exc: The exception to handle
+ context: Optional context information
+ debug_mode: Whether to include debug information
+ extra_debug_info: Additional debug info
+
+ Returns:
+ Structured JSON error response
+ """
+ if isinstance(exc, APIError):
+ api_error = exc
+ else:
+ api_error = categorize_exception(exc, context)
+
+ return build_error_response(api_error, debug_mode, extra_debug_info)
diff --git a/unshackle/core/api/handlers.py b/unshackle/core/api/handlers.py
new file mode 100644
index 0000000..ba94adb
--- /dev/null
+++ b/unshackle/core/api/handlers.py
@@ -0,0 +1,936 @@
+import logging
+from typing import Any, Dict, List, Optional
+
+from aiohttp import web
+
+from unshackle.core.api.errors import APIError, APIErrorCode, handle_api_exception
+from unshackle.core.constants import AUDIO_CODEC_MAP, DYNAMIC_RANGE_MAP, VIDEO_CODEC_MAP
+from unshackle.core.proxies.basic import Basic
+from unshackle.core.proxies.hola import Hola
+from unshackle.core.proxies.nordvpn import NordVPN
+from unshackle.core.proxies.surfsharkvpn import SurfsharkVPN
+from unshackle.core.services import Services
+from unshackle.core.titles import Episode, Movie, Title_T
+from unshackle.core.tracks import Audio, Subtitle, Video
+
+log = logging.getLogger("api")
+
+DEFAULT_DOWNLOAD_PARAMS = {
+ "profile": None,
+ "quality": [],
+ "vcodec": None,
+ "acodec": None,
+ "vbitrate": None,
+ "abitrate": None,
+ "range": ["SDR"],
+ "channels": None,
+ "no_atmos": False,
+ "wanted": [],
+ "latest_episode": False,
+ "lang": ["orig"],
+ "v_lang": [],
+ "a_lang": [],
+ "s_lang": ["all"],
+ "require_subs": [],
+ "forced_subs": False,
+ "exact_lang": False,
+ "sub_format": None,
+ "video_only": False,
+ "audio_only": False,
+ "subs_only": False,
+ "chapters_only": False,
+ "no_subs": False,
+ "no_audio": False,
+ "no_chapters": False,
+ "audio_description": False,
+ "slow": False,
+ "skip_dl": False,
+ "export": None,
+ "cdm_only": None,
+ "no_proxy": False,
+ "no_folder": False,
+ "no_source": False,
+ "no_mux": False,
+ "workers": None,
+ "downloads": 1,
+ "best_available": False,
+}
+
+
+def initialize_proxy_providers() -> List[Any]:
+ """Initialize and return available proxy providers."""
+ proxy_providers = []
+ try:
+ from unshackle.core import binaries
+ # Load the main unshackle config to get proxy provider settings
+ from unshackle.core.config import config as main_config
+
+ log.debug(f"Main config proxy providers: {getattr(main_config, 'proxy_providers', {})}")
+ log.debug(f"Available proxy provider configs: {list(getattr(main_config, 'proxy_providers', {}).keys())}")
+
+ # Use main_config instead of the service-specific config for proxy providers
+ proxy_config = getattr(main_config, "proxy_providers", {})
+
+ if proxy_config.get("basic"):
+ log.debug("Loading Basic proxy provider")
+ proxy_providers.append(Basic(**proxy_config["basic"]))
+ if proxy_config.get("nordvpn"):
+ log.debug("Loading NordVPN proxy provider")
+ proxy_providers.append(NordVPN(**proxy_config["nordvpn"]))
+ if proxy_config.get("surfsharkvpn"):
+ log.debug("Loading SurfsharkVPN proxy provider")
+ proxy_providers.append(SurfsharkVPN(**proxy_config["surfsharkvpn"]))
+ if hasattr(binaries, "HolaProxy") and binaries.HolaProxy:
+ log.debug("Loading Hola proxy provider")
+ proxy_providers.append(Hola())
+
+ for proxy_provider in proxy_providers:
+ log.info(f"Loaded {proxy_provider.__class__.__name__}: {proxy_provider}")
+
+ if not proxy_providers:
+ log.warning("No proxy providers were loaded. Check your proxy provider configuration in unshackle.yaml")
+
+ except Exception as e:
+ log.warning(f"Failed to initialize some proxy providers: {e}")
+
+ return proxy_providers
+
+
+def resolve_proxy(proxy: str, proxy_providers: List[Any]) -> str:
+ """Resolve proxy parameter to actual proxy URI."""
+ import re
+
+ if not proxy:
+ return proxy
+
+ # Check if explicit proxy URI
+ if re.match(r"^https?://", proxy):
+ return proxy
+
+ # Handle provider:country format (e.g., "nordvpn:us")
+ requested_provider = None
+ if re.match(r"^[a-z]+:.+$", proxy, re.IGNORECASE):
+ requested_provider, proxy = proxy.split(":", maxsplit=1)
+
+ # Handle country code format (e.g., "us", "uk")
+ if re.match(r"^[a-z]{2}(?:\d+)?$", proxy, re.IGNORECASE):
+ proxy = proxy.lower()
+
+ if requested_provider:
+ # Find specific provider (case-insensitive matching)
+ proxy_provider = next(
+ (x for x in proxy_providers if x.__class__.__name__.lower() == requested_provider.lower()),
+ None,
+ )
+ if not proxy_provider:
+ available_providers = [x.__class__.__name__ for x in proxy_providers]
+ raise ValueError(
+ f"The proxy provider '{requested_provider}' was not recognized. Available providers: {available_providers}"
+ )
+
+ proxy_uri = proxy_provider.get_proxy(proxy)
+ if not proxy_uri:
+ raise ValueError(f"The proxy provider {requested_provider} had no proxy for {proxy}")
+
+ log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy_uri}")
+ return proxy_uri
+ else:
+ # Try all providers
+ for proxy_provider in proxy_providers:
+ proxy_uri = proxy_provider.get_proxy(proxy)
+ if proxy_uri:
+ log.info(f"Using {proxy_provider.__class__.__name__} Proxy: {proxy_uri}")
+ return proxy_uri
+
+ raise ValueError(f"No proxy provider had a proxy for {proxy}")
+
+ # Return as-is if not recognized format
+ log.info(f"Using explicit Proxy: {proxy}")
+ return proxy
+
+
+def validate_service(service_tag: str) -> Optional[str]:
+ """Validate and normalize service tag."""
+ try:
+ normalized = Services.get_tag(service_tag)
+ service_path = Services.get_path(normalized)
+ if not service_path.exists():
+ return None
+ return normalized
+ except Exception:
+ return None
+
+
+def serialize_title(title: Title_T) -> Dict[str, Any]:
+ """Convert a title object to JSON-serializable dict."""
+ if isinstance(title, Episode):
+ episode_name = title.name if title.name else f"Episode {title.number:02d}"
+ result = {
+ "type": "episode",
+ "name": episode_name,
+ "series_title": str(title.title),
+ "season": title.season,
+ "number": title.number,
+ "year": title.year,
+ "id": str(title.id) if hasattr(title, "id") else None,
+ }
+ elif isinstance(title, Movie):
+ result = {
+ "type": "movie",
+ "name": str(title.name) if hasattr(title, "name") else str(title),
+ "year": title.year,
+ "id": str(title.id) if hasattr(title, "id") else None,
+ }
+ else:
+ result = {
+ "type": "other",
+ "name": str(title.name) if hasattr(title, "name") else str(title),
+ "id": str(title.id) if hasattr(title, "id") else None,
+ }
+
+ return result
+
+
+def serialize_video_track(track: Video) -> Dict[str, Any]:
+ """Convert video track to JSON-serializable dict."""
+ codec_name = track.codec.name if hasattr(track.codec, "name") else str(track.codec)
+ range_name = track.range.name if hasattr(track.range, "name") else str(track.range)
+
+ return {
+ "id": str(track.id),
+ "codec": codec_name,
+ "codec_display": VIDEO_CODEC_MAP.get(codec_name, codec_name),
+ "bitrate": int(track.bitrate / 1000) if track.bitrate else None,
+ "width": track.width,
+ "height": track.height,
+ "resolution": f"{track.width}x{track.height}" if track.width and track.height else None,
+ "fps": track.fps if track.fps else None,
+ "range": range_name,
+ "range_display": DYNAMIC_RANGE_MAP.get(range_name, range_name),
+ "language": str(track.language) if track.language else None,
+ "drm": str(track.drm) if hasattr(track, "drm") and track.drm else None,
+ }
+
+
+def serialize_audio_track(track: Audio) -> Dict[str, Any]:
+ """Convert audio track to JSON-serializable dict."""
+ codec_name = track.codec.name if hasattr(track.codec, "name") else str(track.codec)
+
+ return {
+ "id": str(track.id),
+ "codec": codec_name,
+ "codec_display": AUDIO_CODEC_MAP.get(codec_name, codec_name),
+ "bitrate": int(track.bitrate / 1000) if track.bitrate else None,
+ "channels": track.channels if track.channels else None,
+ "language": str(track.language) if track.language else None,
+ "atmos": track.atmos if hasattr(track, "atmos") else False,
+ "descriptive": track.descriptive if hasattr(track, "descriptive") else False,
+ "drm": str(track.drm) if hasattr(track, "drm") and track.drm else None,
+ }
+
+
+def serialize_subtitle_track(track: Subtitle) -> Dict[str, Any]:
+ """Convert subtitle track to JSON-serializable dict."""
+ return {
+ "id": str(track.id),
+ "codec": track.codec.name if hasattr(track.codec, "name") else str(track.codec),
+ "language": str(track.language) if track.language else None,
+ "forced": track.forced if hasattr(track, "forced") else False,
+ "sdh": track.sdh if hasattr(track, "sdh") else False,
+ "cc": track.cc if hasattr(track, "cc") else False,
+ }
+
+
+async def list_titles_handler(data: Dict[str, Any], request: Optional[web.Request] = None) -> web.Response:
+ """Handle list-titles request."""
+ service_tag = data.get("service")
+ title_id = data.get("title_id")
+ profile = data.get("profile")
+
+ if not service_tag:
+ raise APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Missing required parameter: service",
+ details={"missing_parameter": "service"},
+ )
+
+ if not title_id:
+ raise APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Missing required parameter: title_id",
+ details={"missing_parameter": "title_id"},
+ )
+
+ normalized_service = validate_service(service_tag)
+ if not normalized_service:
+ raise APIError(
+ APIErrorCode.INVALID_SERVICE,
+ f"Invalid or unavailable service: {service_tag}",
+ details={"service": service_tag},
+ )
+
+ try:
+ import inspect
+
+ import click
+ import yaml
+
+ from unshackle.commands.dl import dl
+ from unshackle.core.config import config
+ from unshackle.core.utils.click_types import ContextData
+ from unshackle.core.utils.collections import merge_dict
+
+ service_config_path = Services.get_path(normalized_service) / config.filenames.config
+ if service_config_path.exists():
+ service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
+ else:
+ service_config = {}
+ merge_dict(config.services.get(normalized_service), service_config)
+
+ @click.command()
+ @click.pass_context
+ def dummy_service(ctx: click.Context) -> None:
+ pass
+
+ # Handle proxy configuration
+ proxy_param = data.get("proxy")
+ no_proxy = data.get("no_proxy", False)
+ proxy_providers = []
+
+ if not no_proxy:
+ proxy_providers = initialize_proxy_providers()
+
+ if proxy_param and not no_proxy:
+ try:
+ resolved_proxy = resolve_proxy(proxy_param, proxy_providers)
+ proxy_param = resolved_proxy
+ except ValueError as e:
+ raise APIError(
+ APIErrorCode.INVALID_PROXY,
+ f"Proxy error: {e}",
+ details={"proxy": proxy_param, "service": normalized_service},
+ )
+
+ ctx = click.Context(dummy_service)
+ ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=proxy_providers, profile=profile)
+ ctx.params = {"proxy": proxy_param, "no_proxy": no_proxy}
+
+ service_module = Services.load(normalized_service)
+
+ dummy_service.name = normalized_service
+ dummy_service.params = [click.Argument([title_id], type=str)]
+ ctx.invoked_subcommand = normalized_service
+
+ service_ctx = click.Context(dummy_service, parent=ctx)
+ service_ctx.obj = ctx.obj
+
+ service_kwargs = {"title": title_id}
+
+ # Add additional parameters from request data
+ for key, value in data.items():
+ if key not in ["service", "title_id", "profile", "season", "episode", "wanted", "proxy", "no_proxy"]:
+ service_kwargs[key] = value
+
+ # Get service parameter info and click command defaults
+ service_init_params = inspect.signature(service_module.__init__).parameters
+
+ # Extract default values from the click command
+ if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
+ for param in service_module.cli.params:
+ if hasattr(param, "name") and param.name not in service_kwargs:
+ # Add default value if parameter is not already provided
+ if hasattr(param, "default") and param.default is not None:
+ service_kwargs[param.name] = param.default
+
+ # Handle required parameters that don't have click defaults
+ for param_name, param_info in service_init_params.items():
+ if param_name not in service_kwargs and param_name not in ["self", "ctx"]:
+ # Check if parameter is required (no default value in signature)
+ if param_info.default is inspect.Parameter.empty:
+ # Provide sensible defaults for common required parameters
+ if param_name == "meta_lang":
+ service_kwargs[param_name] = None
+ elif param_name == "movie":
+ service_kwargs[param_name] = False
+ else:
+ # Log warning for unknown required parameters
+ log.warning(f"Unknown required parameter '{param_name}' for service {normalized_service}")
+
+ # Filter out any parameters that the service doesn't accept
+ filtered_kwargs = {}
+ for key, value in service_kwargs.items():
+ if key in service_init_params:
+ filtered_kwargs[key] = value
+
+ service_instance = service_module(service_ctx, **filtered_kwargs)
+
+ cookies = dl.get_cookie_jar(normalized_service, profile)
+ credential = dl.get_credentials(normalized_service, profile)
+ service_instance.authenticate(cookies, credential)
+
+ titles = service_instance.get_titles()
+
+ if hasattr(titles, "__iter__") and not isinstance(titles, str):
+ title_list = [serialize_title(t) for t in titles]
+ else:
+ title_list = [serialize_title(titles)]
+
+ return web.json_response({"titles": title_list})
+
+ except APIError:
+ raise
+ except Exception as e:
+ log.exception("Error listing titles")
+ debug_mode = request.app.get("debug_api", False) if request else False
+ return handle_api_exception(
+ e,
+ context={"operation": "list_titles", "service": normalized_service, "title_id": title_id},
+ debug_mode=debug_mode,
+ )
+
+
+async def list_tracks_handler(data: Dict[str, Any], request: Optional[web.Request] = None) -> web.Response:
+ """Handle list-tracks request."""
+ service_tag = data.get("service")
+ title_id = data.get("title_id")
+ profile = data.get("profile")
+
+ if not service_tag:
+ raise APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Missing required parameter: service",
+ details={"missing_parameter": "service"},
+ )
+
+ if not title_id:
+ raise APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Missing required parameter: title_id",
+ details={"missing_parameter": "title_id"},
+ )
+
+ normalized_service = validate_service(service_tag)
+ if not normalized_service:
+ raise APIError(
+ APIErrorCode.INVALID_SERVICE,
+ f"Invalid or unavailable service: {service_tag}",
+ details={"service": service_tag},
+ )
+
+ try:
+ import inspect
+
+ import click
+ import yaml
+
+ from unshackle.commands.dl import dl
+ from unshackle.core.config import config
+ from unshackle.core.utils.click_types import ContextData
+ from unshackle.core.utils.collections import merge_dict
+
+ service_config_path = Services.get_path(normalized_service) / config.filenames.config
+ if service_config_path.exists():
+ service_config = yaml.safe_load(service_config_path.read_text(encoding="utf8"))
+ else:
+ service_config = {}
+ merge_dict(config.services.get(normalized_service), service_config)
+
+ @click.command()
+ @click.pass_context
+ def dummy_service(ctx: click.Context) -> None:
+ pass
+
+ # Handle proxy configuration
+ proxy_param = data.get("proxy")
+ no_proxy = data.get("no_proxy", False)
+ proxy_providers = []
+
+ if not no_proxy:
+ proxy_providers = initialize_proxy_providers()
+
+ if proxy_param and not no_proxy:
+ try:
+ resolved_proxy = resolve_proxy(proxy_param, proxy_providers)
+ proxy_param = resolved_proxy
+ except ValueError as e:
+ raise APIError(
+ APIErrorCode.INVALID_PROXY,
+ f"Proxy error: {e}",
+ details={"proxy": proxy_param, "service": normalized_service},
+ )
+
+ ctx = click.Context(dummy_service)
+ ctx.obj = ContextData(config=service_config, cdm=None, proxy_providers=proxy_providers, profile=profile)
+ ctx.params = {"proxy": proxy_param, "no_proxy": no_proxy}
+
+ service_module = Services.load(normalized_service)
+
+ dummy_service.name = normalized_service
+ dummy_service.params = [click.Argument([title_id], type=str)]
+ ctx.invoked_subcommand = normalized_service
+
+ service_ctx = click.Context(dummy_service, parent=ctx)
+ service_ctx.obj = ctx.obj
+
+ service_kwargs = {"title": title_id}
+
+ # Add additional parameters from request data
+ for key, value in data.items():
+ if key not in ["service", "title_id", "profile", "season", "episode", "wanted", "proxy", "no_proxy"]:
+ service_kwargs[key] = value
+
+ # Get service parameter info and click command defaults
+ service_init_params = inspect.signature(service_module.__init__).parameters
+
+ # Extract default values from the click command
+ if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
+ for param in service_module.cli.params:
+ if hasattr(param, "name") and param.name not in service_kwargs:
+ # Add default value if parameter is not already provided
+ if hasattr(param, "default") and param.default is not None:
+ service_kwargs[param.name] = param.default
+
+ # Handle required parameters that don't have click defaults
+ for param_name, param_info in service_init_params.items():
+ if param_name not in service_kwargs and param_name not in ["self", "ctx"]:
+ # Check if parameter is required (no default value in signature)
+ if param_info.default is inspect.Parameter.empty:
+ # Provide sensible defaults for common required parameters
+ if param_name == "meta_lang":
+ service_kwargs[param_name] = None
+ elif param_name == "movie":
+ service_kwargs[param_name] = False
+ else:
+ # Log warning for unknown required parameters
+ log.warning(f"Unknown required parameter '{param_name}' for service {normalized_service}")
+
+ # Filter out any parameters that the service doesn't accept
+ filtered_kwargs = {}
+ for key, value in service_kwargs.items():
+ if key in service_init_params:
+ filtered_kwargs[key] = value
+
+ service_instance = service_module(service_ctx, **filtered_kwargs)
+
+ cookies = dl.get_cookie_jar(normalized_service, profile)
+ credential = dl.get_credentials(normalized_service, profile)
+ service_instance.authenticate(cookies, credential)
+
+ titles = service_instance.get_titles()
+
+ wanted_param = data.get("wanted")
+ season = data.get("season")
+ episode = data.get("episode")
+
+ if hasattr(titles, "__iter__") and not isinstance(titles, str):
+ titles_list = list(titles)
+
+ wanted = None
+ if wanted_param:
+ from unshackle.core.utils.click_types import SeasonRange
+
+ try:
+ season_range = SeasonRange()
+ wanted = season_range.parse_tokens(wanted_param)
+ log.debug(f"Parsed wanted '{wanted_param}' into {len(wanted)} episodes: {wanted[:10]}...")
+ except Exception as e:
+ raise APIError(
+ APIErrorCode.INVALID_PARAMETERS,
+ f"Invalid wanted parameter: {e}",
+ details={"wanted": wanted_param, "service": normalized_service},
+ )
+ elif season is not None and episode is not None:
+ wanted = [f"{season}x{episode}"]
+
+ if wanted:
+ # Filter titles based on wanted episodes, similar to how dl.py does it
+ matching_titles = []
+ log.debug(f"Filtering {len(titles_list)} titles with {len(wanted)} wanted episodes")
+ for title in titles_list:
+ if isinstance(title, Episode):
+ episode_key = f"{title.season}x{title.number}"
+ if episode_key in wanted:
+ log.debug(f"Episode {episode_key} matches wanted list")
+ matching_titles.append(title)
+ else:
+ log.debug(f"Episode {episode_key} not in wanted list")
+ else:
+ matching_titles.append(title)
+
+ log.debug(f"Found {len(matching_titles)} matching titles")
+
+ if not matching_titles:
+ raise APIError(
+ APIErrorCode.NO_CONTENT,
+ "No episodes found matching wanted criteria",
+ details={
+ "service": normalized_service,
+ "title_id": title_id,
+ "wanted": wanted_param or f"{season}x{episode}",
+ },
+ )
+
+ # If multiple episodes match, return tracks for all episodes
+ if len(matching_titles) > 1 and all(isinstance(t, Episode) for t in matching_titles):
+ episodes_data = []
+ failed_episodes = []
+
+ # Sort matching titles by season and episode number for consistent ordering
+ sorted_titles = sorted(matching_titles, key=lambda t: (t.season, t.number))
+
+ for title in sorted_titles:
+ try:
+ tracks = service_instance.get_tracks(title)
+ video_tracks = sorted(tracks.videos, key=lambda t: t.bitrate or 0, reverse=True)
+ audio_tracks = sorted(tracks.audio, key=lambda t: t.bitrate or 0, reverse=True)
+
+ episode_data = {
+ "title": serialize_title(title),
+ "video": [serialize_video_track(t) for t in video_tracks],
+ "audio": [serialize_audio_track(t) for t in audio_tracks],
+ "subtitles": [serialize_subtitle_track(t) for t in tracks.subtitles],
+ }
+ episodes_data.append(episode_data)
+ log.debug(f"Successfully got tracks for {title.season}x{title.number}")
+ except SystemExit:
+ # Service calls sys.exit() for unavailable episodes - catch and skip
+ failed_episodes.append(f"S{title.season}E{title.number:02d}")
+ log.debug(f"Episode {title.season}x{title.number} not available, skipping")
+ continue
+ except Exception as e:
+ # Handle other errors gracefully
+ failed_episodes.append(f"S{title.season}E{title.number:02d}")
+ log.debug(f"Error getting tracks for {title.season}x{title.number}: {e}")
+ continue
+
+ if episodes_data:
+ response = {"episodes": episodes_data}
+ if failed_episodes:
+ response["unavailable_episodes"] = failed_episodes
+ return web.json_response(response)
+ else:
+ raise APIError(
+ APIErrorCode.NO_CONTENT,
+ f"No available episodes found. Unavailable: {', '.join(failed_episodes)}",
+ details={
+ "service": normalized_service,
+ "title_id": title_id,
+ "unavailable_episodes": failed_episodes,
+ },
+ )
+ else:
+ # Single episode or movie
+ first_title = matching_titles[0]
+ else:
+ first_title = titles_list[0]
+ else:
+ first_title = titles
+
+ tracks = service_instance.get_tracks(first_title)
+
+ video_tracks = sorted(tracks.videos, key=lambda t: t.bitrate or 0, reverse=True)
+ audio_tracks = sorted(tracks.audio, key=lambda t: t.bitrate or 0, reverse=True)
+
+ response = {
+ "title": serialize_title(first_title),
+ "video": [serialize_video_track(t) for t in video_tracks],
+ "audio": [serialize_audio_track(t) for t in audio_tracks],
+ "subtitles": [serialize_subtitle_track(t) for t in tracks.subtitles],
+ }
+
+ return web.json_response(response)
+
+ except APIError:
+ raise
+ except Exception as e:
+ log.exception("Error listing tracks")
+ debug_mode = request.app.get("debug_api", False) if request else False
+ return handle_api_exception(
+ e,
+ context={"operation": "list_tracks", "service": normalized_service, "title_id": title_id},
+ debug_mode=debug_mode,
+ )
+
+
+def validate_download_parameters(data: Dict[str, Any]) -> Optional[str]:
+ """
+ Validate download parameters and return error message if invalid.
+
+ Returns:
+ None if valid, error message string if invalid
+ """
+ if "vcodec" in data and data["vcodec"]:
+ valid_vcodecs = ["H264", "H265", "VP9", "AV1"]
+ if data["vcodec"].upper() not in valid_vcodecs:
+ return f"Invalid vcodec: {data['vcodec']}. Must be one of: {', '.join(valid_vcodecs)}"
+
+ if "acodec" in data and data["acodec"]:
+ valid_acodecs = ["AAC", "AC3", "EAC3", "OPUS", "FLAC", "ALAC", "VORBIS", "DTS"]
+ if data["acodec"].upper() not in valid_acodecs:
+ return f"Invalid acodec: {data['acodec']}. Must be one of: {', '.join(valid_acodecs)}"
+
+ if "sub_format" in data and data["sub_format"]:
+ valid_sub_formats = ["SRT", "VTT", "ASS", "SSA"]
+ if data["sub_format"].upper() not in valid_sub_formats:
+ return f"Invalid sub_format: {data['sub_format']}. Must be one of: {', '.join(valid_sub_formats)}"
+
+ if "vbitrate" in data and data["vbitrate"] is not None:
+ if not isinstance(data["vbitrate"], int) or data["vbitrate"] <= 0:
+ return "vbitrate must be a positive integer"
+
+ if "abitrate" in data and data["abitrate"] is not None:
+ if not isinstance(data["abitrate"], int) or data["abitrate"] <= 0:
+ return "abitrate must be a positive integer"
+
+ if "channels" in data and data["channels"] is not None:
+ if not isinstance(data["channels"], (int, float)) or data["channels"] <= 0:
+ return "channels must be a positive number"
+
+ if "workers" in data and data["workers"] is not None:
+ if not isinstance(data["workers"], int) or data["workers"] <= 0:
+ return "workers must be a positive integer"
+
+ if "downloads" in data and data["downloads"] is not None:
+ if not isinstance(data["downloads"], int) or data["downloads"] <= 0:
+ return "downloads must be a positive integer"
+
+ exclusive_flags = []
+ if data.get("video_only"):
+ exclusive_flags.append("video_only")
+ if data.get("audio_only"):
+ exclusive_flags.append("audio_only")
+ if data.get("subs_only"):
+ exclusive_flags.append("subs_only")
+ if data.get("chapters_only"):
+ exclusive_flags.append("chapters_only")
+
+ if len(exclusive_flags) > 1:
+ return f"Cannot use multiple exclusive flags: {', '.join(exclusive_flags)}"
+
+ if data.get("no_subs") and data.get("subs_only"):
+ return "Cannot use both no_subs and subs_only"
+ if data.get("no_audio") and data.get("audio_only"):
+ return "Cannot use both no_audio and audio_only"
+
+ if data.get("s_lang") and data.get("require_subs"):
+ return "Cannot use both s_lang and require_subs"
+
+ if "range" in data and data["range"]:
+ valid_ranges = ["SDR", "HDR10", "HDR10+", "DV", "HLG"]
+ if isinstance(data["range"], list):
+ for r in data["range"]:
+ if r.upper() not in valid_ranges:
+ return f"Invalid range value: {r}. Must be one of: {', '.join(valid_ranges)}"
+ elif data["range"].upper() not in valid_ranges:
+ return f"Invalid range value: {data['range']}. Must be one of: {', '.join(valid_ranges)}"
+
+ return None
+
+
+async def download_handler(data: Dict[str, Any], request: Optional[web.Request] = None) -> web.Response:
+ """Handle download request - create and queue a download job."""
+ from unshackle.core.api.download_manager import get_download_manager
+
+ service_tag = data.get("service")
+ title_id = data.get("title_id")
+
+ if not service_tag:
+ raise APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Missing required parameter: service",
+ details={"missing_parameter": "service"},
+ )
+
+ if not title_id:
+ raise APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Missing required parameter: title_id",
+ details={"missing_parameter": "title_id"},
+ )
+
+ normalized_service = validate_service(service_tag)
+ if not normalized_service:
+ raise APIError(
+ APIErrorCode.INVALID_SERVICE,
+ f"Invalid or unavailable service: {service_tag}",
+ details={"service": service_tag},
+ )
+
+ validation_error = validate_download_parameters(data)
+ if validation_error:
+ raise APIError(
+ APIErrorCode.INVALID_PARAMETERS,
+ validation_error,
+ details={"service": normalized_service, "title_id": title_id},
+ )
+
+ try:
+ # Load service module to extract service-specific parameter defaults
+ service_module = Services.load(normalized_service)
+ service_specific_defaults = {}
+
+ # Extract default values from the service's click command
+ if hasattr(service_module, "cli") and hasattr(service_module.cli, "params"):
+ for param in service_module.cli.params:
+ if hasattr(param, "name") and hasattr(param, "default") and param.default is not None:
+ # Store service-specific defaults (e.g., drm_system, hydrate_track, profile for NF)
+ service_specific_defaults[param.name] = param.default
+
+ # Get download manager and start workers if needed
+ manager = get_download_manager()
+ await manager.start_workers()
+
+ # Create download job with filtered parameters (exclude service and title_id as they're already passed)
+ filtered_params = {k: v for k, v in data.items() if k not in ["service", "title_id"]}
+ # Merge defaults with provided parameters (user params override service defaults, which override global defaults)
+ params_with_defaults = {**DEFAULT_DOWNLOAD_PARAMS, **service_specific_defaults, **filtered_params}
+ job = manager.create_job(normalized_service, title_id, **params_with_defaults)
+
+ return web.json_response(
+ {"job_id": job.job_id, "status": job.status.value, "created_time": job.created_time.isoformat()}, status=202
+ )
+
+ except APIError:
+ raise
+ except Exception as e:
+ log.exception("Error creating download job")
+ debug_mode = request.app.get("debug_api", False) if request else False
+ return handle_api_exception(
+ e,
+ context={"operation": "create_download_job", "service": normalized_service, "title_id": title_id},
+ debug_mode=debug_mode,
+ )
+
+
+async def list_download_jobs_handler(data: Dict[str, Any], request: Optional[web.Request] = None) -> web.Response:
+ """Handle list download jobs request with optional filtering and sorting."""
+ from unshackle.core.api.download_manager import get_download_manager
+
+ try:
+ manager = get_download_manager()
+ jobs = manager.list_jobs()
+
+ status_filter = data.get("status")
+ if status_filter:
+ jobs = [job for job in jobs if job.status.value == status_filter]
+
+ service_filter = data.get("service")
+ if service_filter:
+ jobs = [job for job in jobs if job.service == service_filter]
+
+ sort_by = data.get("sort_by", "created_time")
+ sort_order = data.get("sort_order", "desc")
+
+ valid_sort_fields = ["created_time", "started_time", "completed_time", "progress", "status", "service"]
+ if sort_by not in valid_sort_fields:
+ raise APIError(
+ APIErrorCode.INVALID_PARAMETERS,
+ f"Invalid sort_by: {sort_by}. Must be one of: {', '.join(valid_sort_fields)}",
+ details={"sort_by": sort_by, "valid_values": valid_sort_fields},
+ )
+
+ if sort_order not in ["asc", "desc"]:
+ raise APIError(
+ APIErrorCode.INVALID_PARAMETERS,
+ "Invalid sort_order: must be 'asc' or 'desc'",
+ details={"sort_order": sort_order, "valid_values": ["asc", "desc"]},
+ )
+
+ reverse = sort_order == "desc"
+
+ def get_sort_key(job):
+ """Get the sorting key value, handling None values."""
+ value = getattr(job, sort_by, None)
+ if value is None:
+ if sort_by in ["created_time", "started_time", "completed_time"]:
+ from datetime import datetime
+
+ return datetime.min if not reverse else datetime.max
+ elif sort_by == "progress":
+ return 0
+ elif sort_by in ["status", "service"]:
+ return ""
+ return value
+
+ jobs = sorted(jobs, key=get_sort_key, reverse=reverse)
+
+ job_list = [job.to_dict(include_full_details=False) for job in jobs]
+
+ return web.json_response({"jobs": job_list})
+
+ except APIError:
+ raise
+ except Exception as e:
+ log.exception("Error listing download jobs")
+ debug_mode = request.app.get("debug_api", False) if request else False
+ return handle_api_exception(
+ e,
+ context={"operation": "list_download_jobs"},
+ debug_mode=debug_mode,
+ )
+
+
+async def get_download_job_handler(job_id: str, request: Optional[web.Request] = None) -> web.Response:
+ """Handle get specific download job request."""
+ from unshackle.core.api.download_manager import get_download_manager
+
+ try:
+ manager = get_download_manager()
+ job = manager.get_job(job_id)
+
+ if not job:
+ raise APIError(
+ APIErrorCode.JOB_NOT_FOUND,
+ "Job not found",
+ details={"job_id": job_id},
+ )
+
+ return web.json_response(job.to_dict(include_full_details=True))
+
+ except APIError:
+ raise
+ except Exception as e:
+ log.exception(f"Error getting download job {job_id}")
+ debug_mode = request.app.get("debug_api", False) if request else False
+ return handle_api_exception(
+ e,
+ context={"operation": "get_download_job", "job_id": job_id},
+ debug_mode=debug_mode,
+ )
+
+
+async def cancel_download_job_handler(job_id: str, request: Optional[web.Request] = None) -> web.Response:
+ """Handle cancel download job request."""
+ from unshackle.core.api.download_manager import get_download_manager
+
+ try:
+ manager = get_download_manager()
+
+ if not manager.get_job(job_id):
+ raise APIError(
+ APIErrorCode.JOB_NOT_FOUND,
+ "Job not found",
+ details={"job_id": job_id},
+ )
+
+ success = manager.cancel_job(job_id)
+
+ if success:
+ return web.json_response({"status": "success", "message": "Job cancelled"})
+ else:
+ raise APIError(
+ APIErrorCode.INVALID_PARAMETERS,
+ "Job cannot be cancelled (already completed or failed)",
+ details={"job_id": job_id},
+ )
+
+ except APIError:
+ raise
+ except Exception as e:
+ log.exception(f"Error cancelling download job {job_id}")
+ debug_mode = request.app.get("debug_api", False) if request else False
+ return handle_api_exception(
+ e,
+ context={"operation": "cancel_download_job", "job_id": job_id},
+ debug_mode=debug_mode,
+ )
diff --git a/unshackle/core/api/routes.py b/unshackle/core/api/routes.py
new file mode 100644
index 0000000..a5202c5
--- /dev/null
+++ b/unshackle/core/api/routes.py
@@ -0,0 +1,758 @@
+import logging
+import re
+
+from aiohttp import web
+from aiohttp_swagger3 import SwaggerDocs, SwaggerInfo, SwaggerUiSettings
+
+from unshackle.core import __version__
+from unshackle.core.api.errors import APIError, APIErrorCode, build_error_response, handle_api_exception
+from unshackle.core.api.handlers import (cancel_download_job_handler, download_handler, get_download_job_handler,
+ list_download_jobs_handler, list_titles_handler, list_tracks_handler)
+from unshackle.core.services import Services
+from unshackle.core.update_checker import UpdateChecker
+
+
+@web.middleware
+async def cors_middleware(request: web.Request, handler):
+ """Add CORS headers to all responses."""
+ # Handle preflight requests
+ if request.method == "OPTIONS":
+ response = web.Response()
+ else:
+ response = await handler(request)
+
+ # Add CORS headers
+ response.headers["Access-Control-Allow-Origin"] = "*"
+ response.headers["Access-Control-Allow-Methods"] = "GET, POST, PUT, DELETE, OPTIONS"
+ response.headers["Access-Control-Allow-Headers"] = "Content-Type, X-API-Key, Authorization"
+ response.headers["Access-Control-Max-Age"] = "3600"
+
+ return response
+
+
+log = logging.getLogger("api")
+
+
+async def health(request: web.Request) -> web.Response:
+ """
+ Health check endpoint.
+ ---
+ summary: Health check
+ description: Get server health status, version info, and update availability
+ responses:
+ '200':
+ description: Health status
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ status:
+ type: string
+ example: ok
+ version:
+ type: string
+ example: "2.0.0"
+ update_check:
+ type: object
+ properties:
+ update_available:
+ type: boolean
+ nullable: true
+ current_version:
+ type: string
+ latest_version:
+ type: string
+ nullable: true
+ """
+ try:
+ latest_version = await UpdateChecker.check_for_updates(__version__)
+ update_info = {
+ "update_available": latest_version is not None,
+ "current_version": __version__,
+ "latest_version": latest_version,
+ }
+ except Exception as e:
+ log.warning(f"Failed to check for updates: {e}")
+ update_info = {"update_available": None, "current_version": __version__, "latest_version": None}
+
+ return web.json_response({"status": "ok", "version": __version__, "update_check": update_info})
+
+
+async def services(request: web.Request) -> web.Response:
+ """
+ List available services.
+ ---
+ summary: List services
+ description: Get all available streaming services with their details
+ responses:
+ '200':
+ description: List of services
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ services:
+ type: array
+ items:
+ type: object
+ properties:
+ tag:
+ type: string
+ aliases:
+ type: array
+ items:
+ type: string
+ geofence:
+ type: array
+ items:
+ type: string
+ title_regex:
+ oneOf:
+ - type: string
+ - type: array
+ items:
+ type: string
+ nullable: true
+ url:
+ type: string
+ nullable: true
+ description: Service URL from short_help
+ help:
+ type: string
+ nullable: true
+ description: Full service documentation
+ '500':
+ description: Server error
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ status:
+ type: string
+ example: error
+ error_code:
+ type: string
+ example: INTERNAL_ERROR
+ message:
+ type: string
+ example: An unexpected error occurred
+ details:
+ type: object
+ timestamp:
+ type: string
+ format: date-time
+ debug_info:
+ type: object
+ description: Only present when --debug-api flag is enabled
+ """
+ try:
+ service_tags = Services.get_tags()
+ services_info = []
+
+ for tag in service_tags:
+ service_data = {"tag": tag, "aliases": [], "geofence": [], "title_regex": None, "url": None, "help": None}
+
+ try:
+ service_module = Services.load(tag)
+
+ if hasattr(service_module, "ALIASES"):
+ service_data["aliases"] = list(service_module.ALIASES)
+
+ if hasattr(service_module, "GEOFENCE"):
+ service_data["geofence"] = list(service_module.GEOFENCE)
+
+ if hasattr(service_module, "TITLE_RE"):
+ title_re = service_module.TITLE_RE
+ # Handle different types of TITLE_RE
+ if isinstance(title_re, re.Pattern):
+ service_data["title_regex"] = title_re.pattern
+ elif isinstance(title_re, str):
+ service_data["title_regex"] = title_re
+ elif isinstance(title_re, (list, tuple)):
+ # Convert list/tuple of patterns to list of strings
+ patterns = []
+ for item in title_re:
+ if isinstance(item, re.Pattern):
+ patterns.append(item.pattern)
+ elif isinstance(item, str):
+ patterns.append(item)
+ service_data["title_regex"] = patterns if patterns else None
+
+ if hasattr(service_module, "cli") and hasattr(service_module.cli, "short_help"):
+ service_data["url"] = service_module.cli.short_help
+
+ if service_module.__doc__:
+ service_data["help"] = service_module.__doc__.strip()
+
+ except Exception as e:
+ log.warning(f"Could not load details for service {tag}: {e}")
+
+ services_info.append(service_data)
+
+ return web.json_response({"services": services_info})
+ except Exception as e:
+ log.exception("Error listing services")
+ debug_mode = request.app.get("debug_api", False)
+ return handle_api_exception(e, context={"operation": "list_services"}, debug_mode=debug_mode)
+
+
+async def list_titles(request: web.Request) -> web.Response:
+ """
+ List titles for a service and title ID.
+ ---
+ summary: List titles
+ description: Get available titles for a service and title ID
+ requestBody:
+ required: true
+ content:
+ application/json:
+ schema:
+ type: object
+ required:
+ - service
+ - title_id
+ properties:
+ service:
+ type: string
+ description: Service tag
+ title_id:
+ type: string
+ description: Title identifier
+ responses:
+ '200':
+ description: List of titles
+ '400':
+ description: Invalid request (missing parameters, invalid service)
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ status:
+ type: string
+ example: error
+ error_code:
+ type: string
+ example: INVALID_INPUT
+ message:
+ type: string
+ example: Missing required parameter
+ details:
+ type: object
+ timestamp:
+ type: string
+ format: date-time
+ '401':
+ description: Authentication failed
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ status:
+ type: string
+ example: error
+ error_code:
+ type: string
+ example: AUTH_FAILED
+ message:
+ type: string
+ details:
+ type: object
+ timestamp:
+ type: string
+ format: date-time
+ '404':
+ description: Title not found
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ status:
+ type: string
+ example: error
+ error_code:
+ type: string
+ example: NOT_FOUND
+ message:
+ type: string
+ details:
+ type: object
+ timestamp:
+ type: string
+ format: date-time
+ '500':
+ description: Server error
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ status:
+ type: string
+ example: error
+ error_code:
+ type: string
+ example: INTERNAL_ERROR
+ message:
+ type: string
+ details:
+ type: object
+ timestamp:
+ type: string
+ format: date-time
+ """
+ try:
+ data = await request.json()
+ except Exception as e:
+ return build_error_response(
+ APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Invalid JSON request body",
+ details={"error": str(e)},
+ ),
+ request.app.get("debug_api", False),
+ )
+
+ try:
+ return await list_titles_handler(data, request)
+ except APIError as e:
+ debug_mode = request.app.get("debug_api", False)
+ return build_error_response(e, debug_mode)
+
+
+async def list_tracks(request: web.Request) -> web.Response:
+ """
+ List tracks for a title, separated by type.
+ ---
+ summary: List tracks
+ description: Get available video, audio, and subtitle tracks for a title
+ requestBody:
+ required: true
+ content:
+ application/json:
+ schema:
+ type: object
+ required:
+ - service
+ - title_id
+ properties:
+ service:
+ type: string
+ description: Service tag
+ title_id:
+ type: string
+ description: Title identifier
+ wanted:
+ type: string
+ description: Specific episode/season (optional)
+ proxy:
+ type: string
+ description: Proxy configuration (optional)
+ responses:
+ '200':
+ description: Track information
+ '400':
+ description: Invalid request
+ """
+ try:
+ data = await request.json()
+ except Exception as e:
+ return build_error_response(
+ APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Invalid JSON request body",
+ details={"error": str(e)},
+ ),
+ request.app.get("debug_api", False),
+ )
+
+ try:
+ return await list_tracks_handler(data, request)
+ except APIError as e:
+ debug_mode = request.app.get("debug_api", False)
+ return build_error_response(e, debug_mode)
+
+
+async def download(request: web.Request) -> web.Response:
+ """
+ Download content based on provided parameters.
+ ---
+ summary: Download content
+ description: Download video content based on specified parameters
+ requestBody:
+ required: true
+ content:
+ application/json:
+ schema:
+ type: object
+ required:
+ - service
+ - title_id
+ properties:
+ service:
+ type: string
+ description: Service tag
+ title_id:
+ type: string
+ description: Title identifier
+ profile:
+ type: string
+ description: Profile to use for credentials and cookies (default - None)
+ quality:
+ type: array
+ items:
+ type: integer
+ description: Download resolution(s) (default - best available)
+ vcodec:
+ type: string
+ description: Video codec to download (e.g., H264, H265, VP9, AV1) (default - None)
+ acodec:
+ type: string
+ description: Audio codec to download (e.g., AAC, AC3, EAC3) (default - None)
+ vbitrate:
+ type: integer
+ description: Video bitrate in kbps (default - None)
+ abitrate:
+ type: integer
+ description: Audio bitrate in kbps (default - None)
+ range:
+ type: array
+ items:
+ type: string
+ description: Video color range (SDR, HDR10, DV) (default - ["SDR"])
+ channels:
+ type: number
+ description: Audio channels (e.g., 2.0, 5.1, 7.1) (default - None)
+ no_atmos:
+ type: boolean
+ description: Exclude Dolby Atmos audio tracks (default - false)
+ wanted:
+ type: array
+ items:
+ type: string
+ description: Wanted episodes (e.g., ["S01E01", "S01E02"]) (default - all)
+ latest_episode:
+ type: boolean
+ description: Download only the single most recent episode (default - false)
+ lang:
+ type: array
+ items:
+ type: string
+ description: Language for video and audio (use 'orig' for original) (default - ["orig"])
+ v_lang:
+ type: array
+ items:
+ type: string
+ description: Language for video tracks only (default - [])
+ a_lang:
+ type: array
+ items:
+ type: string
+ description: Language for audio tracks only (default - [])
+ s_lang:
+ type: array
+ items:
+ type: string
+ description: Language for subtitle tracks (default - ["all"])
+ require_subs:
+ type: array
+ items:
+ type: string
+ description: Required subtitle languages (default - [])
+ forced_subs:
+ type: boolean
+ description: Include forced subtitle tracks (default - false)
+ exact_lang:
+ type: boolean
+ description: Use exact language matching (no variants) (default - false)
+ sub_format:
+ type: string
+ description: Output subtitle format (SRT, VTT, etc.) (default - None)
+ video_only:
+ type: boolean
+ description: Only download video tracks (default - false)
+ audio_only:
+ type: boolean
+ description: Only download audio tracks (default - false)
+ subs_only:
+ type: boolean
+ description: Only download subtitle tracks (default - false)
+ chapters_only:
+ type: boolean
+ description: Only download chapters (default - false)
+ no_subs:
+ type: boolean
+ description: Do not download subtitle tracks (default - false)
+ no_audio:
+ type: boolean
+ description: Do not download audio tracks (default - false)
+ no_chapters:
+ type: boolean
+ description: Do not download chapters (default - false)
+ audio_description:
+ type: boolean
+ description: Download audio description tracks (default - false)
+ slow:
+ type: boolean
+ description: Add 60-120s delay between downloads (default - false)
+ skip_dl:
+ type: boolean
+ description: Skip downloading, only retrieve decryption keys (default - false)
+ export:
+ type: string
+ description: Path to export decryption keys as JSON (default - None)
+ cdm_only:
+ type: boolean
+ description: Only use CDM for key retrieval (true) or only vaults (false) (default - None)
+ proxy:
+ type: string
+ description: Proxy URI or country code (default - None)
+ no_proxy:
+ type: boolean
+ description: Force disable all proxy use (default - false)
+ tag:
+ type: string
+ description: Set the group tag to be used (default - None)
+ tmdb_id:
+ type: integer
+ description: Use this TMDB ID for tagging (default - None)
+ tmdb_name:
+ type: boolean
+ description: Rename titles using TMDB name (default - false)
+ tmdb_year:
+ type: boolean
+ description: Use release year from TMDB (default - false)
+ no_folder:
+ type: boolean
+ description: Disable folder creation for TV shows (default - false)
+ no_source:
+ type: boolean
+ description: Disable source tag from output file name (default - false)
+ no_mux:
+ type: boolean
+ description: Do not mux tracks into a container file (default - false)
+ workers:
+ type: integer
+ description: Max workers/threads per track download (default - None)
+ downloads:
+ type: integer
+ description: Amount of tracks to download concurrently (default - 1)
+ best_available:
+ type: boolean
+ description: Continue with best available if requested quality unavailable (default - false)
+ responses:
+ '202':
+ description: Download job created
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ job_id:
+ type: string
+ status:
+ type: string
+ created_time:
+ type: string
+ '400':
+ description: Invalid request
+ """
+ try:
+ data = await request.json()
+ except Exception as e:
+ return build_error_response(
+ APIError(
+ APIErrorCode.INVALID_INPUT,
+ "Invalid JSON request body",
+ details={"error": str(e)},
+ ),
+ request.app.get("debug_api", False),
+ )
+
+ try:
+ return await download_handler(data, request)
+ except APIError as e:
+ debug_mode = request.app.get("debug_api", False)
+ return build_error_response(e, debug_mode)
+
+
+async def download_jobs(request: web.Request) -> web.Response:
+ """
+ List all download jobs with optional filtering and sorting.
+ ---
+ summary: List download jobs
+ description: Get list of all download jobs with their status, with optional filtering by status/service and sorting
+ parameters:
+ - name: status
+ in: query
+ required: false
+ schema:
+ type: string
+ enum: [queued, downloading, completed, failed, cancelled]
+ description: Filter jobs by status
+ - name: service
+ in: query
+ required: false
+ schema:
+ type: string
+ description: Filter jobs by service tag
+ - name: sort_by
+ in: query
+ required: false
+ schema:
+ type: string
+ enum: [created_time, started_time, completed_time, progress, status, service]
+ default: created_time
+ description: Field to sort by
+ - name: sort_order
+ in: query
+ required: false
+ schema:
+ type: string
+ enum: [asc, desc]
+ default: desc
+ description: Sort order (ascending or descending)
+ responses:
+ '200':
+ description: List of download jobs
+ content:
+ application/json:
+ schema:
+ type: object
+ properties:
+ jobs:
+ type: array
+ items:
+ type: object
+ properties:
+ job_id:
+ type: string
+ status:
+ type: string
+ created_time:
+ type: string
+ service:
+ type: string
+ title_id:
+ type: string
+ progress:
+ type: number
+ '400':
+ description: Invalid query parameters
+ '500':
+ description: Server error
+ """
+ # Extract query parameters
+ query_params = {
+ "status": request.query.get("status"),
+ "service": request.query.get("service"),
+ "sort_by": request.query.get("sort_by", "created_time"),
+ "sort_order": request.query.get("sort_order", "desc"),
+ }
+ try:
+ return await list_download_jobs_handler(query_params, request)
+ except APIError as e:
+ debug_mode = request.app.get("debug_api", False)
+ return build_error_response(e, debug_mode)
+
+
+async def download_job_detail(request: web.Request) -> web.Response:
+ """
+ Get download job details.
+ ---
+ summary: Get download job
+ description: Get detailed information about a specific download job
+ parameters:
+ - name: job_id
+ in: path
+ required: true
+ schema:
+ type: string
+ responses:
+ '200':
+ description: Download job details
+ '404':
+ description: Job not found
+ '500':
+ description: Server error
+ """
+ job_id = request.match_info["job_id"]
+ try:
+ return await get_download_job_handler(job_id, request)
+ except APIError as e:
+ debug_mode = request.app.get("debug_api", False)
+ return build_error_response(e, debug_mode)
+
+
+async def cancel_download_job(request: web.Request) -> web.Response:
+ """
+ Cancel download job.
+ ---
+ summary: Cancel download job
+ description: Cancel a queued or running download job
+ parameters:
+ - name: job_id
+ in: path
+ required: true
+ schema:
+ type: string
+ responses:
+ '200':
+ description: Job cancelled successfully
+ '400':
+ description: Job cannot be cancelled
+ '404':
+ description: Job not found
+ '500':
+ description: Server error
+ """
+ job_id = request.match_info["job_id"]
+ try:
+ return await cancel_download_job_handler(job_id, request)
+ except APIError as e:
+ debug_mode = request.app.get("debug_api", False)
+ return build_error_response(e, debug_mode)
+
+
+def setup_routes(app: web.Application) -> None:
+ """Setup all API routes."""
+ app.router.add_get("/api/health", health)
+ app.router.add_get("/api/services", services)
+ app.router.add_post("/api/list-titles", list_titles)
+ app.router.add_post("/api/list-tracks", list_tracks)
+ app.router.add_post("/api/download", download)
+ app.router.add_get("/api/download/jobs", download_jobs)
+ app.router.add_get("/api/download/jobs/{job_id}", download_job_detail)
+ app.router.add_delete("/api/download/jobs/{job_id}", cancel_download_job)
+
+
+def setup_swagger(app: web.Application) -> None:
+ """Setup Swagger UI documentation."""
+ swagger = SwaggerDocs(
+ app,
+ swagger_ui_settings=SwaggerUiSettings(path="/api/docs/"),
+ info=SwaggerInfo(
+ title="Unshackle REST API",
+ version=__version__,
+ description="REST API for Unshackle - Modular Movie, TV, and Music Archival Software",
+ ),
+ )
+
+ # Add routes with OpenAPI documentation
+ swagger.add_routes(
+ [
+ web.get("/api/health", health),
+ web.get("/api/services", services),
+ web.post("/api/list-titles", list_titles),
+ web.post("/api/list-tracks", list_tracks),
+ web.post("/api/download", download),
+ web.get("/api/download/jobs", download_jobs),
+ web.get("/api/download/jobs/{job_id}", download_job_detail),
+ web.delete("/api/download/jobs/{job_id}", cancel_download_job),
+ ]
+ )
diff --git a/unshackle/core/binaries.py b/unshackle/core/binaries.py
index da31fb5..f846256 100644
--- a/unshackle/core/binaries.py
+++ b/unshackle/core/binaries.py
@@ -8,22 +8,20 @@ __shaka_platform = {"win32": "win", "darwin": "osx"}.get(sys.platform, sys.platf
def find(*names: str) -> Optional[Path]:
"""Find the path of the first found binary name."""
- # Get the directory containing this file to find the local binaries folder
- current_dir = Path(__file__).parent.parent
+ current_dir = Path(__file__).resolve().parent.parent
local_binaries_dir = current_dir / "binaries"
- for name in names:
- # First check local binaries folder
- if local_binaries_dir.exists():
- local_path = local_binaries_dir / name
- if local_path.is_file() and local_path.stat().st_mode & 0o111: # Check if executable
- return local_path
+ ext = ".exe" if sys.platform == "win32" else ""
- # Also check with .exe extension on Windows
- if sys.platform == "win32":
- local_path_exe = local_binaries_dir / f"{name}.exe"
- if local_path_exe.is_file():
- return local_path_exe
+ for name in names:
+ if local_binaries_dir.exists():
+ candidate_paths = [local_binaries_dir / f"{name}{ext}", local_binaries_dir / name / f"{name}{ext}"]
+
+ for path in candidate_paths:
+ if path.is_file():
+ # On Unix-like systems, check if file is executable
+ if sys.platform == "win32" or (path.stat().st_mode & 0o111):
+ return path
# Fall back to system PATH
path = shutil.which(name)
diff --git a/unshackle/core/cacher.py b/unshackle/core/cacher.py
index ba0c6a8..28cee47 100644
--- a/unshackle/core/cacher.py
+++ b/unshackle/core/cacher.py
@@ -91,7 +91,7 @@ class Cacher:
except jwt.DecodeError:
pass
- self.expiration = self._resolve_datetime(expiration) if expiration else None
+ self.expiration = self.resolve_datetime(expiration) if expiration else None
payload = {"data": self.data, "expiration": self.expiration, "version": self.version}
payload["crc32"] = zlib.crc32(jsonpickle.dumps(payload).encode("utf8"))
@@ -109,7 +109,7 @@ class Cacher:
return self.path.stat()
@staticmethod
- def _resolve_datetime(timestamp: EXP_T) -> datetime:
+ def resolve_datetime(timestamp: EXP_T) -> datetime:
"""
Resolve multiple formats of a Datetime or Timestamp to an absolute Datetime.
@@ -118,15 +118,15 @@ class Cacher:
datetime.datetime(2022, 6, 27, 9, 49, 13, 657208)
>>> iso8601 = now.isoformat()
'2022-06-27T09:49:13.657208'
- >>> Cacher._resolve_datetime(iso8601)
+ >>> Cacher.resolve_datetime(iso8601)
datetime.datetime(2022, 6, 27, 9, 49, 13, 657208)
- >>> Cacher._resolve_datetime(iso8601 + "Z")
+ >>> Cacher.resolve_datetime(iso8601 + "Z")
datetime.datetime(2022, 6, 27, 9, 49, 13, 657208)
- >>> Cacher._resolve_datetime(3600)
+ >>> Cacher.resolve_datetime(3600)
datetime.datetime(2022, 6, 27, 10, 52, 50, 657208)
- >>> Cacher._resolve_datetime('3600')
+ >>> Cacher.resolve_datetime('3600')
datetime.datetime(2022, 6, 27, 10, 52, 51, 657208)
- >>> Cacher._resolve_datetime(7800.113)
+ >>> Cacher.resolve_datetime(7800.113)
datetime.datetime(2022, 6, 27, 11, 59, 13, 770208)
In the int/float examples you may notice that it did not return now + 3600 seconds
diff --git a/unshackle/core/cdm/__init__.py b/unshackle/core/cdm/__init__.py
index 10c0131..226f9ea 100644
--- a/unshackle/core/cdm/__init__.py
+++ b/unshackle/core/cdm/__init__.py
@@ -1,3 +1,4 @@
+from .custom_remote_cdm import CustomRemoteCDM
from .decrypt_labs_remote_cdm import DecryptLabsRemoteCDM
-__all__ = ["DecryptLabsRemoteCDM"]
+__all__ = ["DecryptLabsRemoteCDM", "CustomRemoteCDM"]
diff --git a/unshackle/core/cdm/custom_remote_cdm.py b/unshackle/core/cdm/custom_remote_cdm.py
new file mode 100644
index 0000000..cd4c559
--- /dev/null
+++ b/unshackle/core/cdm/custom_remote_cdm.py
@@ -0,0 +1,1092 @@
+from __future__ import annotations
+
+import base64
+import secrets
+from typing import Any, Dict, List, Optional, Union
+from uuid import UUID
+
+import requests
+from pywidevine.cdm import Cdm as WidevineCdm
+from pywidevine.device import DeviceTypes
+from requests import Session
+
+from unshackle.core import __version__
+from unshackle.core.vaults import Vaults
+
+
+class MockCertificateChain:
+ """Mock certificate chain for PlayReady compatibility."""
+
+ def __init__(self, name: str):
+ self._name = name
+
+ def get_name(self) -> str:
+ return self._name
+
+
+class Key:
+ """Key object compatible with pywidevine."""
+
+ def __init__(self, kid: str, key: str, type_: str = "CONTENT"):
+ if isinstance(kid, str):
+ clean_kid = kid.replace("-", "")
+ if len(clean_kid) == 32:
+ self.kid = UUID(hex=clean_kid)
+ else:
+ self.kid = UUID(hex=clean_kid.ljust(32, "0"))
+ else:
+ self.kid = kid
+
+ if isinstance(key, str):
+ self.key = bytes.fromhex(key)
+ else:
+ self.key = key
+
+ self.type = type_
+
+
+class CustomRemoteCDMExceptions:
+ """Exception classes for compatibility with pywidevine CDM."""
+
+ class InvalidSession(Exception):
+ """Raised when session ID is invalid."""
+
+ class TooManySessions(Exception):
+ """Raised when session limit is reached."""
+
+ class InvalidInitData(Exception):
+ """Raised when PSSH/init data is invalid."""
+
+ class InvalidLicenseType(Exception):
+ """Raised when license type is invalid."""
+
+ class InvalidLicenseMessage(Exception):
+ """Raised when license message is invalid."""
+
+ class InvalidContext(Exception):
+ """Raised when session has no context data."""
+
+ class SignatureMismatch(Exception):
+ """Raised when signature verification fails."""
+
+
+class CustomRemoteCDM:
+ """
+ Highly Configurable Custom Remote CDM implementation.
+
+ This class provides a maximally flexible CDM interface that can adapt to
+ ANY CDM API format through YAML configuration alone. It's designed to support
+ both current and future CDM providers without requiring code changes.
+
+ Key Features:
+ - Fully configuration-driven behavior (all logic controlled via YAML)
+ - Pluggable authentication strategies (header, body, bearer, basic, custom)
+ - Flexible endpoint configuration (custom paths, methods, timeouts)
+ - Advanced parameter mapping (rename, add static, conditional, nested)
+ - Powerful response parsing (deep field access, type detection, transforms)
+ - Transform engine (base64, hex, JSON, custom key formats)
+ - Condition evaluation (response type detection, success validation)
+ - Compatible with both Widevine and PlayReady DRM schemes
+ - Vault integration for intelligent key caching
+
+ Configuration Philosophy:
+ - 90% of new CDM providers: YAML config only
+ - 9% of cases: Add new transform type (minimal code)
+ - 1% of cases: Add new auth strategy (minimal code)
+ - 0% need to modify core request/response logic
+
+ The class is designed to handle diverse API patterns including:
+ - Different authentication mechanisms (headers vs body vs tokens)
+ - Custom endpoint paths and HTTP methods
+ - Parameter name variations (scheme vs device, init_data vs pssh)
+ - Nested JSON structures in requests/responses
+ - Various key formats (JSON objects, colon-separated strings, etc.)
+ - Different response success indicators and error messages
+ - Conditional parameters based on device type or other factors
+ """
+
+ service_certificate_challenge = b"\x08\x04"
+
+ def __init__(
+ self,
+ host: str,
+ service_name: Optional[str] = None,
+ vaults: Optional[Vaults] = None,
+ device: Optional[Dict[str, Any]] = None,
+ auth: Optional[Dict[str, Any]] = None,
+ endpoints: Optional[Dict[str, Any]] = None,
+ request_mapping: Optional[Dict[str, Any]] = None,
+ response_mapping: Optional[Dict[str, Any]] = None,
+ caching: Optional[Dict[str, Any]] = None,
+ legacy: Optional[Dict[str, Any]] = None,
+ timeout: int = 30,
+ **kwargs,
+ ):
+ """
+ Initialize Custom Remote CDM with highly configurable options.
+
+ Args:
+ host: Base URL for the CDM API
+ service_name: Service name for key caching and vault operations
+ vaults: Vaults instance for local key caching
+ device: Device configuration (name, type, system_id, security_level)
+ auth: Authentication configuration (type, credentials, headers)
+ endpoints: Endpoint configuration (paths, methods, timeouts)
+ request_mapping: Request transformation rules (param names, static params, transforms)
+ response_mapping: Response parsing rules (field locations, type detection, success conditions)
+ caching: Caching configuration (enabled, use_vaults, etc.)
+ legacy: Legacy mode configuration
+ timeout: Default request timeout in seconds
+ **kwargs: Additional configuration options for future extensibility
+ """
+ self.host = host.rstrip("/")
+ self.service_name = service_name or ""
+ self.vaults = vaults
+ self.timeout = timeout
+
+ # Device configuration
+ device = device or {}
+ self.device_name = device.get("name", "ChromeCDM")
+ self.device_type_str = device.get("type", "CHROME")
+ self.system_id = device.get("system_id", 26830)
+ self.security_level = device.get("security_level", 3)
+
+ # Determine if this is a PlayReady CDM
+ self._is_playready = self.device_type_str.upper() == "PLAYREADY" or self.device_name in ["SL2", "SL3"]
+
+ # Get device type enum for compatibility
+ if self.device_type_str:
+ self.device_type = self._get_device_type_enum(self.device_type_str)
+
+ # Authentication configuration
+ self.auth_config = auth or {"type": "header", "header_name": "Authorization", "key": ""}
+
+ # Endpoints configuration with defaults
+ endpoints = endpoints or {}
+ self.endpoints = {
+ "get_request": {
+ "path": endpoints.get("get_request", {}).get("path", "/get-challenge")
+ if isinstance(endpoints.get("get_request"), dict)
+ else endpoints.get("get_request", "/get-challenge"),
+ "method": (
+ endpoints.get("get_request", {}).get("method", "POST")
+ if isinstance(endpoints.get("get_request"), dict)
+ else "POST"
+ ),
+ "timeout": (
+ endpoints.get("get_request", {}).get("timeout", self.timeout)
+ if isinstance(endpoints.get("get_request"), dict)
+ else self.timeout
+ ),
+ },
+ "decrypt_response": {
+ "path": endpoints.get("decrypt_response", {}).get("path", "/get-keys")
+ if isinstance(endpoints.get("decrypt_response"), dict)
+ else endpoints.get("decrypt_response", "/get-keys"),
+ "method": (
+ endpoints.get("decrypt_response", {}).get("method", "POST")
+ if isinstance(endpoints.get("decrypt_response"), dict)
+ else "POST"
+ ),
+ "timeout": (
+ endpoints.get("decrypt_response", {}).get("timeout", self.timeout)
+ if isinstance(endpoints.get("decrypt_response"), dict)
+ else self.timeout
+ ),
+ },
+ }
+
+ # Request mapping configuration
+ self.request_mapping = request_mapping or {}
+
+ # Response mapping configuration
+ self.response_mapping = response_mapping or {}
+
+ # Caching configuration
+ caching = caching or {}
+ self.caching_enabled = caching.get("enabled", True)
+ self.use_vaults = caching.get("use_vaults", True) and self.vaults is not None
+ self.check_cached_first = caching.get("check_cached_first", True)
+
+ # Legacy configuration
+ self.legacy_config = legacy or {"enabled": False}
+
+ # Session management
+ self._sessions: Dict[bytes, Dict[str, Any]] = {}
+ self._pssh_b64 = None
+ self._required_kids: Optional[List[str]] = None
+
+ # HTTP session setup
+ self._http_session = Session()
+ self._http_session.headers.update(
+ {"Content-Type": "application/json", "User-Agent": f"unshackle-custom-cdm/{__version__}"}
+ )
+
+ # Apply custom headers from auth config
+ custom_headers = self.auth_config.get("custom_headers", {})
+ if custom_headers:
+ self._http_session.headers.update(custom_headers)
+
+ def _get_device_type_enum(self, device_type: str):
+ """Convert device type string to enum for compatibility."""
+ device_type_upper = device_type.upper()
+ if device_type_upper == "ANDROID":
+ return DeviceTypes.ANDROID
+ elif device_type_upper == "CHROME":
+ return DeviceTypes.CHROME
+ else:
+ return DeviceTypes.CHROME
+
+ @property
+ def is_playready(self) -> bool:
+ """Check if this CDM is in PlayReady mode."""
+ return self._is_playready
+
+ @property
+ def certificate_chain(self) -> MockCertificateChain:
+ """Mock certificate chain for PlayReady compatibility."""
+ return MockCertificateChain(f"{self.device_name}_Custom_Remote")
+
+ def set_pssh_b64(self, pssh_b64: str) -> None:
+ """Store base64-encoded PSSH data for PlayReady compatibility."""
+ self._pssh_b64 = pssh_b64
+
+ def set_required_kids(self, kids: List[Union[str, UUID]]) -> None:
+ """
+ Set the required Key IDs for intelligent caching decisions.
+
+ This method enables the CDM to make smart decisions about when to request
+ additional keys via license challenges. When cached keys are available,
+ the CDM will compare them against the required KIDs to determine if a
+ license request is still needed for missing keys.
+
+ Args:
+ kids: List of required Key IDs as UUIDs or hex strings
+
+ Note:
+ Should be called by DRM classes (PlayReady/Widevine) before making
+ license challenge requests to enable optimal caching behavior.
+ """
+ self._required_kids = []
+ for kid in kids:
+ if isinstance(kid, UUID):
+ self._required_kids.append(str(kid).replace("-", "").lower())
+ else:
+ self._required_kids.append(str(kid).replace("-", "").lower())
+
+ def _generate_session_id(self) -> bytes:
+ """Generate a unique session ID."""
+ return secrets.token_bytes(16)
+
+ def _get_init_data_from_pssh(self, pssh: Any) -> str:
+ """Extract init data from various PSSH formats."""
+ if self.is_playready and self._pssh_b64:
+ return self._pssh_b64
+
+ if hasattr(pssh, "dumps"):
+ dumps_result = pssh.dumps()
+
+ if isinstance(dumps_result, str):
+ try:
+ base64.b64decode(dumps_result)
+ return dumps_result
+ except Exception:
+ return base64.b64encode(dumps_result.encode("utf-8")).decode("utf-8")
+ else:
+ return base64.b64encode(dumps_result).decode("utf-8")
+ elif hasattr(pssh, "raw"):
+ raw_data = pssh.raw
+ if isinstance(raw_data, str):
+ raw_data = raw_data.encode("utf-8")
+ return base64.b64encode(raw_data).decode("utf-8")
+ elif hasattr(pssh, "__class__") and "WrmHeader" in pssh.__class__.__name__:
+ if self.is_playready:
+ raise ValueError("PlayReady WRM header received but no PSSH B64 was set via set_pssh_b64()")
+
+ if hasattr(pssh, "raw_bytes"):
+ return base64.b64encode(pssh.raw_bytes).decode("utf-8")
+ elif hasattr(pssh, "bytes"):
+ return base64.b64encode(pssh.bytes).decode("utf-8")
+ else:
+ raise ValueError(f"Cannot extract PSSH data from WRM header type: {type(pssh)}")
+ else:
+ raise ValueError(f"Unsupported PSSH type: {type(pssh)}")
+
+ def _get_nested_field(self, data: Dict[str, Any], field_path: str, default: Any = None) -> Any:
+ """
+ Get a nested field from a dictionary using dot notation.
+
+ Args:
+ data: Dictionary to extract field from
+ field_path: Field path using dot notation (e.g., "data.cached_keys")
+ default: Default value if field not found
+
+ Returns:
+ Field value or default
+
+ Examples:
+ _get_nested_field({"data": {"keys": [1,2,3]}}, "data.keys") -> [1,2,3]
+ _get_nested_field({"message": "success"}, "message") -> "success"
+ """
+ if not field_path:
+ return default
+
+ keys = field_path.split(".")
+ current = data
+
+ for key in keys:
+ if isinstance(current, dict) and key in current:
+ current = current[key]
+ else:
+ return default
+
+ return current
+
+ def _apply_transform(self, value: Any, transform_type: str) -> Any:
+ """
+ Apply a transformation to a value.
+
+ Args:
+ value: Value to transform
+ transform_type: Type of transformation to apply
+
+ Returns:
+ Transformed value
+
+ Supported transforms:
+ - base64_encode: Encode bytes/string to base64
+ - base64_decode: Decode base64 string to bytes
+ - hex_encode: Encode bytes to hex string
+ - hex_decode: Decode hex string to bytes
+ - json_stringify: Convert object to JSON string
+ - json_parse: Parse JSON string to object
+ - parse_key_string: Parse "kid:key" format strings
+ """
+ if transform_type == "base64_encode":
+ if isinstance(value, str):
+ value = value.encode("utf-8")
+ return base64.b64encode(value).decode("utf-8")
+
+ elif transform_type == "base64_decode":
+ if isinstance(value, str):
+ return base64.b64decode(value)
+ return value
+
+ elif transform_type == "hex_encode":
+ if isinstance(value, bytes):
+ return value.hex()
+ elif isinstance(value, str):
+ return value.encode("utf-8").hex()
+ return value
+
+ elif transform_type == "hex_decode":
+ if isinstance(value, str):
+ return bytes.fromhex(value)
+ return value
+
+ elif transform_type == "json_stringify":
+ import json
+
+ return json.dumps(value)
+
+ elif transform_type == "json_parse":
+ import json
+
+ if isinstance(value, str):
+ return json.loads(value)
+ return value
+
+ elif transform_type == "parse_key_string":
+ # Handle key formats like "kid:key" or "--key kid:key"
+ if isinstance(value, str):
+ keys = []
+ for line in value.split("\n"):
+ line = line.strip()
+ if line.startswith("--key "):
+ line = line[6:]
+ if ":" in line:
+ kid, key = line.split(":", 1)
+ keys.append({"kid": kid.strip(), "key": key.strip(), "type": "CONTENT"})
+ return keys
+ return value
+
+ # Unknown transform type - return value unchanged
+ return value
+
+ def _evaluate_condition(self, condition: str, context: Dict[str, Any]) -> bool:
+ """
+ Evaluate a simple condition against a context.
+
+ Args:
+ condition: Condition string (e.g., "message == 'success'")
+ context: Context dictionary with values to check
+
+ Returns:
+ True if condition is met, False otherwise
+
+ Supported conditions:
+ - "field == value": Equality check
+ - "field != value": Inequality check
+ - "field == null": Null check
+ - "field != null": Not null check
+ - "field exists": Existence check
+ """
+ condition = condition.strip()
+
+ # Check for existence
+ if " exists" in condition:
+ field = condition.replace(" exists", "").strip()
+ return self._get_nested_field(context, field) is not None
+
+ # Check for null comparisons
+ if " == null" in condition:
+ field = condition.replace(" == null", "").strip()
+ return self._get_nested_field(context, field) is None
+
+ if " != null" in condition:
+ field = condition.replace(" != null", "").strip()
+ return self._get_nested_field(context, field) is not None
+
+ # Check for equality
+ if " == " in condition:
+ parts = condition.split(" == ", 1)
+ field = parts[0].strip()
+ expected_value = parts[1].strip().strip("'\"")
+ actual_value = self._get_nested_field(context, field)
+ return str(actual_value) == expected_value
+
+ # Check for inequality
+ if " != " in condition:
+ parts = condition.split(" != ", 1)
+ field = parts[0].strip()
+ expected_value = parts[1].strip().strip("'\"")
+ actual_value = self._get_nested_field(context, field)
+ return str(actual_value) != expected_value
+
+ # Unknown condition format - return False
+ return False
+
+ def _build_request_params(
+ self, endpoint_name: str, base_params: Dict[str, Any], session: Optional[Dict[str, Any]] = None
+ ) -> Dict[str, Any]:
+ """
+ Build request parameters with mapping and transformations.
+
+ Args:
+ endpoint_name: Name of the endpoint (e.g., "get_request", "decrypt_response")
+ base_params: Base parameters to transform
+ session: Optional session data for context
+
+ Returns:
+ Transformed parameters dictionary
+
+ This method applies the following transformations in order:
+ 1. Parameter name mappings (rename parameters)
+ 2. Static parameters (add fixed values)
+ 3. Conditional parameters (add based on conditions)
+ 4. Parameter transforms (apply data transformations)
+ 5. Nested parameter structure (create nested objects)
+ 6. Parameter exclusions (remove unwanted params)
+ """
+ # Get mapping config for this endpoint
+ mapping_config = self.request_mapping.get(endpoint_name, {})
+
+ # Start with base parameters
+ params = base_params.copy()
+
+ # 1. Apply parameter name mappings
+ param_names = mapping_config.get("param_names", {})
+ if param_names:
+ renamed_params = {}
+ for old_name, new_name in param_names.items():
+ if old_name in params:
+ renamed_params[new_name] = params.pop(old_name)
+ params.update(renamed_params)
+
+ # 2. Add static parameters
+ static_params = mapping_config.get("static_params", {})
+ if static_params:
+ params.update(static_params)
+
+ # 3. Add conditional parameters
+ conditional_params = mapping_config.get("conditional_params", [])
+ for condition_block in conditional_params:
+ condition = condition_block.get("condition", "")
+ # Create context for condition evaluation
+ context = {
+ "device_type": self.device_type_str,
+ "device_name": self.device_name,
+ "is_playready": self._is_playready,
+ }
+ if session:
+ context.update(session)
+
+ if self._evaluate_condition(condition, context):
+ params.update(condition_block.get("params", {}))
+
+ # 4. Apply parameter transforms
+ transforms = mapping_config.get("transforms", [])
+ for transform in transforms:
+ param_name = transform.get("param")
+ transform_type = transform.get("type")
+ if param_name in params:
+ params[param_name] = self._apply_transform(params[param_name], transform_type)
+
+ # 5. Handle nested parameter structure
+ nested_params = mapping_config.get("nested_params", {})
+ if nested_params:
+ for parent_key, child_keys in nested_params.items():
+ nested_obj = {}
+ for child_key in child_keys:
+ if child_key in params:
+ nested_obj[child_key] = params.pop(child_key)
+ if nested_obj:
+ params[parent_key] = nested_obj
+
+ # 6. Exclude unwanted parameters
+ exclude_params = mapping_config.get("exclude_params", [])
+ for param_name in exclude_params:
+ params.pop(param_name, None)
+
+ return params
+
+ def _apply_authentication(self, session: Session) -> None:
+ """
+ Apply authentication to the HTTP session based on auth configuration.
+
+ Args:
+ session: requests.Session to apply authentication to
+
+ Supported auth types:
+ - header: Add authentication header (e.g., x-api-key, Authorization)
+ - body: Authentication will be added to request body (handled in request building)
+ - bearer: Add Bearer token to Authorization header
+ - basic: Add HTTP Basic authentication
+ - query: Authentication will be added to query string (handled in request)
+ """
+ auth_type = self.auth_config.get("type", "header")
+
+ if auth_type == "header":
+ header_name = self.auth_config.get("header_name", "Authorization")
+ key = self.auth_config.get("key", "")
+ if key:
+ session.headers[header_name] = key
+
+ elif auth_type == "bearer":
+ token = self.auth_config.get("bearer_token") or self.auth_config.get("key", "")
+ if token:
+ session.headers["Authorization"] = f"Bearer {token}"
+
+ elif auth_type == "basic":
+ username = self.auth_config.get("username", "")
+ password = self.auth_config.get("password", "")
+ if username and password:
+ from requests.auth import HTTPBasicAuth
+
+ session.auth = HTTPBasicAuth(username, password)
+
+ def _parse_response_data(self, endpoint_name: str, response_data: Dict[str, Any]) -> Dict[str, Any]:
+ """
+ Parse response data based on response mapping configuration.
+
+ Args:
+ endpoint_name: Name of the endpoint (e.g., "get_request", "decrypt_response")
+ response_data: Raw response data from API
+
+ Returns:
+ Parsed response with standardized field names
+
+ This method extracts fields from the response using the response_mapping
+ configuration, handling nested fields, type detection, and transformations.
+ """
+ # Get mapping config for this endpoint
+ mapping_config = self.response_mapping.get(endpoint_name, {})
+
+ # Extract fields based on mapping
+ fields_config = mapping_config.get("fields", {})
+ parsed = {}
+
+ for standard_name, field_path in fields_config.items():
+ value = self._get_nested_field(response_data, field_path)
+ if value is not None:
+ parsed[standard_name] = value
+
+ # Apply response transforms
+ transforms = mapping_config.get("transforms", [])
+ for transform in transforms:
+ field_name = transform.get("field")
+ transform_type = transform.get("type")
+ if field_name in parsed:
+ parsed[field_name] = self._apply_transform(parsed[field_name], transform_type)
+
+ # Determine response type
+ response_types = mapping_config.get("response_types", [])
+ for response_type_config in response_types:
+ condition = response_type_config.get("condition", "")
+ if self._evaluate_condition(condition, parsed):
+ parsed["_response_type"] = response_type_config.get("type")
+ break
+
+ # Check success conditions
+ success_conditions = mapping_config.get("success_conditions", [])
+ is_success = True
+ if success_conditions:
+ is_success = all(self._evaluate_condition(cond, parsed) for cond in success_conditions)
+ parsed["_is_success"] = is_success
+
+ # Extract error messages if not successful
+ if not is_success:
+ error_fields = mapping_config.get("error_fields", ["error", "message", "details"])
+ error_messages = []
+ for error_field in error_fields:
+ error_msg = self._get_nested_field(response_data, error_field)
+ if error_msg and error_msg not in error_messages:
+ error_messages.append(str(error_msg))
+ parsed["_error_message"] = " - ".join(error_messages) if error_messages else "Unknown error"
+
+ return parsed
+
+ def _parse_keys_from_response(self, endpoint_name: str, response_data: Dict[str, Any]) -> List[Dict[str, Any]]:
+ """
+ Parse keys from response data using key field mapping.
+
+ Args:
+ endpoint_name: Name of the endpoint
+ response_data: Parsed response data
+
+ Returns:
+ List of key dictionaries with standardized format
+ """
+ mapping_config = self.response_mapping.get(endpoint_name, {})
+ key_fields = mapping_config.get("key_fields", {"kid": "kid", "key": "key", "type": "type"})
+
+ keys = []
+ keys_data = response_data.get("keys", [])
+
+ if isinstance(keys_data, list):
+ for key_obj in keys_data:
+ if isinstance(key_obj, dict):
+ kid = key_obj.get(key_fields.get("kid", "kid"))
+ key = key_obj.get(key_fields.get("key", "key"))
+ key_type = key_obj.get(key_fields.get("type", "type"), "CONTENT")
+
+ if kid and key:
+ keys.append({"kid": str(kid), "key": str(key), "type": str(key_type)})
+
+ # Handle string format keys (e.g., "kid:key" format)
+ elif isinstance(keys_data, str):
+ keys = self._apply_transform(keys_data, "parse_key_string")
+
+ return keys
+
+ def open(self) -> bytes:
+ """
+ Open a new CDM session.
+
+ Returns:
+ Session identifier as bytes
+ """
+ session_id = self._generate_session_id()
+ self._sessions[session_id] = {
+ "service_certificate": None,
+ "keys": [],
+ "pssh": None,
+ "challenge": None,
+ "remote_session_id": None,
+ "tried_cache": False,
+ "cached_keys": None,
+ }
+ return session_id
+
+ def close(self, session_id: bytes) -> None:
+ """
+ Close a CDM session and perform comprehensive cleanup.
+
+ Args:
+ session_id: Session identifier
+
+ Raises:
+ ValueError: If session ID is invalid
+ """
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ session = self._sessions[session_id]
+ session.clear()
+ del self._sessions[session_id]
+
+ def get_service_certificate(self, session_id: bytes) -> Optional[bytes]:
+ """
+ Get the service certificate for a session.
+
+ Args:
+ session_id: Session identifier
+
+ Returns:
+ Service certificate if set, None otherwise
+
+ Raises:
+ ValueError: If session ID is invalid
+ """
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ return self._sessions[session_id]["service_certificate"]
+
+ def set_service_certificate(self, session_id: bytes, certificate: Optional[Union[bytes, str]]) -> str:
+ """
+ Set the service certificate for a session.
+
+ Args:
+ session_id: Session identifier
+ certificate: Service certificate (bytes or base64 string)
+
+ Returns:
+ Certificate status message
+
+ Raises:
+ ValueError: If session ID is invalid
+ """
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ if certificate is None:
+ if not self._is_playready and self.device_name == "L1":
+ certificate = WidevineCdm.common_privacy_cert
+ self._sessions[session_id]["service_certificate"] = base64.b64decode(certificate)
+ return "Using default Widevine common privacy certificate for L1"
+ else:
+ self._sessions[session_id]["service_certificate"] = None
+ return "No certificate set (not required for this device type)"
+
+ if isinstance(certificate, str):
+ certificate = base64.b64decode(certificate)
+
+ self._sessions[session_id]["service_certificate"] = certificate
+ return "Successfully set Service Certificate"
+
+ def has_cached_keys(self, session_id: bytes) -> bool:
+ """
+ Check if cached keys are available for the session.
+
+ Args:
+ session_id: Session identifier
+
+ Returns:
+ True if cached keys are available
+
+ Raises:
+ ValueError: If session ID is invalid
+ """
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ session = self._sessions[session_id]
+ session_keys = session.get("keys", [])
+ return len(session_keys) > 0
+
+ def get_license_challenge(
+ self, session_id: bytes, pssh_or_wrm: Any, license_type: str = "STREAMING", privacy_mode: bool = True
+ ) -> bytes:
+ """
+ Generate a license challenge using the custom CDM API.
+
+ This method implements intelligent caching logic that checks vaults first,
+ then attempts to retrieve cached keys from the API, and only makes a
+ license request if keys are missing.
+
+ Args:
+ session_id: Session identifier
+ pssh_or_wrm: PSSH object or WRM header (for PlayReady compatibility)
+ license_type: Type of license (STREAMING, OFFLINE, AUTOMATIC) - for compatibility only
+ privacy_mode: Whether to use privacy mode - for compatibility only
+
+ Returns:
+ License challenge as bytes, or empty bytes if available keys satisfy requirements
+
+ Raises:
+ InvalidSession: If session ID is invalid
+ requests.RequestException: If API request fails
+ """
+ _ = license_type, privacy_mode
+
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ session = self._sessions[session_id]
+ session["pssh"] = pssh_or_wrm
+ init_data = self._get_init_data_from_pssh(pssh_or_wrm)
+
+ # Check vaults for cached keys first
+ if self.use_vaults and self._required_kids:
+ vault_keys = []
+ for kid_str in self._required_kids:
+ try:
+ clean_kid = kid_str.replace("-", "")
+ if len(clean_kid) == 32:
+ kid_uuid = UUID(hex=clean_kid)
+ else:
+ kid_uuid = UUID(hex=clean_kid.ljust(32, "0"))
+ key, _ = self.vaults.get_key(kid_uuid)
+ if key and key.count("0") != len(key):
+ vault_keys.append({"kid": kid_str, "key": key, "type": "CONTENT"})
+ except (ValueError, TypeError):
+ continue
+
+ if vault_keys:
+ vault_kids = set(k["kid"] for k in vault_keys)
+ required_kids = set(self._required_kids)
+
+ if required_kids.issubset(vault_kids):
+ session["keys"] = vault_keys
+ return b""
+ else:
+ session["vault_keys"] = vault_keys
+
+ # Build request parameters
+ base_params = {
+ "scheme": self.device_name,
+ "init_data": init_data,
+ }
+
+ if self.service_name:
+ base_params["service"] = self.service_name
+
+ if session["service_certificate"]:
+ base_params["service_certificate"] = base64.b64encode(session["service_certificate"]).decode("utf-8")
+
+ # Transform parameters based on configuration
+ request_params = self._build_request_params("get_request", base_params, session)
+
+ # Apply authentication
+ self._apply_authentication(self._http_session)
+
+ # Make API request
+ endpoint_config = self.endpoints["get_request"]
+ url = f"{self.host}{endpoint_config['path']}"
+ timeout = endpoint_config["timeout"]
+
+ response = self._http_session.post(url, json=request_params, timeout=timeout)
+
+ if response.status_code != 200:
+ raise requests.RequestException(f"API request failed: {response.status_code} {response.text}")
+
+ # Parse response
+ response_data = response.json()
+ parsed_response = self._parse_response_data("get_request", response_data)
+
+ # Check if request was successful
+ if not parsed_response.get("_is_success", False):
+ error_msg = parsed_response.get("_error_message", "Unknown error")
+ raise requests.RequestException(f"API error: {error_msg}")
+
+ # Determine response type
+ response_type = parsed_response.get("_response_type")
+
+ # Handle cached keys response
+ if response_type == "cached_keys" or "cached_keys" in parsed_response:
+ cached_keys = self._parse_keys_from_response("get_request", parsed_response)
+
+ all_available_keys = list(cached_keys)
+ if "vault_keys" in session:
+ all_available_keys.extend(session["vault_keys"])
+
+ session["tried_cache"] = True
+
+ # Check if we have all required keys
+ if self._required_kids:
+ available_kids = set()
+ for key in all_available_keys:
+ if isinstance(key, dict) and "kid" in key:
+ available_kids.add(key["kid"].replace("-", "").lower())
+
+ required_kids = set(self._required_kids)
+ missing_kids = required_kids - available_kids
+
+ if missing_kids:
+ # Store cached keys separately - don't populate session["keys"] yet
+ # This allows parse_license() to properly combine cached + license keys
+ session["cached_keys"] = cached_keys
+ else:
+ # All required keys are available from cache
+ session["keys"] = all_available_keys
+ return b""
+ else:
+ # No required KIDs specified - return cached keys
+ session["keys"] = all_available_keys
+ return b""
+
+ # Handle license request response or fetch license if keys missing
+ challenge = parsed_response.get("challenge")
+ remote_session_id = parsed_response.get("session_id")
+
+ if challenge and remote_session_id:
+ # Decode challenge if it's base64
+ if isinstance(challenge, str):
+ try:
+ challenge = base64.b64decode(challenge)
+ except Exception:
+ challenge = challenge.encode("utf-8")
+
+ session["challenge"] = challenge
+ session["remote_session_id"] = remote_session_id
+ return challenge
+
+ # If we have some keys but not all, return empty to skip license parsing
+ if session.get("keys"):
+ return b""
+
+ raise requests.RequestException("API response did not contain challenge or cached keys")
+
+ def parse_license(self, session_id: bytes, license_message: Union[bytes, str]) -> None:
+ """
+ Parse license response using the custom CDM API.
+
+ This method intelligently combines cached keys with newly obtained license keys,
+ avoiding duplicates while ensuring all required keys are available.
+
+ Args:
+ session_id: Session identifier
+ license_message: License response from license server
+
+ Raises:
+ ValueError: If session ID is invalid or no challenge available
+ requests.RequestException: If API request fails
+ """
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ session = self._sessions[session_id]
+
+ # Skip parsing if we already have final keys (no cached keys to combine)
+ # If cached_keys exist (Widevine or PlayReady), we need to combine them with license keys
+ if session["keys"] and "cached_keys" not in session:
+ return
+
+ # Ensure we have a challenge and session ID
+ if not session.get("challenge") or not session.get("remote_session_id"):
+ raise ValueError("No challenge available - call get_license_challenge first")
+
+ # Prepare license message
+ if isinstance(license_message, str):
+ if self.is_playready and license_message.strip().startswith(" List[Key]:
+ """
+ Get keys from the session.
+
+ Args:
+ session_id: Session identifier
+ type_: Optional key type filter (CONTENT, SIGNING, etc.)
+
+ Returns:
+ List of Key objects
+
+ Raises:
+ InvalidSession: If session ID is invalid
+ """
+ if session_id not in self._sessions:
+ raise CustomRemoteCDMExceptions.InvalidSession(f"Invalid session ID: {session_id.hex()}")
+
+ key_dicts = self._sessions[session_id]["keys"]
+ keys = [Key(kid=k["kid"], key=k["key"], type_=k["type"]) for k in key_dicts]
+
+ if type_:
+ keys = [key for key in keys if key.type == type_]
+
+ return keys
+
+
+__all__ = ["CustomRemoteCDM"]
diff --git a/unshackle/core/cdm/decrypt_labs_remote_cdm.py b/unshackle/core/cdm/decrypt_labs_remote_cdm.py
index 8645b4e..3806107 100644
--- a/unshackle/core/cdm/decrypt_labs_remote_cdm.py
+++ b/unshackle/core/cdm/decrypt_labs_remote_cdm.py
@@ -474,7 +474,6 @@ class DecryptLabsRemoteCDM:
if "vault_keys" in session:
all_available_keys.extend(session["vault_keys"])
- session["keys"] = all_available_keys
session["tried_cache"] = True
if self._required_kids:
@@ -505,10 +504,7 @@ class DecryptLabsRemoteCDM:
license_request_data = request_data.copy()
license_request_data["get_cached_keys_if_exists"] = False
- session["decrypt_labs_session_id"] = None
- session["challenge"] = None
- session["tried_cache"] = False
-
+ # Make license request for missing keys
response = self._http_session.post(
f"{self.host}/get-request", json=license_request_data, timeout=30
)
@@ -522,8 +518,12 @@ class DecryptLabsRemoteCDM:
return b""
else:
+ # All required keys are available from cache
+ session["keys"] = all_available_keys
return b""
else:
+ # No required KIDs specified - return cached keys
+ session["keys"] = all_available_keys
return b""
if message_type == "license-request" or "challenge" in data:
@@ -572,7 +572,9 @@ class DecryptLabsRemoteCDM:
session = self._sessions[session_id]
- if session["keys"] and not (self.is_playready and "cached_keys" in session):
+ # Skip parsing if we already have final keys (no cached keys to combine)
+ # If cached_keys exist (Widevine or PlayReady), we need to combine them with license keys
+ if session["keys"] and "cached_keys" not in session:
return
if not session.get("challenge") or not session.get("decrypt_labs_session_id"):
diff --git a/unshackle/core/config.py b/unshackle/core/config.py
index 79483ce..6eb7b26 100644
--- a/unshackle/core/config.py
+++ b/unshackle/core/config.py
@@ -31,6 +31,7 @@ class Config:
class _Filenames:
# default filenames, do not modify here, set via config
log = "unshackle_{name}_{time}.log" # Directories.logs
+ debug_log = "unshackle_debug_{service}_{time}.jsonl" # Directories.logs
config = "config.yaml" # Directories.services / tag
root_config = "unshackle.yaml" # Directories.user_configs
chapters = "Chapters_{title}_{random}.txt" # Directories.temp
@@ -88,6 +89,7 @@ class Config:
self.tag_group_name: bool = kwargs.get("tag_group_name", True)
self.tag_imdb_tmdb: bool = kwargs.get("tag_imdb_tmdb", True)
self.tmdb_api_key: str = kwargs.get("tmdb_api_key") or ""
+ self.simkl_client_id: str = kwargs.get("simkl_client_id") or ""
self.decrypt_labs_api_key: str = kwargs.get("decrypt_labs_api_key") or ""
self.update_checks: bool = kwargs.get("update_checks", True)
self.update_check_interval: int = kwargs.get("update_check_interval", 24)
@@ -98,6 +100,9 @@ class Config:
self.title_cache_max_retention: int = kwargs.get("title_cache_max_retention", 86400) # 24 hours default
self.title_cache_enabled: bool = kwargs.get("title_cache_enabled", True)
+ self.debug: bool = kwargs.get("debug", False)
+ self.debug_keys: bool = kwargs.get("debug_keys", False)
+
@classmethod
def from_yaml(cls, path: Path) -> Config:
if not path.exists():
@@ -113,8 +118,8 @@ POSSIBLE_CONFIG_PATHS = (
Config._Directories.namespace_dir / Config._Filenames.root_config,
# The Parent Folder to the unshackle Namespace Folder (e.g., %appdata%/Python/Python311/site-packages)
Config._Directories.namespace_dir.parent / Config._Filenames.root_config,
- # The AppDirs User Config Folder (e.g., %localappdata%/unshackle)
- Config._Directories.user_configs / Config._Filenames.root_config,
+ # The AppDirs User Config Folder (e.g., ~/.config/unshackle on Linux, %LOCALAPPDATA%\unshackle on Windows)
+ Path(Config._Directories.app_dirs.user_config_dir) / Config._Filenames.root_config,
)
diff --git a/unshackle/core/constants.py b/unshackle/core/constants.py
index 6a14f7d..65c6681 100644
--- a/unshackle/core/constants.py
+++ b/unshackle/core/constants.py
@@ -8,7 +8,13 @@ DRM_SORT_MAP = ["ClearKey", "Widevine"]
LANGUAGE_MAX_DISTANCE = 5 # this is max to be considered "same", e.g., en, en-US, en-AU
LANGUAGE_EXACT_DISTANCE = 0 # exact match only, no variants
VIDEO_CODEC_MAP = {"AVC": "H.264", "HEVC": "H.265"}
-DYNAMIC_RANGE_MAP = {"HDR10": "HDR", "HDR10+": "HDR10P", "Dolby Vision": "DV", "HDR10 / HDR10+": "HDR10P", "HDR10 / HDR10": "HDR"}
+DYNAMIC_RANGE_MAP = {
+ "HDR10": "HDR",
+ "HDR10+": "HDR10P",
+ "Dolby Vision": "DV",
+ "HDR10 / HDR10+": "HDR10P",
+ "HDR10 / HDR10": "HDR",
+}
AUDIO_CODEC_MAP = {"E-AC-3": "DDP", "AC-3": "DD"}
context_settings = dict(
diff --git a/unshackle/core/downloaders/n_m3u8dl_re.py b/unshackle/core/downloaders/n_m3u8dl_re.py
index d183111..7472c59 100644
--- a/unshackle/core/downloaders/n_m3u8dl_re.py
+++ b/unshackle/core/downloaders/n_m3u8dl_re.py
@@ -1,12 +1,10 @@
-import logging
import os
import re
import subprocess
import warnings
from http.cookiejar import CookieJar
-from itertools import chain
from pathlib import Path
-from typing import Any, Generator, MutableMapping, Optional, Union
+from typing import Any, Generator, MutableMapping
import requests
from requests.cookies import cookiejar_from_dict, get_cookie_header
@@ -16,251 +14,331 @@ from unshackle.core.config import config
from unshackle.core.console import console
from unshackle.core.constants import DOWNLOAD_CANCELLED
+PERCENT_RE = re.compile(r"(\d+\.\d+%)")
+SPEED_RE = re.compile(r"(\d+\.\d+(?:MB|KB)ps)")
+SIZE_RE = re.compile(r"(\d+\.\d+(?:MB|GB|KB)/\d+\.\d+(?:MB|GB|KB))")
+WARN_RE = re.compile(r"(WARN : Response.*|WARN : One or more errors occurred.*)")
+ERROR_RE = re.compile(r"(ERROR.*)")
+
+DECRYPTION_ENGINE = {
+ "shaka": "SHAKA_PACKAGER",
+ "mp4decrypt": "MP4DECRYPT",
+}
+
# Ignore FutureWarnings
warnings.simplefilter(action="ignore", category=FutureWarning)
-AUDIO_CODEC_MAP = {"AAC": "mp4a", "AC3": "ac-3", "EC3": "ec-3"}
-VIDEO_CODEC_MAP = {"AVC": "avc", "HEVC": "hvc", "DV": "dvh", "HLG": "hev"}
+def get_track_selection_args(track: Any) -> list[str]:
+ """
+ Generates track selection arguments for N_m3u8dl_RE.
-def track_selection(track: object) -> list[str]:
- """Return the N_m3u8DL-RE stream selection arguments for a track."""
+ Args:
+ track: A track object with attributes like descriptor, data, and class name.
- if "dash" in track.data:
- adaptation_set = track.data["dash"]["adaptation_set"]
- representation = track.data["dash"]["representation"]
+ Returns:
+ A list of strings for track selection.
- track_type = track.__class__.__name__
- codec = track.codec.name
- bitrate = track.bitrate // 1000
- language = track.language
- width = track.width if track_type == "Video" else None
- height = track.height if track_type == "Video" else None
- range = track.range.name if track_type == "Video" else None
+ Raises:
+ ValueError: If the manifest type is unsupported or track selection fails.
+ """
+ descriptor = track.descriptor.name
+ track_type = track.__class__.__name__
- elif "ism" in track.data:
- stream_index = track.data["ism"]["stream_index"]
- quality_level = track.data["ism"]["quality_level"]
+ def _create_args(flag: str, parts: list[str], type_str: str, extra_args: list[str] | None = None) -> list[str]:
+ if not parts:
+ raise ValueError(f"[N_m3u8DL-RE]: Unable to select {type_str} track from {descriptor} manifest")
- track_type = track.__class__.__name__
- codec = track.codec.name
- bitrate = track.bitrate // 1000
- language = track.language
- width = track.width if track_type == "Video" else None
- height = track.height if track_type == "Video" else None
- range = track.range.name if track_type == "Video" else None
- adaptation_set = stream_index
- representation = quality_level
+ final_args = [flag, ":".join(parts)]
+ if extra_args:
+ final_args.extend(extra_args)
- else:
- return []
+ return final_args
- if track_type == "Audio":
- codecs = AUDIO_CODEC_MAP.get(codec)
- langs = adaptation_set.findall("lang") + representation.findall("lang")
- track_ids = list(
- set(
- v
- for x in chain(adaptation_set, representation)
- for v in (x.get("audioTrackId"), x.get("id"))
- if v is not None
+ match descriptor:
+ case "HLS":
+ # HLS playlists are direct inputs; no selection arguments needed.
+ return []
+
+ case "DASH":
+ representation = track.data.get("dash", {}).get("representation", {})
+ adaptation_set = track.data.get("dash", {}).get("adaptation_set", {})
+ parts = []
+
+ if track_type == "Audio":
+ if track_id := representation.get("id") or adaptation_set.get("audioTrackId"):
+ parts.append(rf'"id=\b{track_id}\b"')
+ else:
+ if codecs := representation.get("codecs"):
+ parts.append(f"codecs={codecs}")
+ if lang := representation.get("lang") or adaptation_set.get("lang"):
+ parts.append(f"lang={lang}")
+ if bw := representation.get("bandwidth"):
+ bitrate = int(bw) // 1000
+ parts.append(f"bwMin={bitrate}:bwMax={bitrate + 5}")
+ if roles := representation.findall("Role") + adaptation_set.findall("Role"):
+ if role := next((r.get("value") for r in roles if r.get("value", "").lower() == "main"), None):
+ parts.append(f"role={role}")
+ return _create_args("-sa", parts, "audio")
+
+ if track_type == "Video":
+ if track_id := representation.get("id"):
+ parts.append(rf'"id=\b{track_id}\b"')
+ else:
+ if width := representation.get("width"):
+ parts.append(f"res={width}*")
+ if codecs := representation.get("codecs"):
+ parts.append(f"codecs={codecs}")
+ if bw := representation.get("bandwidth"):
+ bitrate = int(bw) // 1000
+ parts.append(f"bwMin={bitrate}:bwMax={bitrate + 5}")
+ return _create_args("-sv", parts, "video")
+
+ if track_type == "Subtitle":
+ if track_id := representation.get("id"):
+ parts.append(rf'"id=\b{track_id}\b"')
+ else:
+ if lang := representation.get("lang"):
+ parts.append(f"lang={lang}")
+ return _create_args("-ss", parts, "subtitle", extra_args=["--auto-subtitle-fix", "false"])
+
+ case "ISM":
+ quality_level = track.data.get("ism", {}).get("quality_level", {})
+ stream_index = track.data.get("ism", {}).get("stream_index", {})
+ parts = []
+
+ if track_type == "Audio":
+ if name := stream_index.get("Name") or quality_level.get("Index"):
+ parts.append(rf'"id=\b{name}\b"')
+ else:
+ if codecs := quality_level.get("FourCC"):
+ parts.append(f"codecs={codecs}")
+ if lang := stream_index.get("Language"):
+ parts.append(f"lang={lang}")
+ if br := quality_level.get("Bitrate"):
+ bitrate = int(br) // 1000
+ parts.append(f"bwMin={bitrate}:bwMax={bitrate + 5}")
+ return _create_args("-sa", parts, "audio")
+
+ if track_type == "Video":
+ if name := stream_index.get("Name") or quality_level.get("Index"):
+ parts.append(rf'"id=\b{name}\b"')
+ else:
+ if width := quality_level.get("MaxWidth"):
+ parts.append(f"res={width}*")
+ if codecs := quality_level.get("FourCC"):
+ parts.append(f"codecs={codecs}")
+ if br := quality_level.get("Bitrate"):
+ bitrate = int(br) // 1000
+ parts.append(f"bwMin={bitrate}:bwMax={bitrate + 5}")
+ return _create_args("-sv", parts, "video")
+
+ # I've yet to encounter a subtitle track in ISM manifests, so this is mostly theoretical.
+ if track_type == "Subtitle":
+ if name := stream_index.get("Name") or quality_level.get("Index"):
+ parts.append(rf'"id=\b{name}\b"')
+ else:
+ if lang := stream_index.get("Language"):
+ parts.append(f"lang={lang}")
+ return _create_args("-ss", parts, "subtitle", extra_args=["--auto-subtitle-fix", "false"])
+
+ case "URL":
+ raise ValueError(
+ f"[N_m3u8DL-RE]: Direct URL downloads are not supported for {track_type} tracks. "
+ f"The track should use a different downloader (e.g., 'requests', 'aria2c')."
)
- )
- roles = adaptation_set.findall("Role") + representation.findall("Role")
- role = ":role=main" if next((i for i in roles if i.get("value").lower() == "main"), None) else ""
- bandwidth = f"bwMin={bitrate}:bwMax={bitrate + 5}"
- if langs:
- track_selection = ["-sa", f"lang={language}:codecs={codecs}:{bandwidth}{role}"]
- elif len(track_ids) == 1:
- track_selection = ["-sa", f"id={track_ids[0]}"]
- else:
- track_selection = ["-sa", f"for=best{role}"]
- return track_selection
+ raise ValueError(f"[N_m3u8DL-RE]: Unsupported manifest type: {descriptor}")
- if track_type == "Video":
- # adjust codec based on range
- codec_adjustments = {("HEVC", "DV"): "DV", ("HEVC", "HLG"): "HLG"}
- codec = codec_adjustments.get((codec, range), codec)
- codecs = VIDEO_CODEC_MAP.get(codec)
- bandwidth = f"bwMin={bitrate}:bwMax={bitrate + 5}"
- if width and height:
- resolution = f"{width}x{height}"
- elif width:
- resolution = f"{width}*"
- else:
- resolution = "for=best"
- if resolution.startswith("for="):
- track_selection = ["-sv", resolution]
- track_selection.append(f"codecs={codecs}:{bandwidth}")
- else:
- track_selection = ["-sv", f"res={resolution}:codecs={codecs}:{bandwidth}"]
- return track_selection
+def build_download_args(
+ track_url: str,
+ filename: str,
+ output_dir: Path,
+ thread_count: int,
+ retry_count: int,
+ track_from_file: Path | None,
+ custom_args: dict[str, Any] | None,
+ headers: dict[str, Any] | None,
+ cookies: CookieJar | None,
+ proxy: str | None,
+ content_keys: dict[str, str] | None,
+ ad_keyword: str | None,
+ skip_merge: bool | None = False,
+) -> list[str]:
+ """Constructs the CLI arguments for N_m3u8DL-RE."""
+
+ # Default arguments
+ args = {
+ "--save-name": filename,
+ "--save-dir": output_dir,
+ "--tmp-dir": output_dir,
+ "--thread-count": thread_count,
+ "--download-retry-count": retry_count,
+ "--write-meta-json": False,
+ "--no-log": True,
+ }
+ if proxy:
+ args["--custom-proxy"] = proxy
+ if skip_merge:
+ args["--skip-merge"] = skip_merge
+ if ad_keyword:
+ args["--ad-keyword"] = ad_keyword
+ if content_keys:
+ args["--key"] = next((f"{kid.hex}:{key.lower()}" for kid, key in content_keys.items()), None)
+ args["--decryption-engine"] = DECRYPTION_ENGINE.get(config.decryption.lower()) or "SHAKA_PACKAGER"
+ if custom_args:
+ args.update(custom_args)
+
+ command = [track_from_file or track_url]
+ for flag, value in args.items():
+ if value is True:
+ command.append(flag)
+ elif value is False:
+ command.extend([flag, "false"])
+ elif value is not False and value is not None:
+ command.extend([flag, str(value)])
+
+ if headers:
+ for key, value in headers.items():
+ if key.lower() not in ("accept-encoding", "cookie"):
+ command.extend(["--header", f"{key}: {value}"])
+
+ if cookies:
+ req = requests.Request(method="GET", url=track_url)
+ cookie_header = get_cookie_header(cookies, req)
+ command.extend(["--header", f"Cookie: {cookie_header}"])
+
+ return command
def download(
- urls: Union[str, dict[str, Any], list[str], list[dict[str, Any]]],
- track: object,
+ urls: str | dict[str, Any] | list[str | dict[str, Any]],
+ track: Any,
output_dir: Path,
filename: str,
- headers: Optional[MutableMapping[str, Union[str, bytes]]] = None,
- cookies: Optional[Union[MutableMapping[str, str], CookieJar]] = None,
- proxy: Optional[str] = None,
- max_workers: Optional[int] = None,
- content_keys: Optional[dict[str, Any]] = None,
+ headers: MutableMapping[str, str | bytes] | None,
+ cookies: MutableMapping[str, str] | CookieJar | None,
+ proxy: str | None,
+ max_workers: int | None,
+ content_keys: dict[str, Any] | None,
+ skip_merge: bool | None = False,
) -> Generator[dict[str, Any], None, None]:
if not urls:
raise ValueError("urls must be provided and not empty")
- elif not isinstance(urls, (str, dict, list)):
- raise TypeError(f"Expected urls to be {str} or {dict} or a list of one of them, not {type(urls)}")
-
- if not output_dir:
- raise ValueError("output_dir must be provided")
- elif not isinstance(output_dir, Path):
- raise TypeError(f"Expected output_dir to be {Path}, not {type(output_dir)}")
-
- if not filename:
- raise ValueError("filename must be provided")
- elif not isinstance(filename, str):
- raise TypeError(f"Expected filename to be {str}, not {type(filename)}")
-
+ if not isinstance(urls, (str, dict, list)):
+ raise TypeError(f"Expected urls to be str, dict, or list, not {type(urls)}")
+ if not isinstance(output_dir, Path):
+ raise TypeError(f"Expected output_dir to be Path, not {type(output_dir)}")
+ if not isinstance(filename, str) or not filename:
+ raise ValueError("filename must be a non-empty string")
if not isinstance(headers, (MutableMapping, type(None))):
- raise TypeError(f"Expected headers to be {MutableMapping}, not {type(headers)}")
-
+ raise TypeError(f"Expected headers to be a mapping or None, not {type(headers)}")
if not isinstance(cookies, (MutableMapping, CookieJar, type(None))):
- raise TypeError(f"Expected cookies to be {MutableMapping} or {CookieJar}, not {type(cookies)}")
-
+ raise TypeError(f"Expected cookies to be a mapping, CookieJar, or None, not {type(cookies)}")
if not isinstance(proxy, (str, type(None))):
- raise TypeError(f"Expected proxy to be {str}, not {type(proxy)}")
-
- if not max_workers:
- max_workers = min(32, (os.cpu_count() or 1) + 4)
- elif not isinstance(max_workers, int):
- raise TypeError(f"Expected max_workers to be {int}, not {type(max_workers)}")
-
- if not isinstance(urls, list):
- urls = [urls]
-
- if not binaries.N_m3u8DL_RE:
- raise EnvironmentError("N_m3u8DL-RE executable not found...")
+ raise TypeError(f"Expected proxy to be a str or None, not {type(proxy)}")
+ if not isinstance(max_workers, (int, type(None))):
+ raise TypeError(f"Expected max_workers to be an int or None, not {type(max_workers)}")
+ if not isinstance(content_keys, (dict, type(None))):
+ raise TypeError(f"Expected content_keys to be a dict or None, not {type(content_keys)}")
+ if not isinstance(skip_merge, (bool, type(None))):
+ raise TypeError(f"Expected skip_merge to be a bool or None, not {type(skip_merge)}")
if cookies and not isinstance(cookies, CookieJar):
cookies = cookiejar_from_dict(cookies)
- track_type = track.__class__.__name__
- thread_count = str(config.n_m3u8dl_re.get("thread_count", max_workers))
- retry_count = str(config.n_m3u8dl_re.get("retry_count", max_workers))
+ if not binaries.N_m3u8DL_RE:
+ raise EnvironmentError("N_m3u8DL-RE executable not found...")
+
+ effective_max_workers = max_workers or min(32, (os.cpu_count() or 1) + 4)
+
+ if proxy and not config.n_m3u8dl_re.get("use_proxy", True):
+ proxy = None
+
+ thread_count = config.n_m3u8dl_re.get("thread_count", effective_max_workers)
+ retry_count = config.n_m3u8dl_re.get("retry_count", 10)
ad_keyword = config.n_m3u8dl_re.get("ad_keyword")
- arguments = [
- track.url,
- "--save-dir",
- output_dir,
- "--tmp-dir",
- output_dir,
- "--thread-count",
- thread_count,
- "--download-retry-count",
- retry_count,
- "--no-log",
- "--write-meta-json",
- "false",
- ]
+ arguments = build_download_args(
+ track_url=track.url,
+ track_from_file=track.from_file,
+ filename=filename,
+ output_dir=output_dir,
+ thread_count=thread_count,
+ retry_count=retry_count,
+ custom_args=track.downloader_args,
+ headers=headers,
+ cookies=cookies,
+ proxy=proxy,
+ content_keys=content_keys,
+ skip_merge=skip_merge,
+ ad_keyword=ad_keyword,
+ )
+ arguments.extend(get_track_selection_args(track))
- for header, value in (headers or {}).items():
- if header.lower() in ("accept-encoding", "cookie"):
- continue
- arguments.extend(["--header", f"{header}: {value}"])
-
- if cookies:
- cookie_header = get_cookie_header(cookies, requests.Request(url=track.url))
- if cookie_header:
- arguments.extend(["--header", f"Cookie: {cookie_header}"])
-
- if proxy:
- arguments.extend(["--custom-proxy", proxy])
-
- if content_keys:
- for kid, key in content_keys.items():
- keys = f"{kid.hex}:{key.lower()}"
- arguments.extend(["--key", keys])
- arguments.extend(["--use-shaka-packager"])
-
- if ad_keyword:
- arguments.extend(["--ad-keyword", ad_keyword])
-
- if track.descriptor.name == "URL":
- error = f"[N_m3u8DL-RE]: {track.descriptor} is currently not supported"
- raise ValueError(error)
- elif track.descriptor.name == "DASH":
- arguments.extend(track_selection(track))
-
- # TODO: improve this nonsense
- percent_re = re.compile(r"(\d+\.\d+%)")
- speed_re = re.compile(r"(? Generator[dict[str, Any], None, None]:
"""
Download files using N_m3u8DL-RE.
@@ -275,28 +353,33 @@ def n_m3u8dl_re(
The data is in the same format accepted by rich's progress.update() function.
Parameters:
- urls: Web URL(s) to file(s) to download. You can use a dictionary with the key
- "url" for the URI, and other keys for extra arguments to use per-URL.
+ urls: Web URL(s) to file(s) to download. NOTE: This parameter is ignored for now.
track: The track to download. Used to get track attributes for the selection
process. Note that Track.Descriptor.URL is not supported by N_m3u8DL-RE.
output_dir: The folder to save the file into. If the save path's directory does
not exist then it will be made automatically.
- filename: The filename or filename template to use for each file. The variables
- you can use are `i` for the URL index and `ext` for the URL extension.
- headers: A mapping of HTTP Header Key/Values to use for the download.
- cookies: A mapping of Cookie Key/Values or a Cookie Jar to use for the download.
+ filename: The filename or filename template to use for each file.
+ headers: A mapping of HTTP Header Key/Values to use for all downloads.
+ cookies: A mapping of Cookie Key/Values or a Cookie Jar to use for all downloads.
+ proxy: A proxy to use for all downloads.
max_workers: The maximum amount of threads to use for downloads. Defaults to
min(32,(cpu_count+4)). Can be set in config with --thread-count option.
content_keys: The content keys to use for decryption.
+ skip_merge: Whether to skip merging the downloaded chunks.
"""
- track_type = track.__class__.__name__
- log = logging.getLogger("N_m3u8DL-RE")
- if proxy and not config.n_m3u8dl_re.get("use_proxy", True):
- log.warning(f"{track_type}: Ignoring proxy as N_m3u8DL-RE is set to use_proxy=False")
- proxy = None
-
- yield from download(urls, track, output_dir, filename, headers, cookies, proxy, max_workers, content_keys)
+ yield from download(
+ urls=urls,
+ track=track,
+ output_dir=output_dir,
+ filename=filename,
+ headers=headers,
+ cookies=cookies,
+ proxy=proxy,
+ max_workers=max_workers,
+ content_keys=content_keys,
+ skip_merge=skip_merge,
+ )
__all__ = ("n_m3u8dl_re",)
diff --git a/unshackle/core/drm/playready.py b/unshackle/core/drm/playready.py
index a26428a..b1fcea0 100644
--- a/unshackle/core/drm/playready.py
+++ b/unshackle/core/drm/playready.py
@@ -338,7 +338,7 @@ class PlayReady:
]
try:
- subprocess.run(cmd, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
+ subprocess.run(cmd, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True, encoding='utf-8')
except subprocess.CalledProcessError as e:
error_msg = e.stderr if e.stderr else f"mp4decrypt failed with exit code {e.returncode}"
raise subprocess.CalledProcessError(e.returncode, cmd, output=e.stdout, stderr=error_msg)
diff --git a/unshackle/core/drm/widevine.py b/unshackle/core/drm/widevine.py
index 6c3d683..7fee1c9 100644
--- a/unshackle/core/drm/widevine.py
+++ b/unshackle/core/drm/widevine.py
@@ -289,7 +289,7 @@ class Widevine:
]
try:
- subprocess.run(cmd, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
+ subprocess.run(cmd, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True, encoding='utf-8')
except subprocess.CalledProcessError as e:
error_msg = e.stderr if e.stderr else f"mp4decrypt failed with exit code {e.returncode}"
raise subprocess.CalledProcessError(e.returncode, cmd, output=e.stdout, stderr=error_msg)
diff --git a/unshackle/core/manifests/dash.py b/unshackle/core/manifests/dash.py
index 56fec08..442ac96 100644
--- a/unshackle/core/manifests/dash.py
+++ b/unshackle/core/manifests/dash.py
@@ -297,8 +297,9 @@ class DASH:
manifest_base_url = track.url
elif not re.match("^https?://", manifest_base_url, re.IGNORECASE):
manifest_base_url = urljoin(track.url, f"./{manifest_base_url}")
- period_base_url = urljoin(manifest_base_url, period.findtext("BaseURL"))
- rep_base_url = urljoin(period_base_url, representation.findtext("BaseURL"))
+ period_base_url = urljoin(manifest_base_url, period.findtext("BaseURL") or "")
+ adaptation_set_base_url = urljoin(period_base_url, adaptation_set.findtext("BaseURL") or "")
+ rep_base_url = urljoin(adaptation_set_base_url, representation.findtext("BaseURL") or "")
period_duration = period.get("duration") or manifest.get("mediaPresentationDuration")
init_data: Optional[bytes] = None
@@ -384,7 +385,8 @@ class DASH:
segment_duration = float(segment_template.get("duration")) or 1
if not end_number:
- end_number = math.ceil(period_duration / (segment_duration / segment_timescale))
+ segment_count = math.ceil(period_duration / (segment_duration / segment_timescale))
+ end_number = start_number + segment_count - 1
for s in range(start_number, end_number + 1):
segments.append(
diff --git a/unshackle/core/manifests/hls.py b/unshackle/core/manifests/hls.py
index d48d96e..6f49c6a 100644
--- a/unshackle/core/manifests/hls.py
+++ b/unshackle/core/manifests/hls.py
@@ -249,17 +249,20 @@ class HLS:
log = logging.getLogger("HLS")
- # Get the playlist text and handle both session types
- response = session.get(track.url)
- if isinstance(response, requests.Response):
- if not response.ok:
- log.error(f"Failed to request the invariant M3U8 playlist: {response.status_code}")
- sys.exit(1)
- playlist_text = response.text
+ if track.from_file:
+ master = m3u8.load(str(track.from_file))
else:
- raise TypeError(f"Expected response to be a requests.Response or curl_cffi.Response, not {type(response)}")
+ # Get the playlist text and handle both session types
+ response = session.get(track.url)
+ if isinstance(response, requests.Response):
+ if not response.ok:
+ log.error(f"Failed to request the invariant M3U8 playlist: {response.status_code}")
+ sys.exit(1)
+ playlist_text = response.text
+ else:
+ raise TypeError(f"Expected response to be a requests.Response or curl_cffi.Response, not {type(response)}")
- master = m3u8.loads(playlist_text, uri=track.url)
+ master = m3u8.loads(playlist_text, uri=track.url)
if not master.segments:
log.error("Track's HLS playlist has no segments, expecting an invariant M3U8 playlist.")
@@ -310,7 +313,7 @@ class HLS:
if segment.byterange:
byte_range = HLS.calculate_byte_range(segment.byterange, range_offset)
- range_offset = byte_range.split("-")[0]
+ range_offset = int(byte_range.split("-")[0])
else:
byte_range = None
@@ -439,7 +442,7 @@ class HLS:
elif len(files) != range_len:
raise ValueError(f"Missing {range_len - len(files)} segment files for {segment_range}...")
- if isinstance(drm, Widevine):
+ if isinstance(drm, (Widevine, PlayReady)):
# with widevine we can merge all segments and decrypt once
merge(to=merged_path, via=files, delete=True, include_map_data=True)
drm.decrypt(merged_path)
diff --git a/unshackle/core/proxies/__init__.py b/unshackle/core/proxies/__init__.py
index 10008c1..ecb97de 100644
--- a/unshackle/core/proxies/__init__.py
+++ b/unshackle/core/proxies/__init__.py
@@ -2,5 +2,6 @@ from .basic import Basic
from .hola import Hola
from .nordvpn import NordVPN
from .surfsharkvpn import SurfsharkVPN
+from .windscribevpn import WindscribeVPN
-__all__ = ("Basic", "Hola", "NordVPN", "SurfsharkVPN")
+__all__ = ("Basic", "Hola", "NordVPN", "SurfsharkVPN", "WindscribeVPN")
diff --git a/unshackle/core/proxies/windscribevpn.py b/unshackle/core/proxies/windscribevpn.py
new file mode 100644
index 0000000..c8ffd2d
--- /dev/null
+++ b/unshackle/core/proxies/windscribevpn.py
@@ -0,0 +1,109 @@
+import json
+import random
+import re
+from typing import Optional
+
+import requests
+
+from unshackle.core.proxies.proxy import Proxy
+
+
+class WindscribeVPN(Proxy):
+ def __init__(self, username: str, password: str, server_map: Optional[dict[str, str]] = None):
+ """
+ Proxy Service using WindscribeVPN Service Credentials.
+
+ A username and password must be provided. These are Service Credentials, not your Login Credentials.
+ The Service Credentials can be found here: https://windscribe.com/getconfig/openvpn
+ """
+ if not username:
+ raise ValueError("No Username was provided to the WindscribeVPN Proxy Service.")
+ if not password:
+ raise ValueError("No Password was provided to the WindscribeVPN Proxy Service.")
+
+ if server_map is not None and not isinstance(server_map, dict):
+ raise TypeError(f"Expected server_map to be a dict mapping a region to a hostname, not '{server_map!r}'.")
+
+ self.username = username
+ self.password = password
+ self.server_map = server_map or {}
+
+ self.countries = self.get_countries()
+
+ def __repr__(self) -> str:
+ countries = len(set(x.get("country_code") for x in self.countries if x.get("country_code")))
+ servers = sum(
+ len(host)
+ for location in self.countries
+ for group in location.get("groups", [])
+ for host in group.get("hosts", [])
+ )
+
+ return f"{countries} Countr{['ies', 'y'][countries == 1]} ({servers} Server{['s', ''][servers == 1]})"
+
+ def get_proxy(self, query: str) -> Optional[str]:
+ """
+ Get an HTTPS proxy URI for a WindscribeVPN server.
+
+ Note: Windscribe's static OpenVPN credentials work reliably on US, AU, and NZ servers.
+ """
+ query = query.lower()
+ supported_regions = {"us", "au", "nz"}
+
+ if query not in supported_regions and query not in self.server_map:
+ raise ValueError(
+ f"Windscribe proxy does not currently support the '{query.upper()}' region. "
+ f"Supported regions with reliable credentials: {', '.join(sorted(supported_regions))}. "
+ )
+
+ if query in self.server_map:
+ hostname = self.server_map[query]
+ else:
+ if re.match(r"^[a-z]+$", query):
+ hostname = self.get_random_server(query)
+ else:
+ raise ValueError(f"The query provided is unsupported and unrecognized: {query}")
+
+ if not hostname:
+ return None
+
+ hostname = hostname.split(':')[0]
+ return f"https://{self.username}:{self.password}@{hostname}:443"
+
+ def get_random_server(self, country_code: str) -> Optional[str]:
+ """
+ Get a random server hostname for a country.
+
+ Returns None if no servers are available for the country.
+ """
+ for location in self.countries:
+ if location.get("country_code", "").lower() == country_code.lower():
+ hostnames = []
+ for group in location.get("groups", []):
+ for host in group.get("hosts", []):
+ if hostname := host.get("hostname"):
+ hostnames.append(hostname)
+
+ if hostnames:
+ return random.choice(hostnames)
+
+ return None
+
+ @staticmethod
+ def get_countries() -> list[dict]:
+ """Get a list of available Countries and their metadata."""
+ res = requests.get(
+ url="https://assets.windscribe.com/serverlist/firefox/1/1",
+ headers={
+ "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36",
+ "Content-Type": "application/json",
+ },
+ )
+ if not res.ok:
+ raise ValueError(f"Failed to get a list of WindscribeVPN locations [{res.status_code}]")
+
+ try:
+ data = res.json()
+ return data.get("data", [])
+ except json.JSONDecodeError:
+ raise ValueError("Could not decode list of WindscribeVPN locations, not JSON data.")
diff --git a/unshackle/core/session.py b/unshackle/core/session.py
index 4cda472..3a4f704 100644
--- a/unshackle/core/session.py
+++ b/unshackle/core/session.py
@@ -2,9 +2,16 @@
from __future__ import annotations
+import logging
+import random
+import time
import warnings
+from datetime import datetime, timezone
+from email.utils import parsedate_to_datetime
+from typing import Any
+from urllib.parse import urlparse
-from curl_cffi.requests import Session as CurlSession
+from curl_cffi.requests import Response, Session, exceptions
from unshackle.core.config import config
@@ -14,30 +21,145 @@ warnings.filterwarnings(
"ignore", message="Make sure you are using https over https proxy.*", category=RuntimeWarning, module="curl_cffi.*"
)
+FINGERPRINT_PRESETS = {
+ "okhttp4": {
+ "ja3": (
+ "771," # TLS 1.2
+ "4865-4866-4867-49195-49196-52393-49199-49200-52392-49171-49172-156-157-47-53," # Ciphers
+ "0-23-65281-10-11-35-16-5-13-51-45-43," # Extensions
+ "29-23-24," # Named groups (x25519, secp256r1, secp384r1)
+ "0" # EC point formats
+ ),
+ "akamai": "4:16777216|16711681|0|m,p,a,s",
+ "description": "OkHttp 3.x/4.x (BoringSSL TLS stack)",
+ },
+ "okhttp5": {
+ "ja3": (
+ "771," # TLS 1.2
+ "4865-4866-4867-49195-49199-49196-49200-52393-52392-49171-49172-156-157-47-53," # Ciphers
+ "0-23-65281-10-11-35-16-5-13-51-45-43," # Extensions
+ "29-23-24," # Named groups (x25519, secp256r1, secp384r1)
+ "0" # EC point formats
+ ),
+ "akamai": "4:16777216|16711681|0|m,p,a,s",
+ "description": "OkHttp 5.x (BoringSSL TLS stack)",
+ },
+}
-class Session(CurlSession):
- """curl_cffi Session with warning suppression."""
- def request(self, method, url, **kwargs):
- with warnings.catch_warnings():
- warnings.filterwarnings(
- "ignore", message="Make sure you are using https over https proxy.*", category=RuntimeWarning
- )
+class MaxRetriesError(exceptions.RequestException):
+ def __init__(self, message, cause=None):
+ super().__init__(message)
+ self.__cause__ = cause
+
+
+class CurlSession(Session):
+ def __init__(
+ self,
+ max_retries: int = 10,
+ backoff_factor: float = 0.2,
+ max_backoff: float = 60.0,
+ status_forcelist: list[int] | None = None,
+ allowed_methods: set[str] | None = None,
+ catch_exceptions: tuple[type[Exception], ...] | None = None,
+ **session_kwargs: Any,
+ ):
+ super().__init__(**session_kwargs)
+
+ self.max_retries = max_retries
+ self.backoff_factor = backoff_factor
+ self.max_backoff = max_backoff
+ self.status_forcelist = status_forcelist or [429, 500, 502, 503, 504]
+ self.allowed_methods = allowed_methods or {"GET", "POST", "HEAD", "OPTIONS", "PUT", "DELETE", "TRACE"}
+ self.catch_exceptions = catch_exceptions or (
+ exceptions.ConnectionError,
+ exceptions.ProxyError,
+ exceptions.SSLError,
+ exceptions.Timeout,
+ )
+ self.log = logging.getLogger(self.__class__.__name__)
+
+ def get_sleep_time(self, response: Response | None, attempt: int) -> float | None:
+ if response:
+ retry_after = response.headers.get("Retry-After")
+ if retry_after:
+ try:
+ return float(retry_after)
+ except ValueError:
+ if retry_date := parsedate_to_datetime(retry_after):
+ return (retry_date - datetime.now(timezone.utc)).total_seconds()
+
+ if attempt == 0:
+ return 0.0
+
+ backoff_value = self.backoff_factor * (2 ** (attempt - 1))
+ jitter = backoff_value * 0.1
+ sleep_time = backoff_value + random.uniform(-jitter, jitter)
+ return min(sleep_time, self.max_backoff)
+
+ def request(self, method: str, url: str, **kwargs: Any) -> Response:
+ if method.upper() not in self.allowed_methods:
return super().request(method, url, **kwargs)
+ last_exception = None
+ response = None
-def session(browser: str | None = None, **kwargs) -> Session:
+ for attempt in range(self.max_retries + 1):
+ try:
+ response = super().request(method, url, **kwargs)
+ if response.status_code not in self.status_forcelist:
+ return response
+ last_exception = exceptions.HTTPError(f"Received status code: {response.status_code}")
+ self.log.warning(
+ f"{response.status_code} {response.reason}({urlparse(url).path}). Retrying... "
+ f"({attempt + 1}/{self.max_retries})"
+ )
+
+ except self.catch_exceptions as e:
+ last_exception = e
+ response = None
+ self.log.warning(
+ f"{e.__class__.__name__}({urlparse(url).path}). Retrying... ({attempt + 1}/{self.max_retries})"
+ )
+
+ if attempt < self.max_retries:
+ if sleep_duration := self.get_sleep_time(response, attempt + 1):
+ if sleep_duration > 0:
+ time.sleep(sleep_duration)
+ else:
+ break
+
+ raise MaxRetriesError(f"Max retries exceeded for {method} {url}", cause=last_exception)
+
+
+def session(
+ browser: str | None = None,
+ ja3: str | None = None,
+ akamai: str | None = None,
+ extra_fp: dict | None = None,
+ **kwargs,
+) -> CurlSession:
"""
- Create a curl_cffi session that impersonates a browser.
+ Create a curl_cffi session that impersonates a browser or custom TLS/HTTP fingerprint.
This is a full replacement for requests.Session with browser impersonation
and anti-bot capabilities. The session uses curl-impersonate under the hood
to mimic real browser behavior.
Args:
- browser: Browser to impersonate (e.g. "chrome124", "firefox", "safari").
+ browser: Browser to impersonate (e.g. "chrome124", "firefox", "safari") OR
+ fingerprint preset name (e.g. "okhttp4").
Uses the configured default from curl_impersonate.browser if not specified.
- See https://github.com/lexiforest/curl_cffi#sessions for available options.
+ Available presets: okhttp4
+ See https://github.com/lexiforest/curl_cffi#sessions for browser options.
+ ja3: Custom JA3 TLS fingerprint string (format: "SSLVersion,Ciphers,Extensions,Curves,PointFormats").
+ When provided, curl_cffi will use this exact TLS fingerprint instead of the browser's default.
+ See https://curl-cffi.readthedocs.io/en/latest/impersonate/customize.html
+ akamai: Custom Akamai HTTP/2 fingerprint string (format: "SETTINGS|WINDOW_UPDATE|PRIORITY|PSEUDO_HEADERS").
+ When provided, curl_cffi will use this exact HTTP/2 fingerprint instead of the browser's default.
+ See https://curl-cffi.readthedocs.io/en/latest/impersonate/customize.html
+ extra_fp: Additional fingerprint parameters dict for advanced customization.
+ See https://curl-cffi.readthedocs.io/en/latest/impersonate/customize.html
**kwargs: Additional arguments passed to CurlSession constructor:
- headers: Additional headers (dict)
- cookies: Cookie jar or dict
@@ -49,31 +171,80 @@ def session(browser: str | None = None, **kwargs) -> Session:
- max_redirects: Maximum redirect count (int)
- cert: Client certificate (str or tuple)
- Returns:
- curl_cffi.requests.Session configured with browser impersonation, common headers,
- and equivalent retry behavior to requests.Session.
+ Extra arguments for retry handler:
+ - max_retries: Maximum number of retries (int, default 10)
+ - backoff_factor: Backoff factor (float, default 0.2)
+ - max_backoff: Maximum backoff time (float, default 60.0)
+ - status_forcelist: List of status codes to force retry (list, default [429, 500, 502, 503, 504])
+ - allowed_methods: List of allowed HTTP methods (set, default {"GET", "POST", "HEAD", "OPTIONS", "PUT", "DELETE", "TRACE"})
+ - catch_exceptions: List of exceptions to catch (tuple, default (exceptions.ConnectionError, exceptions.ProxyError, exceptions.SSLError, exceptions.Timeout))
- Example:
+ Returns:
+ curl_cffi.requests.Session configured with browser impersonation or custom fingerprints,
+ common headers, and equivalent retry behavior to requests.Session.
+
+ Examples:
+ # Standard browser impersonation
from unshackle.core.session import session
class MyService(Service):
@staticmethod
def get_session():
return session() # Uses config default browser
- """
- if browser is None:
- browser = config.curl_impersonate.get("browser", "chrome124")
- session_config = {
- "impersonate": browser,
- "timeout": 30.0,
- "allow_redirects": True,
- "max_redirects": 15,
- "verify": True,
- }
+ # Use OkHttp 4.x preset for Android TV
+ class AndroidService(Service):
+ @staticmethod
+ def get_session():
+ return session("okhttp4")
+
+ # Custom fingerprint (manual)
+ class CustomService(Service):
+ @staticmethod
+ def get_session():
+ return session(
+ ja3="771,4865-4866-4867-49195...",
+ akamai="1:65536;2:0;4:6291456;6:262144|15663105|0|m,a,s,p",
+ )
+
+ # With retry configuration
+ class MyService(Service):
+ @staticmethod
+ def get_session():
+ return session(
+ "okhttp4",
+ max_retries=5,
+ status_forcelist=[429, 500],
+ allowed_methods={"GET", "HEAD", "OPTIONS"},
+ )
+ """
+
+ if browser and browser in FINGERPRINT_PRESETS:
+ preset = FINGERPRINT_PRESETS[browser]
+ if ja3 is None:
+ ja3 = preset.get("ja3")
+ if akamai is None:
+ akamai = preset.get("akamai")
+ if extra_fp is None:
+ extra_fp = preset.get("extra_fp")
+ browser = None
+
+ if browser is None and ja3 is None and akamai is None:
+ browser = config.curl_impersonate.get("browser", "chrome")
+
+ session_config = {}
+ if browser:
+ session_config["impersonate"] = browser
+
+ if ja3:
+ session_config["ja3"] = ja3
+ if akamai:
+ session_config["akamai"] = akamai
+ if extra_fp:
+ session_config["extra_fp"] = extra_fp
session_config.update(kwargs)
- session_obj = Session(**session_config)
- session_obj.headers.update(config.headers)
+ session_obj = CurlSession(**session_config)
+ session_obj.headers.update(config.headers)
return session_obj
diff --git a/unshackle/core/title_cacher.py b/unshackle/core/title_cacher.py
index f3346aa..76ca639 100644
--- a/unshackle/core/title_cacher.py
+++ b/unshackle/core/title_cacher.py
@@ -180,6 +180,167 @@ class TitleCacher:
"hit_rate": f"{hit_rate:.1f}%",
}
+ def get_cached_tmdb(
+ self, title_id: str, kind: str, region: Optional[str] = None, account_hash: Optional[str] = None
+ ) -> Optional[dict]:
+ """
+ Get cached TMDB data for a title.
+
+ Args:
+ title_id: The title identifier
+ kind: "movie" or "tv"
+ region: The region/proxy identifier
+ account_hash: Hash of account credentials
+
+ Returns:
+ Dict with 'detail' and 'external_ids' if cached and valid, None otherwise
+ """
+ if not config.title_cache_enabled:
+ return None
+
+ cache_key = self._generate_cache_key(title_id, region, account_hash)
+ cache = self.cacher.get(cache_key, version=1)
+
+ if not cache or not cache.data:
+ return None
+
+ tmdb_data = getattr(cache.data, "tmdb_data", None)
+ if not tmdb_data:
+ return None
+
+ tmdb_expiration = tmdb_data.get("expires_at")
+ if not tmdb_expiration or datetime.now() >= tmdb_expiration:
+ self.log.debug(f"TMDB cache expired for {title_id}")
+ return None
+
+ if tmdb_data.get("kind") != kind:
+ self.log.debug(f"TMDB cache kind mismatch for {title_id}: cached {tmdb_data.get('kind')}, requested {kind}")
+ return None
+
+ self.log.debug(f"TMDB cache hit for {title_id}")
+ return {
+ "detail": tmdb_data.get("detail"),
+ "external_ids": tmdb_data.get("external_ids"),
+ "fetched_at": tmdb_data.get("fetched_at"),
+ }
+
+ def cache_tmdb(
+ self,
+ title_id: str,
+ detail_response: dict,
+ external_ids_response: dict,
+ kind: str,
+ region: Optional[str] = None,
+ account_hash: Optional[str] = None,
+ ) -> None:
+ """
+ Cache TMDB data for a title.
+
+ Args:
+ title_id: The title identifier
+ detail_response: Full TMDB detail API response
+ external_ids_response: Full TMDB external_ids API response
+ kind: "movie" or "tv"
+ region: The region/proxy identifier
+ account_hash: Hash of account credentials
+ """
+ if not config.title_cache_enabled:
+ return
+
+ cache_key = self._generate_cache_key(title_id, region, account_hash)
+ cache = self.cacher.get(cache_key, version=1)
+
+ if not cache or not cache.data:
+ self.log.debug(f"Cannot cache TMDB data: no title cache exists for {title_id}")
+ return
+
+ now = datetime.now()
+ tmdb_data = {
+ "detail": detail_response,
+ "external_ids": external_ids_response,
+ "kind": kind,
+ "fetched_at": now,
+ "expires_at": now + timedelta(days=7), # 7-day expiration
+ }
+
+ cache.data.tmdb_data = tmdb_data
+
+ cache.set(cache.data, expiration=cache.expiration)
+ self.log.debug(f"Cached TMDB data for {title_id} (kind={kind})")
+
+ def get_cached_simkl(
+ self, title_id: str, region: Optional[str] = None, account_hash: Optional[str] = None
+ ) -> Optional[dict]:
+ """
+ Get cached Simkl data for a title.
+
+ Args:
+ title_id: The title identifier
+ region: The region/proxy identifier
+ account_hash: Hash of account credentials
+
+ Returns:
+ Simkl response dict if cached and valid, None otherwise
+ """
+ if not config.title_cache_enabled:
+ return None
+
+ cache_key = self._generate_cache_key(title_id, region, account_hash)
+ cache = self.cacher.get(cache_key, version=1)
+
+ if not cache or not cache.data:
+ return None
+
+ simkl_data = getattr(cache.data, "simkl_data", None)
+ if not simkl_data:
+ return None
+
+ simkl_expiration = simkl_data.get("expires_at")
+ if not simkl_expiration or datetime.now() >= simkl_expiration:
+ self.log.debug(f"Simkl cache expired for {title_id}")
+ return None
+
+ self.log.debug(f"Simkl cache hit for {title_id}")
+ return simkl_data.get("response")
+
+ def cache_simkl(
+ self,
+ title_id: str,
+ simkl_response: dict,
+ region: Optional[str] = None,
+ account_hash: Optional[str] = None,
+ ) -> None:
+ """
+ Cache Simkl data for a title.
+
+ Args:
+ title_id: The title identifier
+ simkl_response: Full Simkl API response
+ region: The region/proxy identifier
+ account_hash: Hash of account credentials
+ """
+ if not config.title_cache_enabled:
+ return
+
+ cache_key = self._generate_cache_key(title_id, region, account_hash)
+ cache = self.cacher.get(cache_key, version=1)
+
+ if not cache or not cache.data:
+ self.log.debug(f"Cannot cache Simkl data: no title cache exists for {title_id}")
+ return
+
+ now = datetime.now()
+ simkl_data = {
+ "response": simkl_response,
+ "fetched_at": now,
+ "expires_at": now + timedelta(days=7),
+ }
+
+ cache.data.simkl_data = simkl_data
+
+ cache.set(cache.data, expiration=cache.expiration)
+ self.log.debug(f"Cached Simkl data for {title_id}")
+
def get_region_from_proxy(proxy_url: Optional[str]) -> Optional[str]:
"""
diff --git a/unshackle/core/titles/episode.py b/unshackle/core/titles/episode.py
index 16ccc3d..bd767d1 100644
--- a/unshackle/core/titles/episode.py
+++ b/unshackle/core/titles/episode.py
@@ -95,9 +95,9 @@ class Episode(Title):
media_info.audio_tracks,
key=lambda x: (
float(x.bit_rate) if x.bit_rate else 0,
- bool(x.format_additionalfeatures and "JOC" in x.format_additionalfeatures)
+ bool(x.format_additionalfeatures and "JOC" in x.format_additionalfeatures),
),
- reverse=True
+ reverse=True,
)
primary_audio_track = sorted_audio[0]
unique_audio_languages = len({x.language.split("-")[0] for x in media_info.audio_tracks if x.language})
@@ -173,20 +173,26 @@ class Episode(Title):
if primary_video_track:
codec = primary_video_track.format
hdr_format = primary_video_track.hdr_format_commercial
+ hdr_format_full = primary_video_track.hdr_format or ""
trc = (
primary_video_track.transfer_characteristics
or primary_video_track.transfer_characteristics_original
+ or ""
)
frame_rate = float(primary_video_track.frame_rate)
+
+ # Primary HDR format detection
if hdr_format:
- if (primary_video_track.hdr_format or "").startswith("Dolby Vision"):
+ if hdr_format_full.startswith("Dolby Vision"):
name += " DV"
- if DYNAMIC_RANGE_MAP.get(hdr_format) and DYNAMIC_RANGE_MAP.get(hdr_format) != "DV":
+ if any(indicator in hdr_format_full for indicator in ["HDR10", "SMPTE ST 2086"]):
name += " HDR"
else:
name += f" {DYNAMIC_RANGE_MAP.get(hdr_format)} "
- elif trc and "HLG" in trc:
+ elif "HLG" in trc or "Hybrid Log-Gamma" in trc or "ARIB STD-B67" in trc or "arib-std-b67" in trc.lower():
name += " HLG"
+ elif any(indicator in trc for indicator in ["PQ", "SMPTE ST 2084", "BT.2100"]) or "smpte2084" in trc.lower() or "bt.2020-10" in trc.lower():
+ name += " HDR"
if frame_rate > 30:
name += " HFR"
name += f" {VIDEO_CODEC_MAP.get(codec, codec)}"
diff --git a/unshackle/core/titles/movie.py b/unshackle/core/titles/movie.py
index 515c6b0..3d73fb5 100644
--- a/unshackle/core/titles/movie.py
+++ b/unshackle/core/titles/movie.py
@@ -58,9 +58,9 @@ class Movie(Title):
media_info.audio_tracks,
key=lambda x: (
float(x.bit_rate) if x.bit_rate else 0,
- bool(x.format_additionalfeatures and "JOC" in x.format_additionalfeatures)
+ bool(x.format_additionalfeatures and "JOC" in x.format_additionalfeatures),
),
- reverse=True
+ reverse=True,
)
primary_audio_track = sorted_audio[0]
unique_audio_languages = len({x.language.split("-")[0] for x in media_info.audio_tracks if x.language})
@@ -124,20 +124,26 @@ class Movie(Title):
if primary_video_track:
codec = primary_video_track.format
hdr_format = primary_video_track.hdr_format_commercial
+ hdr_format_full = primary_video_track.hdr_format or ""
trc = (
primary_video_track.transfer_characteristics
or primary_video_track.transfer_characteristics_original
+ or ""
)
frame_rate = float(primary_video_track.frame_rate)
+
+ # Primary HDR format detection
if hdr_format:
- if (primary_video_track.hdr_format or "").startswith("Dolby Vision"):
+ if hdr_format_full.startswith("Dolby Vision"):
name += " DV"
- if DYNAMIC_RANGE_MAP.get(hdr_format) and DYNAMIC_RANGE_MAP.get(hdr_format) != "DV":
+ if any(indicator in hdr_format_full for indicator in ["HDR10", "SMPTE ST 2086"]):
name += " HDR"
else:
name += f" {DYNAMIC_RANGE_MAP.get(hdr_format)} "
- elif trc and "HLG" in trc:
+ elif "HLG" in trc or "Hybrid Log-Gamma" in trc or "ARIB STD-B67" in trc or "arib-std-b67" in trc.lower():
name += " HLG"
+ elif any(indicator in trc for indicator in ["PQ", "SMPTE ST 2084", "BT.2100"]) or "smpte2084" in trc.lower() or "bt.2020-10" in trc.lower():
+ name += " HDR"
if frame_rate > 30:
name += " HFR"
name += f" {VIDEO_CODEC_MAP.get(codec, codec)}"
diff --git a/unshackle/core/tracks/subtitle.py b/unshackle/core/tracks/subtitle.py
index e336345..e807bff 100644
--- a/unshackle/core/tracks/subtitle.py
+++ b/unshackle/core/tracks/subtitle.py
@@ -239,25 +239,29 @@ class Subtitle(Track):
# Sanitize WebVTT timestamps before parsing
text = Subtitle.sanitize_webvtt_timestamps(text)
+ preserve_formatting = config.subtitle.get("preserve_formatting", True)
- try:
- caption_set = pycaption.WebVTTReader().read(text)
- Subtitle.merge_same_cues(caption_set)
- Subtitle.filter_unwanted_cues(caption_set)
- subtitle_text = pycaption.WebVTTWriter().write(caption_set)
- self.path.write_text(subtitle_text, encoding="utf8")
- except pycaption.exceptions.CaptionReadSyntaxError:
- # If first attempt fails, try more aggressive sanitization
- text = Subtitle.sanitize_webvtt(text)
+ if preserve_formatting:
+ self.path.write_text(text, encoding="utf8")
+ else:
try:
caption_set = pycaption.WebVTTReader().read(text)
Subtitle.merge_same_cues(caption_set)
Subtitle.filter_unwanted_cues(caption_set)
subtitle_text = pycaption.WebVTTWriter().write(caption_set)
self.path.write_text(subtitle_text, encoding="utf8")
- except Exception:
- # Keep the sanitized version even if parsing failed
- self.path.write_text(text, encoding="utf8")
+ except pycaption.exceptions.CaptionReadSyntaxError:
+ # If first attempt fails, try more aggressive sanitization
+ text = Subtitle.sanitize_webvtt(text)
+ try:
+ caption_set = pycaption.WebVTTReader().read(text)
+ Subtitle.merge_same_cues(caption_set)
+ Subtitle.filter_unwanted_cues(caption_set)
+ subtitle_text = pycaption.WebVTTWriter().write(caption_set)
+ self.path.write_text(subtitle_text, encoding="utf8")
+ except Exception:
+ # Keep the sanitized version even if parsing failed
+ self.path.write_text(text, encoding="utf8")
@staticmethod
def sanitize_webvtt_timestamps(text: str) -> str:
@@ -979,20 +983,33 @@ class Subtitle(Track):
stdout=subprocess.DEVNULL,
)
else:
- sub = Subtitles(self.path)
+ if config.subtitle.get("convert_before_strip", True) and self.codec != Subtitle.Codec.SubRip:
+ self.path = self.convert(Subtitle.Codec.SubRip)
+ self.codec = Subtitle.Codec.SubRip
+
try:
- sub.filter(rm_fonts=True, rm_ast=True, rm_music=True, rm_effects=True, rm_names=True, rm_author=True)
- except ValueError as e:
- if "too many values to unpack" in str(e):
- # Retry without name removal if the error is due to multiple colons in time references
- # This can happen with lines like "at 10:00 and 2:00"
- sub = Subtitles(self.path)
+ sub = Subtitles(self.path)
+ try:
sub.filter(
- rm_fonts=True, rm_ast=True, rm_music=True, rm_effects=True, rm_names=False, rm_author=True
+ rm_fonts=True, rm_ast=True, rm_music=True, rm_effects=True, rm_names=True, rm_author=True
)
+ except ValueError as e:
+ if "too many values to unpack" in str(e):
+ # Retry without name removal if the error is due to multiple colons in time references
+ # This can happen with lines like "at 10:00 and 2:00"
+ sub = Subtitles(self.path)
+ sub.filter(
+ rm_fonts=True, rm_ast=True, rm_music=True, rm_effects=True, rm_names=False, rm_author=True
+ )
+ else:
+ raise
+ sub.save()
+ except (IOError, OSError) as e:
+ if "is not valid subtitle file" in str(e):
+ self.log.warning(f"Failed to strip SDH from {self.path.name}: {e}")
+ self.log.warning("Continuing without SDH stripping for this subtitle")
else:
raise
- sub.save()
def reverse_rtl(self) -> None:
"""
diff --git a/unshackle/core/tracks/track.py b/unshackle/core/tracks/track.py
index 12c7af0..0b1a38f 100644
--- a/unshackle/core/tracks/track.py
+++ b/unshackle/core/tracks/track.py
@@ -25,7 +25,7 @@ from unshackle.core.constants import DOWNLOAD_CANCELLED, DOWNLOAD_LICENCE_ONLY
from unshackle.core.downloaders import aria2c, curl_impersonate, n_m3u8dl_re, requests
from unshackle.core.drm import DRM_T, PlayReady, Widevine
from unshackle.core.events import events
-from unshackle.core.utilities import get_boxes, try_ensure_utf8
+from unshackle.core.utilities import get_boxes, get_extension, try_ensure_utf8
from unshackle.core.utils.subprocess import ffprobe
@@ -47,6 +47,8 @@ class Track:
drm: Optional[Iterable[DRM_T]] = None,
edition: Optional[str] = None,
downloader: Optional[Callable] = None,
+ downloader_args: Optional[dict] = None,
+ from_file: Optional[Path] = None,
data: Optional[Union[dict, defaultdict]] = None,
id_: Optional[str] = None,
extra: Optional[Any] = None,
@@ -69,6 +71,10 @@ class Track:
raise TypeError(f"Expected edition to be a {str}, not {type(edition)}")
if not isinstance(downloader, (Callable, type(None))):
raise TypeError(f"Expected downloader to be a {Callable}, not {type(downloader)}")
+ if not isinstance(downloader_args, (dict, type(None))):
+ raise TypeError(f"Expected downloader_args to be a {dict}, not {type(downloader_args)}")
+ if not isinstance(from_file, (Path, type(None))):
+ raise TypeError(f"Expected from_file to be a {Path}, not {type(from_file)}")
if not isinstance(data, (dict, defaultdict, type(None))):
raise TypeError(f"Expected data to be a {dict} or {defaultdict}, not {type(data)}")
@@ -100,6 +106,8 @@ class Track:
self.drm = drm
self.edition: str = edition
self.downloader = downloader
+ self.downloader_args = downloader_args
+ self.from_file = from_file
self._data: defaultdict[Any, Any] = defaultdict(dict)
self.data = data or {}
self.extra: Any = extra or {} # allow anything for extra, but default to a dict
@@ -203,7 +211,21 @@ class Track:
save_path = config.directories.temp / f"{track_type}_{self.id}.mp4"
if track_type == "Subtitle":
save_path = save_path.with_suffix(f".{self.codec.extension}")
- if self.downloader.__name__ == "n_m3u8dl_re":
+ # n_m3u8dl_re doesn't support directly downloading subtitles from URLs
+ # or when the subtitle has a direct file extension
+ if self.downloader.__name__ == "n_m3u8dl_re" and (
+ self.descriptor == self.Descriptor.URL
+ or get_extension(self.url) in {
+ ".srt",
+ ".vtt",
+ ".ttml",
+ ".ssa",
+ ".ass",
+ ".stpp",
+ ".wvtt",
+ ".xml",
+ }
+ ):
self.downloader = requests
if self.descriptor != self.Descriptor.URL:
diff --git a/unshackle/core/tracks/video.py b/unshackle/core/tracks/video.py
index 8a9d344..5c54631 100644
--- a/unshackle/core/tracks/video.py
+++ b/unshackle/core/tracks/video.py
@@ -99,24 +99,42 @@ class Video(Track):
@staticmethod
def from_cicp(primaries: int, transfer: int, matrix: int) -> Video.Range:
"""
- ISO/IEC 23001-8 Coding-independent code points to Video Range.
+ Convert CICP (Coding-Independent Code Points) values to Video Range.
+
+ CICP is defined in ITU-T H.273 and ISO/IEC 23091-2 for signaling video
+ color properties independently of the compression codec. These values are
+ used across AVC (H.264), HEVC (H.265), VVC, AV1, and other modern codecs.
+
+ The enum values (Primaries, Transfer, Matrix) match the official specifications:
+ - ITU-T H.273: Coding-independent code points for video signal type identification
+ - ISO/IEC 23091-2: Information technology — Coding-independent code points — Part 2: Video
+ - H.264 Table E-3 (Colour Primaries) and Table E-4 (Transfer Characteristics)
+ - H.265 Table E.3 and E.4 (identical to H.264)
+
+ Note: Value 0 = "Reserved" and Value 2 = "Unspecified" per specification.
+ While both effectively mean "unknown" in practice, the distinction matters for
+ spec compliance. Value 2 was added based on user feedback (GitHub issue) and
+ verified against FFmpeg's AVColorPrimaries/AVColorTransferCharacteristic enums.
Sources:
- https://www.itu.int/rec/T-REC-H.Sup19-202104-I
+ - https://www.itu.int/rec/T-REC-H.273
+ - https://www.itu.int/rec/T-REC-H.Sup19-202104-I
+ - https://github.com/FFmpeg/FFmpeg/blob/master/libavutil/pixfmt.h
"""
class Primaries(Enum):
- Unspecified = 0
+ Reserved = 0
BT_709 = 1
+ Unspecified = 2
BT_601_625 = 5
BT_601_525 = 6
BT_2020_and_2100 = 9
SMPTE_ST_2113_and_EG_4321 = 12 # P3D65
class Transfer(Enum):
- Unspecified = 0
+ Reserved = 0
BT_709 = 1
- Unspecified_Image = 2
+ Unspecified = 2
BT_601 = 6
BT_2020 = 14
BT_2100 = 15
@@ -143,7 +161,7 @@ class Video(Track):
# primaries and matrix does not strictly correlate to a range
- if (primaries, transfer, matrix) == (0, 0, 0):
+ if (primaries, transfer, matrix) == (Primaries.Reserved, Transfer.Reserved, Matrix.RGB):
return Video.Range.SDR
elif primaries in (Primaries.BT_601_625, Primaries.BT_601_525):
return Video.Range.SDR
diff --git a/unshackle/core/update_checker.py b/unshackle/core/update_checker.py
index 5ca6502..8d601fc 100644
--- a/unshackle/core/update_checker.py
+++ b/unshackle/core/update_checker.py
@@ -28,21 +28,21 @@ class UpdateChecker:
DEFAULT_CHECK_INTERVAL = 24 * 60 * 60
@classmethod
- def _get_cache_file(cls) -> Path:
+ def get_cache_file(cls) -> Path:
"""Get the path to the update check cache file."""
from unshackle.core.config import config
return config.directories.cache / "update_check.json"
@classmethod
- def _load_cache_data(cls) -> dict:
+ def load_cache_data(cls) -> dict:
"""
Load cache data from file.
Returns:
Cache data dictionary or empty dict if loading fails
"""
- cache_file = cls._get_cache_file()
+ cache_file = cls.get_cache_file()
if not cache_file.exists():
return {}
@@ -54,7 +54,7 @@ class UpdateChecker:
return {}
@staticmethod
- def _parse_version(version_string: str) -> str:
+ def parse_version(version_string: str) -> str:
"""
Parse and normalize version string by removing 'v' prefix.
@@ -107,7 +107,7 @@ class UpdateChecker:
return None
data = response.json()
- latest_version = cls._parse_version(data.get("tag_name", ""))
+ latest_version = cls.parse_version(data.get("tag_name", ""))
return latest_version if cls._is_valid_version(latest_version) else None
@@ -125,7 +125,7 @@ class UpdateChecker:
Returns:
True if we should check for updates, False otherwise
"""
- cache_data = cls._load_cache_data()
+ cache_data = cls.load_cache_data()
if not cache_data:
return True
@@ -144,7 +144,7 @@ class UpdateChecker:
latest_version: The latest version found, if any
current_version: The current version being used
"""
- cache_file = cls._get_cache_file()
+ cache_file = cls.get_cache_file()
try:
cache_file.parent.mkdir(parents=True, exist_ok=True)
@@ -231,7 +231,7 @@ class UpdateChecker:
Returns:
The latest version string if an update is available from cache, None otherwise
"""
- cache_data = cls._load_cache_data()
+ cache_data = cls.load_cache_data()
if not cache_data:
return None
diff --git a/unshackle/core/utilities.py b/unshackle/core/utilities.py
index cf29d07..22c0b75 100644
--- a/unshackle/core/utilities.py
+++ b/unshackle/core/utilities.py
@@ -1,23 +1,27 @@
import ast
import contextlib
import importlib.util
+import json
import logging
import os
import re
import socket
import sys
import time
+import traceback
import unicodedata
from collections import defaultdict
-from datetime import datetime
+from datetime import datetime, timezone
from pathlib import Path
from types import ModuleType
-from typing import Optional, Sequence, Union
+from typing import Any, Optional, Sequence, Union
from urllib.parse import ParseResult, urlparse
+from uuid import uuid4
import chardet
import requests
from construct import ValidationError
+from fontTools import ttLib
from langcodes import Language, closest_match
from pymp4.parser import Box
from requests.adapters import HTTPAdapter
@@ -27,6 +31,30 @@ from unshackle.core.cacher import Cacher
from unshackle.core.config import config
from unshackle.core.constants import LANGUAGE_EXACT_DISTANCE, LANGUAGE_MAX_DISTANCE
+"""
+Utility functions for the unshackle media archival tool.
+
+This module provides various utility functions including:
+- Font discovery and fallback system for subtitle rendering
+- Cross-platform system font scanning with Windows → Linux font family mapping
+- Log file management and rotation
+- IP geolocation with caching and provider rotation
+- Language matching utilities
+- MP4/ISOBMFF box parsing
+- File sanitization and path handling
+- Structured JSON debug logging
+
+Font System:
+ The font subsystem enables cross-platform font discovery for ASS/SSA subtitles.
+ On Linux, it scans standard font directories and maps Windows font names (Arial,
+ Times New Roman) to their Linux equivalents (Liberation Sans, Liberation Serif).
+
+Main Font Functions:
+ - get_system_fonts(): Discover installed fonts across platforms
+ - find_font_with_fallbacks(): Match fonts with intelligent fallback strategies
+ - suggest_font_packages(): Recommend packages to install for missing fonts
+"""
+
def rotate_log_file(log_path: Path, keep: int = 20) -> Path:
"""
@@ -100,8 +128,11 @@ def sanitize_filename(filename: str, spacer: str = ".") -> str:
# remove or replace further characters as needed
filename = "".join(c for c in filename if unicodedata.category(c) != "Mn") # hidden characters
filename = filename.replace("/", " & ").replace(";", " & ") # e.g. multi-episode filenames
- filename = re.sub(r"[;]", spacer, filename) # structural chars to (spacer)
- filename = re.sub(r"[\\:*!?¿,'\"""<>|$#~]", "", filename) # not filename safe chars
+ # From original repo.
+ # if spacer == ".":
+ # filename = re.sub(r" - ", spacer, filename) # title separators to spacer (avoids .-. pattern)
+ filename = re.sub(r"[:; ]", spacer, filename) # structural chars to (spacer)
+ filename = re.sub(r"[\\*!?¿,'\"" "()<>|$#~]", "", filename) # not filename safe chars
# filename = re.sub(rf"[{spacer}]{{2,}}", spacer, filename) # remove extra neighbouring (spacer)s
return filename
@@ -123,7 +154,7 @@ def is_exact_match(language: Union[str, Language], languages: Sequence[Union[str
return closest_match(language, list(map(str, languages)))[1] <= LANGUAGE_EXACT_DISTANCE
-def get_boxes(data: bytes, box_type: bytes, as_bytes: bool = False) -> Box:
+def get_boxes(data: bytes, box_type: bytes, as_bytes: bool = False) -> Box: # type: ignore
"""
Scan a byte array for a wanted MP4/ISOBMFF box, then parse and yield each find.
@@ -429,21 +460,263 @@ def get_extension(value: Union[str, Path, ParseResult]) -> Optional[str]:
return ext
-def get_system_fonts() -> dict[str, Path]:
- if sys.platform == "win32":
- import winreg
+def extract_font_family(font_path: Path) -> Optional[str]:
+ """
+ Extract font family name from TTF/OTF file using fontTools.
- with winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE) as reg:
- key = winreg.OpenKey(reg, r"SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts", 0, winreg.KEY_READ)
- total_fonts = winreg.QueryInfoKey(key)[1]
- return {
- name.replace(" (TrueType)", ""): Path(r"C:\Windows\Fonts", filename)
- for n in range(0, total_fonts)
- for name, filename, _ in [winreg.EnumValue(key, n)]
- }
- else:
- # TODO: Get System Fonts for Linux and mac OS
- return {}
+ Args:
+ font_path: Path to the font file
+
+ Returns:
+ Font family name if successfully extracted, None otherwise
+ """
+ # Suppress verbose fontTools logging during font table parsing
+ import io
+
+ logging.getLogger("fontTools").setLevel(logging.ERROR)
+ logging.getLogger("fontTools.ttLib").setLevel(logging.ERROR)
+ logging.getLogger("fontTools.ttLib.tables").setLevel(logging.ERROR)
+ logging.getLogger("fontTools.ttLib.tables._n_a_m_e").setLevel(logging.ERROR)
+ stderr_backup = sys.stderr
+ sys.stderr = io.StringIO()
+
+ try:
+ font = ttLib.TTFont(font_path, lazy=True)
+ name_table = font["name"]
+
+ # Try to get family name (nameID 1) for Windows platform (platformID 3)
+ # This matches the naming convention used in Windows registry
+ for record in name_table.names:
+ if record.nameID == 1 and record.platformID == 3:
+ return record.toUnicode()
+
+ # Fallback to other platforms if Windows name not found
+ for record in name_table.names:
+ if record.nameID == 1:
+ return record.toUnicode()
+
+ except Exception:
+ pass
+ finally:
+ sys.stderr = stderr_backup
+
+ return None
+
+
+def get_windows_fonts() -> dict[str, Path]:
+ """
+ Get fonts from Windows registry.
+
+ Returns:
+ Dictionary mapping font family names to their file paths
+ """
+ import winreg
+
+ with winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE) as reg:
+ key = winreg.OpenKey(reg, r"SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts", 0, winreg.KEY_READ)
+ total_fonts = winreg.QueryInfoKey(key)[1]
+ return {
+ name.replace(" (TrueType)", ""): Path(r"C:\Windows\Fonts", filename)
+ for n in range(0, total_fonts)
+ for name, filename, _ in [winreg.EnumValue(key, n)]
+ }
+
+
+def scan_font_directory(font_dir: Path, fonts: dict[str, Path], log: logging.Logger) -> None:
+ """
+ Scan a single directory for fonts.
+
+ Args:
+ font_dir: Directory to scan
+ fonts: Dictionary to populate with found fonts
+ log: Logger instance for error reporting
+ """
+ font_files = list(font_dir.rglob("*.ttf")) + list(font_dir.rglob("*.otf"))
+
+ for font_file in font_files:
+ try:
+ if family_name := extract_font_family(font_file):
+ if family_name not in fonts:
+ fonts[family_name] = font_file
+ except Exception as e:
+ log.debug(f"Failed to process {font_file}: {e}")
+
+
+def get_unix_fonts() -> dict[str, Path]:
+ """
+ Get fonts from Linux/macOS standard directories.
+
+ Returns:
+ Dictionary mapping font family names to their file paths
+ """
+ log = logging.getLogger("get_system_fonts")
+ fonts = {}
+
+ font_dirs = [
+ Path("/usr/share/fonts"),
+ Path("/usr/local/share/fonts"),
+ Path.home() / ".fonts",
+ Path.home() / ".local/share/fonts",
+ ]
+
+ for font_dir in font_dirs:
+ if not font_dir.exists():
+ continue
+
+ try:
+ scan_font_directory(font_dir, fonts, log)
+ except Exception as e:
+ log.warning(f"Failed to scan {font_dir}: {e}")
+ return fonts
+
+
+def get_system_fonts() -> dict[str, Path]:
+ """
+ Get system fonts as a mapping of font family names to font file paths.
+
+ On Windows: Uses registry to get font display names
+ On Linux/macOS: Scans standard font directories and extracts family names using fontTools
+
+ Returns:
+ Dictionary mapping font family names to their file paths
+ """
+ if sys.platform == "win32":
+ return get_windows_fonts()
+ return get_unix_fonts()
+
+
+# Common Windows font names mapped to their Linux equivalents
+# Ordered by preference (first match is used)
+FONT_ALIASES = {
+ "Arial": ["Liberation Sans", "DejaVu Sans", "Nimbus Sans", "FreeSans"],
+ "Arial Black": ["Liberation Sans", "DejaVu Sans", "Nimbus Sans"],
+ "Arial Bold": ["Liberation Sans", "DejaVu Sans"],
+ "Arial Unicode MS": ["DejaVu Sans", "Noto Sans", "FreeSans"],
+ "Times New Roman": ["Liberation Serif", "DejaVu Serif", "Nimbus Roman", "FreeSerif"],
+ "Courier New": ["Liberation Mono", "DejaVu Sans Mono", "Nimbus Mono PS", "FreeMono"],
+ "Comic Sans MS": ["Comic Neue", "Comic Relief", "DejaVu Sans"],
+ "Georgia": ["Gelasio", "DejaVu Serif", "Liberation Serif"],
+ "Impact": ["Impact", "Anton", "Liberation Sans"],
+ "Trebuchet MS": ["Ubuntu", "DejaVu Sans", "Liberation Sans"],
+ "Verdana": ["DejaVu Sans", "Bitstream Vera Sans", "Liberation Sans"],
+ "Tahoma": ["DejaVu Sans", "Liberation Sans"],
+ "Adobe Arabic": ["Noto Sans Arabic", "DejaVu Sans"],
+ "Noto Sans Thai": ["Noto Sans Thai", "Noto Sans"],
+}
+
+
+def find_case_insensitive(font_name: str, fonts: dict[str, Path]) -> Optional[Path]:
+ """
+ Find font by case-insensitive name match.
+
+ Args:
+ font_name: Font family name to find
+ fonts: Dictionary of available fonts
+
+ Returns:
+ Path to matched font, or None if not found
+ """
+ font_lower = font_name.lower()
+ for name, path in fonts.items():
+ if name.lower() == font_lower:
+ return path
+ return None
+
+
+def find_font_with_fallbacks(font_name: str, system_fonts: dict[str, Path]) -> Optional[Path]:
+ """
+ Find a font by name with intelligent fallback matching.
+
+ Tries multiple strategies in order:
+ 1. Exact match (case-sensitive)
+ 2. Case-insensitive match
+ 3. Alias lookup (Windows → Linux font equivalents)
+ 4. Partial/prefix match
+
+ Args:
+ font_name: The requested font family name (e.g., "Arial", "Times New Roman")
+ system_fonts: Dictionary of available fonts (family name → path)
+
+ Returns:
+ Path to the matched font file, or None if no match found
+ """
+ if not system_fonts:
+ return None
+
+ # Strategy 1: Exact match (case-sensitive)
+ if font_name in system_fonts:
+ return system_fonts[font_name]
+
+ # Strategy 2: Case-insensitive match
+ if result := find_case_insensitive(font_name, system_fonts):
+ return result
+
+ # Strategy 3: Alias lookup
+ if font_name in FONT_ALIASES:
+ for alias in FONT_ALIASES[font_name]:
+ # Try exact match for alias
+ if alias in system_fonts:
+ return system_fonts[alias]
+ # Try case-insensitive match for alias
+ if result := find_case_insensitive(alias, system_fonts):
+ return result
+
+ # Strategy 4: Partial/prefix match as last resort
+ font_name_lower = font_name.lower()
+ for name, path in system_fonts.items():
+ if name.lower().startswith(font_name_lower):
+ return path
+
+ return None
+
+
+# Mapping of font families to system packages that provide them
+FONT_PACKAGES = {
+ "liberation": {
+ "debian": "fonts-liberation fonts-liberation2",
+ "fonts": ["Liberation Sans", "Liberation Serif", "Liberation Mono"],
+ },
+ "dejavu": {
+ "debian": "fonts-dejavu fonts-dejavu-core fonts-dejavu-extra",
+ "fonts": ["DejaVu Sans", "DejaVu Serif", "DejaVu Sans Mono"],
+ },
+ "noto": {
+ "debian": "fonts-noto fonts-noto-core",
+ "fonts": ["Noto Sans", "Noto Serif", "Noto Sans Mono", "Noto Sans Arabic", "Noto Sans Thai"],
+ },
+ "ubuntu": {
+ "debian": "fonts-ubuntu",
+ "fonts": ["Ubuntu", "Ubuntu Mono"],
+ },
+}
+
+
+def suggest_font_packages(missing_fonts: list[str]) -> dict[str, list[str]]:
+ """
+ Suggest system packages to install for missing fonts.
+
+ Args:
+ missing_fonts: List of font family names that couldn't be found
+
+ Returns:
+ Dictionary mapping package names to lists of fonts they would provide
+ """
+ suggestions = {}
+
+ # Check which fonts from aliases would help
+ needed_aliases = set()
+ for font in missing_fonts:
+ if font in FONT_ALIASES:
+ needed_aliases.update(FONT_ALIASES[font])
+
+ # Map needed aliases to packages
+ for package_name, package_info in FONT_PACKAGES.items():
+ provided_fonts = package_info["fonts"]
+ matching_fonts = [f for f in provided_fonts if f in needed_aliases]
+ if matching_fonts:
+ suggestions[package_info["debian"]] = matching_fonts
+
+ return suggestions
class FPS(ast.NodeVisitor):
@@ -461,3 +734,334 @@ class FPS(ast.NodeVisitor):
@classmethod
def parse(cls, expr: str) -> float:
return cls().visit(ast.parse(expr).body[0])
+
+
+"""
+Structured JSON debug logging for unshackle.
+
+Provides comprehensive debugging information for service developers and troubleshooting.
+When enabled, logs all operations, requests, responses, DRM operations, and errors in JSON format.
+"""
+
+
+class DebugLogger:
+ """
+ Structured JSON debug logger for unshackle.
+
+ Outputs JSON Lines format where each line is a complete JSON object.
+ This makes it easy to parse, filter, and analyze logs programmatically.
+ """
+
+ def __init__(self, log_path: Optional[Path] = None, enabled: bool = False, log_keys: bool = False):
+ """
+ Initialize the debug logger.
+
+ Args:
+ log_path: Path to the log file. If None, logging is disabled.
+ enabled: Whether debug logging is enabled.
+ log_keys: Whether to log decryption keys (for debugging key issues).
+ """
+ self.enabled = enabled and log_path is not None
+ self.log_path = log_path
+ self.session_id = str(uuid4())[:8]
+ self.file_handle = None
+ self.log_keys = log_keys
+
+ if self.enabled:
+ self.log_path.parent.mkdir(parents=True, exist_ok=True)
+ self.file_handle = open(self.log_path, "a", encoding="utf-8")
+ self.log_session_start()
+
+ def log_session_start(self):
+ """Log the start of a new session with environment information."""
+ import platform
+
+ from unshackle.core import __version__
+
+ self.log(
+ level="INFO",
+ operation="session_start",
+ message="Debug logging session started",
+ context={
+ "unshackle_version": __version__,
+ "python_version": sys.version,
+ "platform": platform.platform(),
+ "platform_system": platform.system(),
+ "platform_release": platform.release(),
+ },
+ )
+
+ def log(
+ self,
+ level: str = "DEBUG",
+ operation: str = "",
+ message: str = "",
+ context: Optional[dict[str, Any]] = None,
+ service: Optional[str] = None,
+ error: Optional[Exception] = None,
+ request: Optional[dict[str, Any]] = None,
+ response: Optional[dict[str, Any]] = None,
+ duration_ms: Optional[float] = None,
+ success: Optional[bool] = None,
+ **kwargs,
+ ):
+ """
+ Log a structured JSON entry.
+
+ Args:
+ level: Log level (DEBUG, INFO, WARNING, ERROR)
+ operation: Name of the operation being performed
+ message: Human-readable message
+ context: Additional context information
+ service: Service name (e.g., DSNP, NF)
+ error: Exception object if an error occurred
+ request: Request details (URL, method, headers, body)
+ response: Response details (status, headers, body)
+ duration_ms: Operation duration in milliseconds
+ success: Whether the operation succeeded
+ **kwargs: Additional fields to include in the log entry
+ """
+ if not self.enabled or not self.file_handle:
+ return
+
+ entry = {
+ "timestamp": datetime.now(timezone.utc).isoformat(),
+ "session_id": self.session_id,
+ "level": level,
+ }
+
+ if operation:
+ entry["operation"] = operation
+ if message:
+ entry["message"] = message
+ if service:
+ entry["service"] = service
+ if context:
+ entry["context"] = self.sanitize_data(context)
+ if request:
+ entry["request"] = self.sanitize_data(request)
+ if response:
+ entry["response"] = self.sanitize_data(response)
+ if duration_ms is not None:
+ entry["duration_ms"] = duration_ms
+ if success is not None:
+ entry["success"] = success
+
+ if error:
+ entry["error"] = {
+ "type": type(error).__name__,
+ "message": str(error),
+ "traceback": traceback.format_exception(type(error), error, error.__traceback__),
+ }
+
+ for key, value in kwargs.items():
+ if key not in entry:
+ entry[key] = self.sanitize_data(value)
+
+ try:
+ self.file_handle.write(json.dumps(entry, default=str) + "\n")
+ self.file_handle.flush()
+ except Exception as e:
+ print(f"Failed to write debug log: {e}", file=sys.stderr)
+
+ def sanitize_data(self, data: Any) -> Any:
+ """
+ Sanitize data for JSON serialization.
+ Handles complex objects and removes sensitive information.
+ """
+ if data is None:
+ return None
+
+ if isinstance(data, (str, int, float, bool)):
+ return data
+
+ if isinstance(data, (list, tuple)):
+ return [self.sanitize_data(item) for item in data]
+
+ if isinstance(data, dict):
+ sanitized = {}
+ for key, value in data.items():
+ key_lower = str(key).lower()
+ has_prefix = key_lower.startswith("has_")
+
+ is_always_sensitive = not has_prefix and any(
+ sensitive in key_lower for sensitive in ["password", "token", "secret", "auth", "cookie"]
+ )
+
+ is_key_field = (
+ "key" in key_lower
+ and not has_prefix
+ and not any(safe in key_lower for safe in ["_count", "_id", "_type", "kid", "keys_", "key_found"])
+ )
+
+ should_redact = is_always_sensitive or (is_key_field and not self.log_keys)
+
+ if should_redact:
+ sanitized[key] = "[REDACTED]"
+ else:
+ sanitized[key] = self.sanitize_data(value)
+ return sanitized
+
+ if isinstance(data, bytes):
+ try:
+ return data.hex()
+ except Exception:
+ return "[BINARY_DATA]"
+
+ if isinstance(data, Path):
+ return str(data)
+
+ try:
+ return str(data)
+ except Exception:
+ return f"[{type(data).__name__}]"
+
+ def log_operation_start(self, operation: str, **kwargs) -> str:
+ """
+ Log the start of an operation and return an operation ID.
+
+ Args:
+ operation: Name of the operation
+ **kwargs: Additional context
+
+ Returns:
+ Operation ID that can be used to log the end of the operation
+ """
+ op_id = str(uuid4())[:8]
+ self.log(
+ level="DEBUG",
+ operation=f"{operation}_start",
+ message=f"Starting operation: {operation}",
+ operation_id=op_id,
+ **kwargs,
+ )
+ return op_id
+
+ def log_operation_end(
+ self, operation: str, operation_id: str, success: bool = True, duration_ms: Optional[float] = None, **kwargs
+ ):
+ """
+ Log the end of an operation.
+
+ Args:
+ operation: Name of the operation
+ operation_id: Operation ID from log_operation_start
+ success: Whether the operation succeeded
+ duration_ms: Operation duration in milliseconds
+ **kwargs: Additional context
+ """
+ self.log(
+ level="INFO" if success else "ERROR",
+ operation=f"{operation}_end",
+ message=f"Finished operation: {operation}",
+ operation_id=operation_id,
+ success=success,
+ duration_ms=duration_ms,
+ **kwargs,
+ )
+
+ def log_service_call(self, method: str, url: str, **kwargs):
+ """
+ Log a service API call.
+
+ Args:
+ method: HTTP method (GET, POST, etc.)
+ url: Request URL
+ **kwargs: Additional request details (headers, body, etc.)
+ """
+ self.log(level="DEBUG", operation="service_call", request={"method": method, "url": url, **kwargs})
+
+ def log_drm_operation(self, drm_type: str, operation: str, **kwargs):
+ """
+ Log a DRM operation (PSSH extraction, license request, key retrieval).
+
+ Args:
+ drm_type: DRM type (Widevine, PlayReady, etc.)
+ operation: DRM operation name
+ **kwargs: Additional context (PSSH, KIDs, keys, etc.)
+ """
+ self.log(
+ level="DEBUG", operation=f"drm_{operation}", message=f"{drm_type} {operation}", drm_type=drm_type, **kwargs
+ )
+
+ def log_vault_query(self, vault_name: str, operation: str, **kwargs):
+ """
+ Log a vault query operation.
+
+ Args:
+ vault_name: Name of the vault
+ operation: Vault operation (get_key, add_key, etc.)
+ **kwargs: Additional context (KID, key, success, etc.)
+ """
+ self.log(
+ level="DEBUG",
+ operation=f"vault_{operation}",
+ message=f"Vault {vault_name}: {operation}",
+ vault=vault_name,
+ **kwargs,
+ )
+
+ def log_error(self, operation: str, error: Exception, **kwargs):
+ """
+ Log an error with full context.
+
+ Args:
+ operation: Operation that failed
+ error: Exception that occurred
+ **kwargs: Additional context
+ """
+ self.log(
+ level="ERROR",
+ operation=operation,
+ message=f"Error in {operation}: {str(error)}",
+ error=error,
+ success=False,
+ **kwargs,
+ )
+
+ def close(self):
+ """Close the log file and clean up resources."""
+ if self.file_handle:
+ self.log(level="INFO", operation="session_end", message="Debug logging session ended")
+ self.file_handle.close()
+ self.file_handle = None
+
+
+# Global debug logger instance
+_debug_logger: Optional[DebugLogger] = None
+
+
+def get_debug_logger() -> Optional[DebugLogger]:
+ """Get the global debug logger instance."""
+ return _debug_logger
+
+
+def init_debug_logger(log_path: Optional[Path] = None, enabled: bool = False, log_keys: bool = False):
+ """
+ Initialize the global debug logger.
+
+ Args:
+ log_path: Path to the log file
+ enabled: Whether debug logging is enabled
+ log_keys: Whether to log decryption keys (for debugging key issues)
+ """
+ global _debug_logger
+ if _debug_logger:
+ _debug_logger.close()
+ _debug_logger = DebugLogger(log_path=log_path, enabled=enabled, log_keys=log_keys)
+
+
+def close_debug_logger():
+ """Close the global debug logger."""
+ global _debug_logger
+ if _debug_logger:
+ _debug_logger.close()
+ _debug_logger = None
+
+
+__all__ = (
+ "DebugLogger",
+ "get_debug_logger",
+ "init_debug_logger",
+ "close_debug_logger",
+)
diff --git a/unshackle/core/utils/tags.py b/unshackle/core/utils/tags.py
index 5a5e616..5fad48c 100644
--- a/unshackle/core/utils/tags.py
+++ b/unshackle/core/utils/tags.py
@@ -1,7 +1,6 @@
from __future__ import annotations
import logging
-import os
import re
import subprocess
import tempfile
@@ -44,7 +43,11 @@ def _get_session() -> requests.Session:
def _api_key() -> Optional[str]:
- return config.tmdb_api_key or os.getenv("TMDB_API_KEY")
+ return config.tmdb_api_key
+
+
+def _simkl_client_id() -> Optional[str]:
+ return config.simkl_client_id
def _clean(s: str) -> str:
@@ -62,10 +65,44 @@ def fuzzy_match(a: str, b: str, threshold: float = 0.8) -> bool:
return ratio >= threshold
-def search_simkl(title: str, year: Optional[int], kind: str) -> Tuple[Optional[dict], Optional[str], Optional[int]]:
- """Search Simkl API for show information by filename (no auth required)."""
+def search_simkl(
+ title: str,
+ year: Optional[int],
+ kind: str,
+ title_cacher=None,
+ cache_title_id: Optional[str] = None,
+ cache_region: Optional[str] = None,
+ cache_account_hash: Optional[str] = None,
+) -> Tuple[Optional[dict], Optional[str], Optional[int]]:
+ """Search Simkl API for show information by filename."""
+
+ if title_cacher and cache_title_id:
+ cached_simkl = title_cacher.get_cached_simkl(cache_title_id, cache_region, cache_account_hash)
+ if cached_simkl:
+ log.debug("Using cached Simkl data")
+ if cached_simkl.get("type") == "episode" and "show" in cached_simkl:
+ show_info = cached_simkl["show"]
+ show_title = show_info.get("title")
+ tmdb_id = show_info.get("ids", {}).get("tmdbtv")
+ if tmdb_id:
+ tmdb_id = int(tmdb_id)
+ return cached_simkl, show_title, tmdb_id
+ elif cached_simkl.get("type") == "movie" and "movie" in cached_simkl:
+ movie_info = cached_simkl["movie"]
+ movie_title = movie_info.get("title")
+ ids = movie_info.get("ids", {})
+ tmdb_id = ids.get("tmdb") or ids.get("moviedb")
+ if tmdb_id:
+ tmdb_id = int(tmdb_id)
+ return cached_simkl, movie_title, tmdb_id
+
log.debug("Searching Simkl for %r (%s, %s)", title, kind, year)
+ client_id = _simkl_client_id()
+ if not client_id:
+ log.debug("No SIMKL client ID configured; skipping SIMKL search")
+ return None, None, None
+
# Construct appropriate filename based on type
filename = f"{title}"
if year:
@@ -78,7 +115,8 @@ def search_simkl(title: str, year: Optional[int], kind: str) -> Tuple[Optional[d
try:
session = _get_session()
- resp = session.post("https://api.simkl.com/search/file", json={"file": filename}, timeout=30)
+ headers = {"simkl-api-key": client_id}
+ resp = session.post("https://api.simkl.com/search/file", json={"file": filename}, headers=headers, timeout=30)
resp.raise_for_status()
data = resp.json()
log.debug("Simkl API response received")
@@ -102,19 +140,23 @@ def search_simkl(title: str, year: Optional[int], kind: str) -> Tuple[Optional[d
log.debug("Simkl year mismatch: searched %d, got %d", year, show_year)
return None, None, None
+ if title_cacher and cache_title_id:
+ try:
+ title_cacher.cache_simkl(cache_title_id, data, cache_region, cache_account_hash)
+ except Exception as exc:
+ log.debug("Failed to cache Simkl data: %s", exc)
+
tmdb_id = show_info.get("ids", {}).get("tmdbtv")
if tmdb_id:
tmdb_id = int(tmdb_id)
log.debug("Simkl -> %s (TMDB ID %s)", show_title, tmdb_id)
return data, show_title, tmdb_id
- # Handle movie responses
elif data.get("type") == "movie" and "movie" in data:
movie_info = data["movie"]
movie_title = movie_info.get("title")
movie_year = movie_info.get("year")
- # Verify title matches and year if provided
if not fuzzy_match(movie_title, title):
log.debug("Simkl title mismatch: searched %r, got %r", title, movie_title)
return None, None, None
@@ -122,6 +164,12 @@ def search_simkl(title: str, year: Optional[int], kind: str) -> Tuple[Optional[d
log.debug("Simkl year mismatch: searched %d, got %d", year, movie_year)
return None, None, None
+ if title_cacher and cache_title_id:
+ try:
+ title_cacher.cache_simkl(cache_title_id, data, cache_region, cache_account_hash)
+ except Exception as exc:
+ log.debug("Failed to cache Simkl data: %s", exc)
+
ids = movie_info.get("ids", {})
tmdb_id = ids.get("tmdb") or ids.get("moviedb")
if tmdb_id:
@@ -135,18 +183,85 @@ def search_simkl(title: str, year: Optional[int], kind: str) -> Tuple[Optional[d
return None, None, None
-def search_show_info(title: str, year: Optional[int], kind: str) -> Tuple[Optional[int], Optional[str], Optional[str]]:
+def search_show_info(
+ title: str,
+ year: Optional[int],
+ kind: str,
+ title_cacher=None,
+ cache_title_id: Optional[str] = None,
+ cache_region: Optional[str] = None,
+ cache_account_hash: Optional[str] = None,
+) -> Tuple[Optional[int], Optional[str], Optional[str]]:
"""Search for show information, trying Simkl first, then TMDB fallback. Returns (tmdb_id, title, source)."""
- simkl_data, simkl_title, simkl_tmdb_id = search_simkl(title, year, kind)
+ simkl_data, simkl_title, simkl_tmdb_id = search_simkl(
+ title, year, kind, title_cacher, cache_title_id, cache_region, cache_account_hash
+ )
if simkl_data and simkl_title and fuzzy_match(simkl_title, title):
return simkl_tmdb_id, simkl_title, "simkl"
- tmdb_id, tmdb_title = search_tmdb(title, year, kind)
+ tmdb_id, tmdb_title = search_tmdb(title, year, kind, title_cacher, cache_title_id, cache_region, cache_account_hash)
return tmdb_id, tmdb_title, "tmdb"
-def search_tmdb(title: str, year: Optional[int], kind: str) -> Tuple[Optional[int], Optional[str]]:
+def _fetch_tmdb_detail(tmdb_id: int, kind: str) -> Optional[dict]:
+ """Fetch full TMDB detail response for caching."""
+ api_key = _api_key()
+ if not api_key:
+ return None
+
+ try:
+ session = _get_session()
+ r = session.get(
+ f"https://api.themoviedb.org/3/{kind}/{tmdb_id}",
+ params={"api_key": api_key},
+ timeout=30,
+ )
+ r.raise_for_status()
+ return r.json()
+ except requests.RequestException as exc:
+ log.debug("Failed to fetch TMDB detail: %s", exc)
+ return None
+
+
+def _fetch_tmdb_external_ids(tmdb_id: int, kind: str) -> Optional[dict]:
+ """Fetch full TMDB external_ids response for caching."""
+ api_key = _api_key()
+ if not api_key:
+ return None
+
+ try:
+ session = _get_session()
+ r = session.get(
+ f"https://api.themoviedb.org/3/{kind}/{tmdb_id}/external_ids",
+ params={"api_key": api_key},
+ timeout=30,
+ )
+ r.raise_for_status()
+ return r.json()
+ except requests.RequestException as exc:
+ log.debug("Failed to fetch TMDB external IDs: %s", exc)
+ return None
+
+
+def search_tmdb(
+ title: str,
+ year: Optional[int],
+ kind: str,
+ title_cacher=None,
+ cache_title_id: Optional[str] = None,
+ cache_region: Optional[str] = None,
+ cache_account_hash: Optional[str] = None,
+) -> Tuple[Optional[int], Optional[str]]:
+ if title_cacher and cache_title_id:
+ cached_tmdb = title_cacher.get_cached_tmdb(cache_title_id, kind, cache_region, cache_account_hash)
+ if cached_tmdb and cached_tmdb.get("detail"):
+ detail = cached_tmdb["detail"]
+ tmdb_id = detail.get("id")
+ tmdb_title = detail.get("title") or detail.get("name")
+ log.debug("Using cached TMDB data: %r (ID %s)", tmdb_title, tmdb_id)
+ return tmdb_id, tmdb_title
+
api_key = _api_key()
if not api_key:
return None, None
@@ -205,15 +320,41 @@ def search_tmdb(title: str, year: Optional[int], kind: str) -> Tuple[Optional[in
)
if best_id is not None:
+ if title_cacher and cache_title_id:
+ try:
+ detail_response = _fetch_tmdb_detail(best_id, kind)
+ external_ids_response = _fetch_tmdb_external_ids(best_id, kind)
+ if detail_response and external_ids_response:
+ title_cacher.cache_tmdb(
+ cache_title_id, detail_response, external_ids_response, kind, cache_region, cache_account_hash
+ )
+ except Exception as exc:
+ log.debug("Failed to cache TMDB data: %s", exc)
+
return best_id, best_title
first = results[0]
return first.get("id"), first.get("title") or first.get("name")
-def get_title(tmdb_id: int, kind: str) -> Optional[str]:
+def get_title(
+ tmdb_id: int,
+ kind: str,
+ title_cacher=None,
+ cache_title_id: Optional[str] = None,
+ cache_region: Optional[str] = None,
+ cache_account_hash: Optional[str] = None,
+) -> Optional[str]:
"""Fetch the name/title of a TMDB entry by ID."""
+ if title_cacher and cache_title_id:
+ cached_tmdb = title_cacher.get_cached_tmdb(cache_title_id, kind, cache_region, cache_account_hash)
+ if cached_tmdb and cached_tmdb.get("detail"):
+ detail = cached_tmdb["detail"]
+ tmdb_title = detail.get("title") or detail.get("name")
+ log.debug("Using cached TMDB title: %r", tmdb_title)
+ return tmdb_title
+
api_key = _api_key()
if not api_key:
return None
@@ -226,17 +367,44 @@ def get_title(tmdb_id: int, kind: str) -> Optional[str]:
timeout=30,
)
r.raise_for_status()
+ js = r.json()
+
+ if title_cacher and cache_title_id:
+ try:
+ external_ids_response = _fetch_tmdb_external_ids(tmdb_id, kind)
+ if external_ids_response:
+ title_cacher.cache_tmdb(
+ cache_title_id, js, external_ids_response, kind, cache_region, cache_account_hash
+ )
+ except Exception as exc:
+ log.debug("Failed to cache TMDB data: %s", exc)
+
+ return js.get("title") or js.get("name")
except requests.RequestException as exc:
log.debug("Failed to fetch TMDB title: %s", exc)
return None
- js = r.json()
- return js.get("title") or js.get("name")
-
-def get_year(tmdb_id: int, kind: str) -> Optional[int]:
+def get_year(
+ tmdb_id: int,
+ kind: str,
+ title_cacher=None,
+ cache_title_id: Optional[str] = None,
+ cache_region: Optional[str] = None,
+ cache_account_hash: Optional[str] = None,
+) -> Optional[int]:
"""Fetch the release year of a TMDB entry by ID."""
+ if title_cacher and cache_title_id:
+ cached_tmdb = title_cacher.get_cached_tmdb(cache_title_id, kind, cache_region, cache_account_hash)
+ if cached_tmdb and cached_tmdb.get("detail"):
+ detail = cached_tmdb["detail"]
+ date = detail.get("release_date") or detail.get("first_air_date")
+ if date and len(date) >= 4 and date[:4].isdigit():
+ year = int(date[:4])
+ log.debug("Using cached TMDB year: %d", year)
+ return year
+
api_key = _api_key()
if not api_key:
return None
@@ -249,18 +417,41 @@ def get_year(tmdb_id: int, kind: str) -> Optional[int]:
timeout=30,
)
r.raise_for_status()
+ js = r.json()
+
+ if title_cacher and cache_title_id:
+ try:
+ external_ids_response = _fetch_tmdb_external_ids(tmdb_id, kind)
+ if external_ids_response:
+ title_cacher.cache_tmdb(
+ cache_title_id, js, external_ids_response, kind, cache_region, cache_account_hash
+ )
+ except Exception as exc:
+ log.debug("Failed to cache TMDB data: %s", exc)
+
+ date = js.get("release_date") or js.get("first_air_date")
+ if date and len(date) >= 4 and date[:4].isdigit():
+ return int(date[:4])
+ return None
except requests.RequestException as exc:
log.debug("Failed to fetch TMDB year: %s", exc)
return None
- js = r.json()
- date = js.get("release_date") or js.get("first_air_date")
- if date and len(date) >= 4 and date[:4].isdigit():
- return int(date[:4])
- return None
+def external_ids(
+ tmdb_id: int,
+ kind: str,
+ title_cacher=None,
+ cache_title_id: Optional[str] = None,
+ cache_region: Optional[str] = None,
+ cache_account_hash: Optional[str] = None,
+) -> dict:
+ if title_cacher and cache_title_id:
+ cached_tmdb = title_cacher.get_cached_tmdb(cache_title_id, kind, cache_region, cache_account_hash)
+ if cached_tmdb and cached_tmdb.get("external_ids"):
+ log.debug("Using cached TMDB external IDs")
+ return cached_tmdb["external_ids"]
-def external_ids(tmdb_id: int, kind: str) -> dict:
api_key = _api_key()
if not api_key:
return {}
@@ -277,13 +468,22 @@ def external_ids(tmdb_id: int, kind: str) -> dict:
r.raise_for_status()
js = r.json()
log.debug("External IDs response: %s", js)
+
+ if title_cacher and cache_title_id:
+ try:
+ detail_response = _fetch_tmdb_detail(tmdb_id, kind)
+ if detail_response:
+ title_cacher.cache_tmdb(cache_title_id, detail_response, js, kind, cache_region, cache_account_hash)
+ except Exception as exc:
+ log.debug("Failed to cache TMDB data: %s", exc)
+
return js
except requests.RequestException as exc:
log.warning("Failed to fetch external IDs for %s %s: %s", kind, tmdb_id, exc)
return {}
-def _apply_tags(path: Path, tags: dict[str, str]) -> None:
+def apply_tags(path: Path, tags: dict[str, str]) -> None:
if not tags:
return
if not binaries.Mkvpropedit:
@@ -334,83 +534,109 @@ def tag_file(path: Path, title: Title, tmdb_id: Optional[int] | None = None) ->
name = title.title
year = title.year
else:
- _apply_tags(path, custom_tags)
+ apply_tags(path, custom_tags)
return
if config.tag_imdb_tmdb:
- # If tmdb_id is provided (via --tmdb), skip Simkl and use TMDB directly
- if tmdb_id is not None:
- log.debug("Using provided TMDB ID %s for tags", tmdb_id)
- else:
- # Try Simkl first for automatic lookup
- simkl_data, simkl_title, simkl_tmdb_id = search_simkl(name, year, kind)
-
- if simkl_data and simkl_title and fuzzy_match(simkl_title, name):
- log.debug("Using Simkl data for tags")
- if simkl_tmdb_id:
- tmdb_id = simkl_tmdb_id
-
- # Handle TV show data from Simkl
- if simkl_data.get("type") == "episode" and "show" in simkl_data:
- show_ids = simkl_data.get("show", {}).get("ids", {})
- if show_ids.get("imdb"):
- standard_tags["IMDB"] = show_ids["imdb"]
- if show_ids.get("tvdb"):
- standard_tags["TVDB2"] = f"series/{show_ids['tvdb']}"
- if show_ids.get("tmdbtv"):
- standard_tags["TMDB"] = f"tv/{show_ids['tmdbtv']}"
-
- # Handle movie data from Simkl
- elif simkl_data.get("type") == "movie" and "movie" in simkl_data:
- movie_ids = simkl_data.get("movie", {}).get("ids", {})
- if movie_ids.get("imdb"):
- standard_tags["IMDB"] = movie_ids["imdb"]
- if movie_ids.get("tvdb"):
- standard_tags["TVDB2"] = f"movies/{movie_ids['tvdb']}"
- if movie_ids.get("tmdb"):
- standard_tags["TMDB"] = f"movie/{movie_ids['tmdb']}"
-
- # Use TMDB API for additional metadata (either from provided ID or Simkl lookup)
+ # Check if we have any API keys available for metadata lookup
api_key = _api_key()
- if not api_key:
- log.debug("No TMDB API key set; applying basic tags only")
- _apply_tags(path, custom_tags)
+ simkl_client = _simkl_client_id()
+
+ if not api_key and not simkl_client:
+ log.debug("No TMDB API key or Simkl client ID configured; skipping IMDB/TMDB tag lookup")
+ apply_tags(path, custom_tags)
return
-
- tmdb_title: Optional[str] = None
- if tmdb_id is None:
- tmdb_id, tmdb_title = search_tmdb(name, year, kind)
- log.debug("TMDB search result: %r (ID %s)", tmdb_title, tmdb_id)
- if not tmdb_id or not tmdb_title or not fuzzy_match(tmdb_title, name):
- log.debug("TMDB search did not match; skipping external ID lookup")
- _apply_tags(path, custom_tags)
- return
-
- prefix = "movie" if kind == "movie" else "tv"
- standard_tags["TMDB"] = f"{prefix}/{tmdb_id}"
- try:
- ids = external_ids(tmdb_id, kind)
- except requests.RequestException as exc:
- log.debug("Failed to fetch external IDs: %s", exc)
- ids = {}
else:
- log.debug("External IDs found: %s", ids)
-
- imdb_id = ids.get("imdb_id")
- if imdb_id:
- standard_tags["IMDB"] = imdb_id
- tvdb_id = ids.get("tvdb_id")
- if tvdb_id:
- if kind == "movie":
- standard_tags["TVDB2"] = f"movies/{tvdb_id}"
+ # If tmdb_id is provided (via --tmdb), skip Simkl and use TMDB directly
+ if tmdb_id is not None:
+ log.debug("Using provided TMDB ID %s for tags", tmdb_id)
else:
- standard_tags["TVDB2"] = f"series/{tvdb_id}"
+ # Try Simkl first for automatic lookup (only if client ID is available)
+ if simkl_client:
+ simkl_data, simkl_title, simkl_tmdb_id = search_simkl(name, year, kind)
+
+ if simkl_data and simkl_title and fuzzy_match(simkl_title, name):
+ log.debug("Using Simkl data for tags")
+ if simkl_tmdb_id:
+ tmdb_id = simkl_tmdb_id
+
+ # Handle TV show data from Simkl
+ if simkl_data.get("type") == "episode" and "show" in simkl_data:
+ show_ids = simkl_data.get("show", {}).get("ids", {})
+ if show_ids.get("imdb"):
+ standard_tags["IMDB"] = show_ids["imdb"]
+ if show_ids.get("tvdb"):
+ standard_tags["TVDB2"] = f"series/{show_ids['tvdb']}"
+ if show_ids.get("tmdbtv"):
+ standard_tags["TMDB"] = f"tv/{show_ids['tmdbtv']}"
+
+ # Handle movie data from Simkl
+ elif simkl_data.get("type") == "movie" and "movie" in simkl_data:
+ movie_ids = simkl_data.get("movie", {}).get("ids", {})
+ if movie_ids.get("imdb"):
+ standard_tags["IMDB"] = movie_ids["imdb"]
+ if movie_ids.get("tvdb"):
+ standard_tags["TVDB2"] = f"movies/{movie_ids['tvdb']}"
+ if movie_ids.get("tmdb"):
+ standard_tags["TMDB"] = f"movie/{movie_ids['tmdb']}"
+
+ # Use TMDB API for additional metadata (either from provided ID or Simkl lookup)
+ if api_key:
+ tmdb_title: Optional[str] = None
+ if tmdb_id is None:
+ tmdb_id, tmdb_title = search_tmdb(name, year, kind)
+ log.debug("TMDB search result: %r (ID %s)", tmdb_title, tmdb_id)
+ if not tmdb_id or not tmdb_title or not fuzzy_match(tmdb_title, name):
+ log.debug("TMDB search did not match; skipping external ID lookup")
+ else:
+ prefix = "movie" if kind == "movie" else "tv"
+ standard_tags["TMDB"] = f"{prefix}/{tmdb_id}"
+ try:
+ ids = external_ids(tmdb_id, kind)
+ except requests.RequestException as exc:
+ log.debug("Failed to fetch external IDs: %s", exc)
+ ids = {}
+ else:
+ log.debug("External IDs found: %s", ids)
+
+ imdb_id = ids.get("imdb_id")
+ if imdb_id:
+ standard_tags["IMDB"] = imdb_id
+ tvdb_id = ids.get("tvdb_id")
+ if tvdb_id:
+ if kind == "movie":
+ standard_tags["TVDB2"] = f"movies/{tvdb_id}"
+ else:
+ standard_tags["TVDB2"] = f"series/{tvdb_id}"
+ elif tmdb_id is not None:
+ # tmdb_id was provided or found via Simkl
+ prefix = "movie" if kind == "movie" else "tv"
+ standard_tags["TMDB"] = f"{prefix}/{tmdb_id}"
+ try:
+ ids = external_ids(tmdb_id, kind)
+ except requests.RequestException as exc:
+ log.debug("Failed to fetch external IDs: %s", exc)
+ ids = {}
+ else:
+ log.debug("External IDs found: %s", ids)
+
+ imdb_id = ids.get("imdb_id")
+ if imdb_id:
+ standard_tags["IMDB"] = imdb_id
+ tvdb_id = ids.get("tvdb_id")
+ if tvdb_id:
+ if kind == "movie":
+ standard_tags["TVDB2"] = f"movies/{tvdb_id}"
+ else:
+ standard_tags["TVDB2"] = f"series/{tvdb_id}"
+ else:
+ log.debug("No TMDB API key configured; skipping TMDB external ID lookup")
merged_tags = {
**custom_tags,
**standard_tags,
}
- _apply_tags(path, merged_tags)
+ apply_tags(path, merged_tags)
__all__ = [
diff --git a/unshackle/core/utils/webvtt.py b/unshackle/core/utils/webvtt.py
index 76a8a36..9379fc6 100644
--- a/unshackle/core/utils/webvtt.py
+++ b/unshackle/core/utils/webvtt.py
@@ -3,8 +3,11 @@ import sys
import typing
from typing import Optional
+import pysubs2
from pycaption import Caption, CaptionList, CaptionNode, CaptionReadError, WebVTTReader, WebVTTWriter
+from unshackle.core.config import config
+
class CaptionListExt(CaptionList):
@typing.no_type_check
@@ -142,7 +145,24 @@ def merge_segmented_webvtt(vtt_raw: str, segment_durations: Optional[list[int]]
"""
MPEG_TIMESCALE = 90_000
- vtt = WebVTTReaderExt().read(vtt_raw)
+ # Check config for conversion method preference
+ conversion_method = config.subtitle.get("conversion_method", "auto")
+ use_pysubs2 = conversion_method in ("pysubs2", "auto")
+
+ if use_pysubs2:
+ # Try using pysubs2 first for more lenient parsing
+ try:
+ # Use pysubs2 to parse and normalize the VTT
+ subs = pysubs2.SSAFile.from_string(vtt_raw)
+ # Convert back to WebVTT string for pycaption processing
+ normalized_vtt = subs.to_string("vtt")
+ vtt = WebVTTReaderExt().read(normalized_vtt)
+ except Exception:
+ # Fall back to direct pycaption parsing
+ vtt = WebVTTReaderExt().read(vtt_raw)
+ else:
+ # Use pycaption directly
+ vtt = WebVTTReaderExt().read(vtt_raw)
for lang in vtt.get_languages():
prev_caption = None
duplicate_index: list[int] = []
diff --git a/unshackle/unshackle-example.yaml b/unshackle/unshackle-example.yaml
index 74447c1..36e7c2c 100644
--- a/unshackle/unshackle-example.yaml
+++ b/unshackle/unshackle-example.yaml
@@ -1,3 +1,10 @@
+# API key for The Movie Database (TMDB)
+tmdb_api_key: ""
+
+# Client ID for SIMKL API (optional, improves metadata matching)
+# Get your free client ID at: https://simkl.com/settings/developer/
+simkl_client_id: ""
+
# Group or Username to postfix to the end of all download filenames following a dash
tag: user_tag
@@ -32,6 +39,26 @@ title_cache_enabled: true # Enable/disable title caching globally (default: true
title_cache_time: 1800 # Cache duration in seconds (default: 1800 = 30 minutes)
title_cache_max_retention: 86400 # Maximum cache retention for fallback when API fails (default: 86400 = 24 hours)
+# Debug logging configuration
+# Comprehensive JSON-based debug logging for troubleshooting and service development
+debug:
+ false # Enable structured JSON debug logging (default: false)
+ # When enabled with --debug flag or set to true:
+ # - Creates JSON Lines (.jsonl) log files with complete debugging context
+ # - Logs: session info, CLI params, service config, CDM details, authentication,
+ # titles, tracks metadata, DRM operations, vault queries, errors with stack traces
+ # - File location: logs/unshackle_debug_{service}_{timestamp}.jsonl
+ # - Also creates text log: logs/unshackle_root_{timestamp}.log
+
+debug_keys:
+ false # Log decryption keys in debug logs (default: false)
+ # Set to true to include actual decryption keys in logs
+ # Useful for debugging key retrieval and decryption issues
+ # SECURITY NOTE: Passwords, tokens, cookies, and session tokens
+ # are ALWAYS redacted regardless of this setting
+ # Only affects: content_key, key fields (the actual CEKs)
+ # Never affects: kid, keys_count, key_id (metadata is always logged)
+
# Muxing configuration
muxing:
set_title: false
@@ -109,6 +136,74 @@ cdm:
default: netflix_standard_l3
# Use pywidevine Serve-compliant Remote CDMs
+
+# Example: Custom CDM API Configuration
+# This demonstrates the highly configurable custom_api type that can adapt to any CDM API format
+# - name: "chrome"
+# type: "custom_api"
+# host: "http://remotecdm.test/"
+# timeout: 30
+# device:
+# name: "ChromeCDM"
+# type: "CHROME"
+# system_id: 34312
+# security_level: 3
+# auth:
+# type: "header"
+# header_name: "x-api-key"
+# key: "YOUR_API_KEY_HERE"
+# custom_headers:
+# User-Agent: "Unshackle/2.0.0"
+# endpoints:
+# get_request:
+# path: "/get-challenge"
+# method: "POST"
+# timeout: 30
+# decrypt_response:
+# path: "/get-keys"
+# method: "POST"
+# timeout: 30
+# request_mapping:
+# get_request:
+# param_names:
+# scheme: "device"
+# init_data: "init_data"
+# static_params:
+# scheme: "Widevine"
+# decrypt_response:
+# param_names:
+# scheme: "device"
+# license_request: "license_request"
+# license_response: "license_response"
+# static_params:
+# scheme: "Widevine"
+# response_mapping:
+# get_request:
+# fields:
+# challenge: "challenge"
+# session_id: "session_id"
+# message: "message"
+# message_type: "message_type"
+# response_types:
+# - condition: "message_type == 'license-request'"
+# type: "license_request"
+# success_conditions:
+# - "message == 'success'"
+# decrypt_response:
+# fields:
+# keys: "keys"
+# message: "message"
+# key_fields:
+# kid: "kid"
+# key: "key"
+# type: "type"
+# success_conditions:
+# - "message == 'success'"
+# caching:
+# enabled: true
+# use_vaults: true
+# check_cached_first: true
+
remote_cdm:
- name: "chrome"
device_name: chrome
@@ -239,27 +334,40 @@ headers:
# Override default filenames used across unshackle
filenames:
- log: "unshackle_{name}_{time}.log"
+ debug_log: "unshackle_debug_{service}_{time}.jsonl" # JSON Lines debug log file
config: "config.yaml"
root_config: "unshackle.yaml"
chapters: "Chapters_{title}_{random}.txt"
subtitle: "Subtitle_{id}_{language}.srt"
-# API key for The Movie Database (TMDB)
-tmdb_api_key: ""
-
# conversion_method:
-# - auto (default): Smart routing - subby for WebVTT/SAMI, standard for others
+# - auto (default): Smart routing - subby for WebVTT/SAMI, pycaption for others
# - subby: Always use subby with advanced processing
# - pycaption: Use only pycaption library (no SubtitleEdit, no subby)
# - subtitleedit: Prefer SubtitleEdit when available, fall back to pycaption
# - pysubs2: Use pysubs2 library (supports SRT/SSA/ASS/WebVTT/TTML/SAMI/MicroDVD/MPL2/TMP)
subtitle:
conversion_method: auto
+ # sdh_method: Method to use for SDH (hearing impaired) stripping
+ # - auto (default): Try subby (SRT only), then SubtitleEdit (if available), then subtitle-filter
+ # - subby: Use subby library (SRT only)
+ # - subtitleedit: Use SubtitleEdit tool (Windows only, falls back to subtitle-filter)
+ # - filter-subs: Use subtitle-filter library directly
sdh_method: auto
+ # strip_sdh: Automatically create stripped (non-SDH) versions of SDH subtitles
+ # Set to false to disable automatic SDH stripping entirely (default: true)
+ strip_sdh: true
+ # convert_before_strip: Auto-convert VTT/other formats to SRT before using subtitle-filter
+ # This ensures compatibility when subtitle-filter is used as fallback (default: true)
+ convert_before_strip: true
+ # preserve_formatting: Preserve original subtitle formatting (tags, positioning, styling)
+ # When true, skips pycaption processing for WebVTT files to keep tags like , , positioning intact
+ # Combined with no sub_format setting, ensures subtitles remain in their original format (default: true)
+ preserve_formatting: true
# Configuration for pywidevine's serve functionality
serve:
+ api_secret: "your-secret-key-here"
users:
secret_key_for_user:
devices:
@@ -273,9 +381,13 @@ services:
# Service-specific configuration goes here
# Profile-specific configurations can be nested under service names
- # Example: with profile-specific device configs
+ # You can override ANY global configuration option on a per-service basis
+ # This allows fine-tuned control for services with special requirements
+ # Supported overrides: dl, aria2c, n_m3u8dl_re, curl_impersonate, subtitle, muxing, headers, etc.
+
+ # Example: Comprehensive service configuration showing all features
EXAMPLE:
- # Global service config
+ # Standard service config
api_key: "service_api_key"
# Service certificate for Widevine L1/L2 (base64 encoded)
@@ -296,6 +408,42 @@ services:
app_name: "AIV"
device_model: "Fire TV Stick 4K"
+ # NEW: Configuration overrides (can be combined with profiles and certificates)
+ # Override dl command defaults for this service
+ dl:
+ downloads: 4 # Limit concurrent track downloads (global default: 6)
+ workers: 8 # Reduce workers per track (global default: 16)
+ lang: ["en", "es-419"] # Different language priority for this service
+ sub_format: srt # Force SRT subtitle format
+
+ # Override n_m3u8dl_re downloader settings
+ n_m3u8dl_re:
+ thread_count: 8 # Lower thread count for rate-limited service (global default: 16)
+ use_proxy: true # Force proxy usage for this service
+ retry_count: 10 # More retries for unstable connections
+ ad_keyword: "advertisement" # Service-specific ad filtering
+
+ # Override aria2c downloader settings
+ aria2c:
+ max_concurrent_downloads: 2 # Limit concurrent downloads (global default: 4)
+ max_connection_per_server: 1 # Single connection per server
+ split: 3 # Fewer splits (global default: 5)
+ file_allocation: none # Faster allocation for this service
+
+ # Override subtitle processing for this service
+ subtitle:
+ conversion_method: pycaption # Use specific subtitle converter
+ sdh_method: auto
+
+ # Service-specific headers
+ headers:
+ User-Agent: "Service-specific user agent string"
+ Accept-Language: "en-US,en;q=0.9"
+
+ # Override muxing options
+ muxing:
+ set_title: true
+
# Example: Service with different regions per profile
SERVICE_NAME:
profiles:
@@ -306,6 +454,25 @@ services:
region: "GB"
api_endpoint: "https://api.uk.service.com"
+ # Example: Rate-limited service
+ RATE_LIMITED_SERVICE:
+ dl:
+ downloads: 2 # Limit concurrent downloads
+ workers: 4 # Reduce workers to avoid rate limits
+ n_m3u8dl_re:
+ thread_count: 4 # Very low thread count
+ retry_count: 20 # More retries for flaky service
+ aria2c:
+ max_concurrent_downloads: 1 # Download tracks one at a time
+ max_connection_per_server: 1 # Single connection only
+
+ # Notes on service-specific overrides:
+ # - Overrides are merged with global config, not replaced
+ # - Only specified keys are overridden, others use global defaults
+ # - Reserved keys (profiles, api_key, certificate, etc.) are NOT treated as overrides
+ # - Any dict-type config option can be overridden (dl, aria2c, n_m3u8dl_re, subtitle, etc.)
+ # - CLI arguments always take priority over service-specific config
+
# External proxy provider services
proxy_providers:
nordvpn:
@@ -320,6 +487,12 @@ proxy_providers:
us: 3844 # force US server #3844 for US proxies
gb: 2697 # force GB server #2697 for GB proxies
au: 4621 # force AU server #4621 for AU proxies
+ windscribevpn:
+ username: your_windscribe_username # Service credentials from https://windscribe.com/getconfig/openvpn
+ password: your_windscribe_password # Service credentials (not your login password)
+ server_map:
+ us: "us-central-096.totallyacdn.com" # force US server
+ gb: "uk-london-055.totallyacdn.com" # force GB server
basic:
GB:
- "socks5://username:password@bhx.socks.ipvanish.com:1080" # 1 (Birmingham)
diff --git a/unshackle/vaults/SQLite.py b/unshackle/vaults/SQLite.py
index ac89fec..f1922d7 100644
--- a/unshackle/vaults/SQLite.py
+++ b/unshackle/vaults/SQLite.py
@@ -37,7 +37,9 @@ class SQLite(Vault):
if not self.has_table(service_name):
continue
- cursor.execute(f"SELECT `id`, `key_` FROM `{service_name}` WHERE `kid`=? AND `key_`!=?", (kid, "0" * 32))
+ cursor.execute(
+ f"SELECT `id`, `key_` FROM `{service_name}` WHERE `kid`=? AND `key_`!=?", (kid, "0" * 32)
+ )
cek = cursor.fetchone()
if cek:
return cek[1]
diff --git a/uv.lock b/uv.lock
index 7cf248f..f2ad4bb 100644
--- a/uv.lock
+++ b/uv.lock
@@ -13,7 +13,7 @@ wheels = [
[[package]]
name = "aiohttp"
-version = "3.12.15"
+version = "3.13.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "aiohappyeyeballs" },
@@ -25,59 +25,75 @@ dependencies = [
{ name = "propcache" },
{ name = "yarl" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/9b/e7/d92a237d8802ca88483906c388f7c201bbe96cd80a165ffd0ac2f6a8d59f/aiohttp-3.12.15.tar.gz", hash = "sha256:4fc61385e9c98d72fcdf47e6dd81833f47b2f77c114c29cd64a361be57a763a2", size = 7823716, upload-time = "2025-07-29T05:52:32.215Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/1c/ce/3b83ebba6b3207a7135e5fcaba49706f8a4b6008153b4e30540c982fae26/aiohttp-3.13.2.tar.gz", hash = "sha256:40176a52c186aefef6eb3cad2cdd30cd06e3afbe88fe8ab2af9c0b90f228daca", size = 7837994, upload-time = "2025-10-28T20:59:39.937Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/47/dc/ef9394bde9080128ad401ac7ede185267ed637df03b51f05d14d1c99ad67/aiohttp-3.12.15-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:b6fc902bff74d9b1879ad55f5404153e2b33a82e72a95c89cec5eb6cc9e92fbc", size = 703921, upload-time = "2025-07-29T05:49:43.584Z" },
- { url = "https://files.pythonhosted.org/packages/8f/42/63fccfc3a7ed97eb6e1a71722396f409c46b60a0552d8a56d7aad74e0df5/aiohttp-3.12.15-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:098e92835b8119b54c693f2f88a1dec690e20798ca5f5fe5f0520245253ee0af", size = 480288, upload-time = "2025-07-29T05:49:47.851Z" },
- { url = "https://files.pythonhosted.org/packages/9c/a2/7b8a020549f66ea2a68129db6960a762d2393248f1994499f8ba9728bbed/aiohttp-3.12.15-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:40b3fee496a47c3b4a39a731954c06f0bd9bd3e8258c059a4beb76ac23f8e421", size = 468063, upload-time = "2025-07-29T05:49:49.789Z" },
- { url = "https://files.pythonhosted.org/packages/8f/f5/d11e088da9176e2ad8220338ae0000ed5429a15f3c9dfd983f39105399cd/aiohttp-3.12.15-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2ce13fcfb0bb2f259fb42106cdc63fa5515fb85b7e87177267d89a771a660b79", size = 1650122, upload-time = "2025-07-29T05:49:51.874Z" },
- { url = "https://files.pythonhosted.org/packages/b0/6b/b60ce2757e2faed3d70ed45dafee48cee7bfb878785a9423f7e883f0639c/aiohttp-3.12.15-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3beb14f053222b391bf9cf92ae82e0171067cc9c8f52453a0f1ec7c37df12a77", size = 1624176, upload-time = "2025-07-29T05:49:53.805Z" },
- { url = "https://files.pythonhosted.org/packages/dd/de/8c9fde2072a1b72c4fadecf4f7d4be7a85b1d9a4ab333d8245694057b4c6/aiohttp-3.12.15-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c39e87afe48aa3e814cac5f535bc6199180a53e38d3f51c5e2530f5aa4ec58c", size = 1696583, upload-time = "2025-07-29T05:49:55.338Z" },
- { url = "https://files.pythonhosted.org/packages/0c/ad/07f863ca3d895a1ad958a54006c6dafb4f9310f8c2fdb5f961b8529029d3/aiohttp-3.12.15-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d5f1b4ce5bc528a6ee38dbf5f39bbf11dd127048726323b72b8e85769319ffc4", size = 1738896, upload-time = "2025-07-29T05:49:57.045Z" },
- { url = "https://files.pythonhosted.org/packages/20/43/2bd482ebe2b126533e8755a49b128ec4e58f1a3af56879a3abdb7b42c54f/aiohttp-3.12.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1004e67962efabbaf3f03b11b4c43b834081c9e3f9b32b16a7d97d4708a9abe6", size = 1643561, upload-time = "2025-07-29T05:49:58.762Z" },
- { url = "https://files.pythonhosted.org/packages/23/40/2fa9f514c4cf4cbae8d7911927f81a1901838baf5e09a8b2c299de1acfe5/aiohttp-3.12.15-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8faa08fcc2e411f7ab91d1541d9d597d3a90e9004180edb2072238c085eac8c2", size = 1583685, upload-time = "2025-07-29T05:50:00.375Z" },
- { url = "https://files.pythonhosted.org/packages/b8/c3/94dc7357bc421f4fb978ca72a201a6c604ee90148f1181790c129396ceeb/aiohttp-3.12.15-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fe086edf38b2222328cdf89af0dde2439ee173b8ad7cb659b4e4c6f385b2be3d", size = 1627533, upload-time = "2025-07-29T05:50:02.306Z" },
- { url = "https://files.pythonhosted.org/packages/bf/3f/1f8911fe1844a07001e26593b5c255a685318943864b27b4e0267e840f95/aiohttp-3.12.15-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:79b26fe467219add81d5e47b4a4ba0f2394e8b7c7c3198ed36609f9ba161aecb", size = 1638319, upload-time = "2025-07-29T05:50:04.282Z" },
- { url = "https://files.pythonhosted.org/packages/4e/46/27bf57a99168c4e145ffee6b63d0458b9c66e58bb70687c23ad3d2f0bd17/aiohttp-3.12.15-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:b761bac1192ef24e16706d761aefcb581438b34b13a2f069a6d343ec8fb693a5", size = 1613776, upload-time = "2025-07-29T05:50:05.863Z" },
- { url = "https://files.pythonhosted.org/packages/0f/7e/1d2d9061a574584bb4ad3dbdba0da90a27fdc795bc227def3a46186a8bc1/aiohttp-3.12.15-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:e153e8adacfe2af562861b72f8bc47f8a5c08e010ac94eebbe33dc21d677cd5b", size = 1693359, upload-time = "2025-07-29T05:50:07.563Z" },
- { url = "https://files.pythonhosted.org/packages/08/98/bee429b52233c4a391980a5b3b196b060872a13eadd41c3a34be9b1469ed/aiohttp-3.12.15-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:fc49c4de44977aa8601a00edbf157e9a421f227aa7eb477d9e3df48343311065", size = 1716598, upload-time = "2025-07-29T05:50:09.33Z" },
- { url = "https://files.pythonhosted.org/packages/57/39/b0314c1ea774df3392751b686104a3938c63ece2b7ce0ba1ed7c0b4a934f/aiohttp-3.12.15-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:2776c7ec89c54a47029940177e75c8c07c29c66f73464784971d6a81904ce9d1", size = 1644940, upload-time = "2025-07-29T05:50:11.334Z" },
- { url = "https://files.pythonhosted.org/packages/1b/83/3dacb8d3f8f512c8ca43e3fa8a68b20583bd25636ffa4e56ee841ffd79ae/aiohttp-3.12.15-cp310-cp310-win32.whl", hash = "sha256:2c7d81a277fa78b2203ab626ced1487420e8c11a8e373707ab72d189fcdad20a", size = 429239, upload-time = "2025-07-29T05:50:12.803Z" },
- { url = "https://files.pythonhosted.org/packages/eb/f9/470b5daba04d558c9673ca2034f28d067f3202a40e17804425f0c331c89f/aiohttp-3.12.15-cp310-cp310-win_amd64.whl", hash = "sha256:83603f881e11f0f710f8e2327817c82e79431ec976448839f3cd05d7afe8f830", size = 452297, upload-time = "2025-07-29T05:50:14.266Z" },
- { url = "https://files.pythonhosted.org/packages/20/19/9e86722ec8e835959bd97ce8c1efa78cf361fa4531fca372551abcc9cdd6/aiohttp-3.12.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d3ce17ce0220383a0f9ea07175eeaa6aa13ae5a41f30bc61d84df17f0e9b1117", size = 711246, upload-time = "2025-07-29T05:50:15.937Z" },
- { url = "https://files.pythonhosted.org/packages/71/f9/0a31fcb1a7d4629ac9d8f01f1cb9242e2f9943f47f5d03215af91c3c1a26/aiohttp-3.12.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:010cc9bbd06db80fe234d9003f67e97a10fe003bfbedb40da7d71c1008eda0fe", size = 483515, upload-time = "2025-07-29T05:50:17.442Z" },
- { url = "https://files.pythonhosted.org/packages/62/6c/94846f576f1d11df0c2e41d3001000527c0fdf63fce7e69b3927a731325d/aiohttp-3.12.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3f9d7c55b41ed687b9d7165b17672340187f87a773c98236c987f08c858145a9", size = 471776, upload-time = "2025-07-29T05:50:19.568Z" },
- { url = "https://files.pythonhosted.org/packages/f8/6c/f766d0aaafcee0447fad0328da780d344489c042e25cd58fde566bf40aed/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc4fbc61bb3548d3b482f9ac7ddd0f18c67e4225aaa4e8552b9f1ac7e6bda9e5", size = 1741977, upload-time = "2025-07-29T05:50:21.665Z" },
- { url = "https://files.pythonhosted.org/packages/17/e5/fb779a05ba6ff44d7bc1e9d24c644e876bfff5abe5454f7b854cace1b9cc/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7fbc8a7c410bb3ad5d595bb7118147dfbb6449d862cc1125cf8867cb337e8728", size = 1690645, upload-time = "2025-07-29T05:50:23.333Z" },
- { url = "https://files.pythonhosted.org/packages/37/4e/a22e799c2035f5d6a4ad2cf8e7c1d1bd0923192871dd6e367dafb158b14c/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74dad41b3458dbb0511e760fb355bb0b6689e0630de8a22b1b62a98777136e16", size = 1789437, upload-time = "2025-07-29T05:50:25.007Z" },
- { url = "https://files.pythonhosted.org/packages/28/e5/55a33b991f6433569babb56018b2fb8fb9146424f8b3a0c8ecca80556762/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b6f0af863cf17e6222b1735a756d664159e58855da99cfe965134a3ff63b0b0", size = 1828482, upload-time = "2025-07-29T05:50:26.693Z" },
- { url = "https://files.pythonhosted.org/packages/c6/82/1ddf0ea4f2f3afe79dffed5e8a246737cff6cbe781887a6a170299e33204/aiohttp-3.12.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5b7fe4972d48a4da367043b8e023fb70a04d1490aa7d68800e465d1b97e493b", size = 1730944, upload-time = "2025-07-29T05:50:28.382Z" },
- { url = "https://files.pythonhosted.org/packages/1b/96/784c785674117b4cb3877522a177ba1b5e4db9ce0fd519430b5de76eec90/aiohttp-3.12.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6443cca89553b7a5485331bc9bedb2342b08d073fa10b8c7d1c60579c4a7b9bd", size = 1668020, upload-time = "2025-07-29T05:50:30.032Z" },
- { url = "https://files.pythonhosted.org/packages/12/8a/8b75f203ea7e5c21c0920d84dd24a5c0e971fe1e9b9ebbf29ae7e8e39790/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6c5f40ec615e5264f44b4282ee27628cea221fcad52f27405b80abb346d9f3f8", size = 1716292, upload-time = "2025-07-29T05:50:31.983Z" },
- { url = "https://files.pythonhosted.org/packages/47/0b/a1451543475bb6b86a5cfc27861e52b14085ae232896a2654ff1231c0992/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:2abbb216a1d3a2fe86dbd2edce20cdc5e9ad0be6378455b05ec7f77361b3ab50", size = 1711451, upload-time = "2025-07-29T05:50:33.989Z" },
- { url = "https://files.pythonhosted.org/packages/55/fd/793a23a197cc2f0d29188805cfc93aa613407f07e5f9da5cd1366afd9d7c/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:db71ce547012a5420a39c1b744d485cfb823564d01d5d20805977f5ea1345676", size = 1691634, upload-time = "2025-07-29T05:50:35.846Z" },
- { url = "https://files.pythonhosted.org/packages/ca/bf/23a335a6670b5f5dfc6d268328e55a22651b440fca341a64fccf1eada0c6/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:ced339d7c9b5030abad5854aa5413a77565e5b6e6248ff927d3e174baf3badf7", size = 1785238, upload-time = "2025-07-29T05:50:37.597Z" },
- { url = "https://files.pythonhosted.org/packages/57/4f/ed60a591839a9d85d40694aba5cef86dde9ee51ce6cca0bb30d6eb1581e7/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:7c7dd29c7b5bda137464dc9bfc738d7ceea46ff70309859ffde8c022e9b08ba7", size = 1805701, upload-time = "2025-07-29T05:50:39.591Z" },
- { url = "https://files.pythonhosted.org/packages/85/e0/444747a9455c5de188c0f4a0173ee701e2e325d4b2550e9af84abb20cdba/aiohttp-3.12.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:421da6fd326460517873274875c6c5a18ff225b40da2616083c5a34a7570b685", size = 1718758, upload-time = "2025-07-29T05:50:41.292Z" },
- { url = "https://files.pythonhosted.org/packages/36/ab/1006278d1ffd13a698e5dd4bfa01e5878f6bddefc296c8b62649753ff249/aiohttp-3.12.15-cp311-cp311-win32.whl", hash = "sha256:4420cf9d179ec8dfe4be10e7d0fe47d6d606485512ea2265b0d8c5113372771b", size = 428868, upload-time = "2025-07-29T05:50:43.063Z" },
- { url = "https://files.pythonhosted.org/packages/10/97/ad2b18700708452400278039272032170246a1bf8ec5d832772372c71f1a/aiohttp-3.12.15-cp311-cp311-win_amd64.whl", hash = "sha256:edd533a07da85baa4b423ee8839e3e91681c7bfa19b04260a469ee94b778bf6d", size = 453273, upload-time = "2025-07-29T05:50:44.613Z" },
- { url = "https://files.pythonhosted.org/packages/63/97/77cb2450d9b35f517d6cf506256bf4f5bda3f93a66b4ad64ba7fc917899c/aiohttp-3.12.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:802d3868f5776e28f7bf69d349c26fc0efadb81676d0afa88ed00d98a26340b7", size = 702333, upload-time = "2025-07-29T05:50:46.507Z" },
- { url = "https://files.pythonhosted.org/packages/83/6d/0544e6b08b748682c30b9f65640d006e51f90763b41d7c546693bc22900d/aiohttp-3.12.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2800614cd560287be05e33a679638e586a2d7401f4ddf99e304d98878c29444", size = 476948, upload-time = "2025-07-29T05:50:48.067Z" },
- { url = "https://files.pythonhosted.org/packages/3a/1d/c8c40e611e5094330284b1aea8a4b02ca0858f8458614fa35754cab42b9c/aiohttp-3.12.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8466151554b593909d30a0a125d638b4e5f3836e5aecde85b66b80ded1cb5b0d", size = 469787, upload-time = "2025-07-29T05:50:49.669Z" },
- { url = "https://files.pythonhosted.org/packages/38/7d/b76438e70319796bfff717f325d97ce2e9310f752a267bfdf5192ac6082b/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e5a495cb1be69dae4b08f35a6c4579c539e9b5706f606632102c0f855bcba7c", size = 1716590, upload-time = "2025-07-29T05:50:51.368Z" },
- { url = "https://files.pythonhosted.org/packages/79/b1/60370d70cdf8b269ee1444b390cbd72ce514f0d1cd1a715821c784d272c9/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6404dfc8cdde35c69aaa489bb3542fb86ef215fc70277c892be8af540e5e21c0", size = 1699241, upload-time = "2025-07-29T05:50:53.628Z" },
- { url = "https://files.pythonhosted.org/packages/a3/2b/4968a7b8792437ebc12186db31523f541943e99bda8f30335c482bea6879/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3ead1c00f8521a5c9070fcb88f02967b1d8a0544e6d85c253f6968b785e1a2ab", size = 1754335, upload-time = "2025-07-29T05:50:55.394Z" },
- { url = "https://files.pythonhosted.org/packages/fb/c1/49524ed553f9a0bec1a11fac09e790f49ff669bcd14164f9fab608831c4d/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6990ef617f14450bc6b34941dba4f12d5613cbf4e33805932f853fbd1cf18bfb", size = 1800491, upload-time = "2025-07-29T05:50:57.202Z" },
- { url = "https://files.pythonhosted.org/packages/de/5e/3bf5acea47a96a28c121b167f5ef659cf71208b19e52a88cdfa5c37f1fcc/aiohttp-3.12.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd736ed420f4db2b8148b52b46b88ed038d0354255f9a73196b7bbce3ea97545", size = 1719929, upload-time = "2025-07-29T05:50:59.192Z" },
- { url = "https://files.pythonhosted.org/packages/39/94/8ae30b806835bcd1cba799ba35347dee6961a11bd507db634516210e91d8/aiohttp-3.12.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c5092ce14361a73086b90c6efb3948ffa5be2f5b6fbcf52e8d8c8b8848bb97c", size = 1635733, upload-time = "2025-07-29T05:51:01.394Z" },
- { url = "https://files.pythonhosted.org/packages/7a/46/06cdef71dd03acd9da7f51ab3a9107318aee12ad38d273f654e4f981583a/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:aaa2234bb60c4dbf82893e934d8ee8dea30446f0647e024074237a56a08c01bd", size = 1696790, upload-time = "2025-07-29T05:51:03.657Z" },
- { url = "https://files.pythonhosted.org/packages/02/90/6b4cfaaf92ed98d0ec4d173e78b99b4b1a7551250be8937d9d67ecb356b4/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:6d86a2fbdd14192e2f234a92d3b494dd4457e683ba07e5905a0b3ee25389ac9f", size = 1718245, upload-time = "2025-07-29T05:51:05.911Z" },
- { url = "https://files.pythonhosted.org/packages/2e/e6/2593751670fa06f080a846f37f112cbe6f873ba510d070136a6ed46117c6/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a041e7e2612041a6ddf1c6a33b883be6a421247c7afd47e885969ee4cc58bd8d", size = 1658899, upload-time = "2025-07-29T05:51:07.753Z" },
- { url = "https://files.pythonhosted.org/packages/8f/28/c15bacbdb8b8eb5bf39b10680d129ea7410b859e379b03190f02fa104ffd/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5015082477abeafad7203757ae44299a610e89ee82a1503e3d4184e6bafdd519", size = 1738459, upload-time = "2025-07-29T05:51:09.56Z" },
- { url = "https://files.pythonhosted.org/packages/00/de/c269cbc4faa01fb10f143b1670633a8ddd5b2e1ffd0548f7aa49cb5c70e2/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:56822ff5ddfd1b745534e658faba944012346184fbfe732e0d6134b744516eea", size = 1766434, upload-time = "2025-07-29T05:51:11.423Z" },
- { url = "https://files.pythonhosted.org/packages/52/b0/4ff3abd81aa7d929b27d2e1403722a65fc87b763e3a97b3a2a494bfc63bc/aiohttp-3.12.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2acbbfff69019d9014508c4ba0401822e8bae5a5fdc3b6814285b71231b60f3", size = 1726045, upload-time = "2025-07-29T05:51:13.689Z" },
- { url = "https://files.pythonhosted.org/packages/71/16/949225a6a2dd6efcbd855fbd90cf476052e648fb011aa538e3b15b89a57a/aiohttp-3.12.15-cp312-cp312-win32.whl", hash = "sha256:d849b0901b50f2185874b9a232f38e26b9b3d4810095a7572eacea939132d4e1", size = 423591, upload-time = "2025-07-29T05:51:15.452Z" },
- { url = "https://files.pythonhosted.org/packages/2b/d8/fa65d2a349fe938b76d309db1a56a75c4fb8cc7b17a398b698488a939903/aiohttp-3.12.15-cp312-cp312-win_amd64.whl", hash = "sha256:b390ef5f62bb508a9d67cb3bba9b8356e23b3996da7062f1a57ce1a79d2b3d34", size = 450266, upload-time = "2025-07-29T05:51:17.239Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/34/939730e66b716b76046dedfe0842995842fa906ccc4964bba414ff69e429/aiohttp-3.13.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:2372b15a5f62ed37789a6b383ff7344fc5b9f243999b0cd9b629d8bc5f5b4155", size = 736471, upload-time = "2025-10-28T20:55:27.924Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/cf/dcbdf2df7f6ca72b0bb4c0b4509701f2d8942cf54e29ca197389c214c07f/aiohttp-3.13.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e7f8659a48995edee7229522984bd1009c1213929c769c2daa80b40fe49a180c", size = 493985, upload-time = "2025-10-28T20:55:29.456Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/87/71c8867e0a1d0882dcbc94af767784c3cb381c1c4db0943ab4aae4fed65e/aiohttp-3.13.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:939ced4a7add92296b0ad38892ce62b98c619288a081170695c6babe4f50e636", size = 489274, upload-time = "2025-10-28T20:55:31.134Z" },
+ { url = "https://files.pythonhosted.org/packages/38/0f/46c24e8dae237295eaadd113edd56dee96ef6462adf19b88592d44891dc5/aiohttp-3.13.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6315fb6977f1d0dd41a107c527fee2ed5ab0550b7d885bc15fee20ccb17891da", size = 1668171, upload-time = "2025-10-28T20:55:36.065Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/c6/4cdfb4440d0e28483681a48f69841fa5e39366347d66ef808cbdadddb20e/aiohttp-3.13.2-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6e7352512f763f760baaed2637055c49134fd1d35b37c2dedfac35bfe5cf8725", size = 1636036, upload-time = "2025-10-28T20:55:37.576Z" },
+ { url = "https://files.pythonhosted.org/packages/84/37/8708cf678628216fb678ab327a4e1711c576d6673998f4f43e86e9ae90dd/aiohttp-3.13.2-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e09a0a06348a2dd73e7213353c90d709502d9786219f69b731f6caa0efeb46f5", size = 1727975, upload-time = "2025-10-28T20:55:39.457Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/2e/3ebfe12fdcb9b5f66e8a0a42dffcd7636844c8a018f261efb2419f68220b/aiohttp-3.13.2-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a09a6d073fb5789456545bdee2474d14395792faa0527887f2f4ec1a486a59d3", size = 1815823, upload-time = "2025-10-28T20:55:40.958Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/4f/ca2ef819488cbb41844c6cf92ca6dd15b9441e6207c58e5ae0e0fc8d70ad/aiohttp-3.13.2-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b59d13c443f8e049d9e94099c7e412e34610f1f49be0f230ec656a10692a5802", size = 1669374, upload-time = "2025-10-28T20:55:42.745Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/fe/1fe2e1179a0d91ce09c99069684aab619bf2ccde9b20bd6ca44f8837203e/aiohttp-3.13.2-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:20db2d67985d71ca033443a1ba2001c4b5693fe09b0e29f6d9358a99d4d62a8a", size = 1555315, upload-time = "2025-10-28T20:55:44.264Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/2b/f3781899b81c45d7cbc7140cddb8a3481c195e7cbff8e36374759d2ab5a5/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:960c2fc686ba27b535f9fd2b52d87ecd7e4fd1cf877f6a5cba8afb5b4a8bd204", size = 1639140, upload-time = "2025-10-28T20:55:46.626Z" },
+ { url = "https://files.pythonhosted.org/packages/72/27/c37e85cd3ece6f6c772e549bd5a253d0c122557b25855fb274224811e4f2/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:6c00dbcf5f0d88796151e264a8eab23de2997c9303dd7c0bf622e23b24d3ce22", size = 1645496, upload-time = "2025-10-28T20:55:48.933Z" },
+ { url = "https://files.pythonhosted.org/packages/66/20/3af1ab663151bd3780b123e907761cdb86ec2c4e44b2d9b195ebc91fbe37/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fed38a5edb7945f4d1bcabe2fcd05db4f6ec7e0e82560088b754f7e08d93772d", size = 1697625, upload-time = "2025-10-28T20:55:50.377Z" },
+ { url = "https://files.pythonhosted.org/packages/95/eb/ae5cab15efa365e13d56b31b0d085a62600298bf398a7986f8388f73b598/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:b395bbca716c38bef3c764f187860e88c724b342c26275bc03e906142fc5964f", size = 1542025, upload-time = "2025-10-28T20:55:51.861Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/2d/1683e8d67ec72d911397fe4e575688d2a9b8f6a6e03c8fdc9f3fd3d4c03f/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:204ffff2426c25dfda401ba08da85f9c59525cdc42bda26660463dd1cbcfec6f", size = 1714918, upload-time = "2025-10-28T20:55:53.515Z" },
+ { url = "https://files.pythonhosted.org/packages/99/a2/ffe8e0e1c57c5e542d47ffa1fcf95ef2b3ea573bf7c4d2ee877252431efc/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:05c4dd3c48fb5f15db31f57eb35374cb0c09afdde532e7fb70a75aede0ed30f6", size = 1656113, upload-time = "2025-10-28T20:55:55.438Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/42/d511aff5c3a2b06c09d7d214f508a4ad8ac7799817f7c3d23e7336b5e896/aiohttp-3.13.2-cp310-cp310-win32.whl", hash = "sha256:e574a7d61cf10351d734bcddabbe15ede0eaa8a02070d85446875dc11189a251", size = 432290, upload-time = "2025-10-28T20:55:56.96Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/ea/1c2eb7098b5bad4532994f2b7a8228d27674035c9b3234fe02c37469ef14/aiohttp-3.13.2-cp310-cp310-win_amd64.whl", hash = "sha256:364f55663085d658b8462a1c3f17b2b84a5c2e1ba858e1b79bff7b2e24ad1514", size = 455075, upload-time = "2025-10-28T20:55:58.373Z" },
+ { url = "https://files.pythonhosted.org/packages/35/74/b321e7d7ca762638cdf8cdeceb39755d9c745aff7a64c8789be96ddf6e96/aiohttp-3.13.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4647d02df098f6434bafd7f32ad14942f05a9caa06c7016fdcc816f343997dd0", size = 743409, upload-time = "2025-10-28T20:56:00.354Z" },
+ { url = "https://files.pythonhosted.org/packages/99/3d/91524b905ec473beaf35158d17f82ef5a38033e5809fe8742e3657cdbb97/aiohttp-3.13.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e3403f24bcb9c3b29113611c3c16a2a447c3953ecf86b79775e7be06f7ae7ccb", size = 497006, upload-time = "2025-10-28T20:56:01.85Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/d3/7f68bc02a67716fe80f063e19adbd80a642e30682ce74071269e17d2dba1/aiohttp-3.13.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:43dff14e35aba17e3d6d5ba628858fb8cb51e30f44724a2d2f0c75be492c55e9", size = 493195, upload-time = "2025-10-28T20:56:03.314Z" },
+ { url = "https://files.pythonhosted.org/packages/98/31/913f774a4708775433b7375c4f867d58ba58ead833af96c8af3621a0d243/aiohttp-3.13.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e2a9ea08e8c58bb17655630198833109227dea914cd20be660f52215f6de5613", size = 1747759, upload-time = "2025-10-28T20:56:04.904Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/63/04efe156f4326f31c7c4a97144f82132c3bb21859b7bb84748d452ccc17c/aiohttp-3.13.2-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:53b07472f235eb80e826ad038c9d106c2f653584753f3ddab907c83f49eedead", size = 1704456, upload-time = "2025-10-28T20:56:06.986Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/02/4e16154d8e0a9cf4ae76f692941fd52543bbb148f02f098ca73cab9b1c1b/aiohttp-3.13.2-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e736c93e9c274fce6419af4aac199984d866e55f8a4cec9114671d0ea9688780", size = 1807572, upload-time = "2025-10-28T20:56:08.558Z" },
+ { url = "https://files.pythonhosted.org/packages/34/58/b0583defb38689e7f06798f0285b1ffb3a6fb371f38363ce5fd772112724/aiohttp-3.13.2-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ff5e771f5dcbc81c64898c597a434f7682f2259e0cd666932a913d53d1341d1a", size = 1895954, upload-time = "2025-10-28T20:56:10.545Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/f3/083907ee3437425b4e376aa58b2c915eb1a33703ec0dc30040f7ae3368c6/aiohttp-3.13.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3b6fb0c207cc661fa0bf8c66d8d9b657331ccc814f4719468af61034b478592", size = 1747092, upload-time = "2025-10-28T20:56:12.118Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/61/98a47319b4e425cc134e05e5f3fc512bf9a04bf65aafd9fdcda5d57ec693/aiohttp-3.13.2-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:97a0895a8e840ab3520e2288db7cace3a1981300d48babeb50e7425609e2e0ab", size = 1606815, upload-time = "2025-10-28T20:56:14.191Z" },
+ { url = "https://files.pythonhosted.org/packages/97/4b/e78b854d82f66bb974189135d31fce265dee0f5344f64dd0d345158a5973/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9e8f8afb552297aca127c90cb840e9a1d4bfd6a10d7d8f2d9176e1acc69bad30", size = 1723789, upload-time = "2025-10-28T20:56:16.101Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/fc/9d2ccc794fc9b9acd1379d625c3a8c64a45508b5091c546dea273a41929e/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:ed2f9c7216e53c3df02264f25d824b079cc5914f9e2deba94155190ef648ee40", size = 1718104, upload-time = "2025-10-28T20:56:17.655Z" },
+ { url = "https://files.pythonhosted.org/packages/66/65/34564b8765ea5c7d79d23c9113135d1dd3609173da13084830f1507d56cf/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:99c5280a329d5fa18ef30fd10c793a190d996567667908bef8a7f81f8202b948", size = 1785584, upload-time = "2025-10-28T20:56:19.238Z" },
+ { url = "https://files.pythonhosted.org/packages/30/be/f6a7a426e02fc82781afd62016417b3948e2207426d90a0e478790d1c8a4/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:2ca6ffef405fc9c09a746cb5d019c1672cd7f402542e379afc66b370833170cf", size = 1595126, upload-time = "2025-10-28T20:56:20.836Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/c7/8e22d5d28f94f67d2af496f14a83b3c155d915d1fe53d94b66d425ec5b42/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:47f438b1a28e926c37632bff3c44df7d27c9b57aaf4e34b1def3c07111fdb782", size = 1800665, upload-time = "2025-10-28T20:56:22.922Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/11/91133c8b68b1da9fc16555706aa7276fdf781ae2bb0876c838dd86b8116e/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9acda8604a57bb60544e4646a4615c1866ee6c04a8edef9b8ee6fd1d8fa2ddc8", size = 1739532, upload-time = "2025-10-28T20:56:25.924Z" },
+ { url = "https://files.pythonhosted.org/packages/17/6b/3747644d26a998774b21a616016620293ddefa4d63af6286f389aedac844/aiohttp-3.13.2-cp311-cp311-win32.whl", hash = "sha256:868e195e39b24aaa930b063c08bb0c17924899c16c672a28a65afded9c46c6ec", size = 431876, upload-time = "2025-10-28T20:56:27.524Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/63/688462108c1a00eb9f05765331c107f95ae86f6b197b865d29e930b7e462/aiohttp-3.13.2-cp311-cp311-win_amd64.whl", hash = "sha256:7fd19df530c292542636c2a9a85854fab93474396a52f1695e799186bbd7f24c", size = 456205, upload-time = "2025-10-28T20:56:29.062Z" },
+ { url = "https://files.pythonhosted.org/packages/29/9b/01f00e9856d0a73260e86dd8ed0c2234a466c5c1712ce1c281548df39777/aiohttp-3.13.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b1e56bab2e12b2b9ed300218c351ee2a3d8c8fdab5b1ec6193e11a817767e47b", size = 737623, upload-time = "2025-10-28T20:56:30.797Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/1b/4be39c445e2b2bd0aab4ba736deb649fabf14f6757f405f0c9685019b9e9/aiohttp-3.13.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:364e25edaabd3d37b1db1f0cbcee8c73c9a3727bfa262b83e5e4cf3489a2a9dc", size = 492664, upload-time = "2025-10-28T20:56:32.708Z" },
+ { url = "https://files.pythonhosted.org/packages/28/66/d35dcfea8050e131cdd731dff36434390479b4045a8d0b9d7111b0a968f1/aiohttp-3.13.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c5c94825f744694c4b8db20b71dba9a257cd2ba8e010a803042123f3a25d50d7", size = 491808, upload-time = "2025-10-28T20:56:34.57Z" },
+ { url = "https://files.pythonhosted.org/packages/00/29/8e4609b93e10a853b65f8291e64985de66d4f5848c5637cddc70e98f01f8/aiohttp-3.13.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ba2715d842ffa787be87cbfce150d5e88c87a98e0b62e0f5aa489169a393dbbb", size = 1738863, upload-time = "2025-10-28T20:56:36.377Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/fa/4ebdf4adcc0def75ced1a0d2d227577cd7b1b85beb7edad85fcc87693c75/aiohttp-3.13.2-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:585542825c4bc662221fb257889e011a5aa00f1ae4d75d1d246a5225289183e3", size = 1700586, upload-time = "2025-10-28T20:56:38.034Z" },
+ { url = "https://files.pythonhosted.org/packages/da/04/73f5f02ff348a3558763ff6abe99c223381b0bace05cd4530a0258e52597/aiohttp-3.13.2-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:39d02cb6025fe1aabca329c5632f48c9532a3dabccd859e7e2f110668972331f", size = 1768625, upload-time = "2025-10-28T20:56:39.75Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/49/a825b79ffec124317265ca7d2344a86bcffeb960743487cb11988ffb3494/aiohttp-3.13.2-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e67446b19e014d37342f7195f592a2a948141d15a312fe0e700c2fd2f03124f6", size = 1867281, upload-time = "2025-10-28T20:56:41.471Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/48/adf56e05f81eac31edcfae45c90928f4ad50ef2e3ea72cb8376162a368f8/aiohttp-3.13.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4356474ad6333e41ccefd39eae869ba15a6c5299c9c01dfdcfdd5c107be4363e", size = 1752431, upload-time = "2025-10-28T20:56:43.162Z" },
+ { url = "https://files.pythonhosted.org/packages/30/ab/593855356eead019a74e862f21523db09c27f12fd24af72dbc3555b9bfd9/aiohttp-3.13.2-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:eeacf451c99b4525f700f078becff32c32ec327b10dcf31306a8a52d78166de7", size = 1562846, upload-time = "2025-10-28T20:56:44.85Z" },
+ { url = "https://files.pythonhosted.org/packages/39/0f/9f3d32271aa8dc35036e9668e31870a9d3b9542dd6b3e2c8a30931cb27ae/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d8a9b889aeabd7a4e9af0b7f4ab5ad94d42e7ff679aaec6d0db21e3b639ad58d", size = 1699606, upload-time = "2025-10-28T20:56:46.519Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/3c/52d2658c5699b6ef7692a3f7128b2d2d4d9775f2a68093f74bca06cf01e1/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:fa89cb11bc71a63b69568d5b8a25c3ca25b6d54c15f907ca1c130d72f320b76b", size = 1720663, upload-time = "2025-10-28T20:56:48.528Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/d4/8f8f3ff1fb7fb9e3f04fcad4e89d8a1cd8fc7d05de67e3de5b15b33008ff/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:8aa7c807df234f693fed0ecd507192fc97692e61fee5702cdc11155d2e5cadc8", size = 1737939, upload-time = "2025-10-28T20:56:50.77Z" },
+ { url = "https://files.pythonhosted.org/packages/03/d3/ddd348f8a27a634daae39a1b8e291ff19c77867af438af844bf8b7e3231b/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:9eb3e33fdbe43f88c3c75fa608c25e7c47bbd80f48d012763cb67c47f39a7e16", size = 1555132, upload-time = "2025-10-28T20:56:52.568Z" },
+ { url = "https://files.pythonhosted.org/packages/39/b8/46790692dc46218406f94374903ba47552f2f9f90dad554eed61bfb7b64c/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9434bc0d80076138ea986833156c5a48c9c7a8abb0c96039ddbb4afc93184169", size = 1764802, upload-time = "2025-10-28T20:56:54.292Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/e4/19ce547b58ab2a385e5f0b8aa3db38674785085abcf79b6e0edd1632b12f/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ff15c147b2ad66da1f2cbb0622313f2242d8e6e8f9b79b5206c84523a4473248", size = 1719512, upload-time = "2025-10-28T20:56:56.428Z" },
+ { url = "https://files.pythonhosted.org/packages/70/30/6355a737fed29dcb6dfdd48682d5790cb5eab050f7b4e01f49b121d3acad/aiohttp-3.13.2-cp312-cp312-win32.whl", hash = "sha256:27e569eb9d9e95dbd55c0fc3ec3a9335defbf1d8bc1d20171a49f3c4c607b93e", size = 426690, upload-time = "2025-10-28T20:56:58.736Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/0d/b10ac09069973d112de6ef980c1f6bb31cb7dcd0bc363acbdad58f927873/aiohttp-3.13.2-cp312-cp312-win_amd64.whl", hash = "sha256:8709a0f05d59a71f33fd05c17fc11fcb8c30140506e13c2f5e8ee1b8964e1b45", size = 453465, upload-time = "2025-10-28T20:57:00.795Z" },
+]
+
+[[package]]
+name = "aiohttp-swagger3"
+version = "0.10.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "aiohttp" },
+ { name = "attrs" },
+ { name = "fastjsonschema" },
+ { name = "pyyaml" },
+ { name = "rfc3339-validator" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a1/06/00ccb2c8afdde4ca7c3cac424d54715c7d90cdd4e13e1ca71d68f5b2e665/aiohttp_swagger3-0.10.0.tar.gz", hash = "sha256:a333c59328f64dd64587e5f276ee84dc256f587d09f2da6ddaae3812fa4d4f33", size = 1839028, upload-time = "2025-02-11T10:51:26.974Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0a/8f/db4cb843999a3088846d170f38eda2182b50b5733387be8102fed171c53f/aiohttp_swagger3-0.10.0-py3-none-any.whl", hash = "sha256:0ae2d2ba7dbd8ea8fe1cffe8f0197db5d0aa979eb9679bd699ecd87923912509", size = 1826491, upload-time = "2025-02-11T10:51:25.174Z" },
]
[[package]]
@@ -137,73 +153,53 @@ wheels = [
[[package]]
name = "beautifulsoup4"
-version = "4.13.4"
+version = "4.14.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "soupsieve" },
{ name = "typing-extensions" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/d8/e4/0c4c39e18fd76d6a628d4dd8da40543d136ce2d1752bd6eeeab0791f4d6b/beautifulsoup4-4.13.4.tar.gz", hash = "sha256:dbb3c4e1ceae6aefebdaf2423247260cd062430a410e38c66f2baa50a8437195", size = 621067, upload-time = "2025-04-15T17:05:13.836Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/77/e9/df2358efd7659577435e2177bfa69cba6c33216681af51a707193dec162a/beautifulsoup4-4.14.2.tar.gz", hash = "sha256:2a98ab9f944a11acee9cc848508ec28d9228abfd522ef0fad6a02a72e0ded69e", size = 625822, upload-time = "2025-09-29T10:05:42.613Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/50/cd/30110dc0ffcf3b131156077b90e9f60ed75711223f306da4db08eff8403b/beautifulsoup4-4.13.4-py3-none-any.whl", hash = "sha256:9bbbb14bfde9d79f38b8cd5f8c7c85f4b8f2523190ebed90e950a8dea4cb1c4b", size = 187285, upload-time = "2025-04-15T17:05:12.221Z" },
+ { url = "https://files.pythonhosted.org/packages/94/fe/3aed5d0be4d404d12d36ab97e2f1791424d9ca39c2f754a6285d59a3b01d/beautifulsoup4-4.14.2-py3-none-any.whl", hash = "sha256:5ef6fa3a8cbece8488d66985560f97ed091e22bbc4e9c2338508a9d5de6d4515", size = 106392, upload-time = "2025-09-29T10:05:43.771Z" },
]
[[package]]
name = "brotli"
-version = "1.1.0"
+version = "1.2.0"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/2f/c2/f9e977608bdf958650638c3f1e28f85a1b075f075ebbe77db8555463787b/Brotli-1.1.0.tar.gz", hash = "sha256:81de08ac11bcb85841e440c13611c00b67d3bf82698314928d0b676362546724", size = 7372270, upload-time = "2023-09-07T14:05:41.643Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/f7/16/c92ca344d646e71a43b8bb353f0a6490d7f6e06210f8554c8f874e454285/brotli-1.2.0.tar.gz", hash = "sha256:e310f77e41941c13340a95976fe66a8a95b01e783d430eeaf7a2f87e0a57dd0a", size = 7388632, upload-time = "2025-11-05T18:39:42.86Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/6d/3a/dbf4fb970c1019a57b5e492e1e0eae745d32e59ba4d6161ab5422b08eefe/Brotli-1.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e1140c64812cb9b06c922e77f1c26a75ec5e3f0fb2bf92cc8c58720dec276752", size = 873045, upload-time = "2023-09-07T14:03:16.894Z" },
- { url = "https://files.pythonhosted.org/packages/dd/11/afc14026ea7f44bd6eb9316d800d439d092c8d508752055ce8d03086079a/Brotli-1.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c8fd5270e906eef71d4a8d19b7c6a43760c6abcfcc10c9101d14eb2357418de9", size = 446218, upload-time = "2023-09-07T14:03:18.917Z" },
- { url = "https://files.pythonhosted.org/packages/36/83/7545a6e7729db43cb36c4287ae388d6885c85a86dd251768a47015dfde32/Brotli-1.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1ae56aca0402a0f9a3431cddda62ad71666ca9d4dc3a10a142b9dce2e3c0cda3", size = 2903872, upload-time = "2023-09-07T14:03:20.398Z" },
- { url = "https://files.pythonhosted.org/packages/32/23/35331c4d9391fcc0f29fd9bec2c76e4b4eeab769afbc4b11dd2e1098fb13/Brotli-1.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:43ce1b9935bfa1ede40028054d7f48b5469cd02733a365eec8a329ffd342915d", size = 2941254, upload-time = "2023-09-07T14:03:21.914Z" },
- { url = "https://files.pythonhosted.org/packages/3b/24/1671acb450c902edb64bd765d73603797c6c7280a9ada85a195f6b78c6e5/Brotli-1.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:7c4855522edb2e6ae7fdb58e07c3ba9111e7621a8956f481c68d5d979c93032e", size = 2857293, upload-time = "2023-09-07T14:03:24Z" },
- { url = "https://files.pythonhosted.org/packages/d5/00/40f760cc27007912b327fe15bf6bfd8eaecbe451687f72a8abc587d503b3/Brotli-1.1.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:38025d9f30cf4634f8309c6874ef871b841eb3c347e90b0851f63d1ded5212da", size = 3002385, upload-time = "2023-09-07T14:03:26.248Z" },
- { url = "https://files.pythonhosted.org/packages/b8/cb/8aaa83f7a4caa131757668c0fb0c4b6384b09ffa77f2fba9570d87ab587d/Brotli-1.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e6a904cb26bfefc2f0a6f240bdf5233be78cd2488900a2f846f3c3ac8489ab80", size = 2911104, upload-time = "2023-09-07T14:03:27.849Z" },
- { url = "https://files.pythonhosted.org/packages/bc/c4/65456561d89d3c49f46b7fbeb8fe6e449f13bdc8ea7791832c5d476b2faf/Brotli-1.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:a37b8f0391212d29b3a91a799c8e4a2855e0576911cdfb2515487e30e322253d", size = 2809981, upload-time = "2023-09-07T14:03:29.92Z" },
- { url = "https://files.pythonhosted.org/packages/05/1b/cf49528437bae28abce5f6e059f0d0be6fecdcc1d3e33e7c54b3ca498425/Brotli-1.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:e84799f09591700a4154154cab9787452925578841a94321d5ee8fb9a9a328f0", size = 2935297, upload-time = "2023-09-07T14:03:32.035Z" },
- { url = "https://files.pythonhosted.org/packages/81/ff/190d4af610680bf0c5a09eb5d1eac6e99c7c8e216440f9c7cfd42b7adab5/Brotli-1.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f66b5337fa213f1da0d9000bc8dc0cb5b896b726eefd9c6046f699b169c41b9e", size = 2930735, upload-time = "2023-09-07T14:03:33.801Z" },
- { url = "https://files.pythonhosted.org/packages/80/7d/f1abbc0c98f6e09abd3cad63ec34af17abc4c44f308a7a539010f79aae7a/Brotli-1.1.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5dab0844f2cf82be357a0eb11a9087f70c5430b2c241493fc122bb6f2bb0917c", size = 2933107, upload-time = "2024-10-18T12:32:09.016Z" },
- { url = "https://files.pythonhosted.org/packages/34/ce/5a5020ba48f2b5a4ad1c0522d095ad5847a0be508e7d7569c8630ce25062/Brotli-1.1.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e4fe605b917c70283db7dfe5ada75e04561479075761a0b3866c081d035b01c1", size = 2845400, upload-time = "2024-10-18T12:32:11.134Z" },
- { url = "https://files.pythonhosted.org/packages/44/89/fa2c4355ab1eecf3994e5a0a7f5492c6ff81dfcb5f9ba7859bd534bb5c1a/Brotli-1.1.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:1e9a65b5736232e7a7f91ff3d02277f11d339bf34099a56cdab6a8b3410a02b2", size = 3031985, upload-time = "2024-10-18T12:32:12.813Z" },
- { url = "https://files.pythonhosted.org/packages/af/a4/79196b4a1674143d19dca400866b1a4d1a089040df7b93b88ebae81f3447/Brotli-1.1.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:58d4b711689366d4a03ac7957ab8c28890415e267f9b6589969e74b6e42225ec", size = 2927099, upload-time = "2024-10-18T12:32:14.733Z" },
- { url = "https://files.pythonhosted.org/packages/e9/54/1c0278556a097f9651e657b873ab08f01b9a9ae4cac128ceb66427d7cd20/Brotli-1.1.0-cp310-cp310-win32.whl", hash = "sha256:be36e3d172dc816333f33520154d708a2657ea63762ec16b62ece02ab5e4daf2", size = 333172, upload-time = "2023-09-07T14:03:35.212Z" },
- { url = "https://files.pythonhosted.org/packages/f7/65/b785722e941193fd8b571afd9edbec2a9b838ddec4375d8af33a50b8dab9/Brotli-1.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:0c6244521dda65ea562d5a69b9a26120769b7a9fb3db2fe9545935ed6735b128", size = 357255, upload-time = "2023-09-07T14:03:36.447Z" },
- { url = "https://files.pythonhosted.org/packages/96/12/ad41e7fadd5db55459c4c401842b47f7fee51068f86dd2894dd0dcfc2d2a/Brotli-1.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:a3daabb76a78f829cafc365531c972016e4aa8d5b4bf60660ad8ecee19df7ccc", size = 873068, upload-time = "2023-09-07T14:03:37.779Z" },
- { url = "https://files.pythonhosted.org/packages/95/4e/5afab7b2b4b61a84e9c75b17814198ce515343a44e2ed4488fac314cd0a9/Brotli-1.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c8146669223164fc87a7e3de9f81e9423c67a79d6b3447994dfb9c95da16e2d6", size = 446244, upload-time = "2023-09-07T14:03:39.223Z" },
- { url = "https://files.pythonhosted.org/packages/9d/e6/f305eb61fb9a8580c525478a4a34c5ae1a9bcb12c3aee619114940bc513d/Brotli-1.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:30924eb4c57903d5a7526b08ef4a584acc22ab1ffa085faceb521521d2de32dd", size = 2906500, upload-time = "2023-09-07T14:03:40.858Z" },
- { url = "https://files.pythonhosted.org/packages/3e/4f/af6846cfbc1550a3024e5d3775ede1e00474c40882c7bf5b37a43ca35e91/Brotli-1.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ceb64bbc6eac5a140ca649003756940f8d6a7c444a68af170b3187623b43bebf", size = 2943950, upload-time = "2023-09-07T14:03:42.896Z" },
- { url = "https://files.pythonhosted.org/packages/b3/e7/ca2993c7682d8629b62630ebf0d1f3bb3d579e667ce8e7ca03a0a0576a2d/Brotli-1.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a469274ad18dc0e4d316eefa616d1d0c2ff9da369af19fa6f3daa4f09671fd61", size = 2918527, upload-time = "2023-09-07T14:03:44.552Z" },
- { url = "https://files.pythonhosted.org/packages/b3/96/da98e7bedc4c51104d29cc61e5f449a502dd3dbc211944546a4cc65500d3/Brotli-1.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:524f35912131cc2cabb00edfd8d573b07f2d9f21fa824bd3fb19725a9cf06327", size = 2845489, upload-time = "2023-09-07T14:03:46.594Z" },
- { url = "https://files.pythonhosted.org/packages/e8/ef/ccbc16947d6ce943a7f57e1a40596c75859eeb6d279c6994eddd69615265/Brotli-1.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:5b3cc074004d968722f51e550b41a27be656ec48f8afaeeb45ebf65b561481dd", size = 2914080, upload-time = "2023-09-07T14:03:48.204Z" },
- { url = "https://files.pythonhosted.org/packages/80/d6/0bd38d758d1afa62a5524172f0b18626bb2392d717ff94806f741fcd5ee9/Brotli-1.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:19c116e796420b0cee3da1ccec3b764ed2952ccfcc298b55a10e5610ad7885f9", size = 2813051, upload-time = "2023-09-07T14:03:50.348Z" },
- { url = "https://files.pythonhosted.org/packages/14/56/48859dd5d129d7519e001f06dcfbb6e2cf6db92b2702c0c2ce7d97e086c1/Brotli-1.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:510b5b1bfbe20e1a7b3baf5fed9e9451873559a976c1a78eebaa3b86c57b4265", size = 2938172, upload-time = "2023-09-07T14:03:52.395Z" },
- { url = "https://files.pythonhosted.org/packages/3d/77/a236d5f8cd9e9f4348da5acc75ab032ab1ab2c03cc8f430d24eea2672888/Brotli-1.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:a1fd8a29719ccce974d523580987b7f8229aeace506952fa9ce1d53a033873c8", size = 2933023, upload-time = "2023-09-07T14:03:53.96Z" },
- { url = "https://files.pythonhosted.org/packages/f1/87/3b283efc0f5cb35f7f84c0c240b1e1a1003a5e47141a4881bf87c86d0ce2/Brotli-1.1.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c247dd99d39e0338a604f8c2b3bc7061d5c2e9e2ac7ba9cc1be5a69cb6cd832f", size = 2935871, upload-time = "2024-10-18T12:32:16.688Z" },
- { url = "https://files.pythonhosted.org/packages/f3/eb/2be4cc3e2141dc1a43ad4ca1875a72088229de38c68e842746b342667b2a/Brotli-1.1.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1b2c248cd517c222d89e74669a4adfa5577e06ab68771a529060cf5a156e9757", size = 2847784, upload-time = "2024-10-18T12:32:18.459Z" },
- { url = "https://files.pythonhosted.org/packages/66/13/b58ddebfd35edde572ccefe6890cf7c493f0c319aad2a5badee134b4d8ec/Brotli-1.1.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:2a24c50840d89ded6c9a8fdc7b6ed3692ed4e86f1c4a4a938e1e92def92933e0", size = 3034905, upload-time = "2024-10-18T12:32:20.192Z" },
- { url = "https://files.pythonhosted.org/packages/84/9c/bc96b6c7db824998a49ed3b38e441a2cae9234da6fa11f6ed17e8cf4f147/Brotli-1.1.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f31859074d57b4639318523d6ffdca586ace54271a73ad23ad021acd807eb14b", size = 2929467, upload-time = "2024-10-18T12:32:21.774Z" },
- { url = "https://files.pythonhosted.org/packages/e7/71/8f161dee223c7ff7fea9d44893fba953ce97cf2c3c33f78ba260a91bcff5/Brotli-1.1.0-cp311-cp311-win32.whl", hash = "sha256:39da8adedf6942d76dc3e46653e52df937a3c4d6d18fdc94a7c29d263b1f5b50", size = 333169, upload-time = "2023-09-07T14:03:55.404Z" },
- { url = "https://files.pythonhosted.org/packages/02/8a/fece0ee1057643cb2a5bbf59682de13f1725f8482b2c057d4e799d7ade75/Brotli-1.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:aac0411d20e345dc0920bdec5548e438e999ff68d77564d5e9463a7ca9d3e7b1", size = 357253, upload-time = "2023-09-07T14:03:56.643Z" },
- { url = "https://files.pythonhosted.org/packages/5c/d0/5373ae13b93fe00095a58efcbce837fd470ca39f703a235d2a999baadfbc/Brotli-1.1.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:32d95b80260d79926f5fab3c41701dbb818fde1c9da590e77e571eefd14abe28", size = 815693, upload-time = "2024-10-18T12:32:23.824Z" },
- { url = "https://files.pythonhosted.org/packages/8e/48/f6e1cdf86751300c288c1459724bfa6917a80e30dbfc326f92cea5d3683a/Brotli-1.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b760c65308ff1e462f65d69c12e4ae085cff3b332d894637f6273a12a482d09f", size = 422489, upload-time = "2024-10-18T12:32:25.641Z" },
- { url = "https://files.pythonhosted.org/packages/06/88/564958cedce636d0f1bed313381dfc4b4e3d3f6015a63dae6146e1b8c65c/Brotli-1.1.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:316cc9b17edf613ac76b1f1f305d2a748f1b976b033b049a6ecdfd5612c70409", size = 873081, upload-time = "2023-09-07T14:03:57.967Z" },
- { url = "https://files.pythonhosted.org/packages/58/79/b7026a8bb65da9a6bb7d14329fd2bd48d2b7f86d7329d5cc8ddc6a90526f/Brotli-1.1.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:caf9ee9a5775f3111642d33b86237b05808dafcd6268faa492250e9b78046eb2", size = 446244, upload-time = "2023-09-07T14:03:59.319Z" },
- { url = "https://files.pythonhosted.org/packages/e5/18/c18c32ecea41b6c0004e15606e274006366fe19436b6adccc1ae7b2e50c2/Brotli-1.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:70051525001750221daa10907c77830bc889cb6d865cc0b813d9db7fefc21451", size = 2906505, upload-time = "2023-09-07T14:04:01.327Z" },
- { url = "https://files.pythonhosted.org/packages/08/c8/69ec0496b1ada7569b62d85893d928e865df29b90736558d6c98c2031208/Brotli-1.1.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7f4bf76817c14aa98cc6697ac02f3972cb8c3da93e9ef16b9c66573a68014f91", size = 2944152, upload-time = "2023-09-07T14:04:03.033Z" },
- { url = "https://files.pythonhosted.org/packages/ab/fb/0517cea182219d6768113a38167ef6d4eb157a033178cc938033a552ed6d/Brotli-1.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d0c5516f0aed654134a2fc936325cc2e642f8a0e096d075209672eb321cff408", size = 2919252, upload-time = "2023-09-07T14:04:04.675Z" },
- { url = "https://files.pythonhosted.org/packages/c7/53/73a3431662e33ae61a5c80b1b9d2d18f58dfa910ae8dd696e57d39f1a2f5/Brotli-1.1.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6c3020404e0b5eefd7c9485ccf8393cfb75ec38ce75586e046573c9dc29967a0", size = 2845955, upload-time = "2023-09-07T14:04:06.585Z" },
- { url = "https://files.pythonhosted.org/packages/55/ac/bd280708d9c5ebdbf9de01459e625a3e3803cce0784f47d633562cf40e83/Brotli-1.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:4ed11165dd45ce798d99a136808a794a748d5dc38511303239d4e2363c0695dc", size = 2914304, upload-time = "2023-09-07T14:04:08.668Z" },
- { url = "https://files.pythonhosted.org/packages/76/58/5c391b41ecfc4527d2cc3350719b02e87cb424ef8ba2023fb662f9bf743c/Brotli-1.1.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:4093c631e96fdd49e0377a9c167bfd75b6d0bad2ace734c6eb20b348bc3ea180", size = 2814452, upload-time = "2023-09-07T14:04:10.736Z" },
- { url = "https://files.pythonhosted.org/packages/c7/4e/91b8256dfe99c407f174924b65a01f5305e303f486cc7a2e8a5d43c8bec3/Brotli-1.1.0-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:7e4c4629ddad63006efa0ef968c8e4751c5868ff0b1c5c40f76524e894c50248", size = 2938751, upload-time = "2023-09-07T14:04:12.875Z" },
- { url = "https://files.pythonhosted.org/packages/5a/a6/e2a39a5d3b412938362bbbeba5af904092bf3f95b867b4a3eb856104074e/Brotli-1.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:861bf317735688269936f755fa136a99d1ed526883859f86e41a5d43c61d8966", size = 2933757, upload-time = "2023-09-07T14:04:14.551Z" },
- { url = "https://files.pythonhosted.org/packages/13/f0/358354786280a509482e0e77c1a5459e439766597d280f28cb097642fc26/Brotli-1.1.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:87a3044c3a35055527ac75e419dfa9f4f3667a1e887ee80360589eb8c90aabb9", size = 2936146, upload-time = "2024-10-18T12:32:27.257Z" },
- { url = "https://files.pythonhosted.org/packages/80/f7/daf538c1060d3a88266b80ecc1d1c98b79553b3f117a485653f17070ea2a/Brotli-1.1.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:c5529b34c1c9d937168297f2c1fde7ebe9ebdd5e121297ff9c043bdb2ae3d6fb", size = 2848055, upload-time = "2024-10-18T12:32:29.376Z" },
- { url = "https://files.pythonhosted.org/packages/ad/cf/0eaa0585c4077d3c2d1edf322d8e97aabf317941d3a72d7b3ad8bce004b0/Brotli-1.1.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:ca63e1890ede90b2e4454f9a65135a4d387a4585ff8282bb72964fab893f2111", size = 3035102, upload-time = "2024-10-18T12:32:31.371Z" },
- { url = "https://files.pythonhosted.org/packages/d8/63/1c1585b2aa554fe6dbce30f0c18bdbc877fa9a1bf5ff17677d9cca0ac122/Brotli-1.1.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e79e6520141d792237c70bcd7a3b122d00f2613769ae0cb61c52e89fd3443839", size = 2930029, upload-time = "2024-10-18T12:32:33.293Z" },
- { url = "https://files.pythonhosted.org/packages/5f/3b/4e3fd1893eb3bbfef8e5a80d4508bec17a57bb92d586c85c12d28666bb13/Brotli-1.1.0-cp312-cp312-win32.whl", hash = "sha256:5f4d5ea15c9382135076d2fb28dde923352fe02951e66935a9efaac8f10e81b0", size = 333276, upload-time = "2023-09-07T14:04:16.49Z" },
- { url = "https://files.pythonhosted.org/packages/3d/d5/942051b45a9e883b5b6e98c041698b1eb2012d25e5948c58d6bf85b1bb43/Brotli-1.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:906bc3a79de8c4ae5b86d3d75a8b77e44404b0f4261714306e3ad248d8ab0951", size = 357255, upload-time = "2023-09-07T14:04:17.83Z" },
+ { url = "https://files.pythonhosted.org/packages/64/10/a090475284fc4a71aed40a96f32e44a7fe5bda39687353dd977720b211b6/brotli-1.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3b90b767916ac44e93a8e28ce6adf8d551e43affb512f2377c732d486ac6514e", size = 863089, upload-time = "2025-11-05T18:38:01.181Z" },
+ { url = "https://files.pythonhosted.org/packages/03/41/17416630e46c07ac21e378c3464815dd2e120b441e641bc516ac32cc51d2/brotli-1.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6be67c19e0b0c56365c6a76e393b932fb0e78b3b56b711d180dd7013cb1fd984", size = 445442, upload-time = "2025-11-05T18:38:02.434Z" },
+ { url = "https://files.pythonhosted.org/packages/24/31/90cc06584deb5d4fcafc0985e37741fc6b9717926a78674bbb3ce018957e/brotli-1.2.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0bbd5b5ccd157ae7913750476d48099aaf507a79841c0d04a9db4415b14842de", size = 1532658, upload-time = "2025-11-05T18:38:03.588Z" },
+ { url = "https://files.pythonhosted.org/packages/62/17/33bf0c83bcbc96756dfd712201d87342732fad70bb3472c27e833a44a4f9/brotli-1.2.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3f3c908bcc404c90c77d5a073e55271a0a498f4e0756e48127c35d91cf155947", size = 1631241, upload-time = "2025-11-05T18:38:04.582Z" },
+ { url = "https://files.pythonhosted.org/packages/48/10/f47854a1917b62efe29bc98ac18e5d4f71df03f629184575b862ef2e743b/brotli-1.2.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:1b557b29782a643420e08d75aea889462a4a8796e9a6cf5621ab05a3f7da8ef2", size = 1424307, upload-time = "2025-11-05T18:38:05.587Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/b7/f88eb461719259c17483484ea8456925ee057897f8e64487d76e24e5e38d/brotli-1.2.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:81da1b229b1889f25adadc929aeb9dbc4e922bd18561b65b08dd9343cfccca84", size = 1488208, upload-time = "2025-11-05T18:38:06.613Z" },
+ { url = "https://files.pythonhosted.org/packages/26/59/41bbcb983a0c48b0b8004203e74706c6b6e99a04f3c7ca6f4f41f364db50/brotli-1.2.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:ff09cd8c5eec3b9d02d2408db41be150d8891c5566addce57513bf546e3d6c6d", size = 1597574, upload-time = "2025-11-05T18:38:07.838Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/e6/8c89c3bdabbe802febb4c5c6ca224a395e97913b5df0dff11b54f23c1788/brotli-1.2.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:a1778532b978d2536e79c05dac2d8cd857f6c55cd0c95ace5b03740824e0e2f1", size = 1492109, upload-time = "2025-11-05T18:38:08.816Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/9a/4b19d4310b2dbd545c0c33f176b0528fa68c3cd0754e34b2f2bcf56548ae/brotli-1.2.0-cp310-cp310-win32.whl", hash = "sha256:b232029d100d393ae3c603c8ffd7e3fe6f798c5e28ddca5feabb8e8fdb732997", size = 334461, upload-time = "2025-11-05T18:38:10.729Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/39/70981d9f47705e3c2b95c0847dfa3e7a37aa3b7c6030aedc4873081ed005/brotli-1.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:ef87b8ab2704da227e83a246356a2b179ef826f550f794b2c52cddb4efbd0196", size = 369035, upload-time = "2025-11-05T18:38:11.827Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/ef/f285668811a9e1ddb47a18cb0b437d5fc2760d537a2fe8a57875ad6f8448/brotli-1.2.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:15b33fe93cedc4caaff8a0bd1eb7e3dab1c61bb22a0bf5bdfdfd97cd7da79744", size = 863110, upload-time = "2025-11-05T18:38:12.978Z" },
+ { url = "https://files.pythonhosted.org/packages/50/62/a3b77593587010c789a9d6eaa527c79e0848b7b860402cc64bc0bc28a86c/brotli-1.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:898be2be399c221d2671d29eed26b6b2713a02c2119168ed914e7d00ceadb56f", size = 445438, upload-time = "2025-11-05T18:38:14.208Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/e1/7fadd47f40ce5549dc44493877db40292277db373da5053aff181656e16e/brotli-1.2.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:350c8348f0e76fff0a0fd6c26755d2653863279d086d3aa2c290a6a7251135dd", size = 1534420, upload-time = "2025-11-05T18:38:15.111Z" },
+ { url = "https://files.pythonhosted.org/packages/12/8b/1ed2f64054a5a008a4ccd2f271dbba7a5fb1a3067a99f5ceadedd4c1d5a7/brotli-1.2.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2e1ad3fda65ae0d93fec742a128d72e145c9c7a99ee2fcd667785d99eb25a7fe", size = 1632619, upload-time = "2025-11-05T18:38:16.094Z" },
+ { url = "https://files.pythonhosted.org/packages/89/5a/7071a621eb2d052d64efd5da2ef55ecdac7c3b0c6e4f9d519e9c66d987ef/brotli-1.2.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:40d918bce2b427a0c4ba189df7a006ac0c7277c180aee4617d99e9ccaaf59e6a", size = 1426014, upload-time = "2025-11-05T18:38:17.177Z" },
+ { url = "https://files.pythonhosted.org/packages/26/6d/0971a8ea435af5156acaaccec1a505f981c9c80227633851f2810abd252a/brotli-1.2.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2a7f1d03727130fc875448b65b127a9ec5d06d19d0148e7554384229706f9d1b", size = 1489661, upload-time = "2025-11-05T18:38:18.41Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/75/c1baca8b4ec6c96a03ef8230fab2a785e35297632f402ebb1e78a1e39116/brotli-1.2.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:9c79f57faa25d97900bfb119480806d783fba83cd09ee0b33c17623935b05fa3", size = 1599150, upload-time = "2025-11-05T18:38:19.792Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/1a/23fcfee1c324fd48a63d7ebf4bac3a4115bdb1b00e600f80f727d850b1ae/brotli-1.2.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:844a8ceb8483fefafc412f85c14f2aae2fb69567bf2a0de53cdb88b73e7c43ae", size = 1493505, upload-time = "2025-11-05T18:38:20.913Z" },
+ { url = "https://files.pythonhosted.org/packages/36/e5/12904bbd36afeef53d45a84881a4810ae8810ad7e328a971ebbfd760a0b3/brotli-1.2.0-cp311-cp311-win32.whl", hash = "sha256:aa47441fa3026543513139cb8926a92a8e305ee9c71a6209ef7a97d91640ea03", size = 334451, upload-time = "2025-11-05T18:38:21.94Z" },
+ { url = "https://files.pythonhosted.org/packages/02/8b/ecb5761b989629a4758c394b9301607a5880de61ee2ee5fe104b87149ebc/brotli-1.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:022426c9e99fd65d9475dce5c195526f04bb8be8907607e27e747893f6ee3e24", size = 369035, upload-time = "2025-11-05T18:38:22.941Z" },
+ { url = "https://files.pythonhosted.org/packages/11/ee/b0a11ab2315c69bb9b45a2aaed022499c9c24a205c3a49c3513b541a7967/brotli-1.2.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:35d382625778834a7f3061b15423919aa03e4f5da34ac8e02c074e4b75ab4f84", size = 861543, upload-time = "2025-11-05T18:38:24.183Z" },
+ { url = "https://files.pythonhosted.org/packages/e1/2f/29c1459513cd35828e25531ebfcbf3e92a5e49f560b1777a9af7203eb46e/brotli-1.2.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7a61c06b334bd99bc5ae84f1eeb36bfe01400264b3c352f968c6e30a10f9d08b", size = 444288, upload-time = "2025-11-05T18:38:25.139Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/6f/feba03130d5fceadfa3a1bb102cb14650798c848b1df2a808356f939bb16/brotli-1.2.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:acec55bb7c90f1dfc476126f9711a8e81c9af7fb617409a9ee2953115343f08d", size = 1528071, upload-time = "2025-11-05T18:38:26.081Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/38/f3abb554eee089bd15471057ba85f47e53a44a462cfce265d9bf7088eb09/brotli-1.2.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:260d3692396e1895c5034f204f0db022c056f9e2ac841593a4cf9426e2a3faca", size = 1626913, upload-time = "2025-11-05T18:38:27.284Z" },
+ { url = "https://files.pythonhosted.org/packages/03/a7/03aa61fbc3c5cbf99b44d158665f9b0dd3d8059be16c460208d9e385c837/brotli-1.2.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:072e7624b1fc4d601036ab3f4f27942ef772887e876beff0301d261210bca97f", size = 1419762, upload-time = "2025-11-05T18:38:28.295Z" },
+ { url = "https://files.pythonhosted.org/packages/21/1b/0374a89ee27d152a5069c356c96b93afd1b94eae83f1e004b57eb6ce2f10/brotli-1.2.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:adedc4a67e15327dfdd04884873c6d5a01d3e3b6f61406f99b1ed4865a2f6d28", size = 1484494, upload-time = "2025-11-05T18:38:29.29Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/57/69d4fe84a67aef4f524dcd075c6eee868d7850e85bf01d778a857d8dbe0a/brotli-1.2.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7a47ce5c2288702e09dc22a44d0ee6152f2c7eda97b3c8482d826a1f3cfc7da7", size = 1593302, upload-time = "2025-11-05T18:38:30.639Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/3b/39e13ce78a8e9a621c5df3aeb5fd181fcc8caba8c48a194cd629771f6828/brotli-1.2.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:af43b8711a8264bb4e7d6d9a6d004c3a2019c04c01127a868709ec29962b6036", size = 1487913, upload-time = "2025-11-05T18:38:31.618Z" },
+ { url = "https://files.pythonhosted.org/packages/62/28/4d00cb9bd76a6357a66fcd54b4b6d70288385584063f4b07884c1e7286ac/brotli-1.2.0-cp312-cp312-win32.whl", hash = "sha256:e99befa0b48f3cd293dafeacdd0d191804d105d279e0b387a32054c1180f3161", size = 334362, upload-time = "2025-11-05T18:38:32.939Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/4e/bc1dcac9498859d5e353c9b153627a3752868a9d5f05ce8dedd81a2354ab/brotli-1.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:b35c13ce241abdd44cb8ca70683f20c0c079728a36a996297adb5334adfc1c44", size = 369115, upload-time = "2025-11-05T18:38:33.765Z" },
]
[[package]]
@@ -323,14 +319,14 @@ wheels = [
[[package]]
name = "click"
-version = "8.2.1"
+version = "8.3.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/46/61/de6cd827efad202d7057d93e0fed9294b96952e188f7384832791c7b2254/click-8.3.0.tar.gz", hash = "sha256:e7b8232224eba16f4ebe410c25ced9f7875cb5f3263ffc93cc3e8da705e229c4", size = 276943, upload-time = "2025-09-18T17:32:23.696Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" },
+ { url = "https://files.pythonhosted.org/packages/db/d3/9dcc0f5797f070ec8edf30fbadfb200e71d9db6b84d211e3b2085a7589a0/click-8.3.0-py3-none-any.whl", hash = "sha256:9b9f285302c6e3064f4330c05f05b81945b2a39544279343e6e7c5f27a9baddc", size = 107295, upload-time = "2025-09-18T17:32:22.42Z" },
]
[[package]]
@@ -359,49 +355,49 @@ wheels = [
[[package]]
name = "cryptography"
-version = "45.0.6"
+version = "45.0.7"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/d6/0d/d13399c94234ee8f3df384819dc67e0c5ce215fb751d567a55a1f4b028c7/cryptography-45.0.6.tar.gz", hash = "sha256:5c966c732cf6e4a276ce83b6e4c729edda2df6929083a952cc7da973c539c719", size = 744949, upload-time = "2025-08-05T23:59:27.93Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/a7/35/c495bffc2056f2dadb32434f1feedd79abde2a7f8363e1974afa9c33c7e2/cryptography-45.0.7.tar.gz", hash = "sha256:4b1654dfc64ea479c242508eb8c724044f1e964a47d1d1cacc5132292d851971", size = 744980, upload-time = "2025-09-01T11:15:03.146Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/8c/29/2793d178d0eda1ca4a09a7c4e09a5185e75738cc6d526433e8663b460ea6/cryptography-45.0.6-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:048e7ad9e08cf4c0ab07ff7f36cc3115924e22e2266e034450a890d9e312dd74", size = 7042702, upload-time = "2025-08-05T23:58:23.464Z" },
- { url = "https://files.pythonhosted.org/packages/b3/b6/cabd07410f222f32c8d55486c464f432808abaa1f12af9afcbe8f2f19030/cryptography-45.0.6-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:44647c5d796f5fc042bbc6d61307d04bf29bccb74d188f18051b635f20a9c75f", size = 4206483, upload-time = "2025-08-05T23:58:27.132Z" },
- { url = "https://files.pythonhosted.org/packages/8b/9e/f9c7d36a38b1cfeb1cc74849aabe9bf817990f7603ff6eb485e0d70e0b27/cryptography-45.0.6-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e40b80ecf35ec265c452eea0ba94c9587ca763e739b8e559c128d23bff7ebbbf", size = 4429679, upload-time = "2025-08-05T23:58:29.152Z" },
- { url = "https://files.pythonhosted.org/packages/9c/2a/4434c17eb32ef30b254b9e8b9830cee4e516f08b47fdd291c5b1255b8101/cryptography-45.0.6-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:00e8724bdad672d75e6f069b27970883179bd472cd24a63f6e620ca7e41cc0c5", size = 4210553, upload-time = "2025-08-05T23:58:30.596Z" },
- { url = "https://files.pythonhosted.org/packages/ef/1d/09a5df8e0c4b7970f5d1f3aff1b640df6d4be28a64cae970d56c6cf1c772/cryptography-45.0.6-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7a3085d1b319d35296176af31c90338eeb2ddac8104661df79f80e1d9787b8b2", size = 3894499, upload-time = "2025-08-05T23:58:32.03Z" },
- { url = "https://files.pythonhosted.org/packages/79/62/120842ab20d9150a9d3a6bdc07fe2870384e82f5266d41c53b08a3a96b34/cryptography-45.0.6-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1b7fa6a1c1188c7ee32e47590d16a5a0646270921f8020efc9a511648e1b2e08", size = 4458484, upload-time = "2025-08-05T23:58:33.526Z" },
- { url = "https://files.pythonhosted.org/packages/fd/80/1bc3634d45ddfed0871bfba52cf8f1ad724761662a0c792b97a951fb1b30/cryptography-45.0.6-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:275ba5cc0d9e320cd70f8e7b96d9e59903c815ca579ab96c1e37278d231fc402", size = 4210281, upload-time = "2025-08-05T23:58:35.445Z" },
- { url = "https://files.pythonhosted.org/packages/7d/fe/ffb12c2d83d0ee625f124880a1f023b5878f79da92e64c37962bbbe35f3f/cryptography-45.0.6-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:f4028f29a9f38a2025abedb2e409973709c660d44319c61762202206ed577c42", size = 4456890, upload-time = "2025-08-05T23:58:36.923Z" },
- { url = "https://files.pythonhosted.org/packages/8c/8e/b3f3fe0dc82c77a0deb5f493b23311e09193f2268b77196ec0f7a36e3f3e/cryptography-45.0.6-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ee411a1b977f40bd075392c80c10b58025ee5c6b47a822a33c1198598a7a5f05", size = 4333247, upload-time = "2025-08-05T23:58:38.781Z" },
- { url = "https://files.pythonhosted.org/packages/b3/a6/c3ef2ab9e334da27a1d7b56af4a2417d77e7806b2e0f90d6267ce120d2e4/cryptography-45.0.6-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:e2a21a8eda2d86bb604934b6b37691585bd095c1f788530c1fcefc53a82b3453", size = 4565045, upload-time = "2025-08-05T23:58:40.415Z" },
- { url = "https://files.pythonhosted.org/packages/31/c3/77722446b13fa71dddd820a5faab4ce6db49e7e0bf8312ef4192a3f78e2f/cryptography-45.0.6-cp311-abi3-win32.whl", hash = "sha256:d063341378d7ee9c91f9d23b431a3502fc8bfacd54ef0a27baa72a0843b29159", size = 2928923, upload-time = "2025-08-05T23:58:41.919Z" },
- { url = "https://files.pythonhosted.org/packages/38/63/a025c3225188a811b82932a4dcc8457a26c3729d81578ccecbcce2cb784e/cryptography-45.0.6-cp311-abi3-win_amd64.whl", hash = "sha256:833dc32dfc1e39b7376a87b9a6a4288a10aae234631268486558920029b086ec", size = 3403805, upload-time = "2025-08-05T23:58:43.792Z" },
- { url = "https://files.pythonhosted.org/packages/5b/af/bcfbea93a30809f126d51c074ee0fac5bd9d57d068edf56c2a73abedbea4/cryptography-45.0.6-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:3436128a60a5e5490603ab2adbabc8763613f638513ffa7d311c900a8349a2a0", size = 7020111, upload-time = "2025-08-05T23:58:45.316Z" },
- { url = "https://files.pythonhosted.org/packages/98/c6/ea5173689e014f1a8470899cd5beeb358e22bb3cf5a876060f9d1ca78af4/cryptography-45.0.6-cp37-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0d9ef57b6768d9fa58e92f4947cea96ade1233c0e236db22ba44748ffedca394", size = 4198169, upload-time = "2025-08-05T23:58:47.121Z" },
- { url = "https://files.pythonhosted.org/packages/ba/73/b12995edc0c7e2311ffb57ebd3b351f6b268fed37d93bfc6f9856e01c473/cryptography-45.0.6-cp37-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ea3c42f2016a5bbf71825537c2ad753f2870191134933196bee408aac397b3d9", size = 4421273, upload-time = "2025-08-05T23:58:48.557Z" },
- { url = "https://files.pythonhosted.org/packages/f7/6e/286894f6f71926bc0da67408c853dd9ba953f662dcb70993a59fd499f111/cryptography-45.0.6-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:20ae4906a13716139d6d762ceb3e0e7e110f7955f3bc3876e3a07f5daadec5f3", size = 4199211, upload-time = "2025-08-05T23:58:50.139Z" },
- { url = "https://files.pythonhosted.org/packages/de/34/a7f55e39b9623c5cb571d77a6a90387fe557908ffc44f6872f26ca8ae270/cryptography-45.0.6-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2dac5ec199038b8e131365e2324c03d20e97fe214af051d20c49db129844e8b3", size = 3883732, upload-time = "2025-08-05T23:58:52.253Z" },
- { url = "https://files.pythonhosted.org/packages/f9/b9/c6d32edbcba0cd9f5df90f29ed46a65c4631c4fbe11187feb9169c6ff506/cryptography-45.0.6-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:18f878a34b90d688982e43f4b700408b478102dd58b3e39de21b5ebf6509c301", size = 4450655, upload-time = "2025-08-05T23:58:53.848Z" },
- { url = "https://files.pythonhosted.org/packages/77/2d/09b097adfdee0227cfd4c699b3375a842080f065bab9014248933497c3f9/cryptography-45.0.6-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5bd6020c80c5b2b2242d6c48487d7b85700f5e0038e67b29d706f98440d66eb5", size = 4198956, upload-time = "2025-08-05T23:58:55.209Z" },
- { url = "https://files.pythonhosted.org/packages/55/66/061ec6689207d54effdff535bbdf85cc380d32dd5377173085812565cf38/cryptography-45.0.6-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:eccddbd986e43014263eda489abbddfbc287af5cddfd690477993dbb31e31016", size = 4449859, upload-time = "2025-08-05T23:58:56.639Z" },
- { url = "https://files.pythonhosted.org/packages/41/ff/e7d5a2ad2d035e5a2af116e1a3adb4d8fcd0be92a18032917a089c6e5028/cryptography-45.0.6-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:550ae02148206beb722cfe4ef0933f9352bab26b087af00e48fdfb9ade35c5b3", size = 4320254, upload-time = "2025-08-05T23:58:58.833Z" },
- { url = "https://files.pythonhosted.org/packages/82/27/092d311af22095d288f4db89fcaebadfb2f28944f3d790a4cf51fe5ddaeb/cryptography-45.0.6-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:5b64e668fc3528e77efa51ca70fadcd6610e8ab231e3e06ae2bab3b31c2b8ed9", size = 4554815, upload-time = "2025-08-05T23:59:00.283Z" },
- { url = "https://files.pythonhosted.org/packages/7e/01/aa2f4940262d588a8fdf4edabe4cda45854d00ebc6eaac12568b3a491a16/cryptography-45.0.6-cp37-abi3-win32.whl", hash = "sha256:780c40fb751c7d2b0c6786ceee6b6f871e86e8718a8ff4bc35073ac353c7cd02", size = 2912147, upload-time = "2025-08-05T23:59:01.716Z" },
- { url = "https://files.pythonhosted.org/packages/0a/bc/16e0276078c2de3ceef6b5a34b965f4436215efac45313df90d55f0ba2d2/cryptography-45.0.6-cp37-abi3-win_amd64.whl", hash = "sha256:20d15aed3ee522faac1a39fbfdfee25d17b1284bafd808e1640a74846d7c4d1b", size = 3390459, upload-time = "2025-08-05T23:59:03.358Z" },
- { url = "https://files.pythonhosted.org/packages/56/d2/4482d97c948c029be08cb29854a91bd2ae8da7eb9c4152461f1244dcea70/cryptography-45.0.6-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:705bb7c7ecc3d79a50f236adda12ca331c8e7ecfbea51edd931ce5a7a7c4f012", size = 3576812, upload-time = "2025-08-05T23:59:04.833Z" },
- { url = "https://files.pythonhosted.org/packages/ec/24/55fc238fcaa122855442604b8badb2d442367dfbd5a7ca4bb0bd346e263a/cryptography-45.0.6-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:826b46dae41a1155a0c0e66fafba43d0ede1dc16570b95e40c4d83bfcf0a451d", size = 4141694, upload-time = "2025-08-05T23:59:06.66Z" },
- { url = "https://files.pythonhosted.org/packages/f9/7e/3ea4fa6fbe51baf3903806a0241c666b04c73d2358a3ecce09ebee8b9622/cryptography-45.0.6-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:cc4d66f5dc4dc37b89cfef1bd5044387f7a1f6f0abb490815628501909332d5d", size = 4375010, upload-time = "2025-08-05T23:59:08.14Z" },
- { url = "https://files.pythonhosted.org/packages/50/42/ec5a892d82d2a2c29f80fc19ced4ba669bca29f032faf6989609cff1f8dc/cryptography-45.0.6-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:f68f833a9d445cc49f01097d95c83a850795921b3f7cc6488731e69bde3288da", size = 4141377, upload-time = "2025-08-05T23:59:09.584Z" },
- { url = "https://files.pythonhosted.org/packages/e7/d7/246c4c973a22b9c2931999da953a2c19cae7c66b9154c2d62ffed811225e/cryptography-45.0.6-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:3b5bf5267e98661b9b888a9250d05b063220dfa917a8203744454573c7eb79db", size = 4374609, upload-time = "2025-08-05T23:59:11.923Z" },
- { url = "https://files.pythonhosted.org/packages/78/6d/c49ccf243f0a1b0781c2a8de8123ee552f0c8a417c6367a24d2ecb7c11b3/cryptography-45.0.6-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:2384f2ab18d9be88a6e4f8972923405e2dbb8d3e16c6b43f15ca491d7831bd18", size = 3322156, upload-time = "2025-08-05T23:59:13.597Z" },
- { url = "https://files.pythonhosted.org/packages/61/69/c252de4ec047ba2f567ecb53149410219577d408c2aea9c989acae7eafce/cryptography-45.0.6-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:fc022c1fa5acff6def2fc6d7819bbbd31ccddfe67d075331a65d9cfb28a20983", size = 3584669, upload-time = "2025-08-05T23:59:15.431Z" },
- { url = "https://files.pythonhosted.org/packages/e3/fe/deea71e9f310a31fe0a6bfee670955152128d309ea2d1c79e2a5ae0f0401/cryptography-45.0.6-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:3de77e4df42ac8d4e4d6cdb342d989803ad37707cf8f3fbf7b088c9cbdd46427", size = 4153022, upload-time = "2025-08-05T23:59:16.954Z" },
- { url = "https://files.pythonhosted.org/packages/60/45/a77452f5e49cb580feedba6606d66ae7b82c128947aa754533b3d1bd44b0/cryptography-45.0.6-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:599c8d7df950aa68baa7e98f7b73f4f414c9f02d0e8104a30c0182a07732638b", size = 4386802, upload-time = "2025-08-05T23:59:18.55Z" },
- { url = "https://files.pythonhosted.org/packages/a3/b9/a2f747d2acd5e3075fdf5c145c7c3568895daaa38b3b0c960ef830db6cdc/cryptography-45.0.6-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:31a2b9a10530a1cb04ffd6aa1cd4d3be9ed49f7d77a4dafe198f3b382f41545c", size = 4152706, upload-time = "2025-08-05T23:59:20.044Z" },
- { url = "https://files.pythonhosted.org/packages/81/ec/381b3e8d0685a3f3f304a382aa3dfce36af2d76467da0fd4bb21ddccc7b2/cryptography-45.0.6-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:e5b3dda1b00fb41da3af4c5ef3f922a200e33ee5ba0f0bc9ecf0b0c173958385", size = 4386740, upload-time = "2025-08-05T23:59:21.525Z" },
- { url = "https://files.pythonhosted.org/packages/0a/76/cf8d69da8d0b5ecb0db406f24a63a3f69ba5e791a11b782aeeefef27ccbb/cryptography-45.0.6-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:629127cfdcdc6806dfe234734d7cb8ac54edaf572148274fa377a7d3405b0043", size = 3331874, upload-time = "2025-08-05T23:59:23.017Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/91/925c0ac74362172ae4516000fe877912e33b5983df735ff290c653de4913/cryptography-45.0.7-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:3be4f21c6245930688bd9e162829480de027f8bf962ede33d4f8ba7d67a00cee", size = 7041105, upload-time = "2025-09-01T11:13:59.684Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/63/43641c5acce3a6105cf8bd5baeceeb1846bb63067d26dae3e5db59f1513a/cryptography-45.0.7-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:67285f8a611b0ebc0857ced2081e30302909f571a46bfa7a3cc0ad303fe015c6", size = 4205799, upload-time = "2025-09-01T11:14:02.517Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/29/c238dd9107f10bfde09a4d1c52fd38828b1aa353ced11f358b5dd2507d24/cryptography-45.0.7-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:577470e39e60a6cd7780793202e63536026d9b8641de011ed9d8174da9ca5339", size = 4430504, upload-time = "2025-09-01T11:14:04.522Z" },
+ { url = "https://files.pythonhosted.org/packages/62/62/24203e7cbcc9bd7c94739428cd30680b18ae6b18377ae66075c8e4771b1b/cryptography-45.0.7-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:4bd3e5c4b9682bc112d634f2c6ccc6736ed3635fc3319ac2bb11d768cc5a00d8", size = 4209542, upload-time = "2025-09-01T11:14:06.309Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/e3/e7de4771a08620eef2389b86cd87a2c50326827dea5528feb70595439ce4/cryptography-45.0.7-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:465ccac9d70115cd4de7186e60cfe989de73f7bb23e8a7aa45af18f7412e75bf", size = 3889244, upload-time = "2025-09-01T11:14:08.152Z" },
+ { url = "https://files.pythonhosted.org/packages/96/b8/bca71059e79a0bb2f8e4ec61d9c205fbe97876318566cde3b5092529faa9/cryptography-45.0.7-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:16ede8a4f7929b4b7ff3642eba2bf79aa1d71f24ab6ee443935c0d269b6bc513", size = 4461975, upload-time = "2025-09-01T11:14:09.755Z" },
+ { url = "https://files.pythonhosted.org/packages/58/67/3f5b26937fe1218c40e95ef4ff8d23c8dc05aa950d54200cc7ea5fb58d28/cryptography-45.0.7-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:8978132287a9d3ad6b54fcd1e08548033cc09dc6aacacb6c004c73c3eb5d3ac3", size = 4209082, upload-time = "2025-09-01T11:14:11.229Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/e4/b3e68a4ac363406a56cf7b741eeb80d05284d8c60ee1a55cdc7587e2a553/cryptography-45.0.7-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b6a0e535baec27b528cb07a119f321ac024592388c5681a5ced167ae98e9fff3", size = 4460397, upload-time = "2025-09-01T11:14:12.924Z" },
+ { url = "https://files.pythonhosted.org/packages/22/49/2c93f3cd4e3efc8cb22b02678c1fad691cff9dd71bb889e030d100acbfe0/cryptography-45.0.7-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:a24ee598d10befaec178efdff6054bc4d7e883f615bfbcd08126a0f4931c83a6", size = 4337244, upload-time = "2025-09-01T11:14:14.431Z" },
+ { url = "https://files.pythonhosted.org/packages/04/19/030f400de0bccccc09aa262706d90f2ec23d56bc4eb4f4e8268d0ddf3fb8/cryptography-45.0.7-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:fa26fa54c0a9384c27fcdc905a2fb7d60ac6e47d14bc2692145f2b3b1e2cfdbd", size = 4568862, upload-time = "2025-09-01T11:14:16.185Z" },
+ { url = "https://files.pythonhosted.org/packages/29/56/3034a3a353efa65116fa20eb3c990a8c9f0d3db4085429040a7eef9ada5f/cryptography-45.0.7-cp311-abi3-win32.whl", hash = "sha256:bef32a5e327bd8e5af915d3416ffefdbe65ed975b646b3805be81b23580b57b8", size = 2936578, upload-time = "2025-09-01T11:14:17.638Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/61/0ab90f421c6194705a99d0fa9f6ee2045d916e4455fdbb095a9c2c9a520f/cryptography-45.0.7-cp311-abi3-win_amd64.whl", hash = "sha256:3808e6b2e5f0b46d981c24d79648e5c25c35e59902ea4391a0dcb3e667bf7443", size = 3405400, upload-time = "2025-09-01T11:14:18.958Z" },
+ { url = "https://files.pythonhosted.org/packages/63/e8/c436233ddf19c5f15b25ace33979a9dd2e7aa1a59209a0ee8554179f1cc0/cryptography-45.0.7-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bfb4c801f65dd61cedfc61a83732327fafbac55a47282e6f26f073ca7a41c3b2", size = 7021824, upload-time = "2025-09-01T11:14:20.954Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/4c/8f57f2500d0ccd2675c5d0cc462095adf3faa8c52294ba085c036befb901/cryptography-45.0.7-cp37-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:81823935e2f8d476707e85a78a405953a03ef7b7b4f55f93f7c2d9680e5e0691", size = 4202233, upload-time = "2025-09-01T11:14:22.454Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/ac/59b7790b4ccaed739fc44775ce4645c9b8ce54cbec53edf16c74fd80cb2b/cryptography-45.0.7-cp37-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3994c809c17fc570c2af12c9b840d7cea85a9fd3e5c0e0491f4fa3c029216d59", size = 4423075, upload-time = "2025-09-01T11:14:24.287Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/56/d4f07ea21434bf891faa088a6ac15d6d98093a66e75e30ad08e88aa2b9ba/cryptography-45.0.7-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:dad43797959a74103cb59c5dac71409f9c27d34c8a05921341fb64ea8ccb1dd4", size = 4204517, upload-time = "2025-09-01T11:14:25.679Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/ac/924a723299848b4c741c1059752c7cfe09473b6fd77d2920398fc26bfb53/cryptography-45.0.7-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ce7a453385e4c4693985b4a4a3533e041558851eae061a58a5405363b098fcd3", size = 3882893, upload-time = "2025-09-01T11:14:27.1Z" },
+ { url = "https://files.pythonhosted.org/packages/83/dc/4dab2ff0a871cc2d81d3ae6d780991c0192b259c35e4d83fe1de18b20c70/cryptography-45.0.7-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:b04f85ac3a90c227b6e5890acb0edbaf3140938dbecf07bff618bf3638578cf1", size = 4450132, upload-time = "2025-09-01T11:14:28.58Z" },
+ { url = "https://files.pythonhosted.org/packages/12/dd/b2882b65db8fc944585d7fb00d67cf84a9cef4e77d9ba8f69082e911d0de/cryptography-45.0.7-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:48c41a44ef8b8c2e80ca4527ee81daa4c527df3ecbc9423c41a420a9559d0e27", size = 4204086, upload-time = "2025-09-01T11:14:30.572Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/fa/1d5745d878048699b8eb87c984d4ccc5da4f5008dfd3ad7a94040caca23a/cryptography-45.0.7-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:f3df7b3d0f91b88b2106031fd995802a2e9ae13e02c36c1fc075b43f420f3a17", size = 4449383, upload-time = "2025-09-01T11:14:32.046Z" },
+ { url = "https://files.pythonhosted.org/packages/36/8b/fc61f87931bc030598e1876c45b936867bb72777eac693e905ab89832670/cryptography-45.0.7-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:dd342f085542f6eb894ca00ef70236ea46070c8a13824c6bde0dfdcd36065b9b", size = 4332186, upload-time = "2025-09-01T11:14:33.95Z" },
+ { url = "https://files.pythonhosted.org/packages/0b/11/09700ddad7443ccb11d674efdbe9a832b4455dc1f16566d9bd3834922ce5/cryptography-45.0.7-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1993a1bb7e4eccfb922b6cd414f072e08ff5816702a0bdb8941c247a6b1b287c", size = 4561639, upload-time = "2025-09-01T11:14:35.343Z" },
+ { url = "https://files.pythonhosted.org/packages/71/ed/8f4c1337e9d3b94d8e50ae0b08ad0304a5709d483bfcadfcc77a23dbcb52/cryptography-45.0.7-cp37-abi3-win32.whl", hash = "sha256:18fcf70f243fe07252dcb1b268a687f2358025ce32f9f88028ca5c364b123ef5", size = 2926552, upload-time = "2025-09-01T11:14:36.929Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/ff/026513ecad58dacd45d1d24ebe52b852165a26e287177de1d545325c0c25/cryptography-45.0.7-cp37-abi3-win_amd64.whl", hash = "sha256:7285a89df4900ed3bfaad5679b1e668cb4b38a8de1ccbfc84b05f34512da0a90", size = 3392742, upload-time = "2025-09-01T11:14:38.368Z" },
+ { url = "https://files.pythonhosted.org/packages/13/3e/e42f1528ca1ea82256b835191eab1be014e0f9f934b60d98b0be8a38ed70/cryptography-45.0.7-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:de58755d723e86175756f463f2f0bddd45cc36fbd62601228a3f8761c9f58252", size = 3572442, upload-time = "2025-09-01T11:14:39.836Z" },
+ { url = "https://files.pythonhosted.org/packages/59/aa/e947693ab08674a2663ed2534cd8d345cf17bf6a1facf99273e8ec8986dc/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a20e442e917889d1a6b3c570c9e3fa2fdc398c20868abcea268ea33c024c4083", size = 4142233, upload-time = "2025-09-01T11:14:41.305Z" },
+ { url = "https://files.pythonhosted.org/packages/24/06/09b6f6a2fc43474a32b8fe259038eef1500ee3d3c141599b57ac6c57612c/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:258e0dff86d1d891169b5af222d362468a9570e2532923088658aa866eb11130", size = 4376202, upload-time = "2025-09-01T11:14:43.047Z" },
+ { url = "https://files.pythonhosted.org/packages/00/f2/c166af87e95ce6ae6d38471a7e039d3a0549c2d55d74e059680162052824/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:d97cf502abe2ab9eff8bd5e4aca274da8d06dd3ef08b759a8d6143f4ad65d4b4", size = 4141900, upload-time = "2025-09-01T11:14:45.089Z" },
+ { url = "https://files.pythonhosted.org/packages/16/b9/e96e0b6cb86eae27ea51fa8a3151535a18e66fe7c451fa90f7f89c85f541/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:c987dad82e8c65ebc985f5dae5e74a3beda9d0a2a4daf8a1115f3772b59e5141", size = 4375562, upload-time = "2025-09-01T11:14:47.166Z" },
+ { url = "https://files.pythonhosted.org/packages/36/d0/36e8ee39274e9d77baf7d0dafda680cba6e52f3936b846f0d56d64fec915/cryptography-45.0.7-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:c13b1e3afd29a5b3b2656257f14669ca8fa8d7956d509926f0b130b600b50ab7", size = 3322781, upload-time = "2025-09-01T11:14:48.747Z" },
+ { url = "https://files.pythonhosted.org/packages/99/4e/49199a4c82946938a3e05d2e8ad9482484ba48bbc1e809e3d506c686d051/cryptography-45.0.7-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a862753b36620af6fc54209264f92c716367f2f0ff4624952276a6bbd18cbde", size = 3584634, upload-time = "2025-09-01T11:14:50.593Z" },
+ { url = "https://files.pythonhosted.org/packages/16/ce/5f6ff59ea9c7779dba51b84871c19962529bdcc12e1a6ea172664916c550/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:06ce84dc14df0bf6ea84666f958e6080cdb6fe1231be2a51f3fc1267d9f3fb34", size = 4149533, upload-time = "2025-09-01T11:14:52.091Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/13/b3cfbd257ac96da4b88b46372e662009b7a16833bfc5da33bb97dd5631ae/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d0c5c6bac22b177bf8da7435d9d27a6834ee130309749d162b26c3105c0795a9", size = 4385557, upload-time = "2025-09-01T11:14:53.551Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/c5/8c59d6b7c7b439ba4fc8d0cab868027fd095f215031bc123c3a070962912/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:2f641b64acc00811da98df63df7d59fd4706c0df449da71cb7ac39a0732b40ae", size = 4149023, upload-time = "2025-09-01T11:14:55.022Z" },
+ { url = "https://files.pythonhosted.org/packages/55/32/05385c86d6ca9ab0b4d5bb442d2e3d85e727939a11f3e163fc776ce5eb40/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:f5414a788ecc6ee6bc58560e85ca624258a55ca434884445440a810796ea0e0b", size = 4385722, upload-time = "2025-09-01T11:14:57.319Z" },
+ { url = "https://files.pythonhosted.org/packages/23/87/7ce86f3fa14bc11a5a48c30d8103c26e09b6465f8d8e9d74cf7a0714f043/cryptography-45.0.7-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:1f3d56f73595376f4244646dd5c5870c14c196949807be39e79e7bd9bac3da63", size = 3332908, upload-time = "2025-09-01T11:14:58.78Z" },
]
[[package]]
@@ -418,24 +414,23 @@ wheels = [
[[package]]
name = "curl-cffi"
-version = "0.7.4"
+version = "0.13.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
{ name = "cffi" },
- { name = "typing-extensions" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/d8/b6/81ea20376e1440a2bcb0f0574c158bccb0948621e437f5634b6fc210d2ba/curl_cffi-0.7.4.tar.gz", hash = "sha256:37a2c8ec77b9914b0c14c74f604991751948d9d5def58fcddcbe73e3b62111c1", size = 137276, upload-time = "2024-12-03T08:41:21.018Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/4e/3d/f39ca1f8fdf14408888e7c25e15eed63eac5f47926e206fb93300d28378c/curl_cffi-0.13.0.tar.gz", hash = "sha256:62ecd90a382bd5023750e3606e0aa7cb1a3a8ba41c14270b8e5e149ebf72c5ca", size = 151303, upload-time = "2025-08-06T13:05:42.988Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/d1/c7/f2133c98a9956baa720dc775ba43b2cf7bf22b0feb0f921aab9bbeb2b58c/curl_cffi-0.7.4-cp38-abi3-macosx_10_9_x86_64.whl", hash = "sha256:417f5264fa746d2680ebb20fbfbcfe5d77fa11a735548d9db6734e839a238e22", size = 5106509, upload-time = "2024-12-03T08:41:02.64Z" },
- { url = "https://files.pythonhosted.org/packages/29/e9/141ff25c5e35f4afc998cf60134df94e0a9157427da69d6ee1d2a045c554/curl_cffi-0.7.4-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:fb76b654fcf9f3e0400cf13be949e4fc525aeb0f9e2e90e61ae48d5bd8557d25", size = 2564082, upload-time = "2024-12-03T08:41:05.223Z" },
- { url = "https://files.pythonhosted.org/packages/66/c4/442094831e7017347e866809bfba29f116864a046478e013848f272ba7b7/curl_cffi-0.7.4-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb9db59b164f2b6be65be62add5896a6fe125c52572aca3046caffbd7eb38f46", size = 5716431, upload-time = "2024-12-03T08:41:07.512Z" },
- { url = "https://files.pythonhosted.org/packages/99/95/6ac63d489167f712bdc14a2cfbe5df252a2e2e95c5b376ea37bda5646fa8/curl_cffi-0.7.4-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4593b120c8101b327e4e2d2c278652c5ef58c42dd39dc4586c2789e42a8bc8b1", size = 5521870, upload-time = "2024-12-03T08:41:09.786Z" },
- { url = "https://files.pythonhosted.org/packages/06/83/2de6b27ba8b3ac394252cadb8783f5c57219068489456d8bb58a180d4aa6/curl_cffi-0.7.4-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c4b5685fab3984aae559e6590a6434a7e34f5d615c562c29c1554a90fffbf0bd", size = 6076887, upload-time = "2024-12-03T08:41:11.464Z" },
- { url = "https://files.pythonhosted.org/packages/86/1d/29b2cf2b7c82c61aeff0076b02531b49420beb5fa89c5a0529f5c06480fe/curl_cffi-0.7.4-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:3f8c19b5ca979e806fcf4de24f606eff745c85b43e9e88956d1db3c07516cc4b", size = 6221911, upload-time = "2024-12-03T08:41:13.886Z" },
- { url = "https://files.pythonhosted.org/packages/1b/7e/a9ba49576373e26169e163878cbb8d4e02cfabf3694c686e22243c12f0dd/curl_cffi-0.7.4-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:9957464013b1f76b0e9259ab846fa60faef7ff08e96e7a1764dd63c83005b836", size = 6004845, upload-time = "2024-12-03T08:41:15.495Z" },
- { url = "https://files.pythonhosted.org/packages/c8/d3/79175cf310f0b1c7149e5a2f25cba997aec83a2bcedc85c744a6456e33af/curl_cffi-0.7.4-cp38-abi3-win32.whl", hash = "sha256:8e9019cf6996bf508e4a51751d7217f22d5902405878679a3ac4757159251741", size = 4188474, upload-time = "2024-12-03T08:41:18.112Z" },
- { url = "https://files.pythonhosted.org/packages/1c/86/6054fcc3fd28ec024ad36a667fa49a05b0c9caf26724186918b7c0ef8217/curl_cffi-0.7.4-cp38-abi3-win_amd64.whl", hash = "sha256:31a80d5ab1bc0f9d4bc0f98d91dc1a3ed4aa08566f21b76ecfde23ece08e0fa9", size = 3993713, upload-time = "2024-12-03T08:41:19.704Z" },
+ { url = "https://files.pythonhosted.org/packages/19/d1/acabfd460f1de26cad882e5ef344d9adde1507034528cb6f5698a2e6a2f1/curl_cffi-0.13.0-cp39-abi3-macosx_10_9_x86_64.whl", hash = "sha256:434cadbe8df2f08b2fc2c16dff2779fb40b984af99c06aa700af898e185bb9db", size = 5686337, upload-time = "2025-08-06T13:05:28.985Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/1c/cdb4fb2d16a0e9de068e0e5bc02094e105ce58a687ff30b4c6f88e25a057/curl_cffi-0.13.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:59afa877a9ae09efa04646a7d068eeea48915a95d9add0a29854e7781679fcd7", size = 2994613, upload-time = "2025-08-06T13:05:31.027Z" },
+ { url = "https://files.pythonhosted.org/packages/04/3e/fdf617c1ec18c3038b77065d484d7517bb30f8fb8847224eb1f601a4e8bc/curl_cffi-0.13.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d06ed389e45a7ca97b17c275dbedd3d6524560270e675c720e93a2018a766076", size = 7931353, upload-time = "2025-08-06T13:05:32.273Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/10/6f30c05d251cf03ddc2b9fd19880f3cab8c193255e733444a2df03b18944/curl_cffi-0.13.0-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b4e0de45ab3b7a835c72bd53640c2347415111b43421b5c7a1a0b18deae2e541", size = 7486378, upload-time = "2025-08-06T13:05:33.672Z" },
+ { url = "https://files.pythonhosted.org/packages/77/81/5bdb7dd0d669a817397b2e92193559bf66c3807f5848a48ad10cf02bf6c7/curl_cffi-0.13.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8eb4083371bbb94e9470d782de235fb5268bf43520de020c9e5e6be8f395443f", size = 8328585, upload-time = "2025-08-06T13:05:35.28Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/c1/df5c6b4cfad41c08442e0f727e449f4fb5a05f8aa564d1acac29062e9e8e/curl_cffi-0.13.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:28911b526e8cd4aa0e5e38401bfe6887e8093907272f1f67ca22e6beb2933a51", size = 8739831, upload-time = "2025-08-06T13:05:37.078Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/91/6dd1910a212f2e8eafe57877bcf97748eb24849e1511a266687546066b8a/curl_cffi-0.13.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:6d433ffcb455ab01dd0d7bde47109083aa38b59863aa183d29c668ae4c96bf8e", size = 8711908, upload-time = "2025-08-06T13:05:38.741Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/e4/15a253f9b4bf8d008c31e176c162d2704a7e0c5e24d35942f759df107b68/curl_cffi-0.13.0-cp39-abi3-win_amd64.whl", hash = "sha256:66a6b75ce971de9af64f1b6812e275f60b88880577bac47ef1fa19694fa21cd3", size = 1614510, upload-time = "2025-08-06T13:05:40.451Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/0f/9c5275f17ad6ff5be70edb8e0120fdc184a658c9577ca426d4230f654beb/curl_cffi-0.13.0-cp39-abi3-win_arm64.whl", hash = "sha256:d438a3b45244e874794bc4081dc1e356d2bb926dcc7021e5a8fef2e2105ef1d8", size = 1365753, upload-time = "2025-08-06T13:05:41.879Z" },
]
[[package]]
@@ -468,6 +463,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" },
]
+[[package]]
+name = "fastjsonschema"
+version = "2.19.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ba/7f/cedf77ace50aa60c566deaca9066750f06e1fcf6ad24f254d255bb976dd6/fastjsonschema-2.19.1.tar.gz", hash = "sha256:e3126a94bdc4623d3de4485f8d468a12f02a67921315ddc87836d6e456dc789d", size = 372732, upload-time = "2023-12-28T14:02:06.823Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9c/b9/79691036d4a8f9857e74d1728b23f34f583b81350a27492edda58d5604e1/fastjsonschema-2.19.1-py3-none-any.whl", hash = "sha256:3672b47bc94178c9f23dbb654bf47440155d4db9df5f7bc47643315f9c405cd0", size = 23388, upload-time = "2023-12-28T14:02:04.512Z" },
+]
+
[[package]]
name = "filelock"
version = "3.19.1"
@@ -477,6 +481,39 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" },
]
+[[package]]
+name = "fonttools"
+version = "4.60.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/4b/42/97a13e47a1e51a5a7142475bbcf5107fe3a68fc34aef331c897d5fb98ad0/fonttools-4.60.1.tar.gz", hash = "sha256:ef00af0439ebfee806b25f24c8f92109157ff3fac5731dc7867957812e87b8d9", size = 3559823, upload-time = "2025-09-29T21:13:27.129Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/26/70/03e9d89a053caff6ae46053890eba8e4a5665a7c5638279ed4492e6d4b8b/fonttools-4.60.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:9a52f254ce051e196b8fe2af4634c2d2f02c981756c6464dc192f1b6050b4e28", size = 2810747, upload-time = "2025-09-29T21:10:59.653Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/41/449ad5aff9670ab0df0f61ee593906b67a36d7e0b4d0cd7fa41ac0325bf5/fonttools-4.60.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c7420a2696a44650120cdd269a5d2e56a477e2bfa9d95e86229059beb1c19e15", size = 2346909, upload-time = "2025-09-29T21:11:02.882Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/18/e5970aa96c8fad1cb19a9479cc3b7602c0c98d250fcdc06a5da994309c50/fonttools-4.60.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee0c0b3b35b34f782afc673d503167157094a16f442ace7c6c5e0ca80b08f50c", size = 4864572, upload-time = "2025-09-29T21:11:05.096Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/20/9b2b4051b6ec6689480787d506b5003f72648f50972a92d04527a456192c/fonttools-4.60.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:282dafa55f9659e8999110bd8ed422ebe1c8aecd0dc396550b038e6c9a08b8ea", size = 4794635, upload-time = "2025-09-29T21:11:08.651Z" },
+ { url = "https://files.pythonhosted.org/packages/10/52/c791f57347c1be98f8345e3dca4ac483eb97666dd7c47f3059aeffab8b59/fonttools-4.60.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4ba4bd646e86de16160f0fb72e31c3b9b7d0721c3e5b26b9fa2fc931dfdb2652", size = 4843878, upload-time = "2025-09-29T21:11:10.893Z" },
+ { url = "https://files.pythonhosted.org/packages/69/e9/35c24a8d01644cee8c090a22fad34d5b61d1e0a8ecbc9945ad785ebf2e9e/fonttools-4.60.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:0b0835ed15dd5b40d726bb61c846a688f5b4ce2208ec68779bc81860adb5851a", size = 4954555, upload-time = "2025-09-29T21:11:13.24Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/86/fb1e994971be4bdfe3a307de6373ef69a9df83fb66e3faa9c8114893d4cc/fonttools-4.60.1-cp310-cp310-win32.whl", hash = "sha256:1525796c3ffe27bb6268ed2a1bb0dcf214d561dfaf04728abf01489eb5339dce", size = 2232019, upload-time = "2025-09-29T21:11:15.73Z" },
+ { url = "https://files.pythonhosted.org/packages/40/84/62a19e2bd56f0e9fb347486a5b26376bade4bf6bbba64dda2c103bd08c94/fonttools-4.60.1-cp310-cp310-win_amd64.whl", hash = "sha256:268ecda8ca6cb5c4f044b1fb9b3b376e8cd1b361cef275082429dc4174907038", size = 2276803, upload-time = "2025-09-29T21:11:18.152Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/85/639aa9bface1537e0fb0f643690672dde0695a5bbbc90736bc571b0b1941/fonttools-4.60.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7b4c32e232a71f63a5d00259ca3d88345ce2a43295bb049d21061f338124246f", size = 2831872, upload-time = "2025-09-29T21:11:20.329Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/47/3c63158459c95093be9618794acb1067b3f4d30dcc5c3e8114b70e67a092/fonttools-4.60.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3630e86c484263eaac71d117085d509cbcf7b18f677906824e4bace598fb70d2", size = 2356990, upload-time = "2025-09-29T21:11:22.754Z" },
+ { url = "https://files.pythonhosted.org/packages/94/dd/1934b537c86fcf99f9761823f1fc37a98fbd54568e8e613f29a90fed95a9/fonttools-4.60.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5c1015318e4fec75dd4943ad5f6a206d9727adf97410d58b7e32ab644a807914", size = 5042189, upload-time = "2025-09-29T21:11:25.061Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/d2/9f4e4c4374dd1daa8367784e1bd910f18ba886db1d6b825b12edf6db3edc/fonttools-4.60.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e6c58beb17380f7c2ea181ea11e7db8c0ceb474c9dd45f48e71e2cb577d146a1", size = 4978683, upload-time = "2025-09-29T21:11:27.693Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/c4/0fb2dfd1ecbe9a07954cc13414713ed1eab17b1c0214ef07fc93df234a47/fonttools-4.60.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ec3681a0cb34c255d76dd9d865a55f260164adb9fa02628415cdc2d43ee2c05d", size = 5021372, upload-time = "2025-09-29T21:11:30.257Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/d5/495fc7ae2fab20223cc87179a8f50f40f9a6f821f271ba8301ae12bb580f/fonttools-4.60.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f4b5c37a5f40e4d733d3bbaaef082149bee5a5ea3156a785ff64d949bd1353fa", size = 5132562, upload-time = "2025-09-29T21:11:32.737Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/fa/021dab618526323c744e0206b3f5c8596a2e7ae9aa38db5948a131123e83/fonttools-4.60.1-cp311-cp311-win32.whl", hash = "sha256:398447f3d8c0c786cbf1209711e79080a40761eb44b27cdafffb48f52bcec258", size = 2230288, upload-time = "2025-09-29T21:11:35.015Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/78/0e1a6d22b427579ea5c8273e1c07def2f325b977faaf60bb7ddc01456cb1/fonttools-4.60.1-cp311-cp311-win_amd64.whl", hash = "sha256:d066ea419f719ed87bc2c99a4a4bfd77c2e5949cb724588b9dd58f3fd90b92bf", size = 2278184, upload-time = "2025-09-29T21:11:37.434Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/f7/a10b101b7a6f8836a5adb47f2791f2075d044a6ca123f35985c42edc82d8/fonttools-4.60.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:7b0c6d57ab00dae9529f3faf187f2254ea0aa1e04215cf2f1a8ec277c96661bc", size = 2832953, upload-time = "2025-09-29T21:11:39.616Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/fe/7bd094b59c926acf2304d2151354ddbeb74b94812f3dc943c231db09cb41/fonttools-4.60.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:839565cbf14645952d933853e8ade66a463684ed6ed6c9345d0faf1f0e868877", size = 2352706, upload-time = "2025-09-29T21:11:41.826Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/ca/4bb48a26ed95a1e7eba175535fe5805887682140ee0a0d10a88e1de84208/fonttools-4.60.1-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:8177ec9676ea6e1793c8a084a90b65a9f778771998eb919d05db6d4b1c0b114c", size = 4923716, upload-time = "2025-09-29T21:11:43.893Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/9f/2cb82999f686c1d1ddf06f6ae1a9117a880adbec113611cc9d22b2fdd465/fonttools-4.60.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:996a4d1834524adbb423385d5a629b868ef9d774670856c63c9a0408a3063401", size = 4968175, upload-time = "2025-09-29T21:11:46.439Z" },
+ { url = "https://files.pythonhosted.org/packages/18/79/be569699e37d166b78e6218f2cde8c550204f2505038cdd83b42edc469b9/fonttools-4.60.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a46b2f450bc79e06ef3b6394f0c68660529ed51692606ad7f953fc2e448bc903", size = 4911031, upload-time = "2025-09-29T21:11:48.977Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/9f/89411cc116effaec5260ad519162f64f9c150e5522a27cbb05eb62d0c05b/fonttools-4.60.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6ec722ee589e89a89f5b7574f5c45604030aa6ae24cb2c751e2707193b466fed", size = 5062966, upload-time = "2025-09-29T21:11:54.344Z" },
+ { url = "https://files.pythonhosted.org/packages/62/a1/f888221934b5731d46cb9991c7a71f30cb1f97c0ef5fcf37f8da8fce6c8e/fonttools-4.60.1-cp312-cp312-win32.whl", hash = "sha256:b2cf105cee600d2de04ca3cfa1f74f1127f8455b71dbad02b9da6ec266e116d6", size = 2218750, upload-time = "2025-09-29T21:11:56.601Z" },
+ { url = "https://files.pythonhosted.org/packages/88/8f/a55b5550cd33cd1028601df41acd057d4be20efa5c958f417b0c0613924d/fonttools-4.60.1-cp312-cp312-win_amd64.whl", hash = "sha256:992775c9fbe2cf794786fa0ffca7f09f564ba3499b8fe9f2f80bd7197db60383", size = 2267026, upload-time = "2025-09-29T21:11:58.852Z" },
+ { url = "https://files.pythonhosted.org/packages/c7/93/0dd45cd283c32dea1545151d8c3637b4b8c53cdb3a625aeb2885b184d74d/fonttools-4.60.1-py3-none-any.whl", hash = "sha256:906306ac7afe2156fcf0042173d6ebbb05416af70f6b370967b47f8f00103bbb", size = 1143175, upload-time = "2025-09-29T21:13:24.134Z" },
+]
+
[[package]]
name = "frozenlist"
version = "1.7.0"
@@ -603,20 +640,20 @@ wheels = [
[[package]]
name = "isort"
-version = "5.13.2"
+version = "7.0.0"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/87/f9/c1eb8635a24e87ade2efce21e3ce8cd6b8630bb685ddc9cdaca1349b2eb5/isort-5.13.2.tar.gz", hash = "sha256:48fdfcb9face5d58a4f6dde2e72a1fb8dcaf8ab26f95ab49fab84c2ddefb0109", size = 175303, upload-time = "2023-12-13T20:37:26.124Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/63/53/4f3c058e3bace40282876f9b553343376ee687f3c35a525dc79dbd450f88/isort-7.0.0.tar.gz", hash = "sha256:5513527951aadb3ac4292a41a16cbc50dd1642432f5e8c20057d414bdafb4187", size = 805049, upload-time = "2025-10-11T13:30:59.107Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/d1/b3/8def84f539e7d2289a02f0524b944b15d7c75dab7628bedf1c4f0992029c/isort-5.13.2-py3-none-any.whl", hash = "sha256:8ca5e72a8d85860d5a3fa69b8745237f2939afe12dbf656afbcb47fe72d947a6", size = 92310, upload-time = "2023-12-13T20:37:23.244Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/ed/e3705d6d02b4f7aea715a353c8ce193efd0b5db13e204df895d38734c244/isort-7.0.0-py3-none-any.whl", hash = "sha256:1bcabac8bc3c36c7fb7b98a76c8abb18e0f841a3ba81decac7691008592499c1", size = 94672, upload-time = "2025-10-11T13:30:57.665Z" },
]
[[package]]
name = "jsonpickle"
-version = "3.4.2"
+version = "4.1.1"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/eb/d9/05365407d3312653498001adcebe64a14024f7189691b728610209991c46/jsonpickle-3.4.2.tar.gz", hash = "sha256:2efa2778859b6397d5804b0a98d52cd2a7d9a70fcb873bc5a3ca5acca8f499ba", size = 314339, upload-time = "2024-11-06T07:48:25.479Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/e4/a6/d07afcfdef402900229bcca795f80506b207af13a838d4d99ad45abf530c/jsonpickle-4.1.1.tar.gz", hash = "sha256:f86e18f13e2b96c1c1eede0b7b90095bbb61d99fedc14813c44dc2f361dbbae1", size = 316885, upload-time = "2025-06-02T20:36:11.57Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/c0/a3/e610ae0feba3e7374da08ab6cc9bb76c8bfa84b4e502aa357bda0ef6dcae/jsonpickle-3.4.2-py3-none-any.whl", hash = "sha256:fd6c273278a02b3b66e3405db3dd2f4dbc8f4a4a3123bfcab3045177c6feb9c3", size = 46256, upload-time = "2024-11-06T07:48:22.923Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/73/04df8a6fa66d43a9fd45c30f283cc4afff17da671886e451d52af60bdc7e/jsonpickle-4.1.1-py3-none-any.whl", hash = "sha256:bb141da6057898aa2438ff268362b126826c812a1721e31cf08a6e142910dc91", size = 47125, upload-time = "2025-06-02T20:36:08.647Z" },
]
[[package]]
@@ -715,46 +752,34 @@ wheels = [
[[package]]
name = "marisa-trie"
-version = "1.2.1"
+version = "1.3.1"
source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "setuptools" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/31/15/9d9743897e4450b2de199ee673b50cb018980c4ced477d41cf91304a85e3/marisa_trie-1.2.1.tar.gz", hash = "sha256:3a27c408e2aefc03e0f1d25b2ff2afb85aac3568f6fa2ae2a53b57a2e87ce29d", size = 416124, upload-time = "2024-10-12T11:30:15.989Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/c5/e3/c9066e74076b90f9701ccd23d6a0b8c1d583feefdec576dc3e1bb093c50d/marisa_trie-1.3.1.tar.gz", hash = "sha256:97107fd12f30e4f8fea97790343a2d2d9a79d93697fe14e1b6f6363c984ff85b", size = 212454, upload-time = "2025-08-26T15:13:18.401Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/e4/83/ccf5b33f2123f3110705c608f8e0caa82002626511aafafc58f82e50d322/marisa_trie-1.2.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a2eb41d2f9114d8b7bd66772c237111e00d2bae2260824560eaa0a1e291ce9e8", size = 362200, upload-time = "2024-10-12T11:28:25.418Z" },
- { url = "https://files.pythonhosted.org/packages/9d/74/f7ce1fc2ee480c7f8ceadd9b992caceaba442a97e5e99d6aea00d3635a0b/marisa_trie-1.2.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9e956e6a46f604b17d570901e66f5214fb6f658c21e5e7665deace236793cef6", size = 192309, upload-time = "2024-10-12T11:28:27.348Z" },
- { url = "https://files.pythonhosted.org/packages/e4/52/5dbbc13e57ce54c2ef0d04962d7d8f66edc69ed34310c734a2913199a581/marisa_trie-1.2.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bd45142501300e7538b2e544905580918b67b1c82abed1275fe4c682c95635fa", size = 174713, upload-time = "2024-10-12T11:28:28.912Z" },
- { url = "https://files.pythonhosted.org/packages/57/49/2580372f3f980aea95c23d05b2c1d3bbb9ee1ab8cfd441545153e44f1be7/marisa_trie-1.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8443d116c612cfd1961fbf76769faf0561a46d8e317315dd13f9d9639ad500c", size = 1314808, upload-time = "2024-10-12T11:28:30.705Z" },
- { url = "https://files.pythonhosted.org/packages/5a/ba/e12a4d450f265414cc68df6a116a78beece72b95f774f04d29cd48e08d19/marisa_trie-1.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:875a6248e60fbb48d947b574ffa4170f34981f9e579bde960d0f9a49ea393ecc", size = 1346678, upload-time = "2024-10-12T11:28:33.106Z" },
- { url = "https://files.pythonhosted.org/packages/b2/81/8e130cb1eea741fd17694d821096f7ec9841f0e3d3c69b740257f5eeafa8/marisa_trie-1.2.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:746a7c60a17fccd3cfcfd4326926f02ea4fcdfc25d513411a0c4fc8e4a1ca51f", size = 1307254, upload-time = "2024-10-12T11:28:35.053Z" },
- { url = "https://files.pythonhosted.org/packages/d7/d0/3deb5ea2bf7e4d845339875dbb31f3c3f66c8d6568723db1d137fb08a91c/marisa_trie-1.2.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:e70869737cc0e5bd903f620667da6c330d6737048d1f44db792a6af68a1d35be", size = 2194712, upload-time = "2024-10-12T11:28:36.87Z" },
- { url = "https://files.pythonhosted.org/packages/9c/5f/b38d728dd30954816497b53425cfaddaf7b93ac0912db5911888f191b07a/marisa_trie-1.2.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:06b099dd743676dbcd8abd8465ceac8f6d97d8bfaabe2c83b965495523b4cef2", size = 2355625, upload-time = "2024-10-12T11:28:38.206Z" },
- { url = "https://files.pythonhosted.org/packages/7e/4f/61c0faa9ae9e53600a1b7a0c367bc9db1a4fdc625402ec232c755a05e094/marisa_trie-1.2.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:d2a82eb21afdaf22b50d9b996472305c05ca67fc4ff5a026a220320c9c961db6", size = 2290290, upload-time = "2024-10-12T11:28:40.148Z" },
- { url = "https://files.pythonhosted.org/packages/7c/7d/713b970fb3043248881ed776dbf4d54918398aa5dde843a38711d0d62c8f/marisa_trie-1.2.1-cp310-cp310-win32.whl", hash = "sha256:8951e7ce5d3167fbd085703b4cbb3f47948ed66826bef9a2173c379508776cf5", size = 130743, upload-time = "2024-10-12T11:28:41.31Z" },
- { url = "https://files.pythonhosted.org/packages/cc/94/3d619cc82c30daeacd18a88674f4e6540ebfb7b4b7752ca0552793be80cf/marisa_trie-1.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:5685a14b3099b1422c4f59fa38b0bf4b5342ee6cc38ae57df9666a0b28eeaad3", size = 151891, upload-time = "2024-10-12T11:28:42.279Z" },
- { url = "https://files.pythonhosted.org/packages/4a/93/ffb01dfa22b6eee918e798e0bc3487427036c608aa4c065725f31aaf4104/marisa_trie-1.2.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ed3fb4ed7f2084597e862bcd56c56c5529e773729a426c083238682dba540e98", size = 362823, upload-time = "2024-10-12T11:28:43.983Z" },
- { url = "https://files.pythonhosted.org/packages/6d/1d/5c36500ac350c278c9bdfd88e17fa846fa4136d75597c167141ed973cdf2/marisa_trie-1.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0fe69fb9ffb2767746181f7b3b29bbd3454d1d24717b5958e030494f3d3cddf3", size = 192741, upload-time = "2024-10-12T11:28:45.536Z" },
- { url = "https://files.pythonhosted.org/packages/e8/04/87dd0840f3f720e511eba56193c02bf64d7d96df1ca9f6d19994f55154be/marisa_trie-1.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4728ed3ae372d1ea2cdbd5eaa27b8f20a10e415d1f9d153314831e67d963f281", size = 174995, upload-time = "2024-10-12T11:28:46.544Z" },
- { url = "https://files.pythonhosted.org/packages/c9/51/9e903a7e13b7593e2e675d0ec4c390ca076dc5df1c1a0d5e85a513b886a3/marisa_trie-1.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8cf4f25cf895692b232f49aa5397af6aba78bb679fb917a05fce8d3cb1ee446d", size = 1384728, upload-time = "2024-10-12T11:28:48.28Z" },
- { url = "https://files.pythonhosted.org/packages/e8/3f/7362a5ac60c2b0aad0f52cd57e7bd0c708f20d2660d8df85360f3d8f1c4b/marisa_trie-1.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7cca7f96236ffdbf49be4b2e42c132e3df05968ac424544034767650913524de", size = 1412620, upload-time = "2024-10-12T11:28:50.427Z" },
- { url = "https://files.pythonhosted.org/packages/1f/bc/aaa3eaf6875f78a204a8da9692d56e3a36f89997dad2c388628385614576/marisa_trie-1.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d7eb20bf0e8b55a58d2a9b518aabc4c18278787bdba476c551dd1c1ed109e509", size = 1361555, upload-time = "2024-10-12T11:28:51.603Z" },
- { url = "https://files.pythonhosted.org/packages/18/98/e11b5a6206c5d110f32adab37fa84a85410d684e9c731acdd5c9250e2ce4/marisa_trie-1.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b1ec93f0d1ee6d7ab680a6d8ea1a08bf264636358e92692072170032dda652ba", size = 2257717, upload-time = "2024-10-12T11:28:52.881Z" },
- { url = "https://files.pythonhosted.org/packages/d2/9d/6b4a40867875e738a67c5b29f83e2e490a66bd9067ace3dd9a5c497e2b7f/marisa_trie-1.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e2699255d7ac610dee26d4ae7bda5951d05c7d9123a22e1f7c6a6f1964e0a4e4", size = 2417044, upload-time = "2024-10-12T11:28:54.115Z" },
- { url = "https://files.pythonhosted.org/packages/fe/61/e25613c72f2931757334b8bcf6b501569ef713f5ee9c6c7688ec460bd720/marisa_trie-1.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c484410911182457a8a1a0249d0c09c01e2071b78a0a8538cd5f7fa45589b13a", size = 2351960, upload-time = "2024-10-12T11:28:55.417Z" },
- { url = "https://files.pythonhosted.org/packages/19/0a/a90ccaf3eb476d13ec261f80c6c52defaf10ebc7f35eb2bcd7dfb533aef7/marisa_trie-1.2.1-cp311-cp311-win32.whl", hash = "sha256:ad548117744b2bcf0e3d97374608be0a92d18c2af13d98b728d37cd06248e571", size = 130446, upload-time = "2024-10-12T11:28:57.294Z" },
- { url = "https://files.pythonhosted.org/packages/fc/98/574b4e143e0a2f5f71af8716b6c4a8a46220f75a6e0847ce7d11ee0ba4aa/marisa_trie-1.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:436f62d27714970b9cdd3b3c41bdad046f260e62ebb0daa38125ef70536fc73b", size = 152037, upload-time = "2024-10-12T11:28:58.399Z" },
- { url = "https://files.pythonhosted.org/packages/4e/bf/8bd4ac8436b33fd46c9e1ffe3c2a131cd9744cc1649dbbe13308f744ef2b/marisa_trie-1.2.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:638506eacf20ca503fff72221a7e66a6eadbf28d6a4a6f949fcf5b1701bb05ec", size = 360041, upload-time = "2024-10-12T11:28:59.436Z" },
- { url = "https://files.pythonhosted.org/packages/ab/dd/4d3151e302e66ae387885f6ec265bd189e096b0c43c1379bfd9a3b9d2543/marisa_trie-1.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:de1665eaafefa48a308e4753786519888021740501a15461c77bdfd57638e6b4", size = 190520, upload-time = "2024-10-12T11:29:01.07Z" },
- { url = "https://files.pythonhosted.org/packages/00/28/ae5991c74fb90b173167a366a634c83445f948ad044d37287b478d6b457e/marisa_trie-1.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f713af9b8aa66a34cd3a78c7d150a560a75734713abe818a69021fd269e927fa", size = 174175, upload-time = "2024-10-12T11:29:02.516Z" },
- { url = "https://files.pythonhosted.org/packages/5a/6a/fbfa89a8680eaabc6847a6c421e65427c43182db0c4bdb60e1516c81c822/marisa_trie-1.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2a7d00f53f4945320b551bccb826b3fb26948bde1a10d50bb9802fabb611b10", size = 1354995, upload-time = "2024-10-12T11:29:04.294Z" },
- { url = "https://files.pythonhosted.org/packages/9e/4c/2ba0b385e5f64ca4ddb0c10ec52ddf881bc4521f135948786fc339d1d6c8/marisa_trie-1.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98042040d1d6085792e8d0f74004fc0f5f9ca6091c298f593dd81a22a4643854", size = 1390989, upload-time = "2024-10-12T11:29:05.576Z" },
- { url = "https://files.pythonhosted.org/packages/6b/22/0791ed3045c91d0938345a86be472fc7c188b894f16c5dfad2ef31e7f882/marisa_trie-1.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6532615111eec2c79e711965ece0bc95adac1ff547a7fff5ffca525463116deb", size = 1328810, upload-time = "2024-10-12T11:29:07.522Z" },
- { url = "https://files.pythonhosted.org/packages/9d/7d/3f566e563abae6efce7fc311c63282a447c611739b3cd66c0e36077c86f8/marisa_trie-1.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:20948e40ab2038e62b7000ca6b4a913bc16c91a2c2e6da501bd1f917eeb28d51", size = 2230222, upload-time = "2024-10-12T11:29:09.374Z" },
- { url = "https://files.pythonhosted.org/packages/a5/0b/38fbb4611b5d1030242ddc2aa62e524438c8076e26f87395dbbf222dc62d/marisa_trie-1.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:66b23e5b35dd547f85bf98db7c749bc0ffc57916ade2534a6bbc32db9a4abc44", size = 2383620, upload-time = "2024-10-12T11:29:10.904Z" },
- { url = "https://files.pythonhosted.org/packages/ae/17/4553c63de29904d5d2521a24cad817bc7883cfa90506ab702ec4dae59a7b/marisa_trie-1.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6704adf0247d2dda42e876b793be40775dff46624309ad99bc7537098bee106d", size = 2329202, upload-time = "2024-10-12T11:29:12.266Z" },
- { url = "https://files.pythonhosted.org/packages/45/08/6307a630e63cd763fe77ac56516faa67fa9cd342060691e40fabc84be6b0/marisa_trie-1.2.1-cp312-cp312-win32.whl", hash = "sha256:3ad356442c2fea4c2a6f514738ddf213d23930f942299a2b2c05df464a00848a", size = 129652, upload-time = "2024-10-12T11:29:13.454Z" },
- { url = "https://files.pythonhosted.org/packages/a1/fe/67c357bfd92710d95a16b86e1453c663d565415d7f7838781c79ff7e1a7e/marisa_trie-1.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:f2806f75817392cedcacb24ac5d80b0350dde8d3861d67d045c1d9b109764114", size = 150845, upload-time = "2024-10-12T11:29:15.092Z" },
+ { url = "https://files.pythonhosted.org/packages/36/eb/c18113555950ea25c421a5e8f7f280a9d7e9198a072f89d33ae9a5725ead/marisa_trie-1.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7e957aa4251a8e70b9fe02a16b2d190f18787902da563cb7ba865508b8e8fb04", size = 172432, upload-time = "2025-08-26T15:11:51.329Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/98/6d3f507a7340697d25d53839e68b516d3d01a3714edf33d484896250189b/marisa_trie-1.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e5888b269e790356ce4525f3e8df1fe866d1497b7d7fb7548cfec883cb985288", size = 156327, upload-time = "2025-08-26T15:11:52.646Z" },
+ { url = "https://files.pythonhosted.org/packages/be/39/78d6def87a6effec6480ef1474d4cc81ef9845c78281ac5a6c07a6440744/marisa_trie-1.3.1-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8f81344d212cb41992340b0b8a67e375f44da90590b884204fd3fa5e02107df2", size = 1219155, upload-time = "2025-08-26T15:11:53.915Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/b4/3b60c26cb9a2c623f47eeed84cfa6ebd3f71c5bd95ef32ed526e4ac689dc/marisa_trie-1.3.1-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3715d779561699471edde70975e07b1de7dddb2816735d40ed16be4b32054188", size = 1239413, upload-time = "2025-08-26T15:11:55.655Z" },
+ { url = "https://files.pythonhosted.org/packages/21/ef/9c7fca5bf133bdb144317843881c8b0c74d2acb7fa209f793c29422e7669/marisa_trie-1.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:47631614c5243ed7d15ae0af8245fcc0599f5b7921fae2a4ae992afb27c9afbb", size = 2161737, upload-time = "2025-08-26T15:11:56.832Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/03/d5f630498bf4b8baf2d6484651255f601e9fdc6d42a83288e8b2420ebc9b/marisa_trie-1.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ad82ab8a58562cf69e6b786debcc7638b28df12f9f1c7bcffb07efb5c1f09cbd", size = 2250038, upload-time = "2025-08-26T15:11:58.165Z" },
+ { url = "https://files.pythonhosted.org/packages/00/6b/c12f055dbb13d22b0f8e1f3da9cb734f581b516cc0e3c909e3f39368f676/marisa_trie-1.3.1-cp310-cp310-win32.whl", hash = "sha256:9f92d3577c72d5a97af5c8e3d98247b79c8ccfb64ebf611311dcf631b11e5604", size = 117232, upload-time = "2025-08-26T15:11:59.616Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/fd/988a19587c7bb8f03fb80e17335f75ca2d5538df4909727012b4bdff8f99/marisa_trie-1.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:a5a0a58ffe2a7eb3f870214c6df8f9a43ce768bd8fed883e6ba8c77645666b63", size = 143231, upload-time = "2025-08-26T15:12:00.52Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/bf/2f1fe6c9fcd2b509c6dfaaf26e35128947d6d3718d0b39510903c55b7bed/marisa_trie-1.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5ef045f694ef66079b4e00c4c9063a00183d6af7d1ff643de6ea5c3b0d9af01b", size = 174027, upload-time = "2025-08-26T15:12:01.434Z" },
+ { url = "https://files.pythonhosted.org/packages/a9/5a/de7936d58ed0de847180cee2b95143d420223c5ade0c093d55113f628237/marisa_trie-1.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cbd28f95d5f30d9a7af6130869568e75bfd7ef2e0adfb1480f1f44480f5d3603", size = 158478, upload-time = "2025-08-26T15:12:02.429Z" },
+ { url = "https://files.pythonhosted.org/packages/48/cc/80611aadefcd0bcf8cd1795cb4643bb27213319a221ba04fe071da0b75cd/marisa_trie-1.3.1-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b173ec46d521308f7c97d96d6e05cf2088e0548f82544ec9a8656af65593304d", size = 1257535, upload-time = "2025-08-26T15:12:04.271Z" },
+ { url = "https://files.pythonhosted.org/packages/36/89/c4eeefb956318047036e6bdc572b6112b2059d595e85961267a90aa40458/marisa_trie-1.3.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:954fef9185f8a79441b4e433695116636bf66402945cfee404f8983bafa59788", size = 1275566, upload-time = "2025-08-26T15:12:05.874Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/63/d775a2fdfc4b555120381cd2aa6dff1845576bc14fb13796ae1b1e8dbaf7/marisa_trie-1.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ca644534f15f85bba14c412afc17de07531e79a766ce85b8dbf3f8b6e7758f20", size = 2199831, upload-time = "2025-08-26T15:12:07.175Z" },
+ { url = "https://files.pythonhosted.org/packages/50/aa/e5053927dc3cac77acc9b27f6f87e75c880f5d3d5eac9111fe13b1d8bf6f/marisa_trie-1.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3834304fdeaa1c9b73596ad5a6c01a44fc19c13c115194704b85f7fbdf0a7b8e", size = 2283830, upload-time = "2025-08-26T15:12:08.319Z" },
+ { url = "https://files.pythonhosted.org/packages/71/3e/e314906d0de5b1a44780a23c79bb62a9aafd876e2a4e80fb34f58c721da4/marisa_trie-1.3.1-cp311-cp311-win32.whl", hash = "sha256:70b4c96f9119cfeb4dc6a0cf4afc9f92f0b002cde225bcd910915d976c78e66a", size = 117335, upload-time = "2025-08-26T15:12:09.776Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/2b/85623566621135de3d57497811f94679b4fb2a8f16148ef67133c2abab7a/marisa_trie-1.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:986eaf35a7f63c878280609ecd37edf8a074f7601c199acfec81d03f1ee9a39a", size = 143985, upload-time = "2025-08-26T15:12:10.988Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/40/ee7ea61b88d62d2189b5c4a27bc0fc8d9c32f8b8dc6daf1c93a7b7ad34ac/marisa_trie-1.3.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5b7c1e7fa6c3b855e8cfbabf38454d7decbaba1c567d0cd58880d033c6b363bd", size = 173454, upload-time = "2025-08-26T15:12:12.13Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/fc/58635811586898041004b2197a085253706ede211324a53ec01612a50e20/marisa_trie-1.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c12b44c190deb0d67655021da1f2d0a7d61a257bf844101cf982e68ed344f28d", size = 155305, upload-time = "2025-08-26T15:12:13.374Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/98/88ca0c98d37034a3237acaf461d210cbcfeb6687929e5ba0e354971fa3ed/marisa_trie-1.3.1-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9688c7b45f744366a4ef661e399f24636ebe440d315ab35d768676c59c613186", size = 1244834, upload-time = "2025-08-26T15:12:14.795Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/5f/93b3e3607ccd693a768eafee60829cd14ea1810b75aa48e8b20e27b332c4/marisa_trie-1.3.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:99a00cab4cf9643a87977c87a5c8961aa44fff8d5dd46e00250135f686e7dedf", size = 1265148, upload-time = "2025-08-26T15:12:16.229Z" },
+ { url = "https://files.pythonhosted.org/packages/db/6e/051d7d25c7fb2b3df605c8bd782513ebbb33fddf3bae6cf46cf268cca89f/marisa_trie-1.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:83efc045fc58ca04c91a96c9b894d8a19ac6553677a76f96df01ff9f0405f53d", size = 2172726, upload-time = "2025-08-26T15:12:18.467Z" },
+ { url = "https://files.pythonhosted.org/packages/58/da/244d9d4e414ce6c73124cba4cc293dd140bf3b04ca18dec64c2775cca951/marisa_trie-1.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0b9816ab993001a7854b02a7daec228892f35bd5ab0ac493bacbd1b80baec9f1", size = 2256104, upload-time = "2025-08-26T15:12:20.168Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/f1/1a36ecd7da6668685a7753522af89a19928ffc80f1cc1dbc301af216f011/marisa_trie-1.3.1-cp312-cp312-win32.whl", hash = "sha256:c785fd6dae9daa6825734b7b494cdac972f958be1f9cb3fb1f32be8598d2b936", size = 115624, upload-time = "2025-08-26T15:12:21.233Z" },
+ { url = "https://files.pythonhosted.org/packages/35/b2/aabd1c9f1c102aa31d66633ed5328c447be166e0a703f9723e682478fd83/marisa_trie-1.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:9868b7a8e0f648d09ffe25ac29511e6e208cc5fb0d156c295385f9d5dc2a138e", size = 138562, upload-time = "2025-08-26T15:12:22.632Z" },
]
[[package]]
@@ -855,7 +880,7 @@ wheels = [
[[package]]
name = "mypy"
-version = "1.17.1"
+version = "1.18.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mypy-extensions" },
@@ -863,27 +888,27 @@ dependencies = [
{ name = "tomli", marker = "python_full_version < '3.11'" },
{ name = "typing-extensions" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/8e/22/ea637422dedf0bf36f3ef238eab4e455e2a0dcc3082b5cc067615347ab8e/mypy-1.17.1.tar.gz", hash = "sha256:25e01ec741ab5bb3eec8ba9cdb0f769230368a22c959c4937360efb89b7e9f01", size = 3352570, upload-time = "2025-07-31T07:54:19.204Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/c0/77/8f0d0001ffad290cef2f7f216f96c814866248a0b92a722365ed54648e7e/mypy-1.18.2.tar.gz", hash = "sha256:06a398102a5f203d7477b2923dda3634c36727fa5c237d8f859ef90c42a9924b", size = 3448846, upload-time = "2025-09-19T00:11:10.519Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/77/a9/3d7aa83955617cdf02f94e50aab5c830d205cfa4320cf124ff64acce3a8e/mypy-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3fbe6d5555bf608c47203baa3e72dbc6ec9965b3d7c318aa9a4ca76f465bd972", size = 11003299, upload-time = "2025-07-31T07:54:06.425Z" },
- { url = "https://files.pythonhosted.org/packages/83/e8/72e62ff837dd5caaac2b4a5c07ce769c8e808a00a65e5d8f94ea9c6f20ab/mypy-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:80ef5c058b7bce08c83cac668158cb7edea692e458d21098c7d3bce35a5d43e7", size = 10125451, upload-time = "2025-07-31T07:53:52.974Z" },
- { url = "https://files.pythonhosted.org/packages/7d/10/f3f3543f6448db11881776f26a0ed079865926b0c841818ee22de2c6bbab/mypy-1.17.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c4a580f8a70c69e4a75587bd925d298434057fe2a428faaf927ffe6e4b9a98df", size = 11916211, upload-time = "2025-07-31T07:53:18.879Z" },
- { url = "https://files.pythonhosted.org/packages/06/bf/63e83ed551282d67bb3f7fea2cd5561b08d2bb6eb287c096539feb5ddbc5/mypy-1.17.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dd86bb649299f09d987a2eebb4d52d10603224500792e1bee18303bbcc1ce390", size = 12652687, upload-time = "2025-07-31T07:53:30.544Z" },
- { url = "https://files.pythonhosted.org/packages/69/66/68f2eeef11facf597143e85b694a161868b3b006a5fbad50e09ea117ef24/mypy-1.17.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:a76906f26bd8d51ea9504966a9c25419f2e668f012e0bdf3da4ea1526c534d94", size = 12896322, upload-time = "2025-07-31T07:53:50.74Z" },
- { url = "https://files.pythonhosted.org/packages/a3/87/8e3e9c2c8bd0d7e071a89c71be28ad088aaecbadf0454f46a540bda7bca6/mypy-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:e79311f2d904ccb59787477b7bd5d26f3347789c06fcd7656fa500875290264b", size = 9507962, upload-time = "2025-07-31T07:53:08.431Z" },
- { url = "https://files.pythonhosted.org/packages/46/cf/eadc80c4e0a70db1c08921dcc220357ba8ab2faecb4392e3cebeb10edbfa/mypy-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ad37544be07c5d7fba814eb370e006df58fed8ad1ef33ed1649cb1889ba6ff58", size = 10921009, upload-time = "2025-07-31T07:53:23.037Z" },
- { url = "https://files.pythonhosted.org/packages/5d/c1/c869d8c067829ad30d9bdae051046561552516cfb3a14f7f0347b7d973ee/mypy-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:064e2ff508e5464b4bd807a7c1625bc5047c5022b85c70f030680e18f37273a5", size = 10047482, upload-time = "2025-07-31T07:53:26.151Z" },
- { url = "https://files.pythonhosted.org/packages/98/b9/803672bab3fe03cee2e14786ca056efda4bb511ea02dadcedde6176d06d0/mypy-1.17.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:70401bbabd2fa1aa7c43bb358f54037baf0586f41e83b0ae67dd0534fc64edfd", size = 11832883, upload-time = "2025-07-31T07:53:47.948Z" },
- { url = "https://files.pythonhosted.org/packages/88/fb/fcdac695beca66800918c18697b48833a9a6701de288452b6715a98cfee1/mypy-1.17.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e92bdc656b7757c438660f775f872a669b8ff374edc4d18277d86b63edba6b8b", size = 12566215, upload-time = "2025-07-31T07:54:04.031Z" },
- { url = "https://files.pythonhosted.org/packages/7f/37/a932da3d3dace99ee8eb2043b6ab03b6768c36eb29a02f98f46c18c0da0e/mypy-1.17.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c1fdf4abb29ed1cb091cf432979e162c208a5ac676ce35010373ff29247bcad5", size = 12751956, upload-time = "2025-07-31T07:53:36.263Z" },
- { url = "https://files.pythonhosted.org/packages/8c/cf/6438a429e0f2f5cab8bc83e53dbebfa666476f40ee322e13cac5e64b79e7/mypy-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:ff2933428516ab63f961644bc49bc4cbe42bbffb2cd3b71cc7277c07d16b1a8b", size = 9507307, upload-time = "2025-07-31T07:53:59.734Z" },
- { url = "https://files.pythonhosted.org/packages/17/a2/7034d0d61af8098ec47902108553122baa0f438df8a713be860f7407c9e6/mypy-1.17.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:69e83ea6553a3ba79c08c6e15dbd9bfa912ec1e493bf75489ef93beb65209aeb", size = 11086295, upload-time = "2025-07-31T07:53:28.124Z" },
- { url = "https://files.pythonhosted.org/packages/14/1f/19e7e44b594d4b12f6ba8064dbe136505cec813549ca3e5191e40b1d3cc2/mypy-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1b16708a66d38abb1e6b5702f5c2c87e133289da36f6a1d15f6a5221085c6403", size = 10112355, upload-time = "2025-07-31T07:53:21.121Z" },
- { url = "https://files.pythonhosted.org/packages/5b/69/baa33927e29e6b4c55d798a9d44db5d394072eef2bdc18c3e2048c9ed1e9/mypy-1.17.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:89e972c0035e9e05823907ad5398c5a73b9f47a002b22359b177d40bdaee7056", size = 11875285, upload-time = "2025-07-31T07:53:55.293Z" },
- { url = "https://files.pythonhosted.org/packages/90/13/f3a89c76b0a41e19490b01e7069713a30949d9a6c147289ee1521bcea245/mypy-1.17.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:03b6d0ed2b188e35ee6d5c36b5580cffd6da23319991c49ab5556c023ccf1341", size = 12737895, upload-time = "2025-07-31T07:53:43.623Z" },
- { url = "https://files.pythonhosted.org/packages/23/a1/c4ee79ac484241301564072e6476c5a5be2590bc2e7bfd28220033d2ef8f/mypy-1.17.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c837b896b37cd103570d776bda106eabb8737aa6dd4f248451aecf53030cdbeb", size = 12931025, upload-time = "2025-07-31T07:54:17.125Z" },
- { url = "https://files.pythonhosted.org/packages/89/b8/7409477be7919a0608900e6320b155c72caab4fef46427c5cc75f85edadd/mypy-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:665afab0963a4b39dff7c1fa563cc8b11ecff7910206db4b2e64dd1ba25aed19", size = 9584664, upload-time = "2025-07-31T07:54:12.842Z" },
- { url = "https://files.pythonhosted.org/packages/1d/f3/8fcd2af0f5b806f6cf463efaffd3c9548a28f84220493ecd38d127b6b66d/mypy-1.17.1-py3-none-any.whl", hash = "sha256:a9f52c0351c21fe24c21d8c0eb1f62967b262d6729393397b6f443c3b773c3b9", size = 2283411, upload-time = "2025-07-31T07:53:24.664Z" },
+ { url = "https://files.pythonhosted.org/packages/03/6f/657961a0743cff32e6c0611b63ff1c1970a0b482ace35b069203bf705187/mypy-1.18.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c1eab0cf6294dafe397c261a75f96dc2c31bffe3b944faa24db5def4e2b0f77c", size = 12807973, upload-time = "2025-09-19T00:10:35.282Z" },
+ { url = "https://files.pythonhosted.org/packages/10/e9/420822d4f661f13ca8900f5fa239b40ee3be8b62b32f3357df9a3045a08b/mypy-1.18.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:7a780ca61fc239e4865968ebc5240bb3bf610ef59ac398de9a7421b54e4a207e", size = 11896527, upload-time = "2025-09-19T00:10:55.791Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/73/a05b2bbaa7005f4642fcfe40fb73f2b4fb6bb44229bd585b5878e9a87ef8/mypy-1.18.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:448acd386266989ef11662ce3c8011fd2a7b632e0ec7d61a98edd8e27472225b", size = 12507004, upload-time = "2025-09-19T00:11:05.411Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/01/f6e4b9f0d031c11ccbd6f17da26564f3a0f3c4155af344006434b0a05a9d/mypy-1.18.2-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f9e171c465ad3901dc652643ee4bffa8e9fef4d7d0eece23b428908c77a76a66", size = 13245947, upload-time = "2025-09-19T00:10:46.923Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/97/19727e7499bfa1ae0773d06afd30ac66a58ed7437d940c70548634b24185/mypy-1.18.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:592ec214750bc00741af1f80cbf96b5013d81486b7bb24cb052382c19e40b428", size = 13499217, upload-time = "2025-09-19T00:09:39.472Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/4f/90dc8c15c1441bf31cf0f9918bb077e452618708199e530f4cbd5cede6ff/mypy-1.18.2-cp310-cp310-win_amd64.whl", hash = "sha256:7fb95f97199ea11769ebe3638c29b550b5221e997c63b14ef93d2e971606ebed", size = 9766753, upload-time = "2025-09-19T00:10:49.161Z" },
+ { url = "https://files.pythonhosted.org/packages/88/87/cafd3ae563f88f94eec33f35ff722d043e09832ea8530ef149ec1efbaf08/mypy-1.18.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:807d9315ab9d464125aa9fcf6d84fde6e1dc67da0b6f80e7405506b8ac72bc7f", size = 12731198, upload-time = "2025-09-19T00:09:44.857Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/e0/1e96c3d4266a06d4b0197ace5356d67d937d8358e2ee3ffac71faa843724/mypy-1.18.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:776bb00de1778caf4db739c6e83919c1d85a448f71979b6a0edd774ea8399341", size = 11817879, upload-time = "2025-09-19T00:09:47.131Z" },
+ { url = "https://files.pythonhosted.org/packages/72/ef/0c9ba89eb03453e76bdac5a78b08260a848c7bfc5d6603634774d9cd9525/mypy-1.18.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1379451880512ffce14505493bd9fe469e0697543717298242574882cf8cdb8d", size = 12427292, upload-time = "2025-09-19T00:10:22.472Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/52/ec4a061dd599eb8179d5411d99775bec2a20542505988f40fc2fee781068/mypy-1.18.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1331eb7fd110d60c24999893320967594ff84c38ac6d19e0a76c5fd809a84c86", size = 13163750, upload-time = "2025-09-19T00:09:51.472Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/5f/2cf2ceb3b36372d51568f2208c021870fe7834cf3186b653ac6446511839/mypy-1.18.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3ca30b50a51e7ba93b00422e486cbb124f1c56a535e20eff7b2d6ab72b3b2e37", size = 13351827, upload-time = "2025-09-19T00:09:58.311Z" },
+ { url = "https://files.pythonhosted.org/packages/c8/7d/2697b930179e7277529eaaec1513f8de622818696857f689e4a5432e5e27/mypy-1.18.2-cp311-cp311-win_amd64.whl", hash = "sha256:664dc726e67fa54e14536f6e1224bcfce1d9e5ac02426d2326e2bb4e081d1ce8", size = 9757983, upload-time = "2025-09-19T00:10:09.071Z" },
+ { url = "https://files.pythonhosted.org/packages/07/06/dfdd2bc60c66611dd8335f463818514733bc763e4760dee289dcc33df709/mypy-1.18.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:33eca32dd124b29400c31d7cf784e795b050ace0e1f91b8dc035672725617e34", size = 12908273, upload-time = "2025-09-19T00:10:58.321Z" },
+ { url = "https://files.pythonhosted.org/packages/81/14/6a9de6d13a122d5608e1a04130724caf9170333ac5a924e10f670687d3eb/mypy-1.18.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a3c47adf30d65e89b2dcd2fa32f3aeb5e94ca970d2c15fcb25e297871c8e4764", size = 11920910, upload-time = "2025-09-19T00:10:20.043Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/a9/b29de53e42f18e8cc547e38daa9dfa132ffdc64f7250e353f5c8cdd44bee/mypy-1.18.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d6c838e831a062f5f29d11c9057c6009f60cb294fea33a98422688181fe2893", size = 12465585, upload-time = "2025-09-19T00:10:33.005Z" },
+ { url = "https://files.pythonhosted.org/packages/77/ae/6c3d2c7c61ff21f2bee938c917616c92ebf852f015fb55917fd6e2811db2/mypy-1.18.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:01199871b6110a2ce984bde85acd481232d17413868c9807e95c1b0739a58914", size = 13348562, upload-time = "2025-09-19T00:10:11.51Z" },
+ { url = "https://files.pythonhosted.org/packages/4d/31/aec68ab3b4aebdf8f36d191b0685d99faa899ab990753ca0fee60fb99511/mypy-1.18.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a2afc0fa0b0e91b4599ddfe0f91e2c26c2b5a5ab263737e998d6817874c5f7c8", size = 13533296, upload-time = "2025-09-19T00:10:06.568Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/83/abcb3ad9478fca3ebeb6a5358bb0b22c95ea42b43b7789c7fb1297ca44f4/mypy-1.18.2-cp312-cp312-win_amd64.whl", hash = "sha256:d8068d0afe682c7c4897c0f7ce84ea77f6de953262b12d07038f4d296d547074", size = 9828828, upload-time = "2025-09-19T00:10:28.203Z" },
+ { url = "https://files.pythonhosted.org/packages/87/e3/be76d87158ebafa0309946c4a73831974d4d6ab4f4ef40c3b53a385a66fd/mypy-1.18.2-py3-none-any.whl", hash = "sha256:22a1748707dd62b58d2ae53562ffc4d7f8bcc727e8ac7cbc69c053ddc874d47e", size = 2352367, upload-time = "2025-09-19T00:10:15.489Z" },
]
[[package]]
@@ -928,11 +953,11 @@ wheels = [
[[package]]
name = "platformdirs"
-version = "4.3.8"
+version = "4.5.0"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/fe/8b/3c73abc9c759ecd3f1f7ceff6685840859e8070c4d947c93fae71f6a0bf2/platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc", size = 21362, upload-time = "2025-05-07T22:47:42.121Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/61/33/9611380c2bdb1225fdef633e2a9610622310fed35ab11dac9620972ee088/platformdirs-4.5.0.tar.gz", hash = "sha256:70ddccdd7c99fc5942e9fc25636a8b34d04c24b335100223152c2803e4063312", size = 21632, upload-time = "2025-10-08T17:44:48.791Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/fe/39/979e8e21520d4e47a0bbe349e2713c0aac6f3d853d0e5b34d76206c439aa/platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4", size = 18567, upload-time = "2025-05-07T22:47:40.376Z" },
+ { url = "https://files.pythonhosted.org/packages/73/cb/ac7874b3e5d58441674fb70742e6c374b28b0c7cb988d37d991cde47166c/platformdirs-4.5.0-py3-none-any.whl", hash = "sha256:e578a81bb873cbb89a41fcc904c7ef523cc18284b7e3b3ccf06aca1403b7ebd3", size = 18651, upload-time = "2025-10-08T17:44:47.223Z" },
]
[[package]]
@@ -945,7 +970,7 @@ wheels = [
[[package]]
name = "pre-commit"
-version = "3.8.0"
+version = "4.4.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cfgv" },
@@ -954,9 +979,9 @@ dependencies = [
{ name = "pyyaml" },
{ name = "virtualenv" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/64/10/97ee2fa54dff1e9da9badbc5e35d0bbaef0776271ea5907eccf64140f72f/pre_commit-3.8.0.tar.gz", hash = "sha256:8bb6494d4a20423842e198980c9ecf9f96607a07ea29549e180eef9ae80fe7af", size = 177815, upload-time = "2024-07-28T19:59:01.538Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/a6/49/7845c2d7bf6474efd8e27905b51b11e6ce411708c91e829b93f324de9929/pre_commit-4.4.0.tar.gz", hash = "sha256:f0233ebab440e9f17cabbb558706eb173d19ace965c68cdce2c081042b4fab15", size = 197501, upload-time = "2025-11-08T21:12:11.607Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/07/92/caae8c86e94681b42c246f0bca35c059a2f0529e5b92619f6aba4cf7e7b6/pre_commit-3.8.0-py2.py3-none-any.whl", hash = "sha256:9a90a53bf82fdd8778d58085faf8d83df56e40dfe18f45b19446e26bf1b3a63f", size = 204643, upload-time = "2024-07-28T19:58:59.335Z" },
+ { url = "https://files.pythonhosted.org/packages/27/11/574fe7d13acf30bfd0a8dd7fa1647040f2b8064f13f43e8c963b1e65093b/pre_commit-4.4.0-py2.py3-none-any.whl", hash = "sha256:b35ea52957cbf83dcc5d8ee636cbead8624e3a15fbfa61a370e42158ac8a5813", size = 226049, upload-time = "2025-11-08T21:12:10.228Z" },
]
[[package]]
@@ -1018,30 +1043,31 @@ wheels = [
[[package]]
name = "protobuf"
-version = "4.25.8"
+version = "6.33.0"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/df/01/34c8d2b6354906d728703cb9d546a0e534de479e25f1b581e4094c4a85cc/protobuf-4.25.8.tar.gz", hash = "sha256:6135cf8affe1fc6f76cced2641e4ea8d3e59518d1f24ae41ba97bcad82d397cd", size = 380920, upload-time = "2025-05-28T14:22:25.153Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/19/ff/64a6c8f420818bb873713988ca5492cba3a7946be57e027ac63495157d97/protobuf-6.33.0.tar.gz", hash = "sha256:140303d5c8d2037730c548f8c7b93b20bb1dc301be280c378b82b8894589c954", size = 443463, upload-time = "2025-10-15T20:39:52.159Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/45/ff/05f34305fe6b85bbfbecbc559d423a5985605cad5eda4f47eae9e9c9c5c5/protobuf-4.25.8-cp310-abi3-win32.whl", hash = "sha256:504435d831565f7cfac9f0714440028907f1975e4bed228e58e72ecfff58a1e0", size = 392745, upload-time = "2025-05-28T14:22:10.524Z" },
- { url = "https://files.pythonhosted.org/packages/08/35/8b8a8405c564caf4ba835b1fdf554da869954712b26d8f2a98c0e434469b/protobuf-4.25.8-cp310-abi3-win_amd64.whl", hash = "sha256:bd551eb1fe1d7e92c1af1d75bdfa572eff1ab0e5bf1736716814cdccdb2360f9", size = 413736, upload-time = "2025-05-28T14:22:13.156Z" },
- { url = "https://files.pythonhosted.org/packages/28/d7/ab27049a035b258dab43445eb6ec84a26277b16105b277cbe0a7698bdc6c/protobuf-4.25.8-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:ca809b42f4444f144f2115c4c1a747b9a404d590f18f37e9402422033e464e0f", size = 394537, upload-time = "2025-05-28T14:22:14.768Z" },
- { url = "https://files.pythonhosted.org/packages/bd/6d/a4a198b61808dd3d1ee187082ccc21499bc949d639feb948961b48be9a7e/protobuf-4.25.8-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:9ad7ef62d92baf5a8654fbb88dac7fa5594cfa70fd3440488a5ca3bfc6d795a7", size = 294005, upload-time = "2025-05-28T14:22:16.052Z" },
- { url = "https://files.pythonhosted.org/packages/d6/c6/c9deaa6e789b6fc41b88ccbdfe7a42d2b82663248b715f55aa77fbc00724/protobuf-4.25.8-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:83e6e54e93d2b696a92cad6e6efc924f3850f82b52e1563778dfab8b355101b0", size = 294924, upload-time = "2025-05-28T14:22:17.105Z" },
- { url = "https://files.pythonhosted.org/packages/0c/c1/6aece0ab5209981a70cd186f164c133fdba2f51e124ff92b73de7fd24d78/protobuf-4.25.8-py3-none-any.whl", hash = "sha256:15a0af558aa3b13efef102ae6e4f3efac06f1eea11afb3a57db2901447d9fb59", size = 156757, upload-time = "2025-05-28T14:22:24.135Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/ee/52b3fa8feb6db4a833dfea4943e175ce645144532e8a90f72571ad85df4e/protobuf-6.33.0-cp310-abi3-win32.whl", hash = "sha256:d6101ded078042a8f17959eccd9236fb7a9ca20d3b0098bbcb91533a5680d035", size = 425593, upload-time = "2025-10-15T20:39:40.29Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/c6/7a465f1825872c55e0341ff4a80198743f73b69ce5d43ab18043699d1d81/protobuf-6.33.0-cp310-abi3-win_amd64.whl", hash = "sha256:9a031d10f703f03768f2743a1c403af050b6ae1f3480e9c140f39c45f81b13ee", size = 436882, upload-time = "2025-10-15T20:39:42.841Z" },
+ { url = "https://files.pythonhosted.org/packages/e1/a9/b6eee662a6951b9c3640e8e452ab3e09f117d99fc10baa32d1581a0d4099/protobuf-6.33.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:905b07a65f1a4b72412314082c7dbfae91a9e8b68a0cc1577515f8df58ecf455", size = 427521, upload-time = "2025-10-15T20:39:43.803Z" },
+ { url = "https://files.pythonhosted.org/packages/10/35/16d31e0f92c6d2f0e77c2a3ba93185130ea13053dd16200a57434c882f2b/protobuf-6.33.0-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:e0697ece353e6239b90ee43a9231318302ad8353c70e6e45499fa52396debf90", size = 324445, upload-time = "2025-10-15T20:39:44.932Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/eb/2a981a13e35cda8b75b5585aaffae2eb904f8f351bdd3870769692acbd8a/protobuf-6.33.0-cp39-abi3-manylinux2014_s390x.whl", hash = "sha256:e0a1715e4f27355afd9570f3ea369735afc853a6c3951a6afe1f80d8569ad298", size = 339159, upload-time = "2025-10-15T20:39:46.186Z" },
+ { url = "https://files.pythonhosted.org/packages/21/51/0b1cbad62074439b867b4e04cc09b93f6699d78fd191bed2bbb44562e077/protobuf-6.33.0-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:35be49fd3f4fefa4e6e2aacc35e8b837d6703c37a2168a55ac21e9b1bc7559ef", size = 323172, upload-time = "2025-10-15T20:39:47.465Z" },
+ { url = "https://files.pythonhosted.org/packages/07/d1/0a28c21707807c6aacd5dc9c3704b2aa1effbf37adebd8caeaf68b17a636/protobuf-6.33.0-py3-none-any.whl", hash = "sha256:25c9e1963c6734448ea2d308cfa610e692b801304ba0908d7bfa564ac5132995", size = 170477, upload-time = "2025-10-15T20:39:51.311Z" },
]
[[package]]
name = "pycaption"
-version = "2.2.16"
+version = "2.2.19"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "beautifulsoup4" },
{ name = "cssutils" },
{ name = "lxml" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/cc/47/afeb8a2219d2f9b2aae74b355f3881a02f7aab11a2e6779b8c6a36c0c409/pycaption-2.2.16.tar.gz", hash = "sha256:aa36aa0909e6893d653eafddf643d4e201887b2ded4a4013e6c468bd45a80f50", size = 272837, upload-time = "2025-01-14T15:39:12.421Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/47/d1/b332cc006803e69cffe50862d36e081728974893519218abe7b9de63f7c1/pycaption-2.2.19.tar.gz", hash = "sha256:2c57d7c61eaf545e08838d4129c21954cbd39b6e960e414be1290c449d42942a", size = 111300, upload-time = "2025-09-30T07:15:24.438Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/0c/cb/ece3724946d4030e2ebfb8e0e7f3e8b2be504e2aa9f347e31f527c4edbff/pycaption-2.2.16-py3-none-any.whl", hash = "sha256:03b6bbf01f7d4f98d16905bd1a409d05443a8f7f6cb550a1aa70869b55cf5845", size = 287139, upload-time = "2025-01-14T15:39:09.579Z" },
+ { url = "https://files.pythonhosted.org/packages/35/c7/d13c57e5a3408df2e5d910853e957ec8e253b41ba531e0f32036c8321240/pycaption-2.2.19-py3-none-any.whl", hash = "sha256:7eb84a05d40bb80400689f9431d05d8b77dec6535938b419ebed2c9d67283a4f", size = 124970, upload-time = "2025-09-30T07:15:21.945Z" },
]
[[package]]
@@ -1101,6 +1127,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/f2/5f/af7da8e6f1e42b52f44a24d08b8e4c726207434e2593732d39e7af5e7256/pycryptodomex-3.23.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:14c37aaece158d0ace436f76a7bb19093db3b4deade9797abfc39ec6cd6cc2fe", size = 1806478, upload-time = "2025-05-17T17:23:26.066Z" },
]
+[[package]]
+name = "pyexecjs"
+version = "1.5.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "six" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ba/8e/aedef81641c8dca6fd0fb7294de5bed9c45f3397d67fddf755c1042c2642/PyExecJS-1.5.1.tar.gz", hash = "sha256:34cc1d070976918183ff7bdc0ad71f8157a891c92708c00c5fbbff7a769f505c", size = 13344, upload-time = "2018-01-18T04:33:55.126Z" }
+
[[package]]
name = "pygments"
version = "2.19.2"
@@ -1121,13 +1156,15 @@ wheels = [
[[package]]
name = "pymediainfo"
-version = "6.1.0"
+version = "7.0.1"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/0f/ed/a02b18943f9162644f90354fe6445410e942c857dd21ded758f630ba41c0/pymediainfo-6.1.0.tar.gz", hash = "sha256:186a0b41a94524f0984d085ca6b945c79a254465b7097f2560dc0c04e8d1d8a5", size = 446466, upload-time = "2023-10-29T16:15:24.336Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/4d/80/80a6fb21005b81e30f6193d45cba13857df09f5d483e0551fa6fbb3aaeed/pymediainfo-7.0.1.tar.gz", hash = "sha256:0d5df59ecc615e24c56f303b8f651579c6accab7265715e5d429186d7ba21514", size = 441563, upload-time = "2025-02-12T14:33:15.038Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/20/47/675bac72d4ab5031f69179b1b8e617a57f766ba063b724a82023da06b9c9/pymediainfo-6.1.0-py3-none-macosx_10_15_x86_64.whl", hash = "sha256:69d5d3cac056c24a68a1129216aec82a222e804a35f88567c08585e8ae7ef2b3", size = 6322766, upload-time = "2023-10-29T16:23:45.953Z" },
- { url = "https://files.pythonhosted.org/packages/62/13/e3938e187a0c48057ffecc031d473b5739d13056d3eefb8b14b7652c27e1/pymediainfo-6.1.0-py3-none-win32.whl", hash = "sha256:19e6baeca3db71f12ed568f056de899fd02f380990fb10f43a26c32b67362dd3", size = 2762429, upload-time = "2023-10-29T16:39:38.457Z" },
- { url = "https://files.pythonhosted.org/packages/61/36/369234e2f568e11ed3195292cd89c60e7444346eb1dd25bf5d996dc5d78a/pymediainfo-6.1.0-py3-none-win_amd64.whl", hash = "sha256:f3f6bad666c65ac993dd8d64f45a2b26ba2acd50f9875f74cebb624dbf2f8da0", size = 2963468, upload-time = "2023-10-29T16:41:33.145Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/4a/d895646df3d3ff617b54d7f06a02ed9d6f5b86673030a543927310e0f7ed/pymediainfo-7.0.1-py3-none-macosx_10_10_universal2.whl", hash = "sha256:286f3bf6299be0997093254e0f371855bc5cf2aaf8641d19455a011e3ee3a84d", size = 6983332, upload-time = "2025-02-12T14:42:47.412Z" },
+ { url = "https://files.pythonhosted.org/packages/77/df/bc6b5a08e908c64a81f6ff169716d408ce7380ceff44e1eceb095f49e0dc/pymediainfo-7.0.1-py3-none-manylinux_2_27_aarch64.whl", hash = "sha256:3648e2379fa67bd02433d1e28c707df3a53834dd480680615a9fefd2266f1182", size = 5768082, upload-time = "2025-02-12T14:33:10.543Z" },
+ { url = "https://files.pythonhosted.org/packages/02/10/a9bc1446a48d3a15940eb1af79a71978f368f27e2cc86f9ec3ec2d206a20/pymediainfo-7.0.1-py3-none-manylinux_2_27_x86_64.whl", hash = "sha256:cde98112f1ce486589b17a12e5da42085faea996224f7c67fa45b8c1dca719c6", size = 6001553, upload-time = "2025-02-12T14:33:12.663Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/7f/c48f8514cb60c9ff9be81b6f383e73e66c7461ef854a1b62628e3c823f13/pymediainfo-7.0.1-py3-none-win32.whl", hash = "sha256:01bcaf82b72cefbf4b96f13b2547e1b2e0e734bab7173d7c33f7f01acc07c98b", size = 3125046, upload-time = "2025-02-12T15:04:39.89Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/26/9d50c2a330541bc36c0ea7ce29eeff5b0c35c2624139660df8bcfa9ae3ce/pymediainfo-7.0.1-py3-none-win_amd64.whl", hash = "sha256:13224fa7590e198763b8baf072e704ea81d334e71aa32a469091460e243893c7", size = 3271232, upload-time = "2025-02-12T15:07:13.672Z" },
]
[[package]]
@@ -1144,11 +1181,11 @@ wheels = [
[[package]]
name = "pymysql"
-version = "1.1.1"
+version = "1.1.2"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/b3/8f/ce59b5e5ed4ce8512f879ff1fa5ab699d211ae2495f1adaa5fbba2a1eada/pymysql-1.1.1.tar.gz", hash = "sha256:e127611aaf2b417403c60bf4dc570124aeb4a57f5f37b8e95ae399a42f904cd0", size = 47678, upload-time = "2024-05-21T11:03:43.722Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/f5/ae/1fe3fcd9f959efa0ebe200b8de88b5a5ce3e767e38c7ac32fb179f16a388/pymysql-1.1.2.tar.gz", hash = "sha256:4961d3e165614ae65014e361811a724e2044ad3ea3739de9903ae7c21f539f03", size = 48258, upload-time = "2025-08-24T12:55:55.146Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/0c/94/e4181a1f6286f545507528c78016e00065ea913276888db2262507693ce5/PyMySQL-1.1.1-py3-none-any.whl", hash = "sha256:4de15da4c61dc132f4fb9ab763063e693d521a80fd0e87943b9a453dd4c19d6c", size = 44972, upload-time = "2024-05-21T11:03:41.216Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/4c/ad33b92b9864cbde84f259d5df035a6447f91891f5be77788e2a3892bce3/pymysql-1.1.2-py3-none-any.whl", hash = "sha256:e6b1d89711dd51f8f74b1631fe08f039e7d76cf67a42a323d3178f0f25762ed9", size = 45300, upload-time = "2025-08-24T12:55:53.394Z" },
]
[[package]]
@@ -1191,7 +1228,7 @@ wheels = [
[[package]]
name = "pywidevine"
-version = "1.8.0"
+version = "1.9.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "click" },
@@ -1202,9 +1239,9 @@ dependencies = [
{ name = "requests" },
{ name = "unidecode" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/99/12/6ff0e6ffa2711187ee629392396d7c18ae6ca8e2e576dcef2d636316d667/pywidevine-1.8.0.tar.gz", hash = "sha256:c14f3fe2864473416b9caa73d9a21251a02d72138e6d54d8c1a3f44b7a6b05c9", size = 76406, upload-time = "2023-12-22T11:13:12.556Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/52/b6/4855cb958892653029f3cafa8a4724d554b847de0a43a3808cea109b9e78/pywidevine-1.9.0.tar.gz", hash = "sha256:6742daf5fd797c5a4813eb1300efb3181ffcddd0c8c478ee28c7c536aa0e51b2", size = 75511, upload-time = "2025-10-27T09:13:15.909Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/41/9f/60f8a4c8e7767a8c34f5c42428662e03fa3e38ad18ba41fcc5370ee43263/pywidevine-1.8.0-py3-none-any.whl", hash = "sha256:1ecf029ce562789b18bbbd64604596d15645aadf413b255cf0fafc8d8b06659d", size = 70476, upload-time = "2023-12-22T11:13:10.84Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/e2/e692647f12008495654868e99c972696f2b8cb39637b507c409f5f99b4b6/pywidevine-1.9.0-py3-none-any.whl", hash = "sha256:70b5726abc2c3fe763f070da853b7c87d6aeb6131a27778743187258bf97e492", size = 70419, upload-time = "2025-10-27T09:13:14.303Z" },
]
[package.optional-dependencies]
@@ -1214,42 +1251,43 @@ serve = [
[[package]]
name = "pyyaml"
-version = "6.0.2"
+version = "6.0.3"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199, upload-time = "2024-08-06T20:31:40.178Z" },
- { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758, upload-time = "2024-08-06T20:31:42.173Z" },
- { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463, upload-time = "2024-08-06T20:31:44.263Z" },
- { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280, upload-time = "2024-08-06T20:31:50.199Z" },
- { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239, upload-time = "2024-08-06T20:31:52.292Z" },
- { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802, upload-time = "2024-08-06T20:31:53.836Z" },
- { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527, upload-time = "2024-08-06T20:31:55.565Z" },
- { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052, upload-time = "2024-08-06T20:31:56.914Z" },
- { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774, upload-time = "2024-08-06T20:31:58.304Z" },
- { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" },
- { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" },
- { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" },
- { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" },
- { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" },
- { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" },
- { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" },
- { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" },
- { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" },
- { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" },
- { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" },
- { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" },
- { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" },
- { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" },
- { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" },
- { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" },
- { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" },
- { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/a0/39350dd17dd6d6c6507025c0e53aef67a9293a6d37d3511f23ea510d5800/pyyaml-6.0.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b", size = 184227, upload-time = "2025-09-25T21:31:46.04Z" },
+ { url = "https://files.pythonhosted.org/packages/05/14/52d505b5c59ce73244f59c7a50ecf47093ce4765f116cdb98286a71eeca2/pyyaml-6.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956", size = 174019, upload-time = "2025-09-25T21:31:47.706Z" },
+ { url = "https://files.pythonhosted.org/packages/43/f7/0e6a5ae5599c838c696adb4e6330a59f463265bfa1e116cfd1fbb0abaaae/pyyaml-6.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8", size = 740646, upload-time = "2025-09-25T21:31:49.21Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/3a/61b9db1d28f00f8fd0ae760459a5c4bf1b941baf714e207b6eb0657d2578/pyyaml-6.0.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198", size = 840793, upload-time = "2025-09-25T21:31:50.735Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/1e/7acc4f0e74c4b3d9531e24739e0ab832a5edf40e64fbae1a9c01941cabd7/pyyaml-6.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b", size = 770293, upload-time = "2025-09-25T21:31:51.828Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/ef/abd085f06853af0cd59fa5f913d61a8eab65d7639ff2a658d18a25d6a89d/pyyaml-6.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0", size = 732872, upload-time = "2025-09-25T21:31:53.282Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/15/2bc9c8faf6450a8b3c9fc5448ed869c599c0a74ba2669772b1f3a0040180/pyyaml-6.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69", size = 758828, upload-time = "2025-09-25T21:31:54.807Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/00/531e92e88c00f4333ce359e50c19b8d1de9fe8d581b1534e35ccfbc5f393/pyyaml-6.0.3-cp310-cp310-win32.whl", hash = "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e", size = 142415, upload-time = "2025-09-25T21:31:55.885Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/fa/926c003379b19fca39dd4634818b00dec6c62d87faf628d1394e137354d4/pyyaml-6.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c", size = 158561, upload-time = "2025-09-25T21:31:57.406Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/16/a95b6757765b7b031c9374925bb718d55e0a9ba8a1b6a12d25962ea44347/pyyaml-6.0.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e", size = 185826, upload-time = "2025-09-25T21:31:58.655Z" },
+ { url = "https://files.pythonhosted.org/packages/16/19/13de8e4377ed53079ee996e1ab0a9c33ec2faf808a4647b7b4c0d46dd239/pyyaml-6.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824", size = 175577, upload-time = "2025-09-25T21:32:00.088Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/62/d2eb46264d4b157dae1275b573017abec435397aa59cbcdab6fc978a8af4/pyyaml-6.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c", size = 775556, upload-time = "2025-09-25T21:32:01.31Z" },
+ { url = "https://files.pythonhosted.org/packages/10/cb/16c3f2cf3266edd25aaa00d6c4350381c8b012ed6f5276675b9eba8d9ff4/pyyaml-6.0.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00", size = 882114, upload-time = "2025-09-25T21:32:03.376Z" },
+ { url = "https://files.pythonhosted.org/packages/71/60/917329f640924b18ff085ab889a11c763e0b573da888e8404ff486657602/pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d", size = 806638, upload-time = "2025-09-25T21:32:04.553Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/6f/529b0f316a9fd167281a6c3826b5583e6192dba792dd55e3203d3f8e655a/pyyaml-6.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a", size = 767463, upload-time = "2025-09-25T21:32:06.152Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/6a/b627b4e0c1dd03718543519ffb2f1deea4a1e6d42fbab8021936a4d22589/pyyaml-6.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4", size = 794986, upload-time = "2025-09-25T21:32:07.367Z" },
+ { url = "https://files.pythonhosted.org/packages/45/91/47a6e1c42d9ee337c4839208f30d9f09caa9f720ec7582917b264defc875/pyyaml-6.0.3-cp311-cp311-win32.whl", hash = "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b", size = 142543, upload-time = "2025-09-25T21:32:08.95Z" },
+ { url = "https://files.pythonhosted.org/packages/da/e3/ea007450a105ae919a72393cb06f122f288ef60bba2dc64b26e2646fa315/pyyaml-6.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf", size = 158763, upload-time = "2025-09-25T21:32:09.96Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/33/422b98d2195232ca1826284a76852ad5a86fe23e31b009c9886b2d0fb8b2/pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", size = 182063, upload-time = "2025-09-25T21:32:11.445Z" },
+ { url = "https://files.pythonhosted.org/packages/89/a0/6cf41a19a1f2f3feab0e9c0b74134aa2ce6849093d5517a0c550fe37a648/pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0", size = 173973, upload-time = "2025-09-25T21:32:12.492Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/23/7a778b6bd0b9a8039df8b1b1d80e2e2ad78aa04171592c8a5c43a56a6af4/pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", size = 775116, upload-time = "2025-09-25T21:32:13.652Z" },
+ { url = "https://files.pythonhosted.org/packages/65/30/d7353c338e12baef4ecc1b09e877c1970bd3382789c159b4f89d6a70dc09/pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", size = 844011, upload-time = "2025-09-25T21:32:15.21Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/9d/b3589d3877982d4f2329302ef98a8026e7f4443c765c46cfecc8858c6b4b/pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", size = 807870, upload-time = "2025-09-25T21:32:16.431Z" },
+ { url = "https://files.pythonhosted.org/packages/05/c0/b3be26a015601b822b97d9149ff8cb5ead58c66f981e04fedf4e762f4bd4/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", size = 761089, upload-time = "2025-09-25T21:32:17.56Z" },
+ { url = "https://files.pythonhosted.org/packages/be/8e/98435a21d1d4b46590d5459a22d88128103f8da4c2d4cb8f14f2a96504e1/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", size = 790181, upload-time = "2025-09-25T21:32:18.834Z" },
+ { url = "https://files.pythonhosted.org/packages/74/93/7baea19427dcfbe1e5a372d81473250b379f04b1bd3c4c5ff825e2327202/pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5", size = 137658, upload-time = "2025-09-25T21:32:20.209Z" },
+ { url = "https://files.pythonhosted.org/packages/86/bf/899e81e4cce32febab4fb42bb97dcdf66bc135272882d1987881a4b519e9/pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b", size = 154003, upload-time = "2025-09-25T21:32:21.167Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/08/67bd04656199bbb51dbed1439b7f27601dfb576fb864099c7ef0c3e55531/pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd", size = 140344, upload-time = "2025-09-25T21:32:22.617Z" },
]
[[package]]
name = "requests"
-version = "2.32.4"
+version = "2.32.5"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
@@ -1257,9 +1295,9 @@ dependencies = [
{ name = "idna" },
{ name = "urllib3" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/e1/0a/929373653770d8a0d7ea76c37de6e41f11eb07559b103b1c02cafb3f7cf8/requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422", size = 135258, upload-time = "2025-06-09T16:43:07.34Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/7c/e4/56027c4a6b4ae70ca9de302488c5ca95ad4a39e190093d6c1a8ace08341b/requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c", size = 64847, upload-time = "2025-06-09T16:43:05.728Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
]
[package.optional-dependencies]
@@ -1267,18 +1305,29 @@ socks = [
{ name = "pysocks" },
]
+[[package]]
+name = "rfc3339-validator"
+version = "0.1.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "six" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/28/ea/a9387748e2d111c3c2b275ba970b735e04e15cdb1eb30693b6b5708c4dbd/rfc3339_validator-0.1.4.tar.gz", hash = "sha256:138a2abdf93304ad60530167e51d2dfb9549521a836871b88d7f4695d0022f6b", size = 5513, upload-time = "2021-05-12T16:37:54.178Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl", hash = "sha256:24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa", size = 3490, upload-time = "2021-05-12T16:37:52.536Z" },
+]
+
[[package]]
name = "rich"
-version = "13.9.4"
+version = "14.2.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "markdown-it-py" },
{ name = "pygments" },
- { name = "typing-extensions", marker = "python_full_version < '3.11'" },
]
-sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149, upload-time = "2024-11-01T16:43:57.873Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/fb/d2/8920e102050a0de7bfabeb4c4614a49248cf8d5d7a8d01885fbb24dc767a/rich-14.2.0.tar.gz", hash = "sha256:73ff50c7c0c1c77c8243079283f4edb376f0f6442433aecb8ce7e6d0b92d1fe4", size = 219990, upload-time = "2025-10-09T14:16:53.064Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424, upload-time = "2024-11-01T16:43:55.817Z" },
+ { url = "https://files.pythonhosted.org/packages/25/7a/b0178788f8dc6cafce37a212c99565fa1fe7872c70c6c9c1e1a372d9d88f/rich-14.2.0-py3-none-any.whl", hash = "sha256:76bc51fe2e57d2b1be1f96c524b890b816e334ab4c1e45888799bfaab0021edd", size = 243393, upload-time = "2025-10-09T14:16:51.245Z" },
]
[[package]]
@@ -1342,35 +1391,37 @@ wheels = [
[[package]]
name = "ruff"
-version = "0.3.7"
+version = "0.14.4"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/ba/6a/5cdb9e5ae04210ddc5b7b6cf31aeca50654de595e73e59961ce1a662656c/ruff-0.3.7.tar.gz", hash = "sha256:d5c1aebee5162c2226784800ae031f660c350e7a3402c4d1f8ea4e97e232e3ba", size = 2164419, upload-time = "2024-04-12T03:54:39.76Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/df/55/cccfca45157a2031dcbb5a462a67f7cf27f8b37d4b3b1cd7438f0f5c1df6/ruff-0.14.4.tar.gz", hash = "sha256:f459a49fe1085a749f15414ca76f61595f1a2cc8778ed7c279b6ca2e1fd19df3", size = 5587844, upload-time = "2025-11-06T22:07:45.033Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/c5/59/8416cddfcc65b710d79374358a81632f2c4810326e8391d5a3c23f1cc422/ruff-0.3.7-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:0e8377cccb2f07abd25e84fc5b2cbe48eeb0fea9f1719cad7caedb061d70e5ce", size = 16845547, upload-time = "2024-04-12T03:53:41.335Z" },
- { url = "https://files.pythonhosted.org/packages/94/db/79298ddaddad3ddb7799fe995d508c49c5f83dbcc1a0f88d672105776906/ruff-0.3.7-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:15a4d1cc1e64e556fa0d67bfd388fed416b7f3b26d5d1c3e7d192c897e39ba4b", size = 8634118, upload-time = "2024-04-12T03:53:46.24Z" },
- { url = "https://files.pythonhosted.org/packages/00/5c/bea349c531f50b8462470b49e5eff11a860f63b2796d8643d4e4e0722b64/ruff-0.3.7-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d28bdf3d7dc71dd46929fafeec98ba89b7c3550c3f0978e36389b5631b793663", size = 8282193, upload-time = "2024-04-12T03:53:49.73Z" },
- { url = "https://files.pythonhosted.org/packages/c2/5d/62593a1ec896c07a497fb653fa269595772abc15ce8306d6edda94aa3b54/ruff-0.3.7-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:379b67d4f49774ba679593b232dcd90d9e10f04d96e3c8ce4a28037ae473f7bb", size = 7655584, upload-time = "2024-04-12T03:53:53.012Z" },
- { url = "https://files.pythonhosted.org/packages/b7/b9/00ecf95ea51f82ab68430851d13266a892c60a23c5058604494d5e474bbf/ruff-0.3.7-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c060aea8ad5ef21cdfbbe05475ab5104ce7827b639a78dd55383a6e9895b7c51", size = 8843361, upload-time = "2024-04-12T03:53:57.06Z" },
- { url = "https://files.pythonhosted.org/packages/83/bb/94d0d8f9ae71f6a5384ed6bc2dfd3fd651148604b4aaec9bd44d0754ba1c/ruff-0.3.7-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:ebf8f615dde968272d70502c083ebf963b6781aacd3079081e03b32adfe4d58a", size = 9591014, upload-time = "2024-04-12T03:54:00.447Z" },
- { url = "https://files.pythonhosted.org/packages/d0/3e/df5317d2f3915cac6a34a88cfd7a7bf7ba8d96cb92b9acd42414ac0738fc/ruff-0.3.7-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d48098bd8f5c38897b03604f5428901b65e3c97d40b3952e38637b5404b739a2", size = 9277992, upload-time = "2024-04-12T03:54:04.646Z" },
- { url = "https://files.pythonhosted.org/packages/4c/5a/202bae9d5af45ea2a49f21998dc7f6a8cc0cc7269540043f6cba5dd45cdc/ruff-0.3.7-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da8a4fda219bf9024692b1bc68c9cff4b80507879ada8769dc7e985755d662ea", size = 10179960, upload-time = "2024-04-12T03:54:08.378Z" },
- { url = "https://files.pythonhosted.org/packages/99/b2/4b0796f93d8bd7188e47c198407f2999579599cd5a11e1ed8a66ee18b4ac/ruff-0.3.7-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c44e0149f1d8b48c4d5c33d88c677a4aa22fd09b1683d6a7ff55b816b5d074f", size = 8853914, upload-time = "2024-04-12T03:54:11.72Z" },
- { url = "https://files.pythonhosted.org/packages/02/ae/7533335b669fa879d5a36d7bb7c3cdc96b4e7d49e9da71218d3bd0a24852/ruff-0.3.7-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:3050ec0af72b709a62ecc2aca941b9cd479a7bf2b36cc4562f0033d688e44fa1", size = 8177020, upload-time = "2024-04-12T03:54:15.136Z" },
- { url = "https://files.pythonhosted.org/packages/a8/55/92a6099ea0e49d500199bc169d83158719f9bf2e1b87f5a1b53210ba74d8/ruff-0.3.7-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:a29cc38e4c1ab00da18a3f6777f8b50099d73326981bb7d182e54a9a21bb4ff7", size = 7646460, upload-time = "2024-04-12T03:54:18.574Z" },
- { url = "https://files.pythonhosted.org/packages/2f/3e/7370e849c14a8461aee6c4f0c87a784f3f2a96ac542c1056fae982cd0504/ruff-0.3.7-py3-none-musllinux_1_2_i686.whl", hash = "sha256:5b15cc59c19edca917f51b1956637db47e200b0fc5e6e1878233d3a938384b0b", size = 8446755, upload-time = "2024-04-12T03:54:22.07Z" },
- { url = "https://files.pythonhosted.org/packages/3a/5e/acae79c630de116212cd4c3346a80f34fe2b421270fa76640cf1756a62e9/ruff-0.3.7-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e491045781b1e38b72c91247cf4634f040f8d0cb3e6d3d64d38dcf43616650b4", size = 8899196, upload-time = "2024-04-12T03:54:25.357Z" },
- { url = "https://files.pythonhosted.org/packages/d3/27/05d3398f0f00742201518c3362d0046ef3a03a50a6c1f1632e9cf36f9a9e/ruff-0.3.7-py3-none-win32.whl", hash = "sha256:bc931de87593d64fad3a22e201e55ad76271f1d5bfc44e1a1887edd0903c7d9f", size = 7766434, upload-time = "2024-04-12T03:54:28.811Z" },
- { url = "https://files.pythonhosted.org/packages/d6/f4/cdc6a5350ce8c9741f3a79ceca912045204adf20e0b4222632664b3cbd1e/ruff-0.3.7-py3-none-win_amd64.whl", hash = "sha256:5ef0e501e1e39f35e03c2acb1d1238c595b8bb36cf7a170e7c1df1b73da00e74", size = 8650055, upload-time = "2024-04-12T03:54:32.454Z" },
- { url = "https://files.pythonhosted.org/packages/20/02/8ec400f495308b4a3833f34d344ccc853ebace7ea6dfd886813c2a6de3d8/ruff-0.3.7-py3-none-win_arm64.whl", hash = "sha256:789e144f6dc7019d1f92a812891c645274ed08af6037d11fc65fcbc183b7d59f", size = 8213066, upload-time = "2024-04-12T03:54:36.402Z" },
+ { url = "https://files.pythonhosted.org/packages/17/b9/67240254166ae1eaa38dec32265e9153ac53645a6c6670ed36ad00722af8/ruff-0.14.4-py3-none-linux_armv6l.whl", hash = "sha256:e6604613ffbcf2297cd5dcba0e0ac9bd0c11dc026442dfbb614504e87c349518", size = 12606781, upload-time = "2025-11-06T22:07:01.841Z" },
+ { url = "https://files.pythonhosted.org/packages/46/c8/09b3ab245d8652eafe5256ab59718641429f68681ee713ff06c5c549f156/ruff-0.14.4-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:d99c0b52b6f0598acede45ee78288e5e9b4409d1ce7f661f0fa36d4cbeadf9a4", size = 12946765, upload-time = "2025-11-06T22:07:05.858Z" },
+ { url = "https://files.pythonhosted.org/packages/14/bb/1564b000219144bf5eed2359edc94c3590dd49d510751dad26202c18a17d/ruff-0.14.4-py3-none-macosx_11_0_arm64.whl", hash = "sha256:9358d490ec030f1b51d048a7fd6ead418ed0826daf6149e95e30aa67c168af33", size = 11928120, upload-time = "2025-11-06T22:07:08.023Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/92/d5f1770e9988cc0742fefaa351e840d9aef04ec24ae1be36f333f96d5704/ruff-0.14.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:81b40d27924f1f02dfa827b9c0712a13c0e4b108421665322218fc38caf615c2", size = 12370877, upload-time = "2025-11-06T22:07:10.015Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/29/e9282efa55f1973d109faf839a63235575519c8ad278cc87a182a366810e/ruff-0.14.4-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f5e649052a294fe00818650712083cddc6cc02744afaf37202c65df9ea52efa5", size = 12408538, upload-time = "2025-11-06T22:07:13.085Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/01/930ed6ecfce130144b32d77d8d69f5c610e6d23e6857927150adf5d7379a/ruff-0.14.4-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aa082a8f878deeba955531f975881828fd6afd90dfa757c2b0808aadb437136e", size = 13141942, upload-time = "2025-11-06T22:07:15.386Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/46/a9c89b42b231a9f487233f17a89cbef9d5acd538d9488687a02ad288fa6b/ruff-0.14.4-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:1043c6811c2419e39011890f14d0a30470f19d47d197c4858b2787dfa698f6c8", size = 14544306, upload-time = "2025-11-06T22:07:17.631Z" },
+ { url = "https://files.pythonhosted.org/packages/78/96/9c6cf86491f2a6d52758b830b89b78c2ae61e8ca66b86bf5a20af73d20e6/ruff-0.14.4-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a9f3a936ac27fb7c2a93e4f4b943a662775879ac579a433291a6f69428722649", size = 14210427, upload-time = "2025-11-06T22:07:19.832Z" },
+ { url = "https://files.pythonhosted.org/packages/71/f4/0666fe7769a54f63e66404e8ff698de1dcde733e12e2fd1c9c6efb689cb5/ruff-0.14.4-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:95643ffd209ce78bc113266b88fba3d39e0461f0cbc8b55fb92505030fb4a850", size = 13658488, upload-time = "2025-11-06T22:07:22.32Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/79/6ad4dda2cfd55e41ac9ed6d73ef9ab9475b1eef69f3a85957210c74ba12c/ruff-0.14.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:456daa2fa1021bc86ca857f43fe29d5d8b3f0e55e9f90c58c317c1dcc2afc7b5", size = 13354908, upload-time = "2025-11-06T22:07:24.347Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/60/f0b6990f740bb15c1588601d19d21bcc1bd5de4330a07222041678a8e04f/ruff-0.14.4-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:f911bba769e4a9f51af6e70037bb72b70b45a16db5ce73e1f72aefe6f6d62132", size = 13587803, upload-time = "2025-11-06T22:07:26.327Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/da/eaaada586f80068728338e0ef7f29ab3e4a08a692f92eb901a4f06bbff24/ruff-0.14.4-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:76158a7369b3979fa878612c623a7e5430c18b2fd1c73b214945c2d06337db67", size = 12279654, upload-time = "2025-11-06T22:07:28.46Z" },
+ { url = "https://files.pythonhosted.org/packages/66/d4/b1d0e82cf9bf8aed10a6d45be47b3f402730aa2c438164424783ac88c0ed/ruff-0.14.4-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:f3b8f3b442d2b14c246e7aeca2e75915159e06a3540e2f4bed9f50d062d24469", size = 12357520, upload-time = "2025-11-06T22:07:31.468Z" },
+ { url = "https://files.pythonhosted.org/packages/04/f4/53e2b42cc82804617e5c7950b7079d79996c27e99c4652131c6a1100657f/ruff-0.14.4-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c62da9a06779deecf4d17ed04939ae8b31b517643b26370c3be1d26f3ef7dbde", size = 12719431, upload-time = "2025-11-06T22:07:33.831Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/94/80e3d74ed9a72d64e94a7b7706b1c1ebaa315ef2076fd33581f6a1cd2f95/ruff-0.14.4-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5a443a83a1506c684e98acb8cb55abaf3ef725078be40237463dae4463366349", size = 13464394, upload-time = "2025-11-06T22:07:35.905Z" },
+ { url = "https://files.pythonhosted.org/packages/54/1a/a49f071f04c42345c793d22f6cf5e0920095e286119ee53a64a3a3004825/ruff-0.14.4-py3-none-win32.whl", hash = "sha256:643b69cb63cd996f1fc7229da726d07ac307eae442dd8974dbc7cf22c1e18fff", size = 12493429, upload-time = "2025-11-06T22:07:38.43Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/22/e58c43e641145a2b670328fb98bc384e20679b5774258b1e540207580266/ruff-0.14.4-py3-none-win_amd64.whl", hash = "sha256:26673da283b96fe35fa0c939bf8411abec47111644aa9f7cfbd3c573fb125d2c", size = 13635380, upload-time = "2025-11-06T22:07:40.496Z" },
+ { url = "https://files.pythonhosted.org/packages/30/bd/4168a751ddbbf43e86544b4de8b5c3b7be8d7167a2a5cb977d274e04f0a1/ruff-0.14.4-py3-none-win_arm64.whl", hash = "sha256:dd09c292479596b0e6fec8cd95c65c3a6dc68e9ad17b8f2382130f87ff6a75bb", size = 12663065, upload-time = "2025-11-06T22:07:42.603Z" },
]
[[package]]
-name = "setuptools"
-version = "80.9.0"
+name = "six"
+version = "1.17.0"
source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/18/5d/3bf57dcd21979b887f014ea83c24ae194cfcd12b9e0fda66b957c69d1fca/setuptools-80.9.0.tar.gz", hash = "sha256:f36b47402ecde768dbfafc46e8e4207b4360c654f1f3bb84475f0a28628fb19c", size = 1319958, upload-time = "2025-05-27T00:56:51.443Z" }
+sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
wheels = [
- { url = "https://files.pythonhosted.org/packages/a3/dc/17031897dae0efacfea57dfd3a82fdd2a2aeb58e0ff71b77b87e44edc772/setuptools-80.9.0-py3-none-any.whl", hash = "sha256:062d34222ad13e0cc312a4c02d73f059e86a4acbfbdea8f8f76b28c99f306922", size = 1201486, upload-time = "2025-05-27T00:56:49.664Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
]
[[package]]
@@ -1514,9 +1565,10 @@ wheels = [
[[package]]
name = "unshackle"
-version = "1.4.8"
+version = "2.1.0"
source = { editable = "." }
dependencies = [
+ { name = "aiohttp-swagger3" },
{ name = "appdirs" },
{ name = "brotli" },
{ name = "chardet" },
@@ -1525,6 +1577,7 @@ dependencies = [
{ name = "crccheck" },
{ name = "cryptography" },
{ name = "curl-cffi" },
+ { name = "fonttools" },
{ name = "httpx" },
{ name = "jsonpickle" },
{ name = "langcodes" },
@@ -1533,6 +1586,7 @@ dependencies = [
{ name = "protobuf" },
{ name = "pycaption" },
{ name = "pycryptodomex" },
+ { name = "pyexecjs" },
{ name = "pyjwt" },
{ name = "pymediainfo" },
{ name = "pymp4" },
@@ -1567,32 +1621,35 @@ dev = [
[package.metadata]
requires-dist = [
+ { name = "aiohttp-swagger3", specifier = ">=0.9.0,<1" },
{ name = "appdirs", specifier = ">=1.4.4,<2" },
{ name = "brotli", specifier = ">=1.1.0,<2" },
{ name = "chardet", specifier = ">=5.2.0,<6" },
{ name = "click", specifier = ">=8.1.8,<9" },
{ name = "construct", specifier = ">=2.8.8,<3" },
{ name = "crccheck", specifier = ">=1.3.0,<2" },
- { name = "cryptography", specifier = ">=45.0.0" },
- { name = "curl-cffi", specifier = ">=0.7.0b4,<0.8" },
+ { name = "cryptography", specifier = ">=45.0.0,<47" },
+ { name = "curl-cffi", specifier = ">=0.7.0b4,<0.14" },
+ { name = "fonttools", specifier = ">=4.0.0,<5" },
{ name = "httpx", specifier = ">=0.28.1,<0.29" },
- { name = "jsonpickle", specifier = ">=3.0.4,<4" },
+ { name = "jsonpickle", specifier = ">=3.0.4,<5" },
{ name = "langcodes", specifier = ">=3.4.0,<4" },
{ name = "lxml", specifier = ">=5.2.1,<7" },
{ name = "pproxy", specifier = ">=2.7.9,<3" },
- { name = "protobuf", specifier = ">=4.25.3,<5" },
+ { name = "protobuf", specifier = ">=4.25.3,<7" },
{ name = "pycaption", specifier = ">=2.2.6,<3" },
{ name = "pycryptodomex", specifier = ">=3.20.0,<4" },
+ { name = "pyexecjs", specifier = ">=1.5.1,<2" },
{ name = "pyjwt", specifier = ">=2.8.0,<3" },
- { name = "pymediainfo", specifier = ">=6.1.0,<7" },
+ { name = "pymediainfo", specifier = ">=6.1.0,<8" },
{ name = "pymp4", specifier = ">=1.4.0,<2" },
{ name = "pymysql", specifier = ">=1.1.0,<2" },
{ name = "pyplayready", specifier = ">=0.6.3,<0.7" },
{ name = "pysubs2", specifier = ">=1.7.0,<2" },
{ name = "pywidevine", extras = ["serve"], specifier = ">=1.8.0,<2" },
{ name = "pyyaml", specifier = ">=6.0.1,<7" },
- { name = "requests", extras = ["socks"], specifier = ">=2.31.0,<3" },
- { name = "rich", specifier = ">=13.7.1,<14" },
+ { name = "requests", extras = ["socks"], specifier = ">=2.32.5,<3" },
+ { name = "rich", specifier = ">=13.7.1,<15" },
{ name = "rlaphoenix-m3u8", specifier = ">=3.4.0,<4" },
{ name = "ruamel-yaml", specifier = ">=0.18.6,<0.19" },
{ name = "sortedcontainers", specifier = ">=2.4.0,<3" },
@@ -1604,12 +1661,12 @@ requires-dist = [
[package.metadata.requires-dev]
dev = [
- { name = "isort", specifier = ">=5.13.2,<6" },
+ { name = "isort", specifier = ">=5.13.2,<8" },
{ name = "mypy", specifier = ">=1.9.0,<2" },
{ name = "mypy-protobuf", specifier = ">=3.6.0,<4" },
- { name = "pre-commit", specifier = ">=3.7.0,<4" },
- { name = "ruff", specifier = "~=0.3.7" },
- { name = "types-protobuf", specifier = ">=4.24.0.20240408,<5" },
+ { name = "pre-commit", specifier = ">=3.7.0,<5" },
+ { name = "ruff", specifier = ">=0.3.7,<0.15" },
+ { name = "types-protobuf", specifier = ">=4.24.0.20240408,<7" },
{ name = "types-pymysql", specifier = ">=1.1.0.1,<2" },
{ name = "types-requests", specifier = ">=2.31.0.20240406,<3" },
{ name = "unshackle", editable = "." },