Introduction
Over a year ago, my quest for a new weekend project led me to an intriguing idea. Inspired by a cinephile friend's extensive collection of RAW HD media and hindered by pandemic-induced supply chain issues when it came to buying computer parts, I embarked on a journey to leverage my cloud infrastructure skills to construct a comprehensive, cloud-based media library solution that would be more manageable, highly accessible, and reliable compared to traditional home setups. The solution needed to be able to stream content from anywhere, on any device and have the ability to add content as easily as humanly possible.
The Conundrum of Cost
A valid concern arose: the cost associated with cloud-hosted media content. The apprehension was well-founded, given the potential for expenses to spiral out of control, especially considering the hidden data egress charges that accompany many popular cloud platforms. However, the reality is more nuanced. Upon closer examination and cost analysis, cloud-based solutions often prove more economical than their hardware counterparts, particularly over a few years. The allure of increased flexibility, customization, and scalability only bolsters this advantage. Additionally, it's possible to opt for cloud providers that offer favorable data egress pricing, mitigating the impact of marathon viewing sessions spanning several seasons. It's worth noting that the only apparent drawback is the inability to indulge in gaming escapades like Forza on your cloud compute during off-hours—although that might change in the future depending on the hardware you are able to attain on the cloud.
Defining the Requirements
My initial requirements were straightforward yet demanding. I sought a solution that seamlessly combined ease of use and management, global reliability, and cost-effectiveness. A fundamental prerequisite was that media data should never be stored my personal hardware, a vital consideration for security and privacy. Furthermore, I aspired to establish a workflow that would empower me to discover and add new content effortlessly, with the added convenience of executing these actions on my iOS/Android devices.
In essence, my quest to create a cloud-based media library was motivated by the convergence of technology prowess, remote access necessity, and prudent cost analysis. The result was a solution that harnessed cloud infrastructure to craft a digital haven for media enthusiasts. By delving into the specifics of architecture, management, and global accessibility, we'll explore how this innovative project materialized in future sections. Stay tuned for the upcoming segments, where we will dive deeper into the technical aspects and benefits of constructing a cloud-based media library tailored to meet the demands of media hungry friends and family in the digital era.
Picking the Right Tools
Creating a fully automated media server entails a multitude of tasks, one of the most crucial being media consolidation. This process involves the seamless acquisition, organization, and preparation of media content for optimal user experience. Think of it this way: imagine when a new TV show episode becomes available. The ideal scenario involves automatic downloading of the episode, collection of associated metadata like posters and fan art, subtitle integration, proper folder organization, updating the media library, and culminating in a user notification confirming the availability of the episode for viewing.
Here's a comprehensive breakdown of the services required to accomplish this:
- Automated Media Download and Organization:
- Automated Media Requests and Downloads:
- Overseerr: A platform for automating media requests and triggering content downloads.
- Put.io: Torrent downloads
- Prowlarr: Manages indexers and sources for torrents and NZBs.
- Media Streaming and Access:
- Plex: A renowned media streaming platform compatible with various devices.
- Xteve: Manages Live TV integration.
- E-book Management:
- Calibre Server: Manages eBook metadata and library.
- Calibre Web: Provides user access to eBooks and facilitates sending to Kindle devices.
- Administration and Backend:
- Portainer: Facilitates container orchestration and administration.
- Nginx Proxy Manager: Manages reverse proxy for SSL termination and load balancing.
- Let's Encrypt: Generates SSL certificates for secure connections.
- Datadog: Monitors and provides telemetry data for various services.
- Google Domains: Manages custom domain for the server.
- Jenkins: Handles updates and automation through pipelines.
- Filebrowser: Allows users to view and edit files on the server.
- Slack: Notifications using webhooks and integrations
- Pagerduty: Incident Response Management
In selecting the appropriate services, I opted for a combination that offered reliability, functionality, and compatibility. Among these, Plex emerged as the primary media streaming platform, catering to a variety of devices. Additionally, ARR open-source helper services played a pivotal role, facilitating media requests, management, and more.
The journey toward constructing an automated media library infused my project with an element of experimentation. From choosing the right hosting solution to integrating services that catered to my requirements, the process was iterative. Stay tuned for the upcoming segments, where we'll delve into the technical intricacies and the seamless synergy between various services that gave rise to a fully functional cloud-based media library solution, ensuring accessibility, reliability, and an unparalleled media experience.
Building the platform to host the services
Having carefully selected the ideal tools and services for our media library project, the next pivotal step is translating these choices into a cohesive, reliable, secure, and maintainable solution suitable for cloud deployment. The journey toward this goal required meticulous planning and strategic implementation to ensure optimal performance and user experience.
Choosing the Right Hosting Solution
To embark on this implementation journey, I spent time evaluating various hosting solutions. These included industry giants like AWS, Google Cloud, and Digital Ocean. Additionally, the consideration of storing media content led to exploring services such as S3 and Wasabi. After comprehensive assessment, a key realization emerged: Virtual Private Servers (VPS) provided the most compelling solution.
A more detail breakdown of the differences can be found here. VPS emerged as the preferred choice for a multitude of reasons but primarily because of:
- Cost Optimization: VPS solutions offer an impressive balance between storage costs and performance. This optimization ensures efficient resource utilization, keeping expenses in check.
- Egress Network Traffic: An essential consideration, particularly when dealing with media streaming in high resolutions like 4K, is the potential for excessive egress network traffic charges. Remarkably, VPS providers typically do not levy additional fees for egress network traffic, ensuring cost predictability, especially crucial when inviting friends and family to share in the media experience.
When it comes to securing a Virtual Private Server (VPS) provider to accommodate your data storage and streaming requirements, precision in resource selection is paramount. The choice of an ideal provider hinges on aligning available storage and compute capabilities with your unique needs. While a configuration featuring a 6-core processor and 10TB storage proves advantageous, remember that your specific use-case and projected demands will ultimately drive this decision-making process.
Mapping Services to Cloud Infrastructure
Having established the hosting framework, the next phase involved mapping the chosen tools and services onto the cloud infrastructure. This required a thoughtful orchestration of components to ensure seamless interaction and optimal utilization. Which brings us to containerization.
What is containerization?
Containerization has revolutionized the way applications are developed, deployed, and managed, offering a streamlined approach to packaging, distributing, and running software applications. It's a technology that enables developers to encapsulate an application along with its dependencies, libraries, and configuration files into a single unit known as a container. This container can then be consistently deployed across various computing environments, be it development, testing, or production, without worrying about compatibility issues.
At its core, containerization addresses the challenges of software deployment by providing a lightweight, isolated, and reproducible environment for applications. The concept draws inspiration from shipping containers used in logistics, where goods are packed and shipped in standardized containers that can be easily transported and handled across different modes of transportation without requiring modification. Similarly, containerization standardizes the packaging of applications, making them portable and consistent across different infrastructure environments, such as local development machines, virtual machines, or cloud servers.
Key Aspects of Containerization
- Isolation: Containers offer process-level isolation, ensuring that applications run independently of each other. This isolation prevents conflicts between different applications and their dependencies, making it easier to manage and maintain software.
- Portability: Containers abstract away the underlying infrastructure, ensuring that applications can run consistently across various environments without modification. This portability simplifies the process of moving applications between development, testing, and production environments.
- Resource Efficiency: Containers share the host operating system's kernel, allowing them to use resources more efficiently than traditional virtual machines. This lightweight approach reduces overhead and increases the density of applications on a single physical or virtual host.
- Version Control: Containers can be versioned, allowing developers to manage and reproduce application states easily. This is particularly useful for maintaining consistent environments during development and troubleshooting.
- Dependency Management: Containers encapsulate an application's dependencies, eliminating the common "it works on my machine" problem. This ensures that applications run the same way regardless of the host environment.
In essence, containerization brings efficiency, consistency, and flexibility to modern application development and deployment. It empowers developers to focus on building and shipping applications, knowing that the deployment environment will remain consistent across various stages of the development lifecycle. The result is a more agile, scalable, and manageable approach to software development that aligns well with the demands of today's dynamic computing landscape.
Launch the service containers
To help your journey, I have published the code on my GitHub. Feel free to use and modify it to your liking as needed. They are linked below.
This process initiates the deployment of all services by fetching the Docker images for each, creating a virtual private network to enable seamless communication among containers, and configuring Nginx to act as a reverse proxy, facilitating networking with these containers.
Setup the services
Setup torrent download and shipping
You can setup a cron job to check for new torrent files that have been generated by Sonarr, Radarr, Readarr etc on a recurring basis. If any are found you will need to convert them into a magnet link and tell Putio to download this. You can use my script for this.
Setup user portal
Setup IPTV through plex (Xteve)
Setup notifications
Monitoring, Observability and Telemetry
Special thanks to smarthome beginner for getting me started with docker