A scalable, cloud-native video transcoding system that automatically converts uploaded videos into adaptive bitrate HLS streams with multiple quality levels.
StreamForge is built with a microservices architecture designed for scalability and reliability:
-
π³ HLS Transcoder Service (
hls-transcoder/)- Dockerized FFmpeg-based transcoding service
- Converts videos to adaptive bitrate HLS streams
- Generates multiple quality levels (240p, 360p, 720p)
- Automatically uploads processed content to S3
-
β‘ API Service (
api/)- Node.js/TypeScript API server
- SQS message consumer
- Docker container orchestration
- Process monitoring and logging
-
π₯ Test Player (
hls-transcoder/test-player.html)- Modern HTML5 video player with HLS.js
- Quality selection controls
- Adaptive bitrate streaming support
- Real-time quality monitoring
- Video Upload: User uploads video to S3 bucket
- Event Trigger: S3 triggers notification to SQS queue
- Message Processing: API service polls SQS for new messages
- Container Launch: API spawns Docker container with video key
- Transcoding: FFmpeg converts video to multiple HLS qualities
- Upload: Processed HLS files uploaded back to S3
- Cleanup: Container removed, SQS message deleted
- Adaptive Bitrate Streaming: Multiple quality levels for optimal viewing
- HLS Protocol: Industry-standard HTTP Live Streaming
- Smart Quality Selection: Automatic quality switching based on bandwidth
- Segment-based Delivery: Efficient streaming with 4-second segments
| Quality | Resolution | Video Bitrate | Audio Bitrate | Use Case |
|---|---|---|---|---|
| 240p | 426Γ240 | 400 kbps | 64 kbps | Mobile/Slow connections |
| 360p | 640Γ360 | 800 kbps | 96 kbps | Standard mobile viewing |
| 720p | 1280Γ720 | 2.5 Mbps | 128 kbps | HD desktop viewing |
- Containerized Processing: Isolated, scalable transcoding jobs
- Queue-based Architecture: Handle high volumes with SQS
- Stateless Design: Easy horizontal scaling
- Cloud-native: AWS S3, SQS integration
- Error Handling: Comprehensive error catching and logging
- Process Timeout: 10-minute timeout protection
- Real-time Logging: Live transcoding progress monitoring
- Graceful Cleanup: Automatic container removal
For each processed video, the system creates:
s3://your-bucket/hls/video-name/
βββ master.m3u8 # Master playlist (entry point)
βββ 240p/
β βββ prog.m3u8 # 240p playlist
β βββ segment_000.ts # Video segments
β βββ segment_001.ts
β βββ ...
βββ 360p/
β βββ prog.m3u8 # 360p playlist
β βββ segment_000.ts
β βββ ...
βββ 720p/
βββ prog.m3u8 # 720p playlist
βββ segment_000.ts
βββ ...
- Docker & Docker Compose
- Node.js 18+
- AWS Account (S3, SQS access)
- FFmpeg (included in Docker image)
# Clone repository
git clone <repository-url>
cd streamforge
# Set up HLS Transcoder
cd hls-transcoder
cp .env.example .env
# Configure AWS credentials in .env
# Set up API
cd ../api
npm install
cp .env.example .env
# Configure SQS queue URL and AWS credentialscd hls-transcoder
docker build -t hls-transcoder .# Create S3 bucket
aws s3 mb s3://your-video-bucket
# Create SQS queue
aws sqs create-queue --queue-name video-processing-queue
# Set up S3 event notifications to SQS
# (Configure in AWS Console or via CLI)# Start API service
cd api
npm run dev
# The API will automatically poll SQS and process videos# Open test player
open hls-transcoder/test-player.html
# Enter your HLS master playlist URL:
# https://your-bucket.s3.region.amazonaws.com/hls/video-name/master.m3u8HLS Transcoder (.env):
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
S3_BUCKET_NAME=your-video-bucket
AWS_REGION=us-east-1API Service (.env):
SQS_QUEUE_URL=https://sqs.region.amazonaws.com/account/queue-name
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=us-east-1The system uses optimized FFmpeg settings:
- Preset:
veryfastfor speed - GOP Size: 48 frames for efficient streaming
- Segment Length: 4 seconds
- Format: HLS with VOD playlist type
- 5-minute 1080p video: ~2-3 minutes
- 10-minute 720p video: ~3-4 minutes
- 30-minute 4K video: ~15-20 minutes
- CPU: 2+ cores recommended
- Memory: 2GB+ RAM
- Storage: Temporary space for processing
- Network: High bandwidth for S3 uploads
This project is licensed under the MIT License - see the LICENSE file for details.