Setting Up Adaptive Streaming with Nginx

Recently, I’m working out a system to smoothly stream live events for an organization. That is pretty new to me and, after a bunch of research, found that Nginx with the RTMP module seems to be a good choice. There are many difficulties when setting all this up and after several days of testing, I found a good setting that is worth a post.

Setup Nginx and RTMP module

First, let’s get Nginx set up. In order to use the RTMP module, we need to compile that as an Nginx module. It would look something like this:

# Installing requirements in Ubuntu/Debian
apt-get install git gcc make libaio1 libpcre3-dev openssl libssl-dev ffmpeg -y

# Installing the same thing in RHEL/CentOS
yum install git gcc make libaio libaio-devel openssl libssl-devel pcre-devel ffmpeg -y

# Download nginx and nginx-rtmp-module
wget http://nginx.org/download/nginx-1.9.4.tar.gz
git clone https://github.com/arut/nginx-rtmp-module.git

# Compile nginx with nginx-rtmp and libaio
tar zxvf nginx-1.9.4.tar.gz

./configure --prefix=/usr/local/nginx --with-file-aio --add-module=/path/to/nginx-rtmp/
make
make install

# Link nginx
ln -s /usr/local/nginx/sbin/nginx /usr/bin/nginx

nginx # Start Nginx
nginx -s stop # Stop Nginx

After all things are done, check whether nginx is compiled properly.

Capture

If you can see that Nginx RTMP is included, you can go to the next step. Before we proceed to configuring Nginx for live streaming, we should confirm what kind of resolution we should provide for live streams and how much hardware power you have.

Prerequisites

For converting live streams into several streams for adaptive streaming, you need to make sure your server have enough CPU for such workload. Otherwise, the live stream will suffer from continuous delays and/or server becomes unresponsive. I have spawn some EC2 c3.large and c3.xlarge instances, test with them and I found out their optimized CPU can handle such workload with ease. Something that also worth mention is about the I/O limits of the disks. If possible, store the HLS fragments generated to an high-speed SSD helps maintain smooth streaming experiences.

ec2CPU Usage when using an EC2 c3.xlarge instance.

Then, you also need to think about what kind of resolutions you will be offering for adaptive streaming. Generally about 4-5 variants are good enough to provide great loading speeds for different network speeds and devices. Here’s my recommended list of variants used for live streaming:

  1. 240p Low Definition stream at 288kbps
  2. 480p Standard Definition stream at 448kbps
  3. 540p Standard Definition stream at 1152kbps
  4. 720p High Definition stream at 2048kbps
  5. Source resolution, source bitrate

Configuring nginx for live streaming

Here is my own nginx.conf with comments that you can have references on.

worker_processes  auto;
events {
    # Allows up to 1024 connections, can be adjusted
    worker_connections  1024;
}

# RTMP configuration
rtmp {
    server {
        listen 1935; # Listen on standard RTMP port
        chunk_size 4000; 
        
        # This application is to accept incoming stream
        application live {
            live on; # Allows live input
            
            # Once receive stream, transcode for adaptive streaming
            # This single ffmpeg command takes the input and transforms
            # the source into 4 different streams with different bitrate
            # and quality. P.S. The scaling done here respects the aspect
            # ratio of the input.
            exec ffmpeg -i rtmp://localhost/$app/$name -async 1 -vsync -1
                        -c:v libx264 -c:a libvo_aacenc -b:v 256k -b:a 32k -vf "scale=480:trunc(ow/a/2)*2" -tune zerolatency -preset veryfast -crf 23 -f flv rtmp://localhost/show/$name_low
                        -c:v libx264 -c:a libvo_aacenc -b:v 768k -b:a 96k -vf "scale=720:trunc(ow/a/2)*2" -tune zerolatency -preset veryfast -crf 23 -f flv rtmp://localhost/show/$name_mid
                        -c:v libx264 -c:a libvo_aacenc -b:v 1024k -b:a 128k -vf "scale=960:trunc(ow/a/2)*2" -tune zerolatency -preset veryfast -crf 23 -f flv rtmp://localhost/show/$name_high
                        -c:v libx264 -c:a libvo_aacenc -b:v 1920k -b:a 128k -vf "scale=1280:trunc(ow/a/2)*2" -tune zerolatency -preset veryfast -crf 23 -f flv rtmp://localhost/show/$name_hd720
                        -c copy -f flv rtmp://localhost/show/$name_src;
        }
        
        # This application is for splitting the stream into HLS fragments
        application show {
            live on; # Allows live input from above
            hls on; # Enable HTTP Live Streaming
            
            # Pointing this to an SSD is better as this involves lots of IO
            hls_path /mnt/hls/;
            
            # Instruct clients to adjust resolution according to bandwidth
            hls_variant _low BANDWIDTH=288000; # Low bitrate, sub-SD resolution
            hls_variant _mid BANDWIDTH=448000; # Medium bitrate, SD resolution
            hls_variant _high BANDWIDTH=1152000; # High bitrate, higher-than-SD resolution
            hls_variant _hd720 BANDWIDTH=2048000; # High bitrate, HD 720p resolution
            hls_variant _src BANDWIDTH=4096000; # Source bitrate, source resolution
        }
    }
}

http {
    # See http://licson.net/post/optimizing-nginx-for-large-file-delivery/ for more detail
    # This optimizes the server for HLS fragment delivery
    sendfile off;
    tcp_nopush on;
    aio on;
    directio 512;
    
    # HTTP server required to serve the player and HLS fragments
    server {
        listen 80;
        
        location / {
            root /path/to/web_player/;
        }
        
        location /hls {
            types {
                application/vnd.apple.mpegurl m3u8;
            }
            
            root /mnt/;
            add_header Cache-Control no-cache; # Prevent caching of HLS fragments
            add_header Access-Control-Allow-Origin *; # Allow web player to access our playlist
        }
    }
}
Then, configure your live encoder to use these settings to stream into the server:
  • RTMP Endpoint: rtmp://yourserver/live/
  • RTMP Stream Name: [Whatever name you like]
Finally, configure your player for live playback. The HLS URL would look like this:
http://yourserver/hls/[The stream name above].m3u8

Recommended encoder settings for live events

If you can adjust the encoder, the following settings can help to gain better experiences.

  • Full HD Resolution (1920×1080) is recommended
  • H.264 Main profile, with target bitrate of 4000Kbps, maximum 6000Kbps
  • 25fps, 2 second keyframe interval
  • AAC audio at 128Kbps, 44.1kHz sample rate

And that’s all! I hope you can enjoy doing live events with these techniques.

28 Replies to “Setting Up Adaptive Streaming with Nginx”

  1. Hi, I have Ubuntu 16.04, I followed the instructions but it does not work for me.

    The transcoding process with ffmpeg is running and I see how the chunks are generated from the directory. But when I try to run the stream with ffplay it does not work.

    My nginx.conf:
    worker_processes auto;
    events {
    # Allows up to 1024 connections, can be adjusted
    worker_connections 1024;
    }

    # RTMP configuration
    rtmp {
    server {
    listen 1935; # Listen on standard RTMP port
    chunk_size 4000;

    # This application is to accept incoming stream
    application live {
    live on; # Allows live input

    # Once receive stream, transcode for adaptive streaming
    # This single ffmpeg command takes the input and transforms
    # the source into 4 different streams with different bitrate
    # and quality. P.S. The scaling done here respects the aspect
    # ratio of the input.
    exec /usr/bin/ffmpeg -i rtmp://localhost:1935/$app/$name -async 1 -vsync -1
    -c:v libx264 -c:a aac -strict -2 -b:v 256k -b:a 32k -vf “scale=480:trunc(ow/a/2)*2” -tune zerolatency -preset veryfast -g 50 -keyint_min 26 -crf 23 -f flv rtmp://localhost:1935/show/$name_low
    -c:v libx264 -c:a aac -strict -2 -b:v 768k -b:a 96k -vf “scale=720:trunc(ow/a/2)*2” -tune zerolatency -preset veryfast -g 50 -keyint_min 26 -crf 23 -f flv rtmp://localhost:1935/show/$name_mid
    -c:v libx264 -c:a aac -strict -2 -b:v 1024k -b:a 128k -vf “scale=960:trunc(ow/a/2)*2” -tune zerolatency -preset veryfast -g 50 -keyint_min 26 -crf 23 -f flv rtmp://localhost:1935/show/$name_high
    -c copy -f flv rtmp://localhost/show/$name_src;
    }

    # This application is for splitting the stream into HLS fragments
    application show {
    live on; # Allows live input from above
    hls on; # Enable HTTP Live Streaming

    # Pointing this to an SSD is better as this involves lots of IO
    hls_path /mnt/test;

    # Instruct clients to adjust resolution according to bandwidth
    hls_variant _low BANDWIDTH=288000; # Low bitrate, sub-SD resolution
    hls_variant _mid BANDWIDTH=448000; # Medium bitrate, SD resolution
    hls_variant _high BANDWIDTH=1152000; # High bitrate, higher-than-SD resolution
    hls_variant _src BANDWIDTH=4096000; # Source bitrate, source resolution
    }
    }
    }

    http {
    # See http://licson.net/post/optimizing-nginx-for-large-file-delivery/ for more detail
    # This optimizes the server for HLS fragment delivery
    sendfile off;
    tcp_nopush on;
    aio on;
    directio 512;

    # HTTP server required to serve the player and HLS fragments
    server {
    listen 80;

    location / {
    root /usr/local/nginx/;
    }

    location /hls {
    types {
    application/vnd.apple.mpegurl m3u8;
    }

    root /mnt/;
    add_header Cache-Control no-cache; # Prevent caching of HLS fragments
    add_header Access-Control-Allow-Origin *; # Allow web player to access our playlist
    }

    1. The problem I’m having is that nginx does not work with streaming or the directory where the hls is generated is inaccessible, I have a disabled firewall and 777 permissions of the directory.

  2. I am getting error can anyone resolve it . Please reply

    apt-get install git gcc make libaio1 libpcre3-dev openssl libssl-dev ffmpeg -y

    sl-dev ffmpeg -y
    Reading package lists… Done
    Building dependency tree
    Reading state information… Done
    Package ffmpeg is not available, but is referred to by another package.
    This may mean that the package is missing, has been obsoleted, or
    is only available from another source

    Package libpcre3-dev is not available, but is referred to by another package.
    This may mean that the package is missing, has been obsoleted, or
    is only available from another source

    E: Package ‘libpcre3-dev’ has no installation candidate
    E: Unable to locate package libssl-dev
    E: Package ‘ffmpeg’ has no installation candidate

    1. Try running apt-get update -y first. That should solve your problem. Also, beware that in Ubuntu 14.04 and prior versions the ffmpeg package is not available and will need to be installed separately.

  3. Hey this post is really helpful thanks. I’m trying to implement DASH with nginx, are you planning to write any article on that too ?

  4. I am using OBS as encoder. tmp/hls folder is empty, there are no m3u8 files in there. I tried to use acc, libfdk_aac, libvo_aacenc but it is still not working please advise.

    1. Well I just found out this statement from FFMPEG site:
      libvo_aacenc : VisualOn AAC encoding library. ​Support for this library has been removed. Use the native FFmpeg encoder instead: it provides better quality and supports more than 2 channels. I changed my FFMEG command and it works now.

      https://trac.ffmpeg.org/wiki/Encode/AAC

  5. Hello,
    thanks for this great post….
    I have a question for live streaming. Is it possible to receive a live stream in rtmp and re-encode it so that I can link different audio tracks to my broadcast.
    The final result that I wish to achieve is to be able to click a button on the webpage with the embedded streaming video and change the audio.
    Thanks M.

  6. How would you be able to have one input stream and then one output stream?

    Like for example twitch, hitbox, livecoding etc. does with one streamkey and one endpoint ex. domain.com/username

    to enable multiple users to livestream? 🙂

    Would be much appreciated.

      1. Yes but that would mean anyone who has the streamkey (via the m3u8) can just stream to my endpoint.

        I would love to be able to get away from that.

        Appreciate you answering though 🙂

  7. hi licson,

    i just want to know how do you encrypt your hls streaming and your rtmp server, so that nobody can stream to your server?

    Thanks.

        1. I figured out what my problem was . My system does not have libvo_aacenc audio codec so i changed that to aac in the ffmpeg command and then everything worked.

    1. The most common error for the empty folder is because you don’t set the GOP correctly, try using -g 6 on ffmpeg to put complete frame every 6 partial frames. Should work

  8. hi,

    how do you encrypt your hls streaming and your rtmp server, so that nobody can stream to your server? also for hls streaming how do you protect it?

    Thanks.

  9. Amazing tutorial, thank you! I tried to integrate with the video.js, but did not go successfully. The JWplayer works perfectly with the m3u8 stream, the nginx.conf okay. With the video.js, does not start the m3us8 (HLS) stream.

    method 1:
    It starts to load, but do not start the movie. Do you have any idea?

    videojs.options.flash.swf = “/video.js/dist/video-js.swf”

    var player = videojs(‘terrace’);
    player.hls( ‘http://ip/hls/terrace.m3u8’ );

    method 2:
    Big “X” and “No compatible source was found for this video” message.

    video.js HLS Plugin Example

    videojs.options.flash.swf = “/video.js/dist/video-js.swf”

    var player = videojs(‘terrace’);
    player.play();

    Do you have any advice on what is the solution?

  10. First of all, thank you for the tutorial.
    Is it possible to detect the bitrate of an original stream to decide which bitrate ffmpeg transcode? maybe using “if” sentence on application.

  11. Which Web Player do you recommend to work on cellphones and tables too?
    I try jwplayer but HLS works only the premium player.
    Or maybe html5 with DASH

    1. For me, I would recommend to use Viblast Player for HLS playback everywhere. It can playback HLS content using Media Source Extension and works well on phones too. Integration with video.js also makes it easy to setup.

Leave a Reply

Your email address will not be published. Required fields are marked *