Today I’m excited to announce and demonstrate the new HTTP Live Streams (HLS) output feature for Amazon Kinesis Video Streams (KVS). If you’re not already familiar with KVS, Jeff covered the release for AWS re:Invent in 2017. In short, Amazon Kinesis Video Streams is a service for securely capturing, processing, and storing video for analytics and machine learning – from one device or millions. Customers are using Kinesis Video with machine learning algorithms to power everything from home automation and smart cities to industrial automation and security.
After iterating on customer feedback, we’ve launched a number of features in the past few months including a plugin for GStreamer, the popular open source multimedia framework, and docker containers which make it easy to start streaming video to Kinesis. We could talk about each of those features at length, but today is all about the new HLS output feature! Fair warning, there are a few pictures of my incredibly messy office in this post.
HLS output is a convenient new feature that allows customers to create HLS endpoints for their Kinesis Video Streams, convenient for building custom UIs and tools that can playback live and on-demand video. The HLS-based playback capability is fully managed, so you don’t have to build any infrastructure to transmux the incoming media. You simply create a new streaming session, up to 5 (for now), with the new GetHLSStreamingSessionURL API and you’re off to the races. The great thing about HLS is that it’s already an industry standard and really easy to leverage in existing web-players like JW Player, hls.js, VideoJS, Google’s Shaka Player, or even rendering natively in mobile apps with Android’s Exoplayer and iOS’s AV Foundation. Let’s take a quick look at the API, feel free to skip to the walk-through below as well.
Kinesis Video HLS Output API
The documentation covers this in more detail than what we can go over in the Blog but I’ll cover the broad components.
- Get an endpoint with the GetDataEndpoint API
- Use that endpoint to get an HLS streaming URL with the GetHLSStreamingSessionURL API
- Render the content in the HLS URL with whatever tools you want!
This is pretty easy in a Jupyter notebook with a quick bit of Python and boto3.
import boto3 STREAM_NAME = "RandallDeepLens" kvs = boto3.client("kinesisvideo") # Grab the endpoint from GetDataEndpoint endpoint = kvs.get_data_endpoint( APIName="GET_HLS_STREAMING_SESSION_URL", StreamName=STREAM_NAME )['DataEndpoint'] # Grab the HLS Stream URL from the endpoint kvam = boto3.client("kinesis-video-archived-media", endpoint_url=endpoint) url = kvam.get_hls_streaming_session_url( StreamName=STREAM_NAME, PlaybackMode="LIVE" )['HLSStreamingSessionURL']
You can even visualize everything right away in Safari which can render HLS streams natively.
from IPython.display import HTML HTML(data=''.format(url))
import DeepLens_Kinesis_Video as dkv import time aws_access_key = "super_fake" aws_secret_key = "even_more_fake" region = "us-east-1" stream_name ="RandallDeepLens" retention = 1 #in minutes. wait_time_sec = 60*300 #The number of seconds to stream the data # will create the stream if it does not already exist producer = dkv.createProducer(aws_access_key, aws_secret_key, "", region) my_stream = producer.createStream(stream_name, retention) my_stream.start() time.sleep(wait_time_sec) my_stream.stop()
How to use Kinesis Video Streams HLS Output Streams
We definitely need a Kinesis Video Stream, which we can create easily in the Kinesis Video Streams Console.
Now, we need to get some content into the stream. We have a few options here. Perhaps the easiest is the docker container. I decided to take the more adventurous route and compile the GStreamer plugin locally on my mac, following the scripts on github. Be warned, compiling this plugin takes a while and can cause your computer to transform into a space heater.
With our freshly compiled GStreamer binaries like
gst-launch-1.0 and the
kvssink plugin we can stream directly from my macbook’s webcam, or any other GStreamer source, into Kinesis Video Streams. I just use the kvssink output plugin and my data will wind up in the video stream. There are a few parameters to configure around this, so pay attention.
Here’s an example command that I ran to stream my macbook’s webcam to Kinesis Video Streams:
gst-launch-1.0 autovideosrc ! videoconvert ! video/x-raw,format=I420,width=640,height=480,framerate=30/1 ! vtenc_h264_hw allow-frame-reordering=FALSE realtime=TRUE max-keyframe-interval=45 bitrate=500 ! h264parse ! video/x-h264,stream-format=avc,alignment=au,width=640,height=480,framerate=30/1 ! kvssink stream-name="BlogStream" storage-size=1024 aws-region=us-west-2 log-config=kvslog
Now that we’re streaming some data into Kinesis, I can use the getting started sample static website to test my HLS stream with a few different video players. I just fill in my AWS credentials and ask it to start playing. The GetHLSStreamingSessionURL API supports a number of parameters so you can play both on-demand segments and live streams from various timestamps.
Data Consumed from Kinesis Video Streams using HLS is charged $0.0119 per GB in US East (N. Virginia) and US West (Oregon) and pricing for other regions is available on the service pricing page. This feature is available now, in all regions where Kinesis Video Streams is available.
The Kinesis Video team told me they’re working hard on getting more integration with the AWS Media services, like MediaLive, which will make it easier to serve Kinesis Video Stream content to larger audiences.
As always, let us know what you you think on twitter or in the comments. I’ve had a ton of fun playing around with this feature over the past few days and I’m excited to see customers build some new tools with it!