What's new in Azure Media Services video processing

Developers and media companies trust and rely on Azure Media Services for the ability to encode, protect, index, and deliver videos at scale. This week we are proud to announce several enhancements to Media Services including the general availability of the new Azure Media Services v3 API, as well as updates to Azure Media Player.

Low-latency live streaming, 24-hour transcoding, CMAF, and a shiny new API (v3) ready for production

The Azure Media Services v3 API was announced at the Build conference in May 2018, which provided a simplified development model, enabled a better integration experience with key Azure services like Event Grid and Functions, and much more. The API is now generally available and comes with many new exciting features. You can begin migrating workloads built on the preview API over to production use today.

What’s new?

The new Media Services v3 API is a major milestone in the enhancement of the developer experience for Media Services customers. The new API provides a set of SDKs for .NET, .NET Core, Java, Go, Python, Ruby, and Node.js! In addition, the API includes support for the following key scenarios.

Low-latency live streaming with 24-hour transcoding

LiveEvent, the replacement for the Channel entity in the v2 API, now has several major service enhancements.

We often receive the request to lower the latency when streaming live events. Our new low-latency live streaming mode is now available exclusively on the LiveEvent entity in our v3 API. It supports 8 seconds end-to-end latency when used in combination with Azure Media Player’s new low-latency heuristic profile, or ~10 seconds with native HLS playback on an Apple iOS device. Simply configure your live encoder to use smaller 1-second GOP sizes, and you can quickly reduce your overall latency when delivering content to small or medium sized audiences. Of course, it should be noted that the end-to-end latency can vary depending on local network conditions or by introducing a CDN caching layer. Test your exact configuration as your latency could vary.

Looking forward, we will continue to make improvements to our low-latency solution. Last month we announced that we are joining the open source SRT Alliance to help improve low-latency live streaming to Azure with secure and reliable ingest to the cloud. As part of this announcement we have already begun work to add SRT ingest protocol support to our LiveEvent.

To use the new LowLatency feature, you can set the StreamOptionsFlag to LowLatency on the LiveEvent:

FE39240A-E58B-4773-B137-891FFAF7141A

Once the stream is up and running, use the Azure Media Player Demo page, and set the the playback options to use the “Low Latency Heuristics Profile”.

Low Latency Heuristics Profile selection

Next, when streaming live video, you have two options for long-duration streaming events. If you need to provide linear (24x7x365) live streams, you should use an on-premises encoder with our “pass-through”, non-transcoding LiveEvent. If you require live encoding in the cloud, in the v2 API you were limited to 8 hours of running time. We are very pleased to announce that we have increased support for live transcoding durations up to a full 24 hours when using the new LiveEvent.

Lastly, we have verified several updated RTMP(s)-based encoders including the latest releases from MediaExcel, Telestream Wirecast, Haivision KB, and Switcher Studio.

Easier development with Event Grid and Azure Resource Manager

To make your development experience easier across Azure solutions for media, we are offering more notifications for common operations through Azure Event Grid. You can now subscribe to state change events from Job and JobOutput operations in order to better integrate your custom media applications. If you are creating custom workflows in a Transform, you can specify your own correlation data in the Job object. This correlation data can be extracted from the notifications received through Event Grid to help create workflows that solve common problems with multi-tenant applications, or integration with 3rd-party media asset management systems. When monitoring a live stream, you can use new events such as live ingest heartbeat, connected and disconnected events from the upstream live encoder.

Subscribe to any Media Services event through code, Logic Apps, Functions, or via the Azure portal.

Create event subscription

With the transition over to Azure Resource Management (ARM) for our v3 API, you get the following benefits when managing transforms, live events, DRM keys, streaming endpoints, and assets:

  1. Easier deployment using ARM templates.
  2. Ability to apply role-based access control (RBAC).

Simplified ingest and asset creation

Ingesting content into Media Services used to involve multiple steps such as copying files to Azure Storage, and creating Assets and AssetFiles. In the new API, you can simply point to an existing file in Azure Storage using a SAS URL, or you can ingest from any HTTP(s) accessible URL.

var input = new JobInputHttp(
                     baseUri: "https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/",
                     files: new List<String> {"Ignite-short.mp4"}
                     );

We have also simplified the creation of assets in Azure Blob Storage by allowing you to set the container name directly. You can then use the storage APIs to add files into the container. Existing v2 assets will continue to work in the new API, but v3 assets are not backwards compatible.

Streaming and Dynamic Packaging with MPEG CMAF

In the service, we have now released official support for the latest MPEG Common Media Application Format (CMAF) with ‘cbcs’ encryption. CMAF, officially known as MPEG-A Part 19 or ISO/IEC 23000-19, is a new multimedia file format that provides storing and delivery of streaming media using a single encrypted, adaptive bitrate format to a wide range of devices including Apple iPhone, Android, and Windows. Streaming service providers will benefit from this common format through improved interoperability, low-latency streaming, and increased CDN cache efficiency.

To use the new CMAF format, simply add the following new “format=” tag to the URL of your streaming URLs and choose the appropriate manifest type of HLS (for iOS devices) or DASH (for Windows or Android devices).

For MPEG DASH manifest with CMAF format content, use “format=mpd-time-cmaf” as shown below:

https://<<your-account-name>>.streaming.media.azure.net/<<locator-ID>>/<<manifest-name>>.ism/manifest(format=mpd-time-cmaf)

For HLS manifest with CMAF format content use “format=m3u8-cmaf” as shown below:

https://<<your-account-name>>.streaming.media.azure.net/<<locator-ID>>/<<manifest-name>>.ism/manifest(format=m3u8-cmaf)

Manage Media Services through the Command Line

Finally, we have updated the Azure CLI 2.0 module for Media Services to include all features of the v3 API. We will be releasing the final Media Services CLI module on October 23, 2018 for download, or for use directly within the Cloud Shell. The CLI is designed to make scripting Media Services easy. Use the CLI to query for running Jobs, creating Live Events, creating custom Transforms, managing content keys, and more. The CLI module also includes support for Streaming Endpoints, content key policies, and dynamic manifest filters.

CE05FFEB-4B2C-4F51-A02D-073A5648967B

Try out the new API following these quickstart tutorials:

Stay in touch!

Be sure to also check out Video Indexer which also moved to general availability last month. We’re eager to hear from you about these updates! You can ask questions in our MSDN Forum, submit a question on Stack Overflow, or add new ideas to the Azure Media Services and Video Indexer user voice sites. You can also reach us on twitter @MSFTAzureMedia and @Video_Indexer.

Source: Azure Blog Feed

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.