# Friday, May 14, 2021
# Thursday, March 18, 2021

GCast 107:

Live Streaming with Azure Media Services

Learn how to broadcast a live video event using Azure Media Services

Thursday, March 18, 2021 8:15:00 AM (GMT Standard Time, UTC+00:00)
# Thursday, March 4, 2021

GCast 106:

Audio Transcription and Captioning with Azure Media Services

Azure Media Services can analyze an audio or video file and transcribe speech into text. You can then take the generated files and provide synchronized captioning for your video.

Thursday, March 4, 2021 8:52:00 AM (GMT Standard Time, UTC+00:00)
# Thursday, February 25, 2021

GCast 105:

Analyzing a Video with Azure Media Services

Learn how to use Azure Media Services to apply Artificial Intelligence and Machine Learning to a video, analyzing such things as face detection, speech-to-text, object detection, and optical character recognition

Thursday, February 25, 2021 8:52:00 AM (GMT Standard Time, UTC+00:00)
# Thursday, February 18, 2021

GCast 104:

Sharing a Video Online with Azure Media Services

Learn how to use Azure Media Services to share a video on the web for streaming and/or for downloading.

Thursday, February 18, 2021 8:51:00 AM (GMT Standard Time, UTC+00:00)
# Thursday, February 11, 2021

GCast 103:

Encode a Video with Azure Media Services

Learn how to use Azure Media Services to encode a video into multiple formats, including support for adaptive streaming.

Thursday, February 11, 2021 9:50:00 AM (GMT Standard Time, UTC+00:00)
# Tuesday, February 9, 2021

In previous articles, I showed how to use Azure Media Services (AMS) to work with video that you upload. In this article, I will show how to broadcast a live event using AMS.

Before you get started, you will need some streaming software. For the demo in this article, I used Wirecast from Telestream. Telestream offers a free version, which is good for learning and demos but not for production, as it places a watermark on all streaming videos.

You will need to create an Azure Media Services account, as described in this article.

After the Media Service is created, navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Live streaming" from the left menu to open the "Live streaming" blade, as shown in Fig. 2.

ams02-LiveStreamingBlade
Fig. 2

Click the [Add live event] button (Fig. 3) to open the "Create live event dialog", as shown in Fig. 4.

ams03-AddLiveEventButton
Fig. 3

ams04-CreateLiveEvent
Fig. 4

At the "Live event name" field, enter a name for your event.

You may optionally enter a description and change the Encoding, Input protocol, Input ID, or Static hostname prefix.

Check the "I have all the rights..." checkbox to indicate you are not streaming content owned by anyone other than yourself.

Click the [Review + create] button to display the summary page, as shown in Fig. 5.

ams05-CreateLiveEventConfirmation
Fig. 5

If any validation errors display, return to the "Basics" page, and correct them.

Click the [Create] button to create the live event.

When the event is created, you will return to the "Live streaming" blade with your event listed, as shown in Fig. 6.

ams06-LiveStreamingBlade
Fig. 6

Click the event name link to display the event details, as shown in Fig. 7.

ams07-LiveEvent
Fig. 7

Click the [Start] button (Fig. 8) and click [Start] on the confirmation popup (Fig. 9) to start the event.

ams08-StartButton
Fig. 8

ams09-ConfirmStart
Fig. 9

When the event is started, the event details page will show information about the input, as shown in Fig. 10.

ams10-LiveEvent
Fig. 10

The "Input URL" textbox (Fig 11) displays a URL that you will need in your streaming software. Copy this URL and save it somewhere. You will need it in your streaming software.

ams11-InputUrl
Fig. 11

For the next part, you will need some streaming software. I used Wirecast from Telestream. The user interface of the free demo version is shown in Fig. 12.

ams12-Wirecast
Fig. 12

The following steps are specific to Wirecast, but other streaming software will have similar steps.

Click the [+] button on the first layer (Fig. 13) to open the "Add Shot" dialog, as shown in Fig. 14.

ams13-Layer
Fig. 13

ams14-AddLayer
Fig. 14

I chose to share the image captured by my webcam, but you can share screen captures or videos, if you like. The image you are capturing will be set as a "preview". Make this same layer broadcast live by clicking the "Live" button (Fig. 15).

ams15-GoButton
Fig. 15

Now, configure your streaming software to send its live video to your AMS Live Streaming event. Select Output | Output Settings... from the menu to open the Output dialog, as shown in Fig. 16.

ams16-OutputSettings
Fig. 16

Select "RTMP Server" from the "Destination" dropdown and click the [OK] button to open the "Output settings" dialog, as shown in Fig. 17.

ams17-OutputSettings
Fig. 17

In the "Address" text box, paste the Input URL that you copied from the AMS Live Stream event. Click the [OK] button to close the dialog.

Your UI should look similar to the following.

To begin streaming, select Output | Start / Stop Broadcasting | Start All from the menu, as shown in Fig. 18.

ams18-StartOutput
Fig. 18

Your UI should look similar to Fig. 19.

ams19-Wirecast
Fig. 19

Return to the Azure Media Services live event. You should see a preview of what you are broadcasting from your streaming software, as shown in Fig. 20. Refresh the page if you do not see it. There may be a few seconds delay between what is captured and what is displayed.

ams20-LiveEvent
Fig. 20

Click the [+ Create an output] button (Fig. 21) to open the "Create an output" dialog with the "Create output" tab selected, as shown in Fig. 22.

ams21-CreateAnOutput
Fig. 21

ams22-CreateOutputDialog
Fig. 22

Verify the information on this tab; then, click the [Next: Add streaming locator] button to advance to the "Add streaming locator" tab, as shown in Fig. 23.

ams23-CreateOutput
Fig. 23

Verify the information on this tab; then, click the [Create] button to create a streaming locator and endpoint. You will return to the live event blades, as shown in Fig. 24.

ams24-StreamingEndpoint
Fig. 24

Click the [Start streaming endpoint] button, then click the confirmation [Start] button, as shown in Fig. 25.

ams25-StartStreamingEndpoint
Fig. 25

After the streaming endpoint is started, copy the "Streaming URL" textbox contents (Fig. 26). You will need this to create an output page for viewers to watch your live event.

ams26-StreamingUrl
Fig. 26

Create and launch a web page with the HTML in Listing 1.

Listing 1:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Azure Media Services Demo</title>
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script>
</head>
<body>
    <h1>Video</h1>
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'>
        <source src="STREAMING_URL"
                type="application/vnd.ms-sstr+xml" />
    </video>
</body>
</html>
  

where STREAMING_URL is the Streaming URL you copied from the live event textbox above.

Listing 2 shows an example with the URL filled in.

Listing 2:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Azure Media Services Demo</title>
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script>
</head>
<body>
    <h1>Video</h1>
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'>
        <source src="https://dgtestams-usea.streaming.media.azure.net/45fb391c-8e10-4d41-a0ab-a03e50d57afd/cb4a49d9-93ad-4bb1-8894-c3f0a9fb7d43.ism/manifest"
                type="application/vnd.ms-sstr+xml" />
    </video>
</body>
</html>
  

With the live event running, your web page should display something similar to Fig. 27.

ams27-WebPage
Fig. 27

If this is published on the web, viewers will be able to watch your live stream from just about anywhere.

Be sure to stop your live event when you finish broadcasting in order to avoid unnecessary charges.

In this article, I showed you how to create a live streaming event using Azure Media Services.

Tuesday, February 9, 2021 8:03:00 AM (GMT Standard Time, UTC+00:00)
# Thursday, February 4, 2021

GCast 102:

Video Files and Azure Media Services

Learn the capabilities of Azure Media Services, how to create an Azure Media Services account, and how to add audio and video files as Assets in that account.

Thursday, February 4, 2021 9:45:00 AM (GMT Standard Time, UTC+00:00)
# Wednesday, February 3, 2021

In a previous article, I showed how to embed into a web page a video encoded with Azure Media Services (AMS).

In this article, I will show you how to add captions to that video.

In my last article, I showed you how to perform audio transcription with Azure Media Services using an Audio Transcription Job. Among other things, this generates a transcript.vtt file with speech-to-text data, listing anything spoken in the video, along with the time at which the words were spoken.

You can also generate this data by using the "Video and Audio Analyzer" job, as described in this article.

For this work, the transcript.vtt file must be in the same folder as the video(s) playing on your web page. A simple way to do this is to download the file from its current container and upload it into the Encoded Video container.

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Output Asset containing Audio Transcription or Analyzer data to display the Asset details page, as shown in Fig. 3.

ams03-AssetDetails
Fig. 3

Click the link next to the "Storage container" label (Fig. 4) to open the Storage Blob container associated with this asset, as shown in Fig. 5. This container should open in a new browser tab.

ams04-StorageContainerLink
Fig. 4

ams05-Container
Fig. 5

Click the "transcript.vtt" row to open the blob blade showing details of the transcript.vtt blob, as shown in Fig. 6.

ams06-VttBlobDetails
Fig. 6

Click the download button (Fig. 7) in the top toolbar and save the transcript.vtt file on your local disc. Note where you save this file.

ams07-DownloadButton
Fig. 7

Listing 1 shows a sample VTT file.

Listing 1

WEBVTT

NOTE duration:"00:00:11.0290000"

NOTE language:en-us

NOTE Confidence: 0.90088177

00:00:00.000 --> 00:00:04.956 
 This video is about Azure Media Services

NOTE Confidence: 0.90088177

00:00:04.956 --> 00:00:11.029 
 and Azure Media Services are. Awesome.
  

Navigate again to the "Assets" blade, as shown in Fig. 8.

ams08-AssetsBlade
Fig. 8

In the row of the Analyzer or Audio Transcription asset, click the link in the "Storage link" column to open the container associated with this asset, as shown in Fig. 9.

ams09-Container
Fig. 9

Click the upload button (Fig. 10) to open the "Upload blob" dialog, as shown in Fig. 11.

ams10-UploadButton
Fig. 10

ams11-UploadBlobDialog
Fig. 11

Click the "Select a file" field to open a file navigation dialog. Navigate to the older where you stored transcript.vtt and select this file. Then, click the [Upload]

When the dialog closes, you should return to the Container blade and transcript.vtt should now be listed, as shown in Fig. 12.

ams12-Container
Fig. 12

Click to open the asset containing the video(s) used to generate the VTT file, as shown in Fig. 13.

ams13-AssetDetails
Fig. 13

Start the Streaming Locator, if it is not already started. If you have not yet created a Streaming Locator, this article walks you through it.

Copy the Streaming URL and save it somewhere. It should begin with "https://" and end with "manifest".

As a reminder, Listing 2 shows the HTML to embed an AMS video in a web page. This is the code shown in this article.

Listing 2:

<!DOCTYPE html> 
< html lang="en"> 
< head> 
    <title>Azure Media Services Demo</title> 
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet"> 
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script> 
< /head> 
< body> 
    <h1>Video</h1> 
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'> 
        <source src="STREAMING_URL_MANIFEST" 
                type="application/vnd.ms-sstr+xml" /> 
default />--> 
    </video> 
< /body> 
< /html>
  

where STREAMING_URL_MANIFEST is replaced with the Streaming URL you copied from the video asset.

To add captions to this video, add a <track> tag inside the <video> tag, as shown in Listing 3:

Listing 3

<!DOCTYPE html> 
< html lang="en"> 
< head> 
    <title>Azure Media Services Demo</title> 
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet"> 
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script> 
< /head> 
< body> 
    <h1>Video</h1> 
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'> 
        <source src="STREAMING_URL_MANIFEST" 
                 type="application/vnd.ms-sstr+xml" /> 
        <track src="VTT_URL" label="english" kind="subtitles" srclang="en-us" default /> 
    </video> 
< /body> 
< /html>
  

where VTT_URL is replaced with a URL consisting of the same domain and folder as in the src attribute of the source tag, but with "transcript.vtt" as the file name.

Listing 4shows an example using an Azure Media Services account that I have since deleted.

Listing 4:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Azure Media Services Demo</title>
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script>
</head>
<body>
    <h1>Video</h1>
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'>
        <source src="https://dgtestblogams-usea.streaming.media.azure.net/232493e2-8c99-41a0-bb09-5a0aea47de35/3b331fca-41e6-458c-8171-235ef3f76875.ism/manifest"
                type="application/vnd.ms-sstr+xml" />
        <track src="https://dgtestblogams-usea.streaming.media.azure.net/29a650b6-5c0a-4932-8efb-2b4bb4a81bf0/transcript.vtt" label="english" kind="subtitles" srclang="en-us" default />
    </video>
</body>
</html>
  

Add this HTML file to any web server and navigate to its URL using a web browser. You should see a page with your video embedded and with captions displaying at the bottom of the video, as shown in Fig. 14.

ams14-VideoWithCaptions
Fig. 14

In this article, I showed you how to include captions in an Azure Media Services video embedded in a web page.

Wednesday, February 3, 2021 9:07:00 AM (GMT Standard Time, UTC+00:00)
# Tuesday, February 2, 2021

In a previous article, I showed you how to use Azure Media Services (AMS) to analyze a video. Among other things, this analysis performs audio transcription to perform text to speech on your video. This outputs 2 files with the spoken text in the audio track of your video.

You may want to only do audio transcription. If you are not interested in the other analysis output, it does not make sense to spend the time or compute on analyzing a video for the other features. AMS allows you to perform only Audio Transcription and eschew the other analysis.

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Input Asset you uploaded to display the Asset details page, as shown in Fig. 3.

ams03-AssetDetails
Fig. 3

Click the [Add job] button (Fig. 4) to display the "Create a job" dialog, as shown in Fig. 5.

ams04-AddJobButton
Fig. 4

ams05-CreateJob
Fig. 5

At the "Transform" field, select the "Create new" radio button.

At the "Transform name" textbox, enter a name to help you identify this Transform.

At the "Description" field, you may optionally enter some text to describe what this transform will do.

At the "Transform type" field, select the "Audio transcription" radio button.

At the "Analysis type" field, select the "Video and audio" radio button.

The "Automatic language detection" section allows you to either specify the audio language or allow AMS to figure this out. If you know the language, select the "No" radio button and select the language from the dropdown list. If you are unsure of the language, select the "Yes" radio button to allow AMS to infer it.

The "Configure Output" section allows you to specify where the generated output assets will be stored.

At the "Output asset name" field, enter a descriptive name for the output asset. AMS will suggest a name, but I prefer the name of the Input Asset, followed by "_AudioTranscription" or something more descriptive.

At the "Asset storage account" dropdown, select the Azure Storage Account in which to save a container and the blob files associated with the output asset.

At the job name, enter a descriptive name for this job. A descriptive name is helpful if you have many jobs running and want to identify this one.

At the "Job priority" dropdown, select the priority in which this job should run. The options are "High", "Low", and "Normal". I generally leave this as "Normal" unless I have a reason to change it. A High priority job will run before a Normal priority job, which will run before a Low priority job.

Click the [Create] button to create the job and queue it to be run.

You can check the status of the job by selecting "Transforms + jobs" from the left menu to open the "Transforms + jobs" blade (Fig. 6) and expanding the job you just created (Fig. 7).

ams06-TransformsJobs
Fig. 6

ams07-ExpandJob
Fig. 7

The state column tells you whether the job is queued, running, or finished.

Click the name of the job to display details about the job, as shown in Fig. 8.

ams08-JobDetails
Fig. 8

After the job finishes, when you return to the "Assets" blade, you will see the new output Asset listed, as shown in Fig. 9.

ams09-AssetsBlade
Fig. 9

Click the name of the asset you just created to display the Asset Details blade, as shown in Fig. 10.

ams10-AudioTranscriptionAssetDetails
Fig. 10

Click the link to the left of "Storage container" to view the files in Blob storage, as shown in Fig. 11.

ams11-Container
Fig. 11

The speech-to-text output can be found in the files transcript.ttml and transcript.vtt. These two files contain the same information - words spoken in the video and times they were spoken - but they are in different standard formats.

Listing 1 shows a sample TTML file for a short video, while Listing 2 shows a VTT file for the same video.

Listing 1:

<?xml version="1.0" encoding="utf-8"?>
 <tt xml:lang="en-US" xmlns="http://www.w3.org/ns/ttml" xmlns:tts="http://www.w3.org/ns/ttml#styling" xmlns:ttm="http://www.w3.org/ns/ttml#metadata">
   <head>
     <metadata>
       <ttm:copyright>Copyright (c) 2013 Microsoft Corporation.  All rights reserved.</ttm:copyright>
     </metadata>
     <styling>
       <style xml:id="Style1" tts:fontFamily="proportionalSansSerif" tts:fontSize="0.8c" tts:textAlign="center" tts:color="white" />
     </styling>
     <layout>
       <region style="Style1" xml:id="CaptionArea" tts:origin="0c 12.6c" tts:extent="32c 2.4c" tts:backgroundColor="rgba(0,0,0,160)" tts:displayAlign="center" tts:padding="0.3c 0.5c" />
     </layout>
   </head>
   <body region="CaptionArea">
     <div>
       <!-- Confidence: 0.90088177 -->
       <p begin="00:00:00.000" end="00:00:07.080">This video is about Azure Media Services and Azure Media</p>

      <!-- Confidence: 0.90088177 -->
       <p begin="00:00:07.206" end="00:00:08.850">Services are.</p>

      <!-- Confidence: 0.935814 -->
       <p begin="00:00:08.850" end="00:00:11.029">Awesome.</p>
     </div>
   </body>
 </tt>
  

Listing 2:

WEBVTT

NOTE duration:"00:00:11.0290000"

NOTE language:en-us

NOTE Confidence: 0.90088177

00:00:00.000 --> 00:00:04.956 
This video is about Azure Media Services

NOTE Confidence: 0.90088177

00:00:04.956 --> 00:00:11.029 
and Azure Media Services are. Awesome.
  
Tuesday, February 2, 2021 9:24:00 AM (GMT Standard Time, UTC+00:00)
# Friday, January 22, 2021

In a previous article, I showed you how to upload an asset to an Azure Media Services (AMS) account. In this article, you will learn how to use Azure Media Services to analyze a video.

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Input Asset you uploaded to display the Asset details page, as shown in Fig. 3.

ams03-AssetDetails
Fig. 3

Click the [Add job] button (Fig. 4) to display the "Create a job" dialog, as shown in Fig. 5.

ams04-AddJobButton
Fig. 4

ams05-CreateJobBlade
Fig. 5

At the "Transform" field, select the "Create new" radio button.

At the "Transform name" textbox, enter a name to help you identify this Transform.

At the "Description" field, you may optionally enter some text to describe what this transform will do.

At the "Transform type" field, select the "Video and audio analyzer" radio button.

At the "Analysis type" field, select the "Video and audio" radio button.

The "Automatic language detection" section allows you to either specify the audio language or allow AMS to figure this out. If you know the language, select the "No" radio button and select the language from the dropdown list. If you are unsure of the language, select the "Yes" radio button to allow AMS to infer it.

The "Configure Output" section allows you to specify where the generated output assets will be stored.

At the "Output asset name" field, enter a descriptive name for the output asset. AMS will suggest a name, but I prefer the name of the Input Asset, followed by "_Analysis" or something more descriptive.

At the "Asset storage account" dropdown, select the Azure Storage Account in which to save a container and the blob files associated with the output asset.

At the job name, enter a descriptive name for this job. A descriptive name is helpful if you have many jobs running and want to identify this one.

At the "Job priority" dropdown, select the priority in which this job should run. The options are "High", "Low", and "Normal". I generally leave this as "Normal" unless I have a reason to change it. A High priority job will run before a Normal priority job, which will run before a Low priority job.

Click the [Create] button to create the job and queue it to be run.

You can check the status of the job by selecting "Transforms + jobs" from the left menu to open the "Transforms + jobs" blade (Fig. 6) and expanding the job you just created (FIg. 7).

ams06-TransformJobs
Fig. 6

ams07-ExpandJob
Fig. 7

The state column tells you whether the job is queued, running, or finished.

Click the name of the job to display details about the job, as shown in Fig. 8.

ams08-JobDetails
Fig. 8

After the job finishes, when you return to the "Assets" blade, you will see the new output Asset listed, as shown in Fig. 9.

ams09-AssetsBlade
Fig. 9

Click on the link in the "Storage link" column to view the files in Blob storage, as shown in Fig. 10.

ams10-Container
Fig. 10

AMS Analytics produces the following text files:

File Name Contents
annotations.json A set of tags identifying objects and actions at various poinst throughout the video
contentmoderation.json Information at time points throughout the video, indicating if the video contains racy and/or adult content and should be reviewed.
emotions.json An analysis of emotions displayed on the faces in the video
faces.json Details of each face detected in the video at various time points
insights.json A file containing information on faces, OCR, and transcriptions at time points throughout the video
lid.json Spoken languages detected at various time points throughout the video
metadata.json Data about the video and audio tracks, such as format and size
ocr.json The text of any words displayed on screen
rollingcredits.json Information about rolling credits displayed, if any
transcript.ttml A transcription of any spoken text in the video, in Timed Text Markup Language (TTML) format
transcript.vtt A transcription of any spoken text in the video, in WebVTT format

In addition, you will find thumbnail images taken from the video as JPG files or as a ZIP file containing multiple JPG files.

In this article, you learned how to use Azure Media Services to analyze an Audio / Video file

Friday, January 22, 2021 9:45:00 AM (GMT Standard Time, UTC+00:00)
# Wednesday, January 20, 2021

In a previous article, I showed you how to use Azure Media Services to generate a Streaming Locator so that others can view and/or download your video.

In this article, I will show you how to create a web page that allows users to select the format and resolution in which they want to view your video. 

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Output Asset created by encoding your input video Asset to display the Asset details page, as shown in Fig. 3.

ams03-AdaptiveStreamingAsset
Fig. 3

Verify that the Streaming Locator exists and is running. Start it, if necessary.

Click the "View locator" link to display the "Streaming URLs" dialog, as shown in Fig. 4.

ams04-StreamingUrlsBlade
Fig. 4

Scroll down to the "SmoothStreaming" section shown in Fig. 5.

ams05-SmoothStreaming
Fig. 5

The SmoothStreaming URL points to a file named "manifest", which is an XML document with information on available encoded videos in this asset. A sample of such a document is in Listing 1.

Listing 1:

<?xml version="1.0" encoding="UTF-8"?> 
< SmoothStreamingMedia MajorVersion="2" MinorVersion="2" Duration="110720000" TimeScale="10000000"> 
    < StreamIndex Chunks="2" Type="audio" Url="QualityLevels({bitrate})/Fragments(aac_und_2_127999_2_1={start time})" QualityLevels="1" Language="und" Name="aac_und_2_127999_2_1"> 
        < QualityLevel AudioTag="255" Index="0" BitsPerSample="16" Bitrate="127999" FourCC="AACL" CodecPrivateData="1190" Channels="2" PacketSize="4" SamplingRate="48000" /> 
        <c t="0" d="60160000" /> 
        <c d="50560000" /> 
    </StreamIndex> 
    < StreamIndex Chunks="2" Type="video" Url="QualityLevels({bitrate})/Fragments(video={start time})" QualityLevels="4"> 
        < QualityLevel Index="0" Bitrate="2478258" FourCC="H264" MaxWidth="1024" MaxHeight="576" CodecPrivateData="000000016764001FACD94040049B0110000003001000000303C0F18319600000000168EBECB22C" /> 
        < QualityLevel Index="1" Bitrate="1154277" FourCC="H264" MaxWidth="640" MaxHeight="360" CodecPrivateData="000000016764001EACD940A02FF970110000030001000003003C0F162D960000000168EBECB22C" /> 
        < QualityLevel Index="2" Bitrate="731219" FourCC="H264" MaxWidth="480" MaxHeight="270" CodecPrivateData="0000000167640015ACD941E08FEB0110000003001000000303C0F162D9600000000168EBECB22C" /> 
        < QualityLevel Index="3" Bitrate="387314" FourCC="H264" MaxWidth="320" MaxHeight="180" CodecPrivateData="000000016764000DACD941419F9F0110000003001000000303C0F14299600000000168EBECB22C" /> 
        <c t="0" d="60000000" /> 
        <c d="50333333" /> 
    </StreamIndex> 
< /SmoothStreamingMedia>
  

Notice there are two <StreamIndex> tags: One for the audio and one for the video. The StreamIndex audio tag has only one <QualityLevel> child tag, indicating that there is only one audio option. The StreamIndex video tag has four <QualityLevel> child tags, indicating that there are four video options - each wiht a different size and bitrate.

We can add the SmoothStreaming manifest URL to an HTML <video> tag, as shown in Listing 2.

Listing 2:

                    <video 
                           id="vid1" 
                           class="azuremediaplayer amp-default-skin" 
                           autoplay 
                            controls 
                           width="848" 
                            height="480" 
                           data-setup='{"nativeControlsForTouch": false}'> 
                         <source 
                                src="https://dgtestams-usea.streaming.media.azure.net/77ec142c-e655-41a2-8ddb-a3e46168751a/WIN_20201215_14_28_08_Pro.ism/manifest" 
                                 type="application/vnd.ms-sstr+xml" /> 
                     </video>
  

A full web page is shown in Listing 3:

Listing 3:

<html>
    <head>
        <link href="https://amp.azure.net/libs/amp/latest/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet" />
        <script src="https://amp.azure.net/libs/amp/latest/azuremediaplayer.min.js"></script>
    </head>
    <body>
        <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="848" height="480" data-setup='{"nativeControlsForTouch": false}'>
            <source src="https://dgtestams-usea.streaming.media.azure.net/77ec142c-e655-41a2-8ddb-a3e46168751a/WIN_20201215_14_28_08_Pro.ism/manifest" type="application/vnd.ms-sstr+xml" />
        </video>
    </body>
</html>
  
  

Fig. 6 shows the output of Listing 3 when viewed in a browser.

ams06-VideoTagInBrowser
Fig. 6

As you can see, clicking the "Quality" icon at the bottom right of the player allows the viewer to select the quality of the video. This is helpful if the user is on a lower bandwidth.

Note that you are charged extra when the Streaming Locator is running, so it is important to stop the Locator if you do not need it.

In this article, you learned how to use the SmoothStreaming URL to add your video to a web page.

Wednesday, January 20, 2021 8:03:00 AM (GMT Standard Time, UTC+00:00)
# Tuesday, January 19, 2021

In a previous article, I showed you how to use Azure Media Services to encode a video.

In this article, I will show you how to generate a URL, allowing others to view your encoded video online.

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Output Asset created by encoding your input video Asset to display the Asset details page, as shown in Fig. 3.

ams03-AdaptiveStreamingAsset
Fig. 3

Click the [New streaming locator] button (Fig. 4) to display the "Add streaming locator" dialog, as shown in Fig. 5.

ams04-NewStreamingLocatorButton
Fig. 4

ams05-AddStreamingLocatorBlade
Fig. 5

At the "Name" field, enter a descriptive name for this locator.

At the "Streaming policy" dropdown, select a desired Streaming Policy. A Streaming Policy define streaming protocols and encryption options. There are options to allow for streaming online or downloading and for adding encryption and Digital Rights Management. For this demo, I have selected "Predefined_DownloadAndClearStreaming". This allows users to view the video online and to download it; and it adds no encryption or DRM.

The flowchart in Fig. 7 is from the Microsoft Azure Documentation and will help you decide which Streaming policy is right for you.

ams06-StreamingPolicyFlowchart
Fig. 6

The Streaming Endpoint will not last forever. You must include an expiration date and time. By default, this is set to 100 years in the future. Change this if you want your video to be accessible for a shorter period of time.

Click the [Add] button to add the Streaming Locator.

The dialog will close and return you to the output Assets details page with the Streaming URL filled in, as shown in Fig. 7.

ams07-StartStreamingEndpoint
Fig. 7

The URL is still not accessible until you start the streaming endpoint. Click the [Start streaming endpoint] button and the [Start] button in the popup dialog to enable the endpoint URL, as shown in Fig. 8.

ams08-ConfirmStartStreamingEndpoint
Fig. 8

After doing this, the Streaming locator will show as "STREAMING" on the output Asset details page, as shown in Fig. 9. When the Streaming endpoint is started, you can preview the video from within the Asset details page.

ams09-ViewLocator
Fig. 9

Click the "View locator" link to display the "Streaming URLs" dialog. Select the "Streaming endpoint" from the dropdown, as shown in Fig. 10.

ams10-StreamingUrlsBlade
Fig. 10

Many URLs are displayed in this dialog. For the encoding I selected, the "Download" section contains a JPG URL and several MP4 URLs, as shown in Fig. 11.

ams11-Downloads
Fig. 11

You can paste the JPG URL into a browser to view a thumbnail image from the video. You can paste any of the MP4 URLs into a browser to view or download the full video. The MP4 URLs may differ by their resolution and encoding. Clues about which is which can be found in the name of each MP4. The JSON URLs provide metadata about the video and can be used in applications. For example,
https://dgtestams-usea.streaming.media.azure.net/77ec142c-e655-41a2-8ddb-a3e46168751a/WIN_20201215_14_28_08_Pro_1024x576_AACAudio_2478.mp4
contains a video that is 1024 pixels wide by 576 pixels high and was encoded with Advanced Audio Coding (AAC).

Try each of these in your browser to see the differences.

Note that you are charged extra when the Streaming Locator is running, so it is important to stop the Locator if you do not need it.

In this article, you learned how to create and start a Streaming Locator to make your Azure Media Services video available online.

Tuesday, January 19, 2021 9:46:00 AM (GMT Standard Time, UTC+00:00)
# Friday, January 15, 2021

In the last article, I showed you how to add Assets to an Azure Media Services (AMS) account. An Asset can point to an audio or video file, but you will want to encode that file to allow others to consume it. There are many encoding options. By encoding an audio or video, we can convert it into a format that can be consumed by others.

Those who are consuming your media are not all using the same systems. They may have different devices, different clients, different connection speeds, and different software installed. You will want to consider the capabilities and configurations of your users when you decide how to encode your media. Fortunately, Azure Media Services gives you many options.

We use an AMS Job to encode media. A job accepts an input Asset and produces an output Asset. That output Asset may consist of one or more files stored in a single Azure Storage Blob Container.

To begin, encoding, navigate to the Azure Portal and open an Azure Media Service, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

See the following articles if you need help creating an AMS account or an AMS Asset.

Click the [Add job] button (Fig. 3) to display the "Create a job" dialog, as shown in Fig. 4.

ams03-AddJobButton
Fig. 3

ams04-CreateJobBlade
Fig. 4

The first thing you need to do is create or select a Transform. A Transform is a recipe for doing something, like encoding a video. It is used by a Job, which tells Azure to execute the steps in a Transform. I will assume this is your first time doing this and do not have any Transforms created, so you will need to create a new one; but in the future, you may choose to re-use an existing Transform on a new Job.

At the "Transform" radio button, select "Create new".

At the "Transform name" textbox, enter a name to help you identify this Transform.

At the "Description" field, you may optionally enter some text to describe what this transform will do.

At the "Transform type" field, select the "Encoding" radio button.

At the "Built-in preset name" dropdown, you can select a desired encoding output appropriate for your audience. For this demo, select "Adaptive Streaming". This will create files in multiple formats that can be consumed by a variety of clients.

Next, we configure the settings for the output asset.

At the "Output asset name", enter a name to help you identify the output Asset that will be created. Azure will supply a default name, but I prefer to use something more readable, such as the Input Asset name, followed by the type of Transform.

At the "Asset storage account" dropdown, select the storage account in which to save the container and files associated with this asset.

At the "Job name" field, enter a name for this job to help you identify it later.

At the "Job priority" dropdown, select "Normal", "High", or "Low" priority, depending on whether you want this job to take precedence over other jobs. Unless I have a compelling reason, I leave this as the default "Normal".

Click the [Create] button to create and queue up the job.

You can check the progress of the job by selectin "Transforms + jobs" in the left menu to display the "Transforms + jobs" blade, as shown in Fig. 5.

ams05-TransformJobsBlade
Fig. 5

Find the row with your Transform name (This why it is important to give it an easily identifiable name). Expand to see Jobs using this Transform, as shown in Fig. 6.

ams06-TransformAndJobsBlade
Fig. 6

The state column tells you whether the job is queued, running, or finished.

From the "Transform + jobs" blade, you can click the name of the Transform to display more details about the Transform, as shown in Fig. 7 or click the name of the Job to display details about the job, as shown in Fig. 8.

ams07-TransformDetailsBlade
Fig. 7

ams08-JobDetailsBlade
Fig. 8

After the job finishes, when you return to the "Assets" blade, you will see the new output Asset listed, as shown in Fig. 9.

ams09-AssetsBlade
Fig. 9

Click on the link in the "Storage link" column to view the files in Blob storage, as shown in Fig. 10.

ams10-Container
Fig. 10

Note that there are multiple MP4 files, each with a different resolution. The names of the file indicate the resolution of the video. This allows users with smaller screens or slower bandwidth to select the optimum resolution for viewing.

The container also contains a thumbnail image and several text files with information describing the videos that client players can use.

In this article, you learned how to use Azure Media Services to encode a video. In the next article, I will show you how to share that video with others.

Friday, January 15, 2021 9:39:00 AM (GMT Standard Time, UTC+00:00)
# Wednesday, January 13, 2021

In this last article, I introduced Azure Media Services and showed how to create an Azure Media Services (AMS) account.

In this article, I will show you how to add video and/or audio assets to an Azure Media Services account. This is often the first step in sharing media online.

An Asset points to an Azure Storage Blob Container containing one or more files. These files contain either media or metadata about media. We distinguish between Input Assets (assets provided to AMS via a user or other external source) and Output Assets (assets produced by AMS jobs). Fig. 1 illustrates this relationship.

ams01-AssetsContainer
Fig. 1

Let's look at how to upload a video file from your local computer as an Asset, as illustrated in Fig. 2.

ams02-PublishDiagram
Fig. 2

Open the Azure Portal and navigate to the Azure Media Services account, as shown in Fig. 3.

ams03-OverviewBlade
Fig. 3

Select "Assets" in the left menu to open the "Assets" blade, as shown in Fig. 4.

ams04-AssetsBlade
Fig. 4

Click the [Upload] button (Fig. 5) to open the "Upload new assets" dialog, as shown in Fig. 6.

ams05-UploadButton
Fig. 5

ams06-UploadNewAsset
Fig. 6

At the "Storage account" dropdown, select the storage account in which you want to store the media file.

Click the "Upload files" icon and select the video file or files you want to upload.

More fields display for each file selected, as shown in Fig. 7.

ams07-UploadNewAsset-Completed
Fig. 7

Enter a name for each asset; then, click the [I agree and upload] button to begin uploading your video.

When the upload is complete, the asset will be listed, as shown in Fig. 8.

ams08-AssetsBladeWithAsset
Fig. 8

Click the link in the "Storage link" column to view the Storage Blob container and files associated with this asset, as shown in Fig. 9.

ams09-Container
Fig. 9

In this article, you learned how to upload a video file to create an Azure Media Services asset. You will want to encode this in order that others can view it. I will show how to encode in the next article.

Wednesday, January 13, 2021 9:01:00 AM (GMT Standard Time, UTC+00:00)
# Tuesday, January 12, 2021

Streaming video online is an effective way to communicate to large numbers of people.

But there are challenges. You need to get your video online in a format that is accessible to others and make it available to your audience.

You may also want to provide closed captioning for hearing impaired users; analyze the contents of your audio and video; reduce latency with a Content Delivery Network; and secure your media appropriately.

Azure Media Services provides all these capabilities and does in a highly scalable, fault-tolerant way.

The first step in using Azure Media Services is to create an Azure Media Services Account. As with most, services in Azure, you can create an Azure Media Services Account in the Azure Portal by clicking the [Create a resource] button (Fig. 1); then search for and select "Media Services", as shown in Fig. 2.

ams01-CreateAResourceButton
Fig. 1

ams02-New
Fig. 2

The "Create media service account" dialog displays, as shown in Fig. 3.

ams03-CreateMediaServiceAccount
Fig. 3

At the "Subscription" dropdown, select the Subscription that will contain this Media Service Account. Most of you will have only one subscription.

At the "Resource group" field, select a Resource Group to contain this account or click the "Create new" link to create a new Resource Group to contain it. A Resource Group is a logical grouping of Azure resources, making it easier for you to manage them together.

At the "Media Services account name" field, enter a unique name for this account. This name must be between 3 and 24 characters in length and can contain only numbers and lowercase letters.

At the "Location" dropdown, select a location in which to store this service. When selecting a location, consider the location of your users and any potential legal issues.

At the "Storage Account" field, select an existing storage account from the dropdown or click the "Create a new storage account" link to create a new storage account. This storage account will hold all the assets for your service, including audio files, video files, and metadata files. Unless I have media files that already exist, I tend to prefer to keep all my Azure Media Services assets in their own storage account.

Click the [Review + create] button to display the summary page, as shown in Fig. 4.

ams04-Review
Fig. 4

Check the "I have all the rights to use the content/file" checkbox and click the [Create] button to begin creating you Azure Media Services Account.

When the service account is complete, the confirmation shown in Fig. 5 displays.

ams05-DeploymentIsComplete
Fig. 5

Click the [Go to resource] button to navigate to the "Overview" blade of the Media Service account, as shown in Fig. 6.

ams06-OverviewBlade
Fig. 6

In this article, you learned the advantages of Azure Media Services and how to create an Azure Media Services account. In the next article, I will show you how to add media assets to this account.

Tuesday, January 12, 2021 9:57:00 AM (GMT Standard Time, UTC+00:00)
# Thursday, September 13, 2018

GCast 13:

Azure Media Services: Closed Captioning

Generate and add closed captioning to your video with Microsoft Azure Media Services.

Thursday, September 13, 2018 7:11:00 AM (GMT Daylight Time, UTC+01:00)
# Thursday, September 6, 2018

GCast 12:

Azure Media Services: Live Streaming

Learn how to Upload, Encode, and Share a Video using Azure Media Services.

Thursday, September 6, 2018 9:22:00 AM (GMT Daylight Time, UTC+01:00)
# Tuesday, August 28, 2018

A caption file can enhance a video file by displaying any dialog as text at the bottom of the video. This is helpful for people who are hard of hearing, for people who do not understand the language spoken in the video and for those who want to play a video at low volume or muted.

Azure Media Services allows you to quickly create a caption file for your saved videos. The steps are:

  1. Create Azure Media Service
  2. Upload File
  3. Encode Asset
  4. Analyze Video
  5. Download VTT file
  6. Upload VTT file
  7. Share video

I described in detail how to perform steps 1-3 in this article.

Analyze Video

After you have encoded your video, you can use the "Analyze" function to generate the following files:

  • A caption file in the TTML format.
  • A caption file in the SAMI format
  • A caption file in the WebVTT format

I have no statistics to back this up, but I see WebVTT used more than the other 2, so I typically stick with this.

To run the "Analyze" function, open your Azure Media Service in the Azure Portal and open the "Assets" tag as shown in Fig. 1.

Fig01-AssetsBlade
Fig. 1

Click the Asset corresponding to the asset corresponding to the encoded video to open its blade, as shown in Fig. 2.

Fig02-EncodedVideoAssetBlade
Fig. 2

Click the [Analyze] button (Fig. 3) to open the "Media Analytics" blade, as shown in Fig. 4.

Fig03-AnalyzeButton
Fig. 3

Fig04-MediaAnalyticsBlade
Fig. 4

Check the checkboxes next to the Closed Caption file formats (and other files) you wish to crate; then click the [Create] button at the bottom to begin creating these files.

This creates and schedules a new job for this Azure Media Service. You can click the "Media Analytics job added" link (Fig. 5) at the top of the blade or you can open the "Jobs" blade for this Media Service and click the most recent job added. Either method will display the blade for this Job, as shown in Fig. 6.

Fig05-JobAdded
Fig. 5

Fig06-JobBlade
Fig. 6

Jobs run asynchronously in Azure Media Service, so you can continue to work or stay on the Job blade to monitor its status. The status changes from "Scheduled" to "Queued" to "Processing" to "Finished". While processing, the blade will display the percent complete.

When the job is finished, you will see a new asset in the "Assets" blade for the indexed video, as shown in Fig. 7.

Fig07-Assets
Fig. 7

Click this asset to open the blade for the Analytics files created by this job, as shown in Fig. 8.

Fig08-IndexedFileAsset
Fig. 8

To make these files available for download, you must publish them. To do so, click the [Publish] button (Fig. 9), which opens the "Publish the asset" blade, as shown in Fig. 10.

Fig09-PublishButton
Fig. 9

Fig10-PublishTheAssetBlade
Fig. 10

After you have published these files, you can download any or all of them. They are listed at the bottom of the Analytics files blade (Fig. 8)

Click the file with the ".vtt" extension to open a blade for this file, as shown in Fig. 11.

Fig11-VTTFileBlade
Fig. 11

The important information is the DOWNLOAD URL field. You can copy this value to your clipboard by clicking the icon to the right of this field.

### Download VTT file

Use CURL to download this file. If CURL is not installed on your computer, you can install it from https://curl.haxx.se/download.html.

Open a command prompt and type

curl -o "fffff.vtt" "http://xxxxxxxxxxxxxxx"

where http://xxxxxxxxxxxxxxx is the DOWNLOAD URL copied from the blade; and fffff.vtt is the name of the local file you want to create when you download this file.

Verify that a new file was created.

Upload VTT file

Now upload this VTT file to the video asset. Open the Azure Media Services "Assets" blade and select the Asset for the originally uploaded video to open the blade for this video asset, as shown in Fig. 12.

Fig12-AssetBlade
Fig. 12

Click the [Upload captions] button (Fig. 13) to open the "Upload caption file" blade, as shown in Fig. 14.

Fig13-UploadCaptionsButton
Fig. 13

Fig14-UploadCaptionFileBlade
Fig. 14

Click the [Select File] icon (Fig. 15) and select the VTT file you downloaded with CURL.

Fig15-SelectFileIcon
Fig. 15

Share video

Now, you can share the video, using a Media Player.

A simple one to use is the Azure Media Player, available at https://ampdemo.azureedge.net/

Information on using this player with Azure Media Services is in this article.  http://davidgiard.com/2018/08/21/UploadingEncodingAndSharingAVideoWithAzureMediaServices.aspx.

You will need the URL of the video to play, which you can by selecting the Encoded Videos asset from the "Assets" blade to open the properties for this asset, as shown in Fig. 16.

Fig16-EncodedVideosAsset
Fig. 16

Copy the Streaming URL at the bottom of this blade under the "Published URLs" section. (NOTE: If nothing is in this section, you need to publish this asset.)

In Azure Media Player, copy this URL into the URL field, omitting the "http:", as shown in Fig. 17.

Fig17-AzureMediaPlayer
Fig. 17

The copied text should look something like this:

//dgtest2-usea.streaming.media.azure.net/d7d7eb93-9ef8-40fd-9c01-8253c9ca2126/Gratitude.ism/manifest

It will end with the name of your video asset, followed by ".ism/manifest".

Click the [Update Player] button and verify that your video plays properly.

If it plays properly, check the "Advanced Options" checkbox to display more options and scroll to the bottom, as shown in Fig. 18.

Fig18-AdvancedOptions
Fig. 18

Click the [Add Track] button to display new track information, as shown in Fig. 19.

Fig19-AddTrack
Fig. 19

At the "Kind" dropdown, select "Captions"

At the "Track Label" field, enter "English" or the name of the language in which the video is recorded. This text will display in the "Closed Captioning" control.

At the "Language" dropdown, select "English" or the language in which the video is recorded.

At the WebVTT URL, enter the URL of the uploaded VTT file. This will be identical to the video URL. The name of the VTT file will replace the name of the video asset, followed by ".ism/manifest".

It should look similar to this:

//dgtest2-usea.streaming.media.azure.net/d7d7eb93-9ef8-40fd-9c01-8253c9ca2126/Gratitude.vtt

Fig. 20 shows these fields completed for my video.

Fig20-VTTTrack
Fig. 20

Click the [Update Player] button.

Now, when you play the video, you should see a "CLOSED CAPTIONING" icon at the bottom right. Click this to reveal the CLOSED CAPTIONING setting and select the caption language you just added (English in the case of the video in Fig. 21)

Fig21-ClosedCaptioning
Fig. 21

With the captions enabled, you should see text below your video whenever anyone is speaking, as shown in Fig. 22.

Fig22-VideoWithCaptions
Fig. 22

In this article, I showed you how to create closed captioning and share a video that includes this captioning.

Tuesday, August 28, 2018 9:38:00 AM (GMT Daylight Time, UTC+01:00)