# Thursday, February 11, 2021

GCast 103:

Encode a Video with Azure Media Services

Learn how to use Azure Media Services to encode a video into multiple formats, including support for adaptive streaming.

Thursday, February 11, 2021 9:50:00 AM (GMT Standard Time, UTC+00:00)
# Tuesday, February 9, 2021

In previous articles, I showed how to use Azure Media Services (AMS) to work with video that you upload. In this article, I will show how to broadcast a live event using AMS.

Before you get started, you will need some streaming software. For the demo in this article, I used Wirecast from Telestream. Telestream offers a free version, which is good for learning and demos but not for production, as it places a watermark on all streaming videos.

You will need to create an Azure Media Services account, as described in this article.

After the Media Service is created, navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Live streaming" from the left menu to open the "Live streaming" blade, as shown in Fig. 2.

ams02-LiveStreamingBlade
Fig. 2

Click the [Add live event] button (Fig. 3) to open the "Create live event dialog", as shown in Fig. 4.

ams03-AddLiveEventButton
Fig. 3

ams04-CreateLiveEvent
Fig. 4

At the "Live event name" field, enter a name for your event.

You may optionally enter a description and change the Encoding, Input protocol, Input ID, or Static hostname prefix.

Check the "I have all the rights..." checkbox to indicate you are not streaming content owned by anyone other than yourself.

Click the [Review + create] button to display the summary page, as shown in Fig. 5.

ams05-CreateLiveEventConfirmation
Fig. 5

If any validation errors display, return to the "Basics" page, and correct them.

Click the [Create] button to create the live event.

When the event is created, you will return to the "Live streaming" blade with your event listed, as shown in Fig. 6.

ams06-LiveStreamingBlade
Fig. 6

Click the event name link to display the event details, as shown in Fig. 7.

ams07-LiveEvent
Fig. 7

Click the [Start] button (Fig. 8) and click [Start] on the confirmation popup (Fig. 9) to start the event.

ams08-StartButton
Fig. 8

ams09-ConfirmStart
Fig. 9

When the event is started, the event details page will show information about the input, as shown in Fig. 10.

ams10-LiveEvent
Fig. 10

The "Input URL" textbox (Fig 11) displays a URL that you will need in your streaming software. Copy this URL and save it somewhere. You will need it in your streaming software.

ams11-InputUrl
Fig. 11

For the next part, you will need some streaming software. I used Wirecast from Telestream. The user interface of the free demo version is shown in Fig. 12.

ams12-Wirecast
Fig. 12

The following steps are specific to Wirecast, but other streaming software will have similar steps.

Click the [+] button on the first layer (Fig. 13) to open the "Add Shot" dialog, as shown in Fig. 14.

ams13-Layer
Fig. 13

ams14-AddLayer
Fig. 14

I chose to share the image captured by my webcam, but you can share screen captures or videos, if you like. The image you are capturing will be set as a "preview". Make this same layer broadcast live by clicking the "Live" button (Fig. 15).

ams15-GoButton
Fig. 15

Now, configure your streaming software to send its live video to your AMS Live Streaming event. Select Output | Output Settings... from the menu to open the Output dialog, as shown in Fig. 16.

ams16-OutputSettings
Fig. 16

Select "RTMP Server" from the "Destination" dropdown and click the [OK] button to open the "Output settings" dialog, as shown in Fig. 17.

ams17-OutputSettings
Fig. 17

In the "Address" text box, paste the Input URL that you copied from the AMS Live Stream event. Click the [OK] button to close the dialog.

Your UI should look similar to the following.

To begin streaming, select Output | Start / Stop Broadcasting | Start All from the menu, as shown in Fig. 18.

ams18-StartOutput
Fig. 18

Your UI should look similar to Fig. 19.

ams19-Wirecast
Fig. 19

Return to the Azure Media Services live event. You should see a preview of what you are broadcasting from your streaming software, as shown in Fig. 20. Refresh the page if you do not see it. There may be a few seconds delay between what is captured and what is displayed.

ams20-LiveEvent
Fig. 20

Click the [+ Create an output] button (Fig. 21) to open the "Create an output" dialog with the "Create output" tab selected, as shown in Fig. 22.

ams21-CreateAnOutput
Fig. 21

ams22-CreateOutputDialog
Fig. 22

Verify the information on this tab; then, click the [Next: Add streaming locator] button to advance to the "Add streaming locator" tab, as shown in Fig. 23.

ams23-CreateOutput
Fig. 23

Verify the information on this tab; then, click the [Create] button to create a streaming locator and endpoint. You will return to the live event blades, as shown in Fig. 24.

ams24-StreamingEndpoint
Fig. 24

Click the [Start streaming endpoint] button, then click the confirmation [Start] button, as shown in Fig. 25.

ams25-StartStreamingEndpoint
Fig. 25

After the streaming endpoint is started, copy the "Streaming URL" textbox contents (Fig. 26). You will need this to create an output page for viewers to watch your live event.

ams26-StreamingUrl
Fig. 26

Create and launch a web page with the HTML in Listing 1.

Listing 1:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Azure Media Services Demo</title>
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script>
</head>
<body>
    <h1>Video</h1>
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'>
        <source src="STREAMING_URL"
                type="application/vnd.ms-sstr+xml" />
    </video>
</body>
</html>
  

where STREAMING_URL is the Streaming URL you copied from the live event textbox above.

Listing 2 shows an example with the URL filled in.

Listing 2:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Azure Media Services Demo</title>
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script>
</head>
<body>
    <h1>Video</h1>
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'>
        <source src="https://dgtestams-usea.streaming.media.azure.net/45fb391c-8e10-4d41-a0ab-a03e50d57afd/cb4a49d9-93ad-4bb1-8894-c3f0a9fb7d43.ism/manifest"
                type="application/vnd.ms-sstr+xml" />
    </video>
</body>
</html>
  

With the live event running, your web page should display something similar to Fig. 27.

ams27-WebPage
Fig. 27

If this is published on the web, viewers will be able to watch your live stream from just about anywhere.

Be sure to stop your live event when you finish broadcasting in order to avoid unnecessary charges.

In this article, I showed you how to create a live streaming event using Azure Media Services.

Tuesday, February 9, 2021 8:03:00 AM (GMT Standard Time, UTC+00:00)
# Monday, February 8, 2021

Episode 647

Kevin Pilch on gRPC

Kevin Pilch describes gRPC - an open source system for making remote calls across processes and/or machines - and the .NET Core implementation of this system.

https://docs.microsoft.com/en-us/aspnet/core/grpc/?view=aspnetcore-5.0
Monday, February 8, 2021 9:30:00 AM (GMT Standard Time, UTC+00:00)
# Sunday, February 7, 2021

2/7
Today I am grateful for the imagination of Jon Favreau.

2/6
Today I am grateful to learn from my teammates.

2/5
Today I am grateful to watch yesterday's snowstorm from my warm safe home.

2/4
Today I am grateful for a productive day yesterday.

2/3
Today I am grateful for:
-a chance to catch up with J for the first time in months
-Christina, who answered my questions about Photoshop and Lightroom

2/2
Today I am grateful for 3 years in my current home.

2/1
Today I am grateful to go to the gym yesterday for the first time in months.

1/31
Today I am grateful for online Mass.

1/30
Today I am grateful for home-cooked meals.

1/29
Today I am grateful to deliver my first presentation of the year last night.

1/28
Today I am grateful for technical support from co-workers.

1/27
Today I am grateful to be a guest yesterday on the Visual Studio Toolbox show.

1/26
Today I am grateful for all the interviews with interesting people I've scheduled for #TechnologyAndFriends

1/25
Today I am grateful for a visit to the Indianapolis Art Museum yesterday.

1/24
Today I am grateful for a walk around Zionsville and Carmel yesterday.

1/23
Today I am grateful for an upgraded hotel room.

1/22
Today I am grateful for funny memes.

1/21
Today I am grateful to see so much hope and optimism expressed by so many people about yesterday's historic events.

1/20
Today I am grateful for a call from my cousin Kevin last night.

1/19
Today I am grateful for my new drone.

1/18
Today I am grateful for the legacy of Rev. Dr. Martin Luther King, Jr, who understood that non-violence was not a weakness.

1/17
Today I am grateful for a much-needed mid-afternoon nap yesterday.

1/16
Today I am grateful to everyone who shared photos of their dogs yesterday.

1/15
Today I am grateful for dogs.

1/14
Today I am grateful to spend the past few evenings sorting through photographs and papers of my late mother.

1/13
Today I am grateful for lots of advice yesterday on how to cook pasta.

1/12
Today I am grateful to restaurants who have worked to make outdoor dining safer in Chicago winter.

1/11
Today I am grateful for my new cabinet/end table.

1/10
Today I am grateful for Indian food.

1/9
Today I am grateful for good health.

1/8
Today I am grateful for my new laptop.

1/7
Today I am grateful for dinner with Tim and Natale last night before they drive to Austin tomorrow.

1/6
Today I am grateful for a virtual lunch with Annie yesterday.

1/5
Today I am grateful for a regular exercise routine.

1/4
Today I am grateful to return to work after an extended vacation.

Sunday, February 7, 2021 3:03:11 PM (GMT Standard Time, UTC+00:00)
# Thursday, February 4, 2021

GCast 102:

Video Files and Azure Media Services

Learn the capabilities of Azure Media Services, how to create an Azure Media Services account, and how to add audio and video files as Assets in that account.

Thursday, February 4, 2021 9:45:00 AM (GMT Standard Time, UTC+00:00)
# Wednesday, February 3, 2021

In a previous article, I showed how to embed into a web page a video encoded with Azure Media Services (AMS).

In this article, I will show you how to add captions to that video.

In my last article, I showed you how to perform audio transcription with Azure Media Services using an Audio Transcription Job. Among other things, this generates a transcript.vtt file with speech-to-text data, listing anything spoken in the video, along with the time at which the words were spoken.

You can also generate this data by using the "Video and Audio Analyzer" job, as described in this article.

For this work, the transcript.vtt file must be in the same folder as the video(s) playing on your web page. A simple way to do this is to download the file from its current container and upload it into the Encoded Video container.

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Output Asset containing Audio Transcription or Analyzer data to display the Asset details page, as shown in Fig. 3.

ams03-AssetDetails
Fig. 3

Click the link next to the "Storage container" label (Fig. 4) to open the Storage Blob container associated with this asset, as shown in Fig. 5. This container should open in a new browser tab.

ams04-StorageContainerLink
Fig. 4

ams05-Container
Fig. 5

Click the "transcript.vtt" row to open the blob blade showing details of the transcript.vtt blob, as shown in Fig. 6.

ams06-VttBlobDetails
Fig. 6

Click the download button (Fig. 7) in the top toolbar and save the transcript.vtt file on your local disc. Note where you save this file.

ams07-DownloadButton
Fig. 7

Listing 1 shows a sample VTT file.

Listing 1

WEBVTT

NOTE duration:"00:00:11.0290000"

NOTE language:en-us

NOTE Confidence: 0.90088177

00:00:00.000 --> 00:00:04.956 
 This video is about Azure Media Services

NOTE Confidence: 0.90088177

00:00:04.956 --> 00:00:11.029 
 and Azure Media Services are. Awesome.
  

Navigate again to the "Assets" blade, as shown in Fig. 8.

ams08-AssetsBlade
Fig. 8

In the row of the Analyzer or Audio Transcription asset, click the link in the "Storage link" column to open the container associated with this asset, as shown in Fig. 9.

ams09-Container
Fig. 9

Click the upload button (Fig. 10) to open the "Upload blob" dialog, as shown in Fig. 11.

ams10-UploadButton
Fig. 10

ams11-UploadBlobDialog
Fig. 11

Click the "Select a file" field to open a file navigation dialog. Navigate to the older where you stored transcript.vtt and select this file. Then, click the [Upload]

When the dialog closes, you should return to the Container blade and transcript.vtt should now be listed, as shown in Fig. 12.

ams12-Container
Fig. 12

Click to open the asset containing the video(s) used to generate the VTT file, as shown in Fig. 13.

ams13-AssetDetails
Fig. 13

Start the Streaming Locator, if it is not already started. If you have not yet created a Streaming Locator, this article walks you through it.

Copy the Streaming URL and save it somewhere. It should begin with "https://" and end with "manifest".

As a reminder, Listing 2 shows the HTML to embed an AMS video in a web page. This is the code shown in this article.

Listing 2:

<!DOCTYPE html> 
< html lang="en"> 
< head> 
    <title>Azure Media Services Demo</title> 
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet"> 
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script> 
< /head> 
< body> 
    <h1>Video</h1> 
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'> 
        <source src="STREAMING_URL_MANIFEST" 
                type="application/vnd.ms-sstr+xml" /> 
default />--> 
    </video> 
< /body> 
< /html>
  

where STREAMING_URL_MANIFEST is replaced with the Streaming URL you copied from the video asset.

To add captions to this video, add a <track> tag inside the <video> tag, as shown in Listing 3:

Listing 3

<!DOCTYPE html> 
< html lang="en"> 
< head> 
    <title>Azure Media Services Demo</title> 
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet"> 
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script> 
< /head> 
< body> 
    <h1>Video</h1> 
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'> 
        <source src="STREAMING_URL_MANIFEST" 
                 type="application/vnd.ms-sstr+xml" /> 
        <track src="VTT_URL" label="english" kind="subtitles" srclang="en-us" default /> 
    </video> 
< /body> 
< /html>
  

where VTT_URL is replaced with a URL consisting of the same domain and folder as in the src attribute of the source tag, but with "transcript.vtt" as the file name.

Listing 4shows an example using an Azure Media Services account that I have since deleted.

Listing 4:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Azure Media Services Demo</title>
    <link href="https://amp.azure.net/libs/amp/2.3.6/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
    <script src="https://amp.azure.net/libs/amp/2.3.6/azuremediaplayer.min.js"></script>
</head>
<body>
    <h1>Video</h1>
    <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" data-setup='{"nativeControlsForTouch": false}'>
        <source src="https://dgtestblogams-usea.streaming.media.azure.net/232493e2-8c99-41a0-bb09-5a0aea47de35/3b331fca-41e6-458c-8171-235ef3f76875.ism/manifest"
                type="application/vnd.ms-sstr+xml" />
        <track src="https://dgtestblogams-usea.streaming.media.azure.net/29a650b6-5c0a-4932-8efb-2b4bb4a81bf0/transcript.vtt" label="english" kind="subtitles" srclang="en-us" default />
    </video>
</body>
</html>
  

Add this HTML file to any web server and navigate to its URL using a web browser. You should see a page with your video embedded and with captions displaying at the bottom of the video, as shown in Fig. 14.

ams14-VideoWithCaptions
Fig. 14

In this article, I showed you how to include captions in an Azure Media Services video embedded in a web page.

Wednesday, February 3, 2021 9:07:00 AM (GMT Standard Time, UTC+00:00)
# Tuesday, February 2, 2021

In a previous article, I showed you how to use Azure Media Services (AMS) to analyze a video. Among other things, this analysis performs audio transcription to perform text to speech on your video. This outputs 2 files with the spoken text in the audio track of your video.

You may want to only do audio transcription. If you are not interested in the other analysis output, it does not make sense to spend the time or compute on analyzing a video for the other features. AMS allows you to perform only Audio Transcription and eschew the other analysis.

Navigate to the Azure Portal and to your Azure Media Services account, as shown in Fig. 1.

ams01-OverviewBlade
Fig. 1

Then, select "Assets" from the left menu to open the "Assets" blade, as shown in Fig. 2.

ams02-AssetsBlade
Fig. 2

Select the Input Asset you uploaded to display the Asset details page, as shown in Fig. 3.

ams03-AssetDetails
Fig. 3

Click the [Add job] button (Fig. 4) to display the "Create a job" dialog, as shown in Fig. 5.

ams04-AddJobButton
Fig. 4

ams05-CreateJob
Fig. 5

At the "Transform" field, select the "Create new" radio button.

At the "Transform name" textbox, enter a name to help you identify this Transform.

At the "Description" field, you may optionally enter some text to describe what this transform will do.

At the "Transform type" field, select the "Audio transcription" radio button.

At the "Analysis type" field, select the "Video and audio" radio button.

The "Automatic language detection" section allows you to either specify the audio language or allow AMS to figure this out. If you know the language, select the "No" radio button and select the language from the dropdown list. If you are unsure of the language, select the "Yes" radio button to allow AMS to infer it.

The "Configure Output" section allows you to specify where the generated output assets will be stored.

At the "Output asset name" field, enter a descriptive name for the output asset. AMS will suggest a name, but I prefer the name of the Input Asset, followed by "_AudioTranscription" or something more descriptive.

At the "Asset storage account" dropdown, select the Azure Storage Account in which to save a container and the blob files associated with the output asset.

At the job name, enter a descriptive name for this job. A descriptive name is helpful if you have many jobs running and want to identify this one.

At the "Job priority" dropdown, select the priority in which this job should run. The options are "High", "Low", and "Normal". I generally leave this as "Normal" unless I have a reason to change it. A High priority job will run before a Normal priority job, which will run before a Low priority job.

Click the [Create] button to create the job and queue it to be run.

You can check the status of the job by selecting "Transforms + jobs" from the left menu to open the "Transforms + jobs" blade (Fig. 6) and expanding the job you just created (Fig. 7).

ams06-TransformsJobs
Fig. 6

ams07-ExpandJob
Fig. 7

The state column tells you whether the job is queued, running, or finished.

Click the name of the job to display details about the job, as shown in Fig. 8.

ams08-JobDetails
Fig. 8

After the job finishes, when you return to the "Assets" blade, you will see the new output Asset listed, as shown in Fig. 9.

ams09-AssetsBlade
Fig. 9

Click the name of the asset you just created to display the Asset Details blade, as shown in Fig. 10.

ams10-AudioTranscriptionAssetDetails
Fig. 10

Click the link to the left of "Storage container" to view the files in Blob storage, as shown in Fig. 11.

ams11-Container
Fig. 11

The speech-to-text output can be found in the files transcript.ttml and transcript.vtt. These two files contain the same information - words spoken in the video and times they were spoken - but they are in different standard formats.

Listing 1 shows a sample TTML file for a short video, while Listing 2 shows a VTT file for the same video.

Listing 1:

<?xml version="1.0" encoding="utf-8"?>
 <tt xml:lang="en-US" xmlns="http://www.w3.org/ns/ttml" xmlns:tts="http://www.w3.org/ns/ttml#styling" xmlns:ttm="http://www.w3.org/ns/ttml#metadata">
   <head>
     <metadata>
       <ttm:copyright>Copyright (c) 2013 Microsoft Corporation.  All rights reserved.</ttm:copyright>
     </metadata>
     <styling>
       <style xml:id="Style1" tts:fontFamily="proportionalSansSerif" tts:fontSize="0.8c" tts:textAlign="center" tts:color="white" />
     </styling>
     <layout>
       <region style="Style1" xml:id="CaptionArea" tts:origin="0c 12.6c" tts:extent="32c 2.4c" tts:backgroundColor="rgba(0,0,0,160)" tts:displayAlign="center" tts:padding="0.3c 0.5c" />
     </layout>
   </head>
   <body region="CaptionArea">
     <div>
       <!-- Confidence: 0.90088177 -->
       <p begin="00:00:00.000" end="00:00:07.080">This video is about Azure Media Services and Azure Media</p>

      <!-- Confidence: 0.90088177 -->
       <p begin="00:00:07.206" end="00:00:08.850">Services are.</p>

      <!-- Confidence: 0.935814 -->
       <p begin="00:00:08.850" end="00:00:11.029">Awesome.</p>
     </div>
   </body>
 </tt>
  

Listing 2:

WEBVTT

NOTE duration:"00:00:11.0290000"

NOTE language:en-us

NOTE Confidence: 0.90088177

00:00:00.000 --> 00:00:04.956 
This video is about Azure Media Services

NOTE Confidence: 0.90088177

00:00:04.956 --> 00:00:11.029 
and Azure Media Services are. Awesome.
  
Tuesday, February 2, 2021 9:24:00 AM (GMT Standard Time, UTC+00:00)
# Sunday, January 31, 2021

Ready Player Two by Ernest Cline continues the story of Wade Watts that began in the author's first novel "Ready Player One" - a novel of the near future that was a huge hit with middle aged nerds.

Wade and his friends are now in control of the largest company in the world. This company that controls the OASIS - a virtual world in which people can escape and live out their fantasies. Wade is rich and famous, but he is not happy. He has lost his girlfriend, has little contact with his friends, and is unpopular with the public and he is to blame for most of his problems. His world turns even worse when a malevolent AI within the OASIS hacks the entire system, trapping millions of users inside and threatening to kill them unless Wade and friends complete a quest to find the Seven Shards.

The hook of this novel is not dissimilar to the earlier one: young people must navigate a virtual world, showing off their video game proficiency and knowledge of late 20th century pop culture to overcome a series of challenges .

There are a couple key differences with the earlier novel: RP2 has a much darker tone than RP1; and there is a narrower focus on pop culture in this one. While RP1 bombarded the reader with an enormous amount of 1980s and 1990s references, RP2 dives deeper into fewer topic topics, such as Tolkien's Middle Earth, the music of Prince, and the movies of John Hughes. The lines between good and evil are less clearly drawn.

I appreciated that this book acknowledges a glaring logic hole in its predecessor. The earlier novel presented the creators of the OASIS as benevolent geniuses trying to build an alternate, better world than the dystopia outside. But it ignored one important corollary: as people spent more time escaping into virtual reality or supporting its infrastructure and economy, they neglected the real world even more, causing contributing to its accelerated deterioration. "Ready Player Two" raises this issue and highlights even more flaws in the OASIS creators. The addictive nature of the virtual world is intensified by the introduction of headsets that offer a more immersive experience by interacting directly with the user's brain waves.

With a film adaptation almost certain, it was difficult to read it without envisioning how it would translate to a big screen with expensive CGI affects.

Ready Player Two presents a familiar formula which will resonate with Cline's fans. We can see the themes Cline is addressing: The corruption of wealth and power; The dangers of addiction; and the inevitable disappointment of hero worship. But the execution lacks the originality and charm of the first book. Still, I found it to be enjoyable escapism.

Sunday, January 31, 2021 8:07:53 PM (GMT Standard Time, UTC+00:00)
# Thursday, January 28, 2021

GCast 101:

Azure Resource Groups

What are the advantages of Azure Resource Groups? How do I create and manage a Resource Group?

Azure | GCast | Screencast | Video
Thursday, January 28, 2021 9:13:00 AM (GMT Standard Time, UTC+00:00)
# Monday, January 25, 2021

Episode 644

Dustin Campbell on Support for WinForms to the Visual Studio Designer

Dustin Campbell recently updated the tooling in Visual Studio to support WinForms and other legacy applications. He describes the challenges in doing so and how he and his team attacked them.

Monday, January 25, 2021 9:07:00 AM (GMT Standard Time, UTC+00:00)