In our last post, we looked at how to broadcast to an Amazon Interactive Video Service (Amazon IVS) live stream directly from a web browser instead of relying on third-party streaming software. Web Broadcast is a versatile tool for creating all-in-one solutions for your live streaming applications, and today weâll look at enhancing your application with screen sharing and canvas overlays.
But Why?
Many times our cameras and microphones are âgood enoughâ, but sometimes broadcasters need additional options. For example, think of an online conference or webinar. Isnât it better to see the presenterâs slide deck or watch them demo an application instead of just watching them talk about their content? What about the most popular application for live streaming - gaming? Would you watch a stream of someone playing Fortnite if the video was only their webcam? Of course not! As you can see, being able to share a screen is a critical piece of the interactive, live streaming puzzle.
But thereâs another piece missing. Since the creation of television, graphic overlays have been used to enhance the viewer's experience. News broadcasts have scrolling tickers and branding graphics. Gaming streams overlay chat and game stats. Sports broadcasts incorporate player and team information, current score, and play clock, and much more. Overlay graphics are a part of engaging video, and your live stream is no doubt improved by including them.
But How?
Weâve established the necessity for screen sharing and graphic overlays, but certainly this kind of advanced functionality must be complex to implement with the Amazon IVS Web Broadcast SDK, right? Of course not - itâs straightforward and doesnât take much code at all. Letâs build on the web broadcast demo from the last post and add features to that example.
Adding Screen Sharing
Weâll start with screen sharing. First, weâll need to add a button to the UI that the broadcaster can click when they are ready to share their screen. Weâll add this below the âStreamâ button that we added in the last post. If you havenât read that post yet, I encourage you to do that now. You can also check out the full source for this demo on CodePen.
<button id="screenshare-btn" class="btn btn-outline-primary mb-3">Share Screen</button>
Next, letâs add an event handler to capture button clicks.
document.getElementById('screenshare-btn').addEventListener('click', toggleScreenshare);
And define the toggleScreenshare()
function that will be called on button click.
const toggleScreenshare = async (e) => {
const screenshareBtn = e.currentTarget;
if(!broadcastReady()) return;
if (!window.isSharingScreen) {
await shareScreen();
if (!window.isBroadcasting) startBroadcast();
screenshareBtn.innerHTML = 'Stop Screen Share';
screenshareBtn.classList.remove('btn-outline-primary');
screenshareBtn.classList.add('btn-danger');
window.isSharingScreen = true;
}
else {
screenshareBtn.innerHTML = 'Share Screen';
screenshareBtn.classList.add('btn-outline-primary');
screenshareBtn.classList.remove('btn-danger');
window.isSharingScreen = false;
await createVideoStream();
}
};
In the toggleScreenshare()
function above, we make sure that our application is ready to broadcast, and if so, we call the shareScreen()
function and update the UI to reflect the current application state. If the user is already sharing the screen, we instead call createVideoStream()
which switches the video source for the stream back to the userâs web camera. Letâs look at shareScreen()
:
const shareScreen = async () => {
if (window.broadcastClient && window.broadcastClient.getVideoInputDevice('camera1')) {
window.broadcastClient.removeVideoInputDevice('camera1');
}
window.videoStream = await navigator.mediaDevices.getDisplayMedia();
if (window.broadcastClient) {
window.broadcastClient.addVideoInputDevice(window.videoStream, 'camera1', { index: 0 });
}
};
This function looks very similar to the createVideoStream()
function that we created in the last post. The shareScreen()
function first removes any existing video input device on the broadcast client, gets a new media source, and then adds it to the broadcast client. But instead of calling getUserMedia()
like we did with createVideoStream()
, this time we use navigator.mediaDevices.getDisplayMedia()
which uses the Screen Capture API and allows the user to select either their entire desktop, a specific window, or a single browser tab to use as a media source. The media source returned by getDisplayMedia()
implements the MediaStream
(docs) interface which means we can use it anywhere a MediaStream
is expected. If we check the Amazon IVS Web Broadcast SDK docs, we see that addVideoInputDevice()
expects a MediaStream
, so weâre good to use this as a source for our broadcast!
The broadcast client supports multiple video sources! If we wanted to, we could add a camera source by giving each source a unique name and specifying a different index for each source. Using this approach, we could include the userâs web cam on top of the screen share - perhaps in a lower corner of the screen - to create a more engaging experience.
If we ran our application at this point and clicked the âShare Screenâ button, we would get prompted to select the display device:
After we select a display device, we can see the chosen display in the local preview.
And we can confirm that our screen is being broadcast to our live stream.
Adding Text and Graphic Overlay
The Web Broadcast SDK also gives us the ability to add text or images as overlays to our live stream. Letâs add another button:
<button id="overlay-btn" class="btn btn-secondary">Overlay</button>
And another event listener.
document.getElementById('overlay-btn').addEventListener('click', showOverlay);
And define the showOverlay()
function.
const showOverlay = () => {
const preview = document.getElementById('broadcast-preview');
const overlay = document.createElement('canvas');
overlay.width = preview.width;
overlay.height = preview.height;
overlay.style.display = 'none';
let ctx = overlay.getContext('2d');
ctx.fillStyle = 'black';
ctx.globalAlpha = 0.5;
ctx.fillRect(0, overlay.height - 220, overlay.width, 220);
ctx.globalAlpha = 1;
ctx.strokeStyle = 'black';
ctx.lineWidth = 3;
ctx.font = 'bold 120px Arial';
ctx.fillStyle = 'white';
ctx.fillText('Amazon IVS Web Broadcast', 30, overlay.height - 100);
ctx.font = 'bold 40px Arial';
ctx.fillStyle = 'white';
ctx.fillText('Canvas Overlay Demo', 40, overlay.height - 40);
document.querySelector('body').appendChild(overlay);
window.broadcastClient.addImageSource(overlay, 'overlay1', { index: 1 });
setTimeout(() => {
window.broadcastClient.removeImage('overlay1');
overlay.remove();
}, 10000);
};
In this function, weâre creating a new <canvas>
element, adding some styles and text to it, appending it to the DOM, and finally adding it to our broadcast client via addImageSource()
(docs). Finally, we create a timer to remove the canvas overlay after 10 seconds. This effect is a nice way to add a âlower thirdâ to your stream that includes information about the broadcaster or any other relevant information. Now if we click the âOverlayâ button, we can see our lower third displayed in the preview.
Hopefully, you can see just how powerful this functionality is for your live streams. Itâs certainly possible to add graphic overlays on top of the player on the client side, but by adding them at the broadcast source, they become a permanent part of the broadcast and will exist in any recorded versions of the live stream.
Try it out!
You can try out this demo on CodePen by opening it in a new tab and plugging in your own stream endpoint and stream key.
Summary
In this post, we learned how to enhance our live streams with screen sharing and canvas overlays. For further reading, please refer to the SDK docs.