paint-brush
How to Build a Live Video Streaming App Using the Agora React Native SDKby@trevorbuntin
1,533 reads
1,533 reads

How to Build a Live Video Streaming App Using the Agora React Native SDK

by Trevor BuntinFebruary 15th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Using the Agora Video SDK, we will create a live broadcasting app that can accommodate numerous broadcasters and entertain thousands of users. The v3.4.6 will be used at the time of writing for the following examples. We will make use of the. Agora RTC SDK for React Native for the. following examples in our main. LiveScreen.js file where our main logic will be implemented. We need to sort out permissions in a different file named. Permission.ts, we request permission below.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - How to Build a Live Video Streaming App Using the Agora React Native SDK
Trevor Buntin HackerNoon profile picture



There are different characteristics to building a flexible, premium, live video streaming app. For instance, While maintaining cross-platform compatibility, maintaining low latency, load balancing, and directing thousands of users in the audience can be quite tasking.


However, there is a convenient way to make this happen using the Agora React Native SDK. In this article, we will create a live broadcasting app that can accommodate numerous broadcasters and entertain thousands of users by using the magic of the Agora Video SDK.


Requirements


We will make use of the Agora RTC SDK for React Native for the following examples. The v3.4.6 will be used at the time of writing.

Client-side implementation

We add our Agora module:


yarn add react-native-agora


After doing that, you follow the simple installation instructions here for android and here for iOS.


Got to your ios folder and run:


pod install


Now, we implement the Live stream


First, we need to sort out permissions in a different file named Permission.ts,we request permission below.

import { PermissionsAndroid } from 'react-native';

export default async function requestCameraAndAudioPermission() {
  try {
    const granted = await PermissionsAndroid.requestMultiple([
      PermissionsAndroid.PERMISSIONS.CAMERA,
      PermissionsAndroid.PERMISSIONS.RECORD_AUDIO,
    ]);
    if (
      granted['android.permission.RECORD_AUDIO'] ===
        PermissionsAndroid.RESULTS.GRANTED &&
      granted['android.permission.CAMERA'] ===
        PermissionsAndroid.RESULTS.GRANTED
    ) {
      console.log('You can use the cameras & mic');
    } else {
      console.log('Permission denied');
    }
  } catch (err) {
    console.warn(err);
  }
}



Now we import it in our main LiveScreen.js file where our main logic will be implemented.

import React, { useEffect, useRef, useState } from 'react';
import {
  Platform,
  ActivityIndicator,
  StyleSheet,
  Dimensions,
} from 'react-native';
import 'react-native-get-random-values';
import { v4 as uuid } from 'uuid';
import RtcEngine, {
  ChannelProfile,
  RtcLocalView,
  RtcRemoteView,
} from 'react-native-agora';
import requestCameraAndAudioPermission from './Permission';

const SCREEN_HEIGHT = Dimensions.get('window').height;
const SCREEN_WIDTH = Dimensions.get('window').width;

export default function LiveScreen({ route }) {
  const isBroadcaster = route.params.type === 'create';
  const channelId = route.params.channel;

  const [joined, setJoined] = useState(false);

  const AgoraEngine = useRef();

  useEffect(() => {
    if (Platform.OS === 'android') requestCameraAndAudioPermission();
    const uid = isBroadcaster ? 1 : 0;
    init().then(() =>
      AgoraEngine.current.joinChannel(null, channelId, null, uid),
    );
    return () => {
      AgoraEngine.current.destroy();
    };
  }, []);

  const init = async () => {
    AgoraEngine.current = await RtcEngine.create('You App ID Here');
    AgoraEngine.current.enableVideo();
    AgoraEngine.current.setChannelProfile(ChannelProfile.LiveBroadcasting);
    if (isBroadcaster)
      AgoraEngine.current.setClientRole(ClientRole.Broadcaster);

    AgoraEngine.current.addListener(
      'JoinChannelSuccess',
      (channelId, uid, elapsed) => {
        console.log('JoinChannelSuccess', channelId, uid, elapsed);
        setJoined(true);
      },
    );
  };

  const onSwitchCamera = () => AgoraEngine.current.switchCamera();

  return (
    <View style={styles.container}>
      {!joined ? (
        <>
          <ActivityIndicator
            size={60}
            color="#222"
            style={styles.activityIndicator}
          />
          <Text style={styles.loadingText}>
            {'Joining Stream, Please Wait'}
          </Text>
        </>
      ) : (
        <>
          {isBroadcaster ? (
            <RtcLocalView.SurfaceView
              style={styles.fullscreen}
              channelId={channelId}
            />
          ) : (
            <RtcRemoteView.SurfaceView
              uid={1}
              style={styles.fullscreen}
              channelId={channelId}
            />
          )}
          <View style={styles.buttonContainer}>
            <TouchableOpacity style={styles.button} onPress={onSwitchCamera}>
              <Text style={styles.buttonText}>{'Switch Camera'}</Text>
            </TouchableOpacity>
          </View>
        </>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  loadingText: {
    fontSize: 18,
    color: '#222',
  },
  fullscreen: {
    width: SCREEN_WIDTH,
    height: SCREEN_HEIGHT,
  },
});


Let's work through the bunch of code we just wrote


  • The LiveScreen receives a props: a channelId and a type. The channelId is a unique string of the channel to connect to and the type can either be "create" or "join" to either start a broadcast of join one.
  • We acquired Camera and Microphone permissions from Android to send Audio and Video.
  • We initiated the Agora Engine instance and setup all the necessary configurations.
  • We joined the channel using no authentication and the channelId from the route prop.


(NB: The joinChannel function takes 4 arguments, Authentication Token, Channel ID, Optional Info, and Optional UID. For a production app, you will need to fetch an authentication token to be generated by a middleware hosted on server-side.)


  • We displayed the Local View and Remote View based on who is using the app, the broadcaster, or the audience.
  • We added a Switch Camera button to switch between the front camera and the back camera.


And that's all. You have a simple Live stream app working in minutes.


Now, the next step can be to add advanced features such as:


Live stream with video conference (many participants) and multiple audiences with embedded live chat with multiple audiences, send requests to join streams, and more.