We needed to add Video Ads to the broadcasts in our React Native (Expo) app. After trying out a few (mostly outdated) implementations for React Native, I decided to implement the ads with the Google IMA API in Native iOS and Android instead.
In part 1, I showed how to implement Video Ads in iOS.
https://hackernoon.com/how-to-implement-video-ads-in-react-native-apps
Let’s implement Video Ads in Android at this time.
In Expo apps, expo-av can be used to play videos and audio. However, expo-av does not support inline video ads, so using expo-av was not an option for the ads. Furthermore, expo-av is based on an older Android ExoPlayer version than the one in the sample IMA code from Google. Because of that, expo-av had to be uninstalled and all video and audio displays in the app had to be replaced with the native implementation.
For this article, we are going to use our simple Example app.
In part 1, I was showing how to initialize the app and add react navigation.
Starting from there, let’s create the Android folder by running:
expo run:android
Following the instructions for adding IMA to Android:
https://developers.google.com/interactive-media-ads/docs/sdks/android/client-side
In app/build.gradle:
defaultConfig {
...
//add below
multiDexEnabled true
}
...
dependencies {
..
//add
implementation 'androidx.multidex:multidex:2.0.1'
implementation 'androidx.appcompat:appcompat:1.3.1'
implementation 'com.google.android.exoplayer:exoplayer-core:2.15.1'
implementation 'com.google.android.exoplayer:exoplayer-ui:2.15.1'
implementation 'com.google.android.exoplayer:extension-ima:2.15.1'
implementation 'com.google.android.exoplayer:exoplayer-hls:2.15.1'
}
In app/src/main/AndroidManifest.xml:
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.sidelineaccess.bhs">
<!-- Required permissions for the IMA SDK -->
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
...
<service android:name=".AudioPlayerService"/>
</application>
</manifest>
Add to app/src/main/res/values/strings.xml (for playing audio)
<string name="notification_channel">channel</string>
<string name="notification_channel_description">Audio Broadcast</string>
Add a layout file for the Video View: app/src/main/res/layout/video_view.xml
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/connections"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000000" >
<com.google.android.exoplayer2.ui.PlayerView
android:id="@+id/player_view"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
</LinearLayout>
Add a layout file for the Audio Player: app/src/main/res/layout/audio_view.xml
This will be just a small line for the Audio Player, but as we are implementing Native UI Components, this will be the counterpart to the React Native View.
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#DDDDDD">
</LinearLayout>
Add app/src/main/java/com/mascotmedia/Media/ReactNativePackage.java (new Java file)
In this file, we are returning the VideoViewManager and the AudioViewManager.
package com.mascotmedia.Media;
import com.facebook.react.ReactPackage;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.uimanager.ViewManager;
import com.facebook.react.ReactPackage;
import com.facebook.react.bridge.JavaScriptModule;
import com.facebook.react.bridge.NativeModule;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.uimanager.ViewManager;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
public class ReactNativePackage implements ReactPackage {
@Override
public List<NativeModule> createNativeModules(ReactApplicationContext reactContext) {
return Collections.emptyList();
}
@Override
public List<ViewManager> createViewManagers(ReactApplicationContext reactContext) {
return Arrays.<ViewManager>asList(
//to be created
new VideoViewManager(reactContext),
new AudioViewManager(reactContext)
);
}
}
In app/src/main/java/com/mascotmedia/Media/MainApplication.java
add the new Package to the returned packages.
@Override
protected List<ReactPackage> getPackages() {
@SuppressWarnings("UnnecessaryLocalVariable")
List<ReactPackage> packages = new PackageList(this).getPackages();
// Packages that cannot be autolinked yet can be added manually here
//add line below
packages.add(new ReactNativePackage());
return packages;
}
Add the app/src/main/java/com/mascotmedia/Media/VideoViewManager.java (new Java file)
This ViewManager will be returning the view for React Native.
We are using Fragments instead of Views, because our sample IMA code is based on Activities, and we get the life cycle methods of an Activity by using a Fragment (and FragmentActivity) rather than just a View.
package com.mascotmedia.Media;
import android.view.Choreographer;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;
import androidx.fragment.app.Fragment;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.fragment.app.FragmentActivity;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.ReadableArray;
import com.facebook.react.common.MapBuilder;
import com.facebook.react.uimanager.annotations.ReactProp;
import com.facebook.react.uimanager.annotations.ReactPropGroup;
import com.facebook.react.uimanager.ViewGroupManager;
import com.facebook.react.uimanager.ThemedReactContext;
import java.util.Map;
public class VideoViewManager extends ViewGroupManager<FrameLayout> {
public static final String REACT_CLASS = "VideoViewManager";
public final int COMMAND_CREATE = 1;
public final int COMMAND_KILL = 2;
private String videoUrl;
private String adTagUrl;
private VideoFragment localVideoFragment;
ReactApplicationContext reactContext;
public VideoViewManager(ReactApplicationContext reactContext) {
this.reactContext = reactContext;
}
@Override
public String getName() {
return REACT_CLASS;
}
/**
* Return a FrameLayout which will later hold the Fragment
*/
@Override
public FrameLayout createViewInstance(ThemedReactContext reactContext) {
return new FrameLayout(reactContext);
}
/**
* Map the "create" command to an integer
*/
@Nullable
@Override
public Map<String, Integer> getCommandsMap() {
return MapBuilder.of("create", COMMAND_CREATE,"kill", COMMAND_KILL);
}
/**
* Handle "create" command (called from JS) and call createFragment method
*/
@Override
public void receiveCommand(@NonNull FrameLayout root, String commandId, @Nullable ReadableArray args) {
super.receiveCommand(root, commandId, args);
int reactNativeViewId = args.getInt(0);
int commandIdInt = Integer.parseInt(commandId);
switch (commandIdInt) {
case COMMAND_CREATE:
createFragment(root, reactNativeViewId);
break;
case COMMAND_KILL:
removeFragment(root, reactNativeViewId);
break;
default: {}
}
}
@ReactProp(name = "videoUrl")
public void setVideoUrl(FrameLayout view, String value) {
videoUrl = value;
}
@ReactProp(name = "adTagUrl")
public void setAdTagUrl(FrameLayout view, String value) {
adTagUrl = value;
}
/**
* Replace your React Native view with a custom fragment
*/
public void createFragment(FrameLayout root, int reactNativeViewId) {
ViewGroup parentView = (ViewGroup) root.findViewById(reactNativeViewId).getParent();
setupLayout((ViewGroup) parentView);
final VideoFragment videoFragment = new VideoFragment();
videoFragment.setVideoUrl(videoUrl);
videoFragment.setAdTagUrl(adTagUrl);
videoFragment.setReactNativeViewId(reactNativeViewId);
videoFragment.setReactContext(reactContext);
localVideoFragment = videoFragment;
FragmentActivity activity = (FragmentActivity) reactContext.getCurrentActivity();
activity.getSupportFragmentManager()
.beginTransaction()
.replace(reactNativeViewId, videoFragment, String.valueOf(reactNativeViewId))
.commit();
}
/**
* Replace your React Native view with a custom fragment
*/
public void removeFragment(FrameLayout root, int reactNativeViewId) {
FragmentActivity activity = (FragmentActivity) reactContext.getCurrentActivity();
VideoFragment videoFragment = (VideoFragment) activity.getSupportFragmentManager().findFragmentById(reactNativeViewId);
if(videoFragment != null) {
activity.getSupportFragmentManager()
.beginTransaction().
remove(videoFragment).commit();
}
}
public void setupLayout(ViewGroup view) {
Choreographer.getInstance().postFrameCallback(new Choreographer.FrameCallback() {
@Override
public void doFrame(long frameTimeNanos) {
manuallyLayoutChildren(view);
view.getViewTreeObserver().dispatchOnGlobalLayout();
Choreographer.getInstance().postFrameCallback(this);
}
});
}
/**
* Layout all children properly
*/
public void manuallyLayoutChildren(ViewGroup view) {
for (int i=0; i < view.getChildCount(); i++){
View child = view.getChildAt(i);
child.measure(
View.MeasureSpec.makeMeasureSpec(view.getMeasuredWidth(),
View.MeasureSpec.EXACTLY),
View.MeasureSpec.makeMeasureSpec(view.getMeasuredHeight(),
View.MeasureSpec.EXACTLY));
child.layout(0, 0, child.getMeasuredWidth(), child.getMeasuredHeight());
}
}
@Nullable
@Override
public Map getExportedCustomDirectEventTypeConstants() {
return super.getExportedCustomDirectEventTypeConstants();
}
@Override
public void onDropViewInstance(FrameLayout view) {
super.onDropViewInstance(view);
if(localVideoFragment!=null) localVideoFragment.onDestroy();
}
}
Add app/src/main/java/com/mascotmedia/Media/VideoFragment.java (new Java file)
The Fragment will get the parameters from the VideoViewManager and initialize the VideoPlayer to play the ads and the video content.
package com.mascotmedia.Media;
import android.app.PendingIntent;
import android.content.Intent;
import android.graphics.Bitmap;
import android.os.Binder;
import android.os.Bundle;
import android.os.IBinder;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import androidx.annotation.Nullable;
import androidx.fragment.app.Fragment;
import android.net.Uri;
import android.os.Bundle;
import androidx.multidex.MultiDex;
import android.app.Service;
import com.facebook.react.bridge.Arguments;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.uimanager.events.RCTEventEmitter;
import com.google.android.exoplayer2.MediaItem;
import com.google.android.exoplayer2.SimpleExoPlayer;
import com.google.android.exoplayer2.ExoPlayer;//
import com.google.android.exoplayer2.Player;
import com.google.android.exoplayer2.ext.ima.ImaAdsLoader;
import com.google.android.exoplayer2.RenderersFactory;
import com.google.android.exoplayer2.source.DefaultMediaSourceFactory;
import com.google.android.exoplayer2.source.hls.DefaultHlsExtractorFactory;
import com.google.android.exoplayer2.source.hls.HlsExtractorFactory;
import com.google.android.exoplayer2.source.MediaSourceFactory;
import com.google.android.exoplayer2.source.ads.AdsLoader;//
import com.google.android.exoplayer2.ui.PlayerNotificationManager;
import com.google.android.exoplayer2.ui.PlayerView;
import com.google.android.exoplayer2.upstream.DataSource;
import com.google.android.exoplayer2.upstream.DefaultDataSourceFactory;
import com.google.android.exoplayer2.util.Util;
public class VideoFragment extends Fragment {
private PlayerView playerView;
private SimpleExoPlayer player;
private ImaAdsLoader adsLoader;
private String videoUrl;
private String adTagUrl;
private int reactNativeViewId;
private ReactApplicationContext reactContext;
public void setVideoUrl(String videoUrl) {
this.videoUrl = videoUrl;
}
public void setAdTagUrl(String adTagUrl) {
this.adTagUrl = adTagUrl;
}
public void setReactNativeViewId(int reactNativeViewId) {
this.reactNativeViewId = reactNativeViewId;
}
public void setReactContext(ReactApplicationContext reactContext) {
this.reactContext = reactContext;
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup parent, Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
return inflater.inflate(R.layout.video_view,parent,false);
}
private void releasePlayer() {
adsLoader.setPlayer(null);
playerView.setPlayer(null);
if(player!=null) player.release();
player = null;
}
@Override
public void onViewCreated(View view, Bundle savedInstanceState) {
super.onViewCreated(view, savedInstanceState);
// do any logic that should happen in an `onCreate` method
MultiDex.install(requireContext());
playerView = (PlayerView) view.findViewById(R.id.player_view);
// Create an AdsLoader.
adsLoader = new ImaAdsLoader.Builder(requireContext()).build();
}
private void initializePlayer() {
// Set up the factory for media sources, passing the ads loader and ad view providers.
DataSource.Factory dataSourceFactory =
new DefaultDataSourceFactory(requireContext(), Util.getUserAgent(requireContext(), getString(R.string.app_name)));
MediaSourceFactory mediaSourceFactory =
new DefaultMediaSourceFactory(dataSourceFactory)
//new DefaultHlsExtractorFactory(dataSourceFactory)
.setAdsLoaderProvider(unusedAdTagUri -> adsLoader)
.setAdViewProvider(playerView);
// Create a SimpleExoPlayer and set it as the player for content and ads.
player = new SimpleExoPlayer.Builder(requireContext()).setMediaSourceFactory(mediaSourceFactory).build();
playerView.setPlayer(player);
adsLoader.setPlayer(player);
MediaItem mediaItem;
// Create the MediaItem to play, specifying the content URI and ad tag URI.
//check for valid link in RN!
Uri contentUri = Uri.parse(videoUrl);
Uri adTagUri = null;
if (adTagUrl != null && !adTagUrl.equals("")) {
adTagUri = Uri.parse(adTagUrl);
mediaItem = new MediaItem.Builder().setUri(contentUri).setAdTagUri(adTagUri).build();
} else{
mediaItem = new MediaItem.Builder().setUri(contentUri).build();
}
// Prepare the content and ad to be played with the SimpleExoPlayer.
player.setMediaItem(mediaItem);
player.prepare();
// Set PlayWhenReady. If true, content and ads will autoplay.
player.setPlayWhenReady(true);
}
@Override
public void onPause() {
super.onPause();
if (playerView != null) {
playerView.onPause();
}
releasePlayer();
}
@Override
public void onResume() {
super.onResume();
initializePlayer();
if (playerView != null) {
playerView.onResume();
}
}
@Override
public void onDestroy() {
super.onDestroy();
if (playerView != null) {
playerView.onPause();
}
releasePlayer();
adsLoader.release();
}
}
Addapp/src/main/java/com/macotmedia/Media/AudioVideoViewManager.java (new Java file).
In this example, we are also showing how to play audio using the native ExoPlayer (no ads for this example). As audio has to be available to play in the foreground (that is it will be playing even if the app moves into the background and can be controlled from the notification center), the audio playing will be done in service. The service will be initiated from the AudioFragment.
package com.mascotmedia.Media;
import android.view.Choreographer;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.fragment.app.FragmentActivity;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.ReadableArray;
import com.facebook.react.common.MapBuilder;
import com.facebook.react.uimanager.annotations.ReactProp;
import com.facebook.react.uimanager.annotations.ReactPropGroup;
import com.facebook.react.uimanager.ViewGroupManager;
import com.facebook.react.uimanager.ThemedReactContext;
import java.util.Map;
public class AudioViewManager extends ViewGroupManager<FrameLayout> {
public static final String REACT_CLASS = "AudioViewManager";
public final int COMMAND_CREATE = 1;
public final int COMMAND_KILL = 2;
public final int COMMAND_RESUME = 3;
public final int COMMAND_PAUSE = 4;
private String audioUrl;
private AudioFragment localAudioFragment;
ReactApplicationContext reactContext;
public AudioViewManager(ReactApplicationContext reactContext) {
this.reactContext = reactContext;
}
@Override
public String getName() {
return REACT_CLASS;
}
/**
* Return a FrameLayout which will later hold the Fragment
*/
@Override
public FrameLayout createViewInstance(ThemedReactContext reactContext) {
return new FrameLayout(reactContext);
}
/**
* Map the "create" command to an integer
*/
@Nullable
@Override
public Map<String, Integer> getCommandsMap() {
return MapBuilder.of("create", COMMAND_CREATE,"kill", COMMAND_KILL, "resume", COMMAND_RESUME, "pause", COMMAND_PAUSE);
}
/**
* Handle "create" command (called from JS) and call createFragment method
*/
@Override
public void receiveCommand(@NonNull FrameLayout root, String commandId, @Nullable ReadableArray args) {
super.receiveCommand(root, commandId, args);
int reactNativeViewId = args.getInt(0);
int commandIdInt = Integer.parseInt(commandId);
System.out.println("****AudioViewManager: receiveCommand: " + commandIdInt);
switch (commandIdInt) {
case COMMAND_CREATE:
createFragment(root, reactNativeViewId);
break;
case COMMAND_KILL:
removeFragment(root, reactNativeViewId);
break;
case COMMAND_RESUME:
resumeAudio(root, reactNativeViewId);
break;
case COMMAND_PAUSE:
pauseAudio(root, reactNativeViewId);
break;
default: {}
}
}
@ReactProp(name = "audioUrl")
public void setAudioUrl(FrameLayout view, String value) {
audioUrl = value;
System.out.println("****AudioViewManager: setAudioUrl: " + value);
}
/**
* Replace your React Native view with a custom fragment
*/
public void createFragment(FrameLayout root, int reactNativeViewId) {
ViewGroup parentView = (ViewGroup) root.findViewById(reactNativeViewId).getParent();
setupLayout((ViewGroup) parentView);
final AudioFragment audioFragment = new AudioFragment();
audioFragment.setAudioUrl(audioUrl);
audioFragment.setReactNativeViewId(reactNativeViewId);
audioFragment.setReactContext(reactContext);
localAudioFragment = audioFragment;
FragmentActivity activity = (FragmentActivity) reactContext.getCurrentActivity();
activity.getSupportFragmentManager()
.beginTransaction()
.replace(reactNativeViewId, audioFragment, String.valueOf(reactNativeViewId))
.commit();
}
/**
* Stop Playing and Remove the Fragment
*/
public void removeFragment(FrameLayout root, int reactNativeViewId) {
//System.out.println("****AudioViewManager: remove Fragment: " + reactNativeViewId);
FragmentActivity activity = (FragmentActivity) reactContext.getCurrentActivity();
AudioFragment audioFragment = (AudioFragment) activity.getSupportFragmentManager().findFragmentById(reactNativeViewId);
if(audioFragment != null) {
//System.out.println("****AudioViewManager: remove Fragment - found fragment");
activity.getSupportFragmentManager()
.beginTransaction().
remove(audioFragment).commit();
}
}
/**
* Pause Playing
*/
public void pauseAudio(FrameLayout root, int reactNativeViewId) {
System.out.println("****AudioViewManager: pause Audio: " + reactNativeViewId);
FragmentActivity activity = (FragmentActivity) reactContext.getCurrentActivity();
AudioFragment audioFragment = (AudioFragment) activity.getSupportFragmentManager().findFragmentById(reactNativeViewId);
if(audioFragment != null) {
audioFragment.pausePlaying();
}
}
/**
* Resume Playing
*/
public void resumeAudio(FrameLayout root, int reactNativeViewId) {
System.out.println("****AudioViewManager: resume Audio: " + reactNativeViewId);
FragmentActivity activity = (FragmentActivity) reactContext.getCurrentActivity();
AudioFragment audioFragment = (AudioFragment) activity.getSupportFragmentManager().findFragmentById(reactNativeViewId);
if(audioFragment != null) {
audioFragment.resumePlaying();
}
}
public void setupLayout(ViewGroup view) {
Choreographer.getInstance().postFrameCallback(new Choreographer.FrameCallback() {
@Override
public void doFrame(long frameTimeNanos) {
manuallyLayoutChildren(view);
view.getViewTreeObserver().dispatchOnGlobalLayout();
Choreographer.getInstance().postFrameCallback(this);
}
});
}
/**
* Layout all children properly
*/
public void manuallyLayoutChildren(ViewGroup view) {
for (int i=0; i < view.getChildCount(); i++){
View child = view.getChildAt(i);
child.measure(
View.MeasureSpec.makeMeasureSpec(view.getMeasuredWidth(),
View.MeasureSpec.EXACTLY),
View.MeasureSpec.makeMeasureSpec(view.getMeasuredHeight(),
View.MeasureSpec.EXACTLY));
child.layout(0, 0, child.getMeasuredWidth(), child.getMeasuredHeight());
}
}
@Nullable
@Override
public Map getExportedCustomDirectEventTypeConstants() {
return super.getExportedCustomDirectEventTypeConstants();
}
@Override
public void onDropViewInstance(FrameLayout view) {
super.onDropViewInstance(view);
if(localAudioFragment!=null) localAudioFragment.onDestroy();
localAudioFragment = null;
}
}
Add app/src/main/java/com/mascotmedia/Media/AudioFragment.java (new Java file)
The AudioFragment will initialize, start and stop the AudioPlayerService.
package com.mascotmedia.Media;
import android.app.PendingIntent;
import android.app.Service;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.os.Binder;
import android.os.Bundle;
import android.os.IBinder;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.content.ServiceConnection;
import androidx.annotation.Nullable;
import androidx.fragment.app.Fragment;
import android.net.Uri;
import android.os.Bundle;
import androidx.multidex.MultiDex;
import android.app.Service;
import com.facebook.react.bridge.Arguments;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.uimanager.events.RCTEventEmitter;
import com.google.android.exoplayer2.MediaItem;
import com.google.android.exoplayer2.SimpleExoPlayer;
import com.google.android.exoplayer2.ExoPlayer;//
import com.google.android.exoplayer2.Player;
import com.google.android.exoplayer2.RenderersFactory;
import com.google.android.exoplayer2.source.DefaultMediaSourceFactory;
import com.google.android.exoplayer2.source.MediaSourceFactory;
import com.google.android.exoplayer2.ui.PlayerNotificationManager;
import com.google.android.exoplayer2.ui.PlayerView;
import com.google.android.exoplayer2.upstream.DataSource;
import com.google.android.exoplayer2.upstream.DefaultDataSourceFactory;
import com.google.android.exoplayer2.util.Util;
public class AudioFragment extends Fragment {
private SimpleExoPlayer player;
private PlayerNotificationManager playerNotificationManager;
private String audioUrl;
private int reactNativeViewId;
private ReactApplicationContext reactContext;
boolean bounded;
private AudioPlayerService audioPlayerService;
public void setAudioUrl(String audioUrl) {
this.audioUrl = audioUrl;
}
public void setReactNativeViewId(int reactNativeViewId) {
this.reactNativeViewId = reactNativeViewId;
}
public void setReactContext(ReactApplicationContext reactContext) {
this.reactContext = reactContext;
}
public void resumePlaying(){
audioPlayerService.resume();
}
public void pausePlaying(){
audioPlayerService.pause();
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup parent, Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
return inflater.inflate(R.layout.audio_view,parent,false);
}
ServiceConnection mConnection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName name, IBinder service) {
audioPlayerService = ((AudioPlayerService.AudioServiceBinder)service).getService();
}
@Override
public void onServiceDisconnected(ComponentName name) {
bounded = false;
audioPlayerService = null;
}
};
@Override
public void onViewCreated(View view, Bundle savedInstanceState) {
super.onViewCreated(view, savedInstanceState);
// do any logic that should happen in an `onCreate` method
MultiDex.install(requireContext());
Intent intent = new Intent(requireContext(), AudioPlayerService.class);
intent.putExtra("audioUrl",audioUrl);
if(requireContext().bindService(intent, mConnection, Context.BIND_AUTO_CREATE)){
bounded = true;
} else{
System.out.println("****AudioFragment: could not bind service");
}
Util.startForegroundService(requireContext(),intent);
}
@Override
public void onPause() {
super.onPause();
// do any logic that should happen in an `onPause` method
// e.g.: customView.onPause();
}
@Override
public void onResume() {
super.onResume();
// do any logic that should happen in an `onResume` method
// e.g.: customView.onResume();
}
@Override
public void onDestroy() {
// do any logic that should happen in an `onDestroy` method
if(audioPlayerService != null) {
audioPlayerService.pause();
audioPlayerService.stopForeground(true);
audioPlayerService.stopSelf();
}
super.onDestroy();
}
}
Addapp/src/main/java/com/mascotmedia/Media/AudioPlayerService.java (new Java file).
The AudioPlayerService will post the Notifications needed and start and stop the Audio. It includes a Binder class through which it can be accessed.
package com.mascotmedia.Media;
import android.app.Notification;
import android.app.PendingIntent;
import android.app.Service;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.os.Binder;
import android.os.IBinder;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.Player;
import com.google.android.exoplayer2.SimpleExoPlayer;
import com.google.android.exoplayer2.MediaItem;
import com.google.android.exoplayer2.source.DefaultMediaSourceFactory;
import com.google.android.exoplayer2.source.MediaSourceFactory;
import com.google.android.exoplayer2.ui.PlayerNotificationManager;
import com.google.android.exoplayer2.upstream.DataSource;
import com.google.android.exoplayer2.upstream.DefaultDataSourceFactory;
import com.google.android.exoplayer2.util.Util;
public class AudioPlayerService extends Service {
private SimpleExoPlayer player;
private PlayerNotificationManager playerNotificationManager;
private String audioUrl;
public void setAudioUrl(String audioUrl) {
this.audioUrl = audioUrl;
}
public class AudioServiceBinder extends Binder {
AudioPlayerService getService() {
return AudioPlayerService.this;
}
}
@Override
public void onCreate(){
super.onCreate();
final Context context = this;
player = new SimpleExoPlayer.Builder(context).build();
playerNotificationManager = new PlayerNotificationManager.Builder(this, 1, "channel").setMediaDescriptionAdapter(new PlayerNotificationManager.MediaDescriptionAdapter() {
@Override
public CharSequence getCurrentContentTitle(Player player) {
return "Audio Broadcast";
}
@Nullable
@Override
public PendingIntent createCurrentContentIntent(Player player) {
Intent intent = new Intent(context, MainActivity.class);
return PendingIntent.getActivity(context,0,intent, PendingIntent.FLAG_UPDATE_CURRENT);
}
@Nullable
@Override
public CharSequence getCurrentContentText(Player player) {
return "";
}
@Nullable
@Override
public Bitmap getCurrentLargeIcon(Player player, PlayerNotificationManager.BitmapCallback callback) {
Bitmap mIcon = BitmapFactory.decodeResource(context.getResources(),R.drawable.notification_icon);
return mIcon;
}
}).setNotificationListener(new PlayerNotificationManager.NotificationListener() {
@Override
public void onNotificationCancelled(int notificationId, boolean dismissedByUser) {
stopSelf();
}
@Override
public void onNotificationPosted(int notificationId, Notification notification, boolean ongoing) {
startForeground(notificationId,notification);
}
})
.setChannelNameResourceId(R.string.notification_channel)
.setChannelDescriptionResourceId(R.string.notification_channel_description)
.build();
playerNotificationManager.setPlayer(player);
}
@Override
public IBinder onBind(Intent intent) {
return mBinder;
}
private final IBinder mBinder = new AudioServiceBinder();
@Override
public void onDestroy() {
stopForeground(true);
playerNotificationManager.setPlayer(null);
if(player!=null) {
player.setPlayWhenReady(false);
player.release();
}
player = null;
super.onDestroy();
}
@Override
public int onStartCommand(Intent intent, int flags, int startId) {
setAudioUrl(intent.getStringExtra("audioUrl"));
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri(audioUrl));
// Prepare the player.
player.prepare();
player.setPlayWhenReady(true);
return START_STICKY;
}
public void pause(){
player.setPlayWhenReady(false);
}
public void resume(){
player.setPlayWhenReady(true);
}
}
After seeing how the native code is implemented, let’s see how the React Native is going to use this.
In the App.js file, we added the Audio Screen into the stack navigator:
<Stack.Screen name="Audio" component={AudioScreen} />
We added an Audio Button to the Home Screen:
<Button
title="Audio"
onPress={() => navigation.navigate('Audio')}
/>
Here is our Audio Screen:
We will show a button with a “play” or “pause” image, that the user can press to start or stop the Audio. (AudioScreen.js)
import React, { useState, useRef } from "react";
import {
View,
Dimensions,
TouchableOpacity,
Image,
} from "react-native";
import AudioViewIos from "../components/AudioViewIos";
import AudioViewAndroid from "../components/AudioViewAndroid";
import Constants from "expo-constants";
const audioLink = "http://www.largesound.com/ashborytour/sound/AshboryBYU.mp3";
const width=Dimensions.get("window").width;
export default function AudioStreamModal() {
const ref = useRef(null);
const [playingAudio, setPlayingAudio] = useState(true);
async function toggleAudio() {
//we are playing right now, pause
if (playingAudio) {
if (ref.current) {
ref.current.pauseAudio();
setPlayingAudio(false);
}
//we are paused, play
} else {
if (ref.current) {
ref.current.resumeAudio();
setPlayingAudio(true);
}
}
}
return (
<View style={{ flex: 1, alignItems: "center", justifyContent: "center", paddingBottom:80 }}>
<TouchableOpacity onPress={toggleAudio}>
{playingAudio && (
<Image
style={{ width: 100, height: 100, alignSelf: "center" }}
resizeMode="cover"
source={require("../assets/video_pause_transparent.png")}
/>
)}
{!playingAudio && (
<Image
style={{ width: 100, height: 100, alignSelf: "center" }}
resizeMode="cover"
source={require("../assets/video_play_transparent.png")}
/>
)}
</TouchableOpacity>
{Constants.platform.android && (
<View
style={{
height: 3,
width: width * 0.64,
backgroundColor: "grey",
}}
>
<AudioViewAndroid ref={ref} url={audioLink} />
</View>
)}
{Constants.platform.ios && (
<View
style={{
height: 3,
width: width * 0.64,
backgroundColor: "grey",
}}
>
<AudioViewIos
ref={ref}
url={audioLink}
/>
</View>
)}
</View>
);
}
Here is the React Native Audio View for Android (AudioViewAndroid.js):
import React, {
useEffect,
useRef,
useImperativeHandle,
forwardRef,
} from "react";
import {
UIManager,
findNodeHandle,
} from "react-native";
import { requireNativeComponent } from "react-native";
const AudioViewManager = requireNativeComponent("AudioViewManager");
function AudioView(props, ref) {
const nativeRef = useRef(null);
useImperativeHandle(ref, () => ({
// methods connected to `ref`
resumeAudio: () => {
if (__DEV__) {
console.log("RN calling resumeAudio ");
}
resumeAudio();
},
pauseAudio: () => {
if (__DEV__) {
console.log("RN calling pauseAudio ");
}
pauseAudio();
}
}));
function createFragment() {
const viewId = findNodeHandle(nativeRef.current);
if (viewId != null) {
UIManager.dispatchViewManagerCommand(
viewId,
UIManager.AudioViewManager.Commands.create.toString(), // we are calling the 'create' command
[viewId]
);
}
}
function killFragment() {
const viewId = findNodeHandle(nativeRef.current);
if (viewId != null) {
UIManager.dispatchViewManagerCommand(
viewId,
UIManager.AudioViewManager.Commands.kill.toString(), // we are calling the 'kill' command
[viewId]
);
}
}
function resumeAudio() {
if (__DEV__) {
console.log("RN resumeAudio ");
}
const viewId = findNodeHandle(nativeRef.current);
if (viewId != null) {
UIManager.dispatchViewManagerCommand(
viewId,
UIManager.AudioViewManager.Commands.resume.toString(), // we are calling the 'resume' command
[viewId]
);
}
}
function pauseAudio() {
if (__DEV__) {
console.log("RN pauseAudio ");
}
const viewId = findNodeHandle(nativeRef.current);
if (viewId != null) {
UIManager.dispatchViewManagerCommand(
viewId,
UIManager.AudioViewManager.Commands.pause.toString(), // we are calling the 'pause' command
[viewId]
);
}
}
useEffect(() => {
if (__DEV__) {
console.log("RN AudioView: " + props.url);
}
setTimeout(() => {
createFragment();
}, 500);
return () => {
//kill the android side when leaving the view
killFragment();
};
}, []);
return (
<AudioViewManager
ref={nativeRef}
audioUrl={props.url}
/>
);
}
export default forwardRef(AudioView);
We are using forwardRef, so that we can forward the pause and resume command from the React Native Audio Screen to the Native UI Component.
What about the Video Component?
Here is our React Native Video Screen (VideoScreen.js):
import * as React from "react";
import { View, Text, Dimensions } from "react-native";
import Constants from "expo-constants";
import VideoViewIos from "../components/VideoViewIos";
import VideoViewAndroid from "../components/VideoViewAndroid";
const videoLink =
"https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8";
const adTagUrl =
"https://pubads.g.doubleclick.net/gampad/ads?sz=640x480&iu=/124319096/external/single_ad_samples&ciu_szs=300x250&impl=s&gdfp_req=1&env=vp&output=vast&unviewed_position_start=1&cust_params=deployment%3Ddevsite%26sample_ct%3Dlinear&correlator=";
const width = Dimensions.get("window").width;
export default function VideoScreen() {
return (
<View
style={{
flex: 1,
alignItems: "center",
justifyContent: "center",
paddingBottom: 80,
}}
>
{Constants.platform.ios && (
<VideoViewIos
videoUrl={videoLink}
adTagUrl={adTagUrl}
style={{
height: width * 0.56,
width: width,
backgroundColor: "grey",
}}
/>
)}
{Constants.platform.android && (
<View
style={{
height: width * 0.56,
width: width,
backgroundColor: "grey",
}}
>
<VideoViewAndroid
url={videoLink}
adTag={adTagUrl}
/>
</View>
)}
</View>
);
}
Here is the React Native Android Video Component. In the video component, we don’t need to send any commands to the native side apart from create and kill. (VideoViewAndroid.js)
import React, { useEffect, useRef } from "react";
import { UIManager, findNodeHandle } from "react-native";
import { requireNativeComponent } from "react-native";
const VideoViewManager = requireNativeComponent("VideoViewManager");
const VideoView = ({ url, adTag }) => {
const nativeRef = useRef(null);
useEffect(() => {
if (__DEV__) {
console.log("RN VideoView: " + url);
}
setTimeout(() => {
createFragment();
}, 500);
return () => {
//kill the android side when leaving the view
killFragment();
};
}, []);
function createFragment() {
const viewId = findNodeHandle(nativeRef.current);
if (viewId != null) {
UIManager.dispatchViewManagerCommand(
viewId,
UIManager.VideoViewManager.Commands.create.toString(), // we are calling the 'create' command
[viewId]
);
}
}
function killFragment() {
const viewId = findNodeHandle(nativeRef.current);
if (viewId != null) {
UIManager.dispatchViewManagerCommand(
viewId,
UIManager.VideoViewManager.Commands.kill.toString(), // we are calling the 'kill' command
[viewId]
);
}
}
return (
<VideoViewManager
ref={nativeRef}
videoUrl={url}
adTagUrl={adTag}
/>
);
};
export default VideoView;
With that, we got video ads before video broadcasts in our React Native app working on iOS and Android, as well as audio broadcasts.
To check out the full implementation for iOS and Android, see the repository on Github.
https://github.com/cabhara/ReactNativeVideoAds
Expo
Using Native Components in React Native
https://reactnative.dev/docs/native-components-ios
https://reactnative.dev/docs/native-components-android
React Navigation
https://reactnavigation.org/docs
Google IMA
https://developers.google.com/interactive-media-ads/docs/sdks/ios/client-side
https://developers.google.com/interactive-media-ads/docs/sdks/android/client-side
https://developers.google.com/interactive-media-ads/docs/sdks/html5/client-side/tags