What we’re building We’ll see how to build a flutter app for iOS/Android that allows users to view and share videos. In another post I showed how to do this with as our video storage API. In this tutorial we’ll use to host the videos instead. We’ll also add client-side encoding and HLS support, so the client can stream the videos with adaptive bitrate. Publitio Firebase Cloud Storage The stack - For making cross platform apps. Flutter - For storing video metadata (urls) and syncing between clients (without writing server code). Firebase Cloud Firestore - For hosting the actual videos. Firebase Cloud Storage - For running client-side video encoding. FFmpeg Why client-side encoding? In most video workflows there will be a transcoding server or serverless cloud function, that encodes the video into various resolutions and bitrates for optimal viewing in all devices and network speeds. If you don’t want to use a transcoding server or API (which can be quite pricey), and depending on the kind of videos your app needs to upload and view, you can choose to forego server side transcoding altogether, and encode the videos solely on the client. This will save considerable costs, but will put the burden of encoding videos on the clients. Even if you use some server-side transcoding solution, you’ll probably want to perform minimal encoding on the client. The raw video sizes (especially on iOS), can be huge, and you don’t want to be wasteful of the user’s data plan, or force them to wait for WiFi unnecessarily. do Video encoding primer Here is a short primer on some terms we’re going to use. X264 Codec / Encoder This is the software used to encode the video into the H.264/MPEG-4 AVC format. There are many other codecs, but seeing as the H.264 format is currently the only one that is natively supported on both iOS and Android, this is what we’ll use. Adaptive bitrate streaming A method to encode the video into multiple bitrates (of varying quality), and each bitrate into many small chunks. The will allow the player to choose which quality the next chunk will be, according to network speed. So if you go from WiFi to cellular data, your player can adapt the bitrate accordingly, without reloading the entire video. streaming protocol There are many streaming protocols, but the one that’s natively supported on iOS and Android is Apple’s - HTTP Live Streaming. In HLS, the video is split to chunks in the form of files, and an playlist file for pointing to the chunks. For each quality, or , there is a playlist file, and a master playlist to rule them all 💍. HLS .ts .m3u8 variant stream Configurations Installing flutter_ffmpeg We’ll use the package to run encoding jobs on iOS/Android. requires choosing a codec package, according to what you want to use. Here we’ll use the package, as it contains the x264 codec, and can be used in release builds. flutter_ffmpeg flutter_ffmpeg min-gpl-lts Add the following to your : android/build.gradle ext { flutterFFmpegPackage = } "min-gpl-lts" And in your replace this line: Podfile pod , : => File. (symlink, ) name path join 'ios' with this: == pod + , : => File. (symlink, ) pod , : => File. (symlink, ) if name 'flutter_ffmpeg' name '/min-gpl-lts' path join 'ios' else name path join 'ios' Cloud Storage configuration If you’ve already setup , as discussed in the , then you just need to add the cloud_firestore package. firebase in your project previous post Now we need to configure public read access to the video files, so that we can access them without a token (see ). For this example I’ve added no authentication to keep things simple, so we’ll allow public write access too, but this should be changed in a production app. So in Firebase Console, go to Storage -> Rules and change it to: comment service firebase.storage { match /b/{bucket}/o { match /{allPaths=**} { allow read; allow write; } } } Stages in client side video processing This is the sequence of steps we’ll have to do for each video: Get raw video path from image_picker Get aspect ratioGenerate thumbnail using ffmpeg Encode raw video into HLS files Upload thumbnail jpg to Cloud Storage Upload encoded files to Cloud Storage Save video metadata to Cloud Firestore Let’s go over every step and see how to implement it. Encoding Provider We’ll create an EncodingProvider class that will encapsulate the encoding logic. The class will hold the flutter_ffmpeg instances needed. { FlutterFFmpeg _encoder = FlutterFFmpeg(); FlutterFFprobe _probe = FlutterFFprobe(); FlutterFFmpegConfig _config = FlutterFFmpegConfig(); ... } class EncodingProvider static final static final static final Generating thumbnail We’ll use the encoder to generate a thumbnail which we’ll save later to Cloud Storage. We’re telling FFmpeg to take one frame ( option) out of videoPath ( option) with size of ( option). We check the result code to ensure the operation finished successfully. -vframes -i width x height -s Future< > getThumb(videoPath, width, height) { (File(videoPath).existsSync()); outPath = ; arguments = static String async assert final String ' .jpg' $videoPath final '-y -i -vframes 1 -an -s x -ss 1 $videoPath ${width} ${height} $outPath '; final int rc = await _encoder.execute(arguments); assert(rc == 0); assert(File(outPath).existsSync()); return outPath; } Getting video length and aspect ratio We’ll use and calculate the aspect ratio (needed for the flutter video player) and get video length (needed to calculate encoding progress): FlutterFFprobe.getMediaInformation Future< < , >> getMediaInformation( path) { _probe.getMediaInformation(path); } getAspectRatio( < , > info) { width = info[ ][ ][ ]; height = info[ ][ ][ ]; aspect = height / width; aspect; } getDuration( < , > info) { info[ ]; } static Map dynamic dynamic String async return await static double Map dynamic dynamic final int 'streams' 0 'width' final int 'streams' 0 'height' final double return static int Map dynamic dynamic return 'duration' Encoding Now for the actual video encoding. For this example I used the parameters from this excellent . We’re creating two variant streams, one with bitrate, and one with bitrate. This will generate multiple files (video chunks) for each variant quality stream, and one file (playlist) for each stream. It will also generate a that lists all the files. HLS tutorial 2000k 365k fileSequence.ts playlistVariant.m3u8 master.m3u8 playlistVariant.m3u8 Future< > encodeHLS(videoPath, outDirPath) { (File(videoPath).existsSync()); arguments = + + + + + + + + + + ; rc = _encoder.execute(arguments); (rc == ); outDirPath; } static String async assert final '-y -i ' $videoPath '-preset ultrafast -g 48 -sc_threshold 0 ' '-map 0:0 -map 0:1 -map 0:0 -map 0:1 ' '-c✌0 libx264 -b✌0 2000k ' '-c✌1 libx264 -b✌1 365k ' '-c:a copy ' '-var_stream_map "v:0,a:0 v:1,a:1" ' '-master_pl_name master.m3u8 ' '-f hls -hls_time 6 -hls_list_size 0 ' '-hls_segment_filename " /%v_fileSequence_%d.ts" ' $outDirPath ' /%v_playlistVariant.m3u8' $outDirPath final int await assert 0 return Note: This is a simple encoding example, but the options are endless. For a complete list: https://ffmpeg.org/ffmpeg-formats.html Showing encoding progress Encoding can take a long time, and it’s important to show the user that something is happening. We’ll use FFmpeg’s to get the current encoded frame’s time, and divide by video duration to get the progress. We’ll then update the state field, which is connected to a . enableStatisticsCallback _progress LinearProgressBar { _progress = ; ... initState() { EncodingProvider.enableStatisticsCallback(( time, size, bitrate, speed, videoFrameNumber, videoQuality, videoFps) { (_canceled) ; setState(() { _progress = time / _videoDuration; }); }); ... .initState(); } _getProgressBar() { Container( padding: EdgeInsets.all( ), child: Column( crossAxisAlignment: CrossAxisAlignment.center, mainAxisAlignment: MainAxisAlignment.center, children: <Widget>[ Container( margin: EdgeInsets.only(bottom: ), child: Text(_processPhase), ), LinearProgressIndicator( value: _progress, ), ], ), ); } < > class _MyHomePageState extends State MyHomePage double 0.0 @override void int int double double int double double if return super return 30.0 30.0 Uploading the files Now that encoding is done, we need to upload the files to Cloud Storage. Uploading a single file to Cloud Storage Uploading to Cloud Storage is quite straightforward. We get a into the path where we want the file to be stored with . Then we call , and listen to the event stream with _onUploadProgress, where we update the state field like we did with the encoding. When the uploading is done, the will return the url we can use to access the file. StorageReference FirebaseStorage.instance.ref().child(folderName).child(fileName) ref.putFile(file) _progress await taskSnapshot.ref.getDownloadURL() Future< > _uploadFile(filePath, folderName) { file = File(filePath); basename = p.basename(filePath); StorageReference ref = FirebaseStorage.instance.ref().child(folderName).child(basename); StorageUploadTask uploadTask = ref.putFile(file); uploadTask.events.listen(_onUploadProgress); StorageTaskSnapshot taskSnapshot = uploadTask.onComplete; videoUrl = taskSnapshot.ref.getDownloadURL(); videoUrl; } _onUploadProgress(event) { (event.type == StorageTaskEventType.progress) { progress = event.snapshot.bytesTransferred / event.snapshot.totalByteCount; setState(() { _progress = progress; }); } } String async final new final final await String await return void if final double Fixing the HLS files Now we need to go over all the generated HLS file ( and ), and upload them into the Cloud Storage folder. But before we do, we need to fix them so that they point to the correct urls relative to their place in Cloud Storage. This is how the files are created on the client: .ts .m3u8 .m3u8 #EXTM3U #EXT-X-VERSION: #EXT-X-TARGETDURATION: #EXT-X-MEDIA-SEQUENCE: #EXTINF: , _fileSequence_0.ts #EXT-X-ENDLIST 3 3 0 2.760000 1 Notice the line . This is the relative path to the chunk in the playlist. But when we upload this to a folder, it’s missing the folder name from the URL. It’s also missing the query parameter, that’s required to get the actual file from Firebase, and not just the metadata. This is how it should look like: 1_fileSequence_0.ts .ts ?alt=media #EXTM3U #EXT-X-VERSION: #EXT-X-TARGETDURATION: #EXT-X-MEDIA-SEQUENCE: #EXTINF: , video4494% F1_fileSequence_0.ts?alt=media #EXT-X-ENDLIST 3 3 0 2.760000 2 So we need a function for adding these two things to each entry, and also to each entry in the master playlist: .ts .m3u8 _updatePlaylistUrls(File file, videoName) { lines = file.readAsLinesSync(); updatedLines = < >(); ( line lines) { updatedLine = line; (line.contains( ) || line.contains( )) { updatedLine = ; } updatedLines.add(updatedLine); } updatedContents = updatedLines.reduce((value, element) => value + + element); file.writeAsStringSync(updatedContents); } void String final var List String for final String in var if '.ts' '.m3u8' ' %2F ?alt=media' $videoName $line final '\n' Uploading the HLS files Finally we’ll go through all the generated files and upload them, fixing the files as necessary: .m3u8 Future< > _uploadHLSFiles(dirPath, videoName) { videosDir = Directory(dirPath); playlistUrl = ; files = videosDir.listSync(); i = ; (FileSystemEntity file files) { fileName = p.basename(file.path); fileExtension = getFileExtension(fileName); (fileExtension == ) _updatePlaylistUrls(file, videoName); setState(() { _processPhase = ; _progress = ; }); downloadUrl = _uploadFile(file.path, videoName); (fileName == ) { playlistUrl = downloadUrl; } i++; } playlistUrl; } String async final var '' final int 1 for in final final if 'm3u8' 'Uploading video file out of ' $i ${files.length} 0.0 final await if 'master.m3u8' return This is how the uploaded files look like in Cloud Storage: Saving metadata to Firestore After we have the video’s storage url, we can save the metadata in Firestore, allowing us to share the videos instantly between users. As we saw in the previous post, saving the metadata to Firestore is easy: Firestore.instance.collection( ). ().setData({ : video.videoUrl, : video.thumbUrl, : video.coverUrl, : video.aspectRatio, : video.uploadedAt, : video.videoName, }); await 'videos' document 'videoUrl' 'thumbUrl' 'coverUrl' 'aspectRatio' 'uploadedAt' 'videoName' This is how the video document looks like in Firestore: Putting it all together Now putting it all together in a processing function that goes through all the stages we’ve seen and updates the state to display the current job status (the input comes from the output): rawVideoFile image_picker Future< > _processVideo(File rawVideoFile) { rand = ; videoName = ; _videoDuration = EncodingProvider.getDuration(info); _progress = ; }); thumbFilePath = EncodingProvider.getThumb(rawVideoPath, thumbWidth, thumbHeight); setState(() { _processPhase = ; _progress = ; }); encodedFilesDir = EncodingProvider.encodeHLS(rawVideoPath, outDirPath); setState(() { _processPhase = ; _progress = ; }); thumbUrl = _uploadFile(thumbFilePath, ); videoUrl = _uploadHLSFiles(encodedFilesDir, videoName); videoInfo = VideoInfo( videoUrl: videoUrl, thumbUrl: thumbUrl, coverUrl: thumbUrl, aspectRatio: aspectRatio, uploadedAt: .now().millisecondsSinceEpoch, videoName: videoName, ); setState(() { _processPhase = ; _progress = ; }); FirebaseProvider.saveVideo(videoInfo); setState(() { _processPhase = ; _progress = ; _processing = ; }); } void async final String ' ' ${ Random().nextInt( )} new 10000 final 'video /Videos/ Generating thumbnail' $rand '; final Directory extDir = await getApplicationDocumentsDirectory(); final outDirPath = ' ${extDir.path} $videoName '; final videosDir = new Directory(outDirPath); videosDir.createSync(recursive: true); final rawVideoPath = rawVideoFile.path; final info = await EncodingProvider.getMediaInformation(rawVideoPath); final aspectRatio = EncodingProvider.getAspectRatio(info); setState(() { _processPhase = ' 0.0 final await 'Encoding video' 0.0 final await 'Uploading thumbnail to firebase storage' 0.0 final await 'thumbnail' final await final DateTime 'Saving video metadata to cloud firestore' 0.0 await '' 0.0 false Showing the video list We saw previously how to listen to Firestore and display a ListView of videos. In short, we used to listen to the update stream, and to create a list that reacts to changes in the stream, via the _videos state field. snapshots().listen() ListView.builder() For each video we display a containing a showing the video’s , and next to it the and field. I’ve used the plugin to display the upload time in a friendly way. Card fadeInImage.memoryNetwork thumbUrl videoName uploadedAt timeago _getListView() { ListView.builder( padding: EdgeInsets.all( ), itemCount: _videos.length, itemBuilder: (BuildContext context, index) { video = _videos[index]; GestureDetector( onTap: () { Navigator.push( context, MaterialPageRoute( builder: (context) { Player( video: video, ); }, ), ); }, child: Card( child: Container( padding: EdgeInsets.all( ), child: Stack( children: <Widget>[ Row( crossAxisAlignment: CrossAxisAlignment.start, children: <Widget>[ Stack( children: <Widget>[ Container( width: thumbWidth.toDouble(), height: thumbHeight.toDouble(), child: Center(child: CircularProgressIndicator()), ), ClipRRect( borderRadius: BorderRadius.circular( ), child: FadeInImage.memoryNetwork( placeholder: kTransparentImage, image: video.thumbUrl, ), ), ], ), Expanded( child: Container( margin: EdgeInsets.only(left: ), child: Column( crossAxisAlignment: CrossAxisAlignment.start, mainAxisSize: MainAxisSize.max, children: <Widget>[ Text( ), Container( margin: EdgeInsets.only(top: ), child: Text( ), ), ], ), ), ), ], ), ], ), ), ), ); }); } return const 8 int final return return new new 10.0 new 8.0 new 20.0 " " ${video.videoName} new 12.0 'Uploaded ' ${timeago.format( DateTime.fromMillisecondsSinceEpoch(video.uploadedAt))} new And there we have it: Limitations Jank Video encoding is a CPU-intensive job. Because flutter is single-threaded, running ffmpeg encoding will cause jank (choppy UI) while it’s running. The solution is of course to offload the encoding to a background process. Regrettably, I haven’t found a way to do this easily with flutter_ffmpeg. If you have a working approach for long-running video encoding jobs in the background, please let me know! (To mitigate the jank effects, you can show an encoding progress bar, and not allow any other use of the UI until it’s done.) Premature termination Another problem with long encoding / uploading jobs, is that the OS can decide to close your app’s process when it’s minimized, before the job has completed. You have to manage the job status and restart / resume the job from where it stopped. Public access to stored videos HLS serving from Cloud Storage with this method requires public read access for all files. If you need authenticated access to the videos, you’ll have to find a way to dynamically update the .m3u8 playlist with the Firebase token each time the client downloads the file (because the token will be different). Caching The flutter video_player plugin doesn’t currently support caching. You can try to use which supports caching (I use it in my app), but I haven’t tested it with HLS. this fork To use it, add this in : pubspec.yaml video_player: git: url: git://github.com/syonip/plugins.git ref: a669b59 path: packages/video_player Wrap Up I think this is a good DIY approach to hosting videos, for two main reasons: Minimal amount of coding - there’s no server code to write or maintain. Very cheap - If you’re working on a side project and want to start with zero cost, you can. Firebase’s free plan has 1GB storage and 10 GB/month transfer limit, which is good to start with. Thanks for reading! As always, full source code can be found on . GitHub If you have any questions, please leave a comment!