Hackernoon logoLive stream an ONVIF Camera on your Android app! 📱 by@remy

Live stream an ONVIF Camera on your Android app! 📱

Author profile picture

@remyRemy Virin

iOS dude

Onvif (stands for: Open Network Video Interface Forum) is a non-profit with the goal of facilitating the development and use of a global open standard for the interface of physical IP-based security products — Wikipedia
Being able to control your house, open doors, view in real time your living room, control the lights is a childhood dream! I was really delighted to develop an app and a dependency to ease the development of ONVIF Android apps.
ONVIF goal is to standardise how IP products (video surveillance cameras, alarms, doors, audio recorders…) communicate with each others. They created some specifications manufacturers have to conform to to be compliant with ONVIF. The goal behind these specifications is to standardise how to connect to these products, for instance if you develop an app for streaming video from an ONVIF camera, it should work with every ONVIF camera.

Try it!

If you just want to try the demo project, you can download the sample project on Github. Open it with Android Studio and run it, that’s it!
You will be able to login on every ONVIF camera and view its live stream:

Streaming from an ONVIF cameraHow to connect to a camera and visualise its live stream on Android? 👨🏽‍💻

  • Create a new project on Android Studio
  • Add implementation
    and implementation
    to your
    (Module: app)
Once it’s done, you can add the following code to connect to the camera and retrieve its information:
class MainActivity : AppCompatActivity(), OnvifListener {

    override fun onCreate(savedInstanceState: Bundle?) {

        currentDevice = OnvifDevice("IP_ADDRESS:PORT", "login", "pwd")
        currentDevice.listener = this

    override fun requestPerformed(response: OnvifResponse) {
        Log.d("ONVIF","Request ${response.request.type} performed.")
        Log.d("ONVIF","Succeeded: ${response.success}, 
		  message: ${response.parsingUIMessage}”)
	if (response.request.type == GetServices) {
To instantiate an ONVIF camera, we use the class
, this class takes an
which defines the
method. This method is called once the camera returns a call we made (getDeviceInformation, getProfiles, getStreamUri).
Note: Calling
before getting the information is not mandatory but strongly recommended. It retrieves the different paths depending on the web service you’re calling, from one camera to another it can change. For instance
command will have this path on a Dahua camera
whereas it will be
on a Uniview camera.
If you implement new web services call in your app, you need to parse their path in this call.
To be able to see the live stream of the camera we need to retrieve the different profiles (media profiles, with different configurations available) from the camera, select the one we want to play and retrieve the corresponding stream URI.
Here is how we retrieve the profiles and the stream URI:
override fun requestPerformed(response: OnvifResponse) {
        Log.d("ONVIF", "Request ${response.request.type} performed.")
        Log.d("ONVIF","Succeeded: ${response.success}, 
		  message: ${response.parsingUIMessage}”)
        if (response.request.type == GetServices) {
        } else if (response.request.type == GetDeviceInformation) {
        } else if (response.request.type == GetProfiles) {
        } else if (response.request.type == GetStreamURI) {
            Log.d("ONVIF", "Stream URI retrieved: ${currentDevice.rtspURI}")
Depending on your camera, your URI will look like this one:
The URI does not follow the http protocol, but the rtsp protocol, which is normal because RTSP (Real Time Streaming Protocol) is aiming to make streaming simpler. The problem is,
doesn’t handle RTSP well (it can work but not with every codec, which is a problem to stream from an ONVIF camera). Fortunately, VLC comes to the rescue!

How to read a RTSP stream on Android? 🎥

Add these lines to your build.gradle:
allprojects {
  repositories {
    maven { url 'https://jitpack.io' }
dependencies {
  compile 'com.github.pedroSG94.vlc-example-streamplayer:pedrovlc:2.5.14'
Here is how you play the video with
once you have the rtsp URI:
class StreamActivity : AppCompatActivity(), VlcListener {

    private var vlcVideoLibrary: VlcVideoLibrary? = null

    override fun onCreate(savedInstanceState: Bundle?) {

        val surfaceView = findViewById<SurfaceView>(R.id.surfaceView)
        vlcVideoLibrary = VlcVideoLibrary(this, this, surfaceView)


    override fun onComplete() {
        Toast.makeText(this, "Playing", Toast.LENGTH_SHORT).show()

    override fun onError() {
        Toast.makeText(this, "Error, make sure your endpoint is correct", Toast.LENGTH_SHORT).show()
implements two methods (
, and
). They are called by
to tell us if the video loads without any problem.
Tip: You can also insert the login and password in the rtsp uri like this:
etc… If you use the
dependency, the URI saved in
contains the login and password.
If you want to learn how to create the same project on iOS, you can read this article.
Creating an ONVIF library was a exciting opportunity, as you can see in my previous posts, I really enjoy having an interaction between my apps and hardware devices, whether it’s a printer for iPhone, a camera, or even a door! I’m really looking forward to test new ONVIF devices and develop apps that can be useful for everyone that has ONVIF devices.
The code of this project is open source, you can download it on Github.
If you have any question, I’ll be happy to read them in the comments!


The Noonification banner

Subscribe to get your daily round-up of top tech stories!