5/08/2012

Ad-hoc camera sharing from Android to multiple parallel viewers on Android and Desktop -- part 2 of 2



As part of our ongoing "Ad-hoc Mobile Ecosystem" project we made a research on the “Ad-hoc camera sharing” use-case.

Due to our user centric concept, where we try to minimize switches in a technical context, we decided to integrate ad-hoc real time camera sharing from other mobile users into Augmented Reality view as a next building block. 

This is part two of two.

Result of Reviews and Evaluation steps

(1) SipDroid based steps:

  • First iteration of our  GitHub RTSP-Camera 
  • running with H263-1998 packetizer from SipDroid 
  • new implemented feature is to send out the same payload to a list of RTP/UDP clients in parallel
  • First implementation of our "Android embedded RTSP Video Server" implementation (part of GitHub RTSP-Camera project)

Result:

  • VLC is able to show the video stream with an estimated 1.5 seconds delay
  • Android Media API “VideoView” starts with a buffering of an incoming video stream which needs 10-12 seconds! 
  • There are no configuration options or tricks to circumvent this!
This delay is not acceptable within our use-case.

(2) SpyDroid based steps:

  • Next step based on H264 encoding to evaluate VideoView delay
  • Integrated the H264 packetizer from SpyDroid 
  • Adopted H264 packetizer to our multi-client  RTSP Video Server RTP/UDP sender pattern

Results:

  • VideoView RTSP does not support H264 decoding
  • VLC needs detailed “sprop-parameter-sets” in SDP file to define SPS (Sequence Parameter Sets) & PPS (Picture Parameter Sets) according to level / profile / width / height / framerate.
  • Running VLC with debugging enabled is essential to understand the different errors
    • VideoLAN\VLC\vlc.exe --extraintf=http:logger --verbose=6 --file-logging --logfile=vlc-log.txt
  • For example, a missing or wrong “sprop-parameter-sets” in SDP ends up with continuous:
    • [112aded0] packetizer_h264 packetizer warning: waiting for SPS/PPS

(3) IP-Webcam based steps:

  • Next step based on MJPEG encoding and streaming through HTTP server
  • GitHub HTTP-Camera project

Results:

  • Straight forward implementation as an alternative to RTSP stream based approach, but with shortcomings regarding bandwidth and client support
  • HTML <image> tag in modern desktop browsers (Firefox, Chrome, …) do support streamed MJPEG rendering.
  • HTML <image> tag in Android WebView doesn’t support streamed MJPEG rendering!

(4) IMSDroid based steps:

  • Test a streaming session between two Android phones with local installed OpenSIP serverResult:
    • Delay is under one second

Results:

  • Hold back as a fall back if Orange code base will fail

(5a) Orange RCS/IMS H263 steps:

We preferred Orange Labs native encoding project over IMSDroid due to their maturity level, code structure and active maintenance.
  • Extracted the H263-2000 native encoder for RTSP-Camera
  • Adopted it to our multi-client  RTSP Video Server RTP/UDP sender pattern
  • Extracted H263-2000 native decoder for RTSP-Viewer

Results:

  • Smooth streaming from VLC and our Android RTSP-Viewer app

(5b) Orange RCS/IMS H264 steps:

  • Extracted the H264 native encoder for RTSP-Camera
  • Adopted it to our multi-client RTSP Video Server RTP/UDP sender pattern
  • Extracted H264 native decoder for RTSP-Viewer

Results:

The main difference of H264 to H263 is the need for SPS/PPS information. There are two options for sending that parameters, named:
  • out band, which means either encoded in SDP “sprop” or at the very beginning of the stream
  • in band, which means with each key-frame (IDR)
The native H264 decoder from PV OpenCORE does not support SDP files, at least not with the JNI interface given from Orange. It relies instead of SPS/PPS embedded as NAL units within the stream.

To switch the default implemented “out band” parameters to “in band” was the result of a very deep debugging and learning session. Our use-case has to support “mid-stream” starting of decoder, which means that a client wants to join a running stream. This is different to Orange supported use-case of an initiated session between two partners to share their view. Last one works with one time SPS/PPS package (out band), our use-case failed.

Interesting enough the underlying PV OpenCORE code supports a parameter to switch exactly that handling, called “out_of_band_param_set”.

The obvious Internet research upfront ended with NULL sourcecode snippets which shows an example or project with the right usage of “in band” parameters.

To change the logic on our own of
  1. encoding and continuous NAL unit SPS/PPS injection before each NAL IDR unit and 
  2. to bring decoding with in band SPS/PPS to life 
was a two day work.

In the end we found the right encoder logic to built the right order of NAL units and we found a bug deep down in C++ decoding sources. This convinced us, that this was the reason for "NULL code snippets research", because we are the first which bring this use-case to life based on this library.

At the same time it becomes clear that Android Media APIs based on the same Stagefright / OpenCORE libraries are not able to support mid-stream decoding.

In our case VLC and Android RTSP-Viewer are now enabled to start rendering mid-stream of an H264 encoded stream.

Summary

We learned the pros & cons of all the different approaches through this project. Based on the lessons learned we are now able to estimate risk and effort for further steps accurate. Even more we are able to discuss architectures and evaluate other solutions now.

The confusion that is still around when mobile video-conferencing is discussed on the net, is mostly cleared for us now.

A critical element of our development approach was the very challenging low level debugging which was necessary for that project. For example, none of the extracted encoder solutions or TCP/UDP based communications worked instant. Every evaluation and coding step needed serious debugging and understanding of communication definitions and protocol stack.

"Ad-hoc Expertise" has been proven again as a vital part of our core competencies.

We solved every single encoder / decoder extraction and adoption to our RTSP Video Server multi-client framework. But to get there we had to learn following new technologies "on the fly":
  • RFCs (RTSP, H263/H264 over RTP, H263 and H264 de/encoding, ...),
  • TCP/UDP/RDP/H26x protocol analysis through Wireshark, 
  • Native C++ libraries debugging based on Android NDK

Our working proof-of-concept code skeleton for the “Ad-hoc video sharing” use-case (with parts of the intermediate steps, like Android Media API based versus native JNI encoding/decoding)  is provided for the benefit of all under GPLv3 on GitHub:

References

No comments:

Post a Comment