Rapid advances in the fields of deep learning and increased processing power on embedded SoCs have led to many new use cases that require two or more cameras streaming synchronously. While Android provides an easy way to develop and release products for vision-enabled embedded products, the original camera framework and APIs on the OS had limitations on some use cases such as concurrent streaming, extraction of raw frames, etc. which are essential for these applications.
These limitations were addressed with the introduction of Camera2 API set and Camera HAL3 architecture, both of which are now fully operational on many Qualcomm Snapdragon SoCs. On top of that, Android Pie Release has support to access multiple streams easily from user applications. Several of Intrinsyc’s Open-Q System-on-Modules (SOMs) comes with Android BSPs that support concurrent camera streaming on Camera2 API. An application adapted from a Google sample that demonstrates this capability is included with those BSPs.Below is a video showing off dual camera simultaneous streaming using Android’s camera2 framework on Intrinsyc’s Open-Q™ 626 Development Kit.
Qualcomm Snapdragon SoCs feature robust camera pipelines, with hardware IPs and software algorithms for real-time vision-based applications. Many of these SoCs also have dual ISPs, enabling multiple cameras to interface to the platform and stream at 1080p or higher resolutions, and frame rates of 30 fps or more.
This capability can be used for features such as stereo vision/depth perception, Mixed Reality, 360⁰ vision, multi-camera SLAM, multi-focus industrial cameras, Seamless Zoom and many more emerging use cases. Coupled with the OpenCL/FastCV/Deep learning frameworks leveraging the heterogeneous computing power of ARM/Kryo core CPUs, Adreno GPU and Hexagon DSP, Intrinsyc’s Open-Q SOMs enable rapid development of advanced multi-camera devices.