Swift and Cameras and Gstreamer

Now that I’ve got the basics of Swift, time to start learning how to use AVFoundation to take photos and videos.

  1. With James‘ tutorial gives you the very basics and here are the gotchas:
  2. If you plug your phone in for the first time while running XCode it may not work properly. I had to close Xcode and plug the phone in first before running Xcode. It wil take some time compiling libraries

Getting Gstreamer 1.4 to work with hardware acceleration and Swift. Wow that’s a challenge:

  1. Installation of Gstreamer 1.4. It looks like it is just a .pkg file, so you install it, but now where does it put the framework is unclear
  2. Tutorial for Gstreamer looks a bit old, but what the heck give it a try and are from gstreamer.com which isn’t the same as the open source project so they are tuned for 0.10
  3. Update for the tutorial to 1.x seems to have happened. I git cloned it but it doesn’t work when you open Xcode and in fact says he never tested it. By spelunking through the Xcode setup, it looks like the tutorial is set of IOS 6.1 (?!) and that the frameworks are in ~/Library/Developer/GStreamer which is where a bunch of Xcode gunk lives and isn’t in ~/Library/Frameworks. Also this tutorial compiles for i386 which doesn’t make much sense for IOS 🙂
  4. OpenGL in Gstreamer on IOS. Basically you can use Apple’s EAGL which allows OpenGL contexts on IOS with Gstreamer and provides GPU support for OpenGL manipulations.
  5. Hardware encode and decode acceleration. Apparently with some release you can do this on IOS 8 and 1.4 at least for video decode. Not clear how to get that working. This requires that you compile from source I think but it isn’t clear where he put the `vtdec` and `vtenc_h264` plugins. This is in addition to OpenGL hardware acceleration.
  6. HTTP Streaming. A long post I barely understand about streaming over HTTP vs RTP, but essentially http streaming is supported by Apple directly.

The biggest problem is lack of documentation 

  1. When you install the pkg you can do it from gstreamer.com but this is out of date so you should install it from the open source site. 
  2. It isn’t clear where the pkg actually installs so it is hard to know where the .framework and .h files are. Looking at ~/Library/Frameworks and /Library/Frameworks didn’t show anything. So I moved ~/Library/Developer/Gstreamer/gstreamer.framework to /Library/Frameworks and it seems to find it.
  3. I then get the error, http://stackoverflow.com/questions/21985909/no-architectures-to-compile-for-only-active-arch-yes-active-arch-x86-64-valid, and it says just flip it to active architecture = NO. This makes sense as I’m running a cross compiled application. Tutorial 1 works here.
  4. Also it is a little confusing how you run different applications in the same Xcode project. Something that I just learned how to do with `Product/Scheme` and then select what you want. And then by selecting the GStreamer IOS Tutorials and select the Targets to change the architecture.
  5. However the Tutorials for the rest don’t seem to work with problems with i386 so I changed this to architecture NO and got a host of `Undefined symbols for architecture i386` and stackoverflow explains that in addition to adding frameworks at the top, you need to correct this in the Build Phases when you have multiple schemes. With a single scheme, just adding the framework at the top works, but you have to manually do for the rest
  6. Tutorial 2 needs something called the VideoToolbox.framework. It isn’t documented in the .h files, so it must be needed by gstreamer somewhere so this now works properly. It uses an delegate protocol and uses a separate Grand Central dispatch thread to run the gstreamer pipeline, so conversion to Swift means we need to understand this as the main work is done by a class `Gstreamerbackend`
  7. The tutorial recommends using the Gstreamer template, but when run on Xcode 6.2, this crashes Xcode!
  8. Tutorial 3, 4 and 5 just need UIKit.framework, VideoToolbox and Gstreamer and they seem to work.

So with the tutorial working on next to figuring out how to use hardware acceleration in the GPU with Gstreamer.

  1. This works apparently in the GIT build accordingin to Coaxion, but you have to build gstreamer yourself on ioS.
  2. The two plugins are called VTenc and VTdec264 which refer to the VideoToolbox apis that allow ahrdwar acceleration
  3. These will eventually go into gstreamer 1.5.1 as ARM support isn’t quite there yet but we are right now at 1.4.5 stable release.

Ok so how to get the latest gstreamer on Cerbero. First make sure not to use a VPN. For whatever reason, certain download sites that cerbero uses will not download using VPN-based addresses.

git clone git://anongit.freedesktop.org/gstreamer-sdk/cerbero ``` This line let's you run there build facility without have to install it, so assuming it is in ~/ws/git and note this is different from the realm. The cross indicates the bootstrap is for IOS.

cd ~/ws/git/cerbero

cpan XML::Parser

Now you need to make sure that you can use multiple cores as Cerbero by default is single core as explained by <a href="https://tausiq.wordpress.com/2014/12/11/ios-gstreamer-framework-custom-build/">tausiq</a>&nbsp;you need to add `allow-parallel-build = True` into your cbc file

echo “allow-parallel-build = True”  >> config/cross-ios-universal.cbc

./cerbero-uninstalled -c config/cross-ios-universal.cbc bootstrap

./cerbero-uninstalled -c config/cross-ios-universal.cbc package gstreamer-1.0


At least for me this seems to fail many times depending on how well the internet is up, so you will have to retry quite a bit before it works.

But finally, this will create a gstreamer.pkg which you should install and this creates a gstreamer.framework which you can then add to the framework section of your application and it should build. This takes quite a while. Been running 

As an aside, the above This fails on vanilla OS X without the install of XML Parser

XML::Parser... configure: error: XML::Parser perl module is required for intltool

So how to get this as www.cpan.org explains you need cpanm, but Macinstruct says not. Cpanm doesn’t install properly by the way, but first the Parser first you need the underlying expat library which does the parsing. 


So I tried another path which was just to use the Objective C framework directly into Swift