My iOS app got approved and is now available for sale! I’ll be updating the web site today.
So chuffed with that, there’ll be plenty more updates to come!
My iOS app got approved and is now available for sale! I’ll be updating the web site today.
So chuffed with that, there’ll be plenty more updates to come!
Apple wanted a demo video as part of the review, knocked a quick one up yesterday, it takes you on a walk through of the app’s capabilities and shows how the projection mapping is done.
I’ll need to make a professional looking one at some point.
Got rejected, but not because of the Binary, that’s good, they want a video of how to use it for the review, so I’m busy making that today. I was a bit scared they’d reject it straight away as the application is a little unusual for an iOS device, I was worried the fact it does nothing without a projector connected would just get it instantly rejected, but asking me for a video on how it’s used is encouraging.
After a busy couple of weeks, I finished up the iOS app and it’s now been submitted for review, it’s my first app submission, so let’s see how it goes…
As well as adding the dev blog, some new changes will be coming to prepare for the iOS release of the Reality Augmenter.
Currently putting the finishing touches to the iOS version of the Reality Augmenter. It’s pretty much feature complete and reliable, just tweaking the user interface and adding final touches.
It’s taken a lot to get here, I wanted to share as much of the code as possible with the OSX version. I’d already stuck to an MVC design pattern which meant most of the iOS work was re-building the view aspect, and rethinking how that would work in an iOS environment. I found the storyboards very easy to pick up and good way to work, and once I’d got the hang of tab and navigation controllers, the basic app built up very quickly.
On the controller side, I’d implemented quite a few things in NSArrayControllers, which aren’t available in iOS, but there wasn’t too much going on in them so could implement some shared classes between iOS and OSX that plugged into the original array controllers and provided the functionality I needed in iOS.
The toughest part was upgrading openGL. Originally the OSX version used a legacy profile because of the Quartz Composer functionality. While there is a fixed function pipeline in ES1, I really wanted to be able to use later versions of ES, which meant upgrading the original to core profile. This means practically all of the openGL code is shared between OSX and iOS, with only different shaders supporting the differences between core and ES. I could probably work on a shader compiler to further reduce differences, but for now I just have different shaders for different versions.
Of course, now there is Metal, and Swift has been taking off too. Once I complete the iOS version, and migrate it’s new features into the OSX version, I’ll be looking to switch the code base over to Swift and Metal.
I’m going to start posting development updates on my own site. You can find my previous development blog on tumblr here.
Another update that’s been a long time coming. The main focus of this update is the underlying OpenGL code, which has been upgraded to support Core Profile 3.2. This was done for a variety of reasons. To enable better code sharing with a future iOS app, to better support new features to come soon and generally bring the code up to modern OpenGL Standards. The Fixed Function Pipeline is gone, drawing has been simplified and shaders upgraded.
In addition I’ve also fixed the annoying hang or crash that could occur mainly when adding or removing chop areas. A little more GPU memory is consumed when using views with previews, the output view has no such views and should be used for when optimal performance is required. Drawing to GUI views is now only done in the main thread, the background rendering no longer tries to lock views and should be more performant, this means a bit more latency on the previews, but the full screen projection areas still render at the maximum refresh rate with as little latency as possible.
Using the reality augmenter to map custom visuals to my turntables and mixer:
You can get the QC compositions used here.
I got myself some all white skinz for my technics and mixer from 12inchskinz, they look amazing. Did a quick couple of QC files using shaders and masks to show them off:
I added the QC files here:
Really simple shader means it’s super quick.