iPhone

Looking for Work

For the first time in nearly seven years, I am now unemployed. Yesterday, along with several other people, I was laid off from my job at The Omni Group, and I’m now looking for new work. UPDATE: Here is a link to my resume PDF and my complete CV.

First of all, thank you to all of my friends and colleagues at Omni for being one of the best groups of people I’ve ever had the pleasure to work with. I’ll be forever grateful that taking a job at Omni is what got me out to the West Coast, and to the lovely city of Seattle. I learned a lot there, and they believed in my abilities enough to give me some truly interesting challenges, including some professional development that was rather tangential to my scope of video production, in the interest of seeing it enrich my work in unexpected ways.

I won’t go into what happened at Omni in much detail (if you’re interested, my fellow layoff-ee Brent Simmons has more to say about that on his blog). Suffice it to say that with a spouse who is an essential employee at a bakery, I never suspected that an economic cascade leading to this layoff would be the way that the COVID-19 pandemic affected our family. Omni are doing everything they can to make this necessity a little less harsh, and I’m certain they never would have laid people off if it weren’t truly necessary for the company’s survival. I wish them the best, and hope that they come out of this stronger than ever.

But still, I am out of work. Do you have work? Hire me!

People probably know me best for my video production work — please see the output of my last seven years in The Omni Group’s video archives — but I have also done a lot of related development work, and would love to push my career in that direction. Here’s a quick list of some relevant skills:

  • My biggest area of experience: video production, post, editing, and motion graphics. 24 years and counting.
  • I’ve produced two long-running podcasts, The Optical, which I also host, and The Omni Show.
  • For the last 13 or so years, I’ve been working on my programming skillset.
    • I write scripts to automate my video production workflows in Python and JavaScript/ExtendScript
    • I’ve built websites with a Django back end, and I have a working knowledge of JavaScript, HTML, CSS, and REST APIs
    • Just about a month ago, I took a Swift + iOS Development Bootcamp at Big Nerd Ranch, so I’m continuing to refresh my skills
    • I developed and published an app guide to Star Trek for the iPhone, for The Post Atomic Horror Podcast (no longer in the App Store)
    • I was a founding member of NSCoderNightDC, collaboratively learning Objective-C, Mac, and iOS development, and am a member of Xcoders in Seattle, keeping in touch with the local community of Mac and iOS developers
    • I’m currently in the process of learning Unity and C#, so that I can build a virtual map of Scarecrow Video, the largest publicly-accessible film archive in the world, where I volunteer on a regular basis. I suppose Unity is useful for other things too.
    • For my own themed tiki bar space, I’ve become very familiar with Raspberry Pi and Arduino programming, to control lights, smoke, and (in progress) animatronics for an immersive themed experience.

Vertical Horizon

Ever since the iPhone first started shooting video, people have decried the use of the vertical orientation. Why would you do that? It looks so horrible! It’s unnatural! Hang on a moment while I pass judgement on you.

Stop it.

Let’s take a look at the history of film aspect ratios for a moment. Sure, the first film format was 4:3, just like our old TV sets — slightly wider than it is tall. In fact, TV cribbed the 4:3 ratio from film, and it wasn’t until TV started sucking away some of the film audience that the movies started to get wider and wider and wider.

The point being, aspect ratio is an artistic choice, and mostly a gimmick to get people back in theaters. None of those aspect ratios are “right” — not even 16:9, which was a compromise between many ratios for an acceptable film “fit” when TV stole widescreen back for itself (and pushed the movies into another 3D frenzy, which is a rant for another day). Even 9:16 (the iPhone’s vertical video ratio) is just another choice in a long line of choices.

And why shouldn’t you shoot video vertically? Apple’s own ads show people chatting on FaceTime with the camera held vertically. Our faces our vertical. There are tall buildings, and kids coming down playground slides. I argue that, sometimes, it’s a really good fit.

Most of the arguments against vertical video seem to boil down to one of two things. One is some pseudo-scientific mumbo-jumbo about how our eyes are set horizontally in our heads, so our natural field of vision is wider than it is tall, and we should obey that restraint. (Yes, art is all about obeying natural restraints, and conforming to convention.) The other is an argument that the way that we share video now, via YouTube and AirPlay-ing to our AppleTVs, demands that wide ratio to fit the screen. This latter theory has some merit, but I would argue that the video sharing sites should accommodate multiple aspect ratios in the way they present the videos, instead of letterboxing things inside a widescreen frame. (At least Vimeo and Flickr seem to handle this properly.)

That leads us to Horizon, a new app that uses the accelerometer in the phone to detect the angle of the phone while it’s shooting video, and automatically crop the video to a level 16:9 horizontal image.

This is super clever, and certainly fills a need — sometimes you do want perfectly level horizontal video, damn the resolution (the crop in vertical mode has only 32% of the resolution of the full image). Probably most people just want a video that looks nice when they play it on their TV, or share it on YouTube. This will do that, and quite nicely.

The more interesting thing to me, is how it enables a unique interface for the zoom function on the phone. Now you can use the angle of the phone to control the crop, instead of clumsily sliding your finger on the screen while you’re trying to hold the phone still.

It’s all about using the camera in creative and unique ways — There is no one “right” way to shoot video. Just a plethora of interesting decisions.

Fingers, Virtual and Otherwise

fingers_virtual_and_otherwise

Recently, a few developers were having a discussion on ADN about a promo video for a new iPad app. Uli Kusterer raised the point that the video was missing something: fingers. It was hard for him to tell what was a tap, and what was something that was merely happening as normal in the app. Since the app in question is a game with a lot of animation and things popping in and out, it’s easy to see how things might get confusing.

Fingers are a really important part of an iOS app demonstration video. There are definitely a few ways to make that happen (and there was some speculation about how it might be done better in that ADN conversation). I’ll get to some techniques you might use later, but first, perhaps we should talk about why it’s so important.

Why Fingers?

When you take a screencast of a Mac, unless you’re doing something really fancy, your cursor appears in the video. The cursor is also known in certain circles at the pointer because, well, it points to things. You can even wave it around and gesture and point to things that you’re not even going to click on. It’s the digital extension of that digit on your hand that you point to things with all day, every day. Always pointing at things, you are.

The cool thing about pointing and gesturing at stuff is that the viewer’s eyes find it really useful, so they can anticipate. The human eye and brain are a pretty good team when it comes to following the motion of object, and when they have that, it’s a lot easier to follow your cursor from one action to the next, than if your cursor, say, suddenly jumped from one side of the screen to the other, and you had to search around to figure out where it had gone.

However, when you take a straight screencast of an iPhone or iPad app, your finger doesn’t show up on the screen. You can’t gesture to things. The viewer’s eye can’t track where you’re going next, so it’s really easy to get lost and frustrated, trying to figure out what’s going on.

Giving Your App the Finger

Now, when I talk about fingers in a video, I don’t necessarily mean a literal human finger. It could be a simple circle representing a finger pad. The important thing is that there’s some way for your viewer to see where you’re going, and be able to follow along.

Let’s take a look at some options for getting that in there. I’ll try to go from least to most complicated, and give you an idea of the pros and cons of each technique. I don’t think any one technique will be right for every app, so it’s good to know your options.

Run your app in the simulator and use SimFinger

  • Maybe you think SimFinger is creaky and old, but you know what? It still does what it needs to do, and does it well.
  • SimFinger replaces your Mac’s cursor with a simulated finger pad that changes from convex to concave when you click, representing a tap with your virtual finger.
  • It also gives you iPhone and iPad frames to put around your simulator screen, if you’re into that kind of thing.
  • Con: It’s the simulator, so it may not look and act exactly the same as the device, or may not be viable if you have features that don’t work well in the simulator. It will also be hard to use this for games with crazy fast multi-tapping action. Or any multi-tapping action at all, really. Your mouse is really only equivalent to one finger, or maybe two if you use the pinch mirroring features in the simulator.

Run your app in the simulator and use ScreenFlow

  • You can also do this kind of cursor replacement with ScreenFlow, and maybe you’re already using ScreenFlow to record anyway. In fact there’s a setting for the mouse pointer called Circle – Light that looks remarkably like Ye Olde SimFinger.
  • You can adjust the transparency of the finger cursor, so that you can always see what’s underneath.
  • Con: Still the simulator.

Shoot a video of a human finger actually using the live app on a device.

  • You capture the video all in one go, so the shoot itself seems pretty easy.
  • Con: The shoot is actually terribly difficult because the color temperature in the environment is very hard to match to the color temperature of the display. And even if you have a fancy bank of LED lights where you can dial up different light temps, it’s still really difficult to make the live app on the device look decent because the contrast and brightness of your live scene is never going to match the screen very well.
  • Also, there’s a big LED light bank shining on your device, so: GLARE. No, you can’t keep the light off the screen. Your finger needs it.
  • Also also, the video you shoot of a screen is never going to look as clean and sharp as a raw screen recording.
  • Tip: You can maybe get away with it, if you want to go for the silhouette effect. Very dim lighting or just accent lights on the real world, and adjust your camera for best exposure of the screen.
  • Con: Your finger, and maybe the rest of your hand, obscures the screen, so it might still be hard to see what’s going on.

AirPlay to your Mac and record the screen with TouchPosé

  • TouchPosé is a set of classes you can add to your app to make semi-transparent circles appear when you touch the screen.
  • Con: It only registers touches, so I’m not sure I can recommend it. There is nothing for the eye to track from one tap to the next, so I don’t think it’s much of an improvement. Perhaps if you’re doing a live demo or a presentation somewhere on a projector, and you can explain to people what you’re doing in more detail, but in a fast-paced demo video, it’s easy to lose track of the little tap circles, especially if it’s a game or other app where random circles popping up could well be part of the app’s animation.
  • Caveat: I haven’t used this myself — only seen it in other folks’ videos. Perhaps there’s an option to customize it in such a way that would make it more clear?

Use TouchPosé to create a skeleton for the video, and animate your own finger.

  • You could add an animation on top of the TouchPosé style video, so that you can help the viewer anticipate. The circles that appear let you know when a tap happened, and give you timings for when you need to animate your pointer overlay into that position.
  • The animated pointer could be anything — matching the TouchPosé circle exactly, just a simple SimFinger-style dot, or any other element you could move around. It could even be a cut-out photograph of a hand with a pointing finger, and you could make it semi-transparent so it doesn’t obscure the rest of the screen.
  • Con: A lot of work. Probably requires learning After Effects or Motion if you’re not already a video guy like me.

Shoot with green screen on the device and record the app separately; sync them up in editing and composite

  • Looks great if you do it right. You can balance the colors and luminance between the real shoot and the screencast, and create something that, while it never really exists that perfectly in nature, makes for a beautiful video.
  • Cons: Toil. It’s a royal pain in the ass to pull off. You need to figure out where your taps will be, and record them so that they sync up with the video. Not too bad if you’re just cutting away for a “real world” demo moment or two, but if you’re doing a full-blown app demo, it can be deadly.
  • A green screen on the device is hard to balance when you’re shooting, to get a good, clean key. You’ve got to get it bright enough that you can pull a good key from the green, but not so bright that the green spills onto the demonstration finger and half of your finger disappears along with the green.
  • You might consider just tapping on a black screen, and rotoscoping your hand to create a matte, but that can take a long time and look unnatural if you’re not skilled at it.
  • We did both of these techniques when I worked on the videos for Briefs recently. I’ll talk about the process in more detail in my next post.

Shoot with an opaque green screen overlay on the device running the actual app

  • Peter Hosey and Mike Lee came up with this idea during the aforementioned ADN discussion. I haven’t tried it, but I think it’s pretty darn clever.
  • The real trick is, what would you make the overlay out of? It needs to be opaque enough to give you a nice clean green, but thin enough that you can transmit touches to the device through the overlay.
  • Jose Vazquez suggested including a touch-transmitting green UIView on top of the real views of the app, when viewed on the device, but removing that green for the AirPlay output that you would record, so then you wouldn’t need anything physical.
  • If you’re using a physical green material, it also might need to be thin enough, if you’re filming the live action off-axis, to not show any significant thickness, or edge, that would need to be removed in compositing. However, I think Mike and Peter were thinking it would be shot dead-on, and the fingers simply composited over the screencast — not composited back into the live action, like we did for Briefs.
  • You would then record both the camera shot of the live fingers and the app’s screen video (via AirPlay) simultaneously, and composite them together. You could even make the fingers semi-transparent to show the full app screen while you demo the app.

All interesting ideas, with varying degrees of preparation, difficulty, and results. What method will you use on your next app video? I look forward to hearing your thoughts.

Post Atomic Horror Unofficial Star Trek™ Episode Guide

Hey, did I mention that I launched an iPhone app in the App Store a couple of weeks ago? I don’t think I did.

The Post Atomic Horror Unofficial Episode Guide is a fun and humorous guide to the all of the Star Trek™ adventures featuring the original crew. It helps you keep track of which episodes you’ve watched, and offers easy access to the comedy review podcast, the Post Atomic Horror.

You don’t have to be a Trekkie to love this app: Let my good friends, podcastronauts Ron “AAlgar” Watt and Matt Rowbotham, be your guides to the incredible world of Trek — and listen to them joke while they watch the bad episodes, so you don’t have to.

Buy it now on the App Store!

This app contains written content from the Post Atomic Horror’s first book, The Post Atomic Horror Unofficial Episode Guide, volume one. The book also has bonus content, and artwork by Ramon Villalobos. Check it out!

PAH Guide app screenshot

How I learned Objective-C, Cocoa, and developed an iPhone App

As I just posted to 43 Things [a site that is sadly now dead — Ed.], I finally shipped my first public iPhone app, so maybe it’s time to look back at this journey and see how it’s gone so far.

It seems like there’s always more and better to learn, but I learned enough to actually ship an iOS app for sale in the App Store, so I’m calling this a win.

How I did it:

I can see several keys to my eventual success:

  • Stick with it. Even though I was learning in spare hours here and there, continuing to attempt progress certainly helped.
  • Find a support group. The assistance and encouragement of my local NSCoderNight group was invaluable. People really do want to help those who are trying to help themselves.
  • Read a lot. I read numerous books and web articles about the things I wanted to learn. It didn’t always sink in the first time, but reading other books and articles on the same subject, with different wording and a different perspective, really helped it sink in with repetition and context.
  • Take time off. I found the real way to boost my learning was to take some time off from my job to concentrate on that and that alone for a week or two.
  • Write an app that you’re passionate about. Even when I didn’t feel I knew enough to write an app yet, the act of writing an app forced me to learn what I needed to do to get the app done, and really sped things along. The three weeks I took off work to code all day and put my first app out into the world was the best boost to my programming knowledge so far.

Lessons & tips:

I think I covered most of those above. Still, my biggest piece of advice is to figure out what app you’re passionate about building, and build that. You’re not going to be nearly as engaged in learning if you’re just building some boring tutorial app.

Resources:

  • NSCoderNight — Find a group near you, and go to the meetings. Usually these are informal gatherings of people actually coding, and asking the occasional question. My local is NSCoderNightDC.
  • If you want a more formal meeting with presentations and such, CocoaHeads or Xcoders may be your thing.
  • The book that was the most help in learning Objective-C is Programming in Objective-C by Stephen Kochan. That book was the first (and only, at the time) to teach you Objective-C from scratch, instead of assuming that you are learning it as an add-on to your existing C knowledge. Since then, Objective-C Programming: The Big Nerd Ranch Guide, has come out, and I would also recommend it, despite a few moments where it assumes knowledge that you might not have.
  • The book I found most helpful for learning Cocoa was Cocoa Programming for Mac OS X, from Aaron Hillegass of the Big Nerd Ranch, but if you’re going to concentrate on apps for iPhone and iPad, you might want to swap that out for iOS Programming: The Big Nerd Ranch Guide.
  • If you’re curious, you can check out my first publicly released app, the Post Atomic Horror Unofficial Episode Guide for Star Trek.

Websites

Websites with articles and tutorials I found invaluable (plus a few newer ones):

Books

Some other books that I read over the course of my learning (updated, plus a few new ones):

Update 2015-08-12

Obviously, a lot has happened since I wrote this post — iOS is about to hit version 9, the new Swift programming language has been introduced, and I moved across the country to be the video producer at a Mac/iOS software company in Seattle.

Sadly, the new job has left little time for me to keep up with my own side projects, but I still keep a close eye on the iOS and Mac development world, since I deal with it every day at work.

I’ve updated the links above, and here are a few more resources, if you’re looking to get started in Swift. That said, learning and knowing Objective-C is still an important skill to learn, since Swift is backwards compatible, and that means there’s going to be a lot of Objective-C out there for years to come.

Swift Resources