iOS

Looking for Work

For the first time in nearly seven years, I am now unemployed. Yesterday, along with several other people, I was laid off from my job at The Omni Group, and I’m now looking for new work. UPDATE: Here is a link to my resume PDF and my complete CV.

First of all, thank you to all of my friends and colleagues at Omni for being one of the best groups of people I’ve ever had the pleasure to work with. I’ll be forever grateful that taking a job at Omni is what got me out to the West Coast, and to the lovely city of Seattle. I learned a lot there, and they believed in my abilities enough to give me some truly interesting challenges, including some professional development that was rather tangential to my scope of video production, in the interest of seeing it enrich my work in unexpected ways.

I won’t go into what happened at Omni in much detail (if you’re interested, my fellow layoff-ee Brent Simmons has more to say about that on his blog). Suffice it to say that with a spouse who is an essential employee at a bakery, I never suspected that an economic cascade leading to this layoff would be the way that the COVID-19 pandemic affected our family. Omni are doing everything they can to make this necessity a little less harsh, and I’m certain they never would have laid people off if it weren’t truly necessary for the company’s survival. I wish them the best, and hope that they come out of this stronger than ever.

But still, I am out of work. Do you have work? Hire me!

People probably know me best for my video production work — please see the output of my last seven years in The Omni Group’s video archives — but I have also done a lot of related development work, and would love to push my career in that direction. Here’s a quick list of some relevant skills:

  • My biggest area of experience: video production, post, editing, and motion graphics. 24 years and counting.
  • I’ve produced two long-running podcasts, The Optical, which I also host, and The Omni Show.
  • For the last 13 or so years, I’ve been working on my programming skillset.
    • I write scripts to automate my video production workflows in Python and JavaScript/ExtendScript
    • I’ve built websites with a Django back end, and I have a working knowledge of JavaScript, HTML, CSS, and REST APIs
    • Just about a month ago, I took a Swift + iOS Development Bootcamp at Big Nerd Ranch, so I’m continuing to refresh my skills
    • I developed and published an app guide to Star Trek for the iPhone, for The Post Atomic Horror Podcast (no longer in the App Store)
    • I was a founding member of NSCoderNightDC, collaboratively learning Objective-C, Mac, and iOS development, and am a member of Xcoders in Seattle, keeping in touch with the local community of Mac and iOS developers
    • I’m currently in the process of learning Unity and C#, so that I can build a virtual map of Scarecrow Video, the largest publicly-accessible film archive in the world, where I volunteer on a regular basis. I suppose Unity is useful for other things too.
    • For my own themed tiki bar space, I’ve become very familiar with Raspberry Pi and Arduino programming, to control lights, smoke, and (in progress) animatronics for an immersive themed experience.

Get an HBO Now login, even if you first subscribed on Apple TV

Frustratingly, it seems when you sign up for HBO Now on the Apple TV, it just associates it with your iTunes account, and that’s it. No email confirmation. No password to sign in to HBO Now on other iOS devices or on the web.

I appears the best answer is to sign up on your iOS device first, but if you’ve already signed up on the Apple TV, here’s the workaround:

  1. Log into your iTunes account on your Mac or PC.
  2. Go to Account > View Account. You will be asked for your iTunes password again.hbonow-01-itunes-account
  3. Scroll down to the Settings section. You will see Subscriptions: on the left, and on the right side next to that, click Manage.hbonow-02-settings
  4. If you have many subscriptions, you might need to scroll down. Next to HBO Now, select Edit.hbonow-03-subscriptions
  5. Under Automatic Renewal select Off and then click Done. You may need to confirm in another dialog box, because iTunes really wants you to be sure.hbonow-04-edit-subscription
  6. Download the HBO Now app for your iOS device (iPhone, iPad, or iPod touch) that uses the same iTunes account for purchases as your Apple TV. At the launch screen, select Start Your Free Trail With iTunes,hbonow-05-ios-trial then register with an email address and password. It seems that it doesn’t need to be the same email you use for iTunes, as the purchase goes through iTunes, so it’s automatically linked to the same iTunes account.
  7. Repeat steps 1–4, but then this time you won’t see Automatic Renewal in the Edit page, but Renewal Options. Click the button to Subscribe, and it will ask you to confirm. Once you do, the Automatic Renewal option will reappear, with On already selected.hbonow-08-auto-renew
  8. Enjoy HBO Now on all of your devices! If you want to log in to hbonow.com, use the email and password that you entered in step 6. Assuming both your iOS device and your Apple TV use the same iTunes account, it does not seem necessary to log out of HBO Now on your Apple TV.hbonow-07-start-watching

Some steps came via this Apple Support Communities post.

Xcoders Seattle: How to App Video

These are the “show notes” from my How to App Video talk at Seattle Xcoders, on May 21, 2014 at 7:00 pm.

Slides, AV Script Format, and Videos

  • My Xcoders talk was recorded and now available on (and embedded above)!
  • My Slide Deck — not sure how helpful this is without the talk, but there it is. These are actually the slides from the previous presentation of this talk at Renaissance 2014, but very little has changed, aside from MouseTools, and including both Mac and iOS instead of just iOS.
  • AV Format Markdown CSS (on GitHub) to use in BBEdit, Marked, and Ulysses
  • If you’re printing out the script from BBEdit, and want the shading to print behind the video side of the script, check out the instructions in this post.
  • At the moment, BBEdit doesn’t obey the CSS page break styles when printing, to avoid weird widow/orphan issues. It seems it’s an Apple Web View bug, and they are working on a solution for that.
  • Final OmniOutliner View Settings “Support Short” video from The Omni Group
  • Final video from MartianCraft
  • Final spec spot, and a on how we did some of the special effects.

Tools

More Bits

Renaissance 2014 — How to App Video

These are the “show notes” from my half of the Master Video talk at Renaissance 2014, on January 21, 2014 at 10:30 am.

Slides, AV Script Format, and Videos

Tools

More Bits

Vertical Horizon

Ever since the iPhone first started shooting video, people have decried the use of the vertical orientation. Why would you do that? It looks so horrible! It’s unnatural! Hang on a moment while I pass judgement on you.

Stop it.

Let’s take a look at the history of film aspect ratios for a moment. Sure, the first film format was 4:3, just like our old TV sets — slightly wider than it is tall. In fact, TV cribbed the 4:3 ratio from film, and it wasn’t until TV started sucking away some of the film audience that the movies started to get wider and wider and wider.

The point being, aspect ratio is an artistic choice, and mostly a gimmick to get people back in theaters. None of those aspect ratios are “right” — not even 16:9, which was a compromise between many ratios for an acceptable film “fit” when TV stole widescreen back for itself (and pushed the movies into another 3D frenzy, which is a rant for another day). Even 9:16 (the iPhone’s vertical video ratio) is just another choice in a long line of choices.

And why shouldn’t you shoot video vertically? Apple’s own ads show people chatting on FaceTime with the camera held vertically. Our faces our vertical. There are tall buildings, and kids coming down playground slides. I argue that, sometimes, it’s a really good fit.

Most of the arguments against vertical video seem to boil down to one of two things. One is some pseudo-scientific mumbo-jumbo about how our eyes are set horizontally in our heads, so our natural field of vision is wider than it is tall, and we should obey that restraint. (Yes, art is all about obeying natural restraints, and conforming to convention.) The other is an argument that the way that we share video now, via YouTube and AirPlay-ing to our AppleTVs, demands that wide ratio to fit the screen. This latter theory has some merit, but I would argue that the video sharing sites should accommodate multiple aspect ratios in the way they present the videos, instead of letterboxing things inside a widescreen frame. (At least Vimeo and Flickr seem to handle this properly.)

That leads us to Horizon, a new app that uses the accelerometer in the phone to detect the angle of the phone while it’s shooting video, and automatically crop the video to a level 16:9 horizontal image.

This is super clever, and certainly fills a need — sometimes you do want perfectly level horizontal video, damn the resolution (the crop in vertical mode has only 32% of the resolution of the full image). Probably most people just want a video that looks nice when they play it on their TV, or share it on YouTube. This will do that, and quite nicely.

The more interesting thing to me, is how it enables a unique interface for the zoom function on the phone. Now you can use the angle of the phone to control the crop, instead of clumsily sliding your finger on the screen while you’re trying to hold the phone still.

It’s all about using the camera in creative and unique ways — There is no one “right” way to shoot video. Just a plethora of interesting decisions.

Fingers, Virtual and Otherwise

fingers_virtual_and_otherwise

Recently, a few developers were having a discussion on ADN about a promo video for a new iPad app. Uli Kusterer raised the point that the video was missing something: fingers. It was hard for him to tell what was a tap, and what was something that was merely happening as normal in the app. Since the app in question is a game with a lot of animation and things popping in and out, it’s easy to see how things might get confusing.

Fingers are a really important part of an iOS app demonstration video. There are definitely a few ways to make that happen (and there was some speculation about how it might be done better in that ADN conversation). I’ll get to some techniques you might use later, but first, perhaps we should talk about why it’s so important.

Why Fingers?

When you take a screencast of a Mac, unless you’re doing something really fancy, your cursor appears in the video. The cursor is also known in certain circles at the pointer because, well, it points to things. You can even wave it around and gesture and point to things that you’re not even going to click on. It’s the digital extension of that digit on your hand that you point to things with all day, every day. Always pointing at things, you are.

The cool thing about pointing and gesturing at stuff is that the viewer’s eyes find it really useful, so they can anticipate. The human eye and brain are a pretty good team when it comes to following the motion of object, and when they have that, it’s a lot easier to follow your cursor from one action to the next, than if your cursor, say, suddenly jumped from one side of the screen to the other, and you had to search around to figure out where it had gone.

However, when you take a straight screencast of an iPhone or iPad app, your finger doesn’t show up on the screen. You can’t gesture to things. The viewer’s eye can’t track where you’re going next, so it’s really easy to get lost and frustrated, trying to figure out what’s going on.

Giving Your App the Finger

Now, when I talk about fingers in a video, I don’t necessarily mean a literal human finger. It could be a simple circle representing a finger pad. The important thing is that there’s some way for your viewer to see where you’re going, and be able to follow along.

Let’s take a look at some options for getting that in there. I’ll try to go from least to most complicated, and give you an idea of the pros and cons of each technique. I don’t think any one technique will be right for every app, so it’s good to know your options.

Run your app in the simulator and use SimFinger

  • Maybe you think SimFinger is creaky and old, but you know what? It still does what it needs to do, and does it well.
  • SimFinger replaces your Mac’s cursor with a simulated finger pad that changes from convex to concave when you click, representing a tap with your virtual finger.
  • It also gives you iPhone and iPad frames to put around your simulator screen, if you’re into that kind of thing.
  • Con: It’s the simulator, so it may not look and act exactly the same as the device, or may not be viable if you have features that don’t work well in the simulator. It will also be hard to use this for games with crazy fast multi-tapping action. Or any multi-tapping action at all, really. Your mouse is really only equivalent to one finger, or maybe two if you use the pinch mirroring features in the simulator.

Run your app in the simulator and use ScreenFlow

  • You can also do this kind of cursor replacement with ScreenFlow, and maybe you’re already using ScreenFlow to record anyway. In fact there’s a setting for the mouse pointer called Circle – Light that looks remarkably like Ye Olde SimFinger.
  • You can adjust the transparency of the finger cursor, so that you can always see what’s underneath.
  • Con: Still the simulator.

Shoot a video of a human finger actually using the live app on a device.

  • You capture the video all in one go, so the shoot itself seems pretty easy.
  • Con: The shoot is actually terribly difficult because the color temperature in the environment is very hard to match to the color temperature of the display. And even if you have a fancy bank of LED lights where you can dial up different light temps, it’s still really difficult to make the live app on the device look decent because the contrast and brightness of your live scene is never going to match the screen very well.
  • Also, there’s a big LED light bank shining on your device, so: GLARE. No, you can’t keep the light off the screen. Your finger needs it.
  • Also also, the video you shoot of a screen is never going to look as clean and sharp as a raw screen recording.
  • Tip: You can maybe get away with it, if you want to go for the silhouette effect. Very dim lighting or just accent lights on the real world, and adjust your camera for best exposure of the screen.
  • Con: Your finger, and maybe the rest of your hand, obscures the screen, so it might still be hard to see what’s going on.

AirPlay to your Mac and record the screen with TouchPosé

  • TouchPosé is a set of classes you can add to your app to make semi-transparent circles appear when you touch the screen.
  • Con: It only registers touches, so I’m not sure I can recommend it. There is nothing for the eye to track from one tap to the next, so I don’t think it’s much of an improvement. Perhaps if you’re doing a live demo or a presentation somewhere on a projector, and you can explain to people what you’re doing in more detail, but in a fast-paced demo video, it’s easy to lose track of the little tap circles, especially if it’s a game or other app where random circles popping up could well be part of the app’s animation.
  • Caveat: I haven’t used this myself — only seen it in other folks’ videos. Perhaps there’s an option to customize it in such a way that would make it more clear?

Use TouchPosé to create a skeleton for the video, and animate your own finger.

  • You could add an animation on top of the TouchPosé style video, so that you can help the viewer anticipate. The circles that appear let you know when a tap happened, and give you timings for when you need to animate your pointer overlay into that position.
  • The animated pointer could be anything — matching the TouchPosé circle exactly, just a simple SimFinger-style dot, or any other element you could move around. It could even be a cut-out photograph of a hand with a pointing finger, and you could make it semi-transparent so it doesn’t obscure the rest of the screen.
  • Con: A lot of work. Probably requires learning After Effects or Motion if you’re not already a video guy like me.

Shoot with green screen on the device and record the app separately; sync them up in editing and composite

  • Looks great if you do it right. You can balance the colors and luminance between the real shoot and the screencast, and create something that, while it never really exists that perfectly in nature, makes for a beautiful video.
  • Cons: Toil. It’s a royal pain in the ass to pull off. You need to figure out where your taps will be, and record them so that they sync up with the video. Not too bad if you’re just cutting away for a “real world” demo moment or two, but if you’re doing a full-blown app demo, it can be deadly.
  • A green screen on the device is hard to balance when you’re shooting, to get a good, clean key. You’ve got to get it bright enough that you can pull a good key from the green, but not so bright that the green spills onto the demonstration finger and half of your finger disappears along with the green.
  • You might consider just tapping on a black screen, and rotoscoping your hand to create a matte, but that can take a long time and look unnatural if you’re not skilled at it.
  • We did both of these techniques when I worked on the videos for Briefs recently. I’ll talk about the process in more detail in my next post.

Shoot with an opaque green screen overlay on the device running the actual app

  • Peter Hosey and Mike Lee came up with this idea during the aforementioned ADN discussion. I haven’t tried it, but I think it’s pretty darn clever.
  • The real trick is, what would you make the overlay out of? It needs to be opaque enough to give you a nice clean green, but thin enough that you can transmit touches to the device through the overlay.
  • Jose Vazquez suggested including a touch-transmitting green UIView on top of the real views of the app, when viewed on the device, but removing that green for the AirPlay output that you would record, so then you wouldn’t need anything physical.
  • If you’re using a physical green material, it also might need to be thin enough, if you’re filming the live action off-axis, to not show any significant thickness, or edge, that would need to be removed in compositing. However, I think Mike and Peter were thinking it would be shot dead-on, and the fingers simply composited over the screencast — not composited back into the live action, like we did for Briefs.
  • You would then record both the camera shot of the live fingers and the app’s screen video (via AirPlay) simultaneously, and composite them together. You could even make the fingers semi-transparent to show the full app screen while you demo the app.

All interesting ideas, with varying degrees of preparation, difficulty, and results. What method will you use on your next app video? I look forward to hearing your thoughts.

Video and You

Video_and_You_title.001

Several people have encouraged me recently, since I have feet in both the video production and Mac & iOS development camps, to blog about video production for developers.

On September 26, 2009, I gave a Blitz Talk at the C4[3] developer conference entitled Video and You, intended to give developers some quick guidelines and tips about developing promotional videos for their apps.

A lot has changed since then, but please enjoy this PDF version of the talk and slides as something small to tide you over while I work up some more modern musings on the subject.