smartfilming

Exploring the possibilities of video production with smartphones

#48 Is ProRes video recording coming to the next iPhone and is it a big deal? — 30. August 2021

#48 Is ProRes video recording coming to the next iPhone and is it a big deal?

ProRes logo and iPhone12 Pro Max image: Apple.

One of the things that always surprised me about Apple’s mobile operating system iOS (and now also iPadOS) was the fact that it wasn’t able to work with Apple’s very own professional video codec ProRes. ProRes is a high-quality video codec that gives a lot of flexibility for grading in post and is easy on the hardware while editing. Years ago I purchased the original Blackmagic Design Pocket Cinema Camera which can record in ProRes and I was really looking forward to having a very compact mobile video production combo with the BMPCC (that, unlike the later BMPCC 4K/6K was actually pocketable) and an iPad running LumaFusion for editing. But no, iOS/iPadOS didn’t support ProRes on a system level so LumaFusion couldn’t either. What a bummer.

Most of us will be familiar with video codecs like H.264 (AVC) and the more recent H.265 (HEVC) but while these have now become ubiquitous “all-in-one” codecs for capturing, editing and delivery of video content, this wasn’t always so. Initially, H.264 was primarily meant to be a delivery codec for a finished edit. It was not supposed to be the common editing codec – and for good reason: The high compression rate required powerful hardware to decode the footage when editing. I can still remember how the legacy Final Cut Pro on my old Mac was struggling with H.264 footage while having no problems with other, less compressed codecs. The huge advantage of H.264 as a capturing codec however is exactly the high compression because it means that you can record in high resolution and for a long time while still having relatively small file sizes which was and still is crucial for mobile devices where storage is precious. ProRes is basically the opposite: You get huge file sizes for the same recording but it’s less taxing on the editing hardware because it’s not as heavily compressed as H.264. From a quality standpoint, it’s capturing more and better color information and is therefore more robust and flexible when you apply grading in post production.

Very recently, Marc Gurman published a Bloomberg article that claims (based on info from inside sources) that the next flagship iPhone will have the ability to capture video with the ProRes codec. This took me quite by surprise given the aforementioned fact that iOS/iPadOS doesn’t even “passively” support ProRes at this point but if it turns out to be true, this is quite a big deal – at least for a certain tribe among the mobile video creators crowd, namely the mobile filmmakers. 

I’m not sure so-called “MoJos” (mobile journalists) producing short current news reports on smartphones would necessarily have to embrace ProRes as their new capture codec since their workflow usually involves a fast turn-around without spending significant time on extensive color grading, something that ProRes is made for. The lighter compression of ProRes might also not be such a big deal for them since recent iPhones and iPads can easily handle 4K multi-track editing of H.264/H.265 encoded footage. On the other hand, the downside of ProRes, very big file sizes, might actually play a role for MoJos since iPhones don’t support the use of SD cards as exchangeable and cheap external storage. Mobile filmmakers however might see this as a game-changer for their line of work, as they usually offload and back-up their dailies externally before going back on set and also spend a significant amount of time in post with grading later on.

Sure, if you are currently shooting with an app like Filmic Pro and use their “Filmic Extreme” bitrate, ProRes bitrates might not even shock you that much but the difference to standard mobile video bitrates is quite extreme nonetheless. To be more precise, the ProRes codec is not a single standard but comes in different flavors (with increasing bitrate): ProRes Proxy, ProRes LT, ProRes 422 (the “422” indicates its chroma subsampling), ProRes 422 HQ, ProRes 4444, ProRes 4444 XQ. ProRes 422 can probably be regarded as the “standard” ProRes. If we look at target bitrates for 1080p FHD in this case, it’s 122 Mbit/s for 25fps and 245Mbit/s for 50fps. Moving on to UHD/4K things are really getting enormous with 492Mbit/s for 25fps and 983Mbit/s for 50fps. A 1-minute clip of ProRes 422 UHD 25fps footage would be 3.69GB, A 1-minute clip of ProRes 422 UHD 50fps would be 7.37GB. It’s easy to see why limited internal storage can easily and quickly become a problem here if you shoot lots of video. So I personally would definitely consider it a great option to have but not exactly a must for every job and situation. Of course I would expect ProRes also to be supported for editing within the system from then on. For more info on the ProRes codec and its bitrates, check here.

At this point the whole thing is however NOT officially confirmed by Apple but only (informed) speculation and until recently I would have heavily doubted the probability of this actually happening. But the fact that Apple totally out of the blue introduced the option to record with a PAL frame rate in the native camera app earlier this year, something that by and large only video pros really care about, gives me the confidence that Apple might actually pull this off for real, maybe in the hope of luring in well-known filmmakers that boost the iPhone’s reputation as a serious filmmaking tool. What do you guys think? Will it really happen and would it be a big deal for you?

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#46 Top tips for smartphone videography in the summer — 28. June 2021

#46 Top tips for smartphone videography in the summer

Photo: Julia Volk via Pexels.com

It’s the dog days of summer again – well at least if you live in the northern hemisphere or near the equator. While many people will be happy to finally escape the long lockdown winter and are looking forward to meeting friends and family outside, intense sunlight and heat can also put extra stress on the body – and it makes for some obvious and less obvious challenges when doing videography. Here are some tips/ideas to tackle those challenges.

Icon: Alexandr Razdolyanskiy via The Noun Project

Find a good time/spot!
Generally, some of the problems mentioned later on can be avoided by picking the right spot and/or time for an outdoor shoot during the summertime. Maybe don’t set up your shot in the middle of a big open field where you and your phone are totally exposed to the full load of sunshine photons at high noon. Rather, try to shoot in the morning, late afternoon or early evening and also think about picking a spot in the shadows. Or choose a time when it’s slightly overcast. Of course it’s not always possible to freely choose time and spot, sometimes you just have to work in difficult conditions.

„Bum to the sun“ – yes or no?
There’s a saying that you should turn your „bum to the sun“ when shooting video. This definitely holds some truth as pointing the lens directly towards the sun can cause multiple problems, including unwanted lens flare effects, underexposed faces or a blown out background. You can however also create artistically interesting shots that way (silhouettes for instance) and the „bum to the sun“ motto comes with problems of its own: If you are shooting away from the sun but the person you are filming is looking directly towards it, they could be blinded by the intense sunlight and squint their eyes which doesn’t look very favorable. If the sun is low you also might have your own shadow in the shot. So I think the saying is something to take into consideration but shouldn’t be adhered to exclusively and in every situation.

Check the sky!
Clouds can severely impact the amount of sunlight that reaches the ground. So if you have set up an interview or longer shot and locked the exposure at a given time when there isn’t a single cloud in front of the sun, there might be a nearby one crawling along already that will take away lots of light later on and give you an underexposed image at some point. Or vice versa. So either do your thing when there are no (fast moving) clouds in the vicinity of the sun or when the cloud cover will be fairly constant for the next minutes.

Use an ND filter!
As I pointed out in my last blog post The Smartphone Camera Exposure Paradox, a bright sunny day can create exposure problems with a smartphone if you want to work with the „recommended“ (double the frame rate, for instance 1/50s at 25fps) or an acceptable shutter speed because phones only have a fixed, wide-open aperture. Even with the lowest ISO setting, you will still have to use a (very) fast shutter speed that can make motion appear jerky. That’s why it’s good to have a neutral density (ND) filter in your kit which reduces the amount of light that hits the sensor. There are two different kinds of ND filters: fixed and variable. The latter one lets you adjust the strength of the filtering effect. Unlike with dedicated regular cameras, the lenses on smartphones don’t have a filter thread so you either have to use some sort of case or rig with a filter thread or a clip-on ND filter.

Shoot LOG! (Well, maybe…)
Some 3rd party video recording apps and even a few native camera apps allow you to shoot with a LOG picture profile. A log profile distributes exposure and color differently, in a logarithmic rather than a linear curve, across the respective spectra compared to a „normal“ non-log image profile. By doing this you basically gain a bit more dynamic range (the range spanning between the brightest and darkest areas of an image) which can be very useful in high-contrast scenarios like a sunny day with extreme highlights and shadows. It also gives you more flexibility for grading in post to achieve the look you want. This however also comes with some extra work as pure log footage can look rather dull/flat and might need grading to look „pretty“ as a final result. It is possible though to apply so-called LUTs (simply put: a pre-defined set of grading parameters) to log footage to reduce/avoid time for manual grading.

Get a white case!


Ever heard of the term “albedo“? It designates the amount of sunlight (or if you want to be more precise: solar radiation) that is reflected by objects. Black objects reflect less and absorb more solar radiation (smaller albedo) than white objects (higher albedo). You can easily get a feeling for the difference by wearing a black or a white shirt on a sunny day. Similarly, if you expose a black or dark colored phone to intense sunlight, it will absorb more heat than a white or light colored phone and therefore be more prone to overheating. So if you do have a black or dark colored phone, it might a good idea to get yourself a white case so more sunlight is reflected off of the device. Vice versa, if you have a white or light colored phone with a black case, take it off. Be aware though that a white case only reduces the absorption of „external heat“ by solar radiation, not internal heat generated by the phone itself, something that particularly happens when you shoot in 4K/UHD, high frame rates or bit rates. You should also take into consideration that a case that fits super tight might reduce the phone’s ability to dispense internal heat. Ergo: A white phone (case) only offers some protection against the impact of direct solar radiation, not against internal heat produced by the phone itself or high ambient temperatures.

Maximize screen brightness!
This is pretty obvious. Of course bright conditions make it harder to see the screen and judge framing, exposure and focus so it’s good to crank up the screen brightness. Some camera apps let you switch on a feature that automatically maximizes screen brightness when using the app.

Get a power bank!
Maximizing screen brightness will significantly increase battery consumption though so you should think about having a back-up power bank at hand – at least if you are going on a longer shoot. But most of us already have one or two so this might not even be an additonal purchase.

Use exposure/focus assistants of your camera app!
One thing that can be very helpful in bright conditions when it’s hard to see the screen are analytical assistant tools in certain camera apps. While there are very few native camera apps that offer some limited assistance in this respect, it’s an area where dedicated 3rd party apps like Filmic Pro, mcpro24fps, ProTake, MoviePro, Mavis etc. can really shine (pardon the pun). For setting the correct exposure you can use Zebra (displays stripes on overexposed areas of the frame) or False Color (renders the image into solid colors identifying areas of under- and overexposure – usually blue for underexposure and red for overexposure). For setting the correct focus you can use Peaking (displays a colored outline on things in focus) and Magnification (digitally magnifies the image). Not all mentioned apps offer all of the mentioned tools. And there’s also a downside: Using these tools puts extra stress on your phone’s chipset which also means more internal heat – so only use them when setting exposure and focus for the shot, turn them off once you are done.

Photo: Moondog Labs

Use a sun hood!
Another way to better see the screen in sunny weather is to use a sun hood. There are multiple generic smartphone sun hoods available online but also one from dedicated mobile camera gear company MoondogLabs. Watch out: SmallRig, a somewhat renowned accessory provider for independent videography and filmmaking has a sun hood for smartphones in its portfolio but it’s made for using smartphones as a secondary device with regular cameras or drones so there’s no cut-out for the lens or open back which renders it useless if you want to shoot with your phone. This cautionary advice also applies to other sun hoods for smartphones.

Photo: RollCallGames

Sweaty fingers?
An issue I encountered last summer while doing a bike tour where I occasionally would stop to take some shots of interesting scenery along the road was that sweaty hands/fingers can cause problems with a phone’s touch screen. Touches aren’t registered or at the wrong places. This can be quite annoying. Turns out that there’s such a thing as „anti-sweat finger sleeves“ which were apparently invented for passionate mobile gamers. So I guess kudos to PUBG and Fortnite aficionados? There’s also another option: You can use a stylus or pen to navigate the touch screen. Users of the Samsung Galaxy Note series are clearly at an advantage here as the stylus comes with the phone.

Photo: George Becker via Pexels.com

Don’t forget the water bottle!
Am I going to tell you to cool your phone with a refreshing shower of bottled drinking water? Despite the fact that many phones nowadays offer some level of water-resistance, the answer is no. I’m including this tip for two reasons: First, it’s always good to stay hydrated if you’re out in the sun – I have had numerous situations where I packed my gear bag with all kinds of stuff (most of which I didn’t need in the end) but forgot to include a bottle of water (which I desperately needed at some point). Secondly, you can use a water bottle as an emergency tripod in combination with a rubber band or hair tie as shown in workshops by Marc Settle and Bernhard Lill. So yes, don’t forget to bring a water bottle!

Got other tips for smartphone videography in the summertime? Let us know!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#45 The Smartphone Camera Exposure Paradox — 11. May 2021

#45 The Smartphone Camera Exposure Paradox

Ask anyone about the weaknesses of smartphone cameras and you will surely find that people often point towards a phone’s low-light capabilities as the or at least one of its Achilles heel(s). When you are outside during the day it’s relatively easy to shoot some good-looking footage with your mobile device, even with budget phones. Once it’s darker or you’re indoors, things get more difficult. The reason for this is essentially that the image sensors in smartphones are still pretty small compared to those in DSLMs/DLSRs or professional video/cinema cameras. Bigger sensors can collect more photons (light) and produce better low light images. A so-called “Full Frame” sensor in a DSLM like Sony’s Alpha 7-series has a surface area of 864 mm2, a common 1/2.5” smartphone image sensor has only 25 mm2. So why not just put a huge sensor in a smartphone? While cameras in smartphones have undeniably become a very important factor, the phone is still very much a multi-purpose device and not a single-purpose one like a dedicated camera – for better or worse. That means there are many things to consider when building a phone. I doubt anyone would want a phone with a form factor that doesn’t allow you to put the phone in your pocket. And the flat form factor makes it difficult to build proper optics with larger sensors. Larger sensors also consume more power and produce more heat, not exactly something desirable. If we are talking about smartphone photography from a tripod, some of the missing sensor size can be compensated for with long exposure times. The advancements in computational imaging and AI have also led to dedicated and often quite impressive photography “Night Modes” on smartphones. But very long shutter speeds aren’t really an option for video as any movement appears extremely blurred – and while today’s chipsets can already handle supportive AI processing for photography, more resource-intensive videography is yet a bridge too far. So despite the fact that latest developments signal that we’re about to experience a considerable bump in smartphone image sensor sizes (Sony and Samsung are about to release a 1-inch/almost 1-inch image sensor for phones), one could say that most/all smartphone cameras (still) have a problem with low-light conditions. But you know what? They also have a problem with the exact opposite – very bright conditions!

If you know a little bit about how cameras work and how to set the exposure manually, you have probably come across something called the “exposure triangle”. The exposure triangle contains the three basic parameters that let you set and adjust the exposure of a photo or video on a regular camera: Shutter speed, aperture and ISO. In more general terms you could also say: Time, size and sensitivity. Shutter speed signifies the amount of time that the still image or a single frame of video is exposed to light, for instance 1/50 of a second. The longer the shutter speed, the more light hits the sensor and the brighter the image will be. Aperture refers to the size of the iris’ opening through which the light passes before it hits the sensor (or wayback when the film strip), it’s commonly measured in f-stops, for instance f/2.0. The bigger the aperture (= SMALLER the f-stop number), the more light reaches the sensor and the brighter the image will be. ISO (or “Gain” in some dedicated video cameras) finally refers to the sensitivity of the image sensor, for instance ISO 400. The higher the ISO, the brighter the image will be. Most of the time you want to keep the ISO as low as possible because higher sensitivity introduces more image noise. 

So what exactly is the problem with smartphone cameras in this respect? Well, unlike dedicated cameras, smartphones don’t have a variable aperture, it’s fixed and can’t be adjusted. Ok, there actually have been a few phones with variable aperture, most notably Samsung had one on the S4 Zoom (2013) and K Zoom (2014) and they introduced a dual aperture approach with the S9/Note9 (2018), held on to it for the S10/Note 10 (2019) but dropped it again for the S20/Note20 (2020). But as you can see from the very limited selection, this has been more of an experiment. The fixed aperture means that the exposure triangle for smartphone cameras only has two adjustable parameters: Shutter speed and ISO. Why is this problematic? When there’s movement in a video (either because something moves within the frame or the camera itself moves), we as an audience have become accustomed to a certain degree of motion blur which is related to the used shutter speed. The rule of thumb applied here says: Double the frame rate. So if you are shooting at 24fps, use a shutter speed of 1/48s, if you are shooting at 25fps, use a shutter speed of 1/50s, 1/60s for 30fps etc. This suggestion is not set in stone and in my humble opinion you can deviate from it to a certain degree without it becoming too obvious for casual, non-pixel-peeping viewers – but if the shutter speed is very slow, everything begins to look like a drug-induced stream of consciousness experience and if it’s very fast, things appear jerky and shutter speed becomes stutter speed. So with the aperture being fixed and the shutter speed set at a “recommended” value, you’re left with ISO as an adjustable exposure parameter. Reducing the sensitivity of the sensor is usually only technically possible down to an ISO between 50 and 100 which will still give you a (heavily) overexposed image on a sunny day outside. So here’s our “paradox”: Too much available light can be just as much of an issue as too little when shooting with a smartphone.

What can we do about the two problems? Until significantly bigger smartphone image sensors or computational image enhancement for video arrives, the best thing to tackle the low-light challenge is to provide your own additional lighting or look for more available light, be it natural or artificial. Depending on your situation, this might be relatively easy or downright impossible. If you are trying to capture an unlit building at night, you will most likely not have a sufficient amount of ultra-bright floodlights at your hand. If you are interviewing someone in a dimly lit room, a small LED might just provide enough light to keep the ISO at a level without too much image noise.

Clip-on variable ND filter

As for the too-much-light problem (which ironically gets even worse with bigger sensors setting out to remedy the low-light problems): Try to pick a less sun-drenched spot, shoot with a faster shutter-speed if there is no or little action in the shot or – and this might be the most flexible solution – get yourself an ND (neutral density) filter that reduces the amount of light that passes through the lens. While some regular cameras have inbuilt ND filters, this feature has yet to appear in any smartphone, although OnePlus showcased a prototype phone last year that had something close to a proper ND filter, using a technology called “electrochromic glass” to hide the lens while still letting (less) light pass through (check out this XDA Developers article). So until this actually makes it to the market and proves to be effective, the filter has to be an external one that is either clipped on or screwed on if you use a dedicated case with a corresponding filter thread. You also have the choice between a variable and a non-variable (fixed density) ND filter. A variable ND filter will let you adjust the strength of its filtering effect which is great for flexibility but also have some disadvantages like the possibility of cross-polarization. If you want to learn more about ND filters, I highly recommend checking out this superb in-depth article by Richard Lackey.

So what’s the bigger issue for you personally? Low-light or high-light? 

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion — 4. May 2021

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion

Rode just recently released the Wireless GO II, a very compact wireless audio system I wrote about in my last article. One of its cool features is that you can feed two transmitters into one receiver so you don’t need two audio inputs on your camera or smartphone to work with two external mic sources simultaneously. What’s even cooler is that you can record the two mics into separate channels of a video file with split track dual mono audio so you are able to access and mix them individually later on which can be very helpful if you need to make some volume adjustments or eliminate unwanted noise from one mic that would otherwise just be “baked in” with a merged track. There’s also the option to record a -12dB safety track into the second channel when you are using the GO II’s “merged mode” instead of the “split mode” – this can be a lifesaver when the audio of the original track clips because of loud input.

If you use a regular camera like a DSLM, it’s basically a given that you can record in split track dual mono and it also isn’t rocket science to access the two individual channels on a lot of desktop editing software. If you are using the GO II with a smartphone and even want to finish the edit on mobile afterwards, it’s a bit more complicated.

First off, if you want to make use of split channels or the safety channel, you need to be able to record a video file with dual track audio, because only then do you have two channels at your disposal, two channels that are either used for mic 1 and mic 2 or mic 1+2 combined and the safety channel in the case of the Wireless Go II. Most smartphones and camera apps nowadays do support this though (if they support external mics in general). The next hurdle is that you need to use the digital input port of your phone, USB-C on an Android device or the Lightning port on an iPhone/iPad. If you use the 3.5mm headphone jack (or an adapter like the 3.5mm to Lightning with iOS devices), the input will either create single channel mono audio or send the same pre-mixed signal to both stereo channels. So you will need a USB-C to USB-C cable for Android devices (Rode is selling the SC-16 but I also made it work with another cable) and a USB-C to Lightning cable for iOS devices (here the Rode SC-15 seems to be the only compatible option) to connect the RX unit of the GO II to the mobile device. Unfortunately, such cables are not included with the GO II but have to be purchased separately. A quick note: Depending on what app you are using, you either need to explicitly choose an external mic as the audio input in the app’s settings or it just automatically detects the external mic.

Once you have recorded a dual mono video file including separate channels and want to access them individually for adjustments, you also need the right editing software that allows you to do that. On desktop, it’s relatively easy with the common prosumer or pro video editing software (I personally use Final Cut Pro) but on mobile devices there’s currently only a single option: LumaFusion, so far only available for iPhone/iPad. I briefly thought that KineMaster (which is available for both Android and iOS) can do it as well because it has a panning feature for audio but it’s not implemented in a way that it can actually do what we need it to do in this scenario.

So how do you access the different channels in LumaFusion? It’s actually quite simple: You either double-tap your video clip in the timeline or tap the pen icon in the bottom toolbar while having the clip selected. Select the “Audio” tab (speaker icon) and find the “Configuration” option on the right. In the “Channels” section select either “Fill From Left” or “Fill From Right” to switch between the channels. If you need to use both channels at the same time and adjust/balance the mix you will have to detach the audio from the video clip (either triple-tap the clip or tap on the rectangular icon with an audio waveform), then duplicate the audio (rectangular icon with a +) and then set the channel configuration of one to “Fill From Left” and for the other to “Fill From Right”.

Here’s hoping that more video editing apps implement the ability to access individual audio tracks of a video file and that LumaFusion eventually makes it to Android.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#43 The Rode Wireless Go II review – Essential audio gear for everyone? — 20. April 2021

#43 The Rode Wireless Go II review – Essential audio gear for everyone?

Australian microphone maker RØDE is an interesting company. For a long time, the main thing they had going for them was that they would provide an almost-as-good but relatively low-cost alternative to high-end brands like Sennheiser or AKG and their established microphones, thereby “democratizing” decent audio gear for the masses. Over the last years however, Rode grew from “mimicking” products of other companies to a highly innovative force, creating original products which others now mimicked in return. Rode was first to come out with a dedicated quality smartphone lavalier microphone (smartLav+) for instance and in 2019, the Wireless GO established another new microphone category: the ultra-compact wireless system with an inbuilt mic on the TX unit. It worked right out of the box with DSLMs/DSLRs, via a TRS-to-TRRS or USB-C cable with smartphones and via a 3.5mm-to-XLR adapter with pro camcorders. The Wireless GO became an instant runaway success and there’s much to love about it – seemingly small details like the clamp that doubles as a cold shoe mount are plain ingenuity. The Interview GO accessory even turns it into a super light-weight handheld reporter mic and you are also able to use it like a more traditional wireless system with a lavalier mic that plugs into the 3.5mm jack of the transmitter. But it wasn’t perfect (how could it be as a first generation product?). The flimsy attachable wind-screen became sort of a running joke among GO users (I had my fair share of trouble with it) and many envied the ability of the similar Saramonic Blink 500 series (B2, B4, B6) to have two transmitters go into a single receiver – albeit without the ability for split channels. Personally, I also had occasional problems with interference when using it with an XLR adapter on bigger cameras and a Zoom H5 audio recorder.

Now Rode has launched a successor, the Wireless GO II. Is it the perfect compact wireless system this time around?

The most obvious new thing about the GO II is that the kit comes with two TX units instead of just one – already know where we are headed with this? Let’s talk about it in a second. A first look at the Wireless GO II’s RX and TX units doesn’t really reveal anything new – apart from the fact that they are labled “Wireless GO II”, the form factor of the little black square boxes is exactly the same. That’s both good and maybe partly bad I guess. Good because yes, just like the original Wireless GO, it’s a very compact system, “partly bad” because I suppose some would have loved to see the TX unit be even smaller for using it standalone as a clip-on with the internal mic and not with an additional lavalier. But I suppose the fact that you have a mic and a transmitter in a single piece requires a certain size to function at this point in time. The internal mic also pretty much seems to be the same, which isn’t a bad thing per se, it’s quite good! I wasn’t able to make out a noticeable difference in my tests so far but maybe the improvements are too subtle for me to notice – I’m not an audio guy. Oh wait, there is one new thing on the outside: A new twist-mechanism for the wind-screen – and this approach actually works really well and keeps the wind-screen in place, even if you pull on it. For those of us who use it outdoors, this is really a big relief.

But let’s talk about the new stuff “under the hood”, and let me tell you, there’s plenty! First of all, as hinted at before, you can now feed two transmitters into one receiver. This is perfect if you need to mic up two persons for an interview. With the original Wireless GO you had to use two receivers and an adapter cable to make it work with a single audio input.

It’s even better that you can choose between a “merged mode” and a “split mode”. The “merged mode” combines both TX sources into a single pre-mixed audio stream, “split mode” sends the two inputs into separate channels (left and right on a stereo mix, so basically dual mono). The “split mode” is very useful because it allows you to access and adjust both channels individually afterwards – this can come in handy for instance if you have a two-person interview and one person coughs while the other one is talking. If the two sources are pre-mixed (“merged mode”) into the same channel, then you will not be able to eliminate the cough without affecting the voice of the person talking – so it’s basically impossible. When you have the two sources in separate channels you can just mute the noisy channel for that moment in post. You can switch between the two modes by pressing both the dB button and the pairing button on the RX unit at the same time. 

One thing you should be aware of when using the split-channels mode recording into a smartphone: This only works with the digital input port of the phone (USB-C on Android, Lightning on iPhone/iPad). If you use a TRS-to-TRRS cable and feed it into the 3.5mm headphone jack (or a 3.5mm adapter, like the one for the iPhone), the signal gets merged, as there is just one contact left on the pin for mic input – only allowing mono. If you want to use the GO II’s split channels feature with an iPhone, there’s currently only one reliable solution: Rode’s SC15 USB-C to Lightning cable which is a separate purchase (around 25 Euros) unfortunately. With Android it’s less restrictive. You can purchase the equivalent SC16 USB-C to USB-C cable from Rode (around 15 Euros) but I tested it with a more generic USB-C to USB-C cable (included with my Samsung T5 SSD drive) and it worked just fine. So if you happen two have a USB-C to USB-C cable around, try this first before buying something new. You should also consider that you need a video editing software that lets you access both channels separately if you want to individually adjust them. On desktop, there are lots of options but on mobile devices, the only option is currently LumaFusion (I’m planning a dedicated blog post about this). 

If you don’t need the extra functionality of the “split mode” or the safety channel and are happy to use it with your device’s 3.5mm port (or a corresponding adapter), be aware that you will still need a TRS-to-TRRS adapter (cable) like Rode’s own SC4 or SC7 because the included one from Rode is TRS-to-TRS which works fine for regular cameras (DSLMs/DSLRs) but not with smartphones which have a TRRS headphone jack – well, if they still have one at all, that is. It may all look the same at first sight but the devil is in the detail, or in this case the connectors of the pin.

If you want to use the GO II with a camera or audio recorder that has XLR inputs, you will need a 3.5mm to XLR adapter like Rode’s own VXLR+ or VXLR Pro.

Along with the GO II, Rode released a desktop application called Rode Central which is available for free for Windows and macOS. It lets you activate and fine-tune additional features on the GO II when it’s connected to the computer. You can also access files from the onboard recording, a new feature I will talk about in a bit. A mobile app for Android and iOS is not yet available but apparently Rode is already working on it.

One brilliant new software feature is the ability to record a simultaneous -12dB safety track when in “merged mode”. It’s something Rode already implemented on the VideoMic NTG and it’s a lifesaver when you don’t know in advance how loud the sound source will be. If there’s a very loud moment in the main track and the audio clips, you can just use the safety track which at -12dB probably will not have clipped. The safety channel is however only available when recording in “merged mode” since it uses the second channel for the back-up. If you are using “split mode”, both channels are already filled and there’s no space for the safety track. It also means that if you are using the GO II with a smartphone, you will only be able to access the safety channel feature when using the digital input (USB-C or Lightning), not the 3.5mm headphone jack analogue input, because only then will you have two channels to record into at your disposal.

Another lifesaver is the new onboard recording capability which basically turns the two TX units into tiny standalone field recorders, thanks to their internal mic and internal storage. The internal storage is capable of recording up to 7 hours of uncompressed wav audio (the 7 hours also correspond with the battery life which probably isn’t a coincidence). This is very helpful when you run into a situation where the wireless connection is disturbed and the audio stream is either affected by interference noise or even drop-outs.

There are some further options you can adjust in the Rode Central app: You can now activate a more nuanced gain control pad for the output of the RX unit. On the original GO, you only had three different settings (low, medium, high), now you have a total of 11 (in 3db steps from -30db to 0db). You can also activate a reduced sensitivity for the input of the TX units when you know that you are going to record something very loud. Furthermore, you can enable a power saver mode that will dim the LEDs to preserve some additional battery life.

Other improvements over the original GO include a wider transmission range (200m line-of-sight vs. 70m) and better shielding from RF interference.

One thing that some people were hoping for in an updated version of the Wireless GO is the option to monitor the audio that goes into the receiver via a headphone output – sorry to say that didn’t happen but as long as you are using a camera or smartphone/smartphone app that gives you live audio monitoring, this shouldn’t be too big of a deal.

Aside from the wireless system itself the GO II comes with a TRS-to-TRS 3.5mm cable to connect it to regular cameras with a 3.5mm input, three USB-C to USB-A cables (for charging and connecting it to a desktop computer/laptop), three windshields, and a pouch. The pouch isn’t that great in my opinion, I would have prefered a more robust case but I guess it’s better than nothing at all. And as mentioned before: I would have loved to see a TRS-to-TRRS, USB-C to USB-C and/or USB-C to Lightning cable included to assure out-of-the-box compatibility with smartphones. Unlike some competitors, the kit doesn’t come with separate lavalier mics so if you don’t want to use the internal mics of the transmitters you will have to make an additional purchase unless you already have some. Rode offers the dedicated Lavalier GO for around 60 Euros. The price for the Wireless GO II is around 300 Euros. 

So is the Rode Wireless GO II perfect? Not quite, but it’s pretty darn close. It surely builds upon an already amazingly compact and versatile wireless audio system and adds some incredible new features so I can only recommend it for every mobile videomaker’s gear bag. If you want to compare it against a viable alternative, you could take a look at the Saramonic Blink 500 Pro B2 which is roughly the same price and comes with two lavalier microphones or the Hollyland Lark 150.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC) — 23. March 2021

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC)

As I have pointed out in two of my previous blog posts (What’s the best free cross-platform mobile video editing app?, Best video editors / video editing apps for Android in 2021) VN is a free and very capable mobile video editor for Android and iPhone/iPad and the makers recently also launched a desktop version for macOS. Project file sharing takes advantage of that and makes it possible to start your editing work on one device and finish it on another. So for instance after having shot some footage on your iPhone, you can start editing right away using VN for iPhone but transfer the whole project to your iMac or MacbookPro later to have a bigger screen and mouse control. It’s also a great way to free up storage space on your phone since you can archive projects in the cloud, on an external drive or computer and delete them from your mobile device afterwards. Project sharing isn’t a one-way trick, it also works the other way around: You start a project using VN on your iMac or MacbookPro and then transfer it to your iPhone or iPad because you have to go somewhere and want to continue your project while commuting. And it’s not all about Apple products either, you can also share from or to VN on Android smartphones and tablets (so basically every smartphone or tablet that’s not made by Apple). What about Windows? Yes, this is also possible but you will need to install an Android emulator on your PC and I will not go into the details about the procedure in this article as I don’t own a PC to test. But you can check out a good tutorial on the VN site here.

Before you start sharing your VN projects, here’s some general info: To actively share a project file, you need to create a free account with VN. Right off the bat, you can share projects that don’t exceed 3 GB in size. There’s also a maximum limit of 100 project files per day but I suppose nobody will actually bump into that. To get rid of these limitations, VN will manually clear your account for unlimited sharing within a few days after filling out this short survey. For passive sharing, that is when someone sends you a project file, there are no limitations even when you are not logged in. As the sharing process is slightly different depending on which platforms/devices are involved I have decided to walk you through all nine combinations, starting with the one that will probably be the most common. 

Let me quickly explain two general things ahead which apply to all combinations so I don’t have to go into the details every time:

1) When creating a VN project file to share, you can do it as “Full” or “Simple”. “Full” will share the project file with all of its media (complete footage, music/sound fx, text), “Simple” will let you choose which video clips you actually want to include. Not including every video clip will result in a smaller project file that can be transferred faster.

2) You can also choose whether or not you want the project file to be “Readonly”. If you choose “Readonly”, saving or exporting will be denied – this can be helpful if you send it to someone else but don’t want this person to save changes or export the project.

All of the sharing combinations I will mention now are focused on local device-to-device sharing. Of course you can also use any cloud service to store/share VN project files and have them downloaded and opened remotely on another device that runs the VN application.

iPhone/iPad to Mac

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon at the bottom), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Now choose “AirDrop” and select your Mac. Make sure that AirDrop is activated on both devices.
  • Depending on your AirDrop settings you now have to accept the transfer on the receiving device or the transfer will start automatically. By default, the file will be saved in the “Downloads” folder of your Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app.
  • Now select “Open project”.

iPhone/iPad to iPhone/iPad

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now choose “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Select the iPhone/iPad you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now select “Open project”

iPhone/iPad to Android

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the iOS/iPadOS share menu will pop up.
  • Now you need to transfer the project file from the iPhone/iPad to the Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both iPhone/iPad and Android.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Open SendAnywhere on your Android device, select the “Receive” tab and enter the code
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. 
  • The Android “Open with” menu will open, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Finally choose “Open Project”.

Mac to iPhone/iPad

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select. “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select your iPhone or iPad. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now choose “Open Project”.

Mac to Mac

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select the Mac you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • By default the VN project file will be saved in the “Downloads” folder of the receiving Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Mac to Android

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and choose a way to send it to your Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both macOS and Android.
  • So using SendAnywhere on your Mac, drag the VN project file into the app. You will see a 6-digit code. Open SendAnywhere on your Android, choose the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then on the project file.
  • The Android “Open with” menu will pop up, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Choose “Open Project”.

Android to Mac

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the Android share sheet will pop up.
  • Now you need to transfer the project file from your Android device to your Mac. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and macOS.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Unless you have created a custom download folder for your preferred file transfer app, the VN project file will be saved to the “Downloads” folder on your Mac or is available in your cloud storage.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Android to Android

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • From the Android share sheet, choose Android’s integrated wifi sharing option Nearby Share (check this video on how to use Nearby Share if you are not familiar with it) and select the device you want to send it to. Make sure Nearby Share is activated on both devices.
  • After accepting the file on the second device, the transfer will start.
  • Once it is finished, choose “VN/Import to VN” from the pop up menu. Importing into VN will start. 
  • Finally choose “Open Project”.

Android to iPhone/iPad

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated. Afterwards, the Android share sheet menu will pop up.
  • Now you need to transfer the project file from the Android device to the iPhone/iPad. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and iPhone/iPad.
  • So choose SendAnywhere from the Share Sheet. A 6-digit code is generated.
  • Open SendAnywhere on your iPhone/iPad, select the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. Now tap on the share icon in the top right corner and choose VN from the list. The project file will be imported into VN.
  • Finally choose “Open Project”.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

DISCLOSURE NOTE: This particular post was sponsored by VN. It was however researched and written all by myself.

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier! — 16. January 2021

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier!

There are times when – for reasons of privacy or even a person’s physical safety – you want to make certain parts of a frame in a video unrecognizable so not to give away someone’s identity or the place where you shot the video. While it’s fairly easy to achieve something like that for a photograph, it’s a lot more challenging for video because of two reasons: 1) You might have a person moving around within a shot or a moving camera which constantly alters the location of the subject within the frame. 2) If the person talks, he or she might also be identifiable just by his/her voice. So are there any apps that help you to anonymize persons or objects in videos when working on a smartphone?

KineMaster – the best so far

Up until recently the best app for anonymizing persons and/or certain parts of a video in general was KineMaster which I already praised in my last blog about the best video editing apps on Android (it’s also available for iPhone/iPad). While it’s possible to use just any video editor that allows for a resizable image layer (let’s say just a plain black square or rectangle) on top of the main track to cover a face, KineMaster is the only one with a dedicated blur/mosaic tool for this use case. Many other video editing apps have a blur effect in their repertoire, but the problem is that this effect always affects the whole image and can’t be applied to only a part of the frame. KineMaster on the other hand allows its Gaussian Blur effect to be adjusted in size and position within the frame. To access this feature, scroll to the part of the timeline where you want to apply the effect but don’t select any of the clips! Now tap on the “Layer” button, choose “Effect”, then “Basic Effects”, then either “Gaussian Blur” or “Mosaic”. An effect layer gets added to the timeline which you can resize and position within the preview window. Even better: KineMaster also lets you keyframe this layer which is incredibly important if the subject/object you want to anonymize is moving around the frame or if the camera is moving (thereby constantly altering the subject’s/object’s position within the frame). Keyframing means you can set “waypoints” for the effect’s area to automatically change its position/size over time. You can access the keyframing feature by tapping on the key icon in the left sidebar. Keyframes have to be set manually so it’s a bit of work, particularly if your subject/object is moving a lot. If you just have a static shot with the person not moving around a lot, you don’t have to bother with keyframing though. And as if the adjustable blur/mosaic effect and support for keyframing wasn’t good enough, KineMaster also gives you a tool to add an extra layer of privacy: you can alter voices. To access this feature, select a clip in the timeline and then scroll down the menu on the right to find “Voice Changer”, there’s a whole bunch of different effects. To be honest, most of them are rather cartoonish – I’m not sure you want your interviewee to sound like a chipmunk. But there are also a couple of voice changer effects that I think can be used in a professional context.

What happened to Censr?

As I indicated in the paragraph above, a moving subject (or a moving camera) makes anonymizing content within a video a lot harder. You can manually keyframe the blurred area to follow along in KineMaster but it would be much easier if that could be done via automatic tracking. Last summer, a closed beta version of an app called “Censr” was released on iOS, the app was able to automatically track and blur faces. It all looked quite promising (I saw some examples on Twitter) but the developer Sam Loeschen told me that “unfortunately, development on censr has for the most part stopped”.

PutMask – a new app with a killer feature!

But you know what? There actually is a smartphone app out there that can automatically track and pixelate faces in a video: it’s called PutMask and currently only available for Android (there are plans for an iOS version). The app (released in July 2020) offers three ways of pixelating faces in videos: automatically by face-tracking, manually by following the subject with your finger on the touch-screen and manually by keyframing. The keyframing option is the most cumbersome one but might be necessary when the other two ways won’t work well. The “swipe follow” option is the middle-ground, not as time-consuming as keyframing but manual action is still required. The most convenient approach is of course automatic face-tracking (you can even track multiple faces at the same time!) – and I have to say that in my tests, it worked surprisingly well! 

Does it always work? No, there are definitely situations in which the feature struggles. If you are walking around and your face gets covered by something else (for instance because you are passing another person or an object like a tree) even for only a short moment, the tracking often loses you. It even lost me when I was walking around indoors and the lens flare from the light bulb at the ceiling created a visual “barrier” which I passed at some point. And although I would say that the app is generally well-designed, some of the workflow steps and the nomenclature can be a bit confusing. Here’s an example: After choosing a video from your gallery, you can tap on “Detect Faces” to start a scanning process. The app will tell you how many faces it has found and will display a numbered square around the face. If you now tap on “Start Tracking”, the app tells you “At least select One filter”. But I couldn’t find a button or something indicating a “filter”. After some confusion I discovered that you need to tap once on the square that is placed over the face in the image, maybe by “filter” they actually mean you need to select at least one face? Now you can initiate the tracking. After the process is finished you can preview the tracking that the app has done (and also dig deeper into the options to alter the amount of pixelation etc.) but for checking the actual pixelated video you have to export your project first. While the navigation could/should be improved for certain actions to make it more clear and intuitive, I was quite happy with the results in general. The biggest catch until recently was the maximum export resolution of 720p but with the latest update released on 21 January 2021, 1080p is also supported. An additional feature that would be great to have in an app that has a dedicated focus on privacy and anonymization, is the ability to alter/distort the voice of a person, like you can do in KineMaster.

There’s one last thing I should address: The app is free to download with all its core functionality but you only get SD resolution and a watermark on export. For HD/FHD watermark-free export, you need to make an in-app purchase. The IAP procedure is without a doubt the weirdest I have ever encountered: The app tells you to purchase any one of a selection of different “characters” to receive the additional benefits. Initially, these “characters” are just names in boxes, “Simple Man”, “Happy Man”, “Metal-Head” etc. If you tap on a box, an animated character pops up. But only when scrolling down it becomes clear that these “characters” represent different amounts of payment with which you support the developer. And if that wasn’t strange enough by itself, the amount you can donate goes up to a staggering 349.99 USD (Character Dr. Plague) – no kidding! At first, I had actually selected Dr. Plague because I thought it was the coolest looking character of the bunch. Only when trying to go through with the IAP did I become aware of the fact that I was about to drop 350 bucks on the app! Seriously, this is nuts! I told the developer that I don’t think this is a good idea. Anyway, the amount of money you donate doesn’t affect your additional benefits, so you can just opt for the first character, the “Simple Man”, which costs you 4.69€. I’m not sure why they would want to make things so confusing for users willing to pay but other than that, PutMask is a great new app with a lot of potential, I will definitely keep an eye on it!

As always, if you have questions or comments, drop them below or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

Download PutMask on GooglePlay.

#36(0) The Insta360 One X2 – fun & frustration — 5. January 2021

#36(0) The Insta360 One X2 – fun & frustration

A couple of years ago, 360° (video) cameras burst onto the scene and seemed to be all the new rage for a while. The initial excitement faded relatively quickly however when producers realized that this kind of video didn’t really resonate as much as they thought it would with the public – at least in the form of immersive VR (Virtual Reality) content for which you need extra hardware, hardware that most didn’t bother to get or didn’t get hooked on. From a creator’s side, 360 video also involved some extra and – dare I say – fairly tedious workflow steps to deliver the final product (I have one word for you: stitching). That’s not to say that this extraordinary form of video doesn’t have value or vanished into total obscurity – it just didn’t become a mainstream trend. 

Among the companies that heavily invested in 360 cameras was Shenzen-based Insta360. They offered a wide variety of different devices: Some standalone, some that were meant to be physically connected to smartphones. I actually got the Insta360 Air for Android devices and while it was not a bad product at all and fun for a short while, the process of connecting it to the USB port of the phone when using it but then taking it off again when putting the phone back in your pocket or using it for other things quickly sucked out the motivation to keep using it.

Repurposing 360 video

While continuing to develop new 360 cameras, Insta360 however realized that 360 video could be utilized for something else than just regular 360 spherical video: Overcapture and subsequent reframing for “traditional”, “flat” video. What does this mean in plain English? Well, the original spherical video that is captured is much bigger in terms of resolution/size than the one that you want as a final product (for instance classic 1920×1080) which gives you the freedom to choose your angle and perspective in post production and even create virtual camera movement and other cool effects. Insta360 by no means invented this idea but they were clever enough to shift their focus towards this use case. Add to that the marketing gold feature of the “invisible selfie-stick” (taking advantage of a dual-lens 360 camera’s blindspot between its lenses), brilliant “Flow State” stabilization and a powerful mobile app (Android & iOS) full of tricks, you’ll end up with a significant popularity boost for your products!

The One X and the wait for a true successor

The one camera that really proved to be an instant and long-lasting success for Insta360 was the One X which was released in 2018. A very compact & slick form factor, ease of use and very decent image quality (except in low light) plus the clever companion app breathed some much-needed life into a fairly wrinkled and deflated 360 video camera balloon. In early 2020 (you know, the days when most of us still didn’t know there was a global pandemic at our doorstep), Insta360 surprised us by not releasing a direct successor to everybody’s darling (the One X) but the modular One R, a flexible and innovative but slightly clunky brother to the One X. It wasn’t until the end of October that Insta360 finally revealed the true successor to the One X, the One X2.

In the months prior to the announcement of the One X2, I had actually thought about getting the original One X (I wasn’t fully convinced by the One R) but it was sold out in most places and there were some things that bothered me about the camera. To my delight, Insta360 seemed to have addressed most of the issues that me (and obviously many others) had with the original One X: They improved the relatively poor battery life by making room for a bigger battery, they added the ability to connect an external mic (both wirelessly through Bluetooth and via the USB-C port), they included a better screen on which you could actually see things and change settings in bright sunlight, they gave you the option to stick on lens guards for protecting the delicate protruding lenses and they made it more rugged including an IPX8 waterproof certification (up to 10m) and  a less flimsy thread for mounting it to a stick or tripod. All good then? Not quite. Just by looking at the spec sheet, people realized that there wasn’t any kind of upgrade in terms of video resolution or even just frame rates. It’s basically the same as the One X. It maxes out at 5.7k (5760×2880) at 30fps (with options for 25 and 24), 4k at 50fps and 3k at 100fps. The maximum bitrate is 125 Mbit/s. I’m sure quite a few folks had hoped for 8k (to get on par with the Kandao Qoocam 8K) or at the very least a 50/60fps option for 5.7k. Well, tough luck.

While I can certainly understand some of the frustration about the fact that there hasn’t been any kind of bump in resolution or frame rates in 2 years, putting 8K in such a small device and also have the footage work for editing on today’s mobile devices probably wasn’t a step Insta360 was ready to take because of the possibility of a worse user experience despite higher resolution image quality. Personally, I wasn’t bothered too much by this since the other hardware improvements over the One X were good enough for me to go ahead and make the purchase. And this is where my own frustrations began…

Insta360 & me: It’s somewhat difficult…

While I was browsing the official Insta360 store to place my order for the One X2, I noticed a pop-up that said that you could get 5% off your purchase if you sign up for their newsletter. They did exclude certain cameras and accessories but the One X2 was mentioned nowhere. So I thought, “Oh, great! This just comes at the right time!”, and signed up for the newsletter. After getting the discount code however, entering it during the check-out always returned a “Code invalid” error message. I took to Twitter to ask them about this – no reply. I contacted their support by eMail and they eventually and rather flippantly told me something like “Oh, we just forgot to put the X2 on the exclusion list, sorry, it’s not eligible!”. Oh yes, me and the Insta360 support were off to a great start!

Wanting to equip myself with the (for me) most important accessories I intended to purchase a pair of spare batteries and the microphone adapter (USB-C to 3.5mm). I could write a whole rant about how outrageous I find the fact that literally everyone seems to make proprietary USB-C to 3.5mm adapters that don’t work with other brands/products. E-waste galore! Anyway, there’s a USB-C to 3.5mm microphone adapter from Insta360 available for the One R and I thought, well maybe at least within the Insta360 ecosystem, there should be some cross-device compatibility. Hell no, they told me the microphone adapter for the One R doesn’t work with the One X2. Ok, so I need to purchase the more expensive new one for the X2 – swell! But wait, I can’t because while it’s listed in the Insta360 store, it’s not available yet. And neither are extra batteries. The next bummer. So I bought the Creator Kit including the “invisible” selfie-stick, a small tripod, a microSD card, a lens cap and a pair of lens guards.

A couple of weeks later, the package arrived – no problem, in the era of Covid I’m definitely willing to cut some slack in terms of delivery times and the merchandise is sent from China so it has quite a way to Germany. I opened the package, took out the items and checked them to see if anything’s broken. I noticed that one of the lens guards had a small blemish/scratch on it. I put them on the camera anyway thinking maybe it doesn’t really show in the footage. Well, it did. A bit annoying but stuff like that happens, a lemon. I contacted the support again. They wanted me to take a picture of the affected lens guard. Okay. I sent them the picture. They blatantly replied that I should just buy a new one from their store, basically insinuating that it was me who damaged the lens guard. What a terrible customer service! I suppose I would have mustered up some understanding for their behaviour if I had contacted them a couple of days or weeks later after actually using the X2 for some time outdoors where stuff can quickly happen. But I got in touch with them the same day the delivery arrived and they should have been able to see that since the delivery had a tracking number. Also, this item costs 25 bucks in the Insta360 store, probably a single one or a few cents in production and I wasn’t even asking about a pair but only one – why make such a fuss about it? So there was some back-and-forth and only after I threatened to return the whole package and asked for a complete refund they finally agreed to send me a replacement pair of lens guards at no extra cost. On a slightly positive note, they did arrive very quickly only a couple of days later.

Is the Insta360 One X2 actually a good camera?

So what an excessive prelude I have written! What about the camera itself? I have to admit that for the most part, it’s been a lot of fun so far after using it for about a month. The design is rugged yet still beautifully simplistic and compact, the image quality in bright, sunny conditions is really good (if you don’t mind that slightly over-sharpened wide-angle look and that it’s still “only” 5.7k – remember this resolution is for the whole 360 image so it’s not equivalent to a 5.7k “flat” image), the stabilization is generally amazing (as long as the camera and its sensor are not exposed to extreme physical shakes which the software stabilization can’t compensate for) and the reframing feature in combination with the camera’s small size and weight gives you immense flexibility in creating very interesting and extraordinary shots.

Sure, it also has some weaknesses: Despite having a 5.7k 360 resolution, if you want to export as a regular flat video, you are limited to 1080p. If you need your final video to be in UHD/4K non-360 resolution, this camera is not for you. The relatively small sensor size (I wasn’t able to find out the exact size for the X2 but I assume it’s the same as the One X, 1/2.3″) makes low-light situations at night or indoors a challenge despite a (fixed) aperture of f/2.0 – even a heavily overcast daytime sky can prove less than ideal. Yes, a slightly bigger sensor compared to its predecessors would have been welcome. The noticeable amount of image noise that is introduced by auto-exposure in such dim conditions can be reduced by exposing manually (you can set shutter speed and ISO) but then of course you just might end up with an image that’s quite dark. The small sensor also doesn’t allow for any fancy “cinematic” bokeh but in combination with the fixed focus it also has an upside that shouldn’t be underestimated for self-shooters: You don’t have to worry about a pulsating auto-focus or being out of focus as everything is always in focus. You can also shoot video in LOG (flatter image for more grading flexibility) and HDR (improved dynamic range in bright conditions) modes. Furthermore, there’s a dedicated non-360 video mode with a 150 degree field-of-view but except for the fact that you get a slight bump in resolution compared to flat reframed 360 video (1440p vs. 1080p) and smaller file sizes (you can also shoot your 5.7k in H.265 codec to save space), I don’t see me using this a lot as you lose all the flexibility in post.

While it’s good that all the stitching is done automatically and the camera does a fairly good job, it’s not perfect and you should definitely familiarize yourself with where the (video) stitchline goes to avoid it in the areas where you capture important objects or persons, particularly faces. As a rule of thumb when filming yourself or others you should always have one of the two lenses pointed towards you/the person and not face the side of the camera. It’s fairly easy to do if you usually have the camera in the same position relative to yourself but becomes more tricky when you include elaborate camera movements (which you probably will as the X2 basically invites you to do this!).

Regarding the audio, the internal 4-mic ambi-sonic set up can produce good results for ambient sound, particularly if you have the camera close to the sound source like when you have it on a stick pointing down and you are walking over fresh snow, dead leaves, gravel etc. For recording voices in good quality, you also need to be pretty close to the camera’s mics, having it on a fully extended selfie-stick isn’t ideal. If you want to use the X2 on an extended stick and talk to the camera you should use an external mic, either one that is directly connected to the camera or plugged into an external recorder, then having to sync audio and video later in post. As I have mentioned before, the X2 now does offer support for external mics via the USB-C charging port with the right USB-C-to-3.5mm adapter and also via Bluetooth. Insta360 highlights in their marketing that you can use Apple’s AirPods (Pro) but you can also other mics that work via Bluetooth. The audio sample rate of Bluetooth mics is currently limited to 16kHz by standard but depending on the used mic you can get decent audio. I’ll probably make a separate article on using external mics with the X2 once my USB-C to 3.5mm adapter arrives. Wait, does the X2 shoot 360 photos as well? Of course it does, they turn out quite decent, particularly with the new “Pure Shot” feature and the stichting is better than in video mode. It’s no secret though that the X2 has a focus on video with all its abilities and for those that mainly care about 360 photography for virtual tours etc., the offerings in the Ricoh Theta line will probably be the better choice.

The Insta360 mobile app

The Insta360 app (Android & iOS) might deserve its own article to get into detail but suffice it to say that while it can seem a bit overwhelming and cluttered occasionally and you also still experience glitches now and then, it’s very powerful and generally works well. Do note however that if you want to export in full 5.7k resolution as a 360 video you have to transfer the original files to a desktop computer and work with them in the (free) Insta360 Studio software (Windows/macOS) as export from the mobile app is limited to 4K. You should also be aware of the fact that neither the mobile app nor the desktop software works as a fully-fledged traditional video editor for immersive 360 video where you can have multiple clips on a timeline and arrange them for a story. In the mobile app, you do get such an editing environment (“Stories” – “My Stories” – “+ Create a story”) but while you can use your original spherical 360 footage here, you can only export the project as a (reframed) flat video (max resolution 2560×1440). If you need your export to be an actual 360 video with according metadata, you can only do this one clip at a time outside the “Stories” editing workspace. But as mentioned before, Insta360 focuses on the reframing of 360 video with its cameras and software, so not too many people might be bothered by that. One thing that really got on my nerves while editing within the app on an iPad: When you are connected to the X2 over WiFi, certain parts of the app that rely on a data connection don’t work, for instance you are not able to browse all the features of the shot lab (only those that have been cached before) or preview/download music tracks for the video. This is less of a problem on a phone where you still can have a mobile data connection while using a WiFi connection to the X2 (if you don’t mind using up mobile data) but on an iPad or any device that doesn’t have an alternative internet connection, it’s quite annoying. You have to download the clip, then disconnect from the X2, re-connect to your home WiFi and then download the track to use.

Who is the One X2 for?

Well, I’d say that it can be particularly useful for solo-shooters and solo-creators for several reasons: Most of all you don’t have to worry much about missing something important around you while shooting since you are capturing a 360 image and can choose the angle in post (reframing/keyframed reframing) if you export as a regular video. This can be extremely useful for scenarios where there’s a lot to see or happening around you, like if you are travel-vlogging from interesting locations or are reporting from within a crowd – or just generally if you want to do a piece-to-camera but also show the viewer what you are looking at the same moment. Insta360’s software stabilization is brilliant and comparable to a gimbal and the “invisible” selfie-stick makes it look like someone else is filming you. The stick and the compact form of the camera also lets you move the camera to places that seem impossible otherwise. With the right technique you can even do fake “drone” shots. Therefore it also makes sense to have the X2 in your tool kit just for special shots, even if you neither are a vlogger, a journalist nor interested in “true” 360 video.

A worthy upgrade from the One X / One R?

Should you upgrade if you have a One X or One R? Yes and no. If you are happy with the battery life of the One X or the form factor of the One R and were mainly hoping for improved image quality in terms of resolution / higher frame rates, then no, the One X2 does not do the trick, it’s more of a One X 1.5 in some ways. However, if you are bothered by some “peripheral” issues like poor battery life, very limited functionality of the screen/display, lack of external microphone support (One X) or the slightly clunky and cumbersome form factor / handling (One R) and you are happy with a 5.7k resolution, the X2 is definitely the better camera overall. If you have never owned a 360 (video) camera, this is a great place to start, despite its quirks – just be aware that Insta360’s support can be surprisingly cranky and poor in case you run into any issues.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#35 Using external microphones with iPhones when shooting video — 1. December 2020

#35 Using external microphones with iPhones when shooting video

I usually don’t follow the stats for my blog but when I recently did check on what articles have been the most popular so far, I noticed that a particular one stuck out by a large margin and that was the one on using external microphones with Android devices. So I thought if people seem to be interested in that, why not make an equivalent for iOS, that is for iPhones? So let’s jump right into it.

First things first: The Basics

A couple of basic things first: Every iPhone has a built-in microphone for recording video that, depending on the use case, might already be good enough if you can position the phone close to your talent/interviewee. Having your mic close to the sound source is key in every situation to get good audio! As a matter of fact, the iPhone has multiple internal mics and uses different ones for recording video (next to the lens/lenses) and pure audio (bottom part). When doing audio-only for radio etc., it’s relatively easy to get close to your subject and get good results. It’s not the best way when recording video though if you don’t want to shove your phone into someone’s face. In this case you can and should significantly improve the audio quality of your video by using an external mic connected to your iPhone – never forget that audio is very important! While the number of Android phone makers that support the use of external mics within their native camera app is slowly growing, there are still many (most?) Android devices out there that don’t support this for the camera app that comes with the phone (it’s possible with basically every Android device if you use 3rd party camera apps though!). You don’t have to worry about this when shooting with the native camera app of an iPhone. The native camera app will recognize a connected external mic automatically and use it as the audio input when recording video. When it comes to 3rd party video recording apps, many of them like Filmic Pro, MoviePro or Mavis support the use of external mics as well but with some of them you have to choose the audio input in the settings so definitely do some testing before using it the first time on a critical job. Although I’m looking at this from a videographer’s angle, most of what I am about to elaborate on also applies to recording with audio recording apps. And in the same way, when I say “iPhone”, I could just as well say “iPad” or “iPod Touch”. So there are basically three different ways of connecting an external mic to your iPhone: via the 3.5mm headphone jack, via the Lightning port and via Bluetooth (wireless).

3.5mm headphone jack & adapter

With all the differences between Android and iOS both in terms of hardware and software, the 3.5mm headphone jack was, for a while, a somewhat unifying factor – that was until Apple decided to drop the headphone jack for the iPhone 7 in 2016. This move became a wildly debated topic, surely among the – let’s be honest – comparatively small community of mobile videographers and audio producers relying on connecting external mics to their phones but also among more casual users because they couldn’t just plug in their (often very expensive) headphones to their iPhone anymore. While the first group is definitely more relevant for readers of this blog, the second was undoubtedly responsible for putting the issue on the public debate map. Despite the considerable outcry, Apple never looked back. They did offer a Lightning-to-3.5mm adapter – but sold it separately. I’m sure they have been making a fortune since, don’t ask how many people had to buy it more than once because they lost, displaced or broke the first one. A whole bunch of Android phone makers obviously thought Apple’s idea was a progressive step forward and started ditching the headphone jack as well, equipping their phones only with a USB-C port. Unlike with Apple however, the consumer still had the choice to choose a new phone that had a headphone jack and in a rather surprising turn of events, some companies like Huawei and Google actually backtracked and re-introduced the headphone jack, at least for certain models. Anyway, if you happen to have an older iPhone (6s and earlier) you can still use the wide variety of external microphones that can be connected via the 3.5mm headphone jack (Rode smartLav+, iRigMic, iRig Pre/iRig Pre 2 interface with XLR mics etc.) without worrying much about adapters and dongles. Just make sure that the mic you are using has a TRRS (three black rings) and not a TRS (two black rings) 3.5mm connector to assure compatibility with smartphones (TRS is for DSLM/DSLR).

Lightning port

While most Android users probably still have fairly fresh memories of a different charging port standard (microUSB) from the one that is common now (USB-C), only seasoned iPhone aficionados will remember the days of the 30-pin connector that lasted until the iPhone 5 introduced the Lightning port as a new standard in 2012. And while microUSB mic solutions for Android could be counted on one hand and USB-C offerings took forever to become a reality, there were dedicated Lightning mics even before Apple decided to kill the headphone jack. The most prominent one and a veritable trailblazer was probably IK Multimedia’s iRig Mic HD and its successor, the iRig Mic HD 2. IK Multimedia’s successor to the iRigPre, the iRigPre HD comes with a Lightning cable as well. But you can also find options from other well-known companies like Zoom (iQ6, iQ7), Shure (MV88/MV88+), Sennheiser (HandMic Digital, MKE 2 Digital), Rode (Video Mic Me-L), Samson (Go Mic Mobile) or Saramonic (Blink 500). The Saramonic Blink 500 comes in multiple variations, two of them specifically targeted at iOS users: the Blink 500 B3 with one transmitter and the B4 with two transmitters. The small receiver plugs right into the Lightning port and is therefore an intriguingly compact solution, particularly when using it with a gimbal. Saramonic also has the SmartRig Di and SmartRig+ Di audio interfaces that let you connect one or two XLR mics to your device. IK Multimedia offers two similar products with the iRig Pro and the iRig Pro Duo. Rode recently released the USB-C-to-Lightning patch cable SC15 which lets you use their Video Mic NTG (which comes with TRS/TRRS cables) with an iPhone. There’s also a Lightning connector version of the SC6 breakout box, the SC6-L which lets you connect two smartLavs or TRRS mics to your phone. I have dropped lots of product names here so far but you know what? Even if you don’t own any of them, you most likely already have an external mic at hand: Of course I’m talking about the headset that comes included with the iPhone! It can’t match the audio quality of other dedicated external mics but it’s quite solid and can come in handy when you have nothing else available. One thing you should keep in mind when using any kind of microphone connected via the iPhone’s Lightning port: unless you are using a special adapter with an additional charge-through port, you will not be able to charge your device at the same time like you can/could with older iOS devices that had a headphone jack.

Wireless/Bluetooth

I have mentioned quite a few wireless systems before (Rode Wireless Go, Saramonic Blink 500/Blink 500 Pro, Samson Go Mic Mobile) that I won’t list here (again) for one reason: While the TX/RX system of something like the Rode Wireless Go streams audio wirelessly between its units, the receiver unit (RX) needs to be connected to the iPhone via a cable or (in the case of the Blink 500) at least a connector. So strictly speaking it’s not really wireless when it comes to how the audio signal gets into the phone. Now, are there any ‘real’ wireless solutions out there? Yes, but the technology hasn’t evolved to a standard that can match wired or semi-wired solutions in terms of both quality and reliability. While there could be two ways of wireless audio into a phone (wifi and Bluetooth), only one (Bluetooth) is currently in use for external microphones. This is unfortunate because the Bluetooth protocol that is used for sending audio back from an external accessory to the phone (the so-called Hands Free Profile, HFP) is limited to a sample rate of 16kHz (probably because it was created with headset phone calls in mind). Professional broadcast audio usually has a sample rate of 44.1 or 48kHz. That doesn’t mean that there aren’t any situations in which using a Bluetooth mic with its 16kHz limitation can actually be good enough. The Instamic was primarily designed to be a standalone ultra-compact high quality audio recorder which records 48/96 kHz files to its internal 8GB storage but can also be used as a truly wireless Bluetooth mic in HFP mode. The 16kHz audio I got when recording with Filmic Pro (here’s a guide on how to use the Instamic with Filmic Pro) was surprisingly decent. This probably has to do with the fact that the Instamic’s mic capsules are high quality unlike with most other Bluetooth mics. One maybe unexpected option is to use Apple’s AirPods/AirPods Pro as a wireless Bluetooth mic input. According to BBC Mobile Journalism trainer Marc Blank-Settle, the audio from the AirPods Pro is “good but not great”. He does however point out that in times of Covid-19, being able to connect to other people’s AirPods wirelessly can be a welcome trick to avoid close contact. Another interesting wireless solution comes from a company called Mikme. Their microphone/audio recorder works with a dedicated companion video recording app via Bluetooth and automatically syncs the quality audio (44.1, 48 or 96kHz) to the video after the recording has been stopped. By doing this, they work around the 16kHz Bluetooth limitation for live audio streaming. While the audio quality itself seems to be great, the somewhat awkward form factor and the fact that it only works with its best feature in their own video recording app but not other camera apps like Filmic Pro, are noticeable shortcomings (you CAN manually sync the Mikme’s audio files to your Filmic or other 3rd party app footage in a video editor). At least regarding the form factor they have released a new version called the Mikme Pocket which is more compact and basically looks/works like a transmitter with a cabled clip-on lavalier mic. One more important tip that applies to all the aforementioned microphone solutions: If you are shooting outdoors, always have some sort of wind screen / wind muff for your microphone with you as even a light breeze can cause noticeable noise.

Micpocalpyse soon?

Looking into the nearby future, some fear that Apple might be pulling another “feature kill” soon, dropping the Lightning port as well and thereby eliminating all physical connections to the iPhone. While there are no clear indications that this is actually imminent, Apple surely would be the prime suspect to push this into the market. If that really happens however, it will be a considerable blow to iPhone videographers as long as there’s no established high-quality and reliable wireless standard for external mics. Oh well, there’s always another mobile platform to go to if you’re not happy with iOS anymore 😉

To wrap things up, I have asked a couple of mobile journalists / content creators using iPhones what their favorite microphone solution is when recording video (or audio in general):

Wytse Vellinga (Mobile Storyteller at Omrop Fryslân, The Netherlands): “When I am out shooting with a smartphone I want high quality worry-free audio. That is why I prefer to use the well-known brands of microphones. Currently there are three microphones I use a lot. The Sennheiser MKE200, the Rode Wireless Go and the Mikme Pocket. The Sennheiser is the microphone that is on the phone constantly when taking shots and capturing the atmospheric sound and short sound bites from people. For longer interviews I use the wireless microphones from Mikme and Rode. They offer me freedom in shooting because I don’t have to worry about the cables.”

Philip Bromwell (Digital Native Content Editor at RTÉ, Ireland): “My current favourite is the Rode Wireless Go. Being wireless, it’s a very flexible option for recording interviews and gathering localised nat sound. It has proven to be reliable too, although the original windshield was a weakness (kept detaching).”

Nick Garnett (BBC Reporter, England & the world): “The mic I always come back to is the Shure MV88+ – not so much for video – but for audio work: it uses a non proprietary cable – micro usb to lightning. It allows headphones to plug into the bottom and so I can use it for monitoring the studio when doing a live insert and the mic is so small it hides in my hand if I have to be discrete. For video work? Rode VideoMicro or the Boya clone. It’s a semi-rifle, it comes with a deadcat and an isolation mount and it costs €30 … absolute bargain.”

Neal Augenstein (Radio Reporter at WTOP Washington DC, USA): “If I’m just recording a one-on-one interview, I generally use the built-in microphone of the iPhone, with a foam windscreen. I’ve yet to find a microphone that so dramatically improves the sound that it merits carrying it around. In an instance where someone’s at a podium or if I’m shooting video, I love the Rode Wireless Go. Just clipping it on the podium, without having to run cable, it pairs automatically, and the sound is predictably good. The one drawback – the tiny windscreen is tough to keep on.”

Nico Piro (Special Correspondent for RAI, Italy & the world): “To record ambient audio (effects or natural as you want to name it) I use a Rode Video Mic Go (light, no battery needed, perfect for both phones and cameras) even if I must say that the iPhone’s on-board mic performs well, too. For Facebook live I use a handheld mic by Polsen, designed for mobile, it is reliable and has a great cardioid pickup pattern. When it comes to interviews, the Rode Wireless Go beats everything for its compact dimensions and low weight. When you are recording in big cites like New York and you are worried about radio interferences the good old cabled mics are always there to help, so Rode’s SmartLav+ is a very good option. I’m also using it for radio production and I am very sad that Rode stopped improving its Rode Rec app which is still good but stuck in time when it comes to file sharing. Last but not least is the Instamic. It takes zero space and it is super versatile…if you use native camera don’t forget to clap for sync!”

Bianca Maria Rathay (Freelance iPhone videographer, Germany): “My favorite external microphone for the iPhone is the RODE Wireless Go in combination with a SmartLav+ (though it works on its own also). The mic lets your interviewee walk around freely, works indoors as well as outdoors and has a full sound. Moreover it is easy to handle and monitor once you have all the necessary adapters in place and ready.”

Leonor Suarez (TV Journalist and News Editor at RTPA, Spain): “My favorite microphone solutions are: For interviews: Rode Rodelink Filmmaker Kit. It is reliable, robust and has a good quality-price relationship. I’ve been using it for years with excellent results. For interviews on the go, unexpected situations or when other mics fail: IK Multimedia iRig Mic Lav. Again, good quality-price relationship. I always carry them with me in my bag and they have allowed me to record interviews, pieces to camera and unexpected stories. What I also love is that you can check the audio with headphones while recording.”

Marcel Anderwert (Mobile Journalist at SRF, Switzerland): “For more than a year, I have been shooting all my reports for Swiss TV with one of these two mics: Voice Technologies’ VT506Mobile (with it’s long cable) or the Rode Wireless Go, my favourite wireless mic solution. The VT506Mobile works with iOS and Android phones, it’s a super reliable lavalier and the sound quality for interviews is just great. Rode’s Wireless Go gives me more freedom of movement. And it can be used in 3 ways: As a small clip-on mic with inbuilt transmitter, with a plugged in lavalier mic – and in combination with a simple adapter even as a handheld mic.”

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#34 Apple is about to give us 25fps in the iPhone’s native camera app (finally catching up to Windows Phones) — 17. November 2020

#34 Apple is about to give us 25fps in the iPhone’s native camera app (finally catching up to Windows Phones)

One of the things that has mostly remained a blindspot in video recording with the native camera app of a smartphone, is the ability to shoot in PAL frame rates, i.e. 25/50fps. The native camera apps of smartphones usually record with a frame rate of 30/60 fps. This is fine for many use cases but it’s not ideal under two circumstances: a) if you have to deliver your video for traditional professional broadcast in a PAL broadcast standard region (Europe, Australia, parts of Africa, Asia, South America etc.) b) If you have a multi-camera shoot with dedicated ‘regular’ cameras that only shoot 25/50fps. Sure, it’s relatively easy to capture in 25fps on your phone by using a 3rd party app like Filmic Pro or Protake but it still would be a welcome addition to any native camera app as long as this silly global frame rate divide (don’t get me started on this!) continues to exist. There was actually a prominent example of a phone maker that offered 25fps as a recording option in their (quasi)-native camera app very early on: Nokia and later Microsoft on their Lumia phones running Windows Phone / Windows Mobile. But as we all know by now, Windows Phone / Windows Mobile never really stood a chance against Android and iOS (read about its potential here) and has all but disappeared from the smartphone market. When LG introduced its highly advanced manual video mode in the native camera app of the V10, I had high hopes they would include a 25/50fps frame rate option as they were obviously aiming at more ambitious videographers. But no, the years have passed and current offerings from the Korean company like the G8X, V60 and Wing still don’t have it. It’s probably my only major gripe with LG’s otherwise outstanding flagship camera app. It was up to Sony to rekindle the flame, giving us 25fps natively in the pro camera app of the Xperia 1 II earlier this year. 

And now, as spotted by BBC multimedia trainer Mark Robertson yesterday, Apple has added the option to record with a frame rate of 25fps in the native camera app on their latest iOS beta 14.3. This is a pretty big deal and I honestly didn’t expect Apple to make that move. But of course this is a more than welcome surprise! Robertson is using a new iPhone 12 Pro Max but his colleague Marc Blank-Settle also confirmed that this feature trickles down to the very old iPhone 6s, that is if you run the latest public beta version of iOS. The iPhone 6 and older models are excluded as they are not able to run iOS 14. While it’s not guaranteed that all new beta features make it to the finish line for the final release, I consider it to be very likely. So how do you set your iPhone’s native camera app to shoot video in 25fps? Go into your iPhone’s general settings, scroll down to “Camera” and then select “Record Video”. Now locate the “Show PAL Formats” toggle switch and activate it, then choose either “1080p HD at 25fps” or “4K at 25fps”. Unfortunately, there’s no 50fps option at this moment, I’m pretty sure it will come at some point in the future though. I recorded several clips with my iPhone SE 2020 and tested the frame rate via the MediaInfo app which revealed a clean 25.000fps and CFR (Constant Frame Rate, smartphones usually record in VFR = Variable Frame Rate). What other implications does this have? Well, many interested in this topic have been complaining about Apple’s own iOS editing app iMovie not supporting 25/50fps export. You can import and edit footage recorded in that frame rates no problem but it will be converted to 30/60fps upon export. I believe that there’s a good chance now that Apple will support 25/50fps export in a future update of iMovie because why bother integrating this into the camera app when you can’t deliver in the same frame rate? Android phone makers in the meantime should pay heed and consider adding 25/50fps video recording to their native camera apps sooner than later. It may not be relevant for the majority of conventional smartphone users but it also doesn’t hurt and you can make certain “special interest” groups very happy! 

As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider signing up for my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.

As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider subscribing to my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂