smartfilming

Exploring the possibilities of video production with smartphones

#49 What’s new and useful in iOS 15? (by Marc Blank-Settle) — 24. October 2021

#49 What’s new and useful in iOS 15? (by Marc Blank-Settle)

Preface

So far, all the blog posts on smartfilming.blog were written by myself. I’m happy that for the very first time I’m now hosting a guest post here. The article is by Marc Blank-Settle who works for the BBC Academy as a smartphone trainer and is highly regarded as one of the top sources for everything “MoJo” (mobile journalism), particularly when it comes to iPhones and iOS. His yearly round-up of all the new features introduced with the latest version of Apple’s mobile opearting system iOS has become a go-to for journalists and content creators. iOS 15 just came out, so without further ado, I’ll leave you to Marc’s take on the new software for iPhones and don’t forget to follow him on Twitter! – Florian – smartfilming.blog

Introduction

Doesn’t time fly? It’s already a year since I made a video looking at what was then the latest version of iOS, the operating system on iPhones.
It’s also therefore a year since the equally traditional complaint of ‘preferential treatment’ to Apple over Android, the operating system on around 70% of smartphones globally.
However, it remains the case that iPhones and iOS remain the dominant device for mobile journalism.
It’s also the case that this review of iOS 15 will be far more relevant, far more quickly, to iPhone owners if the pattern of previous releases is repeated. iOS 14 came out on 16 September 2020; a week later it was running on more devices than were almost a year later running Android 11 which came out a week before.  

iOS 14 got onto millions of users’ iPhones within weeks of its release.
The latest version of Android is adopted much more slowly than iOS.

14 or 15?

In addition to new features and functions, iOS 15 also contains bug fixes and security updates to protect your device against malware, spyware and viruses. But in a radical departure, users can for the first time get all these fixes and updates without taking the new version of iOS.
Up until now, the only way to get the latest protection was to get the latest software version. But now, you can stay on iOS 14 and only take the security updates included in iOS 15 and not the new features.
To do this, go to ‘Settings-General-Software Update-Automatic Updates’ and turn off the options you see here for downloading and installing iOS updates. When Apple releases security patches for iOS 14, you’ll see them in the Software Update menu instead of the iOS 15 updates.
For some users, especially with older devices, this strategy might be worth considering. If you’re still on the 6s or the original SE and your battery depletes quickly, the extra strain of iOS 15 might not be worth it for you. 
Or maybe after reading this review, you might just want to keep everything as it is in terms of how your device works, but you understandably want to take the bug fixes.

Which devices can get iOS 15?

If your iPhone is a 6s, original SE or newer, iOS 15 can be downloaded to it.
But not everything is coming to every iPhone which can download it, as rather than frustrating users by giving their phone some features it will struggle with, Apple have chosen to simply not make them available to older devices.
Most people won’t be aware though that their 7 or 8plus is missing out on new goodies – although you will, after reading this review.
The cut off tends to be the iPhone X and older: if you have one of those devices, then there are about ten additions which you won’t get. Anything newer, and you’ll get everything although there are a few extra things reserved just for the iPhone 12 series of 2020. When this review gets to those features which are only for certain phones, I’ll flag that up.
Additionally, some things which Apple highlighted in their big reveal of iOS 15 in June 2021 have been postponed and won’t in fact be available until 15.1 or later; these too will be flagged up.
Finally, this review reflects to a degree how I personally use my iPhone. I’m not a great user of Reminders or Notes so I won’t be able to do justice to any changes made for that or any other aspects of iOS which I myself neglect.

Mainstream mojo

Usually, my review of the new features for mobile journalists of the forthcoming version of iOS goes BIG on video, audio and photos – the mainstays of mojo.
But not this year, at least not quite to the same degree.
I’m not saying there’s nothing of interest to mobile journalists, or I wouldn’t have spent hours researching, writing, and putting this all together. But there’s certainly not as much as its immediate predecessors, iOS 13 and 14. 
From my perspective, there’s nothing new for videos, photos and audio creation using Apple’s in-built apps with a huge “wow” factor. The key word here is “creation”: iOS 15 doesn’t immediately permit anything radical in terms of how content is gathered. But there are clues of what third-party developers may be able to do to benefit users. 

Video bokeh

The first big change for video in iOS 15 could go some way to addressing one long-standing complaint about footage recorded on an iPhone – that too much is in focus, unlike the material from a ‘proper’ broadcast camera used in news, documentaries, wildlife programmes and so on.
Known as ‘bokeh’ or, more prosaically, ‘blurry background’, it’s the visual effect whereby the main subject of a video, such as an interviewee, is fully in focus while the background behind him or her is not.
It gives depth to shots and a blurred background means the viewer can concentrate on what is being said rather than wondering where the interview is being filmed. On an iPhone, all the footage tends to be in focus unless the subject of a shot is very close to the lens.
Due to the lack of a big image sensor needed to produce ‘natural’ bokeh, smartphones rely on software to artificially create and simulate the blurred background effect. Apple introduced this to photos in the iPhone 7Plus of 2017 with ‘Portrait Mode’ but it’s taken a full four years of advances to get it working on video – even if they were beaten to the punch by third-party apps like Focos Live.

A photo of me taken on the standard wide lens of the iPhone 11 Pro with no blur.
This photo from the iPhone 11 Pro in Portrait Mode shows the blurred background generated. 

Facetime Portrait Mode

If you’re lucky enough to be able to afford a model from the iPhone 13 series, then you’ll have bokeh for video albeit at 30fps which suits the requirements for footage shot in North America but is not what’s needed for TV in the UK and much of the rest of the world.
But if you have an iPhone XS or newer, then iOS 15 does offer a Portrait Mode option on video on Facetime, as well as a few select apps which already offer a blurred background feature such as Instagram, Snapchat and Zoom.
Open the app you want to use and then Control Centre; a new ‘effects’ tile is visible and once pressed, you can toggle ‘Portrait’ on or off.
Or you can do it straight from within Facetime itself:

The icon in the top left can turn the blurred background on and off.

Bokeh video beyond FaceTime?

This could all get really interesting if developers of professional video filming apps like FilmicPro or MoviePro are able to bring this functionality into their apps, giving bokeh to iPhones at the preferred 25fps or even 50fps.
But if it can only be done with the 13 series and not these older models, then journalists unable to acquire the very latest devices won’t be able to benefit from this innovation fully.
As for how it could benefit journalists, depth of field to footage would help close the gap further with the results from ‘big’ cameras. Purists though may still rail against the artificial computer-generated aspect and the fact that it can be adjusted in post. Equally, early results I’ve seen have on occasions been less than impressive with the blur failing altogether or being inconsistent especially around the edges of clothing and hair which is not a failing of “big” cameras.

Audio options in FaceTime

The audio for FaceTime calls also has new features which may too get incorporated into other apps in the coming weeks. Available via Control Centre again, users will see a new ‘Mic Mode’ tile which when pressed gives three choices: standard, voice isolation and wide spectrum.
The first should need little explanation; the second tries to suppress ambient noise as best as it can, to focus better on the person speaking; the last does the opposite, incorporating environmental sounds and other people speaking in the background in case you want the person you’re on a FaceTime call to be able to hear everything that’s happening in your surroundings.
Is this useful for journalists? While it’s never a bad thing for a speaker to have more clarity, the tests I’ve done indicate it’s of limited benefit but that could have been because there was too little or too much ambient noise where I was at the time.
My results echoed those of a colleague who tested it on the other end of a FaceTime call. We could hear the other person better with voice isolation on, although it sounded noticeably processed, almost artificial, in quality. Wide spectrum did indeed boost the background noise.
If there are several people on the same call, then Spatial Audio kicks in (again, not if your device is an iPhone X or older) where the audio sounds like it’s coming from where each person is on the call. Again, this is another one where the clever work from independent developers, taking on the new features and pushing it further in their own apps, could be key.

Other FaceTime features

Before leaving FaceTime, a few other innovations it is getting in iOS 15 are worth mentioning even if they could be viewed less as ‘innovations’ and more ‘catching up with what’s been possible for a while on other cross-platform video calling apps like Zoom, Skype and Facebook Messenger’.
There’s a ‘mute alert’ for those enjoyable moments when someone speaks while their mic is muted. Also, users can now make FaceTime calls to PCs and Android devices, not just to those in the Apple ecosystem, with end-to-end encryption. You can also now invite anyone to a FaceTime call with a link.
One suggestion from Apple is to send a FaceTime link via WhatsApp, but I’m trying to get my head around why anyone would send a FaceTime web link via WhatsApp, encouraging someone to join a FaceTime call…when they could do a video or audio call on WhatsApp itself?
Finally, one big feature touted in Apple’s original ‘here’s what’s in iOS 15’ keynote event won’t be available from day one: Share Play, where you can share a video you’re watching with someone else so you can enjoy it together over FaceTime.

Video playback options

When playing a video embedded on a website, three dots in the bottom right corner signify further options including the new ability to increase the speed at which the video plays, up to two times faster. It can also be slowed down to half-speed if you feel that’s absolutely necessary.

The options for adjusting video playback speed.

Video editing tweaks

For those editing their videos within iOS itself, rather than any 3rd party app or transferring the footage to a Mac, one welcome tweak makes this job a bit easier. Previously, editing a video caused it to shrink on the screen; now, tapping double-headed diagonal arrows will expand the video to full screen so you can see better what it looks like. You can even widen your fingers to expand the frame even more.

Videos were small when edited in iOS 14.
iOS 15 makes a video full screen for editing. 

EXIF data

There’s also more information available about videos as well as photos, as iOS 15 incorporates a feature long available via 3rd party apps – the EXIF data.
EXIF stands for Exchangeable Image File Format. Rather than needing to note down separately information about an image or video, such as camera exposure, date/time the image was captured, and even GPS location, it’s embedded in a special file alongside the media itself.
Before iOS 15, there was a time-consuming workaround to see the EXIF data, involving transferring an image to Files and then another dozen taps; numerous third party apps could do it too.
But it is all now directly visible within the Photos app (still known to many as ‘the Camera Roll’). Tapping on the (i) under the photo or video, or simply swiping up on it, will show which lens was used, the resolution, the size, ISO, shutter speed, frame rate and more.

How EXIF data is displayed for photos in iOS 15.
How EXIF data is displayed for videos in iOS 15.

The benefits of EXIF data

In addition, it’ll show the name of the app if it was taken with a 3rd party app, and tapping that name will result in all the media captured with that app being shown. You can also access that material another way, by searching for the app’s name.
For journalists, knowing the file size of a video can be beneficial as the size can gives an indication of how long it might take to upload, always bearing in mind there are numerous other factors in play here such as the speed of the connection.
It’s also worth pointing out that the file size of a video, along with other EXIF data, is already available for videos in the PNg library, by loading a video and tapping the (i)
Whether journalists can use the EXIF feature to verify the date and time when material was captured will depend on the method used to share it. WhatsApp strips the date and time from material, with the result that iOS only shows the date and time of receipt; if it’s uploaded to Dropbox, then downloaded and saved to Photos, then the metadata is retained and visible.
Finally, while the inbuilt EXIF data shows a lot of information, it doesn’t show everything. For example, with a video, it omits the bitrate which can be useful to know as it gives an indication of how much data is in the video – the higher the bitrate, the better. Transferring the file to a Mac will reveal a lot more info besides.
Another change relating to photos and videos has come about quite possibly as a direct result of how EXIF data is accessed. On a live photo, swiping up showed the options for adjustments such as looping it or bouncing it back and forth. Now that swiping up reveals the EXIF data, the Live Photo adjustments are now accessed from a drop down in the top left corner.

Live Text

Live Text is another change for the XS and newer; if you have an iPhone X or older, you can just read on in envy. Android users will also be reading on with a wry smile as the ‘new’ Live Text feature has long been available on many Android devices.
Go to ‘settings-camera’ and you’ll see a new ‘Live Text’ option. If you don’t want to use it, turn the green light off; but otherwise, turn it on and you’re good to go.
On compatible iPhones, the device can now ‘read’ text in photos, be that ones taken months or years ago and already in the Photos app or ones you’re about to take with the live camera. The text can be printed or handwritten too.
When you have the camera open, look on your screen to see if the live text icon appears.

If it doesn’t, you might need to move around until it does.
Once it’s visible, you’ll also get yellow brackets around the text that is now interactive. If the text you want to use isn’t within the brackets, move your phone around again until it is.
When ready, tap the Live Text icon and you’ll be able to select all or some of the text.

You’ll then be able to do things like copy it to paste into an email, or tap a phone number to call it, or start an email with the address in the ‘to’ field or even translate text into certain languages.
With photos already taken, the process can be even simpler depending on the text in question which the phone can “see”. If there’s a phone number or email address, simply tap it to use it; if that doesn’t work, a gentle tap elsewhere on the screen should bring up the Live Text icon which will definitely make the text in the photo interactive to use as suggested above.
It also works with handwriting, within reason.

The process of using Live Text to scan some terrible handwriting.

When Live Text can be useful

How might journalists use this? It’ll depend on the text in question, but the possibilities are huge. In addition to calling phone numbers or using email addresses as already suggested, you could tap an address to get directions to it. If there’s a time and date, tap it to add it to your calendar.
Or you might have been given a document and you need to use the text from it. Use the Live Text option to scan the words and you can instantly drop the text into an email rather than laboriously typing it out yourself – once you’ve checked it’s not missed out any words, such as ‘not’ from ‘my client will be pleading not guilty to all the charges’.

Visual Lookup

Staying with new tricks that can be done with photos, your device should soon be able to give you more information about what is actually in them through ‘Visual Lookup’.
A very similar feature has already been available on iPhones and Androids via the Google Lens app but it’s now being incorporated into iOS itself. Having said that, it only works on iPhones inside the USA and even there, not on on an iPhone X or older. But when it’s released beyond the borders of the USA, users with compatible devices should look for a small star on the (i) under photos.
That indicates that the feature is active and you can use it to identify a plant, a landmark or animal.

Visual Look Up correctly identifying Bath Abbey.

Finding photos

Another useful addition for Photos but one which isn’t limited to certain iPhones and in certain locations, like Visual LookUp, is the ability for your iPhone to find text in your images.
This is via the Spotlight search option, which is activated by a quick swipe down on any screen of your device. Once you’ve updated to iOS 15, your iPhone quietly scans all your photos for text they contain; Spotlight can now search for the text you ask it to find, and it’ll display the results.
I’ve found this quietly impressive, with Spotlight always returning the photo with the required text. So if you know you have a photo of a document with someone’s name in it, this is an efficient way of finding it rather than trawling all your photos.

Voice Memos

I’m going to omit some other photo and video changes as I don’t think they’re very journalistic (such as the Memories feature for content in your camera roll) so I’ll end this mojo-centric section with a sentence or two about Voice Memos, the iPhone’s audio recording app which now has a feature to skip silences on playback.
It does an ok job from what I’ve found, even if it’s a rather blunt and artificial way of shortening a recording. It’s also doesn’t work at all as an editing tool because when you share audio with the silences skipped (for example by AirDrop or email), the recipient gets the audio with all the silences back in as if you’ve done nothing to it. But if the length of audio is still too much for you, then iOS 15 lets you play it back at up to twice the speed.

Changes to Safari

Muscle memory can be important for many smartphone users: you know where certain apps are on your screen or you know exactly where the ‘reply’ button is on your favourite social media app. But hang on to your hats as iOS 15 brings in a big change to Safari, the main web browser on iPhones, meaning your muscles may have to relearn everything.
The URL address bar – where you enter a website’s address or a search term – has moved and now defaults to being at the bottom of the screen. For as long as anyone can remember, it’s been at the top.

In iOS 14, the URL bar was at the top but in iOS 15 it is at the bottom.

The thinking is that with so many of us having phones with larger screens, the address bar at the top was a strain to reach given our hands haven’t grown to match.
So, moving it lower down the screen makes it easier to access. My wife is admittedly something of a small sample size, but when I showed her Safari on iOS 15 (yes, the evenings just whizz by at our house) she immediately spotted the repositioned address bar and commented on how much easier it made it to use.
But even after using the beta of iOS 15 for several months, my fingers still twitch automatically towards the top of my screen.
All hope is not lost though if you want to return to things as they were, as there are two ways to do this: either by tapping on the aA on the address bar itself and then ‘show top address bar’ or navigate your way through ‘settings-safari-single tab’.

How to move the bar back to the top. 

That these options even exist is a concession by Apple as initial versions of Safari in iOS 15 had the bar at the bottom, like it or not. Such was the outcry that Apple moved enough to allow users to move the bar, even if the change wasn’t fully abandoned.
But the point remains that there’s nothing to tell users after they’ve upgraded that they can in fact return Safari to the top of the screen and I predict there’s going to be a lot of confusion over this change.
If you do like the new place for Safari, you’ll gain another feature that’s missing when it’s at the top: the option to swipe left and right between websites.

Tab groups in Safari

One new feature within Safari which many journalists could find useful is called a “tab group”.
Let’s say you’re working on a court case and you have numerous pages open relating to it; but you’re also planning a dinner party for friends and you have several pages of recipes open; and you’re also thinking ahead to a holiday and you’ve lots of hotel websites open. Instead of all these pages being jumbled up together, you can create a tab group and put only one set of pages into that group, not the others.
When you want to access just the court case pages, tap to open that group and they’ll all be accessible. It’s a bit like bookmarking a website but more efficient as all the tabs open as soon as you swap to the group.

Safari Extensions

Mac users have had extensions for Safari for years. These powerful little add-ons extend (hence the name) what can happen in Safari, and they’re now available for iOS. Once you’ve given an extension permission to interact with websites, how you use them to benefit your journalism will depend on the ones you install. 

Introducing Focus

Whether you just want to get on with your work or want to prevent phone calls interfering while you’re filming something, Do Not Disturb has long been a failsafe.
But now there’s a super-charged DND, known as Focus. It has replaced the DND tile in Control Centre and also within Settings.
Additionally, there’s a lot more you can do with it although it’s worth pointing out here that you don’t HAVE to use these new features. If DND was enough for you, just turn it on as before.
But for the more adventurous, you can do a lot more now as you’re presented with four default Focuses (Foci?) which can each be configured to your liking – and you can also make your own entirely new Focusessses.

The default Focus options all users have access to.

Once you’ve activated a Focus on one device, it syncs across to all devices with the same Apple ID. You could set up the “personal” one so friends and family can still send you notifications, while work would only let selected colleagues do that.
The fact you’re in a Focus can be shared with others so when they message you, the sender should understand why you’re not replying and that you’re not actually ignoring them. That may not be enough to placate and buy off a stressed output editor on all occasions though.

How a Focus can tell someone you’re silencing their notifications, and how they see that information.

Breaking through a Focus

If they really insist they need to be able to contact you at all times, you can tweak things so that their (but only their) notifications are allowed through. The same can be done with apps, as ones you choose can still send their notifications.
If someone hasn’t been put on that whitelist, then they have the option to tap “notify anyway” which bursts through a Focus – but it feels like this should only be used sparingly as doing it too often or unnecessarily could easily cause annoyance or offence.

The ‘notify anyway’ option could prove handy.

Time Sensitive notifications

Things can even be taken a stage further, in a potentially confusing way. There’s an additional setting called “time sensitive” where any app not on the allowed list is still allowed to send notifications marked as ‘time sensitive’, such as an appointment in your calendar. But, as the image below shows, when the first one of these comes through, you are offered control over whether you actually want these or not.

‘Time sensitive’ notifications can still be shown, even when in a Focus.

Focus and journalists

Where it can get really useful for journalists and others is the fact that with a Focus turned on, entire pages of apps can be temporarily hidden.
This means that if your iPhone is organised enough to have all personal or non-work apps on one screen and work ones on another, you can set up a Focus so that all the tempting personal apps just simply aren’t available to you on your device, leaving you to…focus on the work-related task in hand with the apps you do need access to.
But all is not lost – if temptation is too much to resist, all your apps are still accessible via the App Library.
There’s also a way that a Focus could provide a level of security for journalists caught in a tricky situation – although it’ll need a bit of forward planning and I can’t promise it’ll be 100% certain to keep you safe.
The scenario would be that you’re reporting from a location where police officers might be keen to have a look at your device. You could have a Focus called something bland like ‘DayTime’ and set it up such that when active, the screen on your iPhone which has all your reporting and communication apps, as well as your email and Photos, isn’t visible and instead your device only shows less problematic ones.
When you see someone in a uniform and gun approaching, quickly activate DayTime and they’ll only initially see the innocuous apps. If someone with a bit more knowledge spends more time looking through your device, the truth may soon become apparent especially as all your apps remain a swipe away in the App Library, so please don’t seek retribution on me once you’re eventually released from a tiny prison cell.
A Focus can also be triggered automatically based on location, with your device suggesting what it thinks is the most appropriate. This one was flagged up to me when my iPhone detected I’d come home after being out:

A geo-located prompt about a Focus.

For power-user journalists, you can even trigger a Focus when you open an app.
I’ve set my iPhone up to do this. Combined with a personal automation, which triggers things like putting it into Airplane Mode and increasing the screen brightness to 100%, this means that I only need to open FilmicPro and I can use the app to gather content knowing I shouldn’t get any interruptions.
Other options for triggering a Focus are ‘at a certain time’, so if you have regular planning meetings each morning for an hour from 0800, the particular Focus will automatically come on at that time for that long; or ‘at a location’ so it’ll be triggered when you arrive at work before deactivating once you leave.
If all of this seems too much, then just carry on using Do Not Disturb as before.

Notifications

Related to Focus are Notifications and these get tweaked too, with changes to how they look and also a new option of having them all delivered en masse at 8am and 6pm or other times to suit you. While I can see that some users may benefit from only seeing notifications at a particular time, it feels that news journalists in particular may need to know them a bit sooner than that.

Hide My Email

Each iOS release contains very technical bug fixes and security updates which take place in the background and over which you have no control. But others are more openly available to users and iOS 15 has its fair share of these. 
Giving out your email address to all and sundry, or using it when you sign up for apps or websites, may be something you’re totally fine with but it’d be understandable if many journalists would be less than comfortable doing this.
This is where ‘Hide My Email’ in iOS 15 could be useful although it’s important to point out that it’s not available to all, only to users who pay for iCloud storage, through a new service called iCloud+. If you’re still using the free, basic level of 5gb storage then you can’t use Hide My Email but if you pay you’re automatically upgraded.
For those who are on iCloud+, you’ll get offered the option to use a randomly-generated email address which then links directly to your own one.

A randomly-generated email address via Hide My Email.

The app or retailer never gets to see who you really are, yet you receive their emails and are able to use their services. You can also actively create your own unique email address by going to ‘Settings – Apple ID – iCloud – HideMyEmail’ on your iPhone.
Staying with emails, but something which applies to all users not just those with iCloud+ is ‘Mail Privacy Protection’.
When you first open the default email app after updating to iOS 15, you’ll see this screen:

The options for Mail Privacy Protection.

Mail Privacy Protection

The intention here is to give users some privacy when it comes to how companies and advertisers track you when interact with their emails. Usually, tracking pixels and other identifiers are sent when you open the email with information about where you are, the time and IP address.
With Mail Privacy Protection activated, your IP address is hidden and all content is loaded privately in the background, giving you an extra layer of privacy.

Private Relay

Back to a feature only available to those with iCloud+ but one which journalists may benefit from using called Private Relay. It’s available in iOS 15 even though it is still described by Apple as being in “beta”.
Private Relay is like a Virtual Private Network (a VPN), in that it obscures your IP address so you’re able to browse sites which might otherwise be inaccessible to you for example ones restricted by geography or content. You can choose to increase your anonymity by setting it to use the country and time zone you’re in, or have it maintain your general location so you can still see things which are local like restaurants and shops. 
Private Relay isn’t quite as powerful as a VPN though, so don’t plan on using it to watch NetflixUS. Instead, it encrypts your browsing on sites without that little padlock on the URL bar, as well as hiding your real IP address.
This means that the site you’re looking at won’t know it’s you and nor will Apple. It works like this: your traffic is sent to an Apple server and then the IP address, which can be used to locate you, is removed. Your request for a website is then sent to another server where it’s given a temporary IP address before going on to the website you’ve requested. This should mean websites can’t build up a profile about your browsing history and therefore build a profile of you more generally. There are more secure ways of doing all this and so if you do really need proper protection and anonymity, then I wouldn’t rely on Private Relay. Being in beta, it isn’t totally reliable which isn’t ideal given what it is trying to do. I’ve found that Private Relay doesn’t work at all on my home wifi and only functions on 4G.

Record App Activity

iOS 14 brought a change whereby a small green light would be visible when an app was using your camera and an orange one when your mic was being used. iOS 15 goes further, with a new tool which will show which apps and sites are accessing a much wide range of features and data.
It’s somewhat buried, but you can find it in ‘Settings-Privacy-Record App Activity’ and then it needs to be turned on.
After a week of logging, you’ll be able to access a summary of when your apps did what, as well as what those apps did with your data and the sites they subsequently contacted. This though is another feature which was showcased when iOS 15 was unveiled and which has yet to make it into the hands of users. 
Finally, a few changes which don’t fall easily into a particular category.

Longer Siri

If you’re the kind of journalist who likes to dictate your copy or script, then iOS 15 has removed the limit of how long Siri will listen to you before cutting out. It was capped at 60 seconds but now can keep going well beyond that. Tests I did showed Siri was fine at three minutes, although talking at speed did continue to be a challenge for it. It’s possible that this function could work as an automatic transcription service – open a note, turn on Siri and let it transcribe a speech or a press conference. I think it would be wise to also have a recording running at the same time, in case the transcription fails for some reason and also to give you the option to check it for accuracy.

Find My iPhone

Having a fully charged device is a pre-requisite for any journalist but if you’re the type who occasionally lets theirs run fully down and then mislays it, there’s renewed hope for you – as long as you have an iPhone 11 or newer (although not an SE 2020).
With iOS 15, you can still trace your device even when it’s out of power as ‘out of power’ seems to mean something slightly different as your device remains in a very low-power state. This means any nearby iOS device can see the Bluetooth signal it emits and send back its location to help you find it.

Notify When Left Behind

Some people are fortunate enough to have more than one iPhone and if they’re careless enough to forget to take one with them, a new alert will flag that up on their other devices. Called ‘Notify When Left Behind’, it’ll push a notification to the device you have remembered to take with you, as long as it’s on the same Apple ID and you’ve set the service up within the Find My app.

The new “Notify When Left Behind” alert (I am very forgetful).

If and when you get this alert, go to ‘Settings-Apple ID-Find My’ and then into ‘Find My iPhone’ and then ensure all three lights are on as this ensures the new feature is active. Remember though, you can’t do this after the fact so it might even be advisable to turn this setting on right now.

At-a-glance information

One frustration of iOS 14 for me was that when my device was in Do Not Disturb, iPhones with a notch like the X or 11 wouldn’t show the tell-tale crescent moon on the main screen. This meant I had no immediate visual confirmation of the status of my device. On devices without a notch, there was space for the moon.

A notch-less iPhone wouldn’t show the ‘Do Not Disturb’ crescent moon icon in iOS 14.

But in iOS 15, the icon is visible whichever Focus you’re in and I think that’s a useful improvement.

An iPhone in Work, Filming, Sleep and Do Not Disturb focus.

Bigger text where you want it

If you wanted larger text on previous versions of iOS, you either had to enable the feature for EVERYTHING on your device, or not at all whereas iOS 15 lets you do it per app.
Go to Control Centre in Settings and enable the “text size” option. Now, when you’re in an app where you need to adjust the size, slide to open the Control Centre panel and then press and hold on the aA icon.
In the bottom left it’ll give the name of the app currently open under Control Centre, as well as showing a slider to increase or decrease the font size.

Conclusion

These are the useful and interesting changes I’ve found from beta testing iOS 15 over the last few months. You might find others you like (or dislike) based on how you yourself use your device after you’ve upgraded. Or you may feel, having read this, that you’re happy with what iOS 14 can do and you’ll be fine only taking the bug fixes offered by Apple. For the first time ever, that choice is open to you.

#45 The Smartphone Camera Exposure Paradox — 11. May 2021

#45 The Smartphone Camera Exposure Paradox

Ask anyone about the weaknesses of smartphone cameras and you will surely find that people often point towards a phone’s low-light capabilities as the or at least one of its Achilles heel(s). When you are outside during the day it’s relatively easy to shoot some good-looking footage with your mobile device, even with budget phones. Once it’s darker or you’re indoors, things get more difficult. The reason for this is essentially that the image sensors in smartphones are still pretty small compared to those in DSLMs/DLSRs or professional video/cinema cameras. Bigger sensors can collect more photons (light) and produce better low light images. A so-called “Full Frame” sensor in a DSLM like Sony’s Alpha 7-series has a surface area of 864 mm2, a common 1/2.5” smartphone image sensor has only 25 mm2. So why not just put a huge sensor in a smartphone? While cameras in smartphones have undeniably become a very important factor, the phone is still very much a multi-purpose device and not a single-purpose one like a dedicated camera – for better or worse. That means there are many things to consider when building a phone. I doubt anyone would want a phone with a form factor that doesn’t allow you to put the phone in your pocket. And the flat form factor makes it difficult to build proper optics with larger sensors. Larger sensors also consume more power and produce more heat, not exactly something desirable. If we are talking about smartphone photography from a tripod, some of the missing sensor size can be compensated for with long exposure times. The advancements in computational imaging and AI have also led to dedicated and often quite impressive photography “Night Modes” on smartphones. But very long shutter speeds aren’t really an option for video as any movement appears extremely blurred – and while today’s chipsets can already handle supportive AI processing for photography, more resource-intensive videography is yet a bridge too far. So despite the fact that latest developments signal that we’re about to experience a considerable bump in smartphone image sensor sizes (Sony and Samsung are about to release a 1-inch/almost 1-inch image sensor for phones), one could say that most/all smartphone cameras (still) have a problem with low-light conditions. But you know what? They also have a problem with the exact opposite – very bright conditions!

If you know a little bit about how cameras work and how to set the exposure manually, you have probably come across something called the “exposure triangle”. The exposure triangle contains the three basic parameters that let you set and adjust the exposure of a photo or video on a regular camera: Shutter speed, aperture and ISO. In more general terms you could also say: Time, size and sensitivity. Shutter speed signifies the amount of time that the still image or a single frame of video is exposed to light, for instance 1/50 of a second. The longer the shutter speed, the more light hits the sensor and the brighter the image will be. Aperture refers to the size of the iris’ opening through which the light passes before it hits the sensor (or wayback when the film strip), it’s commonly measured in f-stops, for instance f/2.0. The bigger the aperture (= SMALLER the f-stop number), the more light reaches the sensor and the brighter the image will be. ISO (or “Gain” in some dedicated video cameras) finally refers to the sensitivity of the image sensor, for instance ISO 400. The higher the ISO, the brighter the image will be. Most of the time you want to keep the ISO as low as possible because higher sensitivity introduces more image noise. 

So what exactly is the problem with smartphone cameras in this respect? Well, unlike dedicated cameras, smartphones don’t have a variable aperture, it’s fixed and can’t be adjusted. Ok, there actually have been a few phones with variable aperture, most notably Samsung had one on the S4 Zoom (2013) and K Zoom (2014) and they introduced a dual aperture approach with the S9/Note9 (2018), held on to it for the S10/Note 10 (2019) but dropped it again for the S20/Note20 (2020). But as you can see from the very limited selection, this has been more of an experiment. The fixed aperture means that the exposure triangle for smartphone cameras only has two adjustable parameters: Shutter speed and ISO. Why is this problematic? When there’s movement in a video (either because something moves within the frame or the camera itself moves), we as an audience have become accustomed to a certain degree of motion blur which is related to the used shutter speed. The rule of thumb applied here says: Double the frame rate. So if you are shooting at 24fps, use a shutter speed of 1/48s, if you are shooting at 25fps, use a shutter speed of 1/50s, 1/60s for 30fps etc. This suggestion is not set in stone and in my humble opinion you can deviate from it to a certain degree without it becoming too obvious for casual, non-pixel-peeping viewers – but if the shutter speed is very slow, everything begins to look like a drug-induced stream of consciousness experience and if it’s very fast, things appear jerky and shutter speed becomes stutter speed. So with the aperture being fixed and the shutter speed set at a “recommended” value, you’re left with ISO as an adjustable exposure parameter. Reducing the sensitivity of the sensor is usually only technically possible down to an ISO between 50 and 100 which will still give you a (heavily) overexposed image on a sunny day outside. So here’s our “paradox”: Too much available light can be just as much of an issue as too little when shooting with a smartphone.

What can we do about the two problems? Until significantly bigger smartphone image sensors or computational image enhancement for video arrives, the best thing to tackle the low-light challenge is to provide your own additional lighting or look for more available light, be it natural or artificial. Depending on your situation, this might be relatively easy or downright impossible. If you are trying to capture an unlit building at night, you will most likely not have a sufficient amount of ultra-bright floodlights at your hand. If you are interviewing someone in a dimly lit room, a small LED might just provide enough light to keep the ISO at a level without too much image noise.

Clip-on variable ND filter

As for the too-much-light problem (which ironically gets even worse with bigger sensors setting out to remedy the low-light problems): Try to pick a less sun-drenched spot, shoot with a faster shutter-speed if there is no or little action in the shot or – and this might be the most flexible solution – get yourself an ND (neutral density) filter that reduces the amount of light that passes through the lens. While some regular cameras have inbuilt ND filters, this feature has yet to appear in any smartphone, although OnePlus showcased a prototype phone last year that had something close to a proper ND filter, using a technology called “electrochromic glass” to hide the lens while still letting (less) light pass through (check out this XDA Developers article). So until this actually makes it to the market and proves to be effective, the filter has to be an external one that is either clipped on or screwed on if you use a dedicated case with a corresponding filter thread. You also have the choice between a variable and a non-variable (fixed density) ND filter. A variable ND filter will let you adjust the strength of its filtering effect which is great for flexibility but also have some disadvantages like the possibility of cross-polarization. If you want to learn more about ND filters, I highly recommend checking out this superb in-depth article by Richard Lackey.

So what’s the bigger issue for you personally? Low-light or high-light? 

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion — 4. May 2021

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion

Rode just recently released the Wireless GO II, a very compact wireless audio system I wrote about in my last article. One of its cool features is that you can feed two transmitters into one receiver so you don’t need two audio inputs on your camera or smartphone to work with two external mic sources simultaneously. What’s even cooler is that you can record the two mics into separate channels of a video file with split track dual mono audio so you are able to access and mix them individually later on which can be very helpful if you need to make some volume adjustments or eliminate unwanted noise from one mic that would otherwise just be “baked in” with a merged track. There’s also the option to record a -12dB safety track into the second channel when you are using the GO II’s “merged mode” instead of the “split mode” – this can be a lifesaver when the audio of the original track clips because of loud input.

If you use a regular camera like a DSLM, it’s basically a given that you can record in split track dual mono and it also isn’t rocket science to access the two individual channels on a lot of desktop editing software. If you are using the GO II with a smartphone and even want to finish the edit on mobile afterwards, it’s a bit more complicated.

First off, if you want to make use of split channels or the safety channel, you need to be able to record a video file with dual track audio, because only then do you have two channels at your disposal, two channels that are either used for mic 1 and mic 2 or mic 1+2 combined and the safety channel in the case of the Wireless Go II. Most smartphones and camera apps nowadays do support this though (if they support external mics in general). The next hurdle is that you need to use the digital input port of your phone, USB-C on an Android device or the Lightning port on an iPhone/iPad. If you use the 3.5mm headphone jack (or an adapter like the 3.5mm to Lightning with iOS devices), the input will either create single channel mono audio or send the same pre-mixed signal to both stereo channels. So you will need a USB-C to USB-C cable for Android devices (Rode is selling the SC-16 but I also made it work with another cable) and a USB-C to Lightning cable for iOS devices (here the Rode SC-15 seems to be the only compatible option) to connect the RX unit of the GO II to the mobile device. Unfortunately, such cables are not included with the GO II but have to be purchased separately. A quick note: Depending on what app you are using, you either need to explicitly choose an external mic as the audio input in the app’s settings or it just automatically detects the external mic.

Once you have recorded a dual mono video file including separate channels and want to access them individually for adjustments, you also need the right editing software that allows you to do that. On desktop, it’s relatively easy with the common prosumer or pro video editing software (I personally use Final Cut Pro) but on mobile devices there’s currently only a single option: LumaFusion, so far only available for iPhone/iPad. I briefly thought that KineMaster (which is available for both Android and iOS) can do it as well because it has a panning feature for audio but it’s not implemented in a way that it can actually do what we need it to do in this scenario.

So how do you access the different channels in LumaFusion? It’s actually quite simple: You either double-tap your video clip in the timeline or tap the pen icon in the bottom toolbar while having the clip selected. Select the “Audio” tab (speaker icon) and find the “Configuration” option on the right. In the “Channels” section select either “Fill From Left” or “Fill From Right” to switch between the channels. If you need to use both channels at the same time and adjust/balance the mix you will have to detach the audio from the video clip (either triple-tap the clip or tap on the rectangular icon with an audio waveform), then duplicate the audio (rectangular icon with a +) and then set the channel configuration of one to “Fill From Left” and for the other to “Fill From Right”.

Here’s hoping that more video editing apps implement the ability to access individual audio tracks of a video file and that LumaFusion eventually makes it to Android.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC) — 23. March 2021

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC)

As I have pointed out in two of my previous blog posts (What’s the best free cross-platform mobile video editing app?, Best video editors / video editing apps for Android in 2021) VN is a free and very capable mobile video editor for Android and iPhone/iPad and the makers recently also launched a desktop version for macOS. Project file sharing takes advantage of that and makes it possible to start your editing work on one device and finish it on another. So for instance after having shot some footage on your iPhone, you can start editing right away using VN for iPhone but transfer the whole project to your iMac or MacbookPro later to have a bigger screen and mouse control. It’s also a great way to free up storage space on your phone since you can archive projects in the cloud, on an external drive or computer and delete them from your mobile device afterwards. Project sharing isn’t a one-way trick, it also works the other way around: You start a project using VN on your iMac or MacbookPro and then transfer it to your iPhone or iPad because you have to go somewhere and want to continue your project while commuting. And it’s not all about Apple products either, you can also share from or to VN on Android smartphones and tablets (so basically every smartphone or tablet that’s not made by Apple). What about Windows? Yes, this is also possible but you will need to install an Android emulator on your PC and I will not go into the details about the procedure in this article as I don’t own a PC to test. But you can check out a good tutorial on the VN site here.

Before you start sharing your VN projects, here’s some general info: To actively share a project file, you need to create a free account with VN. Right off the bat, you can share projects that don’t exceed 3 GB in size. There’s also a maximum limit of 100 project files per day but I suppose nobody will actually bump into that. To get rid of these limitations, VN will manually clear your account for unlimited sharing within a few days after filling out this short survey. For passive sharing, that is when someone sends you a project file, there are no limitations even when you are not logged in. As the sharing process is slightly different depending on which platforms/devices are involved I have decided to walk you through all nine combinations, starting with the one that will probably be the most common. 

Let me quickly explain two general things ahead which apply to all combinations so I don’t have to go into the details every time:

1) When creating a VN project file to share, you can do it as “Full” or “Simple”. “Full” will share the project file with all of its media (complete footage, music/sound fx, text), “Simple” will let you choose which video clips you actually want to include. Not including every video clip will result in a smaller project file that can be transferred faster.

2) You can also choose whether or not you want the project file to be “Readonly”. If you choose “Readonly”, saving or exporting will be denied – this can be helpful if you send it to someone else but don’t want this person to save changes or export the project.

All of the sharing combinations I will mention now are focused on local device-to-device sharing. Of course you can also use any cloud service to store/share VN project files and have them downloaded and opened remotely on another device that runs the VN application.

iPhone/iPad to Mac

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon at the bottom), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Now choose “AirDrop” and select your Mac. Make sure that AirDrop is activated on both devices.
  • Depending on your AirDrop settings you now have to accept the transfer on the receiving device or the transfer will start automatically. By default, the file will be saved in the “Downloads” folder of your Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app.
  • Now select “Open project”.

iPhone/iPad to iPhone/iPad

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now choose “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Select the iPhone/iPad you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now select “Open project”

iPhone/iPad to Android

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the iOS/iPadOS share menu will pop up.
  • Now you need to transfer the project file from the iPhone/iPad to the Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both iPhone/iPad and Android.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Open SendAnywhere on your Android device, select the “Receive” tab and enter the code
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. 
  • The Android “Open with” menu will open, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Finally choose “Open Project”.

Mac to iPhone/iPad

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select. “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select your iPhone or iPad. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now choose “Open Project”.

Mac to Mac

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select the Mac you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • By default the VN project file will be saved in the “Downloads” folder of the receiving Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Mac to Android

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and choose a way to send it to your Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both macOS and Android.
  • So using SendAnywhere on your Mac, drag the VN project file into the app. You will see a 6-digit code. Open SendAnywhere on your Android, choose the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then on the project file.
  • The Android “Open with” menu will pop up, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Choose “Open Project”.

Android to Mac

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the Android share sheet will pop up.
  • Now you need to transfer the project file from your Android device to your Mac. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and macOS.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Unless you have created a custom download folder for your preferred file transfer app, the VN project file will be saved to the “Downloads” folder on your Mac or is available in your cloud storage.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Android to Android

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • From the Android share sheet, choose Android’s integrated wifi sharing option Nearby Share (check this video on how to use Nearby Share if you are not familiar with it) and select the device you want to send it to. Make sure Nearby Share is activated on both devices.
  • After accepting the file on the second device, the transfer will start.
  • Once it is finished, choose “VN/Import to VN” from the pop up menu. Importing into VN will start. 
  • Finally choose “Open Project”.

Android to iPhone/iPad

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated. Afterwards, the Android share sheet menu will pop up.
  • Now you need to transfer the project file from the Android device to the iPhone/iPad. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and iPhone/iPad.
  • So choose SendAnywhere from the Share Sheet. A 6-digit code is generated.
  • Open SendAnywhere on your iPhone/iPad, select the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. Now tap on the share icon in the top right corner and choose VN from the list. The project file will be imported into VN.
  • Finally choose “Open Project”.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

DISCLOSURE NOTE: This particular post was sponsored by VN. It was however researched and written all by myself.

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier! — 16. January 2021

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier!

There are times when – for reasons of privacy or even a person’s physical safety – you want to make certain parts of a frame in a video unrecognizable so not to give away someone’s identity or the place where you shot the video. While it’s fairly easy to achieve something like that for a photograph, it’s a lot more challenging for video because of two reasons: 1) You might have a person moving around within a shot or a moving camera which constantly alters the location of the subject within the frame. 2) If the person talks, he or she might also be identifiable just by his/her voice. So are there any apps that help you to anonymize persons or objects in videos when working on a smartphone?

KineMaster – the best so far

Up until recently the best app for anonymizing persons and/or certain parts of a video in general was KineMaster which I already praised in my last blog about the best video editing apps on Android (it’s also available for iPhone/iPad). While it’s possible to use just any video editor that allows for a resizable image layer (let’s say just a plain black square or rectangle) on top of the main track to cover a face, KineMaster is the only one with a dedicated blur/mosaic tool for this use case. Many other video editing apps have a blur effect in their repertoire, but the problem is that this effect always affects the whole image and can’t be applied to only a part of the frame. KineMaster on the other hand allows its Gaussian Blur effect to be adjusted in size and position within the frame. To access this feature, scroll to the part of the timeline where you want to apply the effect but don’t select any of the clips! Now tap on the “Layer” button, choose “Effect”, then “Basic Effects”, then either “Gaussian Blur” or “Mosaic”. An effect layer gets added to the timeline which you can resize and position within the preview window. Even better: KineMaster also lets you keyframe this layer which is incredibly important if the subject/object you want to anonymize is moving around the frame or if the camera is moving (thereby constantly altering the subject’s/object’s position within the frame). Keyframing means you can set “waypoints” for the effect’s area to automatically change its position/size over time. You can access the keyframing feature by tapping on the key icon in the left sidebar. Keyframes have to be set manually so it’s a bit of work, particularly if your subject/object is moving a lot. If you just have a static shot with the person not moving around a lot, you don’t have to bother with keyframing though. And as if the adjustable blur/mosaic effect and support for keyframing wasn’t good enough, KineMaster also gives you a tool to add an extra layer of privacy: you can alter voices. To access this feature, select a clip in the timeline and then scroll down the menu on the right to find “Voice Changer”, there’s a whole bunch of different effects. To be honest, most of them are rather cartoonish – I’m not sure you want your interviewee to sound like a chipmunk. But there are also a couple of voice changer effects that I think can be used in a professional context.

What happened to Censr?

As I indicated in the paragraph above, a moving subject (or a moving camera) makes anonymizing content within a video a lot harder. You can manually keyframe the blurred area to follow along in KineMaster but it would be much easier if that could be done via automatic tracking. Last summer, a closed beta version of an app called “Censr” was released on iOS, the app was able to automatically track and blur faces. It all looked quite promising (I saw some examples on Twitter) but the developer Sam Loeschen told me that “unfortunately, development on censr has for the most part stopped”.

PutMask – a new app with a killer feature!

But you know what? There actually is a smartphone app out there that can automatically track and pixelate faces in a video: it’s called PutMask and currently only available for Android (there are plans for an iOS version). The app (released in July 2020) offers three ways of pixelating faces in videos: automatically by face-tracking, manually by following the subject with your finger on the touch-screen and manually by keyframing. The keyframing option is the most cumbersome one but might be necessary when the other two ways won’t work well. The “swipe follow” option is the middle-ground, not as time-consuming as keyframing but manual action is still required. The most convenient approach is of course automatic face-tracking (you can even track multiple faces at the same time!) – and I have to say that in my tests, it worked surprisingly well! 

Does it always work? No, there are definitely situations in which the feature struggles. If you are walking around and your face gets covered by something else (for instance because you are passing another person or an object like a tree) even for only a short moment, the tracking often loses you. It even lost me when I was walking around indoors and the lens flare from the light bulb at the ceiling created a visual “barrier” which I passed at some point. And although I would say that the app is generally well-designed, some of the workflow steps and the nomenclature can be a bit confusing. Here’s an example: After choosing a video from your gallery, you can tap on “Detect Faces” to start a scanning process. The app will tell you how many faces it has found and will display a numbered square around the face. If you now tap on “Start Tracking”, the app tells you “At least select One filter”. But I couldn’t find a button or something indicating a “filter”. After some confusion I discovered that you need to tap once on the square that is placed over the face in the image, maybe by “filter” they actually mean you need to select at least one face? Now you can initiate the tracking. After the process is finished you can preview the tracking that the app has done (and also dig deeper into the options to alter the amount of pixelation etc.) but for checking the actual pixelated video you have to export your project first. While the navigation could/should be improved for certain actions to make it more clear and intuitive, I was quite happy with the results in general. The biggest catch until recently was the maximum export resolution of 720p but with the latest update released on 21 January 2021, 1080p is also supported. An additional feature that would be great to have in an app that has a dedicated focus on privacy and anonymization, is the ability to alter/distort the voice of a person, like you can do in KineMaster.

There’s one last thing I should address: The app is free to download with all its core functionality but you only get SD resolution and a watermark on export. For HD/FHD watermark-free export, you need to make an in-app purchase. The IAP procedure is without a doubt the weirdest I have ever encountered: The app tells you to purchase any one of a selection of different “characters” to receive the additional benefits. Initially, these “characters” are just names in boxes, “Simple Man”, “Happy Man”, “Metal-Head” etc. If you tap on a box, an animated character pops up. But only when scrolling down it becomes clear that these “characters” represent different amounts of payment with which you support the developer. And if that wasn’t strange enough by itself, the amount you can donate goes up to a staggering 349.99 USD (Character Dr. Plague) – no kidding! At first, I had actually selected Dr. Plague because I thought it was the coolest looking character of the bunch. Only when trying to go through with the IAP did I become aware of the fact that I was about to drop 350 bucks on the app! Seriously, this is nuts! I told the developer that I don’t think this is a good idea. Anyway, the amount of money you donate doesn’t affect your additional benefits, so you can just opt for the first character, the “Simple Man”, which costs you 4.69€. I’m not sure why they would want to make things so confusing for users willing to pay but other than that, PutMask is a great new app with a lot of potential, I will definitely keep an eye on it!

As always, if you have questions or comments, drop them below or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

Download PutMask on GooglePlay.

#35 Using external microphones with iPhones when shooting video — 1. December 2020

#35 Using external microphones with iPhones when shooting video

I usually don’t follow the stats for my blog but when I recently did check on what articles have been the most popular so far, I noticed that a particular one stuck out by a large margin and that was the one on using external microphones with Android devices. So I thought if people seem to be interested in that, why not make an equivalent for iOS, that is for iPhones? So let’s jump right into it.

First things first: The Basics

A couple of basic things first: Every iPhone has a built-in microphone for recording video that, depending on the use case, might already be good enough if you can position the phone close to your talent/interviewee. Having your mic close to the sound source is key in every situation to get good audio! As a matter of fact, the iPhone has multiple internal mics and uses different ones for recording video (next to the lens/lenses) and pure audio (bottom part). When doing audio-only for radio etc., it’s relatively easy to get close to your subject and get good results. It’s not the best way when recording video though if you don’t want to shove your phone into someone’s face. In this case you can and should significantly improve the audio quality of your video by using an external mic connected to your iPhone – never forget that audio is very important! While the number of Android phone makers that support the use of external mics within their native camera app is slowly growing, there are still many (most?) Android devices out there that don’t support this for the camera app that comes with the phone (it’s possible with basically every Android device if you use 3rd party camera apps though!). You don’t have to worry about this when shooting with the native camera app of an iPhone. The native camera app will recognize a connected external mic automatically and use it as the audio input when recording video. When it comes to 3rd party video recording apps, many of them like Filmic Pro, MoviePro or Mavis support the use of external mics as well but with some of them you have to choose the audio input in the settings so definitely do some testing before using it the first time on a critical job. Although I’m looking at this from a videographer’s angle, most of what I am about to elaborate on also applies to recording with audio recording apps. And in the same way, when I say “iPhone”, I could just as well say “iPad” or “iPod Touch”. So there are basically three different ways of connecting an external mic to your iPhone: via the 3.5mm headphone jack, via the Lightning port and via Bluetooth (wireless).

3.5mm headphone jack & adapter

With all the differences between Android and iOS both in terms of hardware and software, the 3.5mm headphone jack was, for a while, a somewhat unifying factor – that was until Apple decided to drop the headphone jack for the iPhone 7 in 2016. This move became a wildly debated topic, surely among the – let’s be honest – comparatively small community of mobile videographers and audio producers relying on connecting external mics to their phones but also among more casual users because they couldn’t just plug in their (often very expensive) headphones to their iPhone anymore. While the first group is definitely more relevant for readers of this blog, the second was undoubtedly responsible for putting the issue on the public debate map. Despite the considerable outcry, Apple never looked back. They did offer a Lightning-to-3.5mm adapter – but sold it separately. I’m sure they have been making a fortune since, don’t ask how many people had to buy it more than once because they lost, displaced or broke the first one. A whole bunch of Android phone makers obviously thought Apple’s idea was a progressive step forward and started ditching the headphone jack as well, equipping their phones only with a USB-C port. Unlike with Apple however, the consumer still had the choice to choose a new phone that had a headphone jack and in a rather surprising turn of events, some companies like Huawei and Google actually backtracked and re-introduced the headphone jack, at least for certain models. Anyway, if you happen to have an older iPhone (6s and earlier) you can still use the wide variety of external microphones that can be connected via the 3.5mm headphone jack (Rode smartLav+, iRigMic, iRig Pre/iRig Pre 2 interface with XLR mics etc.) without worrying much about adapters and dongles. Just make sure that the mic you are using has a TRRS (three black rings) and not a TRS (two black rings) 3.5mm connector to assure compatibility with smartphones (TRS is for DSLM/DSLR).

Lightning port

While most Android users probably still have fairly fresh memories of a different charging port standard (microUSB) from the one that is common now (USB-C), only seasoned iPhone aficionados will remember the days of the 30-pin connector that lasted until the iPhone 5 introduced the Lightning port as a new standard in 2012. And while microUSB mic solutions for Android could be counted on one hand and USB-C offerings took forever to become a reality, there were dedicated Lightning mics even before Apple decided to kill the headphone jack. The most prominent one and a veritable trailblazer was probably IK Multimedia’s iRig Mic HD and its successor, the iRig Mic HD 2. IK Multimedia’s successor to the iRigPre, the iRigPre HD comes with a Lightning cable as well. But you can also find options from other well-known companies like Zoom (iQ6, iQ7), Shure (MV88/MV88+), Sennheiser (HandMic Digital, MKE 2 Digital), Rode (Video Mic Me-L), Samson (Go Mic Mobile) or Saramonic (Blink 500). The Saramonic Blink 500 comes in multiple variations, two of them specifically targeted at iOS users: the Blink 500 B3 with one transmitter and the B4 with two transmitters. The small receiver plugs right into the Lightning port and is therefore an intriguingly compact solution, particularly when using it with a gimbal. Saramonic also has the SmartRig Di and SmartRig+ Di audio interfaces that let you connect one or two XLR mics to your device. IK Multimedia offers two similar products with the iRig Pro and the iRig Pro Duo. Rode recently released the USB-C-to-Lightning patch cable SC15 which lets you use their Video Mic NTG (which comes with TRS/TRRS cables) with an iPhone. There’s also a Lightning connector version of the SC6 breakout box, the SC6-L which lets you connect two smartLavs or TRRS mics to your phone. I have dropped lots of product names here so far but you know what? Even if you don’t own any of them, you most likely already have an external mic at hand: Of course I’m talking about the headset that comes included with the iPhone! It can’t match the audio quality of other dedicated external mics but it’s quite solid and can come in handy when you have nothing else available. One thing you should keep in mind when using any kind of microphone connected via the iPhone’s Lightning port: unless you are using a special adapter with an additional charge-through port, you will not be able to charge your device at the same time like you can/could with older iOS devices that had a headphone jack.

Wireless/Bluetooth

I have mentioned quite a few wireless systems before (Rode Wireless Go, Saramonic Blink 500/Blink 500 Pro, Samson Go Mic Mobile) that I won’t list here (again) for one reason: While the TX/RX system of something like the Rode Wireless Go streams audio wirelessly between its units, the receiver unit (RX) needs to be connected to the iPhone via a cable or (in the case of the Blink 500) at least a connector. So strictly speaking it’s not really wireless when it comes to how the audio signal gets into the phone. Now, are there any ‘real’ wireless solutions out there? Yes, but the technology hasn’t evolved to a standard that can match wired or semi-wired solutions in terms of both quality and reliability. While there could be two ways of wireless audio into a phone (wifi and Bluetooth), only one (Bluetooth) is currently in use for external microphones. This is unfortunate because the Bluetooth protocol that is used for sending audio back from an external accessory to the phone (the so-called Hands Free Profile, HFP) is limited to a sample rate of 16kHz (probably because it was created with headset phone calls in mind). Professional broadcast audio usually has a sample rate of 44.1 or 48kHz. That doesn’t mean that there aren’t any situations in which using a Bluetooth mic with its 16kHz limitation can actually be good enough. The Instamic was primarily designed to be a standalone ultra-compact high quality audio recorder which records 48/96 kHz files to its internal 8GB storage but can also be used as a truly wireless Bluetooth mic in HFP mode. The 16kHz audio I got when recording with Filmic Pro (here’s a guide on how to use the Instamic with Filmic Pro) was surprisingly decent. This probably has to do with the fact that the Instamic’s mic capsules are high quality unlike with most other Bluetooth mics. One maybe unexpected option is to use Apple’s AirPods/AirPods Pro as a wireless Bluetooth mic input. According to BBC Mobile Journalism trainer Marc Blank-Settle, the audio from the AirPods Pro is “good but not great”. He does however point out that in times of Covid-19, being able to connect to other people’s AirPods wirelessly can be a welcome trick to avoid close contact. Another interesting wireless solution comes from a company called Mikme. Their microphone/audio recorder works with a dedicated companion video recording app via Bluetooth and automatically syncs the quality audio (44.1, 48 or 96kHz) to the video after the recording has been stopped. By doing this, they work around the 16kHz Bluetooth limitation for live audio streaming. While the audio quality itself seems to be great, the somewhat awkward form factor and the fact that it only works with its best feature in their own video recording app but not other camera apps like Filmic Pro, are noticeable shortcomings (you CAN manually sync the Mikme’s audio files to your Filmic or other 3rd party app footage in a video editor). At least regarding the form factor they have released a new version called the Mikme Pocket which is more compact and basically looks/works like a transmitter with a cabled clip-on lavalier mic. One more important tip that applies to all the aforementioned microphone solutions: If you are shooting outdoors, always have some sort of wind screen / wind muff for your microphone with you as even a light breeze can cause noticeable noise.

Micpocalpyse soon?

Looking into the nearby future, some fear that Apple might be pulling another “feature kill” soon, dropping the Lightning port as well and thereby eliminating all physical connections to the iPhone. While there are no clear indications that this is actually imminent, Apple surely would be the prime suspect to push this into the market. If that really happens however, it will be a considerable blow to iPhone videographers as long as there’s no established high-quality and reliable wireless standard for external mics. Oh well, there’s always another mobile platform to go to if you’re not happy with iOS anymore 😉

To wrap things up, I have asked a couple of mobile journalists / content creators using iPhones what their favorite microphone solution is when recording video (or audio in general):

Wytse Vellinga (Mobile Storyteller at Omrop Fryslân, The Netherlands): “When I am out shooting with a smartphone I want high quality worry-free audio. That is why I prefer to use the well-known brands of microphones. Currently there are three microphones I use a lot. The Sennheiser MKE200, the Rode Wireless Go and the Mikme Pocket. The Sennheiser is the microphone that is on the phone constantly when taking shots and capturing the atmospheric sound and short sound bites from people. For longer interviews I use the wireless microphones from Mikme and Rode. They offer me freedom in shooting because I don’t have to worry about the cables.”

Philip Bromwell (Digital Native Content Editor at RTÉ, Ireland): “My current favourite is the Rode Wireless Go. Being wireless, it’s a very flexible option for recording interviews and gathering localised nat sound. It has proven to be reliable too, although the original windshield was a weakness (kept detaching).”

Nick Garnett (BBC Reporter, England & the world): “The mic I always come back to is the Shure MV88+ – not so much for video – but for audio work: it uses a non proprietary cable – micro usb to lightning. It allows headphones to plug into the bottom and so I can use it for monitoring the studio when doing a live insert and the mic is so small it hides in my hand if I have to be discrete. For video work? Rode VideoMicro or the Boya clone. It’s a semi-rifle, it comes with a deadcat and an isolation mount and it costs €30 … absolute bargain.”

Neal Augenstein (Radio Reporter at WTOP Washington DC, USA): “If I’m just recording a one-on-one interview, I generally use the built-in microphone of the iPhone, with a foam windscreen. I’ve yet to find a microphone that so dramatically improves the sound that it merits carrying it around. In an instance where someone’s at a podium or if I’m shooting video, I love the Rode Wireless Go. Just clipping it on the podium, without having to run cable, it pairs automatically, and the sound is predictably good. The one drawback – the tiny windscreen is tough to keep on.”

Nico Piro (Special Correspondent for RAI, Italy & the world): “To record ambient audio (effects or natural as you want to name it) I use a Rode Video Mic Go (light, no battery needed, perfect for both phones and cameras) even if I must say that the iPhone’s on-board mic performs well, too. For Facebook live I use a handheld mic by Polsen, designed for mobile, it is reliable and has a great cardioid pickup pattern. When it comes to interviews, the Rode Wireless Go beats everything for its compact dimensions and low weight. When you are recording in big cites like New York and you are worried about radio interferences the good old cabled mics are always there to help, so Rode’s SmartLav+ is a very good option. I’m also using it for radio production and I am very sad that Rode stopped improving its Rode Rec app which is still good but stuck in time when it comes to file sharing. Last but not least is the Instamic. It takes zero space and it is super versatile…if you use native camera don’t forget to clap for sync!”

Bianca Maria Rathay (Freelance iPhone videographer, Germany): “My favorite external microphone for the iPhone is the RODE Wireless Go in combination with a SmartLav+ (though it works on its own also). The mic lets your interviewee walk around freely, works indoors as well as outdoors and has a full sound. Moreover it is easy to handle and monitor once you have all the necessary adapters in place and ready.”

Leonor Suarez (TV Journalist and News Editor at RTPA, Spain): “My favorite microphone solutions are: For interviews: Rode Rodelink Filmmaker Kit. It is reliable, robust and has a good quality-price relationship. I’ve been using it for years with excellent results. For interviews on the go, unexpected situations or when other mics fail: IK Multimedia iRig Mic Lav. Again, good quality-price relationship. I always carry them with me in my bag and they have allowed me to record interviews, pieces to camera and unexpected stories. What I also love is that you can check the audio with headphones while recording.”

Marcel Anderwert (Mobile Journalist at SRF, Switzerland): “For more than a year, I have been shooting all my reports for Swiss TV with one of these two mics: Voice Technologies’ VT506Mobile (with it’s long cable) or the Rode Wireless Go, my favourite wireless mic solution. The VT506Mobile works with iOS and Android phones, it’s a super reliable lavalier and the sound quality for interviews is just great. Rode’s Wireless Go gives me more freedom of movement. And it can be used in 3 ways: As a small clip-on mic with inbuilt transmitter, with a plugged in lavalier mic – and in combination with a simple adapter even as a handheld mic.”

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#34 Apple is about to give us 25fps in the iPhone’s native camera app (finally catching up to Windows Phones) — 17. November 2020

#34 Apple is about to give us 25fps in the iPhone’s native camera app (finally catching up to Windows Phones)

One of the things that has mostly remained a blindspot in video recording with the native camera app of a smartphone, is the ability to shoot in PAL frame rates, i.e. 25/50fps. The native camera apps of smartphones usually record with a frame rate of 30/60 fps. This is fine for many use cases but it’s not ideal under two circumstances: a) if you have to deliver your video for traditional professional broadcast in a PAL broadcast standard region (Europe, Australia, parts of Africa, Asia, South America etc.) b) If you have a multi-camera shoot with dedicated ‘regular’ cameras that only shoot 25/50fps. Sure, it’s relatively easy to capture in 25fps on your phone by using a 3rd party app like Filmic Pro or Protake but it still would be a welcome addition to any native camera app as long as this silly global frame rate divide (don’t get me started on this!) continues to exist. There was actually a prominent example of a phone maker that offered 25fps as a recording option in their (quasi)-native camera app very early on: Nokia and later Microsoft on their Lumia phones running Windows Phone / Windows Mobile. But as we all know by now, Windows Phone / Windows Mobile never really stood a chance against Android and iOS (read about its potential here) and has all but disappeared from the smartphone market. When LG introduced its highly advanced manual video mode in the native camera app of the V10, I had high hopes they would include a 25/50fps frame rate option as they were obviously aiming at more ambitious videographers. But no, the years have passed and current offerings from the Korean company like the G8X, V60 and Wing still don’t have it. It’s probably my only major gripe with LG’s otherwise outstanding flagship camera app. It was up to Sony to rekindle the flame, giving us 25fps natively in the pro camera app of the Xperia 1 II earlier this year. 

And now, as spotted by BBC multimedia trainer Mark Robertson yesterday, Apple has added the option to record with a frame rate of 25fps in the native camera app on their latest iOS beta 14.3. This is a pretty big deal and I honestly didn’t expect Apple to make that move. But of course this is a more than welcome surprise! Robertson is using a new iPhone 12 Pro Max but his colleague Marc Blank-Settle also confirmed that this feature trickles down to the very old iPhone 6s, that is if you run the latest public beta version of iOS. The iPhone 6 and older models are excluded as they are not able to run iOS 14. While it’s not guaranteed that all new beta features make it to the finish line for the final release, I consider it to be very likely. So how do you set your iPhone’s native camera app to shoot video in 25fps? Go into your iPhone’s general settings, scroll down to “Camera” and then select “Record Video”. Now locate the “Show PAL Formats” toggle switch and activate it, then choose either “1080p HD at 25fps” or “4K at 25fps”. Unfortunately, there’s no 50fps option at this moment, I’m pretty sure it will come at some point in the future though. I recorded several clips with my iPhone SE 2020 and tested the frame rate via the MediaInfo app which revealed a clean 25.000fps and CFR (Constant Frame Rate, smartphones usually record in VFR = Variable Frame Rate). What other implications does this have? Well, many interested in this topic have been complaining about Apple’s own iOS editing app iMovie not supporting 25/50fps export. You can import and edit footage recorded in that frame rates no problem but it will be converted to 30/60fps upon export. I believe that there’s a good chance now that Apple will support 25/50fps export in a future update of iMovie because why bother integrating this into the camera app when you can’t deliver in the same frame rate? Android phone makers in the meantime should pay heed and consider adding 25/50fps video recording to their native camera apps sooner than later. It may not be relevant for the majority of conventional smartphone users but it also doesn’t hurt and you can make certain “special interest” groups very happy! 

As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider signing up for my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.

As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider subscribing to my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#12 Recording video with multiple cameras simultaneously on a smartphone (incl. Update 2021) — 17. April 2018

#12 Recording video with multiple cameras simultaneously on a smartphone (incl. Update 2021)

2017 marked the return of one of THE big pioneers in the history of mobile phones to the smartphone market: Nokia. It’s not really the same company from the days of feature and Windows phones anymore (a company named HMD Global has licensed the brand name for their phones) but that doesn’t mean we should just ignore it. After launching a bunch of affordable entry-level and lower end mid-range devices (Nokia 3, 5 & 6), the Nokia 8 was the first quasi-flagship phone following the brand’s reboot.

One special feature of the Nokia 8 was something the company called the „Bothie“ for marketing purposes, obviously trying to convince people that a new flavour of the all-too-common „selfie“ is in town. A „Bothie“ is a split-screened snapshot that is taken with both front and rear cameras AT THE SAME TIME, giving you two different perspectives of the very same moment. For instance the image of a person looking at something AND the image of the scenery the person is looking at. What’s more: this mode not only works for photo but also for video, meaning you can record a split-screened video with both front and rear cameras simultaneously. It turns out however that Nokia actually wasn’t the first company to include such a feature in a smartphone. As early as 2013 (Samsung Galaxy S4 and LG Optimus G Pro) other phone makers equipped some of their phones with similar modes – of course they were/are all using different names for the same feature so we can get jolly confused when talking about it!

Before giving a brief overview on how these modes have been implemented by each manufacturer, you might ask how such a feature can be useful for a more professional video production context. I’d say there are two main use cases for which this mode could be a great asset: piece-to-camera reporting and vlogging – obviously those two areas can heavily intersect. Imagine a mobile journalist reporting from an event, let’s say a protest rally – it’s much more interesting for the audience to see both the reporter elaborating on what’s happening and the rally itself instead of just one or the other. Traditionally one would have to have two separate cameras (or take different shots successively) and edit in post-production to achieve the same but thanks to today’s smartphones having HD video capable cameras on the front and the back, this can be done a lot easier and faster.

Samsung’s “Dual Camera“ (discontinued) and “Vlogger Mode”

Right along with its Korean rival LG, Samsung was the first phone maker to introduce a dual video recording feature with the Galaxy S4 in April of 2013. This mode has been on all following Samsung flagships of the S- and Note-Series so far but you might have to download it as a sort of „plug-in“ from within the native camera app (there’s a „+“ button to add more camera modes). Samsung’s take is a picture-in-picture approach, not a traditional split-screen where both parts have exactly the same size (there IS a split-screen option but it’s barely useful with two 16:9 images side-by-side, extreme letter-boxing). With Samsung’s „Dual Camera“, one image always is the main image while the secondary image from the other camera is embedded into it. You can resize the picture-in-picture though and move it around within the main image – you can also swap between cameras during the recording. The recorded video file can have a resolution of up to 1080p with a traditional aspect ratio of 16:9 or 9:16. One very cool thing about Samsung’s native camera app is that unlike most other Android phones’ native camera apps it supports the use of external mics via the 3.5mm headphone jack or USB port which is a tremendous advantage for having professional-grade audio. One catch: You can only record up to 5 minutes for a single clip. The feature made it to the S8 but was axed for the S9 and neither included with the S10 or S20. However, in January 2021, Samsung announced the new Galaxy S21 series (S21, S21+ and S21 Ultra) and with it the return of the Dual Camera functionality albeit under a new name: Vlogger Mode (as part of the “Director’s View mode”). But who cares what they call it, it’s great that you can now again record video with both front and back cameras at the same time! It will be interesting to see if Samsung (like LG with the LG Wing) has also included the option to save the footage of both lenses as separate files or if it’s “only” a split-screen single file like in the past, I will definitely keep an eye on it!

HTC’s “Split Capture“ (discontinued)

HTC followed Samsung with a similar but slightly different feature (officially called “Split Capture”) on the HTC One M8, launched in March 2014. The recorded video was an equally sized left/right split-screen 1080p video with a 16:9/9:16 aspect ratio. HTC subsequently featured this mode in other phones like the Desire Eye and the One M9 but apparently ditched it after the M9 as the HTC 10 and more recent flagships like the U Ultra or U11 don’t seem to have it anymore.

LG’s “Snap Movie“ (discontinued) / “Match Shot“ (discontinued) / “Dual Recording”

In April 2013, LG introduced dual video recording with the LG Optimus G Pro (thanks to the user “Lal muan” for the info, see comments section!) and was – along with Samsung and its Galaxy S4 – the first Android phone maker to do so. Two years later, they redefined what a native camera app on a smartphone can deliver in terms of pro video controls with the release of the LG V10. But not only did the V10 have a unique manual video mode, the app also boasted some more playful features. Among them was a mode called „Snap Movie“ which basically invites you to create a short movie (maximum of one minute) out of short, different shots without having to muck around with an editing app. „Snap Movie“ is not a dual camera mode per se but one way of recording within this mode is to use a split-screen for simultaneously recording with both front and rear cameras. The image is recorded in 1080p with 16:9 or 9:16 aspect ratio. Big catch: You can only do so for a maximum of one minute! Flash forward to 2017 and the V30: While the „Snap Movie“ mode is gone (there’s something called „Snap Shot“ but that’s a completely different thing), there’s now a „Match Shot“ mode. With „Match Shot“ you can record a split-screen image using both front and rear cameras at the same time. You also have the option to select between regular and wide-angle lenses before starting the recording although the front camera actually only has one lens so it’s most likely a software crop. Two good things about the new mode: You are not limited to only one minute anymore and there’s also support for external mics. The recording format is a bit strange though as it’s a 18:9 or 9:18 aspect ration with a resolution of 2880×1440 (you can’t change the resolution at all). The beyond-FHD resolution is great but the rather non-standard aspect ratio (probably thanks to the 18:9 display of the phone) is a bit annoying for watching it on anything other than the phone itself because the image will either get letter-boxed on certain platforms like YouTube (I guess it’s not that much of a problem for Twitter and Facebook as they are more flexible with aspect ratios) or you will have to perform a crop in a video editing app and re-export in a more common 16:9 ratio. Unfortunately LG decided to ditch this feature with the V40 and the subsequent V-series models. I had actually thought it would be gone for good but the release of the LG Wing in October 2020 introduced the feature’s revival with a new and very welcome twist: For the first time on an Android device, the now “Dual Recording” dubbed mode lets you save the two perspectives as separate video files and not one pre-composed split-screen image. This is a huge deal as it gives you much more flexibility in post production!

Nokia’s “Dual Sight“

As it has become quite clear, Nokia’s „Bothie“ feature introduced with the Nokia 8 last year is actually “old news” – HMD Global just made it an integral part of their marketing campaign for the device unlike their predecessors. The mode’s proper name is „Dual Sight“ and it’s pretty much like the one HTC had, meaning it’s an equally sized split-screen image in 16:9 or 9:16 aspect ratio with a resolution of 1080p. The Nokia 8 however DOES have one new trick up its sleeve: live streaming integration! You can use the „Dual Sight“ feature not only for recording but also for live streaming video on Facebook and YouTube (not sure about Periscope) which can come in really handy for journalists and live vloggers. One probable shortcoming of this mode on the Nokia 8: while I’m not able to test myself, I’m pretty sure that Nokia’s native camera app doesn’t have support for external mics (the Nokia 5 definitely doesn’t). If you do own a Nokia 8 please let me know if my assumption is correct. Nokia has kept the Dual Sight feature on their 7 and 8-series phones with the latest addition being the Nokia 8.3 as of October 2020.

Huawei’s “Dual View”

I recently also discovered that Huawei has a mode called “Dual View” in its native camera app of the P30/P30 Pro. This works slightly different from the modes mentioned above as you can only use two of the rear cameras for the split-screen recording, not the front camera! While it’s good for certain situations like say an interview to record both a close-up and a wide-angle image of the interviewee, the lack of support for the front camera makes it less useful for vlogging or reporting. I don’t think it should be a technical problem to add the ability to use the front camera as well so there might be a chance that Huawei will improve things here. Like with the dual recording mode on other phones, this one basically runs in full-auto so don’t have precise control over exposure. The resolution is a rather idiosyncratic 2336×1080 at 30fps. On the positive side: External mics via the headphone jack (yes, the P30 re-introduced this feature, the P30 Pro however doesn’t have it!) are supported!

Oppo’s “Dual View”

In March 2021, Oppo, a really big player in the Chinese/Asian smartphone market that is also slowly establishing itself in other regions, released a bunch of new smartphones including the Find X3 series (Find X3 Pro, Find X3 Neo, Find X3 Lite) and the Reno 5 series (Reno 5, Reno 5 Pro). For the first time, Oppo phones now have the ability to record video with both front and rear cameras simultaneously. This is done in split-screen / picture-in-picture mode.

Dual Recording on iPhones with Filmic DoubleTake and other 3rd party apps

Some of you who have read the original blog post will have noticed that I even changed the title of the article for the 2020 update. The biggest reason for this is that I decided to include the iPhone due to major changes that happened with the release of iOS 13 and the introduction of a bunch of new iPhones in fall 2019. Apple has provided an API for 3rd party app developers to use multiple cameras simultaneously when recording video (or taking photos). The feature is however NOT available in Apple’s own native camera app. This is very interesting because on Android it’s basically the other way around: Dual camera video recording has been a proprietary feature of native camera apps on certain phones, there’s no API for 3rd party developers to use this feature (there’s something on the horizon but more about that in a bit). The first iOS app that took advantage of multi-cam recording was Filmic Inc.’s DoubleTake, Filmic’s CTO Chris Cohen even got to present the app live on stage during the Apple Event! DoubleTake lets you choose between a pair of cameras on your iPhone or iPad (front+rear, main rear+tele etc.) and record either as a single split-screen/picture-in-picture video or as two separate files. Particularly the fact that you can have two separate files is a very useful one as it gives you more flexibility to do what you want in post production with the two camera angels. What’s less exciting is the fact that DoubleTake doesn’t give you any control over exposure, white balance etc., it’s very bare-bones. Some might appreciate this simplistic approach but as Filmic’s well-known and extremely advanced Filmic Pro app is very popular among ambitious videographers, I suppose others are craving at least a bit more manual control for DoubleTake despite the fact that dialing in/adjusting a whole bunch of parameters for two shots at the same time is definitely a challenge. Then again, for vlogging and on the go to-camera-reporting, many might want to rely on auto-controls anyway because they would constantly have to readjust and check settings which is not only inconvenient but could also fail to deliver the desired result. I suppose there are different subjective angles on this topic. So let’s cover some quick facts: Frame rates are limited to 24, 25 and 30fps, resolution to 1920×1080. External mics are supported. The other thing that we need to keep in mind when talking about DoubleTake (and basically all the other similar apps as well) is that it only works on relatively recent iOS devices running at least iOS 13. DoubleTake is currently compatible with: iPhone 11 Pro Max, 11 Pro, 11, Xs Max, Xs, Xr, SE 2020, iPad Pro 2018/2020. As indicated, DoubleTake is not the only iOS app that jumped at the chance to offer multi-camera video recording. In contrast to Filmic’s separate app, MoviePro (another well-established pro video recording app) opted to integrate the functionality into its main app. There’s also a whole bunch of completely new apps that popped up in the wake of the API’s introduction (shout-out to Marc Blank-Settle for the collection): MixCam, Multicam Pro, Vlogger, Dualgram, DuoCam, MeWe Camera, Multicam Recording Dual Camera, DUBL Pro, GEMI, Dizzi and many more to come I would reckon.

Are Android 3rd party devs not invited to the dual cam party?

While Android (or at least some Android OEMs like Samsung, LG and HTC) beat Apple with the dual camera recording feature by half a decade, there has never been an official Android API for 3rd party app developers to tap into and make this feature available for a wider audience. There’s hope however! When I consulted the official release notes for the latest Android version (Android 11), I noticed a very interesting paragraph headline: “Support for concurrent use of more than one camera”. There, you can read the following: “Android 11 adds APIs to query support for using more than one camera at a time, including both a front-facing and rear-facing camera.” While I’m no Android developer, this very much sounds like 3rd party app developers should soon be able to create dual camera video recording apps on Android. The fact that this feature brought with it some serious compatibility fragmentation even on such a streamlined platform like iOS will most likely result in the reality that only more powerful Android devices will be able to pull this off. It will be interesting to see who will be first on Android to release a dedicated dual video recording app. Will it be an Android version of Filmic’s DoubleTake or a new kid on the block? I can’t wait!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂