smartfilming

Exploring the possibilities of video production with smartphones

#42 Camera2 API Update 2021 – Android Pro Videography & Filmmaking — 15. April 2021

#42 Camera2 API Update 2021 – Android Pro Videography & Filmmaking

I’ve already written about Camera2 API in two previous blog posts (#6 & #10) but a couple of years have passed since and I felt like taking another look at the topic now that we’re in 2021. 

Just in case you don’t have a clue what I’m talking about here: Camera2 API is a software component of Google’s mobile operating system Android (which basically runs on every smartphone today expect Apple’s iPhones) that enables 3rd party camera apps (camera apps other than the one that’s already on your phone) to access more advanced functionality/controls of the camera, for instance the setting of a precise shutter speed value for correct exposure. Android phone makers need to implement Camera2 API into their version of Android and not all do it fully. There are four different implementation levels: “Legacy”, “Limited”, “Full” and “Level 3”. “Legacy” basically means Camera2 API hasn’t been implemented at all and the phone uses the old, way more primitive Android Camera API, “Limited” signifies that some components of the Camera2 API have been implemented but not all, “Full” and “Level 3” indicate complete implementation in terms of video-related functionality. “Level 3” only has the additional benefit for photography that you can shoot in RAW format. Android 3rd party camera apps like Filmic Pro, Protake, mcpro24fps, ProShot, Footej Camera 2 or Open Camera can only unleash their full potential if the phone has adequate Camera2 API support, Filmic Pro doesn’t even let you install the app in the first place if the phone doesn’t have proper implementation. “adequate”/”proper” can already be “Limited” for certain phones but you can only be sure with “Full” and “Level 3” devices. With some other apps like Open Camera, Camera2 API is deactivated by default and you need to go into the settings to enable it to access things like shutter speed and ISO control.

How do you know what Camera2 API support level a phone has? If you already own the phone, you can use an app like Camera2 Probe to check but if you want to consider this before buying a new phone of course this isn’t possible. Luckily, the developer of Camera2 Probe has set up a crowd sourced list (users can provide the test results via the app which are automatically entered into the list) with Camera2 API support levels of a massive amount of different Android devices, currently over 3500! The list can be accessed here and it’s great that you even get to sort the list by different parameters like the phone brand or type a device name into a search bar.

It’s important to understand that there’s a Camera2 API support level for each camera on the phone. So there could be a different one for the rear camera than for the selfie camera. The support level also doesn’t say anything about how many of the phone’s camera have been made accessible to 3rd party apps. Auxiliary ultra wide-angle or telephoto lenses have become a common standard in many of today’s phones but not all phone makers allow 3rd party camera apps to access the auxiliary camera(s). So when we talk about the Camera2 API support level of a device, most of the time we are referring to its main rear camera. 

Camera2 API was introduced with Android version 5 aka “Lollipop” in 2014 and it took phone makers a bit of time to implement it into their devices so one could roughly say that only Android devices running at least Android 6 Marshmallow are actually in the position to have proper support. In the beginning, most phone makers only provided full Camera2 API support for their high-end flagship phones but over the last years, the feature has trickled down to the mid-range segment and now even to a considerable amount of entry-level devices (Nokia and Motorola are two companies that have been good with this if you’re on a tight budget).

I actually took the time to go through the Camera2 Probe list to provide some numbers on this development. Of course these are not 100% representative since not every single Android device on the planet has been included in the list but I think 3533 entries (as of 21 March 2021) make for a solid sample size.

Phone models running Android 6

Level 3: 0

Full: 30

Limited: 18

Legacy: 444

Full/Level 3 %: 6.1

———-

Phone models running Android 7

Level 3: 82

Full: 121

Limited: 113

Legacy: 559

Full/Level 3 %: 23.2

———-

Phone models running Android 8

Level 3: 147

Full: 131

Limited: 160

Legacy: 350

Full/Level 3 %: 35.3

———-

Phone models running Android 9

Level 3: 145

Full: 163

Limited: 139

Legacy: 69

Full/Level 3 %: 59.7

———-

Phone models running Android 10

Level 3: 319

Full: 199

Limited: 169

Legacy: 50

Full/Level 3 %: 70.3

———-

Phone models running Android 11

Level 3: 72

Full: 28

Limited: 8

Legacy: 2

Full/Level 3 %: 90.9

I think it’s pretty obvious that the implementation of proper Camera2 API support in Android devices has been taking massive steps forward with each iteration of the OS and a 100% coverage on new devices is just within reach – maybe the upcoming Android 12 can already accomplish this mission?

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC) — 23. March 2021

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC)

As I have pointed out in two of my previous blog posts (What’s the best free cross-platform mobile video editing app?, Best video editors / video editing apps for Android in 2021) VN is a free and very capable mobile video editor for Android and iPhone/iPad and the makers recently also launched a desktop version for macOS. Project file sharing takes advantage of that and makes it possible to start your editing work on one device and finish it on another. So for instance after having shot some footage on your iPhone, you can start editing right away using VN for iPhone but transfer the whole project to your iMac or MacbookPro later to have a bigger screen and mouse control. It’s also a great way to free up storage space on your phone since you can archive projects in the cloud, on an external drive or computer and delete them from your mobile device afterwards. Project sharing isn’t a one-way trick, it also works the other way around: You start a project using VN on your iMac or MacbookPro and then transfer it to your iPhone or iPad because you have to go somewhere and want to continue your project while commuting. And it’s not all about Apple products either, you can also share from or to VN on Android smartphones and tablets (so basically every smartphone or tablet that’s not made by Apple). What about Windows? Yes, this is also possible but you will need to install an Android emulator on your PC and I will not go into the details about the procedure in this article as I don’t own a PC to test. But you can check out a good tutorial on the VN site here.

Before you start sharing your VN projects, here’s some general info: To actively share a project file, you need to create a free account with VN. Right off the bat, you can share projects that don’t exceed 3 GB in size. There’s also a maximum limit of 100 project files per day but I suppose nobody will actually bump into that. To get rid of these limitations, VN will manually clear your account for unlimited sharing within a few days after filling out this short survey. For passive sharing, that is when someone sends you a project file, there are no limitations even when you are not logged in. As the sharing process is slightly different depending on which platforms/devices are involved I have decided to walk you through all nine combinations, starting with the one that will probably be the most common. 

Let me quickly explain two general things ahead which apply to all combinations so I don’t have to go into the details every time:

1) When creating a VN project file to share, you can do it as “Full” or “Simple”. “Full” will share the project file with all of its media (complete footage, music/sound fx, text), “Simple” will let you choose which video clips you actually want to include. Not including every video clip will result in a smaller project file that can be transferred faster.

2) You can also choose whether or not you want the project file to be “Readonly”. If you choose “Readonly”, saving or exporting will be denied – this can be helpful if you send it to someone else but don’t want this person to save changes or export the project.

All of the sharing combinations I will mention now are focused on local device-to-device sharing. Of course you can also use any cloud service to store/share VN project files and have them downloaded and opened remotely on another device that runs the VN application.

iPhone/iPad to Mac

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon at the bottom), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Now choose “AirDrop” and select your Mac. Make sure that AirDrop is activated on both devices.
  • Depending on your AirDrop settings you now have to accept the transfer on the receiving device or the transfer will start automatically. By default, the file will be saved in the “Downloads” folder of your Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app.
  • Now select “Open project”.

iPhone/iPad to iPhone/iPad

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now choose “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Select the iPhone/iPad you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now select “Open project”

iPhone/iPad to Android

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the iOS/iPadOS share menu will pop up.
  • Now you need to transfer the project file from the iPhone/iPad to the Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both iPhone/iPad and Android.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Open SendAnywhere on your Android device, select the “Receive” tab and enter the code
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. 
  • The Android “Open with” menu will open, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Finally choose “Open Project”.

Mac to iPhone/iPad

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select. “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select your iPhone or iPad. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now choose “Open Project”.

Mac to Mac

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select the Mac you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • By default the VN project file will be saved in the “Downloads” folder of the receiving Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Mac to Android

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and choose a way to send it to your Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both macOS and Android.
  • So using SendAnywhere on your Mac, drag the VN project file into the app. You will see a 6-digit code. Open SendAnywhere on your Android, choose the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then on the project file.
  • The Android “Open with” menu will pop up, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Choose “Open Project”.

Android to Mac

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the Android share sheet will pop up.
  • Now you need to transfer the project file from your Android device to your Mac. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and macOS.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Unless you have created a custom download folder for your preferred file transfer app, the VN project file will be saved to the “Downloads” folder on your Mac or is available in your cloud storage.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Android to Android

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • From the Android share sheet, choose Android’s integrated wifi sharing option Nearby Share (check this video on how to use Nearby Share if you are not familiar with it) and select the device you want to send it to. Make sure Nearby Share is activated on both devices.
  • After accepting the file on the second device, the transfer will start.
  • Once it is finished, choose “VN/Import to VN” from the pop up menu. Importing into VN will start. 
  • Finally choose “Open Project”.

Android to iPhone/iPad

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated. Afterwards, the Android share sheet menu will pop up.
  • Now you need to transfer the project file from the Android device to the iPhone/iPad. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and iPhone/iPad.
  • So choose SendAnywhere from the Share Sheet. A 6-digit code is generated.
  • Open SendAnywhere on your iPhone/iPad, select the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. Now tap on the share icon in the top right corner and choose VN from the list. The project file will be imported into VN.
  • Finally choose “Open Project”.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

DISCLOSURE NOTE: This particular post was sponsored by VN. It was however researched and written all by myself.

#40 A whole new video editing experience on a phone! — 28. February 2021

#40 A whole new video editing experience on a phone!

Let’s be honest: Despite the fact that phone screens have become increasingly bigger over the last years, they are still rather small for doing some serious video editing on the go. No doubt, you CAN do video editing on your phone and achieve great results, particularly if you are using an app with a touch-friendly UI like KineMaster that was brilliantly designed for phone screens.  But I’m confident just about every mobile veditor would appreciate some more screen real estate. Sure, you can use a tablet for editing but tablets aren’t great devices for shooting and if you want to do everything on one device pretty much everyone would choose a phone, right? 

While phone makers like Samsung, Huawei and Motorola are currently pioneering devices with foldable screens, those are still extremely expensive (between 1500 and 2000 bucks!) and also have to cope with some teething problems. LG, while not particularly successful in terms of sales figures in the recent past, have proven to be an innovative force in smartphone development for some years now. Not everything they throw at the market sticks, but let’s not forget that for instance the now widely popular and extremely useful wide-angle auxiliary lens was first seen on the LG G5 (rear camera) and LG V10 (front camera). I would also hate to not have an amazing manual video mode in a native camera app like the V10 pioneered.

Instead of making a screen that folds, LG has introduced a series of phones that include (or at least have the option for) a dual screen case that has a second, separate screen – basically making it look like if you were holding two phones next to each other. So the concept is that of a foldable PHONE, not a foldable SCREEN! The actual phone is inserted into the Dual Screen case with a physical connection (initially pogo pins, then USB-C) establishing communication between the two devices. First came the V50 (April 2019), then the G8X (November 2019) and the V60 (March 2020) with the latest Dual Screen-compatible phone release being the LG Velvet (May 2020). As far as I know, the G8X (which I got new for just over 400€) is the only of the bunch that comes with the Dual Screen included, for the other phones, the DS is an accessory that can be purchased separately or in a bundle with the phone. It’s important to note that the DS cases are all slightly different (LG refined the design over time) and only work with the phone they were designed for. It probably goes without saying that they don’t work with just any other Android phone – this is proprietary LG hardware. 

The user experience of a foldable screen phone like the Samsung Galaxy Fold is quite different from that of the Dual Screen foldable phone approach. While an expanded foldable screen can give you more screen real estate for one app, the DS is primarily designed for multi-tasking with two apps running at the same time, one on the phone’s main screen and one on the Dual Screen. The DS is not really meant to use an app in an expanded view over both screens as there’s obviously a big gap/hinge between the two screens which is quite distracting in most cases. If apps were specifically customized, integrating the gap into their UI, this could be much less of a problem but with LG being a rather small player in the smartphone market, this hasn’t really happened so far. LG seems to have been quite aware of this and so they natively only allow a handful of apps (a bunch of Google apps and the Naver Whale browser) to be run in a wide view mode that spans across both screens.

Now, while having an app run across two separate screens might not make a lot of sense for many apps, there is one type of app that could actually be a perfect fit: video editors. On desktop, lots of professional video editors (I’m talking about the persons doing the editing) use a dual monitor set-up to have more screen real estate to organize their virtual workspace. One classic use case is that you have your timeline, media pool etc. on one screen and a big preview window on the second screen. It’s exactly this scenario that can be mimicked on LG’s Dual Screen phones like the G8X – but only with a particular app.

Why only with a particular app? Because the app’s UI needs to fit the Dual Screen in just the right way and currently, the only app that does that is PowerDirector. It’s not a perfect fit (one of the most obvious imperfections is the split playback button) but that’s to be expected since the app has not been optimized in any way for LG’s Dual Screen phones – considering this, it’s truly amazing HOW well Power Director’s UI falls into place on the G8X. The joy of having a big preview window on the top screen with the timeline and tool bars having their own space on the bottom screen (using the phone in landscape orientation) can hardly be overestimated in my opinion. It really feels like a whole new mobile video editing experience, and an extremely pleasant one for sure! 

But wait! Didn’t I mention that LG’s wide view mode is only available for a couple of apps natively? Yes indeed, and that’s why you need a 3rd party helper app that lets you run just any app you want in wide mode. It’s called WideMode for LG and can be downloaded for free from the Google PlayStore. Once you have installed it, you can add a shortcut to the quick settings (accessible via the swipe down notification shade) and switch to wide view whenever you want to. The app works really well in general (don’t blame the app maker for the fact that virtually no app has been optimized for this view!), occasionally, certain navigational actions cause the wide mode to just quit but most of the time, you can pick up the pattern of when that happens. In the case of Power Director for instance, you should only activate wide mode once you have opened your project and can see the timeline. If you activate wide view before that and select a project, you will get thrown out of the wide view mode. Also, if you’re done with your editing and want to export the project, tapping the share/export button will quit wide view and push the UI back on just a single screen but that’s not really problematic in my opinion. Still I couldn’t help but daydream about how cool the app would be if Cyberlink decided to polish the UI for LG’s Dual Screen phones!

What about other video editing apps? KineMaster’s UI, while extremely good for single screen phones, is pretty terrible in wide view on the G8X. VN on the other hand works fairly well but can’t quite match Power Director. Interestingly, while VN doesn’t (yet) support landscape orientation in general, once you force it across both screens, it actually does work like that. The biggest annoyance is probably that the preview window is split between the two screen with the lower quarter on the bottom screen. If you use VN in portrait orientation with wide mode, the preview window is cut in half and so is the timeline area. The UI of CapCut is pretty similar to that of VN, so it’s basically the same here. Adobe Premiere Rush isn’t even available for any LG phones currently.

So is this the future of mobile video editing on smartphones? Yes and no. LG’s smartphone business has been struggling for a while and recent news from the Korean company indicate they might be looking for an exit strategy, selling their mobile branch. This also means however that you can currently get great deals on powerful LG phones so if you are on a budget but are really intrigued by this opportunity for mobile video editing then it might just be the perfect time. The way Power Director’s UI is layed out should also make it great for phones with a foldable screen like the Galaxy Fold series so if we assume that this type of phone will become more common and affordable in the near future, people doing a lot of video editing on the phone should definitely consider checking this out!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#39 Should you buy a cheap Android phone? 10 things to consider! — 24. January 2021

#39 Should you buy a cheap Android phone? 10 things to consider!

One of the big reasons why Android has such an overwhelming dominance as a mobile operating system on a global scale (around 75% of smartphones world wide run Android) is that you basically have a seamless price range from the very bottom to the very top – no matter your budget, there’s an Android phone that will fit it. This is generally a very good thing since it allows everyone on this planet to participate in mobile communication, not just if you have deep pockets. But as many of us would agree, smartphones are not pure communication devices anymore, you can also use them to actively create content. In this respect, Android phones are bringing the power of storytelling to the people and could therefore be regarded as an invaluable asset in democratizing this mighty tool. But if you CAN get a (very) cheap Android phone, SHOULD you get one?

Of course the definition of what one considers “cheap” highly depends on an individual background so I won’t get into any concrete universal definitions here. In Germany, the cheapest Android phones start at around 50 Euro I’d say. So what in general is the difference between a 50 Euro phone and a 1000 Euro Android phone? Let’s single out some points from the perspective of a mobile video creator:

1) Build quality

This can actually be surprisingly controversial. Sure, flagship phones have more premium build materials but the move to shiny glass-covered backs has seen many an excited owner making a mess out of his or her new phone with a single drop. So better get a case if you consider yourself among those who occasionally drop their phone. The plasticy build of cheaper devices might look or at least feel less premium but they can often take more abuse in various circumstances. As for the screen itself, more expensive phones tend to have a more robust protective layer but that doesn’t always save you and you can also get a pretty affordable add-on screen protector if you are worried about damaging your phone’s screen.

2) Software updates

Usually, more expensive phones get more updates / updates for a longer period. But there are exceptions. Nokia for instance is known to be very good with updates even on their budget phones so it also depends on the phone maker. Are software updates important? Yes and no. Generally, new software versions (at least the big annual ones like Android 10, Android 11 etc.) introduce new features and optimizations. New features specifically relevant for videography are however pretty rare (the last major ones were introduced with Android 5 in 2014 and then Android 11 in 2020) so it depends on whether the new features are actually helpful for what you want to get done and whether you are a tech-savvy person who always wants the latest updates to play around with. Security updates are important though but ever since Google decided to make it possible to distribute them separately from feature updates, they have also become more common in cheaper phones – mid-rangers and flagships still tend to receive more software updates and for longer periods of time however.

3) Expandable storage

The ability to easily and cheaply add additional storage to your phone via a microSD card has long been a major plus of the Android system when compared to Apple’s iPhones. More and more Android OEMs however have started eliminating this valuable feature from their new releases, Samsung being the latest with its flagship S21 series. Sure, they have increased the internal storage over time, you can easily get phones with 128, 256 or 512 GB these days, but in my opinion it would still be good to have the option for expandable storage – UHD/4K video can fill up your phone pretty fast if you are shooting a lot. Interestingly, it’s now easier to find support for microSD cards in cheaper phones. Actually, many/most of the entry-level phones (still) have it so if that’s important to you, you might want to have a look at the budget or mid-range segment of the Android phone market.

4) Removable battery

An even more exotic but dare I say “pro” feature that has become nearly extinct but was generally very useful for “power users” is the ability to (easily) swap out batteries in a phone. LG was the last major phone maker to include this in a flagship device with the V20 in late 2016 but over time, the practice of a non-removable battery has trickled down even to the (ultra) budget market. The few phones with exchangable batteries that are left can however be found there, last survivors include the Samsung XCover Pro, the Motorola Moto E6 and the Nokia 1.3. The only recent mid-range device including this feature seems to be the Fairphone 3/3+. Sure, power banks are an abundant accessory now and an easy way to juice up your phone while on the go – but the re-supply is incremental and sometimes it’s quite annoying to be tethered to an external device via cable while using the phone.

5) SoC/Processor

While the last two points were very much in favor of budget phones, the tide is about to turn. If you want to use your phone for more than just browsing the web, checking your messages or following your social media feeds, then your phone needs some decent processing power to keep things running smoothly. One of the toughest nuts to crack for a SoC (System-on-a-Chip) is editing high resolution video – even more so when it involves multiple tracks. So if you are planning on editing a lot of UHD/4K video with multiple layers on your phone, a budget device probably won’t cut it because processing power often is a watershed between cheaper and more expensive phones. That doesn’t mean however that you can’t do video editing at all on a budget smartphone. About two years ago I was really surprised how well Qualcomm’s Snapdragon 430/435 SoC did in terms of video editing, allowing for multiple layers of 1080p video in KineMaster on phones like the Nokia 5, Motorola Moto G5 or the LG Q6. Generally, the amount of layers and their resolution in video editing apps are dependent on the device’s chipset. Some apps like Adobe Premiere Rush aren’t even available for any budget phones because they are too demanding in terms of processing power. The SoC can definitely also have an influence on the video RECORDING capabilities in terms of available frame rates and resolution. If 1080p at a maximum of 30fps is good enough for what you do though, basically every phone has that covered these days, even the cheapest ones.

6) Camera

And while the video recording resolution can be an indicator for technical image quality, it surely isn’t the only one – actually other things are (way) more important: Lens quality, aperture size, sensor quality, processing algorithm. That’s why 1080p footage shot on one phone might look better than 1080p footage shot on another. And generally, that’s also an area in which (ultra) budget phones get left behind. Again, this doesn’t mean that you should never use an entry-level phone to shoot video – some of them can capture surprisingly decent footage and if you are “just” doing something for Facebook etc., the difference in image quality might not really be noticeable for the casual, non-pixel-peeping viewer. Also never forget that the content/the story is way more important than the image quality! You will reach/move more people with a good story shot on a cheap phone than with a mediocre story shot on a flagship phone, never mind the superior image quality of the camera.

7) Native camera app

Another aspect that can distinguish a cheap from a more expensive Android phone is the native camera app. Not so much in terms of the general UI and basic functionality but in terms of special modes and features. LG for instance has an absolutely outstanding manual video mode in the native camera app of its flagship lines, one that can rival a dedicated 3rd party app like Filmic Pro, but you don’t get it in their budget phones. The same goes for Sony and – to a lesser degree – Samsung, which at least gives you support for external mics down to its entry-level offerings. Other Android phone makers however have the same native camera app in all of their models, budget or flagship (Motorola for instance, unless they have recently changed something).

8) Camera2 API

I just mentioned 3rd party video recording apps, so let’s look at an even “nerdier” aspect: Usually, more expensive phones have better Camera2 API support. What’s Camera2 API? I have written a whole blog post about it, but in short, it’s basically the phone’s ability to give 3rd party camera apps access to manual control for certain more advanced imaging parameters like shutter speed, ISO, white balance etc. So this is important if you are planning to use such an app (like for instance Filmic Pro, ProTake or mcpro24fps) instead of the phone’s native camera app. While nowadays basically all (or almost all) flagship phones and many/most mid-range Android phones have proper Camera2 API support, there are also entry-level phones that are equipped with it, for instance some from Nokia and Motorola – it’s not that common yet however.

9) Headphone jack

Before wrapping things up I want to look at another aspect that is of major relevance if you want to record audio with external mics on your smartphone – be it as part of capturing video or just audio-only. Like the removeable battery and expandable storage, the 3.5mm headphone jack is a feature that’s been fading away from smartphones over the last years. Some Android OEMs are still holding on to it (for the most part) but many have eliminated it, relying solely on a single physical port (USB-C) and wireless technology (Bluetooth/WiFi). As with those other features, it’s curious that the 3.5mm headphone jack has mostly survived in budget phones. This makes a case for a very particular use scenario: If you “only” want to record audio (be it for an audio-only production or use as an external audio recorder with a lavalier on a video shoot), a budget phone can be an interesting option because you don’t have to care about the quality of the camera and neither (for the most part) the chipset and its processing power since audio processing is much less resource hungry than video processing. The external-recorder-with-a-lavalier scenario is also a clever idea to make use of an old phone if you have one buried in a drawer somewhere that’s only collecting dust.

10) Bonus tip!

What if you DO want higher processing power and camera quality, but are on a tight budget nonetheless? In that case, it can be helpful to look at older flagship models or mid-rangers. Once new Android phones are released, their price – not always but often – drops after a couple of months. If you compare the camera quality and processing power of a budget phone with an older flagship or potent mid-ranger you can often easily go back two or three years and still be on the better side with the “oldie”. Depending on what model/phone maker you choose and how far back you go, you might be stuck with an older version of Android but as indicated earlier on, this isn’t necessarily as bad as it sounds.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier! — 16. January 2021

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier!

There are times when – for reasons of privacy or even a person’s physical safety – you want to make certain parts of a frame in a video unrecognizable so not to give away someone’s identity or the place where you shot the video. While it’s fairly easy to achieve something like that for a photograph, it’s a lot more challenging for video because of two reasons: 1) You might have a person moving around within a shot or a moving camera which constantly alters the location of the subject within the frame. 2) If the person talks, he or she might also be identifiable just by his/her voice. So are there any apps that help you to anonymize persons or objects in videos when working on a smartphone?

KineMaster – the best so far

Up until recently the best app for anonymizing persons and/or certain parts of a video in general was KineMaster which I already praised in my last blog about the best video editing apps on Android (it’s also available for iPhone/iPad). While it’s possible to use just any video editor that allows for a resizable image layer (let’s say just a plain black square or rectangle) on top of the main track to cover a face, KineMaster is the only one with a dedicated blur/mosaic tool for this use case. Many other video editing apps have a blur effect in their repertoire, but the problem is that this effect always affects the whole image and can’t be applied to only a part of the frame. KineMaster on the other hand allows its Gaussian Blur effect to be adjusted in size and position within the frame. To access this feature, scroll to the part of the timeline where you want to apply the effect but don’t select any of the clips! Now tap on the “Layer” button, choose “Effect”, then “Basic Effects”, then either “Gaussian Blur” or “Mosaic”. An effect layer gets added to the timeline which you can resize and position within the preview window. Even better: KineMaster also lets you keyframe this layer which is incredibly important if the subject/object you want to anonymize is moving around the frame or if the camera is moving (thereby constantly altering the subject’s/object’s position within the frame). Keyframing means you can set “waypoints” for the effect’s area to automatically change its position/size over time. You can access the keyframing feature by tapping on the key icon in the left sidebar. Keyframes have to be set manually so it’s a bit of work, particularly if your subject/object is moving a lot. If you just have a static shot with the person not moving around a lot, you don’t have to bother with keyframing though. And as if the adjustable blur/mosaic effect and support for keyframing wasn’t good enough, KineMaster also gives you a tool to add an extra layer of privacy: you can alter voices. To access this feature, select a clip in the timeline and then scroll down the menu on the right to find “Voice Changer”, there’s a whole bunch of different effects. To be honest, most of them are rather cartoonish – I’m not sure you want your interviewee to sound like a chipmunk. But there are also a couple of voice changer effects that I think can be used in a professional context.

What happened to Censr?

As I indicated in the paragraph above, a moving subject (or a moving camera) makes anonymizing content within a video a lot harder. You can manually keyframe the blurred area to follow along in KineMaster but it would be much easier if that could be done via automatic tracking. Last summer, a closed beta version of an app called “Censr” was released on iOS, the app was able to automatically track and blur faces. It all looked quite promising (I saw some examples on Twitter) but the developer Sam Loeschen told me that “unfortunately, development on censr has for the most part stopped”.

PutMask – a new app with a killer feature!

But you know what? There actually is a smartphone app out there that can automatically track and pixelate faces in a video: it’s called PutMask and currently only available for Android (there are plans for an iOS version). The app (released in July 2020) offers three ways of pixelating faces in videos: automatically by face-tracking, manually by following the subject with your finger on the touch-screen and manually by keyframing. The keyframing option is the most cumbersome one but might be necessary when the other two ways won’t work well. The “swipe follow” option is the middle-ground, not as time-consuming as keyframing but manual action is still required. The most convenient approach is of course automatic face-tracking (you can even track multiple faces at the same time!) – and I have to say that in my tests, it worked surprisingly well! 

Does it always work? No, there are definitely situations in which the feature struggles. If you are walking around and your face gets covered by something else (for instance because you are passing another person or an object like a tree) even for only a short moment, the tracking often loses you. It even lost me when I was walking around indoors and the lens flare from the light bulb at the ceiling created a visual “barrier” which I passed at some point. And although I would say that the app is generally well-designed, some of the workflow steps and the nomenclature can be a bit confusing. Here’s an example: After choosing a video from your gallery, you can tap on “Detect Faces” to start a scanning process. The app will tell you how many faces it has found and will display a numbered square around the face. If you now tap on “Start Tracking”, the app tells you “At least select One filter”. But I couldn’t find a button or something indicating a “filter”. After some confusion I discovered that you need to tap once on the square that is placed over the face in the image, maybe by “filter” they actually mean you need to select at least one face? Now you can initiate the tracking. After the process is finished you can preview the tracking that the app has done (and also dig deeper into the options to alter the amount of pixelation etc.) but for checking the actual pixelated video you have to export your project first. While the navigation could/should be improved for certain actions to make it more clear and intuitive, I was quite happy with the results in general. The biggest catch until recently was the maximum export resolution of 720p but with the latest update released on 21 January 2021, 1080p is also supported. An additional feature that would be great to have in an app that has a dedicated focus on privacy and anonymization, is the ability to alter/distort the voice of a person, like you can do in KineMaster.

There’s one last thing I should address: The app is free to download with all its core functionality but you only get SD resolution and a watermark on export. For HD/FHD watermark-free export, you need to make an in-app purchase. The IAP procedure is without a doubt the weirdest I have ever encountered: The app tells you to purchase any one of a selection of different “characters” to receive the additional benefits. Initially, these “characters” are just names in boxes, “Simple Man”, “Happy Man”, “Metal-Head” etc. If you tap on a box, an animated character pops up. But only when scrolling down it becomes clear that these “characters” represent different amounts of payment with which you support the developer. And if that wasn’t strange enough by itself, the amount you can donate goes up to a staggering 349.99 USD (Character Dr. Plague) – no kidding! At first, I had actually selected Dr. Plague because I thought it was the coolest looking character of the bunch. Only when trying to go through with the IAP did I become aware of the fact that I was about to drop 350 bucks on the app! Seriously, this is nuts! I told the developer that I don’t think this is a good idea. Anyway, the amount of money you donate doesn’t affect your additional benefits, so you can just opt for the first character, the “Simple Man”, which costs you 4.69€. I’m not sure why they would want to make things so confusing for users willing to pay but other than that, PutMask is a great new app with a lot of potential, I will definitely keep an eye on it!

As always, if you have questions or comments, drop them below or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

Download PutMask on GooglePlay.

#37 Best video editors / video editing apps for Android in 2021 — 10. January 2021

#37 Best video editors / video editing apps for Android in 2021


Ever since I started this blog, I wanted to write an article about my favorite video editing apps on Android but I could never decide on how to go about it, whether to write a separate in-depth article on each of them, a really long one on all of them or a more condensed one without too much detail or workflow explanations, more of an overview. So I recently figured there’s been enough pondering on this subject and I should just start writing something. The very basic common ground for all these mobile video editing apps mentioned here is that they allow you to combine multiple video clips into a timeline and arrange them in a desired order. Some might question the validity of editing video on such a relatively small screen as that of a smartphone (even though screen sizes have increased drastically over the last years). While it’s true that there definitely are limitations and I probably wouldn’t consider editing a feature-length movie that way, there’s also an undeniable fascination about the fact that it’s actually doable and can also be a lot of fun. I would even dare to say that it’s a charming throwback to the days before digital non-linear editing when the process of cutting and splicing actual film strips had a very tactile nature to it. But let’s get started…

KineMaster


When I got my first smartphone in 2013 and started looking for video editing apps in the Google PlayStore, I ran into a lot of frustration. There was a plethora of video editing apps but almost none of them could do more than manipulate a single clip. Then, in late December, an app called KineMaster was released and just by looking at the screenshots of the UI I could tell that this was the game changer I had been waiting for, a mobile video editing app that actually aspired to give you the proper feature set of a (basic) desktop video editing software. Unlike some other (failed) attempts in that respect, the devs behind KineMaster realized that giving the user more advanced editing tools could become an unpleasant boomerang flying in their face if the controls weren’t touch-friendly on a small screen. If you ever had the questionable pleasure of using a video editing app called “Clesh” on Android (it’s long gone), you know what I’m talking about. To this date, I still think that KineMaster has one of the most beautiful and intuitive UIs of any mobile app. It really speaks to its ingenuity that despite the fact that the app has grown into a respectable mobile video editing power house with many pro features, even total editing novices usually have no problem getting the hang of the basics within a couple of hours or even minutes.

While spearheading the mobile video editing revolution on Android, KineMaster dared to become one of the first major apps to drop the one-off payment method and pioneer a subscription model. I had initially paid 2€ one-off for the pro version of the app to get rid of the watermark, now you had to pay 2 or 3€ a month (!). I know, “devs gotta eat”, and I’m all for paying a decent amount for good apps but this was quite a shock I have to admit. It needs to be pointed out that KineMaster is actually free to download with all its features (so you can test it fully and with no time limit before investing any money) – but you always get a KineMaster watermark in your exported video and the export resolution doesn’t include UHD/4K. If you are just doing home movies for your family, that might be fine but if you do stuff in a professional or even just more ambitious environment, you probably want to get rid of the watermark. Years later, with every other app having jumped on the subscription bandwagon, I do feel that KineMaster is still one of the apps that are really worth it. I already praised the UI/UX, so here are some of the important features: You get multiple video tracks (resolution and number are device-dependend) and other media layers (including support for png images with tranparency), options for multiple frame rates including PAL (25/50), the ability to select between a wide variety of popular aspect ratios for projects (16:9, 9:16, 1:1, 2.35:1 etc.) and even duplicate the project with a different aspect ratio later (very useful if you want to share a video on multiple platforms), you can use keyframes to animate content, have a very good title tool at hand, audio ducking, voice over recording, basic grading tools and last but not least: the Asset Store. That’s the place where you can download all kinds of helpful assets for your edit: music, fonts, transitions, effects and most of all (animated) graphics (‘stickers’) that you can easily integrate into your project and make it pop without having to spend much time on creating stuff from scratch. Depending on what you are doing, this can be a massive help! I also have to say that despite Android’s fragmentation with all its different phones and chipsets, KineMaster works astonishingly well across the board.

There are still things that could be improved (certain parts of the timeline editing process, media management, precise font sizes, audio waveforms for video clips, quick audio fades, project archives etc.) and development progress in the last one or two years seems to have slowed down but it remains a/the top contender for the Android video editing crown, although way more challenged than in the past. Last note: KineMaster has recently released beta versions of two “helper” apps: VideoStabilizer for KineMaster and SpeedRamp for KineMaster. I personally wish they would have integrated this functionality into the main app but it’s definitely better than not having it at all.

PowerDirector


The first proper rival for KineMaster emerged about half a year later in June 2014 with Cyberlink’s PowerDirector. Unlike KineMaster, PowerDirector was already an established name in the video editing world, at least on the consumer/prosumer level. In many ways, PowerDirector has a somewhat (yet not completely) equal feature set to that of KineMaster with one key missing option being that for exporting in PAL frame rates (if you don’t need to export in 25/50fps, you can ignore this shortcoming). The UI is also good and pretty easy to learn. After KineMaster switched to the subscription model, PowerDirector did have one big factor in its favor: You could still get the full, watermark-free version of the app by making a single, quite reasonable payment, I think it was about 5€. That, however, changed eventually and PowerDirector joined the ranks of apps that you couldn’t own anymore, but only rent via a subscription to have access to all features and watermark-free export. Despite the fact that it’s slightly more expensive than KineMaster now, it’s still a viable and potent mobile video editor with some tricks up its sleeve.

It was for instance – until recently – the only mobile video editor that has an integrated stabilization tool to tackle shaky footage. It’s also the only one with a dedicated de-noise feature for audio and unlike with KineMaster you can mix your audio levels by track in addition to just by individual clips. Furthermore, PowerDirector offers the ability to transfer projects from mobile to its desktop version via the Cyberlink Cloud which can come in handy if you want to assemble a rough cut on the phone but do more in-depth work on a bigger screen with mouse control. Something rather annoying is the way in which the app tries to nudge or dare I say shove you towards a subscription. As I had bought the app before the introduction of the subscription model, I can still use all of its features and export without a watermark but before getting to the edit workspace, the app bombards you with full-screen ads for its subscription service every single time – I really hate that. One last thing: There are a couple of special Android devices on which PowerDirector takes mobile video editing actually to another level but that’s for a future article so stay tuned.

Adobe Premiere Rush


Even more so than Cyberlink, Adobe is a well-known name in the video editing business thanks to Premiere Pro (Windows/macOS). More than once I had asked myself why such a big player had missed the opportunity to get into the mobile editing game. Sure, they dipped their toes into the waters with Premiere Clip but after a mildly promising launch, the app’s development stagnated all too soon and was abandoned eventually – not that much of a loss as it was pretty basic. In 2018 however, Adobe bounced back onto the scene with a completely new app, Premiere Rush. This time, it looked like the video editing giant was ready to take the mobile platform seriously.

The app has a very solid set of advanced editing features and even some specialties that are quite unique/rare in the mobile editing environment: You can for instance expand the audio of a video clip without actually detaching it and risking to go out of sync, very useful for J & L cuts. There’s also a dedicated button that activates multi-select for clips in the timeline, another great feature. What’s more, Rush has true timeline tracks for video. What do I mean by “true”? KineMaster and PowerDirector support video layers but you can’t just move a clip from the primary track to an upper/lower layer track and vice versa which isn’t that much of a problem most of the time but sometimes it can be a nuisance. In Rush you can move your video clips up and down the tracks effortlessly. The “true tracks” also means that you can easily disable/mute/lock a particular track and all the clips that are part of it. One of Rush’s marketed highlights is the auto-conform feature which is supposed to automatically adapt your edit to other aspect ratios using AI to frame the image in the (hopefully) best way. So for instance if you have a classic 16:9 edit, you can use this to get a 1:1 video for Instagram. This feature is reserved for premium subscribers but you can still manually alter the aspect ratio of your project in the free version. For a couple of months, the app was only available for iOS but premiered (pardon the pun!) on Android in May 2019. Like PowerDirector, you can use Adobe’s cloud to transfer project files to the desktop version of Rush (or even import into Premiere Pro) which is useful if the work is a bit more complex. It’s also possible to have projects automatically sync to the cloud (subscriber feature). Initially, the app had a very expensive subscription of around 10€ per month (and only three free exports to test) unless you were already an Adobe Creative Cloud subscriber in which case you got it for free), but it has now become more affordable (4.89€ monthly or 33.99 per year) and the basic version with most features including 1080p export (UHD/4K is a premium feature) is free and doesn’t even force a watermark on your footage – you do need to create a (free) account with Adobe though.

The app does have its quirks – how much of it are still teething aches, I’m not sure. In my personal tests with a Google Pixel 3 and a Pocophone F1, export times were sometimes outrageously long, even for short 1080p projects. Both my test devices were powered by a Snapdragon 845 SoC which is a bit older but was a top flagship processor not too long ago and should easily handle 1080p video. Other editing apps didn’t have any problems rushing out (there goes another pun!) the same project on the same devices. This leads me to believe that the app’s export engine still needs some fine tuning and optimization. But maybe things are looking better on newer and even more powerful devices. Another head-scratcher was frame rate fidelity. While the export window gave me a “1080p Match Framerate” option as an alternative to “1080p 30fps”, surely indicating that it would keep the frame rate of the used clips, working with 25fps footage regularly resulted in a 30fps export. The biggest caveat with Rush though is that its availability on Android is VERY limited. If you have a recent flagship phone from Samsung, Google, Sony or OnePlus, you’re invited, otherwise you are out of luck – for the moment at least. For a complete list of currently supported Android devices check here.

VN


Ever since I started checking the Google PlayStore for interesting new apps on a regular basis, it rarely happens that I find a brilliant one that’s already been out for a very long time. It does happen on very rare occasions however and VN is the perfect case in point. VN had already been available for Android for almost two years (the PlayStore lists May 2018 as the release date) when it eventually popped up on my radar in March 2020 while doing a routine search for “video editors” on the PlayStore. VN is a very powerful video editor with a robust set of advanced tools and a UI that is both clean, intuitive and easy to grasp. You get a multi-layer timeline, support for different aspect ratios including 16:9, 9:16, 1:1, 21:9, voice over recording, transparency with png graphics, keyframing for graphical objects (not audio though, but there’s the option for a quick fade in/out), basic exposure/color correction, a solid title tool, export options for resolutions up to UHD/4K, frame rate (including PAL frame rates) and bitrate.

In other news, VN is currently the only of the advanced mobile video editing apps with a dedicated and very easy-to-use speed-ramping tool which can be helpful when manipulating a clip in terms of playback speed. It’s also great that you can move video clips up and down the tracks although it’s not as intuitive as Adobe Premiere Rush in that respect since you can’t just drag & drop but have to use the “Forward/Backward” button. But once you know how to do it, it’s very easy. While other apps might have a feature or two more, VN has a massive advantage: It’s completely free, no one-off payment, no subscription, no watermark. You do have to watch a 5 second full-screen ad when launching the app and delete a “Directed by” bumper clip from every project’s timeline, but it’s really not much of a bother in my opinion. In the past you had to create an account with VN but it’s not a requirement anymore. Will it stay free? When I talked to VN on Twitter some time ago, they told me that the app as such is supposed to remain free of charge but that they might at some point introduce certain premium features or content. VN recently launched a desktop version for macOS (no Windows yet) and the ability to transfer project files between iOS and macOS. While this is currently only possible within the Apple ecosystem (and does require that you register an account with VN), more cross-platform integration could be on the horizon. All in all, VN is an absolutely awesome and easily accessible mobile video editor widely available for most Android devices (Android 5.0 & up) – but do keep in mind that depending on the power of your phone’s chipset, the number of video layers and the supported editing/exporting resolution can vary.

CapCut

CapCut is somewhat similar to VN in terms of basic functionality (multiple video tracks, support for different frame rates including PAL, variety of aspect ratios etc.) and layout, but with a few additional nifty features that might come in handy depending on the use case. Like VN, it’s completely free without a watermark and you don’t have to create an account. CapCut was – following Cyberlink’s PowerDirector – the second advanced mobile video editing app to introduce a stabilization tool and it can even be adjusted to some degree.

Its unique standout double-feature however has to do with automatic speech-to-text/text-to-speech processing. As we all know, captions have become an integral part of video production for social media platforms as many or most of us browse their network feeds without having the sound turned on and so captions can be a way to motivate users to watch a video even when it’s muted. While it’s no problem to manually create captions with the title tool in basically any video editing app, this can be very time-consuming and fiddly on a mobile device. So how about auto-generated captions?  CapCut has you covered. It doesn’t work perfectly (you sometimes have to do some manual editing) and it’s currently only available in English, but it’s definitely a very cool feature that none of the other editors mentioned here can muster. Interestingly, it’s also possible to do it the other way around: You can let the app auto-generate a voice-over from a text layer. There are three different voices available: “American Male”, “American Female” and “British Female” (only English again). This can be useful if you quickly need to create a voice-over on the go and there’s no time or quiet place to do so or if you are not comfortable recording voice-overs with your own voice. Any cons? Generally, I would say that I prefer VN of the two because I like the design and UX of the timeline workspace better, it’s easier to navigate around, but that’s probably personal taste. What is an actual shortcoming however if you are after the highest possible quality is the fact that CapCut lacks support for UHD/4K export. Don’t get me wrong, you can import UHD/4K footage into the app and work with it but the export resolution is limited to 1080p and you also can’t adjust the bitrate. From a different angle, it should also be mentioned that CapCut is owned by Bytedance, the company behind the popular social video platform TikTok. While you don’t have to create an account for CapCut, you do have to agree to their T&Cs to use the app. So if you are very picky about who gets your data and kept your fingers off TikTok for that reason, you might want to take this into consideration.

Special mention (Motion Graphics): Alight Motion


Alight Motion is a pretty unique mobile app that doesn’t really have an equivalent at the moment. While you can also use it to stitch together a bunch of regular video clips filmed with your phone, this is not its main focus. The app is totally centered around creating advanced, multi-layered motion graphics projects, maybe think of it as a reduced mobile version of Adobe After Effects. Its power lies in the fact that you can manipulate and keyframe a wide range of parameters (for instance movement/position, size, color, shape etc.) on different types of layers to create complex and highly individual animations, spruced up with a variety of cool effects drawn from an extensive library. It takes some learning to unleash the enormous potential and power that lies within the app and fiddling around with a heavy load of parameters and keyframes on a small(ish) touch screen can occasionally be a bit challenging but the clever UI (designed by the same person that made KineMaster so much fun to use) makes the process basically as good and accessible as it can get on a mobile device. The developers also just added effect presets in a recent update which should make it easier for beginners who might be somewhat intimidated by manually keyframing parameters. Pre-designed templates for graphics and animations created by the dev team or other users will make things even more accessible in the future – some are already available but still too few to fully convince passionate users of apps such as the very popular but discontinued Legend. Alight Motion is definitely worth checking out as you can create amazing things with it (like explainer videos or animated info graphics), if you are willing to accept a small learning curve and invest some time. This is coming from someone who regularly throws in the towel trying to get the hang of Apple’s dedicated desktop motion graphics software Motion. Alight Motion has become the first application in this category in which I actually feel like I know what I’m doing – sort of at least. One very cool thing is that you can also use Alight Motion as a photo/still graphics editor since it lets you export the current timeline frame as a png, even with transparency! The app is free to download but to access certain features and export without a watermark you have to get a subscription which is currently around 28€ per year or 4.49 on a monthly basis.

Special mention (Automated Editing): Quik


Sometimes, things have to go quik-ly and you don’t have the time or ambition to assemble your clips manually. While I’m generally not a big fan of automated video editing processes, GoPro’s free Quik video editing app can come in handy at times. You just select a bunch of photos or videos, an animation style, your desired aspect ratio (16:9, 9:16, 1:1) and the app creates an automatic edit for you based on what it thinks are the best bits and pieces. In case you don’t like the results you have the option to change things around and select excerpts that you prefer – generally, manual control is rather limited though and it’s definitely not for more advanced edits. It’s also better suited for purely visual edits without important scenes relying on the original audio (like a person talking and saying something of interest). GoPro, who acquired the app in the past, is apparently working on a successor to Quik and will eventually pull this one from the Google PlayStore later in 2021 but here’s hope that the “new Quik” will be just as useful and accessible.

Special mention (360 Video Editing): V360

While 360 video hasn’t exactly become mainstream, I don’t want to ignore it completely for this post. Owners of a 360 camera (like the Insta360 One X2 I wrote about recently) usually get a companion mobile app along with the hardware which also allows basic editing. In the case of the Insta360 app you actually get quite a range of tools but it’s more geared towards reframing and exporting as a traditional flat video. You can only export a single clip in true 360 format. So if you want to create a story with multiple 360 video clips and also export as true, immersive 360 video with the appropriate metadata for 360 playback, you need to use a 3rd party app. I have already mentioned V360 in one of my very early blog posts but I want to come back to it as the landscape hasn’t really changed since then. V360 gives you a set of basic editing tools to create a 360 video story with multiple clips. You can arrange the clips in the desired order, trim and split them, add music and titles/text. It’s rather basic but good for what it is, with a clean interface and exports in original resolution (at least up to 5.7k which I was able to test). The free version doesn’t allow you to add transition effects between the clips and has a V360 branded bumper clip at the end that you can only delete in the paid version which is 4.99€. There are two other solid 360 video editors (Collect and VeeR Editor) which are comparable and even offer some additional/different features but I personally like V360 best although it has to be said that the app hasn’t seen an update in over two years.

What’s on the horizon?

There’s one big name in mobile editing town that’s missing from the Android platform so far – of course I’m talking about LumaFusion. According to LumaTouch, the company behind LumaFusion, they are currently probing an Android version and apparently have already hired some dedicated developers. I therefore suspect that despite the various challenges that such a demanding app like LumaFusion will encounter in creating a port for a different mobile operating system, we will see at least an early beta version in 2021. Furthermore, despite not having any concrete evidence, I assume that an Android version of Videoleap, another popular iOS-only video editor, might also be currently in the works. Not quite as advanced and feature-packed as LumaFusion, it’s pretty much on par in many respects with the current top dogs on Android. So while there definitely is competition, I also assume that the app’s demands are certainly within what can be achieved on Android and the fact that they have already brought other apps from their portfolio to Android indicates that they have some interest in the platform.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

Download KineMaster on GooglePlay
Downlaod PowerDirector on GooglePlay
Download Adobe Premiere Rush on GooglePlay
Download VN on GooglePlay
Download CapCut on GooglePlay
Download Alight Motion on GooglePlay
Download Quik on GooglePlay
Download V360 on GooglePlay

#36(0) The Insta360 One X2 – fun & frustration — 5. January 2021

#36(0) The Insta360 One X2 – fun & frustration

A couple of years ago, 360° (video) cameras burst onto the scene and seemed to be all the new rage for a while. The initial excitement faded relatively quickly however when producers realized that this kind of video didn’t really resonate as much as they thought it would with the public – at least in the form of immersive VR (Virtual Reality) content for which you need extra hardware, hardware that most didn’t bother to get or didn’t get hooked on. From a creator’s side, 360 video also involved some extra and – dare I say – fairly tedious workflow steps to deliver the final product (I have one word for you: stitching). That’s not to say that this extraordinary form of video doesn’t have value or vanished into total obscurity – it just didn’t become a mainstream trend. 

Among the companies that heavily invested in 360 cameras was Shenzen-based Insta360. They offered a wide variety of different devices: Some standalone, some that were meant to be physically connected to smartphones. I actually got the Insta360 Air for Android devices and while it was not a bad product at all and fun for a short while, the process of connecting it to the USB port of the phone when using it but then taking it off again when putting the phone back in your pocket or using it for other things quickly sucked out the motivation to keep using it.

Repurposing 360 video

While continuing to develop new 360 cameras, Insta360 however realized that 360 video could be utilized for something else than just regular 360 spherical video: Overcapture and subsequent reframing for “traditional”, “flat” video. What does this mean in plain English? Well, the original spherical video that is captured is much bigger in terms of resolution/size than the one that you want as a final product (for instance classic 1920×1080) which gives you the freedom to choose your angle and perspective in post production and even create virtual camera movement and other cool effects. Insta360 by no means invented this idea but they were clever enough to shift their focus towards this use case. Add to that the marketing gold feature of the “invisible selfie-stick” (taking advantage of a dual-lens 360 camera’s blindspot between its lenses), brilliant “Flow State” stabilization and a powerful mobile app (Android & iOS) full of tricks, you’ll end up with a significant popularity boost for your products!

The One X and the wait for a true successor

The one camera that really proved to be an instant and long-lasting success for Insta360 was the One X which was released in 2018. A very compact & slick form factor, ease of use and very decent image quality (except in low light) plus the clever companion app breathed some much-needed life into a fairly wrinkled and deflated 360 video camera balloon. In early 2020 (you know, the days when most of us still didn’t know there was a global pandemic at our doorstep), Insta360 surprised us by not releasing a direct successor to everybody’s darling (the One X) but the modular One R, a flexible and innovative but slightly clunky brother to the One X. It wasn’t until the end of October that Insta360 finally revealed the true successor to the One X, the One X2.

In the months prior to the announcement of the One X2, I had actually thought about getting the original One X (I wasn’t fully convinced by the One R) but it was sold out in most places and there were some things that bothered me about the camera. To my delight, Insta360 seemed to have addressed most of the issues that me (and obviously many others) had with the original One X: They improved the relatively poor battery life by making room for a bigger battery, they added the ability to connect an external mic (both wirelessly through Bluetooth and via the USB-C port), they included a better screen on which you could actually see things and change settings in bright sunlight, they gave you the option to stick on lens guards for protecting the delicate protruding lenses and they made it more rugged including an IPX8 waterproof certification (up to 10m) and  a less flimsy thread for mounting it to a stick or tripod. All good then? Not quite. Just by looking at the spec sheet, people realized that there wasn’t any kind of upgrade in terms of video resolution or even just frame rates. It’s basically the same as the One X. It maxes out at 5.7k (5760×2880) at 30fps (with options for 25 and 24), 4k at 50fps and 3k at 100fps. The maximum bitrate is 125 Mbit/s. I’m sure quite a few folks had hoped for 8k (to get on par with the Kandao Qoocam 8K) or at the very least a 50/60fps option for 5.7k. Well, tough luck.

While I can certainly understand some of the frustration about the fact that there hasn’t been any kind of bump in resolution or frame rates in 2 years, putting 8K in such a small device and also have the footage work for editing on today’s mobile devices probably wasn’t a step Insta360 was ready to take because of the possibility of a worse user experience despite higher resolution image quality. Personally, I wasn’t bothered too much by this since the other hardware improvements over the One X were good enough for me to go ahead and make the purchase. And this is where my own frustrations began…

Insta360 & me: It’s somewhat difficult…

While I was browsing the official Insta360 store to place my order for the One X2, I noticed a pop-up that said that you could get 5% off your purchase if you sign up for their newsletter. They did exclude certain cameras and accessories but the One X2 was mentioned nowhere. So I thought, “Oh, great! This just comes at the right time!”, and signed up for the newsletter. After getting the discount code however, entering it during the check-out always returned a “Code invalid” error message. I took to Twitter to ask them about this – no reply. I contacted their support by eMail and they eventually and rather flippantly told me something like “Oh, we just forgot to put the X2 on the exclusion list, sorry, it’s not eligible!”. Oh yes, me and the Insta360 support were off to a great start!

Wanting to equip myself with the (for me) most important accessories I intended to purchase a pair of spare batteries and the microphone adapter (USB-C to 3.5mm). I could write a whole rant about how outrageous I find the fact that literally everyone seems to make proprietary USB-C to 3.5mm adapters that don’t work with other brands/products. E-waste galore! Anyway, there’s a USB-C to 3.5mm microphone adapter from Insta360 available for the One R and I thought, well maybe at least within the Insta360 ecosystem, there should be some cross-device compatibility. Hell no, they told me the microphone adapter for the One R doesn’t work with the One X2. Ok, so I need to purchase the more expensive new one for the X2 – swell! But wait, I can’t because while it’s listed in the Insta360 store, it’s not available yet. And neither are extra batteries. The next bummer. So I bought the Creator Kit including the “invisible” selfie-stick, a small tripod, a microSD card, a lens cap and a pair of lens guards.

A couple of weeks later, the package arrived – no problem, in the era of Covid I’m definitely willing to cut some slack in terms of delivery times and the merchandise is sent from China so it has quite a way to Germany. I opened the package, took out the items and checked them to see if anything’s broken. I noticed that one of the lens guards had a small blemish/scratch on it. I put them on the camera anyway thinking maybe it doesn’t really show in the footage. Well, it did. A bit annoying but stuff like that happens, a lemon. I contacted the support again. They wanted me to take a picture of the affected lens guard. Okay. I sent them the picture. They blatantly replied that I should just buy a new one from their store, basically insinuating that it was me who damaged the lens guard. What a terrible customer service! I suppose I would have mustered up some understanding for their behaviour if I had contacted them a couple of days or weeks later after actually using the X2 for some time outdoors where stuff can quickly happen. But I got in touch with them the same day the delivery arrived and they should have been able to see that since the delivery had a tracking number. Also, this item costs 25 bucks in the Insta360 store, probably a single one or a few cents in production and I wasn’t even asking about a pair but only one – why make such a fuss about it? So there was some back-and-forth and only after I threatened to return the whole package and asked for a complete refund they finally agreed to send me a replacement pair of lens guards at no extra cost. On a slightly positive note, they did arrive very quickly only a couple of days later.

Is the Insta360 One X2 actually a good camera?

So what an excessive prelude I have written! What about the camera itself? I have to admit that for the most part, it’s been a lot of fun so far after using it for about a month. The design is rugged yet still beautifully simplistic and compact, the image quality in bright, sunny conditions is really good (if you don’t mind that slightly over-sharpened wide-angle look and that it’s still “only” 5.7k – remember this resolution is for the whole 360 image so it’s not equivalent to a 5.7k “flat” image), the stabilization is generally amazing (as long as the camera and its sensor are not exposed to extreme physical shakes which the software stabilization can’t compensate for) and the reframing feature in combination with the camera’s small size and weight gives you immense flexibility in creating very interesting and extraordinary shots.

Sure, it also has some weaknesses: Despite having a 5.7k 360 resolution, if you want to export as a regular flat video, you are limited to 1080p. If you need your final video to be in UHD/4K non-360 resolution, this camera is not for you. The relatively small sensor size (I wasn’t able to find out the exact size for the X2 but I assume it’s the same as the One X, 1/2.3″) makes low-light situations at night or indoors a challenge despite a (fixed) aperture of f/2.0 – even a heavily overcast daytime sky can prove less than ideal. Yes, a slightly bigger sensor compared to its predecessors would have been welcome. The noticeable amount of image noise that is introduced by auto-exposure in such dim conditions can be reduced by exposing manually (you can set shutter speed and ISO) but then of course you just might end up with an image that’s quite dark. The small sensor also doesn’t allow for any fancy “cinematic” bokeh but in combination with the fixed focus it also has an upside that shouldn’t be underestimated for self-shooters: You don’t have to worry about a pulsating auto-focus or being out of focus as everything is always in focus. You can also shoot video in LOG (flatter image for more grading flexibility) and HDR (improved dynamic range in bright conditions) modes. Furthermore, there’s a dedicated non-360 video mode with a 150 degree field-of-view but except for the fact that you get a slight bump in resolution compared to flat reframed 360 video (1440p vs. 1080p) and smaller file sizes (you can also shoot your 5.7k in H.265 codec to save space), I don’t see me using this a lot as you lose all the flexibility in post.

While it’s good that all the stitching is done automatically and the camera does a fairly good job, it’s not perfect and you should definitely familiarize yourself with where the (video) stitchline goes to avoid it in the areas where you capture important objects or persons, particularly faces. As a rule of thumb when filming yourself or others you should always have one of the two lenses pointed towards you/the person and not face the side of the camera. It’s fairly easy to do if you usually have the camera in the same position relative to yourself but becomes more tricky when you include elaborate camera movements (which you probably will as the X2 basically invites you to do this!).

Regarding the audio, the internal 4-mic ambi-sonic set up can produce good results for ambient sound, particularly if you have the camera close to the sound source like when you have it on a stick pointing down and you are walking over fresh snow, dead leaves, gravel etc. For recording voices in good quality, you also need to be pretty close to the camera’s mics, having it on a fully extended selfie-stick isn’t ideal. If you want to use the X2 on an extended stick and talk to the camera you should use an external mic, either one that is directly connected to the camera or plugged into an external recorder, then having to sync audio and video later in post. As I have mentioned before, the X2 now does offer support for external mics via the USB-C charging port with the right USB-C-to-3.5mm adapter and also via Bluetooth. Insta360 highlights in their marketing that you can use Apple’s AirPods (Pro) but you can also other mics that work via Bluetooth. The audio sample rate of Bluetooth mics is currently limited to 16kHz by standard but depending on the used mic you can get decent audio. I’ll probably make a separate article on using external mics with the X2 once my USB-C to 3.5mm adapter arrives. Wait, does the X2 shoot 360 photos as well? Of course it does, they turn out quite decent, particularly with the new “Pure Shot” feature and the stichting is better than in video mode. It’s no secret though that the X2 has a focus on video with all its abilities and for those that mainly care about 360 photography for virtual tours etc., the offerings in the Ricoh Theta line will probably be the better choice.

The Insta360 mobile app

The Insta360 app (Android & iOS) might deserve its own article to get into detail but suffice it to say that while it can seem a bit overwhelming and cluttered occasionally and you also still experience glitches now and then, it’s very powerful and generally works well. Do note however that if you want to export in full 5.7k resolution as a 360 video you have to transfer the original files to a desktop computer and work with them in the (free) Insta360 Studio software (Windows/macOS) as export from the mobile app is limited to 4K. You should also be aware of the fact that neither the mobile app nor the desktop software works as a fully-fledged traditional video editor for immersive 360 video where you can have multiple clips on a timeline and arrange them for a story. In the mobile app, you do get such an editing environment (“Stories” – “My Stories” – “+ Create a story”) but while you can use your original spherical 360 footage here, you can only export the project as a (reframed) flat video (max resolution 2560×1440). If you need your export to be an actual 360 video with according metadata, you can only do this one clip at a time outside the “Stories” editing workspace. But as mentioned before, Insta360 focuses on the reframing of 360 video with its cameras and software, so not too many people might be bothered by that. One thing that really got on my nerves while editing within the app on an iPad: When you are connected to the X2 over WiFi, certain parts of the app that rely on a data connection don’t work, for instance you are not able to browse all the features of the shot lab (only those that have been cached before) or preview/download music tracks for the video. This is less of a problem on a phone where you still can have a mobile data connection while using a WiFi connection to the X2 (if you don’t mind using up mobile data) but on an iPad or any device that doesn’t have an alternative internet connection, it’s quite annoying. You have to download the clip, then disconnect from the X2, re-connect to your home WiFi and then download the track to use.

Who is the One X2 for?

Well, I’d say that it can be particularly useful for solo-shooters and solo-creators for several reasons: Most of all you don’t have to worry much about missing something important around you while shooting since you are capturing a 360 image and can choose the angle in post (reframing/keyframed reframing) if you export as a regular video. This can be extremely useful for scenarios where there’s a lot to see or happening around you, like if you are travel-vlogging from interesting locations or are reporting from within a crowd – or just generally if you want to do a piece-to-camera but also show the viewer what you are looking at the same moment. Insta360’s software stabilization is brilliant and comparable to a gimbal and the “invisible” selfie-stick makes it look like someone else is filming you. The stick and the compact form of the camera also lets you move the camera to places that seem impossible otherwise. With the right technique you can even do fake “drone” shots. Therefore it also makes sense to have the X2 in your tool kit just for special shots, even if you neither are a vlogger, a journalist nor interested in “true” 360 video.

A worthy upgrade from the One X / One R?

Should you upgrade if you have a One X or One R? Yes and no. If you are happy with the battery life of the One X or the form factor of the One R and were mainly hoping for improved image quality in terms of resolution / higher frame rates, then no, the One X2 does not do the trick, it’s more of a One X 1.5 in some ways. However, if you are bothered by some “peripheral” issues like poor battery life, very limited functionality of the screen/display, lack of external microphone support (One X) or the slightly clunky and cumbersome form factor / handling (One R) and you are happy with a 5.7k resolution, the X2 is definitely the better camera overall. If you have never owned a 360 (video) camera, this is a great place to start, despite its quirks – just be aware that Insta360’s support can be surprisingly cranky and poor in case you run into any issues.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#35 Using external microphones with iPhones when shooting video — 1. December 2020

#35 Using external microphones with iPhones when shooting video

I usually don’t follow the stats for my blog but when I recently did check on what articles have been the most popular so far, I noticed that a particular one stuck out by a large margin and that was the one on using external microphones with Android devices. So I thought if people seem to be interested in that, why not make an equivalent for iOS, that is for iPhones? So let’s jump right into it.

First things first: The Basics

A couple of basic things first: Every iPhone has a built-in microphone for recording video that, depending on the use case, might already be good enough if you can position the phone close to your talent/interviewee. Having your mic close to the sound source is key in every situation to get good audio! As a matter of fact, the iPhone has multiple internal mics and uses different ones for recording video (next to the lens/lenses) and pure audio (bottom part). When doing audio-only for radio etc., it’s relatively easy to get close to your subject and get good results. It’s not the best way when recording video though if you don’t want to shove your phone into someone’s face. In this case you can and should significantly improve the audio quality of your video by using an external mic connected to your iPhone – never forget that audio is very important! While the number of Android phone makers that support the use of external mics within their native camera app is slowly growing, there are still many (most?) Android devices out there that don’t support this for the camera app that comes with the phone (it’s possible with basically every Android device if you use 3rd party camera apps though!). You don’t have to worry about this when shooting with the native camera app of an iPhone. The native camera app will recognize a connected external mic automatically and use it as the audio input when recording video. When it comes to 3rd party video recording apps, many of them like Filmic Pro, MoviePro or Mavis support the use of external mics as well but with some of them you have to choose the audio input in the settings so definitely do some testing before using it the first time on a critical job. Although I’m looking at this from a videographer’s angle, most of what I am about to elaborate on also applies to recording with audio recording apps. And in the same way, when I say “iPhone”, I could just as well say “iPad” or “iPod Touch”. So there are basically three different ways of connecting an external mic to your iPhone: via the 3.5mm headphone jack, via the Lightning port and via Bluetooth (wireless).

3.5mm headphone jack & adapter

With all the differences between Android and iOS both in terms of hardware and software, the 3.5mm headphone jack was, for a while, a somewhat unifying factor – that was until Apple decided to drop the headphone jack for the iPhone 7 in 2016. This move became a wildly debated topic, surely among the – let’s be honest – comparatively small community of mobile videographers and audio producers relying on connecting external mics to their phones but also among more casual users because they couldn’t just plug in their (often very expensive) headphones to their iPhone anymore. While the first group is definitely more relevant for readers of this blog, the second was undoubtedly responsible for putting the issue on the public debate map. Despite the considerable outcry, Apple never looked back. They did offer a Lightning-to-3.5mm adapter – but sold it separately. I’m sure they have been making a fortune since, don’t ask how many people had to buy it more than once because they lost, displaced or broke the first one. A whole bunch of Android phone makers obviously thought Apple’s idea was a progressive step forward and started ditching the headphone jack as well, equipping their phones only with a USB-C port. Unlike with Apple however, the consumer still had the choice to choose a new phone that had a headphone jack and in a rather surprising turn of events, some companies like Huawei and Google actually backtracked and re-introduced the headphone jack, at least for certain models. Anyway, if you happen to have an older iPhone (6s and earlier) you can still use the wide variety of external microphones that can be connected via the 3.5mm headphone jack without worrying much about adapters and dongles.

Lightning port

While most Android users probably still have fairly fresh memories of a different charging port standard (microUSB) from the one that is common now (USB-C), only seasoned iPhone aficionados will remember the days of the 30-pin connector that lasted until the iPhone 5 introduced the Lightning port as a new standard in 2012. And while microUSB mic solutions for Android could be counted on one hand and USB-C offerings took forever to become a reality, there were dedicated Lightning mics even before Apple decided to kill the headphone jack. The most prominent one and a veritable trailblazer was probably IK Multimedia’s iRig Mic HD and its successor, the iRig Mic HD 2. IK Multimedia’s successor to the iRigPre, the iRigPre HD comes with a Lightning cable as well. But you can also find options from other well-known companies like Zoom (iQ6, iQ7), Shure (MV88/MV88+), Sennheiser (HandMic Digital, MKE 2 Digital), Rode (Video Mic Me-L), Samson (Go Mic Mobile) or Saramonic (Blink 500). The Saramonic Blink 500 comes in multiple variations, two of them specifically targeted at iOS users: the Blink 500 B3 with one transmitter and the B4 with two transmitters. The small receiver plugs right into the Lightning port and is therefore an intriguingly compact solution, particularly when using it with a gimbal. Saramonic also has the SmartRig Di and SmartRig+ Di audio interfaces that let you connect one or two XLR mics to your device. IK Multimedia offers two similar products with the iRig Pro and the iRig Pro Duo. Rode recently released the USB-C-to-Lightning patch cable SC15 which lets you use their Video Mic NTG (which comes with TRS/TRRS cables) with an iPhone. There’s also a Lightning connector version of the SC6 breakout box, the SC6-L which lets you connect two smartLavs or TRRS mics to your phone. I have dropped lots of product names here so far but you know what? Even if you don’t own any of them, you most likely already have an external mic at hand: Of course I’m talking about the headset that comes included with the iPhone! It can’t match the audio quality of other dedicated external mics but it’s quite solid and can come in handy when you have nothing else available. One thing you should keep in mind when using any kind of microphone connected via the iPhone’s Lightning port: unless you are using a special adapter with an additional charge-through port, you will not be able to charge your device at the same time like you can/could with older iOS devices that had a headphone jack.

Wireless/Bluetooth

I have mentioned quite a few wireless systems before (Rode Wireless Go, Saramonic Blink 500/Blink 500 Pro, Samson Go Mic Mobile) that I won’t list here (again) for one reason: While the TX/RX system of something like the Rode Wireless Go streams audio wirelessly between its units, the receiver unit (RX) needs to be connected to the iPhone via a cable or (in the case of the Blink 500) at least a connector. So strictly speaking it’s not really wireless when it comes to how the audio signal gets into the phone. Now, are there any ‘real’ wireless solutions out there? Yes, but the technology hasn’t evolved to a standard that can match wired or semi-wired solutions in terms of both quality and reliability. While there could be two ways of wireless audio into a phone (wifi and Bluetooth), only one (Bluetooth) is currently in use for external microphones. This is unfortunate because the Bluetooth protocol that is used for sending audio back from an external accessory to the phone (the so-called Hands Free Profile, HFP) is limited to a sample rate of 16kHz (probably because it was created with headset phone calls in mind). Professional broadcast audio usually has a sample rate of 44.1 or 48kHz. That doesn’t mean that there aren’t any situations in which using a Bluetooth mic with its 16kHz limitation can actually be good enough. The Instamic was primarily designed to be a standalone ultra-compact high quality audio recorder which records 48/96 kHz files to its internal 8GB storage but can also be used as a truly wireless Bluetooth mic in HFP mode. The 16kHz audio I got when recording with Filmic Pro (here’s a guide on how to use the Instamic with Filmic Pro) was surprisingly decent. This probably has to do with the fact that the Instamic’s mic capsules are high quality unlike with most other Bluetooth mics. One maybe unexpected option is to use Apple’s AirPods/AirPods Pro as a wireless Bluetooth mic input. According to BBC Mobile Journalism trainer Marc Blank-Settle, the audio from the AirPods Pro is “good but not great”. He does however point out that in times of Covid-19, being able to connect to other people’s AirPods wirelessly can be a welcome trick to avoid close contact. Another interesting wireless solution comes from a company called Mikme. Their microphone/audio recorder works with a dedicated companion video recording app via Bluetooth and automatically syncs the quality audio (44.1, 48 or 96kHz) to the video after the recording has been stopped. By doing this, they work around the 16kHz Bluetooth limitation for live audio streaming. While the audio quality itself seems to be great, the somewhat awkward form factor and the fact that it only works with its best feature in their own video recording app but not other camera apps like Filmic Pro, are noticeable shortcomings (you CAN manually sync the Mikme’s audio files to your Filmic or other 3rd party app footage in a video editor). At least regarding the form factor they have released a new version called the Mikme Pocket which is more compact and basically looks/works like a transmitter with a cabled clip-on lavalier mic. One more important tip that applies to all the aforementioned microphone solutions: If you are shooting outdoors, always have some sort of wind screen / wind muff for your microphone with you as even a light breeze can cause noticeable noise.

Micpocalpyse soon?

Looking into the nearby future, some fear that Apple might be pulling another “feature kill” soon, dropping the Lightning port as well and thereby eliminating all physical connections to the iPhone. While there are no clear indications that this is actually imminent, Apple surely would be the prime suspect to push this into the market. If that really happens however, it will be a considerable blow to iPhone videographers as long as there’s no established high-quality and reliable wireless standard for external mics. Oh well, there’s always another mobile platform to go to if you’re not happy with iOS anymore 😉

To wrap things up, I have asked a couple of mobile journalists / content creators using iPhones what their favorite microphone solution is when recording video (or audio in general):

Wytse Vellinga (Mobile Storyteller at Omrop Fryslân, The Netherlands): “When I am out shooting with a smartphone I want high quality worry-free audio. That is why I prefer to use the well-known brands of microphones. Currently there are three microphones I use a lot. The Sennheiser MKE200, the Rode Wireless Go and the Mikme Pocket. The Sennheiser is the microphone that is on the phone constantly when taking shots and capturing the atmospheric sound and short sound bites from people. For longer interviews I use the wireless microphones from Mikme and Rode. They offer me freedom in shooting because I don’t have to worry about the cables.”

Philip Bromwell (Digital Native Content Editor at RTÉ, Ireland): “My current favourite is the Rode Wireless Go. Being wireless, it’s a very flexible option for recording interviews and gathering localised nat sound. It has proven to be reliable too, although the original windshield was a weakness (kept detaching).”

Nick Garnett (BBC Reporter, England & the world): “The mic I always come back to is the Shure MV88+ – not so much for video – but for audio work: it uses a non proprietary cable – micro usb to lightning. It allows headphones to plug into the bottom and so I can use it for monitoring the studio when doing a live insert and the mic is so small it hides in my hand if I have to be discrete. For video work? Rode VideoMicro or the Boya clone. It’s a semi-rifle, it comes with a deadcat and an isolation mount and it costs €30 … absolute bargain.”

Neal Augenstein (Radio Reporter at WTOP Washington DC, USA): “If I’m just recording a one-on-one interview, I generally use the built-in microphone of the iPhone, with a foam windscreen. I’ve yet to find a microphone that so dramatically improves the sound that it merits carrying it around. In an instance where someone’s at a podium or if I’m shooting video, I love the Rode Wireless Go. Just clipping it on the podium, without having to run cable, it pairs automatically, and the sound is predictably good. The one drawback – the tiny windscreen is tough to keep on.”

Nico Piro (Special Correspondent for RAI, Italy & the world): “To record ambient audio (effects or natural as you want to name it) I use a Rode Video Mic Go (light, no battery needed, perfect for both phones and cameras) even if I must say that the iPhone’s on-board mic performs well, too. For Facebook live I use a handheld mic by Polsen, designed for mobile, it is reliable and has a great cardioid pickup pattern. When it comes to interviews, the Rode Wireless Go beats everything for its compact dimensions and low weight. When you are recording in big cites like New York and you are worried about radio interferences the good old cabled mics are always there to help, so Rode’s SmartLav+ is a very good option. I’m also using it for radio production and I am very sad that Rode stopped improving its Rode Rec app which is still good but stuck in time when it comes to file sharing. Last but not least is the Instamic. It takes zero space and it is super versatile…if you use native camera don’t forget to clap for sync!”

Bianca Maria Rathay (Freelance iPhone videographer, Germany): “My favorite external microphone for the iPhone is the RODE Wireless Go in combination with a SmartLav+ (though it works on its own also). The mic lets your interviewee walk around freely, works indoors as well as outdoors and has a full sound. Moreover it is easy to handle and monitor once you have all the necessary adapters in place and ready.”

Leonor Suarez (TV Journalist and News Editor at RTPA, Spain): “My favorite microphone solutions are: For interviews: Rode Rodelink Filmmaker Kit. It is reliable, robust and has a good quality-price relationship. I’ve been using it for years with excellent results. For interviews on the go, unexpected situations or when other mics fail: IK Multimedia iRig Mic Lav. Again, good quality-price relationship. I always carry them with me in my bag and they have allowed me to record interviews, pieces to camera and unexpected stories. What I also love is that you can check the audio with headphones while recording.”

Marcel Anderwert (Mobile Journalist at SRF, Switzerland): “For more than a year, I have been shooting all my reports for Swiss TV with one of these two mics: Voice Technologies’ VT506Mobile (with it’s long cable) or the Rode Wireless Go, my favourite wireless mic solution. The VT506Mobile works with iOS and Android phones, it’s a super reliable lavalier and the sound quality for interviews is just great. Rode’s Wireless Go gives me more freedom of movement. And it can be used in 3 ways: As a small clip-on mic with inbuilt transmitter, with a plugged in lavalier mic – and in combination with a simple adapter even as a handheld mic.”

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#34 Apple is about to give us 25fps in the iPhone’s native camera app (finally catching up to Windows Phones) — 17. November 2020

#34 Apple is about to give us 25fps in the iPhone’s native camera app (finally catching up to Windows Phones)

One of the things that has mostly remained a blindspot in video recording with the native camera app of a smartphone, is the ability to shoot in PAL frame rates, i.e. 25/50fps. The native camera apps of smartphones usually record with a frame rate of 30/60 fps. This is fine for many use cases but it’s not ideal under two circumstances: a) if you have to deliver your video for traditional professional broadcast in a PAL broadcast standard region (Europe, Australia, parts of Africa, Asia, South America etc.) b) If you have a multi-camera shoot with dedicated ‘regular’ cameras that only shoot 25/50fps. Sure, it’s relatively easy to capture in 25fps on your phone by using a 3rd party app like Filmic Pro or Protake but it still would be a welcome addition to any native camera app as long as this silly global frame rate divide (don’t get me started on this!) continues to exist. There was actually a prominent example of a phone maker that offered 25fps as a recording option in their (quasi)-native camera app very early on: Nokia and later Microsoft on their Lumia phones running Windows Phone / Windows Mobile. But as we all know by now, Windows Phone / Windows Mobile never really stood a chance against Android and iOS (read about its potential here) and has all but disappeared from the smartphone market. When LG introduced its highly advanced manual video mode in the native camera app of the V10, I had high hopes they would include a 25/50fps frame rate option as they were obviously aiming at more ambitious videographers. But no, the years have passed and current offerings from the Korean company like the G8X, V60 and Wing still don’t have it. It’s probably my only major gripe with LG’s otherwise outstanding flagship camera app. It was up to Sony to rekindle the flame, giving us 25fps natively in the pro camera app of the Xperia 1 II earlier this year. 

And now, as spotted by BBC multimedia trainer Mark Robertson yesterday, Apple has added the option to record with a frame rate of 25fps in the native camera app on their latest iOS beta 14.3. This is a pretty big deal and I honestly didn’t expect Apple to make that move. But of course this is a more than welcome surprise! Robertson is using a new iPhone 12 Pro Max but his colleague Marc Blank-Settle also confirmed that this feature trickles down to the very old iPhone 6s, that is if you run the latest public beta version of iOS. The iPhone 6 and older models are excluded as they are not able to run iOS 14. While it’s not guaranteed that all new beta features make it to the finish line for the final release, I consider it to be very likely. So how do you set your iPhone’s native camera app to shoot video in 25fps? Go into your iPhone’s general settings, scroll down to “Camera” and then select “Record Video”. Now locate the “Show PAL Formats” toggle switch and activate it, then choose either “1080p HD at 25fps” or “4K at 25fps”. Unfortunately, there’s no 50fps option at this moment, I’m pretty sure it will come at some point in the future though. I recorded several clips with my iPhone SE 2020 and tested the frame rate via the MediaInfo app which revealed a clean 25.000fps and CFR (Constant Frame Rate, smartphones usually record in VFR = Variable Frame Rate). What other implications does this have? Well, many interested in this topic have been complaining about Apple’s own iOS editing app iMovie not supporting 25/50fps export. You can import and edit footage recorded in that frame rates no problem but it will be converted to 30/60fps upon export. I believe that there’s a good chance now that Apple will support 25/50fps export in a future update of iMovie because why bother integrating this into the camera app when you can’t deliver in the same frame rate? Android phone makers in the meantime should pay heed and consider adding 25/50fps video recording to their native camera apps sooner than later. It may not be relevant for the majority of conventional smartphone users but it also doesn’t hurt and you can make certain “special interest” groups very happy! 

As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider signing up for my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.

As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider subscribing to my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#33 Auto-transcribe all your audio for free with Live Transcribe! — 26. October 2020

#33 Auto-transcribe all your audio for free with Live Transcribe!

While writing my last blog post about Google Recorder 2.0, I stumbled upon a hack that can also be utilized for another app from Google, one that currently understands over 70 languages, not only English: It’s called “Live Transcribe & Sound Notifications” and is available for pretty much every Android device. Have you always been looking for a tool that transcribes your audio recordings but doesn’t require an expensive subscription? Here’s what I like to think is a very useful and simple trick for achieving this on an Android phone. You will need the following things:

  • Android device running at least Android 5.0 Lollipop (if your phone is less than 5 years old, you should be safe!)
  • the app Live Transcribe & Sound Notifications by Google (free download on the Google Play Store)
  • an internet connection (either mobile data or wifi)
  • a quiet environment

Let’s say you have recorded some audio like an interview, a meeting, a vox pop, a voice-over for video or even a podcast on your smartphone (look here for some good audio recorder apps) and would like to have a text transcription of it. If you read this before making such a recording, do include a few seconds of silence before having someone talk in the recording and it’s also important that the recording is of good quality in terms of speech clarity, the reasons will become obvious soon. 

Here’s how it works!

Live Transcribe can be used to transcribe speech/audio from over 70 languages.

Open Live Transcribe and check the input language displayed in the bottom toolbar (if the toolbar isn’t there, just tap on the screen somewhere). It needs to be the same as the recording you want to have transcribed. If it’s a different one, tap on the gear icon and then on “More settings”. Choose the correct language. Unlike Google Recorder which I wrote about in my last article, Live Transcribe works with a vast number of languages, not only English. Also unlike Recorder however, Live Transcribe needs an active internet connection to transcribe, you can’t use it offline! If you are planning on pasting the transcription into a context with a white background later on, you should make sure that “Dark Theme” is disabled in Live Transcribe. Otherwise you will be pasting white text onto a white background. Leave the settings menu and check that Live Transcribe’s main screen says “Ready to transcribe” in the center. Now double-check that you are in a quiet environment, leave Live Transcribe and open the audio recording app. Locate the recording you want to have transcribed and start the playback of the file (do make sure the speaker volume is sufficient!), then quickly switch over to Live Transcribe. One way to do this is to use Android’s “Recent Apps” feature which can be accessed by tapping on the square icon in the 3-button navigation bar – some Android phone makers use a different icon, Samsung for instance now has three vertical lines instead of a square. If you are using gesture navigation, swipe up from the bottom and hold. But you can also just leave the audio recording app and open Live Transcribe again without going into recent apps. The recording will keep playing with Live Transcribe picking up the audio from the phone’s speaker(s) and doing its transcription thing as if someone was talking into the phone’s mic directly. This actually works! Don’t worry if you notice mistakes in the transcription, you can fix them later. Once the recording and subsequently the transcription is finished, long-tap on any word, choose “Select transcription” and then “Copy”. You have now copied the whole transcription to the clipboard and can paste it anywhere you like: eMail, Google Docs etc. That’s also where you are now able to correct any mistakes that Live Transcribe has made (within Live Transcribe, there’s no option for editing the transcription yet). Two more things: You can have Live Transcribe save your transcripts for three days (activate it in the settings or activate auto-save under “More settings”) and if you want to clear out the app’s transcription cache, you can also do this under “More settings”, then choose “Delete history”.

Can you do the same with video recordings?

When in recent apps view, tap the app’s icon to show a pop-up menu. This menu looks slightly different on different Android devices. LG G8X (center), Pixel 3 (right).
Active app windows of Live Transcribe and Google Photos on one screen using “Pop-up window” feature on the LG G8X.

What about video recordings? Could you have them transcribed via Live Transcribe as well? Basically yes, but it’s not quite as easy. That’s if you want to do it using only one device (it’s very easy if you use a second device for playback). When you leave an app that’s playing back a video, the video (and with it its audio) will stop playing so there’s nothing for Live Transcribe to listen to. You can work around this by using Android’s split-screen or multi-window feature to actively run more than one app at the same time. On Android 7 and 8 you are able to access split-screen apps by long-pressing the square icon (recent apps) in the bottom navigation bar and select the app(s) you want to run in split-screen mode. Things have changed with Android 9 however. For one, gesture navigation was introduced as an alternative to the “old” 3-button-navigation bar. So if you are using gesture navigation, you access recent apps by swiping up from the bottom and then hold. If you use the 3-button-navigation, long pressing the square icon doesn’t do anything anymore. Instead, just tap it once to access the recent apps view, tap on the app’s icon at the top of the window and you will get a pop-up menu. Depending on what Android phone you are using the menu will have slightly different items, or at least they are named differently: On my LG G8X I get “App info”, “Multi window”, “Pop-up window” and “Pin app”, on my Pixel 3 I get “App info”, “Split screen”, “Freeform” and “Pause app”. The items you will want to choose to run two apps side by side are “Multi window” (G8X) / “Split screen” (Pixel 3) which will split the screen in half or “Pop-up window” (G8X) / “Freeform” (Pixel 3) which will display the app(s) in a small, desktop-like window that you can move around freely. By doing this, you can playback a video clip and have Live Transcribe running at the same time. Of course you can also use this feature to have both Live Transcribe and the playback of an audio recording app on the same screen simultaneously but for audio file transcriptions, you don’t have to go the extra mile.

Can I do this on an iPhone as well?

Google Translate main interface on Android (top) and iOS (bottom).

Google has a whole range of apps for iOS, but unfortunately, Live Transcribe isn’t among them – it’s currently Android-only. But hey, maybe you have an older Android phone in your drawer that you could put to good use again? That being said, there is the possibility that Google will eventually release an iOS version of Live Transcribe or Apple will come up with an app that does something similar. I also thought of another way, using a Google app that is already available for iOS: Google Translate. Yes, it’s meant for translation and not transcription but in the Android version, you can also find a “Transcribe” button. Initially, using this will only give you a transcription of the translated language but if you tap the cog wheel in the bottom left corner and choose “Show original text”, you will actually get a transcription of the original language which you can then copy and paste. When checking the iOS version of Translate though, I noticed that there is no “Transcribe” button. There is a “Voice” button (which in the Android version has been moved to the search bar) but this will only pick up a limited amount of input and is quite slow. There’s also no “Show original text” option. I suppose there might be a chance that Google will update its iOS version to match the Android version but there are no guarantees. The Android version of Google Photos has had a pretty impressive video stabilization feature for quite a while now, something that is still missing from the iOS version. It might be a purely strategic thing and Google wants to give certain features only to users of its own mobile operating system, but it might also be for technical reasons like that the core transcription engine is deeply rooted in the Android system and it’s just not possible to tap into this on iOS where Google is “just” a 3rd party app developer. Let’s see how things will turn out in the coming months.

If you have any questions or comments, leave them here or hit me up on the Twitter @smartfilming. Do also consider to subscribe to my Telegram Newsletter to get notified about new blog posts and receive the new “Ten Takeaways Telegram” monthly bullet point recap of what happened in the world of mobile video creation during the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂