smartfilming

Exploring the possibilities of video production with smartphones

#46 Top tips for smartphone videography in the summer — 28. June 2021

#46 Top tips for smartphone videography in the summer

Photo: Julia Volk via Pexels.com

It’s the dog days of summer again – well at least if you live in the northern hemisphere or near the equator. While many people will be happy to finally escape the long lockdown winter and are looking forward to meeting friends and family outside, intense sunlight and heat can also put extra stress on the body – and it makes for some obvious and less obvious challenges when doing videography. Here are some tips/ideas to tackle those challenges.

Icon: Alexandr Razdolyanskiy via The Noun Project

Find a good time/spot!
Generally, some of the problems mentioned later on can be avoided by picking the right spot and/or time for an outdoor shoot during the summertime. Maybe don’t set up your shot in the middle of a big open field where you and your phone are totally exposed to the full load of sunshine photons at high noon. Rather, try to shoot in the morning, late afternoon or early evening and also think about picking a spot in the shadows. Or choose a time when it’s slightly overcast. Of course it’s not always possible to freely choose time and spot, sometimes you just have to work in difficult conditions.

„Bum to the sun“ – yes or no?
There’s a saying that you should turn your „bum to the sun“ when shooting video. This definitely holds some truth as pointing the lens directly towards the sun can cause multiple problems, including unwanted lens flare effects, underexposed faces or a blown out background. You can however also create artistically interesting shots that way (silhouettes for instance) and the „bum to the sun“ motto comes with problems of its own: If you are shooting away from the sun but the person you are filming is looking directly towards it, they could be blinded by the intense sunlight and squint their eyes which doesn’t look very favorable. If the sun is low you also might have your own shadow in the shot. So I think the saying is something to take into consideration but shouldn’t be adhered to exclusively and in every situation.

Check the sky!
Clouds can severely impact the amount of sunlight that reaches the ground. So if you have set up an interview or longer shot and locked the exposure at a given time when there isn’t a single cloud in front of the sun, there might be a nearby one crawling along already that will take away lots of light later on and give you an underexposed image at some point. Or vice versa. So either do your thing when there are no (fast moving) clouds in the vicinity of the sun or when the cloud cover will be fairly constant for the next minutes.

Use an ND filter!
As I pointed out in my last blog post The Smartphone Camera Exposure Paradox, a bright sunny day can create exposure problems with a smartphone if you want to work with the „recommended“ (double the frame rate, for instance 1/50s at 25fps) or an acceptable shutter speed because phones only have a fixed, wide-open aperture. Even with the lowest ISO setting, you will still have to use a (very) fast shutter speed that can make motion appear jerky. That’s why it’s good to have a neutral density (ND) filter in your kit which reduces the amount of light that hits the sensor. There are two different kinds of ND filters: fixed and variable. The latter one lets you adjust the strength of the filtering effect. Unlike with dedicated regular cameras, the lenses on smartphones don’t have a filter thread so you either have to use some sort of case or rig with a filter thread or a clip-on ND filter.

Shoot LOG! (Well, maybe…)
Some 3rd party video recording apps and even a few native camera apps allow you to shoot with a LOG picture profile. A log profile distributes exposure and color differently, in a logarithmic rather than a linear curve, across the respective spectra compared to a „normal“ non-log image profile. By doing this you basically gain a bit more dynamic range (the range spanning between the brightest and darkest areas of an image) which can be very useful in high-contrast scenarios like a sunny day with extreme highlights and shadows. It also gives you more flexibility for grading in post to achieve the look you want. This however also comes with some extra work as pure log footage can look rather dull/flat and might need grading to look „pretty“ as a final result. It is possible though to apply so-called LUTs (simply put: a pre-defined set of grading parameters) to log footage to reduce/avoid time for manual grading.

Get a white case!


Ever heard of the term “albedo“? It designates the amount of sunlight (or if you want to be more precise: solar radiation) that is reflected by objects. Black objects reflect less and absorb more solar radiation (smaller albedo) than white objects (higher albedo). You can easily get a feeling for the difference by wearing a black or a white shirt on a sunny day. Similarly, if you expose a black or dark colored phone to intense sunlight, it will absorb more heat than a white or light colored phone and therefore be more prone to overheating. So if you do have a black or dark colored phone, it might a good idea to get yourself a white case so more sunlight is reflected off of the device. Vice versa, if you have a white or light colored phone with a black case, take it off. Be aware though that a white case only reduces the absorption of „external heat“ by solar radiation, not internal heat generated by the phone itself, something that particularly happens when you shoot in 4K/UHD, high frame rates or bit rates. You should also take into consideration that a case that fits super tight might reduce the phone’s ability to dispense internal heat. Ergo: A white phone (case) only offers some protection against the impact of direct solar radiation, not against internal heat produced by the phone itself or high ambient temperatures.

Maximize screen brightness!
This is pretty obvious. Of course bright conditions make it harder to see the screen and judge framing, exposure and focus so it’s good to crank up the screen brightness. Some camera apps let you switch on a feature that automatically maximizes screen brightness when using the app.

Get a power bank!
Maximizing screen brightness will significantly increase battery consumption though so you should think about having a back-up power bank at hand – at least if you are going on a longer shoot. But most of us already have one or two so this might not even be an additonal purchase.

Use exposure/focus assistants of your camera app!
One thing that can be very helpful in bright conditions when it’s hard to see the screen are analytical assistant tools in certain camera apps. While there are very few native camera apps that offer some limited assistance in this respect, it’s an area where dedicated 3rd party apps like Filmic Pro, mcpro24fps, ProTake, MoviePro, Mavis etc. can really shine (pardon the pun). For setting the correct exposure you can use Zebra (displays stripes on overexposed areas of the frame) or False Color (renders the image into solid colors identifying areas of under- and overexposure – usually blue for underexposure and red for overexposure). For setting the correct focus you can use Peaking (displays a colored outline on things in focus) and Magnification (digitally magnifies the image). Not all mentioned apps offer all of the mentioned tools. And there’s also a downside: Using these tools puts extra stress on your phone’s chipset which also means more internal heat – so only use them when setting exposure and focus for the shot, turn them off once you are done.

Photo: Moondog Labs

Use a sun hood!
Another way to better see the screen in sunny weather is to use a sun hood. There are multiple generic smartphone sun hoods available online but also one from dedicated mobile camera gear company MoondogLabs. Watch out: SmallRig, a somewhat renowned accessory provider for independent videography and filmmaking has a sun hood for smartphones in its portfolio but it’s made for using smartphones as a secondary device with regular cameras or drones so there’s no cut-out for the lens or open back which renders it useless if you want to shoot with your phone. This cautionary advice also applies to other sun hoods for smartphones.

Photo: RollCallGames

Sweaty fingers?
An issue I encountered last summer while doing a bike tour where I occasionally would stop to take some shots of interesting scenery along the road was that sweaty hands/fingers can cause problems with a phone’s touch screen. Touches aren’t registered or at the wrong places. This can be quite annoying. Turns out that there’s such a thing as „anti-sweat finger sleeves“ which were apparently invented for passionate mobile gamers. So I guess kudos to PUBG and Fortnite aficionados? There’s also another option: You can use a stylus or pen to navigate the touch screen. Users of the Samsung Galaxy Note series are clearly at an advantage here as the stylus comes with the phone.

Photo: George Becker via Pexels.com

Don’t forget the water bottle!
Am I going to tell you to cool your phone with a refreshing shower of bottled drinking water? Despite the fact that many phones nowadays offer some level of water-resistance, the answer is no. I’m including this tip for two reasons: First, it’s always good to stay hydrated if you’re out in the sun – I have had numerous situations where I packed my gear bag with all kinds of stuff (most of which I didn’t need in the end) but forgot to include a bottle of water (which I desperately needed at some point). Secondly, you can use a water bottle as an emergency tripod in combination with a rubber band or hair tie as shown in workshops by Marc Settle and Bernhard Lill. So yes, don’t forget to bring a water bottle!

Got other tips for smartphone videography in the summertime? Let us know!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#45 The Smartphone Camera Exposure Paradox — 11. May 2021

#45 The Smartphone Camera Exposure Paradox

Ask anyone about the weaknesses of smartphone cameras and you will surely find that people often point towards a phone’s low-light capabilities as the or at least one of its Achilles heel(s). When you are outside during the day it’s relatively easy to shoot some good-looking footage with your mobile device, even with budget phones. Once it’s darker or you’re indoors, things get more difficult. The reason for this is essentially that the image sensors in smartphones are still pretty small compared to those in DSLMs/DLSRs or professional video/cinema cameras. Bigger sensors can collect more photons (light) and produce better low light images. A so-called “Full Frame” sensor in a DSLM like Sony’s Alpha 7-series has a surface area of 864 mm2, a common 1/2.5” smartphone image sensor has only 25 mm2. So why not just put a huge sensor in a smartphone? While cameras in smartphones have undeniably become a very important factor, the phone is still very much a multi-purpose device and not a single-purpose one like a dedicated camera – for better or worse. That means there are many things to consider when building a phone. I doubt anyone would want a phone with a form factor that doesn’t allow you to put the phone in your pocket. And the flat form factor makes it difficult to build proper optics with larger sensors. Larger sensors also consume more power and produce more heat, not exactly something desirable. If we are talking about smartphone photography from a tripod, some of the missing sensor size can be compensated for with long exposure times. The advancements in computational imaging and AI have also led to dedicated and often quite impressive photography “Night Modes” on smartphones. But very long shutter speeds aren’t really an option for video as any movement appears extremely blurred – and while today’s chipsets can already handle supportive AI processing for photography, more resource-intensive videography is yet a bridge too far. So despite the fact that latest developments signal that we’re about to experience a considerable bump in smartphone image sensor sizes (Sony and Samsung are about to release a 1-inch/almost 1-inch image sensor for phones), one could say that most/all smartphone cameras (still) have a problem with low-light conditions. But you know what? They also have a problem with the exact opposite – very bright conditions!

If you know a little bit about how cameras work and how to set the exposure manually, you have probably come across something called the “exposure triangle”. The exposure triangle contains the three basic parameters that let you set and adjust the exposure of a photo or video on a regular camera: Shutter speed, aperture and ISO. In more general terms you could also say: Time, size and sensitivity. Shutter speed signifies the amount of time that the still image or a single frame of video is exposed to light, for instance 1/50 of a second. The longer the shutter speed, the more light hits the sensor and the brighter the image will be. Aperture refers to the size of the iris’ opening through which the light passes before it hits the sensor (or wayback when the film strip), it’s commonly measured in f-stops, for instance f/2.0. The bigger the aperture (= SMALLER the f-stop number), the more light reaches the sensor and the brighter the image will be. ISO (or “Gain” in some dedicated video cameras) finally refers to the sensitivity of the image sensor, for instance ISO 400. The higher the ISO, the brighter the image will be. Most of the time you want to keep the ISO as low as possible because higher sensitivity introduces more image noise. 

So what exactly is the problem with smartphone cameras in this respect? Well, unlike dedicated cameras, smartphones don’t have a variable aperture, it’s fixed and can’t be adjusted. Ok, there actually have been a few phones with variable aperture, most notably Samsung had one on the S4 Zoom (2013) and K Zoom (2014) and they introduced a dual aperture approach with the S9/Note9 (2018), held on to it for the S10/Note 10 (2019) but dropped it again for the S20/Note20 (2020). But as you can see from the very limited selection, this has been more of an experiment. The fixed aperture means that the exposure triangle for smartphone cameras only has two adjustable parameters: Shutter speed and ISO. Why is this problematic? When there’s movement in a video (either because something moves within the frame or the camera itself moves), we as an audience have become accustomed to a certain degree of motion blur which is related to the used shutter speed. The rule of thumb applied here says: Double the frame rate. So if you are shooting at 24fps, use a shutter speed of 1/48s, if you are shooting at 25fps, use a shutter speed of 1/50s, 1/60s for 30fps etc. This suggestion is not set in stone and in my humble opinion you can deviate from it to a certain degree without it becoming too obvious for casual, non-pixel-peeping viewers – but if the shutter speed is very slow, everything begins to look like a drug-induced stream of consciousness experience and if it’s very fast, things appear jerky and shutter speed becomes stutter speed. So with the aperture being fixed and the shutter speed set at a “recommended” value, you’re left with ISO as an adjustable exposure parameter. Reducing the sensitivity of the sensor is usually only technically possible down to an ISO between 50 and 100 which will still give you a (heavily) overexposed image on a sunny day outside. So here’s our “paradox”: Too much available light can be just as much of an issue as too little when shooting with a smartphone.

What can we do about the two problems? Until significantly bigger smartphone image sensors or computational image enhancement for video arrives, the best thing to tackle the low-light challenge is to provide your own additional lighting or look for more available light, be it natural or artificial. Depending on your situation, this might be relatively easy or downright impossible. If you are trying to capture an unlit building at night, you will most likely not have a sufficient amount of ultra-bright floodlights at your hand. If you are interviewing someone in a dimly lit room, a small LED might just provide enough light to keep the ISO at a level without too much image noise.

Clip-on variable ND filter

As for the too-much-light problem (which ironically gets even worse with bigger sensors setting out to remedy the low-light problems): Try to pick a less sun-drenched spot, shoot with a faster shutter-speed if there is no or little action in the shot or – and this might be the most flexible solution – get yourself an ND (neutral density) filter that reduces the amount of light that passes through the lens. While some regular cameras have inbuilt ND filters, this feature has yet to appear in any smartphone, although OnePlus showcased a prototype phone last year that had something close to a proper ND filter, using a technology called “electrochromic glass” to hide the lens while still letting (less) light pass through (check out this XDA Developers article). So until this actually makes it to the market and proves to be effective, the filter has to be an external one that is either clipped on or screwed on if you use a dedicated case with a corresponding filter thread. You also have the choice between a variable and a non-variable (fixed density) ND filter. A variable ND filter will let you adjust the strength of its filtering effect which is great for flexibility but also have some disadvantages like the possibility of cross-polarization. If you want to learn more about ND filters, I highly recommend checking out this superb in-depth article by Richard Lackey.

So what’s the bigger issue for you personally? Low-light or high-light? 

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion — 4. May 2021

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion

Rode just recently released the Wireless GO II, a very compact wireless audio system I wrote about in my last article. One of its cool features is that you can feed two transmitters into one receiver so you don’t need two audio inputs on your camera or smartphone to work with two external mic sources simultaneously. What’s even cooler is that you can record the two mics into separate channels of a video file with split track dual mono audio so you are able to access and mix them individually later on which can be very helpful if you need to make some volume adjustments or eliminate unwanted noise from one mic that would otherwise just be “baked in” with a merged track. There’s also the option to record a -12dB safety track into the second channel when you are using the GO II’s “merged mode” instead of the “split mode” – this can be a lifesaver when the audio of the original track clips because of loud input.

If you use a regular camera like a DSLM, it’s basically a given that you can record in split track dual mono and it also isn’t rocket science to access the two individual channels on a lot of desktop editing software. If you are using the GO II with a smartphone and even want to finish the edit on mobile afterwards, it’s a bit more complicated.

First off, if you want to make use of split channels or the safety channel, you need to be able to record a video file with dual track audio, because only then do you have two channels at your disposal, two channels that are either used for mic 1 and mic 2 or mic 1+2 combined and the safety channel in the case of the Wireless Go II. Most smartphones and camera apps nowadays do support this though (if they support external mics in general). The next hurdle is that you need to use the digital input port of your phone, USB-C on an Android device or the Lightning port on an iPhone/iPad. If you use the 3.5mm headphone jack (or an adapter like the 3.5mm to Lightning with iOS devices), the input will either create single channel mono audio or send the same pre-mixed signal to both stereo channels. So you will need a USB-C to USB-C cable for Android devices (Rode is selling the SC-16 but I also made it work with another cable) and a USB-C to Lightning cable for iOS devices (here the Rode SC-15 seems to be the only compatible option) to connect the RX unit of the GO II to the mobile device. Unfortunately, such cables are not included with the GO II but have to be purchased separately. A quick note: Depending on what app you are using, you either need to explicitly choose an external mic as the audio input in the app’s settings or it just automatically detects the external mic.

Once you have recorded a dual mono video file including separate channels and want to access them individually for adjustments, you also need the right editing software that allows you to do that. On desktop, it’s relatively easy with the common prosumer or pro video editing software (I personally use Final Cut Pro) but on mobile devices there’s currently only a single option: LumaFusion, so far only available for iPhone/iPad. I briefly thought that KineMaster (which is available for both Android and iOS) can do it as well because it has a panning feature for audio but it’s not implemented in a way that it can actually do what we need it to do in this scenario.

So how do you access the different channels in LumaFusion? It’s actually quite simple: You either double-tap your video clip in the timeline or tap the pen icon in the bottom toolbar while having the clip selected. Select the “Audio” tab (speaker icon) and find the “Configuration” option on the right. In the “Channels” section select either “Fill From Left” or “Fill From Right” to switch between the channels. If you need to use both channels at the same time and adjust/balance the mix you will have to detach the audio from the video clip (either triple-tap the clip or tap on the rectangular icon with an audio waveform), then duplicate the audio (rectangular icon with a +) and then set the channel configuration of one to “Fill From Left” and for the other to “Fill From Right”.

Here’s hoping that more video editing apps implement the ability to access individual audio tracks of a video file and that LumaFusion eventually makes it to Android.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#43 The Rode Wireless Go II review – Essential audio gear for everyone? — 20. April 2021

#43 The Rode Wireless Go II review – Essential audio gear for everyone?

Australian microphone maker RØDE is an interesting company. For a long time, the main thing they had going for them was that they would provide an almost-as-good but relatively low-cost alternative to high-end brands like Sennheiser or AKG and their established microphones, thereby “democratizing” decent audio gear for the masses. Over the last years however, Rode grew from “mimicking” products of other companies to a highly innovative force, creating original products which others now mimicked in return. Rode was first to come out with a dedicated quality smartphone lavalier microphone (smartLav+) for instance and in 2019, the Wireless GO established another new microphone category: the ultra-compact wireless system with an inbuilt mic on the TX unit. It worked right out of the box with DSLMs/DSLRs, via a TRS-to-TRRS or USB-C cable with smartphones and via a 3.5mm-to-XLR adapter with pro camcorders. The Wireless GO became an instant runaway success and there’s much to love about it – seemingly small details like the clamp that doubles as a cold shoe mount are plain ingenuity. The Interview GO accessory even turns it into a super light-weight handheld reporter mic and you are also able to use it like a more traditional wireless system with a lavalier mic that plugs into the 3.5mm jack of the transmitter. But it wasn’t perfect (how could it be as a first generation product?). The flimsy attachable wind-screen became sort of a running joke among GO users (I had my fair share of trouble with it) and many envied the ability of the similar Saramonic Blink 500 series (B2, B4, B6) to have two transmitters go into a single receiver – albeit without the ability for split channels. Personally, I also had occasional problems with interference when using it with an XLR adapter on bigger cameras and a Zoom H5 audio recorder.

Now Rode has launched a successor, the Wireless GO II. Is it the perfect compact wireless system this time around?

The most obvious new thing about the GO II is that the kit comes with two TX units instead of just one – already know where we are headed with this? Let’s talk about it in a second. A first look at the Wireless GO II’s RX and TX units doesn’t really reveal anything new – apart from the fact that they are labled “Wireless GO II”, the form factor of the little black square boxes is exactly the same. That’s both good and maybe partly bad I guess. Good because yes, just like the original Wireless GO, it’s a very compact system, “partly bad” because I suppose some would have loved to see the TX unit be even smaller for using it standalone as a clip-on with the internal mic and not with an additional lavalier. But I suppose the fact that you have a mic and a transmitter in a single piece requires a certain size to function at this point in time. The internal mic also pretty much seems to be the same, which isn’t a bad thing per se, it’s quite good! I wasn’t able to make out a noticeable difference in my tests so far but maybe the improvements are too subtle for me to notice – I’m not an audio guy. Oh wait, there is one new thing on the outside: A new twist-mechanism for the wind-screen – and this approach actually works really well and keeps the wind-screen in place, even if you pull on it. For those of us who use it outdoors, this is really a big relief.

But let’s talk about the new stuff “under the hood”, and let me tell you, there’s plenty! First of all, as hinted at before, you can now feed two transmitters into one receiver. This is perfect if you need to mic up two persons for an interview. With the original Wireless GO you had to use two receivers and an adapter cable to make it work with a single audio input.

It’s even better that you can choose between a “merged mode” and a “split mode”. The “merged mode” combines both TX sources into a single pre-mixed audio stream, “split mode” sends the two inputs into separate channels (left and right on a stereo mix, so basically dual mono). The “split mode” is very useful because it allows you to access and adjust both channels individually afterwards – this can come in handy for instance if you have a two-person interview and one person coughs while the other one is talking. If the two sources are pre-mixed (“merged mode”) into the same channel, then you will not be able to eliminate the cough without affecting the voice of the person talking – so it’s basically impossible. When you have the two sources in separate channels you can just mute the noisy channel for that moment in post. You can switch between the two modes by pressing both the dB button and the pairing button on the RX unit at the same time. 

One thing you should be aware of when using the split-channels mode recording into a smartphone: This only works with the digital input port of the phone (USB-C on Android, Lightning on iPhone/iPad). If you use a TRS-to-TRRS cable and feed it into the 3.5mm headphone jack (or a 3.5mm adapter, like the one for the iPhone), the signal gets merged, as there is just one contact left on the pin for mic input – only allowing mono. If you want to use the GO II’s split channels feature with an iPhone, there’s currently only one reliable solution: Rode’s SC15 USB-C to Lightning cable which is a separate purchase (around 25 Euros) unfortunately. With Android it’s less restrictive. You can purchase the equivalent SC16 USB-C to USB-C cable from Rode (around 15 Euros) but I tested it with a more generic USB-C to USB-C cable (included with my Samsung T5 SSD drive) and it worked just fine. So if you happen two have a USB-C to USB-C cable around, try this first before buying something new. You should also consider that you need a video editing software that lets you access both channels separately if you want to individually adjust them. On desktop, there are lots of options but on mobile devices, the only option is currently LumaFusion (I’m planning a dedicated blog post about this). 

If you don’t need the extra functionality of the “split mode” or the safety channel and are happy to use it with your device’s 3.5mm port (or a corresponding adapter), be aware that you will still need a TRS-to-TRRS adapter (cable) like Rode’s own SC4 or SC7 because the included one from Rode is TRS-to-TRS which works fine for regular cameras (DSLMs/DSLRs) but not with smartphones which have a TRRS headphone jack – well, if they still have one at all, that is. It may all look the same at first sight but the devil is in the detail, or in this case the connectors of the pin.

If you want to use the GO II with a camera or audio recorder that has XLR inputs, you will need a 3.5mm to XLR adapter like Rode’s own VXLR+ or VXLR Pro.

Along with the GO II, Rode released a desktop application called Rode Central which is available for free for Windows and macOS. It lets you activate and fine-tune additional features on the GO II when it’s connected to the computer. You can also access files from the onboard recording, a new feature I will talk about in a bit. A mobile app for Android and iOS is not yet available but apparently Rode is already working on it.

One brilliant new software feature is the ability to record a simultaneous -12dB safety track when in “merged mode”. It’s something Rode already implemented on the VideoMic NTG and it’s a lifesaver when you don’t know in advance how loud the sound source will be. If there’s a very loud moment in the main track and the audio clips, you can just use the safety track which at -12dB probably will not have clipped. The safety channel is however only available when recording in “merged mode” since it uses the second channel for the back-up. If you are using “split mode”, both channels are already filled and there’s no space for the safety track. It also means that if you are using the GO II with a smartphone, you will only be able to access the safety channel feature when using the digital input (USB-C or Lightning), not the 3.5mm headphone jack analogue input, because only then will you have two channels to record into at your disposal.

Another lifesaver is the new onboard recording capability which basically turns the two TX units into tiny standalone field recorders, thanks to their internal mic and internal storage. The internal storage is capable of recording up to 7 hours of uncompressed wav audio (the 7 hours also correspond with the battery life which probably isn’t a coincidence). This is very helpful when you run into a situation where the wireless connection is disturbed and the audio stream is either affected by interference noise or even drop-outs.

There are some further options you can adjust in the Rode Central app: You can now activate a more nuanced gain control pad for the output of the RX unit. On the original GO, you only had three different settings (low, medium, high), now you have a total of 11 (in 3db steps from -30db to 0db). You can also activate a reduced sensitivity for the input of the TX units when you know that you are going to record something very loud. Furthermore, you can enable a power saver mode that will dim the LEDs to preserve some additional battery life.

Other improvements over the original GO include a wider transmission range (200m line-of-sight vs. 70m) and better shielding from RF interference.

One thing that some people were hoping for in an updated version of the Wireless GO is the option to monitor the audio that goes into the receiver via a headphone output – sorry to say that didn’t happen but as long as you are using a camera or smartphone/smartphone app that gives you live audio monitoring, this shouldn’t be too big of a deal.

Aside from the wireless system itself the GO II comes with a TRS-to-TRS 3.5mm cable to connect it to regular cameras with a 3.5mm input, three USB-C to USB-A cables (for charging and connecting it to a desktop computer/laptop), three windshields, and a pouch. The pouch isn’t that great in my opinion, I would have prefered a more robust case but I guess it’s better than nothing at all. And as mentioned before: I would have loved to see a TRS-to-TRRS, USB-C to USB-C and/or USB-C to Lightning cable included to assure out-of-the-box compatibility with smartphones. Unlike some competitors, the kit doesn’t come with separate lavalier mics so if you don’t want to use the internal mics of the transmitters you will have to make an additional purchase unless you already have some. Rode offers the dedicated Lavalier GO for around 60 Euros. The price for the Wireless GO II is around 300 Euros. 

So is the Rode Wireless GO II perfect? Not quite, but it’s pretty darn close. It surely builds upon an already amazingly compact and versatile wireless audio system and adds some incredible new features so I can only recommend it for every mobile videomaker’s gear bag. If you want to compare it against a viable alternative, you could take a look at the Saramonic Blink 500 Pro B2 which is roughly the same price and comes with two lavalier microphones or the Hollyland Lark 150.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#42 Camera2 API Update 2021 – Android Pro Videography & Filmmaking — 15. April 2021

#42 Camera2 API Update 2021 – Android Pro Videography & Filmmaking

I’ve already written about Camera2 API in two previous blog posts (#6 & #10) but a couple of years have passed since and I felt like taking another look at the topic now that we’re in 2021. 

Just in case you don’t have a clue what I’m talking about here: Camera2 API is a software component of Google’s mobile operating system Android (which basically runs on every smartphone today expect Apple’s iPhones) that enables 3rd party camera apps (camera apps other than the one that’s already on your phone) to access more advanced functionality/controls of the camera, for instance the setting of a precise shutter speed value for correct exposure. Android phone makers need to implement Camera2 API into their version of Android and not all do it fully. There are four different implementation levels: “Legacy”, “Limited”, “Full” and “Level 3”. “Legacy” basically means Camera2 API hasn’t been implemented at all and the phone uses the old, way more primitive Android Camera API, “Limited” signifies that some components of the Camera2 API have been implemented but not all, “Full” and “Level 3” indicate complete implementation in terms of video-related functionality. “Level 3” only has the additional benefit for photography that you can shoot in RAW format. Android 3rd party camera apps like Filmic Pro, Protake, mcpro24fps, ProShot, Footej Camera 2 or Open Camera can only unleash their full potential if the phone has adequate Camera2 API support, Filmic Pro doesn’t even let you install the app in the first place if the phone doesn’t have proper implementation. “adequate”/”proper” can already be “Limited” for certain phones but you can only be sure with “Full” and “Level 3” devices. With some other apps like Open Camera, Camera2 API is deactivated by default and you need to go into the settings to enable it to access things like shutter speed and ISO control.

How do you know what Camera2 API support level a phone has? If you already own the phone, you can use an app like Camera2 Probe to check but if you want to consider this before buying a new phone of course this isn’t possible. Luckily, the developer of Camera2 Probe has set up a crowd sourced list (users can provide the test results via the app which are automatically entered into the list) with Camera2 API support levels of a massive amount of different Android devices, currently over 3500! The list can be accessed here and it’s great that you even get to sort the list by different parameters like the phone brand or type a device name into a search bar.

It’s important to understand that there’s a Camera2 API support level for each camera on the phone. So there could be a different one for the rear camera than for the selfie camera. The support level also doesn’t say anything about how many of the phone’s camera have been made accessible to 3rd party apps. Auxiliary ultra wide-angle or telephoto lenses have become a common standard in many of today’s phones but not all phone makers allow 3rd party camera apps to access the auxiliary camera(s). So when we talk about the Camera2 API support level of a device, most of the time we are referring to its main rear camera. 

Camera2 API was introduced with Android version 5 aka “Lollipop” in 2014 and it took phone makers a bit of time to implement it into their devices so one could roughly say that only Android devices running at least Android 6 Marshmallow are actually in the position to have proper support. In the beginning, most phone makers only provided full Camera2 API support for their high-end flagship phones but over the last years, the feature has trickled down to the mid-range segment and now even to a considerable amount of entry-level devices (Nokia and Motorola are two companies that have been good with this if you’re on a tight budget).

I actually took the time to go through the Camera2 Probe list to provide some numbers on this development. Of course these are not 100% representative since not every single Android device on the planet has been included in the list but I think 3533 entries (as of 21 March 2021) make for a solid sample size.

Phone models running Android 6

Level 3: 0

Full: 30

Limited: 18

Legacy: 444

Full/Level 3 %: 6.1

———-

Phone models running Android 7

Level 3: 82

Full: 121

Limited: 113

Legacy: 559

Full/Level 3 %: 23.2

———-

Phone models running Android 8

Level 3: 147

Full: 131

Limited: 160

Legacy: 350

Full/Level 3 %: 35.3

———-

Phone models running Android 9

Level 3: 145

Full: 163

Limited: 139

Legacy: 69

Full/Level 3 %: 59.7

———-

Phone models running Android 10

Level 3: 319

Full: 199

Limited: 169

Legacy: 50

Full/Level 3 %: 70.3

———-

Phone models running Android 11

Level 3: 72

Full: 28

Limited: 8

Legacy: 2

Full/Level 3 %: 90.9

I think it’s pretty obvious that the implementation of proper Camera2 API support in Android devices has been taking massive steps forward with each iteration of the OS and a 100% coverage on new devices is just within reach – maybe the upcoming Android 12 can already accomplish this mission?

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC) — 23. March 2021

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC)

As I have pointed out in two of my previous blog posts (What’s the best free cross-platform mobile video editing app?, Best video editors / video editing apps for Android in 2021) VN is a free and very capable mobile video editor for Android and iPhone/iPad and the makers recently also launched a desktop version for macOS. Project file sharing takes advantage of that and makes it possible to start your editing work on one device and finish it on another. So for instance after having shot some footage on your iPhone, you can start editing right away using VN for iPhone but transfer the whole project to your iMac or MacbookPro later to have a bigger screen and mouse control. It’s also a great way to free up storage space on your phone since you can archive projects in the cloud, on an external drive or computer and delete them from your mobile device afterwards. Project sharing isn’t a one-way trick, it also works the other way around: You start a project using VN on your iMac or MacbookPro and then transfer it to your iPhone or iPad because you have to go somewhere and want to continue your project while commuting. And it’s not all about Apple products either, you can also share from or to VN on Android smartphones and tablets (so basically every smartphone or tablet that’s not made by Apple). What about Windows? Yes, this is also possible but you will need to install an Android emulator on your PC and I will not go into the details about the procedure in this article as I don’t own a PC to test. But you can check out a good tutorial on the VN site here.

Before you start sharing your VN projects, here’s some general info: To actively share a project file, you need to create a free account with VN. Right off the bat, you can share projects that don’t exceed 3 GB in size. There’s also a maximum limit of 100 project files per day but I suppose nobody will actually bump into that. To get rid of these limitations, VN will manually clear your account for unlimited sharing within a few days after filling out this short survey. For passive sharing, that is when someone sends you a project file, there are no limitations even when you are not logged in. As the sharing process is slightly different depending on which platforms/devices are involved I have decided to walk you through all nine combinations, starting with the one that will probably be the most common. 

Let me quickly explain two general things ahead which apply to all combinations so I don’t have to go into the details every time:

1) When creating a VN project file to share, you can do it as “Full” or “Simple”. “Full” will share the project file with all of its media (complete footage, music/sound fx, text), “Simple” will let you choose which video clips you actually want to include. Not including every video clip will result in a smaller project file that can be transferred faster.

2) You can also choose whether or not you want the project file to be “Readonly”. If you choose “Readonly”, saving or exporting will be denied – this can be helpful if you send it to someone else but don’t want this person to save changes or export the project.

All of the sharing combinations I will mention now are focused on local device-to-device sharing. Of course you can also use any cloud service to store/share VN project files and have them downloaded and opened remotely on another device that runs the VN application.

iPhone/iPad to Mac

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon at the bottom), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Now choose “AirDrop” and select your Mac. Make sure that AirDrop is activated on both devices.
  • Depending on your AirDrop settings you now have to accept the transfer on the receiving device or the transfer will start automatically. By default, the file will be saved in the “Downloads” folder of your Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app.
  • Now select “Open project”.

iPhone/iPad to iPhone/iPad

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now choose “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Select the iPhone/iPad you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now select “Open project”

iPhone/iPad to Android

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the iOS/iPadOS share menu will pop up.
  • Now you need to transfer the project file from the iPhone/iPad to the Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both iPhone/iPad and Android.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Open SendAnywhere on your Android device, select the “Receive” tab and enter the code
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. 
  • The Android “Open with” menu will open, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Finally choose “Open Project”.

Mac to iPhone/iPad

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select. “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select your iPhone or iPad. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now choose “Open Project”.

Mac to Mac

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select the Mac you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • By default the VN project file will be saved in the “Downloads” folder of the receiving Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Mac to Android

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and choose a way to send it to your Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both macOS and Android.
  • So using SendAnywhere on your Mac, drag the VN project file into the app. You will see a 6-digit code. Open SendAnywhere on your Android, choose the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then on the project file.
  • The Android “Open with” menu will pop up, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Choose “Open Project”.

Android to Mac

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the Android share sheet will pop up.
  • Now you need to transfer the project file from your Android device to your Mac. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and macOS.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Unless you have created a custom download folder for your preferred file transfer app, the VN project file will be saved to the “Downloads” folder on your Mac or is available in your cloud storage.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Android to Android

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • From the Android share sheet, choose Android’s integrated wifi sharing option Nearby Share (check this video on how to use Nearby Share if you are not familiar with it) and select the device you want to send it to. Make sure Nearby Share is activated on both devices.
  • After accepting the file on the second device, the transfer will start.
  • Once it is finished, choose “VN/Import to VN” from the pop up menu. Importing into VN will start. 
  • Finally choose “Open Project”.

Android to iPhone/iPad

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated. Afterwards, the Android share sheet menu will pop up.
  • Now you need to transfer the project file from the Android device to the iPhone/iPad. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and iPhone/iPad.
  • So choose SendAnywhere from the Share Sheet. A 6-digit code is generated.
  • Open SendAnywhere on your iPhone/iPad, select the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. Now tap on the share icon in the top right corner and choose VN from the list. The project file will be imported into VN.
  • Finally choose “Open Project”.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

DISCLOSURE NOTE: This particular post was sponsored by VN. It was however researched and written all by myself.

#40 A whole new video editing experience on a phone! — 28. February 2021

#40 A whole new video editing experience on a phone!

Let’s be honest: Despite the fact that phone screens have become increasingly bigger over the last years, they are still rather small for doing some serious video editing on the go. No doubt, you CAN do video editing on your phone and achieve great results, particularly if you are using an app with a touch-friendly UI like KineMaster that was brilliantly designed for phone screens.  But I’m confident just about every mobile veditor would appreciate some more screen real estate. Sure, you can use a tablet for editing but tablets aren’t great devices for shooting and if you want to do everything on one device pretty much everyone would choose a phone, right? 

While phone makers like Samsung, Huawei and Motorola are currently pioneering devices with foldable screens, those are still extremely expensive (between 1500 and 2000 bucks!) and also have to cope with some teething problems. LG, while not particularly successful in terms of sales figures in the recent past, have proven to be an innovative force in smartphone development for some years now. Not everything they throw at the market sticks, but let’s not forget that for instance the now widely popular and extremely useful wide-angle auxiliary lens was first seen on the LG G5 (rear camera) and LG V10 (front camera). I would also hate to not have an amazing manual video mode in a native camera app like the V10 pioneered.

Instead of making a screen that folds, LG has introduced a series of phones that include (or at least have the option for) a dual screen case that has a second, separate screen – basically making it look like if you were holding two phones next to each other. So the concept is that of a foldable PHONE, not a foldable SCREEN! The actual phone is inserted into the Dual Screen case with a physical connection (initially pogo pins, then USB-C) establishing communication between the two devices. First came the V50 (April 2019), then the G8X (November 2019) and the V60 (March 2020) with the latest Dual Screen-compatible phone release being the LG Velvet (May 2020). As far as I know, the G8X (which I got new for just over 400€) is the only of the bunch that comes with the Dual Screen included, for the other phones, the DS is an accessory that can be purchased separately or in a bundle with the phone. It’s important to note that the DS cases are all slightly different (LG refined the design over time) and only work with the phone they were designed for. It probably goes without saying that they don’t work with just any other Android phone – this is proprietary LG hardware. 

The user experience of a foldable screen phone like the Samsung Galaxy Fold is quite different from that of the Dual Screen foldable phone approach. While an expanded foldable screen can give you more screen real estate for one app, the DS is primarily designed for multi-tasking with two apps running at the same time, one on the phone’s main screen and one on the Dual Screen. The DS is not really meant to use an app in an expanded view over both screens as there’s obviously a big gap/hinge between the two screens which is quite distracting in most cases. If apps were specifically customized, integrating the gap into their UI, this could be much less of a problem but with LG being a rather small player in the smartphone market, this hasn’t really happened so far. LG seems to have been quite aware of this and so they natively only allow a handful of apps (a bunch of Google apps and the Naver Whale browser) to be run in a wide view mode that spans across both screens.

Now, while having an app run across two separate screens might not make a lot of sense for many apps, there is one type of app that could actually be a perfect fit: video editors. On desktop, lots of professional video editors (I’m talking about the persons doing the editing) use a dual monitor set-up to have more screen real estate to organize their virtual workspace. One classic use case is that you have your timeline, media pool etc. on one screen and a big preview window on the second screen. It’s exactly this scenario that can be mimicked on LG’s Dual Screen phones like the G8X – but only with a particular app.

Why only with a particular app? Because the app’s UI needs to fit the Dual Screen in just the right way and currently, the only app that does that is PowerDirector. It’s not a perfect fit (one of the most obvious imperfections is the split playback button) but that’s to be expected since the app has not been optimized in any way for LG’s Dual Screen phones – considering this, it’s truly amazing HOW well Power Director’s UI falls into place on the G8X. The joy of having a big preview window on the top screen with the timeline and tool bars having their own space on the bottom screen (using the phone in landscape orientation) can hardly be overestimated in my opinion. It really feels like a whole new mobile video editing experience, and an extremely pleasant one for sure! 

But wait! Didn’t I mention that LG’s wide view mode is only available for a couple of apps natively? Yes indeed, and that’s why you need a 3rd party helper app that lets you run just any app you want in wide mode. It’s called WideMode for LG and can be downloaded for free from the Google PlayStore. Once you have installed it, you can add a shortcut to the quick settings (accessible via the swipe down notification shade) and switch to wide view whenever you want to. The app works really well in general (don’t blame the app maker for the fact that virtually no app has been optimized for this view!), occasionally, certain navigational actions cause the wide mode to just quit but most of the time, you can pick up the pattern of when that happens. In the case of Power Director for instance, you should only activate wide mode once you have opened your project and can see the timeline. If you activate wide view before that and select a project, you will get thrown out of the wide view mode. Also, if you’re done with your editing and want to export the project, tapping the share/export button will quit wide view and push the UI back on just a single screen but that’s not really problematic in my opinion. Still I couldn’t help but daydream about how cool the app would be if Cyberlink decided to polish the UI for LG’s Dual Screen phones!

What about other video editing apps? KineMaster’s UI, while extremely good for single screen phones, is pretty terrible in wide view on the G8X. VN on the other hand works fairly well but can’t quite match Power Director. Interestingly, while VN doesn’t (yet) support landscape orientation in general, once you force it across both screens, it actually does work like that. The biggest annoyance is probably that the preview window is split between the two screen with the lower quarter on the bottom screen. If you use VN in portrait orientation with wide mode, the preview window is cut in half and so is the timeline area. The UI of CapCut is pretty similar to that of VN, so it’s basically the same here. Adobe Premiere Rush isn’t even available for any LG phones currently.

So is this the future of mobile video editing on smartphones? Yes and no. LG’s smartphone business has been struggling for a while and recent news from the Korean company indicate they might be looking for an exit strategy, selling their mobile branch. This also means however that you can currently get great deals on powerful LG phones so if you are on a budget but are really intrigued by this opportunity for mobile video editing then it might just be the perfect time. The way Power Director’s UI is layed out should also make it great for phones with a foldable screen like the Galaxy Fold series so if we assume that this type of phone will become more common and affordable in the near future, people doing a lot of video editing on the phone should definitely consider checking this out!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#39 Should you buy a cheap Android phone? 10 things to consider! — 24. January 2021

#39 Should you buy a cheap Android phone? 10 things to consider!

One of the big reasons why Android has such an overwhelming dominance as a mobile operating system on a global scale (around 75% of smartphones world wide run Android) is that you basically have a seamless price range from the very bottom to the very top – no matter your budget, there’s an Android phone that will fit it. This is generally a very good thing since it allows everyone on this planet to participate in mobile communication, not just if you have deep pockets. But as many of us would agree, smartphones are not pure communication devices anymore, you can also use them to actively create content. In this respect, Android phones are bringing the power of storytelling to the people and could therefore be regarded as an invaluable asset in democratizing this mighty tool. But if you CAN get a (very) cheap Android phone, SHOULD you get one?

Of course the definition of what one considers “cheap” highly depends on an individual background so I won’t get into any concrete universal definitions here. In Germany, the cheapest Android phones start at around 50 Euro I’d say. So what in general is the difference between a 50 Euro phone and a 1000 Euro Android phone? Let’s single out some points from the perspective of a mobile video creator:

1) Build quality

This can actually be surprisingly controversial. Sure, flagship phones have more premium build materials but the move to shiny glass-covered backs has seen many an excited owner making a mess out of his or her new phone with a single drop. So better get a case if you consider yourself among those who occasionally drop their phone. The plasticy build of cheaper devices might look or at least feel less premium but they can often take more abuse in various circumstances. As for the screen itself, more expensive phones tend to have a more robust protective layer but that doesn’t always save you and you can also get a pretty affordable add-on screen protector if you are worried about damaging your phone’s screen.

2) Software updates

Usually, more expensive phones get more updates / updates for a longer period. But there are exceptions. Nokia for instance is known to be very good with updates even on their budget phones so it also depends on the phone maker. Are software updates important? Yes and no. Generally, new software versions (at least the big annual ones like Android 10, Android 11 etc.) introduce new features and optimizations. New features specifically relevant for videography are however pretty rare (the last major ones were introduced with Android 5 in 2014 and then Android 11 in 2020) so it depends on whether the new features are actually helpful for what you want to get done and whether you are a tech-savvy person who always wants the latest updates to play around with. Security updates are important though but ever since Google decided to make it possible to distribute them separately from feature updates, they have also become more common in cheaper phones – mid-rangers and flagships still tend to receive more software updates and for longer periods of time however.

3) Expandable storage

The ability to easily and cheaply add additional storage to your phone via a microSD card has long been a major plus of the Android system when compared to Apple’s iPhones. More and more Android OEMs however have started eliminating this valuable feature from their new releases, Samsung being the latest with its flagship S21 series. Sure, they have increased the internal storage over time, you can easily get phones with 128, 256 or 512 GB these days, but in my opinion it would still be good to have the option for expandable storage – UHD/4K video can fill up your phone pretty fast if you are shooting a lot. Interestingly, it’s now easier to find support for microSD cards in cheaper phones. Actually, many/most of the entry-level phones (still) have it so if that’s important to you, you might want to have a look at the budget or mid-range segment of the Android phone market.

4) Removable battery

An even more exotic but dare I say “pro” feature that has become nearly extinct but was generally very useful for “power users” is the ability to (easily) swap out batteries in a phone. LG was the last major phone maker to include this in a flagship device with the V20 in late 2016 but over time, the practice of a non-removable battery has trickled down even to the (ultra) budget market. The few phones with exchangable batteries that are left can however be found there, last survivors include the Samsung XCover Pro, the Motorola Moto E6 and the Nokia 1.3. The only recent mid-range device including this feature seems to be the Fairphone 3/3+. Sure, power banks are an abundant accessory now and an easy way to juice up your phone while on the go – but the re-supply is incremental and sometimes it’s quite annoying to be tethered to an external device via cable while using the phone.

5) SoC/Processor

While the last two points were very much in favor of budget phones, the tide is about to turn. If you want to use your phone for more than just browsing the web, checking your messages or following your social media feeds, then your phone needs some decent processing power to keep things running smoothly. One of the toughest nuts to crack for a SoC (System-on-a-Chip) is editing high resolution video – even more so when it involves multiple tracks. So if you are planning on editing a lot of UHD/4K video with multiple layers on your phone, a budget device probably won’t cut it because processing power often is a watershed between cheaper and more expensive phones. That doesn’t mean however that you can’t do video editing at all on a budget smartphone. About two years ago I was really surprised how well Qualcomm’s Snapdragon 430/435 SoC did in terms of video editing, allowing for multiple layers of 1080p video in KineMaster on phones like the Nokia 5, Motorola Moto G5 or the LG Q6. Generally, the amount of layers and their resolution in video editing apps are dependent on the device’s chipset. Some apps like Adobe Premiere Rush aren’t even available for any budget phones because they are too demanding in terms of processing power. The SoC can definitely also have an influence on the video RECORDING capabilities in terms of available frame rates and resolution. If 1080p at a maximum of 30fps is good enough for what you do though, basically every phone has that covered these days, even the cheapest ones.

6) Camera

And while the video recording resolution can be an indicator for technical image quality, it surely isn’t the only one – actually other things are (way) more important: Lens quality, aperture size, sensor quality, processing algorithm. That’s why 1080p footage shot on one phone might look better than 1080p footage shot on another. And generally, that’s also an area in which (ultra) budget phones get left behind. Again, this doesn’t mean that you should never use an entry-level phone to shoot video – some of them can capture surprisingly decent footage and if you are “just” doing something for Facebook etc., the difference in image quality might not really be noticeable for the casual, non-pixel-peeping viewer. Also never forget that the content/the story is way more important than the image quality! You will reach/move more people with a good story shot on a cheap phone than with a mediocre story shot on a flagship phone, never mind the superior image quality of the camera.

7) Native camera app

Another aspect that can distinguish a cheap from a more expensive Android phone is the native camera app. Not so much in terms of the general UI and basic functionality but in terms of special modes and features. LG for instance has an absolutely outstanding manual video mode in the native camera app of its flagship lines, one that can rival a dedicated 3rd party app like Filmic Pro, but you don’t get it in their budget phones. The same goes for Sony and – to a lesser degree – Samsung, which at least gives you support for external mics down to its entry-level offerings. Other Android phone makers however have the same native camera app in all of their models, budget or flagship (Motorola for instance, unless they have recently changed something).

8) Camera2 API

I just mentioned 3rd party video recording apps, so let’s look at an even “nerdier” aspect: Usually, more expensive phones have better Camera2 API support. What’s Camera2 API? I have written a whole blog post about it, but in short, it’s basically the phone’s ability to give 3rd party camera apps access to manual control for certain more advanced imaging parameters like shutter speed, ISO, white balance etc. So this is important if you are planning to use such an app (like for instance Filmic Pro, ProTake or mcpro24fps) instead of the phone’s native camera app. While nowadays basically all (or almost all) flagship phones and many/most mid-range Android phones have proper Camera2 API support, there are also entry-level phones that are equipped with it, for instance some from Nokia and Motorola – it’s not that common yet however.

9) Headphone jack

Before wrapping things up I want to look at another aspect that is of major relevance if you want to record audio with external mics on your smartphone – be it as part of capturing video or just audio-only. Like the removeable battery and expandable storage, the 3.5mm headphone jack is a feature that’s been fading away from smartphones over the last years. Some Android OEMs are still holding on to it (for the most part) but many have eliminated it, relying solely on a single physical port (USB-C) and wireless technology (Bluetooth/WiFi). As with those other features, it’s curious that the 3.5mm headphone jack has mostly survived in budget phones. This makes a case for a very particular use scenario: If you “only” want to record audio (be it for an audio-only production or use as an external audio recorder with a lavalier on a video shoot), a budget phone can be an interesting option because you don’t have to care about the quality of the camera and neither (for the most part) the chipset and its processing power since audio processing is much less resource hungry than video processing. The external-recorder-with-a-lavalier scenario is also a clever idea to make use of an old phone if you have one buried in a drawer somewhere that’s only collecting dust.

10) Bonus tip!

What if you DO want higher processing power and camera quality, but are on a tight budget nonetheless? In that case, it can be helpful to look at older flagship models or mid-rangers. Once new Android phones are released, their price – not always but often – drops after a couple of months. If you compare the camera quality and processing power of a budget phone with an older flagship or potent mid-ranger you can often easily go back two or three years and still be on the better side with the “oldie”. Depending on what model/phone maker you choose and how far back you go, you might be stuck with an older version of Android but as indicated earlier on, this isn’t necessarily as bad as it sounds.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier! — 16. January 2021

#38 How to anonymize persons or objects in videos on a smartphone – new app makes things a lot easier!

There are times when – for reasons of privacy or even a person’s physical safety – you want to make certain parts of a frame in a video unrecognizable so not to give away someone’s identity or the place where you shot the video. While it’s fairly easy to achieve something like that for a photograph, it’s a lot more challenging for video because of two reasons: 1) You might have a person moving around within a shot or a moving camera which constantly alters the location of the subject within the frame. 2) If the person talks, he or she might also be identifiable just by his/her voice. So are there any apps that help you to anonymize persons or objects in videos when working on a smartphone?

KineMaster – the best so far

Up until recently the best app for anonymizing persons and/or certain parts of a video in general was KineMaster which I already praised in my last blog about the best video editing apps on Android (it’s also available for iPhone/iPad). While it’s possible to use just any video editor that allows for a resizable image layer (let’s say just a plain black square or rectangle) on top of the main track to cover a face, KineMaster is the only one with a dedicated blur/mosaic tool for this use case. Many other video editing apps have a blur effect in their repertoire, but the problem is that this effect always affects the whole image and can’t be applied to only a part of the frame. KineMaster on the other hand allows its Gaussian Blur effect to be adjusted in size and position within the frame. To access this feature, scroll to the part of the timeline where you want to apply the effect but don’t select any of the clips! Now tap on the “Layer” button, choose “Effect”, then “Basic Effects”, then either “Gaussian Blur” or “Mosaic”. An effect layer gets added to the timeline which you can resize and position within the preview window. Even better: KineMaster also lets you keyframe this layer which is incredibly important if the subject/object you want to anonymize is moving around the frame or if the camera is moving (thereby constantly altering the subject’s/object’s position within the frame). Keyframing means you can set “waypoints” for the effect’s area to automatically change its position/size over time. You can access the keyframing feature by tapping on the key icon in the left sidebar. Keyframes have to be set manually so it’s a bit of work, particularly if your subject/object is moving a lot. If you just have a static shot with the person not moving around a lot, you don’t have to bother with keyframing though. And as if the adjustable blur/mosaic effect and support for keyframing wasn’t good enough, KineMaster also gives you a tool to add an extra layer of privacy: you can alter voices. To access this feature, select a clip in the timeline and then scroll down the menu on the right to find “Voice Changer”, there’s a whole bunch of different effects. To be honest, most of them are rather cartoonish – I’m not sure you want your interviewee to sound like a chipmunk. But there are also a couple of voice changer effects that I think can be used in a professional context.

What happened to Censr?

As I indicated in the paragraph above, a moving subject (or a moving camera) makes anonymizing content within a video a lot harder. You can manually keyframe the blurred area to follow along in KineMaster but it would be much easier if that could be done via automatic tracking. Last summer, a closed beta version of an app called “Censr” was released on iOS, the app was able to automatically track and blur faces. It all looked quite promising (I saw some examples on Twitter) but the developer Sam Loeschen told me that “unfortunately, development on censr has for the most part stopped”.

PutMask – a new app with a killer feature!

But you know what? There actually is a smartphone app out there that can automatically track and pixelate faces in a video: it’s called PutMask and currently only available for Android (there are plans for an iOS version). The app (released in July 2020) offers three ways of pixelating faces in videos: automatically by face-tracking, manually by following the subject with your finger on the touch-screen and manually by keyframing. The keyframing option is the most cumbersome one but might be necessary when the other two ways won’t work well. The “swipe follow” option is the middle-ground, not as time-consuming as keyframing but manual action is still required. The most convenient approach is of course automatic face-tracking (you can even track multiple faces at the same time!) – and I have to say that in my tests, it worked surprisingly well! 

Does it always work? No, there are definitely situations in which the feature struggles. If you are walking around and your face gets covered by something else (for instance because you are passing another person or an object like a tree) even for only a short moment, the tracking often loses you. It even lost me when I was walking around indoors and the lens flare from the light bulb at the ceiling created a visual “barrier” which I passed at some point. And although I would say that the app is generally well-designed, some of the workflow steps and the nomenclature can be a bit confusing. Here’s an example: After choosing a video from your gallery, you can tap on “Detect Faces” to start a scanning process. The app will tell you how many faces it has found and will display a numbered square around the face. If you now tap on “Start Tracking”, the app tells you “At least select One filter”. But I couldn’t find a button or something indicating a “filter”. After some confusion I discovered that you need to tap once on the square that is placed over the face in the image, maybe by “filter” they actually mean you need to select at least one face? Now you can initiate the tracking. After the process is finished you can preview the tracking that the app has done (and also dig deeper into the options to alter the amount of pixelation etc.) but for checking the actual pixelated video you have to export your project first. While the navigation could/should be improved for certain actions to make it more clear and intuitive, I was quite happy with the results in general. The biggest catch until recently was the maximum export resolution of 720p but with the latest update released on 21 January 2021, 1080p is also supported. An additional feature that would be great to have in an app that has a dedicated focus on privacy and anonymization, is the ability to alter/distort the voice of a person, like you can do in KineMaster.

There’s one last thing I should address: The app is free to download with all its core functionality but you only get SD resolution and a watermark on export. For HD/FHD watermark-free export, you need to make an in-app purchase. The IAP procedure is without a doubt the weirdest I have ever encountered: The app tells you to purchase any one of a selection of different “characters” to receive the additional benefits. Initially, these “characters” are just names in boxes, “Simple Man”, “Happy Man”, “Metal-Head” etc. If you tap on a box, an animated character pops up. But only when scrolling down it becomes clear that these “characters” represent different amounts of payment with which you support the developer. And if that wasn’t strange enough by itself, the amount you can donate goes up to a staggering 349.99 USD (Character Dr. Plague) – no kidding! At first, I had actually selected Dr. Plague because I thought it was the coolest looking character of the bunch. Only when trying to go through with the IAP did I become aware of the fact that I was about to drop 350 bucks on the app! Seriously, this is nuts! I told the developer that I don’t think this is a good idea. Anyway, the amount of money you donate doesn’t affect your additional benefits, so you can just opt for the first character, the “Simple Man”, which costs you 4.69€. I’m not sure why they would want to make things so confusing for users willing to pay but other than that, PutMask is a great new app with a lot of potential, I will definitely keep an eye on it!

As always, if you have questions or comments, drop them below or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

Download PutMask on GooglePlay.

#37 Best video editors / video editing apps for Android in 2021 — 10. January 2021

#37 Best video editors / video editing apps for Android in 2021


Ever since I started this blog, I wanted to write an article about my favorite video editing apps on Android but I could never decide on how to go about it, whether to write a separate in-depth article on each of them, a really long one on all of them or a more condensed one without too much detail or workflow explanations, more of an overview. So I recently figured there’s been enough pondering on this subject and I should just start writing something. The very basic common ground for all these mobile video editing apps mentioned here is that they allow you to combine multiple video clips into a timeline and arrange them in a desired order. Some might question the validity of editing video on such a relatively small screen as that of a smartphone (even though screen sizes have increased drastically over the last years). While it’s true that there definitely are limitations and I probably wouldn’t consider editing a feature-length movie that way, there’s also an undeniable fascination about the fact that it’s actually doable and can also be a lot of fun. I would even dare to say that it’s a charming throwback to the days before digital non-linear editing when the process of cutting and splicing actual film strips had a very tactile nature to it. But let’s get started…

KineMaster


When I got my first smartphone in 2013 and started looking for video editing apps in the Google PlayStore, I ran into a lot of frustration. There was a plethora of video editing apps but almost none of them could do more than manipulate a single clip. Then, in late December, an app called KineMaster was released and just by looking at the screenshots of the UI I could tell that this was the game changer I had been waiting for, a mobile video editing app that actually aspired to give you the proper feature set of a (basic) desktop video editing software. Unlike some other (failed) attempts in that respect, the devs behind KineMaster realized that giving the user more advanced editing tools could become an unpleasant boomerang flying in their face if the controls weren’t touch-friendly on a small screen. If you ever had the questionable pleasure of using a video editing app called “Clesh” on Android (it’s long gone), you know what I’m talking about. To this date, I still think that KineMaster has one of the most beautiful and intuitive UIs of any mobile app. It really speaks to its ingenuity that despite the fact that the app has grown into a respectable mobile video editing power house with many pro features, even total editing novices usually have no problem getting the hang of the basics within a couple of hours or even minutes.

While spearheading the mobile video editing revolution on Android, KineMaster dared to become one of the first major apps to drop the one-off payment method and pioneer a subscription model. I had initially paid 2€ one-off for the pro version of the app to get rid of the watermark, now you had to pay 2 or 3€ a month (!). I know, “devs gotta eat”, and I’m all for paying a decent amount for good apps but this was quite a shock I have to admit. It needs to be pointed out that KineMaster is actually free to download with all its features (so you can test it fully and with no time limit before investing any money) – but you always get a KineMaster watermark in your exported video and the export resolution doesn’t include UHD/4K. If you are just doing home movies for your family, that might be fine but if you do stuff in a professional or even just more ambitious environment, you probably want to get rid of the watermark. Years later, with every other app having jumped on the subscription bandwagon, I do feel that KineMaster is still one of the apps that are really worth it. I already praised the UI/UX, so here are some of the important features: You get multiple video tracks (resolution and number are device-dependend) and other media layers (including support for png images with tranparency), options for multiple frame rates including PAL (25/50), the ability to select between a wide variety of popular aspect ratios for projects (16:9, 9:16, 1:1, 2.35:1 etc.) and even duplicate the project with a different aspect ratio later (very useful if you want to share a video on multiple platforms), you can use keyframes to animate content, have a very good title tool at hand, audio ducking, voice over recording, basic grading tools and last but not least: the Asset Store. That’s the place where you can download all kinds of helpful assets for your edit: music, fonts, transitions, effects and most of all (animated) graphics (‘stickers’) that you can easily integrate into your project and make it pop without having to spend much time on creating stuff from scratch. Depending on what you are doing, this can be a massive help! I also have to say that despite Android’s fragmentation with all its different phones and chipsets, KineMaster works astonishingly well across the board.

There are still things that could be improved (certain parts of the timeline editing process, media management, precise font sizes, audio waveforms for video clips, quick audio fades, project archives etc.) and development progress in the last one or two years seems to have slowed down but it remains a/the top contender for the Android video editing crown, although way more challenged than in the past. Last note: KineMaster has recently released beta versions of two “helper” apps: VideoStabilizer for KineMaster and SpeedRamp for KineMaster. I personally wish they would have integrated this functionality into the main app but it’s definitely better than not having it at all.

PowerDirector


The first proper rival for KineMaster emerged about half a year later in June 2014 with Cyberlink’s PowerDirector. Unlike KineMaster, PowerDirector was already an established name in the video editing world, at least on the consumer/prosumer level. In many ways, PowerDirector has a somewhat (yet not completely) equal feature set to that of KineMaster with one key missing option being that for exporting in PAL frame rates (if you don’t need to export in 25/50fps, you can ignore this shortcoming). The UI is also good and pretty easy to learn. After KineMaster switched to the subscription model, PowerDirector did have one big factor in its favor: You could still get the full, watermark-free version of the app by making a single, quite reasonable payment, I think it was about 5€. That, however, changed eventually and PowerDirector joined the ranks of apps that you couldn’t own anymore, but only rent via a subscription to have access to all features and watermark-free export. Despite the fact that it’s slightly more expensive than KineMaster now, it’s still a viable and potent mobile video editor with some tricks up its sleeve.

It was for instance – until recently – the only mobile video editor that has an integrated stabilization tool to tackle shaky footage. It’s also the only one with a dedicated de-noise feature for audio and unlike with KineMaster you can mix your audio levels by track in addition to just by individual clips. Furthermore, PowerDirector offers the ability to transfer projects from mobile to its desktop version via the Cyberlink Cloud which can come in handy if you want to assemble a rough cut on the phone but do more in-depth work on a bigger screen with mouse control. Something rather annoying is the way in which the app tries to nudge or dare I say shove you towards a subscription. As I had bought the app before the introduction of the subscription model, I can still use all of its features and export without a watermark but before getting to the edit workspace, the app bombards you with full-screen ads for its subscription service every single time – I really hate that. One last thing: There are a couple of special Android devices on which PowerDirector takes mobile video editing actually to another level but that’s for a future article so stay tuned.

Adobe Premiere Rush


Even more so than Cyberlink, Adobe is a well-known name in the video editing business thanks to Premiere Pro (Windows/macOS). More than once I had asked myself why such a big player had missed the opportunity to get into the mobile editing game. Sure, they dipped their toes into the waters with Premiere Clip but after a mildly promising launch, the app’s development stagnated all too soon and was abandoned eventually – not that much of a loss as it was pretty basic. In 2018 however, Adobe bounced back onto the scene with a completely new app, Premiere Rush. This time, it looked like the video editing giant was ready to take the mobile platform seriously.

The app has a very solid set of advanced editing features and even some specialties that are quite unique/rare in the mobile editing environment: You can for instance expand the audio of a video clip without actually detaching it and risking to go out of sync, very useful for J & L cuts. There’s also a dedicated button that activates multi-select for clips in the timeline, another great feature. What’s more, Rush has true timeline tracks for video. What do I mean by “true”? KineMaster and PowerDirector support video layers but you can’t just move a clip from the primary track to an upper/lower layer track and vice versa which isn’t that much of a problem most of the time but sometimes it can be a nuisance. In Rush you can move your video clips up and down the tracks effortlessly. The “true tracks” also means that you can easily disable/mute/lock a particular track and all the clips that are part of it. One of Rush’s marketed highlights is the auto-conform feature which is supposed to automatically adapt your edit to other aspect ratios using AI to frame the image in the (hopefully) best way. So for instance if you have a classic 16:9 edit, you can use this to get a 1:1 video for Instagram. This feature is reserved for premium subscribers but you can still manually alter the aspect ratio of your project in the free version. For a couple of months, the app was only available for iOS but premiered (pardon the pun!) on Android in May 2019. Like PowerDirector, you can use Adobe’s cloud to transfer project files to the desktop version of Rush (or even import into Premiere Pro) which is useful if the work is a bit more complex. It’s also possible to have projects automatically sync to the cloud (subscriber feature). Initially, the app had a very expensive subscription of around 10€ per month (and only three free exports to test) unless you were already an Adobe Creative Cloud subscriber in which case you got it for free), but it has now become more affordable (4.89€ monthly or 33.99 per year) and the basic version with most features including 1080p export (UHD/4K is a premium feature) is free and doesn’t even force a watermark on your footage – you do need to create a (free) account with Adobe though.

The app does have its quirks – how much of it are still teething aches, I’m not sure. In my personal tests with a Google Pixel 3 and a Pocophone F1, export times were sometimes outrageously long, even for short 1080p projects. Both my test devices were powered by a Snapdragon 845 SoC which is a bit older but was a top flagship processor not too long ago and should easily handle 1080p video. Other editing apps didn’t have any problems rushing out (there goes another pun!) the same project on the same devices. This leads me to believe that the app’s export engine still needs some fine tuning and optimization. But maybe things are looking better on newer and even more powerful devices. Another head-scratcher was frame rate fidelity. While the export window gave me a “1080p Match Framerate” option as an alternative to “1080p 30fps”, surely indicating that it would keep the frame rate of the used clips, working with 25fps footage regularly resulted in a 30fps export. The biggest caveat with Rush though is that its availability on Android is VERY limited. If you have a recent flagship phone from Samsung, Google, Sony or OnePlus, you’re invited, otherwise you are out of luck – for the moment at least. For a complete list of currently supported Android devices check here.

VN


Ever since I started checking the Google PlayStore for interesting new apps on a regular basis, it rarely happens that I find a brilliant one that’s already been out for a very long time. It does happen on very rare occasions however and VN is the perfect case in point. VN had already been available for Android for almost two years (the PlayStore lists May 2018 as the release date) when it eventually popped up on my radar in March 2020 while doing a routine search for “video editors” on the PlayStore. VN is a very powerful video editor with a robust set of advanced tools and a UI that is both clean, intuitive and easy to grasp. You get a multi-layer timeline, support for different aspect ratios including 16:9, 9:16, 1:1, 21:9, voice over recording, transparency with png graphics, keyframing for graphical objects (not audio though, but there’s the option for a quick fade in/out), basic exposure/color correction, a solid title tool, export options for resolutions up to UHD/4K, frame rate (including PAL frame rates) and bitrate.

In other news, VN is currently the only of the advanced mobile video editing apps with a dedicated and very easy-to-use speed-ramping tool which can be helpful when manipulating a clip in terms of playback speed. It’s also great that you can move video clips up and down the tracks although it’s not as intuitive as Adobe Premiere Rush in that respect since you can’t just drag & drop but have to use the “Forward/Backward” button. But once you know how to do it, it’s very easy. While other apps might have a feature or two more, VN has a massive advantage: It’s completely free, no one-off payment, no subscription, no watermark. You do have to watch a 5 second full-screen ad when launching the app and delete a “Directed by” bumper clip from every project’s timeline, but it’s really not much of a bother in my opinion. In the past you had to create an account with VN but it’s not a requirement anymore. Will it stay free? When I talked to VN on Twitter some time ago, they told me that the app as such is supposed to remain free of charge but that they might at some point introduce certain premium features or content. VN recently launched a desktop version for macOS (no Windows yet) and the ability to transfer project files between iOS and macOS. While this is currently only possible within the Apple ecosystem (and does require that you register an account with VN), more cross-platform integration could be on the horizon. All in all, VN is an absolutely awesome and easily accessible mobile video editor widely available for most Android devices (Android 5.0 & up) – but do keep in mind that depending on the power of your phone’s chipset, the number of video layers and the supported editing/exporting resolution can vary.

CapCut

CapCut is somewhat similar to VN in terms of basic functionality (multiple video tracks, support for different frame rates including PAL, variety of aspect ratios etc.) and layout, but with a few additional nifty features that might come in handy depending on the use case. Like VN, it’s completely free without a watermark and you don’t have to create an account. CapCut was – following Cyberlink’s PowerDirector – the second advanced mobile video editing app to introduce a stabilization tool and it can even be adjusted to some degree.

Its unique standout double-feature however has to do with automatic speech-to-text/text-to-speech processing. As we all know, captions have become an integral part of video production for social media platforms as many or most of us browse their network feeds without having the sound turned on and so captions can be a way to motivate users to watch a video even when it’s muted. While it’s no problem to manually create captions with the title tool in basically any video editing app, this can be very time-consuming and fiddly on a mobile device. So how about auto-generated captions?  CapCut has you covered. It doesn’t work perfectly (you sometimes have to do some manual editing) and it’s currently only available in English, but it’s definitely a very cool feature that none of the other editors mentioned here can muster. Interestingly, it’s also possible to do it the other way around: You can let the app auto-generate a voice-over from a text layer. There are three different voices available: “American Male”, “American Female” and “British Female” (only English again). This can be useful if you quickly need to create a voice-over on the go and there’s no time or quiet place to do so or if you are not comfortable recording voice-overs with your own voice. Any cons? Generally, I would say that I prefer VN of the two because I like the design and UX of the timeline workspace better, it’s easier to navigate around, but that’s probably personal taste. What is an actual shortcoming however if you are after the highest possible quality is the fact that CapCut lacks support for UHD/4K export. Don’t get me wrong, you can import UHD/4K footage into the app and work with it but the export resolution is limited to 1080p and you also can’t adjust the bitrate. From a different angle, it should also be mentioned that CapCut is owned by Bytedance, the company behind the popular social video platform TikTok. While you don’t have to create an account for CapCut, you do have to agree to their T&Cs to use the app. So if you are very picky about who gets your data and kept your fingers off TikTok for that reason, you might want to take this into consideration.

Special mention (Motion Graphics): Alight Motion


Alight Motion is a pretty unique mobile app that doesn’t really have an equivalent at the moment. While you can also use it to stitch together a bunch of regular video clips filmed with your phone, this is not its main focus. The app is totally centered around creating advanced, multi-layered motion graphics projects, maybe think of it as a reduced mobile version of Adobe After Effects. Its power lies in the fact that you can manipulate and keyframe a wide range of parameters (for instance movement/position, size, color, shape etc.) on different types of layers to create complex and highly individual animations, spruced up with a variety of cool effects drawn from an extensive library. It takes some learning to unleash the enormous potential and power that lies within the app and fiddling around with a heavy load of parameters and keyframes on a small(ish) touch screen can occasionally be a bit challenging but the clever UI (designed by the same person that made KineMaster so much fun to use) makes the process basically as good and accessible as it can get on a mobile device. The developers also just added effect presets in a recent update which should make it easier for beginners who might be somewhat intimidated by manually keyframing parameters. Pre-designed templates for graphics and animations created by the dev team or other users will make things even more accessible in the future – some are already available but still too few to fully convince passionate users of apps such as the very popular but discontinued Legend. Alight Motion is definitely worth checking out as you can create amazing things with it (like explainer videos or animated info graphics), if you are willing to accept a small learning curve and invest some time. This is coming from someone who regularly throws in the towel trying to get the hang of Apple’s dedicated desktop motion graphics software Motion. Alight Motion has become the first application in this category in which I actually feel like I know what I’m doing – sort of at least. One very cool thing is that you can also use Alight Motion as a photo/still graphics editor since it lets you export the current timeline frame as a png, even with transparency! The app is free to download but to access certain features and export without a watermark you have to get a subscription which is currently around 28€ per year or 4.49 on a monthly basis.

Special mention (Automated Editing): Quik


Sometimes, things have to go quik-ly and you don’t have the time or ambition to assemble your clips manually. While I’m generally not a big fan of automated video editing processes, GoPro’s free Quik video editing app can come in handy at times. You just select a bunch of photos or videos, an animation style, your desired aspect ratio (16:9, 9:16, 1:1) and the app creates an automatic edit for you based on what it thinks are the best bits and pieces. In case you don’t like the results you have the option to change things around and select excerpts that you prefer – generally, manual control is rather limited though and it’s definitely not for more advanced edits. It’s also better suited for purely visual edits without important scenes relying on the original audio (like a person talking and saying something of interest). GoPro, who acquired the app in the past, is apparently working on a successor to Quik and will eventually pull this one from the Google PlayStore later in 2021 but here’s hope that the “new Quik” will be just as useful and accessible.

Special mention (360 Video Editing): V360

While 360 video hasn’t exactly become mainstream, I don’t want to ignore it completely for this post. Owners of a 360 camera (like the Insta360 One X2 I wrote about recently) usually get a companion mobile app along with the hardware which also allows basic editing. In the case of the Insta360 app you actually get quite a range of tools but it’s more geared towards reframing and exporting as a traditional flat video. You can only export a single clip in true 360 format. So if you want to create a story with multiple 360 video clips and also export as true, immersive 360 video with the appropriate metadata for 360 playback, you need to use a 3rd party app. I have already mentioned V360 in one of my very early blog posts but I want to come back to it as the landscape hasn’t really changed since then. V360 gives you a set of basic editing tools to create a 360 video story with multiple clips. You can arrange the clips in the desired order, trim and split them, add music and titles/text. It’s rather basic but good for what it is, with a clean interface and exports in original resolution (at least up to 5.7k which I was able to test). The free version doesn’t allow you to add transition effects between the clips and has a V360 branded bumper clip at the end that you can only delete in the paid version which is 4.99€. There are two other solid 360 video editors (Collect and VeeR Editor) which are comparable and even offer some additional/different features but I personally like V360 best although it has to be said that the app hasn’t seen an update in over two years.

What’s on the horizon?

There’s one big name in mobile editing town that’s missing from the Android platform so far – of course I’m talking about LumaFusion. According to LumaTouch, the company behind LumaFusion, they are currently probing an Android version and apparently have already hired some dedicated developers. I therefore suspect that despite the various challenges that such a demanding app like LumaFusion will encounter in creating a port for a different mobile operating system, we will see at least an early beta version in 2021. Furthermore, despite not having any concrete evidence, I assume that an Android version of Videoleap, another popular iOS-only video editor, might also be currently in the works. Not quite as advanced and feature-packed as LumaFusion, it’s pretty much on par in many respects with the current top dogs on Android. So while there definitely is competition, I also assume that the app’s demands are certainly within what can be achieved on Android and the fact that they have already brought other apps from their portfolio to Android indicates that they have some interest in the platform.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

Download KineMaster on GooglePlay
Downlaod PowerDirector on GooglePlay
Download Adobe Premiere Rush on GooglePlay
Download VN on GooglePlay
Download CapCut on GooglePlay
Download Alight Motion on GooglePlay
Download Quik on GooglePlay
Download V360 on GooglePlay