smartfilming

Exploring the possibilities of video production with smartphones

Welcome to smartfilming.blog! — 21. May 2021

Welcome to smartfilming.blog!

If you want to learn about how smartphones and other compact mobile cameras can be powerful and fascinating tools for videography, you have come to the right place! I’m covering a variety of aspects on this topic including mobile devices/cameras, operating systems, apps, accessories and the art of mobile videography, particularly what I like to call “phoneography”. This knowledge can be very useful for a whole range of professional and/or hobby videography enthusiasts and visual storytellers: mobile journalists, smart(phone) filmmakers, vloggers, YouTubers, social media content creators, business or NGO marketing experts, teachers, educators or hey, even if you’re “just” doing home movies for your family! Your phone is a mighty media production power house, learn how to unleash and wield it, right here on smartfilming.blog!

Feel free to connect with me on other platforms (click on the icons):

For a complete list of all my blog articles, click here.

To get in touch via eMail, click here.

To donate to this cost & ad-free blog via PayPal, click here.

#48 Is ProRes video recording coming to the next iPhone and is it a big deal? — 30. August 2021

#48 Is ProRes video recording coming to the next iPhone and is it a big deal?

ProRes logo and iPhone12 Pro Max image: Apple.

One of the things that always surprised me about Apple’s mobile operating system iOS (and now also iPadOS) was the fact that it wasn’t able to work with Apple’s very own professional video codec ProRes. ProRes is a high-quality video codec that gives a lot of flexibility for grading in post and is easy on the hardware while editing. Years ago I purchased the original Blackmagic Design Pocket Cinema Camera which can record in ProRes and I was really looking forward to having a very compact mobile video production combo with the BMPCC (that, unlike the later BMPCC 4K/6K was actually pocketable) and an iPad running LumaFusion for editing. But no, iOS/iPadOS didn’t support ProRes on a system level so LumaFusion couldn’t either. What a bummer.

Most of us will be familiar with video codecs like H.264 (AVC) and the more recent H.265 (HEVC) but while these have now become ubiquitous “all-in-one” codecs for capturing, editing and delivery of video content, this wasn’t always so. Initially, H.264 was primarily meant to be a delivery codec for a finished edit. It was not supposed to be the common editing codec – and for good reason: The high compression rate required powerful hardware to decode the footage when editing. I can still remember how the legacy Final Cut Pro on my old Mac was struggling with H.264 footage while having no problems with other, less compressed codecs. The huge advantage of H.264 as a capturing codec however is exactly the high compression because it means that you can record in high resolution and for a long time while still having relatively small file sizes which was and still is crucial for mobile devices where storage is precious. ProRes is basically the opposite: You get huge file sizes for the same recording but it’s less taxing on the editing hardware because it’s not as heavily compressed as H.264. From a quality standpoint, it’s capturing more and better color information and is therefore more robust and flexible when you apply grading in post production.

Very recently, Marc Gurman published a Bloomberg article that claims (based on info from inside sources) that the next flagship iPhone will have the ability to capture video with the ProRes codec. This took me quite by surprise given the aforementioned fact that iOS/iPadOS doesn’t even “passively” support ProRes at this point but if it turns out to be true, this is quite a big deal – at least for a certain tribe among the mobile video creators crowd, namely the mobile filmmakers. 

I’m not sure so-called “MoJos” (mobile journalists) producing short current news reports on smartphones would necessarily have to embrace ProRes as their new capture codec since their workflow usually involves a fast turn-around without spending significant time on extensive color grading, something that ProRes is made for. The lighter compression of ProRes might also not be such a big deal for them since recent iPhones and iPads can easily handle 4K multi-track editing of H.264/H.265 encoded footage. On the other hand, the downside of ProRes, very big file sizes, might actually play a role for MoJos since iPhones don’t support the use of SD cards as exchangeable and cheap external storage. Mobile filmmakers however might see this as a game-changer for their line of work, as they usually offload and back-up their dailies externally before going back on set and also spend a significant amount of time in post with grading later on.

Sure, if you are currently shooting with an app like Filmic Pro and use their “Filmic Extreme” bitrate, ProRes bitrates might not even shock you that much but the difference to standard mobile video bitrates is quite extreme nonetheless. To be more precise, the ProRes codec is not a single standard but comes in different flavors (with increasing bitrate): ProRes Proxy, ProRes LT, ProRes 422 (the “422” indicates its chroma subsampling), ProRes 422 HQ, ProRes 4444, ProRes 4444 XQ. ProRes 422 can probably be regarded as the “standard” ProRes. If we look at target bitrates for 1080p FHD in this case, it’s 122 Mbit/s for 25fps and 245Mbit/s for 50fps. Moving on to UHD/4K things are really getting enormous with 492Mbit/s for 25fps and 983Mbit/s for 50fps. A 1-minute clip of ProRes 422 UHD 25fps footage would be 3.69GB, A 1-minute clip of ProRes 422 UHD 50fps would be 7.37GB. It’s easy to see why limited internal storage can easily and quickly become a problem here if you shoot lots of video. So I personally would definitely consider it a great option to have but not exactly a must for every job and situation. Of course I would expect ProRes also to be supported for editing within the system from then on. For more info on the ProRes codec and its bitrates, check here.

At this point the whole thing is however NOT officially confirmed by Apple but only (informed) speculation and until recently I would have heavily doubted the probability of this actually happening. But the fact that Apple totally out of the blue introduced the option to record with a PAL frame rate in the native camera app earlier this year, something that by and large only video pros really care about, gives me the confidence that Apple might actually pull this off for real, maybe in the hope of luring in well-known filmmakers that boost the iPhone’s reputation as a serious filmmaking tool. What do you guys think? Will it really happen and would it be a big deal for you?

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#47 Videomakers please stop doing this! — 30. July 2021

#47 Videomakers please stop doing this!

Photo: SHVETS production / Pexels

Ok, today I have something a little different from the usual blog fare around here: a quick and dirty rant, maybe just a little bit tongue-in-cheek. I beg your pardon. I will only shame the deed, not name any perpetrators. You will probably have come across it and either noticed it consciously or subconsciously. Most likely on YouTube. There’s also a good chance you might disagree with what I am about to say. So be it. Now what am I talking about? 

The act of vlogging has risen to big stardom in the wake of moderately fast internet and affordable cameras. It often involves a person directly addressing the camera to tell us something – a casual piece-to-camera so to speak. Addressing the camera basically means addressing us as an audience. Maybe they’re talking about a political topic, about lip-gloss, their ongoing travels to exotic places – or why you should/shouldn’t buy this new exciting smartphone that just came out etc. etc. This is all fine. Here’s looking at you kid, I can take it all day long if need be – well if you have something interesting to say anyway …

What really annoys me though is the fact that an increasing number of creators (oh that’s a fancy word these days!) feel the absolute need to cross-cut their into-the-camera shot with one from the side where they most obviously do not look directly into the lens of the camera but way off. I can’t help it, I always find this extremely irritating, it makes me lose my focus on what’s being said. For me it feels like someone is talking to me, telling me something, looking me in the eye and then at some point he or she just starts looking somewhere else while still talking to me. Like if they see someone they know walking by, following them with their eyes but keep talking to you. Don’t get me wrong, this technique can be used to great effect the other way round in movies when a character breaks the so-called “fourth wall” and directly looks into the camera at some point. Marc Vernet has written an interesting article about it called “The Look at the Camera”. Some of the most memorable cases that come to mind for me personally would probably include “A Clockwork Orange”, “The Silence of the Lambs” and “American Beauty”. And no, your name doesn’t have to be Stanley Kubrick, Jonathan Demme or Sam Mendes to be allowed to do that.

But your reason for the switch between having someone look directly into the camera and then past in the subsequent shot should have purposeful artistic value, it shouldn’t just be used out of a imagined need to have a different shot from a different angle because the main one is perceived as too boring to work all the way through. Something that sort of works for me if the videomaker absolutely wants to have a little bit of change in the visual composition is to use the main camera shot but crop in – very easy if you shoot in 4K but deliver in 1080p. By doing this, you still keep the continuity of looking directly into the camera. It’s also possible to cut in b-roll where the talking person is not seen at all. It’s also a different story if there are other persons involved. But if it’s a single person going back and forth between shots having the presenter look into the camera and then not without good reason is a nuisance – at least for me, at least at this point in time.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#46 Top tips for smartphone videography in the summer — 28. June 2021

#46 Top tips for smartphone videography in the summer

Photo: Julia Volk via Pexels.com

It’s the dog days of summer again – well at least if you live in the northern hemisphere or near the equator. While many people will be happy to finally escape the long lockdown winter and are looking forward to meeting friends and family outside, intense sunlight and heat can also put extra stress on the body – and it makes for some obvious and less obvious challenges when doing videography. Here are some tips/ideas to tackle those challenges.

Icon: Alexandr Razdolyanskiy via The Noun Project

Find a good time/spot!
Generally, some of the problems mentioned later on can be avoided by picking the right spot and/or time for an outdoor shoot during the summertime. Maybe don’t set up your shot in the middle of a big open field where you and your phone are totally exposed to the full load of sunshine photons at high noon. Rather, try to shoot in the morning, late afternoon or early evening and also think about picking a spot in the shadows. Or choose a time when it’s slightly overcast. Of course it’s not always possible to freely choose time and spot, sometimes you just have to work in difficult conditions.

„Bum to the sun“ – yes or no?
There’s a saying that you should turn your „bum to the sun“ when shooting video. This definitely holds some truth as pointing the lens directly towards the sun can cause multiple problems, including unwanted lens flare effects, underexposed faces or a blown out background. You can however also create artistically interesting shots that way (silhouettes for instance) and the „bum to the sun“ motto comes with problems of its own: If you are shooting away from the sun but the person you are filming is looking directly towards it, they could be blinded by the intense sunlight and squint their eyes which doesn’t look very favorable. If the sun is low you also might have your own shadow in the shot. So I think the saying is something to take into consideration but shouldn’t be adhered to exclusively and in every situation.

Check the sky!
Clouds can severely impact the amount of sunlight that reaches the ground. So if you have set up an interview or longer shot and locked the exposure at a given time when there isn’t a single cloud in front of the sun, there might be a nearby one crawling along already that will take away lots of light later on and give you an underexposed image at some point. Or vice versa. So either do your thing when there are no (fast moving) clouds in the vicinity of the sun or when the cloud cover will be fairly constant for the next minutes.

Use an ND filter!
As I pointed out in my last blog post The Smartphone Camera Exposure Paradox, a bright sunny day can create exposure problems with a smartphone if you want to work with the „recommended“ (double the frame rate, for instance 1/50s at 25fps) or an acceptable shutter speed because phones only have a fixed, wide-open aperture. Even with the lowest ISO setting, you will still have to use a (very) fast shutter speed that can make motion appear jerky. That’s why it’s good to have a neutral density (ND) filter in your kit which reduces the amount of light that hits the sensor. There are two different kinds of ND filters: fixed and variable. The latter one lets you adjust the strength of the filtering effect. Unlike with dedicated regular cameras, the lenses on smartphones don’t have a filter thread so you either have to use some sort of case or rig with a filter thread or a clip-on ND filter.

Get a white case!


Ever heard of the term “albedo“? It designates the amount of sunlight (or if you want to be more precise: solar radiation) that is reflected by objects. Black objects reflect less and absorb more solar radiation (smaller albedo) than white objects (higher albedo). You can easily get a feeling for the difference by wearing a black or a white shirt on a sunny day. Similarly, if you expose a black or dark colored phone to intense sunlight, it will absorb more heat than a white or light colored phone and therefore be more prone to overheating. So if you do have a black or dark colored phone, it might a good idea to get yourself a white case so more sunlight is reflected off of the device. Vice versa, if you have a white or light colored phone with a black case, take it off. Be aware though that a white case only reduces the absorption of „external heat“ by solar radiation, not internal heat generated by the phone itself, something that particularly happens when you shoot in 4K/UHD, high frame rates or bit rates. You should also take into consideration that a case that fits super tight might reduce the phone’s ability to dispense internal heat. Ergo: A white phone (case) only offers some protection against the impact of direct solar radiation, not against internal heat produced by the phone itself or high ambient temperatures.

Maximize screen brightness!
This is pretty obvious. Of course bright conditions make it harder to see the screen and judge framing, exposure and focus so it’s good to crank up the screen brightness. Some camera apps let you switch on a feature that automatically maximizes screen brightness when using the app.

Get a power bank!
Maximizing screen brightness will significantly increase battery consumption though so you should think about having a back-up power bank at hand – at least if you are going on a longer shoot. But most of us already have one or two so this might not even be an additonal purchase.

Use exposure/focus assistants of your camera app!
One thing that can be very helpful in bright conditions when it’s hard to see the screen are analytical assistant tools in certain camera apps. While there are very few native camera apps that offer some limited assistance in this respect, it’s an area where dedicated 3rd party apps like Filmic Pro, mcpro24fps, ProTake, MoviePro, Mavis etc. can really shine (pardon the pun). For setting the correct exposure you can use Zebra (displays stripes on overexposed areas of the frame) or False Color (renders the image into solid colors identifying areas of under- and overexposure – usually blue for underexposure and red for overexposure). For setting the correct focus you can use Peaking (displays a colored outline on things in focus) and Magnification (digitally magnifies the image). Not all mentioned apps offer all of the mentioned tools. And there’s also a downside: Using these tools puts extra stress on your phone’s chipset which also means more internal heat – so only use them when setting exposure and focus for the shot, turn them off once you are done.

Photo: Moondog Labs

Use a sun hood!
Another way to better see the screen in sunny weather is to use a sun hood. There are multiple generic smartphone sun hoods available online but also one from dedicated mobile camera gear company MoondogLabs. Watch out: SmallRig, a somewhat renowned accessory provider for independent videography and filmmaking has a sun hood for smartphones in its portfolio but it’s made for using smartphones as a secondary device with regular cameras or drones so there’s no cut-out for the lens or open back which renders it useless if you want to shoot with your phone. This cautionary advice also applies to other sun hoods for smartphones.

Photo: RollCallGames

Sweaty fingers?
An issue I encountered last summer while doing a bike tour where I occasionally would stop to take some shots of interesting scenery along the road was that sweaty hands/fingers can cause problems with a phone’s touch screen. Touches aren’t registered or at the wrong places. This can be quite annoying. Turns out that there’s such a thing as „anti-sweat finger sleeves“ which were apparently invented for passionate mobile gamers. So I guess kudos to PUBG and Fortnite aficionados? There’s also another option: You can use a stylus or pen to navigate the touch screen. Users of the Samsung Galaxy Note series are clearly at an advantage here as the stylus comes with the phone.

Photo: George Becker via Pexels.com

Don’t forget the water bottle!
Am I going to tell you to cool your phone with a refreshing shower of bottled drinking water? Despite the fact that many phones nowadays offer some level of water-resistance, the answer is no. I’m including this tip for two reasons: First, it’s always good to stay hydrated if you’re out in the sun – I have had numerous situations where I packed my gear bag with all kinds of stuff (most of which I didn’t need in the end) but forgot to include a bottle of water (which I desperately needed at some point). Secondly, you can use a water bottle as an emergency tripod in combination with a rubber band or hair tie as shown in workshops by Marc Settle and Bernhard Lill. So yes, don’t forget to bring a water bottle!

Got other tips for smartphone videography in the summertime? Let us know!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#45 The Smartphone Camera Exposure Paradox — 11. May 2021

#45 The Smartphone Camera Exposure Paradox

Ask anyone about the weaknesses of smartphone cameras and you will surely find that people often point towards a phone’s low-light capabilities as the or at least one of its Achilles heel(s). When you are outside during the day it’s relatively easy to shoot some good-looking footage with your mobile device, even with budget phones. Once it’s darker or you’re indoors, things get more difficult. The reason for this is essentially that the image sensors in smartphones are still pretty small compared to those in DSLMs/DLSRs or professional video/cinema cameras. Bigger sensors can collect more photons (light) and produce better low light images. A so-called “Full Frame” sensor in a DSLM like Sony’s Alpha 7-series has a surface area of 864 mm2, a common 1/2.5” smartphone image sensor has only 25 mm2. So why not just put a huge sensor in a smartphone? While cameras in smartphones have undeniably become a very important factor, the phone is still very much a multi-purpose device and not a single-purpose one like a dedicated camera – for better or worse. That means there are many things to consider when building a phone. I doubt anyone would want a phone with a form factor that doesn’t allow you to put the phone in your pocket. And the flat form factor makes it difficult to build proper optics with larger sensors. Larger sensors also consume more power and produce more heat, not exactly something desirable. If we are talking about smartphone photography from a tripod, some of the missing sensor size can be compensated for with long exposure times. The advancements in computational imaging and AI have also led to dedicated and often quite impressive photography “Night Modes” on smartphones. But very long shutter speeds aren’t really an option for video as any movement appears extremely blurred – and while today’s chipsets can already handle supportive AI processing for photography, more resource-intensive videography is yet a bridge too far. So despite the fact that latest developments signal that we’re about to experience a considerable bump in smartphone image sensor sizes (Sony and Samsung are about to release a 1-inch/almost 1-inch image sensor for phones), one could say that most/all smartphone cameras (still) have a problem with low-light conditions. But you know what? They also have a problem with the exact opposite – very bright conditions!

If you know a little bit about how cameras work and how to set the exposure manually, you have probably come across something called the “exposure triangle”. The exposure triangle contains the three basic parameters that let you set and adjust the exposure of a photo or video on a regular camera: Shutter speed, aperture and ISO. In more general terms you could also say: Time, size and sensitivity. Shutter speed signifies the amount of time that the still image or a single frame of video is exposed to light, for instance 1/50 of a second. The longer the shutter speed, the more light hits the sensor and the brighter the image will be. Aperture refers to the size of the iris’ opening through which the light passes before it hits the sensor (or wayback when the film strip), it’s commonly measured in f-stops, for instance f/2.0. The bigger the aperture (= SMALLER the f-stop number), the more light reaches the sensor and the brighter the image will be. ISO (or “Gain” in some dedicated video cameras) finally refers to the sensitivity of the image sensor, for instance ISO 400. The higher the ISO, the brighter the image will be. Most of the time you want to keep the ISO as low as possible because higher sensitivity introduces more image noise. 

So what exactly is the problem with smartphone cameras in this respect? Well, unlike dedicated cameras, smartphones don’t have a variable aperture, it’s fixed and can’t be adjusted. Ok, there actually have been a few phones with variable aperture, most notably Samsung had one on the S4 Zoom (2013) and K Zoom (2014) and they introduced a dual aperture approach with the S9/Note9 (2018), held on to it for the S10/Note 10 (2019) but dropped it again for the S20/Note20 (2020). But as you can see from the very limited selection, this has been more of an experiment. The fixed aperture means that the exposure triangle for smartphone cameras only has two adjustable parameters: Shutter speed and ISO. Why is this problematic? When there’s movement in a video (either because something moves within the frame or the camera itself moves), we as an audience have become accustomed to a certain degree of motion blur which is related to the used shutter speed. The rule of thumb applied here says: Double the frame rate. So if you are shooting at 24fps, use a shutter speed of 1/48s, if you are shooting at 25fps, use a shutter speed of 1/50s, 1/60s for 30fps etc. This suggestion is not set in stone and in my humble opinion you can deviate from it to a certain degree without it becoming too obvious for casual, non-pixel-peeping viewers – but if the shutter speed is very slow, everything begins to look like a drug-induced stream of consciousness experience and if it’s very fast, things appear jerky and shutter speed becomes stutter speed. So with the aperture being fixed and the shutter speed set at a “recommended” value, you’re left with ISO as an adjustable exposure parameter. Reducing the sensitivity of the sensor is usually only technically possible down to an ISO between 50 and 100 which will still give you a (heavily) overexposed image on a sunny day outside. So here’s our “paradox”: Too much available light can be just as much of an issue as too little when shooting with a smartphone.

What can we do about the two problems? Until significantly bigger smartphone image sensors or computational image enhancement for video arrives, the best thing to tackle the low-light challenge is to provide your own additional lighting or look for more available light, be it natural or artificial. Depending on your situation, this might be relatively easy or downright impossible. If you are trying to capture an unlit building at night, you will most likely not have a sufficient amount of ultra-bright floodlights at your hand. If you are interviewing someone in a dimly lit room, a small LED might just provide enough light to keep the ISO at a level without too much image noise.

Clip-on variable ND filter

As for the too-much-light problem (which ironically gets even worse with bigger sensors setting out to remedy the low-light problems): Try to pick a less sun-drenched spot, shoot with a faster shutter-speed if there is no or little action in the shot or – and this might be the most flexible solution – get yourself an ND (neutral density) filter that reduces the amount of light that passes through the lens. While some regular cameras have inbuilt ND filters, this feature has yet to appear in any smartphone, although OnePlus showcased a prototype phone last year that had something close to a proper ND filter, using a technology called “electrochromic glass” to hide the lens while still letting (less) light pass through (check out this XDA Developers article). So until this actually makes it to the market and proves to be effective, the filter has to be an external one that is either clipped on or screwed on if you use a dedicated case with a corresponding filter thread. You also have the choice between a variable and a non-variable (fixed density) ND filter. A variable ND filter will let you adjust the strength of its filtering effect which is great for flexibility but also have some disadvantages like the possibility of cross-polarization. If you want to learn more about ND filters, I highly recommend checking out this superb in-depth article by Richard Lackey.

So what’s the bigger issue for you personally? Low-light or high-light? 

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion — 4. May 2021

#44 Split channels (dual mono) audio from the Rode Wireless Go II in LumaFusion

Rode just recently released the Wireless GO II, a very compact wireless audio system I wrote about in my last article. One of its cool features is that you can feed two transmitters into one receiver so you don’t need two audio inputs on your camera or smartphone to work with two external mic sources simultaneously. What’s even cooler is that you can record the two mics into separate channels of a video file with split track dual mono audio so you are able to access and mix them individually later on which can be very helpful if you need to make some volume adjustments or eliminate unwanted noise from one mic that would otherwise just be “baked in” with a merged track. There’s also the option to record a -12dB safety track into the second channel when you are using the GO II’s “merged mode” instead of the “split mode” – this can be a lifesaver when the audio of the original track clips because of loud input.

If you use a regular camera like a DSLM, it’s basically a given that you can record in split track dual mono and it also isn’t rocket science to access the two individual channels on a lot of desktop editing software. If you are using the GO II with a smartphone and even want to finish the edit on mobile afterwards, it’s a bit more complicated.

First off, if you want to make use of split channels or the safety channel, you need to be able to record a video file with dual track audio, because only then do you have two channels at your disposal, two channels that are either used for mic 1 and mic 2 or mic 1+2 combined and the safety channel in the case of the Wireless Go II. Most smartphones and camera apps nowadays do support this though (if they support external mics in general). The next hurdle is that you need to use the digital input port of your phone, USB-C on an Android device or the Lightning port on an iPhone/iPad. If you use the 3.5mm headphone jack (or an adapter like the 3.5mm to Lightning with iOS devices), the input will either create single channel mono audio or send the same pre-mixed signal to both stereo channels. So you will need a USB-C to USB-C cable for Android devices (Rode is selling the SC-16 but I also made it work with another cable) and a USB-C to Lightning cable for iOS devices (here the Rode SC-15 seems to be the only compatible option) to connect the RX unit of the GO II to the mobile device. Unfortunately, such cables are not included with the GO II but have to be purchased separately. A quick note: Depending on what app you are using, you either need to explicitly choose an external mic as the audio input in the app’s settings or it just automatically detects the external mic.

Once you have recorded a dual mono video file including separate channels and want to access them individually for adjustments, you also need the right editing software that allows you to do that. On desktop, it’s relatively easy with the common prosumer or pro video editing software (I personally use Final Cut Pro) but on mobile devices there’s currently only a single option: LumaFusion, so far only available for iPhone/iPad. I briefly thought that KineMaster (which is available for both Android and iOS) can do it as well because it has a panning feature for audio but it’s not implemented in a way that it can actually do what we need it to do in this scenario.

So how do you access the different channels in LumaFusion? It’s actually quite simple: You either double-tap your video clip in the timeline or tap the pen icon in the bottom toolbar while having the clip selected. Select the “Audio” tab (speaker icon) and find the “Configuration” option on the right. In the “Channels” section select either “Fill From Left” or “Fill From Right” to switch between the channels. If you need to use both channels at the same time and adjust/balance the mix you will have to detach the audio from the video clip (either triple-tap the clip or tap on the rectangular icon with an audio waveform), then duplicate the audio (rectangular icon with a +) and then set the channel configuration of one to “Fill From Left” and for the other to “Fill From Right”.

Here’s hoping that more video editing apps implement the ability to access individual audio tracks of a video file and that LumaFusion eventually makes it to Android.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#43 The Rode Wireless Go II review – Essential audio gear for everyone? — 20. April 2021

#43 The Rode Wireless Go II review – Essential audio gear for everyone?

Australian microphone maker RØDE is an interesting company. For a long time, the main thing they had going for them was that they would provide an almost-as-good but relatively low-cost alternative to high-end brands like Sennheiser or AKG and their established microphones, thereby “democratizing” decent audio gear for the masses. Over the last years however, Rode grew from “mimicking” products of other companies to a highly innovative force, creating original products which others now mimicked in return. Rode was first to come out with a dedicated quality smartphone lavalier microphone (smartLav+) for instance and in 2019, the Wireless GO established another new microphone category: the ultra-compact wireless system with an inbuilt mic on the TX unit. It worked right out of the box with DSLMs/DSLRs, via a TRS-to-TRRS or USB-C cable with smartphones and via a 3.5mm-to-XLR adapter with pro camcorders. The Wireless GO became an instant runaway success and there’s much to love about it – seemingly small details like the clamp that doubles as a cold shoe mount are plain ingenuity. The Interview GO accessory even turns it into a super light-weight handheld reporter mic and you are also able to use it like a more traditional wireless system with a lavalier mic that plugs into the 3.5mm jack of the transmitter. But it wasn’t perfect (how could it be as a first generation product?). The flimsy attachable wind-screen became sort of a running joke among GO users (I had my fair share of trouble with it) and many envied the ability of the similar Saramonic Blink 500 series (B2, B4, B6) to have two transmitters go into a single receiver – albeit without the ability for split channels. Personally, I also had occasional problems with interference when using it with an XLR adapter on bigger cameras and a Zoom H5 audio recorder.

Now Rode has launched a successor, the Wireless GO II. Is it the perfect compact wireless system this time around?

The most obvious new thing about the GO II is that the kit comes with two TX units instead of just one – already know where we are headed with this? Let’s talk about it in a second. A first look at the Wireless GO II’s RX and TX units doesn’t really reveal anything new – apart from the fact that they are labled “Wireless GO II”, the form factor of the little black square boxes is exactly the same. That’s both good and maybe partly bad I guess. Good because yes, just like the original Wireless GO, it’s a very compact system, “partly bad” because I suppose some would have loved to see the TX unit be even smaller for using it standalone as a clip-on with the internal mic and not with an additional lavalier. But I suppose the fact that you have a mic and a transmitter in a single piece requires a certain size to function at this point in time. The internal mic also pretty much seems to be the same, which isn’t a bad thing per se, it’s quite good! I wasn’t able to make out a noticeable difference in my tests so far but maybe the improvements are too subtle for me to notice – I’m not an audio guy. Oh wait, there is one new thing on the outside: A new twist-mechanism for the wind-screen – and this approach actually works really well and keeps the wind-screen in place, even if you pull on it. For those of us who use it outdoors, this is really a big relief.

But let’s talk about the new stuff “under the hood”, and let me tell you, there’s plenty! First of all, as hinted at before, you can now feed two transmitters into one receiver. This is perfect if you need to mic up two persons for an interview. With the original Wireless GO you had to use two receivers and an adapter cable to make it work with a single audio input.

It’s even better that you can choose between a “merged mode” and a “split mode”. The “merged mode” combines both TX sources into a single pre-mixed audio stream, “split mode” sends the two inputs into separate channels (left and right on a stereo mix, so basically dual mono). The “split mode” is very useful because it allows you to access and adjust both channels individually afterwards – this can come in handy for instance if you have a two-person interview and one person coughs while the other one is talking. If the two sources are pre-mixed (“merged mode”) into the same channel, then you will not be able to eliminate the cough without affecting the voice of the person talking – so it’s basically impossible. When you have the two sources in separate channels you can just mute the noisy channel for that moment in post. You can switch between the two modes by pressing both the dB button and the pairing button on the RX unit at the same time. 

One thing you should be aware of when using the split-channels mode recording into a smartphone: This only works with the digital input port of the phone (USB-C on Android, Lightning on iPhone/iPad). If you use a TRS-to-TRRS cable and feed it into the 3.5mm headphone jack (or a 3.5mm adapter, like the one for the iPhone), the signal gets merged, as there is just one contact left on the pin for mic input – only allowing mono. If you want to use the GO II’s split channels feature with an iPhone, there’s currently only one reliable solution: Rode’s SC15 USB-C to Lightning cable which is a separate purchase (around 25 Euros) unfortunately. With Android it’s less restrictive. You can purchase the equivalent SC16 USB-C to USB-C cable from Rode (around 15 Euros) but I tested it with a more generic USB-C to USB-C cable (included with my Samsung T5 SSD drive) and it worked just fine. So if you happen two have a USB-C to USB-C cable around, try this first before buying something new. You should also consider that you need a video editing software that lets you access both channels separately if you want to individually adjust them. On desktop, there are lots of options but on mobile devices, the only option is currently LumaFusion (I’m planning a dedicated blog post about this). 

If you don’t need the extra functionality of the “split mode” or the safety channel and are happy to use it with your device’s 3.5mm port (or a corresponding adapter), be aware that you will still need a TRS-to-TRRS adapter (cable) like Rode’s own SC4 or SC7 because the included one from Rode is TRS-to-TRS which works fine for regular cameras (DSLMs/DSLRs) but not with smartphones which have a TRRS headphone jack – well, if they still have one at all, that is. It may all look the same at first sight but the devil is in the detail, or in this case the connectors of the pin.

If you want to use the GO II with a camera or audio recorder that has XLR inputs, you will need a 3.5mm to XLR adapter like Rode’s own VXLR+ or VXLR Pro.

Along with the GO II, Rode released a desktop application called Rode Central which is available for free for Windows and macOS. It lets you activate and fine-tune additional features on the GO II when it’s connected to the computer. You can also access files from the onboard recording, a new feature I will talk about in a bit. A mobile app for Android and iOS is not yet available but apparently Rode is already working on it.

One brilliant new software feature is the ability to record a simultaneous -12dB safety track when in “merged mode”. It’s something Rode already implemented on the VideoMic NTG and it’s a lifesaver when you don’t know in advance how loud the sound source will be. If there’s a very loud moment in the main track and the audio clips, you can just use the safety track which at -12dB probably will not have clipped. The safety channel is however only available when recording in “merged mode” since it uses the second channel for the back-up. If you are using “split mode”, both channels are already filled and there’s no space for the safety track. It also means that if you are using the GO II with a smartphone, you will only be able to access the safety channel feature when using the digital input (USB-C or Lightning), not the 3.5mm headphone jack analogue input, because only then will you have two channels to record into at your disposal.

Another lifesaver is the new onboard recording capability which basically turns the two TX units into tiny standalone field recorders, thanks to their internal mic and internal storage. The internal storage is capable of recording up to 7 hours of uncompressed wav audio (the 7 hours also correspond with the battery life which probably isn’t a coincidence). This is very helpful when you run into a situation where the wireless connection is disturbed and the audio stream is either affected by interference noise or even drop-outs.

There are some further options you can adjust in the Rode Central app: You can now activate a more nuanced gain control pad for the output of the RX unit. On the original GO, you only had three different settings (low, medium, high), now you have a total of 11 (in 3db steps from -30db to 0db). You can also activate a reduced sensitivity for the input of the TX units when you know that you are going to record something very loud. Furthermore, you can enable a power saver mode that will dim the LEDs to preserve some additional battery life.

Other improvements over the original GO include a wider transmission range (200m line-of-sight vs. 70m) and better shielding from RF interference.

One thing that some people were hoping for in an updated version of the Wireless GO is the option to monitor the audio that goes into the receiver via a headphone output – sorry to say that didn’t happen but as long as you are using a camera or smartphone/smartphone app that gives you live audio monitoring, this shouldn’t be too big of a deal.

Aside from the wireless system itself the GO II comes with a TRS-to-TRS 3.5mm cable to connect it to regular cameras with a 3.5mm input, three USB-C to USB-A cables (for charging and connecting it to a desktop computer/laptop), three windshields, and a pouch. The pouch isn’t that great in my opinion, I would have prefered a more robust case but I guess it’s better than nothing at all. And as mentioned before: I would have loved to see a TRS-to-TRRS, USB-C to USB-C and/or USB-C to Lightning cable included to assure out-of-the-box compatibility with smartphones. Unlike some competitors, the kit doesn’t come with separate lavalier mics so if you don’t want to use the internal mics of the transmitters you will have to make an additional purchase unless you already have some. Rode offers the dedicated Lavalier GO for around 60 Euros. The price for the Wireless GO II is around 300 Euros. 

So is the Rode Wireless GO II perfect? Not quite, but it’s pretty darn close. It surely builds upon an already amazingly compact and versatile wireless audio system and adds some incredible new features so I can only recommend it for every mobile videomaker’s gear bag. If you want to compare it against a viable alternative, you could take a look at the Saramonic Blink 500 Pro B2 which is roughly the same price and comes with two lavalier microphones or the Hollyland Lark 150.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#42 Camera2 API Update 2021 – Android Pro Videography & Filmmaking — 15. April 2021

#42 Camera2 API Update 2021 – Android Pro Videography & Filmmaking

I’ve already written about Camera2 API in two previous blog posts (#6 & #10) but a couple of years have passed since and I felt like taking another look at the topic now that we’re in 2021. 

Just in case you don’t have a clue what I’m talking about here: Camera2 API is a software component of Google’s mobile operating system Android (which basically runs on every smartphone today expect Apple’s iPhones) that enables 3rd party camera apps (camera apps other than the one that’s already on your phone) to access more advanced functionality/controls of the camera, for instance the setting of a precise shutter speed value for correct exposure. Android phone makers need to implement Camera2 API into their version of Android and not all do it fully. There are four different implementation levels: “Legacy”, “Limited”, “Full” and “Level 3”. “Legacy” basically means Camera2 API hasn’t been implemented at all and the phone uses the old, way more primitive Android Camera API, “Limited” signifies that some components of the Camera2 API have been implemented but not all, “Full” and “Level 3” indicate complete implementation in terms of video-related functionality. “Level 3” only has the additional benefit for photography that you can shoot in RAW format. Android 3rd party camera apps like Filmic Pro, Protake, mcpro24fps, ProShot, Footej Camera 2 or Open Camera can only unleash their full potential if the phone has adequate Camera2 API support, Filmic Pro doesn’t even let you install the app in the first place if the phone doesn’t have proper implementation. “adequate”/”proper” can already be “Limited” for certain phones but you can only be sure with “Full” and “Level 3” devices. With some other apps like Open Camera, Camera2 API is deactivated by default and you need to go into the settings to enable it to access things like shutter speed and ISO control.

How do you know what Camera2 API support level a phone has? If you already own the phone, you can use an app like Camera2 Probe to check but if you want to consider this before buying a new phone of course this isn’t possible. Luckily, the developer of Camera2 Probe has set up a crowd sourced list (users can provide the test results via the app which are automatically entered into the list) with Camera2 API support levels of a massive amount of different Android devices, currently over 3500! The list can be accessed here and it’s great that you even get to sort the list by different parameters like the phone brand or type a device name into a search bar.

It’s important to understand that there’s a Camera2 API support level for each camera on the phone. So there could be a different one for the rear camera than for the selfie camera. The support level also doesn’t say anything about how many of the phone’s camera have been made accessible to 3rd party apps. Auxiliary ultra wide-angle or telephoto lenses have become a common standard in many of today’s phones but not all phone makers allow 3rd party camera apps to access the auxiliary camera(s). So when we talk about the Camera2 API support level of a device, most of the time we are referring to its main rear camera. 

Camera2 API was introduced with Android version 5 aka “Lollipop” in 2014 and it took phone makers a bit of time to implement it into their devices so one could roughly say that only Android devices running at least Android 6 Marshmallow are actually in the position to have proper support. In the beginning, most phone makers only provided full Camera2 API support for their high-end flagship phones but over the last years, the feature has trickled down to the mid-range segment and now even to a considerable amount of entry-level devices (Nokia and Motorola are two companies that have been good with this if you’re on a tight budget).

I actually took the time to go through the Camera2 Probe list to provide some numbers on this development. Of course these are not 100% representative since not every single Android device on the planet has been included in the list but I think 3533 entries (as of 21 March 2021) make for a solid sample size.

Phone models running Android 6

Level 3: 0

Full: 30

Limited: 18

Legacy: 444

Full/Level 3 %: 6.1

———-

Phone models running Android 7

Level 3: 82

Full: 121

Limited: 113

Legacy: 559

Full/Level 3 %: 23.2

———-

Phone models running Android 8

Level 3: 147

Full: 131

Limited: 160

Legacy: 350

Full/Level 3 %: 35.3

———-

Phone models running Android 9

Level 3: 145

Full: 163

Limited: 139

Legacy: 69

Full/Level 3 %: 59.7

———-

Phone models running Android 10

Level 3: 319

Full: 199

Limited: 169

Legacy: 50

Full/Level 3 %: 70.3

———-

Phone models running Android 11

Level 3: 72

Full: 28

Limited: 8

Legacy: 2

Full/Level 3 %: 90.9

I think it’s pretty obvious that the implementation of proper Camera2 API support in Android devices has been taking massive steps forward with each iteration of the OS and a 100% coverage on new devices is just within reach – maybe the upcoming Android 12 can already accomplish this mission?

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC) — 23. March 2021

#41 Sharing VN project files between iPhone, iPad, Mac, Android (& Windows PC)

As I have pointed out in two of my previous blog posts (What’s the best free cross-platform mobile video editing app?, Best video editors / video editing apps for Android in 2021) VN is a free and very capable mobile video editor for Android and iPhone/iPad and the makers recently also launched a desktop version for macOS. Project file sharing takes advantage of that and makes it possible to start your editing work on one device and finish it on another. So for instance after having shot some footage on your iPhone, you can start editing right away using VN for iPhone but transfer the whole project to your iMac or MacbookPro later to have a bigger screen and mouse control. It’s also a great way to free up storage space on your phone since you can archive projects in the cloud, on an external drive or computer and delete them from your mobile device afterwards. Project sharing isn’t a one-way trick, it also works the other way around: You start a project using VN on your iMac or MacbookPro and then transfer it to your iPhone or iPad because you have to go somewhere and want to continue your project while commuting. And it’s not all about Apple products either, you can also share from or to VN on Android smartphones and tablets (so basically every smartphone or tablet that’s not made by Apple). What about Windows? Yes, this is also possible but you will need to install an Android emulator on your PC and I will not go into the details about the procedure in this article as I don’t own a PC to test. But you can check out a good tutorial on the VN site here.

Before you start sharing your VN projects, here’s some general info: To actively share a project file, you need to create a free account with VN. Right off the bat, you can share projects that don’t exceed 3 GB in size. There’s also a maximum limit of 100 project files per day but I suppose nobody will actually bump into that. To get rid of these limitations, VN will manually clear your account for unlimited sharing within a few days after filling out this short survey. For passive sharing, that is when someone sends you a project file, there are no limitations even when you are not logged in. As the sharing process is slightly different depending on which platforms/devices are involved I have decided to walk you through all nine combinations, starting with the one that will probably be the most common. 

Let me quickly explain two general things ahead which apply to all combinations so I don’t have to go into the details every time:

1) When creating a VN project file to share, you can do it as “Full” or “Simple”. “Full” will share the project file with all of its media (complete footage, music/sound fx, text), “Simple” will let you choose which video clips you actually want to include. Not including every video clip will result in a smaller project file that can be transferred faster.

2) You can also choose whether or not you want the project file to be “Readonly”. If you choose “Readonly”, saving or exporting will be denied – this can be helpful if you send it to someone else but don’t want this person to save changes or export the project.

All of the sharing combinations I will mention now are focused on local device-to-device sharing. Of course you can also use any cloud service to store/share VN project files and have them downloaded and opened remotely on another device that runs the VN application.

iPhone/iPad to Mac

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon at the bottom), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Now choose “AirDrop” and select your Mac. Make sure that AirDrop is activated on both devices.
  • Depending on your AirDrop settings you now have to accept the transfer on the receiving device or the transfer will start automatically. By default, the file will be saved in the “Downloads” folder of your Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app.
  • Now select “Open project”.

iPhone/iPad to iPhone/iPad

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now choose “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Select the iPhone/iPad you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now select “Open project”

iPhone/iPad to Android

  • Open VN on your iPhone/iPad.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the iOS/iPadOS share menu will pop up.
  • Now you need to transfer the project file from the iPhone/iPad to the Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both iPhone/iPad and Android.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Open SendAnywhere on your Android device, select the “Receive” tab and enter the code
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. 
  • The Android “Open with” menu will open, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Finally choose “Open Project”.

Mac to iPhone/iPad

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select. “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select your iPhone or iPad. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • The project file will be imported into VN automatically.
  • Now choose “Open Project”.

Mac to Mac

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and right-click the file, hover over “Share” and then select “AirDrop”. Make sure that AirDrop is activated on both devices.
  • Now select the Mac you want to send it to. Depending on your AirDrop settings you now need to accept the transfer on the receiving device or the transfer will start automatically.
  • By default the VN project file will be saved in the “Downloads” folder of the receiving Mac.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Mac to Android

  • Open VN on your Mac.
  • In the left side bar, click on “Projects”.
  • Click on the three dots below the thumbnail of the project you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • Now you have to select a save location for the VN project file.
  • Locate the exported project file on your Mac and choose a way to send it to your Android device. I have found that SendAnywhere is a very good tool for this, it’s free and available for both macOS and Android.
  • So using SendAnywhere on your Mac, drag the VN project file into the app. You will see a 6-digit code. Open SendAnywhere on your Android, choose the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then on the project file.
  • The Android “Open with” menu will pop up, locate and select “VN/Import to VN”, the project file will be imported into your VN app.
  • Choose “Open Project”.

Android to Mac

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated and the Android share sheet will pop up.
  • Now you need to transfer the project file from your Android device to your Mac. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and macOS.
  • So choose SendAnywhere from the share menu. A 6-digit code is generated.
  • Unless you have created a custom download folder for your preferred file transfer app, the VN project file will be saved to the “Downloads” folder on your Mac or is available in your cloud storage.
  • Open VN on your Mac and drag and drop the VN project file into the app, then tap “Open Project”.
  • Now select “Open project”.

Android to Android

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated.
  • From the Android share sheet, choose Android’s integrated wifi sharing option Nearby Share (check this video on how to use Nearby Share if you are not familiar with it) and select the device you want to send it to. Make sure Nearby Share is activated on both devices.
  • After accepting the file on the second device, the transfer will start.
  • Once it is finished, choose “VN/Import to VN” from the pop up menu. Importing into VN will start. 
  • Finally choose “Open Project”.

Android to iPhone/iPad

  • Open VN on your Android device.
  • On the VN Studio page (house icon in the bottom navigation bar), select the “Projects” tab.
  • Tap the three dots on the right side of the project that you want to share.
  • Select “Share VN Project”.
  • Choose either “Full” or “Simple”.
  • Choose whether or not you want the project file to be “Readonly”.
  • Tap on “Share”, the project file will be generated. Afterwards, the Android share sheet menu will pop up.
  • Now you need to transfer the project file from the Android device to the iPhone/iPad. I have found that SendAnywhere is a very good tool for this, it’s free and available for both Android and iPhone/iPad.
  • So choose SendAnywhere from the Share Sheet. A 6-digit code is generated.
  • Open SendAnywhere on your iPhone/iPad, select the “Receive” tab and enter the code.
  • After the transfer is completed, tap on the transfer entry and then select the VN project file. Now tap on the share icon in the top right corner and choose VN from the list. The project file will be imported into VN.
  • Finally choose “Open Project”.

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter featuring a personal selection of interesting things that happened in the world of mobile video in the last four weeks.

For an overview of all my blog posts click here.

DISCLOSURE NOTE: This particular post was sponsored by VN. It was however researched and written all by myself.

#40 A whole new video editing experience on a phone! — 28. February 2021

#40 A whole new video editing experience on a phone!

Let’s be honest: Despite the fact that phone screens have become increasingly bigger over the last years, they are still rather small for doing some serious video editing on the go. No doubt, you CAN do video editing on your phone and achieve great results, particularly if you are using an app with a touch-friendly UI like KineMaster that was brilliantly designed for phone screens.  But I’m confident just about every mobile veditor would appreciate some more screen real estate. Sure, you can use a tablet for editing but tablets aren’t great devices for shooting and if you want to do everything on one device pretty much everyone would choose a phone, right? 

While phone makers like Samsung, Huawei and Motorola are currently pioneering devices with foldable screens, those are still extremely expensive (between 1500 and 2000 bucks!) and also have to cope with some teething problems. LG, while not particularly successful in terms of sales figures in the recent past, have proven to be an innovative force in smartphone development for some years now. Not everything they throw at the market sticks, but let’s not forget that for instance the now widely popular and extremely useful wide-angle auxiliary lens was first seen on the LG G5 (rear camera) and LG V10 (front camera). I would also hate to not have an amazing manual video mode in a native camera app like the V10 pioneered.

Instead of making a screen that folds, LG has introduced a series of phones that include (or at least have the option for) a dual screen case that has a second, separate screen – basically making it look like if you were holding two phones next to each other. So the concept is that of a foldable PHONE, not a foldable SCREEN! The actual phone is inserted into the Dual Screen case with a physical connection (initially pogo pins, then USB-C) establishing communication between the two devices. First came the V50 (April 2019), then the G8X (November 2019) and the V60 (March 2020) with the latest Dual Screen-compatible phone release being the LG Velvet (May 2020). As far as I know, the G8X (which I got new for just over 400€) is the only of the bunch that comes with the Dual Screen included, for the other phones, the DS is an accessory that can be purchased separately or in a bundle with the phone. It’s important to note that the DS cases are all slightly different (LG refined the design over time) and only work with the phone they were designed for. It probably goes without saying that they don’t work with just any other Android phone – this is proprietary LG hardware. 

The user experience of a foldable screen phone like the Samsung Galaxy Fold is quite different from that of the Dual Screen foldable phone approach. While an expanded foldable screen can give you more screen real estate for one app, the DS is primarily designed for multi-tasking with two apps running at the same time, one on the phone’s main screen and one on the Dual Screen. The DS is not really meant to use an app in an expanded view over both screens as there’s obviously a big gap/hinge between the two screens which is quite distracting in most cases. If apps were specifically customized, integrating the gap into their UI, this could be much less of a problem but with LG being a rather small player in the smartphone market, this hasn’t really happened so far. LG seems to have been quite aware of this and so they natively only allow a handful of apps (a bunch of Google apps and the Naver Whale browser) to be run in a wide view mode that spans across both screens.

Now, while having an app run across two separate screens might not make a lot of sense for many apps, there is one type of app that could actually be a perfect fit: video editors. On desktop, lots of professional video editors (I’m talking about the persons doing the editing) use a dual monitor set-up to have more screen real estate to organize their virtual workspace. One classic use case is that you have your timeline, media pool etc. on one screen and a big preview window on the second screen. It’s exactly this scenario that can be mimicked on LG’s Dual Screen phones like the G8X – but only with a particular app.

Why only with a particular app? Because the app’s UI needs to fit the Dual Screen in just the right way and currently, the only app that does that is PowerDirector. It’s not a perfect fit (one of the most obvious imperfections is the split playback button) but that’s to be expected since the app has not been optimized in any way for LG’s Dual Screen phones – considering this, it’s truly amazing HOW well Power Director’s UI falls into place on the G8X. The joy of having a big preview window on the top screen with the timeline and tool bars having their own space on the bottom screen (using the phone in landscape orientation) can hardly be overestimated in my opinion. It really feels like a whole new mobile video editing experience, and an extremely pleasant one for sure! 

But wait! Didn’t I mention that LG’s wide view mode is only available for a couple of apps natively? Yes indeed, and that’s why you need a 3rd party helper app that lets you run just any app you want in wide mode. It’s called WideMode for LG and can be downloaded for free from the Google PlayStore. Once you have installed it, you can add a shortcut to the quick settings (accessible via the swipe down notification shade) and switch to wide view whenever you want to. The app works really well in general (don’t blame the app maker for the fact that virtually no app has been optimized for this view!), occasionally, certain navigational actions cause the wide mode to just quit but most of the time, you can pick up the pattern of when that happens. In the case of Power Director for instance, you should only activate wide mode once you have opened your project and can see the timeline. If you activate wide view before that and select a project, you will get thrown out of the wide view mode. Also, if you’re done with your editing and want to export the project, tapping the share/export button will quit wide view and push the UI back on just a single screen but that’s not really problematic in my opinion. Still I couldn’t help but daydream about how cool the app would be if Cyberlink decided to polish the UI for LG’s Dual Screen phones!

What about other video editing apps? KineMaster’s UI, while extremely good for single screen phones, is pretty terrible in wide view on the G8X. VN on the other hand works fairly well but can’t quite match Power Director. Interestingly, while VN doesn’t (yet) support landscape orientation in general, once you force it across both screens, it actually does work like that. The biggest annoyance is probably that the preview window is split between the two screen with the lower quarter on the bottom screen. If you use VN in portrait orientation with wide mode, the preview window is cut in half and so is the timeline area. The UI of CapCut is pretty similar to that of VN, so it’s basically the same here. Adobe Premiere Rush isn’t even available for any LG phones currently.

So is this the future of mobile video editing on smartphones? Yes and no. LG’s smartphone business has been struggling for a while and recent news from the Korean company indicate they might be looking for an exit strategy, selling their mobile branch. This also means however that you can currently get great deals on powerful LG phones so if you are on a budget but are really intrigued by this opportunity for mobile video editing then it might just be the perfect time. The way Power Director’s UI is layed out should also make it great for phones with a foldable screen like the Galaxy Fold series so if we assume that this type of phone will become more common and affordable in the near future, people doing a lot of video editing on the phone should definitely consider checking this out!

As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.

For an overview of all my blog posts click here.

I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂