Back in February I published a list with a wide selection of (potentially) useful Android apps for media production. Despite the fact that I mostly write for this blog in English now, the list was published in its German version first. I did promise an English version however and I’ve been working on it ever since. The new English version is not just a translation, it’s actually an update with some apps having been kicked out and others added. And what occasion could be better to finally publish it then at the time MoJoFest is happening in Galway, Ireland. MoJoFest is an exciting 3-day conference (May 29th to 31st) about content creation with mobile devices, initiated and organized by former RTE Innovation Lead Glen Mulcahy. Check out their website and follow the hashtag #MoJoFest on Twitter! I’ll be giving a workshop/presentation about smartphone videography on Android devices on Thursday, May 31st, and as a precursor, I’ll upload the English version of my app list here. Please keep in mind that there might be some typos or even outdated information in it as the mobile world keeps spinning at an incredible pace and things can change quickly. This is also a highly subjective list and by no means “definitive” or “ultimate”, you may find that other apps which are not on the list suit you better for your work. If you think an app you know and love should absolutely be on this list or if you have new information about apps already on the list, please do contact me! But now without much further ado…
When using a headline like the one above, camera people usually refer to the idea that you should already think about the editing when shooting. This basically means two things: a) make sure you get a variety of different shots (wide shot, close-up, medium, special angle etc) that will allow you to tell a visually interesting story but b) don’t overshoot – don’t take 20 different takes of a shot or record a gazillion hours of footage because it will cost you valuable time to sift through all that footage afterwards. That’s all good advice but in this article I’m actually talking about something different, I’m talking about a way to create a video story with different shots while only using the camera app – no editing software! In a way, this is rather trivial but I’m always surprised how many people don’t know about it as this can be extremely helpful when things need to go super-fast. And let’s be honest, from mobile journalists to social media content producers, there’s an increasing number of jobs and situations to which this applies…
The feature that makes it possible to already edit a video package within the camera app itself while shooting is the ability to pause and resume a recording. The most common way to record a video clip is to hit the record button and then stop the recording once you’re finished. After stopping the recording the app will quickly create/save the video clip to be available in the gallery / camera roll. Now you might not have noticed this but many native camera apps do not only have a „stop“ button while recording video but also one that will temporarily pause the recording without already creating/saving the clip. Instead, you can resume recording another shot into the very same clip you started before, basically creating an edit-on-the-go while shooting with no need to mess around with an editing app afterwards. So for instance, if you’re shooting the exterior of an interesting building, you can take a wide shot from the outside, then pause the recording, go closer, resume recording with a shot of the door, pause again and then go into the building to resume recording with a shot of the interior. When you finally decide to press the „stop“ button, the clip that is saved will already have three different shots in it. The term I would propose for this is „shediting“, obviously a portmanteau of „shooting“ and „editing“. But that’s just some spontaneous thought of mine – you can call this what you want of course.
What camera apps will let you do shediting? On Android, actually most of the native camera apps I have encountered so far. This includes phones from Samsung, LG, Sony, Motorola/Lenovo, Huawei/Honor, HTC, Xiaomi, BQ, Wileyfox and Wiko. The only two Android phone brands that didn’t have this feature in the phone’s native camera app were Nokia (as tested on the Nokia 5) and Nextbit with its Robin. As for 3rd party video recording apps on Android, things are not looking quite as positive. While Open Camera and Footej Camera do allow shediting, many others like Filmic Pro, Cinema FV-5, Cinema 4K, Lumio Cam and ProShot don’t have this feature. When looking at the other mobile platforms, Apple still doesn’t have this feature in the iOS native camera app and the only advanced 3rd party video recording app that will let you do it appears to be MoviePro. And while almost extinct, Lumia phones with Windows 10 Mobile / Windows Phone on the other hand do have this feature in the native camera app just like most Android phones.
Sure, shediting is only useful for certain projects and situations because once you leave the camera app, the clip will be saved anyway without possibility to resume and you can’t edit shots within the clip without heading over to an editing app after all. Still, I think it’s an interesting tool in a smartphone videographer’s kit that one should know about because it can make things easier and faster.
EDIT: After I had published this article I was asked on Twitter if the native camera app re-adjusts or lets you re-adjust focus and exposure after pausing the recording because that would indeed be crucial for its actual usefulness. I did test this with some native camera apps and they all re-adjusted / let you re-adjust focus and exposure in between takes. If you have a different experience, please let me know in the comments!
Xiaomi has been a really big name in China’s smartphone market for years, promising high-end specs and good build quality for a budget price tag – but only at the end of last year did they officially enter the global scene with the Mi A1. The Mi A1 is basically a revamped Mi 5X running stock Android software instead of Xiaomi’s custom Mi UI. It’s also part of Google’s Android One program which means it runs a ‚clean‘ Google version of Android that gets quicker and more frequent updates directly from Google. For a very budget-friendly 180€ (current online price in Europe) you get a slick looking phone with dual rear cameras, featuring a 2x optical zoom telephoto lens alongside the primary camera. Sounds like an incredible deal? Here are some thoughts about the Mi A1 regarding its use as a tool for media production, specifically video.
After spending a couple of days with the Mi A1, I would say that this phone is definitely a very interesting budget-choice for mobile photographers. The fact that you get dual rear cameras (the second one is a 2x optical zoom as mentioned before) at this price point is pretty amazing. The photo quality is quite good in decent lighting conditions (low light is problematic but that can be said of most smartphone cameras), you get a manual mode with advanced controls in the native camera app and the portrait mode feature does a surprisingly good job at creating that fancy Bokeh effect blurring the background to single out your on-screen talent. A lot of bang for the buck. Video – which I’m personally more interested in – is a slightly different story though.
Let’s start with a positive aspect: The Xiaomi Mi A1 lets you record in UHD/4K quality which is still a rarity for a budget phone in this price range. And hey, the footage looks quite good in my opinion, especially considering the fact that it’s coming from a (budget) smartphone. I have uploaded some sample footage on YouTube so see for yourself.
The video bitrate for UHD/4K hovers around 40 Mbps in the native app which is ok for a phone but the audio bitrate is a meager 96 Kbps (same in FHD) – so don’t expect full, rich sound. But this is only the beginning of a couple of disappointments when it comes to video: One of the Mi A1’s promising camera features, the 2x optical zoom lens, CANNOT be used in the video mode, only in the photo mode! What a bummer! This goes for both the native camera app and 3rd party apps.
Talking about 3rd party camera apps, it’s also a huge let-down that the Camera2 API support (what is Camera2 API?) is only „Legacy“ out of the box, even though the Mi A1 is part of Google’s Android One program. „Legacy“ means that third party camera apps can’t really tap into the new, more advanced camera controls that Google introduced with Android 5 in 2014, like precise exposure control over ISO and shutter speed. Due to this, you can’t install an app like Filmic Pro in the first place and other advanced camera apps like Cinema FV-5, ProShot, Lumio Cam, Cinema 4K, Footej Camera or Open Camera can’t really unleash their full potential. Interestingly, there seems to be a way to „unlock“ full Camera2 support via a special procedure without permanently rooting your device (look here) but even after doing so, Filmic Pro can’t be installed, probably because the PlayStore keeps the device’s original Camera2 support information in its database to check if the app is compatible without actually probing the current state of the phone. This is just an educated guess however. Still, many of us might not feel comfortable messing around with their phone in that way and it’s a pity Xiaomi doesn’t provide this out-of-the-box on the Mi A1.
Lackluster Camera2 API support can be remedied by a good native camera app but unlike with photos, there is no pro or manual mode for videos on the Mi A1, it’s actually extremely limited. While you can lock the focus by tapping (there are two focus modes, tap-to-focus and continuous auto-focus), you are only able to adjust the auto-exposure within a certain range (EV), not lock it. There’s also no way to influence the white balance. Shooting in a higher frame rate (60fps)? Not possible, not even in 720p (there’s a not-too-bad 720p slow-motion feature though). Apropos frame rates: I noticed that while the regular frame rate is the usual 30fps, the native camera app reduces the fps to 24 (actually 23.98 to be precise) when shooting under low-light conditions to gain a little bit more light for each frame. That’s also the reason why I made two different YouTube videos with sample footage so I was able to keep the original frame rate of the clips. I have experienced this behaviour of dropping the frame rate in low-light in quite a few (native) camera apps on other phones as well and from the standpoint of a run-of-the-mill smartphone user taking video this is actually an acceptable compromise in my opinion (as long as you don’t go below 20fps) to help tackle the fact that most smartphone cameras still aren’t naturally nocturnal creatures. It can however be a problem for more dedicated smartphone videographers that want to edit their footage as it’s not really good to have clips in one project that differ so much in terms of fps. 3rd party apps might help keeping the fps more constant.
And there are still two other big reasons to use a 3rd party app on the Mi A1 despite the lack of proper Camera2 API support: locking exposure and using an external microphone via the headphone jack (yes, there is one!). One more important shortcoming to talk about: It’s not too surprising maybe that there is no optical image stabilization (OIS) on a phone in this price range but given the fact that you can shoot 4K, I would have expected electronic image stabilization (EIS) at least when shooting in 1080p resolution. But there’s no EIS in 1080p which means that you should put the phone on a tripod or use a gimbal most of the time to avoid getting shaky footage. With a bit of practice you might pull off a decent handheld pan or tilt however to avoid having only static shots.
So I’ve talked about the video capturing part, what about editing video on the Mi A1? The phone sports a Snapdragon 625 which is a slightly dated but still quite capable mid-ranger chipset from Qualcomm. You can work with up to two layers (total of three video tracks) of FHD video in KineMaster and PowerDirector (the two most advanced Android video editing apps) which will suffice for most users. Important note: DON’T run the hardware analysis test in KineMaster though! It’s a hardware probing procedure meant to better determine the device’s capabilities in terms of editing video in the app. While the device capability information originally says you can have two QHD (1440p) video layers, it will downgrade you to two 720p (!) layers after running the analysis – quite strange. Don’t worry though if your evil twin grabs your phone and runs the test anyway – you just have to uninstall and then reinstall KineMaster to get back to the original setting. I ran some quick tests with FHD 1080p layers and it worked fine so just leave everything as is. Since the phone can shoot in UHD/4K resolution you might ask if you can edit this footage on the device. While you can’t edit 4K in KineMaster on the Mi A1 at all (when trying to import 4K footage the app will offer you to import a transcoded QHD version of the clip to work with) you can import and work with UHD/4K in PowerDirector, but only as a single video track, layers are not possible.
So let’s wrap this up: Xiaomi’s first internationally available phone is a great budget option for mobile photographers but the video recording department is let down by a couple of things which makes other options in this price range more appealing to the smartphone videographer if advanced manual controls and certain pro apps are of importance. As I pointed out though, it’s not all bad: It’s still hard to find a phone for that price that offers UHD/4K video recording – and the footage looks even pretty good in decent lighting conditions. So if you happen to have a Mi A1 – there’s no reason at all to not create cool video content with it – if you achieve a nice video package you can even be more proud than someone with a flagship phone! 😉
Last year marked the return of one of THE big pioneers in the history of mobile phones to the smartphone market: Nokia. It’s not really the same company from the days of feature and Windows phones anymore (a company named HMD Global has licensed the brand name for their phones) but that doesn’t mean we should just ignore it. After launching a bunch of affordable entry-level and lower end mid-range devices (Nokia 3, 5 & 6), the Nokia 8 was the first quasi-flagship phone following the brand’s reboot.
One special feature of the Nokia 8 was something the company called the „Bothie“ for marketing purposes, obviously trying to convince people that a new flavour of the all-too-common „selfie“ is in town. A „Bothie“ is a split-screened snapshot that is taken with both front and rear cameras AT THE SAME TIME, giving you two different perspectives of the very same moment. For instance the image of a person looking at something AND the image of the scenery the person is looking at. What’s more: this mode not only works for photo but also for video, meaning you can record a split-screened video with both front and rear cameras simultaneously. It turns out however that Nokia actually wasn’t the first company to include such a feature in a smartphone. As early as 2013 (Samsung Galaxy S4) other phone makers equipped some of their phones with similar modes, HTC (One M8) followed in 2014, LG (V10) in 2015 – of course they are all using different names for the same feature so we can get jolly confused when talking about it!
Before giving a brief overview on how these modes have been implemented by each manufacturer, you might ask how such a feature can be useful for a more professional video production context. I’d say there are two main use cases for which this mode could be a great asset: piece-to-camera reporting and vlogging – obviously those two areas can heavily intersect. Imagine a mobile journalist reporting from an event, let’s say a protest rally – it’s much more interesting for the audience to see both the reporter elaborating on what’s happening and the rally itself instead of just one or the other. Traditionally one would have to have two separate cameras (or take different shots successively) and edit in post-production to achieve the same but thanks to today’s smartphones having HD video capable cameras on the front and the back, this can be done a lot easier and faster.
Samsung’s „Dual Camera“
As far as my research showed, Samsung was the first phone maker to introduce a dual video recording feature with the Galaxy S4 in April of 2013. This mode has been on all following Samsung flagships of the S- and Note-Series so far but you might have to download it as a sort of „plug-in“ from within the native camera app (there’s a „+“ button to add more camera modes). Samsung’s take is a picture-in-picture approach, not a traditional split-screen where both parts have exactly the same size (there IS a split-screen option but it’s barely useful with two 16:9 images side-by-side, extreme letter-boxing). With Samsung’s „Dual Camera“, one image always is the main image while the secondary image from the other camera is embedded into it. You can resize the picture-in-picture though and move it around within the main image – you can also swap between cameras during the recording. The recorded video file can have a resolution of up to 1080p with a traditional aspect ratio of 16:9 or 9:16. One very cool thing about Samsung’s native camera app is that unlike most other Android phones’ native camera apps it supports the use of external mics via the 3.5mm headphone jack or USB port which is a tremendous advantage for having professional-grade audio. One catch: You can only record up to 5 minutes for a single clip.
HTC’s „Split Capture“ (discontinued)
HTC followed Samsung with a similar but slightly different feature (officially called “Split Capture”) on the HTC One M8, launched in March 2014. The recorded video was an equally sized left/right split-screen 1080p video with a 16:9/9:16 aspect ratio. HTC subsequently featured this mode in other phones like the Desire Eye and the One M9 but apparently ditched it after the M9 as the HTC 10 and more recent flagships like the U Ultra or U11 don’t seem to have it anymore.
LG’s „Snap Movie“ / „Match Shot“
In 2015 LG redefined what a native camera app on a smartphone can deliver in terms of pro video controls with the release of the LG V10. But not only did the V10 have a unique manual video mode, the app also boasted some more playful features. Among them was a mode called „Snap Movie“ which basically invites you to create a short movie (maximum of one minute) out of short, different shots without having to muck around with an editing app. „Snap Movie“ is not a dual camera mode per se but one way of recording within this mode is to use a split-screen for simultaneously recording with both front and rear cameras. The image is recorded in 1080p with 16:9 or 9:16 aspect ratio. Big catch: You can only do so for a maximum of one minute! Flash forward to 2017 and the V30: While the „Snap Movie“ mode is gone (there’s something called „Snap Shot“ but that’s a completely different thing), there’s now a „Match Shot“ mode. With „Match Shot“ you can record a split-screen image using both front and rear cameras at the same time. You also have the option to select between regular and wide-angle lenses before starting the recording although the front camera actually only has one lens so it’s most likely a software crop. Two good things about the new mode: You are not limited to only one minute anymore and there’s also support for external mics. The recording format is a bit strange though as it’s a 18:9 or 9:18 aspect ration with a resolution of 2880×1440 (you can’t change the resolution at all). The beyond-FHD resolution is great but the rather non-standard aspect ratio (probably thanks to the 18:9 display of the phone) is a bit annoying for watching it on anything other than the phone itself because the image will either get letter-boxed on certain platforms like YouTube (I guess it’s not that much of a problem for Twitter and Facebook as they are more flexible with aspect ratios) or you will have to perform a crop in a video editing app and re-export in a more common 16:9 ratio. Hopefully LG will fine-tune this mode in the future, it would be nice to record in a standard 16:9 aspect ratio.
Nokia’s „Dual Sight“
As it has become quite clear, Nokia’s „Bothie“ feature introduced with the Nokia 8 last year is actually everything but new – HMD Global just made it an integral part of their marketing campaign for the device unlike their predecessors. The mode’s proper name is „Dual Sight“ and it’s pretty much like the one HTC had, meaning it’s an equally sized split-screen image in 16:9 or 9:16 aspect ratio with a resolution of 1080p. The Nokia 8 however DOES have one new trick up its sleeve: live streaming integration! You can use the „Dual Sight“ feature not only for recording but also for live streaming video on Facebook and YouTube (not sure about Periscope) which can come in really handy for journalists and live vloggers. One probable shortcoming of this mode on the Nokia 8: while I’m not able to test myself, I think it’s pretty save to say that Nokia’s native camera app doesn’t have support for external mics (the Nokia 5 definitely doesn’t). If you do own a Nokia 8 please let me know if my assumption is correct. Last bit of info: The recently launched Nokia 8 Sirocco and Nokia 7 Plus also seem to have the „Dual Sight“ mode.
All mentioned instances of dual recording modes have two things in common: You don’t get (as much) manual control over the image as you would in a ‚regular‘ video, it mostly runs on auto. And: the feature only works in the native camera app of the phone, third party developers can’t access the API to include this functionality in their own apps. There are a bunch of apps on the PlayStore that claim to do something similar but they only take pictures/videos successively with both cameras, not at the same time.
What do you think? Is it a useful feature? Do you know of any other phones that have a dual recording mode? Let me know in the comments below or hit me up on Twitter @smartfilming!
Back in 2016 Google made an iOS-exlusive app (weird, ain’t it?!) called Motion Stills. It focused on working with Apple’s newly introduced ‘Live Photos’ for the iPhone 6s. When you shoot a ‘Live Photo’, 1.5 seconds of video (with a low frame rate mind you) and audio before and after pressing the shutter button is recorded. You can think of it as a GIF with sound. What Motion Stills does is that it lets you record, stabilize, loop, speed-up and/or combine ‘Live Photos’. In 2017, Google finally brought the app to Android. Now while some Android phone makers have introduced ‘Live Photo’-like equivalents, there’s no general Android equivalent as such yet and because of that the app works slightly different on Android. Instead of ‘Live Photos’ you can shoot video clips with a maximum duration of 3 seconds (this also goes for pre-6s iPhones on iOS). There are also other shooting modes (Fast Forward, AR Mode) that are not limited to the 3 seconds but for this post I want to concentrate on the main mode Motion Still.
When I first looked at the app, I didn’t really find it very useful. Recording 3-second-clips in a weird vertical format of 1080×1440 (720×960 on iOS)? A revamped Vine without the attached community? Some days later however I realized that Motion Stills actually could be an interesting and easy-to-use visual micro-storytelling tool, especially for teaching core elements of visual storytelling. The main reasons why I think it’s useful are:
a) it’s a single app for both shooting and editing (and it’s free!)
b) the process of adding clips to a storyboard is super-easy and intuitive and
c) being forced to shoot only a maximum of 3 seconds let’s you concentrate on the essentials of a shot
So here’s a quick run-through of a possible scenario of how one might use the app for a quick story or say story-teaser: When covering a certain topic / location / object etc. you take a bunch of different 3-second-shots with Motion Stills (wide shot, close-up, detail etc. – 5-shot-rule anyone?) by pressing the record button. It might be good to include some sort of motion into at least some shots, either by shooting something where you already have motion because people or objects are moving or by moving the smartphone camera itself (‚dolly‘ shot, pan, tilt) when there is no intrinsic motion. Otherwise it might look a little bit too much like a stills slide show. Don’t worry too much about stabilization because Motion Stills automatically applies a stabilization effect afterwards and even without that, you might just be able to pull off a fairly stable shot for three seconds. After you have taken a bunch of shots, head over to the app’s internal gallery (bottom left corner on Android, swipe up on iOS) where all your recordings are saved and browse through the clips (they auto-play). If you tap a clip you can edit it in a couple of ways: You can turn off stabilization, mute the clip, apply a back-and-forth loop effect or speed it up. On iOS, you can also apply a motion tracking title (hope the Android version will get this feature soon as well!) What you can’t do is trim the clip. But you actually don’t have to go into edit mode at all if you’re happy with your clips as they are, you can create your story right from the gallery. And here’s the cool thing about that: Evoking a shade of Tinder, you can quickly add a clip to your project storyboard (which will appear at the bottom) by swiping a clip to the right or delete a clip from the gallery by swiping it to the left. If you want to rearrange clips in the storyboard, just long-press them and move them to the left or the right. If you want to delete a clip from the storyboard, long-press and drag it towards the center of the screen, a remove option will appear. In a certain way Google’s Motion Stills could be compared to Apple’s really good and more feature-rich Apple Clips app when it comes to creating a micro-story on the go really fast with a single app – but Apple Clips is – of course – only available for iOS. When you are finished putting together your micro-story in Motion Stills, you can play it back by tapping the play button and save/share it by tapping the share button. Once you get the hang of it, this is truly fast and intuitive – you can assemble a series of shots in no time.
That being said, there are a couple of limitations and shortcomings that shouldn’t be swept under the rug. Obviously, thanks to the 3-second-limit per clip, the app isn’t really useful for interviewing people or any other kind of monologue/dialogue scenario. You might fit in some one liners or exclamations but that’s about it. It’s also a bit unfortunate that the app doesn’t apply some kind of automatic audio-transition between the clips. If you listen to the end result with the sound on, you will often notice rather unpleasant jumps/cracks in the audio at the edit points. While you could argue that because of the format content will only be used for social media purposes where people often just watch stuff without sound and will not care much about the audio anyway, I still think this should be an added feature. But let’s get back to the format: While you have the option to export as a GIF if you are only exporting one clip, the end result of a series of clips (which is the use case I’m focusing on here) is an mp4 (mov on iOS) video file with the rather awkward resolution of 1080 by 1440 (Android) or 720 by 960 (iOS) – a 3:4 aspect ratio. This means that it will only be useful for social media platforms but hey, why ‚only‘, isn’t social media everything these days?! Another thing that might be regarded as a shortcoming or not is the fact that (at least on Android) you are pretty much boxed in with the app. You can’t import stuff and clips also don’t auto-save to the OS’s general Gallery (you will have to export clips manually for that). But is that such a bad thing? I don’t think so because a good part of the fun is doing everything with a single app: shooting, editing, exporting/publishing. So let’s finish this with an actual shortcoming: While the app is available for Android, it’s not compatible with certain devices – mostly low-end devices / mid-rangers with rather weak chipsets. And even if you can install it, some not-so-powerful devices like the Nokia 5 or Honor 6A (both rocking a Snapdragon 430) tend to struggle with the app when performing certain tasks. This doesn’t mean the app always runs a 100% stable on flagships – I also ran into the occasional glitch while using it on a Samsung S7 and an iPhone 6. Still, the app is free, so at least check it out, it can really be a lot of fun and useful to do/learn visual (micro) storytelling! Download it on GooglePlay (Android devices) or the Apple App Store (Apple devices).
P.S.: Note that you can only work on one project at a time and don’t clear the app from your app cache before finishing/exporting it – otherwise the project (not the recorded clips) will be lost!
P.P.S.: Turn off the watermark in the settings!
So some time ago I made a blog post about the topic of Camera2 API on Android devices and why it is important if you are interested in doing more advanced videography on your smartphone. If you don’t have a clue about what Camera2 API is, please check out my previous article before continuing to read this. One of the things that my previous article suggested was that you need a device with „Full“ or „Level 3“ Camera2 API support built into the Android OS by the manufacturer of the phone to take advantage of pro video recording apps. If your device has only „Legacy“ or „Limited“ Camera2 API support then you are not able to even install an app like Filmic Pro. However, after recently getting an Honor 6A into my hands, I need to differentiate and clarify some things.
The Honor 6A is a budget device from Huawei’s sub-brand Honor and shows „Limited“ Camera2 API support level, when testing with a probing app like Camera2 probe. I do own a fair amount of different smartphones to test with, but I realized that before getting the Honor 6A, I had only had phones with either „Legacy“ support level or „Full“/„Level 3“, none with the in-between „Limited“. And while the „Limited“ status does mean that you can’t install Filmic Pro at all and not activate the pro-mode in Lumio Cam, other pro video recording apps are not that picky and salvage what the „Limited“ support level gives them over „Legacy“ (the worst Camera2 API support level out of the four mentioned) instead of blocking the installation altogether. The other pro video recording apps I am talking about are Open Camera, ProShot,Footej Camera and Vimojo. If your device has „Limited“ Camera2 API support that includes manual exposure etc. you will be able to use these features (for instance control of ISO and shutter speed) in the aforementioned apps. Please note that when using ProShot, Footej Camera or Vimojo, Camera2 API is automatically activated whereas with Open Camera you will have to go into the settings, activate the usage of Camera2 API and restart the app.
Anyway, this is very good news for all those rocking an Android device that has only „Limited“ Camera2 API support: Prominent examples of such devices would be fairly recent Huawei phones including their flagship P-series (starting with the P9 / P9 Lite & newer, same should go for their Honor sub-brand) and fairly recent iterations of Samsung’s mid-range A-series (A3, A5, A7) – possibly also the entry-level J-series (J3, J5, J7). You still can’t use FilmicPro on these devices, but other pro video recording apps come to the rescue and do give you more advanced controls.
P.S.: These findings are also of relevance to owners of Sony phones. As I explained in my first blog post about Camera2 API, FilmicPro has (still) blacklisted Sony phones (even those with „Full“ or „Level 3“ support level) because of severe problems that they encountered during testing.
One of the first steps when getting more serious about producing video content with a smartphone is to look at the more advanced video recording apps from 3rd party developers. Popular favorites like „FilmicPro“ (available for both Android and iOS) usually offer way more image composition controls, recording options and helpful pro features that you find on dedicated video cameras than the native stock camera app provided by the maker of the smartphone. While quite a few stock camera apps now actually have fairly advanced manual controls when shooting photos (ability to set ISO and shutter speed might be the most prominent example), the video mode unfortunately and frustratingly is still almost always neglected, leaving the eager user with a bare minimum of controls and options. In 2015 however, LG introduced a game changer in this regard: the V10. For the first time in smartphone history, a phone maker (also) focused on a full featured video recording mode: it included among other things the ability to set ISO and shutter speed, lock exposure, pull focus seamlessly, check audio levels via an audio level meter, adjust audio gain, set microphone directionality, use external microphones, alter the bit rate etc. etc. Sure, for certain users there were still some things missing that you could find in 3rd party apps like the option to change the frame rate to 25fps if you’re delivering for a PAL broadcast but that’s only for a very specific use case – in general, this move by LG was groundbreaking and a bold and important statement for video production on a smartphone. But what about other phone makers? How good are their native camera apps when it comes to advanced options and controls for recording video? Can they compete with dedicated 3rd party apps?
First off, let me tell you why in most cases, you DO want to have a 3rd party app for recording video (at least if you have an Android phone): external microphones. With the exception of LG, Samsung (and I’m told OnePlus) in their recent flagship lines (plus Apple in general), no stock camera app I have come across supports the use of external microphones when recording video. Having good audio in a video is really important in most cases and external microphones (connected via headphone jack, microUSB, USB-C or Lightning connector) can be a big help in achieving that goal.
So why would you use a stock camera app over a dedicated 3rd party app at all? Familiarity. I guess many of us use the native camera app of a smartphone when snapping casual, everyday photos and maybe also videos in non-professional situations. So why not build on that familiarity? Simplicity. The default UI of most native camera apps is pretty straight-forward and simple. Some might prefer this to a more complex UI featured in more advanced 3rd party apps. Affordability. You don’t have to spend a single extra penny for it. I’m generally an avid advocate of supporting excellent 3rd party app developers by paying for their apps but others might not want to invest. The most important reason in my opinion however is: Stability/Reliability. This might not be true for every stock camera app on every phone (I think especially owners of Sony phones and lately the Essential Phone could beg to differ) but because of the fact that the app was developed by the maker of the phone and is usually less complex than 3rd party apps, chances are good that it will run more stable and is less prone to (compatibility) bugs, especially when you consider the plethora of Android devices out there. The V10’s stock camera app, despite being rather complex,is rock-solid and hasn’t crashed on me once in almost 2 years now.
Over the last months I have taken a closer look at a whole lot of stock camera apps on smartphones from LG, Samsung, Apple, Huawei, Sony, Motorola/Lenovo, Nokia (both their older Windows Phone / Windows Mobile offerings AND their new Android handsets), HTC, Nextbit, BQ, Wiko and Google/Nexus. It goes without saying that I wasn’t able to inspect stock camera apps on all the different phone models of a manufacturer. This is important to say because some phone makers give their flagships models a more advanced camera app than their budget devices while others offer the same native camera app across all (or at least most) of their device portfolio. Also, features might be added on newer models. So keep in mind, all I want to do is to give a rough overview from my perspective and offer some thoughts on which phone makers are paying more attention to pro features in the video recording department.
The lowest common denominator for recording video in a stock camera app on a smartphone at the moment is that you will have a button to start recording in full-auto mode with a resolution of 1920×1080 (1080p) (1280×720 on some entry level or older devices) at a frame rate of 30fps. „full-auto“ basically means that exposure, focus and white balance (color temperature) will be set and adjusted automatically by the app depending on the situation and the algorithm / image processing routine. While this might sound like a convenient and good idea in general to get things done without much hassle, the auto-mode will not always produce the desired results because it’s not „smart“ enough to judge what’s important for you in the shot and therefore doesn’t get exposure, focus and/or white balance right. It might also change these parameters while recording when you don’t want them to, like for instance when you are panning the camera. Therefore one of the crucial features to get more control over the image is the ability to adjust and lock exposure, focus and white balance because if these parameters shift (too wildly/abruptly/randomly) while recording, it makes the video look amateurish. So let’s have a look at a couple of stock camera apps.
I’ve been spending quite some time in the last months doing research on what device could qualify as the cheapest budget Android phone that still has certain relevant pro specs for doing mobile video. While it might be up to discussion what specs are the most important (depending on who you ask), I have defined the following for my purposes: 1) decent camera that can record at least in FHD/1080p resolution, 2) proper Camera2 API support to run pro camera apps with manual controls like Filmic Pro (check out my last post about what Camera2 API is), 3) powerful enough chipset that allows the use of video layers in pro video editing apps like KineMaster and PowerDirector, 4) support for external microphones (preferably featuring a headphone jack as long as there are no good all-wireless solutions available).
The greatest obstacle in this turned out to be No. 2 on the list, proper Camera2 API support. Apart from Google’s (abandoned?) Nexus line which also includes a budget option with the Nexus 5X (currently retailing for around 250€), phone makers (so far) have only equipped their flagship phones with adequate Camera2 API support (meaning the hardware support level is either ‘Full’ or ‘Level 3’) while mid-range and entry-level devices are left behind.
Recently, I happened to come across a rather exotic Android phone, the Nextbit Robin. The Nextbit Robin is a crowdfunded phone that came out last year. Its most notable special feature was the included 100GB of cloud storage on top of the 32GB internal storage. While the crowdfunding campaign itself was successful and the phone was actually released, regular sales apparently have been somewhat underwhelming as the phone’s price has dropped significantly. Originally selling for a mid-range price of 399$, it can now be snagged for around 150€ online (Amazon US even has it for 129$). As far as I know, it is now the cheapest Android device that checks all the aforementioned boxes regarding pro video features, INCLUDING full Camera2 API support! Sure, it has some shortcomings like mediocre battery life (the battery is also non-replaceable – but that’s unfortunately all too common these days) and the lack of a microSD storage option (would have been more useful than the cloud thing). It also gets warm relatively quick and it’s not the most rugged phone out there. But it does have a lot going for it otherwise: The camera appears to be reasonably good (of course not in the same league as the ones from Samsung’s or LG’s latest flagships), it even records video in UHD/4K – though it’s no low light champion. The Robin’s chipset is the Snapdragon 808 which has aged a bit but in combination with 3GB of RAM, it’s still a quite capable representative of Qualcomm’s 800-series and powerful enough to handle FHD video layers in editing apps like KineMaster and PowerDirector which is essential if you want to do any kind of a/b-roll editing on your video project. It also features a 3.5mm headphone jack which makes it easy to use external microphones when recording video with apps that support external mics. The most surprising thing however is that Nextbit implemented full Camera2 API support in its version of Android which means it can run Filmic Pro (quite well, too, from what I can tell so far!) and other advanced video recording apps like Lumio Cam and Cinema 4K with full manual controls like focus, shutter speed & ISO. One more thing: The Robin’s Android version is pretty much as up-to-date as it gets: While it has Android 6 Marshmallow out of the box, you can upgrade to 7.1.1 Nougat (the latest version is 7.1.2).
So should you buy it? If you don’t mind shelling out big bucks for one of the latest Android flagship phones and you really want the best camera and fastest chipset currently available, then maybe no. But if you are looking for an incredible deal that gives you a phone with a solid camera and a whole bunch of pro video specs at a super-low price, then look no further – you won’t find that kind of package for less at the moment.
This blog post is trying to shed some light into one of Android’s fragmentation corners – one that’s mainly relevant for people interested in more advanced photography and videography apps to take manual control over their image composition.
First off, I have to say that I’m not a coder / software expert at all so this comes from a layman’s point of view and I will – for obvious reasons – not dig too deep into the more technical aspects underneath the surface.
Now, what is an API? API stands for „application programming interface“. An operating system uses APIs to give (third party) developers tools and access to certain parts of the system to use them for their application. In reverse, this means that the maker of the operating system can also restrict access to certain parts of the system. To quote from Wikipedia: „In general terms, it is a set of clearly defined methods of communication between various software components. A good API makes it easier to develop a computer program by providing all the building blocks, which are then put together by the programmer.“ Now you know it.
Up to version 4.4 (KitKat) of Android, the standard API to access the camera functionality embedded in the OS was very limited. With version 5 (Lollipop), Google introduced the so-called Camera2 API to give camera app developers better access to more advanced controls of the camera, like manual exposure (ISO, shutter speed), focus, RAW capture etc. While the phone makers themselves are not necessarily fully dependent on Google’s new API, because they can customize their own version of the Android OS, third party app developers are to a large extend – they can only work with the tools they are given.
So does every Android device running Lollipop have the new Camera 2 API? Yes and no. While Camera2 API is the new standard Camera API since Android Lollipop, there are different levels of implementation of this API which vary between different phone makers and devices. There are four different levels of Camera2 implementation: Legacy, Limited, Full and Level 3. ‚Legacy‘ means that only the features from the old Camera1 API are available, ‚Limited‘ means that some features of the new API are available, ‚Full‘ means that all basic new features of Camera2 are available and ‚Level 3‘ adds some bonus features like RAW capture on top of that.
Depending on the level of implementation, you can use those features in advanced image capturing apps – or not. An app like Filmic Pro can only be installed if the Camera2 support level is at least ‚Full‘ – otherwise you can only install the less feature-packed Filmic Plus. Lumio Cam on the other hand can be installed on most devices but you can only activate the pro mode with manual exposure and focus if the support level is at least ‚Full‘ again. So if you’re interested in using advanced third party apps for capturing photos or recording video with manual exposure controls etc. you want to have a device that at least has ‚Full‘ Camera2 API support.
But what devices have ‚Full‘ Camera2 support? Currently there are two main categories: Google hardware (phones) and (many/most) flagship phones that were released after Android Lollipop came out. Actually, it seems that the latter really only got going with Android 6 Marshmallow (I guess phone makers needed some time to figure out what this was all about ;)) It doesn’t come as a surprise that Google gives their own devices full support (Nexus & Pixel lines). That means even an almost ancient, pre-Lollipop device like the original Nexus 5 has received full support in the meantime (via OS update). Of course all Nexus phones after that (Nexus 6, Nexus 5X, Nexus 6P) are included and it goes without saying Google’s Pixel phones as well.
Now let’s head over to other smartphone manufacturers (so-called OEMs, Original Equipment Manufacturers) like Samsung, LG, HTC, Huawei, Sony, Lenovo/Motorola, OnePlus etc. Many of them offer at least the crucial ‚Full‘ support level on their flagships that came out with Android 6 Marshmallow installed, some already on the ones that came out with Android 5 Lollipop: Samsung with it’s S-series (S6, S6 Edge, S6 Edge Plus via update, S7, S7 Edge etc.), LG with its G-series (starting with the G4) and V-series (starting with the V10), HTC (starting with the HTC 10), Lenovo/Motorola (starting with the Moto Z), OnePlus (starting with the OnePlus 3/3T), and Sony (starting with the Xperia Z5 via update as far as I know). Sony however is a special case: Their Xperia series has been blacklisted by the developers of FilmicPro/Plus because of major issues that occurred with their devices – you can’t install their apps on a Sony phone at the moment. On the other hand, there are also a few major smartphone OEMs that yet have to offer full Camera2 support for their flagships, the most prominent black sheep being Huawei with its P & Mate series, even the brand new Huawei P10 with all its camera prowess has only limited support. The same goes – unsurprisingly – for Huawei’s budget brand Honor. Other OEMs that don’t offer full Camera2 support in their flagships include Asus (Zenfone 3) and Blackberry (KeyOne). Let’s hope that they will soon add this support and let’s also hope that proper support trickles down to the mid-range and maybe even entry-level phones of the Android universe.
Are you curious what Camera2 support level your phone has? You can use two different apps (both free) on the Google Play Store to test the level of Camera2 implementation on your device. Camera2 probe &Camera2 Probe.
You can also find a (naturally incomplete) list of Android devices and their level of Camera2 API support here, created and maintained by the developer of the app „Camera2 probe“:
If you have a device that is not listed, you can help expanding the list by sending your device’s results (no personal data though) to the developer (there’s a special button at the bottom of the app).
For more in-depth information about Camera2 API, check out these sources:
Cameras that can produce spherical 360 video are becoming more affordable and widespread these days, slowly making their way into the mainstream. The recently released Android-smartphone-specific Insta360 Air clip-on camera has joined a bunch of other entry level 360 cams like the Ricoh Theta S, the LG 360 Cam and Samsung’s Gear 360 to make this new exciting world of immersive visuals available for the crowd while more avantgardistic 360 aficionados are getting their fix with a GoPro-Omni rig or Nokia’s 40000 € Ozo. High-end 360 video solutions are still meant to be post-produced on a desktop machine but the consumer variants are closely tied to mobile devices already. The Insta360 Air connects to the microUSB or USB-C port of an Android phone and records the footage directly to the device. The other three aforementioned entry-level 360 cams can – unlike the Insta360 Air – also be used as a standalone camera without a (physical) connection to the phone but they all have companion apps that will help you to get the best shooting experience and control via a wireless connection. Furthermore, they make it very easy to directly transfer the footage from the camera to the phone for instant sharing or editing. YouTube and Facebook are the two big social networks that already support interactive 360 videos natively, Vimeo has recently added this feature as well. But before sharing, it’s very likely you want to perform some edits on your footage or combine a couple of clips to tell a story. This brings us to the topic of how you can edit 360 video directly on your Android device.
Oh wait! Just hold your horses for a second! Before actually tackling the editing options I think it’s helpful to address two subjects first to better understand the idiosyncracies of dealing with 360 video: stitching and metadata.
(Consumer) Camera technology is not (yet) at the point – at least as far as my knowledge goes – where you can record a spherical 360 image with only a single lens. To achieve a spherical 360 image, at least two lenses are used. These two lenses will give you two images which can be stored in a single file or in different files. Either way, to get one single image ready for spherical display (in the so-called “equirectangular” format) the two images need to be “stitched” together. The stitching can be done automatically by a software algorithm or it can be done manually in a specific editing program. When using a consumer 360 camera you will not have to bother with manual stitching as long as you transfer the files to the camera’s companion app which does the stitching for you automatically. You will only encounter “raw” un-stitched files if you pull the recorded files directly off the SD card without transferring them to the app first for stitching. Here are two screen grabs, one is un-stitched footage from the Gear 360, the other stitched footage from the Insta360 Air.
Important: Only stitched footage in equirectangular format will be displayed correctly as an interactive 360 video when you upload the file to YouTube or Facebook.
The other important thing to have the video displayed correctly as an interactive 360 video in a dedicated player is metadata. It’s data embedded in the video file that will “tell” the player that the file is a 360 video. I’ve used the term “interactive” repeatedly, what do I mean by it? It means that you can interactively change the perspective in the video, either by dragging your finger around the screen or by panning/tilting your device (making use of the phone’s gyroscope). If there’s no metadata in the file, the player will just display a flat, equirectangular video that you can’t interact with. And halleluja, this finally sends us off to our actual topic – editing 360 video on Android – because, depending on what editing app you choose, the exported video still has the 360 metadata in it – or not (in which case it will have to be re-injected).
So there are basically three options to edit and produce 360 video on your Android device:
the camera’s companion app
a dedicated 360 video editing app
a regular video editing app + an app that re-injects metadata
When should you use which option?
You only want to trim the beginning/end of a single clip and/or add a filter. You don’t want to mess around with re-injecting metadata.
You want to build a story with multiple clips. You want to have more editing options & features like changing the default viewing angle, speed or add music/audio. You want to keep the metadata in the file.
You want to build a story with multiple clips. You want a timeline environment, not a storyboard. You want the full feature set of your regular Android video editing app including precise placement/length of titles, music, voice-overs, graphics, transitions etc. You want to work on multiple projects at the same time. You don’t mind loosing a bit of vertical resolution. You don’t mind the “Black Hole Sun” syndrome. You don’t mind not having an ‘interactive’ preview. You don’t mind re-injecting metadata.
Using a 360 camera’s companion app
If you use the editing options of a 360 camera’s companion app, you will only be able to perform extremely basic edits when the end product should be an interactive 360 video. For instance, the companion app for the Insta360 Air only lets you add a filter from their selection, like black&white or some other Instagram-inspired ones. You can’t even trim the beginning or the end of the clip which definitely would come in handy if you don’t intend to be in the shot. Unlike with the Insta360 Air app you can do this kind of top & tail trimming in Samsung’s Gear 360 Manager app and Recoh’s Theta+ Video. The latter also lets you add a filter and music before exporting. I can’t really say anything about the companion app for the LG 360 Cam as I neither have one nor do I know somebody who owns it. But I very much assume that it won’t go beyond the features discussed here. Btw, if you want to share to a network that does not (yet) natively support 360 video (like Twitter, Instagram, WhatsApp or Snapchat) you might want to transform the video into a “Tiny Planet” or “Magic Ball” format which (most) companion apps let you do. But as this blog post is about ‘real’ interactive 360 video, I won’t go further into details here. The same goes for desktop editing software that is provided by the camera companies (like Insta360Studio or the Gear 360 Action Director) because we are focusing on mobile-only solutions.
Using a dedicated 360 video editing app
While often Android users are served less generously or belatedly regarding certain high-profile apps compared to Apple’s iOS users, they can actually be trailblazers when it comes to mobile 360 video editing! There are already two genuine 360 video editing apps in Google’s PlayStore (not a single one for iOS yet): Collect and V360. Both of them are still in beta (update: V360 has been officially released in the meantime) and relatively basic when compared to more advanced “regular” video editing apps but they cover the basics pretty well and appear generally very promising at this early stage. The most important thing is that – unlike the companion apps – they actually let you build a story out of multiple clips. When compared to each other, Collect comes off as the more advanced and visually slightly slicker app with a couple of more features but a minor drawback in the exporting process.
But let’s talk about V360 first, it’s plain simplicity may even make it a better choice when doing your very first 360 video edit. Upon firing up the app you can either multi-select a couple of 360 video clips or just select one and add other clips later. One very helpful thing is that there’s a slider button that when activated shows only 360 video clips, not your whole camera roll. When you’re done you’ll get to the storyboard (storyboard means each clip thumbnail has the same size no matter how long or short the video clip is). By swiping your finger around the preview area or moving around your phone you can explore the different parts of the 360 video. If you want to edit a clip you just tap on the pen icon below the storyboard: You can trim (top & tail), delete or duplicate the clip. There’s also an option to sort the clips (newest/oldest first) but that didn’t really work work me. If you want to rearrange the order of the clips in your story you long-press the clip and then drag it to its new place in the storyboard. You have the chance to add music or another audio clip to the storyboard. Keep in mind though that this audio will play through the whole clip, you cannot have it come in or go out at a certain point. There is however an option to adjust the music volume for the whole video. By tapping on the speaker icon you can change the volume relation between the sound from the video and the audio clip in three steps. Upon export you will find that fortunately the resolution is the same as your source material and that the metadata is still in place but also – a bit less enthusiastically – that a V360 branded outro has been added. Hopefully they will give you the chance to disable it with a future update. If you’re longing for something slightly more advanced then you should check out Collect. After selecting your clips you will find that the idea of circularity is a clever UI theme for a 360 video editing app. The thumbnails of the storyboard are circular and the preview window has a circular mask to help you imagine what the point-of-view will be like for the viewer when watching the video in VR mode with goggles. If you tap on one of the clips and enter the edit screen you will also find that the trim handles for the clip are built into a circle. Btw don’t worry about the trim handles already having been moved without you doing anything – when adding the clips to the storyboard the app does sort of a quick “auto-edit” but all of it is reversible. However I’d prefer to have this as an option to enable rather than a default setting. While letting you add some audio to the story (but just like V360 it plays through the whole video), Collect has a couple of more features up its sleeve: You can add a color filter, change the speed (slow, normal, fast) of the video and – that could be the most important thing – change the default viewing angle so viewers initially look into the direction you want them to look when a new clip starts. If you don’t know what it’s for I assume it can be a bit confusing for beginners though. Another nice feature is the ability to add a custom watermark (a square PNG image with maximum size of 1024×1024, transparency is supported) at the bottom of the image. While I am hoping that future updates will add a few more features like a basic title tool or the ability to switch to a timeline mode which gives you more control over the placement of audio tracks, the biggest flaw of this really cool app at the moment is that the resolution of the exported file is always 3840×2160. If you’re working with Gear 360 footage (which has a maximum resolution of 3840×1920), things are fine but if you use footage from another camera with lower resolution like the Insta360 Air that has a maximum video resolution of 2560×1280 on most phones, the image will get softer because of the upscaling. It would be good to have the option to keep the source material’s original resolution when exporting. Like V360 the app preserves the metadata upon export. One more thing: It’s very cool that they integrated an in-app messenger-like service for giving feedback to the developer team. So speak your mind if you have suggestions!
One thing that both Collect and V360 are lacking is the ability to save/manage multiple projects at the same time. Right now, you have to finish one project before starting a new one. And while you can’t work on different projects at the same time in either of these apps, V360 does save your current project even if you leave the app or eliminate it from the background tasks. Collect on the other hand does save your project as long as you keep the app running in the background, if you clear out the background apps your project will be lost! This is definitely something that both apps (especially Collect) should improve upon.
Using a regular video editing app
The ability to save multiple projects and going back to them for adjustments later is (currently) one of the big advantages when using a ‘regular’ Android video editing app for 360 video. Also, if you want to use titles, place audio files including voice overs at certain points, add transitions or just generally have the full feature set of a more advanced mobile editing app at hand, this is the better choice – it’s a slightly different workflow though and there are some caveats as well. By far the best two video editing apps on Android are KineMaster and PowerDirector, so I will only talk about these two champions here although you might also be able to use another video editing app. While PowerDirector already supports 4K/UHD footage on powerful enough devices, KineMaster has just released a beta version that includes 4K/UHD footage support as well (again the device – or more precisely its chipset – needs to be powerful enough to handle it) but the official release version is (for now) limited to FHD. While 4K/UHD still hasn’t exactly penetrated the mainstream as the standard resolution for ‘regular’ video, it’s a crucial point in the 360 video world because spread over a vastly larger area than in a regular non-360 image, a FHD resolution only looks like SD at best. So if you want something that at least comes a bit closer to an HD (720p) feeling, 4K/UHD footage is needed. You also have to consider that the most common aspect ratio of 360 video is not 16:9 but 2:1 (or 18:9) so you will lose a bit of vertical resolution. Let’s have a look at what kind of footage you can import into KineMaster and PowerDirector (please note that less powerful devices may not support the highest resolutions).
KineMaster currently supports a maximum resolution of 1920×1080 (FHD; 4K/UHD support is in the pipeline as mentioned before) and a maximum frame rate of 30fps. This means you can import footage from the Ricoh Theta S (1920×1080, 30fps) in full but you will have to go for lower resolutions and some pixel loss with footage from the Insta360 Air (2560×1280 does not work, only 1920×960, both 30fps) and Gear 360 (3840×1920 and other lower resolutions don’t work, only 1920×960, 30fps). The video will appear in and export from KineMaster in a common FHD resolution of 1920×1080 (having to fill the vertical resolution from 960 to 1080) so there will be some black letter-boxing which eventually results in what I like to call the “Black Hole Sun” (of course paying homage to a certain tune …) syndrome when viewing it as an interactive 360 video: a small black circle at the top and bottom of the image. You can watch the sample video here (make sure to watch it in highest possible resolution). A quick warning for those usually producing PAL video content with a frame rate of 25fps (which KineMaster allows): Since the footage on these cameras can only be captured with 30fps, set the export frame rate in KineMaster‘s settings to 30 as well for the best result – and it’s the more ‘natural’ standard for platforms like YouTube and Facebook.
If you are running PowerDirector on a device that supports 4K/UHD editing you can import Insta360 Air footage shot in 2560×1280 (30fps) but have to decide wether you want to export it downscaled to 1920×1080 (FHD) or upscaled to 3840×2160 (UHD). You can check my two sample videos (FHD & 4K, make sure to watch it in highest possible resolution) to decide which option you like better quality-wise. The ability to import 4K/UHD footage in PowerDirector also lets you use Gear 360 footage at maximum resolution (3840×1920, 30fps) but as the regular UHD format is 3840×2160, your video will also suffer from the “Black Hole Sun” syndrome.
But let’s move on to the actual editing process using either PowerDirector or KineMaster. One thing that makes imagining the final product a bit more difficult than when using a dedicated 360 video editing app like Collect or V360 is the fact that the preview window will not display an interactive image that you can explore by swiping your finger on the screen or moving the device around like you would with the finished product in a 360 video player – all you see is the flat equirectangular image. So be ready for some trial & error work to find out how certain edits or the addition of titles/graphics will actually look like in the end! That being said, having a precise timeline layout instead of a simple storyboard plus the full feature set of those two advanced mobile video editing apps will give you a lot more freedom and control to create the video your way. You can record voice-overs or add music tracks and place them at specific points, you can add titles (they actually work surprisingly well in a 360 environment, just pay attention to where you place them and don’t make them too big or it will be very hard to read them!) and graphics and exactly define their length, size & style, you can apply transitions instead of plain cuts etc. etc.
So you have created a super-sophisticated 360 masterpiece and joyfully sung Soundgarden’s “Black Hole Sun” the whole time – now you can just upload the video to YouTube or Facebook and get showered with Likes and Thumbs-Ups, right? Er … no. Because we’re pretty much coming full circle (absolutely no pun intended!) when I tell you you mustn’t forget about the metadata! After exporting your video from a regular Android video editing app, the metadata is gone and it needs to be re-injected so that the video player on YouTube or Facebook will actually display the video as an interactive 360 video and not in a flat equirectangular form. So there’s a problem but, alas, there’s also a fix: VRfix. This app is a one-trick pony and it will cost you a couple of bucks but you should be thankful that it exists because otherwise, there would be no happy ending for a mobile-only 360 video workflow when you have used PowerDirector or KineMaster to edit your video. After you have re-injected the 360 metadata into the video file with VRfix, you can finally upload the video to your 360 video platform of choice. If you want to know more about how VRfix works, check out their website.
Oh my, this is my first English language blog post here and it has become quite a monster despite the fact that I only wanted to cover some general basics. Well, well. I do hope you will find it useful in some way. Please feel free to drop questions and other feedback in the comments or hit me up on Twitter (@smartfilming). If you happen to find any mistakes or incorrect information in my article you’re also more than welcome to let me know about it. In that regard I want to finish by saying thanks to a couple of people I consulted during the process of writing this blog post: Pipo Serrano (@piposerrano), Paul Gailey (@paulgailey), Kai Rüsberg (@mojonalist), Sarah Jones (@VirtualSarahJ), Sarah Redohl (@SarahRedohl) and the 360 Rumors Blog (@360rumorsblog).
in an earlier version of this article it was said that KineMaster does not support 4K/UHD footage.
in an earlier version of this article it was said that projects can’t be saved when using Collect or V360.