There are times when – for reasons of privacy or even a person’s physical safety – you want to make certain parts of a frame in a video unrecognizable so not to give away someone’s identity or the place where you shot the video. While it’s fairly easy to achieve something like that for a photograph, it’s a lot more challenging for video because of two reasons: 1) You might have a person moving around within a shot or a moving camera which constantly alters the location of the subject within the frame. 2) If the person talks, he or she might also be identifiable just by his/her voice. So are there any apps that help you to anonymize persons or objects in videos when working on a smartphone?
KineMaster – the best so far
Up until recently the best app for anonymizing persons and/or certain parts of a video in general was KineMaster which I already praised in my last blog about the best video editing apps on Android (it’s also available for iPhone/iPad). While it’s possible to use just any video editor that allows for a resizable image layer (let’s say just a plain black square or rectangle) on top of the main track to cover a face, KineMaster is the only one with a dedicated blur/mosaic tool for this use case. Many other video editing apps have a blur effect in their repertoire, but the problem is that this effect always affects the whole image and can’t be applied to only a part of the frame. KineMaster on the other hand allows its Gaussian Blur effect to be adjusted in size and position within the frame. To access this feature, scroll to the part of the timeline where you want to apply the effect but don’t select any of the clips! Now tap on the “Layer” button, choose “Effect”, then “Basic Effects”, then either “Gaussian Blur” or “Mosaic”. An effect layer gets added to the timeline which you can resize and position within the preview window. Even better: KineMaster also lets you keyframe this layer which is incredibly important if the subject/object you want to anonymize is moving around the frame or if the camera is moving (thereby constantly altering the subject’s/object’s position within the frame). Keyframing means you can set “waypoints” for the effect’s area to automatically change its position/size over time. You can access the keyframing feature by tapping on the key icon in the left sidebar. Keyframes have to be set manually so it’s a bit of work, particularly if your subject/object is moving a lot. If you just have a static shot with the person not moving around a lot, you don’t have to bother with keyframing though. And as if the adjustable blur/mosaic effect and support for keyframing wasn’t good enough, KineMaster also gives you a tool to add an extra layer of privacy: you can alter voices. To access this feature, select a clip in the timeline and then scroll down the menu on the right to find “Voice Changer”, there’s a whole bunch of different effects. To be honest, most of them are rather cartoonish – I’m not sure you want your interviewee to sound like a chipmunk. But there are also a couple of voice changer effects that I think can be used in a professional context.
What happened to Censr?
As I indicated in the paragraph above, a moving subject (or a moving camera) makes anonymizing content within a video a lot harder. You can manually keyframe the blurred area to follow along in KineMaster but it would be much easier if that could be done via automatic tracking. Last summer, a closed beta version of an app called “Censr” was released on iOS, the app was able to automatically track and blur faces. It all looked quite promising (I saw some examples on Twitter) but the developer Sam Loeschen told me that “unfortunately, development on censr has for the most part stopped”.
PutMask – a new app with a killer feature!
But you know what? There actually is a smartphone app out there that can automatically track and pixelate faces in a video: it’s called PutMask and currently only available for Android (there are plans for an iOS version). The app (released in July 2020) offers three ways of pixelating faces in videos: automatically by face-tracking, manually by following the subject with your finger on the touch-screen and manually by keyframing. The keyframing option is the most cumbersome one but might be necessary when the other two ways won’t work well. The “swipe follow” option is the middle-ground, not as time-consuming as keyframing but manual action is still required. The most convenient approach is of course automatic face-tracking (you can even track multiple faces at the same time!) – and I have to say that in my tests, it worked surprisingly well!
Does it always work? No, there are definitely situations in which the feature struggles. If you are walking around and your face gets covered by something else (for instance because you are passing another person or an object like a tree) even for only a short moment, the tracking often loses you. It even lost me when I was walking around indoors and the lens flare from the light bulb at the ceiling created a visual “barrier” which I passed at some point. And although I would say that the app is generally well-designed, some of the workflow steps and the nomenclature can be a bit confusing. Here’s an example: After choosing a video from your gallery, you can tap on “Detect Faces” to start a scanning process. The app will tell you how many faces it has found and will display a numbered square around the face. If you now tap on “Start Tracking”, the app tells you “At least select One filter”. But I couldn’t find a button or something indicating a “filter”. After some confusion I discovered that you need to tap once on the square that is placed over the face in the image, maybe by “filter” they actually mean you need to select at least one face? Now you can initiate the tracking. After the process is finished you can preview the tracking that the app has done (and also dig deeper into the options to alter the amount of pixelation etc.) but for checking the actual pixelated video you have to export your project first. While the navigation could/should be improved for certain actions to make it more clear and intuitive, I was quite happy with the results in general. The biggest catch until recently was the maximum export resolution of 720p but with the latest update released on 21 January 2021, 1080p is also supported. An additional feature that would be great to have in an app that has a dedicated focus on privacy and anonymization, is the ability to alter/distort the voice of a person, like you can do in KineMaster.
There’s one last thing I should address: The app is free to download with all its core functionality but you only get SD resolution and a watermark on export. For HD/FHD watermark-free export, you need to make an in-app purchase. The IAP procedure is without a doubt the weirdest I have ever encountered: The app tells you to purchase any one of a selection of different “characters” to receive the additional benefits. Initially, these “characters” are just names in boxes, “Simple Man”, “Happy Man”, “Metal-Head” etc. If you tap on a box, an animated character pops up. But only when scrolling down it becomes clear that these “characters” represent different amounts of payment with which you support the developer. And if that wasn’t strange enough by itself, the amount you can donate goes up to a staggering 349.99 USD (Character Dr. Plague) – no kidding! At first, I had actually selected Dr. Plague because I thought it was the coolest looking character of the bunch. Only when trying to go through with the IAP did I become aware of the fact that I was about to drop 350 bucks on the app! Seriously, this is nuts! I told the developer that I don’t think this is a good idea. Anyway, the amount of money you donate doesn’t affect your additional benefits, so you can just opt for the first character, the “Simple Man”, which costs you 4.69€. I’m not sure why they would want to make things so confusing for users willing to pay but other than that, PutMask is a great new app with a lot of potential, I will definitely keep an eye on it!
As always, if you have questions or comments, drop them below or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂
Ever since I started this blog, I wanted to write an article about my favorite video editing apps on Android but I could never decide on how to go about it, whether to write a separate in-depth article on each of them, a really long one on all of them or a more condensed one without too much detail or workflow explanations, more of an overview. So I recently figured there’s been enough pondering on this subject and I should just start writing something. The very basic common ground for all these mobile video editing apps mentioned here is that they allow you to combine multiple video clips into a timeline and arrange them in a desired order. Some might question the validity of editing video on such a relatively small screen as that of a smartphone (even though screen sizes have increased drastically over the last years). While it’s true that there definitely are limitations and I probably wouldn’t consider editing a feature-length movie that way, there’s also an undeniable fascination about the fact that it’s actually doable and can also be a lot of fun. I would even dare to say that it’s a charming throwback to the days before digital non-linear editing when the process of cutting and splicing actual film strips had a very tactile nature to it. But let’s get started…
When I got my first smartphone in 2013 and started looking for video editing apps in the Google PlayStore, I ran into a lot of frustration. There was a plethora of video editing apps but almost none of them could do more than manipulate a single clip. Then, in late December, an app called KineMaster was released and just by looking at the screenshots of the UI I could tell that this was the game changer I had been waiting for, a mobile video editing app that actually aspired to give you the proper feature set of a (basic) desktop video editing software. Unlike some other (failed) attempts in that respect, the devs behind KineMaster realized that giving the user more advanced editing tools could become an unpleasant boomerang flying in their face if the controls weren’t touch-friendly on a small screen. If you ever had the questionable pleasure of using a video editing app called “Clesh” on Android (it’s long gone), you know what I’m talking about. To this date, I still think that KineMaster has one of the most beautiful and intuitive UIs of any mobile app. It really speaks to its ingenuity that despite the fact that the app has grown into a respectable mobile video editing power house with many pro features, even total editing novices usually have no problem getting the hang of the basics within a couple of hours or even minutes.
While spearheading the mobile video editing revolution on Android, KineMaster dared to become one of the first major apps to drop the one-off payment method and pioneer a subscription model. I had initially paid 2€ one-off for the pro version of the app to get rid of the watermark, now you had to pay 2 or 3€ a month (!). I know, “devs gotta eat”, and I’m all for paying a decent amount for good apps but this was quite a shock I have to admit. It needs to be pointed out that KineMaster is actually free to download with all its features (so you can test it fully and with no time limit before investing any money) – but you always get a KineMaster watermark in your exported video and the export resolution doesn’t include UHD/4K. If you are just doing home movies for your family, that might be fine but if you do stuff in a professional or even just more ambitious environment, you probably want to get rid of the watermark. Years later, with every other app having jumped on the subscription bandwagon, I do feel that KineMaster is still one of the apps that are really worth it. I already praised the UI/UX, so here are some of the important features: You get multiple video tracks (resolution and number are device-dependend) and other media layers (including support for png images with tranparency), options for multiple frame rates including PAL (25/50), the ability to select between a wide variety of popular aspect ratios for projects (16:9, 9:16, 1:1, 2.35:1 etc.) and even duplicate the project with a different aspect ratio later (very useful if you want to share a video on multiple platforms), you can use keyframes to animate content, have a very good title tool at hand, audio ducking, voice over recording, basic grading tools and last but not least: the Asset Store. That’s the place where you can download all kinds of helpful assets for your edit: music, fonts, transitions, effects and most of all (animated) graphics (‘stickers’) that you can easily integrate into your project and make it pop without having to spend much time on creating stuff from scratch. Depending on what you are doing, this can be a massive help! I also have to say that despite Android’s fragmentation with all its different phones and chipsets, KineMaster works astonishingly well across the board.
There are still things that could be improved (certain parts of the timeline editing process, media management, precise font sizes, audio waveforms for video clips, quick audio fades, project archives etc.) and development progress in the last one or two years seems to have slowed down but it remains a/the top contender for the Android video editing crown, although way more challenged than in the past. Last note: KineMaster has recently released beta versions of two “helper” apps: VideoStabilizer for KineMaster and SpeedRamp for KineMaster. I personally wish they would have integrated this functionality into the main app but it’s definitely better than not having it at all.
The first proper rival for KineMaster emerged about half a year later in June 2014 with Cyberlink’s PowerDirector. Unlike KineMaster, PowerDirector was already an established name in the video editing world, at least on the consumer/prosumer level. In many ways, PowerDirector has a somewhat (yet not completely) equal feature set to that of KineMaster with one key missing option being that for exporting in PAL frame rates (if you don’t need to export in 25/50fps, you can ignore this shortcoming). The UI is also good and pretty easy to learn. After KineMaster switched to the subscription model, PowerDirector did have one big factor in its favor: You could still get the full, watermark-free version of the app by making a single, quite reasonable payment, I think it was about 5€. That, however, changed eventually and PowerDirector joined the ranks of apps that you couldn’t own anymore, but only rent via a subscription to have access to all features and watermark-free export. Despite the fact that it’s slightly more expensive than KineMaster now, it’s still a viable and potent mobile video editor with some tricks up its sleeve.
It was for instance – until recently – the only mobile video editor that has an integrated stabilization tool to tackle shaky footage. It’s also the only one with a dedicated de-noise feature for audio and unlike with KineMaster you can mix your audio levels by track in addition to just by individual clips. Furthermore, PowerDirector offers the ability to transfer projects from mobile to its desktop version via the Cyberlink Cloud which can come in handy if you want to assemble a rough cut on the phone but do more in-depth work on a bigger screen with mouse control. Something rather annoying is the way in which the app tries to nudge or dare I say shove you towards a subscription. As I had bought the app before the introduction of the subscription model, I can still use all of its features and export without a watermark but before getting to the edit workspace, the app bombards you with full-screen ads for its subscription service every single time – I really hate that. One last thing: There are a couple of special Android devices on which PowerDirector takes mobile video editing actually to another level but that’s for a future article so stay tuned.
Adobe Premiere Rush
Even more so than Cyberlink, Adobe is a well-known name in the video editing business thanks to Premiere Pro (Windows/macOS). More than once I had asked myself why such a big player had missed the opportunity to get into the mobile editing game. Sure, they dipped their toes into the waters with Premiere Clip but after a mildly promising launch, the app’s development stagnated all too soon and was abandoned eventually – not that much of a loss as it was pretty basic. In 2018 however, Adobe bounced back onto the scene with a completely new app, Premiere Rush. This time, it looked like the video editing giant was ready to take the mobile platform seriously.
The app has a very solid set of advanced editing features and even some specialties that are quite unique/rare in the mobile editing environment: You can for instance expand the audio of a video clip without actually detaching it and risking to go out of sync, very useful for J & L cuts. There’s also a dedicated button that activates multi-select for clips in the timeline, another great feature. What’s more, Rush has true timeline tracks for video. What do I mean by “true”? KineMaster and PowerDirector support video layers but you can’t just move a clip from the primary track to an upper/lower layer track and vice versa which isn’t that much of a problem most of the time but sometimes it can be a nuisance. In Rush you can move your video clips up and down the tracks effortlessly. The “true tracks” also means that you can easily disable/mute/lock a particular track and all the clips that are part of it. One of Rush’s marketed highlights is the auto-conform feature which is supposed to automatically adapt your edit to other aspect ratios using AI to frame the image in the (hopefully) best way. So for instance if you have a classic 16:9 edit, you can use this to get a 1:1 video for Instagram. This feature is reserved for premium subscribers but you can still manually alter the aspect ratio of your project in the free version. For a couple of months, the app was only available for iOS but premiered (pardon the pun!) on Android in May 2019. Like PowerDirector, you can use Adobe’s cloud to transfer project files to the desktop version of Rush (or even import into Premiere Pro) which is useful if the work is a bit more complex. It’s also possible to have projects automatically sync to the cloud (subscriber feature). Initially, the app had a very expensive subscription of around 10€ per month (and only three free exports to test) unless you were already an Adobe Creative Cloud subscriber in which case you got it for free), but it has now become more affordable (4.89€ monthly or 33.99 per year) and the basic version with most features including 1080p export (UHD/4K is a premium feature) is free and doesn’t even force a watermark on your footage – you do need to create a (free) account with Adobe though.
The app does have its quirks – how much of it are still teething aches, I’m not sure. In my personal tests with a Google Pixel 3 and a Pocophone F1, export times were sometimes outrageously long, even for short 1080p projects. Both my test devices were powered by a Snapdragon 845 SoC which is a bit older but was a top flagship processor not too long ago and should easily handle 1080p video. Other editing apps didn’t have any problems rushing out (there goes another pun!) the same project on the same devices. This leads me to believe that the app’s export engine still needs some fine tuning and optimization. But maybe things are looking better on newer and even more powerful devices. Another head-scratcher was frame rate fidelity. While the export window gave me a “1080p Match Framerate” option as an alternative to “1080p 30fps”, surely indicating that it would keep the frame rate of the used clips, working with 25fps footage regularly resulted in a 30fps export. The biggest caveat with Rush though is that its availability on Android is VERY limited. If you have a recent flagship phone from Samsung, Google, Sony or OnePlus, you’re invited, otherwise you are out of luck – for the moment at least. For a complete list of currently supported Android devices check here.
Ever since I started checking the Google PlayStore for interesting new apps on a regular basis, it rarely happens that I find a brilliant one that’s already been out for a very long time. It does happen on very rare occasions however and VN is the perfect case in point. VN had already been available for Android for almost two years (the PlayStore lists May 2018 as the release date) when it eventually popped up on my radar in March 2020 while doing a routine search for “video editors” on the PlayStore. VN is a very powerful video editor with a robust set of advanced tools and a UI that is both clean, intuitive and easy to grasp. You get a multi-layer timeline, support for different aspect ratios including 16:9, 9:16, 1:1, 21:9, voice over recording, transparency with png graphics, keyframing for graphical objects (not audio though, but there’s the option for a quick fade in/out), basic exposure/color correction, a solid title tool, export options for resolutions up to UHD/4K, frame rate (including PAL frame rates) and bitrate.
In other news, VN is currently the only of the advanced mobile video editing apps with a dedicated and very easy-to-use speed-ramping tool which can be helpful when manipulating a clip in terms of playback speed. It’s also great that you can move video clips up and down the tracks although it’s not as intuitive as Adobe Premiere Rush in that respect since you can’t just drag & drop but have to use the “Forward/Backward” button. But once you know how to do it, it’s very easy. While other apps might have a feature or two more, VN has a massive advantage: It’s completely free, no one-off payment, no subscription, no watermark. You do have to watch a 5 second full-screen ad when launching the app and delete a “Directed by” bumper clip from every project’s timeline, but it’s really not much of a bother in my opinion. In the past you had to create an account with VN but it’s not a requirement anymore. Will it stay free? When I talked to VN on Twitter some time ago, they told me that the app as such is supposed to remain free of charge but that they might at some point introduce certain premium features or content. VN recently launched a desktop version for macOS (no Windows yet) and the ability to transfer project files between iOS and macOS. While this is currently only possible within the Apple ecosystem (and does require that you register an account with VN), more cross-platform integration could be on the horizon. All in all, VN is an absolutely awesome and easily accessible mobile video editor widely available for most Android devices (Android 5.0 & up) – but do keep in mind that depending on the power of your phone’s chipset, the number of video layers and the supported editing/exporting resolution can vary.
CapCut is somewhat similar to VN in terms of basic functionality (multiple video tracks, support for different frame rates including PAL, variety of aspect ratios etc.) and layout, but with a few additional nifty features that might come in handy depending on the use case. Like VN, it’s completely free without a watermark and you don’t have to create an account. CapCut was – following Cyberlink’s PowerDirector – the second advanced mobile video editing app to introduce a stabilization tool and it can even be adjusted to some degree.
Its unique standout double-feature however has to do with automatic speech-to-text/text-to-speech processing. As we all know, captions have become an integral part of video production for social media platforms as many or most of us browse their network feeds without having the sound turned on and so captions can be a way to motivate users to watch a video even when it’s muted. While it’s no problem to manually create captions with the title tool in basically any video editing app, this can be very time-consuming and fiddly on a mobile device. So how about auto-generated captions? CapCut has you covered. It doesn’t work perfectly (you sometimes have to do some manual editing) and it’s currently only available in English, but it’s definitely a very cool feature that none of the other editors mentioned here can muster. Interestingly, it’s also possible to do it the other way around: You can let the app auto-generate a voice-over from a text layer. There are three different voices available: “American Male”, “American Female” and “British Female” (only English again). This can be useful if you quickly need to create a voice-over on the go and there’s no time or quiet place to do so or if you are not comfortable recording voice-overs with your own voice. Any cons? Generally, I would say that I prefer VN of the two because I like the design and UX of the timeline workspace better, it’s easier to navigate around, but that’s probably personal taste. What is an actual shortcoming however if you are after the highest possible quality is the fact that CapCut lacks support for UHD/4K export. Don’t get me wrong, you can import UHD/4K footage into the app and work with it but the export resolution is limited to 1080p and you also can’t adjust the bitrate. From a different angle, it should also be mentioned that CapCut is owned by Bytedance, the company behind the popular social video platform TikTok. While you don’t have to create an account for CapCut, you do have to agree to their T&Cs to use the app. So if you are very picky about who gets your data and kept your fingers off TikTok for that reason, you might want to take this into consideration.
Special mention (Motion Graphics): Alight Motion
Alight Motion is a pretty unique mobile app that doesn’t really have an equivalent at the moment. While you can also use it to stitch together a bunch of regular video clips filmed with your phone, this is not its main focus. The app is totally centered around creating advanced, multi-layered motion graphics projects, maybe think of it as a reduced mobile version of Adobe After Effects. Its power lies in the fact that you can manipulate and keyframe a wide range of parameters (for instance movement/position, size, color, shape etc.) on different types of layers to create complex and highly individual animations, spruced up with a variety of cool effects drawn from an extensive library. It takes some learning to unleash the enormous potential and power that lies within the app and fiddling around with a heavy load of parameters and keyframes on a small(ish) touch screen can occasionally be a bit challenging but the clever UI (designed by the same person that made KineMaster so much fun to use) makes the process basically as good and accessible as it can get on a mobile device. The developers also just added effect presets in a recent update which should make it easier for beginners who might be somewhat intimidated by manually keyframing parameters. Pre-designed templates for graphics and animations created by the dev team or other users will make things even more accessible in the future – some are already available but still too few to fully convince passionate users of apps such as the very popular but discontinued Legend. Alight Motion is definitely worth checking out as you can create amazing things with it (like explainer videos or animated info graphics), if you are willing to accept a small learning curve and invest some time. This is coming from someone who regularly throws in the towel trying to get the hang of Apple’s dedicated desktop motion graphics software Motion. Alight Motion has become the first application in this category in which I actually feel like I know what I’m doing – sort of at least. One very cool thing is that you can also use Alight Motion as a photo/still graphics editor since it lets you export the current timeline frame as a png, even with transparency! The app is free to download but to access certain features and export without a watermark you have to get a subscription which is currently around 28€ per year or 4.49 on a monthly basis.
Special mention (Automated Editing): Quik
Sometimes, things have to go quik-ly and you don’t have the time or ambition to assemble your clips manually. While I’m generally not a big fan of automated video editing processes, GoPro’s free Quik video editing app can come in handy at times. You just select a bunch of photos or videos, an animation style, your desired aspect ratio (16:9, 9:16, 1:1) and the app creates an automatic edit for you based on what it thinks are the best bits and pieces. In case you don’t like the results you have the option to change things around and select excerpts that you prefer – generally, manual control is rather limited though and it’s definitely not for more advanced edits. It’s also better suited for purely visual edits without important scenes relying on the original audio (like a person talking and saying something of interest). GoPro, who acquired the app in the past, is apparently working on a successor to Quik and will eventually pull this one from the Google PlayStore later in 2021 but here’s hope that the “new Quik” will be just as useful and accessible.
Special mention (360 Video Editing): V360
While 360 video hasn’t exactly become mainstream, I don’t want to ignore it completely for this post. Owners of a 360 camera (like the Insta360 One X2 I wrote about recently) usually get a companion mobile app along with the hardware which also allows basic editing. In the case of the Insta360 app you actually get quite a range of tools but it’s more geared towards reframing and exporting as a traditional flat video. You can only export a single clip in true 360 format. So if you want to create a story with multiple 360 video clips and also export as true, immersive 360 video with the appropriate metadata for 360 playback, you need to use a 3rd party app. I have already mentioned V360 in one of my very early blog posts but I want to come back to it as the landscape hasn’t really changed since then. V360 gives you a set of basic editing tools to create a 360 video story with multiple clips. You can arrange the clips in the desired order, trim and split them, add music and titles/text. It’s rather basic but good for what it is, with a clean interface and exports in original resolution (at least up to 5.7k which I was able to test). The free version doesn’t allow you to add transition effects between the clips and has a V360 branded bumper clip at the end that you can only delete in the paid version which is 4.99€. There are two other solid 360 video editors (Collect and VeeR Editor) which are comparable and even offer some additional/different features but I personally like V360 best although it has to be said that the app hasn’t seen an update in over two years.
What’s on the horizon?
There’s one big name in mobile editing town that’s missing from the Android platform so far – of course I’m talking about LumaFusion. According to LumaTouch, the company behind LumaFusion, they are currently probing an Android version and apparently have already hired some dedicated developers. I therefore suspect that despite the various challenges that such a demanding app like LumaFusion will encounter in creating a port for a different mobile operating system, we will see at least an early beta version in 2021. Furthermore, despite not having any concrete evidence, I assume that an Android version of Videoleap, another popular iOS-only video editor, might also be currently in the works. Not quite as advanced and feature-packed as LumaFusion, it’s pretty much on par in many respects with the current top dogs on Android. So while there definitely is competition, I also assume that the app’s demands are certainly within what can be achieved on Android and the fact that they have already brought other apps from their portfolio to Android indicates that they have some interest in the platform.
As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
I am investing a lot of time and work in this blog and I’m even paying to keep it ad-free for an undistracted reading experience. If you find any of the content useful, please consider making a small donation via PayPal (click on the PayPal button below). It’s very much appreciated. Thank you! 🙂
Xiaomi has been a really big name in China’s smartphone market for years, promising high-end specs and good build quality for a budget price tag – but only at the end of last year did they officially enter the global scene with the Mi A1. The Mi A1 is basically a revamped Mi 5X running stock Android software instead of Xiaomi’s custom Mi UI. It’s also part of Google’s Android One program which means it runs a ‚clean‘ Google version of Android that gets quicker and more frequent updates directly from Google. For a very budget-friendly 180€ (current online price in Europe) you get a slick looking phone with dual rear cameras, featuring a 2x optical zoom telephoto lens alongside the primary camera. Sounds like an incredible deal? Here are some thoughts about the Mi A1 regarding its use as a tool for media production, specifically video.
After spending a couple of days with the Mi A1, I would say that this phone is definitely a very interesting budget-choice for mobile photographers. The fact that you get dual rear cameras (the second one is a 2x optical zoom as mentioned before) at this price point is pretty amazing. The photo quality is quite good in decent lighting conditions (low light is problematic but that can be said of most smartphone cameras), you get a manual mode with advanced controls in the native camera app and the portrait mode feature does a surprisingly good job at creating that fancy Bokeh effect blurring the background to single out your on-screen talent. A lot of bang for the buck. Video – which I’m personally more interested in – is a slightly different story though.
Let’s start with a positive aspect: The Xiaomi Mi A1 lets you record in UHD/4K quality which is still a rarity for a budget phone in this price range. And hey, the footage looks quite good in my opinion, especially considering the fact that it’s coming from a (budget) smartphone. I have uploaded some sample footage on YouTube so see for yourself.
The video bitrate for UHD/4K hovers around 40 Mbps in the native app which is ok for a phone but the audio bitrate is a meager 96 Kbps (same in FHD) – so don’t expect full, rich sound. But this is only the beginning of a couple of disappointments when it comes to video: One of the Mi A1’s promising camera features, the 2x optical zoom lens, CANNOT be used in the video mode, only in the photo mode! What a bummer! This goes for both the native camera app and 3rd party apps.
Talking about 3rd party camera apps, it’s also a huge let-down that the Camera2 API support (what is Camera2 API?) is only „Legacy“ out of the box, even though the Mi A1 is part of Google’s Android One program. „Legacy“ means that third party camera apps can’t really tap into the new, more advanced camera controls that Google introduced with Android 5 in 2014, like precise exposure control over ISO and shutter speed. Due to this, you can’t install an app like Filmic Pro in the first place and other advanced camera apps like Cinema FV-5, ProShot, Lumio Cam, Cinema 4K, Footej Camera or Open Camera can’t really unleash their full potential. Interestingly, there seems to be a way to „unlock“ full Camera2 support via a special procedure without permanently rooting your device (look here) but even after doing so, Filmic Pro can’t be installed, probably because the PlayStore keeps the device’s original Camera2 support information in its database to check if the app is compatible without actually probing the current state of the phone. This is just an educated guess however. Still, many of us might not feel comfortable messing around with their phone in that way and it’s a pity Xiaomi doesn’t provide this out-of-the-box on the Mi A1.
Lackluster Camera2 API support can be remedied by a good native camera app but unlike with photos, there is no pro or manual mode for videos on the Mi A1, it’s actually extremely limited. While you can lock the focus by tapping (there are two focus modes, tap-to-focus and continuous auto-focus), you are only able to adjust the auto-exposure within a certain range (EV), not lock it. There’s also no way to influence the white balance. Shooting in a higher frame rate (60fps)? Not possible, not even in 720p (there’s a not-too-bad 720p slow-motion feature though). Apropos frame rates: I noticed that while the regular frame rate is the usual 30fps, the native camera app reduces the fps to 24 (actually 23.98 to be precise) when shooting under low-light conditions to gain a little bit more light for each frame. That’s also the reason why I made two different YouTube videos with sample footage so I was able to keep the original frame rate of the clips. I have experienced this behaviour of dropping the frame rate in low-light in quite a few (native) camera apps on other phones as well and from the standpoint of a run-of-the-mill smartphone user taking video this is actually an acceptable compromise in my opinion (as long as you don’t go below 20fps) to help tackle the fact that most smartphone cameras still aren’t naturally nocturnal creatures. It can however be a problem for more dedicated smartphone videographers that want to edit their footage as it’s not really good to have clips in one project that differ so much in terms of fps. 3rd party apps might help keeping the fps more constant.
And there are still two other big reasons to use a 3rd party app on the Mi A1 despite the lack of proper Camera2 API support: locking exposure and using an external microphone via the headphone jack (yes, there is one!). One more important shortcoming to talk about: It’s not too surprising maybe that there is no optical image stabilization (OIS) on a phone in this price range but given the fact that you can shoot 4K, I would have expected electronic image stabilization (EIS) at least when shooting in 1080p resolution. But there’s no EIS in 1080p which means that you should put the phone on a tripod or use a gimbal most of the time to avoid getting shaky footage. With a bit of practice you might pull off a decent handheld pan or tilt however to avoid having only static shots.
So I’ve talked about the video capturing part, what about editing video on the Mi A1? The phone sports a Snapdragon 625 which is a slightly dated but still quite capable mid-ranger chipset from Qualcomm. You can work with up to two layers (total of three video tracks) of FHD video in KineMaster and PowerDirector (the two most advanced Android video editing apps) which will suffice for most users. Important note: DON’T run the hardware analysis test in KineMaster though! It’s a hardware probing procedure meant to better determine the device’s capabilities in terms of editing video in the app. While the device capability information originally says you can have two QHD (1440p) video layers, it will downgrade you to two 720p (!) layers after running the analysis – quite strange. Don’t worry though if your evil twin grabs your phone and runs the test anyway – you just have to uninstall and then reinstall KineMaster to get back to the original setting. I ran some quick tests with FHD 1080p layers and it worked fine so just leave everything as is. Since the phone can shoot in UHD/4K resolution you might ask if you can edit this footage on the device. While you can’t edit 4K in KineMaster on the Mi A1 at all (when trying to import 4K footage the app will offer you to import a transcoded QHD version of the clip to work with) you can import and work with UHD/4K in PowerDirector, but only as a single video track, layers are not possible.
So let’s wrap this up: Xiaomi’s first internationally available phone is a great budget option for mobile photographers but the video recording department is let down by a couple of things which makes other options in this price range more appealing to the smartphone videographer if advanced manual controls and certain pro apps are of importance. As I pointed out though, it’s not all bad: It’s still hard to find a phone for that price that offers UHD/4K video recording – and the footage looks even pretty good in decent lighting conditions. So if you happen to have a Mi A1 – there’s no reason at all to not create cool video content with it – if you achieve a nice video package you can even be more proud than someone with a flagship phone! 😉 If you have any questions or comments, please drop them below or find me on Twitter @smartfilming.
Cameras that can produce spherical 360 video are becoming more affordable and widespread these days, slowly making their way into the mainstream. The recently released Android-smartphone-specific Insta360 Air clip-on camera has joined a bunch of other entry level 360 cams like the Ricoh Theta S, the LG 360 Cam and Samsung’s Gear 360 to make this new exciting world of immersive visuals available for the crowd while more avantgardistic 360 aficionados are getting their fix with a GoPro-Omni rig or Nokia’s 40000 € Ozo. High-end 360 video solutions are still meant to be post-produced on a desktop machine but the consumer variants are closely tied to mobile devices already. The Insta360 Air connects to the microUSB or USB-C port of an Android phone and records the footage directly to the device. The other three aforementioned entry-level 360 cams can – unlike the Insta360 Air – also be used as a standalone camera without a (physical) connection to the phone but they all have companion apps that will help you to get the best shooting experience and control via a wireless connection. Furthermore, they make it very easy to directly transfer the footage from the camera to the phone for instant sharing or editing. YouTube and Facebook are the two big social networks that already support interactive 360 videos natively, Vimeo has recently added this feature as well. But before sharing, it’s very likely you want to perform some edits on your footage or combine a couple of clips to tell a story. This brings us to the topic of how you can edit 360 video directly on your Android device.
Oh wait! Just hold your horses for a second! Before actually tackling the editing options I think it’s helpful to address two subjects first to better understand the idiosyncracies of dealing with 360 video: stitching and metadata.
(Consumer) Camera technology is not (yet) at the point – at least as far as my knowledge goes – where you can record a spherical 360 image with only a single lens. To achieve a spherical 360 image, at least two lenses are used. These two lenses will give you two images which can be stored in a single file or in different files. Either way, to get one single image ready for spherical display (in the so-called “equirectangular” format) the two images need to be “stitched” together. The stitching can be done automatically by a software algorithm or it can be done manually in a specific editing program. When using a consumer 360 camera you will not have to bother with manual stitching as long as you transfer the files to the camera’s companion app which does the stitching for you automatically. You will only encounter “raw” un-stitched files if you pull the recorded files directly off the SD card without transferring them to the app first for stitching. Here are two screen grabs, one is un-stitched footage from the Gear 360, the other stitched footage from the Insta360 Air.
Important: Only stitched footage in equirectangular format will be displayed correctly as an interactive 360 video when you upload the file to YouTube or Facebook.
The other important thing to have the video displayed correctly as an interactive 360 video in a dedicated player is metadata. It’s data embedded in the video file that will “tell” the player that the file is a 360 video. I’ve used the term “interactive” repeatedly, what do I mean by it? It means that you can interactively change the perspective in the video, either by dragging your finger around the screen or by panning/tilting your device (making use of the phone’s gyroscope). If there’s no metadata in the file, the player will just display a flat, equirectangular video that you can’t interact with. And halleluja, this finally sends us off to our actual topic – editing 360 video on Android – because, depending on what editing app you choose, the exported video still has the 360 metadata in it – or not (in which case it will have to be re-injected).
So there are basically three options to edit and produce 360 video on your Android device:
the camera’s companion app
a dedicated 360 video editing app
a regular video editing app + an app that re-injects metadata
When should you use which option?
You only want to trim the beginning/end of a single clip and/or add a filter. You don’t want to mess around with re-injecting metadata.
You want to build a story with multiple clips. You want to have more editing options & features like changing the default viewing angle, speed or add music/audio. You want to keep the metadata in the file.
You want to build a story with multiple clips. You want a timeline environment, not a storyboard. You want the full feature set of your regular Android video editing app including precise placement/length of titles, music, voice-overs, graphics, transitions etc. You want to work on multiple projects at the same time. You don’t mind loosing a bit of vertical resolution. You don’t mind the “Black Hole Sun” syndrome. You don’t mind not having an ‘interactive’ preview. You don’t mind re-injecting metadata.
Using a 360 camera’s companion app
If you use the editing options of a 360 camera’s companion app, you will only be able to perform extremely basic edits when the end product should be an interactive 360 video. For instance, the companion app for the Insta360 Air only lets you add a filter from their selection, like black&white or some other Instagram-inspired ones. You can’t even trim the beginning or the end of the clip which definitely would come in handy if you don’t intend to be in the shot. Unlike with the Insta360 Air app you can do this kind of top & tail trimming in Samsung’s Gear 360 Manager app and Recoh’s Theta+ Video. The latter also lets you add a filter and music before exporting. I can’t really say anything about the companion app for the LG 360 Cam as I neither have one nor do I know somebody who owns it. But I very much assume that it won’t go beyond the features discussed here. Btw, if you want to share to a network that does not (yet) natively support 360 video (like Twitter, Instagram, WhatsApp or Snapchat) you might want to transform the video into a “Tiny Planet” or “Magic Ball” format which (most) companion apps let you do. But as this blog post is about ‘real’ interactive 360 video, I won’t go further into details here. The same goes for desktop editing software that is provided by the camera companies (like Insta360Studio or the Gear 360 Action Director) because we are focusing on mobile-only solutions.
Using a dedicated 360 video editing app
While often Android users are served less generously or belatedly regarding certain high-profile apps compared to Apple’s iOS users, they can actually be trailblazers when it comes to mobile 360 video editing! There are already two genuine 360 video editing apps in Google’s PlayStore (not a single one for iOS yet): Collect and V360. Both of them are still in beta (update: V360 has been officially released in the meantime) and relatively basic when compared to more advanced “regular” video editing apps but they cover the basics pretty well and appear generally very promising at this early stage. The most important thing is that – unlike the companion apps – they actually let you build a story out of multiple clips. When compared to each other, Collect comes off as the more advanced and visually slightly slicker app with a couple of more features but a minor drawback in the exporting process.
But let’s talk about V360 first, it’s plain simplicity may even make it a better choice when doing your very first 360 video edit. Upon firing up the app you can either multi-select a couple of 360 video clips or just select one and add other clips later. One very helpful thing is that there’s a slider button that when activated shows only 360 video clips, not your whole camera roll. When you’re done you’ll get to the storyboard (storyboard means each clip thumbnail has the same size no matter how long or short the video clip is). By swiping your finger around the preview area or moving around your phone you can explore the different parts of the 360 video. If you want to edit a clip you just tap on the pen icon below the storyboard: You can trim (top & tail), delete or duplicate the clip. There’s also an option to sort the clips (newest/oldest first) but that didn’t really work work me. If you want to rearrange the order of the clips in your story you long-press the clip and then drag it to its new place in the storyboard. You have the chance to add music or another audio clip to the storyboard. Keep in mind though that this audio will play through the whole clip, you cannot have it come in or go out at a certain point. There is however an option to adjust the music volume for the whole video. By tapping on the speaker icon you can change the volume relation between the sound from the video and the audio clip in three steps. Upon export you will find that fortunately the resolution is the same as your source material and that the metadata is still in place but also – a bit less enthusiastically – that a V360 branded outro has been added. Hopefully they will give you the chance to disable it with a future update. If you’re longing for something slightly more advanced then you should check out Collect. After selecting your clips you will find that the idea of circularity is a clever UI theme for a 360 video editing app. The thumbnails of the storyboard are circular and the preview window has a circular mask to help you imagine what the point-of-view will be like for the viewer when watching the video in VR mode with goggles. If you tap on one of the clips and enter the edit screen you will also find that the trim handles for the clip are built into a circle. Btw don’t worry about the trim handles already having been moved without you doing anything – when adding the clips to the storyboard the app does sort of a quick “auto-edit” but all of it is reversible. However I’d prefer to have this as an option to enable rather than a default setting. While letting you add some audio to the story (but just like V360 it plays through the whole video), Collect has a couple of more features up its sleeve: You can add a color filter, change the speed (slow, normal, fast) of the video and – that could be the most important thing – change the default viewing angle so viewers initially look into the direction you want them to look when a new clip starts. If you don’t know what it’s for I assume it can be a bit confusing for beginners though. Another nice feature is the ability to add a custom watermark (a square PNG image with maximum size of 1024×1024, transparency is supported) at the bottom of the image. While I am hoping that future updates will add a few more features like a basic title tool or the ability to switch to a timeline mode which gives you more control over the placement of audio tracks, the biggest flaw of this really cool app at the moment is that the resolution of the exported file is always 3840×2160. If you’re working with Gear 360 footage (which has a maximum resolution of 3840×1920), things are fine but if you use footage from another camera with lower resolution like the Insta360 Air that has a maximum video resolution of 2560×1280 on most phones, the image will get softer because of the upscaling. It would be good to have the option to keep the source material’s original resolution when exporting. Like V360 the app preserves the metadata upon export. One more thing: It’s very cool that they integrated an in-app messenger-like service for giving feedback to the developer team. So speak your mind if you have suggestions!
One thing that both Collect and V360 are lacking is the ability to save/manage multiple projects at the same time. Right now, you have to finish one project before starting a new one. And while you can’t work on different projects at the same time in either of these apps, V360 does save your current project even if you leave the app or eliminate it from the background tasks. Collect on the other hand does save your project as long as you keep the app running in the background, if you clear out the background apps your project will be lost! This is definitely something that both apps (especially Collect) should improve upon.
Using a regular video editing app
The ability to save multiple projects and going back to them for adjustments later is (currently) one of the big advantages when using a ‘regular’ Android video editing app for 360 video. Also, if you want to use titles, place audio files including voice overs at certain points, add transitions or just generally have the full feature set of a more advanced mobile editing app at hand, this is the better choice – it’s a slightly different workflow though and there are some caveats as well. By far the best two video editing apps on Android are KineMaster and PowerDirector, so I will only talk about these two champions here although you might also be able to use another video editing app. While PowerDirector already supports 4K/UHD footage on powerful enough devices, KineMaster has just released a beta version that includes 4K/UHD footage support as well (again the device – or more precisely its chipset – needs to be powerful enough to handle it) but the official release version is (for now) limited to FHD. While 4K/UHD still hasn’t exactly penetrated the mainstream as the standard resolution for ‘regular’ video, it’s a crucial point in the 360 video world because spread over a vastly larger area than in a regular non-360 image, a FHD resolution only looks like SD at best. So if you want something that at least comes a bit closer to an HD (720p) feeling, 4K/UHD footage is needed. You also have to consider that the most common aspect ratio of 360 video is not 16:9 but 2:1 (or 18:9) so you will lose a bit of vertical resolution. Let’s have a look at what kind of footage you can import into KineMaster and PowerDirector (please note that less powerful devices may not support the highest resolutions).
KineMaster currently supports a maximum resolution of 1920×1080 (FHD; 4K/UHD support is in the pipeline as mentioned before) and a maximum frame rate of 30fps. This means you can import footage from the Ricoh Theta S (1920×1080, 30fps) in full but you will have to go for lower resolutions and some pixel loss with footage from the Insta360 Air (2560×1280 does not work, only 1920×960, both 30fps) and Gear 360 (3840×1920 and other lower resolutions don’t work, only 1920×960, 30fps). The video will appear in and export from KineMaster in a common FHD resolution of 1920×1080 (having to fill the vertical resolution from 960 to 1080) so there will be some black letter-boxing which eventually results in what I like to call the “Black Hole Sun” (of course paying homage to a certain tune …) syndrome when viewing it as an interactive 360 video: a small black circle at the top and bottom of the image. You can watch the sample video here (make sure to watch it in highest possible resolution). A quick warning for those usually producing PAL video content with a frame rate of 25fps (which KineMaster allows): Since the footage on these cameras can only be captured with 30fps, set the export frame rate in KineMaster‘s settings to 30 as well for the best result – and it’s the more ‘natural’ standard for platforms like YouTube and Facebook.
If you are running PowerDirector on a device that supports 4K/UHD editing you can import Insta360 Air footage shot in 2560×1280 (30fps) but have to decide wether you want to export it downscaled to 1920×1080 (FHD) or upscaled to 3840×2160 (UHD). You can check my two sample videos (FHD & 4K, make sure to watch it in highest possible resolution) to decide which option you like better quality-wise. The ability to import 4K/UHD footage in PowerDirector also lets you use Gear 360 footage at maximum resolution (3840×1920, 30fps) but as the regular UHD format is 3840×2160, your video will also suffer from the “Black Hole Sun” syndrome.
But let’s move on to the actual editing process using either PowerDirector or KineMaster. One thing that makes imagining the final product a bit more difficult than when using a dedicated 360 video editing app like Collect or V360 is the fact that the preview window will not display an interactive image that you can explore by swiping your finger on the screen or moving the device around like you would with the finished product in a 360 video player – all you see is the flat equirectangular image. So be ready for some trial & error work to find out how certain edits or the addition of titles/graphics will actually look like in the end! That being said, having a precise timeline layout instead of a simple storyboard plus the full feature set of those two advanced mobile video editing apps will give you a lot more freedom and control to create the video your way. You can record voice-overs or add music tracks and place them at specific points, you can add titles (they actually work surprisingly well in a 360 environment, just pay attention to where you place them and don’t make them too big or it will be very hard to read them!) and graphics and exactly define their length, size & style, you can apply transitions instead of plain cuts etc. etc.
So you have created a super-sophisticated 360 masterpiece and joyfully sung Soundgarden’s “Black Hole Sun” the whole time – now you can just upload the video to YouTube or Facebook and get showered with Likes and Thumbs-Ups, right? Er … no. Because we’re pretty much coming full circle (absolutely no pun intended!) when I tell you you mustn’t forget about the metadata! After exporting your video from a regular Android video editing app, the metadata is gone and it needs to be re-injected so that the video player on YouTube or Facebook will actually display the video as an interactive 360 video and not in a flat equirectangular form. So there’s a problem but, alas, there’s also a fix: VRfix. This app is a one-trick pony and it will cost you a couple of bucks but you should be thankful that it exists because otherwise, there would be no happy ending for a mobile-only 360 video workflow when you have used PowerDirector or KineMaster to edit your video. After you have re-injected the 360 metadata into the video file with VRfix, you can finally upload the video to your 360 video platform of choice. If you want to know more about how VRfix works, check out their website.
Oh my, this is my first English language blog post here and it has become quite a monster despite the fact that I only wanted to cover some general basics. Well, well. I do hope you will find it useful in some way. Please feel free to drop questions and other feedback in the comments or hit me up on Twitter (@smartfilming). If you happen to find any mistakes or incorrect information in my article you’re also more than welcome to let me know about it. In that regard I want to finish by saying thanks to a couple of people I consulted during the process of writing this blog post: Pipo Serrano (@piposerrano), Paul Gailey (@paulgailey), Kai Rüsberg (@mojonalist), Sarah Jones (@VirtualSarahJ), Sarah Redohl (@SarahRedohl) and the 360 Rumors Blog (@360rumorsblog).