One of the big reasons why Android has such an overwhelming dominance as a mobile operating system on a global scale (around 75% of smartphones world wide run Android) is that you basically have a seamless price range from the very bottom to the very top – no matter your budget, there’s an Android phone that will fit it. This is generally a very good thing since it allows everyone on this planet to participate in mobile communication, not just if you have deep pockets. But as many of us would agree, smartphones are not pure communication devices anymore, you can also use them to actively create content. In this respect, Android phones are bringing the power of storytelling to the people and could therefore be regarded as an invaluable asset in democratizing this mighty tool. But if you CAN get a (very) cheap Android phone, SHOULD you get one?
Of course the definition of what one considers “cheap” highly depends on an individual background so I won’t get into any concrete universal definitions here. In Germany, the cheapest Android phones start at around 50 Euro I’d say. So what in general is the difference between a 50 Euro phone and a 1000 Euro Android phone? Let’s single out some points from the perspective of a mobile video creator:
1) Build quality
This can actually be surprisingly controversial. Sure, flagship phones have more premium build materials but the move to shiny glass-covered backs has seen many an excited owner making a mess out of his or her new phone with a single drop. So better get a case if you consider yourself among those who occasionally drop their phone. The plasticy build of cheaper devices might look or at least feel less premium but they can often take more abuse in various circumstances. As for the screen itself, more expensive phones tend to have a more robust protective layer but that doesn’t always save you and you can also get a pretty affordable add-on screen protector if you are worried about damaging your phone’s screen.
2) Software updates
Usually, more expensive phones get more updates / updates for a longer period. But there are exceptions. Nokia for instance is known to be very good with updates even on their budget phones so it also depends on the phone maker. Are software updates important? Yes and no. Generally, new software versions (at least the big annual ones like Android 10, Android 11 etc.) introduce new features and optimizations. New features specifically relevant for videography are however pretty rare (the last major ones were introduced with Android 5 in 2014 and then Android 11 in 2020) so it depends on whether the new features are actually helpful for what you want to get done and whether you are a tech-savvy person who always wants the latest updates to play around with. Security updates are important though but ever since Google decided to make it possible to distribute them separately from feature updates, they have also become more common in cheaper phones – mid-rangers and flagships still tend to receive more software updates and for longer periods of time however.
3) Expandable storage
The ability to easily and cheaply add additional storage to your phone via a microSD card has long been a major plus of the Android system when compared to Apple’s iPhones. More and more Android OEMs however have started eliminating this valuable feature from their new releases, Samsung being the latest with its flagship S21 series. Sure, they have increased the internal storage over time, you can easily get phones with 128, 256 or 512 GB these days, but in my opinion it would still be good to have the option for expandable storage – UHD/4K video can fill up your phone pretty fast if you are shooting a lot. Interestingly, it’s now easier to find support for microSD cards in cheaper phones. Actually, many/most of the entry-level phones (still) have it so if that’s important to you, you might want to have a look at the budget or mid-range segment of the Android phone market.
4) Removable battery
An even more exotic but dare I say “pro” feature that has become nearly extinct but was generally very useful for “power users” is the ability to (easily) swap out batteries in a phone. LG was the last major phone maker to include this in a flagship device with the V20 in late 2016 but over time, the practice of a non-removable battery has trickled down even to the (ultra) budget market. The few phones with exchangable batteries that are left can however be found there, last survivors include the Samsung XCover Pro, the Motorola Moto E6 and the Nokia 1.3. The only recent mid-range device including this feature seems to be the Fairphone 3/3+. Sure, power banks are an abundant accessory now and an easy way to juice up your phone while on the go – but the re-supply is incremental and sometimes it’s quite annoying to be tethered to an external device via cable while using the phone.
While the last two points were very much in favor of budget phones, the tide is about to turn. If you want to use your phone for more than just browsing the web, checking your messages or following your social media feeds, then your phone needs some decent processing power to keep things running smoothly. One of the toughest nuts to crack for a SoC (System-on-a-Chip) is editing high resolution video – even more so when it involves multiple tracks. So if you are planning on editing a lot of UHD/4K video with multiple layers on your phone, a budget device probably won’t cut it because processing power often is a watershed between cheaper and more expensive phones. That doesn’t mean however that you can’t do video editing at all on a budget smartphone. About two years ago I was really surprised how well Qualcomm’s Snapdragon 430/435 SoC did in terms of video editing, allowing for multiple layers of 1080p video in KineMaster on phones like the Nokia 5, Motorola Moto G5 or the LG Q6. Generally, the amount of layers and their resolution in video editing apps are dependent on the device’s chipset. Some apps like Adobe Premiere Rush aren’t even available for any budget phones because they are too demanding in terms of processing power. The SoC can definitely also have an influence on the video RECORDING capabilities in terms of available frame rates and resolution. If 1080p at a maximum of 30fps is good enough for what you do though, basically every phone has that covered these days, even the cheapest ones.
And while the video recording resolution can be an indicator for technical image quality, it surely isn’t the only one – actually other things are (way) more important: Lens quality, aperture size, sensor quality, processing algorithm. That’s why 1080p footage shot on one phone might look better than 1080p footage shot on another. And generally, that’s also an area in which (ultra) budget phones get left behind. Again, this doesn’t mean that you should never use an entry-level phone to shoot video – some of them can capture surprisingly decent footage and if you are “just” doing something for Facebook etc., the difference in image quality might not really be noticeable for the casual, non-pixel-peeping viewer. Also never forget that the content/the story is way more important than the image quality! You will reach/move more people with a good story shot on a cheap phone than with a mediocre story shot on a flagship phone, never mind the superior image quality of the camera.
7) Native camera app
Another aspect that can distinguish a cheap from a more expensive Android phone is the native camera app. Not so much in terms of the general UI and basic functionality but in terms of special modes and features. LG for instance has an absolutely outstanding manual video mode in the native camera app of its flagship lines, one that can rival a dedicated 3rd party app like Filmic Pro, but you don’t get it in their budget phones. The same goes for Sony and – to a lesser degree – Samsung, which at least gives you support for external mics down to its entry-level offerings. Other Android phone makers however have the same native camera app in all of their models, budget or flagship (Motorola for instance, unless they have recently changed something).
8) Camera2 API
I just mentioned 3rd party video recording apps, so let’s look at an even “nerdier” aspect: Usually, more expensive phones have better Camera2 API support. What’s Camera2 API? I have written a whole blog post about it, but in short, it’s basically the phone’s ability to give 3rd party camera apps access to manual control for certain more advanced imaging parameters like shutter speed, ISO, white balance etc. So this is important if you are planning to use such an app (like for instance Filmic Pro, ProTake or mcpro24fps) instead of the phone’s native camera app. While nowadays basically all (or almost all) flagship phones and many/most mid-range Android phones have proper Camera2 API support, there are also entry-level phones that are equipped with it, for instance some from Nokia and Motorola – it’s not that common yet however.
9) Headphone jack
Before wrapping things up I want to look at another aspect that is of major relevance if you want to record audio with external mics on your smartphone – be it as part of capturing video or just audio-only. Like the removeable battery and expandable storage, the 3.5mm headphone jack is a feature that’s been fading away from smartphones over the last years. Some Android OEMs are still holding on to it (for the most part) but many have eliminated it, relying solely on a single physical port (USB-C) and wireless technology (Bluetooth/WiFi). As with those other features, it’s curious that the 3.5mm headphone jack has mostly survived in budget phones. This makes a case for a very particular use scenario: If you “only” want to record audio (be it for an audio-only production or use as an external audio recorder with a lavalier on a video shoot), a budget phone can be an interesting option because you don’t have to care about the quality of the camera and neither (for the most part) the chipset and its processing power since audio processing is much less resource hungry than video processing. The external-recorder-with-a-lavalier scenario is also a clever idea to make use of an old phone if you have one buried in a drawer somewhere that’s only collecting dust.
10) Bonus tip!
What if you DO want higher processing power and camera quality, but are on a tight budget nonetheless? In that case, it can be helpful to look at older flagship models or mid-rangers. Once new Android phones are released, their price – not always but often – drops after a couple of months. If you compare the camera quality and processing power of a budget phone with an older flagship or potent mid-ranger you can often easily go back two or three years and still be on the better side with the “oldie”. Depending on what model/phone maker you choose and how far back you go, you might be stuck with an older version of Android but as indicated earlier on, this isn’t necessarily as bad as it sounds.
As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
There are times when – for reasons of privacy or even a person’s physical safety – you want to make certain parts of a frame in a video unrecognizable so not to give away someone’s identity or the place where you shot the video. While it’s fairly easy to achieve something like that for a photograph, it’s a lot more challenging for video because of two reasons: 1) You might have a person moving around within a shot or a moving camera which constantly alters the location of the subject within the frame. 2) If the person talks, he or she might also be identifiable just by his/her voice. So are there any apps that help you to anonymize persons or objects in videos when working on a smartphone?
KineMaster – the best so far
Up until recently the best app for anonymizing persons and/or certain parts of a video in general was KineMaster which I already praised in my last blog about the best video editing apps on Android (it’s also available for iPhone/iPad). While it’s possible to use just any video editor that allows for a resizable image layer (let’s say just a plain black square or rectangle) on top of the main track to cover a face, KineMaster is the only one with a dedicated blur/mosaic tool for this use case. Many other video editing apps have a blur effect in their repertoire, but the problem is that this effect always affects the whole image and can’t be applied to only a part of the frame. KineMaster on the other hand allows its Gaussian Blur effect to be adjusted in size and position within the frame. To access this feature, scroll to the part of the timeline where you want to apply the effect but don’t select any of the clips! Now tap on the “Layer” button, choose “Effect”, then “Basic Effects”, then either “Gaussian Blur” or “Mosaic”. An effect layer gets added to the timeline which you can resize and position within the preview window. Even better: KineMaster also lets you keyframe this layer which is incredibly important if the subject/object you want to anonymize is moving around the frame or if the camera is moving (thereby constantly altering the subject’s/object’s position within the frame). Keyframing means you can set “waypoints” for the effect’s area to automatically change its position/size over time. You can access the keyframing feature by tapping on the key icon in the left sidebar. Keyframes have to be set manually so it’s a bit of work, particularly if your subject/object is moving a lot. If you just have a static shot with the person not moving around a lot, you don’t have to bother with keyframing though. And as if the adjustable blur/mosaic effect and support for keyframing wasn’t good enough, KineMaster also gives you a tool to add an extra layer of privacy: you can alter voices. To access this feature, select a clip in the timeline and then scroll down the menu on the right to find “Voice Changer”, there’s a whole bunch of different effects. To be honest, most of them are rather cartoonish – I’m not sure you want your interviewee to sound like a chipmunk. But there are also a couple of voice changer effects that I think can be used in a professional context.
What happened to Censr?
As I indicated in the paragraph above, a moving subject (or a moving camera) makes anonymizing content within a video a lot harder. You can manually keyframe the blurred area to follow along in KineMaster but it would be much easier if that could be done via automatic tracking. Last summer, a closed beta version of an app called “Censr” was released on iOS, the app was able to automatically track and blur faces. It all looked quite promising (I saw some examples on Twitter) but the developer Sam Loeschen told me that “unfortunately, development on censr has for the most part stopped”.
PutMask – a new app with a killer feature!
But you know what? There actually is a smartphone app out there that can automatically track and pixelate faces in a video: it’s called PutMask and currently only available for Android (there are plans for an iOS version). The app (released in July 2020) offers three ways of pixelating faces in videos: automatically by face-tracking, manually by following the subject with your finger on the touch-screen and manually by keyframing. The keyframing option is the most cumbersome one but might be necessary when the other two ways won’t work well. The “swipe follow” option is the middle-ground, not as time-consuming as keyframing but manual action is still required. The most convenient approach is of course automatic face-tracking (you can even track multiple faces at the same time!) – and I have to say that in my tests, it worked surprisingly well!
Does it always work? No, there are definitely situations in which the feature struggles. If you are walking around and your face gets covered by something else (for instance because you are passing another person or an object like a tree) even for only a short moment, the tracking often loses you. It even lost me when I was walking around indoors and the lens flare from the light bulb at the ceiling created a visual “barrier” which I passed at some point. And although I would say that the app is generally well-designed, some of the workflow steps and the nomenclature can be a bit confusing. Here’s an example: After choosing a video from your gallery, you can tap on “Detect Faces” to start a scanning process. The app will tell you how many faces it has found and will display a numbered square around the face. If you now tap on “Start Tracking”, the app tells you “At least select One filter”. But I couldn’t find a button or something indicating a “filter”. After some confusion I discovered that you need to tap once on the square that is placed over the face in the image, maybe by “filter” they actually mean you need to select at least one face? Now you can initiate the tracking. After the process is finished you can preview the tracking that the app has done (and also dig deeper into the options to alter the amount of pixelation etc.) but for checking the actual pixelated video you have to export your project first. While the navigation could/should be improved for certain actions to make it more clear and intuitive, I was quite happy with the results in general. The biggest catch until recently was the maximum export resolution of 720p but with the latest update released on 21 January 2021, 1080p is also supported. An additional feature that would be great to have in an app that has a dedicated focus on privacy and anonymization, is the ability to alter/distort the voice of a person, like you can do in KineMaster.
There’s one last thing I should address: The app is free to download with all its core functionality but you only get SD resolution and a watermark on export. For 720p watermark-free export, you need to make an in-app purchase. The IAP procedure is without a doubt the weirdest I have ever encountered: The app tells you to purchase any one of a selection of different “characters” to receive the additional benefits. Initially, these “characters” are just names in boxes, “Simple Man”, “Happy Man”, “Metal-Head” etc. If you tap on a box, an animated character pops up. But only when scrolling down it becomes clear that these “characters” represent different amounts of payment with which you support the developer. And if that wasn’t strange enough by itself, the amount you can donate goes up to a staggering 349.99 USD (Character Dr. Plague) – no kidding! At first, I had actually selected Dr. Plague because I thought it was the coolest looking character of the bunch. Only when trying to go through with the IAP did I become aware of the fact that I was about to drop 350 bucks on the app! Seriously, this is nuts! I told the developer that I don’t think this is a good idea. Anyway, the amount of money you donate doesn’t affect your additional benefits, so you can just opt for the first character, the “Simple Man”, which costs you 4.69€. I’m not sure why they would want to make things so confusing for users willing to pay but other than that, PutMask is a great new app with a lot of potential, I will definitely keep an eye on it!
As always, if you have questions or comments, drop them below or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
Ever since I started this blog, I wanted to write an article about my favorite video editing apps on Android but I could never decide on how to go about it, whether to write a separate in-depth article on each of them, a really long one on all of them or a more condensed one without too much detail or workflow explanations, more of an overview. So I recently figured there’s been enough pondering on this subject and I should just start writing something. The very basic common ground for all these mobile video editing apps mentioned here is that they allow you to combine multiple video clips into a timeline and arrange them in a desired order. Some might question the validity of editing video on such a relatively small screen as that of a smartphone (even though screen sizes have increased drastically over the last years). While it’s true that there definitely are limitations and I probably wouldn’t consider editing a feature-length movie that way, there’s also an undeniable fascination about the fact that it’s actually doable and can also be a lot of fun. I would even dare to say that it’s a charming throwback to the days before digital non-linear editing when the process of cutting and splicing actual film strips had a very tactile nature to it. But let’s get started…
When I got my first smartphone in 2013 and started looking for video editing apps in the Google PlayStore, I ran into a lot of frustration. There was a plethora of video editing apps but almost none of them could do more than manipulate a single clip. Then, in late December, an app called KineMaster was released and just by looking at the screenshots of the UI I could tell that this was the game changer I had been waiting for, a mobile video editing app that actually aspired to give you the proper feature set of a (basic) desktop video editing software. Unlike some other (failed) attempts in that respect, the devs behind KineMaster realized that giving the user more advanced editing tools could become an unpleasant boomerang flying in their face if the controls weren’t touch-friendly on a small screen. If you ever had the questionable pleasure of using a video editing app called “Clesh” on Android (it’s long gone), you know what I’m talking about. To this date, I still think that KineMaster has one of the most beautiful and intuitive UIs of any mobile app. It really speaks to its ingenuity that despite the fact that the app has grown into a respectable mobile video editing power house with many pro features, even total editing novices usually have no problem getting the hang of the basics within a couple of hours or even minutes.
While spearheading the mobile video editing revolution on Android, KineMaster dared to become one of the first major apps to drop the one-off payment method and pioneer a subscription model. I had initially paid 2€ one-off for the pro version of the app to get rid of the watermark, now you had to pay 2 or 3€ a month (!). I know, “devs gotta eat”, and I’m all for paying a decent amount for good apps but this was quite a shock I have to admit. It needs to be pointed out that KineMaster is actually free to download with all its features (so you can test it fully and with no time limit before investing any money) – but you always get a KineMaster watermark in your exported video and the export resolution doesn’t include UHD/4K. If you are just doing home movies for your family, that might be fine but if you do stuff in a professional or even just more ambitious environment, you probably want to get rid of the watermark. Years later, with every other app having jumped on the subscription bandwagon, I do feel that KineMaster is still one of the apps that are really worth it. I already praised the UI/UX, so here are some of the important features: You get multiple video tracks (resolution and number are device-dependend) and other media layers (including support for png images with tranparency), options for multiple frame rates including PAL (25/50), the ability to select between a wide variety of popular aspect ratios for projects (16:9, 9:16, 1:1, 2.35:1 etc.) and even duplicate the project with a different aspect ratio later (very useful if you want to share a video on multiple platforms), you can use keyframes to animate content, have a very good title tool at hand, audio ducking, voice over recording, basic grading tools and last but not least: the Asset Store. That’s the place where you can download all kinds of helpful assets for your edit: music, fonts, transitions, effects and most of all (animated) graphics (‘stickers’) that you can easily integrate into your project and make it pop without having to spend much time on creating stuff from scratch. Depending on what you are doing, this can be a massive help! I also have to say that despite Android’s fragmentation with all its different phones and chipsets, KineMaster works astonishingly well across the board.
There are still things that could be improved (certain parts of the timeline editing process, media management, precise font sizes, audio waveforms for video clips, quick audio fades, project archives etc.) and development progress in the last one or two years seems to have slowed down but it remains a/the top contender for the Android video editing crown, although way more challenged than in the past. Last note: KineMaster has recently released beta versions of two “helper” apps: VideoStabilizer for KineMaster and SpeedRamp for KineMaster. I personally wish they would have integrated this functionality into the main app but it’s definitely better than not having it at all.
The first proper rival for KineMaster emerged about half a year later in June 2014 with Cyberlink’s PowerDirector. Unlike KineMaster, PowerDirector was already an established name in the video editing world, at least on the consumer/prosumer level. In many ways, PowerDirector has a somewhat (yet not completely) equal feature set to that of KineMaster with one key missing option being that for exporting in PAL frame rates (if you don’t need to export in 25/50fps, you can ignore this shortcoming). The UI is also good and pretty easy to learn. After KineMaster switched to the subscription model, PowerDirector did have one big factor in its favor: You could still get the full, watermark-free version of the app by making a single, quite reasonable payment, I think it was about 5€. That, however, changed eventually and PowerDirector joined the ranks of apps that you couldn’t own anymore, but only rent via a subscription to have access to all features and watermark-free export. Despite the fact that it’s slightly more expensive than KineMaster now, it’s still a viable and potent mobile video editor with some tricks up its sleeve.
It was for instance – until recently – the only mobile video editor that has an integrated stabilization tool to tackle shaky footage. It’s also the only one with a dedicated de-noise feature for audio and unlike with KineMaster you can mix your audio levels by track in addition to just by individual clips. Furthermore, PowerDirector offers the ability to transfer projects from mobile to its desktop version via the Cyberlink Cloud which can come in handy if you want to assemble a rough cut on the phone but do more in-depth work on a bigger screen with mouse control. Something rather annoying is the way in which the app tries to nudge or dare I say shove you towards a subscription. As I had bought the app before the introduction of the subscription model, I can still use all of its features and export without a watermark but before getting to the edit workspace, the app bombards you with full-screen ads for its subscription service every single time – I really hate that. One last thing: There are a couple of special Android devices on which PowerDirector takes mobile video editing actually to another level but that’s for a future article so stay tuned.
Adobe Premiere Rush
Even more so than Cyberlink, Adobe is a well-known name in the video editing business thanks to Premiere Pro (Windows/macOS). More than once I had asked myself why such a big player had missed the opportunity to get into the mobile editing game. Sure, they dipped their toes into the waters with Premiere Clip but after a mildly promising launch, the app’s development stagnated all too soon and was abandoned eventually – not that much of a loss as it was pretty basic. In 2018 however, Adobe bounced back onto the scene with a completely new app, Premiere Rush. This time, it looked like the video editing giant was ready to take the mobile platform seriously.
The app has a very solid set of advanced editing features and even some specialties that are quite unique/rare in the mobile editing environment: You can for instance expand the audio of a video clip without actually detaching it and risking to go out of sync, very useful for J & L cuts. There’s also a dedicated button that activates multi-select for clips in the timeline, another great feature. What’s more, Rush has true timeline tracks for video. What do I mean by “true”? KineMaster and PowerDirector support video layers but you can’t just move a clip from the primary track to an upper/lower layer track and vice versa which isn’t that much of a problem most of the time but sometimes it can be a nuisance. In Rush you can move your video clips up and down the tracks effortlessly. The “true tracks” also means that you can easily disable/mute/lock a particular track and all the clips that are part of it. One of Rush’s marketed highlights is the auto-conform feature which is supposed to automatically adapt your edit to other aspect ratios using AI to frame the image in the (hopefully) best way. So for instance if you have a classic 16:9 edit, you can use this to get a 1:1 video for Instagram. This feature is reserved for premium subscribers but you can still manually alter the aspect ratio of your project in the free version. For a couple of months, the app was only available for iOS but premiered (pardon the pun!) on Android in May 2019. Like PowerDirector, you can use Adobe’s cloud to transfer project files to the desktop version of Rush (or even import into Premiere Pro) which is useful if the work is a bit more complex. It’s also possible to have projects automatically sync to the cloud (subscriber feature). Initially, the app had a very expensive subscription of around 10€ per month (and only three free exports to test) unless you were already an Adobe Creative Cloud subscriber in which case you got it for free), but it has now become more affordable (4.89€ monthly or 33.99 per year) and the basic version with most features including 1080p export (UHD/4K is a premium feature) is free and doesn’t even force a watermark on your footage – you do need to create a (free) account with Adobe though.
The app does have its quirks – how much of it are still teething aches, I’m not sure. In my personal tests with a Google Pixel 3 and a Pocophone F1, export times were sometimes outrageously long, even for short 1080p projects. Both my test devices were powered by a Snapdragon 845 SoC which is a bit older but was a top flagship processor not too long ago and should easily handle 1080p video. Other editing apps didn’t have any problems rushing out (there goes another pun!) the same project on the same devices. This leads me to believe that the app’s export engine still needs some fine tuning and optimization. But maybe things are looking better on newer and even more powerful devices. Another head-scratcher was frame rate fidelity. While the export window gave me a “1080p Match Framerate” option as an alternative to “1080p 30fps”, surely indicating that it would keep the frame rate of the used clips, working with 25fps footage regularly resulted in a 30fps export. The biggest caveat with Rush though is that its availability on Android is VERY limited. If you have a recent flagship phone from Samsung, Google, Sony or OnePlus, you’re invited, otherwise you are out of luck – for the moment at least. For a complete list of currently supported Android devices check here.
Ever since I started checking the Google PlayStore for interesting new apps on a regular basis, it rarely happens that I find a brilliant one that’s already been out for a very long time. It does happen on very rare occasions however and VN is the perfect case in point. VN had already been available for Android for almost two years (the PlayStore lists May 2018 as the release date) when it eventually popped up on my radar in March 2020 while doing a routine search for “video editors” on the PlayStore. VN is a very powerful video editor with a robust set of advanced tools and a UI that is both clean, intuitive and easy to grasp. You get a multi-layer timeline, support for different aspect ratios including 16:9, 9:16, 1:1, 21:9, voice over recording, transparency with png graphics, keyframing for graphical objects (not audio though, but there’s the option for a quick fade in/out), basic exposure/color correction, a solid title tool, export options for resolutions up to UHD/4K, frame rate (including PAL frame rates) and bitrate.
In other news, VN is currently the only of the advanced mobile video editing apps with a dedicated and very easy-to-use speed-ramping tool which can be helpful when manipulating a clip in terms of playback speed. It’s also great that you can move video clips up and down the tracks although it’s not as intuitive as Adobe Premiere Rush in that respect since you can’t just drag & drop but have to use the “Forward/Backward” button. But once you know how to do it, it’s very easy. While other apps might have a feature or two more, VN has a massive advantage: It’s completely free, no one-off payment, no subscription, no watermark. You do have to watch a 5 second full-screen ad when launching the app and delete a “Directed by” bumper clip from every project’s timeline, but it’s really not much of a bother in my opinion. In the past you had to create an account with VN but it’s not a requirement anymore. Will it stay free? When I talked to VN on Twitter some time ago, they told me that the app as such is supposed to remain free of charge but that they might at some point introduce certain premium features or content. VN recently launched a desktop version for macOS (no Windows yet) and the ability to transfer project files between iOS and macOS. While this is currently only possible within the Apple ecosystem (and does require that you register an account with VN), more cross-platform integration could be on the horizon. All in all, VN is an absolutely awesome and easily accessible mobile video editor widely available for most Android devices (Android 5.0 & up) – but do keep in mind that depending on the power of your phone’s chipset, the number of video layers and the supported editing/exporting resolution can vary.
CapCut is somewhat similar to VN in terms of basic functionality (multiple video tracks, support for different frame rates including PAL, variety of aspect ratios etc.) and layout, but with a few additional nifty features that might come in handy depending on the use case. Like VN, it’s completely free without a watermark and you don’t have to create an account. CapCut was – following Cyberlink’s PowerDirector – the second advanced mobile video editing app to introduce a stabilization tool and it can even be adjusted to some degree.
Its unique standout double-feature however has to do with automatic speech-to-text/text-to-speech processing. As we all know, captions have become an integral part of video production for social media platforms as many or most of us browse their network feeds without having the sound turned on and so captions can be a way to motivate users to watch a video even when it’s muted. While it’s no problem to manually create captions with the title tool in basically any video editing app, this can be very time-consuming and fiddly on a mobile device. So how about auto-generated captions? CapCut has you covered. It doesn’t work perfectly (you sometimes have to do some manual editing) and it’s currently only available in English, but it’s definitely a very cool feature that none of the other editors mentioned here can muster. Interestingly, it’s also possible to do it the other way around: You can let the app auto-generate a voice-over from a text layer. There are three different voices available: “American Male”, “American Female” and “British Female” (only English again). This can be useful if you quickly need to create a voice-over on the go and there’s no time or quiet place to do so or if you are not comfortable recording voice-overs with your own voice. Any cons? Generally, I would say that I prefer VN of the two because I like the design and UX of the timeline workspace better, it’s easier to navigate around, but that’s probably personal taste. What is an actual shortcoming however if you are after the highest possible quality is the fact that CapCut lacks support for UHD/4K export. Don’t get me wrong, you can import UHD/4K footage into the app and work with it but the export resolution is limited to 1080p and you also can’t adjust the bitrate. From a different angle, it should also be mentioned that CapCut is owned by Bytedance, the company behind the popular social video platform TikTok. While you don’t have to create an account for CapCut, you do have to agree to their T&Cs to use the app. So if you are very picky about who gets your data and kept your fingers off TikTok for that reason, you might want to take this into consideration.
Special mention (Motion Graphics): Alight Motion
Alight Motion is a pretty unique mobile app that doesn’t really have an equivalent at the moment. While you can also use it to stitch together a bunch of regular video clips filmed with your phone, this is not its main focus. The app is totally centered around creating advanced, multi-layered motion graphics projects, maybe think of it as a reduced mobile version of Adobe After Effects. Its power lies in the fact that you can manipulate and keyframe a wide range of parameters (for instance movement/position, size, color, shape etc.) on different types of layers to create complex and highly individual animations, spruced up with a variety of cool effects drawn from an extensive library. It takes some learning to unleash the enormous potential and power that lies within the app and fiddling around with a heavy load of parameters and keyframes on a small(ish) touch screen can occasionally be a bit challenging but the clever UI (designed by the same person that made KineMaster so much fun to use) makes the process basically as good and accessible as it can get on a mobile device. The developers also just added effect presets in a recent update which should make it easier for beginners who might be somewhat intimidated by manually keyframing parameters. Pre-designed templates for graphics and animations created by the dev team or other users will make things even more accessible in the future – some are already available but still too few to fully convince passionate users of apps such as the very popular but discontinued Legend. Alight Motion is definitely worth checking out as you can create amazing things with it (like explainer videos or animated info graphics), if you are willing to accept a small learning curve and invest some time. This is coming from someone who regularly throws in the towel trying to get the hang of Apple’s dedicated desktop motion graphics software Motion. Alight Motion has become the first application in this category in which I actually feel like I know what I’m doing – sort of at least. One very cool thing is that you can also use Alight Motion as a photo/still graphics editor since it lets you export the current timeline frame as a png, even with transparency! The app is free to download but to access certain features and export without a watermark you have to get a subscription which is currently around 28€ per year or 4.49 on a monthly basis.
Special mention (Automated Editing): Quik
Sometimes, things have to go quik-ly and you don’t have the time or ambition to assemble your clips manually. While I’m generally not a big fan of automated video editing processes, GoPro’s free Quik video editing app can come in handy at times. You just select a bunch of photos or videos, an animation style, your desired aspect ratio (16:9, 9:16, 1:1) and the app creates an automatic edit for you based on what it thinks are the best bits and pieces. In case you don’t like the results you have the option to change things around and select excerpts that you prefer – generally, manual control is rather limited though and it’s definitely not for more advanced edits. It’s also better suited for purely visual edits without important scenes relying on the original audio (like a person talking and saying something of interest). GoPro, who acquired the app in the past, is apparently working on a successor to Quik and will eventually pull this one from the Google PlayStore later in 2021 but here’s hope that the “new Quik” will be just as useful and accessible.
Special mention (360 Video Editing): V360
While 360 video hasn’t exactly become mainstream, I don’t want to ignore it completely for this post. Owners of a 360 camera (like the Insta360 One X2 I wrote about recently) usually get a companion mobile app along with the hardware which also allows basic editing. In the case of the Insta360 app you actually get quite a range of tools but it’s more geared towards reframing and exporting as a traditional flat video. You can only export a single clip in true 360 format. So if you want to create a story with multiple 360 video clips and also export as true, immersive 360 video with the appropriate metadata for 360 playback, you need to use a 3rd party app. I have already mentioned V360 in one of my very early blog posts but I want to come back to it as the landscape hasn’t really changed since then. V360 gives you a set of basic editing tools to create a 360 video story with multiple clips. You can arrange the clips in the desired order, trim and split them, add music and titles/text. It’s rather basic but good for what it is, with a clean interface and exports in original resolution (at least up to 5.7k which I was able to test). The free version doesn’t allow you to add transition effects between the clips and has a V360 branded bumper clip at the end that you can only delete in the paid version which is 4.99€. There are two other solid 360 video editors (Collect and VeeR Editor) which are comparable and even offer some additional/different features but I personally like V360 best although it has to be said that the app hasn’t seen an update in over two years.
What’s on the horizon?
There’s one big name in mobile editing town that’s missing from the Android platform so far – of course I’m talking about LumaFusion. According to LumaTouch, the company behind LumaFusion, they are currently probing an Android version and apparently have already hired some dedicated developers. I therefore suspect that despite the various challenges that such a demanding app like LumaFusion will encounter in creating a port for a different mobile operating system, we will see at least an early beta version in 2021. Furthermore, despite not having any concrete evidence, I assume that an Android version of Videoleap, another popular iOS-only video editor, might also be currently in the works. Not quite as advanced and feature-packed as LumaFusion, it’s pretty much on par in many respects with the current top dogs on Android. So while there definitely is competition, I also assume that the app’s demands are certainly within what can be achieved on Android and the fact that they have already brought other apps from their portfolio to Android indicates that they have some interest in the platform.
As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
A couple of years ago, 360° (video) cameras burst onto the scene and seemed to be all the new rage for a while. The initial excitement faded relatively quickly however when producers realized that this kind of video didn’t really resonate as much as they thought it would with the public – at least in the form of immersive VR (Virtual Reality) content for which you need extra hardware, hardware that most didn’t bother to get or didn’t get hooked on. From a creator’s side, 360 video also involved some extra and – dare I say – fairly tedious workflow steps to deliver the final product (I have one word for you: stitching). That’s not to say that this extraordinary form of video doesn’t have value or vanished into total obscurity – it just didn’t become a mainstream trend.
Among the companies that heavily invested in 360 cameras was Shenzen-based Insta360. They offered a wide variety of different devices: Some standalone, some that were meant to be physically connected to smartphones. I actually got the Insta360 Air for Android devices and while it was not a bad product at all and fun for a short while, the process of connecting it to the USB port of the phone when using it but then taking it off again when putting the phone back in your pocket or using it for other things quickly sucked out the motivation to keep using it.
Repurposing 360 video
While continuing to develop new 360 cameras, Insta360 however realized that 360 video could be utilized for something else than just regular 360 spherical video: Overcapture and subsequent reframing for “traditional”, “flat” video. What does this mean in plain English? Well, the original spherical video that is captured is much bigger in terms of resolution/size than the one that you want as a final product (for instance classic 1920×1080) which gives you the freedom to choose your angle and perspective in post production and even create virtual camera movement and other cool effects. Insta360 by no means invented this idea but they were clever enough to shift their focus towards this use case. Add to that the marketing gold feature of the “invisible selfie-stick” (taking advantage of a dual-lens 360 camera’s blindspot between its lenses), brilliant “Flow State” stabilization and a powerful mobile app (Android & iOS) full of tricks, you’ll end up with a significant popularity boost for your products!
The One X and the wait for a true successor
The one camera that really proved to be an instant and long-lasting success for Insta360 was the One X which was released in 2018. A very compact & slick form factor, ease of use and very decent image quality (except in low light) plus the clever companion app breathed some much-needed life into a fairly wrinkled and deflated 360 video camera balloon. In early 2020 (you know, the days when most of us still didn’t know there was a global pandemic at our doorstep), Insta360 surprised us by not releasing a direct successor to everybody’s darling (the One X) but the modular One R, a flexible and innovative but slightly clunky brother to the One X. It wasn’t until the end of October that Insta360 finally revealed the true successor to the One X, the One X2.
In the months prior to the announcement of the One X2, I had actually thought about getting the original One X (I wasn’t fully convinced by the One R) but it was sold out in most places and there were some things that bothered me about the camera. To my delight, Insta360 seemed to have addressed most of the issues that me (and obviously many others) had with the original One X: They improved the relatively poor battery life by making room for a bigger battery, they added the ability to connect an external mic (both wirelessly through Bluetooth and via the USB-C port), they included a better screen on which you could actually see things and change settings in bright sunlight, they gave you the option to stick on lens guards for protecting the delicate protruding lenses and they made it more rugged including an IPX8 waterproof certification (up to 10m) and a less flimsy thread for mounting it to a stick or tripod. All good then? Not quite. Just by looking at the spec sheet, people realized that there wasn’t any kind of upgrade in terms of video resolution or even just frame rates. It’s basically the same as the One X. It maxes out at 5.7k (5760×2880) at 30fps (with options for 25 and 24), 4k at 50fps and 3k at 100fps. The maximum bitrate is 125 Mbit/s. I’m sure quite a few folks had hoped for 8k (to get on par with the Kandao Qoocam 8K) or at the very least a 50/60fps option for 5.7k. Well, tough luck.
While I can certainly understand some of the frustration about the fact that there hasn’t been any kind of bump in resolution or frame rates in 2 years, putting 8K in such a small device and also have the footage work for editing on today’s mobile devices probably wasn’t a step Insta360 was ready to take because of the possibility of a worse user experience despite higher resolution image quality. Personally, I wasn’t bothered too much by this since the other hardware improvements over the One X were good enough for me to go ahead and make the purchase. And this is where my own frustrations began…
Insta360 & me: It’s somewhat difficult…
While I was browsing the official Insta360 store to place my order for the One X2, I noticed a pop-up that said that you could get 5% off your purchase if you sign up for their newsletter. They did exclude certain cameras and accessories but the One X2 was mentioned nowhere. So I thought, “Oh, great! This just comes at the right time!”, and signed up for the newsletter. After getting the discount code however, entering it during the check-out always returned a “Code invalid” error message. I took to Twitter to ask them about this – no reply. I contacted their support by eMail and they eventually and rather flippantly told me something like “Oh, we just forgot to put the X2 on the exclusion list, sorry, it’s not eligible!”. Oh yes, me and the Insta360 support were off to a great start!
Wanting to equip myself with the (for me) most important accessories I intended to purchase a pair of spare batteries and the microphone adapter (USB-C to 3.5mm). I could write a whole rant about how outrageous I find the fact that literally everyone seems to make proprietary USB-C to 3.5mm adapters that don’t work with other brands/products. E-waste galore! Anyway, there’s a USB-C to 3.5mm microphone adapter from Insta360 available for the One R and I thought, well maybe at least within the Insta360 ecosystem, there should be some cross-device compatibility. Hell no, they told me the microphone adapter for the One R doesn’t work with the One X2. Ok, so I need to purchase the more expensive new one for the X2 – swell! But wait, I can’t because while it’s listed in the Insta360 store, it’s not available yet. And neither are extra batteries. The next bummer. So I bought the Creator Kit including the “invisible” selfie-stick, a small tripod, a microSD card, a lens cap and a pair of lens guards.
A couple of weeks later, the package arrived – no problem, in the era of Covid I’m definitely willing to cut some slack in terms of delivery times and the merchandise is sent from China so it has quite a way to Germany. I opened the package, took out the items and checked them to see if anything’s broken. I noticed that one of the lens guards had a small blemish/scratch on it. I put them on the camera anyway thinking maybe it doesn’t really show in the footage. Well, it did. A bit annoying but stuff like that happens, a lemon. I contacted the support again. They wanted me to take a picture of the affected lens guard. Okay. I sent them the picture. They blatantly replied that I should just buy a new one from their store, basically insinuating that it was me who damaged the lens guard. What a terrible customer service! I suppose I would have mustered up some understanding for their behaviour if I had contacted them a couple of days or weeks later after actually using the X2 for some time outdoors where stuff can quickly happen. But I got in touch with them the same day the delivery arrived and they should have been able to see that since the delivery had a tracking number. Also, this item costs 25 bucks in the Insta360 store, probably a single one or a few cents in production and I wasn’t even asking about a pair but only one – why make such a fuss about it? So there was some back-and-forth and only after I threatened to return the whole package and asked for a complete refund they finally agreed to send me a replacement pair of lens guards at no extra cost. On a slightly positive note, they did arrive very quickly only a couple of days later.
Is the Insta360 One X2 actually a good camera?
So what an excessive prelude I have written! What about the camera itself? I have to admit that for the most part, it’s been a lot of fun so far after using it for about a month. The design is rugged yet still beautifully simplistic and compact, the image quality in bright, sunny conditions is really good (if you don’t mind that slightly over-sharpened wide-angle look and that it’s still “only” 5.7k – remember this resolution is for the whole 360 image so it’s not equivalent to a 5.7k “flat” image), the stabilization is generally amazing (as long as the camera and its sensor are not exposed to extreme physical shakes which the software stabilization can’t compensate for) and the reframing feature in combination with the camera’s small size and weight gives you immense flexibility in creating very interesting and extraordinary shots.
Sure, it also has some weaknesses: Despite having a 5.7k 360 resolution, if you want to export as a regular flat video, you are limited to 1080p. If you need your final video to be in UHD/4K non-360 resolution, this camera is not for you. The relatively small sensor size (I wasn’t able to find out the exact size for the X2 but I assume it’s the same as the One X, 1/2.3″) makes low-light situations at night or indoors a challenge despite a (fixed) aperture of f/2.0 – even a heavily overcast daytime sky can prove less than ideal. Yes, a slightly bigger sensor compared to its predecessors would have been welcome. The noticeable amount of image noise that is introduced by auto-exposure in such dim conditions can be reduced by exposing manually (you can set shutter speed and ISO) but then of course you just might end up with an image that’s quite dark. The small sensor also doesn’t allow for any fancy “cinematic” bokeh but in combination with the fixed focus it also has an upside that shouldn’t be underestimated for self-shooters: You don’t have to worry about a pulsating auto-focus or being out of focus as everything is always in focus. You can also shoot video in LOG (flatter image for more grading flexibility) and HDR (improved dynamic range in bright conditions) modes. Furthermore, there’s a dedicated non-360 video mode with a 150 degree field-of-view but except for the fact that you get a slight bump in resolution compared to flat reframed 360 video (1440p vs. 1080p) and smaller file sizes (you can also shoot your 5.7k in H.265 codec to save space), I don’t see me using this a lot as you lose all the flexibility in post.
While it’s good that all the stitching is done automatically and the camera does a fairly good job, it’s not perfect and you should definitely familiarize yourself with where the (video) stitchline goes to avoid it in the areas where you capture important objects or persons, particularly faces. As a rule of thumb when filming yourself or others you should always have one of the two lenses pointed towards you/the person and not face the side of the camera. It’s fairly easy to do if you usually have the camera in the same position relative to yourself but becomes more tricky when you include elaborate camera movements (which you probably will as the X2 basically invites you to do this!).
Regarding the audio, the internal 4-mic ambi-sonic set up can produce good results for ambient sound, particularly if you have the camera close to the sound source like when you have it on a stick pointing down and you are walking over fresh snow, dead leaves, gravel etc. For recording voices in good quality, you also need to be pretty close to the camera’s mics, having it on a fully extended selfie-stick isn’t ideal. If you want to use the X2 on an extended stick and talk to the camera you should use an external mic, either one that is directly connected to the camera or plugged into an external recorder, then having to sync audio and video later in post. As I have mentioned before, the X2 now does offer support for external mics via the USB-C charging port with the right USB-C-to-3.5mm adapter and also via Bluetooth. Insta360 highlights in their marketing that you can use Apple’s AirPods (Pro) but you can also other mics that work via Bluetooth. The audio sample rate of Bluetooth mics is currently limited to 16kHz by standard but depending on the used mic you can get decent audio. I’ll probably make a separate article on using external mics with the X2 once my USB-C to 3.5mm adapter arrives. Wait, does the X2 shoot 360 photos as well? Of course it does, they turn out quite decent, particularly with the new “Pure Shot” feature and the stichting is better than in video mode. It’s no secret though that the X2 has a focus on video with all its abilities and for those that mainly care about 360 photography for virtual tours etc., the offerings in the Ricoh Theta line will probably be the better choice.
The Insta360 mobile app
The Insta360 app (Android & iOS) might deserve its own article to get into detail but suffice it to say that while it can seem a bit overwhelming and cluttered occasionally and you also still experience glitches now and then, it’s very powerful and generally works well. Do note however that if you want to export in full 5.7k resolution as a 360 video you have to transfer the original files to a desktop computer and work with them in the (free) Insta360 Studio software (Windows/macOS) as export from the mobile app is limited to 4K. You should also be aware of the fact that neither the mobile app nor the desktop software works as a fully-fledged traditional video editor for immersive 360 video where you can have multiple clips on a timeline and arrange them for a story. In the mobile app, you do get such an editing environment (“Stories” – “My Stories” – “+ Create a story”) but while you can use your original spherical 360 footage here, you can only export the project as a (reframed) flat video (max resolution 2560×1440). If you need your export to be an actual 360 video with according metadata, you can only do this one clip at a time outside the “Stories” editing workspace. But as mentioned before, Insta360 focuses on the reframing of 360 video with its cameras and software, so not too many people might be bothered by that. One thing that really got on my nerves while editing within the app on an iPad: When you are connected to the X2 over WiFi, certain parts of the app that rely on a data connection don’t work, for instance you are not able to browse all the features of the shot lab (only those that have been cached before) or preview/download music tracks for the video. This is less of a problem on a phone where you still can have a mobile data connection while using a WiFi connection to the X2 (if you don’t mind using up mobile data) but on an iPad or any device that doesn’t have an alternative internet connection, it’s quite annoying. You have to download the clip, then disconnect from the X2, re-connect to your home WiFi and then download the track to use.
Who is the One X2 for?
Well, I’d say that it can be particularly useful for solo-shooters and solo-creators for several reasons: Most of all you don’t have to worry much about missing something important around you while shooting since you are capturing a 360 image and can choose the angle in post (reframing/keyframed reframing) if you export as a regular video. This can be extremely useful for scenarios where there’s a lot to see or happening around you, like if you are travel-vlogging from interesting locations or are reporting from within a crowd – or just generally if you want to do a piece-to-camera but also show the viewer what you are looking at the same moment. Insta360’s software stabilization is brilliant and comparable to a gimbal and the “invisible” selfie-stick makes it look like someone else is filming you. The stick and the compact form of the camera also lets you move the camera to places that seem impossible otherwise. With the right technique you can even do fake “drone” shots. Therefore it also makes sense to have the X2 in your tool kit just for special shots, even if you neither are a vlogger, a journalist nor interested in “true” 360 video.
A worthy upgrade from the One X / One R?
Should you upgrade if you have a One X or One R? Yes and no. If you are happy with the battery life of the One X or the form factor of the One R and were mainly hoping for improved image quality in terms of resolution / higher frame rates, then no, the One X2 does not do the trick, it’s more of a One X 1.5 in some ways. However, if you are bothered by some “peripheral” issues like poor battery life, very limited functionality of the screen/display, lack of external microphone support (One X) or the slightly clunky and cumbersome form factor / handling (One R) and you are happy with a 5.7k resolution, the X2 is definitely the better camera overall. If you have never owned a 360 (video) camera, this is a great place to start, despite its quirks – just be aware that Insta360’s support can be surprisingly cranky and poor in case you run into any issues.
As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
I usually don’t follow the stats for my blog but when I recently did check on what articles have been the most popular so far, I noticed that a particular one stuck out by a large margin and that was the one on using external microphones with Android devices. So I thought if people seem to be interested in that, why not make an equivalent for iOS, that is for iPhones? So let’s jump right into it.
First things first: The Basics
A couple of basic things first: Every iPhone has a built-in microphone for recording video that, depending on the use case, might already be good enough if you can position the phone close to your talent/interviewee. Having your mic close to the sound source is key in every situation to get good audio! As a matter of fact, the iPhone has multiple internal mics and uses different ones for recording video (next to the lens/lenses) and pure audio (bottom part). When doing audio-only for radio etc., it’s relatively easy to get close to your subject and get good results. It’s not the best way when recording video though if you don’t want to shove your phone into someone’s face. In this case you can and should significantly improve the audio quality of your video by using an external mic connected to your iPhone – never forget that audio is very important! While the number of Android phone makers that support the use of external mics within their native camera app is slowly growing, there are still many (most?) Android devices out there that don’t support this for the camera app that comes with the phone (it’s possible with basically every Android device if you use 3rd party camera apps though!). You don’t have to worry about this when shooting with the native camera app of an iPhone. The native camera app will recognize a connected external mic automatically and use it as the audio input when recording video. When it comes to 3rd party video recording apps, many of them like Filmic Pro, MoviePro or Mavis support the use of external mics as well but with some of them you have to choose the audio input in the settings so definitely do some testing before using it the first time on a critical job. Although I’m looking at this from a videographer’s angle, most of what I am about to elaborate on also applies to recording with audio recording apps. And in the same way, when I say “iPhone”, I could just as well say “iPad” or “iPod Touch”. So there are basically three different ways of connecting an external mic to your iPhone: via the 3.5mm headphone jack, via the Lightning port and via Bluetooth (wireless).
3.5mm headphone jack & adapter
With all the differences between Android and iOS both in terms of hardware and software, the 3.5mm headphone jack was, for a while, a somewhat unifying factor – that was until Apple decided to drop the headphone jack for the iPhone 7 in 2016. This move became a wildly debated topic, surely among the – let’s be honest – comparatively small community of mobile videographers and audio producers relying on connecting external mics to their phones but also among more casual users because they couldn’t just plug in their (often very expensive) headphones to their iPhone anymore. While the first group is definitely more relevant for readers of this blog, the second was undoubtedly responsible for putting the issue on the public debate map. Despite the considerable outcry, Apple never looked back. They did offer a Lightning-to-3.5mm adapter – but sold it separately. I’m sure they have been making a fortune since, don’t ask how many people had to buy it more than once because they lost, displaced or broke the first one. A whole bunch of Android phone makers obviously thought Apple’s idea was a progressive step forward and started ditching the headphone jack as well, equipping their phones only with a USB-C port. Unlike with Apple however, the consumer still had the choice to choose a new phone that had a headphone jack and in a rather surprising turn of events, some companies like Huawei and Google actually backtracked and re-introduced the headphone jack, at least for certain models. Anyway, if you happen to have an older iPhone (6s and earlier) you can still use the wide variety of external microphones that can be connected via the 3.5mm headphone jack without worrying much about adapters and dongles.
While most Android users probably still have fairly fresh memories of a different charging port standard (microUSB) from the one that is common now (USB-C), only seasoned iPhone aficionados will remember the days of the 30-pin connector that lasted until the iPhone 5 introduced the Lightning port as a new standard in 2012. And while microUSB mic solutions for Android could be counted on one hand and USB-C offerings took forever to become a reality, there were dedicated Lightning mics even before Apple decided to kill the headphone jack. The most prominent one and a veritable trailblazer was probably IK Multimedia’s iRig Mic HD and its successor, the iRig Mic HD 2. IK Multimedia’s successor to the iRigPre, the iRigPre HD comes with a Lightning cable as well. But you can also find options from other well-known companies like Zoom (iQ6, iQ7), Shure (MV88/MV88+), Sennheiser (HandMic Digital, MKE 2 Digital), Rode (Video Mic Me-L), Samson (Go Mic Mobile) or Saramonic (Blink 500). The Saramonic Blink 500 comes in multiple variations, two of them specifically targeted at iOS users: the Blink 500 B3 with one transmitter and the B4 with two transmitters. The small receiver plugs right into the Lightning port and is therefore an intriguingly compact solution, particularly when using it with a gimbal. Saramonic also has the SmartRig Di and SmartRig+ Di audio interfaces that let you connect one or two XLR mics to your device. IK Multimedia offers two similar products with the iRig Pro and the iRig Pro Duo. Rode recently released the USB-C-to-Lightning patch cable SC15 which lets you use their Video Mic NTG (which comes with TRS/TRRS cables) with an iPhone. There’s also a Lightning connector version of the SC6 breakout box, the SC6-L which lets you connect two smartLavs or TRRS mics to your phone. I have dropped lots of product names here so far but you know what? Even if you don’t own any of them, you most likely already have an external mic at hand: Of course I’m talking about the headset that comes included with the iPhone! It can’t match the audio quality of other dedicated external mics but it’s quite solid and can come in handy when you have nothing else available. One thing you should keep in mind when using any kind of microphone connected via the iPhone’s Lightning port: unless you are using a special adapter with an additional charge-through port, you will not be able to charge your device at the same time like you can/could with older iOS devices that had a headphone jack.
I have mentioned quite a few wireless systems before (Rode Wireless Go, Saramonic Blink 500/Blink 500 Pro, Samson Go Mic Mobile) that I won’t list here (again) for one reason: While the TX/RX system of something like the Rode Wireless Go streams audio wirelessly between its units, the receiver unit (RX) needs to be connected to the iPhone via a cable or (in the case of the Blink 500) at least a connector. So strictly speaking it’s not really wireless when it comes to how the audio signal gets into the phone. Now, are there any ‘real’ wireless solutions out there? Yes, but the technology hasn’t evolved to a standard that can match wired or semi-wired solutions in terms of both quality and reliability. While there could be two ways of wireless audio into a phone (wifi and Bluetooth), only one (Bluetooth) is currently in use for external microphones. This is unfortunate because the Bluetooth protocol that is used for sending audio back from an external accessory to the phone (the so-called Hands Free Profile, HFP) is limited to a sample rate of 16kHz (probably because it was created with headset phone calls in mind). Professional broadcast audio usually has a sample rate of 44.1 or 48kHz. That doesn’t mean that there aren’t any situations in which using a Bluetooth mic with its 16kHz limitation can actually be good enough. The Instamic was primarily designed to be a standalone ultra-compact high quality audio recorder which records 48/96 kHz files to its internal 8GB storage but can also be used as a truly wireless Bluetooth mic in HFP mode. The 16kHz audio I got when recording with Filmic Pro (here’s a guide on how to use the Instamic with Filmic Pro) was surprisingly decent. This probably has to do with the fact that the Instamic’s mic capsules are high quality unlike with most other Bluetooth mics. One maybe unexpected option is to use Apple’s AirPods/AirPods Pro as a wireless Bluetooth mic input. According to BBC Mobile Journalism trainer Marc Blank-Settle, the audio from the AirPods Pro is “good but not great”. He does however point out that in times of Covid-19, being able to connect to other people’s AirPods wirelessly can be a welcome trick to avoid close contact. Another interesting wireless solution comes from a company called Mikme. Their microphone/audio recorder works with a dedicated companion video recording app via Bluetooth and automatically syncs the quality audio (44.1, 48 or 96kHz) to the video after the recording has been stopped. By doing this, they work around the 16kHz Bluetooth limitation for live audio streaming. While the audio quality itself seems to be great, the somewhat awkward form factor and the fact that it only works with its best feature in their own video recording app but not other camera apps like Filmic Pro, are noticeable shortcomings (you CAN manually sync the Mikme’s audio files to your Filmic or other 3rd party app footage in a video editor). At least regarding the form factor they have released a new version called the Mikme Pocket which is more compact and basically looks/works like a transmitter with a cabled clip-on lavalier mic. One more important tip that applies to all the aforementioned microphone solutions: If you are shooting outdoors, always have some sort of wind screen / wind muff for your microphone with you as even a light breeze can cause noticeable noise.
Looking into the nearby future, some fear that Apple might be pulling another “feature kill” soon, dropping the Lightning port as well and thereby eliminating all physical connections to the iPhone. While there are no clear indications that this is actually imminent, Apple surely would be the prime suspect to push this into the market. If that really happens however, it will be a considerable blow to iPhone videographers as long as there’s no established high-quality and reliable wireless standard for external mics. Oh well, there’s always another mobile platform to go to if you’re not happy with iOS anymore 😉
To wrap things up, I have asked a couple of mobile journalists / content creators using iPhones what their favorite microphone solution is when recording video (or audio in general):
Wytse Vellinga (Mobile Storyteller at Omrop Fryslân, The Netherlands): “When I am out shooting with a smartphone I want high quality worry-free audio. That is why I prefer to use the well-known brands of microphones. Currently there are three microphones I use a lot. The Sennheiser MKE200, the Rode Wireless Go and the Mikme Pocket. The Sennheiser is the microphone that is on the phone constantly when taking shots and capturing the atmospheric sound and short sound bites from people. For longer interviews I use the wireless microphones from Mikme and Rode. They offer me freedom in shooting because I don’t have to worry about the cables.”
Philip Bromwell (Digital Native Content Editor at RTÉ, Ireland): “My current favourite is the Rode Wireless Go. Being wireless, it’s a very flexible option for recording interviews and gathering localised nat sound. It has proven to be reliable too, although the original windshield was a weakness (kept detaching).”
Nick Garnett (BBC Reporter, England & the world): “The mic I always come back to is the Shure MV88+ – not so much for video – but for audio work: it uses a non proprietary cable – micro usb to lightning. It allows headphones to plug into the bottom and so I can use it for monitoring the studio when doing a live insert and the mic is so small it hides in my hand if I have to be discrete. For video work? Rode VideoMicro or the Boya clone. It’s a semi-rifle, it comes with a deadcat and an isolation mount and it costs €30 … absolute bargain.”
Neal Augenstein (Radio Reporter at WTOP Washington DC, USA): “If I’m just recording a one-on-one interview, I generally use the built-in microphone of the iPhone, with a foam windscreen. I’ve yet to find a microphone that so dramatically improves the sound that it merits carrying it around. In an instance where someone’s at a podium or if I’m shooting video, I love the Rode Wireless Go. Just clipping it on the podium, without having to run cable, it pairs automatically, and the sound is predictably good. The one drawback – the tiny windscreen is tough to keep on.”
Nico Piro (Special Correspondent for RAI, Italy & the world): “To record ambient audio (effects or natural as you want to name it) I use a Rode Video Mic Go (light, no battery needed, perfect for both phones and cameras) even if I must say that the iPhone’s on-board mic performs well, too. For Facebook live I use a handheld mic by Polsen, designed for mobile, it is reliable and has a great cardioid pickup pattern. When it comes to interviews, the Rode Wireless Go beats everything for its compact dimensions and low weight. When you are recording in big cites like New York and you are worried about radio interferences the good old cabled mics are always there to help, so Rode’s SmartLav+ is a very good option. I’m also using it for radio production and I am very sad that Rode stopped improving its Rode Rec app which is still good but stuck in time when it comes to file sharing. Last but not least is the Instamic. It takes zero space and it is super versatile…if you use native camera don’t forget to clap for sync!”
Bianca Maria Rathay (Freelance iPhone videographer, Germany): “My favorite external microphone for the iPhone is the RODE Wireless Go in combination with a SmartLav+ (though it works on its own also). The mic lets your interviewee walk around freely, works indoors as well as outdoors and has a full sound. Moreover it is easy to handle and monitor once you have all the necessary adapters in place and ready.”
Leonor Suarez (TV Journalist and News Editor at RTPA, Spain): “My favorite microphone solutions are: For interviews: Rode Rodelink Filmmaker Kit. It is reliable, robust and has a good quality-price relationship. I’ve been using it for years with excellent results. For interviews on the go, unexpected situations or when other mics fail: IK Multimedia iRig Mic Lav. Again, good quality-price relationship. I always carry them with me in my bag and they have allowed me to record interviews, pieces to camera and unexpected stories. What I also love is that you can check the audio with headphones while recording.”
Marcel Anderwert (Mobile Journalist at SRF, Switzerland): “For more than a year, I have been shooting all my reports for Swiss TV with one of these two mics: Voice Technologies’ VT506Mobile (with it’s long cable) or the Rode Wireless Go, my favourite wireless mic solution. The VT506Mobile works with iOS and Android phones, it’s a super reliable lavalier and the sound quality for interviews is just great. Rode’s Wireless Go gives me more freedom of movement. And it can be used in 3 ways: As a small clip-on mic with inbuilt transmitter, with a plugged in lavalier mic – and in combination with a simple adapter even as a handheld mic.”
As always, if you have questions or comments, drop them here or hit me up on the Twitter @smartfilming. If you like this article, also consider subscribing to my free Telegram channel (t.me/smartfilming) to get notified about new blog posts and receive the monthly Ten Telegram Takeaways newsletter about important things that happened in the world of mobile video.
One of the things that has mostly remained a blindspot in video recording with the native camera app of a smartphone, is the ability to shoot in PAL frame rates, i.e. 25/50fps. The native camera apps of smartphones usually record with a frame rate of 30/60 fps. This is fine for many use cases but it’s not ideal under two circumstances: a) if you have to deliver your video for traditional professional broadcast in a PAL broadcast standard region (Europe, Australia, parts of Africa, Asia, South America etc.) b) If you have a multi-camera shoot with dedicated ‘regular’ cameras that only shoot 25/50fps. Sure, it’s relatively easy to capture in 25fps on your phone by using a 3rd party app like Filmic Pro or Protake but it still would be a welcome addition to any native camera app as long as this silly global frame rate divide (don’t get me started on this!) continues to exist. There was actually a prominent example of a phone maker that offered 25fps as a recording option in their (quasi)-native camera app very early on: Nokia and later Microsoft on their Lumia phones running Windows Phone / Windows Mobile. But as we all know by now, Windows Phone / Windows Mobile never really stood a chance against Android and iOS (read about its potential here) and has all but disappeared from the smartphone market. When LG introduced its highly advanced manual video mode in the native camera app of the V10, I had high hopes they would include a 25/50fps frame rate option as they were obviously aiming at more ambitious videographers. But no, the years have passed and current offerings from the Korean company like the G8X, V60 and Wing still don’t have it. It’s probably my only major gripe with LG’s otherwise outstanding flagship camera app. It was up to Sony to rekindle the flame, giving us 25fps natively in the pro camera app of the Xperia 1 II earlier this year.
As always, feel free to comment here or hit me up on the Twitter @smartfilming. If you like this blog post, do consider subscribing to my Telegram channel to get notified about new blog posts and also receive my Ten Telegram Takeaways newsletter including 10 interesting things that happened during the past four weeks in the world of mobile content creation/tech.
While writing my last blog post about Google Recorder 2.0, I stumbled upon a hack that can also be utilized for another app from Google, one that currently understands over 70 languages, not only English: It’s called “Live Transcribe & Sound Notifications” and is available for pretty much every Android device. Have you always been looking for a tool that transcribes your audio recordings but doesn’t require an expensive subscription? Here’s what I like to think is a very useful and simple trick for achieving this on an Android phone. You will need the following things:
Android device running at least Android 5.0 Lollipop (if your phone is less than 5 years old, you should be safe!)
an internet connection (either mobile data or wifi)
a quiet environment
Let’s say you have recorded some audio like an interview, a meeting, a vox pop, a voice-over for video or even a podcast on your smartphone (look here for some good audio recorder apps) and would like to have a text transcription of it. If you read this before making such a recording, do include a few seconds of silence before having someone talk in the recording and it’s also important that the recording is of good quality in terms of speech clarity, the reasons will become obvious soon.
Here’s how it works!
Open Live Transcribe and check the input language displayed in the bottom toolbar (if the toolbar isn’t there, just tap on the screen somewhere). It needs to be the same as the recording you want to have transcribed. If it’s a different one, tap on the gear icon and then on “More settings”. Choose the correct language. Unlike Google Recorder which I wrote about in my last article, Live Transcribe works with a vast number of languages, not only English. Also unlike Recorder however, Live Transcribe needs an active internet connection to transcribe, you can’t use it offline! If you are planning on pasting the transcription into a context with a white background later on, you should make sure that “Dark Theme” is disabled in Live Transcribe. Otherwise you will be pasting white text onto a white background. Leave the settings menu and check that Live Transcribe’s main screen says “Ready to transcribe” in the center. Now double-check that you are in a quiet environment, leave Live Transcribe and open the audio recording app. Locate the recording you want to have transcribed and start the playback of the file (do make sure the speaker volume is sufficient!), then quickly switch over to Live Transcribe. One way to do this is to use Android’s “Recent Apps” feature which can be accessed by tapping on the square icon in the 3-button navigation bar – some Android phone makers use a different icon, Samsung for instance now has three vertical lines instead of a square. If you are using gesture navigation, swipe up from the bottom and hold. But you can also just leave the audio recording app and open Live Transcribe again without going into recent apps. The recording will keep playing with Live Transcribe picking up the audio from the phone’s speaker(s) and doing its transcription thing as if someone was talking into the phone’s mic directly. This actually works! Don’t worry if you notice mistakes in the transcription, you can fix them later. Once the recording and subsequently the transcription is finished, long-tap on any word, choose “Select transcription” and then “Copy”. You have now copied the whole transcription to the clipboard and can paste it anywhere you like: eMail, Google Docs etc. That’s also where you are now able to correct any mistakes that Live Transcribe has made (within Live Transcribe, there’s no option for editing the transcription yet). Two more things: You can have Live Transcribe save your transcripts for three days (activate it in the settings or activate auto-save under “More settings”) and if you want to clear out the app’s transcription cache, you can also do this under “More settings”, then choose “Delete history”.
Can you do the same with video recordings?
What about video recordings? Could you have them transcribed via Live Transcribe as well? Basically yes, but it’s not quite as easy. That’s if you want to do it using only one device (it’s very easy if you use a second device for playback). When you leave an app that’s playing back a video, the video (and with it its audio) will stop playing so there’s nothing for Live Transcribe to listen to. You can work around this by using Android’s split-screen or multi-window feature to actively run more than one app at the same time. On Android 7 and 8 you are able to access split-screen apps by long-pressing the square icon (recent apps) in the bottom navigation bar and select the app(s) you want to run in split-screen mode. Things have changed with Android 9 however. For one, gesture navigation was introduced as an alternative to the “old” 3-button-navigation bar. So if you are using gesture navigation, you access recent apps by swiping up from the bottom and then hold. If you use the 3-button-navigation, long pressing the square icon doesn’t do anything anymore. Instead, just tap it once to access the recent apps view, tap on the app’s icon at the top of the window and you will get a pop-up menu. Depending on what Android phone you are using the menu will have slightly different items, or at least they are named differently: On my LG G8X I get “App info”, “Multi window”, “Pop-up window” and “Pin app”, on my Pixel 3 I get “App info”, “Split screen”, “Freeform” and “Pause app”. The items you will want to choose to run two apps side by side are “Multi window” (G8X) / “Split screen” (Pixel 3) which will split the screen in half or “Pop-up window” (G8X) / “Freeform” (Pixel 3) which will display the app(s) in a small, desktop-like window that you can move around freely. By doing this, you can playback a video clip and have Live Transcribe running at the same time. Of course you can also use this feature to have both Live Transcribe and the playback of an audio recording app on the same screen simultaneously but for audio file transcriptions, you don’t have to go the extra mile.
Can I do this on an iPhone as well?
Google has a whole range of apps for iOS, but unfortunately, Live Transcribe isn’t among them – it’s currently Android-only. But hey, maybe you have an older Android phone in your drawer that you could put to good use again? That being said, there is the possibility that Google will eventually release an iOS version of Live Transcribe or Apple will come up with an app that does something similar. I also thought of another way, using a Google app that is already available for iOS: Google Translate. Yes, it’s meant for translation and not transcription but in the Android version, you can also find a “Transcribe” button. Initially, using this will only give you a transcription of the translated language but if you tap the cog wheel in the bottom left corner and choose “Show original text”, you will actually get a transcription of the original language which you can then copy and paste. When checking the iOS version of Translate though, I noticed that there is no “Transcribe” button. There is a “Voice” button (which in the Android version has been moved to the search bar) but this will only pick up a limited amount of input and is quite slow. There’s also no “Show original text” option. I suppose there might be a chance that Google will update its iOS version to match the Android version but there are no guarantees. The Android version of Google Photos has had a pretty impressive video stabilization feature for quite a while now, something that is still missing from the iOS version. It might be a purely strategic thing and Google wants to give certain features only to users of its own mobile operating system, but it might also be for technical reasons like that the core transcription engine is deeply rooted in the Android system and it’s just not possible to tap into this on iOS where Google is “just” a 3rd party app developer. Let’s see how things will turn out in the coming months.
If you have any questions or comments, leave them here or hit me up on the Twitter @smartfilming. Do also consider to subscribe to my Telegram Newsletter to get notified about new blog posts and receive the new “Ten Takeaways Telegram” monthly bullet point recap of what happened in the world of mobile video creation during the last four weeks.
Not too long ago, I wrote an article about my favorite audio recorder apps for Android. One of the apps I included was Google Recorder. Officially, the app is only available for Pixel phones but can be sideloaded to a range of other Android devices. Google Recorder has a unique place among audio recording apps because of one killer feature: it transcribes audio into text – offline and for free. This can be extremely useful for a lot of people, particularly journalists. With the launch of the new Pixel 5 / Pixel 4a 5G, Google has introduced version 2.0 of Recorder and it packs some really exciting new features and improvements!
Edit the transcript
As good as Google’s voice recognition and subsequent transcription works, it occasionally makes mistakes. Before version 2.0, you weren’t able to make any kind of edits to the transcription within the app (it was possible to export the text and then make corrections). With the update you can now edit your transcript, however only one word at a time.
Edit your recording
Another new feature of version 2.0 is that you can now edit the audio recording itself by cropping/trimming (cut off something at the beginning/end) or removing a part in the middle. You can actually also do this by removing words from the transcript and it will automatically cut the audio file accordingly! You can access this feature by tapping on the scissors icon in the top right corner when having a recording selected. This particular feature can also come in very handy to bypass a limitation of another new feature which I will talk about in a second.
Create a video with waveforms and captions
Quite possibly the coolest new feature of version 2.0 is the ability to create a video with waveforms and captions from your audio file. This is very useful for sharing audio snippets or teasers on social networks where everything is primarily focused on visual impressions. I was even more delighted to find that you can customize a couple of things for the video: You can choose whether you want the waveforms plus captions or only the waveform. You can also select the aspect ratio of the video (square, portrait, landscape) and the color theme (dark/light). This is great! One thing they could have added is an option to choose a photo as a background image for the video. You will also notice that there are two watermarks at the bottom (the Recorder app logo and a “Recorded on Pixel” branding), unfortunately there’s no way to hide them before exporting. You can however use a separate video editing app to crop the image or place a black/white layer over the bottom part to cover it up. One last thing to mention: You can only create videos from clips that have a maximum length of 60 seconds. So for longer recordings you need to cut out a chunk via the editing tool, save it as a copy and then create your video from this excerpt. The export resolution of the video is 1080×1080 for square, 720×1280 for portrait and 1280×720 for landscape, all at 30fps.
Perfect? Not quite!
Two shortcomings that I already pointed out in my other blog post and that unfortunately haven’t been improved with version 2.0: Google Recorder is still limited to English. I’m sure though that support for other languages will be coming soon because Google’s own Live Transcribe app which I think uses the very same engine for voice recognition and transcription is already polyglot. The second minor set-back concerns its potential use in a professional (broadcast) environment: The app only records with a sample rate of 32kHz. It’s not a problem for professional use per se because I think it’s fair to say that you can also call it a “professional” tool when you “just” use the transcription for your work. But if you want to use the audio recording as such (say for broadcast radio), the sample rate doesn’t match the usual standards of 44.1/48 kHz. If Google Recorder allowed importing audio files from outside the app, this limitation could be circumvented but you can only use files recorded within the app – and I don’t think this is going to change soon as Google probably wants the user experience to be as easy as possible and importing files from other apps might not fit the bill. Ease of use is probably also the reason for not being able to customize anything in terms of recording quality. The sample rate of 32kHz should however be just fine for less “official” formats like podcasts or social media / the web. I have also thought of a hack to record in higher quality but still take advantage of Google Recorder’s features: Record you audio with another app that gives you a higher sample rate (for instance ShurePlus Motiv) and then play it back on your phone while simultaneously recording with Google Recorder. Google Recorder picks up the playback from your phone’s speaker and treats it as if you were talking into the mic. This actually works quite well but of course you need to be in a quiet environment. If you want to use the app’s ability to create a video with waveforms and captions but incorporate the original audio and not the lower quality re-recording, export the re-recording as a video file, then import the video into a video editing app that lets you exchange the audio with the original higher quality recording.
For which devices is Google Recorder available?
Officially, Google Recorder is only available for Google’s own Pixel phones, excluding the very first Pixel (XL). These are: Pixel 2 (XL), Pixel 3 (XL), Pixel 3a, Pixel 4 (XL), Pixel 4a, Pixel 4a 5G and Pixel 5. If you have used version 1 but can’t find the new features, you need to update the app. So are you totally out of luck if you don’t own a Google Pixel? Not quite! It’s actually possible to sideload the app to a whole range of other Android phones running Android 9 or newer (version 1 of Google Recorder) or Android 10 or newer (version 2). However, while the app can be installed on other Android devices that – in theory – should be able to run it, not all do so without problems in reality. I have had excellent results with phones from LG (V30 running version 1 of Google Recorder and now the G8X running version 2) where the app seems to work flawlessly. It also works well on the Huawei P30. On the other hand, the OnePlus 3 only does the recording part, not the transcription. And the Xiaomi Pocophone F1 lets you install and open the app but the moment you try to start a recording, the app crashes. Bottom line: Your mileage will vary with non-Pixel devices and if you’re about to buy a new phone and want to make sure Google Recorder 2.0 runs with all relevant features, you should get one of the recent Pixel phones. If you have a non-Pixel phone that theoretically should be able to run the app by sideloading it, just give it a go, you might be lucky!
How can I sideload the app and is it safe?
Unlike with Apple’s iOS, Android lets you sideload apps to your device. Sideloading basically means you can install apps from other sources than the official app store, in the case of Android the official app store is Google’s Play Store. When you download an Android app from outside the Play Store, you will get an apk file that you can then open and install. For security reasons, installs from other sources than the Play Store are disabled by default on Android and the system will give you a warning when trying to install an apk file. You can override this protective layer though by allowing certain apps (in most cases it will be the browser which you used to download the apk file) to perform installs from so-called “unknown sources”. I highly recommend only downloading and installing apk files from sites you trust. Personally I have only downloaded apks from XDA Developers and APKMirror so far. Now if you want to rush over to APKMirror and get the Google Recorder 2.0 apk, there’s one more hoop you have to jump through, at last for the moment: The download is provided not as a single apk file but as an “apk bundle”, this is a different way of packaging an app to reduce the file size. But while Android can handle installing single-file apks out of the box, you need an extra app to install apk bundles. I used APKMirror’s own APK Mirror Installer which you can download as a regular app from the Google Play Store. After downloading both the APKMirror Installer and the Google Recorder 2.0 apk bundle onto your Android device, open APKMirror, tap “Browse files” and select the Google Recorder 2.0 apk bundle (it has an .apkm file extension). Choose “Install package” and you’re finally done!
To wrap it up: With the 2.0 update, Google has immensely improved its fascinating Recorder app and made it an even more powerful tool for recording, auto-transcribing and sharing audio, one that might be a decisive factor for choosing a Google Pixel over any other phone, be it Android or an iPhone. What’s your experience with Google Recorder? Have you used it? If you have sideloaded it onto a non-Pixel device, how does it work? Let me know in the comments or hit me up on the Twitter @smartfilming. If you like this article, do consider subscribing to my Telegram Newsletter to get notified about new blog posts.
I’ve been thinking about getting my first full-frame DSLM camera for some time now, there are a whole lot of very tempting offerings out there. Not one however was able to tick all the boxes that are most important for me – including excellent auto-focus, great battery life, no recording limit and a price tag of around 2k. Very recently, Sony announced the Alpha 7c, Sony’s smallest full-frame camera so far. While the A7c recycles a lot of established components from earlier Sony cameras and received quite a bit of flak for that (same sensor! no 4K60! no 10bit!), it did include some minor improvements over the Alpha 7 III that might actually be a major deal for some: a fully articulating screen, eye-tracking auto focus for video and unlimited recording. On the other hand, reviewers found that the in-body image stabilization via sensor shift (IBIS) was curiously worse than that of the A7 III.
While watching some A7c-related videos on YouTube a few days ago, I stumbled upon a very interesting video by Gordon Laing though:
He reveals that the A7c has a “hidden” feature that relates to video stabilization. I say “hidden” because Sony for whatever reason didn’t bother to mention it at all when promoting its latest camera release, totally focussing on its small form factor. So Sony’s A7c has an inbuilt gyroscope sensor that records metadata about the camera’s whereabouts in 3D space when filming, so basically every shake you make leaves a metadata trace in the file. This metadata can be used by Sony’s free desktop software Catalyst Browse to correct the shakes and stabilize the footage in post. As you can see in Gordon Laing’s video, the results are very impressive, almost gimbal-like! This was also picked up by some other YouTubers like Camera Conspiracies and Lens Library. Sure, it’s another step in post production that you have to do (and the software seems to take its time to process footage) but the prospect of not having to pack a gimbal and balance it and instead becoming even more mobile, is very promising in my opinion.
Now how does this relate to smartphone videography? As you might know, all modern smartphones (unlike most traditional cameras) do have gyro sensors in them, the most basic thing they’re good for is to control the screen’s orientation (portrait or landscape) based on how you’re holding your phone. Why not take advantage of this in a more advanced way to record gyro metadata when capturing video? Google already has a pretty amazing and free software stabilization feature in its Android version of Photos (many still don’t know about it!) but I’m quite sure this is not (yet) based on recorded gyro metadata. While it might not be that easy for a 3rd party app like Filmic Pro to syphon the gyro metadata off the sensor it should be generally possible. And what’s more: With the smartphone being not only a camera but also a computer that runs software, the post stabilization process (it might be too much for a processor to handle this in real time while shooting!) could be done on the very same device unlike when shooting on a DSLM like the A7c. Of course this would also mean that we need some sort of a mobile Catalyst Browse app for Android and iOS but maybe pro mobile video editing apps like LumaFusion or KineMaster could make this happen in the near future? It will require powerful processors but I think I’m hardly exaggerating when I say that most modern flagship phones can be more powerful than a lot of desktop computers. I’m not a software developer so maybe I’m asking too much (at least right now) but I sure think it’s worth a thought, well actually more than just one!
What do you think? Would this be something you are interested in? How do you like the results? Let me know in the comments or hit me up on the Twitter @smartfilming. If you like this blog post, do consider signing up for my Telegram Newsletter where you will be notified about new blog posts.
One of the things I really like about Apple’s ecosystem is the cross-platform integration of a functionality called “AirDrop” which lets you fast, wirelessly and offline transfer (big) files between Apple devices that are close to each other, be it Mac, iPhone or iPad. This is extremely helpful when transferring video files which as we all know can get pretty heavy these days, particularly if one records in UHD/4K. Shooting on an iPhone and then transferring the footage to an iPad for editing with a bigger screen is a pretty popular workflow. Android on the other hand had something called “WiFi Direct” relatively early in its career but it never got picked up consistently by phone makers which preferred to introduce their own proprietary file transfer solutions which of course only worked with phones/devices of the same brand. So for quite a while I resorted to third party apps like Feem and Send Anywhere that also worked cross-platform between mobile and desktop – Android, iOS, macOS and Windows. As for Android-to-Android device wireless file transfers, Google introduced an app called “Files Go” (today Files by Google) in late 2017 which was primarily a file explorer but also had the ability to share files offline to another device by creating a WiFi Direct connection. While the app ventured somewhat close towards becoming a system resource in that it came pre-installed on many new phones as part of Google’s app portfolio, it was hard to deny that Apple’s AirDrop was more easily accessible.
Google is finally giving Android proper wireless file sharing
Enter Nearby Share: Recently, Google started rolling out a new Android feature called “Nearby Share” that should soon be available on all Android devices that sport at least Android 6 Marshmallow (Android 6 was released in 2015, we’re now at Android 11). Nearby Share allows for fast wireless sharing of files to other nearby Android devices, even offline (that is without using an internet connection). The feature is distributed automatically via the Google Play Services app (which comes pre-installed on basically all Android devices) so you don’t need to download anything. Nearby Share is integrated into the Android system, it’s not a separate app. As of now, roughly 90% of my own Android devices (and believe me I own quite a few!) have already received Nearby Share.
Does your Android device have it already?
And here’s how to check if you have it: On your Android device, go into “Settings”, then select “Google” and then “Device connections”. You should now find an option called “Nearby Share” (not be confused with something called “Nearby”!). To use it, you need to activate it by switching the slider to “On”. If you have not yet activated Location and Bluetooth it will ask you to do so because that’s how it will look for and find other devices. There are also a couple of options: You can customize the name of your device (under which name it will be visible for other devices). You can select between three different “Device visibility” settings (All contacts, Some contacts, Hidden) and you can choose by which means the transfers are achieved (Data, Wi-Fi only or Without Internet). Regarding the last bit, I personally always switch to “Without Internet” so it uses the fast peer-to-peer WiFi Direct protocol and doesn’t consume any mobile data when not connected to regular WiFi. Before actually initiating the first file transfers I suggest one more thing (it’s not really necessary though): You can add Nearby Share to your Quick Settings. Quick Settings is the bunch of settings directly accessible when pulling down the notification shade from the top of the screen. Now it’s not exactly the same on all Android devices, but there’s usually a small pen icon in the Quick Settings which allows to add or remove certain items to/from the Quick Settings. Scroll down do find two horizontal lines that are intertwined (Nearby Share) and drag the icon to the main Quick Settings. The reason I recommend doing this is because you can easily make your device visible to others for Nearby Share or turn the feature on when it’s off. Long pressing the Nearby Share icon will also take you straight into the settings for Nearby Share without clicking and scrolling through the general settings.
How does it work?
So how does a file transfer via Nearby Share actually work? Keep in mind that Nearby Share is for sharing to physically nearby devices, not to someone on the other side of the globe!
Assuming you want to transfer one or multiple video files, locate the file(s) in your phone’s Gallery app (the native Gallery app or Google Photos). Select the one(s) you need and then tap the share button.
Now look for the Nearby Share icon on the share sheet and select it. If you are using Google Photos as your Gallery app it will give you three options, select “Actual size”. Your sharing device will immediately start looking for devices that are close by and have Nearby Share activated (it usually doesn’t have to be opened).
On your receiving device you will get a prompt “Device nearby is sharing. Tap to become visible” (If it doesn’t, open Nearby Share from the Quick Settings on the receiving device). After doing so, your receiving device will pop up on the radar of the sharing device.
Select your receiving device and tap “Accept” on the receiving device itself. The file transfer will start and you are done. Your transferred files will be available in the “Download” folder of your Gallery app.
Is it any good?
So far, Nearby Share worked really well for me and it makes transferring big files to other Android devices so much easier. It’s a bit of shame that unlike with phones, there aren’t too many powerful Android tablets out there to make a phone-tablet workflow a tempting proposition. It’s basically only Samsung that offers a tablet with flagship specs for video editing these days. The biggest shortcoming for me though is that it’s currently only available between Android devices and doesn’t build a bridge to desktop/laptop computers or iOS. This isn’t exactly a surprise. While Apple produces both mobile and desktop/laptop hardware with their own software, Google doesn’t really. “Laptops” is debatable because Google has Chromebook devices like the Pixelbook / Pixelbook Go and Nearby Share is supposed to roll out for their ChromeOS as well but I would assume most of us still associate “laptop” with devices running Windows, Linux or macOS. There’s actual hope though: Google is apparently planning to make Nearby Share part of its Chrome Browser and thereby opening up a whole new sharing world with the option to share to iOS, macOS, Windows and Linux. And even in its current state, Nearby Share can be very helpful in many situations, for instance when having multiple phoneographers in the field and you want to collect the footage on one device afterwards for editing or if as a journalist you talk to a person that filmed something interesting on his/her phone and wants to share it with you.
Does your Android device have Nearby Share? Have you used it already? How does it work for you? Let me know in the comments or hit me up on the Twitter @smartfilming. You might also want to have a look at Google’s own blog post about Nearby Share. If you like this blog, please consider subscribing to my Telegram Newsletter which will notify you when new posts are released.