Before captioning videos for convenient social media consumption on the go became all the rage, everyone agreed that having good audio was an essential element of good video, possibly even more important than the image quality. Many will agree that it actually still is despite the captioning vogue of late. So how do you get good audio for video? Let’s make it simple: Get as close to the sound source as possible with your mic! In most cases you will get better audio with a cheap mic close to the sound source than with a super-expensive mic that’s (too) far away (notice: it won’t work at all however if you use a carrot). And how do you get as close as possible to the audio source? An external mic (as opposed to the internal mic of your phone) will be a big help to achieve that since you probably don’t want to shove your phone/camera into someone’s face. But can you work with external mics on Android devices? Yes, you can. And the good news is that it basically works with EVERY Android phone or tablet! Of course I have not tested it on every single Android device on the planet but so far I have not encountered a single one that didn’t support it and believe me I have had dozens so far! In most cases you will however have to use third party apps since most native camera apps don’t support the use of external mics. There are a few exceptions like many Samsung phones and the (recent) flagships of LG and Sony but with other Android phone makers your only chance to use an external mic for better audio while recording video are third party video recording apps like FilmicPro, Cinema FV-5, Open Camera, Cinema 4K or Footej Camera. If you are into video live streaming: Popular platforms like Facebook, Periscope, Instagram or YouTube also support the use of external mics on their mobile apps. Important: Some apps will automatically detect a connected external mic while with others you will have to go into the settings and choose the external mic as the audio input. In general, it’s recommended to connect the mic before launching the app as sometimes the app might not correctly detect the mic when you only plug it in after launching the app. But how can you connect an external mic to an Android device? There are four basic options that I will shortly elaborate on: 1) via the 3.5mm headphone jack 2) via the microUSB port 3) via the USB-C port 4) via a wireless connection / Bluetooth.
3.5mm headphone jack
The most common wired solution for connecting an external mic to your Android device is (was?) the 3.5mm headphone jack, the port where you would usually plug in your headphones to listen to music. For a long time this was one of THE universal things about a smartphone, be it an Android, an iPhone or even a Windows Phone. In the past couple of years however, more and more phone makers have been following Apple’s lead to ditch the headphone jack (starting with the iPhone 7) in an attempt to further push for a slick enclosed unibody design, leaving the phone with only one physical port, the one that is primarily there to charge your phone. Of course this move has also to do with the rise of Bluetooth headphones. Anyway, if you’re lucky enough to still have a phone with a 3.5mm headphone jack you have a range of options to connect different types of external mics. There are two general options: Using a mic with a dedicated TRRS 3.5mm headphone jack connection or using another mic with an adapter. What’s TRRS you may ask? Well it has to do with the number of conductors on the 3.5mm pin. You might have encountered mics with a similar looking 3.5mm pin for ‚regular‘ cameras. But while they look similar, they only have THREE conductors on the pin (TRS), not FOUR (TRRS). Smartphones use the TRRS standard so a TRS 3.5mm pin won’t work unless you use an adapter like the Rode SC4. But you know what? There’s a good chance you already own an external mic without even knowing: the headphones that came with your phone. Yes, you heard that right! They usually have an inbuilt mic for making/receiving phone calls and if you have your headphones connected to the headphone jack while using a video recording app that supports external mics, then this is an easy and cheap way to improve your audio. You might be surprised how decent the sound quality can be! Of course the cable of the headphones is usually not too long so if you are doing a piece to camera or an interview with someone it will be hard to avoid having the cable in the frame and the sound quality of dedicated TRRS 3.5mm mics often trumps that of the headset but I think it’s still great to have this option at hand. Dedicated TRRS headphone jack mics include the original iRigMic (handheld) by IK Multimedia, several lavalier/lapel mics (like the Rode smartLav+, the Aputure a.Lav, the Tonor Dual Headed Lapel and the Boya BY-M1) or the Rode Videomic Me (directional shotgun-type). One might also count the Rode VideoMicro (directional shotgun-type) as a dedicated TRRS mic because you can exchange the TRS cable that it comes with (to work with ‚regular‘ cameras like DSLRs etc.) for a TRRS cable (Rode SC7, sold separately). Other than that you can connect basically any XLR mic by using IK Multimedia’s iRigPre adapter/converter box which has a female XLR input on one side and a male 3.5mm pin cable on the other. XLR is the most common professional audio connection standard.
Just as 3.5mm headphone jacks used to be a common standard on phones, so did microUSB ports on Android devices for charging your phone’s battery. The change from microUSB to USB-C as the preferred „power port“ in the last years very much but not exclusively coincided with the trend to drop the headphone jack. There are hardly any new phones coming out these days that still have a microUSB port (the most recent were all in the budget segment). And as 3.5mm headphone jacks were a universal standard at the time the lack of dedicated microUSB mics didn’t really come as too much of a surprise. Actually the only mic like this that I ever encountered and used was IK Multimedia’s iRig Mic HD-A, an improved digital version of the original iRigMic which featured a microUSB connector instead of a 3.5mm pin. One thing you also had to pay attention to when connecting accessories to a microUSB port was the question of USB-OTG support (OTG stands for „On-the-go“). In simplistic terms one could say that USB-OTG support basically means that you can use the USB port for other things than just charging. For instance as an audio input. Not all Android devices have support for USB-OTG.
USB-OTG support is also of relevance when talking about USB-C, the new USB connection standard for Android devices (it has recently also been introduced on Apple’s iPad Pro series so one might speculate on whether the future will see Apple making the switch for all its devices). One of the very practical things that makes USB-C better than microUSB is that the connector is shaped in a way that fits into the port of the phone in two ways, not only one like microUSB which usually meant you tried to plug it in the wrong way first. The other good thing is that as time went by, USB-OTG has become a more common feature on Android devices so chances are relatively high that your device will support USB-OTG if you have purchased it in the last two years or so. It’s still not a definitive standard on Android devices though, so if you plan to use USB-C mics you should check the phone’s spec sheet first. The introduction of dedicated USB-C mics has been very slow, the first one to my knowledge was the Samson Go Mic Mobile wireless system launched in 2017 which included a USB-C connection cable for the receiver unit along with cables for 3.5mm, microUSB and Lightning port (the latter is the standard on most Apple devices). Boya has recently added two USB-C mics into their portfolio (a directional shotgun-type and a lavalier) and Saramonic has a USB-C-to-XLR adapter cable so there are finally at least some options. For a great overview regarding USB-C mics check out this blogpost by Neil Philip Sheppard on smartphonefilmpro.com. One more thing: While quite a few phone makers include a USB-C to 3.5mm-adapter with their phones if they have a USB-C port (which would let you use 3.5mm mics), these tend to be proprietary, meaning that you can’t use them with other phone brands and if you lose yours you will have to purchase from the same brand again and can’t use a third party adapter. Yes, very annoying, I know. In general, USB-C mics don’t seem to work as universally across Android devices and apps as their headphone jack buddies just yet so if you plan to use a USB-C mic than I would recommend doing thorough testing before using it on an important job.
Wireless / Bluetooth
All the aforementioned external mic solutions have in common that they involve some kind of wired connection to the phone, even in the case of the wireless Samson Go Mic Mobile system or Rode’s RodeLink wireless kit (which can be utilized when connecting a TRS-to-TRRS adapter to it) as the receiver unit has to be plugged into the phone. Of course it would be fantastic to have the audio go directly and wirelessly from a mic (transmitter) into the phone without a separate receiver unit attached to it. And in theory it should be very much possible because modern phones do have two protocols allowing for wireless data transfer: WiFi and Bluetooth. So far, only Bluetooth has been used for that, I’m sure there’s a technical reason why the WiFi way might not be feasible (yet) that I don’t know about. A bunch of potential Bluetooth mics have been around for some time but they usually still need a receiver unit and the audio quality and reliability hasn’t been quite up to the task so far. Bluetooth headphones/headsets with an internal mic is another possible option. Here’s a short test I did using the inbuilt headset mic of my (rather cheap) Bluetooth headphones.
It’s not too bad in my opinion and might suffice for certain tasks but you definitely notice the quality difference to a good wired external mic. Apparently Bluetooth audio that goes directly into the phone is limited to a sample rate of 8kHz by the Android system at the moment (according to one of FilmicPro’s engineers) which doesn’t provide the grounds for great quality audio. There’s also a mic called the Instamic that is basically a self-contained mini audio-recorder in the form of a somewhat bigger lavalier with internal storage that also allows a live audio-streaming mode directly to the phone (with noticeably diminished quality compared to the internally recorded audio) but depending on your job, the quality might still not be good enough and can’t match that of wired connections. You also often get a slight delay between video and audio that increases the farther you get away from the device. And unlike with wired external mics, only a few video recording apps on Android actually accept Bluetooth as an external audio input as of today, the ones I know about are FilmicPro and Cinema FV-5. So while the limitations of Bluetooth mics might still be too big for much/most professional work at this time, it should (soon) be a viable option in the near future. As a matter of fact, there recently was a Kickstarter campaign for a Bluetooth transmitter called BAM! that can be attached to any XLR mic and streams the audio directly to the phone in good quality – unfortunately the campaign didn’t reach its funding goal. Let’s hope it’s not the end of the story since smartphone development is probably headed towards a design with no physical ports (wireless charging is already here!) and then wireless is the only way to go for better or worse! If you have questions or comments, feel free to drop a line!
A little more than six months ago I bid my LG V10 goodbye into retirement. The V10 was the first flagship smartphone I had purchased and I had done so for a very specific reason: LG had redefined what a stock/native camera app on a smartphone can offer in terms of pro video controls. While many other phone makers were including advanced manual controls for photography in their camera apps, video had been shamelessly ignored. With the introduction of the V-series in late 2015, LG offered avid smartphone videographers a feature pack in the native camera app that could otherwise only be found in dedicated 3rd party apps like FilmicPro. While LG’s smartphone sales can’t really compete with the ones from Samsung, Huawei and such, the V-series fortunately didn’t just vanish after the V10 but was succeeded by the V20, V30, V35 and V40 henceforth. As I don’t see the need to upgrade my phone on an annual basis, I went for the V30. It took over the useful dual rear cameras from the V20 and newly introduced features like LOG profile, Point Zoom and CineVideo. After spending six months with the V30, what is there to say about the device as a videography tool?
Hardware features: Lost & Found
Well first off, let’s get that big thing out of the way that bothered me the most before I even bought the V30: abandoning the removable battery. LG was basically the last major phone maker to offer an exchangeable battery on a flagship with the V20, so kudos for that, but they eventually ditched it for the V30. I somewhat do get the idea that a unibody design without removable parts might just make the device look slicker and even has a practical reason when it comes to water and dust resistance (yes, you CAN submerge the V30 without a case thanks to the IP68-rating). But apart from the concerning fact that this is a considerable ecological issue because it makes it likely that you will just buy a new phone when battery life starts to falter, it also does away with the „power management security net“ and fosters the fear of running out of power with your phone. Especially when using such a device extensively for professional purposes, a back-up battery that lets you go back from 0 to 100% in a matter of seconds feels just very comfortable to have around. Sure, external batteries a.k.a. power banks are a common thing by now but they are not quite as compact and fast in getting the recharging job done. While dropping the removable battery is unfortunate, it’s an all-too-common thing, LG only follows the rest of the pack as nowadays you can hardly find a phone that still has this feature. Furthermore, I have to say that I was pleasantly surprised by the V30’s battery life. It’s much better than the V10’s and while doing some tests with very long recordings, the phone only consumed around 30% of the battery when recording continuously for almost two hours. I just hope the battery doesn’t degrade too fast over time.
Speaking of useful features that are en vogue to get the sack: LG is still holding out on the 3.5mm headphone jack which will make a lot of people happy as it’s still a very easy and universal way to attach external mics for better sound quality (or do audio monitoring). The V30 also has a USB-C port which can be used for connecting external mics as well, but as of now there are hardly any USB-C mics out there to make use of that. One very clever and useful exception is the Samson Go Mic Mobile wireless system which comes with a whole bunch of connecting cables, including a USB-C one. One day in the (hopefully not so distant) future, truly wireless audio solutions sending audio in high quality directly from the mic to the phone’s video recording app might replace wired solutions but the current state of quality and reliability in that area isn’t yet up to the task as far as I can see. As for the internal mic, there are actually two, so the LG V30 is one of only a few phones that records in stereo natively. This is very useful if you are capturing a soundscape or if you have sound sources moving around. And to tell the whole story, the V30 actually has a third internal mic: the phone’s earpiece kicks in as a life saver in very loud environments (like say a rock concert) to avoid distorted audio. Another useful feature that the V30 fortunately kept was the support for external storage via microSD card, popping in a 128 or 256GB card is a pretty cheap way to have more space for media and apps on your phone.
Three and a Half Cameras
Let’s continue our inspection of the V30’s hardware and take a look at what might be considered the most important thing for a phone – when talking about videography: the camera(s). While the V10 had a somewhat peculiar lens set-up with a single rear camera but dual front cameras, the V20 flipped this around which I personally find more useful if you’re not a selfie-vlogger. Dual rear cameras have become all the rage in the last couple of years and almost something considered a must-have on flagship phones and even some mid-rangers (unless your name is Google Pixel). Not all secondary rear cams are created equal though. Some are only for shooting nice portrait shots with a blurred background, some feature a monochrome sensor for black/white photography with better low-light performance and dynamic range and some have a different focal length than the main camera, going either for telephoto (zoom) or wide-angle. For smartphone videographers, only the last two options are actually helpful. And while I have been known for whining about the lack of optical zoom on smartphones in the past, I do have to say that from a practical standpoint, wide-angle seems like the best, most versatile choice after all. Especially if you find yourself indoors backed up against a wall, having a wide-angle is just incredibly helpful to fit more of the scenery into the shot. And the ability to shoot two very different images from a single point without having to move around is fantastic. So while the wide-angle secondary camera is actually a rare choice in the market, LG can only be applauded for going down this route. And after the V20 had a noticeable amount of barrel-distortion on the wide-angle, the V30’s 12mm secondary rear cam has been more refined in that respect. It now also has a much wider aperture compared to the V20 (f/1.9 vs f/2.4 – smaller is better) which helps in low-light. There are three limitations when using the wide-angle however: The first one might actually be of use in certain situations – the fact that there’s a fixed focus and therefore no adjusting auto-focus guarantees that there are no unexpected and sudden focus shifts. A fixed focus might be a serious problem for the main camera, but for the wide-angle, it’s ok. The second limitation is a real one though: no OIS (optical image stabilization) and no EIS (electronic image stabilization) either. The third one is the biggest though: the V30’s new „LG-Cine Log“ profile (more about that later) is not available for the wide-angle camera, only for the main snapper.
The 30mm main rear camera has OIS (plus the option for additional EIS called „Steady recording“ – not available for UHD/4K though), laser auto-focus, a f/1.6 aperture and the ability to record in LG-Cine Log. Both rear cameras let you record in UHD/4K (but only up to 30fps, 60fps is only available for FHD resolution, 120fps only for 720p in slow-motion mode) and you can switch between them with a single tap even when recording. The colors of the two rear cameras don’t match 100% if you take a really close look but they are close enough for most purposes I’d say. Now while the main rear camera seems to be excellent for low-light with its wide f/1.6 aperture, the relatively small size of the image sensor (1/3.1“ with 1.0µm pixel size) unfortunately diminishes this advantage. With very few exceptions (especially when it comes to video), all smartphones still struggle with low-light situations so it would be wrong to single out LG for that. I would classify the V30’s low-light performance as solid, but not as good as one could have expected with regard to the promising aperture of the main cam.
What about the selfie camera? Well, I was already a little bit suspicious when I saw the tiny camera hole on the front. As it turns out, not only did LG scrap one front camera compared to my old V10 but the actual quality of the footage isn’t really better than the V10’s from two years ago as far as I could see. That’s a bit of a disappointment for sure but personally, I don’t care too much as I rarely use the front camera. As for resolution, you can shoot FHD in 30fps which is the solid standard but that’s it – no UHD/4K or higher frame rate. Another note: While there isn’t a second front facing camera you still get the option to switch between a wider and a narrower field of view – as there’s no second lens, this is done by a software crop of the image.
Before moving on to the software side of things, a few words about one other very important hardware aspect: the chipset. The LG V30 is equipped with a capable Snapdragon 835 that not only lets you shoot video in UHD/4K resolution (although only up to 30fps in UHD/4K) but also edit it. Importing footage into Android’s two best video editors KineMaster and PowerDirector reveals that you can even have a second video track when working with UHD/4K footage in those apps which is excellent news. For those interested in creating Augmented Reality (AR) enriched video: The V30 is compatible with Google’s ARCore and the Snapdragon 835 has enough muscle to let you use an app like “Just a line” for instance which lets you draw/doodle in AR space. There isn’t too much around in this category yet though.
The King of Manual Video Controls
But while good cameras and powerful chipsets can also be found on other (Android) phones, the unique selling point of the V-series has always been its focus on videography with all the manual controls and features you get in the native camera app. I’ve already talked about that regarding the V10 when discussing native camera apps on smartphones in an earlier post (I still owe the second part of this article, what a shame!) but there have been some significant additions since the V10 so it’s worth pointing out in detail again. Let’s have a look at the interface of the manual video mode: On the far left on the bottom of the screen you find an audio level meter which reassures you that there’s actually audio coming in from the mic(s) and it also helps you to make sure the audio isn’t too loud (peaking). No other native camera app on a smartphone has that – you can only find it in advanced 3rd party apps like FilmicPro, Cinema FV-5 etc. To the right there’s information on what resolution, frame rate and bitrate you are currently using. Next is a button with a microphone icon and this opens up a transparent panel overlay with some advanced audio controls audiophiles will love: You get to change the input gain, activate a low cut filter or set a limiter. While recording, you even get live audio waveforms when having this panel open which gives you even more precise visualized information about the incoming audio than the audio level meter. From this panel you can also apply a wind noise filter and select an external mic if there is one connected via the headphone jack (edit: unfortunately it doesn’t seem to support mics connected via the USB-C port like I originally wrote in this post!) At this point I would like to mention that the app even allows for audio monitoring via cabled or Bluetooth headphones. There’s a small delay to the live audio so listening to it over extended periods of time can be irritating but it’s definitely good to quickly check the audio for possible unwanted sonic interference. The next button is for white balance and you can switch between auto mode and a Kelvin scale that ranges from 2300 to 7500K. No presets are available though. Next in line is focus. Again, you can switch between auto-focus and manual focus. When you choose manual focus mode you get to enjoy another staggering feature for a native camera app: focus peaking. Focus peaking adds a colored overlay to the areas of the frame that are in focus and is therefore incredibly helpful to get the focus right. It can usually only be found on professional „big“ cameras. Focus peaking can be switched on or off when using manual focus on the V30. One shortcoming: You can only use focus peaking BEFORE starting the recording which makes fancy rack focus action while filming still a bit of a gamble. The only Android app that allows focus peaking even while recording is FilmicPro. The EV button lets you adjust the exposure value without having to set precise values for ISO and shutter speed but as there’s no option to lock the exposure in that case I find it fairly useless. On to the two real exposure parameters: ISO and shutter speed. The ISO ranges between 50 and 3200, shutter speed between 1/25s and 1/4000s. One crucial improvement over the V10 regarding the shutter speed is that you can now select „PAL“ shutter speeds, most importantly 1/50s. This is important because in Europe and some other regions many artificial light sources emit light at the frequency of 50 Hertz which causes ugly banding effects in your footage if you are not shooting with a shutter speed that matches this frequency. The last thing you find on the far right of the bottom control panel is what I like to call the „panic button“ and it’s a very cool feature: If you ever find yourself lost fiddling with all the manual controls but need to quickly start recording all of a sudden you can just push the „A“ with a circling arrow around it and everything goes back to auto: white balance, focus, exposure.
But not only the control panel of the main recording interface is stuffed with controls and features, there’s more to find in the settings section which you can access by tapping the cog wheel on the bottom of the left side bar. The first option you can find here at the top of the list is the frame rate. And it’s in here that me and some other folks do miss a particular something: PAL frame rates, PAL being the broadcast standard in Europe and some other regions of the world. Normally you couldn’t really blame a smartphone for not having the option to shoot in 25 or 50fps in the native camera app (the only phones that ever did at least 25fps were Nokia’s/Microsoft’s Lumia phones) but with all the amazing bells and whistles in terms of pro videography controls on the V-series, it’s a real shame that LG didn’t pay attention to that as well. Truth be told, this option will only be of serious relevance to a certain group of videographers: Those who shoot for PAL broadcast and/or use their phone in combination with a ‚regular‘ camera that only shoots PAL frame rates. If you don’t belong in this category, you can be perfectly happy with the options at hand: 1, 2, 24, 30 and 60fps (60fps is not available when shooting UHD/4K or LOG). Still, for the highly unlikely case that someone from LG reads this blog, PLEASE do add the option to shoot in 25/50fps! How hard can it be? I hope there’s a golden future ahead where regional frame rates are a thing of the past but that future might still be a bit too far away to just ignore the present. Yes, you can use 3rd party apps to shoot in 25fps on the V30 but if LG gives us a native camera app so good with manual video controls and the idea that this is a serious videography tool, why be ignorant in that particular area? Next in the settings list is bitrate. Yes, you heard that right, you can adjust the bitrate. Another feature that can otherwise only be found in advanced 3rd party apps. You can choose between three different settings: high, medium and low. The bitrates depend on the selected resolution and frame rate and – upon closer inspection – turn out to be not as high as some power users would have liked. The maximum you get is 52 Mbit/s when shooting in UHD/4K, the „high“ option in 1080p with 30fps is 24Mbit/s. Still, it’s nice to have some control over the bitrate at all in a native camera app. Below the bitrate option, there’s another very interesting feature that will excite every audiophile: You can toggle on „HiFi recording“ which pushes the audio bitrate for video to a crazy 2400 Kbit/s (24-Bit PCM Stereo) while the regular set-up is 156Kbit/s (AAC) and no other smartphone I encountered exceeded 320Kbit/s. If you want to edit your footage on the phone be warned that not every video editing app supports PCM audio (KineMaster and PowerDirector do though) – and neither does Twitter’s video player by the way.
What the LOG!?!
But let’s move on to the big new feature that LG introduced to the V-series with the V30: LG-Cine Log. What’s „log“? I won’t and I can’t go into the details of this but let’s just say it is a special shooting profile that applies certain processing to the image which will give you a better dynamic/tonal range and generally allows more flexibility in post production when you want to create a specific look for your footage. It’s a feature usually only found on professional cinema cameras and calls for a certain amount of post production (grading/coloring) because the „raw“ footage usually looks rather dull and pale. So if your workflow includes a fast turnaround you probably shouldn’t use the LOG profile. It’s a very cool feature though, I absolutely love it, not least because the regular footage might be considered over-sharpened and over-saturated, an unfortunate habit of many/most smartphones as they are trying to satisfy what they deem the crowd’s taste. And while I’d say that the V30’s non-LOG image quality is a tad behind Google’s recent Pixel phones, Samsung’s S9/S9 Plus/Note 9 and the latest iPhones, the native LOG profile makes up for that in my opinion as you can really create stunning footage with it and have immense flexibility in post production. However it can’t be denied that shooting LOG probably is only of interest to a certain group of videographers. But hey, if any smartphone should have the ability to shoot LOG in the native app, it should be the V30! Two things to keep in mind when using LG-Cine Log: You can’t use the wide-angle lens and you can only shoot up to 30fps. Here’s a “show reel” of footage shot in LG-Cine Log on the V30 (graded in FCPX).
And here are two shorter videos with LG V30 LOG footage, one “raw” like it is originally recorded, the other with minor grading applied.
And as I already talked about bitrates earlier on, it’s particularly unfortunate with regard to shooting LOG that the bitrates can’t be bumped up to higher levels. One last thing: When using LOG profile you can find a button in the top right corner of the main interface that lets you toggle on and off at LUT (so-called ‚Look-Up-Table‘). Again, I don’t really want to get into the specifics here but suffice it to say that this gives you a preview of what the graded result of your LOG footage COULD look like, it is NOT recording that preview! The image that is recorded is ALWAYS the one that you can see when LUT is toggled OFF!
Let’s wrap up the settings menu with a quick look at some other features: Bright Mode and HDR can’t be used in the manual video recording mode (only in auto-mode) which renders them useless for me. Steady Recording is an additional (software-powered) stabilizing option that crops the frame and can’t be used when recording in UHD/4K. Tracking Focus tracks a person or object while moving about the frame which can be useful in certain situations. It doesn’t always work perfect but it’s worth trying out. Covered Lens gives you a warning when you (accidentally) cover part of the wide-angle camera’s image. This can indeed be helpful as I have occasionally found myself inserting my pinkie into the frame without the intention to do so because the wide-angle has a really wide angle. On the right hand side of the settings menu you can activate a timer (3 or 10 seconds) and select a resolution. Resolution varies between 720p and UHD/4K and offers three different aspect ratios (16:9, 18:9 and 21:9 – the latter two are only available up to 1080p). 21:9 is interesting because the ultra-widescreen format gives you a certain „cinematic“ effect. If you combine that with the according frame rate (24fps) and LOG profile you are setting the stage for that sweet silver screen look. And for those of you interested in creating vertical video content, you can also shoot vertically with all features & manual controls. Manual mode is however not available when you are using the front camera though – a little bummer.
More fun with shooting modes…
The manual video mode is outstanding but what about any other interesting video modes in the native camera app? There’s one particular mode that was also first introduced with the V30 and got a lot of attention before the phone’s release: CineVideo. The mode actually has two separate features bundled together in one mode – the bundling aspect however left me somewhat confused. So one aspect of the CineVideo mode is that you can apply a couple of slick „cinematic“ filters (some are even calling them LUTs, not sure if that’s correct though) to your image. But while you get control over the strength of the filter and the vignetting that comes with it, that’s basically it. Yes you do get some very rudimentary exposure value control but you can’t lock the exposure or set specific values for ISO and shutter speed which is really unfortunate and dramatically reduces the usefulness. The other feature in the CineVideo mode is Digital Point Zoom. You can choose a point within the frame and smoothly zoom in by using a virtual slider. Yes, the zoom is only digital but to my surprise the quality loss isn’t all that bad and even when fully zoomed in, the image can still be considered acceptable. So it’s a real shame that LG restricted this feature to the CineVideo mode – it would have been very cool to have this in the manual video mode as well. There you can also zoom digitally by using the common zoom gesture with two fingers but the zoom will be very abrupt because there’s no slider. And you also can’t zoom in to an off-center point of the frame like you can with the Digital Point Zoom.
So one small general gripe I have with modes and features on the V30 is that certain useful things are only available in certain modes / in certain settings and not in others which can be a little frustrating at times.
„Popout“ is another fairly interesting mode as it uses both the main and the wide-angle camera simultaneously to create a picture-in-picture video with two different views from the same camera standpoint. The cool thing is that you can apply some effects to the wide-angle image: Fisheye, Black&White, Vignette and Lens Blur. You can even combine some or all of them at the same time. On top of that you can also change the layout of the picture-in-picture to have a circle instead of a rectangle or have three segments of which the top and the bottom are filled by the wide-angle camera while the middle one is filled by the main camera. It’s more of a fun mode and I don’t use it often but it can come in handy when you try to create something more playful for instance for a short social media video.
The simultaneous use of two cameras gets even more interesting with the „Match Shot“ mode. This is a fantastic feature for vloggers and mobile journalists reporting as a one-(wo)man-band – I have already mentioned this mode in my blog post #12: It creates a split-screen recording using both the front and a rear camera simultaneously which means you can basically show yourself AND your own point-of-view at the same time. This is just super cool if you are doing an on-the-scene piece-to-camera for a news report or some travel vlogging. For each screen segment you can choose between the regular view and a wide-angle so you have some flexibility there as well. Best of all: external mics are even supported! Some downsides on the other hand: The aspect ratio is fixed to 18:9 (resolution of 2880×1440 is good though, so one can adjust to 16:9 in post), the frame rate is only 24fps and everything’s running on auto, no manual controls. Still, it’s an amazing feature with great potential and it’s a real pity that apparently LG has ditched this mode again on the V40. Here’s a video (not mine) with the Match Shot mode in action:
If you are into square video and doing super-short teasers for longer content you might find some use for the „Grid Shot“ mode which lets you shoot four very short clips of a maximum of 3 seconds each and assembles them into a split-screen square video (resolution: 1440×1440) playing back all four clips at the same time.
The last interesting mode for video is „Slo-Mo“. You get slow motion with 240fps – but only in 720p and with barely any manual controls. It’s nice to have but it’s definitely not LG’s strong suit – Apple and Samsung offer much better quality here in their flagship phones.
Camera2 API & 3rd party apps
So with the V30’s native camera app being so amazing is there any need at all for 3rd party apps? Yes and no, or as we like to say in German: Jein. The biggest reason for using a 3rd party app is probably the frame rate: As mentioned before, the native app does not offer any PAL standard frame rates (25/50fps) which might be important to some users. Other than that, the only app that can actually beat LG’s native camera app when it comes to features and controls is FilmicPro which gives you among other things focus peaking during recording, a waveform monitor and false color analytics to check exposure in difficult situations, the ability to shoot in higher bitrates and the option to use the more efficient (but not yet fully mass-market compatible) HEVC/H.265 codec instead of the standard AVC/H.264. But as I have pointed out in an earlier blog post, the ability to have advanced manual video controls in 3rd party apps on Android devices very much depends on how well the phone maker has implemented the so-called Camera2 API (if you want to learn more about it, check out my two blogposts about it here and here). Without proper implementation, 3rd party app developers can’t access/make use of certain controls. So how’s the Camera2 API support for the V30? Well, it’s a mixed bag. It does have the highest support level („Level 3“) for both rear cameras (only „Limited“ though for the front camera) so theoretically things should be fine but apparently LG overlooked a small bug that affects focusing in 3rd party camera apps. Sometimes, the focus gets stuck and you have to quit and re-launch the app. While I have experienced this first with FilmicPro it also happened with other 3rd party apps, so it seems to be a more general issue and not only related to a FilmicPro. Let’s hope LG can fix this nuisance with a software update. A positive aspect of LG’s Camera2 implementation on the other hand is the fact that 3rd party camera apps do get access to the secondary rear camera, something other Android phone makers are less welcoming about. So far, only FilmicPro and ProShot have actually integrated this as a feature though. In the case of FilmicPro this means that there is a way to shoot in LOG profile with the wide-angle lens after all! A word about frame rates: The ability to shoot in 25fps is one major reason for some to use 3rd party camera apps. Using the V30 with FilmicPro in 25fps has been mostly consistent and reliable so far (occasionally you do get 24.93 or something not 100% on spot) but you don’t get the higher frame rate PAL option of 50fps (something very few Android handsets seem to be able to allow at this point). And neither do you get 60fps which is available in the native app so LG still keeps some shackles on the API here for 3rd party apps. Surprisingly though, you can shoot at the even higher slow-motion frame rate of 120fps (up to FHD). So I’d say slow-motion capability comes out as a tie between native camera app and FilmicPro: The native camera app lets you record in 240fps using the slow-motion mode but only in 720p while with FilmicPro you „only“ get a frame rate of 120fps but a higher resolution (1080p).
In the long run…
Before concluding this rather detailed inspection of the V30 I would like to address one more aspect: maximum recording length. While quite a few smartphone videographers usually take relatively short clips and don’t really care if there’s a limit of say 20 minutes for a single video, it’s really important to know about that for others. Android used to have a single file size limit of around 4GB (this particular size seems to be related to the well-known FAT32 format but to my knowledge it actually isn’t as the limit isn’t exactly 4GB), but many phone makers were able to get rid of that with their own version of the Android OS (Sony, Huawei, Nokia, BQ, HTC for instance). Unfortunately, LG isn’t among them. That being said, LG vastly improved things compared to the V10. On the V10, the recording would stop upon reaching the file size limit and you would have to manually restart the recording. Not a good thing, if you were using the phone as an alternative angle for a longer event while having your focus on the main camera or if you really needed every second of the recording. With the V30 you don’t have to manually restart the recording anymore, it basically records continuously for as long as battery and storage allows. In the background however, the clip is chopped up into chunks of 4.29 GB and you lose a very short segment in between (I’d say it’s around 2 seconds maybe). It might not be the ideal solution for certain jobs but it’s definitely better than having to restart manually. After all, some might even argue that in case of file corruption it’s better to not have a single file. Of course then the ideal solution would be a spliced clip that can be seamlessly reassembled afterwards without dropping a single frame.
So, in the end, is the LG V30 a smartphone videographer’s dream machine? For the most part I’d say yes, its focus on videography is absolutely unique in the smartphone market, the range of advanced pro tools for shooting video that is available right out of the box without having to bother with 3rd party apps that might have certain quirks thanks to Android’s fragmentation is utterly brilliant. The native camera app has been rock-solid in terms of reliability, it hasn’t crashed on me once so far. It’s not quite perfect though: Especially when taking into account that this phone was made for (professional) videographers, it’s a bit puzzling that LG didn’t bother to include PAL frame rates for its native camera app. I’m not an expert on this but I’d say it shouldn’t have been too much of a problem technically to do so. Maybe they just didn’t care? Who knows… This leaves me with two wishes: a) Please, LG, go the extra inch and include PAL frame rates in the native camera app with a software update and b) to all you other smartphone makers out there: please follow LG’s example in paying more attention to your phone’s native camera app in terms of advanced manual video controls. Thank you.
Back in February I published a list with a wide selection of (potentially) useful Android apps for media production. Despite the fact that I mostly write for this blog in English now, the list was published in its German version first. I did promise an English version however and I’ve been working on it ever since. The new English version is not just a translation, it’s actually an update with some apps having been kicked out and others added. And what occasion could be better to finally publish it then at the time MoJoFest is happening in Galway, Ireland. MoJoFest is an exciting 3-day conference (May 29th to 31st) about content creation with mobile devices, initiated and organized by former RTE Innovation Lead Glen Mulcahy. Check out their website and follow the hashtag #MoJoFest on Twitter! I’ll be giving a workshop/presentation about smartphone videography on Android devices on Thursday, May 31st, and as a precursor, I’ll upload the English version of my app list here. Please keep in mind that there might be some typos or even outdated information in it as the mobile world keeps spinning at an incredible pace and things can change quickly. This is also a highly subjective list and by no means “definitive” or “ultimate”, you may find that other apps which are not on the list suit you better for your work. If you think an app you know and love should absolutely be on this list or if you have new information about apps already on the list, please do contact me! But now without much further ado…
When using a headline like the one above, camera people usually refer to the idea that you should already think about the editing when shooting. This basically means two things: a) make sure you get a variety of different shots (wide shot, close-up, medium, special angle etc) that will allow you to tell a visually interesting story but b) don’t overshoot – don’t take 20 different takes of a shot or record a gazillion hours of footage because it will cost you valuable time to sift through all that footage afterwards. That’s all good advice but in this article I’m actually talking about something different, I’m talking about a way to create a video story with different shots while only using the camera app – no editing software! In a way, this is rather trivial but I’m always surprised how many people don’t know about it as this can be extremely helpful when things need to go super-fast. And let’s be honest, from mobile journalists to social media content producers, there’s an increasing number of jobs and situations to which this applies…
The feature that makes it possible to already edit a video package within the camera app itself while shooting is the ability to pause and resume a recording. The most common way to record a video clip is to hit the record button and then stop the recording once you’re finished. After stopping the recording the app will quickly create/save the video clip to be available in the gallery / camera roll. Now you might not have noticed this but many native camera apps do not only have a „stop“ button while recording video but also one that will temporarily pause the recording without already creating/saving the clip. Instead, you can resume recording another shot into the very same clip you started before, basically creating an edit-on-the-go while shooting with no need to mess around with an editing app afterwards. So for instance, if you’re shooting the exterior of an interesting building, you can take a wide shot from the outside, then pause the recording, go closer, resume recording with a shot of the door, pause again and then go into the building to resume recording with a shot of the interior. When you finally decide to press the „stop“ button, the clip that is saved will already have three different shots in it. The term I would propose for this is „shediting“, obviously a portmanteau of „shooting“ and „editing“. But that’s just some spontaneous thought of mine – you can call this what you want of course.
What camera apps will let you do shediting? On Android, actually most of the native camera apps I have encountered so far. This includes phones from Samsung, LG, Sony, Motorola/Lenovo, Huawei/Honor, HTC, Xiaomi, BQ, Wileyfox and Wiko. The only two Android phone brands that didn’t have this feature in the phone’s native camera app were Nokia (as tested on the Nokia 5) and Nextbit with its Robin. As for 3rd party video recording apps on Android, things are not looking quite as positive. While Open Camera and Footej Camera do allow shediting, many others like Filmic Pro, Cinema FV-5, Cinema 4K, Lumio Cam and ProShot don’t have this feature. When looking at the other mobile platforms, Apple still doesn’t have this feature in the iOS native camera app and the only advanced 3rd party video recording app that will let you do it appears to be MoviePro. And while almost extinct, Lumia phones with Windows 10 Mobile / Windows Phone on the other hand do have this feature in the native camera app just like most Android phones.
Sure, shediting is only useful for certain projects and situations because once you leave the camera app, the clip will be saved anyway without possibility to resume and you can’t edit shots within the clip without heading over to an editing app after all. Still, I think it’s an interesting tool in a smartphone videographer’s kit that one should know about because it can make things easier and faster.
EDIT: After I had published this article I was asked on Twitter if the native camera app re-adjusts or lets you re-adjust focus and exposure after pausing the recording because that would indeed be crucial for its actual usefulness. I did test this with some native camera apps and they all re-adjusted / let you re-adjust focus and exposure in between takes. If you have a different experience, please let me know in the comments!
Xiaomi has been a really big name in China’s smartphone market for years, promising high-end specs and good build quality for a budget price tag – but only at the end of last year did they officially enter the global scene with the Mi A1. The Mi A1 is basically a revamped Mi 5X running stock Android software instead of Xiaomi’s custom Mi UI. It’s also part of Google’s Android One program which means it runs a ‚clean‘ Google version of Android that gets quicker and more frequent updates directly from Google. For a very budget-friendly 180€ (current online price in Europe) you get a slick looking phone with dual rear cameras, featuring a 2x optical zoom telephoto lens alongside the primary camera. Sounds like an incredible deal? Here are some thoughts about the Mi A1 regarding its use as a tool for media production, specifically video.
After spending a couple of days with the Mi A1, I would say that this phone is definitely a very interesting budget-choice for mobile photographers. The fact that you get dual rear cameras (the second one is a 2x optical zoom as mentioned before) at this price point is pretty amazing. The photo quality is quite good in decent lighting conditions (low light is problematic but that can be said of most smartphone cameras), you get a manual mode with advanced controls in the native camera app and the portrait mode feature does a surprisingly good job at creating that fancy Bokeh effect blurring the background to single out your on-screen talent. A lot of bang for the buck. Video – which I’m personally more interested in – is a slightly different story though.
Let’s start with a positive aspect: The Xiaomi Mi A1 lets you record in UHD/4K quality which is still a rarity for a budget phone in this price range. And hey, the footage looks quite good in my opinion, especially considering the fact that it’s coming from a (budget) smartphone. I have uploaded some sample footage on YouTube so see for yourself.
The video bitrate for UHD/4K hovers around 40 Mbps in the native app which is ok for a phone but the audio bitrate is a meager 96 Kbps (same in FHD) – so don’t expect full, rich sound. But this is only the beginning of a couple of disappointments when it comes to video: One of the Mi A1’s promising camera features, the 2x optical zoom lens, CANNOT be used in the video mode, only in the photo mode! What a bummer! This goes for both the native camera app and 3rd party apps.
Talking about 3rd party camera apps, it’s also a huge let-down that the Camera2 API support (what is Camera2 API?) is only „Legacy“ out of the box, even though the Mi A1 is part of Google’s Android One program. „Legacy“ means that third party camera apps can’t really tap into the new, more advanced camera controls that Google introduced with Android 5 in 2014, like precise exposure control over ISO and shutter speed. Due to this, you can’t install an app like Filmic Pro in the first place and other advanced camera apps like Cinema FV-5, ProShot, Lumio Cam, Cinema 4K, Footej Camera or Open Camera can’t really unleash their full potential. Interestingly, there seems to be a way to „unlock“ full Camera2 support via a special procedure without permanently rooting your device (look here) but even after doing so, Filmic Pro can’t be installed, probably because the PlayStore keeps the device’s original Camera2 support information in its database to check if the app is compatible without actually probing the current state of the phone. This is just an educated guess however. Still, many of us might not feel comfortable messing around with their phone in that way and it’s a pity Xiaomi doesn’t provide this out-of-the-box on the Mi A1.
Lackluster Camera2 API support can be remedied by a good native camera app but unlike with photos, there is no pro or manual mode for videos on the Mi A1, it’s actually extremely limited. While you can lock the focus by tapping (there are two focus modes, tap-to-focus and continuous auto-focus), you are only able to adjust the auto-exposure within a certain range (EV), not lock it. There’s also no way to influence the white balance. Shooting in a higher frame rate (60fps)? Not possible, not even in 720p (there’s a not-too-bad 720p slow-motion feature though). Apropos frame rates: I noticed that while the regular frame rate is the usual 30fps, the native camera app reduces the fps to 24 (actually 23.98 to be precise) when shooting under low-light conditions to gain a little bit more light for each frame. That’s also the reason why I made two different YouTube videos with sample footage so I was able to keep the original frame rate of the clips. I have experienced this behaviour of dropping the frame rate in low-light in quite a few (native) camera apps on other phones as well and from the standpoint of a run-of-the-mill smartphone user taking video this is actually an acceptable compromise in my opinion (as long as you don’t go below 20fps) to help tackle the fact that most smartphone cameras still aren’t naturally nocturnal creatures. It can however be a problem for more dedicated smartphone videographers that want to edit their footage as it’s not really good to have clips in one project that differ so much in terms of fps. 3rd party apps might help keeping the fps more constant.
And there are still two other big reasons to use a 3rd party app on the Mi A1 despite the lack of proper Camera2 API support: locking exposure and using an external microphone via the headphone jack (yes, there is one!). One more important shortcoming to talk about: It’s not too surprising maybe that there is no optical image stabilization (OIS) on a phone in this price range but given the fact that you can shoot 4K, I would have expected electronic image stabilization (EIS) at least when shooting in 1080p resolution. But there’s no EIS in 1080p which means that you should put the phone on a tripod or use a gimbal most of the time to avoid getting shaky footage. With a bit of practice you might pull off a decent handheld pan or tilt however to avoid having only static shots.
So I’ve talked about the video capturing part, what about editing video on the Mi A1? The phone sports a Snapdragon 625 which is a slightly dated but still quite capable mid-ranger chipset from Qualcomm. You can work with up to two layers (total of three video tracks) of FHD video in KineMaster and PowerDirector (the two most advanced Android video editing apps) which will suffice for most users. Important note: DON’T run the hardware analysis test in KineMaster though! It’s a hardware probing procedure meant to better determine the device’s capabilities in terms of editing video in the app. While the device capability information originally says you can have two QHD (1440p) video layers, it will downgrade you to two 720p (!) layers after running the analysis – quite strange. Don’t worry though if your evil twin grabs your phone and runs the test anyway – you just have to uninstall and then reinstall KineMaster to get back to the original setting. I ran some quick tests with FHD 1080p layers and it worked fine so just leave everything as is. Since the phone can shoot in UHD/4K resolution you might ask if you can edit this footage on the device. While you can’t edit 4K in KineMaster on the Mi A1 at all (when trying to import 4K footage the app will offer you to import a transcoded QHD version of the clip to work with) you can import and work with UHD/4K in PowerDirector, but only as a single video track, layers are not possible.
So let’s wrap this up: Xiaomi’s first internationally available phone is a great budget option for mobile photographers but the video recording department is let down by a couple of things which makes other options in this price range more appealing to the smartphone videographer if advanced manual controls and certain pro apps are of importance. As I pointed out though, it’s not all bad: It’s still hard to find a phone for that price that offers UHD/4K video recording – and the footage looks even pretty good in decent lighting conditions. So if you happen to have a Mi A1 – there’s no reason at all to not create cool video content with it – if you achieve a nice video package you can even be more proud than someone with a flagship phone! 😉
Back in 2016 Google made an iOS-exlusive app (weird, ain’t it?!) called Motion Stills. It focused on working with Apple’s newly introduced ‘Live Photos’ for the iPhone 6s. When you shoot a ‘Live Photo’, 1.5 seconds of video (with a low frame rate mind you) and audio before and after pressing the shutter button is recorded. You can think of it as a GIF with sound. What Motion Stills does is that it lets you record, stabilize, loop, speed-up and/or combine ‘Live Photos’. In 2017, Google finally brought the app to Android. Now while some Android phone makers have introduced ‘Live Photo’-like equivalents, there’s no general Android equivalent as such yet and because of that the app works slightly different on Android. Instead of ‘Live Photos’ you can shoot video clips with a maximum duration of 3 seconds (this also goes for pre-6s iPhones on iOS). There are also other shooting modes (Fast Forward, AR Mode) that are not limited to the 3 seconds but for this post I want to concentrate on the main mode Motion Still.
When I first looked at the app, I didn’t really find it very useful. Recording 3-second-clips in a weird vertical format of 1080×1440 (720×960 on iOS)? A revamped Vine without the attached community? Some days later however I realized that Motion Stills actually could be an interesting and easy-to-use visual micro-storytelling tool, especially for teaching core elements of visual storytelling. The main reasons why I think it’s useful are:
a) it’s a single app for both shooting and editing (and it’s free!)
b) the process of adding clips to a storyboard is super-easy and intuitive and
c) being forced to shoot only a maximum of 3 seconds let’s you concentrate on the essentials of a shot
So here’s a quick run-through of a possible scenario of how one might use the app for a quick story or say story-teaser: When covering a certain topic / location / object etc. you take a bunch of different 3-second-shots with Motion Stills (wide shot, close-up, detail etc. – 5-shot-rule anyone?) by pressing the record button. It might be good to include some sort of motion into at least some shots, either by shooting something where you already have motion because people or objects are moving or by moving the smartphone camera itself (‚dolly‘ shot, pan, tilt) when there is no intrinsic motion. Otherwise it might look a little bit too much like a stills slide show. Don’t worry too much about stabilization because Motion Stills automatically applies a stabilization effect afterwards and even without that, you might just be able to pull off a fairly stable shot for three seconds. After you have taken a bunch of shots, head over to the app’s internal gallery (bottom left corner on Android, swipe up on iOS) where all your recordings are saved and browse through the clips (they auto-play). If you tap a clip you can edit it in a couple of ways: You can turn off stabilization, mute the clip, apply a back-and-forth loop effect or speed it up. On iOS, you can also apply a motion tracking title (hope the Android version will get this feature soon as well!) What you can’t do is trim the clip. But you actually don’t have to go into edit mode at all if you’re happy with your clips as they are, you can create your story right from the gallery. And here’s the cool thing about that: Evoking a shade of Tinder, you can quickly add a clip to your project storyboard (which will appear at the bottom) by swiping a clip to the right or delete a clip from the gallery by swiping it to the left. If you want to rearrange clips in the storyboard, just long-press them and move them to the left or the right. If you want to delete a clip from the storyboard, long-press and drag it towards the center of the screen, a remove option will appear. In a certain way Google’s Motion Stills could be compared to Apple’s really good and more feature-rich Apple Clips app when it comes to creating a micro-story on the go really fast with a single app – but Apple Clips is – of course – only available for iOS. When you are finished putting together your micro-story in Motion Stills, you can play it back by tapping the play button and save/share it by tapping the share button. Once you get the hang of it, this is truly fast and intuitive – you can assemble a series of shots in no time.
That being said, there are a couple of limitations and shortcomings that shouldn’t be swept under the rug. Obviously, thanks to the 3-second-limit per clip, the app isn’t really useful for interviewing people or any other kind of monologue/dialogue scenario. You might fit in some one liners or exclamations but that’s about it. It’s also a bit unfortunate that the app doesn’t apply some kind of automatic audio-transition between the clips. If you listen to the end result with the sound on, you will often notice rather unpleasant jumps/cracks in the audio at the edit points. While you could argue that because of the format content will only be used for social media purposes where people often just watch stuff without sound and will not care much about the audio anyway, I still think this should be an added feature. But let’s get back to the format: While you have the option to export as a GIF if you are only exporting one clip, the end result of a series of clips (which is the use case I’m focusing on here) is an mp4 (mov on iOS) video file with the rather awkward resolution of 1080 by 1440 (Android) or 720 by 960 (iOS) – a 3:4 aspect ratio. This means that it will only be useful for social media platforms but hey, why ‚only‘, isn’t social media everything these days?! Another thing that might be regarded as a shortcoming or not is the fact that (at least on Android) you are pretty much boxed in with the app. You can’t import stuff and clips also don’t auto-save to the OS’s general Gallery (you will have to export clips manually for that). But is that such a bad thing? I don’t think so because a good part of the fun is doing everything with a single app: shooting, editing, exporting/publishing. So let’s finish this with an actual shortcoming: While the app is available for Android, it’s not compatible with certain devices – mostly low-end devices / mid-rangers with rather weak chipsets. And even if you can install it, some not-so-powerful devices like the Nokia 5 or Honor 6A (both rocking a Snapdragon 430) tend to struggle with the app when performing certain tasks. This doesn’t mean the app always runs a 100% stable on flagships – I also ran into the occasional glitch while using it on a Samsung S7 and an iPhone 6. Still, the app is free, so at least check it out, it can really be a lot of fun and useful to do/learn visual (micro) storytelling! Download it on GooglePlay (Android devices) or the Apple App Store (Apple devices).
P.S.: Note that you can only work on one project at a time and don’t clear the app from your app cache before finishing/exporting it – otherwise the project (not the recorded clips) will be lost!
P.P.S.: Turn off the watermark in the settings!
One of the first steps when getting more serious about producing video content with a smartphone is to look at the more advanced video recording apps from 3rd party developers. Popular favorites like „FilmicPro“ (available for both Android and iOS) usually offer way more image composition controls, recording options and helpful pro features that you find on dedicated video cameras than the native stock camera app provided by the maker of the smartphone. While quite a few stock camera apps now actually have fairly advanced manual controls when shooting photos (ability to set ISO and shutter speed might be the most prominent example), the video mode unfortunately and frustratingly is still almost always neglected, leaving the eager user with a bare minimum of controls and options. In 2015 however, LG introduced a game changer in this regard: the V10. For the first time in smartphone history, a phone maker (also) focused on a full featured video recording mode: it included among other things the ability to set ISO and shutter speed, lock exposure, pull focus seamlessly, check audio levels via an audio level meter, adjust audio gain, set microphone directionality, use external microphones, alter the bit rate etc. etc. Sure, for certain users there were still some things missing that you could find in 3rd party apps like the option to change the frame rate to 25fps if you’re delivering for a PAL broadcast but that’s only for a very specific use case – in general, this move by LG was groundbreaking and a bold and important statement for video production on a smartphone. But what about other phone makers? How good are their native camera apps when it comes to advanced options and controls for recording video? Can they compete with dedicated 3rd party apps?
First off, let me tell you why in most cases, you DO want to have a 3rd party app for recording video (at least if you have an Android phone): external microphones. With the exception of LG, Samsung (and I’m told OnePlus) in their recent flagship lines (plus Apple in general), no stock camera app I have come across supports the use of external microphones when recording video. Having good audio in a video is really important in most cases and external microphones (connected via headphone jack, microUSB, USB-C or Lightning connector) can be a big help in achieving that goal.
So why would you use a stock camera app over a dedicated 3rd party app at all? Familiarity. I guess many of us use the native camera app of a smartphone when snapping casual, everyday photos and maybe also videos in non-professional situations. So why not build on that familiarity? Simplicity. The default UI of most native camera apps is pretty straight-forward and simple. Some might prefer this to a more complex UI featured in more advanced 3rd party apps. Affordability. You don’t have to spend a single extra penny for it. I’m generally an avid advocate of supporting excellent 3rd party app developers by paying for their apps but others might not want to invest. The most important reason in my opinion however is: Stability/Reliability. This might not be true for every stock camera app on every phone (I think especially owners of Sony phones and lately the Essential Phone could beg to differ) but because of the fact that the app was developed by the maker of the phone and is usually less complex than 3rd party apps, chances are good that it will run more stable and is less prone to (compatibility) bugs, especially when you consider the plethora of Android devices out there. The V10’s stock camera app, despite being rather complex,is rock-solid and hasn’t crashed on me once in almost 2 years now.
Over the last months I have taken a closer look at a whole lot of stock camera apps on smartphones from LG, Samsung, Apple, Huawei, Sony, Motorola/Lenovo, Nokia (both their older Windows Phone / Windows Mobile offerings AND their new Android handsets), HTC, Nextbit, BQ, Wiko and Google/Nexus. It goes without saying that I wasn’t able to inspect stock camera apps on all the different phone models of a manufacturer. This is important to say because some phone makers give their flagships models a more advanced camera app than their budget devices while others offer the same native camera app across all (or at least most) of their device portfolio. Also, features might be added on newer models. So keep in mind, all I want to do is to give a rough overview from my perspective and offer some thoughts on which phone makers are paying more attention to pro features in the video recording department.
The lowest common denominator for recording video in a stock camera app on a smartphone at the moment is that you will have a button to start recording in full-auto mode with a resolution of 1920×1080 (1080p) (1280×720 on some entry level or older devices) at a frame rate of 30fps. „full-auto“ basically means that exposure, focus and white balance (color temperature) will be set and adjusted automatically by the app depending on the situation and the algorithm / image processing routine. While this might sound like a convenient and good idea in general to get things done without much hassle, the auto-mode will not always produce the desired results because it’s not „smart“ enough to judge what’s important for you in the shot and therefore doesn’t get exposure, focus and/or white balance right. It might also change these parameters while recording when you don’t want them to, like for instance when you are panning the camera. Therefore one of the crucial features to get more control over the image is the ability to adjust and lock exposure, focus and white balance because if these parameters shift (too wildly/abruptly/randomly) while recording, it makes the video look amateurish. So let’s have a look at a couple of stock camera apps.
I’ve been spending quite some time in the last months doing research on what device could qualify as the cheapest budget Android phone that still has certain relevant pro specs for doing mobile video. While it might be up to discussion what specs are the most important (depending on who you ask), I have defined the following for my purposes: 1) decent camera that can record at least in FHD/1080p resolution, 2) proper Camera2 API support to run pro camera apps with manual controls like Filmic Pro (check out my last post about what Camera2 API is), 3) powerful enough chipset that allows the use of video layers in pro video editing apps like KineMaster and PowerDirector, 4) support for external microphones (preferably featuring a headphone jack as long as there are no good all-wireless solutions available).
The greatest obstacle in this turned out to be No. 2 on the list, proper Camera2 API support. Apart from Google’s (abandoned?) Nexus line which also includes a budget option with the Nexus 5X (currently retailing for around 250€), phone makers (so far) have only equipped their flagship phones with adequate Camera2 API support (meaning the hardware support level is either ‘Full’ or ‘Level 3’) while mid-range and entry-level devices are left behind.
Recently, I happened to come across a rather exotic Android phone, the Nextbit Robin. The Nextbit Robin is a crowdfunded phone that came out last year. Its most notable special feature was the included 100GB of cloud storage on top of the 32GB internal storage. While the crowdfunding campaign itself was successful and the phone was actually released, regular sales apparently have been somewhat underwhelming as the phone’s price has dropped significantly. Originally selling for a mid-range price of 399$, it can now be snagged for around 150€ online (Amazon US even has it for 129$). As far as I know, it is now the cheapest Android device that checks all the aforementioned boxes regarding pro video features, INCLUDING full Camera2 API support! Sure, it has some shortcomings like mediocre battery life (the battery is also non-replaceable – but that’s unfortunately all too common these days) and the lack of a microSD storage option (would have been more useful than the cloud thing). It also gets warm relatively quick and it’s not the most rugged phone out there. But it does have a lot going for it otherwise: The camera appears to be reasonably good (of course not in the same league as the ones from Samsung’s or LG’s latest flagships), it even records video in UHD/4K – though it’s no low light champion. The Robin’s chipset is the Snapdragon 808 which has aged a bit but in combination with 3GB of RAM, it’s still a quite capable representative of Qualcomm’s 800-series and powerful enough to handle FHD video layers in editing apps like KineMaster and PowerDirector which is essential if you want to do any kind of a/b-roll editing on your video project. It also features a 3.5mm headphone jack which makes it easy to use external microphones when recording video with apps that support external mics. The most surprising thing however is that Nextbit implemented full Camera2 API support in its version of Android which means it can run Filmic Pro (quite well, too, from what I can tell so far!) and other advanced video recording apps like Lumio Cam and Cinema 4K with full manual controls like focus, shutter speed & ISO. One more thing: The Robin’s Android version is pretty much as up-to-date as it gets: While it has Android 6 Marshmallow out of the box, you can upgrade to 7.1.1 Nougat (the latest version is 7.1.2).
So should you buy it? If you don’t mind shelling out big bucks for one of the latest Android flagship phones and you really want the best camera and fastest chipset currently available, then maybe no. But if you are looking for an incredible deal that gives you a phone with a solid camera and a whole bunch of pro video specs at a super-low price, then look no further – you won’t find that kind of package for less at the moment.
This blog post is trying to shed some light into one of Android’s fragmentation corners – one that’s mainly relevant for people interested in more advanced photography and videography apps to take manual control over their image composition.
First off, I have to say that I’m not a coder / software expert at all so this comes from a layman’s point of view and I will – for obvious reasons – not dig too deep into the more technical aspects underneath the surface.
Now, what is an API? API stands for „application programming interface“. An operating system uses APIs to give (third party) developers tools and access to certain parts of the system to use them for their application. In reverse, this means that the maker of the operating system can also restrict access to certain parts of the system. To quote from Wikipedia: „In general terms, it is a set of clearly defined methods of communication between various software components. A good API makes it easier to develop a computer program by providing all the building blocks, which are then put together by the programmer.“ Now you know it.
Up to version 4.4 (KitKat) of Android, the standard API to access the camera functionality embedded in the OS was very limited. With version 5 (Lollipop), Google introduced the so-called Camera2 API to give camera app developers better access to more advanced controls of the camera, like manual exposure (ISO, shutter speed), focus, RAW capture etc. While the phone makers themselves are not necessarily fully dependent on Google’s new API, because they can customize their own version of the Android OS, third party app developers are to a large extend – they can only work with the tools they are given.
So does every Android device running Lollipop have the new Camera 2 API? Yes and no. While Camera2 API is the new standard Camera API since Android Lollipop, there are different levels of implementation of this API which vary between different phone makers and devices. There are four different levels of Camera2 implementation: Legacy, Limited, Full and Level 3. ‚Legacy‘ means that only the features from the old Camera1 API are available, ‚Limited‘ means that some features of the new API are available, ‚Full‘ means that all basic new features of Camera2 are available and ‚Level 3‘ adds some bonus features like RAW capture on top of that.
Depending on the level of implementation, you can use those features in advanced image capturing apps – or not. An app like Filmic Pro can only be installed if the Camera2 support level is at least ‚Full‘ – otherwise you can only install the less feature-packed Filmic Plus. Lumio Cam on the other hand can be installed on most devices but you can only activate the pro mode with manual exposure and focus if the support level is at least ‚Full‘ again. So if you’re interested in using advanced third party apps for capturing photos or recording video with manual exposure controls etc. you want to have a device that at least has ‚Full‘ Camera2 API support.
But what devices have ‚Full‘ Camera2 support? Currently there are two main categories: Google hardware (phones) and (many/most) flagship phones that were released after Android Lollipop came out. Actually, it seems that the latter really only got going with Android 6 Marshmallow (I guess phone makers needed some time to figure out what this was all about ;)) It doesn’t come as a surprise that Google gives their own devices full support (Nexus & Pixel lines). That means even an almost ancient, pre-Lollipop device like the original Nexus 5 has received full support in the meantime (via OS update). Of course all Nexus phones after that (Nexus 6, Nexus 5X, Nexus 6P) are included and it goes without saying Google’s Pixel phones as well.
Now let’s head over to other smartphone manufacturers (so-called OEMs, Original Equipment Manufacturers) like Samsung, LG, HTC, Huawei, Sony, Lenovo/Motorola, OnePlus etc. Many of them offer at least the crucial ‚Full‘ support level on their flagships that came out with Android 6 Marshmallow installed, some already on the ones that came out with Android 5 Lollipop: Samsung with it’s S-series (S6, S6 Edge, S6 Edge Plus via update, S7, S7 Edge etc.), LG with its G-series (starting with the G4) and V-series (starting with the V10), HTC (starting with the HTC 10), Lenovo/Motorola (starting with the Moto Z), OnePlus (starting with the OnePlus 3/3T), and Sony (starting with the Xperia Z5 via update as far as I know). Sony however is a special case: Their Xperia series has been blacklisted by the developers of FilmicPro/Plus because of major issues that occurred with their devices – you can’t install their apps on a Sony phone at the moment. On the other hand, there are also a few major smartphone OEMs that yet have to offer full Camera2 support for their flagships, the most prominent black sheep being Huawei with its P & Mate series, even the brand new Huawei P10 with all its camera prowess has only limited support. The same goes – unsurprisingly – for Huawei’s budget brand Honor. Other OEMs that don’t offer full Camera2 support in their flagships include Asus (Zenfone 3) and Blackberry (KeyOne). Let’s hope that they will soon add this support and let’s also hope that proper support trickles down to the mid-range and maybe even entry-level phones of the Android universe.
Are you curious what Camera2 support level your phone has? You can use two different apps (both free) on the Google Play Store to test the level of Camera2 implementation on your device. Camera2 probe &Camera2 Probe.
You can also find a (naturally incomplete) list of Android devices and their level of Camera2 API support here, created and maintained by the developer of the app „Camera2 probe“:
If you have a device that is not listed, you can help expanding the list by sending your device’s results (no personal data though) to the developer (there’s a special button at the bottom of the app).
For more in-depth information about Camera2 API, check out these sources:
Cameras that can produce spherical 360 video are becoming more affordable and widespread these days, slowly making their way into the mainstream. The recently released Android-smartphone-specific Insta360 Air clip-on camera has joined a bunch of other entry level 360 cams like the Ricoh Theta S, the LG 360 Cam and Samsung’s Gear 360 to make this new exciting world of immersive visuals available for the crowd while more avantgardistic 360 aficionados are getting their fix with a GoPro-Omni rig or Nokia’s 40000 € Ozo. High-end 360 video solutions are still meant to be post-produced on a desktop machine but the consumer variants are closely tied to mobile devices already. The Insta360 Air connects to the microUSB or USB-C port of an Android phone and records the footage directly to the device. The other three aforementioned entry-level 360 cams can – unlike the Insta360 Air – also be used as a standalone camera without a (physical) connection to the phone but they all have companion apps that will help you to get the best shooting experience and control via a wireless connection. Furthermore, they make it very easy to directly transfer the footage from the camera to the phone for instant sharing or editing. YouTube and Facebook are the two big social networks that already support interactive 360 videos natively, Vimeo has recently added this feature as well. But before sharing, it’s very likely you want to perform some edits on your footage or combine a couple of clips to tell a story. This brings us to the topic of how you can edit 360 video directly on your Android device.
Oh wait! Just hold your horses for a second! Before actually tackling the editing options I think it’s helpful to address two subjects first to better understand the idiosyncracies of dealing with 360 video: stitching and metadata.
(Consumer) Camera technology is not (yet) at the point – at least as far as my knowledge goes – where you can record a spherical 360 image with only a single lens. To achieve a spherical 360 image, at least two lenses are used. These two lenses will give you two images which can be stored in a single file or in different files. Either way, to get one single image ready for spherical display (in the so-called “equirectangular” format) the two images need to be “stitched” together. The stitching can be done automatically by a software algorithm or it can be done manually in a specific editing program. When using a consumer 360 camera you will not have to bother with manual stitching as long as you transfer the files to the camera’s companion app which does the stitching for you automatically. You will only encounter “raw” un-stitched files if you pull the recorded files directly off the SD card without transferring them to the app first for stitching. Here are two screen grabs, one is un-stitched footage from the Gear 360, the other stitched footage from the Insta360 Air.
Important: Only stitched footage in equirectangular format will be displayed correctly as an interactive 360 video when you upload the file to YouTube or Facebook.
The other important thing to have the video displayed correctly as an interactive 360 video in a dedicated player is metadata. It’s data embedded in the video file that will “tell” the player that the file is a 360 video. I’ve used the term “interactive” repeatedly, what do I mean by it? It means that you can interactively change the perspective in the video, either by dragging your finger around the screen or by panning/tilting your device (making use of the phone’s gyroscope). If there’s no metadata in the file, the player will just display a flat, equirectangular video that you can’t interact with. And halleluja, this finally sends us off to our actual topic – editing 360 video on Android – because, depending on what editing app you choose, the exported video still has the 360 metadata in it – or not (in which case it will have to be re-injected).
So there are basically three options to edit and produce 360 video on your Android device:
the camera’s companion app
a dedicated 360 video editing app
a regular video editing app + an app that re-injects metadata
When should you use which option?
You only want to trim the beginning/end of a single clip and/or add a filter. You don’t want to mess around with re-injecting metadata.
You want to build a story with multiple clips. You want to have more editing options & features like changing the default viewing angle, speed or add music/audio. You want to keep the metadata in the file.
You want to build a story with multiple clips. You want a timeline environment, not a storyboard. You want the full feature set of your regular Android video editing app including precise placement/length of titles, music, voice-overs, graphics, transitions etc. You want to work on multiple projects at the same time. You don’t mind loosing a bit of vertical resolution. You don’t mind the “Black Hole Sun” syndrome. You don’t mind not having an ‘interactive’ preview. You don’t mind re-injecting metadata.
Using a 360 camera’s companion app
If you use the editing options of a 360 camera’s companion app, you will only be able to perform extremely basic edits when the end product should be an interactive 360 video. For instance, the companion app for the Insta360 Air only lets you add a filter from their selection, like black&white or some other Instagram-inspired ones. You can’t even trim the beginning or the end of the clip which definitely would come in handy if you don’t intend to be in the shot. Unlike with the Insta360 Air app you can do this kind of top & tail trimming in Samsung’s Gear 360 Manager app and Recoh’s Theta+ Video. The latter also lets you add a filter and music before exporting. I can’t really say anything about the companion app for the LG 360 Cam as I neither have one nor do I know somebody who owns it. But I very much assume that it won’t go beyond the features discussed here. Btw, if you want to share to a network that does not (yet) natively support 360 video (like Twitter, Instagram, WhatsApp or Snapchat) you might want to transform the video into a “Tiny Planet” or “Magic Ball” format which (most) companion apps let you do. But as this blog post is about ‘real’ interactive 360 video, I won’t go further into details here. The same goes for desktop editing software that is provided by the camera companies (like Insta360Studio or the Gear 360 Action Director) because we are focusing on mobile-only solutions.
Using a dedicated 360 video editing app
While often Android users are served less generously or belatedly regarding certain high-profile apps compared to Apple’s iOS users, they can actually be trailblazers when it comes to mobile 360 video editing! There are already two genuine 360 video editing apps in Google’s PlayStore (not a single one for iOS yet): Collect and V360. Both of them are still in beta (update: V360 has been officially released in the meantime) and relatively basic when compared to more advanced “regular” video editing apps but they cover the basics pretty well and appear generally very promising at this early stage. The most important thing is that – unlike the companion apps – they actually let you build a story out of multiple clips. When compared to each other, Collect comes off as the more advanced and visually slightly slicker app with a couple of more features but a minor drawback in the exporting process.
But let’s talk about V360 first, it’s plain simplicity may even make it a better choice when doing your very first 360 video edit. Upon firing up the app you can either multi-select a couple of 360 video clips or just select one and add other clips later. One very helpful thing is that there’s a slider button that when activated shows only 360 video clips, not your whole camera roll. When you’re done you’ll get to the storyboard (storyboard means each clip thumbnail has the same size no matter how long or short the video clip is). By swiping your finger around the preview area or moving around your phone you can explore the different parts of the 360 video. If you want to edit a clip you just tap on the pen icon below the storyboard: You can trim (top & tail), delete or duplicate the clip. There’s also an option to sort the clips (newest/oldest first) but that didn’t really work work me. If you want to rearrange the order of the clips in your story you long-press the clip and then drag it to its new place in the storyboard. You have the chance to add music or another audio clip to the storyboard. Keep in mind though that this audio will play through the whole clip, you cannot have it come in or go out at a certain point. There is however an option to adjust the music volume for the whole video. By tapping on the speaker icon you can change the volume relation between the sound from the video and the audio clip in three steps. Upon export you will find that fortunately the resolution is the same as your source material and that the metadata is still in place but also – a bit less enthusiastically – that a V360 branded outro has been added. Hopefully they will give you the chance to disable it with a future update. If you’re longing for something slightly more advanced then you should check out Collect. After selecting your clips you will find that the idea of circularity is a clever UI theme for a 360 video editing app. The thumbnails of the storyboard are circular and the preview window has a circular mask to help you imagine what the point-of-view will be like for the viewer when watching the video in VR mode with goggles. If you tap on one of the clips and enter the edit screen you will also find that the trim handles for the clip are built into a circle. Btw don’t worry about the trim handles already having been moved without you doing anything – when adding the clips to the storyboard the app does sort of a quick “auto-edit” but all of it is reversible. However I’d prefer to have this as an option to enable rather than a default setting. While letting you add some audio to the story (but just like V360 it plays through the whole video), Collect has a couple of more features up its sleeve: You can add a color filter, change the speed (slow, normal, fast) of the video and – that could be the most important thing – change the default viewing angle so viewers initially look into the direction you want them to look when a new clip starts. If you don’t know what it’s for I assume it can be a bit confusing for beginners though. Another nice feature is the ability to add a custom watermark (a square PNG image with maximum size of 1024×1024, transparency is supported) at the bottom of the image. While I am hoping that future updates will add a few more features like a basic title tool or the ability to switch to a timeline mode which gives you more control over the placement of audio tracks, the biggest flaw of this really cool app at the moment is that the resolution of the exported file is always 3840×2160. If you’re working with Gear 360 footage (which has a maximum resolution of 3840×1920), things are fine but if you use footage from another camera with lower resolution like the Insta360 Air that has a maximum video resolution of 2560×1280 on most phones, the image will get softer because of the upscaling. It would be good to have the option to keep the source material’s original resolution when exporting. Like V360 the app preserves the metadata upon export. One more thing: It’s very cool that they integrated an in-app messenger-like service for giving feedback to the developer team. So speak your mind if you have suggestions!
One thing that both Collect and V360 are lacking is the ability to save/manage multiple projects at the same time. Right now, you have to finish one project before starting a new one. And while you can’t work on different projects at the same time in either of these apps, V360 does save your current project even if you leave the app or eliminate it from the background tasks. Collect on the other hand does save your project as long as you keep the app running in the background, if you clear out the background apps your project will be lost! This is definitely something that both apps (especially Collect) should improve upon.
Using a regular video editing app
The ability to save multiple projects and going back to them for adjustments later is (currently) one of the big advantages when using a ‘regular’ Android video editing app for 360 video. Also, if you want to use titles, place audio files including voice overs at certain points, add transitions or just generally have the full feature set of a more advanced mobile editing app at hand, this is the better choice – it’s a slightly different workflow though and there are some caveats as well. By far the best two video editing apps on Android are KineMaster and PowerDirector, so I will only talk about these two champions here although you might also be able to use another video editing app. While PowerDirector already supports 4K/UHD footage on powerful enough devices, KineMaster has just released a beta version that includes 4K/UHD footage support as well (again the device – or more precisely its chipset – needs to be powerful enough to handle it) but the official release version is (for now) limited to FHD. While 4K/UHD still hasn’t exactly penetrated the mainstream as the standard resolution for ‘regular’ video, it’s a crucial point in the 360 video world because spread over a vastly larger area than in a regular non-360 image, a FHD resolution only looks like SD at best. So if you want something that at least comes a bit closer to an HD (720p) feeling, 4K/UHD footage is needed. You also have to consider that the most common aspect ratio of 360 video is not 16:9 but 2:1 (or 18:9) so you will lose a bit of vertical resolution. Let’s have a look at what kind of footage you can import into KineMaster and PowerDirector (please note that less powerful devices may not support the highest resolutions).
KineMaster currently supports a maximum resolution of 1920×1080 (FHD; 4K/UHD support is in the pipeline as mentioned before) and a maximum frame rate of 30fps. This means you can import footage from the Ricoh Theta S (1920×1080, 30fps) in full but you will have to go for lower resolutions and some pixel loss with footage from the Insta360 Air (2560×1280 does not work, only 1920×960, both 30fps) and Gear 360 (3840×1920 and other lower resolutions don’t work, only 1920×960, 30fps). The video will appear in and export from KineMaster in a common FHD resolution of 1920×1080 (having to fill the vertical resolution from 960 to 1080) so there will be some black letter-boxing which eventually results in what I like to call the “Black Hole Sun” (of course paying homage to a certain tune …) syndrome when viewing it as an interactive 360 video: a small black circle at the top and bottom of the image. You can watch the sample video here (make sure to watch it in highest possible resolution). A quick warning for those usually producing PAL video content with a frame rate of 25fps (which KineMaster allows): Since the footage on these cameras can only be captured with 30fps, set the export frame rate in KineMaster‘s settings to 30 as well for the best result – and it’s the more ‘natural’ standard for platforms like YouTube and Facebook.
If you are running PowerDirector on a device that supports 4K/UHD editing you can import Insta360 Air footage shot in 2560×1280 (30fps) but have to decide wether you want to export it downscaled to 1920×1080 (FHD) or upscaled to 3840×2160 (UHD). You can check my two sample videos (FHD & 4K, make sure to watch it in highest possible resolution) to decide which option you like better quality-wise. The ability to import 4K/UHD footage in PowerDirector also lets you use Gear 360 footage at maximum resolution (3840×1920, 30fps) but as the regular UHD format is 3840×2160, your video will also suffer from the “Black Hole Sun” syndrome.
But let’s move on to the actual editing process using either PowerDirector or KineMaster. One thing that makes imagining the final product a bit more difficult than when using a dedicated 360 video editing app like Collect or V360 is the fact that the preview window will not display an interactive image that you can explore by swiping your finger on the screen or moving the device around like you would with the finished product in a 360 video player – all you see is the flat equirectangular image. So be ready for some trial & error work to find out how certain edits or the addition of titles/graphics will actually look like in the end! That being said, having a precise timeline layout instead of a simple storyboard plus the full feature set of those two advanced mobile video editing apps will give you a lot more freedom and control to create the video your way. You can record voice-overs or add music tracks and place them at specific points, you can add titles (they actually work surprisingly well in a 360 environment, just pay attention to where you place them and don’t make them too big or it will be very hard to read them!) and graphics and exactly define their length, size & style, you can apply transitions instead of plain cuts etc. etc.
So you have created a super-sophisticated 360 masterpiece and joyfully sung Soundgarden’s “Black Hole Sun” the whole time – now you can just upload the video to YouTube or Facebook and get showered with Likes and Thumbs-Ups, right? Er … no. Because we’re pretty much coming full circle (absolutely no pun intended!) when I tell you you mustn’t forget about the metadata! After exporting your video from a regular Android video editing app, the metadata is gone and it needs to be re-injected so that the video player on YouTube or Facebook will actually display the video as an interactive 360 video and not in a flat equirectangular form. So there’s a problem but, alas, there’s also a fix: VRfix. This app is a one-trick pony and it will cost you a couple of bucks but you should be thankful that it exists because otherwise, there would be no happy ending for a mobile-only 360 video workflow when you have used PowerDirector or KineMaster to edit your video. After you have re-injected the 360 metadata into the video file with VRfix, you can finally upload the video to your 360 video platform of choice. If you want to know more about how VRfix works, check out their website.
Oh my, this is my first English language blog post here and it has become quite a monster despite the fact that I only wanted to cover some general basics. Well, well. I do hope you will find it useful in some way. Please feel free to drop questions and other feedback in the comments or hit me up on Twitter (@smartfilming). If you happen to find any mistakes or incorrect information in my article you’re also more than welcome to let me know about it. In that regard I want to finish by saying thanks to a couple of people I consulted during the process of writing this blog post: Pipo Serrano (@piposerrano), Paul Gailey (@paulgailey), Kai Rüsberg (@mojonalist), Sarah Jones (@VirtualSarahJ), Sarah Redohl (@SarahRedohl) and the 360 Rumors Blog (@360rumorsblog).
in an earlier version of this article it was said that KineMaster does not support 4K/UHD footage.
in an earlier version of this article it was said that projects can’t be saved when using Collect or V360.