As I pointed out in one of my very first blog posts here (in German), smartphone videography still comes with a whole bunch of limitations (although some of them are slowly but surely going away or have at least been mitigated). Yet one central aspect of the fascinating philosophy behind phoneography (that’s the term I now prefer for referring to content creation with smartphones in general) has always been one of “can do” instead of “can’t do” despite the shortcomings. The spirit of overcoming obvious obstacles, going the extra mile to get something done, trailblazing new forms of storytelling despite not having all the bells and whistles of a whole multi-device or multi-person production environment seems to be a key factor. With this in mind I always found it a bit irritating and slightly “treacherous” to this philosophy when people proclaimed that video editing apps without the ability to have a second video track in the editing timeline are not suitable for storytelling. “YOU HAVE TO HAVE A VIDEO EDITOR WITH AT LEAST TWO VIDEO TRACKS!” Bam! If you are just starting out creating your first videos you might easily be discouraged if you hear such a statement from a seasoned video producer. Now let me just make one thing clear before digging a little deeper: I’m not saying having two (or multiple) video tracks in a video editing app as opposed to just one isn’t useful. It most definitely is. It enables you to do things you can’t or can’t easily do otherwise. However, and I can’t stress this enough, it is by no means a prerequisite for phoneography storytelling – in my very humble opinion, that is.
I can see why someone would support the idea of having two video tracks as being a must for creating certain types of videography work. For instance it could be based on the traditional concept of a news report or documentary featuring one or more persons talking (most often as part of an interview) and you don’t want to have the person talking occupying the frame all the time but still keep the statement going. This can help in many ways: On a very basic level, it can work as a means for visual variety to reduce the amount of “talking heads” air time. It might also help to cover up some unwanted visual distraction like when another person stops to look at the interviewee or the camera. But it can also exemplify something that the person is talking about, creating a meaningful connection. If you are interviewing the director of a theater piece who talks about the upcoming premiere you could insert a short clip showing the theater building from the outside, a clip of a poster announcing the premiere or a clip of actors playing a scene during the rehearsal while the director is still talking. The way you do it is by adding the so-called “b-roll” clip as a layer to the primary clip in the timeline of the editing app (usually muting the audio of the b-roll or at least reducing the volume). Without a second video track it can be difficult or even impossible to pull off this mix of video from one clip with the audio from another. But let’s stop here for a moment: Is this really the ONLY legitimate way to tell a story? Sure, as I just pointed out, it does have merit and can be a helpful tool – but I strongly believe that it’s also possible to tell a good story without this “trick” – and therefore without the need for a second video track. Here are some ideas:
Most of us have probably come across the strange acronym WYSIWYG: “What you see is what you get” – it’s a concept from computational UI design where it means that the preview you are getting in a (text/website/CMS) editor will very much resemble the way things actually look after creating/publishing. If you want a word to appear bold in your text and it’s bold after marking it in the editor, this is WYSIWYG. If you have to punch in code like <b>bold</b> into your text editing interface to make the published end result bold, that’s not WYSIWYG. So I dare to steal this bizarre acronym in a slightly altered version and context: WYSIWYH – “What you see is what you hear” – meaning that your video clips always have the original sound. So in the case of an interview like described before, using a video editing app with only one video track, you would either present the interview in one piece (if it’s not very long) or cut it into smaller chunks with “b-roll” footage in between rather than overlaid (if you don’t want the questions included). Sure, it will look or feel a bit different, not “traditional”, but is that bad? Can’t it still be a good video story? One fairly technical problem we might encounter here is getting smooth audio transitions between clips when the audio levels of the two clips are very different. Video editing apps usually don’t have audio-only cross-fades (WHY is that, I ask!) and a cross-fade involving both audio AND video might not be the preferred transition of choice as most of the time you want to use a plain cut. There are ways to work around this however or just accept it as a stylistic choice for this way of storytelling.
Another very interesting way that results in a much easier edit without the need for a second video track (if any at all) but includes more pre-planning in advance for a shoot is the one-shot approach. In contrast to what many one-man-band video journalists do (using a tripod with a static camera), this means you need to be an active camera operator at the same time to catch different visual aspects of the scene. This probably also calls for some sort of stabilization solution like phone-internal OIS/EIS, a rig, a gimbal or at least a steady hand and some practice. Journalist Kai Rüsberg has been an advocate of this style and collected some good tips here (blog post is in German but Google Translate should help you getting the gist). As a matter of fact, there’s even a small selection of noticeable feature films created in such a (risky) manner, among them “Russian Ark” (2002) and “Viktoria” (2015). One other thing we need to take into consideration is that if there’s any kind of asking questions involved, the interviewer’s voice will be “on air” so the audio should be good enough for this as well. I personally think that this style can be (if done right!) quite fascinating and more visually immersive than an edited package with static separate shots but it poses some challenges and might not be suited for everybody and every job/situation. Still, doing something like that might just expand your storytelling capabilities by trying something different. A one-track video editing app will suffice to add some text, titles, narration, fade in/out etc.
A unique almagam of a traditional multi-clip approach and the one-shot method is a technique I called “shediting” in an earlier blog post. This involves a certain feature that is present in many native and some 3rd party camera apps: By pausing the recording instead of stopping it in between shots, you can cram a whole bunch of different shots into a single clip. Just like with one-shot, this can save you lots of time in the edit (sometimes things need to go really fast!) but requires more elaborate planning and comes with a certain risk. It also usually means that everything needs to be filmed within a very compact time frame and one location/area because in most cases you can’t close the app or have the phone go to sleep without actually stopping the recording. Nonetheless, I find this to be an extremely underrated and widely unknown “hack” to piece together a package on the go! Do yourself a favor and try to tell a short video story that way!
A way to tackle rough audio transitions (or bad/challenging sound in general) while also creating a sense of continuity between clips is to use a voice-over narration in post production, most mobile editors offer this option directly within the app and even if you happen to come across one that doesn’t (or like Videoshop, hides it behind a paywall) you can easily record a voice-over in a separate audio recording app and import the audio to your video editor although it’s a bit more of a hassle if you need to redo it when the timing isn’t quite right. One example could be splicing your interview into several clips in the timeline and add “b-roll” footage with a voice-over in between. Of course you should see to it that the voice-over is somewhat meaningful and not just redundant information or is giving away the gist / key argument of an upcoming statement of the interviewee. You could however build/rephrase an actual question into the voice-over. Instead of having the original question “What challenges did you experience during the rehearsal process?” in the footage, you record a voice-over saying “During the rehearsal process director XY faced several challenges both on and off the stage…” for the insert clip followed by the director’s answer to the question. It might also help in such a situation to let the voice-over already begin at the end of the previous clip and flow into the subsequent one to cover up an obvious change in the ambient sound of the different clips. Of course, depending on the footage, the story and situation, this might not always work perfectly.
Finally, with more and more media content being consumed muted on smartphones “on the go” in public, one can also think about having text and titles as an important narrative tool, particularly if there’s no interview involved (of course a subtitled interview would also be just fine!). This only works however if your editing app has an adequate title tool, nothing too fancy but at least covering the basics like control over fonts, size, position, color etc. (looking at you, iMovie for iOS!). Unlike adding a second video track, titles don’t tax the processor very much so even ultra-budget phones will be able to handle it.
Now, do you still remember the second part of this article’s title, the one in parentheses? I have just gone into lengths to explain why I think it’s not always necessary to use a video editing app with at least two video tracks to create a video story with your phone, so why would I now be saying that after all it doesn’t really matter that much anymore? Well, if you look back a whole bunch of years (say around 2013/2014) when the phoneography movement really started to gather momentum, the idea of having two video tracks in a video editing app was not only a theoretical question for app developers, thinking about how advanced they WANTED their app to be. It was also very much a plain technical consideration, particularly for Android where the processing power of devices ranged from quite weak to quite powerful. Processing multiple video streams in HD resolution simultaneously was no small feat at the time for a mobile processor, to a small degree this might even still be true today. This meant that not only was there a (very) limited selection of video editing apps with the ability to handle more than just one video track at the same time, but even when an app like KineMaster or PowerDirector generally supported the use of multiple video tracks, this feature was only available for certain devices, excluding phones and tablets with very basic processors that weren’t up to the task. Now this has very much changed over the last years with SoCs (System-on-a-chip) becoming more and more powerful, at least when it comes to handling video footage in FHD 1080p resolution as opposed to UHD/4K! Sure, I bet there’s still a handful of (old) budget Android devices out there that can’t handle two tracks of HD video in an editing app but mostly, having the ability to use at least two video tracks is not really tied to technical restraints anymore – if the app developers want their app to have multi-track editing then they should be able to integrate that. And you can definitely see that an increasing number of video editing apps have (added) this feature – one that’s really good, cross-platform and free without watermark is VN which I wrote about in an earlier article.
So, despite having argued that two video tracks in an editing app is not an absolute prerequisite for producing a good video story on your phone, the fact that nowadays many apps and basically all devices support this feature very much reduces the potential conflict that could arise from such an opinion. I do hope however that the mindset of the phoneography movement continues to be one of “can do” instead of “can’t do”, exploring new ways of storytelling, not just producing traditional formats with new “non-traditional” devices.
As usual, feel free to drop a comment or get in touch on the Twitter @smartfilming. If you like this blog, consider signing up for my Telegram channel t.me/smartfilming.
I’m a big fan of advanced mobile video editing apps like ‘KineMaster’ (Android & iOS) or ‘LumaFusion’ (iOS-only) and I’m very supportive of the idea that one should pay for such powerful media creation tools. However, there might be instances when it’s just not possible for one reason or another to do that. So I have always kept an eye on mobile video editing apps that tick all the following boxes: 1) they should be free to download and use 2) if there are different versions the free version should not include a watermark 3) they should be fairly advanced (for instance include the ability to have a second video track) and user-friendly 4) they should be cross-platform (Android and iOS) and 5) they should handle/export at least 1080p resolution with 25/30fps. I eventually ditched one other prerequisite: that you don’t have to create an account to use the app. To be honest, if you want an app that really ticks all the boxes, there isn’t much around. Actually up until recently I would have only been able to point to a single one: ‘VlogIt’. And even that could have been considered a cheat under strict circumstances because while VlogIt doesn’t have a watermark on the exported video, it has a branded bumper outro. I’m not too much a fan of the app’s UI though and its limited to a 16:9 project aspect ratio. Another theoretical contender was the relatively new ‘Adobe Premiere Rush’ but the availability for Android devices is still extremely limited and you only get three free exports before you have to commit to a paid subscription. So things were looking pretty sobering until last week-end.
While routinely browsing the Google Play Store for new video editing apps, I came across an app named ‘VN’. The provided screen grabs looked somewhat promising and I downloaded the app. After launching it, I was greeted with a splash screen that prompted me to log in or create an account. I seriously considered deleting the app again. I’m at a point where I really don’t want to sign up for the 3478th service, particularly not before even being able to try out the app. Curiosity however got the better of me and in hindsight, I’m glad it did.
First things first: VN isn’t really new. It apparently has been around for about two years according to the release date in the PlayStore but the relatively low number of downloads compared to other popular free video editing apps indicates that not too many people seem to have noticed it. VN is integrated into a video sharing community (where you can post videos to their platform and follow other users) which can seem a bit annoying if you only want to use the app to save the finished project to the device and share it to your platform of choice. You don’t have to share the video to VN’s community though, it’s possible to only export it to the Gallery (Android) or Camera Roll (iOS) and save it locally on the device.
With that out of the way, I have to say I was very impressed with VN’s feature set after taking it out for a spin. While it’s not quite as advanced as LumaFusion or KineMaster, it comes surprisingly close for a free app, covering a wide range of dedicated functions for serious video editing while at the same time sporting a visually pleasing and generally user-friendly UI.
VN has a classic video editor timeline layout and is able to handle multiple tracks of video (important for b-roll editing for instance), audio and other visual elements like titles, photos and graphics. In terms of graphics it’s also important to note that it supports png files with alpha channel (for instance to include brand logos). You can also record voice-over into the timeline as an audio track and for this external microphones are supported as well. Another big win for VN is the variety of project aspect ratios available: 21:9, 16:9, 4:3, 1:1, 3:4, 9:16 and even ‘Round’ which is basically a masked square format.
One area where VN really needs to be improved (at least on Android) is handling audio transitions between video clips. There a multiple ways to achieve this but none is included at the moment: 1) it’s not possible to detach the audio of a video clip to make J&L cuts 2) while visual elements can be keyframed, audio can’t – so no audio ducking / automation is possible 3) while quick fade in/out buttons are conveniently available for audio-only clips (music, voice-overs etc), this is not available for the integrated audio of video clips in the Android version (it is on iOS) 4) no audio-only cross-fade is included in the transitions. With all these critical points in combination it’s very hard to avoid rough audio transitions between video clips in the Android version at the moment, the iOS version is slightly better. I suppose the fade in/out buttons for video audio will be added to the Android version eventually.
Talking about audio, at least in the Android version voice-overs recorded within the app itself don’t sound very good (I tested on two devices so far), like they are recorded at a low audio bitrate or sample rate but I’m sure this can be fixed with an update. Also, you can’t boost the audio in the Android version while on iOS you can. A slightly annoying thing in both versions is the fact that just like many other video editors featuring video overlays, the added b-roll footage doesn’t fill the whole frame but is added in a slightly scaled down version so if you want to have it cover up the frame of the video clip on the primary track seamlessly, you have to manually scale it which is not only an extra step but also includes the risk of accidentally moving the image away from the center. I get that this default setting is useful if you want to use the overlay video as a picture-in-picture but it’s not the best for editing b-roll style. It would also be nice to have a visual audio level meter when playing back the timeline.
Other than that, VN continues to provide you with lots of useful editing options like speed-ramping, nice title templates, filters, basic grading and various visual effects. One very clever UI function is that when long-pressing a video clip in the timeline to rearrange the order of the clips, it automatically squeezes the clip into a compact square storyboard thumbnail and only transitions back to the original timeline view after releasing the clip into its new place. This makes it much easier to rearrange clips quickly. VN also gives you a variety of professional options on export, not only resolution but frame rate (24/25/30/50/60) and bitrate. And it’s watermark-free! And available for both Android and iOS! On iOS it even seems that you can use it without having to create an account first. I have only tested it for about a week now and it’s quite possible that I will come across (more) bugs or shortcomings but so far I can conclude that this is a fantastic app, both easy to use and powerful. So is it the best free-without-watermark cross-platform mobile video editing app?
A couple of days after discovering VN, I took a second look at another app, one that I tested about a year ago when it was still in beta but somehow lost track of it over the months. It’s called ‘Feelmatic’ and is available for both Android and iOS and similar to VN (at least when looking at the Android version), you have to create an account for their video sharing platform/community.
Feelmatic also covers a lot of important features for advanced mobile video editing. It’s a bit more basic than VN, lacking some of its “bells & whistles”, but depending on the job you need to get done, it might not be that much of a deal. One might even see it positively as a more focused approach with a toolbar that lets you see all elements at a glance without having to swipe and scroll around, going down the option rabbit hole. It might be easier to grasp for users who are completely new to video editing. When I first tested the app last year it didn’t have the ability to add a video overlay but it does now. Better yet and unlike VN, the video overlay fully covers up the clip in the primary track by default. Feelmatic lets you record voice-over within the app and supports the use of external mics for that. Just like with VN however creating a smooth audio mix can be a problem, as there’s no audio keyframing, audio-only transitions or fade in/out buttons etc. I consider this to be one of two crucial points to improve in Feelmatic. The other is the extremely limited number of available aspect ratios: 16:9 is all there is (unless I’ve missed something), no option for vertical or square. You can bring in footage in other aspect ratios but it will be fit into a 16:9 frame and exported as such.
Feelmatic also has two slightly special toolbar elements, one is called ‘Logo’ which basically invites you to add an alpha channel png file as a brand/broadcaster logo and gives you a choice of four common default positions within the frame. The other one is ‘Subtitle’ which adds text including a half-transparent background for better legibility at the bottom of the frame. This is great for actual subtitles/captions but as far as I could tell, there are no other title options like say for an intro. This is a bit too bare bones for my taste.
The UI is generally good and focused with one minor shortcoming: the toolbar is located in the middle of the screen which makes reaching it in one-hand operation a bit more difficult, at least on bigger phones. If the toolbar were located at the bottom beneath the timeline, accessibility would be better.
The process of getting your project out of the app is a bit more cumbersome than with VN (you have to select a category for your video even if you don’t want to publish it on the Feelmatic platform for instance) but it is possible. That being said, you do get a solid set of export settings including video and audio bitrate. The video bit rate however maxes out at 10 Mbit, the audio bit rate at 128 Kbit which isn’t exactly great. And there are even more limitations: resolution is limited to 1080p (no UHD/4K), fps to a maximum of 30fps. While on iOS this does at least include 25fps as well, the Android version only supports 24 and 30 which is disappointing because other editing apps on Android like KineMaster, VN or CuteCut don’t have a problem with exporting 25fps.
So while I think that Feelmatic is actually a pretty solid and interesting video editing app with great potential definitely worth checking out, VN is more powerful in terms of features and the export process is less cumbersome. You should definitely check out both apps if you are into mobile video editing unless you are worried about their business model. If you don’t mind a watermark on the exported video or paying for a subscription, KineMaster is still the best and most compatible option available for both major mobile platforms. Let me know what you think in the comments or on Twitter @smartfilming.
Before captioning videos for convenient social media consumption on the go became all the rage, everyone agreed that having good audio was an essential element of good video, possibly even more important than the image quality. Many will agree that it actually still is despite the captioning vogue of late. So how do you get good audio for video? Let’s make it simple: Get as close to the sound source as possible with your mic! In most cases you will get better audio with a cheap mic close to the sound source than with a super-expensive mic that’s (too) far away (notice: it won’t work at all however if you use a carrot). And how do you get as close as possible to the audio source? An external mic (as opposed to the internal mic of your phone) will be a big help to achieve that since you probably don’t want to shove your phone/camera into someone’s face. But can you work with external mics on Android devices? Yes, you can. And the good news is that it basically works with EVERY Android phone or tablet! Of course I have not tested it on every single Android device on the planet but so far I have not encountered a single one that didn’t support it and believe me I have had dozens so far! In most cases you will however have to use third party apps since most native camera apps don’t support the use of external mics. There are a few exceptions like many Samsung phones and the (recent) flagships of LG and Sony but with other Android phone makers your only chance to use an external mic for better audio while recording video are third party video recording apps like FilmicPro, Cinema FV-5, Open Camera, Cinema 4K or Footej Camera. If you are into video live streaming: Popular platforms like Facebook, Periscope, Instagram or YouTube also support the use of external mics on their mobile apps. Important: Some apps will automatically detect a connected external mic while with others you will have to go into the settings and choose the external mic as the audio input. In general, it’s recommended to connect the mic before launching the app as sometimes the app might not correctly detect the mic when you only plug it in after launching the app. But how can you connect an external mic to an Android device? There are four basic options that I will shortly elaborate on: 1) via the 3.5mm headphone jack 2) via the microUSB port 3) via the USB-C port 4) via a wireless connection / Bluetooth.
3.5mm headphone jack
The most common wired solution for connecting an external mic to your Android device is (was?) the 3.5mm headphone jack, the port where you would usually plug in your headphones to listen to music. For a long time this was one of THE universal things about a smartphone, be it an Android, an iPhone or even a Windows Phone. In the past couple of years however, more and more phone makers have been following Apple’s lead to ditch the headphone jack (starting with the iPhone 7) in an attempt to further push for a slick enclosed unibody design, leaving the phone with only one physical port, the one that is primarily there to charge your phone. Of course this move has also to do with the rise of Bluetooth headphones. Anyway, if you’re lucky enough to still have a phone with a 3.5mm headphone jack you have a range of options to connect different types of external mics. There are two general options: Using a mic with a dedicated TRRS 3.5mm headphone jack connection or using another mic with an adapter. What’s TRRS you may ask? Well it has to do with the number of conductors on the 3.5mm pin. You might have encountered mics with a similar looking 3.5mm pin for ‚regular‘ cameras. But while they look similar, they only have THREE conductors on the pin (TRS), not FOUR (TRRS). Smartphones use the TRRS standard so a TRS 3.5mm pin won’t work unless you use an adapter like the Rode SC4. But you know what? There’s a good chance you already own an external mic without even knowing: the headphones that came with your phone. Yes, you heard that right! They usually have an inbuilt mic for making/receiving phone calls and if you have your headphones connected to the headphone jack while using a video recording app that supports external mics, then this is an easy and cheap way to improve your audio. You might be surprised how decent the sound quality can be! Of course the cable of the headphones is usually not too long so if you are doing a piece to camera or an interview with someone it will be hard to avoid having the cable in the frame and the sound quality of dedicated TRRS 3.5mm mics often trumps that of the headset but I think it’s still great to have this option at hand. Dedicated TRRS headphone jack mics include the original iRigMic (handheld) by IK Multimedia, several lavalier/lapel mics (like the Rode smartLav+, the Aputure a.Lav, the Tonor Dual Headed Lapel and the Boya BY-M1) or the Rode Videomic Me (directional shotgun-type). One might also count the Rode VideoMicro (directional shotgun-type) as a dedicated TRRS mic because you can exchange the TRS cable that it comes with (to work with ‚regular‘ cameras like DSLRs etc.) for a TRRS cable (Rode SC7, sold separately). Other than that you can connect basically any XLR mic by using IK Multimedia’s iRigPre adapter/converter box which has a female XLR input on one side and a male 3.5mm pin cable on the other. XLR is the most common professional audio connection standard.
Just as 3.5mm headphone jacks used to be a common standard on phones, so did microUSB ports on Android devices for charging your phone’s battery. The change from microUSB to USB-C as the preferred „power port“ in the last years very much but not exclusively coincided with the trend to drop the headphone jack. There are hardly any new phones coming out these days that still have a microUSB port (the most recent were all in the budget segment). And as 3.5mm headphone jacks were a universal standard at the time the lack of dedicated microUSB mics didn’t really come as too much of a surprise. Actually the only mic like this that I ever encountered and used was IK Multimedia’s iRig Mic HD-A, an improved digital version of the original iRigMic which featured a microUSB connector instead of a 3.5mm pin. One thing you also had to pay attention to when connecting accessories to a microUSB port was the question of USB-OTG support (OTG stands for „On-the-go“). In simplistic terms one could say that USB-OTG support basically means that you can use the USB port for other things than just charging. For instance as an audio input. Not all Android devices have support for USB-OTG.
USB-OTG support is also of relevance when talking about USB-C, the new USB connection standard for Android devices (it has recently also been introduced on Apple’s iPad Pro series so one might speculate on whether the future will see Apple making the switch for all its devices). One of the very practical things that makes USB-C better than microUSB is that the connector is shaped in a way that fits into the port of the phone in two ways, not only one like microUSB which usually meant you tried to plug it in the wrong way first. The other good thing is that as time went by, USB-OTG has become a more common feature on Android devices so chances are relatively high that your device will support USB-OTG if you have purchased it in the last two years or so. It’s still not a definitive standard on Android devices though, so if you plan to use USB-C mics you should check the phone’s spec sheet first. The introduction of dedicated USB-C mics has been very slow, the first one to my knowledge was the Samson Go Mic Mobile wireless system launched in 2017 which included a USB-C connection cable for the receiver unit along with cables for 3.5mm, microUSB and Lightning port (the latter is the standard on most Apple devices). Boya has recently added two USB-C mics into their portfolio (a directional shotgun-type and a lavalier) and Saramonic has a USB-C-to-XLR adapter cable so there are finally at least some options. For a great overview regarding USB-C mics check out this blogpost by Neil Philip Sheppard on smartphonefilmpro.com. One more thing: While quite a few phone makers include a USB-C to 3.5mm-adapter with their phones if they have a USB-C port (which would let you use 3.5mm mics), these tend to be proprietary, meaning that you can’t use them with other phone brands and if you lose yours you will have to purchase from the same brand again and can’t use a third party adapter. Yes, very annoying, I know. In general, USB-C mics don’t seem to work as universally across Android devices and apps as their headphone jack buddies just yet so if you plan to use a USB-C mic than I would recommend doing thorough testing before using it on an important job.
Wireless / Bluetooth
All the aforementioned external mic solutions have in common that they involve some kind of wired connection to the phone, even in the case of the wireless Samson Go Mic Mobile system or Rode’s RodeLink wireless kit (which can be utilized when connecting a TRS-to-TRRS adapter to it) as the receiver unit has to be plugged into the phone. Of course it would be fantastic to have the audio go directly and wirelessly from a mic (transmitter) into the phone without a separate receiver unit attached to it. And in theory it should be very much possible because modern phones do have two protocols allowing for wireless data transfer: WiFi and Bluetooth. So far, only Bluetooth has been used for that, I’m sure there’s a technical reason why the WiFi way might not be feasible (yet) that I don’t know about. A bunch of potential Bluetooth mics have been around for some time but they usually still need a receiver unit and the audio quality and reliability hasn’t been quite up to the task so far. Bluetooth headphones/headsets with an internal mic is another possible option. Here’s a short test I did using the inbuilt headset mic of my (rather cheap) Bluetooth headphones.
It’s not too bad in my opinion and might suffice for certain tasks but you definitely notice the quality difference to a good wired external mic. Apparently Bluetooth audio that goes directly into the phone is limited to a sample rate of 8kHz by the Android system at the moment (according to one of FilmicPro’s engineers) which doesn’t provide the grounds for great quality audio. There’s also a mic called the Instamic that is basically a self-contained mini audio-recorder in the form of a somewhat bigger lavalier with internal storage that also allows a live audio-streaming mode directly to the phone (with noticeably diminished quality compared to the internally recorded audio) but depending on your job, the quality might still not be good enough and can’t match that of wired connections. You also often get a slight delay between video and audio that increases the farther you get away from the device. And unlike with wired external mics, only a few video recording apps on Android actually accept Bluetooth as an external audio input as of today, the ones I know about are FilmicPro and Cinema FV-5. So while the limitations of Bluetooth mics might still be too big for much/most professional work at this time, it should (soon) be a viable option in the near future. As a matter of fact, there recently was a Kickstarter campaign for a Bluetooth transmitter called BAM! that can be attached to any XLR mic and streams the audio directly to the phone in good quality – unfortunately the campaign didn’t reach its funding goal. Let’s hope it’s not the end of the story since smartphone development is probably headed towards a design with no physical ports (wireless charging is already here!) and then wireless is the only way to go for better or worse! If you have questions or comments, feel free to drop a line!
A little more than six months ago I bid my LG V10 goodbye into retirement. The V10 was the first flagship smartphone I had purchased and I had done so for a very specific reason: LG had redefined what a stock/native camera app on a smartphone can offer in terms of pro video controls. While many other phone makers were including advanced manual controls for photography in their camera apps, video had been shamelessly ignored. With the introduction of the V-series in late 2015, LG offered avid smartphone videographers a feature pack in the native camera app that could otherwise only be found in dedicated 3rd party apps like FilmicPro. While LG’s smartphone sales can’t really compete with the ones from Samsung, Huawei and such, the V-series fortunately didn’t just vanish after the V10 but was succeeded by the V20, V30, V35 and V40 henceforth. As I don’t see the need to upgrade my phone on an annual basis, I went for the V30. It took over the useful dual rear cameras from the V20 and newly introduced features like LOG profile, Point Zoom and CineVideo. After spending six months with the V30, what is there to say about the device as a videography tool?
Hardware features: Lost & Found
Well first off, let’s get that big thing out of the way that bothered me the most before I even bought the V30: abandoning the removable battery. LG was basically the last major phone maker to offer an exchangeable battery on a flagship with the V20, so kudos for that, but they eventually ditched it for the V30. I somewhat do get the idea that a unibody design without removable parts might just make the device look slicker and even has a practical reason when it comes to water and dust resistance (yes, you CAN submerge the V30 without a case thanks to the IP68-rating). But apart from the concerning fact that this is a considerable ecological issue because it makes it likely that you will just buy a new phone when battery life starts to falter, it also does away with the „power management security net“ and fosters the fear of running out of power with your phone. Especially when using such a device extensively for professional purposes, a back-up battery that lets you go back from 0 to 100% in a matter of seconds feels just very comfortable to have around. Sure, external batteries a.k.a. power banks are a common thing by now but they are not quite as compact and fast in getting the recharging job done. While dropping the removable battery is unfortunate, it’s an all-too-common thing, LG only follows the rest of the pack as nowadays you can hardly find a phone that still has this feature. Furthermore, I have to say that I was pleasantly surprised by the V30’s battery life. It’s much better than the V10’s and while doing some tests with very long recordings, the phone only consumed around 30% of the battery when recording continuously for almost two hours. I just hope the battery doesn’t degrade too fast over time.
Speaking of useful features that are en vogue to get the sack: LG is still holding out on the 3.5mm headphone jack which will make a lot of people happy as it’s still a very easy and universal way to attach external mics for better sound quality (or do audio monitoring). The V30 also has a USB-C port which can be used for connecting external mics as well, but as of now there are hardly any USB-C mics out there to make use of that. One very clever and useful exception is the Samson Go Mic Mobile wireless system which comes with a whole bunch of connecting cables, including a USB-C one. One day in the (hopefully not so distant) future, truly wireless audio solutions sending audio in high quality directly from the mic to the phone’s video recording app might replace wired solutions but the current state of quality and reliability in that area isn’t yet up to the task as far as I can see. As for the internal mic, there are actually two, so the LG V30 is one of only a few phones that records in stereo natively. This is very useful if you are capturing a soundscape or if you have sound sources moving around. And to tell the whole story, the V30 actually has a third internal mic: the phone’s earpiece kicks in as a life saver in very loud environments (like say a rock concert) to avoid distorted audio. Another useful feature that the V30 fortunately kept was the support for external storage via microSD card, popping in a 128 or 256GB card is a pretty cheap way to have more space for media and apps on your phone.
Three and a Half Cameras
Let’s continue our inspection of the V30’s hardware and take a look at what might be considered the most important thing for a phone – when talking about videography: the camera(s). While the V10 had a somewhat peculiar lens set-up with a single rear camera but dual front cameras, the V20 flipped this around which I personally find more useful if you’re not a selfie-vlogger. Dual rear cameras have become all the rage in the last couple of years and almost something considered a must-have on flagship phones and even some mid-rangers (unless your name is Google Pixel). Not all secondary rear cams are created equal though. Some are only for shooting nice portrait shots with a blurred background, some feature a monochrome sensor for black/white photography with better low-light performance and dynamic range and some have a different focal length than the main camera, going either for telephoto (zoom) or wide-angle. For smartphone videographers, only the last two options are actually helpful. And while I have been known for whining about the lack of optical zoom on smartphones in the past, I do have to say that from a practical standpoint, wide-angle seems like the best, most versatile choice after all. Especially if you find yourself indoors backed up against a wall, having a wide-angle is just incredibly helpful to fit more of the scenery into the shot. And the ability to shoot two very different images from a single point without having to move around is fantastic. So while the wide-angle secondary camera is actually a rare choice in the market, LG can only be applauded for going down this route. And after the V20 had a noticeable amount of barrel-distortion on the wide-angle, the V30’s 12mm secondary rear cam has been more refined in that respect. It now also has a much wider aperture compared to the V20 (f/1.9 vs f/2.4 – smaller is better) which helps in low-light. There are three limitations when using the wide-angle however: The first one might actually be of use in certain situations – the fact that there’s a fixed focus and therefore no adjusting auto-focus guarantees that there are no unexpected and sudden focus shifts. A fixed focus might be a serious problem for the main camera, but for the wide-angle, it’s ok. The second limitation is a real one though: no OIS (optical image stabilization) and no EIS (electronic image stabilization) either. The third one is the biggest though: the V30’s new „LG-Cine Log“ profile (more about that later) is not available for the wide-angle camera, only for the main snapper.
The 30mm main rear camera has OIS (plus the option for additional EIS called „Steady recording“ – not available for UHD/4K though), laser auto-focus, a f/1.6 aperture and the ability to record in LG-Cine Log. Both rear cameras let you record in UHD/4K (but only up to 30fps, 60fps is only available for FHD resolution, 120fps only for 720p in slow-motion mode) and you can switch between them with a single tap even when recording. The colors of the two rear cameras don’t match 100% if you take a really close look but they are close enough for most purposes I’d say. Now while the main rear camera seems to be excellent for low-light with its wide f/1.6 aperture, the relatively small size of the image sensor (1/3.1“ with 1.0µm pixel size) unfortunately diminishes this advantage. With very few exceptions (especially when it comes to video), all smartphones still struggle with low-light situations so it would be wrong to single out LG for that. I would classify the V30’s low-light performance as solid, but not as good as one could have expected with regard to the promising aperture of the main cam.
What about the selfie camera? Well, I was already a little bit suspicious when I saw the tiny camera hole on the front. As it turns out, not only did LG scrap one front camera compared to my old V10 but the actual quality of the footage isn’t really better than the V10’s from two years ago as far as I could see. That’s a bit of a disappointment for sure but personally, I don’t care too much as I rarely use the front camera. As for resolution, you can shoot FHD in 30fps which is the solid standard but that’s it – no UHD/4K or higher frame rate. Another note: While there isn’t a second front facing camera you still get the option to switch between a wider and a narrower field of view – as there’s no second lens, this is done by a software crop of the image.
Before moving on to the software side of things, a few words about one other very important hardware aspect: the chipset. The LG V30 is equipped with a capable Snapdragon 835 that not only lets you shoot video in UHD/4K resolution (although only up to 30fps in UHD/4K) but also edit it. Importing footage into Android’s two best video editors KineMaster and PowerDirector reveals that you can even have a second video track when working with UHD/4K footage in those apps which is excellent news. For those interested in creating Augmented Reality (AR) enriched video: The V30 is compatible with Google’s ARCore and the Snapdragon 835 has enough muscle to let you use an app like “Just a line” for instance which lets you draw/doodle in AR space. There isn’t too much around in this category yet though.
The King of Manual Video Controls
But while good cameras and powerful chipsets can also be found on other (Android) phones, the unique selling point of the V-series has always been its focus on videography with all the manual controls and features you get in the native camera app. I’ve already talked about that regarding the V10 when discussing native camera apps on smartphones in an earlier post (I still owe the second part of this article, what a shame!) but there have been some significant additions since the V10 so it’s worth pointing out in detail again. Let’s have a look at the interface of the manual video mode: On the far left on the bottom of the screen you find an audio level meter which reassures you that there’s actually audio coming in from the mic(s) and it also helps you to make sure the audio isn’t too loud (peaking). No other native camera app on a smartphone has that – you can only find it in advanced 3rd party apps like FilmicPro, Cinema FV-5 etc. To the right there’s information on what resolution, frame rate and bitrate you are currently using. Next is a button with a microphone icon and this opens up a transparent panel overlay with some advanced audio controls audiophiles will love: You get to change the input gain, activate a low cut filter or set a limiter. While recording, you even get live audio waveforms when having this panel open which gives you even more precise visualized information about the incoming audio than the audio level meter. From this panel you can also apply a wind noise filter and select an external mic if there is one connected via the headphone jack (edit: unfortunately it doesn’t seem to support mics connected via the USB-C port like I originally wrote in this post!) At this point I would like to mention that the app even allows for audio monitoring via cabled or Bluetooth headphones. There’s a small delay to the live audio so listening to it over extended periods of time can be irritating but it’s definitely good to quickly check the audio for possible unwanted sonic interference. The next button is for white balance and you can switch between auto mode and a Kelvin scale that ranges from 2300 to 7500K. No presets are available though. Next in line is focus. Again, you can switch between auto-focus and manual focus. When you choose manual focus mode you get to enjoy another staggering feature for a native camera app: focus peaking. Focus peaking adds a colored overlay to the areas of the frame that are in focus and is therefore incredibly helpful to get the focus right. It can usually only be found on professional „big“ cameras. Focus peaking can be switched on or off when using manual focus on the V30. One shortcoming: You can only use focus peaking BEFORE starting the recording which makes fancy rack focus action while filming still a bit of a gamble. The only Android app that allows focus peaking even while recording is FilmicPro. The EV button lets you adjust the exposure value without having to set precise values for ISO and shutter speed but as there’s no option to lock the exposure in that case I find it fairly useless. On to the two real exposure parameters: ISO and shutter speed. The ISO ranges between 50 and 3200, shutter speed between 1/25s and 1/4000s. One crucial improvement over the V10 regarding the shutter speed is that you can now select „PAL“ shutter speeds, most importantly 1/50s. This is important because in Europe and some other regions many artificial light sources emit light at the frequency of 50 Hertz which causes ugly banding effects in your footage if you are not shooting with a shutter speed that matches this frequency. The last thing you find on the far right of the bottom control panel is what I like to call the „panic button“ and it’s a very cool feature: If you ever find yourself lost fiddling with all the manual controls but need to quickly start recording all of a sudden you can just push the „A“ with a circling arrow around it and everything goes back to auto: white balance, focus, exposure.
But not only the control panel of the main recording interface is stuffed with controls and features, there’s more to find in the settings section which you can access by tapping the cog wheel on the bottom of the left side bar. The first option you can find here at the top of the list is the frame rate. And it’s in here that me and some other folks do miss a particular something: PAL frame rates, PAL being the broadcast standard in Europe and some other regions of the world. Normally you couldn’t really blame a smartphone for not having the option to shoot in 25 or 50fps in the native camera app (the only phones that ever did at least 25fps were Nokia’s/Microsoft’s Lumia phones) but with all the amazing bells and whistles in terms of pro videography controls on the V-series, it’s a real shame that LG didn’t pay attention to that as well. Truth be told, this option will only be of serious relevance to a certain group of videographers: Those who shoot for PAL broadcast and/or use their phone in combination with a ‚regular‘ camera that only shoots PAL frame rates. If you don’t belong in this category, you can be perfectly happy with the options at hand: 1, 2, 24, 30 and 60fps (60fps is not available when shooting UHD/4K or LOG). Still, for the highly unlikely case that someone from LG reads this blog, PLEASE do add the option to shoot in 25/50fps! How hard can it be? I hope there’s a golden future ahead where regional frame rates are a thing of the past but that future might still be a bit too far away to just ignore the present. Yes, you can use 3rd party apps to shoot in 25fps on the V30 but if LG gives us a native camera app so good with manual video controls and the idea that this is a serious videography tool, why be ignorant in that particular area? Next in the settings list is bitrate. Yes, you heard that right, you can adjust the bitrate. Another feature that can otherwise only be found in advanced 3rd party apps. You can choose between three different settings: high, medium and low. The bitrates depend on the selected resolution and frame rate and – upon closer inspection – turn out to be not as high as some power users would have liked. The maximum you get is 52 Mbit/s when shooting in UHD/4K, the „high“ option in 1080p with 30fps is 24Mbit/s. Still, it’s nice to have some control over the bitrate at all in a native camera app. Below the bitrate option, there’s another very interesting feature that will excite every audiophile: You can toggle on „HiFi recording“ which pushes the audio bitrate for video to a crazy 2400 Kbit/s (24-Bit PCM Stereo) while the regular set-up is 156Kbit/s (AAC) and no other smartphone I encountered exceeded 320Kbit/s. If you want to edit your footage on the phone be warned that not every video editing app supports PCM audio (KineMaster and PowerDirector do though) – and neither does Twitter’s video player by the way.
What the LOG!?!
But let’s move on to the big new feature that LG introduced to the V-series with the V30: LG-Cine Log. What’s „log“? I won’t and I can’t go into the details of this but let’s just say it is a special shooting profile that applies certain processing to the image which will give you a better dynamic/tonal range and generally allows more flexibility in post production when you want to create a specific look for your footage. It’s a feature usually only found on professional cinema cameras and calls for a certain amount of post production (grading/coloring) because the „raw“ footage usually looks rather dull and pale. So if your workflow includes a fast turnaround you probably shouldn’t use the LOG profile. It’s a very cool feature though, I absolutely love it, not least because the regular footage might be considered over-sharpened and over-saturated, an unfortunate habit of many/most smartphones as they are trying to satisfy what they deem the crowd’s taste. And while I’d say that the V30’s non-LOG image quality is a tad behind Google’s recent Pixel phones, Samsung’s S9/S9 Plus/Note 9 and the latest iPhones, the native LOG profile makes up for that in my opinion as you can really create stunning footage with it and have immense flexibility in post production. However it can’t be denied that shooting LOG probably is only of interest to a certain group of videographers. But hey, if any smartphone should have the ability to shoot LOG in the native app, it should be the V30! Two things to keep in mind when using LG-Cine Log: You can’t use the wide-angle lens and you can only shoot up to 30fps. Here’s a “show reel” of footage shot in LG-Cine Log on the V30 (graded in FCPX).
And here are two shorter videos with LG V30 LOG footage, one “raw” like it is originally recorded, the other with minor grading applied.
And as I already talked about bitrates earlier on, it’s particularly unfortunate with regard to shooting LOG that the bitrates can’t be bumped up to higher levels. One last thing: When using LOG profile you can find a button in the top right corner of the main interface that lets you toggle on and off at LUT (so-called ‚Look-Up-Table‘). Again, I don’t really want to get into the specifics here but suffice it to say that this gives you a preview of what the graded result of your LOG footage COULD look like, it is NOT recording that preview! The image that is recorded is ALWAYS the one that you can see when LUT is toggled OFF!
Let’s wrap up the settings menu with a quick look at some other features: Bright Mode and HDR can’t be used in the manual video recording mode (only in auto-mode) which renders them useless for me. Steady Recording is an additional (software-powered) stabilizing option that crops the frame and can’t be used when recording in UHD/4K. Tracking Focus tracks a person or object while moving about the frame which can be useful in certain situations. It doesn’t always work perfect but it’s worth trying out. Covered Lens gives you a warning when you (accidentally) cover part of the wide-angle camera’s image. This can indeed be helpful as I have occasionally found myself inserting my pinkie into the frame without the intention to do so because the wide-angle has a really wide angle. On the right hand side of the settings menu you can activate a timer (3 or 10 seconds) and select a resolution. Resolution varies between 720p and UHD/4K and offers three different aspect ratios (16:9, 18:9 and 21:9 – the latter two are only available up to 1080p). 21:9 is interesting because the ultra-widescreen format gives you a certain „cinematic“ effect. If you combine that with the according frame rate (24fps) and LOG profile you are setting the stage for that sweet silver screen look. And for those of you interested in creating vertical video content, you can also shoot vertically with all features & manual controls. Manual mode is however not available when you are using the front camera though – a little bummer.
More fun with shooting modes…
The manual video mode is outstanding but what about any other interesting video modes in the native camera app? There’s one particular mode that was also first introduced with the V30 and got a lot of attention before the phone’s release: CineVideo. The mode actually has two separate features bundled together in one mode – the bundling aspect however left me somewhat confused. So one aspect of the CineVideo mode is that you can apply a couple of slick „cinematic“ filters (some are even calling them LUTs, not sure if that’s correct though) to your image. But while you get control over the strength of the filter and the vignetting that comes with it, that’s basically it. Yes you do get some very rudimentary exposure value control but you can’t lock the exposure or set specific values for ISO and shutter speed which is really unfortunate and dramatically reduces the usefulness. The other feature in the CineVideo mode is Digital Point Zoom. You can choose a point within the frame and smoothly zoom in by using a virtual slider. Yes, the zoom is only digital but to my surprise the quality loss isn’t all that bad and even when fully zoomed in, the image can still be considered acceptable. So it’s a real shame that LG restricted this feature to the CineVideo mode – it would have been very cool to have this in the manual video mode as well. There you can also zoom digitally by using the common zoom gesture with two fingers but the zoom will be very abrupt because there’s no slider. And you also can’t zoom in to an off-center point of the frame like you can with the Digital Point Zoom.
So one small general gripe I have with modes and features on the V30 is that certain useful things are only available in certain modes / in certain settings and not in others which can be a little frustrating at times.
„Popout“ is another fairly interesting mode as it uses both the main and the wide-angle camera simultaneously to create a picture-in-picture video with two different views from the same camera standpoint. The cool thing is that you can apply some effects to the wide-angle image: Fisheye, Black&White, Vignette and Lens Blur. You can even combine some or all of them at the same time. On top of that you can also change the layout of the picture-in-picture to have a circle instead of a rectangle or have three segments of which the top and the bottom are filled by the wide-angle camera while the middle one is filled by the main camera. It’s more of a fun mode and I don’t use it often but it can come in handy when you try to create something more playful for instance for a short social media video.
The simultaneous use of two cameras gets even more interesting with the „Match Shot“ mode. This is a fantastic feature for vloggers and mobile journalists reporting as a one-(wo)man-band – I have already mentioned this mode in my blog post #12: It creates a split-screen recording using both the front and a rear camera simultaneously which means you can basically show yourself AND your own point-of-view at the same time. This is just super cool if you are doing an on-the-scene piece-to-camera for a news report or some travel vlogging. For each screen segment you can choose between the regular view and a wide-angle so you have some flexibility there as well. Best of all: external mics are even supported! Some downsides on the other hand: The aspect ratio is fixed to 18:9 (resolution of 2880×1440 is good though, so one can adjust to 16:9 in post), the frame rate is only 24fps and everything’s running on auto, no manual controls. Still, it’s an amazing feature with great potential and it’s a real pity that apparently LG has ditched this mode again on the V40. Here’s a video (not mine) with the Match Shot mode in action:
If you are into square video and doing super-short teasers for longer content you might find some use for the „Grid Shot“ mode which lets you shoot four very short clips of a maximum of 3 seconds each and assembles them into a split-screen square video (resolution: 1440×1440) playing back all four clips at the same time.
The last interesting mode for video is „Slo-Mo“. You get slow motion with 240fps – but only in 720p and with barely any manual controls. It’s nice to have but it’s definitely not LG’s strong suit – Apple and Samsung offer much better quality here in their flagship phones.
Camera2 API & 3rd party apps
So with the V30’s native camera app being so amazing is there any need at all for 3rd party apps? Yes and no, or as we like to say in German: Jein. The biggest reason for using a 3rd party app is probably the frame rate: As mentioned before, the native app does not offer any PAL standard frame rates (25/50fps) which might be important to some users. Other than that, the only app that can actually beat LG’s native camera app when it comes to features and controls is FilmicPro which gives you among other things focus peaking during recording, a waveform monitor and false color analytics to check exposure in difficult situations, the ability to shoot in higher bitrates and the option to use the more efficient (but not yet fully mass-market compatible) HEVC/H.265 codec instead of the standard AVC/H.264. But as I have pointed out in an earlier blog post, the ability to have advanced manual video controls in 3rd party apps on Android devices very much depends on how well the phone maker has implemented the so-called Camera2 API (if you want to learn more about it, check out my two blogposts about it here and here). Without proper implementation, 3rd party app developers can’t access/make use of certain controls. So how’s the Camera2 API support for the V30? Well, it’s a mixed bag. It does have the highest support level („Level 3“) for both rear cameras (only „Limited“ though for the front camera) so theoretically things should be fine but apparently LG overlooked a small bug that affects focusing in 3rd party camera apps. Sometimes, the focus gets stuck and you have to quit and re-launch the app. While I have experienced this first with FilmicPro it also happened with other 3rd party apps, so it seems to be a more general issue and not only related to a FilmicPro. Let’s hope LG can fix this nuisance with a software update. A positive aspect of LG’s Camera2 implementation on the other hand is the fact that 3rd party camera apps do get access to the secondary rear camera, something other Android phone makers are less welcoming about. So far, only FilmicPro and ProShot have actually integrated this as a feature though. In the case of FilmicPro this means that there is a way to shoot in LOG profile with the wide-angle lens after all! A word about frame rates: The ability to shoot in 25fps is one major reason for some to use 3rd party camera apps. Using the V30 with FilmicPro in 25fps has been mostly consistent and reliable so far (occasionally you do get 24.93 or something not 100% on spot) but you don’t get the higher frame rate PAL option of 50fps (something very few Android handsets seem to be able to allow at this point). And neither do you get 60fps which is available in the native app so LG still keeps some shackles on the API here for 3rd party apps. Surprisingly though, you can shoot at the even higher slow-motion frame rate of 120fps (up to FHD). So I’d say slow-motion capability comes out as a tie between native camera app and FilmicPro: The native camera app lets you record in 240fps using the slow-motion mode but only in 720p while with FilmicPro you „only“ get a frame rate of 120fps but a higher resolution (1080p).
In the long run…
Before concluding this rather detailed inspection of the V30 I would like to address one more aspect: maximum recording length. While quite a few smartphone videographers usually take relatively short clips and don’t really care if there’s a limit of say 20 minutes for a single video, it’s really important to know about that for others. Android used to have a single file size limit of around 4GB (this particular size seems to be related to the well-known FAT32 format but to my knowledge it actually isn’t as the limit isn’t exactly 4GB), but many phone makers were able to get rid of that with their own version of the Android OS (Sony, Huawei, Nokia, BQ, HTC for instance). Unfortunately, LG isn’t among them. That being said, LG vastly improved things compared to the V10. On the V10, the recording would stop upon reaching the file size limit and you would have to manually restart the recording. Not a good thing, if you were using the phone as an alternative angle for a longer event while having your focus on the main camera or if you really needed every second of the recording. With the V30 you don’t have to manually restart the recording anymore, it basically records continuously for as long as battery and storage allows. In the background however, the clip is chopped up into chunks of 4.29 GB and you lose a very short segment in between (I’d say it’s around 2 seconds maybe). It might not be the ideal solution for certain jobs but it’s definitely better than having to restart manually. After all, some might even argue that in case of file corruption it’s better to not have a single file. Of course then the ideal solution would be a spliced clip that can be seamlessly reassembled afterwards without dropping a single frame.
So, in the end, is the LG V30 a smartphone videographer’s dream machine? For the most part I’d say yes, its focus on videography is absolutely unique in the smartphone market, the range of advanced pro tools for shooting video that is available right out of the box without having to bother with 3rd party apps that might have certain quirks thanks to Android’s fragmentation is utterly brilliant. The native camera app has been rock-solid in terms of reliability, it hasn’t crashed on me once so far. It’s not quite perfect though: Especially when taking into account that this phone was made for (professional) videographers, it’s a bit puzzling that LG didn’t bother to include PAL frame rates for its native camera app. I’m not an expert on this but I’d say it shouldn’t have been too much of a problem technically to do so. Maybe they just didn’t care? Who knows… This leaves me with two wishes: a) Please, LG, go the extra inch and include PAL frame rates in the native camera app with a software update and b) to all you other smartphone makers out there: please follow LG’s example in paying more attention to your phone’s native camera app in terms of advanced manual video controls. Thank you.
Back in February I published a list with a wide selection of (potentially) useful Android apps for media production. Despite the fact that I mostly write for this blog in English now, the list was published in its German version first. I did promise an English version however and I’ve been working on it ever since. The new English version is not just a translation, it’s actually an update with some apps having been kicked out and others added. And what occasion could be better to finally publish it then at the time MoJoFest is happening in Galway, Ireland. MoJoFest is an exciting 3-day conference (May 29th to 31st) about content creation with mobile devices, initiated and organized by former RTE Innovation Lead Glen Mulcahy. Check out their website and follow the hashtag #MoJoFest on Twitter! I’ll be giving a workshop/presentation about smartphone videography on Android devices on Thursday, May 31st, and as a precursor, I’ll upload the English version of my app list here. Please keep in mind that there might be some typos or even outdated information in it as the mobile world keeps spinning at an incredible pace and things can change quickly. This is also a highly subjective list and by no means “definitive” or “ultimate”, you may find that other apps which are not on the list suit you better for your work. If you think an app you know and love should absolutely be on this list or if you have new information about apps already on the list, please do contact me! But now without much further ado…
When using a headline like the one above, camera people usually refer to the idea that you should already think about the editing when shooting. This basically means two things: a) make sure you get a variety of different shots (wide shot, close-up, medium, special angle etc) that will allow you to tell a visually interesting story but b) don’t overshoot – don’t take 20 different takes of a shot or record a gazillion hours of footage because it will cost you valuable time to sift through all that footage afterwards. That’s all good advice but in this article I’m actually talking about something different, I’m talking about a way to create a video story with different shots while only using the camera app – no editing software! In a way, this is rather trivial but I’m always surprised how many people don’t know about it as this can be extremely helpful when things need to go super-fast. And let’s be honest, from mobile journalists to social media content producers, there’s an increasing number of jobs and situations to which this applies…
The feature that makes it possible to already edit a video package within the camera app itself while shooting is the ability to pause and resume a recording. The most common way to record a video clip is to hit the record button and then stop the recording once you’re finished. After stopping the recording the app will quickly create/save the video clip to be available in the gallery / camera roll. Now you might not have noticed this but many native camera apps do not only have a „stop“ button while recording video but also one that will temporarily pause the recording without already creating/saving the clip. Instead, you can resume recording another shot into the very same clip you started before, basically creating an edit-on-the-go while shooting with no need to mess around with an editing app afterwards. So for instance, if you’re shooting the exterior of an interesting building, you can take a wide shot from the outside, then pause the recording, go closer, resume recording with a shot of the door, pause again and then go into the building to resume recording with a shot of the interior. When you finally decide to press the „stop“ button, the clip that is saved will already have three different shots in it. The term I would propose for this is „shediting“, obviously a portmanteau of „shooting“ and „editing“. But that’s just some spontaneous thought of mine – you can call this what you want of course.
What camera apps will let you do shediting? On Android, actually most of the native camera apps I have encountered so far. This includes phones from Samsung, LG, Sony, Motorola/Lenovo, Huawei/Honor, HTC, Xiaomi, BQ, Wileyfox and Wiko. The only two Android phone brands that didn’t have this feature in the phone’s native camera app were Nokia (as tested on the Nokia 5) and Nextbit with its Robin. As for 3rd party video recording apps on Android, things are not looking quite as positive. While Open Camera and Footej Camera do allow shediting, most others don’t have this feature. FilmicPro (Android & iOS) meanwhile doesn’t have a “pause” button but you can basically achieve the same thing by activating a feature called “Stitch Recorded Footage” in the settings under “Device”. There’s also MoviePro on iOS which lets you do this trick. Apple however still doesn’t have this feature in the iOS native camera app at this point. And while almost extinct, Lumia phones with Windows 10 Mobile / Windows Phone on the other hand do have this feature in the native camera app just like most Android phones.
EDIT: After I had published this article I was asked on Twitter if the native camera app re-adjusts or lets you re-adjust focus and exposure after pausing the recording because that would indeed be crucial for its actual usefulness. I did test this with some native camera apps and they all re-adjusted / let you re-adjust focus and exposure in between takes. If you have a different experience, please let me know in the comments!
Sure, shediting is only useful for certain projects and situations because once you leave the camera app, the clip will be saved anyway without possibility to resume and you can’t edit shots within the clip without heading over to an editing app after all. Still, I think it’s an interesting tool in a smartphone videographer’s kit that one should know about because it can make things easier and faster.
Xiaomi has been a really big name in China’s smartphone market for years, promising high-end specs and good build quality for a budget price tag – but only at the end of last year did they officially enter the global scene with the Mi A1. The Mi A1 is basically a revamped Mi 5X running stock Android software instead of Xiaomi’s custom Mi UI. It’s also part of Google’s Android One program which means it runs a ‚clean‘ Google version of Android that gets quicker and more frequent updates directly from Google. For a very budget-friendly 180€ (current online price in Europe) you get a slick looking phone with dual rear cameras, featuring a 2x optical zoom telephoto lens alongside the primary camera. Sounds like an incredible deal? Here are some thoughts about the Mi A1 regarding its use as a tool for media production, specifically video.
After spending a couple of days with the Mi A1, I would say that this phone is definitely a very interesting budget-choice for mobile photographers. The fact that you get dual rear cameras (the second one is a 2x optical zoom as mentioned before) at this price point is pretty amazing. The photo quality is quite good in decent lighting conditions (low light is problematic but that can be said of most smartphone cameras), you get a manual mode with advanced controls in the native camera app and the portrait mode feature does a surprisingly good job at creating that fancy Bokeh effect blurring the background to single out your on-screen talent. A lot of bang for the buck. Video – which I’m personally more interested in – is a slightly different story though.
Let’s start with a positive aspect: The Xiaomi Mi A1 lets you record in UHD/4K quality which is still a rarity for a budget phone in this price range. And hey, the footage looks quite good in my opinion, especially considering the fact that it’s coming from a (budget) smartphone. I have uploaded some sample footage on YouTube so see for yourself.
The video bitrate for UHD/4K hovers around 40 Mbps in the native app which is ok for a phone but the audio bitrate is a meager 96 Kbps (same in FHD) – so don’t expect full, rich sound. But this is only the beginning of a couple of disappointments when it comes to video: One of the Mi A1’s promising camera features, the 2x optical zoom lens, CANNOT be used in the video mode, only in the photo mode! What a bummer! This goes for both the native camera app and 3rd party apps.
Talking about 3rd party camera apps, it’s also a huge let-down that the Camera2 API support (what is Camera2 API?) is only „Legacy“ out of the box, even though the Mi A1 is part of Google’s Android One program. „Legacy“ means that third party camera apps can’t really tap into the new, more advanced camera controls that Google introduced with Android 5 in 2014, like precise exposure control over ISO and shutter speed. Due to this, you can’t install an app like Filmic Pro in the first place and other advanced camera apps like Cinema FV-5, ProShot, Lumio Cam, Cinema 4K, Footej Camera or Open Camera can’t really unleash their full potential. Interestingly, there seems to be a way to „unlock“ full Camera2 support via a special procedure without permanently rooting your device (look here) but even after doing so, Filmic Pro can’t be installed, probably because the PlayStore keeps the device’s original Camera2 support information in its database to check if the app is compatible without actually probing the current state of the phone. This is just an educated guess however. Still, many of us might not feel comfortable messing around with their phone in that way and it’s a pity Xiaomi doesn’t provide this out-of-the-box on the Mi A1.
Lackluster Camera2 API support can be remedied by a good native camera app but unlike with photos, there is no pro or manual mode for videos on the Mi A1, it’s actually extremely limited. While you can lock the focus by tapping (there are two focus modes, tap-to-focus and continuous auto-focus), you are only able to adjust the auto-exposure within a certain range (EV), not lock it. There’s also no way to influence the white balance. Shooting in a higher frame rate (60fps)? Not possible, not even in 720p (there’s a not-too-bad 720p slow-motion feature though). Apropos frame rates: I noticed that while the regular frame rate is the usual 30fps, the native camera app reduces the fps to 24 (actually 23.98 to be precise) when shooting under low-light conditions to gain a little bit more light for each frame. That’s also the reason why I made two different YouTube videos with sample footage so I was able to keep the original frame rate of the clips. I have experienced this behaviour of dropping the frame rate in low-light in quite a few (native) camera apps on other phones as well and from the standpoint of a run-of-the-mill smartphone user taking video this is actually an acceptable compromise in my opinion (as long as you don’t go below 20fps) to help tackle the fact that most smartphone cameras still aren’t naturally nocturnal creatures. It can however be a problem for more dedicated smartphone videographers that want to edit their footage as it’s not really good to have clips in one project that differ so much in terms of fps. 3rd party apps might help keeping the fps more constant.
And there are still two other big reasons to use a 3rd party app on the Mi A1 despite the lack of proper Camera2 API support: locking exposure and using an external microphone via the headphone jack (yes, there is one!). One more important shortcoming to talk about: It’s not too surprising maybe that there is no optical image stabilization (OIS) on a phone in this price range but given the fact that you can shoot 4K, I would have expected electronic image stabilization (EIS) at least when shooting in 1080p resolution. But there’s no EIS in 1080p which means that you should put the phone on a tripod or use a gimbal most of the time to avoid getting shaky footage. With a bit of practice you might pull off a decent handheld pan or tilt however to avoid having only static shots.
So I’ve talked about the video capturing part, what about editing video on the Mi A1? The phone sports a Snapdragon 625 which is a slightly dated but still quite capable mid-ranger chipset from Qualcomm. You can work with up to two layers (total of three video tracks) of FHD video in KineMaster and PowerDirector (the two most advanced Android video editing apps) which will suffice for most users. Important note: DON’T run the hardware analysis test in KineMaster though! It’s a hardware probing procedure meant to better determine the device’s capabilities in terms of editing video in the app. While the device capability information originally says you can have two QHD (1440p) video layers, it will downgrade you to two 720p (!) layers after running the analysis – quite strange. Don’t worry though if your evil twin grabs your phone and runs the test anyway – you just have to uninstall and then reinstall KineMaster to get back to the original setting. I ran some quick tests with FHD 1080p layers and it worked fine so just leave everything as is. Since the phone can shoot in UHD/4K resolution you might ask if you can edit this footage on the device. While you can’t edit 4K in KineMaster on the Mi A1 at all (when trying to import 4K footage the app will offer you to import a transcoded QHD version of the clip to work with) you can import and work with UHD/4K in PowerDirector, but only as a single video track, layers are not possible.
So let’s wrap this up: Xiaomi’s first internationally available phone is a great budget option for mobile photographers but the video recording department is let down by a couple of things which makes other options in this price range more appealing to the smartphone videographer if advanced manual controls and certain pro apps are of importance. As I pointed out though, it’s not all bad: It’s still hard to find a phone for that price that offers UHD/4K video recording – and the footage looks even pretty good in decent lighting conditions. So if you happen to have a Mi A1 – there’s no reason at all to not create cool video content with it – if you achieve a nice video package you can even be more proud than someone with a flagship phone! 😉
Back in 2016 Google made an iOS-exlusive app (weird, ain’t it?!) called Motion Stills. It focused on working with Apple’s newly introduced ‘Live Photos’ for the iPhone 6s. When you shoot a ‘Live Photo’, 1.5 seconds of video (with a low frame rate mind you) and audio before and after pressing the shutter button is recorded. You can think of it as a GIF with sound. What Motion Stills does is that it lets you record, stabilize, loop, speed-up and/or combine ‘Live Photos’. In 2017, Google finally brought the app to Android. Now while some Android phone makers have introduced ‘Live Photo’-like equivalents, there’s no general Android equivalent as such yet and because of that the app works slightly different on Android. Instead of ‘Live Photos’ you can shoot video clips with a maximum duration of 3 seconds (this also goes for pre-6s iPhones on iOS). There are also other shooting modes (Fast Forward, AR Mode) that are not limited to the 3 seconds but for this post I want to concentrate on the main mode Motion Still.
When I first looked at the app, I didn’t really find it very useful. Recording 3-second-clips in a weird vertical format of 1080×1440 (720×960 on iOS)? A revamped Vine without the attached community? Some days later however I realized that Motion Stills actually could be an interesting and easy-to-use visual micro-storytelling tool, especially for teaching core elements of visual storytelling. The main reasons why I think it’s useful are:
a) it’s a single app for both shooting and editing (and it’s free!)
b) the process of adding clips to a storyboard is super-easy and intuitive and
c) being forced to shoot only a maximum of 3 seconds let’s you concentrate on the essentials of a shot
So here’s a quick run-through of a possible scenario of how one might use the app for a quick story or say story-teaser: When covering a certain topic / location / object etc. you take a bunch of different 3-second-shots with Motion Stills (wide shot, close-up, detail etc. – 5-shot-rule anyone?) by pressing the record button. It might be good to include some sort of motion into at least some shots, either by shooting something where you already have motion because people or objects are moving or by moving the smartphone camera itself (‚dolly‘ shot, pan, tilt) when there is no intrinsic motion. Otherwise it might look a little bit too much like a stills slide show. Don’t worry too much about stabilization because Motion Stills automatically applies a stabilization effect afterwards and even without that, you might just be able to pull off a fairly stable shot for three seconds. After you have taken a bunch of shots, head over to the app’s internal gallery (bottom left corner on Android, swipe up on iOS) where all your recordings are saved and browse through the clips (they auto-play). If you tap a clip you can edit it in a couple of ways: You can turn off stabilization, mute the clip, apply a back-and-forth loop effect or speed it up. On iOS, you can also apply a motion tracking title (hope the Android version will get this feature soon as well!) What you can’t do is trim the clip. But you actually don’t have to go into edit mode at all if you’re happy with your clips as they are, you can create your story right from the gallery. And here’s the cool thing about that: Evoking a shade of Tinder, you can quickly add a clip to your project storyboard (which will appear at the bottom) by swiping a clip to the right or delete a clip from the gallery by swiping it to the left. If you want to rearrange clips in the storyboard, just long-press them and move them to the left or the right. If you want to delete a clip from the storyboard, long-press and drag it towards the center of the screen, a remove option will appear. In a certain way Google’s Motion Stills could be compared to Apple’s really good and more feature-rich Apple Clips app when it comes to creating a micro-story on the go really fast with a single app – but Apple Clips is – of course – only available for iOS. When you are finished putting together your micro-story in Motion Stills, you can play it back by tapping the play button and save/share it by tapping the share button. Once you get the hang of it, this is truly fast and intuitive – you can assemble a series of shots in no time.
That being said, there are a couple of limitations and shortcomings that shouldn’t be swept under the rug. Obviously, thanks to the 3-second-limit per clip, the app isn’t really useful for interviewing people or any other kind of monologue/dialogue scenario. You might fit in some one liners or exclamations but that’s about it. It’s also a bit unfortunate that the app doesn’t apply some kind of automatic audio-transition between the clips. If you listen to the end result with the sound on, you will often notice rather unpleasant jumps/cracks in the audio at the edit points. While you could argue that because of the format content will only be used for social media purposes where people often just watch stuff without sound and will not care much about the audio anyway, I still think this should be an added feature. But let’s get back to the format: While you have the option to export as a GIF if you are only exporting one clip, the end result of a series of clips (which is the use case I’m focusing on here) is an mp4 (mov on iOS) video file with the rather awkward resolution of 1080 by 1440 (Android) or 720 by 960 (iOS) – a 3:4 aspect ratio. This means that it will only be useful for social media platforms but hey, why ‚only‘, isn’t social media everything these days?! Another thing that might be regarded as a shortcoming or not is the fact that (at least on Android) you are pretty much boxed in with the app. You can’t import stuff and clips also don’t auto-save to the OS’s general Gallery (you will have to export clips manually for that). But is that such a bad thing? I don’t think so because a good part of the fun is doing everything with a single app: shooting, editing, exporting/publishing. So let’s finish this with an actual shortcoming: While the app is available for Android, it’s not compatible with certain devices – mostly low-end devices / mid-rangers with rather weak chipsets. And even if you can install it, some not-so-powerful devices like the Nokia 5 or Honor 6A (both rocking a Snapdragon 430) tend to struggle with the app when performing certain tasks. This doesn’t mean the app always runs a 100% stable on flagships – I also ran into the occasional glitch while using it on a Samsung S7 and an iPhone 6. Still, the app is free, so at least check it out, it can really be a lot of fun and useful to do/learn visual (micro) storytelling! Download it on GooglePlay (Android devices) or the Apple App Store (Apple devices).
P.S.: Note that you can only work on one project at a time and don’t clear the app from your app cache before finishing/exporting it – otherwise the project (not the recorded clips) will be lost!
P.P.S.: Turn off the watermark in the settings!
One of the first steps when getting more serious about producing video content with a smartphone is to look at the more advanced video recording apps from 3rd party developers. Popular favorites like „FilmicPro“ (available for both Android and iOS) usually offer way more image composition controls, recording options and helpful pro features that you find on dedicated video cameras than the native stock camera app provided by the maker of the smartphone. While quite a few stock camera apps now actually have fairly advanced manual controls when shooting photos (ability to set ISO and shutter speed might be the most prominent example), the video mode unfortunately and frustratingly is still almost always neglected, leaving the eager user with a bare minimum of controls and options. In 2015 however, LG introduced a game changer in this regard: the V10. For the first time in smartphone history, a phone maker (also) focused on a full featured video recording mode: it included among other things the ability to set ISO and shutter speed, lock exposure, pull focus seamlessly, check audio levels via an audio level meter, adjust audio gain, set microphone directionality, use external microphones, alter the bit rate etc. etc. Sure, for certain users there were still some things missing that you could find in 3rd party apps like the option to change the frame rate to 25fps if you’re delivering for a PAL broadcast but that’s only for a very specific use case – in general, this move by LG was groundbreaking and a bold and important statement for video production on a smartphone. But what about other phone makers? How good are their native camera apps when it comes to advanced options and controls for recording video? Can they compete with dedicated 3rd party apps?
First off, let me tell you why in most cases, you DO want to have a 3rd party app for recording video (at least if you have an Android phone): external microphones. With the exception of LG, Samsung (and I’m told OnePlus) in their recent flagship lines (plus Apple in general), no stock camera app I have come across supports the use of external microphones when recording video. Having good audio in a video is really important in most cases and external microphones (connected via headphone jack, microUSB, USB-C or Lightning connector) can be a big help in achieving that goal.
So why would you use a stock camera app over a dedicated 3rd party app at all? Familiarity. I guess many of us use the native camera app of a smartphone when snapping casual, everyday photos and maybe also videos in non-professional situations. So why not build on that familiarity? Simplicity. The default UI of most native camera apps is pretty straight-forward and simple. Some might prefer this to a more complex UI featured in more advanced 3rd party apps. Affordability. You don’t have to spend a single extra penny for it. I’m generally an avid advocate of supporting excellent 3rd party app developers by paying for their apps but others might not want to invest. The most important reason in my opinion however is: Stability/Reliability. This might not be true for every stock camera app on every phone (I think especially owners of Sony phones and lately the Essential Phone could beg to differ) but because of the fact that the app was developed by the maker of the phone and is usually less complex than 3rd party apps, chances are good that it will run more stable and is less prone to (compatibility) bugs, especially when you consider the plethora of Android devices out there. The V10’s stock camera app, despite being rather complex,is rock-solid and hasn’t crashed on me once in almost 2 years now.
Over the last months I have taken a closer look at a whole lot of stock camera apps on smartphones from LG, Samsung, Apple, Huawei, Sony, Motorola/Lenovo, Nokia (both their older Windows Phone / Windows Mobile offerings AND their new Android handsets), HTC, Nextbit, BQ, Wiko and Google/Nexus. It goes without saying that I wasn’t able to inspect stock camera apps on all the different phone models of a manufacturer. This is important to say because some phone makers give their flagships models a more advanced camera app than their budget devices while others offer the same native camera app across all (or at least most) of their device portfolio. Also, features might be added on newer models. So keep in mind, all I want to do is to give a rough overview from my perspective and offer some thoughts on which phone makers are paying more attention to pro features in the video recording department.
The lowest common denominator for recording video in a stock camera app on a smartphone at the moment is that you will have a button to start recording in full-auto mode with a resolution of 1920×1080 (1080p) (1280×720 on some entry level or older devices) at a frame rate of 30fps. „full-auto“ basically means that exposure, focus and white balance (color temperature) will be set and adjusted automatically by the app depending on the situation and the algorithm / image processing routine. While this might sound like a convenient and good idea in general to get things done without much hassle, the auto-mode will not always produce the desired results because it’s not „smart“ enough to judge what’s important for you in the shot and therefore doesn’t get exposure, focus and/or white balance right. It might also change these parameters while recording when you don’t want them to, like for instance when you are panning the camera. Therefore one of the crucial features to get more control over the image is the ability to adjust and lock exposure, focus and white balance because if these parameters shift (too wildly/abruptly/randomly) while recording, it makes the video look amateurish. So let’s have a look at a couple of stock camera apps.
I’ve been spending quite some time in the last months doing research on what device could qualify as the cheapest budget Android phone that still has certain relevant pro specs for doing mobile video. While it might be up to discussion what specs are the most important (depending on who you ask), I have defined the following for my purposes: 1) decent camera that can record at least in FHD/1080p resolution, 2) proper Camera2 API support to run pro camera apps with manual controls like Filmic Pro (check out my last post about what Camera2 API is), 3) powerful enough chipset that allows the use of video layers in pro video editing apps like KineMaster and PowerDirector, 4) support for external microphones (preferably featuring a headphone jack as long as there are no good all-wireless solutions available).
The greatest obstacle in this turned out to be No. 2 on the list, proper Camera2 API support. Apart from Google’s (abandoned?) Nexus line which also includes a budget option with the Nexus 5X (currently retailing for around 250€), phone makers (so far) have only equipped their flagship phones with adequate Camera2 API support (meaning the hardware support level is either ‘Full’ or ‘Level 3’) while mid-range and entry-level devices are left behind.
Recently, I happened to come across a rather exotic Android phone, the Nextbit Robin. The Nextbit Robin is a crowdfunded phone that came out last year. Its most notable special feature was the included 100GB of cloud storage on top of the 32GB internal storage. While the crowdfunding campaign itself was successful and the phone was actually released, regular sales apparently have been somewhat underwhelming as the phone’s price has dropped significantly. Originally selling for a mid-range price of 399$, it can now be snagged for around 150€ online (Amazon US even has it for 129$). As far as I know, it is now the cheapest Android device that checks all the aforementioned boxes regarding pro video features, INCLUDING full Camera2 API support! Sure, it has some shortcomings like mediocre battery life (the battery is also non-replaceable – but that’s unfortunately all too common these days) and the lack of a microSD storage option (would have been more useful than the cloud thing). It also gets warm relatively quick and it’s not the most rugged phone out there. But it does have a lot going for it otherwise: The camera appears to be reasonably good (of course not in the same league as the ones from Samsung’s or LG’s latest flagships), it even records video in UHD/4K – though it’s no low light champion. The Robin’s chipset is the Snapdragon 808 which has aged a bit but in combination with 3GB of RAM, it’s still a quite capable representative of Qualcomm’s 800-series and powerful enough to handle FHD video layers in editing apps like KineMaster and PowerDirector which is essential if you want to do any kind of a/b-roll editing on your video project. It also features a 3.5mm headphone jack which makes it easy to use external microphones when recording video with apps that support external mics. The most surprising thing however is that Nextbit implemented full Camera2 API support in its version of Android which means it can run Filmic Pro (quite well, too, from what I can tell so far!) and other advanced video recording apps like Lumio Cam and Cinema 4K with full manual controls like focus, shutter speed & ISO. One more thing: The Robin’s Android version is pretty much as up-to-date as it gets: While it has Android 6 Marshmallow out of the box, you can upgrade to 7.1.1 Nougat (the latest version is 7.1.2).
So should you buy it? If you don’t mind shelling out big bucks for one of the latest Android flagship phones and you really want the best camera and fastest chipset currently available, then maybe no. But if you are looking for an incredible deal that gives you a phone with a solid camera and a whole bunch of pro video specs at a super-low price, then look no further – you won’t find that kind of package for less at the moment.