Skip to main content

Scary Fast event shot on iPhone 15 Pro Max: Impressive, or a cheat?

Apple’s proud boast that the entire Scary Fast event was shot on the iPhone 15 Pro Max has lead to a lot of discussion and debate about what that really means, and whether or not it’s a big deal.

There are those suggesting it really does mean the camera is something special, while other are suggesting that it’s at best meaningless, and at worst a cheat …

The Verge’s Jess Weatherbed was one of the naysayers.

Still images and a video reveal that (unsurprisingly) a great deal of fancy equipment — from drones, gimbals, dollies, industrial set lighting, and other recording accessories — is still required to make iPhone footage look this good […]

It’s a neat way to promote the recording quality of iPhone cameras, but it’s not like everyday folks can recreate these kinds of results at home unless they happen to own a shedload of ludicrously expensive equipment. The gear shown in the “Scary Fast” behind-the-scenes footage is fairly standard for big studio productions, but Apple’s implication with these so-called “shot on iPhone” promotions is that anyone can do it if only they buy the newest iPhone.

Daring Fireball’s John Gruber took the opposite side of the debate.

I saw a few folks mocking Apple for this on Mastodon and Threads, too. This is ridiculous. Do these people think that previous Apple keynote films were shot with just a single camera person wielding something like a $40K RED cinema camera and no crew, no lighting, no cranes? That the iPhone “needs help” that traditional cinema cameras do not? I mean, guess what, they used professional microphones too.

The whole point is that an iPhone 15 Pro camera is so good that it can fit right in on a high-budget commercial film shoot, and produce world-class results. There’s no implication that a casual user can get results like this by just hitting the shutter button in the iPhone Camera app.

Personally, I fall somewhere between the two views.

There’s one cheat Apple didn’t use

First, there is one cheat that is often used on ‘Shot on iPhone’ video footage, and that’s the use of high-end lenses.

You can buy adapters that allow you to use either DSLR or even full-on cinema lenses with iPhones. While those still have to go through the built-in lenses, of course, they do allow you to achieve things that simply cannot be achieved with the built-in lenses, like anamorphic perspectives and really shallow depth of field (DoF).

Since Apple showed us behind-the-scenes footage, and we saw no sign of external lenses, we can be reasonably confident that the company didn’t use that particular cheat.

But set control is a very good workaround

Very often, you want shallow DoF because there are distractions in the background. When you or I are filming on the street, or in a public area, there are other people around, trash cans, litter, all kinds of distracting elements.

A big-budget production, in contrast, has complete control of the set. You can either build the background you need, or simply take complete control of it.

Clear out all the people, ensure there’s nothing untidy to clutter up the scene, add a bit of dry ice, and voila: a setting in which everything being in focus is just fine.

Lighting is everything

When I introduced a friend to studio photography, he was amazed that camera settings were set-and-forget. Manual mode, ISO 100, f/11, 1/125th, done.

All the control is achieved with the lights. High-key, low-key, upbeat, moody, colorful, monochromatic, background visible, background invisible – lighting is the route to achieving the exact look you want.

The same is true with video. But more than this, studio or cinema lighting enables you to overcome weaknesses in a sensor. Almost any modern camera phone does just fine in bright light; it’s in low-light conditions that we see the difference between a high-end sensor and a mediocre one. Throw in the kind of lighting rigs we see in the BTS video and it’s no surprise that the iPhone sensor can cope.

Now, you might argue that this was a Halloween theme, and a lot of it was pretty dimly lit – and that’s true. But the presenters are very well-lit indeed, and it’s them we’re looking at. If we turn our attention to the low-lit background, that has nothing like the same clarity or sharpness. Those areas are as muddy and noisy as we’d expect from low-light filmed with a smartphone sensor (and bear in mind this is the result after professional editors have done their very best to clean it up):

Camera angles, movement, and transitions do a lot

The other factor to bear in mind is that a lot of heavy lifting was done by all the rest of the kit we got to see in the BTS videos. Dollies, jibs, gimbals, drones, you name it.

Plus some absolutely superb editing, with Apple’s trademark transitions.

The result was a lot of camera movement, a lot of angles, a lot of fast-paced transitions to capture our attention. So we don’t have much time to take in the wider framing and notice the noise and muddiness in the background areas.

Is Apple pulling a fast one?

The debate essentially comes down to whether Apple is trying to give the impression that the iPhone 15 Pro Max camera is so good that anyone can get these types of results.

There are those who say: No, of course not. The very fact that Apple chose to share the BTS video acknowledges that you need all the extra kit to get these results. You’d use exactly the same additional kit if you were shooting with Arri movie cameras.

Top comment by Tim Bennett

Liked by 32 people

I spent four decades in media management and tv production on national and international levels.

Congrat's to Apple on a stunning production, and a wonderful marketing win.

View all comments

That much is true. But … you could take your Alexa Mini into an ordinary room, partly lit by window light alone, without all those fancy lighting rigs, and still get very good results. The results from the iPhone wouldn’t even come close. And it’s that latter situation where most of us would find ourselves shooting.

So while Apple does acknowledge that it used movie-grade kit, at the same time the implied message is ‘See? The iPhone can get similar results to movie cameras’ – when that is, in most circumstances, very much not the case.

You decide

Ultimately, this comes down to opinion. What Apple did was real, and used exactly the same kit it has used with cinema cameras in the past. At the the same time, those are the least-challenging conditions for any camera – and there are a lot of weaknesses in the images that simply flashed past unnoticed.

Is it a cheat? Let us know your view in the comments.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear