قالب وردپرس درنا توس
Home / Technology / The 7 most egregious fibs Apple told about the iPhone XS camera today – TechCrunch

The 7 most egregious fibs Apple told about the iPhone XS camera today – TechCrunch



Apple always leaves some dizziness to its events and the announcement of today's iPhone XS was no exception. And nowhere were they more blatant than in the introduction of the "new" camera features of the devices. Nobody doubts that the iPhone make big pictures, so why bother to lie? My hypothesis is that they can not help themselves.

To be clear, I have no doubt that they have made updates to make a good camera better. But whatever these improvements are, today they have been obscured by the breathless noise that was often questionable and occasionally simply wrong. Now, to fill this article I had to be a bit pedantic, but honestly, some of these are quite striking.

"The most famous camera in the world"

There are many iPhones out there, to be sure. But defining the iPhone as a sort of continuous decade-long camera, which Apple seems to be doing, is in some ways an unreliable way to do it. With this standard, Samsung would almost certainly benefit, since it would have been possible to count all its Galaxy phones even in the following decade, and they definitely surpassed Apple at that time. Going further, if one were to say that a stack of standard base cameras and a common Sony or Samsung sensor were a "camera", the iPhone would probably be in the 10: 1 minority from Android phones.

Is the iPhone one of the most famous cameras in the world? To be sure. Is it the most famous camera in the world? You should cut it quite thin and say that this or that year and this or that model were more numerous than any other single model. The point is that this is a very squishy metric and many could claim dependence on how they choose or interpret the numbers. As usual, Apple has not shown their work here, so we can also mint a term and call it an educated bluff.

"A new extraordinary double camera system"

As Phil would later explain, many of the novelty comes from improving the sensor and image processor. But since he said the system was new, supported by an exploded view of the video camera hardware, we could also consider it a reference to this.

It is not really clear what in hardware is different from the iPhone X. Of course if you look at the specifications, they are almost identical:

If I said that these were different cameras, would you believe me? Same numbers F, no reason to think that the stabilization of the image is different or better, and so on. It would not be unreasonable to suppose that these are, in terms of optics, the same cameras as before. Again, not that there was something wrong with them: they are fabulous optics. But to show components that are actually the same and to say that it is different is misleading.

Given Apple's style, if there were any real changes to lenses or OIS, they would have said something. It is not trivial to improve those things and they would take credit if they did.

The sensor is of course extremely important, and is improved: the pixel pitch of 1.4 micrometers on the wide – the main angled camera is larger than the pitch of 1.22 micrometers on the X. Since megapixels are similar, we can probably assume that the "bigger" sensor is a consequence of this different pixel pitch, not of any kind of real change in the form factor. It is certainly larger, but the larger pixel pitch, which helps with sensitivity, is what has actually improved, and the increased dimensions are only a consequence of .

.

"2x faster sensor … for better image quality"

It is not very clear what you mean when you say this. "To exploit all this technology." Is the reading rate? Is it the processor that is faster, since it is probably what would produce better image quality (more power to compute colors, code better, and so on)? "Fast" also refers to the collection of light – is it that faster?

I do not think it's random that this was simply thrown out there and unspecified. Apple likes big simple numbers and does not want to play the spec game in the same way as the others. But this in my opinion crosses the line from simplification to misleading. This at least Apple or some detailed third-party tests can clarify.

"What is completely new is to connect ISPs with that neural engine, to use them together."

Now, this was a little bit of prestige games by Phil. Presumably, the novelty is that Apple has better integrated the image processing path between the traditional image processor, which is doing the workhorse as the autofocus and color, and the "neural engine", which is doing face detection.

It might be new to Apple, but this sort of thing has been standard in many cameras for years. Both phones and interchangeable lens systems such as DSLRs use face and eye detection, some of which use neural models, to guide autofocus or exposure. This (and the problems deriving from it) go back to years and years. I remember the point-and-shoot that they had, but unfortunately I could not detect people with dark skin or frowning.

It has improved a lot (Apple depth detection units probably help a lot), but the idea of ​​linking a facial tracking system, whatever the fancy name you call, in the process of acquiring 39; picture is an old hat. What's in XS could be the best, but probably it's not "completely new" even for Apple, not to mention the rest of photography.

"We have a brand new feature that we call smart HDR."

Apple's new feature has been on Google Pixel phones for some time. Many cameras now keep a frame buffer active, essentially by taking photos in the background while the app is open, then using the last one when you press the button. And Google, among others, had the idea of ​​using these invisible images as raw material for an HDR shot.

Probably Apple's method is different, and it could even be better, but basically it's the same thing. Again, "brand new" for iPhone users, but well known among Android flagship devices.

"This is what you should not do, right, take a picture in the sun, because you're about to turn off the exposure."

I'm not saying that should shoot directly in the sun, but it's not unusual to include the sun in the shot. For example, in a corner like this it can make some slow glows. Nowadays, it will not explode because almost all of the camera's automatic exposure algorithms are center-weighted or moved intelligently to find faces, for example.

When the sun is in your shot, your problem has not jumped out highlights, but a lack of dynamic range caused by a big difference between the exposure needed to capture the sunlit background and the foreground in shadow. Of course, as Phil says, one of the best HDR applications: a well-balanced exposure can make sure you have the details of the shadows and keep the highlights.

Strangely, in the picture he chose here, the details of the shadow are mostly lost – you see only a little noise there. You do not need HDR to get those drops of water – this is a thing that saves time, really. Anyway it's still a big blow, I do not think it's an example of what Phil is talking about.

"You can adjust the depth of field … this was not possible in photography of any type of camera."

This is not true. You can do it on the Galaxy S9 and it will also be distributed on Google Photos. Lytro was doing something similar years and years ago, if we included "any type of camera". Will be better? Probably – it seems fantastic. Was it never possible? Not even close. I feel a little bad that no one told Phil. It's out here without the facts.

Well, they're all the biggest. There were many other things, we should say, embellishments at the event, but this is the starting point for the launch of any large company. I felt as if these could not remain unanswered. I have nothing against the iPhone camera – I use one myself. But the boy is going crazy with these statements. Someone has to say it, since clearly no one is inside Apple.

Check out the rest of our Apple event coverage here:

 Still Coverage for the iPhone Event 2018


Source link