IOS 26.0.1 camera are blurry on iPhone 17 Pro Max and older iPhones.

I have contacted Apple senior support for over 3 days about the camera quality, where i take photos of a book page (none macro, normal 1x) and only where it focuses is good quality and the rest of the texts and the edges of the photo and the content of the book are blurry/smudgy. I have only owned my iPhone 17 Pro Max for 5 days, my IOS 18.5 six years old iPhone 11 Pro Max outperforms today’s iPhone, with much sharper photo and clarity all sides and edges. Not even the senior support could point out if it was a hardware or a software issue. And they booked me in on a official apple reseller and even there they ”couldn’t see a problem” with the iPhone 17 Pro Max, yet it was pretty obvious. Until i tried the cameras on the demo iPhones both in Apple official reseller and a retailer store and they all had the same photos i got. Even the employees with their older iphone (15 pro max and 13 pro max) had the same issue since they had IOS 26 upgrade. So either please fix the issue as soon as possible, or recall the devices and refund them.


Also Apple senior support team tried to tweak some camera settings while i shared screen for them on my iPhone 17PM, and they told me to factory reset my iPhone and still absolutely nothing changed.


I can provide photo evidence of how blurry images are on the edges and smudgy on texts, and i would really like a confirmation if the camera of the new 2025 device supposed to be this bad, so that i can return it as it didn’t fill my expectations at all.

Posted on Oct 6, 2025 8:41 AM

Reply
Question marked as Top-ranking reply

Posted on Oct 6, 2025 2:18 PM

Hi, this is a very simple issue of a concept called Depth of Field (DoF). DoF can be expressed as a mathematical formula. Rather than post a lot of math, you can see the formula and how it works here,


https://en.wikipedia.org/wiki/Depth_of_field


The concept though is fairly simple. There is a zone of acceptable sharpness/focus that extends both behind and in front of the exact point you focus upon. The further objects appear from the exact point of focus the blurrier they become.


You’re attempting to photograph a flat page in a book, but it’s not perfectly flat. Do you see the dip in the center of book, along the spine. You’ll achieve better results using a ½” or ⅝” polished plate glass to hold the book flatter. Not perfect, but better than what you’re currently doing.


Why did the DoF change between older cameras and newer cameras? Apple made a design decision to improve the camera and resulting images for the average photographer. One of the improvements was to a larger sensor and different lens design for the 24mm (1X) lens. The changes resulted in different parameters and the DoF changed and became narrower (less objects in focus both in front of and behind point of focus). This is why newer models will not produce images with less DoF when focused close. The newer models do produce better/sharper images with the 24mm lens that the average photographer will take such as portraits, seascapes, sunsets, sports etc.


Your options are to return the iPhone and purchase an older model, purchase a camera better suited for flat reproduction images, modify your current technique and equipment for better results.



72 replies
Question marked as Top-ranking reply

Oct 6, 2025 2:18 PM in response to PlsFixMyProblem

Hi, this is a very simple issue of a concept called Depth of Field (DoF). DoF can be expressed as a mathematical formula. Rather than post a lot of math, you can see the formula and how it works here,


https://en.wikipedia.org/wiki/Depth_of_field


The concept though is fairly simple. There is a zone of acceptable sharpness/focus that extends both behind and in front of the exact point you focus upon. The further objects appear from the exact point of focus the blurrier they become.


You’re attempting to photograph a flat page in a book, but it’s not perfectly flat. Do you see the dip in the center of book, along the spine. You’ll achieve better results using a ½” or ⅝” polished plate glass to hold the book flatter. Not perfect, but better than what you’re currently doing.


Why did the DoF change between older cameras and newer cameras? Apple made a design decision to improve the camera and resulting images for the average photographer. One of the improvements was to a larger sensor and different lens design for the 24mm (1X) lens. The changes resulted in different parameters and the DoF changed and became narrower (less objects in focus both in front of and behind point of focus). This is why newer models will not produce images with less DoF when focused close. The newer models do produce better/sharper images with the 24mm lens that the average photographer will take such as portraits, seascapes, sunsets, sports etc.


Your options are to return the iPhone and purchase an older model, purchase a camera better suited for flat reproduction images, modify your current technique and equipment for better results.



Nov 9, 2025 4:44 PM in response to PlsFixMyProblem

I have shot professionally, understand depth of field, know how to set up focus test charts, and have done so. The problem is not depth of field which, as has been noted in this thread, is enormous given the small sensor size. The issue is that the lens of the main camera of the 17 Pro (Max in my case) is *very* soft as you approach the edges, and the images become "smudgy." The softness kicks in much closer to the center than is normal even for full size camera lenses. The text reproduction examples that have been posted here illustrate it quite well.


The softness of the main camera lens has been well documented, for example in the Lux review. This softness actually endows the landscape and travel photos showcased in that review with a dreamy quality, so if that's your use case you may be pleased with it. It approximates the rendering of a vintage lens. Portraits will also have a certain softness to them, which can often be desirable. However, clinical applications, like book scanning or business document processing, are going to suffer. The main camera of my 17 pro max is virtually useless for this; I get identical results to the flat page examples that have been posted above. Even photographing a restaurant dish is tricky, as the outer parts of the plate are rendered consistently soft.


There are a few workarounds that will result in acceptable quality for applications that require critical sharpness:


  • Get close and use the ultra-wide-camera, which is sharp edge-to-edge, and which will produce more than acceptable images for business application or scanning purposes.
  • Use the 2x setting. This will produce 12 MP images from a center crop of the main camera, which is the sharpest area. There may still be some softness at the edges of the image.
  • Get far enough and use the 4x camera, which is slightly soft but not nearly as much as the main camera.


This is different from the example posted by @patrick_photography That one is a well-documented glitch with the iPhone's computational wizardry, also discussed in the (look for the photo with the caption "Processing makes curious mistakes at times").


That's where it's at. Presumably the computational issues will get solved at some point. The lens softness in the main camera will not go away, so you have to decide whether you can live with it or not.


Examples:


Main camera, full photo. Note the smudginess in the text closer to the edges.


100% crop of the above. Smudgy text:


100% crop of same photo taken with ultra-wide-camera. No smudginess:



[Edited by Moderator]

Oct 6, 2025 4:18 PM in response to PlsFixMyProblem

Your two sample photos, illustrate part of the principles of DoF. You photographed to flat objects and the flat objects are consistently sharp from edge to center.


Your original two objects were not flat and illuminated how DoF affects objects not in the plane of focus.


Your issue in the last two images does not have a DoF issue. If focused under identical conditions, using a tripod to eliminate camera motion, illustrate one lens is sharper than the other. It’s an issue of lens design, sensor design and choice of materials. At the given distance, the iPhone 11 Pro Max is sharper the 17 Pro Max.


But software can’t significantly alter the sharpness of a lens. You would need to put a different lens on the 17 Pro Max to achieve more resolution and sharper images at that distance.


So again, your options are to return the iPhone 17 PM and purchase an older model, purchase a camera better suited for flat reproduction images, or modify your current technique and equipment for better results. Unfortunately, improving technique will not achieve the results you’re hoping for.



Oct 27, 2025 4:55 PM in response to Senator48

As someone that’s been involved with iPhones since 2007 and photography in general for almost 50 years, don’t expect much change from Apple. Every year a small segment of the iPhone camera community voice their concerns about the over processing of images. I’ve witnessed this since about iPhone 7.


Tge solution is simple and it won’t come from Apple. Instead look to third party camera apps and their developers. Apps such as Halide, Reeflex, Leica LUX, and ProCamera will do minimally processed Raw files. No computational photography. The ones I mentioned above all charge a subscription fee or a high lifetime purchase fee. However, there is one free app, FotoGear that does a Raw files will little to no processing. But If Apple hasn’t changed in almost 10 years, I wouldn’t hold out for them starting with iPhone 17 Pro models.


If you want to switch, you might consider some of the Chinese models or the Google Pixel 10 models. The Pixel does computational photography, but most photographers feel they do less than Apple or Samsung. Overall, the consensus is Samsung does the most computational photography and it’s notorious for digital hallucinations.

Oct 24, 2025 11:06 PM in response to PlsFixMyProblem

Hi all, I'm new to this forum after looking for a thread about iPhone 17 pro camera focus problems.


I take thousands of photos of buildings to use in Reality Scan to generate 3D models. I switched to using the iPhone 11 for this mostly because all the photos would be in focus, not something guaranteed with a traditional camera.


I recently upgraded from the iPhone 14 Pro to the iPhone 17 Pro for the three 48mp cameras and am now having severe focus issues with up to 30% of the photos out of focus.


After extensive testing (and more to be done) I think Jeff Donald may have the right answer but for all the wrong reasons!


If you look at this photo on flickr:-


https://www.flickr.com/gp/padraiccollins/0d9yu37473


This image has severe depth of field (but absolutely nothing tack sharp) - this is completely software generated. I didn't ask it to do this and in a series of photos taken by walking around the building some are completely in focus and some are completely out of focus. Indeed the centre of the "false" depth of field can be anywhere on the focal plane to the point it sometimes looks like a rolling shutter effect, half the image in focus and the rest out of focus, unrelated to the depth of the object.


This is computational photography at its very worse....... software bug or software feature?


I believe at the heart of this is the "Fusion Focus" system interfering with my photography. I think this is what PlsFixMyProblem is also seeing. I get the feeling the software is using all the sensors (motion, lidar, all three camera sensors etc). This might be great but is currently out of control.


Today, under the camera app settings, I switched the "Fusion Camera" option to "24mm only" and I am hoping this solves some of the issues, but an "off" option might be even better.


It would be useful to have a technical explanation of exactly what is going on "under the hood" for this and other features (prioritise faster shooting???) to be able to take back control of the cameras. The camera focus on the iPhone used to be sharp and reliable in conditions beyond that of a traditional camera.


For Jeff to suggest the depth of field is hard wired and one should go back to an older iPhone is strange. The iPhone depth of field is completely computational and should present little challenge to give some user control. Melding multiple images from different sensors when one just wants a sharp image is not the way to go!




[Edited by Moderator]

Oct 26, 2025 6:17 AM in response to Jeff Donald

Jeff, I'm sort of lost for words.


"Depth of Field will be governed by Laws of Physics and not Apple Engineers."


As google will tell you - due to the iPhone's smaller sensor size its effective depth of field is much larger than the equivalent large camera sensor. The depth of field shown on the image I posted is impossible under the laws of physics on an iPhone. It is completely generated by software engineers.


I've take tens of thousands of photos with various iPhones. My focusing issues are not related to inexperienced use of the camera. Using the same process and techniques on a iPhone 14 Pro produces 99% sharp images and this has dropped to under 70% on the iPhone 17 Pro, depending on the colour of the subject.


This isn't about holding the camera steady. Describing a studio process for taking sharp images is not a real world situation where there are a number of changing variables.


The thing is I don't want depth of field. I want a sharp image across the whole photo. If I'm ten feet from a building why is the iPhone presenting the image as if it was a product shot from a studio?


Its not about debating the merits of computational photography, something is "broken" on the iPhone 17 Pro focus system and how it then computes the image. The thousands of photographs I've taken in the last two days suggest the fusion focus system is getting confused, particularly if the subject matter is predominantly red.


Nothing to do with my and old and shaky grip.....


Nov 10, 2025 12:43 PM in response to Mauricio Drelichman

Two further considerations:


1) When capturing 48 megapixel images, it is very likely that the sensor resolution is exceeding the resolving power of the lens. This happens with big fat DSLR lenses mounted on full-frame 50 MP camera bodies; no wonder it is an issue with puny phone lenses.


2) When shooting at 48 megapixel resolution, camera shake will be *extremely* noticeable. Again, we get this all the time when shooting on pro bodies, with the finest glass available. High resolution sensors are unforgiving is showing all the imperfections. Because of the physics of human hands trembling (essentially a random shake around a central point), camera shake is more noticeable along the edges than in the center of an image.


Actions you can take:


• If 48 megapixel resolution is not necessary, shoot 24 megapixel images. This will reduce visible smudging when viewed at 100% magnification.

• When shooting with the main camera at close range, try to increase available light in order to increase shutter speeds and minimize camera shake. I have had much better results scanning books / documents when under a floodlight, indicating that shutter speed matters.

Oct 12, 2025 11:12 AM in response to PlsFixMyProblem

I have to correct my previous post. Though I’m still not convinced with the quality of the camera chip (at least under low illumination conditions), I found a simple and reasonable explanation for the blur. If you look at the file size of a 12 MP pic it is around 1.2-1.4 MB, which is about 30-40% higher compression than the typical 2.0-2.1 MB of previous iOS. Higher compressions means more condensed data and more interpolation in between data points which causes the blur effect. Play with the settings, select different raw modes and you’ll see the difference. Also, keep bi***ing to make reduce the compression. ;)

Nov 9, 2025 6:17 PM in response to Mauricio Drelichman

Let’s look at what the Lux blog has to say about the 1X lens (24mm equivalent).


What's beginning to get very old is its lack of close focusing. Its new sibling camera in iPhone Air focuses a whole 5 cm (that's basically 2 inches) closer, and it's very noticeable. For most users, arms-length photography is an extremely common use case: think objects you hold, a dish of food or an iced matcha, your pet; you probably take photos at this distance every day. And if you do, you'll have encountered your iPhone switching, at times rapidly, between the ultra wide 'macro' lens and the regular main camera — one of which produces nice natural bokeh and has far higher image quality. It's been several years of this now, and it's time to call it out as a serious user experience annoyance that I hope can be fixed in the future. This is, incidentally, one of the reasons why our app Halide does not auto-switch lenses.


No criticism of sharpness or resolution of the lens, but rather its inability to focus close. Yes, older models 1X lenses focus closer and produce sharper images. This is the issue that started this thread. The use of the fusion camera for macro and near macro has anomalies. Your photographs are an example.


Nowhere in the article does the author criticize the 1X lens. In fact, this what is said about the 1X lens.


”I find the focal lengths ideal for day-to-day use and the main camera especially is sharp and responsive. Its image quality isn't getting old (yet).”


The main lens is the sharpest of the 3 and no one disputes that. Its issue is how close it focuses. This contributes to the fusion camera having to rely heavily on the ultra wide angle 0.5X lens.


Where are you getting that the main lens is soft? If it’s the smudgy images, I suspect you were too close and exceeded the minimum focus distance. How far were you from the subject?


Dec 4, 2025 8:50 AM in response to freddy2013

I got the replacement device and can confirm my issue of blurry pictures has been resolved !!!

There must have been some issue with the hardware OIS or focusing mechanism.


I can also confirm what others have stated about soft edges. The image for me is now sharp in the center and does blurry out towards the edges of the frame in all directions vertical and horizontal which is probably done intentionally by the software processing and DOF. This to me is not a problem.


From past experience with real cameras mainly Panasonic Lumix, when i ran DOF chart tests in the past the in focus area was across the entire horizontal area at the specific distance of the frame. I don't recall softness towards outer edge of frame.


So if you recently purchased and want to go down the route i went, you will have to call apple and they will ship you a replacement and charge you 29 bucks for order processing fee, they will also place a hold on CC for the full amount and the hold should be released once you return the other device.

Oct 8, 2025 1:55 AM in response to PlsFixMyProblem

Hi hello - I'm from germany and sorry for my bad english.


Perhaps you could test for a photo from a booksite not with the normal Camera-App, instead with die Documentscanner. You found this tool under "files / three points top right / Document scanning"


This tool used other Camerasettigs and is ideal for Booksites and Documents. You'll need 2 or 3 attempts to get the alignment right, but then it should be fine, in my opinion.


Best regards from germany

Dec 2, 2025 7:44 AM in response to PlsFixMyProblem

After a full reset/reinstall and full system diagnostics performed by Apple Genius Bar Tech, which returned nothing wrong via the scans, only the actual picture tacking. They offered to replace the camera module since they too were able to replicate the blurry photos in the back room. I declined cracking open a brand new 1700 phone. I called Apple customer service they are sending me a replacement phone, will come back here and update. I recently went on a trip to Disney and 90% of the pictures are blurry.

IOS 26.0.1 camera are blurry on iPhone 17 Pro Max and older iPhones.

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.