Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
I was using 2 27" LG 27GM950-Bs (IIRC), that could do up to 165Hz and VRR on a 2019 cheesegrater Mac Pro, wasn't the cables, or the monitors, or the card.
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail.
I wonder if Apple is doing this on purpose except for their own displays.
It's a bit nit-picky on my part, but this bizarre world of MacOS resolution/scaling handling vs. other operating systems (including Windows 11 for crying out loud) is one of my biggest gripes with using Apple hardware.
I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro. I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.
Anyway I will run the diagnostic commands and see what I get.
I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
Thanks for the feedback, I'll try to take some photos, it's not an easy thing to do accurately without a good camera setup, but I'll reply here after work if I get something setup and added to the post.
The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.
24 inch 1080p
24 inch 4k (2x scaling)
27 inch 1440p
27 inch 5k (2x scaling)
32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.
For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).
This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.
This is what us proles on third-party monitors have to do to make text look halfway decent. My LG DualUps (~140ppi if I recall) run at 4x of a scaled resolution to arrive at roughly what would be pixel-doubled 109ppi, which is the only pixel density the UI looks halfway decent at.
It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadbale aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect.
But that's at some 85% of 2880x2560. I have a 4K portable display on a monitor arm I run at, as you suggest, 1080p at 2x. But if you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting 2x the GPU cycles and memory on rendering a 4x framebuffer and scaling it down.
To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?
Yeah. I don't get it. If you've got a 3840x2160 display, intended use on macOS as a 1920x1080@2x display, what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
> what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
Yes but Apple got to drop subpixel anti-aliasing support because this workaround is "good enough" for all of their built-in displays and overpriced mediocre external ones, so we all get to suffer having to render 4x the pixels than we need.
Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
Yeah I'm not sure what the point of this article is really or am I probably misunderstanding something? There's no such thing as 4K HiDPI on a 4K monitor. That would be 2160p @ 2x on an 8K monitor. 4K at 100% scaling looks terrible in general across every OS.
Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
I believe it will, it won't be until you push up to an 8k display that you'll get the old level of scaling back (could be wrong though as I don't have a way to test this).
And prior to Apple’s re-entry into the display market, everybody internally was likely on 2x HiDPI LG UltraFine displays or integrated displays on iMacs and MacBooks.
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
- 24" you need 4k
- 27" you need 5K.
- 32" you need 6k.
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.
Totally agree with those resolution suggestions. Personally I have a 32" 4k, I wanted a 5k or 6k back then (just too expensive) - but now I wish I had just got a 27" which is better suited to 4k - regardless it was a LOT better on the M2 Max with HiDPI working.
This still baffles me. Never mind Windows; I can get sub-pixel font rendering with the ability to fine-tune it on virtually any major Linux distro since around 2010.
Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
Tbh I'm not even sure what the issue is here. I have a personal M1 macbook and a work M4 and a 4k display. I don't see any issues or differences between them on my display. The M4 seems to be outputting a 4k image just fine.
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
Right, I just went though all of the scale options on my M4 with 4k monitor and none of them rendered blurry. Might be a very situational bug. Doesn't seem as widespread as the title makes out to be.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
Anyway I will run the diagnostic commands and see what I get.
24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadbale aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect.
But that's at some 85% of 2880x2560. I have a 4K portable display on a monitor arm I run at, as you suggest, 1080p at 2x. But if you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting 2x the GPU cycles and memory on rendering a 4x framebuffer and scaling it down.
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.
They've got a good thing going, but they keep finding ways to alienate people.
As an article, it is not 100% coherent, but there is a valid data and a real problem that is clear.
The article doesn't mention it.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
They’re likely all on Studio Displays.
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
https://bjango.com/articles/macexternaldisplays/
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
Tim Apple's Apple has been fu#$%& me again..