Atkinson Dithering (2021)

(beyondloom.com)

236 points | by jdblair 337 days ago

21 comments

  • mungoman2 337 days ago
    To avoid changing the overall brightness like in the examples it is important to work on linearized values or to adjust the running error accordingly. It's not correct to work on the encoded sRGB values of the image directly. This is a very common mistake in blog articles about diy image filtering.
    • virtualritz 337 days ago
      Not working in linearized space is a common error in pretty much any new OSS graphics project done by people who's background is not in computer graphics, I came across in the last decade.

      I think in the old days you got CG know how from a few books or you went to comp.graphics.algrithms where more experienced people would gladly explain stuff to you.

      Today people watch YT videos and read blog posts produced by people who also lack these basics.

      It's like an error that get accumulated. But no dithering will help diffusing it. ;)

      • leni536 337 days ago
        > Not working in linearized space is a common error in pretty much any new OSS graphics project done by people who's background is not in computer graphics, I came across in the last decade.

        This still happens in mature software as well, including contemporary web browsers.

        Just open this image in your favorite browser and zoom out:

        http://www.ericbrasseur.org/gamma-1.0-or-2.2.png

      • ghusbands 337 days ago
        I assure you that people made the same mistake plenty in the old days, too. Most programmers who come across an array of values assume they know what the values mean, and that stands true across time. Many paint packages and other programs still get many transforms/effects wrong due to it, and in fact some people seem to prefer some effects when not linearized.
        • bandrami 337 days ago
          > Most programmers who come across an array of values assume they know what the values mean

          Oh yeah, let me add from the audio synthesis world that this disease is prevalent here too

      • crazygringo 337 days ago
        In my experience it's been the exact opposite -- back in the "old days" most programmers simply didn't know about linearized space, which is why even Adobe Photoshop does plenty of incorrect calculations that way. And because there wasn't any internet, there was nobody to tell you otherwise.

        These days you can at least find references to it when you look things up.

        • GuB-42 336 days ago
          I can confirm, when I started computer graphics, I had absolutely no idea about linear space. I never had a formal education about it though, mostly random tutorials, demoscene stuff, things like that.

          I think one of the reason is that in the "old days", in many cases, performance mattered more than correctness. Models were, overall, very wrong, but they were fast, and gave recognizable results, which was more than enough. And working in gamma space as if it was linear saved time and wasn't that bad. That gamma space somehow matched CRT monitors response curve was an added bonus (one less operation to do).

          But things have changed, with modern, ridiculously powerful GPUs, people are not content with just recognizable shapes, we want some degree of realism and physical correctness. Messing up color space in the age of HDR is not acceptable, especially considering that gamma correction is now considered a trivial operation.

        • dahart 336 days ago
          Not knowing about linear space means that people were using linear by default, right? That’s what I would assume. Early games and all the graphics I was exposed to up through college all used linear RGB, but just didn’t call it that, and of course RGB isn’t a real color space anyway. Most people didn’t know about gamma correction, and otherwise almost nobody converted into non-linear spaces or tried to differentiate RGB from something else. Color geeks at Pixar and Cornell and other places were working with non-linear colors, but I guess most people writing code to display pixels in the 70s & 80s weren’t thinking color spaces at all, they just plugged in some RGB values.
          • Etherlord87 335 days ago
            According to wikipedia, sRGB is a standard created in 1996, so yeah, it just wasn't used earlier. However at the end of the millenium you could create software that opens an image file saved in sRGB, and unknowingly apply some algorithm, like dithering, without converting it to linear space first.
            • dahart 335 days ago
              There was gamma correction and other perceptually uniform-ish color spaces before 1996 and before sRGB. I was taught about the CIE set of color spaces xyY/XYZ/LAB/LUV in school and used them to write renderers before I’d ever heard of sRGB. And yes exactly right, before they know better, a lot of people will grab something non-linear and starting doing linear math on it accidentally. It’s still one of the most common color mistakes to this day, I think, but it was definitely more common before sRGB. People sometimes forget basic alpha-blend compositing needs linearized colors, so it’s a common cause of fringe artifacts. Things have gotten much better though, and quickly. A lot of game studios 20 years ago didn’t have much in the way of color management, and it’s ubiquitous now.
        • virtualritz 336 days ago
          That is software targeting Mac and Windows. Adobe has been notoriously inept at getting color right, except for print.

          Already in the old days there was Digital Fusion (now integrated as 'Fusion' into DaVinci Resolve, I think it was e.g. used on "Independence Day") and Wavefront Composer (SGI/Irix, later ported to Windows NT but I may misremember).

          Also depends where "the old days" start. I got into CG around 1994 and then "the bible" was "Computer Graphics'?" from Foley et al.

          And aforementioned newsgroup and also comp.graphics.rendering(.renderman)

          Software that was written in VFX facilities and then became OSS didn't suffer from this as most color computations happened in f32/float, not u8/char per-channel and colors were expected to be input linearly.

          Often the DCC apps didn't do the de-gamma though. So there was an issue at the user interface layer.

          But in the early 2000's the problem was understood my most people working professionally in CGI for the big screen and all studios I worked at had proper color pipelines, some more sophisticated than others.

          As far as OSS 3D renderers go, there were Aqsis and Pixie.

          Krita was linear from the beginning, AFAIR. I.e. I recall using it for look development/texture paint on "Hellboy II" -- that was 2007 though.

      • zamadatix 337 days ago
        Since there are plenty of knowledgeable folks here (you included) I'll pitch a naive question in hopes of learning some more:

        Beyond efficiency, is there any reason to avoid bringing everything into "some wide gamut linear space using doubles to represent each channel" for the computations and then converting back to the desired color space for any final output or export? Are there other things or alternative things you can do to meaningfully increase the final quality too/instead of?

        • dahart 336 days ago
          Most film production already does close to what you describe, they convert to linear in order to do editing/rendering/grading/compositing, and then convert to the desired output color space. One place to start learning about film color handling is ACES: https://en.wikipedia.org/wiki/Academy_Color_Encoding_System

          Outside of stylistic choices, I think the only technical reasons to use fewer bits are space & bandwidth efficiency, meeting the requirements of a given output device or file format.

          There are reasons to avoid doubles, just because they’re so big. 64 bits is unnecessary and wasteful for almost all color handling. Doubles are slow on most GPUs, where a lot of image processing has moved. 16 bits per channel is usually way more than enough for basic capture & display, especially if the output color range matches the input color range, i.e. little to no editing needed. (That ACES page says “Most ACES compliant image files are encoded in 16-bit half-floats, thus allowing ACES OpenEXR files to encode 30 stops of scene information.”) Even 32 bit floats is vast overkill for most things, but offers a much wider safety net in terms of using very small or very large ranges, and reduces the possibility of clipping error or visible degradation from quantization and rounding error when converting multiple times.

          Note while a lot of cameras offer things like HDR outputs and high bit rate RAW, even the best photo cameras in the world are getting around 8 effective bits per channel signal-to-noise ratio. (I’m getting this from the SNR measurements on dxomark.com) 8 bits per channel also happens to be close to the limits of human perception, when handled carefully, i.e., not linear but a more perceptually uniform color space.

        • nullc 336 days ago
          In doing so you may introduce banding artifacts by destroying the existing dithering in smooth areas of images.

          You could re-dither the output, but the required amount of dither to eliminate banding artifacts is great enough to be obvious and often annoying.

        • jiggawatts 336 days ago
          Doubles are way overkill. Using something like 16 bit integers per channel is adequate even for HDR.
        • adgjlsfhk1 337 days ago
          Nope! the easiest way to do this is to always load images as linearized Float32 color.
      • weinzierl 336 days ago
        "I think in the old days you got CG know how from a few books or you went to comp.graphics.algrithms where more experienced people would gladly explain stuff to you."

        I think you also hardly could avoid being annoyed if you got it wrong, because the dynamic range of display devices was much smaller.

      • nullc 336 days ago
        Eh, because of the viewer's contrast sensitivity function scaling in linear space can give worse looking results. I'd say there are four reasons that processing is so often in a non-linear space:

        1. Since our 'raw' formats are non-linear, processing in that space is what happens when you don't know otherwise.

        2. It's much more computationally efficient to not convert into linear and back out again.

        3. Given the low bitdepth of our image data, going in and out of linear space and doing even minimal processing can easily produce banding artifacts in smooth areas.

        4. Due to the human CSF scaling in e.g. sRGB can give results that preserve the apparent structure in the image better, while a linear scale can look bad by comparison. sRGB levels also more correctly represent perceived levels, so thresholds based on sRGB ratios will work more consistently across brightness levels.

        I'm sure plenty of people have seen internet comments about linear processing, went and implemented and found the results looked worse for reasons they didn't understand and abandoned it (and plenty of others who didn't notice it looked worse and crapped up their code without knowing it. :) )

    • pornel 337 days ago
      With a caveat that when dithering in color, you need both: linear RGB for diffusion/color blending, and a perceptual color space for palette search. Distance in sRGB is more closely aligned with perception of color than linearized RGB.
    • douchescript 336 days ago
      Please explain what you mean, atkinson dither loses 1/4 of the error to gain in contrast and details, rendering the top whites washed out. What's your recommended formula for converting the image to gray before dithering
  • JKCalhoun 337 days ago
    I fell in love with the original Macintosh display. It was so crisp, the black and white pixels so beautiful (especially if you had come off the typical computer hooked up to a TV). Combine that with the beautiful dithering and I almost shunned color when it came along.

    (I believe I've read that Atkinson's dithering didn't come about until scanner software was needed for the Macintosh and so he wrote his own dithering code.)

    • marssaxman 337 days ago
      I have fond memories of the charming pixelated art featured in the classic Macintosh game "Glider", written by someone with an oddly familiar name...

      That 512x342 monochrome world was really kind of special. I used to spend hours carefully tweaking every pixel in my little 32x32 program icons, trying to make them fit the overall aesthetic.

      • JKCalhoun 337 days ago
        Thanks, and yeah, pixels mattered then.

        When I made Glider in color for the first commercial release, I used only the 16 colors of the Macintosh palette - I guess to keep the performance up, memory footprint down.

        There was a good deal of experimenting with hand-dithering to get more muted colors — like maybe a checkerboard pattern of brown & grey to get a mustier-looking wood.

        • LocalH 337 days ago
          I remember the original splash screen ;)
      • jrussino 337 days ago
        Wow I had totally forgotten about this game! Glider (I think I may have played "Glider PRO") was so much fun. I found this youtube video of the game and the title screen music just gave me a wave of nostalgia:

        https://www.youtube.com/watch?v=V4QP76Om7sQ

    • kccqzy 337 days ago
      I fell in love with the display of Playdate (https://play.date/), a handheld gaming device that very much inherits the ethos of the original Macintosh display. I'm not even a gamer and I hardly play games and yet I love that screen.
  • WoodenChair 337 days ago
    This was the article that inspired me to study the MacPaint format. I ended up writing first a Python program that can do Atkinson dithering of a modern image and then save it as a MacPaint file for display on a retro Mac. That code is a chapter in my upcoming book. I then added 9 more 1-bit dithering algorithms and turned it into a Swift/AppKit app so it's easy to use if you want to Atkinson dither your own images and transfer them over to your retro Macs:

    https://apps.apple.com/us/app/retro-dither-b-w-is-beautiful/...

  • garaetjjte 337 days ago
    • m12k 337 days ago
      Also, here's an example of another 3D dithering technique, inspired by Obra Dinn: https://mastodon.gamedev.place/@runevision/11050883717334359...
    • rikroots 337 days ago
      I still have that Ditherpunk article bookmarked! I leaned on its insights heavily when I was building the dither (reduce palette) filter[1] for my canvas library.

      "Return of the Obra Dinn" looks fantastic. I keep meaning to purchase/play that game but the intention keeps slipping down my ToDo list while I'm distracted.

      [1] Demo on CodePen (will ask to use your device's camera) - https://codepen.io/kaliedarik/pen/OJOaOZz

    • Rant423 336 days ago
      THIS is the best resource on dithering I've read (except maybe some gritty details in forum posts by the author of Obra Dinn)
    • DonHopkins 337 days ago
      Here's an example of iterative error diffusion dithering, procedural circuit bending, mis-using Micropolis (Open Source SimCity) tiles to render cellular automata (dithered heat diffusion, and Eco (Anneal + Life + Brian's Brain):

      Micropolis Web Space Inventory Cellular Automata Music 1

      https://youtu.be/BBVyCpmVQew?t=291

      Micropolis Web is the browser based version of Micropolis (open source SimCity), that uses WebAssembly, WebGL, and SvelteKit. Based on the original SimCity Classic code, designed by Will Wright, ported by Don Hopkins. This first video has music by Juho Hietala, Blamstrain, and the Space Inventory Cellular Automata is performed by Don Hopkins.

      https://MicropolisWeb.com (tap the "space" bar a few times, even though it warns you not to)

      The error diffusion dithering is most noticeable when there's not a lot of heat change in the system (strong enough heating or cooling rotates the tiles towards the extreme, then they wrap around, producing chaotic boiling lava lamp blobs).

      Without the error diffusion dithering, the heat diffusion produces much more geometric less organic patterns (sharp diagonal triangular crystals that melt away quickly, instead of fuzzy dithered curvy gradients that smoothly organically diffuse and stay around for a long time).

      Strictly it's not actually a "cellular automata" because of the error diffusion: information travels further than one cell locally each frame -- the leftover energy can "quantum tunnel" in the direction of scanning (serpentine left/right / right/left) into cells downstream arbitrarily far away. So when you draw in one part of the image, the dither fingers in all parts of the image wiggle in response. A true cellular automata has no "action at a distance" and the flow of information respects the "speed of light" (one cell or so per frame, depending on the radius of the neighborhood).

      https://en.wikipedia.org/wiki/Circuit_bending

      >Circuit bending is the creative, chance-based customization of the circuits within electronic devices such as low-voltage, battery-powered guitar effects, children's toys and digital synthesizers to create new musical or visual instruments and sound generators. >Emphasizing spontaneity and randomness, the techniques of circuit bending have been commonly associated with noise music, though many more conventional contemporary musicians and musical groups have been known to experiment with "bent" instruments. Circuit bending usually involves dismantling the machine and adding components such as switches and potentiometers that alter the circuit.

  • NelsonMinar 337 days ago
    These days dithering is a lost technique other than as an arty effect. It's a shame, digital video playback would really benefit from it as a way to eliminate posterization.
    • dahart 337 days ago
      No, not entirely! Dithering is still very important for print media and for displays. I’m talking about dithering from 16 or 32 bits per channel, e.g. floating point and/or HDR, down to 8 bits per channel.

      The dithering techniques like Floyd Steinberg and Atkinson taking 8 or 4 bits per channel down to 1 bit per channel are definitely anachronistic, but not dithering where the goal is 8 bits per channel.

      I’ve made very expensive mistakes printing poster-sized art without using dithering, and you can end up with visible color-banding in gradients in the print that aren’t visible on a display. This is why we still need and still have dithering. This is the reason that Photoshop quietly dithers by default when you convert from 16 or 32 bits per channel down to 8 bits per channel.

      • btbuildem 337 days ago
        I've definitely seen color banding on video playback across multiple streaming services. Dithering would've been super helpful in resolving that.
        • quietbritishjim 337 days ago
          If there wasn't the bandwidth to include full colour depth information then there definitely wasn't the bandwidth for the high-frequency pixel data for dithering.
          • Sesse__ 337 days ago
            You can dither the decoded data to the output display, though (assuming you either are in a float codec, or a 10-bit or higher). It won't solve all problems, but it's not completely unheard of either (e.g. I believe JPEG-XL does it on encode).
          • dahart 337 days ago
            Lossy compression is definitely anti-dither, that’s true. I think a lot of the banding you see in streaming is actually bad color handing and/or miscalibrated devices… very dark colors are especially prone to banding and it’s often due to TVs being turned up brighter than they should be, and sometimes due to compression algorithms ignoring color space & gamma.
            • adgjlsfhk1 337 days ago
              what you generally want is rather than lossy compression on top of dithering, to instead use lossy compression of higher bit depth data, and then the decoder can add the dithering to represent the extra bits.
              • dahart 336 days ago
                Sure, of course, technically, but that circles back to @quietbritishjim’s point: if you have the bandwidth for, say, 8 bits per channel, then just display it without dithering. Dithering would reduce the quality. If you don’t have the bandwidth for 8 bits per channel, then dithering won’t help, it’s already too low and dithering will make it lower. In other words, dithering always reduces the color resolution of the signal, so when a compression constraint is involved, it’s pretty difficult to find a scenario where dithering makes sense to use. This is why dithering isn’t used for streaming, it’s mainly used to meet the input color resolution constraints of various media or devices, or for stylistic reasons.
    • prewett 337 days ago
      Well, it's more niche, but it's still necessary. For instance, if you write a printer driver you'll need a good dithering algorithm, since you only get pigment on/off (even in CMYK). In fact, for multi-channel dither you do not want a normal error-diffusion algorithm, since it does not give good results for low values of a channel (for instance, rgb(255, 255, 2)).

      There appears to be active research on new dithering techniques. I ran across libdither [1] which implements more dithering algorithms that you can imagine.

      [1] https://github.com/robertkist/libdither

    • SaberTail 337 days ago
      Also e-ink displays. For most of the ebooks I've read recently, the images end up posterized to the point of near illegibility.
      • ddingus 337 days ago
        Definitely! The better ones have really high resolution, which compliments this dither technique perfectly.
    • zamadatix 337 days ago
      Dithering is usually applied as pre or post processing step to encoding/decoding because accurately encoding dither itself actually takes more bits than just encoding without banding would in the first place. Nowadays the easiest way to sidestep banding is to use >8 bit encoding and a codec with good psycho-visual enhancements & perceptual quantization.
      • Sharlin 337 days ago
        Of course as post-processing step, yes. Dithering is not uncommon in games to avoid posterization of gradients (which can happen even with 24-bit colors without any extra quantization) so it would be pretty natural for media players to do it as well. And doing it in the player would mean older videos without fancy 10-bit PQ stuff would benefit as well.
        • tempoponet 337 days ago
          I'd be curious to know more about how this is done in games.

          I have noticed that if you get up close to the surface of a car in a lot of modern racing games it has a very noisy/sparkly shader. I know some paint does look like this, but I always suspected this was to prevent banding by creating sub-pixel noise.

    • dvh 337 days ago
      Chrome uses dithering for canvas gradients, or any other operations really. Firefox looks much worse without it.
    • semireg 337 days ago
      It’s still used every day in low-DPI label printing where dithering and pixel alignment are very important.
      • Tommix11 337 days ago
        Also in the retro computer demo scene is dithering very important. I've seen some outrageously great dithering on C64 demo art.
        • makapuf 337 days ago
          Also everywhere you want smooth gradients on any application. Go search gradient rendering without banding by example. Dithering will almost certainly be mentioned.
    • Etherlord87 335 days ago
      In Blender, dithering is enabled by default. First of all, it's not exactly true, that 8 bits per channel make it so banding is unnoticeable - but it's very hard to notice; will depend on the area of colors (darker/brigher, redish/bluish...), but also if you dither your gradient in the image, and the image is further processed, the dither will help avoid banding after processing. Of course if you want to process an image, don't save it in just 8 bits per channel, but sometimes you don't know the image will be processed, or choosing the format is not up to you.
    • 01HNNWZ0MV43FF 337 days ago
      I assume those 10 and 12 bit codecs do it on playback, it's just that free video services bit-starve the encoders
      • adgjlsfhk1 337 days ago
        Especially with AV1, 10 bit color can lead to a cleaner image with fewer bits than 8 bit. The quantization tables mean that you don't have to store the 10 bit info for the high frequency parts, but just having it available makes things like I frames work better (because randomly shifting blocky output is a worse predictor than the underlying smooth signal).
    • doublesocket 337 days ago
      We've very recently been testing multiple dithering techniques for graphics we send as bitmaps to a TIJ printer that's used as a dynamic labeller.
    • SomeoneFromCA 336 days ago
      dwm_lut, widely used tool uses dithering
    • highlaif 337 days ago
      [dead]
  • lukko 337 days ago
    Thanks for sharing this.

    I once used Floyd-Steinberg dithering to make 3D voxel prints from brain MRI scans [0]. You just convert the scan to full white and black values to represent different inks, and it means you don't have to do any segmentation and can represent fine structures much more accurately.

    May be interesting to try with Atkinson dithering too, although the loss of detail may be an issue.

    [0] https://www.lukehale.com/voxelprinting/

  • feverzsj 337 days ago
    Error diffusions are hard to be parallelized. They're also not stable, meaning moving pixel may look different in different frame. But they usually give best result for static images.
  • londons_explore 337 days ago
    I would like to see the ideas of dithering applied to voting systems.

    Ie. Imagine a country with hundreds of elected officials, each of which represents a town or city. Each official is part of a party.

    A dithering-like system could be used during the vote so that the country as a whole is fairly represented, and most towns also are represented by who the majority of their population wants.

    It would work by, during an election, whenever a candidate is chosen for a location, any votes for other parties get transferred to neighbouring towns and cities. That is done repeatedly until every towns seat is filled, and nearly every voters vote has an impact (if not in their local town, then it gets to help a candidate of the same party nearbyish)

    • crazygringo 337 days ago
      That's basically proportional representation with added steps:

      https://en.wikipedia.org/wiki/Proportional_representation

      It's extremely common in Europe, and there are a lot of different precise methods for it. But the point is exactly what you're describing -- every vote has an impact.

      I've always been baffled that not only has the idea never taken off in the US, virtually nobody except political scientists seems to be even aware of it.

    • garaetjjte 337 days ago
      Just sum party votes and calculate amount of seats for each party on national level, then assign seats for each party based on regional results.
      • pacaro 337 days ago
        Or cut to the chase and use sortition, which will provide statistically correct representation with out even needing a vote
    • dahart 337 days ago
      This is what (for example) congress and congressional districts are supposed to help with, but gerrymandering prevents it from working as intended. I’m kinda guessing that adding a dithering algorithm might only change the gerrymandering without fixing the problem, but I’m certainly curious as to what that might look like and whether there are ways to prevent gerrymandering, with or without vote dithering. I’m guessing it would be extremely difficult to get anything with randomness in it passed as law.

      First we need to get “one person, one vote” to be the actual goal (https://en.wikipedia.org/wiki/One_man,_one_vote). In the US, the electoral college was specifically designed to not have one person, one vote as the primary goal, and we haven’t been able to change it yet. For presidential elections, we don’t really need dithering so much as simple majority winner, plus run-off vote counting.

      Maybe run-off voting already is a type of vote dithering?

      • londons_explore 336 days ago
        > difficult to get anything with randomness in it passed as law.

        You don't need randomness. Dithering looks random, but is fully deterministic.

        It does suffer from the 'butterfly effect' - a few extra votes in one place can change the assignment of a lot of nearby seats.

        • dahart 336 days ago
          Some dithering algorithms are deterministic, not all of them. The simplest & most basic dithering you can do is random, and that works quite well when targeting 8 bits per channel.

          The reason I bought it up is because deterministic dithering can definitely be gamed by gerrymandering, so if you want to avoid that you might need to introduce randomness, but then you will have an even harder time getting people to buy in than with a deterministic algorithm.

  • tiffanyh 337 days ago
    Am I the only person who thinks the Floyd-Steinberg dithering is superior is clarity and detail?

    The Atkinson dithering makes the image appear overexposed/blown-out (not true to the original image).

    • TapamN 337 days ago
      The point of Atkinson is that it's much faster than Floyd-Steinberg on old CPUs. Floyd-Steinberg requires multiplying most source pixel by a small number (3, 5, 7) and doing a right shift by 4. Atkinson is just doing a right shift by 3. On the original Macintosh's Motorola 68000, I could see Atkinson being more than twice as fast.
      • crazygringo 336 days ago
        Was that actually the goal, or was it just a happy side effect? (I don't know the history.)

        Was a 2x speedup for dithering even important at the time, especially if it involved a sacrifice of quality? It's not like dithering images was something people did regularly, in the first couple generations of Macs.

        You'd dither a few images for a game or something you were building, that you were lucky to get from a scanner. It was a pretty rare thing. Speed wasn't really relevant, as far as I remember.

        • TapamN 336 days ago
          I could have sworn I read on folklore.org that it was for speed, but I'm not finding it. You do have a point. Maybe Atkinson thought his dithering was more elegant without the multiplies or preferred how it looked on his test images.

          I only used a B&W Mac a few times, but I do remember Windows 3.1 doing on-the-fly ordered dithering when running with palettized color (and being very surprised at NOT seeing the dithering on the blue gradient of a setup program once I started using high color modes). Windows 1.0 apparently was capable of doing it as well.

          • crazygringo 336 days ago
            > Windows 3.1 doing on-the-fly ordered dithering when running with palettized color

            Do you have a source for that? That's very much the opposite of what I remember. If you had 16 colors or even 256 colors, I don't remember anything in the Windows UX being dithered. Like I don't think you could pass an RGB color to GDI to draw a light pink line and it would dither it for you.

            The only dithering I remember was indeed the background of blue gradients in Setup, and I always assumed that was custom code. After all, it's not like GDI had commands for gradients either.

      • mordae 336 days ago
        It's also a lot less local.
    • pacaro 337 days ago
      As another comment mentioned, the differ is being done naively w.r.t. the color space. Handling rgb (or gray) values as linear values is usually wrong
    • quietbritishjim 337 days ago
      I think Atkinson might have the edge if you were looking at it on a blurry CRT instead of a modern LCD/LED screen.
    • douchescript 336 days ago
      I prefer Atkinson dithering. I think it preserves more details when the resolution is very low. For more high resolutions floyd-steinberg is better though
    • crazygringo 337 days ago
      Completely agreed.

      I don't get the appeal of Atkinson dithering at all -- it makes the noise more "clumpy" or "textured" and thereby inherently reduces the amount of detail you can perceive. I don't think that's something subjective.

      And if you want the "richer contrast" that the article describes Atkinson as providing, then easy -- just increase the contrast of the grayscale image before dithering. Then you actually have control over whatever final contrast you want -- it can be whatever you want! But you won't lose detail the way Atkinson does.

    • aldto 337 days ago
      I agree that area above the nostrils appears blown-out, but I prefer the eyes more in the Atkinson version. So neither algorithm is superior to me.
  • nullc 337 days ago
    This PHD thesis is the best technical coverage of dither science that I've seen:

    https://uwspace.uwaterloo.ca/bitstream/handle/10012/3867/the...

    Among the things it covers is the design of a noise shaping filter with a more symmetrical response than the Floyd-Steinberg one.

  • bryanthompson 337 days ago
    One really nice use case for dithering that I've found is for building graphics for 8-bit (Pico-8 and Picotron) games and toys.

    I made a ruby script that can take a graphic and scale it to whatever size, then it uses a closest color match to substitute colors for the _very_ limited Pico* palette and applies dithering to make it look attractive. I like Stenberg the most, but have played with Atkinson and am still feeling around a bit.

  • moribvndvs 337 days ago
    I never really paid attention to the grid-like artifacts[0] that FSD and some other dithering algorithms cause when a dithered image is scaled way down (as is the case of the thumbnails in this article).

    [0] https://en.m.wikipedia.org/wiki/Moir%C3%A9_pattern

  • OnlyMortal 337 days ago
    This just reminded me of the “secret” bitmap of the dev team hidden in the OS.

    I recall using Macsbug to show it.

    • joezydeco 337 days ago
      G 41D89A (Mac SE)
      • OnlyMortal 323 days ago
        I recall opening a Plus and seeing engraved signatures.
  • jd3 337 days ago
    I remember creating an X-Face[0] using Atkinson Dithering for my SeaMonkey add-on MessageFaces[1]!

    You can create one online here[2], but it doesn't seem to support Atkinson for whatever reason.

    [0]: https://en.wikipedia.org/wiki/X-Face

    [1]: https://github.com/JohnDDuncanIII/messagefaces

    [2]: https://www.dairiki.org/xface/

  • AceJohnny2 337 days ago
    Why is the Floyd-Steinberg error diffusion kernel so uneven? What's the output like if the error is distributed equally among the bottom-right pixels?

    And this is a naive question, but could one construct a kernel that diffuses the error across all surrounding pixels, not just the bottom+right? I get that this will cause recursion difficulties as error bounces back-and-forth between neighboring pixels, but is that resolvable?

    • chowells 337 days ago
      Bottom+right is purely a performance optimization, when processing top-to-bottom, left-to-right. You only ever push the error terms to pixels you haven't processed yet, so the whole algorithm remains a single linear pass. That was the standard approach for handling images at the time. Everything was stored as lines of raster data, rather than the blocks modern compressed formats prefer, and there wasn't parallel processing available to speed things up in ways that make linear data dependency awkward.
      • nullc 337 days ago
        It doesn't just prevent you from having to make two passes, it prevents you from having to make potentially infinite passes!
    • DonHopkins 337 days ago
      Instead of top/bottom left/right scan, you can perform a serpentine scan that goes top/bottom, but alternates left/and right/left each row, zig zagging as it goes down the page.

      That way the errors get spread out more evenly, and you don't get 45 degree diagonal flowing artifacts down and to the right.

      The In-Laws (1979): Getting off the plane in Tijuara:

      https://www.youtube.com/watch?v=A2_w-QCWpS0

      "Serpentine! Serpentine!!!"

  • tiffanyh 337 days ago
    More example (and algorithms) can be found here:

    https://brucebcampbell.wordpress.com/wp-content/uploads/2013...

  • leni536 337 days ago
    There is implementation variance on whether you apply the dithering consistently left to right on each row or you alternate. Floyd-Steinberg definitely benefits from the latter approach.

    Also whether you apply the dithering in a linear colorspace.

  • GuB-42 336 days ago
    Just curious, does the website name "Beyond Loom" has anything to do with the video game "Loom" (1990)?
  • antirez 337 days ago
    Floyd-Steinberg looks a lot better IMHO.
    • KerrAvon 337 days ago
      Results will vary by image; I'm not sure the single comparison in TFA is the best possible source for comparison.
  • kibwen 337 days ago
    Dithering can still be a striking aesthetic choice. Low Tech Magazine is my go-to example: https://solar.lowtechmagazine.com/

    From their About page:

    "Thus, instead of using full-colour high-resolution images, we chose to convert all images to black and white, with four levels of grey in-between. These black-and-white images are then coloured according to the pertaining content category via the browser’s native image manipulation capacities. Compressed through this dithering plugin, images featured in the articles add much less load to the content: compared to the old website, the images are roughly ten times less resource-intensive."

    • flobosg 337 days ago
      See also the video game Return of the Obra Dinn (https://en.wikipedia.org/wiki/Return_of_the_Obra_Dinn) as well as the developer log entry about dithering (https://forums.tigsource.com/index.php?topic=40832.msg136374...).
      • magicalhippo 337 days ago
        Was about to mention this game. Found it through HN and loved it. The dithering is used very well in the game and fits very well with the overall gameplay.

        An interesting aspect about the game is how it required making the dithering stable to changing viewpoints, not something typical dithering algorithms care about.

    • Rygian 337 days ago
      > compared to the old website, the images are roughly ten times less resource-intensive.

      Does that account for the repeated post processing done by every client?

      • blacksmith_tb 337 days ago
        My sense is they're trying to reduce request size/processing/reading from disk for their solar-powered server, more than reduce the whole world's energy use.
        • thih9 337 days ago
          Their core goal is, paraphrasing, helping design a sustainable society[1]. Reducing the whole world's energy use is aligned with that - more than reducing load on a single server.

          That being said, I doubt the post processing adds much in this case.

          [1]: "Low-tech Magazine questions the belief in technological progress and highlights the potential of past knowledge and technologies when it comes to designing a sustainable society" - https://www.patreon.com/lowtechmagazine/about

      • makapuf 337 days ago
        Decoding a black and white compressed image using a two color palette is essentially free. You have to paint a bit to a color, might as well choose #ff00ff instead of #111111 for "black"
    • omoikane 337 days ago
      Unfortunately, if the browser window were not at the right size, some of those dithered images will be scaled with visible moiré patterns. If they wanted a dithered look that works at different resolutions, it might have been better to serve grayscale images and filter them on the client side.
    • xattt 337 days ago
      Dithered content stands out because it’s an outlier in a world of smooth gradients. We’d be clamouring for the opposite if everything was dithered.
    • mjcohen 337 days ago
      Thanks. Low Tech Magazine is fascinating.
    • calebm 337 days ago
      Lichtenstein would agree.