Chromium Has Merged JpegXL

(chromium-review.googlesource.com)

315 points | by thunderbong 11 hours ago ago

97 comments

  • out_of_protocol 6 hours ago ago

    https://cloudinary.com/blog/jpeg-xl-and-the-pareto-front

    Oldie goodie article with charts, comparing webp, jpegxl, avif, jpeg etc. avif is SLOW

    • charcircuit 9 minutes ago ago

      This doesn't use hardware accelerated decoders and encoders.

    • xnx 2 hours ago ago

      > This consolidates JPEG XL’s position as the best image codec currently available, for both lossless and lossy compression, across the quality range but in particular for high quality to visually lossless quality. It is Pareto-optimal across a wide range of speed settings.

      Wow. Nice. Big improvement if JPEG and PNG can be replaced by one codec.

      • adgjlsfhk1 20 minutes ago ago

        The part I'm more excited for is all the image-like/bundle of image like data that until Jpeg-xl didn't have any good codecs (usually implemented as folders of images). One clear example of this is PBR in blender and friends. (e.g. a combo of normal map, roughness, color, metalness etc)

      • lambdaone 2 hours ago ago

        And even better if someone can implement the whole massive spec securely...

        • fsflover 7 minutes ago ago

          This is probably impossible and also not needed. Choose security through compartmentalization (instead of security through correctness that never works), if you really care about security.

          Works for me with Qubes OS.

  • adzm 8 hours ago ago

    https://github.com/libjxl/jxl-rs jxl-rs is the underlying implementation. It's relatively new but Rust certainly calms security fears. This library wasn't really an option last time this came around in chromium.

    • quikoa 8 hours ago ago

      Didn't Google refuse adding JpegXL because they claimed there wasn't enough interest? I don't think they refused out of security concerns but maybe I'm misremembering that.

      • pixelesque 8 hours ago ago

        Google argued that duplicating largely (I know JpegXL does support a bit more, but from most users' perspectives, they're largely right) what AVIF provided while being written in an unsafe language was not what they wanted in terms in increasing the attack surface.

        • Dylan16807 an hour ago ago

          > duplicating largely what AVIF provided

          That's not a great bar since both of them showed up around the same time. And importantly JXL hits many use cases that AVIF doesn't.

          > while being written in an unsafe language

          They put little emphasis on that part when they were rejecting JXL. If they wanted to call for a safer implementation they could have done that.

        • adzm 8 hours ago ago

          And it really was the right move at the time, imo. JXL however now has better implementations and better momentum in the wider ecosystem and not just yet another image format that gets put into chrome and de facto becomes a standard.

          • _ache_ 7 hours ago ago

            I can confirm. I found multiple problems in the "official" cjxl encoder back in 2023 contrary to the webp2 (cwp2) implementation where I could not find any bug or error.

            If the encoder have obvious problems it is not a big deal, but it doesn't bode well for the decoder.

            • adzm 6 hours ago ago

              CVE-2023-0645 in libjxl that year too, and several since

          • Dylan16807 an hour ago ago

            Forcing other companies to override them is a way to prove momentum but it's not a good way to prove momentum.

          • klglrksbjkt 7 hours ago ago
            • actionfromafar 5 hours ago ago

              Hahaha perfect! Can't believe I never heard this story before.

      • gcr 5 hours ago ago

        Google refused to merge JpegXL as a strategy play to promote AVIF, which was in use by other teams (i think Photos?). Internally, chrome engineers were supportive of jxl but were overridden by leadership.

        • zaphar 4 hours ago ago

          Do you have actual sources for this? Because the other people commenting about how the newer library removes most of the concerns explains this better than an unsubstantiated speculation about promoting AVIF.

          • themerone 3 hours ago ago

            If you look at the issue tracker, the creator of Webp killed it because of untrue claims there was no interest or advantages over existing formats.

            Concerns about the implementation only came up after years of pushback forced google ton reconsider.

            • magicalist an hour ago ago

              > If you look at the issue tracker, the creator of Webp killed it because of untrue claims

              I think for most modern software it's difficult to name the creator, but if you had to for webp, it would be hard to argue that it's anyone but Jyrki Alakuijala, who is in fact one of the co-creators of jpegxl and the person backing up the long-term support of the rust jxl-rs implementation, so I'm not even going to ask for a source here because it's just not true.

    • rafram 3 hours ago ago

      > It's relatively new but Rust certainly calms security fears.

      https://github.com/search?q=repo%3Alibjxl%2Fjxl-rs%20unsafe&...

      • dubi_steinkek 2 hours ago ago

        That looks pretty good to me. Every `unsafe` function has clearly stated safety requirements, and every `unsafe` blocks justifies why the requirements are met.

      • 12_throw_away an hour ago ago

        So, I had no reason to use "unsafe" for a very long time, and had developed a bit of an aversion to it. Then I actually needed to use it, first to interface with some C code, and then to deal with a device's mmap'd memory as raw `&[u8]`s.

        And my discovery (which basically anyone could have told me beforehand) was that ... "unsafe" rust is not really that different from regular rust. It lets you dereference pointers (which is not a particularly unusual operation in many other languages) and call some functions that need extra care. Usually the presence of "unsafe" really just means that you needed to interface with foreign functions or hardware or something.

        This is all to say: implying that mere presence of an "unsafe" keyword is a sign that code is insecure is very, very silly.

    • WhereIsTheTruth 3 hours ago ago

      > Rust certainly calms security fears

      No, memory safety is not security, Rust's memory guarantees eliminate some issues, but they also create a dangerous overconfidence, devs treat the compiler as a security audit and skip the hard work of threat modeling

      A vigilant C programmer who manually validates everything and use available tools at its disposal is less risky than a complacent Rust programmer who blindly trust the language

      • rkangel 3 hours ago ago

        > A vigilant C programmer who manually validates everything and use available tools at its disposal is less risky than a complacent Rust programmer who blindly trust the language

        I agree with this. But for a component whose job is to parse data and produce pixels, the security worries I have are memory ones. It's not implementing a permissions model or anything where design and logic are really important. The security holes an image codec would introduce are the sort where it a buffer overun gave an execution primitive (etc.).

        • lambdaone 2 hours ago ago

          Rust programmers are far more likely to have the vigilant mindset than C programmers, or they wouldn't be using Rust.

          You can get an awful lot done very quickly in C if you aren't bothered about security - and traditionally, most of the profession has done exactly that.

      • estebank 2 hours ago ago

        > A vigilant C programmer who manually validates everything and use available tools at its disposal is less risky than a complacent Rust programmer who blindly trust the language

        What about against a vigilant Rust programmer who also manually validates everything and uses available tools at its disposal?

  • bla3 5 hours ago ago

    It's a shame that JpegXL doesn't have a freely available spec.

    • master-lincoln 3 hours ago ago

      You could also say it's a sham to have non-public standards

    • Latitude7973 5 hours ago ago

      In general terms, it is a shame that thousands of ISO, IEC etc specifications and documents are behind a paywall.

      • bionhoward 4 hours ago ago

        Yes! The paywalled SQL documents are a big annoyance

  • LtdJorge 8 hours ago ago

    I’ve recently compared WebP and AVIF with the reference encoders (and rav1e for lossy AVIF), and for similar quality, WebP is almost instant while AVIF takes more than 20 seconds (1MP image).

    JXL is not yet widely supported, so I cannot really use it (videogame maps), but I hope its performance is similar to WebP with better quality, for the future.

    • adzm 7 hours ago ago

      You have to adjust the CPU used parameter, not just quality, for AVIF. Though it can indeed be slow it should not be that slow, especially for a 1mp image. The defaults usually use a higher CPU setting for some reason. I have modest infrastructure that generates 2MP AVIF in a hundred ms or so.

      • LtdJorge 7 hours ago ago

        I tested both WebP and AVIF with maximum CPU usage/effort. I have not tried the faster settings because I wanted the highest quality for small size, but for similar quality WebP blew AVIF out of the water.

        I also have both compiled with -O3 and -march=znver2 in GCC (same for rav1e's RUSTFLAGS) through my Gentoo profile.

        • adzm 6 hours ago ago

          Maximum CPU between those two libs is not really comparable though. But quality is subjective and it sounds like webp worked best for you! Just saying though, there is little benefit in using the max CPU settings for avif. That's like comparing max CPU settings on zip vs xz!

    • fishgoesblub 2 hours ago ago

      rav1e has not had an actual update in performance or quality in years since funding got dropped. Use an encoder like aom, or svt-av1.

  • hbn 43 minutes ago ago

    So is this another image format I'll download and be unable to use without converting because nothing supports it a la .webp?

  • jakkos 10 hours ago ago

    I've been hearing about fights over JpegXL and WebP (and AVIF?) for years, but don't know much about it.

    From a quick look at various "benchmarks" JpegXL seems just be flat out better than WebP in both compression speed and size, why has there been such reluctance from Chromium to adopt it? Are there WebP benefits I'm missing?

    My only experience with WebP has been downloading what is nominally a `.png` file but then being told "WebP is not supported" by some software when I try to open it.

    • jmillikin 9 hours ago ago

      Most of the code in WebP and AVIF is shared with VP8/AV1, which means if your browser supports contemporary video codecs then it also gets pretty good lossy image codecs for free. JPEG-XL is a separate codebase, so it's far more effort to implement and merely providing better compression might not be worth it absent other considerations. The continued widespread use of JPEG is evidence that many web publishers don't care that much about squeezing out a few bytes.

      Also from a security perspective the reference implementation of JPEG-XL isn't great. It's over a hundred kLoC of C++, and given the public support for memory safety by both Google and Mozilla it would be extremely embarrassing if a security vulnerability in libjxl lead to a zero-click zero-day in either Chrome or Firefox.

      The timing is probably a sign that Chrome considers the Rust implementation of JPEG-XL to be mature enough (or at least heading in that direction) to start kicking the tires.

      • latexr 8 hours ago ago

        > The continued widespread use of JPEG is evidence that many web publishers don't care that much about squeezing out a few bytes.

        I agree with the second part (useless hero images at the top of every post demonstrate it), but not necessarily the first. JPEG is supported pretty much everywhere images are, and it’s the de facto default format for pictures. Most people won’t even know what format they’re using, let alone that they could compress it or use another one. In the words of Hank Hill:

        > Do I look like I know what a JPEG is? I just want a picture of a god dang hot dog.

        https://www.youtube.com/watch?v=EvKTOHVGNbg

        • jmillikin 7 hours ago ago

          I'm not (only) talking about the general population, but major sites. As a quick sanity check, the following sites are serving images with the `image/jpeg` content type:

          * CNN (cnn.com): News-related photos on their front page

          * Reddit (www.reddit.com): User-provided images uploaded to their internal image hosting

          * Amazon (amazon.com): Product categories on the front page (product images are in WebP)

          I wouldn't expect to see a lot of WebP on personal homepages or old-style forums, but if bandwidth costs were a meaningful budget line item then I would expect to see ~100% adoption of WebP or AVIF for any image that gets recompressed by a publishing pipeline.

          • vlovich123 6 hours ago ago

            It’s subsidized by cheap CDN rates and dominated by video demand.

          • ascorbic 5 hours ago ago

            Any site that uses a frontend framework or CMS will probably serve WebP at the very least.

    • coppsilgold 8 hours ago ago

      JPEG XL has progressive decoding

      https://www.youtube.com/watch?v=UphN1_7nP8U

    • jacobp100 9 hours ago ago

      JpegXL and AVIF are comparable formats. Google argued you only needed one, and each additional format is a security vulnerability.

      • londons_explore 7 hours ago ago

        And more importantly, an additional format is a commitment to maintain support forever, not only for you, but for future people who implement a web browser.

        I can completely see why the default answer to "should we add x" should be no unless there is a really good reason.

    • out_of_protocol 9 hours ago ago

      - avif is better at low bpp (low-quality images), terrible in lossless

      - jxl is better at high bpp, best in lossless mode

    • speps 10 hours ago ago

      It was an issue with the main JPEGXL library being unmaintained and possibly open for security flaws. Some people got together and wrote a new one in Rust which then became an acceptable choice for a secure browser.

      • a-french-anon 9 hours ago ago

        Unmaintained? You must be mistaken, libjxl was getting a healthy stream of commits.

        The issue was the use of C++ instead of Rust or WUFFS (that Chromium uses for a lot of formats).

      • spider-mario 3 hours ago ago

        It’s largely the same people.

    • rdsubhas 7 hours ago ago

      > various "benchmarks" JpegXL seems just be flat out better than WebP

      The decode speed benchmarks are misleading. WebP has been hardware accelerated since 2013 in Android and 2020 in Apple devices. Due to existing hardware capabilities, real users will _always_ experience better performance and battery life with webp.

      JXL is more about future-proofing. Bit depth, Wide gamut HDR, Progressive decoding, Animation, Transparency, etc.

      JXL does flat out beats AVIF (the image codec, not videos) today. AVIF also pretty much doesn't have hardware decoding in modern phones yet. It makes sense to invest NOW in JXL than on AVIF.

      For what people use today - unfortunately there is no significant case to beat WebP with the existing momentum. The size vs perceptive quality tradeoffs are not significantly different. For users, things will get worse (worser decode speeds & battery life due to lack of hardware decode) before it gets better. That can take many years – because hey, more features in JXL also means translating that to hardware die space will take more time. Just the software side of things is only now picking up.

      But for what we all need – it's really necessary to start the JXL journey now.

      • Dylan16807 an hour ago ago

        > Due to existing hardware capabilities, real users will _always_ experience better performance and battery life with webp.

        Extra data transfer costs performance and battery life too.

    • 3OCSzk 4 hours ago ago

      1 black pixel of .webp is smaller than 1 black pixel of .jpegxl that is also smaller than 1 black pixel of .png

      so webp > jpegxl > png

    • archerx 8 hours ago ago

      Google created webp and that is why they are giving it unjustified preferential treatment and has been trying to unreasonably force it down the throat of the internet.

      • adzm 7 hours ago ago

        WebP gave me alpha transparency with lossy images, which came in handy at the time. It was also not bogged down by patents and licensing. Plus like others said, if you support vp8 video, you pretty much already have a webp codec, same with AV1 and avif

        • archerx 6 hours ago ago

          Lossy PNGs exist with transparency.

          • WhitneyLand 2 hours ago ago

            PNG as a format is not lossy, it uses deflate.

            What you’re referring to is pngquant which uses dithering/reduces colors to allow the PNG to compress to a smaller size.

            So the “loss” is happening independent of the format.

          • adzm 5 hours ago ago

            Do you mean lossless? PNGs are not lossy. A large photo with alpha channel in a lossless png could easily be 20x the size of a lossy webp

            • Semaphor 4 hours ago ago

              PNG of course can be lossy. They aren’t great at it, but depending on the image can be good enough.

            • archerx 3 hours ago ago

              No I meant lossy. This is the library I use; https://pngquant.org/

      • breppp 7 hours ago ago

        unjustified preferential treatment over jpegxl a format google also had created

        • archerx 7 hours ago ago

          They helped create jpegXL but they are not the sole owner like they are with webp. There is a difference.

          • breppp 6 hours ago ago

            a better argument might be that chrome protects their own vs a research group in google switzerland, however as other mentioned the security implications of another unsafe binary parser in a browser is hardly worth it

            • archerx 6 hours ago ago
              • breppp 4 hours ago ago

                which only strengthens my argument, webp seemed like some ad-hoc pet project in chrome, and that ended like most unsafe binary parsers, with critical vulnerabilities

                • magicalist an hour ago ago

                  > webp seemed like some ad-hoc pet project in chrome

                  FWIW webp came from the same "research group in google switzerland" that later developed jpegxl.

                  • breppp 24 minutes ago ago

                    I now see that webp lossless is definitely from there, but the webp base format looks acquired from a US startup, was the image format also adapted in the swiss group?

      • MrDOS 6 hours ago ago

        You're getting downvoted, but you're not wrong. If anyone else had come up with it, it would have been ignored completely. I don't think it's as bad as some people make it out to be, but it's not really that compelling for end users, either. As other folks in the thread have pointed out, WebP is basically the static image format that you get “for free” when you've already got a VP8 video decoder.

        The funny thing is all the places where Google's own ecosystem has ignored WebP. E.g., the golang stdlib has a WebP decoder, but all of the encoders you'll find are CGo bindings to libwebp.

        • archerx 6 hours ago ago

          I noticed Hacker news is more about feelings than facts lately which is a shame.

  • viktorcode 9 hours ago ago

    Anyone knows if their implementation supports animations? This is a feature missing from Apple's

    • Latitude7973 4 hours ago ago

      Yes, but it's not recommended - it does not have inter-frame compression, so it is significantly less efficient than just having a regular video file and slapping 'gif' on it.

      • 112233 2 hours ago ago

        Do you know of a video format that supports progressive decoding?

    • adzm 7 hours ago ago

      According to the chrome platform status page, yes! https://chromestatus.com/feature/5114042131808256

      >>>

        - Progressive decoding for improved perceived loading performance
        - Support for wide color gamut, HDR, and high bit depth
        - Animation support
    • nar001 6 hours ago ago

      It does, I just tried it in Canary and the jxl test page did also show animations

    • actionfromafar 5 hours ago ago

      What, isn't this the cue for someone to explain that it's ironic webp is really a video format which is a bad image format, and now we have symmetry that JpegXL is a good image format which is bad video format? :-D

      (I don't know if any of this is true, but it sounds funny...)

  • Findecanor 3 hours ago ago

    From my (limited) understanding, there is still a lot shared between JPEG and JPEG-XL.

    I wonder if this new implementation could be extended to incorporate support for the older JPEG format and if then total code size could be reduced.

  • carra 6 hours ago ago

    Thanks, but just like WEBP I'll try to stick to regular JPEGs whenever possible. Not all programs I use accept these formats, and for a common user JPEG + PNG should mostly cover all needs. Maybe add GIF to the list for simple animations, while more complex ones can be videos instead of images.

    • striking 6 hours ago ago

      "JPEG XL" is a little bit of a misnomer as it's not just "JPEG with more bits". It supports lossless encoding of existing content at a smaller file size than PNG and allows you to transcode existing JPEGs recoverably for a 20% space savings, the lossy encoding doesn't look nearly as ugly and artifacted as JPEG, it supports wide gamut and HDR, and delivers images progressively so you get a decent preview with as little as 15% of the image loaded with no additional client-side effort (from https://jpegxl.info/).

      It is at least a very good transcoding target for the web, but it genuinely replaces many other formats in a way where the original source file can more or less be regenerated.

      • AlienRobot 2 hours ago ago

        Honestly, I don't like how webp and now jpegxl support both a lossless and lossy mode.

        Let's say you want to store images lossless. This means you won't tolerate loss of data. Which means you don't want to risk it by using a codec that will compress the image lossy if you forget to enable a setting.

        With PNG there is no way to accidentally make it lossy, which feels a lot safer for cases you want lossless compression.

        • striking 2 hours ago ago

          Taking a look at the reference codec package at https://gitlab.com/wg1/jpeg-xl, they note:

          > Specifically for JPEG files, the default cjxl behavior is to apply lossless recompression and the default djxl behavior is to reconstruct the original JPEG file (when the extension of the output file is .jpg).

          You're right, however, that you do need to be careful and use the reference codec package for this, as tools like ImageMagick create loss during the decoding of the JPEG into pixels (https://github.com/ImageMagick/ImageMagick/discussions/6046) and ImageMagick sets quality to 92 by default. But perhaps that's something we can change.

    • Sammi 6 hours ago ago

      You can really treat WebP as a universally available format in 2026. It is an old, boring, and safe format to use now.

      Browser support for WebP is excellent now. The last browser to add it was Safari 14 in September 16, 2020: https://caniuse.com/webp

      It got into Windows 10 1809 in October 2018. Into MacOS Big Sur in November 2020.

      Wikipedia has a great list of popular software that supports it: https://en.wikipedia.org/wiki/WebP#Graphics_software

      • carra 3 hours ago ago

        Unfortunately being universal implies way more than just having good browser support. There are quite a few image processing programs without webp or jpeg-xl support. I'm using Windows 11 and the default image viewer can't even open webp... Also, keep in mind that due to subscription models there are many people stuck with older Photoshop versions too.

        • spider-mario 3 hours ago ago
          • carra 3 hours ago ago

            Thanks, I know about this and other workarounds. My point is, if it was truly universal you should not need anything! I bet most regular users will never even know this exists.

          • majora2007 2 hours ago ago

            I never knew about this either and it's been very frustrating as I've been converting my Manga library over to webp (savings are insane) and doing any spot checking opens Edge.

            Edit: After reading the comments, this doesn't seem to open in Photos App.

      • Y-bar 6 hours ago ago

        Webp can be really annoying once you hit certain encoding edge cases.

        One customer of mine (fashion) has over 700k images in their DAM, and about 0.5% cannot be converted to webp at all using libwebp. They can without problem be converted to jpeg, png, and avif.

        • jdiff 5 hours ago ago

          Just out of curiosity, what's the problem libwebp has with them? I wasn't aware of cases where any image format would just cross its arms and refuse point blank like that.

          • Y-bar 5 hours ago ago

            We have never been able to resolve it better than knowing this:

            Certain pixel colour combinations in the source image appear to trip the algorithm to such a degree that the encoder will only produce a black image.

            We know this because we have been able to encode the images by (in pure frustration) manually brute forcing moving a black square across the source image on different locations and then trying to encode again. Suddenly it will work.

            Images are pretty much always exported from Adobe, often smaller than 3000x3000 pixels. Images from the same camera, same size, same photo session, same export batch will work and then suddenly one out of a few hundred may become black, and only the webp one not other formats, the rest of the photos will work for all formats.

            A more mathematically inclined colleague tried to have a look at the implementation once, but was unable to figure it out because they could apparently not find a good written spec on how the encoder is supposed to work.

          • tr45872267 5 hours ago ago

            Webp has a maximum pixel dimension size of 16383 x 16383.[0]

            [0] https://developers.google.com/speed/webp/faq#what_is_the_max...

    • ashirviskas 6 hours ago ago

      You should never use GIF anymore, it is super inefficient. Just do video, it is 5x to 10x more efficient.

      https://web.dev/articles/replace-gifs-with-videos

      • jdiff 5 hours ago ago

        There's odd cases where it still has uses. When I was a teacher, some of the gamifying tools don't allow video embeds without a subscription, but I wanted to make some "what 3D operation is shown here" questions with various tools in Blender. GIF sizes were pretty comparable to video with largely static, less-than-a-second loops, and likely had slightly higher quality with care used to reduce color palette usage.

        But I fully realize, there are vanishingly few cases with similar constraints.

        • ascorbic 5 hours ago ago

          For those you can often use animated WebP, or even APNG. They all have close to universal support and are usually much smaller.

          • tylertyler 2 hours ago ago

            If you need animated images in emails or text messages, GIF is the only supported format that will play the animation. Because of the size restrictions for these messaging systems the inefficient compression of GIFs is a major issue.

            • prmoustache an hour ago ago

              I am not sure "need" is the right word here.

          • adzm 4 hours ago ago

            AVIF works here also. Discord started supporting it for custom emoji.

  • einpoklum 5 hours ago ago

    Unfortunately, with Chromium dropping support for manifest-v2 extensions, and through that dropping proper support for uBlock Origin, I'm moving away from it. Not that that's easy, of course...