Yet another fatwā (or, the case against Lagarith)

I seem to be spending a lot of my time on this here blawg yelling at people and telling them what they shouldn’t be doing or shouldn’t be using. I see no reason to discontinue this since it’s low-effort writing, likely to piss at least someone off and is in the best interests of everyone.

Here, then, is the Grand Mufti’s ruling on the use of Lagarith: don’t.

The reasons are as many as there are grains of sand in the desert, but the most prominent ones are:

Lagarith is slow

If you are using a lossless codec, chances are that you are doing it because your original source is so slow on decoding and/or filtering that it bottlenecks the encoder (or you could just be doing it because someone told you that’s how things are done, and then you should seriously reevaluate your encoding practices). Therefore, using a lossless codec that is slow on decoding is extremely dumb. You should take care to use the fastest codec available, in order to bottleneck the encoder as little as possible (you shouldn’t be using uncompressed, because you’ll bottleneck on disk read speed instead).

Some of you will undoubtedly hurf a blurf about how Lagarith compresses better, but if you’re already blowing 20 GB of disk space on a lossless file that you’ll only keep for a few days at most, why would another few GB matter? If you want good compression there are better alternatives available anyway.

If you were going to say “eh, the difference isn’t that big”, shut up before you embarrass yourself. On encoding, FFmpeg’s HuffYUV encoder is about 160% faster while compressing about 10% worse (the latter number being source dependent, of course). On decoding, it isn’t quite as far behind, FFhuff “only” being about 150% faster. The comparison gets even more unfavorable when you start taking a look at lossless h264, which (while being annoyingly slow on seeking, something that may or may not matter to you) is not only faster on both encoding and decoding, but also compresses better (except in intra-only mode, which is still faster but compresses worse than FFhuff), even on fastest settings.

Lagarith is buggy

Not much needs to be said about this one. Both the encoder and the decoder have had numerous bugs in the past, and some surely remain. The decoder is known to randomly corrupt frames in the decoded output for no particular reason (happens rarely, but still often enough to be annoying). There have also been multiple funny issues with x64 versions.

Lagarith is a dumb format

Floating-point arithmetic in a lossless video encoder. Nuff said.

So what should I use instead?

FFmpeg’s HuffYUV encoder (use “plane” as the predictor), as featured in ffmpeg itself, mencoder and FFdshow (and FFvfw), or lossless h264, depending on your requirements.


If you’re using a shitty NLE like Adobe After Effects or whatever, use whatever the fuck you want that works, because you’re most likely fucked anyway. It may also be forgivable to use Lagarith for RGBA (the above numbers on performance etc apply to YV12 only) since not many other lossless codecs support that.


Mostly DeathWolf’s lossless codec test and personal experience.

Comments (7)

  1. JEEB wrote:

    For RGBA there’s UTVideo, to bring up one lossless encoder/decoder that might even get to the hurf durf ffmpeg.

    Thursday, June 24, 2010 at 18:11 #
  2. RandomAnon wrote:

    OT but the fact that ffmpeg converts every lossless RGBA shit internally to yv12 in ffv1/ffhuff is lulz and I dont sense that they are going to fix that in the near future.

    Friday, June 25, 2010 at 07:41 #
  3. MasterCJ wrote:

    I was kind of sadfacing until you mentioned the RGBA aspect, as that’s the only reason I can think of to use Lagarith, and it’s a fairly compelling reason. As soon as I hit that paragraph I was happy though, so flame on.

    Wednesday, June 30, 2010 at 17:52 #
  4. PC wrote:

    This post made me think about trying HuffYUV again for my video editing stuffs, to see if the issues I had in the past were just a one-time thing. They weren’t. It does perform better (it’s nice to actually be able to preview everything properly) but lol clips that randomly become either completely or partially unreadable after a while.

    Of course, it is a case of NLE software being shitty and all, and it’s not like Lagarith hasn’t given me issues (random corrupted frames and all).

    Thursday, July 1, 2010 at 22:43 #
  5. Richard Berg wrote:

    PC: that is entirely expected. Welcome to bit rot. If you have a memory chip or HDD that’s prone to flip bits, a Huffman-coded bitstream will become partially unreadable, by design. Lossless coding is pretty much the exact opposite of redundancy.

    Sunday, July 11, 2010 at 00:40 #
  6. roger wrote:

    So has utvideo ever made its way to ffmpeg yet?

    Thursday, September 15, 2011 at 00:41 #
  7. Fredo wrote:

    Except of course Lagarith makes VASTLY smaller files (and if its slow its because you have a old computer).

    Sunday, July 29, 2012 at 16:53 #