Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 1062 Vote(s) - 3.55 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Is there a faster lossy compression than JPEG?

#1
Is there a compression algorithm that is faster than JPEG yet well supported? I know about jpeg2000 but from what I've heard it's not really that much faster.

Edit: for compressing.

Edit2: It should run on Linux 32 bit and ideally it should be in C or C++.
Reply

#2
In what context? On a PC or a portable device?

From my experience you've got JPEG, JPEG2000, PNG, and ... uh, that's about it for "well-supported" image types in a broad context (lossy or not!)

(Hooray that GIF is on its way out.)
Reply

#3
I think wavelet-based compression algorithms are in general slower than the ones using DCT. Maybe you should take a look at the JPEG XR and WebP formats.
Reply

#4
Do you have MMX/SSE2 instructions available on your target architecture? If so, you might try [libjpeg-turbo](

[To see links please register here]

). Alternatively, can you compress the images with something like `zlib` and then offload the actual reduction to another machine? Is it imperative that actual lossy compression of the images take place on the embedded device itself?
Reply

#5
JPEG2000 isn't faster at all. Is it encoding or decoding that's not fast enough with jpeg? You could probably be alot faster by doing only 4x4 FDCT and IDCT on jpeg.

It's hard to find any documentation on IJG libjpeg, but if you use that, try lowering the quality setting, it might make it faster, also there seems to be a fast FDCT option.

Someone mentioned libjpeg-turbo that uses SIMD instructions and is compatible with the regular libjpeg. If that's an option for you, I think you should try it.
Reply

#6
You could simply resize the image to a smaller one if you don't require the full image fidelity. Averaging every 2x2 block into a single pixel will reduce the size to 1/4 very quickly.
Reply

#7
Jpeg encoding and decoding should be *extremely* fast. You'll have a hard time finding a faster algorithm. If it's slow, your problem is probably not the format but a bad implementation of the encoder. Try the encoder from `libavcodec` in the `ffmpeg` project.
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through