0
$\begingroup$

Are there any compression algorithms outside of entropy encodings that yield a significant compression ratio for memoryless sources (not the Markov kind, but this kind)?

More concretely, if a million ASCII characters, each generated from the same probability distribution that is not necessarily even but where the probability of each ASCII character being chosen is non-zero, what compression algorithms outside of entropy encoders (such as arithmetic, huffman) would yield decent compression ratios?

asked Jun 3, 2016 at 21:00
$\endgroup$
1
  • 2
    $\begingroup$ Why have you rejected entropy encoders? $\endgroup$ Commented Jun 16, 2017 at 21:28

1 Answer 1

1
$\begingroup$

You've eliminated entropy encoders, which is the type of data you described.

A Memory-less Markov is a contradiction of terms.

The best Compression possible for the specifically generated list of characters, might just be the program that is used to generate the deterministic data in the first place.

answered Jun 16, 2017 at 19:38
$\endgroup$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.