Had an idea - looking for a math buff to tell me if it's possible with today's technology.

Adrian Chadd adrian at creative.net.au
Fri May 20 02:39:41 UTC 2011

On Thu, May 19, 2011, Warren Kumari wrote:

> > Just wanted to say yes, this is entirely what I meant.  Of course the
> > smaller the file the more pointless it gets but still...  If the file was
> > 1GB instead of just 7 bytes I'm wondering if a regular old workstation could
> > put it back together in any reasonable amount of time with the equation.
> While many folk have said "You've just invented compression", I'm going to be a little more specific -- "Wavelet compression".

Well, yes. There's other types of function driven compression rather than
dictionary driven compression (which is just function driven compression :-),
eg iterated function systems.

The problem is finding a method that works for a variety of data. From what I
understand, (lossless) wavelet compression isn't fantastic for arbitrary types
of data.

I'd suggest the original poster pull up some literature introducing them
to information theory and compression techniques in general. Heck, even the
wikipedia article on lossless compression is a good starting point.

I think once the original poster understands some of the basics of information
theory and coding as it relates to representing say 1GB from 7 bytes as given
above, they may be better equipped to ask more specific (and useful!) questions.



More information about the NANOG mailing list