Had an idea - looking for a math buff to tell me if it's possible with today's technology.
aredridel at nbtsc.org
Wed May 18 15:35:17 CDT 2011
On Wednesday, May 18, 2011 at 2:18 PM, Dorn Hetzel wrote:
> On Wed, May 18, 2011 at 4:07 PM, Landon Stewart <lstewart at superb.net> wrote:
> > Lets say you had a file that was 1,000,000,000 characters consisting of
> > 8,000,000,000bits. What if instead of transferring that file through the
> > interwebs you transmitted a mathematical equation to tell a computer on the
> > other end how to *construct* that file. First you'd feed the file into a
> > cruncher of some type to reduce the pattern of 8,000,000,000 bits into an
> > equation somehow. Sure this would take time, I realize that. The equation
> > would then be transmitted to the other computer where it would use its
> > mad-math-skillz to *figure out the answer* which would theoretically be the
> > same pattern of bits. Thus the same file would emerge on the other end.
> > The real question here is how long would it take for a regular computer to
> > do this kind of math?
> > The real question is whether this is possible. And the short answer is No,
> at least not in general.
Exactly: What you run up against is that you can reduce extraneous information, and compress redundant information, but if you actually have dense information, you're not gonna get any better.
So easy to compress a billion bytes of JSON or XML significantly; not so much a billion bytes of already tightly coded movie.
More information about the NANOG