> I am not saying that any mathematical formula or metaphysical
> phenomenon will compress 16 into 12 bits, but also, I always
> remember that the word that less crossed the mind of people like
> Einstein, Edison, V.Braun, Copernicus, Dumont, Bell, Schotkky,
> Da Vinci, and so many others not less important, was "impossible".
Sorry Wagner, wrongly recalled. Einstein for example pointed out just
that; it is *impossible* for an object to travel faster than the speed
of light. What set these people apart was not a dogma that "nothing is
impossible", but a deep understanding of the nature of possibilities and
> When PC hard disks were expensive and programs needed more and more
> space, we invented the doublespace, doubledisk or other software
> solution to compress data. It was a fantastic technique,
It was dicky. There's no recovery from a compressed volume with a bad
> it was "standard" in some Microsoft DOS versions,
If it was, you had to remove it immediately!
> But then, everyone forgot it when HD manufacturers found an almost
> "impossible" way to reduce HD prices.
*Everyone* forgot it after the first crash!
Byron A Jeff wrote:
> V90 and X2 56K modems require special equipment on the ISP end IIRC
We are talking exclusively of digital exchanges here (the vast
majority of them nowadays); analogue connections have no such
It appears the problem is that the ADC end of the connection cannot
cope with the output from a conventional modem of this speed, so you
need to be able to feed the digital datastream directly to exchange
(which is if I correctly understand, at 64 kilobits per second so there
is *no way* you can transmit more data down a standard telephone line).
The SLIC copes with this at 56K, converts it to analog for the line
and your modem recovers it. You can thus receive data at (maybe) 56K
from the ISP, but of course can't *send* it back at that rate.
> The only way to get out of this is with compression. ... However if
> symbols occur with different frequency, then you can encode
> frequently occuring symbols with small than 12 bits and others which
> are larger than 12 bits.
Compression is statistical. Random data cannot be compressed. Just
talking round in circles apropos the original problem - you can encode
four decimal digits into 12 bits iff (if and only if) you already have
the knowledge that a certain proportion of states (5904 to be exact)
will *never* occur.
Paul B. Webster VK2BZC wrote:
> Sorry Wagner, wrongly recalled. Einstein for example pointed out just
> that; it is *impossible* for an object to travel faster than the speed
> of light...
This may be the reason why nobody invested considerable amounts of $
it, somebody important said it was "impossible". It was one of those
mistakes", but he was smart to say "object" because it involves
in his theory it requires an infinite amount of energy to do that, but
this energy is relative to other particles comparative speed, so, it
"impossible" to isolate an electron and stop its movement, and then
it again to speed of light, because it would take an infinite amount
in both steps, is that right? This explains why domestic size cold
"impossible" to generate energy? So the impossibility is in the
By the way, none of those persons were perfect, and one also said the
was the center of the universe. But at least they tried, and that is
are remembered, mostly because the pressure they got around "to not do
Isn't something name "muon", a particle in the atom that travels
the electron? I don't recall it.
Remember Josephson technology? Everybody said it was "impossible" to
electrons travel faster than the speed in an electric conductive
Remember Dick Tracy and his portable phone, and then, StarTrek with
engines, folding the speed of light, both are a vision, a dream to
The "impossible" never existed before somebody thought about that, and
doesn't exist in the universe, it is in our mind.
Impossible was to go to the Moon, and make Mars terraforming in the
next 400 yrs.
The "impossible" word needs to be rewritten as "difficult", so I agree
that it is
"very difficult" to compress 10000 combinations in only 12 bits.
We will talk about that again in 500 years. :)
bit : 2 states, 3 letters
trit: 3 states, 4 letters
they can't even make things compatible anymore, and want to travel at
light... tsk tsk.. I think it is easier to teach a kid to do math
using both hands fingers in hexadecimal up to 1024 instead the
Since nobody found a solution for the subject, that's enough to talk
The "impossible" word needs to be rewritten as "difficult", so I
agree that it is "very difficult" to compress 10000 combinations
in only 12 bits.
We will talk about that again in 500 years. :)
Look, this isn't some real-world poorly-understood physical phenomena
that we're talking about. This is as close as you're likely to get to
PURE mathematics - a science defined by its own assumptions, and
entirely abstract. You'll be able to fit 10000 combinations in 12 bits
about the same time you'll be able to make 2 times 2 equal 5, sometimes.
Want to talk about fitting 10000 combinations into 12 memory cells?
Fine! But that's an entirely different animal. (And you can do it
today, using intel's "strataflash" technology.)
> The "impossible" word needs to be rewritten as "difficult", so I agree
> that it is
> "very difficult" to compress 10000 combinations in only 12 bits.
> We will talk about that again in 500 years. :)
The difference is that we don't "observe" bits and bytes and "make models"
that clear up how we think bits behave, but could be in error. We have
DEFINED the bits, what they are and how to calculate with them. Then
we built computers and microcontrollers to implement that mathematical
bit universe physically.
As long as the implementation does not break the definition (and that
we try to avoid by testing the chips and considering those "damaged" that
don't behave like spec'ed), there is no possibility to overcome the
mathematical limitations. Those are INTENTIONALLY designed in! (while
it is always desirable to have more memory, it seldomly is to have
more memory that doesn't work in a deterministic fashion in return).
If at some time we happen to build "computers" based on things we just
FIND laying around somewhere and think we know enough about them to
make them part of a useful machine (eg living bio structures similar to
human brains), things may change. We'd THEN not have "defined" the
behaviour but EXAMINED, and may have made mistakes or not OBSERVED
With such a (future) "machine", yes, we might possibly become surprised
about unknown and hidden "features" or "bugs" that were previously
considered "impossible" to happen.
But that's not a PIC with 12 bit of spare RAM left to store 14 bit worth
of information :-)