Newsgroups: sci.crypt
Path: cactus.org!cs.utexas.edu!uunet!mnemosyne.cs.du.edu!nyx10!colin
From: colin@nyx10.cs.du.edu (Colin Plumb)

Subject: Re: Block Mixing Transformations
Message-ID: <1994Mar16.205036.1806@mnemosyne.cs.du.edu>
X-Disclaimer: Nyx is a public access Unix system run by the University
+             of Denver for the Denver community.  The University has neither
+             control over nor responsibility for the opinions of users.
Keywords: DES replacement, Large blocks
Sender: usenet@mnemosyne.cs.du.edu (netnews admin account)
Organization: Nyx, Public Access Unix at U. of Denver Math/CS dept.
References: <1994Mar13.051515.27175@cactus.org> <1994Mar16.010710.
+           4943@mnemosyne.cs.du.edu> <1994Mar16.093035.1330@cactus.org>
Date: Wed, 16 Mar 94 20:50:36 GMT
Lines: 276

In article <1994Mar16.093035.1330@cactus.org>,
Terry Ritter  wrote:
>
> In <1994Mar16.010710.4943@mnemosyne.cs.du.edu> colin@nyx10.cs.du.edu
> (Colin Plumb) writes:
>
>
>>Okay, time to comme clean: I get a bit annoyed when I see a fancily-formatted
>>"report" that might fool some people when such glaring points as this are
>>wide open.  It seems enormously pretensious and makes me think poorly
>>about the author.  Sorry if it's unkind, but it's true that a part of
>>my mind is saying "I wish Terry would quit posting his `clever' ideas
>>until he learns thing one about cryptanalysis."
>
> Frankly, I think this says more about Colin Plumb than Terry Ritter.
>
> When Plumb publishes a practical cryptosystem which he can *prove*
> secure, then I'm sure the rest of us will take some notice.  Until
> then, we all must learn to deal with--and depend upon--not only
> proposals, but actual fielded systems of unknown strength.
>
> Frankly, I think the *practice* of Science--and especially
> cryptography--is benefitted more from the exposure of reasonable
> wrong approaches than right ones.  Nobody comes up with only great
> ideas; to publish only after a great one is finally found and
> confirmed is to hide what the work of Science actually is.  The
> work is the process, not the result.  There will always be another
> result.
>
> And, sci.crypt is not a refereed publication.  I do not expect
> to treat it like one.

I didn't mean to be insulting.  I was feeling a little bit frustrated and
trying to express it without being vicious.  I read my words again and
I still think I hedged them enough that it's not a personal attack.
Am I wrong?


>>Yes, it's called polyalphabetic substitution and ciphertext-only attacks
>>are trivial.
>
> Well, the term "polyalphabetic substitution" is normally applied
> to stream ciphers instead of block ciphers.  To apply it here,
> we must already have come to the conclusion that the mixing is
> so ineffective that individual characters are enciphered--through
> the entire cipher--by independent substitutions.  This seems
> unlikely, but in any case it has not been shown.  Thus, the
> term is inappropriate.

I was just pointing out that key size is NOT a good measure of cipher
strength.  128 bits is enough for a good many years.  256 bits should
be sufficient for a very long time indeed.  As long as the best attack
is brute force.  Key size in excess of this is a waste of storage space.
Ciphers with much larger key sizes can be easy to break.  I get people
all the time asking if RSA is stronger than IDEA because its key size
is so much arger.  The answer, of course, is no; I believe it was
Paul Leyland who estimated that a 128-bit IDEA key is about as much
work to break as a 3100-bit RSA key.  (Using presently-known algorithms.)

> The whole point of my proposal is to separate the mixing from the
> strength.  A cipher with extremely fast, weak mixing may well be
> preferable to the slow form Plumb proposed, which turns out in fact
> to be 20-50 times slower. (!!!!)

As I said, I didn't realize you were changing the width of the mixing.
small-width multiplication modulo a polynomial can be easily optimized.
I made a wrong assumption based on your discussion of FFT patterns.

> Think about it:  Using the fast mixing I proposed, one can afford
> to make a block cipher with perhaps 20 times as many stages--and 20
> times as many unique substitution tables--as one with Plumb's
> "improved" mixer.  I would say that there is at least a reasonable
> chance that the cipher with the fast mixer--having up to 20 times
> as many stages--would be a stronger cipher.  And if it could be
> better analyzed for strength, use fewer stages and operate faster
> as well, so much the better.

And I said that I didn't think cascading stages of your mixer help
that much.  Thinking about it, I actually can see the contrary
view, but my basic idea was that an 8-bit S-box can be represented
as a system of boolean equations of degree 8.  The mixing network
is linear.  So three stages of S boxes makes for a degree-24 system
of equations.  Big and awkward, to be sure, and I wish I knew more
about how much work that is to solve, but it doesn't seem ridiculous.
(Note that the fact that these are over GF(2) simplifies things
greatly; I know that solving a degree-24 system of equations over the
reals is a royal pain in the ass.)

>>This is immediately recognizable as Pascal's triangle, but serves to
>>illustrate the fact that a 1-bit change in the input is slow to propagate.
>
> One of the characteristics I mentioned for the fast mixing function
> was the fact that a single bit-change in *either* input block would
> produce at least (and probably) a bit change in *both* output
> blocks.  A modest propagation, true, but it rolls through 61 mixing
> operations per stage.  The reason for cascading the mixings is to
> spread the various changes throughout the field of substitution
> inputs.

For that claim, I agree you achieved it.  Although I'd make it simpler
by choosing the reducing polynomial to be x^n+1, so you just
exclusive-or, rotate, and merge into the two inputs.

The argument that it makes no sense to mix at a granularity finer than
the width of your S-boxes, so all you really need is that a change to
either input byte affects both output bytes, is a valid one.

(Where the number 61 comes from is a mystery to me.  I thought this was
a discussion of a cipher building block; the number of times it
gets used in any given cipher is a different issue.  And in any case,
it's the number of *levels* of mixing that is relevant to thoroughness.)

>>Cryptography is harder than cryptanalysis.
>
> Not in my experience.

A good code is one that is resistant to cryptanalysis.  If we define
cryptography to be the design of good codes (I agree that any boy scout
can construct a bad one), then it is defined in terms of cryptanalysis.
You can't do it without understanding cryptanalysis.

I agree that practical cryptanalysis is an extremely painstaking job
and it may take less time to demonstrate (through theoretical arguments)
a resistance to a given cryptanalytic attack than to conduct the attack
would take if it were not resistant, but you still have to understand
the attack.  Maybe you can have a non-constructive understanding of
the attack which lets you know when it applies without knowing how to
apply it, but I believe the difference is small at present.

>>Cryptanalysis (of modern ciphers)
>>is extremely difficult.  I'm not qualified to do either, although I'm
>>slowly working my way through the cryptanalytic literature in an attempt
>>to develop a modicum of skill.
>
> Interesting, then, that Plumb feels so capable of judgement.

Success at cryptanalysis is obvious.  While special skills help,
their lack does not reduce the value of a successful attack.
Failure is also obvious.  This clear indication of success is why
its difficulty is generally appreciated.

When I say I'm not qualified to do cryptanalysis, I mean that if you
want a cipher attacked, I'm not a good person to ask; I can only
break the most trivial of ciphers.  However, if I can break it,
you can be assured it's a bad cipher.

>>If you look at papers proposing ciphers, you'll see that the onus is
>>on the proposer to show that a number of known methods of cryptanalysis
>>are ineffective.
>
> If someone were to actually read my Block Mixing article, s/he
> would see that it was concerned with the existence of such
> structures, and with one fast structure in particular.
>
> That structure works.  It reversibly mixes without expansion.
> The article is not in error.  I did expect the mixing to be
> stronger than it turned out to be, but "strength" was specifically
> disclaimed as an issue.
>
> The article was not concerned with overall cryptosystems, other
> than to explain why one might want a Block Mixing structure in
> the first place, and to propose some cipher constructs which
> might be worth investigating.

And I observed that your proposed structures are not worth much.
Those are the "clever" ideas I take objection to.  Your block
mixer achieves its stated goals and I think is a useful contribution.
However, your attempts to use it in a block cipher suggested to me
that you didn't understand its limitations.

> Since the article was specifically limited to detailed discussion
> of the *mixing*, it does seem a little odd that it would be
> criticized for not being a proven cipher.

You're right; my criticisms were directed towards your block cipher
suggestions and not your mixer per se.  My criticisms of the mixer
were based on "if you want to use it in that structure, you need
a stronger mixer."  I sort of got the issue turned on its head.

>>"Show" does not mean "I couldn't do it", but "Here
>>is a proof that nobody can do it."
>
> Ah, then we must *have* practical cryptosystems which are proven
> secure.  Which ones would those be, exactly?

I don't think we have any except the one-time pad that are proven secure.
The next-best is probably the BBS random-number generator and other
PK systems provably equivalent to factoring.

There are a great many ciphers provably (or with very good arguments)
resistant to specific attacks.  Such as linear and differential
cryptanalysis.  That is about the best that can be expected: you
examine several attacks, and argue that they are not effective.

I know of no theory which applies to *all* attacks and provides a useful
lower bound on the work to break (obviously Shannon's theorem
applies, but computing power isn't a factor there).

> Postings to sci.crypt are not, and should not be considered to be,
> scientific publications.  We need room to speculate, room to fail
> without career implications, and room to consider alternative
> approaches that your PhD advisor may not like.  One of these may
> lead to better solutions.  Or not.

> We have a lot of people laboring in secret because they are afraid
> to say anything and get the kind of response we see from Plumb.
> We are all poorer for it; those who read because we don't read the
> new ideas, and those who don't write, because it is through error
> that they learn more than they know.

Huh?  I was harsh?  I sure didn't mean to be.  I thought I was improving
the signal-to-noise ratio by demonstrating avenues of cryptanalysis.
As the FAQ says, it's hard work, which is why so many pet ciphers that
get proposed here go unanalyzed, even though they are weak.

So many things here offer no opportunity for learning, because the
success or failure of a proposed cipher is never evaluated.  I don't
think "hey, this is rubbish, and here's why" is a bad thing.  I
did think most of my observations were obvious enough that you could
have spotted them yourself before announcing the ideas to the world.

>>One cipher that might be worth exploring is one based on an FFT-like mixing
>>pattern (all butterflies the same size), with S-boxes after each round of the
>>FFT.  But if you were to stick to 8-bit mixing, you could merge the mixing
>>function with the previous S-box by having each of the two input S-boxes
>>to the mixer emit 16 bits, which are XORed and the two halves used as outputs. 
>>And that would permit much more complex mixing functions with no increase
>>in cost.
>
> It would certainly be a different cipher.  But it would be moving in
> the opposite direction from my proposal, which specifically intends
> to separate the concepts of "strength" and "mixing" each into their
> own particular mechanism.

Pardon me if I misunderstood, but I thought that is a mixing operation
did not enhance the strength of a cipher, it should be omitted.  (Like
DES's initial and final permitations.)

Separating nonlinearity, maybe.  Separating (non-linear) substitution
and (linear) permutation is very common.  Your mixing is a variant
on the permutation theme.  But that need only be for analysis purposes.
In a software implementation, they can be combined for efficiency.
I was pointing out that given a certain implementation choice, there
are a large number of equally efficient permutation functions.

> One of the major problems with modern cipher design is the inability
> to *measure* "strength."  It is very easy to say "this mixer is
> nonlinear so it must be stronger."  It is easy to *say*, but not so
> easy to prove.  And very few academic papers even begin to approach
> the idea that "stronger" may not be worth the expense, if it takes
> 20 times as long.

Obviously, you have to compare the strength with that of the weaker system
iterated 20 times.  Whichever is stronger, you choose.

>>But remember, these are all off-the-cuff ideas; if I were to start work
>>now and not find any holes after 6 months of effort, it might be worth
>>considering using them to protect data.
>
> I certainly am not using any of the many schemes I suggested in
> the article to protect data.
>
> However, I have not seen a sufficient problem with it to prevent
> continued investigation in that direction.

Maybe his opinion has changed now, but last year Eli Biham was nervous
about IDEA because it only had 8 rounds; he would prefer 12 or 16.
He also criticizes many large-block ciphers because they do not give
things enough time to mix thoroughly.  Now, illustrating how your
proposed mixer can mix n bytes in log2(n) rounds is useful for
addressing this concern.  (Although I have yet to see a convincing
argument, except for efficiency, for using larger block sizes.  And
I haven't seen a cipher that didn't sacrifice either efficiency or
security when going to a larger block size.)
-- 
	-Colin