SEC webpages inaccessible due to Firefox blocking servers with weak DH ciphers

Alexander Maassen outsider at scarynet.org
Fri Jul 17 20:50:12 UTC 2015


(Sorry Michael for the duplicate, forgot to press reply all :P)

No problem making the web more secure, but in such cases I think it would
have been better if you could set this behaviour per site, same as with
'invalid/self signed certs'. And in some cases, vendors use weak ciphers
because they also utilize less resources. Everyone who has a DRAC knows
about it's sluggish performance.

Another backdraw of the DRAC's is, they are https only, and you cannot
turn this behaviour off. Guess for that the only options would be to make
your own interface and utilize the telnet/snmp interface. (Which is
probably less secure then SSLv3), or some form of SSLv3 <-> strong cipher
proxy.

And needing to replace hardware that works perfectly fine for the purposes
it's intended for just because a browser refuses to connect to it and
denies you the option to make exceptions sounds just like the well known
error 'Not enough money spend on hardware'

On Fri, July 17, 2015 9:14 pm, Michael O Holstein wrote:
>>making 99% of the web secure is better than keeping an old 1% working
>
> A fine idea, unless for $reason your application is among the 1% ..
nevermind the arrogance of the "I'm sorry Dave" sort of attitude.
>
> As an example .. we have a vendor who, in the current release (last 3
months) still requires "weak" ciphers in authentication responses. That
was mostly okay until another vendor (with more sense) wanted to auth
the same way but only permitted strong ciphers.
>
> My $0.02
>
> Michael Holstein
> Cleveland State University







More information about the NANOG mailing list