On 28/05/17 23:56, Leon Fauster wrote:
Am 28.05.2017 um 12:16 schrieb Robert Moskowitz rgm@htt-consult.com:
On 05/28/2017 04:24 AM, Tony Mountifield wrote:
In article 792718e8-f403-1dea-367d-977b157af82c@htt-consult.com, Robert Moskowitz rgm@htt-consult.com wrote:
On 05/26/2017 08:35 PM, Leon Fauster wrote: drops back to 30! for a few minutes. Sigh.
EPEL: yum install haveged
WOW!!!
installed, enabled, and started.
Entropy jumped from ~130 bits to ~2000 bits
thanks
Note to anyone running a web server, or creating certs. You need entropy. Without it your keys are weak and attackable. Probably even known already.
Interesting. I just did a quick check of the various servers I support, and have noticed that all the CentOS 5 and 6 systems report entropy in the low hundreds of bits, but all the CentOS 4 systems and the one old FC3 system all report over 3000 bits.
Since they were all pretty much stock installs, what difference between the versions might explain what I observed?
This is partly why so many certs found in the U of Mich study are weak and factorable. So many systems have inadequate entropy for the generation of key pairs to use in TLS certs. Worst are certs created in firstboot process where at times there is no entropy, but the firstboot still creates its certs.
/var/lib/random-seed and $HOME/.rnd are approaches to mitigate this scenario.
-- LF
so there are mitigations - the question really is: why hasn't redhat made these mitigations the default for their enterprise products - maybe other influences we are unaware of - seems like a huge big hole. With the advent of SSL/TLS being mandated by google et al, every device needs access to entropy.
CentOS mailing list CentOS@centos.org https://lists.centos.org/mailman/listinfo/centos