[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: distributing SSH keys in a cluster environment



Martin F Krafft said on Fri, Oct 29, 2004 at 07:03:02PM +0200:
> As far as I can tell, there remains one problem: we use SSH
> hostbased authentication between the nodes, and while I finally got
> that to work, every machine gets a new host key on every
> reinstallation, requiring the global database to be updated. Of
> course, ssh-keyscan makes that easy, but people *will* forget to
> call it, and I refuse to automate the process because there is
> almost no intrusion detection going on, so that it would be trivial
> to take a get access to the cluster with a laptop. As it stands,
> I kept the attack vector small with respect to the data stored on
> the cluster, physical security is good, and the whole thing is
> behind a fascist firewall anyway.
> 
> So what can I do about these SSH keys?
 
Very little.  I would use cfengine to push your ssh keys from your cfengine
host right after FAI.  You could, I suppose allow the nodes to FAI, and
generate new keys, and have the master scp their correct keys out (ignoring the
temporary key) and kick sshd.  This would prevent transmitted your private keys
in the clear, but wouldn't prevent someone from spoofing your newly installed
cluster node.

However, I think this is your best shot for an unattended installation where
you care about the host keys.

FYI: I use systemimager which is rsync based, so I just end up putting the same
ssh key on every sim node in the cluster.  Since I don't care if node42 is
spoofing node21 or or not, this works well for me.

M

Attachment: pgp3GdspuyLo2.pgp
Description: PGP signature


Reply to: