ssh Issue After Migrating AMI to New Region
Hello,
I recently created a new AMI from the StarCluster AMI using the "starcluster
createimage" utility as described here:
http://web.mit.edu/stardev/cluster/docs/0.92rc2/manual/create_new_ami.html
The image created works as expected while running from an S3 bucket in the
us-east-1 region, after I migrated the AMI to an S3 bucket located in the
us-west-1 region via the "ec2-migrate-image" utility, there is an issue with
ssh. There is no problem starting up a single instance or a cluster, but ssh
doesn't seem to come online.
(test)nolan_at_nolan-mac:~ starcluster start ibicluster
...
>>> Waiting for cluster to come up... (updating every 30s)
>>> Waiting for instances to activate...
>>> Waiting for all nodes to be in a 'running' state...
2/2 ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
100%
>>> Waiting for SSH to come up on all nodes...
0/2 | |
0%
I am using a clone of StarCluster on github, python epd 2.7.1, and
boto2.0b4. I tried the recommendation to remove al security groups starting
with '_at_ec.' suggested here:
https://github.com/jtriley/StarCluster/issues/14,
but no luck yet.
Any ideas?
Thanks,
Nolan
Received on Wed Aug 10 2011 - 02:46:47 EDT
This archive was generated by
hypermail 2.3.0.