ssh Issue After Migrating AMI to New Region
This archive was generated by
I recently created a new AMI from the StarCluster AMI using the "starcluster
createimage" utility as described here:
The image created works as expected while running from an S3 bucket in the
us-east-1 region, after I migrated the AMI to an S3 bucket located in the
us-west-1 region via the "ec2-migrate-image" utility, there is an issue with
ssh. There is no problem starting up a single instance or a cluster, but ssh
doesn't seem to come online.
(test)nolan_at_nolan-mac:~ starcluster start ibicluster
>>> Waiting for cluster to come up... (updating every 30s)
>>> Waiting for instances to activate...
>>> Waiting for all nodes to be in a 'running' state...
>>> Waiting for SSH to come up on all nodes...
0/2 | |
I am using a clone of StarCluster on github, python epd 2.7.1, and
boto2.0b4. I tried the recommendation to remove al security groups starting
with '_at_ec.' suggested here: https://github.com/jtriley/StarCluster/issues/14
but no luck yet.
Received on Wed Aug 10 2011 - 02:46:47 EDT