kubernetesibm-cloud-private

SSH to master node in local IBM Private Cloud cluster?


I have installed a local IBM Cloud kubernetes cluster based on this guide:

https://github.com/IBM/deploy-ibm-cloud-private/blob/master/docs/deploy-vagrant.md

and it is up and running successfully. Now I need to SSH to the master node and copy the configure-registry-cert.sh file to my host machine, based on:

https://www.ibm.com/support/knowledgecenter/en/SSBS6K_1.2.0/manage_images/using_docker_cli.html

But how do I SSH to the master node? From my host machine I have tried with:

$ ssh admin@192.168.27.100

and specify admin as password (I use admin/admin when I login to the webinterface running on: https://192.168.27.100:8443/console) but that does not work.

This page:

https://www.ibm.com/support/knowledgecenter/SSBS6K_2.1.0.3/installing/ssh_keys.html

describes that private/public keys needs to be configured during installation but that was never an option. Based on the vagrant guide you just run vagrant up and after approx 20 min. everything is up and running.

Any ideas?


Solution

  • You can SSH to the master node by command. You can not use the ICP administrator username and password (e.g.: admin/admin by default) to login to master node, as this username is not the OS user, thanks.

    From the 'Deploy IBM Cloud Private beta using Vagrant' link, we can see below commands.

    ...............

    IBM Cloud Private Vagrant Commands

    install: vagrant up

    stop: vagrant halt (DO NOT USE! USE vagrant suspend TO HALT VM)

    start: vagrant up

    uninstall: vagrant destroy

    login to master node: vagrant ssh

    suspend: vagrant suspend

    resume: vagrant resume

    ...............

    https://github.com/IBM/deploy-ibm-cloud-private/blob/master/docs/deploy-vagrant.md