10

I created a three node multi-machine vagrant environment and am having issues ssh'ing from one vagrant vm to another.

Here is the Vagrantfile:

Vagrant.configure("2") do |config| 
  config.vm.box = "centos/7"

  config.vm.define "master" do |master|
    master.vm.hostname = "master.local" 
    master.vm.network "private_network", type: "dhcp"
  end 

  config.vm.define "node1" do |node1|
     node1.vm.hostname = "node1.local" 
     node1.vm.network "private_network", type: "dhcp" 
  end 

  config.vm.define "node2" do |node2|
    node2.vm.hostname = "node2.local" 
    node2.vm.network "private_network", type: "dhcp" 
  end  
end 

The hosts file (same on each node):

$ cat /etc/hosts
172.28.128.3    master.local    master
172.28.128.4    node1.local     node1
172.28.128.5    node2.local     node2

I can ping back and forth all day from any machine to the other but I cannot ssh from one vagrant vm to the other. The typical error message is (from node1 to master):

[vagrant@node1.local] $ ssh vagrant@172.28.128.3
Permission denied (publickey,gssapi-keyex,gssapi-with-mic) 

SSH is running and the port is open.

The firewall is not running.

I am sure this has to do with ssh keys. I readily admit I am not an expert.
What am I doing wrong here folks?

Dan Cornilescu
  • 6,730
  • 2
  • 19
  • 44
HBach
  • 101
  • 1
  • 4
  • Updated. Yes, from one vm to the other in a vagrant environment. It doesn't matter from which vm to another, something is not correct. – HBach Apr 25 '17 at 16:26
  • You need vagrant's private key within the machine, can't remember where it is store in vagrant host but the vagrant user is configured to allow only key based access. (using the default key is not recommended out of tests obviously) – Tensibai Apr 25 '17 at 16:50
  • @Tensibai One could also login without the keys ssh vagrant@host and use vagrant as a password as well. – 030 Apr 25 '17 at 16:53
  • Please add the output of ip a of all boxes to the question and check whether the IP addresses (172.28.128.3-5) are available when the boxes are down. – 030 Apr 25 '17 at 17:04
  • 1
    @030 according to the message in the question, centos boxes, unlike Ubuntu ones, doesn't set a vagrant password, allowing only key authentication – Tensibai Apr 25 '17 at 17:10
  • @Tensibai, I am using the latest available box, centos/7 (virtualbox, 1703.01). – HBach Apr 25 '17 at 19:37
  • @Tensibai, The private key is stored at ~/VirtualBox VMs/nifi/.vagrant/machines/master/virtualbox/private_key, modifying the path accordingly for each vagrant vm. The Authorized Keys file is used in /etc/ssh/sshd_config, AuthorizedKeysFile .ssh/authorized_keys. I tried cp'ing the private_key for master as well as node1 and pasting it in the .ssh/authorized_keys file on each host respectively, restarting sshd on both instances, and tried to ssh to the other vm, but nogo. We may be on the right track, but not certain. – HBach Apr 25 '17 at 19:37
  • @030, the IPs are fine. This is not the issue. – HBach Apr 25 '17 at 19:38
  • @hbach authorized_keys file store the signature of authorized keys, by default ssh use $HOME/.ssh/id_rsa (IIRC) as default private key, you can specify it with -i but the key has to be owned by the user running ssh client and mode 0600 (read write for owner only) – Tensibai Apr 25 '17 at 19:46
  • Often adding one or (progressively) more v arguments to ssh helps figure out what's going on by increasing verbosity, especially when comparing with a working setup: ssh -v ... then ssh -vv ..., etc. Permission denied could mean a bunch of things... – Dan Cornilescu Apr 26 '17 at 03:35
  • @HBach, do not copy the private key for authorized_keys. Copy the public key instead. – Isaac A. Nugroho Sep 21 '17 at 16:14
  • did you solve this issue? – 030 Sep 28 '17 at 13:43

3 Answers3

3

Following Vagrant file address this problem.

You can get all supporting key files along with this vagrant file at https://github.com/malyabee/IaaC/tree/master/ansible_lab

$commonscript = <<-SCRIPT
sudo yum update -y
sudo yum install python2 epel-release -y
sudo yum install -y ansible
sudo echo "192.168.22.10    ansiblecontroller.example.com ansiblecontroller" >> /etc/hosts
sudo echo "192.168.22.11   node01.example.com   node01" >> /etc/hosts
sudo echo "192.168.22.12   node02.example.com      node02" >> /etc/hosts
SCRIPT

$nodescript = <<-SCRIPT
cat /vagrant/ansible_lab.pub >> /home/vagrant/.ssh/authorized_keys
SCRIPT

$ansiblescript = <<-SCRIPT
sudo yum install ansible -y
sudo cp -r /vagrant/ansible_lab /home/vagrant/.ssh/id_rsa
sudo chmod 400  /home/vagrant/.ssh/id_rsa
sudo chown vagrant:vagrant /home/vagrant/.ssh/id_rsa
SCRIPT

Vagrant.configure("2") do |config|
  config.vm.provision "shell", inline: "echo Hello"

  config.vm.define "ansiblecontroller" do |ansiblecontroller|
    ansiblecontroller.vm.box = "centos/7"
    ansiblecontroller.vm.provider "virtualbox" do |v|
          v.memory = 512
          v.cpus = 1
       end
    ansiblecontroller.vm.network "private_network", ip: "192.168.22.10", virtualbox__intnet: "mynetwork01"
    ansiblecontroller.vm.hostname = "ansiblecontroller.example.com"
    # Installing required packages for ansible controller node
    ansiblecontroller.vm.provision "shell", inline: $commonscript
    ansiblecontroller.vm.provision "shell", inline: $ansiblescript
  end

  config.vm.define "node01" do |node01|
    node01.vm.box = "centos/7"
    node01.vm.provider "virtualbox" do |v|
          v.memory = 512
          v.cpus = 1
       end
    node01.vm.network "private_network", ip: "192.168.22.11", virtualbox__intnet: "mynetwork01"
    node01.vm.hostname = "node01.example.com"
    # Installing required packages for  node01
    node01.vm.provision "shell", inline: $commonscript
    node01.vm.provision "shell", inline: $nodescript
  end
  config.vm.define "node02" do |node02|
    node02.vm.box = "centos/7"
    node02.vm.provider "virtualbox" do |v|
          v.memory = 512
          v.cpus = 1
       end
    node02.vm.network "private_network", ip: "192.168.22.12", virtualbox__intnet: "mynetwork01"
    node02.vm.hostname = "node02.example.com"
    # Installing required packages for  node01
    node02.vm.provision "shell", inline: $commonscript
    node02.vm.provision "shell", inline: $nodescript
  end
end
MalyaBee
  • 31
  • 4
2

According to the docs one should use:

vagrant ssh [name|id]

If there is a single node then use vagrant ssh and in case of multi-node define the name or id of the VM, e.g. vagrant ssh box1

If one would like to ssh between boxes then one could create an ssh key and provision the private key to each box and add the public key to authorized_keys file.

https://www.vagrantup.com/docs/provisioning/file.html

Vagrant.configure("2") do |config|
  # ... other configuration

config.vm.provision "file", source: "~/.gitconfig", destination: ".gitconfig" end

030
  • 13,235
  • 16
  • 74
  • 173
0

Try this link. You need to either execute one of the following:

  • ssh -i <pathto/private_key> <vagrant>@<ip>
  • ssh -o PreferredAuthentications=password user@server-ip (if you haven’t disabled password based authentication)