# ssh tunneling through multiple servers



## anlag (Mar 5, 2005)

I need to move a large amount of data from a machine which I can only access via two other connection steps. So, I first need to connect to server A, then from there to B, and from there to C, where the data is. To make it just a little bit nastier, I have different usernames on my local machine, on A/B and on C.

I thought I'd do this using scp via a set of nestled ssh tunnels, but so far I can't quite get it right. I've tried:

> ssh -f -L6969:localhost:22 [email protected] ssh -f -L6969:localhost:22 [email protected] ssh -f -N -L6969:localhost:22 [email protected]

This executes without error, however closer examination in the form of

> scp -P 6969 localhost:/something

...shows the tunnel only reaches A, or at least that is the filesystem I get access to.

I manage single-step ssh tunnels just fine, and I suppose I could tunnel on manually at each step of the way, but it should be possible to get this to work with one command from the local machine too and I would be grateful if someone could help me out. Cheers.


----------



## lotuseclat79 (Sep 12, 2003)

Use sudo if you are not root, you need root access for binding services to ports under 1024.

-- Tom


----------



## anlag (Mar 5, 2005)

I don't think that's the problem; trying just the first step

> ssh -f -N -L6969:localhost:22 [email protected]

and then a test transfer

> scp -P 6969 [email protected]:/some/path .

works fine, without root or sudo access. As far as I can tell I'm not binding anything to port 22, I'm binding it to 6969.

Any other suggestions?


----------



## anlag (Mar 5, 2005)

It occurred to me that only the last step of the tunnel should point to port 22, and that's why my initial attempt only connected to node A.

With that in mind I took a step back and tried creating the tunnel step by step.

On my local machine:

> ssh -f -N -L6969:localhost:6969 [email protected]

On A:

> ssh -f -N -L6969:localhost:6969 [email protected]

And on B:

> ssh -f -N -L6969:localhost:22 [email protected]


Now, having done this if I on A execute the following:

> scp -P 6969 [email protected]:/some/path .

it indeed works, in that the specified file is transferred from node C. So port 6969 on A tunnels to 6969 on B, and 6969 on B to 22 on C. BUT, when I execute the same command on my local machine, it asks me for the password for [email protected] - which I don't have and should not need due to the nature of the setup.

I don't understand why it works fine from A and not from my local workstation, when the setup is exactly the same but with just one additional step on the way.


----------



## lotuseclat79 (Sep 12, 2003)

Why not userAB on node C?

Also, do you have account identities on the different nodes that are not the same or all the same on all nodes involved?

-- Tom


----------



## lotuseclat79 (Sep 12, 2003)

Hi anlag,

One of the many websites I keep a daily watch has an what may be a good solution for you if you can craft it to fit your multiple server situation. It does have an ssh example, but is not adapted for intermediate multiple servers. Here is the link to the article:
Linux Tip: super-fast network file copy.

Let us know how you manage to solve this problem as feedback for other users that may face a similar problem in the future.

-- Tom


----------



## sniper11 (Nov 16, 2008)

You could try putting the ssh commands in your .profile file in B so that when you login to B via ssh it immediately connects to C.
Then if you don't want to connect to C , just stop the operation with Ctrl-C.

However, this is just a temporary solution. If you find out how to do it with a single command let us know


----------

