I want to connect to and execute a process on a remote server using Python. I want to be able to get the return code and stderr (if any) of the process. Has anyone ever done anything like this before. I have done it with ssh, but I want to do it from Python script.
Cheers.
-
1Duplicate: stackoverflow.com/questions/536370/…S.Lott– S.Lott2009年06月03日 21:21:34 +00:00Commented Jun 3, 2009 at 21:21
-
Example: stromberg.dnsalias.org/~strombrg/looper This is Microsoft-licensed opensource due to an acquisition. It works well on Linux.user1277476– user12774762014年09月27日 18:30:08 +00:00Commented Sep 27, 2014 at 18:30
3 Answers 3
Use the ssh module called paramiko which was created for this purpose instead of using subprocess. Here's an example below:
from paramiko import SSHClient
client = SSHClient()
client.load_system_host_keys()
client.connect("hostname", username="user")
stdin, stdout, stderr = client.exec_command('program')
print "stderr: ", stderr.readlines()
print "pwd: ", stdout.readlines()
UPDATE: The example used to use the ssh module, but that is now deprecated and paramiko is the up-to-date module that provides ssh functionality in python.
Comments
Well, you can call ssh from python...
import subprocess
ret = subprocess.call(["ssh", "user@host", "program"]);
# or, with stderr:
prog = subprocess.Popen(["ssh", "user@host", "program"], stderr=subprocess.PIPE)
errdata = prog.communicate()[1]
5 Comments
subprocess. Even at the time this answer was posted the former version of the ssh module (called Paramiko) existed.Maybe if you want to wrap the nuts and bolts of the ssh calls you could use Fabric This library is geared towards deployment and server management, but it could also be useful for these kind of problems.
Also have a look at Celery. This implements a task queue for Python/Django on various brokers. Maybe an overkill for your problem, but if you are going to call more functions on multiple machines it will save you a lot of headache managing your connections.