4

I have a python script present in my AWS EC2 instance that does some job. I have to trigger that script whenever a new file enters a particular bucket.

My idea was to add a lambda trigger to that bucket which in turns triggers the script present in EC2, but failed to do so.

So how to achieve the solution if according to my plan or is there any other workarounds for this problem?

asked Sep 9, 2019 at 7:07
1
  • S3 has the option to use an SQS Queue to handle an event, in your case whenever a new object is created, you can then just create a consumer service(boto3) over at your EC2 instance which can just trigger your internal python script whenever it receives a new message from the SQS. Commented Sep 9, 2019 at 7:22

3 Answers 3

5

As suggested in comment better to use SNS or SQS, I think it more suitable then lambda function and with SNS or SQS involve one to one communication between S3 and EC2 instance then why you should add an extra layer of lambda?.

Although three can subscribe to the event but lambda involves one extra layer and also involve ssh too which I think costly in term of time (s3 event receiving + event process + ssh to ec2).

enter image description here

Using Lambda:

When lambda trigger it will start doing ssh to ec2 and will run the script and there is one big advantage with Lambda is that you can run any type of script and you do not need a server to keep them up and running like in case of SQS and SNS. you can explore these example ssh-ec2-lambda/ and scheduling-ssh-jobs-using-aws-lambda, the second example is similar just you need based on the event instead of scheduling.

SNS:

If multiple instances suppose to run the job script on ec2 instance the SNS is a better choice. The diagram is a bit similar to your use cases or for representing a big picture.

enter image description here

SQS:

If there is only one instance is supposed to run the script then SQS will be suitable to handle the event.

enter image description here

answered Sep 9, 2019 at 8:09
Sign up to request clarification or add additional context in comments.

Comments

1
  • I am not sure why your option did not work, because its absolutely possible and i have done this using this blog aws blog
  • This git repository has code to trigger a lambda whenever an file with a specific extension is uploaded to the bucket (terraform).
  • You can access the EC2 instance via the lambda as shown in the block above using tags.
  • Hope this all helps you.
answered Sep 9, 2019 at 7:57

Comments

0

I managed it with the help of a blog I found online, which I have lost the link of, but have the code.

import time
import boto3
import paramiko
import os
def lambda_handler(event, context):
 ec2 = boto3.resource('ec2', region_name='us-east-1',aws_access_key_id='XXXXXXXXXXXXXXXXXXXX',aws_secret_access_key='XXXXXXXXXXXXXXXXXXXX')
 instance_id = 'XXXXXXXXXXXXXXXX'
 instance = ec2.Instance(instance_id)
 # Start the instance
 instance.start()
 # Giving some time to start the instance completely
 #time.sleep(60)
 # Connect to S3, we will use it get the pem key file of your ec2 instance
 s3_client = boto3.client('s3',aws_access_key_id='XXXXXXXXXXXXXXXXXXXX',aws_secret_access_key='XXXXXXXXXXXXXXXXXXXX')
 # # # Download private key file from secure S3 bucket
 # # # and save it inside /tmp/ folder of lambda event
 bucket_name = ''
 key_name = ''
 key_location = ''
 s3_client.download_file(bucket_name, key_name, key_location)
 # # # # Allowing few seconds for the download to complete
 time.sleep(10)
 ssh = paramiko.SSHClient()
 ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
 privkey = paramiko.RSAKey.from_private_key_file(key_location)
 # # # username is most likely 'ec2-user' or 'root' or 'ubuntu'
 # # # depending upon yor ec2 AMI
 ssh.connect(instance.private_ip_address,22, username='ubuntu', pkey=privkey)
 commands = []
 for command in commands:
 print("Executing {}".format(command))
 stdin , stdout, stderr = ssh.exec_command(command)
 stdin.flush()
 data = stdout.read().splitlines()
 for line in data:
 print(line)
 ssh.close()
 return 'Success'

Now just zip the paramiko library along with it. Will udate the answer if I found the blog again.

answered Jun 9, 2020 at 8:00

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.