Skip to content

Conversation

@slapula
Copy link
Contributor

@slapula slapula commented Aug 18, 2017

Please feel free to close if this is deemed unnecessary.

This is a great toolkit, however, we've been using other tools to deploy/manage lambda configurations. I found that I simply need this toolkit to upload the zipped file to S3 instead of actually deploying the lambda itself. I tried to stay within patterns already established in this repo so this PR may look a little copy/paste-y. Provided that the user adds the destination S3 bucket (and the optional key) to config.yaml, he or she can just execute lambda upload to build the zip file and send it off to S3.

Any feedback would be much appreciated.

@jmeekhof jmeekhof self-assigned this Aug 21, 2017
@jmeekhof
Copy link
Collaborator

@slapula Looks good. Does upload also deploy the function? Or would it be necessary to go into the console after after the upload to deploy the function?

@slapula
Copy link
Contributor Author

slapula commented Aug 21, 2017

@jmeekhof This simply uploads the zip to S3. In our case, deploying the lambda happens elsewhere (Cloudformation, for example, or Terraform). We just need the zip file in a specific S3 bucket so that the appropriate tool can grab it and do the needful. I imagine others may handle their Lambda code similarly.

@jmeekhof
Copy link
Collaborator

I've run into situations where the code size is above what can be uploaded, and I see the value and need to use s3 to upload our functions. Without deploying the uploaded s3 file after the upload, this feels like a slight departure from how python-lambda currently operates.

@nficano Can you weigh in here? I like the functionality, but I feel like we should deploy at the same time.

@nficano
Copy link
Owner

nficano commented Aug 29, 2017

sorry I was out of the country last week. I actually love this, this solves the problem of bundles being over 100mb.

aws_secret_access_key = cfg.get('aws_secret_access_key')

account_id = get_account_id(aws_access_key_id, aws_secret_access_key)
role = get_role_name(account_id, cfg.get('role', 'lambda_basic_execution'))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it safe to assume that the role for execution will be the same one used for uploading bundles to s3? Maybe this should be configured independently. 

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I agree. I tweaked this role when I was testing but it really should be something like basic_s3_upload.

"""Upload a function to AWS S3."""

print('Uploading your new Lambda function')
byte_stream = read(path_to_zip_file, binary_file=True)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should use a context manager. e.g.:

with open(path_to_zip_file, binary_file=True) as fh:
    # do work with file.

)
kwargs = {
'Bucket': '{}'.format(buck_name),
'Key': cfg.get('s3_key', '{}.zip'.format(func_name)),
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be generated programmatically, maybe like a unix timestamp and the md5 of the the zip file. The configuration should allow for the path to be specified.

e.g.:

# this is untested
s3_key_prefix = cfg.get('s3_key_prefix', '/dist')
checksum = md5(btye_stream).digest()
timestamp = str(time.time())
filename = '{prefix}{checksum}-{ts}.zip'.format(prefix=s3_key_prefix, checksum=checksum, ts=timestamp)

@slapula
Copy link
Contributor Author

slapula commented Sep 3, 2017

@nficano @jmeekhof This is ready for a second pass. Thanks for the feedback guys 😄 👍

import botocore
import pip
import yaml
import md5
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IIRC, the md5 lib is deprecated, you should use hashlib

@nficano
Copy link
Owner

nficano commented Sep 5, 2017

only other suggests is to update the readme, other than that, LGTM

@slapula
Copy link
Contributor Author

slapula commented Sep 5, 2017

@nficano Cool! I've made the suggested changes. I think I may try to implement S3 based Lambda deploys here since that would be the logical next step for this functionality. If DO and Github plan on hosting Hacktoberfest again, I may shoot to have it ready by then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants