diff --git a/.gitignore b/.gitignore index 477144c..1131799 100644 --- a/.gitignore +++ b/.gitignore @@ -28,3 +28,9 @@ node_modules # Users Environment Variables .lock-wscript .DS_Store + +# These will be generated by the gulp script so don't need to be pushed to github +LambdaTest.zip +dist/ +admin.env +.env diff --git a/README.md b/README.md index a5b9b30..ecb411d 100644 --- a/README.md +++ b/README.md @@ -64,14 +64,16 @@ something to S3 on each execution cycle you could rack up the bill! ## Create and Test Your Own AWS Lambda Function * [Create a Lambda function inline](#hello-world-example-inline) -* [Create a Lambda function using a .zip folder](#hello-world-example-.zip) +* [Create a Lambda function using a .zip folder](#hello-world-example-zip) * [Create a Lambda function using the AWS API Gateway](#hello-world-example-api-gateway) * [Trigger a Lambda function using an event from DynamoDB](#triggering-a-lambda-function-using-an-event-from-dynamodb) * [Trigger a Lambda function using the Simple Notification System](#trigger-a-lambda-function-using-the-simple-notification-system) -* [Testing Lambda Functions](#testing-lambda-functions) -* [Deploying Lambda Functions using Gulp](#deploying-lambda-functions-using -gulp) * [Continuous Integration using Codeship](#continuous-integration-using-codeship) +* [Testing Lambda Functions](#testing-lambda-functions) +* [Upload Lambda Function to S3 and deploy to Lambda](#upload-your-lambda-function-to-an-s3-bucket-and-automatically-deploy-it-to-lambda-bash-script-example) +* [Deploying Lambda Functions using Gulp](#deploying-lambda-functions-using-gulp) * [Versioning and Aliasing Lambda Functions](#versioning-and-aliasing-lambda-functions) +* [Create an API with GET/POST Methods using Lambda functions to retrieve/update records from a DynamoDB table](#create-an-api-with-getpost-methods-that-uses-lambda-functions-to-retrieveupdate-records-from-a-dynamodb-table-1) ### 'HELLO WORLD!' Example (inline) @@ -267,6 +269,196 @@ see: http://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-api-keys
+### Create an API with GET/POST Methods that uses Lambda functions to retrieve/update records from a DynamoDB table + +1. First we'll need to create a table in DynamoDB. Go to the DynamoDB console and then click the 'Create Table' button. Give your table a name _(call it something relevant to the type of data your DynamoDB table will hold)_. We've called ours 'Users'. The 'Primary key' is made up of a 'Partition key' _(hash key)_ and an optional 'Sort key'. _(The partition key is used to partition data across hosts for scalability and availability)_: + + + ![create table](https://cloud.githubusercontent.com/assets/5912647/12557398/7114929c-c382-11e5-9c48-5c2bf15649ac.png) + + ![table name](https://cloud.githubusercontent.com/assets/12450298/12714300/c9a4e152-c8cb-11e5-8c35-370393cef70e.png) + + For 'Table settings' just check the 'Use default settings' checkbox and then click the blue 'Create' button: + + ![table setup](https://cloud.githubusercontent.com/assets/12450298/12714466/db3a51d0-c8cc-11e5-882f-a3b09df203a4.png) + +2. Once the table is created, click on the 'Alarms' tab and then delete the basic alarms if they have been created: + + ![alarms](https://cloud.githubusercontent.com/assets/12450298/12714608/9da7b6ea-c8cd-11e5-8b5c-f09f94d3e66a.png) + + Then click on the 'Capacity' tab and then specify the 'Read' and 'Write' capacity units as 3 each and then click 'Save': + + ![capacity](https://cloud.githubusercontent.com/assets/12450298/12714552/5fe19b1e-c8cd-11e5-919a-780c3bb06316.png) + +3. Next we will have to create a policy that allows your AWS functions to access Cloudwatch logs as well as the table you just created. Go to the IAM console, select 'Roles' and then 'Create new role'. We've called ours 'APIGatewayLambdaExecRole': + + ![create role](https://cloud.githubusercontent.com/assets/12450298/12714889/11c25804-c8cf-11e5-8b32-e01f9673b8cf.png) + + Select the 'AWS Lambda' role: + + ![lambda role](https://cloud.githubusercontent.com/assets/12450298/12714963/651140f6-c8cf-11e5-87f5-f547605f757a.png) + + And then click 'Next step' to skip the 'Attach Policy' section: + + ![skip attach policy](https://cloud.githubusercontent.com/assets/12450298/12714986/8de42822-c8cf-11e5-9fc8-9aad5ed4b799.png) + + In the 'Review' section click the blue 'Create Role' button to finish: + + ![review role](https://cloud.githubusercontent.com/assets/12450298/12715013/bcb3bc1c-c8cf-11e5-8fce-37f32546d0b5.png) + + Click on the title of the role you just created then click the down arrow for 'Inline Policies'. Follow the link to create an inline policy: + + ![inline policies](https://cloud.githubusercontent.com/assets/12450298/12715091/385b678e-c8d0-11e5-8006-1d65487b933e.png) + + Click on the 'Custom Policy' radio button and then click 'Select': + + ![custom policy](https://cloud.githubusercontent.com/assets/12450298/12715150/857ad6e4-c8d0-11e5-9688-c6237746e742.png) + + Give your custom policy a name _(we've called ours 'LogAndDynamoDBAccess')_ and then enter the following in the 'Policy Document' section. **Make sure your "Resource" at the bottom is set to the ARN of your table and the second "SID" is set to "_YourTableName_DynamoDBReadWrite"**. _(the ARN can be found in your 'Table details' by going to your DynamoDB console and clicking on your table.)_: + + ``` + { + "Version": "2012-10-17", + "Statement": [ + { + "Sid": "AccessCloudwatchLogs", + "Action": [ + "logs:*" + ], + "Effect": "Allow", + "Resource": "arn:aws:logs:*:*:*" + }, + { + "Sid": "UsersDynamoDBReadWrite", + "Effect": "Allow", + "Action": [ + "dynamodb:DeleteItem", + "dynamodb:GetItem", + "dynamodb:PutItem", + "dynamodb:UpdateItem" + ], + "Resource": [ + "arn:aws:dynamodb:eu-west-1:655240720487:table/Users" + ] + } + ] + } + ``` + +4. Now we need to create the Lambda functions for adding and retrieving data to and from the table _(we'll be creating our functions in a text editor, zipping them up and then uploading them to Lambda. Follow the instructions in the previous 'HELLO WORLD!' .zip example on how to do this)_: + + Create a new ```.js``` file that will contain our first Lambda function. This function will GET information from the DynamoDB table. We've called the file ```getUserInfo.js```. Here is the code: + + ```JavaScript + var AWS = require('aws-sdk'); + var DOC = require('dynamodb-doc'); + var dynamo = new DOC.DynamoDB(); + + exports.handler = function(event, context) { + var callback = function(err, data) { + if (err) { + console.log('error on getUserInfo: ', err); + context.done('Unable to retrieve user information', null); + } else { + if(data.Item && data.Item.users) { + context.done(null, data.Item.users); + } else { + context.done(null, {}); + } + } + }; + + dynamo.getItem({TableName:"Users", Key:{username:"default"}}, callback); + }; + ``` + Zip up the file and then upload it to Lambda: + + ```zip -r getUserInfo.zip getUserInfo.js``` + + ![getuserinfo](https://cloud.githubusercontent.com/assets/12450298/12716616/ceb37a6a-c8d9-11e5-80be-54ebf8b9754d.png) + + For the Role, select the one we created earlier. Then click 'Next' and then 'Create function': + + ![role](https://cloud.githubusercontent.com/assets/12450298/12716846/6d4dcd82-c8db-11e5-9b01-3dccc12d8fa5.png) + + Click 'Test' to test the function. The results should return an empty objext ```{}```. + + Create a second ```.js``` file that will contain our second Lambda function. This function will UPDATE information in our DynamoDB table. We've called the file ```updateUserInfo.js```. Here is the code: + + ```JavaScript + var AWS = require('aws-sdk'); + var DOC = require('dynamodb-doc'); + var dynamo = new DOC.DynamoDB(); + + exports.handler = function(event, context) { + var item = { username:"default", + users: event.users || {} + }; + + var callback = function(err, data) { + if (err) { + console.log(err); + context.fail('unable to update users at this time'); + } else { + console.log(data); + context.done(null, data); + } + }; + + dynamo.putItem({TableName:"Users", Item:item}, callback); + }; + ``` + Again zip up the file and then upload it to Lambda: + ```zip -r updateUserInfo.zip updateUserInfo.js``` + + Follow the same steps as the previous function to create the second one, giving it the same role. They should both now appear in your functions section: + + ![functions](https://cloud.githubusercontent.com/assets/12450298/12717241/7e1805bc-c8de-11e5-9c0c-9974a961cef7.png) + + Test the function with a sample event relevant to your data. We created the following sample event: + ``` + { + "users": [ + { + "id": 1, + "name": "John Smith", + "location": "London" + + } + ] + } + ``` + You should see an empty obect just like the first function ```{}```. + Go back to the GetUserInfo function and then click 'Test' again. You should now see a returned result with the object in your sample event like this: + + ``` + [ + { + "id": 1, + "location": "London", + "name": "John Smith" + } + ] + ``` +5. We're going to have to create one more Lambda function. It essentially does nothing but it is required by the OPTIONS method for CORS _(Cross Origin Resource Sharing which is a mechanism that allows restricted resources on a web page to be requested from )_. The function is as follows: + + ```JavaScript + exports.handler = function(event, context) { + context.succeed(''); + } + ``` + Upload it just like the previous Lambda functions: + + ![noop](https://cloud.githubusercontent.com/assets/12450298/12744540/be1404a0-c98c-11e5-8a7b-a0dfb74bc6f1.png) + +6. Next go to the Amazon API Gateway console and create a new API by clicking 'Create API'. Give it a name, we've called our API 'SecureUsers': + + ![api gateway](https://cloud.githubusercontent.com/assets/12450298/12744749/cd30dd9a-c98d-11e5-97ce-217fe7adf74f.png) + + + + + ### Triggering a Lambda function using an event from DynamoDB Lambda functions can be set up to be triggered by events from other AWS services like Dynamo DB tables. This can be used to build applications that react to data modifications. @@ -544,12 +736,6 @@ NB: Using the JSON Messsage Generator option it is possible to format messages d More info and an example can be found [here](https://aws.amazon.com/blogs/compute/continuous-integration-deployment-for-aws-lambda-functions-with-jenkins-and-grunt-part-1/) -### Deploying Lambda Functions using Gulp - - Gulp can be used to automate the packaging and deployment of Lambda functions. - - More info on setting up gulp with aws-lambda can be found [here](https://medium.com/@AdamRNeary/a-gulp-workflow-for-amazon-lambda-61c2afd723b6#.4rfsrda09) - ### Continuous Integration using Codeship After writing your tests, the next step is to set up Continuous Integration (CI) for your Lambda Functions so every time you push up your code to GitHub, the tests are run and the code is deployed to AWS if the tests pass. This example goes through how to set up CI using Codeship. @@ -623,7 +809,8 @@ Some initial set up of your project repo is required. This involves having a lam "lambda:UpdateFunctionCode", "lambda:UpdateFunctionConfiguration", "lambda:InvokeFunction", - "lambda:GetFunction" + "lambda:GetFunction", + "lambda:CreateFunction", ], "Resource": [ "YOUR_LAMBDA_FUNCTION_ARN_HERE" @@ -702,6 +889,475 @@ For more information have at the Codeship documentation: * [Integrating AWS Lambda with Codeship](https://blog.codeship.com/integrating-aws-lambda-with-codeship/) * [Deployment to AWS Lambda](https://codeship.com/documentation/continuous-deployment/deployment-to-aws-lambda/) +### Upload Your Lambda Function to an S3 Bucket and Automatically Deploy it to Lambda (bash script example) +In this example will build a script that will execute the neccessary steps to upload a Lambda function to S3 where it can be stored and then automatically deploy it to Lambda. + +We will be writing our own bash script that will involve the use of some of the AWS CLI commands. Follow these instructions on how to get set up with the AWS CLI on your local machine: + +1. If you haven't already done so, set up an account with AWS **[here](http://aws.amazon.com/)**. + +2. You'll then need to get your 'access key ID' and 'secret access key' by doing the following: + * Open the IAM console + * In the navigation pane choose 'Users' + * Click your IAM username + * Click 'Security Credentials' and then 'Create Access Key' + * To see your access key, choose Show User Security Credentials. Your credentials will look something like this: + Access Key ID: AKIAIOSFODNN7EXAMPLE + Secret Access Key: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY + * Click 'Download Credentials' and store them in a secure location + +3. Install the AWS CLI via a method of your choice **[here](http://docs.aws.amazon.com/cli/latest/userguide/installing.html)**. + +4. Once it's installed you have to configure it. Type ```aws configure``` in the command line. You should see something like this: + ```bash + $ aws configure + AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE + AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY + Default region name [None]: eu-west-1 + Default output format [None]: ENTER + ``` + Enter your aws access key, secret access key and region then press enter on the last option. + You should now be good to go! + +5. Next write a Lambda function in your text editor if you haven't already. Check out our **[previous example](https://github.com/dwyl/learn-aws-lambda#hello-world-example-zip)** up until step 4 _(we'll be automating the zipping in this example)_. + +6. Once you've done this you'll want to create a new S3 bucket that will store all of your uploaded Lambda functions. Click on the S3 console on the AWS Management Console window: + + ![s3 console](https://cloud.githubusercontent.com/assets/12450298/12646827/32f97802-c5ca-11e5-84a8-b49e2cd0e929.png) + + Click the 'Create Bucket' button. Give your S3 Bucket a name and select its region. We've called ours 'lambda-function-container': + + ![s3 create bucket](https://cloud.githubusercontent.com/assets/12450298/12646889/8f342590-c5ca-11e5-8e2f-e2cb2bccf04d.png) + +7. Next you'll want to write a bash script that will perform 3 commands. The first is to create your deployment package (a .ZIP file containing your lambda function and its dependencies). The second will upload the deployment package to your newly created S3 Bucket. The third will deploy your Lambda function from S3. + + To do so create a new file and call it whatever you want and save it as a ```.sh``` file. We've called ours 'lambda-upload-create.sh'. The 3 commands require variables as input which is why we've included the ```echo``` & ```read``` bash commands in order to temporarily save these inputs: + + ![echo and read](https://cloud.githubusercontent.com/assets/12450298/12647320/7f9fdd52-c5cc-11e5-98d0-dfd68b6a8caf.png) + + We tried to have as few variable inputs as possible so that it reduces the margin for error when typing it into the command line. These are followed by our zip and AWS CLI commands: + + The first command (zip) takes two inputs, the name of the zip file you want to create and the names of the files you want to zip up. _(in our case its going to be upload and upload.js seeing as we have no dependencies)_ + ```bash + zip -r "$ZipFileName.zip" $FilesToBeZipped + ``` + + The upload command 'put-object' takes three inputs, the name of the bucket, the key which is the file path of the zip file and the body which is the same as the key in this case. + ```bash + aws s3api put-object --bucket $BucketName --key "./$ZipFileName.zip" --body "./$ZipFileName.zip" + ``` + + The deployment command 'create-function' takes five inputs, the function name which can be anything you like, the runtime which in our case is nodejs, the role which is the ARN for an IAM role you have used/created in the IAM console, the code which consists of the bucket name that you're deploying from and the key which is the file path of the zip and finally the description of your function which is optional. + ```bash + aws lambda create-function --function-name $FunctionName --runtime nodejs \ + --role $Role --handler "$ZipFileName.handler" \ + --code S3Bucket="$BucketName",S3Key="./$ZipFileName.zip" \ + --description $Description + ``` +8. Let's create the script that we'll run in our ```package.json``` that will trigger the ```.sh``` file we just created: + + ![script link](https://cloud.githubusercontent.com/assets/12450298/12648830/b1af1d4c-c5d3-11e5-91af-0d32691b7764.png) + + In order to be able to run our script we have to make it **_executable_**. Type this command into your terminal: + + ```chmod +x (filenameOfScript.sh)``` + +9. One final step before we'll be able to run our script. Go back to AWS and go to the IAM console because you need to add some policies that enable you to perform certain methods like 'create-function' or 'put-object'. + + Click on the groups and then select 'Create Group'. We've made a 'Public' group, click on it once you've created it: + ![create group](https://cloud.githubusercontent.com/assets/12450298/12649316/06ac3a80-c5d6-11e5-8b05-d624d507a12e.png) + + Click on the 'Attach Policy' button and then select 'IAMFullAccess' from the list: + ![Attach policy](https://cloud.githubusercontent.com/assets/12450298/12649326/11a19a3e-c5d6-11e5-82e3-5136640fdeb4.png) + + Click on the 'Create Group Policy' in the Inline Policies section: + ![inline policy](https://cloud.githubusercontent.com/assets/12450298/12649339/1c916cb2-c5d6-11e5-8109-34f51f860d5a.png) + + Select the 'Custom Policy' and then press the 'Select' button: + ![custom policy](https://cloud.githubusercontent.com/assets/12450298/12649566/095d28ba-c5d7-11e5-812d-97ea278cb285.png) + + Create your custom policy. We've included the necessary effects, actions and resources to have complete access. Then click 'Apply Policy': + ![create custom policy](https://cloud.githubusercontent.com/assets/12450298/12649574/0f1dbcd8-c5d7-11e5-864e-d9e04b80882f.png) + + Once your group has been created you'll need to add a user to it. Any user who is added to that group will have the same permissions. If you haven't created a user you can do that here: + ![create user](https://cloud.githubusercontent.com/assets/12450298/12649893/73b3e590-c5d8-11e5-9cec-88bee1ac5c4d.png) + + Go back to the group you just created and then click 'Add Users to Group' and then select a user to add. The user should be the one that has the access key id and secret access key assigned to it that you're using for the AWS CLI. + ![add users](https://cloud.githubusercontent.com/assets/12450298/12650158/9c0df796-c5d9-11e5-91da-dc4f45d22c98.png) + + We should now be able to take our script for a spin! + +10. In the command line, run the script in your ```package.json```. Ours is as follows: + ```$ npm run upload``` + + This should prompt the ```echo``` and ```read``` commands first: + ```bash + Enter the name of the files you wish to zip (eg. lambdaFunction.js node_modules): upload.js + Enter the name of the output zip file (eg. lambdaFunction): upload + Enter the name of the s3 bucket you wish to upload to: lambda-function-container + Enter the name of your lambda function: Upload + Enter the ARN of the role you wish to implement: arn:aws:iam::655240711487:role/lambda_basic_execution + ``` + + After you've hit enter it should return this: + ```bash + adding: upload.js (deflated 17%) + { + "ETag": "\"519e9cfc9a2ee33412ba813c82f33a56fa3\"" + } + { + "CodeSha256": "nbYYHfHKyYSlb09Dpw7vf7wB93F+9V8XEmaTBU=", + "FunctionName": "Upload", + "CodeSize": 249, + "MemorySize": 128, + "FunctionArn": "arn:aws:lambda:eu-west-1:655240711487:function:Upload", + "Version": "$LATEST", + "Role": "arn:aws:iam::655240711487:role/lambda_basic_execution", + "Timeout": 3, + "LastModified": "2016-01-28T13:31:28.627+0000", + "Handler": "upload.handler", + "Runtime": "nodejs", + "Description": "Bash Script Tutorial" + } + ``` + +11. Go to S3 to check if the deployment package has been uploaded. You should see your ```.ZIP``` file: + + ![s3 uploaded](https://cloud.githubusercontent.com/assets/12450298/12650714/bcb0664e-c5db-11e5-9d08-a2bf2f2c32ff.png) + +12. Go to Lambda to check if your Lambda function has been enabled: + + ![lambda enabled](https://cloud.githubusercontent.com/assets/12450298/12650757/ee4b886e-c5db-11e5-8505-08b3b4bc0958.png) + + That's all! You should now be able to upload and deploy a Lambda function with a single bash script. + +### Deploying Lambda Functions using Gulp + + [Gulp](https://github.com/gulpjs/gulp/blob/master/docs/API.md) can be used to automate the zipping, deployment and testing of Lambda functions on AWS. The Codeship deployment script can then be reduced to a single command `gulp deploy`! + + The syntax to create a new Gulp task is" + + ```js + gulp.task('name of task', function() { + return //gulp functions to run + }) + ``` + + There many plugins for performing actions like retrieving, moving and zipping files. These actions are also chainable. + + We will go through a simple gulp script with tasks for each of the steps involved. + + 1. Require in all the relevant modules and files. We'll be using the aws-sdk to deploy and invoke the lambda function. We also need to read in the `package.json` file in order to add the node modules to the zip file. + + ```js + var AWS = require('aws-sdk'); + var gulp = require('gulp'); + var zip = require('gulp-zip'); + var install = require('gulp-install'); + var runSequence = require('run-sequence'); + var fs = require('fs'); + + var packageJson = require('./package.json'); + ``` + + 2. Declare Constants. + + ```js + var region = 'eu-west-1'; //AWS region + var functionName = 'LambdaTest'; + var outputName = 'LambdaTest.zip'; //name to be given to output zip file + + // the ARN of the execution role to be given to the lambda function - change this to a role from your account + var IAMRole = 'arn:aws:iam::685330956565:role/lambda_basic_execution'; + + // the paths of the files to be added to the zip folder + var filesToPack = ['./lambda-testing/functions/LambdaTest.js']; + + ``` + + **Make sure the IAM role is changed to the ARN of a role from your AWS account and the region is set to the AWS region you want to deploy the Lambda function to!** + + 3. Create an archive folder and add the project files + + ```js + gulp.task('js', function () { + return gulp.src(filesToPack, {base: './lambda-testing/functions'}) + .pipe(gulp.dest('dist/')); + }); + ``` + + `gulp.src` takes an array of file paths as the first argument and an options object as the second. If you specify a base file path in the options only the folders/files after the base are copied i.e. in this case, only the LambdaTest.js file is copied into the archive folder (`dist`). + + 4. Add the node modules to the archive folder + + ```js + gulp.task('node-modules', function () { + return gulp.src('./package.json') + .pipe(gulp.dest('dist/')) + .pipe(install({production: true})); + }); + ``` + + In this task, the `package.json` file is copied to the archive folder and the 'gulp-install' module is used to do an `npm install --production` of all the listed dependencies. + + 5. Zip up the archive folder and save it. + + ```js + gulp.task('zip', function () { + return gulp.src(['dist/**', '!dist/package.json']) + .pipe(zip(outputName)) + .pipe(gulp.dest('./')); + }); + ``` + + All the files in the dist folder apart from the `package.json` file are zipped up using the 'gulp-zip' module and save in the root of the project folder. + + 6. Upload the zip file to AWS. If the function already exists, update it, otherwise create a new Function. + + We can create an 'upload' task with gulp + + ```js + gulp.task('upload', function() {}) + ``` + + Inside the function we first have to do a bit of set up: + + ```js + AWS.config.region = region; // this is set to eu-west-1 from the constants declared in step 1 + var lambda = new AWS.Lambda(); + var zipFile = './' + outputName; // the outputName has also been set in step 1 + ``` + + First we need to check if the function already exists on AWS before deciding whether to create a function or update a function. + + ```js + lambda.getFunction({ FunctionName: functionName }, function(err, data) { + if (err) createFunction(); + else updateFunction(); + }); + ``` + + We also need a function to retrieve the saved zip file in order to pass it in as a parameter in our create function command. + + ```js + function getZipFile (callback) { + fs.readFile(zipFile, function (err, data) { + if (err) console.log(err); + else { + callback(data); + } + }); + } + ``` + The `getZipFile` function takes a callback which gets called with the file data if the file is read successfully. + + Using the aws-sdk we can then define a function to create a new Lambda function from this zip file. + + ```js + function createFunction () { + + getZipFile(function (data) { + var params = { + Code: { + ZipFile: data // buffer with the zip file data + }, + FunctionName: functionName, // functionName was set in the constants in step 1 + Handler: 'LambdaTest.handler', // need to set this as the name of our lambda function file is LambdaTest.js + Role: IAMRole, // IAMRole was set in the constants in step 1 + Runtime: 'nodejs' + }; + + lambda.createFunction (params, function (err, data) { + if (err) console.error(err); + else console.log('Function ' + functionName + ' has been created.'); + }); + }); + + } + ``` + Similarly we can also define `updateFunction`: + + ```js + function updateFunction () { + + getZipFile(function (data) { + var params = { + FunctionName: functionName, + ZipFile: data + }; + + lambda.updateFunctionCode(params, function(err, data) { + if (err) console.error(err); + else console.log('Function ' + functionName + ' has been updated.'); + }); + }); + } + ``` + + 7. Invoke the function with a test event to check the live version is working as expected. + + We have to first get the function to make sure it exists and only invoke it if there isn't an error. + + In the parameters for invoking the function, a JSON object can be specified as the 'Payload' and the 'InvocationType' can be specified as 'RequestResponse' if you want to get a response body. + + ```js + gulp.task('test-invoke', function() { + var lambda = new AWS.Lambda(); + + var params = { + FunctionName: functionName, + InvocationType: 'RequestResponse', + LogType: 'Tail', + Payload: '{ "key1" : "name" }' + }; + + lambda.getFunction({ FunctionName: functionName }, function(err, data) { + if (err) console.log("Function" + functionName + "not found", err); + else invokeFunction(); + }); + + function invokeFunction() { + lambda.invoke(params, function(err, data) { + if (err) console.log(err, err.stack); + else console.log(data); + }) + } + }) + ``` + + 8. Create a deployment task that runs all the above tasks in series in the correct order. + + The `runSequence` module takes a comma separated list of gulp task names or a list of arrays with gulp tasks, and ends with a callback. The tasks are run in the order they are specified. To run two tasks in parallel specify them in the same array. + + ```js + gulp.task('deploy', function (callback) { + return runSequence( + ['js', 'node-modules'], + ['zip'], + ['upload'], + ['test-invoke'], + callback + ); + }); + ``` + + **In the AWS console you can only view functions by region, so if you can't see the function after it has been created, check you're looking at the correct region (in the dropdown menu in the top right of the console)** + + ![AWSregion](https://cloud.githubusercontent.com/assets/5912647/12677661/75d12846-c692-11e5-878d-990487be9910.png) + + 9. Add the deployment script to Codeship or your package.json + + In Codeship just add `gulp-deploy` to your Deployment script and you're good to go! + + **Note: Make sure the Access Policy of the Codeship User in the IAM console on AWS has permissions for all the actions you're trying to execute. i.e. getting, creating, updating and invoking lambda functions.** + +#### Upload to S3 and Deploy to Lambda With Gulp +Here we will implement the previous example of uploading a Lambda function to S3 and then deploying it from the bucket. Intead of using a bash script we can use Gulp. We can make some small adjustments to the Gulp example that we just created in order to deploy from S3. This is a continuation from that so please check it out before you look at this one: + +1. We're going to want to create a new Gulp task that will upload our zip file. We've called our task 'upload-to-s3' and we've included it just after our zip task. + + The first thing we do is create a new S3 instance using the AWS SDK because we'll need to access some S3 methods. We then declare the path of the zip file we want to upload to be used as the S3 Key. We then created a 'getZipFile' function that will get the 'Body' for the parameters and also wrap around the 'putObject' method. We then write the S3 method which takes params _(an object with 'Bucket', 'Key' and 'Body' within it)_ and a callback that handles errors is called when a response from the service is returned. + ```JavaScript + gulp.task('upload-to-s3', function () { + var s3 = new AWS.S3(); + var zipFilePath = './' + outputName; + getZipFile(function (data) { + var params = { + Bucket: 'lambda-function-container', + Key: zipFilePath, + Body: data + }; + s3.putObject(params, function(err, data) { + if (err) console.log('Object upload unsuccessful!'); + else console.log('Object ' + outputName + ' was uploaded!'); + }); + }); + function getZipFile (next) { + fs.readFile(zipFilePath, function (err, data) { + if (err) console.log(err); + else { + next(data); + } + }); + } + }); + ``` +2. Next we need to add our new task to the list of tasks in our runSequence that we've already created. We want it to come after zipping but before our 'upload' task: + + ```JavaScript + gulp.task('deploy', function (callback) { + return runSequence( + ['js', 'node-mods'], + ['zip'], + ['upload-to-s3'], + ['upload'], + ['test-invoke'] + callback + ); + }); + ``` + +3. In order for our Lambda function to be deployed from S3, we're going to have to adjust our 'createFunction' & 'updateFunction' Lambda methods that we created previously. + + We needed to change the 'Code' parameter from ```ZipFile: data``` to: + ```JavaScript + function createFunction () { + var params = { + Code: { + S3Bucket: bucketName, + S3Key: zipFile + }, + FunctionName: functionName, + Handler: 'LambdaTest.handler', + Role: IAMRole, + Runtime: 'nodejs' + }; + + lambda.createFunction (params, function (err, data) { + if (err) console.error("CREATE ERROR", err); + else console.log('Function ' + functionName + ' has been created.'); + }); + + } + ``` + We then needed to do the same with our 'updateFunction': + ```JavaScript + function updateFunction () { + var params = { + FunctionName: functionName, + S3Bucket: bucketName, + S3Key: zipFile + }; + + lambda.updateFunctionCode(params, function(err, data) { + if (err) console.error(err); + else console.log('Function ' + functionName + ' has been updated.'); + }); + } + ``` +4. Because we added some more AWS methods, we'll need to update our policy so that it supports this. Go to your IAM console and add the necessary methods. Here's ours: + + ![policy](https://cloud.githubusercontent.com/assets/12450298/12679928/e3ee941a-c69e-11e5-9e39-4ea1dcf95fda.png) + We included a 'getObject' method to check if the object had been uploaded already. + ```JavaScript + function checkObject (fn) { + var params = { + Bucket: bucketName, + Key: zipFile + }; + s3.getObject(params, function (err, data) { + if (err) console.log('BUCKET ERROR', err); + else fn(); + }); + } + ``` +5. Your script should be good to go! Once you've run it go to your S3 and Lambda consoles to check if your Lambda function has been uploaded and deployed: + + ![uploaded](https://cloud.githubusercontent.com/assets/12450298/12680122/0be2f64a-c6a0-11e5-91e4-c452adf3e766.png) + + ![deployed](https://cloud.githubusercontent.com/assets/12450298/12680144/2241d87a-c6a0-11e5-8e15-2c5fc32e3470.png) + + ### Versioning and Aliasing Lambda Functions Multiple versions of a Lambda function can be running at the same time on AWS. Each one has a unique ARN. This allows different versions to be used in different stages of the development workflow e.g. development, beta, staging, production etc. Versions are immutable. @@ -733,6 +1389,375 @@ Using aliases means that calling event source doesn't have to know the specific Walkthrough of implementing [versioning](http://docs.aws.amazon.com/lambda/latest/dg/versioning-aliases-walkthrough1.html) and [aliasing](http://docs.aws.amazon.com/lambda/latest/dg/versioning-aliases-walkthrough1.html) using the AWS CLI on the AWS Lambda docs. +### Create an API with GET/POST Methods that uses Lambda functions to retrieve/update records from a DynamoDB table + + 1. First we'll need to create a table in DynamoDB. Go to the DynamoDB console and then click the 'Create Table' button. Give your table a name _(call it something relevant to the type of data your DynamoDB table will hold)_. We've called ours 'Users'. The 'Primary key' is made up of a 'Partition key' _(hash key)_ and an optional 'Sort key'. _(The partition key is used to partition data across hosts for scalability and availability)_: + + + ![create table](https://cloud.githubusercontent.com/assets/5912647/12557398/7114929c-c382-11e5-9c48-5c2bf15649ac.png) + + ![table name](https://cloud.githubusercontent.com/assets/12450298/12714300/c9a4e152-c8cb-11e5-8c35-370393cef70e.png) + + For 'Table settings' just check the 'Use default settings' checkbox and then click the blue 'Create' button: + + ![table setup](https://cloud.githubusercontent.com/assets/12450298/12714466/db3a51d0-c8cc-11e5-882f-a3b09df203a4.png) + +2. Once the table is created, click on the 'Alarms' tab and then delete the basic alarms if they have been created: + + ![alarms](https://cloud.githubusercontent.com/assets/12450298/12714608/9da7b6ea-c8cd-11e5-8b5c-f09f94d3e66a.png) + + Then click on the 'Capacity' tab and then specify the 'Read' and 'Write' capacity units as 3 each and then click 'Save': + + ![capacity](https://cloud.githubusercontent.com/assets/12450298/12714552/5fe19b1e-c8cd-11e5-919a-780c3bb06316.png) + +3. Next we will have to create a policy that allows your AWS functions to access Cloudwatch logs as well as the table you just created. Go to the IAM console, select 'Roles' and then 'Create new role'. We've called ours 'APIGatewayLambdaExecRole': + + ![create role](https://cloud.githubusercontent.com/assets/12450298/12714889/11c25804-c8cf-11e5-8b32-e01f9673b8cf.png) + + Select the 'AWS Lambda' role: + + ![lambda role](https://cloud.githubusercontent.com/assets/12450298/12714963/651140f6-c8cf-11e5-87f5-f547605f757a.png) + + And then click 'Next step' to skip the 'Attach Policy' section: + + ![skip attach policy](https://cloud.githubusercontent.com/assets/12450298/12714986/8de42822-c8cf-11e5-9fc8-9aad5ed4b799.png) + + In the 'Review' section click the blue 'Create Role' button to finish: + + ![review role](https://cloud.githubusercontent.com/assets/12450298/12715013/bcb3bc1c-c8cf-11e5-8fce-37f32546d0b5.png) + + Click on the title of the role you just created then click the down arrow for 'Inline Policies'. Follow the link to create an inline policy: + + ![inline policies](https://cloud.githubusercontent.com/assets/12450298/12715091/385b678e-c8d0-11e5-8006-1d65487b933e.png) + + Click on the 'Custom Policy' radio button and then click 'Select': + + ![custom policy](https://cloud.githubusercontent.com/assets/12450298/12715150/857ad6e4-c8d0-11e5-9688-c6237746e742.png) + + Give your custom policy a name _(we've called ours 'LogAndDynamoDBAccess')_ and then enter the following in the 'Policy Document' section. **Make sure your "Resource" at the bottom is set to the ARN of your table and the second "SID" is set to "_YourTableName_DynamoDBReadWrite"**. _(the ARN can be found in your 'Table details' by going to your DynamoDB console and clicking on your table.)_: + + ``` + { + "Version": "2012-10-17", + "Statement": [ + { + "Sid": "AccessCloudwatchLogs", + "Action": [ + "logs:*" + ], + "Effect": "Allow", + "Resource": "arn:aws:logs:*:*:*" + }, + { + "Sid": "UsersDynamoDBReadWrite", + "Effect": "Allow", + "Action": [ + "dynamodb:DeleteItem", + "dynamodb:GetItem", + "dynamodb:PutItem", + "dynamodb:UpdateItem" + ], + "Resource": [ + "arn:aws:dynamodb:eu-west-1:655240720487:table/Users" + ] + } + ] + } + ``` + + 4. Now we need to create the Lambda functions for adding and retrieving data to and from the table _(we'll be creating our functions in a text editor, zipping them up and then uploading them to Lambda. Follow the instructions in the previous 'HELLO WORLD!' .zip example on how to do this)_: + + Create a new ```.js``` file that will contain our first Lambda function. This function will GET information from the DynamoDB table. We've called the file ```getUserInfo.js```. Here is the code: + + ```JavaScript + var AWS = require('aws-sdk'); + var DOC = require('dynamodb-doc'); + var dynamo = new DOC.DynamoDB(); + + exports.handler = function(event, context) { + var callback = function(err, data) { + if (err) { + console.log('error on getUserInfo: ', err); + context.done('Unable to retrieve user information', null); + } else { + if(data.Item && data.Item.users) { + context.done(null, data.Item.users); + } else { + context.done(null, {}); + } + } + }; + + dynamo.getItem({TableName:"Users", Key:{username:"default"}}, callback); + }; + ``` + Zip up the file and then upload it to Lambda: + + ```zip -r getUserInfo.zip getUserInfo.js``` + + ![getuserinfo](https://cloud.githubusercontent.com/assets/12450298/12716616/ceb37a6a-c8d9-11e5-80be-54ebf8b9754d.png) + + For the Role, select the one we created earlier. Then click 'Next' and then 'Create function': + + ![role](https://cloud.githubusercontent.com/assets/12450298/12716846/6d4dcd82-c8db-11e5-9b01-3dccc12d8fa5.png) + + Click 'Test' to test the function. The results should return an empty objext ```{}```. + + Create a second ```.js``` file that will contain our second Lambda function. This function will UPDATE information in our DynamoDB table. We've called the file ```updateUserInfo.js```. Here is the code: + + ```JavaScript + var AWS = require('aws-sdk'); + var DOC = require('dynamodb-doc'); + var dynamo = new DOC.DynamoDB(); + + exports.handler = function(event, context) { + var item = { username:"default", + users: event.users || {} + }; + + var callback = function(err, data) { + if (err) { + console.log(err); + context.fail('unable to update users at this time'); + } else { + console.log(data); + context.done(null, data); + } + }; + + dynamo.putItem({TableName:"Users", Item:item}, callback); + }; + ``` + Again zip up the file and then upload it to Lambda: + ```zip -r updateUserInfo.zip updateUserInfo.js``` + + Follow the same steps as the previous function to create the second one, giving it the same role. They should both now appear in your functions section: + + ![functions](https://cloud.githubusercontent.com/assets/12450298/12717241/7e1805bc-c8de-11e5-9c0c-9974a961cef7.png) + + Test the function with a sample event relevant to your data. We created the following sample event: + ``` + { + "users": [ + { + "id": 1, + "name": "John Smith", + "location": "London" + + } + ] + } + ``` + You should see an empty obect just like the first function ```{}```. + Go back to the GetUserInfo function and then click 'Test' again. You should now see a returned result with the object in your sample event like this: + + ``` + [ + { + "id": 1, + "location": "London", + "name": "John Smith" + } + ] + ``` + 5. We're going to have to create one more Lambda function. It essentially does nothing but it is required by the OPTIONS method for CORS _(Cross Origin Resource Sharing which is a mechanism that allows restricted resources on a web page to be requested from )_. The function is as follows: + + ```JavaScript + exports.handler = function(event, context) { + context.succeed(''); + } + ``` + Upload it just like the previous Lambda functions: + + ![noop](https://cloud.githubusercontent.com/assets/12450298/12744540/be1404a0-c98c-11e5-8a7b-a0dfb74bc6f1.png) + + 6. Next go to the Amazon API Gateway console and create a new API by clicking 'Create API'. Give it a name, we've called our API 'SecureUsers': + + ![api gateway](https://cloud.githubusercontent.com/assets/12450298/12744749/cd30dd9a-c98d-11e5-97ce-217fe7adf74f.png) + + Click on the 'Create Resource' button and then give your resource a name. We've called ours 'Users': + + ![create resource button](screen shot 2016-02-05 at 14 30 28](https://cloud.githubusercontent.com/assets/12450298/12849024/2d7ae61c-cc15-11e5-8e92-1cefb9cc7bee.png) + + Click 'Create Resource' again to confirm it: + + ![create resource config](https://cloud.githubusercontent.com/assets/12450298/12849056/5e7c7082-cc15-11e5-87cc-51d921af1bd7.png) + +7. On the left hand side, click the endpoint you just created. Ours is ```/users```. Then click 'Create Method' and set it to 'GET': + + ![GET](https://cloud.githubusercontent.com/assets/12450298/12849342/1d95f8ca-cc17-11e5-894b-3896f83d3f2f.png) + + Select the 'Lambda function' radio button and then assign it to the Get function we created earlier then press 'Save': + + ![assign GET function](https://cloud.githubusercontent.com/assets/12450298/12849623/87651974-cc18-11e5-8e88-ebf4f2b3c39d.png) + + Click 'Test'. You should see an empty object ```{}``` in the response body: + + ![GET test](https://cloud.githubusercontent.com/assets/12450298/12849531/f5d2f0ee-cc17-11e5-8162-cde17cdab2dc.png) + +8. Repeat the previous step but instead of a 'GET', set the method to 'POST': + + ![POST](https://cloud.githubusercontent.com/assets/12450298/12849673/cf1cf82c-cc18-11e5-8c8c-edac7bc0d39d.png) + + Click 'Test' but this time in the request body, add in some details. We've added two users: + + ```json + {"users": [ {"id": 1, "name": "Peter", "surname": "Smith"}, + {"id": 2, "name": "John", "surname": "Walsh"} + ] + } + ``` + +9. Go back to your 'GET' method and then click 'Test' again. You should now be able to see that the table has been updated with the details you tested your 'POST' request with. + + ![GET test 2](https://cloud.githubusercontent.com/assets/12450298/12849902/ebfa3602-cc19-11e5-92f6-ffa21320fd20.png) + + +All done! You can now set up Lambda functions that manipulate information in an AWS DynamoDB table that are invoked through the API Gateway! + + +### The Serverless Framework _(previously known as JAWS)_ +**_Serverless_** is a framework that allows you to build applications that run on AWS Lambda and AWS API Gateway. + + +#### Why use it? +This framework makes it extremely easy to deploy Lambda functions and API Gateway endpoints from your local machine. It is also _super_ easy to test your functions locally too. If you want to know more about it check out the [Serverless repo](https://github.com/serverless/serverless) + + + +We'll go through the steps to set up their example project 'serverless-starter'. If you haven't checked out our previous examples on how to create and deploy a Lambda function we strongly recommend that before you get stuck into this one: +Here are the steps to set up the Serverless example project 'serverless-starter'. + +1. First create an AWS account if you haven't done so already. Follow the steps in our previous examples in order to do this. + +2. Go to your IAM console. We're going to have to create a new 'User' that our serverless application will be able to use and attach the neccessary policy in order to initialize it. Go to the 'Users' tab and then click 'Create New Users'. Give your user a name and then click 'Create'. We've called ours 'serverless-admin': + + ![serverless user](https://cloud.githubusercontent.com/assets/12450298/12822479/2d889d44-cb60-11e5-8c89-9420378d6be6.png) + + Once you've created your user, click show credentials. Make a note of them and then download them. _(keep them safe...we'll need them later)_ + + ![credentials](https://cloud.githubusercontent.com/assets/12450298/12822483/31bfc428-cb60-11e5-81f0-2990ec32ca41.png) + + Click on your newly created user and then click the blue 'Attach Policy' button. Select the 'AdministratorAccess' policy and then click the 'Attach Policy' button again to attach it. + + ![attach policy](https://cloud.githubusercontent.com/assets/12450298/12822486/35574a98-cb60-11e5-8cd7-2e2f06ab5c41.png) + + Here's the summary of our serverless-admin user: + + ![user summary](https://cloud.githubusercontent.com/assets/12450298/12822489/39388622-cb60-11e5-900d-bda80d95cd5f.png) + +3. Next you're going to have to install the AWS CLI if you haven't already done so. You can do so via any of the methods **[here](http://docs.aws.amazon.com/cli/latest/userguide/installing.html)**. Once you've done that, type the ```aws configure``` command into the command line. Use the Access ID and Secret Access ID from the user you just set up, select your region and then press enter for the last option: + + ```bash + $ aws configure + AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE + AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY + Default region name [None]: eu-west-1 + Default output format [None]: ENTER + ``` + _(note: you'll still need to reconfigure your AWS CLI, even if you've done so previously)_ + + +4. Now you should be good to get started with Serverless. Run this command to install it globally: + + ```$ npm install serverless -g``` + +5. It's time to set up your project. For demonstration purposes we've chosen to run [serverless-starter](https://github.com/serverless/serverless-starter) which is a boilerplate example provided by Serverless. In future we'll show you how to set up your own project but for now this is all you need to get used the framework. Run this command in your terminal: + + ```$ serverless project install serverless-starter``` + + You'll see this screen and then be prompted to insert some data: + + ![project data](https://cloud.githubusercontent.com/assets/12450298/12822470/2854a2be-cb60-11e5-9822-11913fdd98e1.png) + + Enter the name of the bucket you'd like Serverless to back up your Lambda functions with and an email that alarms can be sent to. Select your region and then select the default AWS profile: + + ```bash + Serverless: Enter a universally unique project bucket name: (serverless-starter-nkatqu-ejl6y9.com) learn-serverless + Serverless: Enter an email to use for AWS alarms: (me@serverless-starter-nkatqu.com) example@email.com + Serverless: Select a region for your project: + us-east-1 + us-west-2 + > eu-west-1 + ap-northeast-1 + Serverless: Select an AWS profile for your project: + > default + Serverless: Creating stage "dev"... + Serverless: Creating region "eu-west-1" in stage "dev"... + Serverless: Creating your project bucket on S3: serverless.eu-west-1.learn-serverless... + Serverless: Deploying resources to stage "dev" in region "eu-west-1" via Cloudformation (~3 minutes)... + Serverless: No resource updates are to be performed. + Serverless: Successfully created region "eu-west-1" within stage "dev" + Serverless: Successfully created stage "dev" + Serverless: Installing nodejs dependencies for component: restApi... + serverless-helpers-js@0.0.3 node_modules/serverless-helpers-js + └── dotenv@1.2.0 + Serverless: Successfully initialized project "serverless-starter" + Serverless: Successfully installed project "serverless-starter" + ``` +6. You should now be able to see the serverless-starter files in your directory: + + ![serverless-starter files](https://cloud.githubusercontent.com/assets/12450298/12822495/3ea624c0-cb60-11e5-87e5-4335faa9320a.png) + +7. Click on it. Let's have a look at what's inside. Click on the 'restApi' folder. In there you should see lib, multi and single directories: + + ![inside the files](https://cloud.githubusercontent.com/assets/12450298/12822502/43a7e9c2-cb60-11e5-8bd6-8fa5a0cf963b.png) + + ```lib``` - holds code shared across all of your functions + ```multi``` - this is a Serverless module that contains multiple Lambda functions + ```single``` - this is another Serverless module containing one Lambda function with multiple endpoints + + In the ```multi``` directory click on the ```create``` directory and then the ```s-function.json``` file. This file contains endpoint configuration information for that Lambda function. + + ![endpoint](https://cloud.githubusercontent.com/assets/12450298/12822510/4c8b6e38-cb60-11e5-80fe-2bc093f7f955.png) + + In the ```single``` directory click on the ```all``` directory and then navigate to its ```s-function.json``` file. Here you can see that a single function has been configured with multiple endpoints. (GET, POST, PUT, DELETE) + + ![endpoints](https://cloud.githubusercontent.com/assets/12450298/12822513/4f450b5c-cb60-11e5-990e-a5d175233442.png) + +8. Next we're going to run our Lambda functions locally. Type the following command in your command line. The third piece is the route to your function. We're testing the 'all' function and so ours is: + + ```$ serverless function restApi/single/all``` + + You should then see the function run and return the results: + + ``` + Serverless: Running restApi/single/all... + Serverless: ----------------- + Serverless: Success! - This Response Was Returned: + Serverless: {"message":"Your Serverless function ran successfully via the 'GET' method!"} + ``` + _Note: The function will take the event from the **event.json** file so you can configure whatever your payload is there. Our test function expects an object with a 'httpMethod' key so our event.json file looks like this:_ + + ```json + { + "httpMethod": "GET" + } + ``` + +9. Once we are happy with our Lambda functions and API Gateway endpoints we can deploy them from the command line using the ```$ serverless dash deploy``` command. You then use the up and down arrow keys to navigate to, and select (by pressing ```enter```). The ones you select will then be deployed after you've moved back down to 'Deploy' and pressed ```enter```. The selected ones show up in yellow: + + ![deploy serverless](https://cloud.githubusercontent.com/assets/12450298/12822528/5bd60a7e-cb60-11e5-9f55-460af2b6132c.png) + + You'll then see this in your terminal: + + ![deploying in process](https://cloud.githubusercontent.com/assets/12450298/12822531/5f22124a-cb60-11e5-8297-868c0cd250d2.png) + +10. We can then go to AWS and check out the S3, Lambda and API Gateway consoles to see if everything has been deployed correctly: + + API Gateway + ![Api deploy](https://cloud.githubusercontent.com/assets/12450298/12822535/624c67fe-cb60-11e5-8120-9f74d933994d.png) + + S3 + ![S3 deploy](https://cloud.githubusercontent.com/assets/12450298/12822544/6ccb1ca2-cb60-11e5-8d1c-97e8e8224717.png) + + Lambda + ![lambda deploy](https://cloud.githubusercontent.com/assets/12450298/12822547/70f21056-cb60-11e5-90c8-e2a3fd4aa457.png) + +_(The Serverless framework automates a lot of the processes that we have covered in previous examples such as uploading to S3 and deploying to Lambda)_ + + + ## Further Reading + Walkthrough Custom Events: diff --git a/api-gateway-dynamoDB-example/getUserInfo.js b/api-gateway-dynamoDB-example/getUserInfo.js new file mode 100644 index 0000000..f4a2d74 --- /dev/null +++ b/api-gateway-dynamoDB-example/getUserInfo.js @@ -0,0 +1,20 @@ +var AWS = require('aws-sdk'); +var DOC = require('dynamodb-doc'); +var dynamo = new DOC.DynamoDB(); + +exports.handler = function(event, context) { + var callback = function(err, data) { + if (err) { + console.log('error on getUserInfo: ', err); + context.done('Unable to retrieve user information', null); + } else { + if(data.Item && data.Item.users) { + context.done(null, data.Item.users); + } else { + context.done(null, {}); + } + } + }; + + dynamo.getItem({TableName:"Users", Key:{username:"default"}}, callback); +}; diff --git a/api-gateway-dynamoDB-example/getUserInfo.zip b/api-gateway-dynamoDB-example/getUserInfo.zip new file mode 100644 index 0000000..084bd01 Binary files /dev/null and b/api-gateway-dynamoDB-example/getUserInfo.zip differ diff --git a/api-gateway-dynamoDB-example/no-op.js b/api-gateway-dynamoDB-example/no-op.js new file mode 100644 index 0000000..3a8b308 --- /dev/null +++ b/api-gateway-dynamoDB-example/no-op.js @@ -0,0 +1,3 @@ +exports.handler = function(event, context) { + context.succeed(''); +} diff --git a/api-gateway-dynamoDB-example/no-op.zip b/api-gateway-dynamoDB-example/no-op.zip new file mode 100644 index 0000000..ad1d35f Binary files /dev/null and b/api-gateway-dynamoDB-example/no-op.zip differ diff --git a/api-gateway-dynamoDB-example/updateUserInfo.js b/api-gateway-dynamoDB-example/updateUserInfo.js new file mode 100644 index 0000000..0f554e0 --- /dev/null +++ b/api-gateway-dynamoDB-example/updateUserInfo.js @@ -0,0 +1,21 @@ +var AWS = require('aws-sdk'); +var DOC = require('dynamodb-doc'); +var dynamo = new DOC.DynamoDB(); + +exports.handler = fucntion(event, context) { + var item = { username:"default", + users: event.users || {} + }; + + var callback = function(err, data) { + if (err) { + console.log(err); + context.fail('unable to update users at this time'); + } else { + console.log(data); + context.done(null, data); + } + }; + + dynamo.putItem({TableName:"Users", Item:item}, callback); +}; diff --git a/api-gateway-dynamoDB-example/updateUserInfo.zip b/api-gateway-dynamoDB-example/updateUserInfo.zip new file mode 100644 index 0000000..0b2c4dd Binary files /dev/null and b/api-gateway-dynamoDB-example/updateUserInfo.zip differ diff --git a/dist/LambdaTest.js b/dist/LambdaTest.js new file mode 100644 index 0000000..0084da5 --- /dev/null +++ b/dist/LambdaTest.js @@ -0,0 +1,6 @@ +console.log('Loading event'); +console.log('Loading event again'); + +exports.handler = function(event, context) { + context.succeed(event.key1); // SUCCESS with message +}; diff --git a/package.json b/dist/package.json similarity index 79% rename from package.json rename to dist/package.json index 010888f..4928bae 100644 --- a/package.json +++ b/dist/package.json @@ -4,6 +4,7 @@ "description": "\"How-to Guide for AWS Lambda, DynamoDB, SNS & API Gateway\"", "main": "index.js", "scripts": { + "upload": "./lambda-upload-create.sh", "test": "./node_modules/lab/bin/lab ./lambda-testing/test/index.js -m 0 -c -a code -r lcov -o lcov.info -r console -o stdout", "coverage": "./node_modules/lab/bin/lab -r html -o ./test/coverage.html" }, @@ -26,6 +27,12 @@ "devDependencies": { "aws-lambda-mock-context": "^1.1.0", "code": "^2.0.1", - "lab": "^7.2.0" + "lab": "^7.2.0", + "archiver": "^0.21.0", + "aws-sdk": "^2.2.29", + "gulp": "^3.9.0", + "gulp-install": "^0.6.0", + "gulp-zip": "^3.0.2", + "run-sequence": "^1.1.5" } } diff --git a/gulpfile.js b/gulpfile.js new file mode 100644 index 0000000..8bfd363 --- /dev/null +++ b/gulpfile.js @@ -0,0 +1,164 @@ +var AWS = require('aws-sdk'); +var gulp = require('gulp'); +var zip = require('gulp-zip'); +var install = require('gulp-install'); +var runSequence = require('run-sequence'); +var fs = require('fs'); + +var packageJson = require('./package.json'); + +//constants +var region = 'eu-west-1'; +var functionName = 'LambdaTest'; +var outputName = 'LambdaTest.zip'; + +var IAMRole = 'arn:aws:iam::655240720487:role/lambda_basic_execution'; +var filesToPack = ['./lambda-testing/functions/LambdaTest.js']; + +/** + * Adds the project files to the archive folder. + */ +gulp.task('js', function () { + return gulp.src(filesToPack, {base: './lambda-testing/functions'}) + .pipe(gulp.dest('dist/')); +}); + +/** + * This task will copy all the required dependencies to + * the dist folder. + */ +gulp.task('node-mods', function () { + return gulp.src('./package.json') + .pipe(gulp.dest('dist/')) + .pipe(install({production: true})); +}); + +/** + * Create an archive based on the dest folder. + */ + +gulp.task('zip', function () { + return gulp.src(['dist/**', '!dist/package.json']) + .pipe(zip(outputName)) + .pipe(gulp.dest('./')); +}); + +/** +* Upload deployment package to S3 (lambda function file + dependencies) +*/ +gulp.task('upload-to-s3', function () { + var s3 = new AWS.S3(); + var zipFilePath = './' + outputName; + getZipFile(function (data) { + var params = { + Bucket: 'lambda-function-container', + Key: zipFilePath, + Body: data + }; + s3.putObject(params, function(err, data) { + if (err) console.log('Object upload unsuccessful!'); + else console.log('Object ' + outputName + ' was uploaded!'); + }); + }); + function getZipFile (next) { + fs.readFile(zipFilePath, function (err, data) { + if (err) console.log(err); + else { + next(data); + } + }); + } +}); + +/** + * update or create the lambda functon + */ +gulp.task('upload', function() { + AWS.config.region = region; + var lambda = new AWS.Lambda(); + var s3 = new AWS.S3(); + var zipFile = './' + outputName; + var bucketName = 'lambda-function-container'; + + lambda.getFunction({ FunctionName: functionName }, function(err, data) { + if (err) checkObject(createFunction); + else checkObject(updateFunction); + }); + function checkObject (fn) { + var params = { + Bucket: bucketName, + Key: zipFile + }; + s3.getObject(params, function (err, data) { + if (err) console.log('BUCKET ERROR', err); + else fn(); + }); + } + function createFunction () { + var params = { + Code: { + S3Bucket: bucketName, + S3Key: zipFile + }, + FunctionName: functionName, + Handler: 'LambdaTest.handler', + Role: IAMRole, + Runtime: 'nodejs' + }; + + lambda.createFunction (params, function (err, data) { + if (err) console.error("CREATE ERROR", err); + else console.log('Function ' + functionName + ' has been created.'); + }); + + } + + function updateFunction () { + var params = { + FunctionName: functionName, + S3Bucket: bucketName, + S3Key: zipFile + }; + + lambda.updateFunctionCode(params, function(err, data) { + if (err) console.error(err); + else console.log('Function ' + functionName + ' has been updated.'); + }); + } + +}); + +gulp.task('test-invoke', function() { + var lambda = new AWS.Lambda(); + + var params = { + FunctionName: functionName, + InvocationType: 'RequestResponse', + LogType: 'Tail', + Payload: '{ "key1" : "name" }' + }; + + lambda.getFunction({ FunctionName: functionName }, function(err, data) { + if (err) console.log("FUNCTION NOT FOUND", err); + else invokeFunction(); + }); + + function invokeFunction() { + lambda.invoke(params, function(err, data) { + if (err) console.log(err, err.stack); + else console.log(data); + }) + } +}) + + +gulp.task('deploy', function (callback) { + return runSequence( + ['js', 'node-mods'], + ['zip'], + ['upload-to-s3'], + ['upload'], + ['test-invoke'] + callback + ); +}); diff --git a/lambda-testing/functions/DynamoDBLambdaTest.js b/lambda-testing/functions/DynamoDBLambdaTest.js new file mode 100644 index 0000000..fb3ecfa --- /dev/null +++ b/lambda-testing/functions/DynamoDBLambdaTest.js @@ -0,0 +1,9 @@ +exports.handler = function(event, context) { + //console.log('Received event:', JSON.stringify(event, null, 2)); + event.Records.forEach(function(record) { + console.log(record.eventID); + console.log(record.eventName); + console.log('DynamoDB Record: %j', record.dynamodb); + }); + context.succeed(event.Records.length); +}; diff --git a/lambda-testing/functions/lambdaTest.js b/lambda-testing/functions/lambdaTest.js index 0084da5..438f5f7 100644 --- a/lambda-testing/functions/lambdaTest.js +++ b/lambda-testing/functions/lambdaTest.js @@ -1,6 +1,7 @@ -console.log('Loading event'); -console.log('Loading event again'); - exports.handler = function(event, context) { - context.succeed(event.key1); // SUCCESS with message + if(event.key1) { + context.succeed(event.key1); + } else { + context.fail("no key1"); + } }; diff --git a/lambda-testing/test/ava.test.js b/lambda-testing/test/ava.test.js new file mode 100644 index 0000000..1564e75 --- /dev/null +++ b/lambda-testing/test/ava.test.js @@ -0,0 +1,30 @@ +'use strict'; +var test = require('ava'); +var context = require('aws-lambda-mock-context'); +var promisify = require('aws-lambda-pify'); +var createDynamoDBEvent = require('./utils/eventCreators').createDynamoDBEvent; + +/** + Create mock event and context objects +**/ + +var ctx = context(); +var testEvent = { key1: 'name' } +var testDynamoDBEvent = createDynamoDBEvent(); + +/** + Promisify handlers and pass in mock context +**/ + +var LambdaTest = promisify(require('../functions/LambdaTest.js').handler, ctx); +var DynamoDBLambdaTest = promisify(require('../functions/DynamoDBLambdaTest.js').handler, ctx) + +test('LambdaTest', async t => { + t.is( await LambdaTest(testEvent), 'name') +}) + +test('DynamoDBLambdaTest', async t => { + t.is( await DynamoDBLambdaTest(testDynamoDBEvent), 3) +}) + +console.log('Tests took', process.uptime(), "seconds to run"); diff --git a/lambda-testing/test/callbacks.test.js b/lambda-testing/test/callbacks.test.js new file mode 100644 index 0000000..7cb3aa5 --- /dev/null +++ b/lambda-testing/test/callbacks.test.js @@ -0,0 +1,55 @@ +'use strict'; + +var Code = require('code'); +var Lab = require('lab'); +var lab = exports.lab = Lab.script(); +var describe = lab.experiment; +var expect = Code.expect; +var it = lab.test; +var createDynamoDBEvent = require('./utils/eventCreators').createDynamoDBEvent; + +/** + Handlers +**/ +var LambdaTest = require('../functions/LambdaTest.js').handler +var DynamoDBLambdaTest = require('../functions/DynamoDBLambdaTest.js').handler + +/** + Create mock event and context objects +**/ +var contextCreator = require('./utils/mockContext.js'); +var testEvent = { key1: 'name' } +var testDynamoDBEvent = createDynamoDBEvent(); + +describe('LambdaTest', function(){ + it("LambdaTest: returns value when given event with key1 property", function(done) { + + function test(result){ + expect(result).to.equal("name") + done(); + } + var context = contextCreator(test); + LambdaTest(testEvent, context); + }) + it("LambdaTest: returns error when given empty event", function(done) { + function test(error){ + expect(error).to.equal("no key1") + done(); + } + var context = contextCreator(test); + LambdaTest({}, context); + }) +}) + +describe('DynamoDB Triggered Lambda Test', function(){ + it("DynamoDBTest: returns number of records in the event", function(done) { + + function test(result){ + expect(result).to.equal(3) + done(); + } + var context = contextCreator(test); + + DynamoDBLambdaTest(testDynamoDBEvent, context); + }) +}) diff --git a/lambda-testing/test/data.json b/lambda-testing/test/data.json deleted file mode 100644 index 53bed0d..0000000 --- a/lambda-testing/test/data.json +++ /dev/null @@ -1,3 +0,0 @@ -{ - "key1": "name" -} diff --git a/lambda-testing/test/index.js b/lambda-testing/test/index.js deleted file mode 100644 index e0fde27..0000000 --- a/lambda-testing/test/index.js +++ /dev/null @@ -1,40 +0,0 @@ -'use strict'; - -var context = require('aws-lambda-mock-context'); -var Code = require('code'); -var Lab = require('lab'); -var lab = exports.lab = Lab.script(); -var describe = lab.experiment; -var expect = Code.expect; -var it = lab.test; - -var lambdaToTest = require('../functions/LambdaTest.js'); - -// creating context object -var ctx = context(); -// text event object -var testEvent = require('./data.json'); - -var response = null; -var error = null; - -describe('Test a simple Lambda function', function(){ - it("Capture response", function(done) { - lambdaToTest.handler(testEvent, ctx); - //capture the response or errors - ctx.Promise - .then(function(resp) { - response = resp; - done(); - }) - .catch(function(err) { - error = err; - done(); - }) - }) - - it("Check response", function(done) { - expect(response).to.equal('name'); - done(); - }) -}) diff --git a/lambda-testing/test/promises.test.js b/lambda-testing/test/promises.test.js new file mode 100644 index 0000000..5fdbb3e --- /dev/null +++ b/lambda-testing/test/promises.test.js @@ -0,0 +1,53 @@ +'use strict'; + +var context = require('aws-lambda-mock-context'); +var promisify = require('aws-lambda-pify'); +var Code = require('code'); +var Lab = require('lab'); +var lab = exports.lab = Lab.script(); +var describe = lab.experiment; +var expect = Code.expect; +var it = lab.test; +var createDynamoDBEvent = require('./utils/eventCreators').createDynamoDBEvent; + +/** + Create mock event and context objects +**/ +var ctx = context(); +var testEvent = { key1: 'name' } +var testDynamoDBEvent = createDynamoDBEvent(); + +/** + Promisify handlers and pass in mock context +**/ +var LambdaTest = promisify(require('../functions/LambdaTest.js').handler, ctx); +var DynamoDBLambdaTest = promisify(require('../functions/DynamoDBLambdaTest.js').handler, ctx) + + +describe('LambdaTest', function(){ + it("LambdaTest: returns value when given event with key1 property", function(done) { + LambdaTest(testEvent) + .then(function(response) { + expect(response).to.equal('name'); + done(); + }) + }) + it("LambdaTest: returns error when given empty event", function(done) { + LambdaTest({}) + .then() + .catch(function(error){ + expect(error).to.equal('no key1'); + done(); + }) + }) +}) + +describe('DynamoDB Triggered Lambda Test', function(){ + it("DynamoDBTest: returns number of records in the event", function(done) { + DynamoDBLambdaTest(testDynamoDBEvent) + .then(function(response) { + expect(response).to.equal(3); // three records in the event + done(); + }) + }) +}) diff --git a/lambda-testing/test/runner.js b/lambda-testing/test/runner.js index e486bce..ad9a93a 100644 --- a/lambda-testing/test/runner.js +++ b/lambda-testing/test/runner.js @@ -1,3 +1 @@ 'use strict'; - -require('./index.js'); diff --git a/lambda-testing/test/utils/eventCreators.js b/lambda-testing/test/utils/eventCreators.js new file mode 100644 index 0000000..0f37a8c --- /dev/null +++ b/lambda-testing/test/utils/eventCreators.js @@ -0,0 +1,117 @@ +/** + + Helper functions to create mock event objects for testing + +**/ + +/** +@param {object} - options for records e.g. + { + awsRegion: defaults to eu-west-1 + events: [] array of objects e.g. { type: 'INSERT', number: 2 }, + eventSourceARN: defaults to "arn:aws:dynamodb:us-west-2:account-id:table/ExampleTableWithStream/stream/2015-06-27T00:48:05.899" + } +**/ + +function createDynamoDBEvent(options){ + + /** + Add code to look at options and return appropriate event object. + **/ + + + return { + "Records": [ + { + "eventID": "1", + "eventVersion": "1.0", + "dynamodb": { + "Keys": { + "Id": { + "N": "101" + } + }, + "NewImage": { + "Message": { + "S": "New item!" + }, + "Id": { + "N": "101" + } + }, + "StreamViewType": "NEW_AND_OLD_IMAGES", + "SequenceNumber": "111", + "SizeBytes": 26 + }, + "awsRegion": "us-west-2", + "eventName": "INSERT", + "eventSourceARN": "arn:aws:dynamodb:us-west-2:account-id:table/ExampleTableWithStream/stream/2015-06-27T00:48:05.899", + "eventSource": "aws:dynamodb" + }, + { + "eventID": "2", + "eventVersion": "1.0", + "dynamodb": { + "OldImage": { + "Message": { + "S": "New item!" + }, + "Id": { + "N": "101" + } + }, + "SequenceNumber": "222", + "Keys": { + "Id": { + "N": "101" + } + }, + "SizeBytes": 59, + "NewImage": { + "Message": { + "S": "This item has changed" + }, + "Id": { + "N": "101" + } + }, + "StreamViewType": "NEW_AND_OLD_IMAGES" + }, + "awsRegion": "us-west-2", + "eventName": "MODIFY", + "eventSourceARN": "arn:aws:dynamodb:us-west-2:account-id:table/ExampleTableWithStream/stream/2015-06-27T00:48:05.899", + "eventSource": "aws:dynamodb" + }, + { + "eventID": "3", + "eventVersion": "1.0", + "dynamodb": { + "Keys": { + "Id": { + "N": "101" + } + }, + "SizeBytes": 38, + "SequenceNumber": "333", + "OldImage": { + "Message": { + "S": "This item has changed" + }, + "Id": { + "N": "101" + } + }, + "StreamViewType": "NEW_AND_OLD_IMAGES" + }, + "awsRegion": "us-west-2", + "eventName": "REMOVE", + "eventSourceARN": "arn:aws:dynamodb:us-west-2:account-id:table/ExampleTableWithStream/stream/2015-06-27T00:48:05.899", + "eventSource": "aws:dynamodb" + } + ] + } +} + +module.exports = { + createDynamoDBEvent: createDynamoDBEvent +} diff --git a/lambda-testing/test/utils/mockContext.js b/lambda-testing/test/utils/mockContext.js new file mode 100644 index 0000000..f93a632 --- /dev/null +++ b/lambda-testing/test/utils/mockContext.js @@ -0,0 +1,41 @@ +'use strict'; +/** + Mock Context object for testing. Takes a callback function which will be called with the value passed into + the context method + + Example usage in a test: + + var contextCreator = require('./utils/mockContext.js'); + var testEvent = { key1: 'value1'} + + describe('LambdaTest', function(){ + it("LambdaTest: returns value when given event with key1 property", function(done) { + + function test(result){ + expect(result).to.equal("value1") + done(); + } + + var context = contextCreator(test); + + Handler(testEvent, context); + }) + }) + +**/ + +module.exports = function(cb) { + return { + succeed: function (result) { + console.log('succeed: ' + result); + cb(result); + }, + fail: function (error) { + console.log('fail: ' + error); + cb(error); + }, + done: function (result) { + cb(result); + } + } +}; diff --git a/serverless-starter/.gitignore b/serverless-starter/.gitignore new file mode 100644 index 0000000..f897b5a --- /dev/null +++ b/serverless-starter/.gitignore @@ -0,0 +1,40 @@ +# Logs +logs +*.log +npm-debug.log + +# Runtime data +pids +*.pid +*.seed +dist + +# Directory for instrumented libs generated by jscoverage/JSCover +lib-cov + +# Coverage directory used by tools like istanbul +coverage + +# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) +.grunt + +# node-waf configuration +.lock-wscript + +# Compiled binary addons (http://nodejs.org/api/addons.html) +build/Release + +# Dependency directory +# https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git +node_modules + +#IDE Stuff +**/.idea + +#OS STUFF +.DS_Store +.tmp + +#SERVERLESS STUFF +admin.env +.env \ No newline at end of file diff --git a/serverless-starter/.npmignore b/serverless-starter/.npmignore new file mode 100644 index 0000000..52438f3 --- /dev/null +++ b/serverless-starter/.npmignore @@ -0,0 +1,41 @@ +# Logs +logs +*.log +npm-debug.log + +# Runtime data +pids +*.pid +*.seed +dist + +# Directory for instrumented libs generated by jscoverage/JSCover +lib-cov + +# Coverage directory used by tools like istanbul +coverage + +# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) +.grunt + +# node-waf configuration +.lock-wscript + +# Compiled binary addons (http://nodejs.org/api/addons.html) +build/Release + +# Dependency directory +# https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git +node_modules + +#IDE Stuff +**/.idea + +#OS STUFF +.DS_Store +.tmp + +#SERVERLESS STUFF +admin.env +.env +_meta \ No newline at end of file diff --git a/serverless-starter/README.md b/serverless-starter/README.md new file mode 100644 index 0000000..c450d09 --- /dev/null +++ b/serverless-starter/README.md @@ -0,0 +1,33 @@ +#Serverless Starter + +A boilerplate for new Serverless Projects. This is full of useful examples and we add more on a regular basis. + +##Install + +Make sure you have the [Serverless Framework](http://www.serverless.com) installed and you're using Node V4 +``` +npm install serverless -g +``` + +Install the project using Serverless: +``` +serverless project install serverless-starter +``` +Go into each project component and install its dependencies via npm: +``` +npm install +``` +Deploy your functions and endpoints: +``` +serverless dash deploy +``` + +##Includes + +This project contains the following: + +* **Multi:** A Serverless module with multiple functions each containing a single endpoint +* **Single:** A Serverless module with a single function that uses multiple endpoints. +* **Optimizer Plugin:** Each function is automatically optimized via the [serverless-optimizer-plugin](https://www.github.com/serverless/serverless-optimizer-plugin) +* **Templates:** Templates are used to reduce configuraton syntax +* **REST API Parameters:** The Multi/Show function endpoint gives an example of how to accept a path parameter diff --git a/serverless-starter/_meta/resources/s-resources-cf-dev-euwest1.json b/serverless-starter/_meta/resources/s-resources-cf-dev-euwest1.json new file mode 100644 index 0000000..4af5278 --- /dev/null +++ b/serverless-starter/_meta/resources/s-resources-cf-dev-euwest1.json @@ -0,0 +1,64 @@ +{ + "AWSTemplateFormatVersion": "2010-09-09", + "Description": "The AWS CloudFormation template for this Serverless application's resources outside of Lambdas and Api Gateway", + "Resources": { + "IamRoleLambda": { + "Type": "AWS::IAM::Role", + "Properties": { + "AssumeRolePolicyDocument": { + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Principal": { + "Service": [ + "lambda.amazonaws.com" + ] + }, + "Action": [ + "sts:AssumeRole" + ] + } + ] + }, + "Path": "/" + } + }, + "IamPolicyLambda": { + "Type": "AWS::IAM::Policy", + "Properties": { + "PolicyName": "dev-serverless-starter-lambda", + "PolicyDocument": { + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": [ + "logs:CreateLogGroup", + "logs:CreateLogStream", + "logs:PutLogEvents" + ], + "Resource": "arn:aws:logs:eu-west-1:*:*" + } + ] + }, + "Roles": [ + { + "Ref": "IamRoleLambda" + } + ] + } + } + }, + "Outputs": { + "IamRoleArnLambda": { + "Description": "ARN of the lambda IAM role", + "Value": { + "Fn::GetAtt": [ + "IamRoleLambda", + "Arn" + ] + } + } + } +} \ No newline at end of file diff --git a/serverless-starter/_meta/variables/s-variables-common.json b/serverless-starter/_meta/variables/s-variables-common.json new file mode 100644 index 0000000..fa858ff --- /dev/null +++ b/serverless-starter/_meta/variables/s-variables-common.json @@ -0,0 +1,6 @@ +{ + "project": "serverless-starter", + "projectBucket": "serverless.eu-west-1.serverless-demo", + "domain": "serverless-demo", + "notificationEmail": "jackacarlisle@gmail.com" +} \ No newline at end of file diff --git a/serverless-starter/_meta/variables/s-variables-dev-euwest1.json b/serverless-starter/_meta/variables/s-variables-dev-euwest1.json new file mode 100644 index 0000000..b2ff15d --- /dev/null +++ b/serverless-starter/_meta/variables/s-variables-dev-euwest1.json @@ -0,0 +1,3 @@ +{ + "region": "eu-west-1" +} \ No newline at end of file diff --git a/serverless-starter/_meta/variables/s-variables-dev.json b/serverless-starter/_meta/variables/s-variables-dev.json new file mode 100644 index 0000000..8f706c2 --- /dev/null +++ b/serverless-starter/_meta/variables/s-variables-dev.json @@ -0,0 +1,3 @@ +{ + "stage": "dev" +} \ No newline at end of file diff --git a/serverless-starter/package.json b/serverless-starter/package.json new file mode 100644 index 0000000..58357ef --- /dev/null +++ b/serverless-starter/package.json @@ -0,0 +1,18 @@ +{ + "name": "serverless-starter", + "version": "0.1.0", + "description": "A Serverless Project and its Serverless Plugin dependencies.", + "author": { + "name": "me" + }, + "license": "MIT", + "private": false, + "repository": { + "type": "git", + "url": "git://github.com/" + }, + "dependencies": { + "serverless-optimizer-plugin": "^1.0.0" + }, + "scripts": {} +} \ No newline at end of file diff --git a/serverless-starter/restApi/lib/index.js b/serverless-starter/restApi/lib/index.js new file mode 100644 index 0000000..989a051 --- /dev/null +++ b/serverless-starter/restApi/lib/index.js @@ -0,0 +1,35 @@ +/** + * Lib + */ + +// Single - All +module.exports.singleAll = function(event, cb) { + + var response = { + message: 'Your Serverless function ran successfully via the \'' + + event.httpMethod + + '\' method!' + }; + + return cb(null, response); +}; + +// Multi - Create +module.exports.multiCreate = function(event, cb) { + + var response = { + message: 'Your Serverless function \'multi/create\' ran successfully!' + }; + + return cb(null, response); +}; + +// Multi - Show +module.exports.multiShow = function(event, cb) { + + var response = { + message: 'Your Serverless function \'multi/show\' ran successfully with the following ID \'' + event.pathId + '\'!' + }; + + return cb(null, response); +}; \ No newline at end of file diff --git a/serverless-starter/restApi/multi/create/event.json b/serverless-starter/restApi/multi/create/event.json new file mode 100644 index 0000000..9e26dfe --- /dev/null +++ b/serverless-starter/restApi/multi/create/event.json @@ -0,0 +1 @@ +{} \ No newline at end of file diff --git a/serverless-starter/restApi/multi/create/handler.js b/serverless-starter/restApi/multi/create/handler.js new file mode 100644 index 0000000..bef188c --- /dev/null +++ b/serverless-starter/restApi/multi/create/handler.js @@ -0,0 +1,22 @@ +'use strict'; + +/** + * Serverless Module: Lambda Handler + * - Your lambda functions should be a thin wrapper around your own separate + * modules, to keep your code testable, reusable and AWS independent + * - 'serverless-helpers-js' module is required for Serverless ENV var support. Hopefully, AWS will add ENV support to Lambda soon :) + */ + +// Require Serverless ENV vars +var ServerlessHelpers = require('serverless-helpers-js').loadEnv(); + +// Require Logic +var lib = require('../../lib'); + +// Lambda Handler +module.exports.handler = function(event, context) { + + lib.multiCreate(event, function(error, response) { + return context.done(error, response); + }); +}; \ No newline at end of file diff --git a/serverless-starter/restApi/multi/create/s-function.json b/serverless-starter/restApi/multi/create/s-function.json new file mode 100644 index 0000000..b2184da --- /dev/null +++ b/serverless-starter/restApi/multi/create/s-function.json @@ -0,0 +1,34 @@ +{ + "name": "create", + "handler": "multi/create/handler.handler", + "runtime": "nodejs", + "timeout": 6, + "memorySize": 1024, + "custom": { + "excludePatterns": [], + "envVars": [], + "optimize": true + }, + "endpoints": [ + { + "path": "multi/create", + "method": "GET", + "authorizationType": "none", + "apiKeyRequired": false, + "requestParameters": {}, + "requestTemplates": "$${apiRequestTemplate}", + "responses": { + "400": { + "statusCode": "400" + }, + "default": { + "statusCode": "200", + "responseParameters": {}, + "responseModels": {}, + "responseTemplates": {}, + "application/json": "" + } + } + } + ] +} \ No newline at end of file diff --git a/serverless-starter/restApi/multi/s-module.json b/serverless-starter/restApi/multi/s-module.json new file mode 100644 index 0000000..cbc5e72 --- /dev/null +++ b/serverless-starter/restApi/multi/s-module.json @@ -0,0 +1,14 @@ +{ + "name": "multi", + "version": "0.0.1", + "profile": "aws-v0.1.1", + "location": "https://github.com/...", + "author": "", + "description": "A Serverless Module", + "runtime": "nodejs", + "custom": {}, + "cloudFormation": { + "resources": {}, + "lambdaIamPolicyDocumentStatements": [] + } +} \ No newline at end of file diff --git a/serverless-starter/restApi/multi/show/event.json b/serverless-starter/restApi/multi/show/event.json new file mode 100644 index 0000000..9e26dfe --- /dev/null +++ b/serverless-starter/restApi/multi/show/event.json @@ -0,0 +1 @@ +{} \ No newline at end of file diff --git a/serverless-starter/restApi/multi/show/handler.js b/serverless-starter/restApi/multi/show/handler.js new file mode 100644 index 0000000..e4b6d39 --- /dev/null +++ b/serverless-starter/restApi/multi/show/handler.js @@ -0,0 +1,22 @@ +'use strict'; + +/** + * Serverless Module: Lambda Handler + * - Your lambda functions should be a thin wrapper around your own separate + * modules, to keep your code testable, reusable and AWS independent + * - 'serverless-helpers-js' module is required for Serverless ENV var support. Hopefully, AWS will add ENV support to Lambda soon :) + */ + +// Require Serverless ENV vars +var ServerlessHelpers = require('serverless-helpers-js').loadEnv(); + +// Require Logic +var lib = require('../../lib'); + +// Lambda Handler +module.exports.handler = function(event, context) { + + lib.multiShow(event, function(error, response) { + return context.done(error, response); + }); +}; \ No newline at end of file diff --git a/serverless-starter/restApi/multi/show/s-function.json b/serverless-starter/restApi/multi/show/s-function.json new file mode 100644 index 0000000..0be8795 --- /dev/null +++ b/serverless-starter/restApi/multi/show/s-function.json @@ -0,0 +1,34 @@ +{ + "name": "show", + "handler": "multi/show/handler.handler", + "runtime": "nodejs", + "timeout": 6, + "memorySize": 256, + "custom": { + "excludePatterns": [], + "envVars": [], + "optimize": true + }, + "endpoints": [ + { + "path": "multi/show/{id}", + "method": "GET", + "authorizationType": "none", + "apiKeyRequired": false, + "requestParameters": {}, + "requestTemplates": "$${apiRequestTemplate}", + "responses": { + "400": { + "statusCode": "400" + }, + "default": { + "statusCode": "200", + "responseParameters": {}, + "responseModels": {}, + "responseTemplates": {}, + "application/json": "" + } + } + } + ] +} \ No newline at end of file diff --git a/serverless-starter/restApi/multi/show/s-templates.json b/serverless-starter/restApi/multi/show/s-templates.json new file mode 100644 index 0000000..4e82c6a --- /dev/null +++ b/serverless-starter/restApi/multi/show/s-templates.json @@ -0,0 +1,7 @@ +{ + "apiRequestTemplate": { + "application/json": { + "pathId": "$input.params('id')" + } + } +} \ No newline at end of file diff --git a/serverless-starter/restApi/package.json b/serverless-starter/restApi/package.json new file mode 100644 index 0000000..725b4a2 --- /dev/null +++ b/serverless-starter/restApi/package.json @@ -0,0 +1,17 @@ +{ + "name": "", + "version": "0.0.1", + "description": "Serverless module dependencies", + "author": "me", + "license": "MIT", + "private": true, + "repository": { + "type": "git", + "url": "git://github.com/" + }, + "keywords": [], + "devDependencies": {}, + "dependencies": { + "serverless-helpers-js": "~0.0.3" + } +} \ No newline at end of file diff --git a/serverless-starter/restApi/s-component.json b/serverless-starter/restApi/s-component.json new file mode 100644 index 0000000..74afc60 --- /dev/null +++ b/serverless-starter/restApi/s-component.json @@ -0,0 +1,5 @@ +{ + "name": "restApi", + "runtime": "nodejs", + "custom": {} +} \ No newline at end of file diff --git a/serverless-starter/restApi/single/all/event.json b/serverless-starter/restApi/single/all/event.json new file mode 100644 index 0000000..b654c39 --- /dev/null +++ b/serverless-starter/restApi/single/all/event.json @@ -0,0 +1,3 @@ +{ + "httpMethod": "GET" +} diff --git a/serverless-starter/restApi/single/all/handler.js b/serverless-starter/restApi/single/all/handler.js new file mode 100644 index 0000000..5c76139 --- /dev/null +++ b/serverless-starter/restApi/single/all/handler.js @@ -0,0 +1,22 @@ +'use strict'; + +/** + * Serverless Module: Lambda Handler + * - Your lambda functions should be a thin wrapper around your own separate + * modules, to keep your code testable, reusable and AWS independent + * - 'serverless-helpers-js' module is required for Serverless ENV var support. Hopefully, AWS will add ENV support to Lambda soon :) + */ + +// Require Serverless ENV vars +var ServerlessHelpers = require('serverless-helpers-js').loadEnv(); + +// Require Logic +var lib = require('../../lib'); + +// Lambda Handler +module.exports.handler = function(event, context) { + + lib.singleAll(event, function(error, response) { + return context.done(error, response); + }); +}; \ No newline at end of file diff --git a/serverless-starter/restApi/single/all/s-function.json b/serverless-starter/restApi/single/all/s-function.json new file mode 100644 index 0000000..d659acd --- /dev/null +++ b/serverless-starter/restApi/single/all/s-function.json @@ -0,0 +1,94 @@ +{ + "name": "all", + "handler": "single/all/handler.handler", + "runtime": "nodejs", + "timeout": 6, + "memorySize": 1024, + "custom": { + "excludePatterns": [], + "envVars": [], + "optimize": true + }, + "endpoints": [ + { + "path": "single/all", + "method": "GET", + "authorizationType": "none", + "apiKeyRequired": false, + "requestParameters": {}, + "requestTemplates": "$${apiRequestTemplate}", + "responses": { + "400": { + "statusCode": "400" + }, + "default": { + "statusCode": "200", + "responseParameters": {}, + "responseModels": {}, + "responseTemplates": {}, + "application/json": "" + } + } + }, + { + "path": "single/all", + "method": "PUT", + "authorizationType": "none", + "apiKeyRequired": false, + "requestParameters": {}, + "requestTemplates": "$${apiRequestTemplate}", + "responses": { + "400": { + "statusCode": "400" + }, + "default": { + "statusCode": "200", + "responseParameters": {}, + "responseModels": {}, + "responseTemplates": {}, + "application/json": "" + } + } + }, + { + "path": "single/all", + "method": "POST", + "authorizationType": "none", + "apiKeyRequired": false, + "requestParameters": {}, + "requestTemplates": "$${apiRequestTemplate}", + "responses": { + "400": { + "statusCode": "400" + }, + "default": { + "statusCode": "200", + "responseParameters": {}, + "responseModels": {}, + "responseTemplates": {}, + "application/json": "" + } + } + }, + { + "path": "single/all", + "method": "DELETE", + "authorizationType": "none", + "apiKeyRequired": false, + "requestParameters": {}, + "requestTemplates": "$${apiRequestTemplate}", + "responses": { + "400": { + "statusCode": "400" + }, + "default": { + "statusCode": "200", + "responseParameters": {}, + "responseModels": {}, + "responseTemplates": {}, + "application/json": "" + } + } + } + ] +} \ No newline at end of file diff --git a/serverless-starter/restApi/single/all/s-templates.json b/serverless-starter/restApi/single/all/s-templates.json new file mode 100644 index 0000000..4e82c6a --- /dev/null +++ b/serverless-starter/restApi/single/all/s-templates.json @@ -0,0 +1,7 @@ +{ + "apiRequestTemplate": { + "application/json": { + "pathId": "$input.params('id')" + } + } +} \ No newline at end of file diff --git a/serverless-starter/restApi/single/s-module.json b/serverless-starter/restApi/single/s-module.json new file mode 100644 index 0000000..f059220 --- /dev/null +++ b/serverless-starter/restApi/single/s-module.json @@ -0,0 +1,14 @@ +{ + "name": "single", + "version": "0.0.1", + "profile": "aws-v0.1.1", + "location": "https://github.com/...", + "author": "", + "description": "A Serverless Module", + "runtime": "nodejs", + "custom": {}, + "cloudFormation": { + "resources": {}, + "lambdaIamPolicyDocumentStatements": [] + } +} \ No newline at end of file diff --git a/serverless-starter/s-project.json b/serverless-starter/s-project.json new file mode 100644 index 0000000..1ec4aae --- /dev/null +++ b/serverless-starter/s-project.json @@ -0,0 +1,76 @@ +{ + "name": "serverless-starter", + "version": "0.0.1", + "profile": "serverless-v0.1.1", + "location": "https://github.com/...", + "author": "", + "description": "A starter boilerplate for new Serverless Projects using primarily nodejs", + "custom": {}, + "plugins": [ + "serverless-optimizer-plugin" + ], + "cloudFormation": { + "AWSTemplateFormatVersion": "2010-09-09", + "Description": "The AWS CloudFormation template for this Serverless application's resources outside of Lambdas and Api Gateway", + "Resources": { + "IamRoleLambda": { + "Type": "AWS::IAM::Role", + "Properties": { + "AssumeRolePolicyDocument": { + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Principal": { + "Service": [ + "lambda.amazonaws.com" + ] + }, + "Action": [ + "sts:AssumeRole" + ] + } + ] + }, + "Path": "/" + } + }, + "IamPolicyLambda": { + "Type": "AWS::IAM::Policy", + "Properties": { + "PolicyName": "${stage}-${project}-lambda", + "PolicyDocument": { + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": [ + "logs:CreateLogGroup", + "logs:CreateLogStream", + "logs:PutLogEvents" + ], + "Resource": "arn:aws:logs:${region}:*:*" + } + ] + }, + "Roles": [ + { + "Ref": "IamRoleLambda" + } + ] + } + } + }, + "Outputs": { + "IamRoleArnLambda": { + "Description": "ARN of the lambda IAM role", + "Value": { + "Fn::GetAtt": [ + "IamRoleLambda", + "Arn" + ] + } + } + } + } +} \ No newline at end of file diff --git a/serverless-starter/s-templates.json b/serverless-starter/s-templates.json new file mode 100644 index 0000000..c62dd0f --- /dev/null +++ b/serverless-starter/s-templates.json @@ -0,0 +1,12 @@ +{ + "apiRequestTemplate": { + "application/json": { + "httpMethod": "$context.httpMethod", + "body": "$input.json('$')", + "queryParams" : "$input.params().querystring", + "headerParams" : "$input.params().header", + "headerParamNames" : "$input.params().header.keySet()", + "contentTypeValue" : "$input.params().header.get('Content-Type')" + } + } +} \ No newline at end of file diff --git a/upload-deploy-example/lambda-upload-create.sh b/upload-deploy-example/lambda-upload-create.sh new file mode 100755 index 0000000..e6fd4bb --- /dev/null +++ b/upload-deploy-example/lambda-upload-create.sh @@ -0,0 +1,41 @@ +#!/bin/bash + +# bash script that will zip up a deployment package, upload a lambda function +# to a specified s3 bucket. The script will then deploy the function to lambda (node version) + +echo -n "Enter the name of the files you wish to zip (eg. lambdaFunction.js node_modules): " +read FilesToBeZipped +echo -n "Enter the name of the output zip file (eg. lambdaFunction): " +read ZipFileName +echo -n "Enter the name of the s3 bucket you wish to upload to: " +read BucketName +echo -n "Enter the name of your lambda function: " +read FunctionName +echo -n "Enter a description of your function: " +read Description +echo -n "Enter the ARN of the role you wish to implement: " +read Role + +zip -r "$ZipFileName.zip" $FilesToBeZipped + +aws s3api put-object --bucket $BucketName --key "./$ZipFileName.zip" --body "./$ZipFileName.zip" + +aws lambda create-function --function-name $FunctionName --runtime nodejs \ +--role $Role --handler "$ZipFileName.handler" \ +--code S3Bucket="$BucketName",S3Key="./$ZipFileName.zip" \ +--description "$Description" + + + + + + + + +# echo -n "Enter the object version (eg. 111111): " +# read ObjectVersion + +# add default values to variables +# check whether the version already exists +# role precreated or have to create them +# permission denied, you have to make it an executable diff --git a/upload-deploy-example/upload.js b/upload-deploy-example/upload.js new file mode 100644 index 0000000..5ccfc0b --- /dev/null +++ b/upload-deploy-example/upload.js @@ -0,0 +1,3 @@ +exports.handler = function(event, context) { + context.succeed(event.key1 + ' ' + event.key2); +}; diff --git a/upload-deploy-example/upload.zip b/upload-deploy-example/upload.zip new file mode 100644 index 0000000..ad076f2 Binary files /dev/null and b/upload-deploy-example/upload.zip differ