From 673c3e2f6bf0a43bb9f07d7e3d1d7a58d952da45 Mon Sep 17 00:00:00 2001 From: Mohamed Elasmar <71043312+moelasmar@users.noreply.github.com> Date: Wed, 18 Aug 2021 15:27:12 -0700 Subject: [PATCH] cdk-support-dev branch merged to develop branch (#3146) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * chore: Use BUILD_TAG and JENKINS_URL to identify Jenkins env (#2805) * chore: Use BUILD_TAG instead of JENKINS_URL to identify Jenkins env * Keep JENKINS_URL * fix: Java Gradle 7 Build Test Data Issue (#2816) * Updated Deprecated compile to implementation * Updated Tabs to Spaces * Updated for Kotlin * Updated More Tabs to Spaces * Request for Comments: Auto Create ECR Repos in Guided Deploy (#2675) * Added Auto Create ECR Design Doc * Updated Format * Addressed feedback * fix(bug): Pass boto_session to SAM Translator library (#2759) When validating the sam template, SAM CLI requires credentials to get the IAM Manged Policies. SAM Translator also requires the region in order to figure out the parition. Previously, SAM Translator assumed this to be on the Env but SAM CLI could get this information from a command line argument or a profile. This commit passes the boto_session into the SAM Translator lib (v1.35.0 or later), so that SAM Translator can figure out the partition from the information passed to SAM CLI. Co-authored-by: Jacob Fuss Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * feat: add option --container-host and --container-host-interface to sam local commands (#2806) * chore: bump version to 1.23.0 (#2824) Co-authored-by: Xia Zhao * refactor: Extract git-clone functionality out of InitTemplates class (#2821) * [Refactor] extract git-clone functionality out of InitTemplates class to its own class * apply review comments * typo * apply review comments * chore: add command line options to pyinstaller build script (#2829) * chore: add command line options to pyinstaller build script * Update quotes * fix the dist folder name * update logs * trigger appveyor build * ignoring temp dirs used by dotnet (#2839) Co-authored-by: Slava Senchenko * chore: Add GitHub actions to automate our issues workflow (#2521) * add github actions to automate our github issue workflow * reformat * update name format * update response message to be more precise * updated with the correct sam bot login name * updated with the correct token name * updated label name and bot name Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * Point numpy version to <1.20.3 (#2868) * Point numpy version to <1.19 to avoid PEP 317 failure * Update integ test python requirements which contain numpy * Fixing to numpy 1.20.2 * Revert "Fixing to numpy 1.20.2" This reverts commit a03f4d77e4b1588ecc3d0cbbe0f4c7c80ef60571. * Fixing numpy version to <1.20.3 * chore: Overhaul the development guide (#2827) * Validate default template.json (#2855) Issue: https://github.com/aws/aws-sam-cli/issues/2355 Added integration tests for `validate` command Co-authored-by: Slava Senchenko Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * fix: package/deploy failure when Location/TemplateURL is virtual host S3 URL (#2785) * feat: Supports uncompression local layer zips in sam local (#2877) * refactor: refactor logs command library (#2862) * refactor logs command library * re-organize due to click usage * address comments * adding pylint disable for console consumer * make pylint happy with python 3.6 Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * sam init - Enable --app-template argument for Image package-type (#2864) * Enable --app-template argument for Image package-type while generating a new SAM project using 'sam init' * Fix the exception message * normalize pathes in UT to pass on windows * normalize project-template local path * fix: Ignore `.aws-sam` in sam build cache checksum (#2881) * feat: Allow dir_checksum() to accept a ignore_list * feat: Ignore .aws-sam when calculate cache md5 * fix: Fix crash when nested CFN stack has dict TemplateURL (unresolved intrinsics) (#2879) * fix: Fix crash when nested CFN stack has dict TemplateURL * Interactive flow question default answer from toml (#2850) * get questions' default answers from toml * make black happy * add more docs * rename question's attribute 'default_from_toml' to 'defaultFromToml' and rename 'valueof' to 'key' and add some docs * Add preload_value * Allow to pass toml file to interactive flow run() * Update related classes to utilize proload value context object * Update test * Add missing docstring * Remove samconfig change * Rename extra_context to context because it is required field now * Remove toml logics from this PR * Update comment Co-authored-by: Sam Liu Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * Don't use layer.name in LayerBuildDefinition.__str__ (#2873) * Watchdog error (#2902) * chore: updating version of watchdog. Co-authored-by: Tarun Mall * chore: Update aws_lambda_builders to 1.4.0 (#2903) * chore: Update aws_lambda_builders to 1.4.0 * Update integration tests for new maven behavior * Add integ test for PEP 600 tags * chore: Adds missing unit tests for LayerBuildDefinition in build_graph (#2883) * Adds missing unit tests for LayerBuildDefinition in build_graph * fix black formatting * fix: Build graph tests using assertTrue instead of assertEqual + added assertions (#2909) * samconfig debug level logging fixed; documentation updated (#2891) * samconfig debug level logging fixed; documentation updated * integration tests fix * help text improved Co-authored-by: Slava Senchenko * chore: update to aws-sam-translator 1.36.0 (#2917) * Revert "samconfig debug level logging fixed; documentation updated (#2891)" (#2918) This reverts commit 2a13a69822660538c478118125eef50d0164995a. * chore: bump version to 1.24.0 (#2919) * fix: Windows default validate template integration test (#2924) * Enabled ability to provide tags as list in samconfig.toml file (#2912) * Enabled ability to provide tags as list in samconfig.toml file * Removed trailing white spaces and reformatted code * Added integration test for tags as list deploy command * Added integration test for tags as string from samconfig.toml Co-authored-by: Mohamed Elasmar <71043312+moelasmar@users.noreply.github.com> Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> * fix: Add configparser to PyInstaller hiddenimports to resolve dependency issue from botocore (#2932) * Revert "Enabled ability to provide tags as list in samconfig.toml file (#2912)" (#2933) This reverts commit 104b5e5c528ef7e1ad0e83a5ba42316836a21e83. * chore: bump version to 1.24.1 (#2938) * chore: Update requests to 2.25.1 to remove the locking on urllib3 to 1.25 (#2929) * Updating tomlkit version as we need fix of the dataloss bug during copy() method use on Table object (#2939) * Updating tomlkit version as we need fix of the dataloss bug during copy() method use on Table object * Fixing types for tomlkit * Adding integration test for tomlkit not able to parse boolean issue. * Updating THIRD-PARTY-LICENSES file. * Parameterizing integ test filename Co-authored-by: Tarun Mall Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * test: Fix the integration validate tests on Windows (#2940) * ci: Pin boto3-stubs to 1.17.90 due to a bug in 1.17.91 (#2942) * resolve pseudo region in build amd deploy comands (#2884) * resolve pseudo region from command argument or envvar if available * Revert "resolve pseudo region from command argument or envvar if available" This reverts commit abc0b2b62526f517dd633186861087fefb0f8b6e. * pass the aws-region to the BuildContext, DeployContext and Deploy command * Add integration tests * Make black happy * Temporary skip SAR build INTEGRATION TEST till we figure out the credeential issue * skip SAR tests when no credentials are available * Use the constant IntrinsicsSymbolTable.AWS_REGION instead of the string 'AWS::Region' * expand build SAR integration tests to four(all combinations of use-containr and us-east-2 region) * refactoring, merge stack_names and stack_names_with_regions together Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * fix: Catch more errors when building an image (#2857) * chore: fix canary/integration test issue (#2945) * feat: Allow tags as list input from samconfig.toml file (#2956) * Enabled ability to provide tags as list in samconfig.toml file * Removed trailing white spaces and reformatted code * Added integration test for tags as list deploy command * Added integration test for tags as string from samconfig.toml * Fixed Appveyer error by removing s3 info Co-authored-by: Mohamed Elasmar <71043312+moelasmar@users.noreply.github.com> Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> * fix: Deploy integration tests for toml tags as a list (#2965) * chore: Increase awareness of same file warning during package (#2946) * chore: increase awareness of same file warning during package * fix formatting & grammar Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * fix: Allow the base64Encoded field in REST Api, skip validation of unknown fields and validate missing statusCode for Http Api (#2941) * fix API Gateway emulator: - skip validating the non allowed fields for Http Api Gateway, as it always skip the unknown fields - add base64Encoded as an allowed field for Rest Api gateway - base64 decoding will be always done for Http API gateway if the lambda response isBase64Encoded is true regardless the content-type - validate if statusCode is missing in case of Http API, and payload version 1.0 * - accept "true", "True", "false", "False" as valid isBase64Encoded values. - Validate on other isBase64Encoded Values - add more integration && unit test cases * fix lint && black issues * use smaller image to test Base64 response * fix: pass copy of environment variables for keeping cache valid (#2943) * fix: pass copy of environment variables for keeping cache valid * add integ tests * update docs * make black happy Co-authored-by: Qingchuan Ma <69653965+qingchm@users.noreply.github.com> * fix: Skip build of Docker image if ImageUri is a valid ECR URL (#2934) (#2935) * Add condition to managed bucket policy (#2999) * Update appveyor.yml to do docker login on both dockerhub and Public ECR (#3005) (#3006) Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> * chore: bump version to 1.25.0 (#3007) Co-authored-by: Sriram Madapusi Vasudevan <3770774+sriram-mv@users.noreply.github.com> * temp: reduce python testing matrix (#3008) * temp: disable testing against python 3.8, and enabled 3.7 (#3009) * temp: disable testing against python 3.8, and enabled 3.7 * temp: disable testing against python 3.8, and enabled 3.7 & 3.6 * fix: enable all runtimes in python testing matrix (#3011) * revert: enable all runtimes in python testing matrix * fix indentation for yml * chore: update to aws-sam-translator 1.37.0 (#3019) * chore: bump version to 1.26.0 (#3020) * chore: Improved --resolve-s3 option documentation and deployment without s3 error messages (#2983) * Improve documentation on --resolve-s3 option and improve s3 failure messages * Changed indentation for integration test on s3 error message * Fixed a typo in description * Improve spacing on help text for resolve-s3 option * feature: new SAM command supporting on CDK, sam init, sam package, sam deploy (#2994) * Cdk support package and deploy (#352) * Refactor project type click option * Refactor IAC helper * Update callbacks handling --cdk-app and --template * Add methods for stack in iac interface; Update CFN plugin to link image assets * Refactor option validations and update package cli interface * Update commands to include iac option validations * Fix iac validation * sam package for CDK * sam package & deploy for CDK * Update option validations to deal with guided deploy * Update test for guided deploy for CDK * Upgrade lambda builder * chore: Update aws_lambda_builders to 1.4.0 (#2903) * chore: Update aws_lambda_builders to 1.4.0 * Update integration tests for new maven behavior * Add integ test for PEP 600 tags * Update to update asset parameter after pacakage * Update iac cdk unit tests * Update iac cdk unit tests * resolve PR comments * resolve PR comments Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> Co-authored-by: Mohamed Elasmar * SAM CLI CDK init flow: (#344) * SAM CLI CDK init flow: interactive and no interactive * fix current test cases * black reformat * Allow clone from non-master branch * trigger tests * Resolve comments * Resolve comments, fix cdk runtime list, and improve docstring and error message * fix pylint * fix pylint * Update exception name for CDK project errors * Trigger appveyor * ci: Pin boto3-stubs to 1.17.90 due to a bug in 1.17.91 (#2942) * black reformat * Cdk support package and deploy fix (#2996) * Fix --resolve-s3 --s3-bucket validation under guided flow * Fix package resource assets * Add debug * Trigger test with debug * restart docker service in linux * revert - restart docker service in linux * Update appveyor.yml to log into ECR * Revert "Update appveyor.yml to log into ECR" This reverts commit e948298f1279c973fb8b596d39942afb18a32626. * Update appveyor.yml to log into Public ECR * Update appveyor.yml to explicitly specify server for logging in dockerhub * Disable python3.7, 3.6 to run integ test without pull limitation * fix rapid version regex * Update regex * fix integ test options * fix parsing the Lambda Function Image Uri * try fixing another integ test issue * resolve the resources assets * fix two log diff error * Fix recognizing assets in CFN project * Fix artifact_exporter unit test * Fix handling packageable resources in Metadata * Fix handling of Metadata resource in artifact exporter * Fix integ test - test_deploy_without_stack_name * Handling missing stack_name in iac_validator * Add more tests * Improve package regression log * Increase rerun number on two flaky tests test_all_containers_are_initialized_before_any_invoke/test_no_new_created_containers_after_lambda_function_invoke * Fix handling of multiple assets in one resource * Fix Handling of Metadata section * enable integration test for python 3.6 * enable integration test for python 3.7 * kick off tests Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> Co-authored-by: Mohamed Elasmar * chore: bump SAM CLI version * Merge cdk develop branch into beta release branch (#3047) * Cdk support package and deploy (#352) * Refactor project type click option * Refactor IAC helper * Update callbacks handling --cdk-app and --template * Add methods for stack in iac interface; Update CFN plugin to link image assets * Refactor option validations and update package cli interface * Update commands to include iac option validations * Fix iac validation * sam package for CDK * sam package & deploy for CDK * Update option validations to deal with guided deploy * Update test for guided deploy for CDK * Upgrade lambda builder * chore: Update aws_lambda_builders to 1.4.0 (#2903) * chore: Update aws_lambda_builders to 1.4.0 * Update integration tests for new maven behavior * Add integ test for PEP 600 tags * Update to update asset parameter after pacakage * Update iac cdk unit tests * Update iac cdk unit tests * resolve PR comments * resolve PR comments Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> Co-authored-by: Mohamed Elasmar * SAM CLI CDK init flow: (#344) * SAM CLI CDK init flow: interactive and no interactive * fix current test cases * black reformat * Allow clone from non-master branch * trigger tests * Resolve comments * Resolve comments, fix cdk runtime list, and improve docstring and error message * fix pylint * fix pylint * Update exception name for CDK project errors * Trigger appveyor * ci: Pin boto3-stubs to 1.17.90 due to a bug in 1.17.91 (#2942) * black reformat * Cdk support package and deploy fix (#2996) * Fix --resolve-s3 --s3-bucket validation under guided flow * Fix package resource assets * Add debug * Trigger test with debug * restart docker service in linux * revert - restart docker service in linux * Update appveyor.yml to log into ECR * Revert "Update appveyor.yml to log into ECR" This reverts commit e948298f1279c973fb8b596d39942afb18a32626. * Update appveyor.yml to log into Public ECR * Update appveyor.yml to explicitly specify server for logging in dockerhub * Disable python3.7, 3.6 to run integ test without pull limitation * fix rapid version regex * Update regex * fix integ test options * fix parsing the Lambda Function Image Uri * try fixing another integ test issue * resolve the resources assets * fix two log diff error * Fix recognizing assets in CFN project * Fix artifact_exporter unit test * Fix handling packageable resources in Metadata * Fix handling of Metadata resource in artifact exporter * Fix integ test - test_deploy_without_stack_name * Handling missing stack_name in iac_validator * Add more tests * Improve package regression log * Increase rerun number on two flaky tests test_all_containers_are_initialized_before_any_invoke/test_no_new_created_containers_after_lambda_function_invoke * Fix handling of multiple assets in one resource * Fix Handling of Metadata section * enable integration test for python 3.6 * enable integration test for python 3.7 * kick off tests * fix: interactive creating CDK project won't direct to the correct resource (#3044) Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> Co-authored-by: Mohamed Elasmar * feat: Add SAM Pipeline commands (#3085) * sam pipeline bootstrap (#2811) * two-stages-pipeline plugin * typos * add docstring * make mypy happy * removing swap file * delete the two_stages_pipeline plugin as the pipeline-bootstrap command took over its responsibility * remove 'get_template_function_runtimes' function as the decision is made to not process the SAM template during pipeline init which was the only place we use the function * sam pipeline bootstrap command * move the pipelineconfig.toml file to .aws-sam * UX - rewriting Co-authored-by: Chris Rehn * UX improvements * make black happy * apply review comments * UX - rewriting Co-authored-by: Chris Rehn * refactor * Apply review comments * use python way of array elements assignments * Update samcli/lib/pipeline/bootstrap/stage.py Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * apply review comments * typo * read using utf-8 * create and user a safe version of the save_config method * apply review comments * rename _get_command_name to _get_command_names * don't save generated ARNs for now, will save during init * Revert "don't save generated ARNs for now, will save during init" This reverts commit d184e164022d9560131c62a826436edbc93da189. * Notify the user to rotate periodically rotate the IAM credentials * typo * Use AES instead of KMS for S3 SSE * rename Ecr to ECR and Iam to IAM * Grant lambda service explicit permissions to thhe ECR instead of relying on giving this permissions on ad-hoc while creating the container images Co-authored-by: Chris Rehn Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * sam pipeline init command (#2831) * sam pipeline init command * apply review comments * apply review comments * display a message that we have successfully created the pipeline configuration file(s). * doc typo * Let 'sam pipeline init' prefills pipeline's infrastructure resources… (#2894) * Let 'sam pipeline init' prefills pipeline's infrastructure resources' values from 'sam pipeline bootstrap' results. * save bootstrapped sateg region * make black happy * exclude non-dict keys from samconfig.get_env_names method. * Rename the pipeline 'Stage' concept to 'Environment' (#2908) * Rename the pipeline 'Stage' concept to 'Environment' * typo * Rename --environment-name argument to --environment * Sam pipelines ux rename ecr repo to image repository (#2910) * Rename ecr-repo to image-repository * UT Fixes * typo * typo * feat: Support creating pipeline files directly into . without hooks (#2911) * feat: Support creating pipeline files directly into . without hooks * Integration test for pipeline init and pipeline bootstrap (#2841) * Expose Environment._get_stack_name for integ test to predict stack name * Add integ test for pipeline bootstrap * Add init integ test * small UX improvements: (#2914) * small UX improvements: 1. show a message when the user cancels a bootstrapping command. 2. Don't prompt for CI/CD provider or provider templates if there is only one choice. 3. Make PipelineFileAlreadyExistsError a UserError. 4. use the Colored class instead of fg='color' when prompting a colored message. 5. Fix a bug where we were not allowing empty response for not required questions. * Fix Integration Test: We now don't ask the user to select a provider's pipeline template if there is only one * Add docs for PipelineFileAlreadyExistsError * make black happy * Sam pipelines s3 security (#2975) * Deny non https requests for the artifacts S3 bucket * enable bucket serverside logging * add integration tests for artifacts bucket SSL-only requests and access logging * typo * Ensure the ArtifactsLoggingBucket denies non ssl requests (#2976) * Sam pipelines ux round 3 (#2979) * rename customer facing message 'CI/CD provider' to 'CI/CD system' * add a note about what 'Environment Name' is during the pipeline bootstrap guided context * Apply suggestions from code review typo Co-authored-by: Chris Rehn Co-authored-by: Chris Rehn * let pipeline IAM user assume only IAM roles tagged with Role=pipeline-execution-role (#2982) * Adding AWS_ prefix to displayed out. (#2993) Co-authored-by: Tarun Mall * Add region to pipeline bootstrap interactive flow (#2997) * Ask AWS region in bootstrap interactive flow * Read default region from boto session first * Fix a unit test * Inform write to pipelineconfig.toml at the end of bootstrap (#3002) * Print info about pipelineconfig.toml after resources are bootstrapped * Update samcli/commands/pipeline/bootstrap/cli.py Co-authored-by: Chris Rehn Co-authored-by: Chris Rehn * List detected env names in pipeline init when prompt to input the env name (#3000) * Allow question.question can be resolved using key path * Pass the list of env names message (environment_names_message) into pipeline init interactive flow context * Update samcli/commands/pipeline/init/interactive_init_flow.py Co-authored-by: Chris Rehn * Fix unit test (trigger pr builds) * Fix integ test * Fix integ test Co-authored-by: Chris Rehn * Adding account id to bootstrap message. (#2998) * Adding account id to bootstrap message. * adding docstring * Addressing PR comments. * Adding unit tests. * Fixing unit tests. Co-authored-by: Tarun Mall * Cfn creds fix (#3014) * Removing pipeline user creds from cfn output. This maintains same user exp. Co-authored-by: Tarun Mall * Ux bootstrap revamp 20210706 (#3021) * Add intro paragraph to bootstrap * Add switch account prompt * Revamp stage definition prompt * Revamp existing resources prompt * Revamp security prompt * Allow answers to be changed later * Add exit message for bootstrap * Add exit message for bootstrap (1) * Add indentation to review values * Add "Below is the summary of the answers:" * Sweep pylint errors * Update unit tests * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/cli.py Co-authored-by: Chris Rehn * Update unit tests * Add bold to other literals Co-authored-by: Chris Rehn * Adding account condition for CFN execution role. (#3027) Co-authored-by: Tarun Mall * pipeline UX revamp 20210707 (#3031) * Allow running bootstrap inside pipeline init * Select account credential source within bootstrap * Add bootstrap decorations within pipeline init * Removing ip range option from bootstrap. (#3036) * Removing ip range option from bootstrap. * Fixing unit test from UX PR. Co-authored-by: Tarun Mall * Fix toml file incorrect read/write in init --bootstrap (#3037) * Temporarily removing account fix. (#3038) Co-authored-by: Tarun Mall * Rename environment to stage (#3040) * Improve account source selection (#3042) * Fixing various cosmetics UX issues with pipeline workflow. (#3046) * Fixing credential to credentials * Forcing text color to yellow. * Adding new line after stage diagram. * Adding extra line after checking bootstrap message. * Renaming config -> configuration * account source -> credential source * Removing old message. * Fixing indentation in list. * Fixing bunch of indentation. * fixing f string Co-authored-by: Tarun Mall * Auto skip questions if stage detected (#3045) * Autofill question if default value is presented * Allow to use index to select stage names (#3051) * Updating message when bootstrap stages are missing. (#3058) * Updating message when bootstrap stages are missing. * Fixing indendation Co-authored-by: Tarun Mall * Fixing bootstrap integ tests. (#3061) * Fixing bootstrap integ tests. * Cleaning up some integ tests. * Using environment variables when running integ test on CI. * Using expression instead of full loop. * Adding instruction to use default profile on local. Co-authored-by: Tarun Mall * Fix bootstrap test region (#3064) * Fix bootstrap region in integ test * Fix regions in non-interactive mode as well * Add more pipeline init integ test (#3065) * Fix existing pipeline init integ test * Add more pipeline init integ tests * Config file bug (#3066) * Validating config file after bootstrap stack creation. * Validating config file after bootstrap. Co-authored-by: Tarun Mall * Fix pipeline init integ test because of pipelineconfig file exists (#3067) * Make stage name randomized to avoid race condition among multi canary runs (#3078) * Load number of stages from pipeline template (#3059) * Load number of stages from templates * Rename variable and add debug log * Add encoding to open() * Allow roles with Tag aws-sam-pipeline-codebuild-service-role to assume PipelineExecutionRole (#2950) * pipeline init UX: Ask to confirm when file exists (#3079) * Ask to confirm overriding if files already exist, or save to another directory * Add doc links (#3087) * Adding accidentally removed tests back. (#3088) Co-authored-by: Tarun Mall Co-authored-by: elbayaaa <72949274+elbayaaa@users.noreply.github.com> Co-authored-by: Chris Rehn Co-authored-by: Ahmed Elbayaa Co-authored-by: Tarun Co-authored-by: Tarun Mall * chore: bump aws-lambda-builder version to 1.5.0 (#3086) * chore: update to aws-sam-translator 1.38.0 (#3073) * ci: Update expected Jenkins file in pipeline integ test (#3090) * chore: Refine pipeline help text and update unit test (#3091) * Update --bucket help text * Update --stage help text * Update help text * Update help text * Update help text * Update help text * Update help text * Update jenkins generated files * Update some intro texts * Remove trialing spaces * Clearing pipeline integ test buckets with versioned objects. (#3094) * Clearing pipeline integ test buckets with versioned objects. * Fixing black formatting. Co-authored-by: Tarun Mall * Fixing bug in bucket cleanup. (#3096) Co-authored-by: Tarun Mall * Deleting bucket (#3097) Co-authored-by: Tarun Mall * Revert "temp: disable testing against python 3.8, and enabled 3.7 (#3009)" (#3098) This reverts commit fe832185be09acb199b2a09ad73bf59e1553d131. Co-authored-by: Tarun Mall * chore: bump SAM CLI version to 1.27.0 (#3101) * Add pipeline to pyinstaller (#3103) * Adding pipeline to pyinstaller. Co-authored-by: Tarun Mall * Including stage resource yaml in pip. (#3106) * Including stage resource yaml in pip. * Bumping patch version Co-authored-by: Tarun Mall * ci: Speed up unit test by caching the git clone (#3060) * ci: Speed up unit test by caching the git clone * Revert "Revert "temp: disable testing against python 3.8, and enabled 3.7"" (#3102) This reverts commit 1916bfa354b5d2612bd1bf9efd54a77e2bc66ff6. Revert "Revert "temp: disable testing against python 3.8, and enabled 3.7 (#3009)" (#3098)" (#3102) Co-authored-by: Tarun Mall * fix: fixing pipeline init integration test. (#3123) * fix: fixing pipeline init integration test so that it don't break every time we update our template. * black formatting. * cleaning up not needed file. Co-authored-by: Tarun Mall * chore: upgrade pylint to 2.9.0 (#3119) * chore: fix pylint failures in python3.9 * chore: fix pylint failures in python3.9 * chore: bump pylint version to 2.9.0 * fix typo * Add disabling reasons on new rules * fix: integration test case related to recent fix on ruby (#3124) * fix: add dockerhub default login server, improve logs to check docker pull limitation (#3137) * fix: add sample payload for 'sam local generate-event stepfunctions error' (#3043) * add sample payload for 'sam local generate-event stepfunctions error' * add better default for error * chore: Use BUILD_TAG and JENKINS_URL to identify Jenkins env (#2805) * chore: Use BUILD_TAG instead of JENKINS_URL to identify Jenkins env * Keep JENKINS_URL * Request for Comments: Auto Create ECR Repos in Guided Deploy (#2675) * Added Auto Create ECR Design Doc * Updated Format * Addressed feedback * fix(bug): Pass boto_session to SAM Translator library (#2759) When validating the sam template, SAM CLI requires credentials to get the IAM Manged Policies. SAM Translator also requires the region in order to figure out the parition. Previously, SAM Translator assumed this to be on the Env but SAM CLI could get this information from a command line argument or a profile. This commit passes the boto_session into the SAM Translator lib (v1.35.0 or later), so that SAM Translator can figure out the partition from the information passed to SAM CLI. Co-authored-by: Jacob Fuss Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * feat: add option --container-host and --container-host-interface to sam local commands (#2806) * chore: bump version to 1.23.0 (#2824) Co-authored-by: Xia Zhao * refactor: Extract git-clone functionality out of InitTemplates class (#2821) * [Refactor] extract git-clone functionality out of InitTemplates class to its own class * apply review comments * typo * apply review comments * ignoring temp dirs used by dotnet (#2839) Co-authored-by: Slava Senchenko * chore: Add GitHub actions to automate our issues workflow (#2521) * add github actions to automate our github issue workflow * reformat * update name format * update response message to be more precise * updated with the correct sam bot login name * updated with the correct token name * updated label name and bot name Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * Point numpy version to <1.20.3 (#2868) * Point numpy version to <1.19 to avoid PEP 317 failure * Update integ test python requirements which contain numpy * Fixing to numpy 1.20.2 * Revert "Fixing to numpy 1.20.2" This reverts commit a03f4d77e4b1588ecc3d0cbbe0f4c7c80ef60571. * Fixing numpy version to <1.20.3 * chore: Overhaul the development guide (#2827) * Validate default template.json (#2855) Issue: https://github.com/aws/aws-sam-cli/issues/2355 Added integration tests for `validate` command Co-authored-by: Slava Senchenko Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * fix: package/deploy failure when Location/TemplateURL is virtual host S3 URL (#2785) * feat: Supports uncompression local layer zips in sam local (#2877) * refactor: refactor logs command library (#2862) * refactor logs command library * re-organize due to click usage * address comments * adding pylint disable for console consumer * make pylint happy with python 3.6 Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * sam init - Enable --app-template argument for Image package-type (#2864) * Enable --app-template argument for Image package-type while generating a new SAM project using 'sam init' * Fix the exception message * normalize pathes in UT to pass on windows * normalize project-template local path * fix: Ignore `.aws-sam` in sam build cache checksum (#2881) * feat: Allow dir_checksum() to accept a ignore_list * feat: Ignore .aws-sam when calculate cache md5 * fix: Fix crash when nested CFN stack has dict TemplateURL (unresolved intrinsics) (#2879) * fix: Fix crash when nested CFN stack has dict TemplateURL * Interactive flow question default answer from toml (#2850) * get questions' default answers from toml * make black happy * add more docs * rename question's attribute 'default_from_toml' to 'defaultFromToml' and rename 'valueof' to 'key' and add some docs * Add preload_value * Allow to pass toml file to interactive flow run() * Update related classes to utilize proload value context object * Update test * Add missing docstring * Remove samconfig change * Rename extra_context to context because it is required field now * Remove toml logics from this PR * Update comment Co-authored-by: Sam Liu Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * Don't use layer.name in LayerBuildDefinition.__str__ (#2873) * Watchdog error (#2902) * chore: updating version of watchdog. Co-authored-by: Tarun Mall * chore: Adds missing unit tests for LayerBuildDefinition in build_graph (#2883) * Adds missing unit tests for LayerBuildDefinition in build_graph * fix black formatting * fix: Build graph tests using assertTrue instead of assertEqual + added assertions (#2909) * samconfig debug level logging fixed; documentation updated (#2891) * samconfig debug level logging fixed; documentation updated * integration tests fix * help text improved Co-authored-by: Slava Senchenko * chore: update to aws-sam-translator 1.36.0 (#2917) * Revert "samconfig debug level logging fixed; documentation updated (#2891)" (#2918) This reverts commit 2a13a69822660538c478118125eef50d0164995a. * chore: bump version to 1.24.0 (#2919) * fix: Windows default validate template integration test (#2924) * Enabled ability to provide tags as list in samconfig.toml file (#2912) * Enabled ability to provide tags as list in samconfig.toml file * Removed trailing white spaces and reformatted code * Added integration test for tags as list deploy command * Added integration test for tags as string from samconfig.toml Co-authored-by: Mohamed Elasmar <71043312+moelasmar@users.noreply.github.com> Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> * fix: Add configparser to PyInstaller hiddenimports to resolve dependency issue from botocore (#2932) * Revert "Enabled ability to provide tags as list in samconfig.toml file (#2912)" (#2933) This reverts commit 104b5e5c528ef7e1ad0e83a5ba42316836a21e83. * chore: bump version to 1.24.1 (#2938) * chore: Update requests to 2.25.1 to remove the locking on urllib3 to 1.25 (#2929) * Updating tomlkit version as we need fix of the dataloss bug during copy() method use on Table object (#2939) * Updating tomlkit version as we need fix of the dataloss bug during copy() method use on Table object * Fixing types for tomlkit * Adding integration test for tomlkit not able to parse boolean issue. * Updating THIRD-PARTY-LICENSES file. * Parameterizing integ test filename Co-authored-by: Tarun Mall Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * test: Fix the integration validate tests on Windows (#2940) * resolve pseudo region in build amd deploy comands (#2884) * resolve pseudo region from command argument or envvar if available * Revert "resolve pseudo region from command argument or envvar if available" This reverts commit abc0b2b62526f517dd633186861087fefb0f8b6e. * pass the aws-region to the BuildContext, DeployContext and Deploy command * Add integration tests * Make black happy * Temporary skip SAR build INTEGRATION TEST till we figure out the credeential issue * skip SAR tests when no credentials are available * Use the constant IntrinsicsSymbolTable.AWS_REGION instead of the string 'AWS::Region' * expand build SAR integration tests to four(all combinations of use-containr and us-east-2 region) * refactoring, merge stack_names and stack_names_with_regions together Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * fix: Catch more errors when building an image (#2857) * chore: fix canary/integration test issue (#2945) * feat: Allow tags as list input from samconfig.toml file (#2956) * Enabled ability to provide tags as list in samconfig.toml file * Removed trailing white spaces and reformatted code * Added integration test for tags as list deploy command * Added integration test for tags as string from samconfig.toml * Fixed Appveyer error by removing s3 info Co-authored-by: Mohamed Elasmar <71043312+moelasmar@users.noreply.github.com> Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> * fix: Deploy integration tests for toml tags as a list (#2965) * chore: Increase awareness of same file warning during package (#2946) * chore: increase awareness of same file warning during package * fix formatting & grammar Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * fix: Allow the base64Encoded field in REST Api, skip validation of unknown fields and validate missing statusCode for Http Api (#2941) * fix API Gateway emulator: - skip validating the non allowed fields for Http Api Gateway, as it always skip the unknown fields - add base64Encoded as an allowed field for Rest Api gateway - base64 decoding will be always done for Http API gateway if the lambda response isBase64Encoded is true regardless the content-type - validate if statusCode is missing in case of Http API, and payload version 1.0 * - accept "true", "True", "false", "False" as valid isBase64Encoded values. - Validate on other isBase64Encoded Values - add more integration && unit test cases * fix lint && black issues * use smaller image to test Base64 response * fix: pass copy of environment variables for keeping cache valid (#2943) * fix: pass copy of environment variables for keeping cache valid * add integ tests * update docs * make black happy Co-authored-by: Qingchuan Ma <69653965+qingchm@users.noreply.github.com> * fix: Skip build of Docker image if ImageUri is a valid ECR URL (#2934) (#2935) * Add condition to managed bucket policy (#2999) * chore: bump version to 1.25.0 (#3007) Co-authored-by: Sriram Madapusi Vasudevan <3770774+sriram-mv@users.noreply.github.com> * temp: reduce python testing matrix (#3008) * temp: disable testing against python 3.8, and enabled 3.7 (#3009) * temp: disable testing against python 3.8, and enabled 3.7 * temp: disable testing against python 3.8, and enabled 3.7 & 3.6 * chore: update to aws-sam-translator 1.37.0 (#3019) * chore: bump version to 1.26.0 (#3020) * chore: Improved --resolve-s3 option documentation and deployment without s3 error messages (#2983) * Improve documentation on --resolve-s3 option and improve s3 failure messages * Changed indentation for integration test on s3 error message * Fixed a typo in description * Improve spacing on help text for resolve-s3 option * feat: Add SAM Pipeline commands (#3085) * sam pipeline bootstrap (#2811) * two-stages-pipeline plugin * typos * add docstring * make mypy happy * removing swap file * delete the two_stages_pipeline plugin as the pipeline-bootstrap command took over its responsibility * remove 'get_template_function_runtimes' function as the decision is made to not process the SAM template during pipeline init which was the only place we use the function * sam pipeline bootstrap command * move the pipelineconfig.toml file to .aws-sam * UX - rewriting Co-authored-by: Chris Rehn * UX improvements * make black happy * apply review comments * UX - rewriting Co-authored-by: Chris Rehn * refactor * Apply review comments * use python way of array elements assignments * Update samcli/lib/pipeline/bootstrap/stage.py Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * apply review comments * typo * read using utf-8 * create and user a safe version of the save_config method * apply review comments * rename _get_command_name to _get_command_names * don't save generated ARNs for now, will save during init * Revert "don't save generated ARNs for now, will save during init" This reverts commit d184e164022d9560131c62a826436edbc93da189. * Notify the user to rotate periodically rotate the IAM credentials * typo * Use AES instead of KMS for S3 SSE * rename Ecr to ECR and Iam to IAM * Grant lambda service explicit permissions to thhe ECR instead of relying on giving this permissions on ad-hoc while creating the container images Co-authored-by: Chris Rehn Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> * sam pipeline init command (#2831) * sam pipeline init command * apply review comments * apply review comments * display a message that we have successfully created the pipeline configuration file(s). * doc typo * Let 'sam pipeline init' prefills pipeline's infrastructure resources… (#2894) * Let 'sam pipeline init' prefills pipeline's infrastructure resources' values from 'sam pipeline bootstrap' results. * save bootstrapped sateg region * make black happy * exclude non-dict keys from samconfig.get_env_names method. * Rename the pipeline 'Stage' concept to 'Environment' (#2908) * Rename the pipeline 'Stage' concept to 'Environment' * typo * Rename --environment-name argument to --environment * Sam pipelines ux rename ecr repo to image repository (#2910) * Rename ecr-repo to image-repository * UT Fixes * typo * typo * feat: Support creating pipeline files directly into . without hooks (#2911) * feat: Support creating pipeline files directly into . without hooks * Integration test for pipeline init and pipeline bootstrap (#2841) * Expose Environment._get_stack_name for integ test to predict stack name * Add integ test for pipeline bootstrap * Add init integ test * small UX improvements: (#2914) * small UX improvements: 1. show a message when the user cancels a bootstrapping command. 2. Don't prompt for CI/CD provider or provider templates if there is only one choice. 3. Make PipelineFileAlreadyExistsError a UserError. 4. use the Colored class instead of fg='color' when prompting a colored message. 5. Fix a bug where we were not allowing empty response for not required questions. * Fix Integration Test: We now don't ask the user to select a provider's pipeline template if there is only one * Add docs for PipelineFileAlreadyExistsError * make black happy * Sam pipelines s3 security (#2975) * Deny non https requests for the artifacts S3 bucket * enable bucket serverside logging * add integration tests for artifacts bucket SSL-only requests and access logging * typo * Ensure the ArtifactsLoggingBucket denies non ssl requests (#2976) * Sam pipelines ux round 3 (#2979) * rename customer facing message 'CI/CD provider' to 'CI/CD system' * add a note about what 'Environment Name' is during the pipeline bootstrap guided context * Apply suggestions from code review typo Co-authored-by: Chris Rehn Co-authored-by: Chris Rehn * let pipeline IAM user assume only IAM roles tagged with Role=pipeline-execution-role (#2982) * Adding AWS_ prefix to displayed out. (#2993) Co-authored-by: Tarun Mall * Add region to pipeline bootstrap interactive flow (#2997) * Ask AWS region in bootstrap interactive flow * Read default region from boto session first * Fix a unit test * Inform write to pipelineconfig.toml at the end of bootstrap (#3002) * Print info about pipelineconfig.toml after resources are bootstrapped * Update samcli/commands/pipeline/bootstrap/cli.py Co-authored-by: Chris Rehn Co-authored-by: Chris Rehn * List detected env names in pipeline init when prompt to input the env name (#3000) * Allow question.question can be resolved using key path * Pass the list of env names message (environment_names_message) into pipeline init interactive flow context * Update samcli/commands/pipeline/init/interactive_init_flow.py Co-authored-by: Chris Rehn * Fix unit test (trigger pr builds) * Fix integ test * Fix integ test Co-authored-by: Chris Rehn * Adding account id to bootstrap message. (#2998) * Adding account id to bootstrap message. * adding docstring * Addressing PR comments. * Adding unit tests. * Fixing unit tests. Co-authored-by: Tarun Mall * Cfn creds fix (#3014) * Removing pipeline user creds from cfn output. This maintains same user exp. Co-authored-by: Tarun Mall * Ux bootstrap revamp 20210706 (#3021) * Add intro paragraph to bootstrap * Add switch account prompt * Revamp stage definition prompt * Revamp existing resources prompt * Revamp security prompt * Allow answers to be changed later * Add exit message for bootstrap * Add exit message for bootstrap (1) * Add indentation to review values * Add "Below is the summary of the answers:" * Sweep pylint errors * Update unit tests * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/guided_context.py Co-authored-by: Chris Rehn * Update samcli/commands/pipeline/bootstrap/cli.py Co-authored-by: Chris Rehn * Update unit tests * Add bold to other literals Co-authored-by: Chris Rehn * Adding account condition for CFN execution role. (#3027) Co-authored-by: Tarun Mall * pipeline UX revamp 20210707 (#3031) * Allow running bootstrap inside pipeline init * Select account credential source within bootstrap * Add bootstrap decorations within pipeline init * Removing ip range option from bootstrap. (#3036) * Removing ip range option from bootstrap. * Fixing unit test from UX PR. Co-authored-by: Tarun Mall * Fix toml file incorrect read/write in init --bootstrap (#3037) * Temporarily removing account fix. (#3038) Co-authored-by: Tarun Mall * Rename environment to stage (#3040) * Improve account source selection (#3042) * Fixing various cosmetics UX issues with pipeline workflow. (#3046) * Fixing credential to credentials * Forcing text color to yellow. * Adding new line after stage diagram. * Adding extra line after checking bootstrap message. * Renaming config -> configuration * account source -> credential source * Removing old message. * Fixing indentation in list. * Fixing bunch of indentation. * fixing f string Co-authored-by: Tarun Mall * Auto skip questions if stage detected (#3045) * Autofill question if default value is presented * Allow to use index to select stage names (#3051) * Updating message when bootstrap stages are missing. (#3058) * Updating message when bootstrap stages are missing. * Fixing indendation Co-authored-by: Tarun Mall * Fixing bootstrap integ tests. (#3061) * Fixing bootstrap integ tests. * Cleaning up some integ tests. * Using environment variables when running integ test on CI. * Using expression instead of full loop. * Adding instruction to use default profile on local. Co-authored-by: Tarun Mall * Fix bootstrap test region (#3064) * Fix bootstrap region in integ test * Fix regions in non-interactive mode as well * Add more pipeline init integ test (#3065) * Fix existing pipeline init integ test * Add more pipeline init integ tests * Config file bug (#3066) * Validating config file after bootstrap stack creation. * Validating config file after bootstrap. Co-authored-by: Tarun Mall * Fix pipeline init integ test because of pipelineconfig file exists (#3067) * Make stage name randomized to avoid race condition among multi canary runs (#3078) * Load number of stages from pipeline template (#3059) * Load number of stages from templates * Rename variable and add debug log * Add encoding to open() * Allow roles with Tag aws-sam-pipeline-codebuild-service-role to assume PipelineExecutionRole (#2950) * pipeline init UX: Ask to confirm when file exists (#3079) * Ask to confirm overriding if files already exist, or save to another directory * Add doc links (#3087) * Adding accidentally removed tests back. (#3088) Co-authored-by: Tarun Mall Co-authored-by: elbayaaa <72949274+elbayaaa@users.noreply.github.com> Co-authored-by: Chris Rehn Co-authored-by: Ahmed Elbayaa Co-authored-by: Tarun Co-authored-by: Tarun Mall * chore: bump aws-lambda-builder version to 1.5.0 (#3086) * chore: update to aws-sam-translator 1.38.0 (#3073) * ci: Update expected Jenkins file in pipeline integ test (#3090) * chore: Refine pipeline help text and update unit test (#3091) * Update --bucket help text * Update --stage help text * Update help text * Update help text * Update help text * Update help text * Update help text * Update jenkins generated files * Update some intro texts * Remove trialing spaces * Clearing pipeline integ test buckets with versioned objects. (#3094) * Clearing pipeline integ test buckets with versioned objects. * Fixing black formatting. Co-authored-by: Tarun Mall * Fixing bug in bucket cleanup. (#3096) Co-authored-by: Tarun Mall * Deleting bucket (#3097) Co-authored-by: Tarun Mall * chore: bump SAM CLI version to 1.27.0 (#3101) * Add pipeline to pyinstaller (#3103) * Adding pipeline to pyinstaller. Co-authored-by: Tarun Mall * Including stage resource yaml in pip. (#3106) * Including stage resource yaml in pip. * Bumping patch version Co-authored-by: Tarun Mall * ci: Speed up unit test by caching the git clone (#3060) * ci: Speed up unit test by caching the git clone * Revert "Revert "temp: disable testing against python 3.8, and enabled 3.7"" (#3102) This reverts commit 1916bfa354b5d2612bd1bf9efd54a77e2bc66ff6. Revert "Revert "temp: disable testing against python 3.8, and enabled 3.7 (#3009)" (#3098)" (#3102) Co-authored-by: Tarun Mall * fix: fixing pipeline init integration test. (#3123) * fix: fixing pipeline init integration test so that it don't break every time we update our template. * black formatting. * cleaning up not needed file. Co-authored-by: Tarun Mall * chore: upgrade pylint to 2.9.0 (#3119) * chore: fix pylint failures in python3.9 * chore: fix pylint failures in python3.9 * chore: bump pylint version to 2.9.0 * fix typo * Add disabling reasons on new rules * fix: integration test case related to recent fix on ruby (#3124) * fix: add dockerhub default login server, improve logs to check docker pull limitation (#3137) * fix: add sample payload for 'sam local generate-event stepfunctions error' (#3043) * add sample payload for 'sam local generate-event stepfunctions error' * add better default for error * fix conflicts * chore: removed unused code which was using pre-defined managed policy… (#3030) * chore: removed unused code which was using pre-defined managed policy list and used in a sam translator wrapper, but the code path is not used. * make black * feat(public-ecr): Download Emulation images (#3152) Co-authored-by: Jacob Fuss Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> * resolve PR comments * fix(integ): Use images that are in public ecr (#3162) Co-authored-by: Jacob Fuss * Add ECR credentials for windows test (#3160) * Add ECR credentials for windows test * Remove the dockerhub env vars * fix(integ): Fix Invalid image tag errors (#3163) Co-authored-by: Jacob Fuss * Install aws cli in the windows test jobs (#3164) * Add ECR credentials for windows test * Remove the dockerhub env vars * install aws cli in the windows test jobs * fix(integ): Add missing image to have deploy integ tests work (#3165) Co-authored-by: Jacob Fuss * chore: Update dateparser to 1.0, update TestBuildCommand_PythonFunctions_Images test requirement (#3172) * chore: Update dateparser to 1.0 * Move public ECR limited test cases to Canary tests * Python39 support for samcli (#3173) * Python39 support for samcli (#354) * Python39 support for samcli * Updated reproducible-linux.txt and lambda_build_container.py for test purpose * Revert files after testing * updated integ test * updated appveyor * updated to appveyor * Update python3.9 appveyor config * update windows python3.9 executable path * update appveyor * fix lint and windows python appveyor script * bump version of lambda-builder to 1.6.0 Co-authored-by: jonife <79116465+jonife@users.noreply.github.com> * chore: bump SAM CLI version to 1.28.0 (#3174) * skip CDK build integration test cases to run during make-pr because of the limitation of ecr * run black Co-authored-by: _sam <3804518+aahung@users.noreply.github.com> Co-authored-by: Cosh_ Co-authored-by: Jacob Fuss <32497805+jfuss@users.noreply.github.com> Co-authored-by: Jacob Fuss Co-authored-by: Mathieu Grandis <73313235+mgrandis@users.noreply.github.com> Co-authored-by: Xia Zhao <78883180+xazhao@users.noreply.github.com> Co-authored-by: Xia Zhao Co-authored-by: elbayaaa <72949274+elbayaaa@users.noreply.github.com> Co-authored-by: Raymond Wang <14915548+wchengru@users.noreply.github.com> Co-authored-by: Slava Senchenko Co-authored-by: Slava Senchenko Co-authored-by: mingkun2020 <68391979+mingkun2020@users.noreply.github.com> Co-authored-by: Wing Fung Lau <4760060+hawflau@users.noreply.github.com> Co-authored-by: Mehmet Nuri Deveci <5735811+mndeveci@users.noreply.github.com> Co-authored-by: Sam Liu Co-authored-by: Arturo García <5125146+asgarciap@users.noreply.github.com> Co-authored-by: Tarun Co-authored-by: Tarun Mall Co-authored-by: Qingchuan Ma <69653965+qingchm@users.noreply.github.com> Co-authored-by: hnnasit <84355507+hnnasit@users.noreply.github.com> Co-authored-by: Renato Valenzuela <37676028+valerena@users.noreply.github.com> Co-authored-by: Alexis Facques Co-authored-by: Sriram Madapusi Vasudevan <3770774+sriram-mv@users.noreply.github.com> Co-authored-by: Chris Rehn Co-authored-by: Ahmed Elbayaa Co-authored-by: Ruperto Torres <86501267+torresxb1@users.noreply.github.com> Co-authored-by: jonife <79116465+jonife@users.noreply.github.com> --- .github/workflows/close-stale-issues.yml | 53 ++ .github/workflows/closed-issue-message.yml | 18 + .github/workflows/need-attention-label.yml | 22 + .gitignore | 6 +- .pylintrc | 10 +- DEVELOPMENT_GUIDE.md | 281 ++++++--- MANIFEST.in | 1 + appveyor-windows-build-python.yml | 9 +- appveyor-windows.yml | 7 +- appveyor.yml | 77 ++- designs/auto-create-ecr.md | 324 ++++++++++ installer/assets/THIRD-PARTY-LICENSES | 78 +-- installer/pyinstaller/hook-samcli.py | 4 + mypy.ini | 2 +- requirements/base.txt | 12 +- requirements/pre-dev.txt | 2 +- requirements/reproducible-linux.txt | 87 +-- samcli/__init__.py | 2 +- samcli/cli/command.py | 1 + samcli/cli/context.py | 4 +- samcli/cli/types.py | 5 + samcli/commands/_utils/options.py | 4 +- samcli/commands/build/build_constants.py | 8 + samcli/commands/build/build_context.py | 20 +- samcli/commands/build/command.py | 9 +- samcli/commands/deploy/command.py | 4 +- samcli/commands/deploy/deploy_context.py | 8 +- samcli/commands/deploy/guided_context.py | 20 +- samcli/commands/exceptions.py | 19 + samcli/commands/init/__init__.py | 22 +- samcli/commands/init/init_templates.py | 149 +---- .../local/cli_common/invoke_context.py | 17 +- samcli/commands/local/cli_common/options.py | 18 +- .../local/generate_event/event_generation.py | 2 +- samcli/commands/local/invoke/cli.py | 8 + samcli/commands/local/lib/local_lambda.py | 16 +- samcli/commands/local/start_api/cli.py | 8 + samcli/commands/local/start_lambda/cli.py | 8 + samcli/commands/logs/command.py | 19 +- samcli/commands/logs/console_consumers.py | 18 + samcli/commands/logs/logs_context.py | 48 +- samcli/commands/package/command.py | 4 +- samcli/commands/package/exceptions.py | 3 +- .../logs => commands/pipeline}/__init__.py | 0 .../commands/pipeline/bootstrap}/__init__.py | 0 samcli/commands/pipeline/bootstrap/cli.py | 245 ++++++++ .../pipeline/bootstrap/guided_context.py | 249 ++++++++ samcli/commands/pipeline/external_links.py | 8 + samcli/commands/pipeline/init/__init__.py | 0 samcli/commands/pipeline/init/cli.py | 51 ++ .../pipeline/init/interactive_init_flow.py | 482 +++++++++++++++ .../init/pipeline_templates_manifest.py | 61 ++ samcli/commands/pipeline/pipeline.py | 21 + .../validate/lib/sam_template_validator.py | 11 +- samcli/commands/validate/validate.py | 11 +- samcli/lib/bootstrap/bootstrap.py | 36 +- samcli/lib/build/app_builder.py | 2 +- samcli/lib/build/build_graph.py | 38 +- samcli/lib/build/build_strategy.py | 11 +- samcli/lib/build/workflow_config.py | 2 + samcli/lib/config/samconfig.py | 14 +- samcli/lib/cookiecutter/exceptions.py | 4 +- samcli/lib/cookiecutter/interactive_flow.py | 37 +- .../cookiecutter/interactive_flow_creator.py | 38 +- samcli/lib/cookiecutter/processor.py | 2 +- samcli/lib/cookiecutter/question.py | 173 +++++- samcli/lib/cookiecutter/template.py | 28 +- samcli/lib/deploy/deployer.py | 2 + .../event-mapping.json | 11 +- .../stepfunctions/StepFunctionsError.json | 5 +- samcli/lib/iac/utils/helpers.py | 2 +- samcli/lib/logs/event.py | 72 --- samcli/lib/logs/fetcher.py | 145 ----- samcli/lib/logs/formatter.py | 181 ------ samcli/lib/observability/__init__.py | 0 samcli/lib/observability/cw_logs/__init__.py | 0 .../lib/observability/cw_logs/cw_log_event.py | 40 ++ .../cw_logs/cw_log_formatters.py | 94 +++ .../cw_logs/cw_log_group_provider.py} | 0 .../observability/cw_logs/cw_log_puller.py | 111 ++++ .../observability_info_puller.py | 143 +++++ samcli/lib/package/artifact_exporter.py | 15 +- samcli/lib/package/ecr_utils.py | 2 +- samcli/lib/package/packageable_resources.py | 4 +- samcli/lib/package/s3_uploader.py | 2 +- samcli/lib/package/utils.py | 39 +- samcli/lib/pipeline/__init__.py | 0 samcli/lib/pipeline/bootstrap/__init__.py | 0 samcli/lib/pipeline/bootstrap/resource.py | 138 +++++ samcli/lib/pipeline/bootstrap/stage.py | 330 ++++++++++ .../pipeline/bootstrap/stage_resources.yaml | 358 +++++++++++ samcli/lib/providers/sam_base_provider.py | 28 + samcli/lib/providers/sam_function_provider.py | 114 +++- samcli/lib/providers/sam_stack_provider.py | 10 +- .../lib/samlib/default_managed_policies.json | 372 ------------ samcli/lib/samlib/wrapper.py | 51 +- samcli/lib/schemas/schemas_aws_config.py | 3 +- samcli/lib/telemetry/cicd.py | 20 +- samcli/lib/utils/colors.py | 4 + samcli/lib/utils/defaults.py | 8 + samcli/lib/utils/git_repo.py | 170 ++++++ samcli/lib/utils/hash.py | 14 +- .../lib/utils/managed_cloudformation_stack.py | 94 ++- samcli/lib/utils/profile.py | 10 + samcli/local/apigw/local_apigw_service.py | 71 ++- samcli/local/common/runtime_template.py | 8 +- samcli/local/docker/container.py | 20 +- samcli/local/docker/lambda_container.py | 8 + samcli/local/docker/lambda_debug_settings.py | 4 + samcli/local/docker/lambda_image.py | 16 +- samcli/local/lambdafn/runtime.py | 66 +- samcli/yamlhelper.py | 10 +- setup.py | 1 + .../lib/models/all_policy_templates.yaml | 6 + .../lib/models/api_request_model.yaml | 9 + .../models/api_request_model_openapi_3.yaml | 12 + .../lib/models/api_with_apikey_required.yaml | 8 + .../api_with_apikey_required_openapi_3.yaml | 8 + .../lib/models/api_with_auth_all_maximum.yaml | 52 +- .../api_with_auth_all_maximum_openapi_3.yaml | 52 +- .../lib/models/api_with_auth_all_minimum.yaml | 18 + .../api_with_auth_all_minimum_openapi.yaml | 18 + .../lib/models/api_with_auth_no_default.yaml | 18 + .../api_with_aws_account_blacklist.yaml | 6 + .../api_with_aws_account_whitelist.yaml | 13 + ...api_with_cors_and_auth_preflight_auth.yaml | 7 + ...cors_and_conditions_no_definitionbody.yaml | 7 + .../api_with_cors_and_only_methods.yaml | 5 + .../api_with_cors_no_definitionbody.yaml | 7 + ...efault_aws_iam_auth_and_no_auth_route.yaml | 14 + ...h_if_conditional_with_resource_policy.yaml | 7 + .../models/api_with_method_aws_iam_auth.yaml | 26 + .../validate/lib/models/api_with_mode.yaml | 22 + .../lib/models/api_with_open_api_version.yaml | 5 + .../models/api_with_open_api_version_2.yaml | 5 + .../lib/models/api_with_path_parameters.yaml | 6 + .../lib/models/api_with_resource_policy.yaml | 7 + ..._with_resource_policy_global_implicit.yaml | 15 + .../lib/models/api_with_resource_refs.yaml | 5 + .../models/api_with_source_vpc_blacklist.yaml | 5 + .../models/api_with_source_vpc_whitelist.yaml | 10 + ...pi_with_swagger_and_openapi_with_auth.yaml | 5 + .../api_with_swagger_authorizer_none.yaml | 141 +++++ .../lib/models/api_with_usageplans.yaml | 7 + ...th_usageplans_shared_attributes_three.yaml | 102 ++++ ...with_usageplans_shared_attributes_two.yaml | 75 +++ ...th_usageplans_shared_no_side_effect_1.yaml | 61 ++ ...th_usageplans_shared_no_side_effect_2.yaml | 34 ++ ...oyment_preference_alarms_intrinsic_if.yaml | 23 + .../models/function_with_mq_virtual_host.yaml | 19 + ...plicit_api_deletion_policy_precedence.yaml | 32 + .../layer_deletion_policy_precedence.yaml | 18 + .../state_machine_with_xray_policies.yaml | 22 + .../models/state_machine_with_xray_role.yaml | 10 + .../version_deletion_policy_precedence.yaml | 19 + .../lib/test_sam_template_validator.py | 16 +- .../integration/buildcmd/build_integ_base.py | 4 + tests/integration/buildcmd/test_build_cmd.py | 117 +++- .../buildcmd/test_cdk_build_cmd.py | 24 +- tests/integration/deploy/deploy_integ_base.py | 5 + .../integration/deploy/test_deploy_command.py | 182 ++++-- .../schemas/test_init_with_schemas_command.py | 26 +- tests/integration/init/test_init_command.py | 3 + .../invoke/test_integration_cli_images.py | 19 + .../local/invoke/test_integrations_cli.py | 20 +- .../local/start_api/test_start_api.py | 53 +- .../local/start_lambda/test_start_lambda.py | 4 +- .../package/test_package_command_image.py | 19 +- tests/integration/pipeline/__init__.py | 0 tests/integration/pipeline/base.py | 157 +++++ .../pipeline/test_bootstrap_command.py | 380 ++++++++++++ .../integration/pipeline/test_init_command.py | 301 ++++++++++ .../buildcmd/PyLayer/requirements.txt | 3 +- .../buildcmd/PyLayerMake/requirements.txt | 3 +- .../testdata/buildcmd/Python/requirements.txt | 3 +- .../buildcmd/PythonImage/requirements.txt | 3 +- ...s-application-with-application-id-map.yaml | 17 + .../layers/local-zip-layer-template.yml | 19 + .../testdata/invoke/template_image.yaml | 10 + .../aws-lambda-function-image-and-api.yaml | 2 +- .../package/aws-lambda-function-image.yaml | 2 +- .../aws-serverless-function-image.yaml | 2 +- .../ChildStackX/ChildStackY/template.yaml | 2 +- .../ChildStackX/template.yaml | 2 +- .../package/deep-nested-image/template.yaml | 2 +- .../samconfig-read-boolean-tomlkit.toml | 6 + .../testdata/package/samconfig-tags-list.toml | 10 + .../package/samconfig-tags-string.toml | 5 + .../custom_template/cookiecutter.json | 4 + .../pipeline/custom_template/metadata.json | 3 + .../pipeline/custom_template/questions.json | 7 + .../{{cookiecutter.outputDir}}/weather | 1 + .../testdata/start_api/binarydata.gif | Bin 1951 -> 49 bytes .../start_api/image_package_type/main.py | 2 +- tests/integration/testdata/start_api/main.py | 36 +- .../testdata/start_api/swagger-template.yaml | 48 ++ .../validate/default_json/template.json | 15 + .../validate/default_yaml/template.yaml | 10 + .../validate/multiple_files/template.json | 15 + .../validate/multiple_files/template.yaml | 10 + .../with_build/.aws-sam/build/template.yaml | 10 + .../validate/with_build/template.json | 15 + tests/integration/validate/__init__.py | 0 .../validate/test_validate_command.py | 75 +++ tests/testing_utils.py | 9 + tests/unit/cli/test_types.py | 6 + tests/unit/commands/_utils/test_options.py | 13 +- tests/unit/commands/_utils/test_template.py | 8 +- .../commands/buildcmd/test_build_context.py | 11 +- tests/unit/commands/buildcmd/test_command.py | 3 + .../commands/deploy/test_deploy_context.py | 10 +- .../commands/deploy/test_guided_context.py | 9 +- tests/unit/commands/init/test_cli.py | 382 ++++++++++-- tests/unit/commands/init/test_templates.py | 75 +-- .../local/cli_common/test_invoke_context.py | 83 +++ tests/unit/commands/local/invoke/test_cli.py | 20 + .../commands/local/lib/test_local_lambda.py | 72 ++- .../local/lib/test_sam_function_provider.py | 171 ++++-- .../unit/commands/local/start_api/test_cli.py | 7 + .../commands/local/start_lambda/test_cli.py | 7 + tests/unit/commands/logs/test_command.py | 43 +- .../commands/logs/test_console_consumers.py | 15 + tests/unit/commands/logs/test_logs_context.py | 24 +- tests/unit/commands/pipeline/__init__.py | 0 .../commands/pipeline/bootstrap/__init__.py | 0 .../commands/pipeline/bootstrap/test_cli.py | 276 +++++++++ .../pipeline/bootstrap/test_guided_context.py | 231 +++++++ tests/unit/commands/pipeline/init/__init__.py | 0 tests/unit/commands/pipeline/init/test_cli.py | 22 + .../init/test_initeractive_init_flow.py | 566 ++++++++++++++++++ .../init/test_pipeline_templates_manifest.py | 82 +++ .../unit/commands/samconfig/test_samconfig.py | 17 + .../lib/test_sam_template_validator.py | 21 +- tests/unit/commands/validate/test_cli.py | 9 +- tests/unit/lib/bootstrap/test_bootstrap.py | 34 +- .../unit/lib/build_module/test_build_graph.py | 146 ++++- .../lib/build_module/test_build_strategy.py | 9 +- .../lib/cookiecutter/test_interactive_flow.py | 24 + tests/unit/lib/cookiecutter/test_question.py | 113 +++- tests/unit/lib/cookiecutter/test_template.py | 19 +- tests/unit/lib/logs/test_fetcher.py | 255 -------- tests/unit/lib/logs/test_formatter.py | 164 ----- tests/unit/lib/observability/__init__.py | 0 .../lib/observability/cw_logs/__init__.py | 0 .../cw_logs/test_cw_log_event.py} | 44 +- .../cw_logs/test_cw_log_formatters.py | 120 ++++ .../cw_logs/test_cw_log_group_provider.py} | 2 +- .../cw_logs/test_cw_log_puller.py | 322 ++++++++++ .../test_observability_info_puller.py | 50 ++ .../lib/package/test_artifact_exporter.py | 6 +- tests/unit/lib/package/test_utils.py | 44 ++ tests/unit/lib/pipeline/__init__.py | 0 tests/unit/lib/pipeline/bootstrap/__init__.py | 0 .../pipeline/bootstrap/test_environment.py | 425 +++++++++++++ .../lib/pipeline/bootstrap/test_resource.py | 81 +++ tests/unit/lib/samconfig/test_samconfig.py | 34 +- tests/unit/lib/telemetry/test_cicd.py | 10 + tests/unit/lib/utils/test_git_repo.py | 194 ++++++ tests/unit/lib/utils/test_hash.py | 33 +- .../test_managed_cloudformation_stack.py | 21 +- tests/unit/lib/utils/test_osutils.py | 2 +- .../local/apigw/test_local_apigw_service.py | 441 ++++++++++++-- tests/unit/local/docker/test_container.py | 14 +- .../local/docker/test_lambda_container.py | 3 +- .../docker/test_lambda_debug_settings.py | 1 + tests/unit/local/docker/test_lambda_image.py | 75 ++- tests/unit/local/lambdafn/test_runtime.py | 36 +- 267 files changed, 10960 insertions(+), 2324 deletions(-) create mode 100644 .github/workflows/close-stale-issues.yml create mode 100644 .github/workflows/closed-issue-message.yml create mode 100644 .github/workflows/need-attention-label.yml create mode 100644 designs/auto-create-ecr.md create mode 100644 samcli/commands/build/build_constants.py create mode 100644 samcli/commands/logs/console_consumers.py rename samcli/{lib/logs => commands/pipeline}/__init__.py (100%) rename {tests/unit/lib/logs => samcli/commands/pipeline/bootstrap}/__init__.py (100%) create mode 100644 samcli/commands/pipeline/bootstrap/cli.py create mode 100644 samcli/commands/pipeline/bootstrap/guided_context.py create mode 100644 samcli/commands/pipeline/external_links.py create mode 100644 samcli/commands/pipeline/init/__init__.py create mode 100644 samcli/commands/pipeline/init/cli.py create mode 100644 samcli/commands/pipeline/init/interactive_init_flow.py create mode 100644 samcli/commands/pipeline/init/pipeline_templates_manifest.py create mode 100644 samcli/commands/pipeline/pipeline.py delete mode 100644 samcli/lib/logs/event.py delete mode 100644 samcli/lib/logs/fetcher.py delete mode 100644 samcli/lib/logs/formatter.py create mode 100644 samcli/lib/observability/__init__.py create mode 100644 samcli/lib/observability/cw_logs/__init__.py create mode 100644 samcli/lib/observability/cw_logs/cw_log_event.py create mode 100644 samcli/lib/observability/cw_logs/cw_log_formatters.py rename samcli/lib/{logs/provider.py => observability/cw_logs/cw_log_group_provider.py} (100%) create mode 100644 samcli/lib/observability/cw_logs/cw_log_puller.py create mode 100644 samcli/lib/observability/observability_info_puller.py create mode 100644 samcli/lib/pipeline/__init__.py create mode 100644 samcli/lib/pipeline/bootstrap/__init__.py create mode 100644 samcli/lib/pipeline/bootstrap/resource.py create mode 100644 samcli/lib/pipeline/bootstrap/stage.py create mode 100644 samcli/lib/pipeline/bootstrap/stage_resources.yaml delete mode 100644 samcli/lib/samlib/default_managed_policies.json create mode 100644 samcli/lib/utils/defaults.py create mode 100644 samcli/lib/utils/git_repo.py create mode 100644 samcli/lib/utils/profile.py create mode 100644 tests/functional/commands/validate/lib/models/api_with_mode.yaml create mode 100644 tests/functional/commands/validate/lib/models/api_with_swagger_authorizer_none.yaml create mode 100644 tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_three.yaml create mode 100644 tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_two.yaml create mode 100644 tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_1.yaml create mode 100644 tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_2.yaml create mode 100644 tests/functional/commands/validate/lib/models/function_with_deployment_preference_alarms_intrinsic_if.yaml create mode 100644 tests/functional/commands/validate/lib/models/function_with_mq_virtual_host.yaml create mode 100644 tests/functional/commands/validate/lib/models/implicit_api_deletion_policy_precedence.yaml create mode 100644 tests/functional/commands/validate/lib/models/layer_deletion_policy_precedence.yaml create mode 100644 tests/functional/commands/validate/lib/models/state_machine_with_xray_policies.yaml create mode 100644 tests/functional/commands/validate/lib/models/state_machine_with_xray_role.yaml create mode 100644 tests/functional/commands/validate/lib/models/version_deletion_policy_precedence.yaml create mode 100644 tests/integration/pipeline/__init__.py create mode 100644 tests/integration/pipeline/base.py create mode 100644 tests/integration/pipeline/test_bootstrap_command.py create mode 100644 tests/integration/pipeline/test_init_command.py create mode 100644 tests/integration/testdata/buildcmd/aws-serverless-application-with-application-id-map.yaml create mode 100644 tests/integration/testdata/invoke/layers/local-zip-layer-template.yml create mode 100644 tests/integration/testdata/package/samconfig-read-boolean-tomlkit.toml create mode 100644 tests/integration/testdata/package/samconfig-tags-list.toml create mode 100644 tests/integration/testdata/package/samconfig-tags-string.toml create mode 100644 tests/integration/testdata/pipeline/custom_template/cookiecutter.json create mode 100644 tests/integration/testdata/pipeline/custom_template/metadata.json create mode 100644 tests/integration/testdata/pipeline/custom_template/questions.json create mode 100644 tests/integration/testdata/pipeline/custom_template/{{cookiecutter.outputDir}}/weather create mode 100644 tests/integration/testdata/validate/default_json/template.json create mode 100644 tests/integration/testdata/validate/default_yaml/template.yaml create mode 100644 tests/integration/testdata/validate/multiple_files/template.json create mode 100644 tests/integration/testdata/validate/multiple_files/template.yaml create mode 100644 tests/integration/testdata/validate/with_build/.aws-sam/build/template.yaml create mode 100644 tests/integration/testdata/validate/with_build/template.json create mode 100644 tests/integration/validate/__init__.py create mode 100644 tests/integration/validate/test_validate_command.py create mode 100644 tests/unit/commands/logs/test_console_consumers.py create mode 100644 tests/unit/commands/pipeline/__init__.py create mode 100644 tests/unit/commands/pipeline/bootstrap/__init__.py create mode 100644 tests/unit/commands/pipeline/bootstrap/test_cli.py create mode 100644 tests/unit/commands/pipeline/bootstrap/test_guided_context.py create mode 100644 tests/unit/commands/pipeline/init/__init__.py create mode 100644 tests/unit/commands/pipeline/init/test_cli.py create mode 100644 tests/unit/commands/pipeline/init/test_initeractive_init_flow.py create mode 100644 tests/unit/commands/pipeline/init/test_pipeline_templates_manifest.py delete mode 100644 tests/unit/lib/logs/test_fetcher.py delete mode 100644 tests/unit/lib/logs/test_formatter.py create mode 100644 tests/unit/lib/observability/__init__.py create mode 100644 tests/unit/lib/observability/cw_logs/__init__.py rename tests/unit/lib/{logs/test_event.py => observability/cw_logs/test_cw_log_event.py} (51%) create mode 100644 tests/unit/lib/observability/cw_logs/test_cw_log_formatters.py rename tests/unit/lib/{logs/test_provider.py => observability/cw_logs/test_cw_log_group_provider.py} (78%) create mode 100644 tests/unit/lib/observability/cw_logs/test_cw_log_puller.py create mode 100644 tests/unit/lib/observability/test_observability_info_puller.py create mode 100644 tests/unit/lib/package/test_utils.py create mode 100644 tests/unit/lib/pipeline/__init__.py create mode 100644 tests/unit/lib/pipeline/bootstrap/__init__.py create mode 100644 tests/unit/lib/pipeline/bootstrap/test_environment.py create mode 100644 tests/unit/lib/pipeline/bootstrap/test_resource.py create mode 100644 tests/unit/lib/utils/test_git_repo.py diff --git a/.github/workflows/close-stale-issues.yml b/.github/workflows/close-stale-issues.yml new file mode 100644 index 0000000000..3da877a732 --- /dev/null +++ b/.github/workflows/close-stale-issues.yml @@ -0,0 +1,53 @@ +name: Close stale issues + +# Controls when the action will run. +on: + schedule: + # Uses UTC so it runs at 10PM PDT + - cron: "0 6 * * *" + +jobs: + cleanup: + runs-on: ubuntu-latest + name: Stale issue job + steps: + - uses: aws-actions/stale-issue-cleanup@v3 + with: + issue-types: issues + + # Setting messages to an empty string will cause the automation to skip + # that category + ancient-issue-message: | + This issue has not received any attention in 1 year. + If you want to keep this issue open, please leave a comment below and auto-close will be canceled. + stale-issue-message: | + This issue has not received a response in 14 days. + If you want to keep this issue open, please leave a comment below and auto-close will be canceled. + stale-pr-message: | + This PR has not received a response in 14 days. + If you want to keep this issue open, please leave a comment below and auto-close will be canceled. + + # These labels are required + stale-issue-label: blocked/close-if-inactive + exempt-issue-labels: no-autoclose, stage/needs-attention + stale-pr-label: blocked/close-if-inactive + exempt-pr-labels: no-autoclose, type/feature + response-requested-label: blocked/more-info-needed + + # Don't set this to not apply a label when closing issues + closed-for-staleness-label: stage/closed-for-inactivity + + # Issue timing + days-before-stale: 14 + days-before-close: 7 + days-before-ancient: 365 + + # If you don't want to mark a issue as being ancient based on a + # threshold of "upvotes", you can set this here. An "upvote" is + # the total number of +1, heart, hooray, and rocket reactions + # on an issue. + minimum-upvotes-to-exempt: 10 + + # need a repo scope token here to make this action can trigger other github actions + repo-token: ${{ secrets.STALE_BOT_PERSONAL_TOKEN }} + diff --git a/.github/workflows/closed-issue-message.yml b/.github/workflows/closed-issue-message.yml new file mode 100644 index 0000000000..5128523727 --- /dev/null +++ b/.github/workflows/closed-issue-message.yml @@ -0,0 +1,18 @@ +name: Closed issue message + +on: + issues: + types: [ closed ] +jobs: + auto_comment: + runs-on: ubuntu-latest + steps: + - uses: aws-actions/closed-issue-message@v1 + with: + # These inputs are both required + repo-token: "${{ secrets.GITHUB_TOKEN }}" + message: | + ### ⚠️COMMENT VISIBILITY WARNING⚠️ + Comments on closed issues are hard for our team to see. + If you need more assistance, please either tag a team member or open a new issue that references this one. + If you wish to keep having a conversation with other community members under this issue feel free to do so. diff --git a/.github/workflows/need-attention-label.yml b/.github/workflows/need-attention-label.yml new file mode 100644 index 0000000000..a0261dcd4b --- /dev/null +++ b/.github/workflows/need-attention-label.yml @@ -0,0 +1,22 @@ +name: Add need attention label + +on: + issue_comment: + types: [created, edited] + +jobs: + apply-label: + runs-on: ubuntu-latest + steps: + - uses: actions/github-script@v3 + # the login of our bot called 'sam-cli-bot' + if: github.event.sender.login != 'aws-sam-cli-stale-bot' + with: + github-token: ${{secrets.GITHUB_TOKEN}} + script: | + github.issues.addLabels({ + issue_number: context.issue.number, + owner: context.repo.owner, + repo: context.repo.repo, + labels: ['stage/needs-attention'] + }) diff --git a/.gitignore b/.gitignore index deb14508b0..af1e99aeed 100644 --- a/.gitignore +++ b/.gitignore @@ -305,6 +305,7 @@ venv/ ENV/ env.bak/ venv.bak/ +venv-update-reproducible-requirements/ # Spyder project settings .spyderproject @@ -389,8 +390,11 @@ tests/integration/testdata/buildcmd/Dotnetcore2.0/bin tests/integration/testdata/buildcmd/Dotnetcore2.0/obj tests/integration/testdata/buildcmd/Dotnetcore2.1/bin tests/integration/testdata/buildcmd/Dotnetcore2.1/obj +tests/integration/testdata/buildcmd/Dotnetcore3.1/bin +tests/integration/testdata/buildcmd/Dotnetcore3.1/obj +tests/integration/testdata/invoke/credential_tests/inprocess/dotnet/STS/obj # End of https://www.gitignore.io/api/osx,node,macos,linux,python,windows,pycharm,intellij,sublimetext,visualstudiocode # Installer build folder -.build \ No newline at end of file +.build diff --git a/.pylintrc b/.pylintrc index d2bae96690..2960a0e0b0 100644 --- a/.pylintrc +++ b/.pylintrc @@ -23,7 +23,8 @@ load-plugins= pylint.extensions.docparams, # Parameter documentation checker # Use multiple processes to speed up Pylint. -jobs=4 +# Pylint has a bug on multitread, turn it off. +jobs=1 # Allow loading of arbitrary C extensions. Extensions are imported into the # active Python interpreter and may run arbitrary code. @@ -92,6 +93,13 @@ disable= # W9018, # "%s" differing in parameter type documentation # W9019, # "%s" useless ignored parameter documentation # W9020, # "%s" useless ignored parameter type documentation + # Constant name style warnings. We will remove them one by one. + C0103, # Class constant name "%s" doesn't conform to UPPER_CASE naming style ('([^\\W\\da-z][^\\Wa-z]*|__.*__)$' pattern) (invalid-name) + # New recommendations, disable for now to avoid introducing behaviour changes. + R1729, # Use a generator instead '%s' (use-a-generator) + R1732, # Consider using 'with' for resource-allocating operations (consider-using-with) + # This applies to CPython only. + I1101, # Module 'math' has no 'pow' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member) [REPORTS] diff --git a/DEVELOPMENT_GUIDE.md b/DEVELOPMENT_GUIDE.md index b399682ae2..994c1e3220 100644 --- a/DEVELOPMENT_GUIDE.md +++ b/DEVELOPMENT_GUIDE.md @@ -1,5 +1,4 @@ -DEVELOPMENT GUIDE -================= +# AWS SAM CLI Development Guide **Welcome hacker!** @@ -8,11 +7,8 @@ development environment, IDEs, tests, coding practices, or anything that will help you be more productive. If you found something is missing or inaccurate, update this guide and send a Pull Request. -**Note**: `pyenv` currently only supports macOS and Linux. If you are a -Windows users, consider using [pipenv](https://docs.pipenv.org/). +## 1-Click Ready to Hack IDE (this section might be outdated, to be verified) -1-Click Ready to Hack IDE -------------------------- For setting up a local development environment, we recommend using Gitpod - a service that allows you to spin up an in-browser Visual Studio Code-compatible editor, with everything set up and ready to go for development on this project. Just click the button below to create your private workspace: [![Gitpod ready-to-code](https://img.shields.io/badge/Gitpod-ready--to--code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/awslabs/aws-sam-cli) @@ -21,125 +17,216 @@ This will start a new Gitpod workspace, and immediately kick off a build of the Gitpod is free for 50 hours per month - make sure to stop your workspace when you're done (you can always resume it later, and it won't need to run the build again). -Environment Setup ------------------ +## Environment Setup +### 1. Prerequisites (Python Virtual Environment) -### 1. Install Python Versions +AWS SAM CLI is mainly written in Python 3 and we support Python 3.6, 3.7 and 3.8. +So having a Python environment with aforementioned versions is required. -We support 3.6 and 3.7 versions. Our CI/CD pipeline is setup to run -unit tests against both Python versions. So make sure you test it -with both versions before sending a Pull Request. -See [Unit testing with multiple Python versions](#unit-testing-with-multiple-python-versions). +Having a dedicated Python virtual environment ensures it won't "pollute" or get "polluted" +by other python packages. Here we introduce two ways of setting up a Python virtual environment: +(1) Python's built in [`venv`](https://docs.python.org/3/tutorial/venv.html) and (2) [`pyenv`](https://github.com/pyenv/pyenv). -[pyenv](https://github.com/pyenv/pyenv) is a great tool to -easily setup multiple Python versions. - -> Note: For Windows, type -> `export PATH="/c/Users//.pyenv/libexec:$PATH"` to add pyenv to -> your path. - -1. Install PyEnv - - `curl -L https://github.com/pyenv/pyenv-installer/raw/master/bin/pyenv-installer | bash` -2. `pyenv install 3.6.8` -3. `pyenv install 3.7.2` -4. Make Python versions available in the project: - `pyenv local 3.6.8 3.7.2` - -### 2. Install Additional Tooling -#### Black -We format our code using [Black](https://github.com/python/black) and verify the source code is black compliant -in Appveyor during PRs. Black will be installed automatically with `make init`. - -After installing, you can run our formatting through our Makefile by `make black` or integrating Black directly in your favorite IDE (instructions -can be found [here](https://black.readthedocs.io/en/stable/editor_integration.html)) - -##### (workaround) Integrating Black directly in your favorite IDE -Since black is installed in virtualenv, when you follow [this instruction](https://black.readthedocs.io/en/stable/editor_integration.html), `which black` might give you this +**Note**: `pyenv` currently only supports macOS and Linux. If you are a +Windows users, consider using [pyenv-win](https://github.com/pyenv-win/pyenv-win). -```bash -(samcli37) $ where black -/Users//.pyenv/shims/black +| | `venv` | `pyenv` | +| -- | -------- | ------------ | +| Pick if you want ... | Easy setup | You want to develop and test SAM CLI in different Python versions | + + +#### `venv` setup + +```sh +python3 -m venv .venv # one time setup: create a virtual environment to directory .venv +source .venv/bin/activate # activate the virtual environment ``` +#### `pyenv` setup -However, IDEs such PyChaim (using FileWatcher) will have a hard time invoking `/Users//.pyenv/shims/black` -and this will happen: +Install `pyenv` and [`pyenv-virtualenv` plugin](https://github.com/pyenv/pyenv-virtualenv) +On macOS with [Homebrew](https://brew.sh/) + +```sh +brew install pyenv +brew install pyenv-virtualenv ``` -pyenv: black: command not found -The `black' command exists in these Python versions: - 3.7.2/envs/samcli37 - samcli37 -``` +or using [pyenv-installer](https://github.com/pyenv/pyenv-installer) and git -A simple workaround is to use `/Users//.pyenv/versions/samcli37/bin/black` -instead of `/Users//.pyenv/shims/black`. +```sh +curl https://pyenv.run | bash # https://github.com/pyenv/pyenv-installer +exec $SHELL # restart your shell so the path changes take effect +git clone https://github.com/pyenv/pyenv-virtualenv.git $(pyenv root)/plugins/pyenv-virtualenv +exec $SHELL # restart your shell to enable pyenv-virtualenv +``` + +Next, setup a virtual environment and activate it: -#### Pre-commit -If you don't wish to manually run black on each pr or install black manually, we have integrated black into git hooks through [pre-commit](https://pre-commit.com/). -After installing pre-commit, run `pre-commit install` in the root of the project. This will install black for you and run the black formatting on -commit. +```sh +# Assuming you want to develop AWS SAM CLI in Python 3.8.9 +pyenv install 3.8.9 # install Python 3.8.9 using pyenv +pyenv virtualenv 3.8.9 samcli38 # create a virtual environment using 3.8.9 named "samcli38" +pyenv activate samcli38 # activate the virtual environment +``` -### 3. Activate Virtualenv +### 2. Initialize dependencies and create `samdev` available in `$PATH` -Virtualenv allows you to install required libraries outside of the -Python installation. A good practice is to setup a different virtualenv -for each project. [pyenv](https://github.com/pyenv/pyenv) comes with a -handy plugin that can create virtualenv. +Clone the AWS SAM CLI repository to your local machine if you haven't done that yet. -Depending on the python version, the following commands would change to -be the appropriate python version. +```sh +# Using SSH +git clone git@github.com:aws/aws-sam-cli.git +``` +or +```sh +# Using HTTPS +git clone https://github.com/aws/aws-sam-cli.git +``` -1. `pyenv virtualenv 3.7.2 samcli37` -2. `pyenv activate samcli37` for Python3.7 +(make sure you have virtual environment activated) -### 4. Install dev version of SAM CLI +```sh +cd aws-sam-cli +make init # this will put a file `samdev` available in $PATH +``` -We will install a development version of SAM CLI from source into the -virtualenv for you to try out the CLI as you make changes. We will -install in a command called `samdev` to keep it separate from a global -SAM CLI installation, if any. +Now you can verify whether the dev AWS SAM CLI is available: -1. Activate Virtualenv: `pyenv activate samcli37` -2. Install dev CLI: `make init` -3. Make sure installation succeeded: `which samdev` +```sh +samdev --version # this will print something like "SAM CLI, version x.xx.x" +``` -### 5. (Optional) Install development version of SAM Transformer +#### Try out to make change to AWS SAM CLI (Optional) -If you want to run the latest version of [SAM -Transformer](https://github.com/awslabs/serverless-application-model/), -you can clone it locally and install it in your pyenv. This is useful if -you want to validate your templates against any new, unreleased SAM -features ahead of time. +```sh +# Change the AWS SAM CLI version to 123.456.789 +echo '__version__ = "123.456.789"' >> samcli/__init__.py +samdev --version # this will print "SAM CLI, version 123.456.789" +``` + +### 3. (Optional) Install development version of SAM Transformer -This step is optional and will use the specified version of -aws-sam-transformer from PyPi by default. +If you want to run the latest version of [SAM Transformer](https://github.com/aws/serverless-application-model/) +or work on it at the same time, you can clone it locally and install it in your virtual environment. +This is useful if you want to validate your templates against any new, unreleased SAM features ahead of time. -``cd ~/projects (cd into the directory where you usually place projects)`` +```sh +# Make sure it is not in AWS SAM CLI repository -``git clone https://github.com/awslabs/serverless-application-model/`` +# clone the AWS SAM repo +git clone git@github.com:aws/serverless-application-model.git +# or using HTTPS: git clone https://github.com/aws/serverless-application-model.git -``git checkout develop `` +cd serverless-application-model +``` -Install the SAM Transformer in editable mode so that all changes you make to the SAM Transformer locally are immediately picked up for SAM CLI. +Make sure you are in the same virtual environment as the one you are using with SAM CLI. +```sh +source /.venv/bin/activate # if you chose to use venv to setup the virtual environment +# or +pyenv activate samcli38 # if you chose to use pyenv to setup the virtual environment +``` -``pip install -e . `` +Install the SAM Transformer in editable mode so that +all changes you make to the SAM Transformer locally are immediately picked up for SAM CLI. + +```sh +pip install -e . +``` Move back to your SAM CLI directory and re-run init, If necessary: open requirements/base.txt and replace the version number of aws-sam-translator with the ``version number`` specified in your local version of `serverless-application-model/samtranslator/__init__.py` -``cd ../aws-sam-cli`` - -``make init`` +```sh +# Make sure you are back to your SAM CLI directory +make init +``` + +## Making a Pull Request + +Above demonstrates how to setup the environment, which is enough +to play with the AWS SAM CLI source code. However, if you want to +contribute to the repository, there are a few more things to consider. + +### Make Sure AWS SAM CLI Work in Multiple Python Versions + +We support 3.6, 3.7 and 3.8 versions. Our CI/CD pipeline is setup to run +unit tests against all Python versions. So make sure you test it +with all versions before sending a Pull Request. +See [Unit testing with multiple Python versions](#unit-testing-with-multiple-python-versions). -Running Tests -------------- +If you chose to use `pyenv` in the previous session, setting up a +different Python version should be easy: -### Unit testing with one Python version +(assuming you are in virtual environment `samcli38`) + +```sh +# Your shell now should looks like "(samcli38) $" +pyenv deactivate samcli38 # "(samcli38)" will disappear +pyenv install 3.7.10 # one time setup +pyenv virtualenv 3.7.10 samcli37 # one time setup +pyenv activate samcli37 +# Your shell now should looks like "(samcli37) $" + +# You can verify the version of Python +python --version # Python 3.7.10 + +make init # one time setup, this will put a file `samdev` available in $PATH +``` + +### Format Python Code + +We format our code using [Black](https://github.com/python/black) and verify the source code is +black compliant in AppVeyor during PRs. Black will be installed automatically with `make init`. + +There are generally 3 options to make sure your change is compliant with our formatting standard: + +#### (Option 1) Run `make black` + +```sh +make black +``` + +#### (Option 2) Integrating Black directly in your favorite IDE + +Since black is installed in virtualenv, when you follow [this instruction](https://black.readthedocs.io/en/stable/editor_integration.html), `which black` might give you this + +``` +/Users//.pyenv/shims/black +``` + +However, IDEs such PyChaim (using FileWatcher) will have a hard time +invoking `/Users//.pyenv/shims/black` +and this will happen: + +``` +pyenv: black: command not found + +The `black' command exists in these Python versions: + 3.8.9/envs/samcli38 + samcli38 +``` + +A simple workaround is to use `/Users//.pyenv/versions/samcli37/bin/black` +instead of `/Users//.pyenv/shims/black`. + +#### (Option 3) Pre-commit + +We have integrated black into git hooks through [pre-commit](https://pre-commit.com/). +After installing pre-commit, run `pre-commit install` in the root of the project. This will install black for you and run the black formatting on commit. + +### Do a Local PR Check + +This commands will run the AWS SAM CLI code through various checks including +lint, formatter, unit tests, function tests, and so on. +```sh +make pr +``` -If you're trying to do a quick run, it's ok to use the current python version. Run `make pr`. +We also suggest to run `make pr` in all Python versions. -### Unit testing with multiple Python versions +#### Unit Testing with Multiple Python Versions (Optional) Currently, SAM CLI only supports Python3 versions (see setup.py for exact versions). For the most part, code that works in Python3.6 will work in Python3.7. You only run into problems if you are @@ -148,7 +235,7 @@ will not work in Python3.6). If you want to test in many versions, you can creat each version and flip between them (sourcing the activate script). Typically, we run all tests in one python version locally and then have our ci (appveyor) run all supported versions. -### Integration Test +#### Integration Test (Optional) `make integ-test` - To run integration test against global SAM CLI installation. It looks for a command named `sam` in your shell. @@ -158,8 +245,8 @@ development version of SAM CLI. This is useful if you are making changes to the CLI and want to verify that it works. It is a good practice to run integration tests before submitting a pull request. -Code Conventions ----------------- +## Other Topics +### Code Conventions Please follow these code conventions when making your changes. This will align your code to the same conventions used in rest of the package and @@ -198,8 +285,7 @@ conventions are best practices that we have learnt over time. comments. -Testing -------- +### Our Testing Practices We need thorough test coverage to ensure the code change works today, and continues to work in future. When you make a code change, use the @@ -230,8 +316,7 @@ following framework to decide the kinds of tests to write: calling AWS APIs, spinning up Docker containers, mutating files etc. -Design Document ---------------- +### Design Document A design document is a written description of the feature/capability you are building. We have a [design document diff --git a/MANIFEST.in b/MANIFEST.in index 7e0654b49f..64f107c2b6 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -5,6 +5,7 @@ include requirements/pre-dev.txt include requirements/dev.txt include samcli/local/rapid/aws-lambda-rie recursive-include samcli/lib/init/templates * +recursive-include samcli/lib/pipeline * recursive-include samcli/lib *.json recursive-include samcli/lib/generated_sample_events *.json prune tests diff --git a/appveyor-windows-build-python.yml b/appveyor-windows-build-python.yml index 25e3825331..287371f953 100644 --- a/appveyor-windows-build-python.yml +++ b/appveyor-windows-build-python.yml @@ -36,10 +36,12 @@ init: install: # Make sure the temp directory exists for Python to use. - # Install python3.8 + # Install python3.8 and python3.9 - "choco install chocolatey-core.extension --version 1.3.3 --force -y" - "choco install python3 --version 3.8.0" + - "choco install python3 --version 3.9.0" - "C:\\Python38\\python.exe -m pip freeze" + - "C:\\Python39\\python.exe -m pip freeze" - "refreshenv" # To run Nodejs workflow integ tests @@ -50,12 +52,14 @@ install: - "cdk --version" - ps: "mkdir -Force D:\\tmp" - - "SET PATH=%PYTHON_HOME%;%PATH%;C:\\Python36-x64;C:\\Python27-x64;C:\\Python38" + - "SET PATH=%PYTHON_HOME%;%PATH%;C:\\Python36-x64;C:\\Python27-x64;C:\\Python38;C:\\Python39" - "echo %PYTHON_HOME%" - "echo %PATH%" - "python --version" # Check if python3.8 exists on the image - "C:\\Python38\\python.exe --version" + # Check if python3.9 exists on the image + - "C:\\Python39\\python.exe --version" # Upgrade setuptools, wheel and virtualenv - "python -m pip install --upgrade setuptools wheel virtualenv" @@ -66,6 +70,7 @@ install: # python is python3.7 - "python -m pip install --upgrade pip" - "C:\\Python38\\python.exe -m pip install --upgrade pip" + - "C:\\Python39\\python.exe -m pip install --upgrade pip" # Create new virtual environment with chosen python version and activate it - "python -m virtualenv venv" diff --git a/appveyor-windows.yml b/appveyor-windows.yml index df6bf5dcb7..cf790f2977 100644 --- a/appveyor-windows.yml +++ b/appveyor-windows.yml @@ -73,6 +73,9 @@ install: # Actually install SAM CLI's dependencies - "pip install -e \".[dev]\"" + # Install aws cli + - "pip install awscli" + # Switch to Docker Linux containers - ps: Switch-DockerLinux @@ -85,8 +88,8 @@ install: test_script: # Reactivate virtualenv before running tests - ps: " - If ((Test-Path env:BY_CANARY) -And (Test-Path env:DOCKER_USER) -And (Test-Path env:DOCKER_PASS)){ - echo Logging in Docker Hub; echo $env:DOCKER_PASS | docker login --username $env:DOCKER_USER --password-stdin; + If (Test-Path env:BY_CANARY){ + echo Logging in Public ECR; aws ecr-public get-login-password --region us-east-1 | docker login --username AWS --password-stdin public.ecr.aws; }" - "git --version" - "venv\\Scripts\\activate" diff --git a/appveyor.yml b/appveyor.yml index 6d6a7d2630..66aa43360f 100644 --- a/appveyor.yml +++ b/appveyor.yml @@ -1,7 +1,7 @@ version: 1.0.{build} image: - Ubuntu - - Visual Studio 2017 + - Visual Studio 2019 environment: AWS_DEFAULT_REGION: us-east-1 @@ -9,15 +9,16 @@ environment: matrix: - # - PYTHON_HOME: "C:\\Python36-x64" - # PYTHON_VERSION: '3.6' - # PYTHON_ARCH: '64' - # NOSE_PARAMETERIZED_NO_WARN: 1 - # INSTALL_PY_37_PIP: 1 - # INSTALL_PY_38_PIP: 1 - # AWS_S3: 'AWS_S3_36' - # AWS_ECR: 'AWS_ECR_36' - # APPVEYOR_CONSOLE_DISABLE_PTY: true + - PYTHON_HOME: "C:\\Python36-x64" + PYTHON_VERSION: '3.6' + PYTHON_ARCH: '64' + NOSE_PARAMETERIZED_NO_WARN: 1 + INSTALL_PY_37_PIP: 1 + INSTALL_PY_38_PIP: 1 + INSTALL_PY_39_PIP: 1 + AWS_S3: 'AWS_S3_36' + AWS_ECR: 'AWS_ECR_36' + APPVEYOR_CONSOLE_DISABLE_PTY: true - PYTHON_HOME: "C:\\Python37-x64" PYTHON_VERSION: '3.7' @@ -26,26 +27,40 @@ environment: NOSE_PARAMETERIZED_NO_WARN: 1 INSTALL_PY_36_PIP: 1 INSTALL_PY_38_PIP: 1 + INSTALL_PY_39_PIP: 1 AWS_S3: 'AWS_S3_37' AWS_ECR: 'AWS_ECR_37' APPVEYOR_CONSOLE_DISABLE_PTY: true - #- PYTHON_HOME: "C:\\Python38-x64" - # PYTHON_VERSION: '3.8' - # PYTHON_ARCH: '64' - # RUN_SMOKE: 1 - # NOSE_PARAMETERIZED_NO_WARN: 1 - # INSTALL_PY_36_PIP: 1 - # INSTALL_PY_37_PIP: 1 - # AWS_S3: 'AWS_S3_38' - # AWS_ECR: 'AWS_ECR_38' - # APPVEYOR_CONSOLE_DISABLE_PTY: true + - PYTHON_HOME: "C:\\Python38-x64" + PYTHON_VERSION: '3.8' + PYTHON_ARCH: '64' + RUN_SMOKE: 1 + NOSE_PARAMETERIZED_NO_WARN: 1 + INSTALL_PY_36_PIP: 1 + INSTALL_PY_37_PIP: 1 + INSTALL_PY_39_PIP: 1 + AWS_S3: 'AWS_S3_38' + AWS_ECR: 'AWS_ECR_38' + APPVEYOR_CONSOLE_DISABLE_PTY: true + + - PYTHON_HOME: "C:\\Python39-x64" + PYTHON_VERSION: '3.9' + PYTHON_ARCH: '64' + RUN_SMOKE: 1 + NOSE_PARAMETERIZED_NO_WARN: 1 + INSTALL_PY_36_PIP: 1 + INSTALL_PY_37_PIP: 1 + INSTALL_PY_38_PIP: 1 + AWS_S3: 'AWS_S3_39' + AWS_ECR: 'AWS_ECR_39' + APPVEYOR_CONSOLE_DISABLE_PTY: true for: - matrix: only: - - image: Visual Studio 2017 + - image: Visual Studio 2019 install: - "choco install nodejs-lts -y --force" @@ -117,6 +132,8 @@ for: - image: Ubuntu install: + # apt repo for python3.9 installation + - sh: "sudo add-apt-repository ppa:deadsnakes/ppa" # AppVeyor's apt-get cache might be outdated, and the package could potentially be 404. - sh: "sudo apt-get -y update" @@ -153,16 +170,20 @@ for: - sh: "sudo apt-get -y install python2.7" - sh: "sudo apt-get -y install python3.7" - sh: "sudo apt-get -y install python3.8" + - sh: "sudo apt-get -y install python3.9" - sh: "which python3.8" - - sh: "which python3.6" - sh: "which python3.7" + - sh: "which python3.6" + - sh: "which python3.9" - sh: "which python2.7" - - sh: "PATH=$PATH:/usr/bin/python3.8:/usr/bin/python3.7" + - sh: "PATH=$PATH:/usr/bin/python3.9:/usr/bin/python3.8:/usr/bin/python3.7" - sh: "curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py" - sh: "sudo apt-get -y install python3-distutils" + - sh: "sudo apt-get -y install python3.9-distutils" + - ps: "If ($env:INSTALL_PY_39_PIP) {python3.9 get-pip.py --user}" - ps: "If ($env:INSTALL_PY_38_PIP) {python3.8 get-pip.py --user}" - ps: "If ($env:INSTALL_PY_37_PIP) {python3.7 get-pip.py --user}" - ps: "If ($env:INSTALL_PY_36_PIP) {python3.6 get-pip.py --user}" @@ -174,7 +195,7 @@ for: # Pre-dev Tests - "pip install -e \".[pre-dev]\"" - "pylint --rcfile .pylintrc samcli" - + # Dev Tests - "pip install -e \".[dev]\"" #TODO revert the percentage to 94 @@ -183,15 +204,13 @@ for: - "mypy setup.py samcli tests" - "pytest -n 4 tests/functional" - # Runs only in Linux, logging docker hub when running canary and docker cred is available - - sh: " - if [[ -n $BY_CANARY ]] && [[ -n $DOCKER_USER ]] && [[ -n $DOCKER_PASS ]]; - then echo Logging in Docker Hub; echo $DOCKER_PASS | docker login --username $DOCKER_USER --password-stdin registry-1.docker.io; - fi" + # Runs only in Linux, logging Public ECR when running canary and cred is available - sh: " if [[ -n $BY_CANARY ]]; then echo Logging in Public ECR; aws ecr-public get-login-password --region us-east-1 | docker login --username AWS --password-stdin public.ecr.aws; fi" + + - sh: "pytest -vv tests/integration" - sh: "pytest -vv tests/regression" - sh: "black --check setup.py tests samcli" diff --git a/designs/auto-create-ecr.md b/designs/auto-create-ecr.md new file mode 100644 index 0000000000..f74d80e95b --- /dev/null +++ b/designs/auto-create-ecr.md @@ -0,0 +1,324 @@ +Auto Create ECR Repos in Guided Deploy +==================================== + + +What is the problem? +-------------------- + +With the release of Lambda Container Images Support in SAM CLI, customers today have to specify a ECR Repo URI location where images will need to be uploaded by SAM CLI after having been built. This means that customers need to have pre-created resources ready (in this case ECR repos) to go, so that they can supply them during the deploy process. This introduces friction and break in the seamless workflow that `sam deploy --guided` normally offers, with customers having to figure out how to create a ECR repo and how to find correct ECR URI to specify. + +Current flow for deploying a template with image based function: + +1. Create ECR repo: `aws ecr create-repository --repository-name myecr_repo` + +2. Deploy with SAM CLI guided: `sam deploy --guided` + +3. Specify image repo: `Image Repository for HelloWorldFunction []: 12345.dkr.ecr.us-west-2.amazonaws.com/helloworldfunctionrepo` + +What will be changed? +--------------------- + +When deploying with guided, SAM CLI will prompt the option to auto create ECR repos for image based functions. +The auto created ECR repos will reside in a companion stack that gets deployed along with the actual stack. + +During each guided deploy, the functions and repos will be synced. + + +Each function without an image repo specified will have a corresponding repo created in the companion stack. +If a function is deleted from the template and has an auto created image repo previously associated with it, the auto created image repo will also be removed. + + + +There will be an escape hatch to use non SAM CLI managed repos by specifying `--image-repositories` or changing `samconfig.toml`. + +Success criteria for the change +------------------------------- + +* No context switching from SAM CLI to creating resources using other tools. + +Out-of-Scope +------------ + +* SAM CLI will not manage lifecycles, creation and deletion, of the auto created resources outside of SAM CLI. For auto created image repos, modifications to the functions throught other means like console will not modify the associated image repo until the next deployment with SAM CLI. +* Auto create repo only concerns about guided experience. Repos are assumed to be provided in CI/CD situations. However the option --resolve-image-repos will be added for auto creating repos without going through guided. + +User Experience Walkthrough +--------------------------- + +`sam deploy --guided` + +**Creating New Repos** + + +``` +====================== +Looking for config file [samconfig.toml] : Not found + +Setting default arguments for 'sam deploy' +========================================= +Stack Name [sam-app]: images-app +AWS Region [us-east-1]: us-east-2 +#Shows you resources changes to be deployed and require a 'Y' to initiate deploy +Confirm changes before deploy [y/N]: y +#SAM needs permission to be able to create roles to connect to the resources in your template +Allow SAM CLI IAM role creation [Y/n]: y +Save arguments to configuration file [Y/n]: y +SAM configuration file [samconfig.toml]: +SAM configuration environment [default]: +Looking for resources needed for deployment: + S3 bucket: Found! (aws-sam-cli-managed-default-samclisourcebucket-abcdef) + Image repositories: Not found. + #Managed repositories will be deleted when their functions are removed from the template and deployed + Create managed ECR repositories for all functions? [Y/n]: Y + Creating the required resources... + Successfully created resources for deployment! + +Successfully saved arguments to config file! + #Running 'sam deploy' for future deployments will use the parameters saved above. + #The above parameters can be changed by modifying samconfig.toml + #Learn more about samconfig.toml syntax at + #https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-config.html + +Uploading to abcdefg/7dd33d96eafcae1086a1356df982d38e 539284 / 539284.0 (100.00%) + +Deploying with the following values +=================================== +Stack name : test-stack +Region : us-east-2 +Confirm changeset : True +Deployment s3 bucket : aws-sam-cli-managed-default-samclisourcebucket-abcdef +Image repositories : [ + {“helloWorldFunction1”:"12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction1-abcde"}, + {“helloWorldFunction2”:"12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction2-abcde"}, + {“helloWorldFunction3”:"12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction3-abcde”} +] +Capabilities : ["CAPABILITY_IAM"] +Parameter overrides : {} +Signing profiles : {} + +Initiating deployment +``` + +**Deleting Unreferenced Repos** + +``` +Configuring SAM deploy +====================== + + Looking for config file [samconfig.toml] : Found + Reading default arguments : Success + + Setting default arguments for 'sam deploy' + ========================================= + Stack Name [test-stack]: + AWS Region [us-east-2]: + #Shows you resources changes to be deployed and require a 'Y' to initiate deploy + Confirm changes before deploy [Y/n]: y + #SAM needs permission to be able to create roles to connect to the resources in your template + Allow SAM CLI IAM role creation [Y/n]: y + HelloWorldFunction4 may not have authorization defined, Is this okay? [y/N]: y + HelloWorldFunction5 may not have authorization defined, Is this okay? [y/N]: y + Save arguments to configuration file [Y/n]: y + SAM configuration file [samconfig.toml]: + SAM configuration environment [default]: + + Looking for resources needed for deployment: Found! + + Managed S3 bucket: aws-sam-cli-managed-default-samclisourcebucket-abcdef + A different default S3 bucket can be set in samconfig.toml + + Image repositories: Not found. + #Managed repositories will be deleted when their functions are removed from the template and deployed + Create managed ECR repositories for all functions? [Y/n]: y + Checking for unreferenced ECR repositories to clean-up: 2 found + 12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction1-abcde + 12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction2-abcde + Delete the unreferenced repositories listed above when deploying? [y/N]: y + + helloworldfunction4:python3.8-v1 to be pushed to 12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction4-abcde:helloworldfunction4-7bfff073dfcf-python3.8-v1 + helloworldfunction5:python3.8-v1 to be pushed to 12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction5-abcde:helloworldfunction5-7bfff073dfcf-python3.8-v1 + + + Saved arguments to config file + Running 'sam deploy' for future deployments will use the parameters saved above. + The above parameters can be changed by modifying samconfig.toml + Learn more about samconfig.toml syntax at + https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-config.html + + + Deploying with following values + =============================== + Stack name : auto-ecr-guided-test + Region : us-west-2 + Confirm changeset : True + Deployment image repository : + { + "HelloWorldFunction3": "12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction3-abcde", + "HelloWorldFunction4": "12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction4-abcde", + "HelloWorldFunction5": "12345.dkr.ecr.us-east-2.amazonaws.com/helloworldfunction5-abcde" + } + Deployment s3 bucket : aws-sam-cli-managed-default-samclisourcebucket-abcde + Capabilities : ["CAPABILITY_IAM"] + Parameter overrides : {} + Signing Profiles : {} +``` + +--resolve-image-repos +--------------------------- +This new option will have the exact behavior as running guided with confirmation for all image repo related prompts. + +To be more specific, this option will: +1. Create image repos for functions are not linked to image repos with the `--image-repositories` +2. Delete auto created image repos that do not have functions associated with them anymore + +Destructive operations are done for the following reasons: +1. This will keep a consistent behavior as the guided. In guided, SAM CLI will abort deployment if deletion of auto created image repos is denied. +2. For UX, this will avoid image repos and functions mapping to an invalid state where orphaned image repos exist. For this case, we will also need to track which repos should be kept and makes the sync less robust. +3. From security perspective, keeping old image repos will increase the impact radius of an information leakage. A customer might expect a sensitive image repo will be deleted as soon as the function itself is also removed like the guided experience. + +Implementation +============== + +CLI Changes +----------- + +* Add new prompt for guided deploy. + * `Create managed ECR repositories for all functions? [Y/n]: y` +* Add `--resolve-image-repos` to non-guided deploy + + +### Breaking Change + +* Not a breaking change. + +Design +------ + +**Companion Stack Naming Scheme** +``` +#Escaped StackName with only common accpetable characters +escaped_stack_name = re.sub(r"[^a-z0-9]", "", stack_name.lower()) +#Escaped LambdaName with only common accpetable characters +escaped_lambda_logical_id = re.sub(r"[^a-z0-9]", "", lambda_logical_id.lower()) +#MD5 of the original StackName. +stack_md5 = hash.str_checksum(stack_name) +#MD5 of the original LambdaName +function_md5 = hash.str_checksum(lambda_logical_id) + +#MD5 is used to avoid two having the same escaped name with different Lambda Functions +#For example: Helloworld and HELLO-WORLD +repo_logical_id = + lambda_logical_id [:52] + function_md5 [:8] + "Repo" + #52 + 8 + 4 = 64 max char + +repo_name = + escaped_stack_name + stack_md5[:8] + "/" + escaped_lambda_logical_id + function_md5[:8] + "repo" + #128 + 8 + 1 + 64 + 8 + 4 = 213 max char + +companion_stack_name = + stack_name[:104] + "-" + stack_md5[:8] + "-" + "CompanionStack" + #104 + 1 + 8 + 1 + 14 = 128 max char + +repo_output_logical_id = + lambda_logical_id[:52] + function_md5 [:8] + "Out" + #52 + 8 + 3 = 63 max char + +Exmaple: + Input: + Customer Stack Name: Hello-World-Stack + Function 1 Logical ID: TestFunction01 + Function 2 Logical ID: AnotherTestFunction02 + Output: + Companion Stack Name: Hello-World-Stack-925976eb-CompanionStack + Function 1 Repo Logical ID: TestFunction0150919004Repo + Function 1 Repo Name: helloworldstack925976eb/testfunction0150919004repo + Function 2 Repo Logical ID: AnotherTestFunction025c2cfd8cRepo + Function 2 Repo Name: helloworldstack925976eb/anothertestfunction025c2cfd8crepo +``` + +**Companion Stack Structure** +``` +AWSTemplateFormatVersion : '2010-09-09' +Transform: AWS::Serverless-2016-10-31 +Description: AWS SAM CLI Managed ECR Repo Stack +Metadata: + SamCliInfo: 1.18.0 + CompanionStackname: Hello-World-Stack-925976eb-CompanionStack + +Resources: + + TestFunction0150919004Repo: + Type: AWS::ECR::Repository + Properties: + RepositoryName: helloworldstack925976eb/testfunction0150919004repo + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + - Key: AwsSamCliCompanionStack + Value: Hello-World-Stack-925976eb-CompanionStack + + RepositoryPolicyText: + Version: "2012-10-17" + Statement: + - + Sid: AllowLambdaSLR + Effect: Allow + Principal: + Service: + - "lambda.amazonaws.com" + Action: + - "ecr:GetDownloadUrlForLayer" + - "ecr:GetRepositoryPolicy" + - "ecr:BatchGetImage" + + AnotherTestFunction025c2cfd8cRepo: + Type: AWS::ECR::Repository + Properties: + RepositoryName: helloworldstack925976eb/anothertestfunction025c2cfd8crepo + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + - Key: AwsSamCliCompanionStack + Value: Hello-World-Stack-925976eb-CompanionStack + + RepositoryPolicyText: + Version: "2012-10-17" + Statement: + - + Sid: AllowLambdaSLR + Effect: Allow + Principal: + Service: + - "lambda.amazonaws.com" + Action: + - "ecr:GetDownloadUrlForLayer" + - "ecr:GetRepositoryPolicy" + - "ecr:BatchGetImage" + +Outputs: + + None: + Value: !Sub ${AWS::AccountId}.dkr.ecr.${AWS::Region}.${AWS::URLSuffix}/${TestFunction0150919004Repo} + + None: + Value: !Sub ${AWS::AccountId}.dkr.ecr.${AWS::Region}.${AWS::URLSuffix}/${AnotherTestFunction025c2cfd8cRepo} +``` + +Documentation Changes +===================== +* New option `--resolve-image-repos`. This option will auto create/delete repos without the needs of going through guided experience. + +Open Issues +============ +https://github.com/aws/aws-sam-cli/issues/2447 + +Task Breakdown +============== + +- \[x\] Send a Pull Request with this design document +- \[ \] Build Companion Stack Manager +- \[ \] Change Deploy CLI +- \[ \] Unit tests +- \[ \] Functional Tests +- \[ \] Integration tests +- \[ \] Update documentation diff --git a/installer/assets/THIRD-PARTY-LICENSES b/installer/assets/THIRD-PARTY-LICENSES index 10b3966e4a..303e08f469 100644 --- a/installer/assets/THIRD-PARTY-LICENSES +++ b/installer/assets/THIRD-PARTY-LICENSES @@ -1,17 +1,17 @@ -** arrow; version 0.15.5 -- https://pypi.org/project/arrow/ -** aws-lambda-builders; version 1.1.0 -- +** arrow; version 1.0.3 -- https://pypi.org/project/arrow/ +** aws-lambda-builders; version 1.4.0 -- https://pypi.org/project/aws-sam-translator/ -** aws-sam-translator; version 1.28.1 -- +** aws-sam-translator; version 1.36.0 -- https://pypi.org/project/aws-sam-translator/ -** boto3; version 1.14.27 -- https://pypi.org/project/boto3/ -** botocore; version 1.17.27 -- https://github.com/boto/botocore +** boto3; version 1.17.33 -- https://pypi.org/project/boto3/ +** botocore; version 1.20.33 -- https://github.com/boto/botocore ** docker; version 4.2.2 -- https://pypi.org/project/docker/ -** Importlib-metadata; version 1.6.0 -- +** Importlib-metadata; version 4.0.1 -- https://importlib-metadata.readthedocs.io/en/latest/ -** python-request; version 2.23.0 -- -https://pypi.python.org/pypi/requests/2.23.0 -** s3transfer; version 0.3.3 -- https://github.com/boto/s3transfer -** serverlessrepo; version 0.1.9 -- https://pypi.org/project/serverlessrepo/ +** python-request; version 2.25.1 -- +https://pypi.python.org/pypi/requests/2.25.1 +** s3transfer; version 0.3.6 -- https://github.com/boto/s3transfer +** serverlessrepo; version 0.1.10 -- https://pypi.org/project/serverlessrepo/ Apache License @@ -345,7 +345,7 @@ limitations under the License. ------ -** text_unidecode; version 1.2 -- https://github.com/kmike/text-unidecode/ +** text_unidecode; version 1.3 -- https://github.com/kmike/text-unidecode/ Mikhail Korobov The "Artistic License" @@ -504,7 +504,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ------ -** dateparser; version 0.7.4 -- https://pypi.org/project/dateparser/ +** dateparser; version 0.7.6 -- https://pypi.org/project/dateparser/ Copyright (c) 2014, Scrapinghub All rights reserved. @@ -535,7 +535,7 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ------ -** dateutil; version 2.8.0 -- https://github.com/dateutil/dateutil/tree/2.8.0 +** dateutil; version 2.8.1 -- https://github.com/dateutil/dateutil/tree/2.8.0 Copyright 2017- Paul Ganssle Copyright 2017- dateutil contributors (see AUTHORS file) @@ -594,7 +594,7 @@ The above BSD License Applies to all code, even that also covered by Apache ------ -** cookiecutter; version 1.6.0 -- https://pypi.org/project/cookiecutter/ +** cookiecutter; version 1.7.2 -- https://pypi.org/project/cookiecutter/ Copyright (c) 2013-2017, Audrey Roy All rights reserved. @@ -630,7 +630,7 @@ POSSIBILITY OF SUCH DAMAGE. ------ -** idna; version 2.9 -- https://github.com/kjd/idna +** idna; version 2.10 -- https://github.com/kjd/idna Copyright (c) 2013-2020, Kim Davies. All rights reserved. Redistribution and use in source and binary forms, with or without @@ -697,11 +697,11 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ------ -** click; version 7.1.1 -- https://click.palletsprojects.com/en/7.x/ +** click; version 7.1.2 -- https://click.palletsprojects.com/en/7.x/ Copyright 2014 Pallets -** Flask; version 1.0.4 -- https://pypi.org/project/Flask/ +** Flask; version 1.1.2 -- https://pypi.org/project/Flask/ Copyright 2010 Pallets -** jinja2; version 2.11.1 -- https://github.com/pallets/jinja +** jinja2; version 2.11.3 -- https://github.com/pallets/jinja Copyright 2007 Pallets ** Werkzeug; version 1.0.1 -- https://pypi.org/project/Werkzeug/ Copyright 2007 Pallets @@ -1411,7 +1411,7 @@ Apache License 2.0 ------ -** websocket-client; version 0.57.0 -- +** websocket-client; version 0.58.0 -- https://github.com/websocket-client/websocket-client Copyright (C) 2010 Hiroki Ohtani(liris) @@ -2464,7 +2464,7 @@ That's all there is to it! ------ -** chevron; version 0.13.1 -- https://pypi.org/project/chevron/ +** chevron; version 0.14.0 -- https://pypi.org/project/chevron/ Copyright (c) 2014 Noah Morrison ** future; version 0.18.2 -- https://pypi.org/project/future/ Copyright (c) 2013-2016 Python Charmers Pty Ltd, Australia @@ -2474,24 +2474,24 @@ Copyright (c) 2015 Raphael Pierzina Copyright (c) 2013 Julian Berman ** poyo; version 0.5.0 -- https://pypi.org/project/poyo/ Copyright (c) 2015 Raphael Pierzina -** pyrsistent; version 0.16.0 -- https://github.com/tobgu/pyrsistent +** pyrsistent; version 0.17.3 -- https://github.com/tobgu/pyrsistent Copyright (c) 2019 Tobias Gustafsson ** python-six; version 1.15.0 -- https://github.com/JioCloud/python-six Copyright (c) 2010-2020 Benjamin Peterson -** python-urllib3; version 1.25.8 -- https://github.com/shazow/urllib3 +** python-urllib3; version 1.26.5 -- https://github.com/shazow/urllib3 Copyright (c) 2008-2020 Andrey Petrov and contributors (see CONTRIBUTORS.txt) -** pytz; version 2019.3 -- https://pypi.org/project/pytz/ +** pytz; version 2021.1 -- https://pypi.org/project/pytz/ Copyright (c) 2003-2019 Stuart Bishop -** pyyaml; version 5.3.1 -- http://pyyaml.org/ +** pyyaml; version 5.4.1 -- http://pyyaml.org/ Copyright (c) 2006 Kirill Simonov -** setuptools; version 46.1.3 -- +** setuptools; version 54.2.0 -- https://github.com/pypa/setuptools/tree/v46.1.3 Copyright (C) 2016 Jason R Coombs -** tomlkit; version 0.5.8 -- https://pypi.org/project/tomlkit/ +** tomlkit; version 0.7.2 -- https://pypi.org/project/tomlkit/ Copyright (c) 2018 Sébastien Eustace -** tzlocal; version 2.0.0 -- https://pypi.org/project/tzlocal/ +** tzlocal; version 2.1.0 -- https://pypi.org/project/tzlocal/ Copyright 2011-2017 Lennart Regebro -** zipp; version 3.1.0 -- https://github.com/jaraco/zipp +** zipp; version 3.4.1 -- https://github.com/jaraco/zipp Copyright Jason R. Coombs Permission is hereby granted, free of charge, to any person obtaining a copy @@ -2501,20 +2501,20 @@ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: -The above copyright notice and this permission notice shall be included in all -copies or substantial portions of the Software. +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE -SOFTWARE. +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. ------ -** jmespath; version 0.9.5 -- https://pypi.org/project/jmespath/ +** jmespath; version 0.10.0 -- https://pypi.org/project/jmespath/ Copyright (c) 2013 Amazon.com, Inc. or its affiliates. All Rights Reserved Copyright (c) 2013 Amazon.com, Inc. or its affiliates. All Rights Reserved @@ -2540,9 +2540,9 @@ IN THE SOFTWARE. ------ -** attrs; version 19.3.0 -- https://pypi.org/project/attrs/ +** attrs; version 20.3.0 -- https://pypi.org/project/attrs/ Copyright (c) 2015 Hynek Schlawack -** wheel; version 0.34.2 -- https://github.com/pypa/wheel +** wheel; version 0.36.2 -- https://github.com/pypa/wheel "wheel" copyright (c) 2012-2014 Daniel Holth and contributors. @@ -2570,7 +2570,7 @@ SOFTWARE. ------ -** certifi; version 2020.4.5.1 -- https://github.com/certifi/python-certifi/ +** certifi; version 2020.12.05 -- https://github.com/certifi/python-certifi/ (c) 1999 VeriSign, Inc. (c) 2007 GeoTrust Inc. (c) 2006 VeriSign, Inc. @@ -2939,7 +2939,7 @@ Copyright (C) 1995-1998 Eric Young (eay@cryptsoft.com) ------ -** python; version 3.6.x, 3.7.x -- https://www.python.org/ +** python; version 3.6.x, 3.7.x, 3.8.x -- https://www.python.org/ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All rights reserved. @@ -3124,7 +3124,7 @@ THIS SOFTWARE. ------ -** regex; version 2020.4.4 -- https://pypi.org/project/regex/ +** regex; version 2021.3.17 -- https://pypi.org/project/regex/ Copyright 2019 Matthew Barnett ** Matthew Barnett - https://bitbucket.org/mrabarnett/mrab-regex/src/default/ @@ -3364,4 +3364,4 @@ PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file +SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/installer/pyinstaller/hook-samcli.py b/installer/pyinstaller/hook-samcli.py index d526b4cc55..72a939a9b2 100644 --- a/installer/pyinstaller/hook-samcli.py +++ b/installer/pyinstaller/hook-samcli.py @@ -13,10 +13,14 @@ "samcli.commands.deploy", "samcli.commands.logs", "samcli.commands.publish", + "samcli.commands.pipeline.pipeline", + "samcli.commands.pipeline.init", + "samcli.commands.pipeline.bootstrap", # default hidden import 'pkg_resources.py2_warn' is added # since pyInstaller 4.0. "pkg_resources.py2_warn", "aws_lambda_builders.workflows", + "configparser", ] datas = ( hooks.collect_data_files("samcli") diff --git a/mypy.ini b/mypy.ini index 8a7c51866f..cbab8f7fa6 100644 --- a/mypy.ini +++ b/mypy.ini @@ -59,7 +59,7 @@ ignore_missing_imports=True ignore_missing_imports=True # progressive add typechecks and these modules already complete the process, let's keep them clean -[mypy-samcli.commands.build,samcli.lib.build.*,samcli.commands.local.cli_common.invoke_context,samcli.commands.local.lib.local_lambda,samcli.lib.providers.*,samcli.lib.iac.cdk.*] +[mypy-samcli.commands.build,samcli.lib.build.*,samcli.commands.local.cli_common.invoke_context,samcli.commands.local.lib.local_lambda,samcli.lib.providers.*,samcli.lib.utils.git_repo.py,samcli.lib.cookiecutter.*,samcli.lib.pipeline.*,samcli.commands.pipeline.*,samcli.lib.iac.cdk.*] disallow_untyped_defs=True disallow_incomplete_defs=True diff --git a/requirements/base.txt b/requirements/base.txt index e52f6d63e1..fd9af35722 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -6,12 +6,12 @@ boto3~=1.14 jmespath~=0.10.0 PyYAML~=5.3 cookiecutter~=1.7.2 -aws-sam-translator==1.35.0 +aws-sam-translator==1.38.0 #docker minor version updates can include breaking changes. Auto update micro version only. docker~=4.2.0 -dateparser~=0.7 -requests==2.23.0 +dateparser~=1.0 +requests==2.25.1 serverlessrepo==0.1.10 -aws_lambda_builders==1.4.0 -tomlkit==0.7.0 -watchdog==0.10.3 +aws_lambda_builders==1.6.0 +tomlkit==0.7.2 +watchdog==2.1.2 diff --git a/requirements/pre-dev.txt b/requirements/pre-dev.txt index 287259350b..346a9e3622 100644 --- a/requirements/pre-dev.txt +++ b/requirements/pre-dev.txt @@ -1 +1 @@ -pylint~=2.6.0 \ No newline at end of file +pylint~=2.9.0 diff --git a/requirements/reproducible-linux.txt b/requirements/reproducible-linux.txt index abaf26ae99..75f8b8a0e3 100644 --- a/requirements/reproducible-linux.txt +++ b/requirements/reproducible-linux.txt @@ -1,5 +1,5 @@ # -# This file is autogenerated by pip-compile +# This file is autogenerated by pip-compile with python 3.7 # To update, run: # # pip-compile --allow-unsafe --generate-hashes --output-file=requirements/reproducible-linux.txt @@ -12,15 +12,15 @@ attrs==20.3.0 \ --hash=sha256:31b2eced602aa8423c2aea9c76a724617ed67cf9513173fd3a4f03e3a929c7e6 \ --hash=sha256:832aa3cde19744e49938b91fea06d69ecb9e649c93ba974535d08ad92164f700 # via jsonschema -aws-lambda-builders==1.4.0 \ - --hash=sha256:3f885433bb71bae653b520e3cf4c31fe5f5b977cb770d42c631af155cd60fd2b \ - --hash=sha256:5d4e4ecb3d3290f0eec1f62b7b0d9d6b91160ae71447d95899eede392d05f75f \ - --hash=sha256:d32f79cf67b189a7598793f69797f284b2eb9a9fada562175b1e854187f95aed +aws-lambda-builders==1.6.0 \ + --hash=sha256:068840f92520dd37524f464c61e4fd3224c185dda40cd941ddeb0354edcc069b \ + --hash=sha256:2d232e14397519d8e75d4cc1523a31a255c053f97fd784b5513b81fc6c6c5492 \ + --hash=sha256:3e26edb75e78e1420d573cf4bcda1acd424f88a12e754bff971fe8b9bb4545a3 # via aws-sam-cli (setup.py) -aws-sam-translator==1.35.0 \ - --hash=sha256:2f8904fd4a631752bc441a8fd928c444ed98ceb86b94d25ed7b84982e2eff1cd \ - --hash=sha256:5cf7faab3566843f3b44ef1a42a9c106ffb50809da4002faab818076dcc7bff8 \ - --hash=sha256:c35075e7e804490d6025598ed4878ad3ab8668e37cafb7ae75120b1c37a6d212 +aws-sam-translator==1.38.0 \ + --hash=sha256:0ecadda9cf5ab2318f57f1253181a2151e4c53cd35d21717a923c075a5a65cb6 \ + --hash=sha256:dc6b816bb5cfd9709299f9b263fc0cf5ae60aca4166d1c90413ece651f1556bb \ + --hash=sha256:ee7c7c5e44ec67202622ca877140545496527ffcc45da3beeda966f007443a88 # via aws-sam-cli (setup.py) binaryornot==0.4.4 \ --hash=sha256:359501dfc9d40632edc9fac890e19542db1a287bbcfa58175b66658392018061 \ @@ -64,9 +64,9 @@ cookiecutter==1.7.2 \ --hash=sha256:430eb882d028afb6102c084bab6cf41f6559a77ce9b18dc6802e3bc0cc5f4a30 \ --hash=sha256:efb6b2d4780feda8908a873e38f0e61778c23f6a2ea58215723bcceb5b515dac # via aws-sam-cli (setup.py) -dateparser==0.7.6 \ - --hash=sha256:7552c994f893b5cb8fcf103b4cd2ff7f57aab9bfd2619fdf0cf571c0740fd90b \ - --hash=sha256:e875efd8c57c85c2d02b238239878db59ff1971f5a823457fcc69e493bf6ebfa +dateparser==1.0.0 \ + --hash=sha256:159cc4e01a593706a15cd4e269a0b3345edf3aef8bf9278a57dac8adf5bf1e4a \ + --hash=sha256:17202df32c7a36e773136ff353aa3767e987f8b3e27374c39fd21a30a803d6f8 # via aws-sam-cli (setup.py) docker==4.2.2 \ --hash=sha256:03a46400c4080cb6f7aa997f881ddd84fef855499ece219d75fbdb53289c17ab \ @@ -80,18 +80,14 @@ idna==2.10 \ --hash=sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6 \ --hash=sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0 # via requests -importlib-metadata==3.7.3 \ - --hash=sha256:742add720a20d0467df2f444ae41704000f50e1234f46174b51f9c6031a1bd71 \ - --hash=sha256:b74159469b464a99cb8cc3e21973e4d96e05d3024d337313fedb618a6e86e6f4 +importlib-metadata==4.0.1 \ + --hash=sha256:8c501196e49fb9df5df43833bdb1e4328f64847763ec8a50703148b73784d581 \ + --hash=sha256:d7eb1dea6d6a6086f8be21784cc9e3bcfa55872b52309bc5fad53a8ea444465d # via jsonschema itsdangerous==1.1.0 \ --hash=sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19 \ --hash=sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749 # via flask -jinja2-time==0.2.0 \ - --hash=sha256:d14eaa4d315e7688daa4969f616f226614350c48730bfa1692d2caebd8c90d40 \ - --hash=sha256:d3eab6605e3ec8b7a0863df09cc1d23714908fa61aa6986a845c20ba488b4efa - # via cookiecutter jinja2==2.11.3 \ --hash=sha256:03e47ad063331dd6a3f04a43eddca8a966a26ba0c5b7207a9a9e4e08f1b29419 \ --hash=sha256:a6d58433de0ae800347cab1fa3043cebbabe8baa9d29e668f1c768cb87a333c6 @@ -99,6 +95,10 @@ jinja2==2.11.3 \ # cookiecutter # flask # jinja2-time +jinja2-time==0.2.0 \ + --hash=sha256:d14eaa4d315e7688daa4969f616f226614350c48730bfa1692d2caebd8c90d40 \ + --hash=sha256:d3eab6605e3ec8b7a0863df09cc1d23714908fa61aa6986a845c20ba488b4efa + # via cookiecutter jmespath==0.10.0 \ --hash=sha256:b85d0567b8666149a93172712e68920734333c0ce7e89b78b3e987f71e5ed4f9 \ --hash=sha256:cdf6525904cc597730141d61b36f2e4b8ecc257c420fa2f4549bac2c2d0cb72f @@ -166,9 +166,6 @@ markupsafe==1.1.1 \ # via # cookiecutter # jinja2 -pathtools==0.1.2 \ - --hash=sha256:7c35c5421a39bb82e58018febd90e3b6e5db34c5443aaaf742b3f33d4655f1c0 - # via watchdog poyo==0.5.0 \ --hash=sha256:3e2ca8e33fdc3c411cd101ca395668395dd5dc7ac775b8e809e3def9f9fe041a \ --hash=sha256:e26956aa780c45f011ca9886f044590e2d8fd8b61db7b1c1cf4e0869f48ed4dd @@ -268,9 +265,9 @@ regex==2021.3.17 \ --hash=sha256:f5d0c921c99297354cecc5a416ee4280bd3f20fd81b9fb671ca6be71499c3fdf \ --hash=sha256:f85d6f41e34f6a2d1607e312820971872944f1661a73d33e1e82d35ea3305e14 # via dateparser -requests==2.23.0 \ - --hash=sha256:43999036bfa82904b6af1d99e4882b560e5e2c68e5c4b0aa03b655f3d7d73fee \ - --hash=sha256:b3f43d496c6daba4493e7c431722aeb7dbc6288f52a6e04e7b6023b0247817e6 +requests==2.25.1 \ + --hash=sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804 \ + --hash=sha256:c210084e36a42ae6b9219e00e48287def368a26d03a048ddad7bfee44f75871e # via # aws-sam-cli (setup.py) # cookiecutter @@ -299,14 +296,14 @@ text-unidecode==1.3 \ --hash=sha256:1311f10e8b895935241623731c2ba64f4c455287888b18189350b67134a822e8 \ --hash=sha256:bad6603bb14d279193107714b288be206cac565dfa49aa5b105294dd5c4aab93 # via python-slugify -tomlkit==0.7.0 \ - --hash=sha256:6babbd33b17d5c9691896b0e68159215a9387ebfa938aa3ac42f4a4beeb2b831 \ - --hash=sha256:ac57f29693fab3e309ea789252fcce3061e19110085aa31af5446ca749325618 +tomlkit==0.7.2 \ + --hash=sha256:173ad840fa5d2aac140528ca1933c29791b79a374a0861a80347f42ec9328117 \ + --hash=sha256:d7a454f319a7e9bd2e249f239168729327e4dd2d27b17dc68be264ad1ce36754 # via aws-sam-cli (setup.py) -typing-extensions==3.7.4.3 \ - --hash=sha256:7cb407020f00f7bfc3cb3e7881628838e69d8f3fcab2f64742a5e76b2f841918 \ - --hash=sha256:99d4073b617d30288f569d3f13d2bd7548c3a7e4c8de87db09a9d29bb3a4a60c \ - --hash=sha256:dafc7639cde7f1b6e1acc0f457842a83e722ccca8eef5270af2d74792619a89f +typing-extensions==3.10.0.0 \ + --hash=sha256:0ac0f89795dd19de6b97debb0c6af1c70987fd80a2d62d1958f7e56fcc31b497 \ + --hash=sha256:50b6f157849174217d0656f99dc82fe932884fb250826c18350e159ec6cdf342 \ + --hash=sha256:779383f6086d90c99ae41cf0ff39aac8a7937a9283ce0a414e5dd782f4c94a84 # via # arrow # importlib-metadata @@ -314,14 +311,30 @@ tzlocal==2.1 \ --hash=sha256:643c97c5294aedc737780a49d9df30889321cbe1204eac2c2ec6134035a92e44 \ --hash=sha256:e2cb6c6b5b604af38597403e9852872d7f534962ae2954c7f35efcb1ccacf4a4 # via dateparser -urllib3==1.25.11 \ - --hash=sha256:8d7eaa5a82a1cac232164990f04874c594c9453ec55eef02eab885aa02fc17a2 \ - --hash=sha256:f5321fbe4bf3fefa0efd0bfe7fb14e90909eb62a48ccda331726b4319897dd5e +urllib3==1.26.5 \ + --hash=sha256:753a0374df26658f99d826cfe40394a686d05985786d946fbe4165b5148f5a7c \ + --hash=sha256:a7acd0977125325f516bda9735fa7142b909a8d01e8b2e4c8108d0984e6e0098 # via # botocore # requests -watchdog==0.10.3 \ - --hash=sha256:4214e1379d128b0588021880ccaf40317ee156d4603ac388b9adcf29165e0c04 +watchdog==2.1.2 \ + --hash=sha256:0237db4d9024859bea27d0efb59fe75eef290833fd988b8ead7a879b0308c2db \ + --hash=sha256:104266a778906ae0e971368d368a65c4cd032a490a9fca5ba0b78c6c7ae11720 \ + --hash=sha256:188145185c08c73c56f1478ccf1f0f0f85101191439679b35b6b100886ce0b39 \ + --hash=sha256:1a62a4671796dc93d1a7262286217d9e75823c63d4c42782912d39a506d30046 \ + --hash=sha256:255a32d44bbbe62e52874ff755e2eefe271b150e0ec240ad7718a62a7a7a73c4 \ + --hash=sha256:3d6405681471ebe0beb3aa083998c4870e48b57f8afdb45ea1b5957cc5cf1014 \ + --hash=sha256:4b219d46d89cfa49af1d73175487c14a318a74cb8c5442603fd13c6a5b418c86 \ + --hash=sha256:581e3548159fe7d2a9f377a1fbcb41bdcee46849cca8ab803c7ac2e5e04ec77c \ + --hash=sha256:58ebb1095ee493008a7789d47dd62e4999505d82be89fc884d473086fccc6ebd \ + --hash=sha256:598d772beeaf9c98d0df946fbabf0c8365dd95ea46a250c224c725fe0c4730bc \ + --hash=sha256:668391e6c32742d76e5be5db6bf95c455fa4b3d11e76a77c13b39bccb3a47a72 \ + --hash=sha256:6ef9fe57162c4c361692620e1d9167574ba1975ee468b24051ca11c9bba6438e \ + --hash=sha256:91387ee2421f30b75f7ff632c9d48f76648e56bf346a7c805c0a34187a93aab4 \ + --hash=sha256:a42e6d652f820b2b94cd03156c62559a2ea68d476476dfcd77d931e7f1012d4a \ + --hash=sha256:a6471517315a8541a943c00b45f1d252e36898a3ae963d2d52509b89a50cb2b9 \ + --hash=sha256:d34ce2261f118ecd57eedeef95fc2a495fc4a40b3ed7b3bf0bd7a8ccc1ab4f8f \ + --hash=sha256:edcd9ef3fd460bb8a98eb1fcf99941e9fd9f275f45f1a82cb1359ec92975d647 # via aws-sam-cli (setup.py) websocket-client==0.58.0 \ --hash=sha256:44b5df8f08c74c3d82d28100fdc81f4536809ce98a17f0757557813275fbb663 \ diff --git a/samcli/__init__.py b/samcli/__init__.py index 166aea32fd..ceb4acdbf7 100644 --- a/samcli/__init__.py +++ b/samcli/__init__.py @@ -2,4 +2,4 @@ SAM CLI version """ -__version__ = "1.22.0.dev202107140310" +__version__ = "1.28.0.dev202107140310" diff --git a/samcli/cli/command.py b/samcli/cli/command.py index 384529f78b..c135400586 100644 --- a/samcli/cli/command.py +++ b/samcli/cli/command.py @@ -21,6 +21,7 @@ "samcli.commands.deploy", "samcli.commands.logs", "samcli.commands.publish", + "samcli.commands.pipeline.pipeline", # We intentionally do not expose the `bootstrap` command for now. We might open it up later # "samcli.commands.bootstrap", ] diff --git a/samcli/cli/context.py b/samcli/cli/context.py index b340801203..2aa6ccf7a8 100644 --- a/samcli/cli/context.py +++ b/samcli/cli/context.py @@ -4,7 +4,7 @@ import logging import uuid -from typing import Optional, cast +from typing import Optional, cast, List import boto3 import botocore @@ -223,7 +223,7 @@ def _refresh_session(self): raise CredentialsError(str(ex)) from ex -def get_cmd_names(cmd_name, ctx): +def get_cmd_names(cmd_name, ctx) -> List[str]: """ Given the click core context, return a list representing all the subcommands passed to the CLI diff --git a/samcli/cli/types.py b/samcli/cli/types.py index 00a52e2077..7aa571b67b 100644 --- a/samcli/cli/types.py +++ b/samcli/cli/types.py @@ -208,6 +208,11 @@ def convert(self, value, param, ctx): if value == ("",): return result + # if value comes in via samconfig file and is a list, it is parsed to string. + if isinstance(value, list): + if not value: + return result + value = " ".join(value) # if value comes in a via configuration file, it will be a string. So we should still convert it. value = (value,) if not isinstance(value, tuple) else value diff --git a/samcli/commands/_utils/options.py b/samcli/commands/_utils/options.py index 712da5dcc1..778fa74c0d 100644 --- a/samcli/commands/_utils/options.py +++ b/samcli/commands/_utils/options.py @@ -14,7 +14,7 @@ from samcli.commands._utils.custom_options.option_nargs import OptionNargs from samcli.lib.iac.interface import ProjectTypes -_TEMPLATE_OPTION_DEFAULT_VALUE = "template.[yaml|yml]" +_TEMPLATE_OPTION_DEFAULT_VALUE = "template.[yaml|yml|json]" DEFAULT_STACK_NAME = "sam-app" LOG = logging.getLogger(__name__) @@ -33,7 +33,7 @@ def get_or_default_template_file_name(ctx, param, provided_value, include_build) """ original_template_path = os.path.abspath(provided_value) - search_paths = ["template.yaml", "template.yml"] + search_paths = ["template.yaml", "template.yml", "template.json"] if include_build: search_paths.insert(0, os.path.join(".aws-sam", "build", "template.yaml")) diff --git a/samcli/commands/build/build_constants.py b/samcli/commands/build/build_constants.py new file mode 100644 index 0000000000..4ecc728e21 --- /dev/null +++ b/samcli/commands/build/build_constants.py @@ -0,0 +1,8 @@ +""" +Contains Build command related constants values +""" + +import os + +DEFAULT_BUILD_DIR = os.path.join(".aws-sam", "build") +DEFAULT_CACHE_DIR = os.path.join(".aws-sam", "cache") diff --git a/samcli/commands/build/build_context.py b/samcli/commands/build/build_context.py index 56b6100705..878eb3efcc 100644 --- a/samcli/commands/build/build_context.py +++ b/samcli/commands/build/build_context.py @@ -4,18 +4,19 @@ import logging import os -import shutil -from typing import Optional, List import pathlib +import shutil +from typing import Dict, Optional, List from samcli.lib.iac.interface import IacPlugin, Project +from samcli.commands.build.exceptions import InvalidBuildDirException, MissingBuildMethodException +from samcli.lib.intrinsic_resolver.intrinsics_symbol_table import IntrinsicsSymbolTable from samcli.lib.providers.provider import ResourcesToBuildCollector, Stack, Function, LayerVersion -from samcli.lib.providers.sam_stack_provider import SamLocalStackProvider -from samcli.local.docker.manager import ContainerManager from samcli.lib.providers.sam_function_provider import SamFunctionProvider from samcli.lib.providers.sam_layer_provider import SamLayerProvider +from samcli.lib.providers.sam_stack_provider import SamLocalStackProvider +from samcli.local.docker.manager import ContainerManager from samcli.local.lambdafn.exceptions import ResourceNotFound -from samcli.commands.build.exceptions import InvalidBuildDirException, MissingBuildMethodException LOG = logging.getLogger(__name__) @@ -47,6 +48,7 @@ def __init__( container_env_var: Optional[dict] = None, container_env_var_file: Optional[str] = None, build_images: Optional[dict] = None, + aws_region: Optional[str] = None, ) -> None: self._resource_identifier = resource_identifier @@ -61,6 +63,10 @@ def __init__( self._clean = clean self._use_container = use_container self._parameter_overrides = parameter_overrides + # Override certain CloudFormation pseudo-parameters based on values provided by customer + self._global_parameter_overrides: Optional[Dict] = None + if aws_region: + self._global_parameter_overrides = {IntrinsicsSymbolTable.AWS_REGION: aws_region} self._docker_network = docker_network self._skip_pull_image = skip_pull_image self._mode = mode @@ -79,7 +85,9 @@ def __init__( def __enter__(self) -> "BuildContext": self._stacks, remote_stack_full_paths = SamLocalStackProvider.get_stacks( - self._project.stacks, parameter_overrides=self._parameter_overrides + self._project.stacks, + parameter_overrides=self._parameter_overrides, + global_parameter_overrides=self._global_parameter_overrides, ) if remote_stack_full_paths: diff --git a/samcli/commands/build/command.py b/samcli/commands/build/command.py index d5c93d0584..0e9165ad33 100644 --- a/samcli/commands/build/command.py +++ b/samcli/commands/build/command.py @@ -17,6 +17,7 @@ cdk_click_options, ) from samcli.cli.main import pass_context, common_options as cli_framework_options, aws_creds_options, print_cmdline_args +from samcli.commands.build.build_constants import DEFAULT_BUILD_DIR, DEFAULT_CACHE_DIR from samcli.lib.build.exceptions import BuildInsideContainerError from samcli.lib.iac.interface import IacPlugin, Project from samcli.lib.iac.utils.helpers import inject_iac_plugin @@ -29,9 +30,6 @@ LOG = logging.getLogger(__name__) -DEFAULT_BUILD_DIR = os.path.join(".aws-sam", "build") -DEFAULT_CACHE_DIR = os.path.join(".aws-sam", "cache") - HELP_TEXT = """ Use this command to build your AWS Lambda Functions source code to generate artifacts that target AWS Lambda's execution environment.\n @@ -43,7 +41,7 @@ \b Supported Runtimes ------------------ -1. Python 2.7, 3.6, 3.7, 3.8 using PIP\n +1. Python 2.7, 3.6, 3.7, 3.8 3.9 using PIP\n 2. Nodejs 14.x, 12.x, 10.x, 8.10, 6.10 using NPM\n 3. Ruby 2.5 using Bundler\n 4. Java 8, Java 11 using Gradle and Maven\n @@ -220,6 +218,7 @@ def cli( # All logic must be implemented in the ``do_cli`` method. This helps with easy unit testing mode = _get_mode_value_from_envvar("SAM_BUILD_MODE", choices=["debug"]) do_cli( + ctx, resource_logical_id, template_file, base_dir, @@ -244,6 +243,7 @@ def cli( def do_cli( # pylint: disable=too-many-locals, too-many-statements + click_ctx, function_identifier: Optional[str], template: str, base_dir: Optional[str], @@ -307,6 +307,7 @@ def do_cli( # pylint: disable=too-many-locals, too-many-statements container_env_var=processed_env_vars, container_env_var_file=container_env_var_file, build_images=processed_build_images, + aws_region=click_ctx.region, iac=iac, project=project, ) as ctx: diff --git a/samcli/commands/deploy/command.py b/samcli/commands/deploy/command.py index 8209dd060a..6062c0a2a3 100644 --- a/samcli/commands/deploy/command.py +++ b/samcli/commands/deploy/command.py @@ -158,7 +158,9 @@ "--resolve-s3", required=False, is_flag=True, - help="Automatically resolve s3 bucket for non-guided deployments." + help="Automatically resolve s3 bucket for non-guided deployments. " + "Enabling this option will also create a managed default s3 bucket for you. " + "If you do not provide a --s3-bucket value, the managed bucket will be used. " "Do not use --s3-guided parameter with this option.", ) @metadata_override_option diff --git a/samcli/commands/deploy/deploy_context.py b/samcli/commands/deploy/deploy_context.py index e5a8566421..362f87305a 100644 --- a/samcli/commands/deploy/deploy_context.py +++ b/samcli/commands/deploy/deploy_context.py @@ -17,7 +17,7 @@ import logging import os -from typing import Dict, List +from typing import Dict, List, Optional import boto3 import click @@ -32,6 +32,7 @@ ) from samcli.lib.deploy.deployer import Deployer from samcli.lib.iac.interface import Stack +from samcli.lib.intrinsic_resolver.intrinsics_symbol_table import IntrinsicsSymbolTable from samcli.lib.package.s3_uploader import S3Uploader from samcli.lib.providers.sam_stack_provider import SamLocalStackProvider from samcli.lib.utils.botoconfig import get_boto_config_with_user_agent @@ -82,6 +83,10 @@ def __init__( self.s3_prefix = s3_prefix self.kms_key_id = kms_key_id self.parameter_overrides = parameter_overrides + # Override certain CloudFormation pseudo-parameters based on values provided by customer + self.global_parameter_overrides: Optional[Dict] = None + if region: + self.global_parameter_overrides = {IntrinsicsSymbolTable.AWS_REGION: region} self.capabilities = capabilities self.no_execute_changeset = no_execute_changeset self.role_arn = role_arn @@ -217,6 +222,7 @@ def deploy( stacks, _ = SamLocalStackProvider.get_stacks( [iac_stack], parameter_overrides=sanitize_parameter_overrides(self.parameter_overrides), + global_parameter_overrides=self.global_parameter_overrides, normalize_resource_metadata=False, ) auth_required_per_resource = auth_per_resource(stacks) diff --git a/samcli/commands/deploy/guided_context.py b/samcli/commands/deploy/guided_context.py index e108b9e44b..50b6c5fd5d 100644 --- a/samcli/commands/deploy/guided_context.py +++ b/samcli/commands/deploy/guided_context.py @@ -6,12 +6,12 @@ from typing import Dict, Any, List import click -from botocore.session import get_session -from click.types import FuncParamType -from click import prompt from click import confirm +from click import prompt +from click.types import FuncParamType from samcli.commands._utils.options import _space_separated_list_func_type, DEFAULT_STACK_NAME +from samcli.commands.deploy.auth_utils import auth_per_resource from samcli.commands.deploy.code_signer_utils import ( signer_config_per_function, extract_profile_name_and_owner_from_existing, @@ -20,16 +20,17 @@ ) from samcli.commands.deploy.exceptions import GuidedDeployFailedError from samcli.commands.deploy.guided_config import GuidedConfig -from samcli.commands.deploy.auth_utils import auth_per_resource from samcli.commands.deploy.utils import sanitize_parameter_overrides -from samcli.lib.config.samconfig import DEFAULT_ENV, DEFAULT_CONFIG_FILE_NAME from samcli.lib.bootstrap.bootstrap import manage_stack +from samcli.lib.config.samconfig import DEFAULT_ENV, DEFAULT_CONFIG_FILE_NAME +from samcli.lib.intrinsic_resolver.intrinsics_symbol_table import IntrinsicsSymbolTable from samcli.lib.package.ecr_utils import is_ecr_url from samcli.lib.package.image_utils import tag_translation, NonLocalImageException, NoImageFoundException from samcli.lib.providers.provider import Stack from samcli.lib.providers.sam_function_provider import SamFunctionProvider from samcli.lib.providers.sam_stack_provider import SamLocalStackProvider from samcli.lib.utils.colors import Colored +from samcli.lib.utils.defaults import get_default_aws_region from samcli.lib.utils.packagetype import IMAGE LOG = logging.getLogger(__name__) @@ -120,7 +121,7 @@ def guided_prompts(self): """ default_stack_name = self.stack_name - default_region = self.region or get_session().get_config_variable("region") or "us-east-1" + default_region = self.region or get_default_aws_region() default_capabilities = self.capabilities[0] or ("CAPABILITY_IAM",) default_config_env = self.config_env or DEFAULT_ENV default_config_file = self.config_file or DEFAULT_CONFIG_FILE_NAME @@ -139,12 +140,15 @@ def guided_prompts(self): ) self._get_iac_stack(stack_name) region = prompt(f"\t{self.start_bold}AWS Region{self.end_bold}", default=default_region, type=click.STRING) + global_parameter_overrides = {IntrinsicsSymbolTable.AWS_REGION: region} parameter_override_keys = self._iac_stack.get_overrideable_parameters() input_parameter_overrides = self.prompt_parameters( parameter_override_keys, self.parameter_overrides_from_cmdline, self.start_bold, self.end_bold ) stacks, _ = SamLocalStackProvider.get_stacks( - [self._iac_stack], parameter_overrides=sanitize_parameter_overrides(input_parameter_overrides) + [self._iac_stack], + parameter_overrides=sanitize_parameter_overrides(input_parameter_overrides), + global_parameter_overrides=global_parameter_overrides, ) image_repositories = self.prompt_image_repository(stacks) @@ -326,7 +330,7 @@ def prompt_image_repository(self, stacks: List[Stack]): if isinstance(self.image_repositories, dict) else "" or self.image_repository, ) - if not is_ecr_url(image_repositories.get(resource_id)): + if resource_id not in image_repositories or not is_ecr_url(str(image_repositories[resource_id])): raise GuidedDeployFailedError( f"Invalid Image Repository ECR URI: {image_repositories.get(resource_id)}" ) diff --git a/samcli/commands/exceptions.py b/samcli/commands/exceptions.py index 7b8f253609..a27f4872cf 100644 --- a/samcli/commands/exceptions.py +++ b/samcli/commands/exceptions.py @@ -59,3 +59,22 @@ class ContainersInitializationException(UserException): """ Exception class when SAM is not able to initialize any of the lambda functions containers """ + + +class PipelineTemplateCloneException(UserException): + """ + Exception class when unable to download pipeline templates from a Git repository during `sam pipeline init` + """ + + +class AppPipelineTemplateManifestException(UserException): + """ + Exception class when SAM is not able to parse the "manifest.yaml" file located in the SAM pipeline templates + Git repo: "github.com/aws/aws-sam-cli-pipeline-init-templates.git + """ + + +class AppPipelineTemplateMetadataException(UserException): + """ + Exception class when SAM is not able to parse the "metadata.json" file located in the SAM pipeline templates + """ diff --git a/samcli/commands/init/__init__.py b/samcli/commands/init/__init__.py index 8eabc705f7..c7eedb9888 100644 --- a/samcli/commands/init/__init__.py +++ b/samcli/commands/init/__init__.py @@ -179,7 +179,7 @@ def wrapped(*args, **kwargs): default=None, help="Lambda Image of your app", cls=Mutex, - not_required=["location", "app_template", "runtime"], + not_required=["location", "runtime"], ) @click.option( "-d", @@ -198,7 +198,7 @@ def wrapped(*args, **kwargs): help="Identifier of the managed application template you want to use. " "If not sure, call 'sam init' without options for an interactive workflow.", cls=Mutex, - not_required=["location", "base_image"], + not_required=["location"], ) @click.option( "--no-input", @@ -278,7 +278,6 @@ def do_cli( extra_context, project_type, cdk_language, - auto_clone=True, ): """ Implementation of the ``cli`` method @@ -297,19 +296,20 @@ def do_cli( image_bool = name and pt_explicit and base_image if location or zip_bool or image_bool: # need to turn app_template into a location before we generate - templates = InitTemplates(no_interactive, auto_clone) + templates = InitTemplates(no_interactive) if package_type == IMAGE and image_bool: base_image, runtime = _get_runtime_from_image(base_image) options = templates.init_options( project_type, cdk_language, package_type, runtime, base_image, dependency_manager ) - if len(options) == 1: - app_template = options[0].get("appTemplate") - elif len(options) > 1: - raise LambdaImagesTemplateException( - "Multiple lambda image application templates found. " - "This should not be possible, please raise an issue." - ) + if not app_template: + if len(options) == 1: + app_template = options[0].get("appTemplate") + elif len(options) > 1: + raise LambdaImagesTemplateException( + "Multiple lambda image application templates found. " + "Please specify one using the --app-template parameter." + ) if app_template and not location: location = templates.location_from_app_template( diff --git a/samcli/commands/init/init_templates.py b/samcli/commands/init/init_templates.py index 4cd025593a..c20bb96c95 100644 --- a/samcli/commands/init/init_templates.py +++ b/samcli/commands/init/init_templates.py @@ -4,12 +4,8 @@ import itertools import json -import os import logging -import platform -import shutil -import subprocess - +import os from pathlib import Path from typing import Dict @@ -18,12 +14,13 @@ from samcli.cli.main import global_cfg from samcli.commands.exceptions import UserException, AppTemplateUpdateException from samcli.lib.iac.interface import ProjectTypes -from samcli.lib.utils import osutils -from samcli.lib.utils.osutils import rmtree_callback -from samcli.local.common.runtime_template import RUNTIME_DEP_TEMPLATE_MAPPING, get_local_lambda_images_location +from samcli.lib.utils.git_repo import GitRepo, CloneRepoException, CloneRepoUnstableStateException from samcli.lib.utils.packagetype import IMAGE +from samcli.local.common.runtime_template import RUNTIME_DEP_TEMPLATE_MAPPING, get_local_lambda_images_location LOG = logging.getLogger(__name__) +APP_TEMPLATES_REPO_URL = "https://github.com/aws/aws-sam-cli-app-templates" +APP_TEMPLATES_REPO_NAME = "aws-sam-cli-app-templates" class InvalidInitTemplateError(UserException): @@ -35,15 +32,10 @@ class CDKProjectInvalidInitConfigError(UserException): class InitTemplates: - def __init__(self, no_interactive=False, auto_clone=True): - self._repo_url = "https://github.com/aws/aws-sam-cli-app-templates" - self._repo_branch = "cdk-template" - self._repo_name = "aws-sam-cli-app-templates" - self._temp_repo_name = "TEMP-aws-sam-cli-app-templates" - self.repo_path = None - self.clone_attempted = False + def __init__(self, no_interactive=False): self._no_interactive = no_interactive - self._auto_clone = auto_clone + # TODO [melasmar] Remove branch after CDK templates become GA + self._git_repo: GitRepo = GitRepo(url=APP_TEMPLATES_REPO_URL, branch="cdk-template") def prompt_for_location(self, project_type, cdk_language, package_type, runtime, base_image, dependency_manager): """ @@ -99,7 +91,7 @@ def prompt_for_location(self, project_type, cdk_language, package_type, runtime, if template_md.get("init_location") is not None: return (template_md["init_location"], template_md["appTemplate"]) if template_md.get("directory") is not None: - return (os.path.join(self.repo_path, template_md["directory"]), template_md["appTemplate"]) + return os.path.join(self._git_repo.local_path, template_md["directory"]), template_md["appTemplate"] raise InvalidInitTemplateError("Invalid template. This should not be possible, please raise an issue.") def location_from_app_template( @@ -111,7 +103,7 @@ def location_from_app_template( if template.get("init_location") is not None: return template["init_location"] if template.get("directory") is not None: - return os.path.join(self.repo_path, template["directory"]) + return os.path.normpath(os.path.join(self._git_repo.local_path, template["directory"])) raise InvalidInitTemplateError("Invalid template. This should not be possible, please raise an issue.") except StopIteration as ex: msg = "Can't find application template " + app_template + " - check valid values in interactive init." @@ -124,9 +116,18 @@ def _check_app_template(entry: Dict, app_template: str) -> bool: return bool(entry["appTemplate"] == app_template) def init_options(self, project_type, cdk_language, package_type, runtime, base_image, dependency_manager): - if not self.clone_attempted: - self._clone_repo() - if self.repo_path is None: + if not self._git_repo.clone_attempted: + shared_dir: Path = global_cfg.config_dir + try: + self._git_repo.clone(clone_dir=shared_dir, clone_name=APP_TEMPLATES_REPO_NAME, replace_existing=True) + except CloneRepoUnstableStateException as ex: + raise AppTemplateUpdateException(str(ex)) from ex + except (OSError, CloneRepoException): + # If can't clone, try using an old clone from a previous run if already exist + expected_previous_clone_local_path: Path = shared_dir.joinpath(APP_TEMPLATES_REPO_NAME) + if expected_previous_clone_local_path.exists(): + self._git_repo.local_path = expected_previous_clone_local_path + if self._git_repo.local_path is None: return self._init_options_from_bundle(project_type, cdk_language, package_type, runtime, dependency_manager) return self._init_options_from_manifest( project_type, cdk_language, package_type, runtime, base_image, dependency_manager @@ -135,7 +136,7 @@ def init_options(self, project_type, cdk_language, package_type, runtime, base_i def _init_options_from_manifest( self, project_type, cdk_language, package_type, runtime, base_image, dependency_manager ): - manifest_path = os.path.join(self.repo_path, "manifest.json") + manifest_path = os.path.join(self._git_repo.local_path, "manifest.json") with open(str(manifest_path)) as fp: body = fp.read() manifest_body = json.loads(body) @@ -179,110 +180,6 @@ def _init_options_from_bundle(project_type, cdk_language, package_type, runtime, ) raise InvalidInitTemplateError(msg) - @staticmethod - def _shared_dir_check(shared_dir: Path) -> bool: - try: - shared_dir.mkdir(mode=0o700, parents=True, exist_ok=True) - return True - except OSError as ex: - LOG.warning("WARN: Unable to create shared directory.", exc_info=ex) - return False - - def _clone_repo(self): - if not self._auto_clone: - return # Unit test escape hatch - # check if we have templates stored already - shared_dir = global_cfg.config_dir - if not self._shared_dir_check(shared_dir): - # Nothing we can do if we can't access the shared config directory, use bundled. - return - expected_path = os.path.normpath(os.path.join(shared_dir, self._repo_name)) - if self._template_directory_exists(expected_path): - self._overwrite_existing_templates(expected_path) - else: - # simply create the app templates repo - self._clone_new_app_templates(shared_dir, expected_path) - self.clone_attempted = True - - def _overwrite_existing_templates(self, expected_path: str): - self.repo_path = expected_path - # workflow to clone a copy to a new directory and overwrite - with osutils.mkdir_temp(ignore_errors=True) as tempdir: - try: - expected_temp_path = os.path.normpath(os.path.join(tempdir, self._repo_name)) - LOG.info("\nCloning app templates from %s", self._repo_url) - subprocess.check_output( - # TODO: [UPDATEME] wchengru: We should remove --branch option when making CDK support GA. - [self._git_executable(), "clone", "--branch", self._repo_branch, self._repo_url, self._repo_name], - cwd=tempdir, - stderr=subprocess.STDOUT, - ) - # Now we need to delete the old repo and move this one. - self._replace_app_templates(expected_temp_path, expected_path) - self.repo_path = expected_path - except OSError as ex: - LOG.warning("WARN: Could not clone app template repo.", exc_info=ex) - except subprocess.CalledProcessError as clone_error: - output = clone_error.output.decode("utf-8") - if "not found" in output.lower(): - click.echo("WARN: Could not clone app template repo.") - - @staticmethod - def _replace_app_templates(temp_path: str, dest_path: str) -> None: - try: - LOG.debug("Removing old templates from %s", dest_path) - shutil.rmtree(dest_path, onerror=rmtree_callback) - LOG.debug("Copying templates from %s to %s", temp_path, dest_path) - shutil.copytree(temp_path, dest_path, ignore=shutil.ignore_patterns("*.git")) - except (OSError, shutil.Error) as ex: - # UNSTABLE STATE - # it's difficult to see how this scenario could happen except weird permissions, user will need to debug - raise AppTemplateUpdateException( - "Unstable state when updating app templates. " - "Check that you have permissions to create/delete files in the AWS SAM shared directory " - "or file an issue at https://github.com/awslabs/aws-sam-cli/issues" - ) from ex - - def _clone_new_app_templates(self, shared_dir, expected_path): - with osutils.mkdir_temp(ignore_errors=True) as tempdir: - expected_temp_path = os.path.normpath(os.path.join(tempdir, self._repo_name)) - try: - LOG.info("\nCloning app templates from %s", self._repo_url) - subprocess.check_output( - [self._git_executable(), "clone", self._repo_url], - cwd=tempdir, - stderr=subprocess.STDOUT, - ) - shutil.copytree(expected_temp_path, expected_path, ignore=shutil.ignore_patterns("*.git")) - self.repo_path = expected_path - except OSError as ex: - LOG.warning("WARN: Can't clone app repo, git executable not found", exc_info=ex) - except subprocess.CalledProcessError as clone_error: - output = clone_error.output.decode("utf-8") - if "not found" in output.lower(): - click.echo("WARN: Could not clone app template repo.") - - @staticmethod - def _template_directory_exists(expected_path: str) -> bool: - path = Path(expected_path) - return path.exists() - - @staticmethod - def _git_executable() -> str: - execname = "git" - if platform.system().lower() == "windows": - options = [execname, "{}.cmd".format(execname), "{}.exe".format(execname), "{}.bat".format(execname)] - else: - options = [execname] - for name in options: - try: - subprocess.Popen([name], stdout=subprocess.PIPE, stderr=subprocess.PIPE) - # No exception. Let's pick this - return name - except OSError as ex: - LOG.debug("Unable to find executable %s", name, exc_info=ex) - raise OSError("Cannot find git, was looking at executables: {}".format(options)) - def is_dynamic_schemas_template( self, project_type, cdk_language, package_type, app_template, runtime, base_image, dependency_manager ): diff --git a/samcli/commands/local/cli_common/invoke_context.py b/samcli/commands/local/cli_common/invoke_context.py index cad942a2fb..fead83114e 100644 --- a/samcli/commands/local/cli_common/invoke_context.py +++ b/samcli/commands/local/cli_common/invoke_context.py @@ -9,7 +9,7 @@ from pathlib import Path from typing import Dict, List, Optional, IO, cast, Tuple, Any -import samcli.lib.utils.osutils as osutils +from samcli.lib.utils import osutils from samcli.lib.iac.interface import IacPlugin, Project from samcli.lib.providers.provider import Stack, Function from samcli.lib.providers.sam_stack_provider import SamLocalStackProvider @@ -78,6 +78,8 @@ def __init__( warm_container_initialization_mode: Optional[str] = None, debug_function: Optional[str] = None, shutdown: bool = False, + container_host: Optional[str] = None, + container_host_interface: Optional[str] = None, ) -> None: """ Initialize the context @@ -124,6 +126,10 @@ def __init__( option is enabled shutdown bool Optional. If True, perform a SHUTDOWN event when tearing down containers. Default False. + container_host string + Optional. Host of locally emulated Lambda container + container_host_interface string + Optional. Interface that Docker host binds ports to """ self._template_file = template_file self._function_identifier = function_identifier @@ -151,6 +157,9 @@ def __init__( self._aws_profile = aws_profile self._shutdown = shutdown + self._container_host = container_host + self._container_host_interface = container_host_interface + self._containers_mode = ContainersMode.COLD self._containers_initializing_mode = ContainersInitializationMode.LAZY @@ -251,7 +260,9 @@ def _initialize_all_functions_containers(self) -> None: def initialize_function_container(function: Function) -> None: function_config = self.local_lambda_runner.get_invoke_config(function) - self.lambda_runtime.run(None, function_config, self._debug_context) + self.lambda_runtime.run( + None, function_config, self._debug_context, self._container_host, self._container_host_interface + ) try: async_context = AsyncContext() @@ -335,6 +346,8 @@ def local_lambda_runner(self) -> LocalLambdaRunner: aws_region=self._aws_region, env_vars_values=self._env_vars_value, debug_context=self._debug_context, + container_host=self._container_host, + container_host_interface=self._container_host_interface, ) return self._local_lambda_runner diff --git a/samcli/commands/local/cli_common/options.py b/samcli/commands/local/cli_common/options.py index e5d19413c1..528c5772d3 100644 --- a/samcli/commands/local/cli_common/options.py +++ b/samcli/commands/local/cli_common/options.py @@ -48,7 +48,23 @@ def local_common_options(f): default=False, help="If set, will emulate a shutdown event after the invoke completes, " "in order to test extension handling of shutdown behavior.", - ) + ), + click.option( + "--container-host", + default="localhost", + show_default=True, + help="Host of locally emulated Lambda container. " + "This option is useful when the container runs on a different host than SAM CLI. " + "For example, if you want to run SAM CLI in a Docker container on macOS, " + "use this option with host.docker.internal", + ), + click.option( + "--container-host-interface", + default="127.0.0.1", + show_default=True, + help="IP address of the host network interface that container ports should bind to. " + "Use 0.0.0.0 to bind to all interfaces.", + ), ] # Reverse the list to maintain ordering of options in help text printed with --help diff --git a/samcli/commands/local/generate_event/event_generation.py b/samcli/commands/local/generate_event/event_generation.py index 68571afef8..bd053a87ab 100644 --- a/samcli/commands/local/generate_event/event_generation.py +++ b/samcli/commands/local/generate_event/event_generation.py @@ -6,7 +6,7 @@ import click -import samcli.lib.generated_sample_events.events as events +from samcli.lib.generated_sample_events import events from samcli.cli.cli_config_file import TomlProvider, configuration_option from samcli.cli.options import debug_option from samcli.lib.telemetry.metric import track_command diff --git a/samcli/commands/local/invoke/cli.py b/samcli/commands/local/invoke/cli.py index 7ae8364a38..01c7ac808b 100644 --- a/samcli/commands/local/invoke/cli.py +++ b/samcli/commands/local/invoke/cli.py @@ -80,6 +80,8 @@ def cli( parameter_overrides, config_file, config_env, + container_host, + container_host_interface, cdk_app, cdk_context, project_type: str, @@ -109,6 +111,8 @@ def cli( force_image_build, shutdown, parameter_overrides, + container_host, + container_host_interface, project_type, iac, project, @@ -134,6 +138,8 @@ def do_cli( # pylint: disable=R0914 force_image_build, shutdown, parameter_overrides, + container_host, + container_host_interface, project_type: str, iac: IacPlugin, project: Project, @@ -179,6 +185,8 @@ def do_cli( # pylint: disable=R0914 aws_region=ctx.region, aws_profile=ctx.profile, shutdown=shutdown, + container_host=container_host, + container_host_interface=container_host_interface, iac=iac, project=project, ) as context: diff --git a/samcli/commands/local/lib/local_lambda.py b/samcli/commands/local/lib/local_lambda.py index ef54b00984..ddc0bebafc 100644 --- a/samcli/commands/local/lib/local_lambda.py +++ b/samcli/commands/local/lib/local_lambda.py @@ -43,6 +43,8 @@ def __init__( aws_region: Optional[str] = None, env_vars_values: Optional[Dict[Any, Any]] = None, debug_context: Optional[DebugContext] = None, + container_host: Optional[str] = None, + container_host_interface: Optional[str] = None, ) -> None: """ Initializes the class @@ -55,6 +57,8 @@ def __init__( :param string aws_region: Optional. AWS Region to use. :param dict env_vars_values: Optional. Dictionary containing values of environment variables. :param DebugContext debug_context: Optional. Debug context for the function (includes port, args, and path). + :param string container_host: Optional. Host of locally emulated Lambda container + :param string container_host_interface: Optional. Interface that Docker host binds ports to """ self.local_runtime = local_runtime @@ -66,6 +70,8 @@ def __init__( self.debug_context = debug_context self._boto3_session_creds: Optional[Dict[str, str]] = None self._boto3_region: Optional[str] = None + self.container_host = container_host + self.container_host_interface = container_host_interface def invoke( self, @@ -120,7 +126,15 @@ def invoke( # Invoke the function try: - self.local_runtime.invoke(config, event, debug_context=self.debug_context, stdout=stdout, stderr=stderr) + self.local_runtime.invoke( + config, + event, + debug_context=self.debug_context, + stdout=stdout, + stderr=stderr, + container_host=self.container_host, + container_host_interface=self.container_host_interface, + ) except ContainerResponseException: # NOTE(sriram-mv): This should still result in a exit code zero to avoid regressions. LOG.info("No response from invoke container for %s", function.name) diff --git a/samcli/commands/local/start_api/cli.py b/samcli/commands/local/start_api/cli.py index f1947038b3..3c88668775 100644 --- a/samcli/commands/local/start_api/cli.py +++ b/samcli/commands/local/start_api/cli.py @@ -88,6 +88,8 @@ def cli( warm_containers, shutdown, debug_function, + container_host, + container_host_interface, project_type, cdk_context, cdk_app, @@ -119,6 +121,8 @@ def cli( warm_containers, shutdown, debug_function, + container_host, + container_host_interface, project_type, iac, project, @@ -146,6 +150,8 @@ def do_cli( # pylint: disable=R0914 warm_containers, shutdown, debug_function, + container_host, + container_host_interface, project_type, iac, project, @@ -189,6 +195,8 @@ def do_cli( # pylint: disable=R0914 warm_container_initialization_mode=warm_containers, debug_function=debug_function, shutdown=shutdown, + container_host=container_host, + container_host_interface=container_host_interface, iac=iac, project=project, ) as invoke_context: diff --git a/samcli/commands/local/start_lambda/cli.py b/samcli/commands/local/start_lambda/cli.py index 6ff7af27fd..f4974fad77 100644 --- a/samcli/commands/local/start_lambda/cli.py +++ b/samcli/commands/local/start_lambda/cli.py @@ -100,6 +100,8 @@ def cli( warm_containers, shutdown, debug_function, + container_host, + container_host_interface, cdk_context, project_type, cdk_app, @@ -130,6 +132,8 @@ def cli( warm_containers, shutdown, debug_function, + container_host, + container_host_interface, iac, project, ) # pragma: no cover @@ -155,6 +159,8 @@ def do_cli( # pylint: disable=R0914 warm_containers, shutdown, debug_function, + container_host, + container_host_interface, iac: IacPlugin, project: Project, ): @@ -196,6 +202,8 @@ def do_cli( # pylint: disable=R0914 warm_container_initialization_mode=warm_containers, debug_function=debug_function, shutdown=shutdown, + container_host=container_host, + container_host_interface=container_host_interface, iac=iac, project=project, ) as invoke_context: diff --git a/samcli/commands/logs/command.py b/samcli/commands/logs/command.py index 03723c08bc..7042970a3a 100644 --- a/samcli/commands/logs/command.py +++ b/samcli/commands/logs/command.py @@ -111,24 +111,13 @@ def do_cli(function_name, stack_name, filter_pattern, tailing, start_time, end_t filter_pattern=filter_pattern, start_time=start_time, end_time=end_time, - # output_file is not yet supported by CLI - output_file=None, ) as context: if tailing: - events_iterable = context.fetcher.tail( - context.log_group_name, filter_pattern=context.filter_pattern, start=context.start_time - ) + context.fetcher.tail(start_time=context.start_time, filter_pattern=context.filter_pattern) else: - events_iterable = context.fetcher.fetch( - context.log_group_name, + context.fetcher.load_time_period( + start_time=context.start_time, + end_time=context.end_time, filter_pattern=context.filter_pattern, - start=context.start_time, - end=context.end_time, ) - - formatted_events = context.formatter.do_format(events_iterable) - - for event in formatted_events: - # New line is not necessary. It is already in the log events sent by CloudWatch - click.echo(event, nl=False) diff --git a/samcli/commands/logs/console_consumers.py b/samcli/commands/logs/console_consumers.py new file mode 100644 index 0000000000..2f77e34ab0 --- /dev/null +++ b/samcli/commands/logs/console_consumers.py @@ -0,0 +1,18 @@ +""" +Consumers that will print out events to console +""" + +import click + +from samcli.lib.observability.cw_logs.cw_log_event import CWLogEvent +from samcli.lib.observability.observability_info_puller import ObservabilityEventConsumer + + +class CWConsoleEventConsumer(ObservabilityEventConsumer[CWLogEvent]): + """ + Consumer implementation that will consume given event as outputting into console + """ + + # pylint: disable=R0201 + def consume(self, event: CWLogEvent): + click.echo(event.message, nl=False) diff --git a/samcli/commands/logs/logs_context.py b/samcli/commands/logs/logs_context.py index 668cffb66d..5504895a70 100644 --- a/samcli/commands/logs/logs_context.py +++ b/samcli/commands/logs/logs_context.py @@ -3,13 +3,21 @@ """ import logging + import boto3 import botocore from samcli.commands.exceptions import UserException -from samcli.lib.logs.fetcher import LogsFetcher -from samcli.lib.logs.formatter import LogsFormatter, LambdaLogMsgFormatters, JSONMsgFormatter, KeywordHighlighter -from samcli.lib.logs.provider import LogGroupProvider +from samcli.commands.logs.console_consumers import CWConsoleEventConsumer +from samcli.lib.observability.cw_logs.cw_log_formatters import ( + CWColorizeErrorsFormatter, + CWJsonFormatter, + CWKeywordHighlighterFormatter, + CWPrettyPrintFormatter, +) +from samcli.lib.observability.cw_logs.cw_log_group_provider import LogGroupProvider +from samcli.lib.observability.cw_logs.cw_log_puller import CWLogPuller +from samcli.lib.observability.observability_info_puller import ObservabilityEventConsumerDecorator from samcli.lib.utils.colors import Colored from samcli.lib.utils.time import to_utc, parse_date @@ -97,26 +105,20 @@ def __exit__(self, *args): @property def fetcher(self): - return LogsFetcher(self._logs_client) - - @property - def formatter(self): - """ - Creates and returns a Formatter capable of nicely formatting Lambda function logs - - Returns - ------- - LogsFormatter - """ - formatter_chain = [ - LambdaLogMsgFormatters.colorize_errors, - # Format JSON "before" highlighting the keywords. Otherwise, JSON will be invalid from all the - # ANSI color codes and fail to pretty print - JSONMsgFormatter.format_json, - KeywordHighlighter(self._filter_pattern).highlight_keywords, - ] - - return LogsFormatter(self.colored, formatter_chain) + return CWLogPuller( + logs_client=self._logs_client, + consumer=ObservabilityEventConsumerDecorator( + mappers=[ + CWColorizeErrorsFormatter(self.colored), + CWJsonFormatter(), + CWKeywordHighlighterFormatter(self.colored, self._filter_pattern), + CWPrettyPrintFormatter(self.colored), + ], + consumer=CWConsoleEventConsumer(), + ), + cw_log_group=self.log_group_name, + resource_name=self._function_name, + ) @property def start_time(self): diff --git a/samcli/commands/package/command.py b/samcli/commands/package/command.py index 33f242648c..d5e730e744 100644 --- a/samcli/commands/package/command.py +++ b/samcli/commands/package/command.py @@ -116,7 +116,9 @@ def resources_and_properties_help_string(): "--resolve-s3", required=False, is_flag=True, - help="Automatically resolve s3 bucket for non-guided deployments." + help="Automatically resolve s3 bucket for non-guided deployments. " + "Enabling this option will also create a managed default s3 bucket for you. " + "If you do not provide a --s3-bucket value, the managed bucket will be used. " "Do not use --s3-guided parameter with this option.", ) @click.option("--stack-name", required=False, help="The stack name to package") diff --git a/samcli/commands/package/exceptions.py b/samcli/commands/package/exceptions.py index a650f62843..af549058e9 100644 --- a/samcli/commands/package/exceptions.py +++ b/samcli/commands/package/exceptions.py @@ -124,7 +124,8 @@ class BucketNotSpecifiedError(UserException): def __init__(self, **kwargs): self.kwargs = kwargs - message_fmt = "\nS3 Bucket not specified, use --s3-bucket to specify a bucket name or run sam deploy --guided" + message_fmt = "\nS3 Bucket not specified, use --s3-bucket to specify a bucket name, or use --resolve-s3 \ +to create a managed default bucket, or run sam deploy --guided" super().__init__(message=message_fmt.format(**self.kwargs)) diff --git a/samcli/lib/logs/__init__.py b/samcli/commands/pipeline/__init__.py similarity index 100% rename from samcli/lib/logs/__init__.py rename to samcli/commands/pipeline/__init__.py diff --git a/tests/unit/lib/logs/__init__.py b/samcli/commands/pipeline/bootstrap/__init__.py similarity index 100% rename from tests/unit/lib/logs/__init__.py rename to samcli/commands/pipeline/bootstrap/__init__.py diff --git a/samcli/commands/pipeline/bootstrap/cli.py b/samcli/commands/pipeline/bootstrap/cli.py new file mode 100644 index 0000000000..4c32ebc9b3 --- /dev/null +++ b/samcli/commands/pipeline/bootstrap/cli.py @@ -0,0 +1,245 @@ +""" +CLI command for "pipeline bootstrap", which sets up the require pipeline infrastructure resources +""" +import os +from textwrap import dedent +from typing import Any, Dict, List, Optional + +import click + +from samcli.cli.cli_config_file import configuration_option, TomlProvider +from samcli.cli.main import pass_context, common_options, aws_creds_options, print_cmdline_args +from samcli.lib.config.samconfig import SamConfig +from samcli.lib.pipeline.bootstrap.stage import Stage +from samcli.lib.telemetry.metric import track_command +from samcli.lib.utils.colors import Colored +from samcli.lib.utils.version_checker import check_newer_version +from .guided_context import GuidedContext +from ..external_links import CONFIG_AWS_CRED_ON_CICD_URL + +SHORT_HELP = "Generates the required AWS resources to connect your CI/CD system." + +HELP_TEXT = """ +This command generates the required AWS infrastructure resources to connect to your CI/CD system. +This step must be run for each deployment stage in your pipeline, prior to running the sam pipline init command. +""" + +PIPELINE_CONFIG_DIR = os.path.join(".aws-sam", "pipeline") +PIPELINE_CONFIG_FILENAME = "pipelineconfig.toml" + + +@click.command("bootstrap", short_help=SHORT_HELP, help=HELP_TEXT, context_settings=dict(max_content_width=120)) +@configuration_option(provider=TomlProvider(section="parameters")) +@click.option( + "--interactive/--no-interactive", + is_flag=True, + default=True, + help="Disable interactive prompting for bootstrap parameters, and fail if any required arguments are missing.", +) +@click.option( + "--stage", + help="The name of the corresponding deployment stage. " + "It is used as a suffix for the created AWS infrastructure resources.", + required=False, +) +@click.option( + "--pipeline-user", + help="The Amazon Resource Name (ARN) of the IAM user having its access key ID and secret access key " + "shared with the CI/CD system. It is used to grant this IAM user permission to access the " + "corresponding AWS account. If not provided, the command will create one along with the access " + "key ID and secret access key credentials.", + required=False, +) +@click.option( + "--pipeline-execution-role", + help="The ARN of the IAM role to be assumed by the pipeline user to operate on this stage. " + "Provide it only if you want to use your own role, otherwise this command will create one.", + required=False, +) +@click.option( + "--cloudformation-execution-role", + help="The ARN of the IAM role to be assumed by the AWS CloudFormation service while deploying the " + "application's stack. Provide only if you want to use your own role, otherwise the command will create one.", + required=False, +) +@click.option( + "--bucket", + help="The ARN of the Amazon S3 bucket to hold the AWS SAM artifacts.", + required=False, +) +@click.option( + "--create-image-repository/--no-create-image-repository", + is_flag=True, + default=False, + help="If set to true and no ECR image repository is provided, this command will create an ECR image repository " + "to hold the container images of Lambda functions having an Image package type.", +) +@click.option( + "--image-repository", + help="The ARN of an Amazon ECR image repository to hold the container images of Lambda functions or " + "layers that have a package type of Image. If provided, the --create-image-repository options is ignored. " + "If not provided and --create-image-repository is specified, the command will create one.", + required=False, +) +@click.option( + "--confirm-changeset/--no-confirm-changeset", + default=True, + is_flag=True, + help="Prompt to confirm if the resources are to be deployed.", +) +@common_options +@aws_creds_options +@pass_context +@track_command +@check_newer_version +@print_cmdline_args +def cli( + ctx: Any, + interactive: bool, + stage: Optional[str], + pipeline_user: Optional[str], + pipeline_execution_role: Optional[str], + cloudformation_execution_role: Optional[str], + bucket: Optional[str], + create_image_repository: bool, + image_repository: Optional[str], + confirm_changeset: bool, + config_file: Optional[str], + config_env: Optional[str], +) -> None: + """ + `sam pipeline bootstrap` command entry point + """ + do_cli( + region=ctx.region, + profile=ctx.profile, + interactive=interactive, + stage_name=stage, + pipeline_user_arn=pipeline_user, + pipeline_execution_role_arn=pipeline_execution_role, + cloudformation_execution_role_arn=cloudformation_execution_role, + artifacts_bucket_arn=bucket, + create_image_repository=create_image_repository, + image_repository_arn=image_repository, + confirm_changeset=confirm_changeset, + config_file=config_env, + config_env=config_file, + ) # pragma: no cover + + +def do_cli( + region: Optional[str], + profile: Optional[str], + interactive: bool, + stage_name: Optional[str], + pipeline_user_arn: Optional[str], + pipeline_execution_role_arn: Optional[str], + cloudformation_execution_role_arn: Optional[str], + artifacts_bucket_arn: Optional[str], + create_image_repository: bool, + image_repository_arn: Optional[str], + confirm_changeset: bool, + config_file: Optional[str], + config_env: Optional[str], + standalone: bool = True, +) -> None: + """ + implementation of `sam pipeline bootstrap` command + """ + if not pipeline_user_arn: + pipeline_user_arn = _load_saved_pipeline_user_arn() + + if interactive: + if standalone: + click.echo( + dedent( + """\ + + sam pipeline bootstrap generates the required AWS infrastructure resources to connect + to your CI/CD system. This step must be run for each deployment stage in your pipeline, + prior to running the sam pipeline init command. + + We will ask for [1] stage definition, [2] account details, and + [3] references to existing resources in order to bootstrap these pipeline resources. + """ + ), + ) + + guided_context = GuidedContext( + profile=profile, + stage_name=stage_name, + pipeline_user_arn=pipeline_user_arn, + pipeline_execution_role_arn=pipeline_execution_role_arn, + cloudformation_execution_role_arn=cloudformation_execution_role_arn, + artifacts_bucket_arn=artifacts_bucket_arn, + create_image_repository=create_image_repository, + image_repository_arn=image_repository_arn, + region=region, + ) + guided_context.run() + stage_name = guided_context.stage_name + pipeline_user_arn = guided_context.pipeline_user_arn + pipeline_execution_role_arn = guided_context.pipeline_execution_role_arn + cloudformation_execution_role_arn = guided_context.cloudformation_execution_role_arn + artifacts_bucket_arn = guided_context.artifacts_bucket_arn + create_image_repository = guided_context.create_image_repository + image_repository_arn = guided_context.image_repository_arn + region = guided_context.region + profile = guided_context.profile + + if not stage_name: + raise click.UsageError("Missing required parameter '--stage'") + + environment: Stage = Stage( + name=stage_name, + aws_profile=profile, + aws_region=region, + pipeline_user_arn=pipeline_user_arn, + pipeline_execution_role_arn=pipeline_execution_role_arn, + cloudformation_execution_role_arn=cloudformation_execution_role_arn, + artifacts_bucket_arn=artifacts_bucket_arn, + create_image_repository=create_image_repository, + image_repository_arn=image_repository_arn, + ) + + bootstrapped: bool = environment.bootstrap(confirm_changeset=confirm_changeset) + + if bootstrapped: + environment.print_resources_summary() + + environment.save_config_safe( + config_dir=PIPELINE_CONFIG_DIR, filename=PIPELINE_CONFIG_FILENAME, cmd_names=_get_bootstrap_command_names() + ) + + click.secho( + dedent( + f"""\ + View the definition in {os.path.join(PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME)}, + run sam pipeline bootstrap to generate another set of resources, or proceed to + sam pipeline init to create your pipeline configuration file. + """ + ) + ) + + if not environment.pipeline_user.is_user_provided: + click.secho( + dedent( + f"""\ + Before running {Colored().bold("sam pipeline init")}, we recommend first setting up AWS credentials + in your CI/CD account. Read more about how to do so with your provider in + {CONFIG_AWS_CRED_ON_CICD_URL}. + """ + ) + ) + + +def _load_saved_pipeline_user_arn() -> Optional[str]: + samconfig: SamConfig = SamConfig(config_dir=PIPELINE_CONFIG_DIR, filename=PIPELINE_CONFIG_FILENAME) + if not samconfig.exists(): + return None + config: Dict[str, str] = samconfig.get_all(cmd_names=_get_bootstrap_command_names(), section="parameters") + return config.get("pipeline_user") + + +def _get_bootstrap_command_names() -> List[str]: + return ["pipeline", "bootstrap"] diff --git a/samcli/commands/pipeline/bootstrap/guided_context.py b/samcli/commands/pipeline/bootstrap/guided_context.py new file mode 100644 index 0000000000..a7f1f89b08 --- /dev/null +++ b/samcli/commands/pipeline/bootstrap/guided_context.py @@ -0,0 +1,249 @@ +""" +An interactive flow that prompt the user for required information to bootstrap the AWS account of an environment +with the required infrastructure +""" +import os +import sys +from textwrap import dedent +from typing import Optional, List, Tuple, Callable + +import click +from botocore.credentials import EnvProvider + +from samcli.commands.exceptions import CredentialsError +from samcli.commands.pipeline.external_links import CONFIG_AWS_CRED_DOC_URL +from samcli.lib.bootstrap.bootstrap import get_current_account_id +from samcli.lib.utils.colors import Colored + +from samcli.lib.utils.defaults import get_default_aws_region +from samcli.lib.utils.profile import list_available_profiles + + +class GuidedContext: + def __init__( + self, + profile: Optional[str] = None, + stage_name: Optional[str] = None, + pipeline_user_arn: Optional[str] = None, + pipeline_execution_role_arn: Optional[str] = None, + cloudformation_execution_role_arn: Optional[str] = None, + artifacts_bucket_arn: Optional[str] = None, + create_image_repository: bool = False, + image_repository_arn: Optional[str] = None, + region: Optional[str] = None, + ) -> None: + self.profile = profile + self.stage_name = stage_name + self.pipeline_user_arn = pipeline_user_arn + self.pipeline_execution_role_arn = pipeline_execution_role_arn + self.cloudformation_execution_role_arn = cloudformation_execution_role_arn + self.artifacts_bucket_arn = artifacts_bucket_arn + self.create_image_repository = create_image_repository + self.image_repository_arn = image_repository_arn + self.region = region + self.color = Colored() + + def _prompt_account_id(self) -> None: + profiles = list_available_profiles() + click.echo("The following AWS credential sources are available to use:") + click.echo( + dedent( + f"""\ + To know more about configuration AWS credentials, visit the link below: + {CONFIG_AWS_CRED_DOC_URL}\ + """ + ) + ) + has_env_creds = os.getenv(EnvProvider.ACCESS_KEY) and os.getenv(EnvProvider.SECRET_KEY) + click.echo(f"\t1 - Environment variables{' (not available)' if not has_env_creds else ''}") + for i, profile in enumerate(profiles): + click.echo(f"\t{i + 2} - {profile} (named profile)") + click.echo("\tq - Quit and configure AWS credentials") + answer = click.prompt( + "Select a credential source to associate with this stage", + show_choices=False, + show_default=False, + type=click.Choice((["1"] if has_env_creds else []) + [str(i + 2) for i in range(len(profiles))] + ["q"]), + ) + if answer == "q": + sys.exit(0) + elif answer == "1": + # by default, env variable has higher precedence + # https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html#envvars-list + self.profile = None + else: + self.profile = profiles[int(answer) - 2] + + try: + account_id = get_current_account_id(self.profile) + click.echo(self.color.green(f"Associated account {account_id} with stage {self.stage_name}.")) + except CredentialsError as ex: + click.echo(f"{self.color.red(ex.message)}\n") + self._prompt_account_id() + + def _prompt_stage_name(self) -> None: + click.echo( + "Enter a name for this stage. This will be referenced later when you use the sam pipeline init command:" + ) + self.stage_name = click.prompt( + "Stage name", + default=self.stage_name, + type=click.STRING, + ) + + def _prompt_region_name(self) -> None: + self.region = click.prompt( + "Enter the region in which you want these resources to be created", + type=click.STRING, + default=get_default_aws_region(), + ) + + def _prompt_pipeline_user(self) -> None: + self.pipeline_user_arn = click.prompt( + "Enter the pipeline IAM user ARN if you have previously created one, or we will create one for you", + default="", + type=click.STRING, + ) + + def _prompt_pipeline_execution_role(self) -> None: + self.pipeline_execution_role_arn = click.prompt( + "Enter the pipeline execution role ARN if you have previously created one, " + "or we will create one for you", + default="", + type=click.STRING, + ) + + def _prompt_cloudformation_execution_role(self) -> None: + self.cloudformation_execution_role_arn = click.prompt( + "Enter the CloudFormation execution role ARN if you have previously created one, " + "or we will create one for you", + default="", + type=click.STRING, + ) + + def _prompt_artifacts_bucket(self) -> None: + self.artifacts_bucket_arn = click.prompt( + "Please enter the artifact bucket ARN for your Lambda function. " + "If you do not have a bucket, we will create one for you", + default="", + type=click.STRING, + ) + + def _prompt_image_repository(self) -> None: + if click.confirm("Does your application contain any IMAGE type Lambda functions?"): + self.image_repository_arn = click.prompt( + "Please enter the ECR image repository ARN(s) for your Image type function(s)." + "If you do not yet have a repository, we will create one for you", + default="", + type=click.STRING, + ) + self.create_image_repository = not bool(self.image_repository_arn) + else: + self.create_image_repository = False + + def _get_user_inputs(self) -> List[Tuple[str, Callable[[], None]]]: + return [ + (f"Account: {get_current_account_id(self.profile)}", self._prompt_account_id), + (f"Stage name: {self.stage_name}", self._prompt_stage_name), + (f"Region: {self.region}", self._prompt_region_name), + ( + f"Pipeline user ARN: {self.pipeline_user_arn}" + if self.pipeline_user_arn + else "Pipeline user: [to be created]", + self._prompt_pipeline_user, + ), + ( + f"Pipeline execution role ARN: {self.pipeline_execution_role_arn}" + if self.pipeline_execution_role_arn + else "Pipeline execution role: [to be created]", + self._prompt_pipeline_execution_role, + ), + ( + f"CloudFormation execution role ARN: {self.cloudformation_execution_role_arn}" + if self.cloudformation_execution_role_arn + else "CloudFormation execution role: [to be created]", + self._prompt_cloudformation_execution_role, + ), + ( + f"Artifacts bucket ARN: {self.artifacts_bucket_arn}" + if self.artifacts_bucket_arn + else "Artifacts bucket: [to be created]", + self._prompt_artifacts_bucket, + ), + ( + f"ECR image repository ARN: {self.image_repository_arn}" + if self.image_repository_arn + else f"ECR image repository: [{'to be created' if self.create_image_repository else 'skipped'}]", + self._prompt_image_repository, + ), + ] + + def run(self) -> None: # pylint: disable=too-many-branches + """ + Runs an interactive questionnaire to prompt the user for the ARNs of the AWS resources(infrastructure) required + for the pipeline to work. Users can provide all, none or some resources' ARNs and leave the remaining empty + and it will be created by the bootstrap command + """ + click.secho(self.color.bold("[1] Stage definition")) + if self.stage_name: + click.echo(f"Stage name: {self.stage_name}") + else: + self._prompt_stage_name() + click.echo() + + click.secho(self.color.bold("[2] Account details")) + self._prompt_account_id() + click.echo() + + if not self.region: + self._prompt_region_name() + + if self.pipeline_user_arn: + click.echo(f"Pipeline IAM user ARN: {self.pipeline_user_arn}") + else: + self._prompt_pipeline_user() + click.echo() + + click.secho(self.color.bold("[3] Reference application build resources")) + + if self.pipeline_execution_role_arn: + click.echo(f"Pipeline execution role ARN: {self.pipeline_execution_role_arn}") + else: + self._prompt_pipeline_execution_role() + + if self.cloudformation_execution_role_arn: + click.echo(f"CloudFormation execution role ARN: {self.cloudformation_execution_role_arn}") + else: + self._prompt_cloudformation_execution_role() + + if self.artifacts_bucket_arn: + click.echo(f"Artifacts bucket ARN: {self.cloudformation_execution_role_arn}") + else: + self._prompt_artifacts_bucket() + + if self.image_repository_arn: + click.echo(f"ECR image repository ARN: {self.image_repository_arn}") + else: + self._prompt_image_repository() + click.echo() + + # Ask customers to confirm the inputs + click.secho(self.color.bold("[4] Summary")) + while True: + inputs = self._get_user_inputs() + click.secho("Below is the summary of the answers:") + for i, (text, _) in enumerate(inputs): + click.secho(f"\t{i + 1} - {text}") + edit_input = click.prompt( + text="Press enter to confirm the values above, or select an item to edit the value", + default="0", + show_choices=False, + show_default=False, + type=click.Choice(["0"] + [str(i + 1) for i in range(len(inputs))]), + ) + click.echo() + if int(edit_input): + inputs[int(edit_input) - 1][1]() + click.echo() + else: + break diff --git a/samcli/commands/pipeline/external_links.py b/samcli/commands/pipeline/external_links.py new file mode 100644 index 0000000000..77301ebb1b --- /dev/null +++ b/samcli/commands/pipeline/external_links.py @@ -0,0 +1,8 @@ +""" +The module to store external links. Put them in a centralized place so that we can verify their +validity automatically. +""" +CONFIG_AWS_CRED_DOC_URL = "https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html" + +_SAM_DOC_PREFIX = "https://docs.aws.amazon.com/serverless-application-model/latest/developerguide" +CONFIG_AWS_CRED_ON_CICD_URL = _SAM_DOC_PREFIX + "/serverless-generating-example-ci-cd-others.html" diff --git a/samcli/commands/pipeline/init/__init__.py b/samcli/commands/pipeline/init/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/samcli/commands/pipeline/init/cli.py b/samcli/commands/pipeline/init/cli.py new file mode 100644 index 0000000000..a7223398c9 --- /dev/null +++ b/samcli/commands/pipeline/init/cli.py @@ -0,0 +1,51 @@ +""" +CLI command for "pipeline init" command +""" +from typing import Any, Optional + +import click + +from samcli.cli.cli_config_file import configuration_option, TomlProvider +from samcli.cli.main import pass_context, common_options as cli_framework_options +from samcli.commands.pipeline.init.interactive_init_flow import InteractiveInitFlow +from samcli.lib.telemetry.metric import track_command + +SHORT_HELP = "Generates a CI/CD pipeline configuration file." +HELP_TEXT = """ +This command generates a pipeline configuration file that your CI/CD system can use to deploy +serverless applications using AWS SAM. + +Before using sam pipeline init, you must bootstrap the necessary resources for each stage in your pipeline. +You can do this by running sam pipeline init --bootstrap to be guided through the setup and configuration +file generation process, or refer to resources you have previously created with the sam pipeline bootstrap command. +""" + + +@click.command("init", help=HELP_TEXT, short_help=SHORT_HELP) +@configuration_option(provider=TomlProvider(section="parameters")) +@click.option( + "--bootstrap", + is_flag=True, + default=False, + help="Enable interactive mode that walks the user through creating necessary AWS infrastructure resources.", +) +@cli_framework_options +@pass_context +@track_command # pylint: disable=R0914 +def cli(ctx: Any, config_env: Optional[str], config_file: Optional[str], bootstrap: bool) -> None: + """ + `sam pipeline init` command entry point + """ + + # Currently we support interactive mode only, i.e. the user doesn't provide the required arguments during the call + # so we call do_cli without any arguments. This will change after supporting the non interactive mode. + do_cli(bootstrap) + + +def do_cli(bootstrap: bool) -> None: + """ + implementation of `sam pipeline init` command + """ + # TODO non-interactive mode + init_flow = InteractiveInitFlow(bootstrap) + init_flow.do_interactive() diff --git a/samcli/commands/pipeline/init/interactive_init_flow.py b/samcli/commands/pipeline/init/interactive_init_flow.py new file mode 100644 index 0000000000..d4e989ebfa --- /dev/null +++ b/samcli/commands/pipeline/init/interactive_init_flow.py @@ -0,0 +1,482 @@ +""" +Interactive flow that prompts that users for pipeline template (cookiecutter template) and used it to generate +pipeline configuration file +""" +import json +import logging +import os +from json import JSONDecodeError +from pathlib import Path +from textwrap import dedent +from typing import Dict, List, Tuple + +import click + +from samcli.cli.main import global_cfg +from samcli.commands.exceptions import ( + AppPipelineTemplateMetadataException, + PipelineTemplateCloneException, +) +from samcli.lib.config.samconfig import SamConfig +from samcli.lib.cookiecutter.interactive_flow import InteractiveFlow +from samcli.lib.cookiecutter.interactive_flow_creator import InteractiveFlowCreator +from samcli.lib.cookiecutter.question import Choice +from samcli.lib.cookiecutter.template import Template +from samcli.lib.utils import osutils +from samcli.lib.utils.colors import Colored +from samcli.lib.utils.git_repo import GitRepo, CloneRepoException +from .pipeline_templates_manifest import Provider, PipelineTemplateMetadata, PipelineTemplatesManifest +from ..bootstrap.cli import ( + do_cli as do_bootstrap, + PIPELINE_CONFIG_DIR, + PIPELINE_CONFIG_FILENAME, + _get_bootstrap_command_names, +) + +LOG = logging.getLogger(__name__) +shared_path: Path = global_cfg.config_dir +APP_PIPELINE_TEMPLATES_REPO_URL = "https://github.com/aws/aws-sam-cli-pipeline-init-templates.git" +APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME = "aws-sam-cli-app-pipeline-templates" +CUSTOM_PIPELINE_TEMPLATE_REPO_LOCAL_NAME = "custom-pipeline-template" +SAM_PIPELINE_TEMPLATE_SOURCE = "AWS Quick Start Pipeline Templates" +CUSTOM_PIPELINE_TEMPLATE_SOURCE = "Custom Pipeline Template Location" + + +class InteractiveInitFlow: + def __init__(self, allow_bootstrap: bool): + self.allow_bootstrap = allow_bootstrap + self.color = Colored() + + def do_interactive(self) -> None: + """ + An interactive flow that prompts the user for pipeline template (cookiecutter template) location, downloads it, + runs its specific questionnaire then generates the pipeline config file + based on the template and user's responses + """ + click.echo( + dedent( + """\ + + sam pipeline init generates a pipeline configuration file that your CI/CD system + can use to deploy serverless applications using AWS SAM. + We will guide you through the process to bootstrap resources for each stage, + then walk through the details necessary for creating the pipeline config file. + + Please ensure you are in the root folder of your SAM application before you begin. + """ + ) + ) + + click.echo("Select a pipeline structure template to get started:") + pipeline_template_source_question = Choice( + key="pipeline-template-source", + text="Select template", + options=[SAM_PIPELINE_TEMPLATE_SOURCE, CUSTOM_PIPELINE_TEMPLATE_SOURCE], + is_required=True, + ) + source = pipeline_template_source_question.ask() + if source == CUSTOM_PIPELINE_TEMPLATE_SOURCE: + generated_files = self._generate_from_custom_location() + else: + generated_files = self._generate_from_app_pipeline_templates() + click.secho(Colored().green("Successfully created the pipeline configuration file(s):")) + for file in generated_files: + click.secho(Colored().green(f"\t- {file}")) + + def _generate_from_app_pipeline_templates( + self, + ) -> List[str]: + """ + Prompts the user to choose a pipeline template from SAM predefined set of pipeline templates hosted in the git + repository: aws/aws-sam-cli-pipeline-init-templates.git + downloads locally, then generates the pipeline configuration file from the selected pipeline template. + Finally, return the list of generated files. + """ + pipeline_templates_local_dir: Path = _clone_app_pipeline_templates() + pipeline_templates_manifest: PipelineTemplatesManifest = _read_app_pipeline_templates_manifest( + pipeline_templates_local_dir + ) + # The manifest contains multiple pipeline-templates so select one + selected_pipeline_template_metadata: PipelineTemplateMetadata = _prompt_pipeline_template( + pipeline_templates_manifest + ) + selected_pipeline_template_dir: Path = pipeline_templates_local_dir.joinpath( + selected_pipeline_template_metadata.location + ) + return self._generate_from_pipeline_template(selected_pipeline_template_dir) + + def _generate_from_custom_location( + self, + ) -> List[str]: + """ + Prompts the user for a custom pipeline template location, downloads locally, + then generates the pipeline config file and return the list of generated files + """ + pipeline_template_git_location: str = click.prompt("Template Git location") + if os.path.exists(pipeline_template_git_location): + return self._generate_from_pipeline_template(Path(pipeline_template_git_location)) + + with osutils.mkdir_temp(ignore_errors=True) as tempdir: + tempdir_path = Path(tempdir) + pipeline_template_local_dir: Path = _clone_pipeline_templates( + pipeline_template_git_location, tempdir_path, CUSTOM_PIPELINE_TEMPLATE_REPO_LOCAL_NAME + ) + return self._generate_from_pipeline_template(pipeline_template_local_dir) + + def _prompt_run_bootstrap_within_pipeline_init(self, stage_names: List[str], number_of_stages: int) -> bool: + """ + Prompt bootstrap if `--bootstrap` flag is provided. Return True if bootstrap process is executed. + """ + if not stage_names: + click.echo("[!] None detected in this account.") + else: + click.echo( + Colored().yellow( + f"Only {len(stage_names)} stage(s) were detected, " + f"fewer than what the template requires: {number_of_stages}." + ) + ) + click.echo() + + if self.allow_bootstrap: + if click.confirm( + "Do you want to go through stage setup process now? If you choose no, " + "you can still reference other bootstrapped resources." + ): + click.secho( + dedent( + """\ + + For each stage, we will ask for [1] stage definition, [2] account details, and [3] + reference application build resources in order to bootstrap these pipeline + resources. + + We recommend using an individual AWS account profiles for each stage in your + pipeline. You can set these profiles up using [little bit of info on how to do + this/docs]. + """ + ) + ) + + click.echo(Colored().bold(f"\nStage {len(stage_names) + 1} Setup\n")) + do_bootstrap( + region=None, + profile=None, + interactive=True, + stage_name=None, + pipeline_user_arn=None, + pipeline_execution_role_arn=None, + cloudformation_execution_role_arn=None, + artifacts_bucket_arn=None, + create_image_repository=False, + image_repository_arn=None, + confirm_changeset=True, + config_file=None, + config_env=None, + standalone=False, + ) + return True + else: + click.echo( + dedent( + """\ + To set up stage(s), please quit the process using Ctrl+C and use one of the following commands: + sam pipeline init --bootstrap To be guided through the stage and config file creation process. + sam pipeline bootstrap To specify details for an individual stage. + """ + ) + ) + click.prompt( + "To reference stage resources bootstrapped in a different account, press enter to proceed", default="" + ) + return False + + def _generate_from_pipeline_template(self, pipeline_template_dir: Path) -> List[str]: + """ + Generates a pipeline config file from a given pipeline template local location + and return the list of generated files. + """ + pipeline_template: Template = _initialize_pipeline_template(pipeline_template_dir) + number_of_stages = (pipeline_template.metadata or {}).get("number_of_stages") + if not number_of_stages: + LOG.debug("Cannot find number_of_stages from template's metadata, set to default 2.") + number_of_stages = 2 + click.echo(f"You are using the {number_of_stages}-stage pipeline template.") + _draw_stage_diagram(number_of_stages) + while True: + click.echo("Checking for existing stages...\n") + stage_names, bootstrap_context = _load_pipeline_bootstrap_resources() + if len(stage_names) < number_of_stages and self._prompt_run_bootstrap_within_pipeline_init( + stage_names, number_of_stages + ): + # the customers just went through the bootstrap process, + # refresh the pipeline bootstrap resources and see whether bootstrap is still needed + continue + break + + context: Dict = pipeline_template.run_interactive_flows(bootstrap_context) + with osutils.mkdir_temp() as generate_dir: + LOG.debug("Generating pipeline files into %s", generate_dir) + context["outputDir"] = "." # prevent cookiecutter from generating a sub-folder + pipeline_template.generate_project(context, generate_dir) + return _copy_dir_contents_to_cwd(generate_dir) + + +def _load_pipeline_bootstrap_resources() -> Tuple[List[str], Dict[str, str]]: + section = "parameters" + context: Dict = {} + + config = SamConfig(PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME) + if not config.exists(): + context[str(["stage_names_message"])] = "" + return [], context + + # config.get_stage_names() will return the list of + # bootstrapped stage names and "default" which is used to store shared values + # we don't want to include "default" here. + stage_names = [stage_name for stage_name in config.get_stage_names() if stage_name != "default"] + for index, stage in enumerate(stage_names): + for key, value in config.get_all(_get_bootstrap_command_names(), section, stage).items(): + context[str([stage, key])] = value + # create an index alias for each stage name + # so that if customers type "1," it is equivalent to the first stage name + context[str([str(index + 1), key])] = value + + # pre-load the list of stage names detected from pipelineconfig.toml + stage_names_message = ( + "Here are the stage names detected " + + f"in {os.path.join(PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME)}:\n" + + "\n".join([f"\t{index + 1} - {stage_name}" for index, stage_name in enumerate(stage_names)]) + ) + context[str(["stage_names_message"])] = stage_names_message + + return stage_names, context + + +def _copy_dir_contents_to_cwd(source_dir: str) -> List[str]: + """ + Copy the contents of source_dir into the current cwd. + If existing files are encountered, ask for confirmation. + If not confirmed, all files will be written to + .aws-sam/pipeline/generated-files/ + """ + file_paths: List[str] = [] + existing_file_paths: List[str] = [] + for root, _, files in os.walk(source_dir): + for filename in files: + file_path = Path(root, filename) + target_file_path = Path(".").joinpath(file_path.relative_to(source_dir)) + LOG.debug("Verify %s does not exist", target_file_path) + if target_file_path.exists(): + existing_file_paths.append(str(target_file_path)) + file_paths.append(str(target_file_path)) + if existing_file_paths: + click.echo("\nThe following files already exist:") + for existing_file_path in existing_file_paths: + click.echo(f"\t- {existing_file_path}") + if not click.confirm("Do you want to override them?"): + target_dir = str(Path(PIPELINE_CONFIG_DIR, "generated-files")) + osutils.copytree(source_dir, target_dir) + click.echo(f"All files are saved to {target_dir}.") + return [str(Path(target_dir, path)) for path in file_paths] + LOG.debug("Copy contents of %s to cwd", source_dir) + osutils.copytree(source_dir, ".") + return file_paths + + +def _clone_app_pipeline_templates() -> Path: + """ + clone aws/aws-sam-cli-pipeline-init-templates.git Git repo to the local machine in SAM shared directory. + Returns: + the local directory path where the repo is cloned. + """ + try: + return _clone_pipeline_templates( + repo_url=APP_PIPELINE_TEMPLATES_REPO_URL, + clone_dir=shared_path, + clone_name=APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME, + ) + except PipelineTemplateCloneException: + # If can't clone app pipeline templates, try using an old clone from a previous run if already exist + expected_previous_clone_local_path: Path = shared_path.joinpath(APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME) + if expected_previous_clone_local_path.exists(): + click.echo("Unable to download updated app pipeline templates, using existing ones") + return expected_previous_clone_local_path + raise + + +def _clone_pipeline_templates(repo_url: str, clone_dir: Path, clone_name: str) -> Path: + """ + clone a given pipeline templates' Git repo to the user machine inside the given clone_dir directory + under the given clone name. For example, if clone_name is "custom-pipeline-template" then the location to clone + to is "/clone/dir/path/custom-pipeline-template/" + + Parameters: + repo_url: the URL of the Git repo to clone + clone_dir: the local parent directory to clone to + clone_name: The folder name to give to the created clone inside clone_dir + + Returns: + Path to the local clone + """ + try: + repo: GitRepo = GitRepo(repo_url) + clone_path: Path = repo.clone(clone_dir, clone_name, replace_existing=True) + return clone_path + except (OSError, CloneRepoException) as ex: + raise PipelineTemplateCloneException(str(ex)) from ex + + +def _read_app_pipeline_templates_manifest(pipeline_templates_dir: Path) -> PipelineTemplatesManifest: + """ + parse and return the manifest yaml file located in the root directory of the SAM pipeline templates folder: + + Parameters: + pipeline_templates_dir: local directory of SAM pipeline templates + + Raises: + AppPipelineTemplateManifestException if the manifest is not found, ill-formatted or missing required keys + + Returns: + The manifest of the pipeline templates + """ + manifest_path: Path = pipeline_templates_dir.joinpath("manifest.yaml") + return PipelineTemplatesManifest(manifest_path) + + +def _prompt_pipeline_template(pipeline_templates_manifest: PipelineTemplatesManifest) -> PipelineTemplateMetadata: + """ + Prompts the user a list of the available CI/CD systems along with associated app pipeline templates to choose + one of them + + Parameters: + pipeline_templates_manifest: A manifest file lists the available providers and the associated pipeline templates + + Returns: + The manifest (A section in the pipeline_templates_manifest) of the chosen pipeline template; + """ + provider = _prompt_cicd_provider(pipeline_templates_manifest.providers) + provider_pipeline_templates: List[PipelineTemplateMetadata] = [ + t for t in pipeline_templates_manifest.templates if t.provider == provider.id + ] + selected_template_manifest: PipelineTemplateMetadata = _prompt_provider_pipeline_template( + provider_pipeline_templates + ) + return selected_template_manifest + + +def _prompt_cicd_provider(available_providers: List[Provider]) -> Provider: + """ + Prompts the user a list of the available CI/CD systems to choose from + + Parameters: + available_providers: List of available CI/CD systems such as Jenkins, Gitlab and CircleCI + + Returns: + The chosen provider + """ + if len(available_providers) == 1: + return available_providers[0] + + question_to_choose_provider = Choice( + key="provider", text="CI/CD system", options=[p.display_name for p in available_providers], is_required=True + ) + chosen_provider_display_name = question_to_choose_provider.ask() + return next(p for p in available_providers if p.display_name == chosen_provider_display_name) + + +def _prompt_provider_pipeline_template( + provider_available_pipeline_templates_metadata: List[PipelineTemplateMetadata], +) -> PipelineTemplateMetadata: + """ + Prompts the user a list of the available pipeline templates to choose from + + Parameters: + provider_available_pipeline_templates_metadata: List of available pipeline templates manifests + + Returns: + The chosen pipeline template manifest + """ + if len(provider_available_pipeline_templates_metadata) == 1: + return provider_available_pipeline_templates_metadata[0] + question_to_choose_pipeline_template = Choice( + key="pipeline-template", + text="Which pipeline template would you like to use?", + options=[t.display_name for t in provider_available_pipeline_templates_metadata], + ) + chosen_pipeline_template_display_name = question_to_choose_pipeline_template.ask() + return next( + t + for t in provider_available_pipeline_templates_metadata + if t.display_name == chosen_pipeline_template_display_name + ) + + +def _initialize_pipeline_template(pipeline_template_dir: Path) -> Template: + """ + Initialize a pipeline template from a given pipeline template (cookiecutter template) location + + Parameters: + pipeline_template_dir: The local location of the pipeline cookiecutter template + + Returns: + The initialized pipeline's cookiecutter template + """ + interactive_flow = _get_pipeline_template_interactive_flow(pipeline_template_dir) + metadata = _get_pipeline_template_metadata(pipeline_template_dir) + return Template(location=str(pipeline_template_dir), interactive_flows=[interactive_flow], metadata=metadata) + + +def _get_pipeline_template_metadata(pipeline_template_dir: Path) -> Dict: + """ + Load the metadata from the file metadata.json located in the template directory, + raise an exception if anything wrong. + """ + metadata_path = Path(pipeline_template_dir, "metadata.json") + if not metadata_path.exists(): + raise AppPipelineTemplateMetadataException(f"Cannot find metadata file {metadata_path}") + try: + with open(metadata_path, "r", encoding="utf-8") as file: + metadata = json.load(file) + if isinstance(metadata, dict): + return metadata + raise AppPipelineTemplateMetadataException(f"Invalid content found in {metadata_path}") + except JSONDecodeError as ex: + raise AppPipelineTemplateMetadataException(f"Invalid JSON found in {metadata_path}") from ex + + +def _get_pipeline_template_interactive_flow(pipeline_template_dir: Path) -> InteractiveFlow: + """ + A pipeline template defines its own interactive flow (questionnaire) in a JSON file named questions.json located + in the root directory of the template. This questionnaire defines a set of questions to prompt to the user and + use the responses as the cookiecutter context + + Parameters: + pipeline_template_dir: The local location of the pipeline cookiecutter template + + Raises: + QuestionsNotFoundException: if the pipeline template is missing questions.json file. + QuestionsFailedParsingException: if questions.json file is ill-formatted or missing required keys. + + Returns: + The interactive flow + """ + flow_definition_path: Path = pipeline_template_dir.joinpath("questions.json") + return InteractiveFlowCreator.create_flow(str(flow_definition_path)) + + +def _lines_for_stage(stage_index: int) -> List[str]: + return [ + " _________ ", + "| |", + f"| Stage {stage_index} |", + "|_________|", + ] + + +def _draw_stage_diagram(number_of_stages: int) -> None: + delimiters = [" ", " ", "->", " "] + stage_lines = [_lines_for_stage(i + 1) for i in range(number_of_stages)] + for i, delimiter in enumerate(delimiters): + click.echo(delimiter.join([stage_lines[stage_i][i] for stage_i in range(number_of_stages)])) + click.echo("") diff --git a/samcli/commands/pipeline/init/pipeline_templates_manifest.py b/samcli/commands/pipeline/init/pipeline_templates_manifest.py new file mode 100644 index 0000000000..8249e14d85 --- /dev/null +++ b/samcli/commands/pipeline/init/pipeline_templates_manifest.py @@ -0,0 +1,61 @@ +""" +Represents a manifest that lists the available SAM pipeline templates. +Example: + providers: + - displayName:Jenkins + id: jenkins + - displayName:Gitlab CI/CD + id: gitlab + - displayName:Github Actions + id: github-actions + templates: + - displayName: jenkins-two-environments-pipeline + provider: Jenkins + location: templates/cookiecutter-jenkins-two-environments-pipeline + - displayName: gitlab-two-environments-pipeline + provider: Gitlab + location: templates/cookiecutter-gitlab-two-environments-pipeline + - displayName: Github-Actions-two-environments-pipeline + provider: Github Actions + location: templates/cookiecutter-github-actions-two-environments-pipeline +""" +from pathlib import Path +from typing import Dict, List + +import yaml + +from samcli.commands.exceptions import AppPipelineTemplateManifestException +from samcli.yamlhelper import parse_yaml_file + + +class Provider: + """ CI/CD system such as Jenkins, Gitlab and GitHub-Actions""" + + def __init__(self, manifest: Dict) -> None: + self.id: str = manifest["id"] + self.display_name: str = manifest["displayName"] + + +class PipelineTemplateMetadata: + """ The metadata of a Given pipeline template""" + + def __init__(self, manifest: Dict) -> None: + self.display_name: str = manifest["displayName"] + self.provider: str = manifest["provider"] + self.location: str = manifest["location"] + + +class PipelineTemplatesManifest: + """ The metadata of the available CI/CD systems and the pipeline templates""" + + def __init__(self, manifest_path: Path) -> None: + try: + manifest: Dict = parse_yaml_file(file_path=str(manifest_path)) + self.providers: List[Provider] = list(map(Provider, manifest["providers"])) + self.templates: List[PipelineTemplateMetadata] = list(map(PipelineTemplateMetadata, manifest["templates"])) + except (FileNotFoundError, KeyError, TypeError, yaml.YAMLError) as ex: + raise AppPipelineTemplateManifestException( + "SAM pipeline templates manifest file is not found or ill-formatted. This could happen if the file " + f"{manifest_path} got deleted or modified." + "If you believe this is not the case, please file an issue at https://github.com/aws/aws-sam-cli/issues" + ) from ex diff --git a/samcli/commands/pipeline/pipeline.py b/samcli/commands/pipeline/pipeline.py new file mode 100644 index 0000000000..2d8df4463e --- /dev/null +++ b/samcli/commands/pipeline/pipeline.py @@ -0,0 +1,21 @@ +""" +Command group for "pipeline" suite commands. It provides common CLI arguments, template parsing capabilities, +setting up stdin/stdout etc +""" + +import click + +from .bootstrap.cli import cli as bootstrap_cli +from .init.cli import cli as init_cli + + +@click.group() +def cli() -> None: + """ + Manage the continuous delivery of the application + """ + + +# Add individual commands under this group +cli.add_command(bootstrap_cli) +cli.add_command(init_cli) diff --git a/samcli/commands/validate/lib/sam_template_validator.py b/samcli/commands/validate/lib/sam_template_validator.py index ddac94a82b..4e34071b9f 100644 --- a/samcli/commands/validate/lib/sam_template_validator.py +++ b/samcli/commands/validate/lib/sam_template_validator.py @@ -7,6 +7,7 @@ from samtranslator.public.exceptions import InvalidDocumentException from samtranslator.parser import parser from samtranslator.translator.translator import Translator +from boto3.session import Session from samcli.lib.utils.packagetype import ZIP from samcli.yamlhelper import yaml_dump @@ -16,7 +17,7 @@ class SamTemplateValidator: - def __init__(self, sam_template, managed_policy_loader): + def __init__(self, sam_template, managed_policy_loader, profile=None, region=None): """ Construct a SamTemplateValidator @@ -40,6 +41,7 @@ def __init__(self, sam_template, managed_policy_loader): self.sam_template = sam_template self.managed_policy_loader = managed_policy_loader self.sam_parser = parser.Parser() + self.boto3_session = Session(profile_name=profile, region_name=region) def is_valid(self): """ @@ -53,7 +55,12 @@ def is_valid(self): """ managed_policy_map = self.managed_policy_loader.load() - sam_translator = Translator(managed_policy_map=managed_policy_map, sam_parser=self.sam_parser, plugins=[]) + sam_translator = Translator( + managed_policy_map=managed_policy_map, + sam_parser=self.sam_parser, + plugins=[], + boto_session=self.boto3_session, + ) self._replace_local_codeuri() diff --git a/samcli/commands/validate/validate.py b/samcli/commands/validate/validate.py index 3f12ccb30f..0eb8c4d6c9 100644 --- a/samcli/commands/validate/validate.py +++ b/samcli/commands/validate/validate.py @@ -7,6 +7,8 @@ from botocore.exceptions import NoCredentialsError import click +from samtranslator.translator.arn_generator import NoRegionFound + from samcli.cli.main import pass_context, common_options as cli_framework_options, aws_creds_options, print_cmdline_args from samcli.commands._utils.options import template_option_without_build from samcli.lib.telemetry.metric import track_command @@ -49,13 +51,20 @@ def do_cli(ctx, template): sam_template = _read_sam_file(template) iam_client = boto3.client("iam") - validator = SamTemplateValidator(sam_template, ManagedPolicyLoader(iam_client)) + validator = SamTemplateValidator( + sam_template, ManagedPolicyLoader(iam_client), profile=ctx.profile, region=ctx.region + ) try: validator.is_valid() except InvalidSamDocumentException as e: click.secho("Template provided at '{}' was invalid SAM Template.".format(template), bg="red") raise InvalidSamTemplateException(str(e)) from e + except NoRegionFound as no_region_found_e: + raise UserException( + "AWS Region was not found. Please configure your region through a profile or --region option", + wrapped_from=no_region_found_e.__class__.__name__, + ) from no_region_found_e except NoCredentialsError as e: raise UserException( "AWS Credentials are required. Please configure your credentials.", wrapped_from=e.__class__.__name__ diff --git a/samcli/lib/bootstrap/bootstrap.py b/samcli/lib/bootstrap/bootstrap.py index 81c30c7748..a9a590dc7f 100644 --- a/samcli/lib/bootstrap/bootstrap.py +++ b/samcli/lib/bootstrap/bootstrap.py @@ -4,32 +4,51 @@ import json import logging +from typing import Optional + +import boto3 +from botocore.exceptions import ClientError + from samcli import __version__ from samcli.cli.global_config import GlobalConfig -from samcli.commands.exceptions import UserException -from samcli.lib.utils.managed_cloudformation_stack import manage_stack as manage_cloudformation_stack +from samcli.commands.exceptions import UserException, CredentialsError +from samcli.lib.utils.managed_cloudformation_stack import StackOutput, manage_stack as manage_cloudformation_stack SAM_CLI_STACK_NAME = "aws-sam-cli-managed-default" LOG = logging.getLogger(__name__) def manage_stack(profile, region): - outputs = manage_cloudformation_stack( + outputs: StackOutput = manage_cloudformation_stack( profile=None, region=region, stack_name=SAM_CLI_STACK_NAME, template_body=_get_stack_template() ) - try: - bucket_name = next(o for o in outputs if o["OutputKey"] == "SourceBucket")["OutputValue"] - except StopIteration as ex: + bucket_name = outputs.get("SourceBucket") + if bucket_name is None: msg = ( "Stack " + SAM_CLI_STACK_NAME + " exists, but is missing the managed source bucket key. " "Failing as this stack was likely not created by the AWS SAM CLI." ) - raise UserException(msg) from ex + raise UserException(msg) # This bucket name is what we would write to a config file return bucket_name +def get_current_account_id(profile: Optional[str] = None): + """Returns account ID based on used AWS credentials.""" + session = boto3.Session(profile_name=profile) # type: ignore + sts_client = session.client("sts") + try: + caller_identity = sts_client.get_caller_identity() + except ClientError as ex: + if ex.response["Error"]["Code"] == "InvalidClientTokenId": + raise CredentialsError("Cannot identify account due to invalid configured credentials.") from ex + raise CredentialsError("Cannot identify account based on configured credentials.") from ex + if "Account" not in caller_identity: + raise CredentialsError("Cannot identify account based on configured credentials.") + return caller_identity["Account"] + + def _get_stack_template(): gc = GlobalConfig() info = {"version": __version__, "installationId": gc.installation_id if gc.installation_id else "unknown"} @@ -73,6 +92,9 @@ def _get_stack_template(): - "/*" Principal: Service: serverlessrepo.amazonaws.com + Condition: + StringEquals: + aws:SourceAccount: !Ref AWS::AccountId Outputs: SourceBucket: diff --git a/samcli/lib/build/app_builder.py b/samcli/lib/build/app_builder.py index 38af1f254a..789c7d3965 100644 --- a/samcli/lib/build/app_builder.py +++ b/samcli/lib/build/app_builder.py @@ -24,7 +24,7 @@ from samcli.lib.providers.provider import ResourcesToBuildCollector, Function, get_full_path, Stack, LayerVersion from samcli.lib.providers.sam_base_provider import SamBaseProvider from samcli.lib.utils.colors import Colored -import samcli.lib.utils.osutils as osutils +from samcli.lib.utils import osutils from samcli.lib.utils.packagetype import IMAGE, ZIP from samcli.lib.utils.stream_writer import StreamWriter from samcli.local.docker.lambda_build_container import LambdaBuildContainer diff --git a/samcli/lib/build/build_graph.py b/samcli/lib/build/build_graph.py index 836a412a86..ceda957ffb 100644 --- a/samcli/lib/build/build_graph.py +++ b/samcli/lib/build/build_graph.py @@ -4,7 +4,7 @@ import logging from pathlib import Path -from typing import Tuple, List, Any, Optional, Dict +from typing import Tuple, List, Any, Optional, Dict, cast from uuid import uuid4 import tomlkit @@ -33,7 +33,7 @@ def _function_build_definition_to_toml_table( function_build_definition: "FunctionBuildDefinition", -) -> tomlkit.items.Table: +) -> tomlkit.api.Table: """ Converts given function_build_definition into toml table representation @@ -44,7 +44,7 @@ def _function_build_definition_to_toml_table( Returns ------- - tomlkit.items.Table + tomlkit.api.Table toml table of FunctionBuildDefinition """ toml_table = tomlkit.table() @@ -63,7 +63,7 @@ def _function_build_definition_to_toml_table( return toml_table -def _toml_table_to_function_build_definition(uuid: str, toml_table: tomlkit.items.Table) -> "FunctionBuildDefinition": +def _toml_table_to_function_build_definition(uuid: str, toml_table: tomlkit.api.Table) -> "FunctionBuildDefinition": """ Converts given toml table into FunctionBuildDefinition instance @@ -71,7 +71,7 @@ def _toml_table_to_function_build_definition(uuid: str, toml_table: tomlkit.item ---------- uuid: str key of the function toml_table instance - toml_table: tomlkit.items.Table + toml_table: tomlkit.api.Table function build definition as toml table Returns @@ -91,7 +91,7 @@ def _toml_table_to_function_build_definition(uuid: str, toml_table: tomlkit.item return function_build_definition -def _layer_build_definition_to_toml_table(layer_build_definition: "LayerBuildDefinition") -> tomlkit.items.Table: +def _layer_build_definition_to_toml_table(layer_build_definition: "LayerBuildDefinition") -> tomlkit.api.Table: """ Converts given layer_build_definition into toml table representation @@ -102,7 +102,7 @@ def _layer_build_definition_to_toml_table(layer_build_definition: "LayerBuildDef Returns ------- - tomlkit.items.Table + tomlkit.api.Table toml table of LayerBuildDefinition """ toml_table = tomlkit.table() @@ -119,7 +119,7 @@ def _layer_build_definition_to_toml_table(layer_build_definition: "LayerBuildDef return toml_table -def _toml_table_to_layer_build_definition(uuid: str, toml_table: tomlkit.items.Table) -> "LayerBuildDefinition": +def _toml_table_to_layer_build_definition(uuid: str, toml_table: tomlkit.api.Table) -> "LayerBuildDefinition": """ Converts given toml table into LayerBuildDefinition instance @@ -127,7 +127,7 @@ def _toml_table_to_layer_build_definition(uuid: str, toml_table: tomlkit.items.T ---------- uuid: str key of the toml_table instance - toml_table: tomlkit.items.Table + toml_table: tomlkit.api.Table layer build definition as toml table Returns @@ -136,10 +136,10 @@ def _toml_table_to_layer_build_definition(uuid: str, toml_table: tomlkit.items.T LayerBuildDefinition of given toml table """ layer_build_definition = LayerBuildDefinition( - toml_table[LAYER_NAME_FIELD], - toml_table[CODE_URI_FIELD], - toml_table[BUILD_METHOD_FIELD], - toml_table[COMPATIBLE_RUNTIMES_FIELD], + toml_table.get(LAYER_NAME_FIELD), + toml_table.get(CODE_URI_FIELD), + toml_table.get(BUILD_METHOD_FIELD), + toml_table.get(COMPATIBLE_RUNTIMES_FIELD), toml_table.get(SOURCE_MD5_FIELD, ""), dict(toml_table.get(ENV_VARS_FIELD, {})), ) @@ -266,7 +266,10 @@ def _read(self) -> None: document = {} try: txt = self._filepath.read_text() - document = tomlkit.loads(txt) + # .loads() returns a TOMLDocument, + # and it behaves like a standard dictionary according to https://github.com/sdispater/tomlkit. + # in tomlkit 0.7.2, the types are broken (tomlkit#128, #130, #134) so here we convert it to Dict. + document = cast(Dict, tomlkit.loads(txt)) except OSError: LOG.debug("No previous build graph found, generating new one") function_build_definitions_table = document.get(BuildGraph.FUNCTION_BUILD_DEFINITIONS, []) @@ -304,8 +307,9 @@ def _write(self) -> None: # create toml document and add build definitions document = tomlkit.document() document.add(tomlkit.comment("This file is auto generated by SAM CLI build command")) - document.add(BuildGraph.FUNCTION_BUILD_DEFINITIONS, function_build_definitions_table) - document.add(BuildGraph.LAYER_BUILD_DEFINITIONS, layer_build_definitions_table) + # we need to cast `Table` to `Item` because of tomlkit#135. + document.add(BuildGraph.FUNCTION_BUILD_DEFINITIONS, cast(tomlkit.items.Item, function_build_definitions_table)) + document.add(BuildGraph.LAYER_BUILD_DEFINITIONS, cast(tomlkit.items.Item, layer_build_definitions_table)) if not self._filepath.exists(): open(self._filepath, "a+").close() @@ -351,7 +355,7 @@ def __init__( def __str__(self) -> str: return ( f"LayerBuildDefinition({self.name}, {self.codeuri}, {self.source_md5}, {self.uuid}, " - f"{self.build_method}, {self.compatible_runtimes}, {self.env_vars}, {self.layer.name})" + f"{self.build_method}, {self.compatible_runtimes}, {self.env_vars})" ) def __eq__(self, other: Any) -> bool: diff --git a/samcli/lib/build/build_strategy.py b/samcli/lib/build/build_strategy.py index 829946b6bf..258101ba2d 100644 --- a/samcli/lib/build/build_strategy.py +++ b/samcli/lib/build/build_strategy.py @@ -5,6 +5,7 @@ import pathlib import shutil from abc import abstractmethod, ABC +from copy import deepcopy from typing import Callable, Dict, List, Any, Optional, cast from samcli.commands.build.exceptions import MissingBuildMethodException @@ -114,6 +115,10 @@ def build_single_function_definition(self, build_definition: FunctionBuildDefini LOG.debug("Building to following folder %s", single_build_dir) + # we should create a copy and pass it down, otherwise additional env vars like LAMBDA_BUILDERS_LOG_LEVEL + # will make cache invalid all the time + container_env_vars = deepcopy(build_definition.env_vars) + # when a function is passed here, it is ZIP function, codeuri and runtime are not None result = self._build_function( build_definition.get_function_name(), @@ -123,7 +128,7 @@ def build_single_function_definition(self, build_definition: FunctionBuildDefini build_definition.get_handler_name(), single_build_dir, build_definition.metadata, - build_definition.env_vars, + container_env_vars, ) function_build_results[single_full_path] = result @@ -214,7 +219,7 @@ def build_single_function_definition(self, build_definition: FunctionBuildDefini return self._delegate_build_strategy.build_single_function_definition(build_definition) code_dir = str(pathlib.Path(self._base_dir, cast(str, build_definition.codeuri)).resolve()) - source_md5 = dir_checksum(code_dir) + source_md5 = dir_checksum(code_dir, ignore_list=[".aws-sam"]) cache_function_dir = pathlib.Path(self._cache_dir, build_definition.uuid) function_build_results = {} @@ -253,7 +258,7 @@ def build_single_layer_definition(self, layer_definition: LayerBuildDefinition) Builds single layer definition with caching """ code_dir = str(pathlib.Path(self._base_dir, cast(str, layer_definition.codeuri)).resolve()) - source_md5 = dir_checksum(code_dir) + source_md5 = dir_checksum(code_dir, ignore_list=[".aws-sam"]) cache_function_dir = pathlib.Path(self._cache_dir, layer_definition.uuid) layer_build_result = {} diff --git a/samcli/lib/build/workflow_config.py b/samcli/lib/build/workflow_config.py index 146b6bfa77..bec974cc25 100644 --- a/samcli/lib/build/workflow_config.py +++ b/samcli/lib/build/workflow_config.py @@ -151,6 +151,7 @@ def get_layer_subfolder(build_workflow: str) -> str: "python3.6": "python", "python3.7": "python", "python3.8": "python", + "python3.9": "python", "nodejs4.3": "nodejs", "nodejs6.10": "nodejs", "nodejs8.10": "nodejs", @@ -210,6 +211,7 @@ def get_workflow_config( "python3.6": BasicWorkflowSelector(PYTHON_PIP_CONFIG), "python3.7": BasicWorkflowSelector(PYTHON_PIP_CONFIG), "python3.8": BasicWorkflowSelector(PYTHON_PIP_CONFIG), + "python3.9": BasicWorkflowSelector(PYTHON_PIP_CONFIG), "nodejs10.x": BasicWorkflowSelector(NODEJS_NPM_CONFIG), "nodejs12.x": BasicWorkflowSelector(NODEJS_NPM_CONFIG), "nodejs14.x": BasicWorkflowSelector(NODEJS_NPM_CONFIG), diff --git a/samcli/lib/config/samconfig.py b/samcli/lib/config/samconfig.py index 996ac5f648..5af1c0080a 100644 --- a/samcli/lib/config/samconfig.py +++ b/samcli/lib/config/samconfig.py @@ -41,6 +41,12 @@ def __init__(self, config_dir, filename=None): """ self.filepath = Path(config_dir, filename or DEFAULT_CONFIG_FILE_NAME) + def get_stage_names(self): + self._read() + if isinstance(self.document, dict): + return [stage for stage, value in self.document.items() if isinstance(value, dict)] + return [] + def get_all(self, cmd_names, section, env=DEFAULT_ENV): """ Gets a value from the configuration file for the given environment, command and section @@ -153,6 +159,10 @@ def sanity_check(self): def exists(self): return self.filepath.exists() + def _ensure_exists(self): + self.filepath.parent.mkdir(parents=True, exist_ok=True) + self.filepath.touch() + def path(self): return str(self.filepath) @@ -183,8 +193,8 @@ def _read(self): def _write(self): if not self.document: return - if not self.exists(): - open(self.filepath, "a+").close() + + self._ensure_exists() current_version = self._version() if self._version() else SAM_CONFIG_VERSION try: diff --git a/samcli/lib/cookiecutter/exceptions.py b/samcli/lib/cookiecutter/exceptions.py index af19364811..5d379228d8 100644 --- a/samcli/lib/cookiecutter/exceptions.py +++ b/samcli/lib/cookiecutter/exceptions.py @@ -4,8 +4,8 @@ class CookiecutterErrorException(Exception): fmt = "An unspecified error occurred" - def __init__(self, **kwargs): - msg = self.fmt.format(**kwargs) + def __init__(self, **kwargs): # type: ignore + msg: str = self.fmt.format(**kwargs) Exception.__init__(self, msg) self.kwargs = kwargs diff --git a/samcli/lib/cookiecutter/interactive_flow.py b/samcli/lib/cookiecutter/interactive_flow.py index 486e8c4d30..95ce846dc0 100644 --- a/samcli/lib/cookiecutter/interactive_flow.py +++ b/samcli/lib/cookiecutter/interactive_flow.py @@ -1,6 +1,10 @@ """A flow of questions to be asked to the user in an interactive way.""" -from typing import Any, Dict, Optional +from typing import Any, Dict, Optional, List, Tuple + +import click + from .question import Question +from ..utils.colors import Colored class InteractiveFlow: @@ -19,6 +23,7 @@ def __init__(self, questions: Dict[str, Question], first_question_key: str): self._questions: Dict[str, Question] = questions self._first_question_key: str = first_question_key self._current_question: Optional[Question] = None + self._color = Colored() def advance_to_next_question(self, current_answer: Optional[Any] = None) -> Optional[Question]: """ @@ -40,7 +45,10 @@ def advance_to_next_question(self, current_answer: Optional[Any] = None) -> Opti self._current_question = self._questions.get(next_question_key) if next_question_key else None return self._current_question - def run(self, context: Dict) -> Dict: + def run( + self, + context: Dict, + ) -> Dict: """ starts the flow, collects user's answers to the question and return a new copy of the passed context with the answers appended to the copy @@ -49,14 +57,33 @@ def run(self, context: Dict) -> Dict: ---------- context: Dict The cookiecutter context before prompting this flow's questions + The context can be used to provide default values, and support both str keys and List[str] keys. - Returns: A new copy of the context with user's answers added to the copy such that each answer is - associated to the key of the corresponding question + Returns + ------- + A new copy of the context with user's answers added to the copy such that each answer is + associated to the key of the corresponding question """ context = context.copy() + answers: List[Tuple[str, Any]] = [] + question = self.advance_to_next_question() while question: - answer = question.ask() + answer = question.ask(context=context) context[question.key] = answer + answers.append((question.key, answer)) question = self.advance_to_next_question(answer) + + # print summary + click.echo(self._color.bold("SUMMARY")) + click.echo("We will generate a pipeline config file based on the following information:") + + for question_key, answer in answers: + if answer is None: + # ignore unanswered questions + continue + + question = self._questions[question_key] + click.echo(f"\t{question.text}: {self._color.underline(str(answer))}") + return context diff --git a/samcli/lib/cookiecutter/interactive_flow_creator.py b/samcli/lib/cookiecutter/interactive_flow_creator.py index d1a227f1c8..b3552d4065 100644 --- a/samcli/lib/cookiecutter/interactive_flow_creator.py +++ b/samcli/lib/cookiecutter/interactive_flow_creator.py @@ -17,7 +17,7 @@ class QuestionsFailedParsingException(UserException): class InteractiveFlowCreator: @staticmethod - def create_flow(flow_definition_path: str, extra_context: Optional[Dict] = None): + def create_flow(flow_definition_path: str, extra_context: Optional[Dict] = None) -> InteractiveFlow: """ This method parses the given json/yaml file to create an InteractiveFLow. It expects the file to define a list of questions. It parses the questions and add it to the flow in the same order they are defined @@ -42,6 +42,19 @@ def create_flow(flow_definition_path: str, extra_context: Optional[Dict] = None) "True": "key of the question to jump to if the user answered 'Yes'", "False": "key of the question to jump to if the user answered 'Yes'", } + "default": "default_answer", + # the default value can also be loaded from cookiecutter context + # with a key path whose key path item can be loaded from cookiecutter as well. + "default": { + "keyPath": [ + { + "valueOf": "key-of-another-question" + }, + "pipeline_user" + ] + } + # assuming the answer of "key-of-another-question" is "ABC" + # the default value will be load from cookiecutter context with key "['ABC', 'pipeline_user]" }, ... ] @@ -63,18 +76,21 @@ def _load_questions( questions: Dict[str, Question] = {} questions_definition = InteractiveFlowCreator._parse_questions_definition(flow_definition_path, extra_context) - for question in questions_definition.get("questions"): - q = QuestionFactory.create_question_from_json(question) - if not first_question_key: - first_question_key = q.key - elif previous_question and not previous_question.default_next_question_key: - previous_question.set_default_next_question_key(q.key) - questions[q.key] = q - previous_question = q - return questions, first_question_key + try: + for question in questions_definition.get("questions", []): + q = QuestionFactory.create_question_from_json(question) + if not first_question_key: + first_question_key = q.key + elif previous_question and not previous_question.default_next_question_key: + previous_question.set_default_next_question_key(q.key) + questions[q.key] = q + previous_question = q + return questions, first_question_key + except (KeyError, ValueError, AttributeError, TypeError) as ex: + raise QuestionsFailedParsingException(f"Failed to parse questions: {str(ex)}") from ex @staticmethod - def _parse_questions_definition(file_path, extra_context: Optional[Dict] = None): + def _parse_questions_definition(file_path: str, extra_context: Optional[Dict] = None) -> Dict: """ Read the questions definition file, do variable substitution, parse it as JSON/YAML diff --git a/samcli/lib/cookiecutter/processor.py b/samcli/lib/cookiecutter/processor.py index 5994c77949..4f34df06f8 100644 --- a/samcli/lib/cookiecutter/processor.py +++ b/samcli/lib/cookiecutter/processor.py @@ -9,7 +9,7 @@ class Processor(ABC): """ @abstractmethod - def run(self, context: Dict): + def run(self, context: Dict) -> Dict: """ the processing logic of this processor diff --git a/samcli/lib/cookiecutter/question.py b/samcli/lib/cookiecutter/question.py index 71c30d98da..4fad0ea020 100644 --- a/samcli/lib/cookiecutter/question.py +++ b/samcli/lib/cookiecutter/question.py @@ -1,6 +1,8 @@ """ This module represents the questions to ask to the user to fulfill the cookiecutter context. """ +from abc import ABC, abstractmethod from enum import Enum -from typing import Any, Dict, List, Optional, Type +from typing import Any, Dict, List, Optional, Type, Union + import click @@ -13,7 +15,18 @@ class QuestionKind(Enum): default = "default" -class Question: +class Promptable(ABC): + """ + Abstract class Question, Info, Choice, Confirm implement. + These classes need to implement their own prompt() method to prompt differently. + """ + + @abstractmethod + def prompt(self, text: str, default_answer: Optional[Any]) -> Any: + pass + + +class Question(Promptable): """ A question to be prompt to the user in an interactive flow where the response is used to fulfill the cookiecutter context. @@ -26,8 +39,10 @@ class Question: The text to prompt to the user _required: bool Whether the user must provide an answer for this question or not. - _default_answer: Optional[str] - A default answer that is suggested to the user + _default_answer: Optional[Union[str, Dict]] + A default answer that is suggested to the user, + it can be directly provided (a string) + or resolved from cookiecutter context (a Dict, in the form of {"keyPath": [...,]}) _next_question_map: Optional[Dict[str, str]] A simple branching mechanism, it refers to what is the next question to ask the user if he answered a particular answer to this question. this map is in the form of {answer: next-question-key}. this @@ -48,14 +63,16 @@ def __init__( self, key: str, text: str, - default: Optional[str] = None, + default: Optional[Union[str, Dict]] = None, is_required: Optional[bool] = None, + allow_autofill: Optional[bool] = None, next_question_map: Optional[Dict[str, str]] = None, default_next_question_key: Optional[str] = None, ): self._key = key self._text = text self._required = is_required + self._allow_autofill = allow_autofill self._default_answer = default # if it is an optional question, set an empty default answer to prevent click from keep asking for an answer if not self._required and self._default_answer is None: @@ -64,31 +81,57 @@ def __init__( self._default_next_question_key = default_next_question_key @property - def key(self): + def key(self) -> str: return self._key @property - def text(self): + def text(self) -> str: return self._text @property - def default_answer(self): - return self._default_answer + def default_answer(self) -> Optional[Any]: + return self._resolve_default_answer() @property - def required(self): + def required(self) -> Optional[bool]: return self._required @property - def next_question_map(self): + def next_question_map(self) -> Dict[str, str]: return self._next_question_map @property - def default_next_question_key(self): + def default_next_question_key(self) -> Optional[str]: return self._default_next_question_key - def ask(self) -> Any: - return click.prompt(text=self._text, default=self._default_answer) + def ask(self, context: Optional[Dict] = None) -> Any: + """ + prompt the user this question + + Parameters + ---------- + context + The cookiecutter context dictionary containing previous questions' answers and default values + + Returns + ------- + The user provided answer. + """ + resolved_default_answer = self._resolve_default_answer(context) + + # skip the question and directly use the default value if autofill is allowed. + if resolved_default_answer is not None and self._allow_autofill: + return resolved_default_answer + + # if it is an optional question with no default answer, + # set an empty default answer to prevent click from keep asking for an answer + if not self._required and resolved_default_answer is None: + resolved_default_answer = "" + + return self.prompt(self._resolve_text(context), resolved_default_answer) + + def prompt(self, text: str, default_answer: Optional[Any]) -> Any: + return click.prompt(text=text, default=default_answer) def get_next_question_key(self, answer: Any) -> Optional[str]: # _next_question_map is a Dict[str(answer), str(next question key)] @@ -96,18 +139,97 @@ def get_next_question_key(self, answer: Any) -> Optional[str]: answer = str(answer) return self._next_question_map.get(answer, self._default_next_question_key) - def set_default_next_question_key(self, next_question_key): + def set_default_next_question_key(self, next_question_key: str) -> None: self._default_next_question_key = next_question_key + def _resolve_key_path(self, key_path: List, context: Dict) -> List[str]: + """ + key_path element is a list of str and Dict. + When the element is a dict, in the form of { "valueOf": question_key }, + it means it refers to the answer to another questions. + _resolve_key_path() will replace such dict with the actual question answer + + Parameters + ---------- + key_path + The key_path list containing str and dict + context + The cookiecutter context containing answers to previous answered questions + Returns + ------- + The key_path list containing only str + """ + resolved_key_path: List[str] = [] + for unresolved_key in key_path: + if isinstance(unresolved_key, str): + resolved_key_path.append(unresolved_key) + elif isinstance(unresolved_key, dict): + if "valueOf" not in unresolved_key: + raise KeyError(f'Missing key "valueOf" in question default keyPath element "{unresolved_key}".') + query_question_key: str = unresolved_key.get("valueOf", "") + if query_question_key not in context: + raise KeyError( + f'Invalid question key "{query_question_key}" referenced ' + f"in default answer of question {self.key}" + ) + resolved_key_path.append(context[query_question_key]) + else: + raise ValueError(f'Invalid value "{unresolved_key}" in key path') + return resolved_key_path + + def _resolve_value_from_expression(self, expression: Any, context: Optional[Dict] = None) -> Optional[Any]: + """ + a question may have a value provided directly as string or number value + or indirectly from cookiecutter context using a key path + + Parameters + ---------- + context + Cookiecutter context used to resolve values. + + Raises + ------ + KeyError + When an expression depends on the answer to a non-existent question + ValueError + The expression is malformed + + Returns + ------- + Optional value, it might be resolved from cookiecutter context using specified key path. + + """ + if isinstance(expression, dict): + context = context if context else {} + + # load value using key path from cookiecutter + if "keyPath" not in expression: + raise KeyError(f'Missing key "keyPath" in "{expression}".') + unresolved_key_path = expression.get("keyPath", []) + if not isinstance(unresolved_key_path, list): + raise ValueError(f'Invalid expression "{expression}" in question {self.key}') + + return context.get(str(self._resolve_key_path(unresolved_key_path, context))) + return expression + + def _resolve_text(self, context: Optional[Dict] = None) -> str: + resolved_text = self._resolve_value_from_expression(self._text, context) + if resolved_text is None: + raise ValueError(f"Cannot resolve value from expression: {self._text}") + return str(resolved_text) + + def _resolve_default_answer(self, context: Optional[Dict] = None) -> Optional[Any]: + return self._resolve_value_from_expression(self._default_answer, context) + class Info(Question): - def ask(self) -> None: - return click.echo(message=self._text) + def prompt(self, text: str, default_answer: Optional[Any]) -> Any: + return click.echo(message=text) class Confirm(Question): - def ask(self) -> bool: - return click.confirm(text=self._text) + def prompt(self, text: str, default_answer: Optional[Any]) -> Any: + return click.confirm(text=text) class Choice(Question): @@ -118,25 +240,27 @@ def __init__( options: List[str], default: Optional[str] = None, is_required: Optional[bool] = None, + allow_autofill: Optional[bool] = None, next_question_map: Optional[Dict[str, str]] = None, default_next_question_key: Optional[str] = None, ): if not options: raise ValueError("No defined options") self._options = options - super().__init__(key, text, default, is_required, next_question_map, default_next_question_key) + super().__init__(key, text, default, is_required, allow_autofill, next_question_map, default_next_question_key) - def ask(self) -> str: - click.echo(self._text) + def prompt(self, text: str, default_answer: Optional[Any]) -> Any: + click.echo(text) for index, option in enumerate(self._options): click.echo(f"\t{index + 1} - {option}") options_indexes = self._get_options_indexes(base=1) choices = list(map(str, options_indexes)) choice = click.prompt( text="Choice", - default=self._default_answer, + default=default_answer, show_choices=False, type=click.Choice(choices), + show_default=default_answer is not None, ) return self._options[int(choice) - 1] @@ -145,7 +269,6 @@ def _get_options_indexes(self, base: int = 0) -> List[int]: class QuestionFactory: - question_classes: Dict[QuestionKind, Type[Question]] = { QuestionKind.info: Info, QuestionKind.choice: Choice, @@ -160,6 +283,7 @@ def create_question_from_json(question_json: Dict) -> Question: options = question_json.get("options") default = question_json.get("default") is_required = question_json.get("isRequired") + allow_autofill = question_json.get("allowAutofill") next_question_map = question_json.get("nextQuestion") default_next_question = question_json.get("defaultNextQuestion") kind_str = question_json.get("kind") @@ -171,6 +295,7 @@ def create_question_from_json(question_json: Dict) -> Question: "text": text, "default": default, "is_required": is_required, + "allow_autofill": allow_autofill, "next_question_map": next_question_map, "default_next_question_key": default_next_question, } diff --git a/samcli/lib/cookiecutter/template.py b/samcli/lib/cookiecutter/template.py index c7d643bb43..46b851985e 100644 --- a/samcli/lib/cookiecutter/template.py +++ b/samcli/lib/cookiecutter/template.py @@ -3,15 +3,17 @@ values of the context and how to generate a project from the given template and provided context """ import logging -from typing import Any, Dict, List, Optional +from typing import Dict, List, Optional + from cookiecutter.exceptions import RepositoryNotFound, UnknownRepoType from cookiecutter.main import cookiecutter + from samcli.commands.exceptions import UserException from samcli.lib.init.arbitrary_project import generate_non_cookiecutter_project +from .exceptions import GenerateProjectFailedError, InvalidLocationError, PreprocessingError, PostprocessingError from .interactive_flow import InteractiveFlow from .plugin import Plugin from .processor import Processor -from .exceptions import GenerateProjectFailedError, InvalidLocationError, PreprocessingError, PostprocessingError LOG = logging.getLogger(__name__) @@ -41,6 +43,8 @@ class Template: An optional series of plugins to be plugged in. A plugin defines its own interactive_flow, preprocessor and postprocessor. A plugin is a sub-set of the template, if there is a common behavior among multiple templates, it is better to be extracted to a plugin that can then be plugged in to each of these templates. + metadata: Optional[Dict] + An optional dictionary with extra information about the template Methods ------- @@ -61,6 +65,7 @@ def __init__( preprocessors: Optional[List[Processor]] = None, postprocessors: Optional[List[Processor]] = None, plugins: Optional[List[Plugin]] = None, + metadata: Optional[Dict] = None, ): """ Initialize the class @@ -84,6 +89,8 @@ def __init__( An optional series of plugins to be plugged in. A plugin defines its own interactive_flow, preprocessor and postprocessor. A plugin is a sub-set of the template, if there is a common behavior among multiple templates, it is better to be extracted to a plugin that can then be plugged in to each of these templates. + metadata: Optional[Dict] + An optional dictionary with extra information about the template """ self._location = location self._interactive_flows = interactive_flows or [] @@ -97,8 +104,9 @@ def __init__( self._preprocessors.append(plugin.preprocessor) if plugin.postprocessor: self._postprocessors.append(plugin.postprocessor) + self.metadata = metadata - def run_interactive_flows(self) -> Dict: + def run_interactive_flows(self, context: Optional[Dict] = None) -> Dict: """ prompt the user a series of questions' flows and gather the answers to create the cookiecutter context. The questions are identified by keys. If multiple questions, whether within the same flow or across @@ -112,14 +120,14 @@ def run_interactive_flows(self) -> Dict: A Dictionary in the form of {question.key: answer} representing user's answers to the flows' questions """ try: - context: Dict[str, Any] = {} + context = context if context else {} for flow in self._interactive_flows: context = flow.run(context) return context except Exception as e: raise UserException(str(e), wrapped_from=e.__class__.__name__) from e - def generate_project(self, context: Dict): + def generate_project(self, context: Dict, output_dir: str) -> None: """ Generates a project based on this cookiecutter template and the given context. The context is first processed and manipulated by series of preprocessors(if any) then the project is generated and finally @@ -129,6 +137,8 @@ def generate_project(self, context: Dict): ---------- context: Dict the cookiecutter context to fulfill the values of cookiecutter.json keys + output_dir: str + the directory where project will be generated in Raise: ------ @@ -144,7 +154,13 @@ def generate_project(self, context: Dict): try: LOG.debug("Baking a new template with cookiecutter with all parameters") - cookiecutter(template=self._location, output_dir=".", no_input=True, extra_context=context) + cookiecutter( + template=self._location, + output_dir=output_dir, + no_input=True, + extra_context=context, + overwrite_if_exists=True, + ) except RepositoryNotFound as e: # cookiecutter.json is not found in the template. Let's just clone it directly without # using cookiecutter and call it done. diff --git a/samcli/lib/deploy/deployer.py b/samcli/lib/deploy/deployer.py index 8aae03425e..540535def2 100644 --- a/samcli/lib/deploy/deployer.py +++ b/samcli/lib/deploy/deployer.py @@ -84,6 +84,7 @@ def __init__(self, cloudformation_client, changeset_prefix="samcli-deploy"): self.max_attempts = 3 self.deploy_color = DeployColor() + # pylint: disable=inconsistent-return-statements def has_stack(self, stack_name): """ Checks if a CloudFormation stack with given name exists @@ -113,6 +114,7 @@ def has_stack(self, stack_name): if "Stack with id {0} does not exist".format(stack_name) in str(e): LOG.debug("Stack with id %s does not exist", stack_name) return False + return None except botocore.exceptions.BotoCoreError as e: # If there are credentials, environment errors, # catch that and throw a deploy failed error. diff --git a/samcli/lib/generated_sample_events/event-mapping.json b/samcli/lib/generated_sample_events/event-mapping.json index 4a12ee5d90..e72721bacb 100644 --- a/samcli/lib/generated_sample_events/event-mapping.json +++ b/samcli/lib/generated_sample_events/event-mapping.json @@ -992,7 +992,16 @@ "error": { "filename": "StepFunctionsError", "help": "Generates an AWS StepFunctions Error Event", - "tags": {} + "tags": { + "error": { + "type": "string", + "default": "ErrorName" + }, + "cause": { + "type": "string", + "default": "This is the cause of the error." + } + } } } } diff --git a/samcli/lib/generated_sample_events/events/stepfunctions/StepFunctionsError.json b/samcli/lib/generated_sample_events/events/stepfunctions/StepFunctionsError.json index 9e26dfeeb6..313d614c28 100644 --- a/samcli/lib/generated_sample_events/events/stepfunctions/StepFunctionsError.json +++ b/samcli/lib/generated_sample_events/events/stepfunctions/StepFunctionsError.json @@ -1 +1,4 @@ -{} \ No newline at end of file +{ + "Error": "{{{error}}}", + "Cause": "{{{cause}}}" +} \ No newline at end of file diff --git a/samcli/lib/iac/utils/helpers.py b/samcli/lib/iac/utils/helpers.py index c929524acc..57d9e2e7af 100644 --- a/samcli/lib/iac/utils/helpers.py +++ b/samcli/lib/iac/utils/helpers.py @@ -30,7 +30,7 @@ def get_iac_plugin(project_type, command_params, with_build): lookup_paths = [] if with_build: - from samcli.commands.build.command import DEFAULT_BUILD_DIR + from samcli.commands.build.build_constants import DEFAULT_BUILD_DIR # is this correct? --build-dir is only used for "build" (for writing) # but with_true is True for "local" commands only diff --git a/samcli/lib/logs/event.py b/samcli/lib/logs/event.py deleted file mode 100644 index 0c05232d33..0000000000 --- a/samcli/lib/logs/event.py +++ /dev/null @@ -1,72 +0,0 @@ -""" -Represents CloudWatch Log Event -""" - -import logging - -from samcli.lib.utils.time import timestamp_to_iso - -LOG = logging.getLogger(__name__) - - -class LogEvent: - """ - Data object representing a CloudWatch Log Event - """ - - log_group_name = None - log_stream_name = None - timestamp = None - message = None - - def __init__(self, log_group_name, event_dict): - """ - Creates instance of the class - - Parameters - ---------- - log_group_name : str - The log group name - event_dict : dict - Dict of log event data returned by CloudWatch Logs API. - https://docs.aws.amazon.com/AmazonCloudWatchLogs/latest/APIReference/API_FilteredLogEvent.html - """ - - self.log_group_name = log_group_name - - if not event_dict: - # If event is empty, just use default values for properties. We don't raise an error here because - # this class is a data wrapper to the `events_dict`. It doesn't try to be smart. - return - - self.log_stream_name = event_dict.get("logStreamName") - self.message = event_dict.get("message", "") - - self.timestamp_millis = event_dict.get("timestamp") - - # Convert the timestamp from epoch to readable ISO timestamp, easier for formatting. - if self.timestamp_millis: - self.timestamp = timestamp_to_iso(int(self.timestamp_millis)) - - def __eq__(self, other): - - if not isinstance(other, LogEvent): - return False - - return ( - self.log_group_name == other.log_group_name - and self.log_stream_name == other.log_stream_name - and self.timestamp == other.timestamp - and self.message == other.message - ) - - def __repr__(self): # pragma: no cover - # Used to print pretty diff when testing - return str( - { - "log_group_name": self.log_group_name, - "log_stream_name": self.log_stream_name, - "message": self.message, - "timestamp": self.timestamp, - } - ) diff --git a/samcli/lib/logs/fetcher.py b/samcli/lib/logs/fetcher.py deleted file mode 100644 index c6709fe28e..0000000000 --- a/samcli/lib/logs/fetcher.py +++ /dev/null @@ -1,145 +0,0 @@ -""" -Filters & fetches logs from CloudWatch Logs -""" - -import time -import logging - -from samcli.lib.utils.time import to_timestamp, to_datetime -from .event import LogEvent - - -LOG = logging.getLogger(__name__) - - -class LogsFetcher: - """ - Fetch logs from a CloudWatch Logs group with the ability to scope to a particular time, filter by - a pattern, and in the future possibly multiplex from from multiple streams together. - """ - - def __init__(self, cw_client=None): - """ - Initialize the fetcher - - Parameters - ---------- - cw_client - CloudWatch Logs Client from AWS SDK - """ - self.cw_client = cw_client - - def fetch(self, log_group_name, start=None, end=None, filter_pattern=None): - """ - Fetch logs from all streams under the given CloudWatch Log Group and yields in the output. Optionally, caller - can filter the logs using a pattern or a start/end time. - - Parameters - ---------- - log_group_name : string - Name of CloudWatch Logs Group to query. - start : datetime.datetime - Optional start time for logs. - end : datetime.datetime - Optional end time for logs. - filter_pattern : str - Expression to filter the logs by. This is passed directly to CloudWatch, so any expression supported by - CloudWatch Logs API is supported here. - - Yields - ------ - - samcli.lib.logs.event.LogEvent - Object containing the information from each log event returned by CloudWatch Logs - """ - - kwargs = {"logGroupName": log_group_name, "interleaved": True} - - if start: - kwargs["startTime"] = to_timestamp(start) - - if end: - kwargs["endTime"] = to_timestamp(end) - - if filter_pattern: - kwargs["filterPattern"] = filter_pattern - - while True: - LOG.debug("Fetching logs from CloudWatch with parameters %s", kwargs) - result = self.cw_client.filter_log_events(**kwargs) - - # Several events will be returned. Yield one at a time - for event in result.get("events", []): - yield LogEvent(log_group_name, event) - - # Keep iterating until there are no more logs left to query. - next_token = result.get("nextToken", None) - kwargs["nextToken"] = next_token - if not next_token: - break - - def tail(self, log_group_name, start=None, filter_pattern=None, max_retries=1000, poll_interval=0.3): - """ - ** This is a long blocking call ** - - Fetches logs from CloudWatch logs similar to the ``fetch`` method, but instead of stopping after all logs have - been fetched, this method continues to poll CloudWatch for new logs. So this essentially simulates the - ``tail -f`` bash command. - - If no logs are available, then it keep polling for ``timeout`` number of seconds before exiting. This method - polls CloudWatch at around ~3 Calls Per Second to stay below the 5TPS limit. - - Parameters - ---------- - log_group_name : str - Name of CloudWatch Logs Group to query. - start : datetime.datetime - Optional start time for logs. Defaults to '5m ago' - filter_pattern : str - Expression to filter the logs by. This is passed directly to CloudWatch, so any expression supported by - CloudWatch Logs API is supported here. - max_retries : int - When logs are not available, this value determines the number of times to retry fetching logs before giving - up. This counter is reset every time new logs are available. - poll_interval : float - Number of fractional seconds wait before polling again. Defaults to 300milliseconds. - If no new logs available, this method will stop polling after ``max_retries * poll_interval`` seconds - - Yields - ------ - samcli.lib.logs.event.LogEvent - Object containing the information from each log event returned by CloudWatch Logs - """ - - # On every poll, startTime of the API call is the timestamp of last record observed - latest_event_time = 0 # Start of epoch - if start: - latest_event_time = to_timestamp(start) - - counter = max_retries - while counter > 0: - - LOG.debug("Tailing logs from %s starting at %s", log_group_name, str(latest_event_time)) - - has_data = False - counter -= 1 - events_itr = self.fetch(log_group_name, start=to_datetime(latest_event_time), filter_pattern=filter_pattern) - - # Find the timestamp of the most recent log event. - for event in events_itr: - has_data = True - - if event.timestamp_millis > latest_event_time: - latest_event_time = event.timestamp_millis - - # Yield the event back so it behaves similar to ``fetch`` - yield event - - # This poll fetched logs. Reset the retry counter and set the timestamp for next poll - if has_data: - counter = max_retries - latest_event_time += 1 # one extra millisecond to fetch next log event - - # We already fetched logs once. Sleep for some time before querying again. - # This also helps us scoot under the TPS limit for CloudWatch API call. - time.sleep(poll_interval) diff --git a/samcli/lib/logs/formatter.py b/samcli/lib/logs/formatter.py deleted file mode 100644 index 6e21619f36..0000000000 --- a/samcli/lib/logs/formatter.py +++ /dev/null @@ -1,181 +0,0 @@ -""" -Format log events produced by CloudWatch Logs -""" - -import json -import functools - - -class LogsFormatter: - """ - Formats log messages returned by CloudWatch Logs service. - """ - - def __init__(self, colored, formatter_chain=None): - # the docstring contains an example function which contains another docstring, - # pylint is confused so disable it for this method. - # pylint: disable=missing-param-doc,differing-param-doc,differing-type-doc,redundant-returns-doc - """ - - ``formatter_chain`` is a list of methods that can format an event. Each method must take an - ``samcli.lib.logs.event.LogEvent`` object as input and return the same object back. This allows us to easily - chain formatter methods one after another. This class will apply all the formatters from this list on each - log event. - - After running the formatter chain, this class will convert the event object to string by appending - the timestamp to message. To skip all custom formatting and simply convert event to string, you can leave - the ``formatter_chain`` list empty. - - Formatter Method - ================ - Formatter method needs to accept two arguments at a minimum: ``event`` and ``colored``. It can make - modifications to the contents of ``event`` and must return the same object. - - Example: - .. code-block:: python - - def my_formatter(event, colored): - \""" - Example of a custom log formatter - - Parameters - ---------- - event : samcli.lib.logs.event.LogEvent - Log event to format - - colored : samcli.lib.utils.colors.Colored - Instance of ``Colored`` object to add colors to the message - - Returns - ------- - samcli.lib.logs.event.LogEvent - Object representing the log event that has been formatted. It could be the same event object passed - via input. - \""" - - # Do your formatting - - return event - - Parameters - ---------- - colored : samcli.lib.utils.colors.Colored - Used to add color to the string when pretty printing. Colors are useful only when pretty printing on a - Terminal. To turn off coloring, set the appropriate property when instantiating the - ``samcli.lib.utils.colors.Colored`` class. - - formatter_chain : List[str] - list of formatter methods - """ - - self.colored = colored - self.formatter_chain = formatter_chain or [] - - # At end of the chain, pretty print the Event object as string. - self.formatter_chain.append(LogsFormatter._pretty_print_event) - - def do_format(self, event_iterable): - """ - Formats the given CloudWatch Logs Event dictionary as necessary and returns an iterable that will - return the formatted string. This can be used to parse and format the events based on context - ie. In Lambda Function logs, a formatter may wish to color the "ERROR" keywords red, - or highlight a filter keyword separately etc. - - This method takes an iterable as input and returns an iterable. It does not immediately format the event. - Instead, it sets up the formatter chain appropriately and returns the iterable. Actual formatting happens - only when the iterable is used by the caller. - - Parameters - ---------- - event_iterable : iterable of samcli.lib.logs.event.LogEvent - Iterable that returns an object containing information about each log event. - - Returns - ------- - iterable of string - Iterable that returns a formatted event as a string. - """ - - for operation in self.formatter_chain: - - # Make sure the operation has access to certain basic objects like colored - partial_op = functools.partial(operation, colored=self.colored) - event_iterable = map(partial_op, event_iterable) - - return event_iterable - - @staticmethod - def _pretty_print_event(event, colored): - """ - Basic formatter to convert an event object to string - """ - event.timestamp = colored.yellow(event.timestamp) - event.log_stream_name = colored.cyan(event.log_stream_name) - - return " ".join([event.log_stream_name, event.timestamp, event.message]) - - -class LambdaLogMsgFormatters: - """ - Format logs printed by AWS Lambda functions. - - This class is a collection of static methods that can be used within a formatter chain. - """ - - @staticmethod - def colorize_errors(event, colored): - """ - Highlights some commonly known Lambda error cases in red: - - Nodejs process crashes - - Lambda function timeouts - """ - - nodejs_crash_msg = "Process exited before completing request" - timeout_msg = "Task timed out" - - if nodejs_crash_msg in event.message or timeout_msg in event.message: - event.message = colored.red(event.message) - - return event - - -class KeywordHighlighter: - """ - Highlight certain keywords in the log line - """ - - def __init__(self, keyword=None): - self.keyword = keyword - - def highlight_keywords(self, event, colored): - """ - Highlight the keyword in the log statement by drawing an underline - """ - if self.keyword: - highlight = colored.underline(self.keyword) - event.message = event.message.replace(self.keyword, highlight) - - return event - - -class JSONMsgFormatter: - """ - Pretty print JSONs within a message - """ - - @staticmethod - def format_json(event, colored): - """ - If the event message is a JSON string, then pretty print the JSON with 2 indents and sort the keys. This makes - it very easy to visually parse and search JSON data - """ - - try: - if event.message.startswith("{"): - msg_dict = json.loads(event.message) - event.message = json.dumps(msg_dict, indent=2) - except Exception: - # Skip if the event message was not JSON - pass - - return event diff --git a/samcli/lib/observability/__init__.py b/samcli/lib/observability/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/samcli/lib/observability/cw_logs/__init__.py b/samcli/lib/observability/cw_logs/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/samcli/lib/observability/cw_logs/cw_log_event.py b/samcli/lib/observability/cw_logs/cw_log_event.py new file mode 100644 index 0000000000..49b9a4e889 --- /dev/null +++ b/samcli/lib/observability/cw_logs/cw_log_event.py @@ -0,0 +1,40 @@ +""" +CloudWatch log event type +""" +from typing import Optional + +from samcli.lib.observability.observability_info_puller import ObservabilityEvent + + +class CWLogEvent(ObservabilityEvent[dict]): + """ + An event class which represents a Cloud Watch log + """ + + def __init__(self, cw_log_group: str, event: dict, resource_name: Optional[str] = None): + """ + Parameters + ---------- + cw_log_group : str + Name of the CloudWatch log group + event : dict + Event dictionary of the CloudWatch log event + resource_name : Optional[str] + Resource name that is related to this CloudWatch log event + """ + self.cw_log_group = cw_log_group + self.message: str = event.get("message", "") + self.log_stream_name: str = event.get("logStreamName", "") + timestamp: int = event.get("timestamp", 0) + super().__init__(event, timestamp, resource_name) + + def __eq__(self, other): + if not isinstance(other, CWLogEvent): + return False + + return ( + self.cw_log_group == other.cw_log_group + and self.log_stream_name == other.log_stream_name + and self.timestamp == other.timestamp + and self.message == other.message + ) diff --git a/samcli/lib/observability/cw_logs/cw_log_formatters.py b/samcli/lib/observability/cw_logs/cw_log_formatters.py new file mode 100644 index 0000000000..f0d35a18a6 --- /dev/null +++ b/samcli/lib/observability/cw_logs/cw_log_formatters.py @@ -0,0 +1,94 @@ +""" +Contains all mappers (formatters) for CloudWatch logs +""" +import json +import logging +from json import JSONDecodeError + +from samcli.lib.observability.cw_logs.cw_log_event import CWLogEvent +from samcli.lib.observability.observability_info_puller import ObservabilityEventMapper +from samcli.lib.utils.colors import Colored +from samcli.lib.utils.time import timestamp_to_iso + +LOG = logging.getLogger(__name__) + + +class CWKeywordHighlighterFormatter(ObservabilityEventMapper[CWLogEvent]): + """ + Mapper implementation which will highlight given keywords in CloudWatch logs + """ + + def __init__(self, colored: Colored, keyword=None): + """ + Parameters + ---------- + colored : Colored + Colored class that will be used to highlight the keywords in log event + keyword : str + Keyword that will be highlighted + """ + self._keyword = keyword + self._colored = colored + + def map(self, event: CWLogEvent) -> CWLogEvent: + if self._keyword: + highlight = self._colored.underline(self._keyword) + event.message = event.message.replace(self._keyword, highlight) + + return event + + +class CWColorizeErrorsFormatter(ObservabilityEventMapper[CWLogEvent]): + """ + Mapper implementation which will colorize some pre-defined error messages + """ + + # couple of pre-defined error messages for lambda functions which will be colorized when getting the logs + NODEJS_CRASH_MESSAGE = "Process exited before completing request" + TIMEOUT_MSG = "Task timed out" + + def __init__(self, colored: Colored): + self._colored = colored + + def map(self, event: CWLogEvent) -> CWLogEvent: + if ( + CWColorizeErrorsFormatter.NODEJS_CRASH_MESSAGE in event.message + or CWColorizeErrorsFormatter.TIMEOUT_MSG in event.message + ): + event.message = self._colored.red(event.message) + return event + + +class CWJsonFormatter(ObservabilityEventMapper[CWLogEvent]): + """ + Mapper implementation which will auto indent the input if the input is a JSON object + """ + + # pylint: disable=R0201 + # Pylint recommends converting this method to a static one but we want it to stay as it is + # since formatters/mappers are combined in an array of ObservabilityEventMapper class + def map(self, event: CWLogEvent) -> CWLogEvent: + try: + if event.message.startswith("{"): + msg_dict = json.loads(event.message) + event.message = json.dumps(msg_dict, indent=2) + except JSONDecodeError as err: + LOG.debug("Can't decode string (%s) as JSON. Error (%s)", event.message, err) + + return event + + +class CWPrettyPrintFormatter(ObservabilityEventMapper[CWLogEvent]): + """ + Mapper implementation which will format given CloudWatch log event into string with coloring + log stream name and timestamp + """ + + def __init__(self, colored: Colored): + self._colored = colored + + def map(self, event: CWLogEvent) -> CWLogEvent: + timestamp = self._colored.yellow(timestamp_to_iso(int(event.timestamp))) + log_stream_name = self._colored.cyan(event.log_stream_name) + event.message = f"{log_stream_name} {timestamp} {event.message}" + return event diff --git a/samcli/lib/logs/provider.py b/samcli/lib/observability/cw_logs/cw_log_group_provider.py similarity index 100% rename from samcli/lib/logs/provider.py rename to samcli/lib/observability/cw_logs/cw_log_group_provider.py diff --git a/samcli/lib/observability/cw_logs/cw_log_puller.py b/samcli/lib/observability/cw_logs/cw_log_puller.py new file mode 100644 index 0000000000..e7d8b7fb10 --- /dev/null +++ b/samcli/lib/observability/cw_logs/cw_log_puller.py @@ -0,0 +1,111 @@ +""" +CloudWatch log event puller implementation +""" +import logging +import time +from datetime import datetime +from typing import Optional, Any + +from samcli.lib.observability.cw_logs.cw_log_event import CWLogEvent +from samcli.lib.observability.observability_info_puller import ObservabilityPuller, ObservabilityEventConsumer +from samcli.lib.utils.time import to_timestamp, to_datetime + +LOG = logging.getLogger(__name__) + + +class CWLogPuller(ObservabilityPuller): + """ + Puller implementation that can pull events from CloudWatch log group + """ + + def __init__( + self, + logs_client: Any, + consumer: ObservabilityEventConsumer, + cw_log_group: str, + resource_name: Optional[str] = None, + max_retries: int = 1000, + poll_interval: int = 1, + ): + """ + Parameters + ---------- + logs_client: Any + boto3 logs client instance + consumer : ObservabilityEventConsumer + Consumer instance that will process pulled events + cw_log_group : str + CloudWatch log group name + resource_name : Optional[str] + Optional parameter to assign a resource name for each event. + max_retries: int + Optional parameter to set maximum retries when tailing. Default value is 1000 + poll_interval: int + Optional parameter to define sleep interval between pulling new log events when tailing. Default value is 1 + """ + self.logs_client = logs_client + self.consumer = consumer + self.cw_log_group = cw_log_group + self.resource_name = resource_name + self._max_retries = max_retries + self._poll_interval = poll_interval + self.latest_event_time = 0 + self.had_data = False + + def tail(self, start_time: Optional[datetime] = None, filter_pattern: Optional[str] = None): + if start_time: + self.latest_event_time = to_timestamp(start_time) + + counter = self._max_retries + while counter > 0: + LOG.debug("Tailing logs from %s starting at %s", self.cw_log_group, str(self.latest_event_time)) + + counter -= 1 + self.load_time_period(to_datetime(self.latest_event_time), filter_pattern=filter_pattern) + + # This poll fetched logs. Reset the retry counter and set the timestamp for next poll + if self.had_data: + counter = self._max_retries + self.latest_event_time += 1 # one extra millisecond to fetch next log event + self.had_data = False + + # We already fetched logs once. Sleep for some time before querying again. + # This also helps us scoot under the TPS limit for CloudWatch API call. + time.sleep(self._poll_interval) + + def load_time_period( + self, + start_time: Optional[datetime] = None, + end_time: Optional[datetime] = None, + filter_pattern: Optional[str] = None, + ): + kwargs = {"logGroupName": self.cw_log_group, "interleaved": True} + + if start_time: + kwargs["startTime"] = to_timestamp(start_time) + + if end_time: + kwargs["endTime"] = to_timestamp(end_time) + + if filter_pattern: + kwargs["filterPattern"] = filter_pattern + + while True: + LOG.debug("Fetching logs from CloudWatch with parameters %s", kwargs) + result = self.logs_client.filter_log_events(**kwargs) + + # Several events will be returned. Yield one at a time + for event in result.get("events", []): + self.had_data = True + cw_event = CWLogEvent(self.cw_log_group, event, self.resource_name) + + if cw_event.timestamp > self.latest_event_time: + self.latest_event_time = cw_event.timestamp + + self.consumer.consume(cw_event) + + # Keep iterating until there are no more logs left to query. + next_token = result.get("nextToken", None) + kwargs["nextToken"] = next_token + if not next_token: + break diff --git a/samcli/lib/observability/observability_info_puller.py b/samcli/lib/observability/observability_info_puller.py new file mode 100644 index 0000000000..b6d6f2b906 --- /dev/null +++ b/samcli/lib/observability/observability_info_puller.py @@ -0,0 +1,143 @@ +""" +Interfaces and generic implementations for observability events (like CW logs) +""" +import logging +from abc import ABC, abstractmethod +from datetime import datetime +from typing import List, Optional, Generic, TypeVar, Any + +LOG = logging.getLogger(__name__) + +# Generic type for the internal observability event +InternalEventType = TypeVar("InternalEventType") + + +class ObservabilityEvent(Generic[InternalEventType]): + """ + Generic class that represents observability event + This keeps some common fields for filtering or sorting later on + """ + + def __init__(self, event: InternalEventType, timestamp: int, resource_name: Optional[str] = None): + """ + Parameters + ---------- + event : EventType + Actual event object. This can be any type with generic definition (dict, str etc.) + timestamp : int + Timestamp of the event + resource_name : Optional[str] + Resource name related to this event. This is optional since not all events is connected to a single resource + """ + self.event = event + self.timestamp = timestamp + self.resource_name = resource_name + + +# Generic type for identifying different ObservabilityEvent +ObservabilityEventType = TypeVar("ObservabilityEventType", bound=ObservabilityEvent) + + +class ObservabilityPuller(ABC): + """ + Interface definition for pulling observability information. + """ + + @abstractmethod + def tail(self, start_time: Optional[datetime] = None, filter_pattern: Optional[str] = None): + """ + Parameters + ---------- + start_time : Optional[datetime] + Optional parameter to tail information from earlier time + filter_pattern : Optional[str] + Optional parameter to filter events with given string + """ + + @abstractmethod + def load_time_period( + self, + start_time: Optional[datetime] = None, + end_time: Optional[datetime] = None, + filter_pattern: Optional[str] = None, + ): + """ + Parameters + ---------- + start_time : Optional[datetime] + Optional parameter to load events from certain date time + end_time : Optional[datetime] + Optional parameter to load events until certain date time + filter_pattern : Optional[str] + Optional parameter to filter events with given string + """ + + +# pylint: disable=fixme +# fixme add ABC parent class back once we bump the pylint to a version 2.8.2 or higher +class ObservabilityEventMapper(Generic[ObservabilityEventType]): + """ + Interface definition to map/change any event to another object + This could be used by highlighting certain parts or formatting events before logging into console + """ + + @abstractmethod + def map(self, event: ObservabilityEventType) -> Any: + """ + Parameters + ---------- + event : ObservabilityEventType + Event object that will be mapped/converted to another event or any object + + Returns + ------- + Any + Return converted type + """ + + +class ObservabilityEventConsumer(Generic[ObservabilityEventType]): + """ + Consumer interface, which will consume any event. + An example is to output event into console. + """ + + @abstractmethod + def consume(self, event: ObservabilityEventType): + """ + Parameters + ---------- + event : ObservabilityEvent + Event that will be consumed + """ + + +class ObservabilityEventConsumerDecorator(ObservabilityEventConsumer): + """ + A decorator implementation for consumer, which can have mappers and decorated consumer within. + Rather than the normal implementation, this will process the events through mappers which is been + provided, and then pass them to actual consumer + """ + + def __init__(self, mappers: List[ObservabilityEventMapper], consumer: ObservabilityEventConsumer): + """ + Parameters + ---------- + mappers : List[ObservabilityEventMapper] + List of event mappers which will be used to process events before passing to consumer + consumer : ObservabilityEventConsumer + Actual consumer which will handle the events after they are processed by mappers + """ + super().__init__() + self._mappers = mappers + self._consumer = consumer + + def consume(self, event: ObservabilityEvent): + """ + See Also ObservabilityEventConsumerDecorator and ObservabilityEventConsumer + """ + for mapper in self._mappers: + LOG.debug("Calling mapper (%s) for event (%s)", mapper, event) + event = mapper.map(event) + LOG.debug("Calling consumer (%s) for event (%s)", self._consumer, event) + self._consumer.consume(event) diff --git a/samcli/lib/package/artifact_exporter.py b/samcli/lib/package/artifact_exporter.py index 4fc268af49..658a4112c7 100644 --- a/samcli/lib/package/artifact_exporter.py +++ b/samcli/lib/package/artifact_exporter.py @@ -37,7 +37,13 @@ ) from samcli.lib.package.s3_uploader import S3Uploader from samcli.lib.package.uploaders import Uploaders -from samcli.lib.package.utils import is_local_folder, mktempfile, is_s3_url, is_local_file, make_abs_path +from samcli.lib.package.utils import ( + is_local_folder, + make_abs_path, + is_local_file, + mktempfile, + is_s3_url, +) from samcli.lib.utils.packagetype import ZIP from samcli.yamlhelper import yaml_dump from samcli.lib.iac.interface import Stack as IacStack, IacPlugin, Resource as IacResource, DictSectionItem, S3Asset @@ -86,12 +92,7 @@ def do_export(self, resource, parent_dir): return template_path = asset.source_path - if ( - template_path is None - or is_s3_url(template_path) - or template_path.startswith(self.uploader.s3.meta.endpoint_url) - or template_path.startswith("https://s3.amazonaws.com/") - ): + if template_path is None or is_s3_url(template_path): # Nothing to do return diff --git a/samcli/lib/package/ecr_utils.py b/samcli/lib/package/ecr_utils.py index 6186d24099..f4bedc4a27 100644 --- a/samcli/lib/package/ecr_utils.py +++ b/samcli/lib/package/ecr_utils.py @@ -6,5 +6,5 @@ from samcli.lib.package.regexpr import ECR_URL -def is_ecr_url(url): +def is_ecr_url(url: str) -> bool: return bool(re.match(ECR_URL, url)) if url else False diff --git a/samcli/lib/package/packageable_resources.py b/samcli/lib/package/packageable_resources.py index d41b175647..ea87371f33 100644 --- a/samcli/lib/package/packageable_resources.py +++ b/samcli/lib/package/packageable_resources.py @@ -19,7 +19,7 @@ copy_to_temp_dir, upload_local_artifacts, upload_local_image_artifacts, - is_s3_url, + is_s3_protocol_url, is_path_value_valid, ) @@ -557,7 +557,7 @@ def include_transform_export_handler(template_dict, uploader, parent_dir): return template_dict include_location = template_dict.get("Parameters", {}).get("Location", None) - if not include_location or not is_path_value_valid(include_location) or is_s3_url(include_location): + if not include_location or not is_path_value_valid(include_location) or is_s3_protocol_url(include_location): # `include_location` is either empty, or not a string, or an S3 URI return template_dict diff --git a/samcli/lib/package/s3_uploader.py b/samcli/lib/package/s3_uploader.py index ce0dbf63dc..303890622a 100644 --- a/samcli/lib/package/s3_uploader.py +++ b/samcli/lib/package/s3_uploader.py @@ -85,7 +85,7 @@ def upload(self, file_name: str, remote_path: str) -> str: # Check if a file with same data exists if not self.force_upload and self.file_exists(remote_path): - LOG.debug("File with same data is already exists at %s. " "Skipping upload", remote_path) + LOG.info("File with same data already exists at %s, skipping upload", remote_path) return self.make_url(remote_path) try: diff --git a/samcli/lib/package/utils.py b/samcli/lib/package/utils.py index a0f6961bf8..f8d97e3f3b 100644 --- a/samcli/lib/package/utils.py +++ b/samcli/lib/package/utils.py @@ -4,6 +4,7 @@ import logging import os import platform +import re import shutil import tempfile import uuid @@ -23,6 +24,29 @@ LOG = logging.getLogger(__name__) +# https://docs.aws.amazon.com/AmazonS3/latest/dev-retired/UsingBucket.html +_REGION_PATTERN = r"[a-zA-Z0-9-]+" +_DOT_AMAZONAWS_COM_PATTERN = r"\.amazonaws\.com" +_S3_URL_REGEXS = [ + # Path-Style (and ipv6 dualstack) + # - https://s3.Region.amazonaws.com/bucket-name/key name + # - https://s3.amazonaws.com/bucket-name/key name (old, without region) + # - https://s3.dualstack.us-west-2.amazonaws.com/... + re.compile(rf"http(s)?://s3(.dualstack)?(\.{_REGION_PATTERN})?{_DOT_AMAZONAWS_COM_PATTERN}/.+/.+"), + # Virtual Hosted-Style (including two legacies) + # https://docs.aws.amazon.com/AmazonS3/latest/userguide/VirtualHosting.html + # - Virtual Hosted-Style: https://bucket-name.s3.Region.amazonaws.com/key name + # - Virtual Hosted-Style (Legacy: a dash between S3 and the Region): https://bucket-name.s3-Region.amazonaws.com/... + # - Virtual Hosted-Style (Legacy Global Endpoint): https://my-bucket.s3.amazonaws.com/... + re.compile(rf"http(s)?://.+\.s3((.|-){_REGION_PATTERN})?{_DOT_AMAZONAWS_COM_PATTERN}/.+"), + # S3 access point: + # - https://AccessPointName-AccountId.s3-accesspoint.region.amazonaws.com + re.compile(rf"http(s)?://.+-\d+\.s3-accesspoint\.{_REGION_PATTERN}{_DOT_AMAZONAWS_COM_PATTERN}/.+/.+"), + # S3 protocol URL: + # - s3://bucket-name/key-name + re.compile(r"s3://.+/.+"), +] + def is_path_value_valid(path): return isinstance(path, str) @@ -34,7 +58,10 @@ def make_abs_path(directory, path): return path -def is_s3_url(url): +def is_s3_protocol_url(url): + """ + Check whether url is a valid path in the form of "s3://..." + """ try: S3Uploader.parse_s3_url(url) return True @@ -42,6 +69,14 @@ def is_s3_url(url): return False +def is_s3_url(url: str) -> bool: + """ + Check whether a URL is a S3 access URL + specified at https://docs.aws.amazon.com/AmazonS3/latest/dev-retired/UsingBucket.html + """ + return any(regex.match(url) for regex in _S3_URL_REGEXS) + + def is_local_folder(path): return is_path_value_valid(path) and os.path.isdir(path) @@ -123,7 +158,7 @@ def upload_local_artifacts( # Build the root directory and upload to S3 local_path = parent_dir - if is_s3_url(local_path): + if is_s3_protocol_url(local_path): # A valid CloudFormation template will specify artifacts as S3 URLs. # This check is supporting the case where your resource does not # refer to local artifacts diff --git a/samcli/lib/pipeline/__init__.py b/samcli/lib/pipeline/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/samcli/lib/pipeline/bootstrap/__init__.py b/samcli/lib/pipeline/bootstrap/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/samcli/lib/pipeline/bootstrap/resource.py b/samcli/lib/pipeline/bootstrap/resource.py new file mode 100644 index 0000000000..a7b39dd965 --- /dev/null +++ b/samcli/lib/pipeline/bootstrap/resource.py @@ -0,0 +1,138 @@ +""" Represents AWS resource""" +from typing import Optional + + +class ARNParts: + """ + Decompose a given ARN into its parts https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html + + Attributes + ---------- + partition: str + the partition part(AWS, aws-cn or aws-us-gov) of the ARN + service: str + the service part(S3, IAM, ECR, ...etc) of the ARN + region: str + the AWS region part(us-east-1, eu-west-1, ...etc) of the ARN + account-id: str + the account-id part of the ARN + resource-id: str + the resource-id part of the ARN + resource-type: str + the resource-type part of the ARN + """ + + partition: str + service: str + region: str + account_id: str + resource_id: str + + def __init__(self, arn: str) -> None: + parts = arn.split(":") + try: + [_, self.partition, self.service, self.region, self.account_id, self.resource_id] = parts + except ValueError as ex: + raise ValueError(f"Invalid ARN ({arn})") from ex + + +class Resource: + """ + Represents an AWS resource + + Attributes + ---------- + arn: str + the ARN of the resource + comment: str + the comment of the resource + is_user_provided: bool + True if the user provided the ARN of the resource during the initialization. It indicates whether this pipeline- + resource is provided by the user or created by SAM during `sam pipeline bootstrap` + + Methods + ------- + name(self) -> Optional[str]: + extracts and returns the resource name from its ARN + """ + + def __init__(self, arn: Optional[str], comment: Optional[str]) -> None: + self.arn: Optional[str] = arn + self.comment: Optional[str] = comment + self.is_user_provided: bool = bool(arn) + + def name(self) -> Optional[str]: + """ + extracts and returns the resource name from its ARN + Raises + ------ + ValueError if the ARN is invalid + """ + if not self.arn: + return None + arn_parts: ARNParts = ARNParts(arn=self.arn) + return arn_parts.resource_id + + +class IAMUser(Resource): + """ + Represents an AWS IAM User resource + Attributes + ---------- + access_key_id: Optional[str] + holds the AccessKeyId of the credential of this IAM user, if any. + secret_access_key: Optional[str] + holds the SecretAccessKey of the credential of this IAM user, if any. + """ + + def __init__( + self, + arn: Optional[str], + comment: Optional[str], + access_key_id: Optional[str] = None, + secret_access_key: Optional[str] = None, + ) -> None: + self.access_key_id: Optional[str] = access_key_id + self.secret_access_key: Optional[str] = secret_access_key + super().__init__(arn=arn, comment=comment) + + +class S3Bucket(Resource): + """ + Represents an AWS S3Bucket resource + Attributes + ---------- + kms_key_arn: Optional[str] + The ARN of the KMS key used in encrypting this S3Bucket, if any. + """ + + def __init__(self, arn: Optional[str], comment: Optional[str], kms_key_arn: Optional[str] = None) -> None: + self.kms_key_arn: Optional[str] = kms_key_arn + super().__init__(arn=arn, comment=comment) + + +class ECRImageRepository(Resource): + """ Represents an AWS ECR image repository resource """ + + def __init__(self, arn: Optional[str], comment: Optional[str]) -> None: + super().__init__(arn=arn, comment=comment) + + def get_uri(self) -> Optional[str]: + """ + extracts and returns the URI of the given ECR image repository from its ARN + see https://docs.aws.amazon.com/AmazonECR/latest/userguide/Registries.html + Raises + ------ + ValueError if the ARN is invalid + """ + if not self.arn: + return None + arn_parts: ARNParts = ARNParts(self.arn) + # ECR's resource_id contains the resource-type("resource") which is excluded from the URL + # from docs: https://docs.aws.amazon.com/AmazonECR/latest/userguide/security_iam_service-with-iam.html + # ECR's ARN: arn:${Partition}:ecr:${Region}:${Account}:repository/${Repository-name} + if not arn_parts.resource_id.startswith("repository/"): + raise ValueError(f"Invalid ECR ARN ({self.arn}), can't extract the URL from it.") + i = len("repository/") + repo_name = arn_parts.resource_id[i:] + return f"{arn_parts.account_id}.dkr.ecr.{arn_parts.region}.amazonaws.com/{repo_name}" diff --git a/samcli/lib/pipeline/bootstrap/stage.py b/samcli/lib/pipeline/bootstrap/stage.py new file mode 100644 index 0000000000..d98081237b --- /dev/null +++ b/samcli/lib/pipeline/bootstrap/stage.py @@ -0,0 +1,330 @@ +""" Application Environment """ +import json +import os +import pathlib +import re +from itertools import chain +from typing import Dict, List, Optional, Tuple + +import boto3 +import click + +from samcli.lib.config.samconfig import SamConfig +from samcli.lib.utils.colors import Colored +from samcli.lib.utils.managed_cloudformation_stack import manage_stack, StackOutput +from samcli.lib.pipeline.bootstrap.resource import Resource, IAMUser, ECRImageRepository + +CFN_TEMPLATE_PATH = str(pathlib.Path(os.path.dirname(__file__))) +STACK_NAME_PREFIX = "aws-sam-cli-managed" +STAGE_RESOURCES_STACK_NAME_SUFFIX = "pipeline-resources" +STAGE_RESOURCES_CFN_TEMPLATE = "stage_resources.yaml" +PIPELINE_USER = "pipeline_user" +PIPELINE_EXECUTION_ROLE = "pipeline_execution_role" +CLOUDFORMATION_EXECUTION_ROLE = "cloudformation_execution_role" +ARTIFACTS_BUCKET = "artifacts_bucket" +ECR_IMAGE_REPOSITORY = "image_repository" +REGION = "region" + + +class Stage: + """ + Represents an application stage: Beta, Gamma, Prod ...etc + + Attributes + ---------- + name: str + The name of the environment + aws_profile: Optional[str] + The named AWS profile (in user's machine) of the AWS account to deploy this environment to. + aws_region: Optional[str] + The AWS region to deploy this environment to. + pipeline_user: IAMUser + The IAM User having its AccessKeyId and SecretAccessKey credentials shared with the CI/CD system + pipeline_execution_role: Resource + The IAM role assumed by the pipeline-user to get access to the AWS account and executes the + CloudFormation stack. + cloudformation_execution_role: Resource + The IAM role assumed by the CloudFormation service to executes the CloudFormation stack. + artifacts_bucket: Resource + The S3 bucket to hold the SAM build artifacts of the application's CFN template. + create_image_repository: bool + A boolean flag that determines whether the user wants to create an ECR image repository or not + image_repository: ECRImageRepository + The ECR image repository to hold the image container of lambda functions with Image package-type + + Methods: + -------- + did_user_provide_all_required_resources(self) -> bool: + checks if all of the environment's required resources (pipeline_user, pipeline_execution_role, + cloudformation_execution_role, artifacts_bucket and image_repository) are provided by the user. + bootstrap(self, confirm_changeset: bool = True) -> None: + deploys the CFN template ./stage_resources.yaml to the AWS account identified by aws_profile and + aws_region member fields. if aws_profile is not provided, it will fallback to default boto3 credentials' + resolving. Note that ./stage_resources.yaml template accepts the ARNs of already existing resources(if + any) as parameters and it will skip the creation of those resources but will use the ARNs to set the proper + permissions of other missing resources(resources created by the template) + save_config(self, config_dir: str, filename: str, cmd_names: List[str]): + save the Artifacts bucket name, ECR image repository URI and ARNs of pipeline_user, pipeline_execution_role and + cloudformation_execution_role to the "pipelineconfig.toml" file so that it can be auto-filled during + the `sam pipeline init` command. + print_resources_summary(self) -> None: + prints to the screen(console) the ARNs of the created and provided resources. + """ + + def __init__( + self, + name: str, + aws_profile: Optional[str] = None, + aws_region: Optional[str] = None, + pipeline_user_arn: Optional[str] = None, + pipeline_execution_role_arn: Optional[str] = None, + cloudformation_execution_role_arn: Optional[str] = None, + artifacts_bucket_arn: Optional[str] = None, + create_image_repository: bool = False, + image_repository_arn: Optional[str] = None, + ) -> None: + self.name: str = name + self.aws_profile: Optional[str] = aws_profile + self.aws_region: Optional[str] = aws_region + self.pipeline_user: IAMUser = IAMUser(arn=pipeline_user_arn, comment="Pipeline IAM user") + self.pipeline_execution_role: Resource = Resource( + arn=pipeline_execution_role_arn, comment="Pipeline execution role" + ) + self.cloudformation_execution_role: Resource = Resource( + arn=cloudformation_execution_role_arn, comment="CloudFormation execution role" + ) + self.artifacts_bucket: Resource = Resource(arn=artifacts_bucket_arn, comment="Artifact bucket") + self.create_image_repository: bool = create_image_repository + self.image_repository: ECRImageRepository = ECRImageRepository( + arn=image_repository_arn, comment="ECR image repository" + ) + self.color = Colored() + + def did_user_provide_all_required_resources(self) -> bool: + """Check if the user provided all of the environment resources or not""" + return all(resource.is_user_provided for resource in self._get_resources()) + + def _get_non_user_provided_resources_msg(self) -> str: + resource_comments = chain.from_iterable( + [ + [] if self.pipeline_user.is_user_provided else [self.pipeline_user.comment], + [] if self.pipeline_execution_role.is_user_provided else [self.pipeline_execution_role.comment], + [] + if self.cloudformation_execution_role.is_user_provided + else [self.cloudformation_execution_role.comment], + [] if self.artifacts_bucket.is_user_provided else [self.artifacts_bucket.comment], + [] + if self.image_repository.is_user_provided or not self.create_image_repository + else [self.image_repository.comment], + ] + ) + return "\n".join([f"\t- {comment}" for comment in resource_comments]) + + def bootstrap(self, confirm_changeset: bool = True) -> bool: + """ + Deploys the CFN template(./stage_resources.yaml) which deploys: + * Pipeline IAM User + * Pipeline execution IAM role + * CloudFormation execution IAM role + * Artifacts' S3 Bucket + * ECR image repository + to the AWS account associated with the given environment. It will not redeploy the stack if already exists. + This CFN template accepts the ARNs of the resources as parameters and will not create a resource if already + provided, this way we can conditionally create a resource only if the user didn't provide it + + THIS METHOD UPDATES THE STATE OF THE CALLING INSTANCE(self) IT WILL SET THE VALUES OF THE RESOURCES ATTRIBUTES + + Parameters + ---------- + confirm_changeset: bool + if set to false, the stage_resources.yaml CFN template will directly be deployed, otherwise, + the user will be prompted for confirmation + + Returns True if bootstrapped, otherwise False + """ + + if self.did_user_provide_all_required_resources(): + click.secho( + self.color.yellow(f"\nAll required resources for the {self.name} environment exist, skipping creation.") + ) + return True + + missing_resources_msg: str = self._get_non_user_provided_resources_msg() + click.echo( + f"This will create the following required resources for the '{self.name}' environment: \n" + f"{missing_resources_msg}" + ) + if confirm_changeset: + confirmed: bool = click.confirm("Should we proceed with the creation?") + if not confirmed: + click.secho(self.color.red("Canceling pipeline bootstrap creation.")) + return False + + environment_resources_template_body = Stage._read_template(STAGE_RESOURCES_CFN_TEMPLATE) + output: StackOutput = manage_stack( + stack_name=self._get_stack_name(), + region=self.aws_region, + profile=self.aws_profile, + template_body=environment_resources_template_body, + parameter_overrides={ + "PipelineUserArn": self.pipeline_user.arn or "", + "PipelineExecutionRoleArn": self.pipeline_execution_role.arn or "", + "CloudFormationExecutionRoleArn": self.cloudformation_execution_role.arn or "", + "ArtifactsBucketArn": self.artifacts_bucket.arn or "", + "CreateImageRepository": "true" if self.create_image_repository else "false", + "ImageRepositoryArn": self.image_repository.arn or "", + }, + ) + + pipeline_user_secret_sm_id = output.get("PipelineUserSecretKey") + + self.pipeline_user.arn = output.get("PipelineUser") + if pipeline_user_secret_sm_id: + ( + self.pipeline_user.access_key_id, + self.pipeline_user.secret_access_key, + ) = Stage._get_pipeline_user_secret_pair(pipeline_user_secret_sm_id, self.aws_profile, self.aws_region) + self.pipeline_execution_role.arn = output.get("PipelineExecutionRole") + self.cloudformation_execution_role.arn = output.get("CloudFormationExecutionRole") + self.artifacts_bucket.arn = output.get("ArtifactsBucket") + self.image_repository.arn = output.get("ImageRepository") + return True + + @staticmethod + def _get_pipeline_user_secret_pair( + secret_manager_arn: str, profile: Optional[str], region: Optional[str] + ) -> Tuple[str, str]: + """ + Helper method to fetch pipeline user's AWS Credentials from secrets manager. + SecretString need to be in following JSON format: + { + "aws_access_key_id": "AWSSECRETACCESSKEY123", + "aws_secret_access_key": "mYSuperSecretDummyKey" + } + Parameters + ---------- + secret_manager_arn: + ARN of secret manager entry which holds pipeline user key. + profile: + The named AWS profile (in user's machine) of the AWS account to deploy this environment to. + region: + The AWS region to deploy this environment to. + + Returns tuple of aws_access_key_id and aws_secret_access_key. + + """ + session = boto3.Session(profile_name=profile, region_name=region if region else None) # type: ignore + secrets_manager_client = session.client("secretsmanager") + response = secrets_manager_client.get_secret_value(SecretId=secret_manager_arn) + secret_string = response["SecretString"] + secret_json = json.loads(secret_string) + return secret_json["aws_access_key_id"], secret_json["aws_secret_access_key"] + + @staticmethod + def _read_template(template_file_name: str) -> str: + template_path: str = os.path.join(CFN_TEMPLATE_PATH, template_file_name) + with open(template_path, "r", encoding="utf-8") as fp: + template_body = fp.read() + return template_body + + def save_config(self, config_dir: str, filename: str, cmd_names: List[str]) -> None: + """ + save the Artifacts bucket name, ECR image repository URI and ARNs of pipeline_user, pipeline_execution_role and + cloudformation_execution_role to the given filename and directory. + + Parameters + ---------- + config_dir: str + the directory of the toml file to save to + filename: str + the name of the toml file to save to + cmd_names: List[str] + nested command name to scope the saved configs to inside the toml file + + Raises + ------ + ValueError: if the artifacts_bucket or ImageRepository ARNs are invalid + """ + + samconfig: SamConfig = SamConfig(config_dir=config_dir, filename=filename) + + if self.pipeline_user.arn: + samconfig.put(cmd_names=cmd_names, section="parameters", key=PIPELINE_USER, value=self.pipeline_user.arn) + + # Computing Artifacts bucket name and ECR image repository URL may through an exception if the ARNs are wrong + # Let's swallow such an exception to be able to save the remaining resources + try: + artifacts_bucket_name: Optional[str] = self.artifacts_bucket.name() + except ValueError: + artifacts_bucket_name = "" + try: + image_repository_uri: Optional[str] = self.image_repository.get_uri() or "" + except ValueError: + image_repository_uri = "" + + environment_specific_configs: Dict[str, Optional[str]] = { + PIPELINE_EXECUTION_ROLE: self.pipeline_execution_role.arn, + CLOUDFORMATION_EXECUTION_ROLE: self.cloudformation_execution_role.arn, + ARTIFACTS_BUCKET: artifacts_bucket_name, + # even image repository can be None, we want to save it as empty string + # so that pipeline init command can pick it up + ECR_IMAGE_REPOSITORY: image_repository_uri, + REGION: self.aws_region, + } + + for key, value in environment_specific_configs.items(): + if value is not None: + samconfig.put( + cmd_names=cmd_names, + section="parameters", + key=key, + value=value, + env=self.name, + ) + + samconfig.flush() + + def save_config_safe(self, config_dir: str, filename: str, cmd_names: List[str]) -> None: + """ + A safe version of save_config method that doesn't raise any exception + """ + try: + self.save_config(config_dir, filename, cmd_names) + except Exception: + pass + + def _get_resources(self) -> List[Resource]: + resources = [ + self.pipeline_user, + self.pipeline_execution_role, + self.cloudformation_execution_role, + self.artifacts_bucket, + ] + if self.create_image_repository or self.image_repository.arn: # Image Repository is optional + resources.append(self.image_repository) + return resources + + def print_resources_summary(self) -> None: + """prints to the screen(console) the ARNs of the created and provided resources.""" + + provided_resources = [] + created_resources = [] + for resource in self._get_resources(): + if resource.is_user_provided: + provided_resources.append(resource) + else: + created_resources.append(resource) + + if created_resources: + click.secho(self.color.green("The following resources were created in your account:")) + for resource in created_resources: + click.secho(self.color.green(f"\t- {resource.comment}")) + + if not self.pipeline_user.is_user_provided: + click.secho(self.color.green("Pipeline IAM user credential:")) + click.secho(self.color.green(f"\tAWS_ACCESS_KEY_ID: {self.pipeline_user.access_key_id}")) + click.secho(self.color.green(f"\tAWS_SECRET_ACCESS_KEY: {self.pipeline_user.secret_access_key}")) + + def _get_stack_name(self) -> str: + sanitized_stage_name: str = re.sub("[^0-9a-zA-Z]+", "-", self.name) + return f"{STACK_NAME_PREFIX}-{sanitized_stage_name}-{STAGE_RESOURCES_STACK_NAME_SUFFIX}" diff --git a/samcli/lib/pipeline/bootstrap/stage_resources.yaml b/samcli/lib/pipeline/bootstrap/stage_resources.yaml new file mode 100644 index 0000000000..bcc5e94423 --- /dev/null +++ b/samcli/lib/pipeline/bootstrap/stage_resources.yaml @@ -0,0 +1,358 @@ +AWSTemplateFormatVersion: '2010-09-09' +Transform: AWS::Serverless-2016-10-31 + +Parameters: + PipelineUserArn: + Type: String + PipelineExecutionRoleArn: + Type: String + CloudFormationExecutionRoleArn: + Type: String + ArtifactsBucketArn: + Type: String + CreateImageRepository: + Type: String + Default: false + AllowedValues: [true, false] + ImageRepositoryArn: + Type: String + +Conditions: + MissingPipelineUser: !Equals [!Ref PipelineUserArn, ""] + MissingPipelineExecutionRole: !Equals [!Ref PipelineExecutionRoleArn, ""] + MissingCloudFormationExecutionRole: !Equals [!Ref CloudFormationExecutionRoleArn, ""] + MissingArtifactsBucket: !Equals [!Ref ArtifactsBucketArn, ""] + ShouldHaveImageRepository: !Or [!Equals [!Ref CreateImageRepository, "true"], !Not [!Equals [!Ref ImageRepositoryArn, ""]]] + MissingImageRepository: !And [!Condition ShouldHaveImageRepository, !Equals [!Ref ImageRepositoryArn, ""]] + +Resources: + PipelineUser: + Type: AWS::IAM::User + Condition: MissingPipelineUser + Properties: + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + Policies: + - PolicyName: AssumeRoles + PolicyDocument: + Version: "2012-10-17" + Statement: + - Effect: Allow + Action: + - "sts:AssumeRole" + Resource: "*" + Condition: + StringEquals: + aws:ResourceTag/Role: pipeline-execution-role + + PipelineUserAccessKey: + Type: AWS::IAM::AccessKey + Condition: MissingPipelineUser + Properties: + Serial: 1 + Status: Active + UserName: !Ref PipelineUser + + PipelineUserSecretKey: + Type: AWS::SecretsManager::Secret + Condition: MissingPipelineUser + Properties: + SecretString: !Sub '{"aws_access_key_id": "${PipelineUserAccessKey}", "aws_secret_access_key": "${PipelineUserAccessKey.SecretAccessKey}"}' + + CloudFormationExecutionRole: + Type: AWS::IAM::Role + Condition: MissingCloudFormationExecutionRole + Properties: + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + AssumeRolePolicyDocument: + Version: 2012-10-17 + Statement: + - Effect: Allow + Principal: + Service: cloudformation.amazonaws.com + Action: + - 'sts:AssumeRole' + Policies: + - PolicyName: GrantCloudFormationFullAccess + PolicyDocument: + Version: 2012-10-17 + Statement: + - Effect: Allow + Action: '*' + Resource: '*' + + PipelineExecutionRole: + Type: AWS::IAM::Role + Condition: MissingPipelineExecutionRole + Properties: + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + - Key: Role + Value: pipeline-execution-role + AssumeRolePolicyDocument: + Version: 2012-10-17 + Statement: + - Effect: Allow + Principal: + AWS: + - Fn::If: + - MissingPipelineUser + - !GetAtt PipelineUser.Arn + - !Ref PipelineUserArn + Action: + - 'sts:AssumeRole' + - Effect: Allow + Principal: + # Allow roles with tag Role=aws-sam-pipeline-codebuild-service-role to assume this role. + # This is required when CodePipeline is the CI/CD system of choice. + AWS: + - !If + - MissingPipelineUser + - !Ref AWS::AccountId + - !Select [4, !Split [':', !Ref PipelineUserArn]] + Action: + - 'sts:AssumeRole' + Condition: + StringEquals: + aws:PrincipalTag/Role: aws-sam-pipeline-codebuild-service-role + + ArtifactsBucket: + Type: AWS::S3::Bucket + Condition: MissingArtifactsBucket + DeletionPolicy: "Retain" + Properties: + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + LoggingConfiguration: + DestinationBucketName: + !Ref ArtifactsLoggingBucket + LogFilePrefix: "artifacts-logs" + VersioningConfiguration: + Status: Enabled + BucketEncryption: + ServerSideEncryptionConfiguration: + - ServerSideEncryptionByDefault: + SSEAlgorithm: AES256 + + ArtifactsBucketPolicy: + Type: AWS::S3::BucketPolicy + Condition: MissingArtifactsBucket + Properties: + Bucket: !Ref ArtifactsBucket + PolicyDocument: + Statement: + - Effect: "Deny" + Action: "s3:*" + Principal: "*" + Resource: + - !Join [ '',[ !GetAtt ArtifactsBucket.Arn, '/*' ] ] + - !GetAtt ArtifactsBucket.Arn + Condition: + Bool: + aws:SecureTransport: false + - Effect: "Allow" + Action: + - 's3:GetObject*' + - 's3:PutObject*' + - 's3:GetBucket*' + - 's3:List*' + Resource: + - !Join ['',[!GetAtt ArtifactsBucket.Arn, '/*']] + - !GetAtt ArtifactsBucket.Arn + Principal: + AWS: + - Fn::If: + - MissingPipelineExecutionRole + - !GetAtt PipelineExecutionRole.Arn + - !Ref PipelineExecutionRoleArn + - Fn::If: + - MissingCloudFormationExecutionRole + - !GetAtt CloudFormationExecutionRole.Arn + - !Ref CloudFormationExecutionRoleArn + + ArtifactsLoggingBucket: + Type: AWS::S3::Bucket + Condition: MissingArtifactsBucket + DeletionPolicy: "Retain" + Properties: + AccessControl: "LogDeliveryWrite" + Tags: + - Key: ManagedStackSource + Value: AwsSamCli + VersioningConfiguration: + Status: Enabled + BucketEncryption: + ServerSideEncryptionConfiguration: + - ServerSideEncryptionByDefault: + SSEAlgorithm: AES256 + + ArtifactsLoggingBucketPolicy: + Type: AWS::S3::BucketPolicy + Condition: MissingArtifactsBucket + Properties: + Bucket: !Ref ArtifactsLoggingBucket + PolicyDocument: + Statement: + - Effect: "Deny" + Action: "s3:*" + Principal: "*" + Resource: + - !Join [ '',[ !GetAtt ArtifactsLoggingBucket.Arn, '/*' ] ] + - !GetAtt ArtifactsLoggingBucket.Arn + Condition: + Bool: + aws:SecureTransport: false + + PipelineExecutionRolePermissionPolicy: + Type: AWS::IAM::Policy + Condition: MissingPipelineExecutionRole + Properties: + PolicyName: PipelineExecutionRolePermissions + PolicyDocument: + Version: 2012-10-17 + Statement: + - Effect: Allow + Action: 'iam:PassRole' + Resource: + Fn::If: + - MissingCloudFormationExecutionRole + - !GetAtt CloudFormationExecutionRole.Arn + - !Ref CloudFormationExecutionRoleArn + - Effect: Allow + Action: + - "cloudformation:CreateChangeSet" + - "cloudformation:DescribeChangeSet" + - "cloudformation:ExecuteChangeSet" + - "cloudformation:DescribeStackEvents" + - "cloudformation:DescribeStacks" + - "cloudformation:GetTemplateSummary" + - "cloudformation:DescribeStackResource" + Resource: '*' + - Effect: Allow + Action: + - 's3:GetObject*' + - 's3:PutObject*' + - 's3:GetBucket*' + - 's3:List*' + Resource: + Fn::If: + - MissingArtifactsBucket + - - !Join [ '',[ !GetAtt ArtifactsBucket.Arn, '/*' ] ] + - !GetAtt ArtifactsBucket.Arn + - - !Join [ '',[ !Ref ArtifactsBucketArn, '/*' ] ] + - !Ref ArtifactsBucketArn + - Fn::If: + - ShouldHaveImageRepository + - Effect: "Allow" + Action: "ecr:GetAuthorizationToken" + Resource: "*" + - !Ref AWS::NoValue + - Fn::If: + - ShouldHaveImageRepository + - Effect: "Allow" + Action: + - "ecr:GetDownloadUrlForLayer" + - "ecr:BatchGetImage" + - "ecr:BatchCheckLayerAvailability" + - "ecr:PutImage" + - "ecr:InitiateLayerUpload" + - "ecr:UploadLayerPart" + - "ecr:CompleteLayerUpload" + Resource: + Fn::If: + - MissingImageRepository + - !GetAtt ImageRepository.Arn + - !Ref ImageRepositoryArn + - !Ref AWS::NoValue + Roles: + - !Ref PipelineExecutionRole + + ImageRepository: + Type: AWS::ECR::Repository + Condition: MissingImageRepository + Properties: + RepositoryPolicyText: + Version: "2012-10-17" + Statement: + - Sid: LambdaECRImageRetrievalPolicy + Effect: Allow + Principal: + Service: lambda.amazonaws.com + Action: + - "ecr:GetDownloadUrlForLayer" + - "ecr:BatchGetImage" + - "ecr:GetRepositoryPolicy" + - "ecr:SetRepositoryPolicy" + - "ecr:DeleteRepositoryPolicy" + - Sid: AllowPushPull + Effect: Allow + Principal: + AWS: + - Fn::If: + - MissingPipelineExecutionRole + - !GetAtt PipelineExecutionRole.Arn + - !Ref PipelineExecutionRoleArn + - Fn::If: + - MissingCloudFormationExecutionRole + - !GetAtt CloudFormationExecutionRole.Arn + - !Ref CloudFormationExecutionRoleArn + Action: + - "ecr:GetDownloadUrlForLayer" + - "ecr:BatchGetImage" + - "ecr:BatchCheckLayerAvailability" + - "ecr:PutImage" + - "ecr:InitiateLayerUpload" + - "ecr:UploadLayerPart" + - "ecr:CompleteLayerUpload" + +Outputs: + PipelineUser: + Description: ARN of the Pipeline IAM User + Value: + Fn::If: + - MissingPipelineUser + - !GetAtt PipelineUser.Arn + - !Ref PipelineUserArn + + PipelineUserSecretKey: + Description: AWS Access Key and Secret Key of pipeline user. + Condition: MissingPipelineUser + Value: !Ref PipelineUserSecretKey + + CloudFormationExecutionRole: + Description: ARN of the IAM Role(CloudFormationExecutionRole) + Value: + Fn::If: + - MissingCloudFormationExecutionRole + - !GetAtt CloudFormationExecutionRole.Arn + - !Ref CloudFormationExecutionRoleArn + + PipelineExecutionRole: + Description: ARN of the IAM Role(PipelineExecutionRole) + Value: + Fn::If: + - MissingPipelineExecutionRole + - !GetAtt PipelineExecutionRole.Arn + - !Ref PipelineExecutionRoleArn + + ArtifactsBucket: + Description: ARN of the Artifacts bucket + Value: + Fn::If: + - MissingArtifactsBucket + - !GetAtt ArtifactsBucket.Arn + - !Ref ArtifactsBucketArn + + ImageRepository: + Description: ARN of the ECR image repository + Condition: ShouldHaveImageRepository + Value: + Fn::If: + - MissingImageRepository + - !GetAtt ImageRepository.Arn + - !Ref ImageRepositoryArn diff --git a/samcli/lib/providers/sam_base_provider.py b/samcli/lib/providers/sam_base_provider.py index 29623adb18..a135cbd8f5 100644 --- a/samcli/lib/providers/sam_base_provider.py +++ b/samcli/lib/providers/sam_base_provider.py @@ -11,6 +11,8 @@ from samcli.lib.intrinsic_resolver.intrinsics_symbol_table import IntrinsicsSymbolTable from samcli.lib.samlib.resource_metadata_normalizer import ResourceMetadataNormalizer from samcli.lib.samlib.wrapper import SamTranslatorWrapper +from samcli.lib.package.ecr_utils import is_ecr_url + LOG = logging.getLogger(__name__) @@ -35,6 +37,11 @@ class SamBaseProvider: SERVERLESS_LAYER: "ContentUri", } + IMAGE_PROPERTY_KEYS = { + LAMBDA_FUNCTION: "Code", + SERVERLESS_FUNCTION: "ImageUri", + } + def get(self, name: str) -> Optional[Any]: """ Given name of the function, this method must return the Function object @@ -89,6 +96,17 @@ def _is_s3_location(location: Optional[Union[str, Dict]]) -> bool: isinstance(location, str) and location.startswith("s3://") ) + @staticmethod + def _is_ecr_uri(location: Optional[Union[str, Dict]]) -> bool: + """ + the input could be: + - ImageUri of Serverless::Function + - Code of Lambda::Function + """ + return location is not None and is_ecr_url( + str(location.get("ImageUri", "")) if isinstance(location, dict) else location + ) + @staticmethod def _warn_code_extraction(resource_type: str, resource_name: str, code_property: str) -> None: LOG.warning( @@ -99,6 +117,16 @@ def _warn_code_extraction(resource_type: str, resource_name: str, code_property: code_property, ) + @staticmethod + def _warn_imageuri_extraction(resource_type: str, resource_name: str, image_property: str) -> None: + LOG.warning( + "The resource %s '%s' has specified ECR registry image for %s. " + "It will not be built and SAM CLI does not support invoking it locally.", + resource_type, + resource_name, + image_property, + ) + @staticmethod def _extract_lambda_function_imageuri(resource_properties: Dict, code_property_key: str) -> Optional[str]: """ diff --git a/samcli/lib/providers/sam_function_provider.py b/samcli/lib/providers/sam_function_provider.py index 720f61793c..0f4bd3a9d7 100644 --- a/samcli/lib/providers/sam_function_provider.py +++ b/samcli/lib/providers/sam_function_provider.py @@ -130,51 +130,101 @@ def _extract_functions( resource_properties["Metadata"] = resource_metadata if resource_type in [SamFunctionProvider.SERVERLESS_FUNCTION, SamFunctionProvider.LAMBDA_FUNCTION]: + resource_package_type = resource_properties.get("PackageType", ZIP) + + image_property_key = SamBaseProvider.IMAGE_PROPERTY_KEYS[resource_type] code_property_key = SamBaseProvider.CODE_PROPERTY_KEYS[resource_type] - assets = resource.assets or [] - code_asset_uri = None - for asset in assets: - if isinstance(asset, S3Asset) and asset.source_property == code_property_key: - code_asset_uri = asset.source_path - break - if SamBaseProvider._is_s3_location(code_asset_uri or resource_properties.get(code_property_key)): + + code_asset_uri = SamFunctionProvider.get_code_asset_uri(code_property_key, resource) + if resource_package_type == ZIP and SamBaseProvider._is_s3_location( + code_asset_uri or resource_properties.get(code_property_key) + ): # CodeUri can be a dictionary of S3 Bucket/Key or a S3 URI, neither of which are supported if not ignore_code_extraction_warnings: SamFunctionProvider._warn_code_extraction(resource_type, name, code_property_key) continue - if resource_type == SamFunctionProvider.SERVERLESS_FUNCTION: - layers = SamFunctionProvider._parse_layer_info( - stack, - resource_properties.get("Layers", []), - use_raw_codeuri, - ignore_code_extraction_warnings=ignore_code_extraction_warnings, - ) - function = SamFunctionProvider._convert_sam_function_resource( - stack, - name, - resource, - layers, - use_raw_codeuri, - ) - result[function.full_path] = function + image_asset_uri = SamFunctionProvider.get_image_asset_uri(image_property_key, resource) + if resource_package_type == IMAGE and SamBaseProvider._is_ecr_uri( + image_asset_uri or resource_properties.get(image_property_key) + ): + # ImageUri can be an ECR uri, which is not supported + if not ignore_code_extraction_warnings: + SamFunctionProvider._warn_imageuri_extraction(resource_type, name, image_property_key) + continue - elif resource_type == SamFunctionProvider.LAMBDA_FUNCTION: - layers = SamFunctionProvider._parse_layer_info( - stack, - resource_properties.get("Layers", []), - use_raw_codeuri, - ignore_code_extraction_warnings=ignore_code_extraction_warnings, - ) - function = SamFunctionProvider._convert_lambda_function_resource( - stack, name, resource, layers, use_raw_codeuri - ) + function = SamFunctionProvider.build_lambda_function( + ignore_code_extraction_warnings, + name, + resource, + resource_properties, + resource_type, + stack, + use_raw_codeuri, + ) + + if function: result[function.full_path] = function # We don't care about other resource types. Just ignore them return result + @staticmethod + def get_image_asset_uri(image_property_key, resource): + assets = resource.assets or [] + image_asset_uri = None + for asset in assets: + if isinstance(asset, ImageAsset) and asset.source_property == image_property_key: + image_asset_uri = asset.source_local_image if asset.source_local_image else asset.source_path + if isinstance(image_asset_uri, str): + image_asset_uri = cast(Optional[str], image_asset_uri) + else: + image_asset_uri = cast(Optional[str], image_asset_uri.get("ImageUri", None)) + break + return image_asset_uri + + @staticmethod + def get_code_asset_uri(code_property_key, resource): + assets = resource.assets or [] + code_asset_uri = None + for asset in assets: + if isinstance(asset, S3Asset) and asset.source_property == code_property_key: + code_asset_uri = asset.source_path + break + return code_asset_uri + + @staticmethod + def build_lambda_function( + ignore_code_extraction_warnings, name, resource, resource_properties, resource_type, stack, use_raw_codeuri + ): + function = None + if resource_type == SamFunctionProvider.SERVERLESS_FUNCTION: + layers = SamFunctionProvider._parse_layer_info( + stack, + resource_properties.get("Layers", []), + use_raw_codeuri, + ignore_code_extraction_warnings=ignore_code_extraction_warnings, + ) + function = SamFunctionProvider._convert_sam_function_resource( + stack, + name, + resource, + layers, + use_raw_codeuri, + ) + elif resource_type == SamFunctionProvider.LAMBDA_FUNCTION: + layers = SamFunctionProvider._parse_layer_info( + stack, + resource_properties.get("Layers", []), + use_raw_codeuri, + ignore_code_extraction_warnings=ignore_code_extraction_warnings, + ) + function = SamFunctionProvider._convert_lambda_function_resource( + stack, name, resource, layers, use_raw_codeuri + ) + return function + @staticmethod def _convert_sam_function_resource( stack: Stack, diff --git a/samcli/lib/providers/sam_stack_provider.py b/samcli/lib/providers/sam_stack_provider.py index 7ddcea27b0..fba7e20598 100644 --- a/samcli/lib/providers/sam_stack_provider.py +++ b/samcli/lib/providers/sam_stack_provider.py @@ -184,9 +184,15 @@ def _convert_cfn_stack_resource( if isinstance(asset, S3Asset) and asset.source_property == "TemplateURL": asset_location = asset.source_property - template_url = asset_location or resource_properties.get("TemplateURL", "") + template_url = asset_location or resource_properties.get("TemplateURL") - if not isinstance(template_url, str) or SamLocalStackProvider.is_remote_url(template_url): + if isinstance(template_url, dict): + # This happens when TemplateURL has unresolvable intrinsic functions + # and it usually happens in CDK generated template files (#2832). + raise RemoteStackLocationNotSupported() + + template_url = cast(str, template_url) + if SamLocalStackProvider.is_remote_url(template_url): raise RemoteStackLocationNotSupported() if template_url.startswith("file://"): template_url = unquote(urlparse(template_url).path) diff --git a/samcli/lib/samlib/default_managed_policies.json b/samcli/lib/samlib/default_managed_policies.json deleted file mode 100644 index 011382b900..0000000000 --- a/samcli/lib/samlib/default_managed_policies.json +++ /dev/null @@ -1,372 +0,0 @@ -{ - "SecurityAudit": "arn:aws:iam::aws:policy/SecurityAudit", - "AWSElasticBeanstalkMulticontainerDocker": "arn:aws:iam::aws:policy/AWSElasticBeanstalkMulticontainerDocker", - "AWSGreengrassResourceAccessRolePolicy": "arn:aws:iam::aws:policy/service-role/AWSGreengrassResourceAccessRolePolicy", - "AmazonS3ReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess", - "AmazonEC2RoleforDataPipelineRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2RoleforDataPipelineRole", - "AWSElasticBeanstalkService": "arn:aws:iam::aws:policy/service-role/AWSElasticBeanstalkService", - "AWSQuickSightIoTAnalyticsAccess": "arn:aws:iam::aws:policy/AWSQuickSightIoTAnalyticsAccess", - "AWSElasticBeanstalkCustomPlatformforEC2Role": "arn:aws:iam::aws:policy/AWSElasticBeanstalkCustomPlatformforEC2Role", - "AWSStepFunctionsConsoleFullAccess": "arn:aws:iam::aws:policy/AWSStepFunctionsConsoleFullAccess", - "AWSCloudTrailFullAccess": "arn:aws:iam::aws:policy/AWSCloudTrailFullAccess", - "NetworkAdministrator": "arn:aws:iam::aws:policy/job-function/NetworkAdministrator", - "AWSCodePipelineApproverAccess": "arn:aws:iam::aws:policy/AWSCodePipelineApproverAccess", - "AWSDirectConnectReadOnlyAccess": "arn:aws:iam::aws:policy/AWSDirectConnectReadOnlyAccess", - "AmazonMobileAnalyticsFinancialReportAccess": "arn:aws:iam::aws:policy/AmazonMobileAnalyticsFinancialReportAccess", - "AWSDeepLensLambdaFunctionAccessPolicy": "arn:aws:iam::aws:policy/AWSDeepLensLambdaFunctionAccessPolicy", - "AutoScalingFullAccess": "arn:aws:iam::aws:policy/AutoScalingFullAccess", - "AmazonLexRunBotsOnly": "arn:aws:iam::aws:policy/AmazonLexRunBotsOnly", - "AmazonEC2RoleforAWSCodeDeploy": "arn:aws:iam::aws:policy/service-role/AmazonEC2RoleforAWSCodeDeploy", - "AWSLambdaReplicator": "arn:aws:iam::aws:policy/aws-service-role/AWSLambdaReplicator", - "CloudWatchEventsReadOnlyAccess": "arn:aws:iam::aws:policy/CloudWatchEventsReadOnlyAccess", - "CloudWatchActionsEC2Access": "arn:aws:iam::aws:policy/CloudWatchActionsEC2Access", - "ViewOnlyAccess": "arn:aws:iam::aws:policy/job-function/ViewOnlyAccess", - "AmazonECSTaskExecutionRolePolicy": "arn:aws:iam::aws:policy/service-role/AmazonECSTaskExecutionRolePolicy", - "AmazonMacieServiceRole": "arn:aws:iam::aws:policy/service-role/AmazonMacieServiceRole", - "ResourceGroupsandTagEditorFullAccess": "arn:aws:iam::aws:policy/ResourceGroupsandTagEditorFullAccess", - "AmazonESReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonESReadOnlyAccess", - "IAMReadOnlyAccess": "arn:aws:iam::aws:policy/IAMReadOnlyAccess", - "AWSCloud9User": "arn:aws:iam::aws:policy/AWSCloud9User", - "AmazonMachineLearningRealTimePredictionOnlyAccess": "arn:aws:iam::aws:policy/AmazonMachineLearningRealTimePredictionOnlyAccess", - "AWSCloud9ServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSCloud9ServiceRolePolicy", - "AmazonMachineLearningCreateOnlyAccess": "arn:aws:iam::aws:policy/AmazonMachineLearningCreateOnlyAccess", - "AmazonRoute53AutoNamingFullAccess": "arn:aws:iam::aws:policy/AmazonRoute53AutoNamingFullAccess", - "AWSXrayFullAccess": "arn:aws:iam::aws:policy/AWSXrayFullAccess", - "AWSElasticBeanstalkWebTier": "arn:aws:iam::aws:policy/AWSElasticBeanstalkWebTier", - "AWSConfigRoleForOrganizations": "arn:aws:iam::aws:policy/service-role/AWSConfigRoleForOrganizations", - "AmazonRDSFullAccess": "arn:aws:iam::aws:policy/AmazonRDSFullAccess", - "AWSIoTLogging": "arn:aws:iam::aws:policy/service-role/AWSIoTLogging", - "AWSConfigRole": "arn:aws:iam::aws:policy/service-role/AWSConfigRole", - "AWSStorageGatewayReadOnlyAccess": "arn:aws:iam::aws:policy/AWSStorageGatewayReadOnlyAccess", - "AWSCodeDeployDeployerAccess": "arn:aws:iam::aws:policy/AWSCodeDeployDeployerAccess", - "AmazonWorkMailReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonWorkMailReadOnlyAccess", - "AmazonSNSRole": "arn:aws:iam::aws:policy/service-role/AmazonSNSRole", - "AWSImportExportFullAccess": "arn:aws:iam::aws:policy/AWSImportExportFullAccess", - "AmazonAppStreamServiceAccess": "arn:aws:iam::aws:policy/service-role/AmazonAppStreamServiceAccess", - "AWSGlueConsoleFullAccess": "arn:aws:iam::aws:policy/AWSGlueConsoleFullAccess", - "ComprehendReadOnly": "arn:aws:iam::aws:policy/ComprehendReadOnly", - "AmazonMachineLearningBatchPredictionsAccess": "arn:aws:iam::aws:policy/AmazonMachineLearningBatchPredictionsAccess", - "AWSEnhancedClassicNetworkingMangementPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSEnhancedClassicNetworkingMangementPolicy", - "SecretsManagerReadWrite": "arn:aws:iam::aws:policy/SecretsManagerReadWrite", - "AmazonMQFullAccess": "arn:aws:iam::aws:policy/AmazonMQFullAccess", - "AWSIoTConfigReadOnlyAccess": "arn:aws:iam::aws:policy/AWSIoTConfigReadOnlyAccess", - "AWSElasticLoadBalancingClassicServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSElasticLoadBalancingClassicServiceRolePolicy", - "SystemAdministrator": "arn:aws:iam::aws:policy/job-function/SystemAdministrator", - "AmazonEC2ContainerRegistryFullAccess": "arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryFullAccess", - "CloudWatchEventsInvocationAccess": "arn:aws:iam::aws:policy/service-role/CloudWatchEventsInvocationAccess", - "AmazonECSServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonECSServiceRolePolicy", - "AmazonCognitoPowerUser": "arn:aws:iam::aws:policy/AmazonCognitoPowerUser", - "ElastiCacheServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/ElastiCacheServiceRolePolicy", - "AWSCloudFormationReadOnlyAccess": "arn:aws:iam::aws:policy/AWSCloudFormationReadOnlyAccess", - "ApplicationAutoScalingForAmazonAppStreamAccess": "arn:aws:iam::aws:policy/service-role/ApplicationAutoScalingForAmazonAppStreamAccess", - "AWSCloud9EnvironmentMember": "arn:aws:iam::aws:policy/AWSCloud9EnvironmentMember", - "AmazonLexFullAccess": "arn:aws:iam::aws:policy/AmazonLexFullAccess", - "AmazonElastiCacheFullAccess": "arn:aws:iam::aws:policy/AmazonElastiCacheFullAccess", - "AWSBatchServiceRole": "arn:aws:iam::aws:policy/service-role/AWSBatchServiceRole", - "AmazonKinesisAnalyticsFullAccess": "arn:aws:iam::aws:policy/AmazonKinesisAnalyticsFullAccess", - "AWSApplicationAutoscalingDynamoDBTablePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingDynamoDBTablePolicy", - "AmazonElasticFileSystemFullAccess": "arn:aws:iam::aws:policy/AmazonElasticFileSystemFullAccess", - "AmazonEC2ContainerServiceAutoscaleRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2ContainerServiceAutoscaleRole", - "AWSCodeDeployReadOnlyAccess": "arn:aws:iam::aws:policy/AWSCodeDeployReadOnlyAccess", - "AWSHealthFullAccess": "arn:aws:iam::aws:policy/AWSHealthFullAccess", - "AmazonDynamoDBReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonDynamoDBReadOnlyAccess", - "AmazonVPCFullAccess": "arn:aws:iam::aws:policy/AmazonVPCFullAccess", - "AmazonEC2RoleforSSM": "arn:aws:iam::aws:policy/service-role/AmazonEC2RoleforSSM", - "AmazonMobileAnalyticsNon-financialReportAccess": "arn:aws:iam::aws:policy/AmazonMobileAnalyticsNon-financialReportAccess", - "AmazonSageMakerReadOnly": "arn:aws:iam::aws:policy/AmazonSageMakerReadOnly", - "AWSCloudHSMReadOnlyAccess": "arn:aws:iam::aws:policy/AWSCloudHSMReadOnlyAccess", - "AmazonCognitoDeveloperAuthenticatedIdentities": "arn:aws:iam::aws:policy/AmazonCognitoDeveloperAuthenticatedIdentities", - "AmazonRDSEnhancedMonitoringRole": "arn:aws:iam::aws:policy/service-role/AmazonRDSEnhancedMonitoringRole", - "AWSCodeBuildReadOnlyAccess": "arn:aws:iam::aws:policy/AWSCodeBuildReadOnlyAccess", - "AWSDirectoryServiceReadOnlyAccess": "arn:aws:iam::aws:policy/AWSDirectoryServiceReadOnlyAccess", - "AmazonCloudDirectoryReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonCloudDirectoryReadOnlyAccess", - "CloudSearchReadOnlyAccess": "arn:aws:iam::aws:policy/CloudSearchReadOnlyAccess", - "AWSElementalMediaStoreReadOnly": "arn:aws:iam::aws:policy/AWSElementalMediaStoreReadOnly", - "LexChannelPolicy": "arn:aws:iam::aws:policy/aws-service-role/LexChannelPolicy", - "AWSSSOServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSSSOServiceRolePolicy", - "AWSConfigRulesExecutionRole": "arn:aws:iam::aws:policy/service-role/AWSConfigRulesExecutionRole", - "AmazonChimeUserManagement": "arn:aws:iam::aws:policy/AmazonChimeUserManagement", - "IAMFullAccess": "arn:aws:iam::aws:policy/IAMFullAccess", - "IAMUserChangePassword": "arn:aws:iam::aws:policy/IAMUserChangePassword", - "AWSKeyManagementServicePowerUser": "arn:aws:iam::aws:policy/AWSKeyManagementServicePowerUser", - "AWSMobileHub_FullAccess": "arn:aws:iam::aws:policy/AWSMobileHub_FullAccess", - "AWSLambdaReplicatorInternal": "arn:aws:iam::aws:policy/aws-service-role/AWSLambdaReplicatorInternal", - "AWSOpsWorksCMServiceRole": "arn:aws:iam::aws:policy/service-role/AWSOpsWorksCMServiceRole", - "AmazonZocaloFullAccess": "arn:aws:iam::aws:policy/AmazonZocaloFullAccess", - "AWSCertificateManagerFullAccess": "arn:aws:iam::aws:policy/AWSCertificateManagerFullAccess", - "AmazonEC2ReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonEC2ReadOnlyAccess", - "AmazonDynamoDBFullAccess": "arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess", - "DAXServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/DAXServiceRolePolicy", - "AWSApplicationAutoscalingSageMakerEndpointPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingSageMakerEndpointPolicy", - "AmazonWorkSpacesApplicationManagerAdminAccess": "arn:aws:iam::aws:policy/AmazonWorkSpacesApplicationManagerAdminAccess", - "AmazonElasticMapReduceforEC2Role": "arn:aws:iam::aws:policy/service-role/AmazonElasticMapReduceforEC2Role", - "AmazonSNSReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonSNSReadOnlyAccess", - "CloudHSMServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/CloudHSMServiceRolePolicy", - "AWSApplicationAutoscalingAppStreamFleetPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingAppStreamFleetPolicy", - "IAMUserSSHKeys": "arn:aws:iam::aws:policy/IAMUserSSHKeys", - "AmazonVPCCrossAccountNetworkInterfaceOperations": "arn:aws:iam::aws:policy/AmazonVPCCrossAccountNetworkInterfaceOperations", - "AmazonFreeRTOSFullAccess": "arn:aws:iam::aws:policy/AmazonFreeRTOSFullAccess", - "AmazonInspectorReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonInspectorReadOnlyAccess", - "AmazonEMRCleanupPolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonEMRCleanupPolicy", - "IAMSelfManageServiceSpecificCredentials": "arn:aws:iam::aws:policy/IAMSelfManageServiceSpecificCredentials", - "AWSQuicksightAthenaAccess": "arn:aws:iam::aws:policy/service-role/AWSQuicksightAthenaAccess", - "AmazonEC2ContainerServiceRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2ContainerServiceRole", - "AmazonMechanicalTurkReadOnly": "arn:aws:iam::aws:policy/AmazonMechanicalTurkReadOnly", - "AmazonEC2ReportsAccess": "arn:aws:iam::aws:policy/AmazonEC2ReportsAccess", - "AWSElementalMediaStoreFullAccess": "arn:aws:iam::aws:policy/AWSElementalMediaStoreFullAccess", - "AWSBatchServiceEventTargetRole": "arn:aws:iam::aws:policy/service-role/AWSBatchServiceEventTargetRole", - "AWSCodeDeployFullAccess": "arn:aws:iam::aws:policy/AWSCodeDeployFullAccess", - "CloudWatchAgentServerPolicy": "arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy", - "AWSApplicationAutoscalingRDSClusterPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingRDSClusterPolicy", - "AmazonVPCReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonVPCReadOnlyAccess", - "AmazonRoute53DomainsReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonRoute53DomainsReadOnlyAccess", - "AWSElasticBeanstalkReadOnlyAccess": "arn:aws:iam::aws:policy/AWSElasticBeanstalkReadOnlyAccess", - "AWSStepFunctionsReadOnlyAccess": "arn:aws:iam::aws:policy/AWSStepFunctionsReadOnlyAccess", - "AWSSupportAccess": "arn:aws:iam::aws:policy/AWSSupportAccess", - "GreengrassOTAUpdateArtifactAccess": "arn:aws:iam::aws:policy/service-role/GreengrassOTAUpdateArtifactAccess", - "AmazonEC2ContainerServiceforEC2Role": "arn:aws:iam::aws:policy/service-role/AmazonEC2ContainerServiceforEC2Role", - "AmazonKinesisVideoStreamsReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonKinesisVideoStreamsReadOnlyAccess", - "AWSMarketplaceFullAccess": "arn:aws:iam::aws:policy/AWSMarketplaceFullAccess", - "AWSOpsWorksCloudWatchLogs": "arn:aws:iam::aws:policy/AWSOpsWorksCloudWatchLogs", - "AWSStepFunctionsFullAccess": "arn:aws:iam::aws:policy/AWSStepFunctionsFullAccess", - "AmazonSQSReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonSQSReadOnlyAccess", - "AmazonElasticMapReduceRole": "arn:aws:iam::aws:policy/service-role/AmazonElasticMapReduceRole", - "AWSOpsWorksCMInstanceProfileRole": "arn:aws:iam::aws:policy/AWSOpsWorksCMInstanceProfileRole", - "AWSMarketplaceRead-only": "arn:aws:iam::aws:policy/AWSMarketplaceRead-only", - "AWSWAFFullAccess": "arn:aws:iam::aws:policy/AWSWAFFullAccess", - "AmazonSQSFullAccess": "arn:aws:iam::aws:policy/AmazonSQSFullAccess", - "AmazonMobileAnalyticsFullAccess": "arn:aws:iam::aws:policy/AmazonMobileAnalyticsFullAccess", - "AutoScalingConsoleFullAccess": "arn:aws:iam::aws:policy/AutoScalingConsoleFullAccess", - "QuickSightAccessForS3StorageManagementAnalyticsReadOnly": "arn:aws:iam::aws:policy/service-role/QuickSightAccessForS3StorageManagementAnalyticsReadOnly", - "AmazonESCognitoAccess": "arn:aws:iam::aws:policy/AmazonESCognitoAccess", - "AWSAppSyncSchemaAuthor": "arn:aws:iam::aws:policy/AWSAppSyncSchemaAuthor", - "AWSIoTConfigAccess": "arn:aws:iam::aws:policy/AWSIoTConfigAccess", - "AWSAppSyncPushToCloudWatchLogs": "arn:aws:iam::aws:policy/service-role/AWSAppSyncPushToCloudWatchLogs", - "AWSDataPipeline_PowerUser": "arn:aws:iam::aws:policy/AWSDataPipeline_PowerUser", - "AWSStorageGatewayFullAccess": "arn:aws:iam::aws:policy/AWSStorageGatewayFullAccess", - "AmazonElasticTranscoderReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonElasticTranscoderReadOnlyAccess", - "AmazonSSMServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonSSMServiceRolePolicy", - "AWSMigrationHubSMSAccess": "arn:aws:iam::aws:policy/service-role/AWSMigrationHubSMSAccess", - "AWSCertificateManagerReadOnly": "arn:aws:iam::aws:policy/AWSCertificateManagerReadOnly", - "AWSLambdaKinesisExecutionRole": "arn:aws:iam::aws:policy/service-role/AWSLambdaKinesisExecutionRole", - "AWSDeepLensServiceRolePolicy": "arn:aws:iam::aws:policy/service-role/AWSDeepLensServiceRolePolicy", - "AWSApplicationDiscoveryAgentAccess": "arn:aws:iam::aws:policy/AWSApplicationDiscoveryAgentAccess", - "AWSServiceCatalogAdminFullAccess": "arn:aws:iam::aws:policy/AWSServiceCatalogAdminFullAccess", - "AdministratorAccess": "arn:aws:iam::aws:policy/AdministratorAccess", - "CloudSearchFullAccess": "arn:aws:iam::aws:policy/CloudSearchFullAccess", - "AmazonInspectorServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonInspectorServiceRolePolicy", - "AmazonGlacierFullAccess": "arn:aws:iam::aws:policy/AmazonGlacierFullAccess", - "AWSConnector": "arn:aws:iam::aws:policy/AWSConnector", - "FMSServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/FMSServiceRolePolicy", - "AWSXrayWriteOnlyAccess": "arn:aws:iam::aws:policy/AWSXrayWriteOnlyAccess", - "AmazonMachineLearningRoleforRedshiftDataSource": "arn:aws:iam::aws:policy/service-role/AmazonMachineLearningRoleforRedshiftDataSource", - "AmazonSESReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonSESReadOnlyAccess", - "AmazonAppStreamFullAccess": "arn:aws:iam::aws:policy/AmazonAppStreamFullAccess", - "AWSMobileHub_ReadOnly": "arn:aws:iam::aws:policy/AWSMobileHub_ReadOnly", - "AWSEC2FleetServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSEC2FleetServiceRolePolicy", - "AWSElementalMediaPackageFullAccess": "arn:aws:iam::aws:policy/AWSElementalMediaPackageFullAccess", - "AWSQuickSightListIAM": "arn:aws:iam::aws:policy/service-role/AWSQuickSightListIAM", - "AmazonAPIGatewayPushToCloudWatchLogs": "arn:aws:iam::aws:policy/service-role/AmazonAPIGatewayPushToCloudWatchLogs", - "AmazonDynamoDBFullAccesswithDataPipeline": "arn:aws:iam::aws:policy/AmazonDynamoDBFullAccesswithDataPipeline", - "AmazonAppStreamReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonAppStreamReadOnlyAccess", - "AWSGlueServiceRole": "arn:aws:iam::aws:policy/service-role/AWSGlueServiceRole", - "AWSServiceCatalogEndUserFullAccess": "arn:aws:iam::aws:policy/AWSServiceCatalogEndUserFullAccess", - "AmazonElasticTranscoderRole": "arn:aws:iam::aws:policy/service-role/AmazonElasticTranscoderRole", - "AWSMarketplaceGetEntitlements": "arn:aws:iam::aws:policy/AWSMarketplaceGetEntitlements", - "SupportUser": "arn:aws:iam::aws:policy/job-function/SupportUser", - "AWSIoTOTAUpdate": "arn:aws:iam::aws:policy/service-role/AWSIoTOTAUpdate", - "AutoScalingServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AutoScalingServiceRolePolicy", - "AmazonZocaloReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonZocaloReadOnlyAccess", - "AmazonPollyFullAccess": "arn:aws:iam::aws:policy/AmazonPollyFullAccess", - "AWSQuickSightDescribeRDS": "arn:aws:iam::aws:policy/service-role/AWSQuickSightDescribeRDS", - "AWSCloudHSMRole": "arn:aws:iam::aws:policy/service-role/AWSCloudHSMRole", - "AWSGlueServiceNotebookRole": "arn:aws:iam::aws:policy/service-role/AWSGlueServiceNotebookRole", - "CloudWatchEventsServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/CloudWatchEventsServiceRolePolicy", - "AmazonRedshiftReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonRedshiftReadOnlyAccess", - "AWSLambdaInvocation-DynamoDB": "arn:aws:iam::aws:policy/AWSLambdaInvocation-DynamoDB", - "AmazonRekognitionServiceRole": "arn:aws:iam::aws:policy/service-role/AmazonRekognitionServiceRole", - "AWSQuickSightDescribeRedshift": "arn:aws:iam::aws:policy/service-role/AWSQuickSightDescribeRedshift", - "AmazonAPIGatewayAdministrator": "arn:aws:iam::aws:policy/AmazonAPIGatewayAdministrator", - "AWSCodeBuildDeveloperAccess": "arn:aws:iam::aws:policy/AWSCodeBuildDeveloperAccess", - "AmazonEC2FullAccess": "arn:aws:iam::aws:policy/AmazonEC2FullAccess", - "AWSCodeCommitPowerUser": "arn:aws:iam::aws:policy/AWSCodeCommitPowerUser", - "AmazonRoute53AutoNamingReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonRoute53AutoNamingReadOnlyAccess", - "AWSCodeDeployRole": "arn:aws:iam::aws:policy/service-role/AWSCodeDeployRole", - "ServerMigrationConnector": "arn:aws:iam::aws:policy/ServerMigrationConnector", - "CloudWatchLogsReadOnlyAccess": "arn:aws:iam::aws:policy/CloudWatchLogsReadOnlyAccess", - "PowerUserAccess": "arn:aws:iam::aws:policy/PowerUserAccess", - "AWSMarketplaceManageSubscriptions": "arn:aws:iam::aws:policy/AWSMarketplaceManageSubscriptions", - "AmazonRDSReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonRDSReadOnlyAccess", - "ServiceCatalogEndUserAccess": "arn:aws:iam::aws:policy/ServiceCatalogEndUserAccess", - "AWSOpsWorksRole": "arn:aws:iam::aws:policy/service-role/AWSOpsWorksRole", - "AmazonSSMReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonSSMReadOnlyAccess", - "AWSCodePipelineCustomActionAccess": "arn:aws:iam::aws:policy/AWSCodePipelineCustomActionAccess", - "AmazonRoute53AutoNamingRegistrantAccess": "arn:aws:iam::aws:policy/AmazonRoute53AutoNamingRegistrantAccess", - "AmazonMachineLearningReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonMachineLearningReadOnlyAccess", - "AmazonESFullAccess": "arn:aws:iam::aws:policy/AmazonESFullAccess", - "AWSConfigUserAccess": "arn:aws:iam::aws:policy/AWSConfigUserAccess", - "DatabaseAdministrator": "arn:aws:iam::aws:policy/job-function/DatabaseAdministrator", - "CloudWatchLogsFullAccess": "arn:aws:iam::aws:policy/CloudWatchLogsFullAccess", - "AmazonSSMAutomationRole": "arn:aws:iam::aws:policy/service-role/AmazonSSMAutomationRole", - "AmazonMachineLearningFullAccess": "arn:aws:iam::aws:policy/AmazonMachineLearningFullAccess", - "AmazonSageMakerFullAccess": "arn:aws:iam::aws:policy/AmazonSageMakerFullAccess", - "ServerMigrationServiceRole": "arn:aws:iam::aws:policy/service-role/ServerMigrationServiceRole", - "AWSLambdaVPCAccessExecutionRole": "arn:aws:iam::aws:policy/service-role/AWSLambdaVPCAccessExecutionRole", - "AmazonSESFullAccess": "arn:aws:iam::aws:policy/AmazonSESFullAccess", - "AmazonTranscribeReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonTranscribeReadOnlyAccess", - "AmazonKinesisAnalyticsReadOnly": "arn:aws:iam::aws:policy/AmazonKinesisAnalyticsReadOnly", - "AmazonRekognitionReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonRekognitionReadOnlyAccess", - "AmazonInspectorFullAccess": "arn:aws:iam::aws:policy/AmazonInspectorFullAccess", - "AmazonElasticMapReduceforAutoScalingRole": "arn:aws:iam::aws:policy/service-role/AmazonElasticMapReduceforAutoScalingRole", - "AWSImportExportReadOnlyAccess": "arn:aws:iam::aws:policy/AWSImportExportReadOnlyAccess", - "AmazonMQReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonMQReadOnlyAccess", - "AWSElasticBeanstalkEnhancedHealth": "arn:aws:iam::aws:policy/service-role/AWSElasticBeanstalkEnhancedHealth", - "AmazonGuardDutyServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonGuardDutyServiceRolePolicy", - "AWSAppSyncAdministrator": "arn:aws:iam::aws:policy/AWSAppSyncAdministrator", - "AWSCodeBuildAdminAccess": "arn:aws:iam::aws:policy/AWSCodeBuildAdminAccess", - "AWSServiceRoleForEC2ScheduledInstances": "arn:aws:iam::aws:policy/aws-service-role/AWSServiceRoleForEC2ScheduledInstances", - "AWSDeviceFarmFullAccess": "arn:aws:iam::aws:policy/AWSDeviceFarmFullAccess", - "ComprehendFullAccess": "arn:aws:iam::aws:policy/ComprehendFullAccess", - "AWSAgentlessDiscoveryService": "arn:aws:iam::aws:policy/AWSAgentlessDiscoveryService", - "CloudWatchReadOnlyAccess": "arn:aws:iam::aws:policy/CloudWatchReadOnlyAccess", - "AmazonElasticMapReduceReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonElasticMapReduceReadOnlyAccess", - "CloudWatchAgentAdminPolicy": "arn:aws:iam::aws:policy/CloudWatchAgentAdminPolicy", - "AWSCodeStarServiceRole": "arn:aws:iam::aws:policy/service-role/AWSCodeStarServiceRole", - "AmazonMacieSetupRole": "arn:aws:iam::aws:policy/service-role/AmazonMacieSetupRole", - "AWSLambdaENIManagementAccess": "arn:aws:iam::aws:policy/service-role/AWSLambdaENIManagementAccess", - "AWSOpsWorksInstanceRegistration": "arn:aws:iam::aws:policy/AWSOpsWorksInstanceRegistration", - "AWSDirectoryServiceFullAccess": "arn:aws:iam::aws:policy/AWSDirectoryServiceFullAccess", - "CloudWatchEventsBuiltInTargetExecutionAccess": "arn:aws:iam::aws:policy/service-role/CloudWatchEventsBuiltInTargetExecutionAccess", - "AWSIoTFullAccess": "arn:aws:iam::aws:policy/AWSIoTFullAccess", - "ServiceCatalogAdminReadOnlyAccess": "arn:aws:iam::aws:policy/ServiceCatalogAdminReadOnlyAccess", - "AWSGreengrassFullAccess": "arn:aws:iam::aws:policy/AWSGreengrassFullAccess", - "AWSCodeCommitFullAccess": "arn:aws:iam::aws:policy/AWSCodeCommitFullAccess", - "AlexaForBusinessGatewayExecution": "arn:aws:iam::aws:policy/AlexaForBusinessGatewayExecution", - "AWSMigrationHubFullAccess": "arn:aws:iam::aws:policy/AWSMigrationHubFullAccess", - "AWSMarketplaceMeteringFullAccess": "arn:aws:iam::aws:policy/AWSMarketplaceMeteringFullAccess", - "CloudFrontFullAccess": "arn:aws:iam::aws:policy/CloudFrontFullAccess", - "AmazonAthenaFullAccess": "arn:aws:iam::aws:policy/AmazonAthenaFullAccess", - "AWSCodeDeployRoleForLambda": "arn:aws:iam::aws:policy/service-role/AWSCodeDeployRoleForLambda", - "AWSElasticLoadBalancingServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSElasticLoadBalancingServiceRolePolicy", - "AmazonKinesisFullAccess": "arn:aws:iam::aws:policy/AmazonKinesisFullAccess", - "AWSApplicationAutoscalingEMRInstanceGroupPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingEMRInstanceGroupPolicy", - "AWSArtifactAccountSync": "arn:aws:iam::aws:policy/service-role/AWSArtifactAccountSync", - "AWSBatchFullAccess": "arn:aws:iam::aws:policy/AWSBatchFullAccess", - "AmazonRoute53FullAccess": "arn:aws:iam::aws:policy/AmazonRoute53FullAccess", - "AWSTrustedAdvisorServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSTrustedAdvisorServiceRolePolicy", - "AWSCodePipelineFullAccess": "arn:aws:iam::aws:policy/AWSCodePipelineFullAccess", - "AmazonWorkMailFullAccess": "arn:aws:iam::aws:policy/AmazonWorkMailFullAccess", - "TranslateReadOnly": "arn:aws:iam::aws:policy/TranslateReadOnly", - "AmazonDMSVPCManagementRole": "arn:aws:iam::aws:policy/service-role/AmazonDMSVPCManagementRole", - "AWSCodeCommitReadOnly": "arn:aws:iam::aws:policy/AWSCodeCommitReadOnly", - "CloudWatchEventsFullAccess": "arn:aws:iam::aws:policy/CloudWatchEventsFullAccess", - "AWSDataPipelineRole": "arn:aws:iam::aws:policy/service-role/AWSDataPipelineRole", - "AmazonMobileAnalyticsWriteOnlyAccess": "arn:aws:iam::aws:policy/AmazonMobileAnalyticsWriteOnlyAccess", - "AWSLambdaFullAccess": "arn:aws:iam::aws:policy/AWSLambdaFullAccess", - "APIGatewayServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/APIGatewayServiceRolePolicy", - "DataScientist": "arn:aws:iam::aws:policy/job-function/DataScientist", - "AmazonLexReadOnly": "arn:aws:iam::aws:policy/AmazonLexReadOnly", - "AWSDataPipeline_FullAccess": "arn:aws:iam::aws:policy/AWSDataPipeline_FullAccess", - "AWSCloud9Administrator": "arn:aws:iam::aws:policy/AWSCloud9Administrator", - "AmazonRDSServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonRDSServiceRolePolicy", - "AmazonMachineLearningManageRealTimeEndpointOnlyAccess": "arn:aws:iam::aws:policy/AmazonMachineLearningManageRealTimeEndpointOnlyAccess", - "AutoScalingReadOnlyAccess": "arn:aws:iam::aws:policy/AutoScalingReadOnlyAccess", - "AWSApplicationAutoscalingECSServicePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingECSServicePolicy", - "AmazonECS_FullAccess": "arn:aws:iam::aws:policy/AmazonECS_FullAccess", - "AmazonMechanicalTurkFullAccess": "arn:aws:iam::aws:policy/AmazonMechanicalTurkFullAccess", - "AmazonS3FullAccess": "arn:aws:iam::aws:policy/AmazonS3FullAccess", - "AmazonKinesisFirehoseReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonKinesisFirehoseReadOnlyAccess", - "AWSCodeStarFullAccess": "arn:aws:iam::aws:policy/AWSCodeStarFullAccess", - "AmazonElasticsearchServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonElasticsearchServiceRolePolicy", - "AmazonWorkSpacesAdmin": "arn:aws:iam::aws:policy/AmazonWorkSpacesAdmin", - "AmazonMechanicalTurkCrowdReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonMechanicalTurkCrowdReadOnlyAccess", - "AmazonRoute53DomainsFullAccess": "arn:aws:iam::aws:policy/AmazonRoute53DomainsFullAccess", - "AmazonEC2ContainerRegistryReadOnly": "arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryReadOnly", - "AmazonRekognitionFullAccess": "arn:aws:iam::aws:policy/AmazonRekognitionFullAccess", - "AWSLambdaRole": "arn:aws:iam::aws:policy/service-role/AWSLambdaRole", - "AWSApplicationAutoscalingEC2SpotFleetRequestPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSApplicationAutoscalingEC2SpotFleetRequestPolicy", - "CloudFrontReadOnlyAccess": "arn:aws:iam::aws:policy/CloudFrontReadOnlyAccess", - "AmazonCloudDirectoryFullAccess": "arn:aws:iam::aws:policy/AmazonCloudDirectoryFullAccess", - "AWSIoTRuleActions": "arn:aws:iam::aws:policy/service-role/AWSIoTRuleActions", - "AmazonEC2SpotFleetTaggingRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2SpotFleetTaggingRole", - "AWSLambdaDynamoDBExecutionRole": "arn:aws:iam::aws:policy/service-role/AWSLambdaDynamoDBExecutionRole", - "AmazonElastiCacheReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonElastiCacheReadOnlyAccess", - "AWSMigrationHubDiscoveryAccess": "arn:aws:iam::aws:policy/service-role/AWSMigrationHubDiscoveryAccess", - "AWSLambdaExecute": "arn:aws:iam::aws:policy/AWSLambdaExecute", - "AWSIoTDataAccess": "arn:aws:iam::aws:policy/AWSIoTDataAccess", - "AmazonChimeReadOnly": "arn:aws:iam::aws:policy/AmazonChimeReadOnly", - "AlexaForBusinessDeviceSetup": "arn:aws:iam::aws:policy/AlexaForBusinessDeviceSetup", - "AmazonRedshiftFullAccess": "arn:aws:iam::aws:policy/AmazonRedshiftFullAccess", - "AmazonDRSVPCManagement": "arn:aws:iam::aws:policy/AmazonDRSVPCManagement", - "AWSAccountUsageReportAccess": "arn:aws:iam::aws:policy/AWSAccountUsageReportAccess", - "VMImportExportRoleForAWSConnector": "arn:aws:iam::aws:policy/service-role/VMImportExportRoleForAWSConnector", - "AWSDirectConnectFullAccess": "arn:aws:iam::aws:policy/AWSDirectConnectFullAccess", - "AutoScalingNotificationAccessRole": "arn:aws:iam::aws:policy/service-role/AutoScalingNotificationAccessRole", - "AmazonElasticMapReduceFullAccess": "arn:aws:iam::aws:policy/AmazonElasticMapReduceFullAccess", - "AmazonEC2ContainerServiceFullAccess": "arn:aws:iam::aws:policy/AmazonEC2ContainerServiceFullAccess", - "AmazonCognitoReadOnly": "arn:aws:iam::aws:policy/AmazonCognitoReadOnly", - "AWSApplicationDiscoveryServiceFullAccess": "arn:aws:iam::aws:policy/AWSApplicationDiscoveryServiceFullAccess", - "AmazonDMSRedshiftS3Role": "arn:aws:iam::aws:policy/service-role/AmazonDMSRedshiftS3Role", - "AmazonSSMAutomationApproverAccess": "arn:aws:iam::aws:policy/AmazonSSMAutomationApproverAccess", - "AWSMobileHub_ServiceUseOnly": "arn:aws:iam::aws:policy/service-role/AWSMobileHub_ServiceUseOnly", - "AmazonAPIGatewayInvokeFullAccess": "arn:aws:iam::aws:policy/AmazonAPIGatewayInvokeFullAccess", - "ReadOnlyAccess": "arn:aws:iam::aws:policy/ReadOnlyAccess", - "DynamoDBReplicationServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/DynamoDBReplicationServiceRolePolicy", - "AmazonSSMMaintenanceWindowRole": "arn:aws:iam::aws:policy/service-role/AmazonSSMMaintenanceWindowRole", - "AmazonGuardDutyFullAccess": "arn:aws:iam::aws:policy/AmazonGuardDutyFullAccess", - "AWSWAFReadOnlyAccess": "arn:aws:iam::aws:policy/AWSWAFReadOnlyAccess", - "AutoScalingConsoleReadOnlyAccess": "arn:aws:iam::aws:policy/AutoScalingConsoleReadOnlyAccess", - "AmazonRoute53ReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonRoute53ReadOnlyAccess", - "AmazonGuardDutyReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonGuardDutyReadOnlyAccess", - "AmazonElasticFileSystemReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonElasticFileSystemReadOnlyAccess", - "AmazonEC2ContainerRegistryPowerUser": "arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryPowerUser", - "AWSElasticBeanstalkFullAccess": "arn:aws:iam::aws:policy/AWSElasticBeanstalkFullAccess", - "AmazonSSMFullAccess": "arn:aws:iam::aws:policy/AmazonSSMFullAccess", - "Billing": "arn:aws:iam::aws:policy/job-function/Billing", - "AWSElasticBeanstalkServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSElasticBeanstalkServiceRolePolicy", - "LexBotPolicy": "arn:aws:iam::aws:policy/aws-service-role/LexBotPolicy", - "AmazonDMSCloudWatchLogsRole": "arn:aws:iam::aws:policy/service-role/AmazonDMSCloudWatchLogsRole", - "AWSOrganizationsServiceTrustPolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSOrganizationsServiceTrustPolicy", - "AmazonEC2SpotFleetAutoscaleRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2SpotFleetAutoscaleRole", - "AWSIoTThingsRegistration": "arn:aws:iam::aws:policy/service-role/AWSIoTThingsRegistration", - "IsengardControllerPolicy": "arn:aws:iam::aws:policy/aws-service-role/IsengardControllerPolicy", - "AmazonRDSDirectoryServiceAccess": "arn:aws:iam::aws:policy/service-role/AmazonRDSDirectoryServiceAccess", - "AmazonElasticTranscoderJobsSubmitter": "arn:aws:iam::aws:policy/AmazonElasticTranscoderJobsSubmitter", - "AWSCodePipelineReadOnlyAccess": "arn:aws:iam::aws:policy/AWSCodePipelineReadOnlyAccess", - "AWSEC2SpotFleetServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSEC2SpotFleetServiceRolePolicy", - "AlexaForBusinessReadOnlyAccess": "arn:aws:iam::aws:policy/AlexaForBusinessReadOnlyAccess", - "AWSOpsWorksRegisterCLI": "arn:aws:iam::aws:policy/AWSOpsWorksRegisterCLI", - "CloudWatchFullAccess": "arn:aws:iam::aws:policy/CloudWatchFullAccess", - "AmazonEC2ContainerServiceEventsRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2ContainerServiceEventsRole", - "AWSAccountActivityAccess": "arn:aws:iam::aws:policy/AWSAccountActivityAccess", - "AmazonGlacierReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonGlacierReadOnlyAccess", - "AWSCloudHSMFullAccess": "arn:aws:iam::aws:policy/AWSCloudHSMFullAccess", - "AWSMigrationHubDMSAccess": "arn:aws:iam::aws:policy/service-role/AWSMigrationHubDMSAccess", - "AWSAppSyncInvokeFullAccess": "arn:aws:iam::aws:policy/AWSAppSyncInvokeFullAccess", - "RDSCloudHsmAuthorizationRole": "arn:aws:iam::aws:policy/service-role/RDSCloudHsmAuthorizationRole", - "AmazonTranscribeFullAccess": "arn:aws:iam::aws:policy/AmazonTranscribeFullAccess", - "AmazonChimeFullAccess": "arn:aws:iam::aws:policy/AmazonChimeFullAccess", - "AmazonKinesisReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonKinesisReadOnlyAccess", - "AmazonKinesisVideoStreamsFullAccess": "arn:aws:iam::aws:policy/AmazonKinesisVideoStreamsFullAccess", - "AWSEC2SpotServiceRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AWSEC2SpotServiceRolePolicy", - "AWSResourceGroupsReadOnlyAccess": "arn:aws:iam::aws:policy/AWSResourceGroupsReadOnlyAccess", - "AWSLambdaBasicExecutionRole": "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole", - "AWSCloudTrailReadOnlyAccess": "arn:aws:iam::aws:policy/AWSCloudTrailReadOnlyAccess", - "AWSXrayReadOnlyAccess": "arn:aws:iam::aws:policy/AWSXrayReadOnlyAccess", - "AWSPriceListServiceFullAccess": "arn:aws:iam::aws:policy/AWSPriceListServiceFullAccess", - "AmazonRedshiftServiceLinkedRolePolicy": "arn:aws:iam::aws:policy/aws-service-role/AmazonRedshiftServiceLinkedRolePolicy", - "AWSOpsWorksFullAccess": "arn:aws:iam::aws:policy/AWSOpsWorksFullAccess", - "AmazonElasticTranscoderFullAccess": "arn:aws:iam::aws:policy/AmazonElasticTranscoderFullAccess", - "AWSElasticBeanstalkWorkerTier": "arn:aws:iam::aws:policy/AWSElasticBeanstalkWorkerTier", - "AWSLambdaReadOnlyAccess": "arn:aws:iam::aws:policy/AWSLambdaReadOnlyAccess", - "AmazonSNSFullAccess": "arn:aws:iam::aws:policy/AmazonSNSFullAccess", - "AlexaForBusinessFullAccess": "arn:aws:iam::aws:policy/AlexaForBusinessFullAccess", - "SimpleWorkflowFullAccess": "arn:aws:iam::aws:policy/SimpleWorkflowFullAccess", - "ResourceGroupsandTagEditorReadOnlyAccess": "arn:aws:iam::aws:policy/ResourceGroupsandTagEditorReadOnlyAccess", - "AmazonKinesisFirehoseFullAccess": "arn:aws:iam::aws:policy/AmazonKinesisFirehoseFullAccess", - "AmazonMacieFullAccess": "arn:aws:iam::aws:policy/AmazonMacieFullAccess", - "AmazonEC2SpotFleetRole": "arn:aws:iam::aws:policy/service-role/AmazonEC2SpotFleetRole", - "AmazonPollyReadOnlyAccess": "arn:aws:iam::aws:policy/AmazonPollyReadOnlyAccess", - "AWSElementalMediaPackageReadOnly": "arn:aws:iam::aws:policy/AWSElementalMediaPackageReadOnly", - "AmazonMechanicalTurkCrowdFullAccess": "arn:aws:iam::aws:policy/AmazonMechanicalTurkCrowdFullAccess" -} \ No newline at end of file diff --git a/samcli/lib/samlib/wrapper.py b/samcli/lib/samlib/wrapper.py index 622e18af7a..08a52a7523 100644 --- a/samcli/lib/samlib/wrapper.py +++ b/samcli/lib/samlib/wrapper.py @@ -8,13 +8,10 @@ """ import copy -import os -import json - import functools from typing import Dict -import boto3 +from samtranslator.model import ResourceTypeResolver, sam_resources # SAM Translator Library Internal module imports # from samtranslator.model.exceptions import ( @@ -23,22 +20,15 @@ InvalidResourceException, InvalidEventException, ) -from samtranslator.validator.validator import SamTemplateValidator -from samtranslator.model import ResourceTypeResolver, sam_resources from samtranslator.plugins import LifeCycleEvents -from samtranslator.translator.translator import prepare_plugins, Translator -from samtranslator.translator.managed_policy_translator import ManagedPolicyLoader -from samtranslator.parser.parser import Parser +from samtranslator.translator.translator import prepare_plugins +from samtranslator.validator.validator import SamTemplateValidator from samcli.commands.validate.lib.exceptions import InvalidSamDocumentException from .local_uri_plugin import SupportLocalUriPlugin class SamTranslatorWrapper: - - _thisdir = os.path.dirname(os.path.abspath(__file__)) - _DEFAULT_MANAGED_POLICIES_FILE = os.path.join(_thisdir, "default_managed_policies.json") - def __init__(self, sam_template, parameter_values=None, offline_fallback=True): """ @@ -83,45 +73,10 @@ def run_plugins(self, convert_local_uris=True): return template_copy - def __translate(self, parameter_values): - """ - This method is unused and a Work In Progress - """ - - template_copy = self.template - - sam_parser = Parser() - sam_translator = Translator( - managed_policy_map=self.__managed_policy_map(), - sam_parser=sam_parser, - # Default plugins are already initialized within the Translator - plugins=self.extra_plugins, - ) - - return sam_translator.translate(sam_template=template_copy, parameter_values=parameter_values) - @property def template(self): return copy.deepcopy(self._sam_template) - def __managed_policy_map(self): - """ - This method is unused and a Work In Progress - """ - try: - iam_client = boto3.client("iam") - return ManagedPolicyLoader(iam_client).load() - except Exception as ex: - - if self._offline_fallback: - # If offline flag is set, then fall back to the list of default managed policies - # This should be sufficient for most cases - with open(self._DEFAULT_MANAGED_POLICIES_FILE, "r") as fp: - return json.load(fp) - - # Offline is not enabled. So just raise the exception - raise ex - class _SamParserReimplemented: """ diff --git a/samcli/lib/schemas/schemas_aws_config.py b/samcli/lib/schemas/schemas_aws_config.py index 433cc11dc4..cfb557d1c4 100644 --- a/samcli/lib/schemas/schemas_aws_config.py +++ b/samcli/lib/schemas/schemas_aws_config.py @@ -73,8 +73,7 @@ def _get_aws_region_choice(available_regions_name, region): click.echo("# Partial list of AWS regions") click.echo("#") - for cli_display_region in cli_display_regions: - msg = cli_display_regions[cli_display_region] + for msg in cli_display_regions.values(): click.echo("# " + msg) region_choice = click.prompt(f"Region [{region}]", type=str, show_choices=False) diff --git a/samcli/lib/telemetry/cicd.py b/samcli/lib/telemetry/cicd.py index 70d3b6e77a..a65aa7b0ad 100644 --- a/samcli/lib/telemetry/cicd.py +++ b/samcli/lib/telemetry/cicd.py @@ -46,9 +46,25 @@ def _is_codeship(environ: Mapping) -> bool: return ci_name == "codeship" +def _is_jenkins(environ: Mapping) -> bool: + """ + Use environ to determine whether it is running in Jenkins. + According to the doc, + https://www.jenkins.io/doc/book/pipeline/jenkinsfile/#working-with-your-jenkinsfile + > BUILD_TAG + > String of jenkins-${JOB_NAME}-${BUILD_NUMBER}. + > ... + > JENKINS_URL + > Full URL of Jenkins, such as https://example.com:port/jenkins/ + > (NOTE: only available if Jenkins URL set in "System Configuration") + + Here firstly check JENKINS_URL's presence, if not, then fallback to check BUILD_TAG starts with "jenkins" + """ + return "JENKINS_URL" in environ or environ.get("BUILD_TAG", "").startswith("jenkins-") + + _ENV_VAR_OR_CALLABLE_BY_PLATFORM: Dict[CICDPlatform, Union[str, Callable[[Mapping], bool]]] = { - # https://www.jenkins.io/doc/book/pipeline/jenkinsfile/#using-environment-variables - CICDPlatform.Jenkins: "JENKINS_URL", + CICDPlatform.Jenkins: _is_jenkins, # https://docs.gitlab.com/ee/ci/variables/predefined_variables.html CICDPlatform.GitLab: "GITLAB_CI", # https://docs.github.com/en/actions/reference/environment-variables diff --git a/samcli/lib/utils/colors.py b/samcli/lib/utils/colors.py index 84e3cbdbd7..84767f0fec 100644 --- a/samcli/lib/utils/colors.py +++ b/samcli/lib/utils/colors.py @@ -58,6 +58,10 @@ def underline(self, msg): """Underline the input""" return click.style(msg, underline=True) if self.colorize else msg + def bold(self, msg): + """Bold the input""" + return click.style(msg, bold=True) if self.colorize else msg + def _color(self, msg, color): """Internal helper method to add colors to input""" kwargs = {"fg": color} diff --git a/samcli/lib/utils/defaults.py b/samcli/lib/utils/defaults.py new file mode 100644 index 0000000000..4a07b113ac --- /dev/null +++ b/samcli/lib/utils/defaults.py @@ -0,0 +1,8 @@ +""" +Contains helpers for providing default values +""" +from botocore.session import get_session + + +def get_default_aws_region() -> str: + return get_session().get_config_variable("region") or "us-east-1" diff --git a/samcli/lib/utils/git_repo.py b/samcli/lib/utils/git_repo.py new file mode 100644 index 0000000000..78b4bf23e6 --- /dev/null +++ b/samcli/lib/utils/git_repo.py @@ -0,0 +1,170 @@ +""" Manage Git repo """ + +import logging +import os +import platform +import shutil +import subprocess +from pathlib import Path + +# import check_output alone so that it can be patched without affecting +# other parts of subprocess. +from subprocess import check_output +from typing import Optional + +from samcli.lib.utils import osutils +from samcli.lib.utils.osutils import rmtree_callback + +LOG = logging.getLogger(__name__) + + +class CloneRepoException(Exception): + """ + Exception class when clone repo fails. + """ + + +class CloneRepoUnstableStateException(CloneRepoException): + """ + Exception class when clone repo enters an unstable state. + """ + + +class GitRepo: + """ + Class for managing a Git repo, currently it has a clone functionality only + + Attributes + ---------- + url: str + The URL of this Git repository, example "https://github.com/aws/aws-sam-cli" + local_path: Path + The path of the last local clone of this Git repository. Can be used in conjunction with clone_attempted + to avoid unnecessary multiple cloning of the repository. + clone_attempted: bool + whether an attempt to clone this Git repository took place or not. Can be used in conjunction with local_path + to avoid unnecessary multiple cloning of the repository + + Methods + ------- + clone(self, clone_dir: Path, clone_name, replace_existing=False) -> Path: + creates a local clone of this Git repository. (more details in the method documentation). + """ + + # TODO: [UPDATEME] melasmar: We should remove branch when making CDK support GA. + def __init__(self, url: str, branch: Optional[str] = None) -> None: + self.url: str = url + self.local_path: Optional[Path] = None + self.clone_attempted: bool = False + self.branch = branch + + @staticmethod + def _ensure_clone_directory_exists(clone_dir: Path) -> None: + try: + clone_dir.mkdir(mode=0o700, parents=True, exist_ok=True) + except OSError as ex: + LOG.warning("WARN: Unable to create clone directory.", exc_info=ex) + raise + + @staticmethod + def _git_executable() -> str: + if platform.system().lower() == "windows": + executables = ["git", "git.cmd", "git.exe", "git.bat"] + else: + executables = ["git"] + + for executable in executables: + try: + subprocess.Popen([executable], stdout=subprocess.PIPE, stderr=subprocess.PIPE) + # No exception. Let's pick this + return executable + except OSError as ex: + LOG.debug("Unable to find executable %s", executable, exc_info=ex) + + raise OSError("Cannot find git, was looking at executables: {}".format(executables)) + + def clone(self, clone_dir: Path, clone_name: str, replace_existing: bool = False) -> Path: + """ + creates a local clone of this Git repository. + This method is different from the standard Git clone in the following: + 1. It accepts the path to clone into as a clone_dir (the parent directory to clone in) and a clone_name (The + name of the local folder) instead of accepting the full path (the join of both) in one parameter + 2. It removes the "*.git" files/directories so the clone is not a GitRepo any more + 3. It has the option to replace the local folder(destination) if already exists + + Parameters + ---------- + clone_dir: Path + The directory to create the local clone inside + clone_name: str + The dirname of the local clone + replace_existing: bool + Whether to replace the current local clone directory if already exists or not + + Returns + ------- + The path of the created local clone + + Raises + ------ + OSError: + when file management errors like unable to mkdir, copytree, rmtree ...etc + CloneRepoException: + General errors like for example; if an error occurred while running `git clone` + or if the local_clone already exists and replace_existing is not set + CloneRepoUnstableStateException: + when reaching unstable state, for example with replace_existing flag set, unstable state can happen + if removed the current local clone but failed to copy the new one from the temp location to the destination + """ + + GitRepo._ensure_clone_directory_exists(clone_dir=clone_dir) + # clone to temp then move to the destination(repo_local_path) + with osutils.mkdir_temp(ignore_errors=True) as tempdir: + try: + temp_path = os.path.normpath(os.path.join(tempdir, clone_name)) + git_executable: str = GitRepo._git_executable() + LOG.info("\nCloning from %s", self.url) + # TODO: [UPDATEME] wchengru: We should remove --branch option when making CDK support GA. + command_list = [git_executable, "clone", self.url, clone_name] + if self.branch is not None: + command_list = [git_executable, "clone", "--branch", self.branch, self.url, clone_name] + check_output( + command_list, + cwd=tempdir, + stderr=subprocess.STDOUT, + ) + self.local_path = self._persist_local_repo(temp_path, clone_dir, clone_name, replace_existing) + return self.local_path + except OSError as ex: + LOG.warning("WARN: Could not clone repo %s", self.url, exc_info=ex) + raise + except subprocess.CalledProcessError as clone_error: + output = clone_error.output.decode("utf-8") + if "not found" in output.lower(): + LOG.warning("WARN: Could not clone repo %s", self.url, exc_info=clone_error) + raise CloneRepoException(output) from clone_error + finally: + self.clone_attempted = True + + @staticmethod + def _persist_local_repo(temp_path: str, dest_dir: Path, dest_name: str, replace_existing: bool) -> Path: + dest_path = os.path.normpath(dest_dir.joinpath(dest_name)) + try: + if Path(dest_path).exists(): + if not replace_existing: + raise CloneRepoException(f"Can not clone to {dest_path}, directory already exist") + LOG.debug("Removing old repo at %s", dest_path) + shutil.rmtree(dest_path, onerror=rmtree_callback) + + LOG.debug("Copying from %s to %s", temp_path, dest_path) + # Todo consider not removing the .git files/directories + shutil.copytree(temp_path, dest_path, ignore=shutil.ignore_patterns("*.git")) + return Path(dest_path) + except (OSError, shutil.Error) as ex: + # UNSTABLE STATE + # it's difficult to see how this scenario could happen except weird permissions, user will need to debug + raise CloneRepoUnstableStateException( + "Unstable state when updating repo. " + f"Check that you have permissions to create/delete files in {dest_dir} directory " + "or file an issue at https://github.com/aws/aws-sam-cli/issues" + ) from ex diff --git a/samcli/lib/utils/hash.py b/samcli/lib/utils/hash.py index 6800883eff..0d245218ec 100644 --- a/samcli/lib/utils/hash.py +++ b/samcli/lib/utils/hash.py @@ -4,6 +4,7 @@ import os import hashlib import logging +from typing import List, Optional BLOCK_SIZE = 4096 LOG = logging.getLogger(__name__) @@ -39,25 +40,32 @@ def file_checksum(file_name: str) -> str: return md5.hexdigest() -def dir_checksum(directory: str, followlinks: bool = True) -> str: +def dir_checksum(directory: str, followlinks: bool = True, ignore_list: Optional[List[str]] = None) -> str: """ Parameters ---------- directory : A directory with an absolute path followlinks: Follow symbolic links through the given directory + ignore_list: The list of file/directory names to ignore in checksum Returns ------- md5 checksum of the directory. """ + ignore_set = set(ignore_list or []) md5_dir = hashlib.md5() files = list() # Walk through given directory and find all directories and files. - for dirpath, _, filenames in os.walk(directory, followlinks=followlinks): + for dirpath, dirnames, filenames in os.walk(directory, followlinks=followlinks): + # > When topdown is True, the caller can modify the dirnames list in-place + # > (perhaps using del or slice assignment) and walk() will only recurse + # > into the subdirectories whose names remain in dirnames + # > https://docs.python.org/library/os.html#os.walk + dirnames[:] = [dirname for dirname in dirnames if dirname not in ignore_set] # Go through every file in the directory and sub-directory. - for filepath in [os.path.join(dirpath, filename) for filename in filenames]: + for filepath in [os.path.join(dirpath, filename) for filename in filenames if filename not in ignore_set]: # Look at filename and contents. # Encode file's checksum to be utf-8 and bytes. files.append(filepath) diff --git a/samcli/lib/utils/managed_cloudformation_stack.py b/samcli/lib/utils/managed_cloudformation_stack.py index 25973fbc8b..29d148a7d9 100644 --- a/samcli/lib/utils/managed_cloudformation_stack.py +++ b/samcli/lib/utils/managed_cloudformation_stack.py @@ -1,20 +1,17 @@ """ Bootstrap's user's development environment by creating cloud resources required by SAM CLI """ - import logging +from collections.abc import Collection +from typing import cast, Dict, List, Optional, Union import boto3 - import click - from botocore.config import Config from botocore.exceptions import ClientError, BotoCoreError, NoRegionError, NoCredentialsError, ProfileNotFound from samcli.commands.exceptions import UserException, CredentialsError, RegionError - -SAM_CLI_STACK_PREFIX = "aws-sam-cli-managed-" LOG = logging.getLogger(__name__) @@ -25,10 +22,45 @@ def __init__(self, ex): super().__init__(message=message_fmt.format(ex=self.ex)) -def manage_stack(profile, region, stack_name, template_body): +class StackOutput: + def __init__(self, stack_output: List[Dict[str, str]]): + self._stack_output: List[Dict[str, str]] = stack_output + + def get(self, key) -> Optional[str]: + try: + return next(o for o in self._stack_output if o.get("OutputKey") == key).get("OutputValue") + except StopIteration: + return None + + +def manage_stack( + region: Optional[str], + stack_name: str, + template_body: str, + profile: Optional[str] = None, + parameter_overrides: Optional[Dict[str, Union[str, List[str]]]] = None, +) -> StackOutput: + """ + get or create a CloudFormation stack + + Parameters + ---------- + region: str + AWS region for the CloudFormation stack + stack_name: str + CloudFormation stack name + template_body: str + CloudFormation template's content + profile: Optional[str] + AWS named profile for the AWS account + parameter_overrides: Optional[Dict[str, Union[str, List[str]]]] + Values of template parameters, if any. + + Returns: Stack output section(list of OutputKey, OutputValue pairs) + """ try: if profile: - session = boto3.Session(profile_name=profile, region_name=region if region else None) + session = boto3.Session(profile_name=profile, region_name=region if region else None) # type: ignore cloudformation_client = session.client("cloudformation") else: cloudformation_client = boto3.client( @@ -51,32 +83,41 @@ def manage_stack(profile, region, stack_name, template_body): "Error Setting Up Managed Stack Client: Unable to resolve a region. " "Please provide a region via the --region parameter or by the AWS_REGION environment variable." ) from ex - return _create_or_get_stack(cloudformation_client, stack_name, template_body) + return _create_or_get_stack(cloudformation_client, stack_name, template_body, parameter_overrides) -def _create_or_get_stack(cloudformation_client, stack_name, template_body): +# Todo Add _update_stack to handle the case when the values of the stack parameter got changed +def _create_or_get_stack( + cloudformation_client, + stack_name: str, + template_body: str, + parameter_overrides: Optional[Dict[str, Union[str, List[str]]]] = None, +) -> StackOutput: try: ds_resp = cloudformation_client.describe_stacks(StackName=stack_name) stacks = ds_resp["Stacks"] stack = stacks[0] click.echo("\n\tLooking for resources needed for deployment: Found!") - _check_sanity_of_stack(stack, stack_name) - return stack["Outputs"] + _check_sanity_of_stack(stack) + stack_outputs = cast(List[Dict[str, str]], stack["Outputs"]) + return StackOutput(stack_outputs) except ClientError: click.echo("\n\tLooking for resources needed for deployment: Not found.") try: stack = _create_stack( - cloudformation_client, stack_name, template_body + cloudformation_client, stack_name, template_body, parameter_overrides ) # exceptions are not captured from subcommands - _check_sanity_of_stack(stack, stack_name) - return stack["Outputs"] + _check_sanity_of_stack(stack) + stack_outputs = cast(List[Dict[str, str]], stack["Outputs"]) + return StackOutput(stack_outputs) except (ClientError, BotoCoreError) as ex: LOG.debug("Failed to create managed resources", exc_info=ex) raise ManagedStackError(str(ex)) from ex -def _check_sanity_of_stack(stack, stack_name): +def _check_sanity_of_stack(stack): + stack_name = stack.get("StackName") tags = stack.get("Tags", None) outputs = stack.get("Outputs", None) @@ -112,15 +153,23 @@ def _check_sanity_of_stack(stack, stack_name): raise UserException(msg) from ex -def _create_stack(cloudformation_client, stack_name, template_body): +def _create_stack( + cloudformation_client, + stack_name: str, + template_body: str, + parameter_overrides: Optional[Dict[str, Union[str, List[str]]]] = None, +): click.echo("\tCreating the required resources...") change_set_name = "InitialCreation" + parameters = _generate_stack_parameters(parameter_overrides) change_set_resp = cloudformation_client.create_change_set( StackName=stack_name, TemplateBody=template_body, Tags=[{"Key": "ManagedStackSource", "Value": "AwsSamCli"}], ChangeSetType="CREATE", ChangeSetName=change_set_name, # this must be unique for the stack, but we only create so that's fine + Capabilities=["CAPABILITY_IAM"], + Parameters=parameters, ) stack_id = change_set_resp["StackId"] change_waiter = cloudformation_client.get_waiter("change_set_create_complete") @@ -134,3 +183,16 @@ def _create_stack(cloudformation_client, stack_name, template_body): stacks = ds_resp["Stacks"] click.echo("\tSuccessfully created!") return stacks[0] + + +def _generate_stack_parameters( + parameter_overrides: Optional[Dict[str, Union[str, List[str]]]] = None +) -> List[Dict[str, str]]: + parameters = [] + if parameter_overrides: + for key, value in parameter_overrides.items(): + if isinstance(value, Collection) and not isinstance(value, str): + # Assumption: values don't include commas or spaces. Need to refactor to handle such a case if needed. + value = ",".join(value) + parameters.append({"ParameterKey": key, "ParameterValue": value}) + return parameters diff --git a/samcli/lib/utils/profile.py b/samcli/lib/utils/profile.py new file mode 100644 index 0000000000..47d0242eee --- /dev/null +++ b/samcli/lib/utils/profile.py @@ -0,0 +1,10 @@ +""" +Module for aws profile related helpers +""" +from typing import List, cast + +from botocore.session import Session + + +def list_available_profiles() -> List[str]: + return cast(List[str], Session().available_profiles) diff --git a/samcli/local/apigw/local_apigw_service.py b/samcli/local/apigw/local_apigw_service.py index cc2684c200..5a6d397d54 100644 --- a/samcli/local/apigw/local_apigw_service.py +++ b/samcli/local/apigw/local_apigw_service.py @@ -333,7 +333,7 @@ def _request_handler(self, **kwargs): ) else: (status_code, headers, body) = self._parse_v1_payload_format_lambda_output( - lambda_response, self.api.binary_media_types, request + lambda_response, self.api.binary_media_types, request, route.event_type ) except LambdaResponseParseException as ex: LOG.error("Invalid lambda response received: %s", ex) @@ -379,13 +379,14 @@ def get_request_methods_endpoints(flask_request): # Consider moving this out to its own class. Logic is started to get dense and looks messy @jfuss @staticmethod - def _parse_v1_payload_format_lambda_output(lambda_output: str, binary_types, flask_request): + def _parse_v1_payload_format_lambda_output(lambda_output: str, binary_types, flask_request, event_type): """ Parses the output from the Lambda Container :param str lambda_output: Output from Lambda Invoke :param binary_types: list of binary types :param flask_request: flash request object + :param event_type: determines the route event type :return: Tuple(int, dict, str, bool) """ # pylint: disable-msg=too-many-statements @@ -397,6 +398,9 @@ def _parse_v1_payload_format_lambda_output(lambda_output: str, binary_types, fla if not isinstance(json_output, dict): raise LambdaResponseParseException(f"Lambda returned {type(json_output)} instead of dict") + if event_type == Route.HTTP and json_output.get("statusCode") is None: + raise LambdaResponseParseException(f"Invalid API Gateway Response Key: statusCode is not in {json_output}") + status_code = json_output.get("statusCode") or 200 headers = LocalApigwService._merge_response_headers( json_output.get("headers") or {}, json_output.get("multiValueHeaders") or {} @@ -405,7 +409,8 @@ def _parse_v1_payload_format_lambda_output(lambda_output: str, binary_types, fla body = json_output.get("body") if body is None: LOG.warning("Lambda returned empty body!") - is_base_64_encoded = json_output.get("isBase64Encoded") or False + + is_base_64_encoded = LocalApigwService.get_base_64_encoded(event_type, json_output) try: status_code = int(status_code) @@ -422,8 +427,10 @@ def _parse_v1_payload_format_lambda_output(lambda_output: str, binary_types, fla f"Non null response bodies should be able to convert to string: {body}" ) from ex - invalid_keys = LocalApigwService._invalid_apig_response_keys(json_output) - if invalid_keys: + invalid_keys = LocalApigwService._invalid_apig_response_keys(json_output, event_type) + # HTTP API Gateway just skip the non allowed lambda response fields, but Rest API gateway fail on + # the non allowed fields + if event_type == Route.API and invalid_keys: raise LambdaResponseParseException(f"Invalid API Gateway Response Keys: {invalid_keys} in {json_output}") # If the customer doesn't define Content-Type default to application/json @@ -432,17 +439,51 @@ def _parse_v1_payload_format_lambda_output(lambda_output: str, binary_types, fla headers["Content-Type"] = "application/json" try: - if LocalApigwService._should_base64_decode_body(binary_types, flask_request, headers, is_base_64_encoded): + # HTTP API Gateway always decode the lambda response only if isBase64Encoded field in response is True + # regardless the response content-type + # Rest API Gateway depends on the response content-type and the API configured BinaryMediaTypes to decide + # if it will decode the response or not + if (event_type == Route.HTTP and is_base_64_encoded) or ( + event_type == Route.API + and LocalApigwService._should_base64_decode_body( + binary_types, flask_request, headers, is_base_64_encoded + ) + ): body = base64.b64decode(body) except ValueError as ex: LambdaResponseParseException(str(ex)) return status_code, headers, body + @staticmethod + def get_base_64_encoded(event_type, json_output): + # The following behaviour is undocumented behaviour, and based on some trials + # Http API gateway checks lambda response for isBase64Encoded field, and ignore base64Encoded + # Rest API gateway checks first the field base64Encoded field, if not exist, it checks isBase64Encoded field + + if event_type == Route.API and json_output.get("base64Encoded") is not None: + is_base_64_encoded = json_output.get("base64Encoded") + field_name = "base64Encoded" + elif json_output.get("isBase64Encoded") is not None: + is_base_64_encoded = json_output.get("isBase64Encoded") + field_name = "isBase64Encoded" + else: + is_base_64_encoded = False + field_name = "isBase64Encoded" + + if isinstance(is_base_64_encoded, str) and is_base_64_encoded in ["true", "True", "false", "False"]: + is_base_64_encoded = is_base_64_encoded in ["true", "True"] + elif not isinstance(is_base_64_encoded, bool): + raise LambdaResponseParseException( + f"Invalid API Gateway Response Key: {is_base_64_encoded} is not a valid" f"{field_name}" + ) + + return is_base_64_encoded + @staticmethod def _parse_v2_payload_format_lambda_output(lambda_output: str, binary_types, flask_request): """ - Parses the output from the Lambda Container + Parses the output from the Lambda Container. V2 Payload Format means that the event_type is only HTTP :param str lambda_output: Output from Lambda Invoke :param binary_types: list of binary types @@ -487,21 +528,15 @@ def _parse_v2_payload_format_lambda_output(lambda_output: str, binary_types, fla f"Non null response bodies should be able to convert to string: {body}" ) from ex - # API Gateway only accepts statusCode, body, headers, and isBase64Encoded in - # a response shape. - # Don't check the response keys when inferring a response, see - # https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-develop-integrations-lambda.html#http-api-develop-integrations-lambda.v2. - invalid_keys = LocalApigwService._invalid_apig_response_keys(json_output) - if "statusCode" in json_output and invalid_keys: - raise LambdaResponseParseException(f"Invalid API Gateway Response Keys: {invalid_keys} in {json_output}") - # If the customer doesn't define Content-Type default to application/json if "Content-Type" not in headers: LOG.info("No Content-Type given. Defaulting to 'application/json'.") headers["Content-Type"] = "application/json" try: - if LocalApigwService._should_base64_decode_body(binary_types, flask_request, headers, is_base_64_encoded): + # HTTP API Gateway always decode the lambda response only if isBase64Encoded field in response is True + # regardless the response content-type + if is_base_64_encoded: # Note(xinhol): here in this method we change the type of the variable body multiple times # and confused mypy, we might want to avoid this and use multiple variables here. body = base64.b64decode(body) # type: ignore @@ -511,8 +546,10 @@ def _parse_v2_payload_format_lambda_output(lambda_output: str, binary_types, fla return status_code, headers, body @staticmethod - def _invalid_apig_response_keys(output): + def _invalid_apig_response_keys(output, event_type): allowable = {"statusCode", "body", "headers", "multiValueHeaders", "isBase64Encoded", "cookies"} + if event_type == Route.API: + allowable.add("base64Encoded") invalid_keys = output.keys() - allowable return invalid_keys diff --git a/samcli/local/common/runtime_template.py b/samcli/local/common/runtime_template.py index 265406a9d5..cb9bfe4ade 100644 --- a/samcli/local/common/runtime_template.py +++ b/samcli/local/common/runtime_template.py @@ -34,7 +34,7 @@ RUNTIME_DEP_TEMPLATE_MAPPING = { "python": [ { - "runtimes": ["python3.8", "python3.7", "python3.6", "python2.7"], + "runtimes": ["python3.9", "python3.8", "python3.7", "python3.6", "python2.7"], "dependency_manager": "pip", "init_location": os.path.join(_templates, "cookiecutter-aws-sam-hello-python"), "build": True, @@ -98,6 +98,7 @@ def get_local_lambda_images_location(mapping, runtime): RUNTIME_TO_DEPENDENCY_MANAGERS = { + "python3.9": ["pip"], "python3.8": ["pip"], "python3.7": ["pip"], "python3.6": ["pip"], @@ -139,7 +140,7 @@ def get_local_lambda_images_location(mapping, runtime): INIT_RUNTIMES = [ # latest of each runtime version "nodejs14.x", - "python3.8", + "python3.9", "ruby2.7", "go1.x", "java11", @@ -148,6 +149,7 @@ def get_local_lambda_images_location(mapping, runtime): "nodejs12.x", "nodejs10.x", # older python runtimes + "python3.8", "python3.7", "python3.6", "python2.7", @@ -164,6 +166,7 @@ def get_local_lambda_images_location(mapping, runtime): "amazon/nodejs14.x-base", "amazon/nodejs12.x-base", "amazon/nodejs10.x-base", + "amazon/python3.9-base", "amazon/python3.8-base", "amazon/python3.7-base", "amazon/python3.6-base", @@ -188,4 +191,5 @@ def get_local_lambda_images_location(mapping, runtime): "python3.7": "Python36", "python3.6": "Python36", "python3.8": "Python36", + "python3.9": "Python36", } diff --git a/samcli/local/docker/container.py b/samcli/local/docker/container.py index fbaf237224..a43e485513 100644 --- a/samcli/local/docker/container.py +++ b/samcli/local/docker/container.py @@ -38,7 +38,7 @@ class Container: _STDOUT_FRAME_TYPE = 1 _STDERR_FRAME_TYPE = 2 RAPID_PORT_CONTAINER = "8080" - URL = "http://localhost:{port}/2015-03-31/functions/{function_name}/invocations" + URL = "http://{host}:{port}/2015-03-31/functions/{function_name}/invocations" # Set connection timeout to 1 sec to support the large input. RAPID_CONNECTION_TIMEOUT = 1 @@ -55,6 +55,8 @@ def __init__( docker_client=None, container_opts=None, additional_volumes=None, + container_host="localhost", + container_host_interface="127.0.0.1", ): """ Initializes the class with given configuration. This does not automatically create or run the container. @@ -71,6 +73,8 @@ def __init__( :param docker_client: Optional, a docker client to replace the default one loaded from env :param container_opts: Optional, a dictionary containing the container options :param additional_volumes: Optional list of additional volumes + :param string container_host: Optional. Host of locally emulated Lambda container + :param string container_host_interface: Optional. Interface that Docker host binds ports to """ self._image = image @@ -96,6 +100,10 @@ def __init__( # selecting the first free port in a range that's not ephemeral. self._start_port_range = 5000 self._end_port_range = 9000 + + self._container_host = container_host + self._container_host_interface = container_host_interface + try: self.rapid_port_host = find_free_port(start=self._start_port_range, end=self._end_port_range) except NoFreePortsError as ex: @@ -150,11 +158,14 @@ def create(self): if self._env_vars: kwargs["environment"] = self._env_vars - kwargs["ports"] = {self.RAPID_PORT_CONTAINER: ("127.0.0.1", self.rapid_port_host)} + kwargs["ports"] = {self.RAPID_PORT_CONTAINER: (self._container_host_interface, self.rapid_port_host)} if self._exposed_ports: kwargs["ports"].update( - {container_port: ("127.0.0.1", host_port) for container_port, host_port in self._exposed_ports.items()} + { + container_port: (self._container_host_interface, host_port) + for container_port, host_port in self._exposed_ports.items() + } ) if self._entrypoint: @@ -266,8 +277,9 @@ def wait_for_http_response(self, name, event, stdout): # TODO(sriram-mv): `aws-lambda-rie` is in a mode where the function_name is always "function" # NOTE(sriram-mv): There is a connection timeout set on the http call to `aws-lambda-rie`, however there is not # a read time out for the response received from the server. + resp = requests.post( - self.URL.format(port=self.rapid_port_host, function_name="function"), + self.URL.format(host=self._container_host, port=self.rapid_port_host, function_name="function"), data=event.encode("utf-8"), timeout=(self.RAPID_CONNECTION_TIMEOUT, None), ) diff --git a/samcli/local/docker/lambda_container.py b/samcli/local/docker/lambda_container.py index 87d79f1b44..98f519729b 100644 --- a/samcli/local/docker/lambda_container.py +++ b/samcli/local/docker/lambda_container.py @@ -45,6 +45,8 @@ def __init__( memory_mb=128, env_vars=None, debug_options=None, + container_host=None, + container_host_interface=None, ): """ Initializes the class @@ -74,6 +76,10 @@ def __init__( Optional. Dictionary containing environment variables passed to container debug_options DebugContext Optional. Contains container debugging info (port, debugger path) + container_host string + Optional. Host of locally emulated Lambda container + container_host_interface + Optional. Interface that Docker host binds ports to """ if not Runtime.has_value(runtime) and not packagetype == IMAGE: raise ValueError("Unsupported Lambda runtime {}".format(runtime)) @@ -119,6 +125,8 @@ def __init__( env_vars=env_vars, container_opts=additional_options, additional_volumes=additional_volumes, + container_host=container_host, + container_host_interface=container_host_interface, ) @staticmethod diff --git a/samcli/local/docker/lambda_debug_settings.py b/samcli/local/docker/lambda_debug_settings.py index 6bdb7d8a12..9f8ce2fe7d 100644 --- a/samcli/local/docker/lambda_debug_settings.py +++ b/samcli/local/docker/lambda_debug_settings.py @@ -156,6 +156,10 @@ def get_debug_settings(debug_port, debug_args_list, _container_env_vars, runtime entry + ["/var/lang/bin/python3.8"] + debug_args_list + ["/var/runtime/bootstrap.py"], container_env_vars=_container_env_vars, ), + Runtime.python39.value: lambda: DebugSettings( + entry + ["/var/lang/bin/python3.9"] + debug_args_list + ["/var/runtime/bootstrap.py"], + container_env_vars=_container_env_vars, + ), } try: return entrypoint_mapping[runtime]() diff --git a/samcli/local/docker/lambda_image.py b/samcli/local/docker/lambda_image.py index a282cac23d..0c31b20289 100644 --- a/samcli/local/docker/lambda_image.py +++ b/samcli/local/docker/lambda_image.py @@ -30,6 +30,7 @@ class Runtime(Enum): python36 = "python3.6" python37 = "python3.7" python38 = "python3.8" + python39 = "python3.9" ruby25 = "ruby2.5" ruby27 = "ruby2.7" java8 = "java8" @@ -54,7 +55,7 @@ def has_value(cls, value): class LambdaImage: _LAYERS_DIR = "/opt" - _INVOKE_REPO_PREFIX = "amazon/aws-sam-cli-emulation-image" + _INVOKE_REPO_PREFIX = "public.ecr.aws/sam/emulation" _SAM_CLI_REPO_NAME = "samcli/lambda" _RAPID_SOURCE_PATH = Path(__file__).parent.joinpath("..", "rapid").resolve() @@ -229,11 +230,20 @@ def set_item_permission(tar_info): with create_tarball(tar_paths, tar_filter=tar_filter) as tarballfile: try: resp_stream = self.docker_client.api.build( - fileobj=tarballfile, custom_context=True, rm=True, tag=docker_tag, pull=not self.skip_pull_image + fileobj=tarballfile, + custom_context=True, + rm=True, + tag=docker_tag, + pull=not self.skip_pull_image, + decode=True, ) - for _ in resp_stream: + for log in resp_stream: stream_writer.write(".") stream_writer.flush() + if "error" in log: + stream_writer.write("\n") + LOG.exception("Failed to build Docker Image") + raise ImageBuildException("Error building docker image: {}".format(log["error"])) stream_writer.write("\n") except (docker.errors.BuildError, docker.errors.APIError) as ex: stream_writer.write("\n") diff --git a/samcli/local/lambdafn/runtime.py b/samcli/local/lambdafn/runtime.py index 759d02fd79..af42876001 100644 --- a/samcli/local/lambdafn/runtime.py +++ b/samcli/local/lambdafn/runtime.py @@ -1,20 +1,21 @@ """ Classes representing a local Lambda runtime """ - +import copy import os import shutil import tempfile import signal import logging import threading -from typing import Optional +from typing import Optional, Union, Dict from samcli.local.docker.lambda_container import LambdaContainer from samcli.lib.utils.file_observer import LambdaFunctionObserver from samcli.lib.utils.packagetype import ZIP from samcli.lib.telemetry.metric import MetricName, capture_parameter from .zip import unzip +from ...lib.providers.provider import LayerVersion from ...lib.utils.stream_writer import StreamWriter LOG = logging.getLogger(__name__) @@ -44,7 +45,7 @@ def __init__(self, container_manager, image_builder): self._image_builder = image_builder self._temp_uncompressed_paths_to_be_cleaned = [] - def create(self, function_config, debug_context=None): + def create(self, function_config, debug_context=None, container_host=None, container_host_interface=None): """ Create a new Container for the passed function, then store it in a dictionary using the function name, so it can be retrieved later and used in the other functions. Make sure to use the debug_context only @@ -56,6 +57,8 @@ def create(self, function_config, debug_context=None): Configuration of the function to create a new Container for it. debug_context DebugContext Debugging context for the function (includes port, args, and path) + container_host string + Host of locally emulated Lambda container Returns ------- @@ -66,6 +69,7 @@ def create(self, function_config, debug_context=None): env_vars = function_config.env_vars.resolve() code_dir = self._get_code_dir(function_config.code_abs_path) + layers = [self._unarchived_layer(layer) for layer in function_config.layers] container = LambdaContainer( function_config.runtime, function_config.imageuri, @@ -73,11 +77,13 @@ def create(self, function_config, debug_context=None): function_config.packagetype, function_config.imageconfig, code_dir, - function_config.layers, + layers, self._image_builder, memory_mb=function_config.memory, env_vars=env_vars, debug_options=debug_context, + container_host=container_host, + container_host_interface=container_host_interface, ) try: # create the container. @@ -88,7 +94,7 @@ def create(self, function_config, debug_context=None): LOG.debug("Ctrl+C was pressed. Aborting container creation") raise - def run(self, container, function_config, debug_context): + def run(self, container, function_config, debug_context, container_host=None, container_host_interface=None): """ Find the created container for the passed Lambda function, then using the ContainerManager run this container. @@ -102,6 +108,11 @@ def run(self, container, function_config, debug_context): Configuration of the function to run its created container. debug_context DebugContext Debugging context for the function (includes port, args, and path) + container_host string + Host of locally emulated Lambda container + container_host_interface string + Optional. Interface that Docker host binds ports to + Returns ------- Container @@ -109,7 +120,7 @@ def run(self, container, function_config, debug_context): """ if not container: - container = self.create(function_config, debug_context) + container = self.create(function_config, debug_context, container_host, container_host_interface) if container.is_running(): LOG.info("Lambda function '%s' is already running", function_config.name) @@ -132,6 +143,8 @@ def invoke( debug_context=None, stdout: Optional[StreamWriter] = None, stderr: Optional[StreamWriter] = None, + container_host=None, + container_host_interface=None, ): """ Invoke the given Lambda function locally. @@ -150,13 +163,17 @@ def invoke( StreamWriter that receives stdout text from container. :param samcli.lib.utils.stream_writer.StreamWriter stderr: Optional. StreamWriter that receives stderr text from container. + :param string container_host: Optional. + Host of locally emulated Lambda container + :param string container_host_interface: Optional. + Interface that Docker host binds ports to :raises Keyboard """ timer = None container = None try: # Start the container. This call returns immediately after the container starts - container = self.create(function_config, debug_context) + container = self.create(function_config, debug_context, container_host, container_host_interface) container = self.run(container, function_config, debug_context) # Setup appropriate interrupt - timeout or Ctrl+C - before function starts executing. # @@ -235,9 +252,9 @@ def signal_handler(sig, frame): timer.start() return timer - def _get_code_dir(self, code_path): + def _get_code_dir(self, code_path: str) -> str: """ - Method to get a path to a directory where the Lambda function code is available. This directory will + Method to get a path to a directory where the function/layer code is available. This directory will be mounted directly inside the Docker container. This method handles a few different cases for ``code_path``: @@ -259,13 +276,34 @@ def _get_code_dir(self, code_path): """ if code_path and os.path.isfile(code_path) and code_path.endswith(self.SUPPORTED_ARCHIVE_EXTENSIONS): - decompressed_dir = _unzip_file(code_path) + decompressed_dir: str = _unzip_file(code_path) self._temp_uncompressed_paths_to_be_cleaned += [decompressed_dir] return decompressed_dir LOG.debug("Code %s is not a zip/jar file", code_path) return code_path + def _unarchived_layer(self, layer: Union[str, Dict, LayerVersion]) -> Union[str, Dict, LayerVersion]: + """ + If the layer's content uri points to a supported local archive file, use self._get_code_dir() to + un-archive it and so that it can be mounted directly inside the Docker container. + Parameters + ---------- + layer + a str, dict or a LayerVersion object representing a layer + + Returns + ------- + as it is (if no archived file is identified) + or a LayerVersion with ContentUri pointing to an unarchived directory + """ + if isinstance(layer, LayerVersion) and isinstance(layer.codeuri, str): + unarchived_layer = copy.deepcopy(layer) + unarchived_layer.codeuri = self._get_code_dir(layer.codeuri) + return unarchived_layer if unarchived_layer.codeuri != layer.codeuri else layer + + return layer + def _clean_decompressed_paths(self): """ Clean the temporary decompressed code dirs @@ -301,7 +339,7 @@ def __init__(self, container_manager, image_builder): super().__init__(container_manager, image_builder) - def create(self, function_config, debug_context=None): + def create(self, function_config, debug_context=None, container_host=None, container_host_interface=None): """ Create a new Container for the passed function, then store it in a dictionary using the function name, so it can be retrieved later and used in the other functions. Make sure to use the debug_context only @@ -313,6 +351,10 @@ def create(self, function_config, debug_context=None): Configuration of the function to create a new Container for it. debug_context DebugContext Debugging context for the function (includes port, args, and path) + container_host string + Host of locally emulated Lambda container + container_host_interface string + Interface that Docker host binds ports to Returns ------- @@ -336,7 +378,7 @@ def create(self, function_config, debug_context=None): ) debug_context = None - container = super().create(function_config, debug_context) + container = super().create(function_config, debug_context, container_host, container_host_interface) self._containers[function_config.name] = container self._observer.watch(function_config) diff --git a/samcli/yamlhelper.py b/samcli/yamlhelper.py index 49bb544fda..b88de31948 100644 --- a/samcli/yamlhelper.py +++ b/samcli/yamlhelper.py @@ -18,7 +18,7 @@ # pylint: disable=too-many-ancestors import json -from typing import Dict, Optional +from typing import cast, Dict, Optional from botocore.compat import OrderedDict import yaml @@ -116,20 +116,20 @@ def _dict_constructor(loader, node): return OrderedDict(loader.construct_pairs(node)) -def yaml_parse(yamlstr): +def yaml_parse(yamlstr) -> Dict: """Parse a yaml string""" try: # PyYAML doesn't support json as well as it should, so if the input # is actually just json it is better to parse it with the standard # json parser. - return json.loads(yamlstr, object_pairs_hook=OrderedDict) + return cast(Dict, json.loads(yamlstr, object_pairs_hook=OrderedDict)) except ValueError: yaml.SafeLoader.add_constructor(yaml.resolver.BaseResolver.DEFAULT_MAPPING_TAG, _dict_constructor) yaml.SafeLoader.add_multi_constructor("!", intrinsics_multi_constructor) - return yaml.safe_load(yamlstr) + return cast(Dict, yaml.safe_load(yamlstr)) -def parse_yaml_file(file_path, extra_context: Optional[Dict] = None): +def parse_yaml_file(file_path, extra_context: Optional[Dict] = None) -> Dict: """ Read the file, do variable substitution, parse it as JSON/YAML diff --git a/setup.py b/setup.py index dc50cea29a..eddc9b0140 100644 --- a/setup.py +++ b/setup.py @@ -71,6 +71,7 @@ def read_version(): "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", + "Programming Language :: Python :: 3.9", "Topic :: Internet", "Topic :: Software Development :: Build Tools", "Topic :: Utilities", diff --git a/tests/functional/commands/validate/lib/models/all_policy_templates.yaml b/tests/functional/commands/validate/lib/models/all_policy_templates.yaml index 5216a65f2c..c46cfaecb9 100644 --- a/tests/functional/commands/validate/lib/models/all_policy_templates.yaml +++ b/tests/functional/commands/validate/lib/models/all_policy_templates.yaml @@ -168,3 +168,9 @@ Resources: - EventBridgePutEventsPolicy: EventBusName: name + + - AcmGetCertificatePolicy: + CertificateArn: arn + + - Route53ChangeResourceRecordSetsPolicy: + HostedZoneId: test diff --git a/tests/functional/commands/validate/lib/models/api_request_model.yaml b/tests/functional/commands/validate/lib/models/api_request_model.yaml index 4dc0c5f423..5c1d96b073 100644 --- a/tests/functional/commands/validate/lib/models/api_request_model.yaml +++ b/tests/functional/commands/validate/lib/models/api_request_model.yaml @@ -15,6 +15,15 @@ Resources: RequestModel: Model: User Required: true + AnyPath: + Type: Api + Properties: + RestApiId: HtmlApi + Path: /any + Method: any + RequestModel: + Model: User + Required: true HtmlApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_request_model_openapi_3.yaml b/tests/functional/commands/validate/lib/models/api_request_model_openapi_3.yaml index 2e9a7d26d2..69e003ebdb 100644 --- a/tests/functional/commands/validate/lib/models/api_request_model_openapi_3.yaml +++ b/tests/functional/commands/validate/lib/models/api_request_model_openapi_3.yaml @@ -27,6 +27,18 @@ Resources: Path: /iam Auth: Authorizer: AWS_IAM + AnyIam: + Type: Api + Properties: + RequestModel: + Model: User + Required: true + RestApiId: + Ref: HtmlApi + Method: any + Path: /any/iam + Auth: + Authorizer: AWS_IAM HtmlApi: diff --git a/tests/functional/commands/validate/lib/models/api_with_apikey_required.yaml b/tests/functional/commands/validate/lib/models/api_with_apikey_required.yaml index 4ae8e52680..27dfe9a720 100644 --- a/tests/functional/commands/validate/lib/models/api_with_apikey_required.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_apikey_required.yaml @@ -19,3 +19,11 @@ Resources: Method: get Auth: ApiKeyRequired: true + MyApiWithApiKeyRequiredAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithoutAuth + Path: /any/ApiKeyRequiredTrue + Method: any + Auth: + ApiKeyRequired: true diff --git a/tests/functional/commands/validate/lib/models/api_with_apikey_required_openapi_3.yaml b/tests/functional/commands/validate/lib/models/api_with_apikey_required_openapi_3.yaml index e3140b5945..bd962b7709 100644 --- a/tests/functional/commands/validate/lib/models/api_with_apikey_required_openapi_3.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_apikey_required_openapi_3.yaml @@ -20,3 +20,11 @@ Resources: Method: get Auth: ApiKeyRequired: true + MyApiWithApiKeyRequiredAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithoutAuth + Path: /any/ApiKeyRequiredTrue + Method: any + Auth: + ApiKeyRequired: true diff --git a/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum.yaml b/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum.yaml index 831425e6da..67e3f4a8eb 100644 --- a/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum.yaml @@ -50,7 +50,7 @@ Resources: Context: - Authorization4 ReauthorizeEvery: 0 - + MyFunction: Type: AWS::Serverless::Function Properties: @@ -66,6 +66,14 @@ Resources: Method: get Auth: Authorizer: NONE + WithNoAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/noauth + Method: any + Auth: + Authorizer: NONE WithCognitoMultipleUserPoolsAuthorizer: Type: Api Properties: @@ -74,6 +82,14 @@ Resources: Method: post Auth: Authorizer: MyCognitoAuthMultipleUserPools + WithCognitoMultipleUserPoolsAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/cognitomultiple + Method: any + Auth: + Authorizer: MyCognitoAuthMultipleUserPools WithLambdaTokenAuthorizer: Type: Api Properties: @@ -82,7 +98,15 @@ Resources: Method: get Auth: Authorizer: MyLambdaTokenAuth - WithLambdaTokenAuthorizer: + WithLambdaTokenAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/lambdatoken + Method: any + Auth: + Authorizer: MyLambdaTokenAuth + WithLambdaTokenNoneAuthorizer: Type: Api Properties: RestApiId: !Ref MyApi @@ -90,6 +114,14 @@ Resources: Method: patch Auth: Authorizer: MyLambdaTokenAuthNoneFunctionInvokeRole + WithLambdaTokenNoneAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/lambdatokennone + Method: any + Auth: + Authorizer: MyLambdaTokenAuthNoneFunctionInvokeRole WithLambdaRequestAuthorizer: Type: Api Properties: @@ -98,9 +130,23 @@ Resources: Method: delete Auth: Authorizer: MyLambdaRequestAuth + WithLambdaRequestAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/lambdarequest + Method: any + Auth: + Authorizer: MyLambdaRequestAuth WithDefaultAuthorizer: Type: Api Properties: RestApiId: !Ref MyApi Path: /users - Method: put \ No newline at end of file + Method: put + WithDefaultAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/default + Method: any \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum_openapi_3.yaml b/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum_openapi_3.yaml index 0012f8bc14..5c8d3597eb 100644 --- a/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum_openapi_3.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_auth_all_maximum_openapi_3.yaml @@ -51,7 +51,7 @@ Resources: Context: - Authorization4 ReauthorizeEvery: 0 - + MyFunction: Type: AWS::Serverless::Function Properties: @@ -67,6 +67,14 @@ Resources: Method: get Auth: Authorizer: NONE + WithNoAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/noauth + Method: any + Auth: + Authorizer: NONE WithCognitoMultipleUserPoolsAuthorizer: Type: Api Properties: @@ -75,6 +83,14 @@ Resources: Method: post Auth: Authorizer: MyCognitoAuthMultipleUserPools + WithCognitoMultipleUserPoolsAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/cognitomultiple + Method: any + Auth: + Authorizer: MyCognitoAuthMultipleUserPools WithLambdaTokenAuthorizer: Type: Api Properties: @@ -83,7 +99,15 @@ Resources: Method: get Auth: Authorizer: MyLambdaTokenAuth - WithLambdaTokenAuthorizer: + WithLambdaTokenAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/lambdatoken + Method: any + Auth: + Authorizer: MyLambdaTokenAuth + WithLambdaTokenNoneAuthorizer: Type: Api Properties: RestApiId: !Ref MyApi @@ -91,6 +115,14 @@ Resources: Method: patch Auth: Authorizer: MyLambdaTokenAuthNoneFunctionInvokeRole + WithLambdaTokenNoneAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/lambdatokennone + Method: any + Auth: + Authorizer: MyLambdaTokenAuthNoneFunctionInvokeRole WithLambdaRequestAuthorizer: Type: Api Properties: @@ -99,9 +131,23 @@ Resources: Method: delete Auth: Authorizer: MyLambdaRequestAuth + WithLambdaRequestAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/lambdarequest + Method: any + Auth: + Authorizer: MyLambdaRequestAuth WithDefaultAuthorizer: Type: Api Properties: RestApiId: !Ref MyApi Path: /users - Method: put \ No newline at end of file + Method: put + WithDefaultAuthorizerAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApi + Path: /any/default + Method: any \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum.yaml b/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum.yaml index f6eda0af2c..399df76126 100644 --- a/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum.yaml @@ -51,18 +51,36 @@ Resources: RestApiId: !Ref MyApiWithCognitoAuth Method: get Path: /cognito + CognitoAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithCognitoAuth + Method: any + Path: /any/cognito LambdaToken: Type: Api Properties: RestApiId: !Ref MyApiWithLambdaTokenAuth Method: get Path: /lambda-token + LambdaTokenAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaTokenAuth + Method: any + Path: /any/lambda-token LambdaRequest: Type: Api Properties: RestApiId: !Ref MyApiWithLambdaRequestAuth Method: get Path: /lambda-request + LambdaRequestAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaRequestAuth + Method: any + Path: /any/lambda-request MyUserPool: Type: AWS::Cognito::UserPool Properties: diff --git a/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum_openapi.yaml b/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum_openapi.yaml index 486bd1250f..bfa377bbbf 100644 --- a/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum_openapi.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_auth_all_minimum_openapi.yaml @@ -54,18 +54,36 @@ Resources: RestApiId: !Ref MyApiWithCognitoAuth Method: get Path: /cognito + CognitoAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithCognitoAuth + Method: any + Path: /any/cognito LambdaToken: Type: Api Properties: RestApiId: !Ref MyApiWithLambdaTokenAuth Method: get Path: /lambda-token + LambdaTokenAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaTokenAuth + Method: any + Path: /any/lambda-token LambdaRequest: Type: Api Properties: RestApiId: !Ref MyApiWithLambdaRequestAuth Method: get Path: /lambda-request + LambdaRequestAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaRequestAuth + Method: any + Path: /any/lambda-request MyUserPool: Type: AWS::Cognito::UserPool Properties: diff --git a/tests/functional/commands/validate/lib/models/api_with_auth_no_default.yaml b/tests/functional/commands/validate/lib/models/api_with_auth_no_default.yaml index 85d591b06e..3f3900386c 100644 --- a/tests/functional/commands/validate/lib/models/api_with_auth_no_default.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_auth_no_default.yaml @@ -48,18 +48,36 @@ Resources: RestApiId: !Ref MyApiWithCognitoAuth Method: get Path: /cognito + CognitoAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithCognitoAuth + Method: any + Path: /any/cognito LambdaToken: Type: Api Properties: RestApiId: !Ref MyApiWithLambdaTokenAuth Method: get Path: /lambda-token + LambdaTokenAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaTokenAuth + Method: any + Path: /any/lambda-token LambdaRequest: Type: Api Properties: RestApiId: !Ref MyApiWithLambdaRequestAuth Method: get Path: /lambda-request + LambdaRequestAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaRequestAuth + Method: any + Path: /any/lambda-request MyUserPool: Type: AWS::Cognito::UserPool Properties: diff --git a/tests/functional/commands/validate/lib/models/api_with_aws_account_blacklist.yaml b/tests/functional/commands/validate/lib/models/api_with_aws_account_blacklist.yaml index b93e63d9b6..19b51412a9 100644 --- a/tests/functional/commands/validate/lib/models/api_with_aws_account_blacklist.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_aws_account_blacklist.yaml @@ -23,3 +23,9 @@ Resources: Properties: Method: Put Path: /get + Any: + Type: Api + Properties: + Method: any + Path: /any + diff --git a/tests/functional/commands/validate/lib/models/api_with_aws_account_whitelist.yaml b/tests/functional/commands/validate/lib/models/api_with_aws_account_whitelist.yaml index c69a9b64f3..ff55cbae2b 100644 --- a/tests/functional/commands/validate/lib/models/api_with_aws_account_whitelist.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_aws_account_whitelist.yaml @@ -26,3 +26,16 @@ Resources: ] Method: Put Path: /get + Any: + Type: Api + Properties: + Auth: + ResourcePolicy: + AwsAccountWhitelist: [ + "12345" + ] + AwsAccountBlacklist: [ + "67890" + ] + Method: any + Path: /any diff --git a/tests/functional/commands/validate/lib/models/api_with_cors_and_auth_preflight_auth.yaml b/tests/functional/commands/validate/lib/models/api_with_cors_and_auth_preflight_auth.yaml index e984428c15..1fb222b890 100644 --- a/tests/functional/commands/validate/lib/models/api_with_cors_and_auth_preflight_auth.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_cors_and_auth_preflight_auth.yaml @@ -24,6 +24,13 @@ Resources: Method: post RestApiId: !Ref ServerlessApi + AnyHtml: + Type: Api + Properties: + Path: /any + Method: any + RestApiId: !Ref ServerlessApi + ServerlessApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_with_cors_and_conditions_no_definitionbody.yaml b/tests/functional/commands/validate/lib/models/api_with_cors_and_conditions_no_definitionbody.yaml index 6070b112d9..5075726ae7 100644 --- a/tests/functional/commands/validate/lib/models/api_with_cors_and_conditions_no_definitionbody.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_cors_and_conditions_no_definitionbody.yaml @@ -48,6 +48,13 @@ Resources: Path: / Method: post + AnyHtml: + Type: Api + Properties: + RestApiId: !Ref ExplicitApi + Path: /any + Method: any + ExplicitApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_with_cors_and_only_methods.yaml b/tests/functional/commands/validate/lib/models/api_with_cors_and_only_methods.yaml index 1ee2d92883..724de43017 100644 --- a/tests/functional/commands/validate/lib/models/api_with_cors_and_only_methods.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_cors_and_only_methods.yaml @@ -16,4 +16,9 @@ Resources: Properties: Path: / Method: get + AnyHtml: + Type: Api + Properties: + Path: /any + Method: any diff --git a/tests/functional/commands/validate/lib/models/api_with_cors_no_definitionbody.yaml b/tests/functional/commands/validate/lib/models/api_with_cors_no_definitionbody.yaml index f8b7bcd522..7d496c2f9b 100644 --- a/tests/functional/commands/validate/lib/models/api_with_cors_no_definitionbody.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_cors_no_definitionbody.yaml @@ -27,6 +27,13 @@ Resources: Path: / Method: post + AnyHtml: + Type: Api + Properties: + RestApiId: !Ref ExplicitApi + Path: /any + Method: any + ExplicitApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_with_default_aws_iam_auth_and_no_auth_route.yaml b/tests/functional/commands/validate/lib/models/api_with_default_aws_iam_auth_and_no_auth_route.yaml index 8bad587889..d3d69d577c 100644 --- a/tests/functional/commands/validate/lib/models/api_with_default_aws_iam_auth_and_no_auth_route.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_default_aws_iam_auth_and_no_auth_route.yaml @@ -19,6 +19,12 @@ Resources: RestApiId: !Ref MyApiWithAwsIamAuth Path: / Method: post + MyApiWithAwsIamAuthAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithAwsIamAuth + Path: /any/iam + Method: any MyApiWithNoAuth: Type: Api Properties: @@ -27,3 +33,11 @@ Resources: Method: get Auth: Authorizer: 'NONE' + MyApiWithNoAuthAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithAwsIamAuth + Path: /any/none + Method: any + Auth: + Authorizer: 'NONE' diff --git a/tests/functional/commands/validate/lib/models/api_with_if_conditional_with_resource_policy.yaml b/tests/functional/commands/validate/lib/models/api_with_if_conditional_with_resource_policy.yaml index 3ffecb9b74..cfbc74ec1e 100644 --- a/tests/functional/commands/validate/lib/models/api_with_if_conditional_with_resource_policy.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_if_conditional_with_resource_policy.yaml @@ -50,5 +50,12 @@ Resources: Ref: ExplicitApi Path: /three Method: put + AnyHtml: + Type: Api + Properties: + RestApiId: + Ref: ExplicitApi + Path: /any + Method: any \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_method_aws_iam_auth.yaml b/tests/functional/commands/validate/lib/models/api_with_method_aws_iam_auth.yaml index 8a1c8c6da2..16c06dc43e 100644 --- a/tests/functional/commands/validate/lib/models/api_with_method_aws_iam_auth.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_method_aws_iam_auth.yaml @@ -37,3 +37,29 @@ Resources: Auth: Authorizer: AWS_IAM InvokeRole: CALLER_CREDENTIALS + MyApiWithAwsIamAuthAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithoutAuth + Path: /any/one + Method: any + Auth: + Authorizer: AWS_IAM + MyApiWithAwsIamAuthAndCustomInvokeRoleAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithoutAuth + Path: /any/two + Method: any + Auth: + Authorizer: AWS_IAM + InvokeRole: rn:aws:iam::123:role/AUTH_AWS_IAM + MyApiWithAwsIamAuthAndDefaultInvokeRoleAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithoutAuth + Path: /any/three + Method: any + Auth: + Authorizer: AWS_IAM + InvokeRole: CALLER_CREDENTIALS diff --git a/tests/functional/commands/validate/lib/models/api_with_mode.yaml b/tests/functional/commands/validate/lib/models/api_with_mode.yaml new file mode 100644 index 0000000000..8df0693af4 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/api_with_mode.yaml @@ -0,0 +1,22 @@ +Resources: + Function: + Type: AWS::Serverless::Function + Properties: + CodeUri: s3://sam-demo-bucket/member_portal.zip + Handler: index.gethtml + Runtime: nodejs12.x + Events: + GetHtml: + Type: Api + Properties: + RestApiId: Api + Path: / + Method: get + + Api: + Type: AWS::Serverless::Api + Properties: + StageName: Prod + DefinitionUri: s3://sam-demo-bucket/webpage_swagger.json + Description: my description + Mode: overwrite diff --git a/tests/functional/commands/validate/lib/models/api_with_open_api_version.yaml b/tests/functional/commands/validate/lib/models/api_with_open_api_version.yaml index 1ffd32bd6a..7efa33f629 100644 --- a/tests/functional/commands/validate/lib/models/api_with_open_api_version.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_open_api_version.yaml @@ -16,6 +16,11 @@ Resources: Properties: Path: / Method: get + AnyHtml: + Type: Api + Properties: + Path: /any + Method: any ExplicitApi: Type: AWS::Serverless::Api Properties: diff --git a/tests/functional/commands/validate/lib/models/api_with_open_api_version_2.yaml b/tests/functional/commands/validate/lib/models/api_with_open_api_version_2.yaml index 688344e032..52e6530326 100644 --- a/tests/functional/commands/validate/lib/models/api_with_open_api_version_2.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_open_api_version_2.yaml @@ -16,6 +16,11 @@ Resources: Properties: Path: / Method: get + AnyHtml: + Type: Api + Properties: + Path: /any + Method: any ExplicitApi: Type: AWS::Serverless::Api Properties: diff --git a/tests/functional/commands/validate/lib/models/api_with_path_parameters.yaml b/tests/functional/commands/validate/lib/models/api_with_path_parameters.yaml index ac79e312c5..e1799d3e70 100644 --- a/tests/functional/commands/validate/lib/models/api_with_path_parameters.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_path_parameters.yaml @@ -12,6 +12,12 @@ Resources: RestApiId: HtmlApi Path: /{prameter}/resources Method: get + AnyHtml: + Type: Api + Properties: + RestApiId: HtmlApi + Path: /any/{prameter}/resources + Method: any HtmlApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_with_resource_policy.yaml b/tests/functional/commands/validate/lib/models/api_with_resource_policy.yaml index fb9071db25..2c34783842 100644 --- a/tests/functional/commands/validate/lib/models/api_with_resource_policy.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_resource_policy.yaml @@ -37,5 +37,12 @@ Resources: Ref: ExplicitApi Path: /three Method: put + AnyHtml: + Type: Api + Properties: + RestApiId: + Ref: ExplicitApi + Path: /any + Method: any \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_resource_policy_global_implicit.yaml b/tests/functional/commands/validate/lib/models/api_with_resource_policy_global_implicit.yaml index d3599c73c4..613f67dc10 100644 --- a/tests/functional/commands/validate/lib/models/api_with_resource_policy_global_implicit.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_resource_policy_global_implicit.yaml @@ -21,3 +21,18 @@ Resources: Action: 'execute-api:blah', Resource: ['execute-api:/*/*/*'] }] + AddItemAnyMethod: + Type: Api + Properties: + Path: /any/add + Method: any + Auth: + ResourcePolicy: + CustomStatements: [{ + Action: 'execute-api:Invoke', + Resource: ['execute-api:/*/*/*'] + }, + { + Action: 'execute-api:blah', + Resource: ['execute-api:/*/*/*'] + }] diff --git a/tests/functional/commands/validate/lib/models/api_with_resource_refs.yaml b/tests/functional/commands/validate/lib/models/api_with_resource_refs.yaml index 3381677ef2..e84845cbba 100644 --- a/tests/functional/commands/validate/lib/models/api_with_resource_refs.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_resource_refs.yaml @@ -21,6 +21,11 @@ Resources: Properties: Path: /html Method: GET + GetHtmlAnyMethod: + Type: Api + Properties: + Path: /any/html + Method: any Outputs: ImplicitApiDeployment: diff --git a/tests/functional/commands/validate/lib/models/api_with_source_vpc_blacklist.yaml b/tests/functional/commands/validate/lib/models/api_with_source_vpc_blacklist.yaml index 65073bdede..6315a79314 100644 --- a/tests/functional/commands/validate/lib/models/api_with_source_vpc_blacklist.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_source_vpc_blacklist.yaml @@ -23,4 +23,9 @@ Resources: Properties: Method: Put Path: /get + ApiAnyMethod: + Type: Api + Properties: + Method: any + Path: /any/get diff --git a/tests/functional/commands/validate/lib/models/api_with_source_vpc_whitelist.yaml b/tests/functional/commands/validate/lib/models/api_with_source_vpc_whitelist.yaml index 1cacf39415..f67ea34d8a 100644 --- a/tests/functional/commands/validate/lib/models/api_with_source_vpc_whitelist.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_source_vpc_whitelist.yaml @@ -31,11 +31,21 @@ Resources: Properties: Method: Put Path: /get + ApiAnyMethod: + Type: Api + Properties: + Method: any + Path: /any/get Fetch: Type: Api Properties: Method: Post Path: /fetch + FetchAnyMethod: + Type: Api + Properties: + Method: any + Path: /any/fetch MyApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_with_swagger_and_openapi_with_auth.yaml b/tests/functional/commands/validate/lib/models/api_with_swagger_and_openapi_with_auth.yaml index af30762da9..1b796e449b 100644 --- a/tests/functional/commands/validate/lib/models/api_with_swagger_and_openapi_with_auth.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_swagger_and_openapi_with_auth.yaml @@ -24,6 +24,11 @@ Resources: Properties: Path: / Method: get + GetHtmlAnyMethod: + Type: Api + Properties: + Path: /any + Method: any ExplicitApi: Type: AWS::Serverless::Api diff --git a/tests/functional/commands/validate/lib/models/api_with_swagger_authorizer_none.yaml b/tests/functional/commands/validate/lib/models/api_with_swagger_authorizer_none.yaml new file mode 100644 index 0000000000..98173772ec --- /dev/null +++ b/tests/functional/commands/validate/lib/models/api_with_swagger_authorizer_none.yaml @@ -0,0 +1,141 @@ +Resources: + MyApiWithCognitoAuth: + Type: "AWS::Serverless::Api" + Properties: + StageName: Prod + Auth: + Authorizers: + MyCognitoAuth: + UserPoolArn: !GetAtt MyUserPool.Arn + DefaultAuthorizer: MyCognitoAuth + + MyApiWithLambdaTokenAuth: + Type: "AWS::Serverless::Api" + Properties: + StageName: Prod + Auth: + Authorizers: + MyLambdaTokenAuth: + FunctionArn: !GetAtt MyAuthFn.Arn + DefaultAuthorizer: MyLambdaTokenAuth + + MyApiWithLambdaRequestAuth: + Type: "AWS::Serverless::Api" + Properties: + StageName: Prod + DefinitionBody: + swagger: 2.0 + info: + version: '1.0' + title: !Ref AWS::StackName + schemes: + - https + paths: + "/lambda-request": + get: + x-amazon-apigateway-integration: + httpMethod: POST + type: aws_proxy + uri: !Sub arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${MyFn.Arn}/invocations + passthroughBehavior: when_no_match + responses: {} + Auth: + Authorizers: + MyLambdaRequestAuth: + FunctionPayloadType: REQUEST + FunctionArn: !GetAtt MyAuthFn.Arn + Identity: + Headers: + - Authorization1 + DefaultAuthorizer: MyLambdaRequestAuth + + MyAuthFn: + Type: AWS::Serverless::Function + Properties: + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Handler: index.handler + Runtime: nodejs8.10 + + MyFn: + Type: AWS::Serverless::Function + Properties: + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Handler: index.handler + Runtime: nodejs8.10 + Events: + Cognito: + Type: Api + Properties: + RestApiId: !Ref MyApiWithCognitoAuth + Method: get + Auth: + Authorizer: NONE + Path: /cognito + CognitoAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithCognitoAuth + Method: any + Auth: + Authorizer: NONE + Path: /any/cognito + LambdaToken: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaTokenAuth + Method: get + Auth: + Authorizer: NONE + Path: /lambda-token + LambdaTokenAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaTokenAuth + Method: any + Auth: + Authorizer: NONE + Path: /any/lambda-token + LambdaRequest: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaRequestAuth + Auth: + Authorizer: NONE + Method: get + Path: /lambda-request + LambdaRequestAnyMethod: + Type: Api + Properties: + RestApiId: !Ref MyApiWithLambdaRequestAuth + Auth: + Authorizer: NONE + Method: any + Path: /any/lambda-request + + MyUserPool: + Type: AWS::Cognito::UserPool + Properties: + UserPoolName: UserPoolName + Policies: + PasswordPolicy: + MinimumLength: 8 + UsernameAttributes: + - email + Schema: + - AttributeDataType: String + Name: email + Required: false \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_usageplans.yaml b/tests/functional/commands/validate/lib/models/api_with_usageplans.yaml index 836d98648b..41b08e493d 100644 --- a/tests/functional/commands/validate/lib/models/api_with_usageplans.yaml +++ b/tests/functional/commands/validate/lib/models/api_with_usageplans.yaml @@ -63,6 +63,13 @@ Resources: Ref: MyApiOne Method: get Path: /path/one + ApiKeyAnyMethod: + Type: Api + Properties: + RestApiId: + Ref: MyApiOne + Method: any + Path: /any/path/one MyFunctionTwo: Type: AWS::Serverless::Function diff --git a/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_three.yaml b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_three.yaml new file mode 100644 index 0000000000..aed811ca0a --- /dev/null +++ b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_three.yaml @@ -0,0 +1,102 @@ +Globals: + Api: + Auth: + ApiKeyRequired: true + UsagePlan: + CreateUsagePlan: SHARED + +Conditions: + C1: + Fn::Equals: + - test + - test + C2: + Fn::Equals: + - test + - test + +Resources: + MyApiOne: + Type: AWS::Serverless::Api + Condition: C1 + UpdateReplacePolicy: Delete + Properties: + StageName: Prod + + MyApiTwo: + Type: AWS::Serverless::Api + Condition: C2 + UpdateReplacePolicy: Snapshot + Properties: + StageName: Prod + + MyApiThree: + Type: AWS::Serverless::Api + Properties: + StageName: Prod + + MyFunctionOne: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiOne + Method: get + Path: /path/one + + MyFunctionTwo: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiTwo + Method: get + Path: /path/two + + MyFunctionThree: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiThree + Method: get + Path: /path/three \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_two.yaml b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_two.yaml new file mode 100644 index 0000000000..36c5bab657 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_attributes_two.yaml @@ -0,0 +1,75 @@ +Globals: + Api: + Auth: + ApiKeyRequired: true + UsagePlan: + CreateUsagePlan: SHARED + +Conditions: + C1: + Fn::Equals: + - test + - test + C2: + Fn::Equals: + - test + - test + +Resources: + MyApiOne: + Type: AWS::Serverless::Api + DeletionPolicy: Delete + Condition: C1 + Properties: + StageName: Prod + + MyApiTwo: + Type: AWS::Serverless::Api + DeletionPolicy: Retain + Condition: C2 + Properties: + StageName: Prod + + MyFunctionOne: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiOne + Method: get + Path: /path/one + + MyFunctionTwo: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiTwo + Method: get + Path: /path/two \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_1.yaml b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_1.yaml new file mode 100644 index 0000000000..f05fe7511b --- /dev/null +++ b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_1.yaml @@ -0,0 +1,61 @@ +Globals: + Api: + Auth: + ApiKeyRequired: true + UsagePlan: + CreateUsagePlan: SHARED + +Resources: + MyApiOne: + Type: AWS::Serverless::Api + Properties: + StageName: Prod + + MyApiTwo: + Type: AWS::Serverless::Api + Properties: + StageName: Prod + + MyFunctionOne: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiOne + Method: get + Path: /path/one + + MyFunctionTwo: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiTwo + Method: get + Path: /path/two diff --git a/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_2.yaml b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_2.yaml new file mode 100644 index 0000000000..857e387692 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/api_with_usageplans_shared_no_side_effect_2.yaml @@ -0,0 +1,34 @@ +Globals: + Api: + Auth: + ApiKeyRequired: true + UsagePlan: + CreateUsagePlan: SHARED + +Resources: + MyApiFour: + Type: AWS::Serverless::Api + Properties: + StageName: Prod + + MyFunctionFour: + Type: AWS::Serverless::Function + Properties: + Handler: index.handler + Runtime: nodejs12.x + InlineCode: | + exports.handler = async (event) => { + return { + statusCode: 200, + body: JSON.stringify(event), + headers: {} + } + } + Events: + ApiKey: + Type: Api + Properties: + RestApiId: + Ref: MyApiFour + Method: get + Path: /path/four diff --git a/tests/functional/commands/validate/lib/models/function_with_deployment_preference_alarms_intrinsic_if.yaml b/tests/functional/commands/validate/lib/models/function_with_deployment_preference_alarms_intrinsic_if.yaml new file mode 100644 index 0000000000..f392f10628 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/function_with_deployment_preference_alarms_intrinsic_if.yaml @@ -0,0 +1,23 @@ +Conditions: + MyCondition: + Fn::Equals: + - true + - false +Resources: + MinimalFunction: + Type: "AWS::Serverless::Function" + Properties: + CodeUri: s3://sam-demo-bucket/hello.zip + Handler: hello.handler + Runtime: python2.7 + AutoPublishAlias: live + DeploymentPreference: + Type: Linear10PercentEvery3Minutes + Alarms: + Fn::If: + - MyCondition + - - Alarm1 + - Alarm2 + - Alarm3 + - - Alarm1 + - Alarm5 diff --git a/tests/functional/commands/validate/lib/models/function_with_mq_virtual_host.yaml b/tests/functional/commands/validate/lib/models/function_with_mq_virtual_host.yaml new file mode 100644 index 0000000000..b5d2c62085 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/function_with_mq_virtual_host.yaml @@ -0,0 +1,19 @@ +Resources: + MQFunction: + Type: 'AWS::Serverless::Function' + Properties: + CodeUri: s3://sam-demo-bucket/queues.zip + Handler: queue.mq_handler + Runtime: python2.7 + Events: + MyMQQueue: + Type: MQ + Properties: + Broker: arn:aws:mq:us-east-2:123456789012:broker:MyBroker:b-1234a5b6-78cd-901e-2fgh-3i45j6k178l9 + Queues: + - "Queue1" + SourceAccessConfigurations: + - Type: BASIC_AUTH + URI: arn:aws:secretsmanager:us-west-2:123456789012:secret:my-path/my-secret-name-1a2b3c + - Type: VIRTUAL_HOST + URI: vhost_name \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/implicit_api_deletion_policy_precedence.yaml b/tests/functional/commands/validate/lib/models/implicit_api_deletion_policy_precedence.yaml new file mode 100644 index 0000000000..643b9ac477 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/implicit_api_deletion_policy_precedence.yaml @@ -0,0 +1,32 @@ +Resources: + RestApiFunction: + Type: AWS::Serverless::Function + DeletionPolicy: Delete + UpdateReplacePolicy: Retain + Properties: + CodeUri: s3://sam-demo-bucket/todo_list.zip + Handler: index.restapi + Runtime: nodejs12.x + Policies: AmazonDynamoDBFullAccess + Events: + GetHtml: + Type: Api + Properties: + Path: /{proxy+} + Method: any + + GetHtmlFunction: + Type: AWS::Serverless::Function + DeletionPolicy: Retain + UpdateReplacePolicy: Retain + Properties: + CodeUri: s3://sam-demo-bucket/todo_list.zip + Handler: index.gethtml + Runtime: nodejs12.x + Policies: AmazonDynamoDBReadOnlyAccess + Events: + GetHtml: + Type: Api + Properties: + Path: /{proxy++} + Method: any diff --git a/tests/functional/commands/validate/lib/models/layer_deletion_policy_precedence.yaml b/tests/functional/commands/validate/lib/models/layer_deletion_policy_precedence.yaml new file mode 100644 index 0000000000..a967ed6212 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/layer_deletion_policy_precedence.yaml @@ -0,0 +1,18 @@ +Resources: + MinimalLayer: + Type: 'AWS::Serverless::LayerVersion' + DeletionPolicy: Delete + Properties: + ContentUri: s3://sam-demo-bucket/layer.zip + RetentionPolicy: Retain + + MinimalLayer2: + Type: 'AWS::Serverless::LayerVersion' + DeletionPolicy: Delete + Properties: + ContentUri: s3://sam-demo-bucket/layer.zip + + MinimalLayer3: + Type: 'AWS::Serverless::LayerVersion' + Properties: + ContentUri: s3://sam-demo-bucket/layer.zip \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/models/state_machine_with_xray_policies.yaml b/tests/functional/commands/validate/lib/models/state_machine_with_xray_policies.yaml new file mode 100644 index 0000000000..719d5874ab --- /dev/null +++ b/tests/functional/commands/validate/lib/models/state_machine_with_xray_policies.yaml @@ -0,0 +1,22 @@ +Resources: + MyFunction: + Type: "AWS::Serverless::Function" + Properties: + CodeUri: s3://sam-demo-bucket/hello.zip + Handler: hello.handler + Runtime: python2.7 + + StateMachine: + Type: AWS::Serverless::StateMachine + Properties: + Name: MyBasicStateMachine + Type: STANDARD + DefinitionUri: s3://sam-demo-bucket/my-state-machine.asl.json + Tracing: + Enabled: true + Policies: + - Version: "2012-10-17" + Statement: + - Effect: Allow + Action: lambda:InvokeFunction + Resource: !GetAtt MyFunction.Arn diff --git a/tests/functional/commands/validate/lib/models/state_machine_with_xray_role.yaml b/tests/functional/commands/validate/lib/models/state_machine_with_xray_role.yaml new file mode 100644 index 0000000000..f5e56e7294 --- /dev/null +++ b/tests/functional/commands/validate/lib/models/state_machine_with_xray_role.yaml @@ -0,0 +1,10 @@ +Resources: + StateMachine: + Type: AWS::Serverless::StateMachine + Properties: + Name: MyStateMachineWithXRayTracing + Type: STANDARD + DefinitionUri: s3://sam-demo-bucket/my-state-machine.asl.json + Role: arn:aws:iam::123456123456:role/service-role/SampleRole + Tracing: + Enabled: true diff --git a/tests/functional/commands/validate/lib/models/version_deletion_policy_precedence.yaml b/tests/functional/commands/validate/lib/models/version_deletion_policy_precedence.yaml new file mode 100644 index 0000000000..bf868f9a6e --- /dev/null +++ b/tests/functional/commands/validate/lib/models/version_deletion_policy_precedence.yaml @@ -0,0 +1,19 @@ +Resources: + MinimalFunction: + Type: 'AWS::Serverless::Function' + Properties: + CodeUri: s3://sam-demo-bucket/hello.zip + Handler: hello.handler + Runtime: python2.7 + AutoPublishAlias: live + VersionDescription: sam-testing + + MinimalFunction2: + Type: 'AWS::Serverless::Function' + DeletionPolicy: Delete + Properties: + CodeUri: s3://sam-demo-bucket/hello.zip + Handler: hello.handler + Runtime: python2.7 + AutoPublishAlias: live + VersionDescription: sam-testing \ No newline at end of file diff --git a/tests/functional/commands/validate/lib/test_sam_template_validator.py b/tests/functional/commands/validate/lib/test_sam_template_validator.py index 2dae4a5075..8d79b836ea 100644 --- a/tests/functional/commands/validate/lib/test_sam_template_validator.py +++ b/tests/functional/commands/validate/lib/test_sam_template_validator.py @@ -36,7 +36,7 @@ def test_valid_template(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() @@ -56,7 +56,7 @@ def test_invalid_template(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") with self.assertRaises(InvalidSamDocumentException): validator.is_valid() @@ -76,7 +76,7 @@ def test_valid_template_with_local_code_for_function(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() @@ -93,7 +93,7 @@ def test_valid_template_with_local_code_for_layer_version(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() @@ -113,7 +113,7 @@ def test_valid_template_with_local_code_for_api(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() @@ -133,7 +133,7 @@ def test_valid_template_with_DefinitionBody_for_api(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() @@ -165,7 +165,7 @@ def test_valid_template_with_s3_object_passed(self): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() @@ -187,7 +187,7 @@ def test_valid_api_request_model_template(self, template_path): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"PolicyName": "FakePolicy"} - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, region="us-east-1") # Should not throw an exception validator.is_valid() diff --git a/tests/integration/buildcmd/build_integ_base.py b/tests/integration/buildcmd/build_integ_base.py index 915235985c..f8649404e9 100644 --- a/tests/integration/buildcmd/build_integ_base.py +++ b/tests/integration/buildcmd/build_integ_base.py @@ -73,6 +73,7 @@ def get_command_list( container_env_var=None, container_env_var_file=None, build_image=None, + region=None, ): command_list = [self.cmd, "build"] @@ -118,6 +119,9 @@ def get_command_list( if build_image: command_list += ["--build-image", build_image] + if region: + command_list += ["--region", region] + return command_list def verify_docker_container_cleanedup(self, runtime): diff --git a/tests/integration/buildcmd/test_build_cmd.py b/tests/integration/buildcmd/test_build_cmd.py index 7fdc30c2bd..9908d1eef0 100644 --- a/tests/integration/buildcmd/test_build_cmd.py +++ b/tests/integration/buildcmd/test_build_cmd.py @@ -1,39 +1,45 @@ -import shutil -import sys -import os import logging +import os import random -from unittest import skipIf +import shutil +import sys from pathlib import Path -from parameterized import parameterized, parameterized_class +from unittest import skipIf import pytest +from parameterized import parameterized, parameterized_class from samcli.lib.utils import osutils -from .build_integ_base import ( - BuildIntegBase, - DedupBuildIntegBase, - CachedBuildIntegBase, - BuildIntegRubyBase, - NestedBuildIntegBase, - IntrinsicIntegBase, -) from tests.testing_utils import ( IS_WINDOWS, RUNNING_ON_CI, + RUNNING_TEST_FOR_MASTER_ON_CI, + RUN_BY_CANARY, CI_OVERRIDE, run_command, SKIP_DOCKER_TESTS, SKIP_DOCKER_MESSAGE, ) +from .build_integ_base import ( + BuildIntegBase, + DedupBuildIntegBase, + CachedBuildIntegBase, + BuildIntegRubyBase, + NestedBuildIntegBase, + IntrinsicIntegBase, +) LOG = logging.getLogger(__name__) TIMEOUT = 420 # 7 mins +# SAR tests require credentials. This is to skip running the test where credentials are not available. +SKIP_SAR_TESTS = RUNNING_ON_CI and RUNNING_TEST_FOR_MASTER_ON_CI and not RUN_BY_CANARY + @skipIf( - ((IS_WINDOWS and RUNNING_ON_CI) and not CI_OVERRIDE), + # Hits public ECR pull limitation, move it to canary tests + ((not RUN_BY_CANARY) or (IS_WINDOWS and RUNNING_ON_CI) and not CI_OVERRIDE), "Skip build tests on windows when running in CI unless overridden", ) class TestBuildCommand_PythonFunctions_Images(BuildIntegBase): @@ -49,7 +55,7 @@ class TestBuildCommand_PythonFunctions_Images(BuildIntegBase): FUNCTION_LOGICAL_ID_IMAGE = "ImageFunction" - @parameterized.expand([("3.6", False), ("3.7", False), ("3.8", False)]) + @parameterized.expand([("3.6", False), ("3.7", False), ("3.8", False), ("3.9", False)]) @pytest.mark.flaky(reruns=3) def test_with_default_requirements(self, runtime, use_container): overrides = { @@ -91,6 +97,7 @@ class TestBuildCommand_PythonFunctions(BuildIntegBase): ("python3.6", "Python", False), ("python3.7", "Python", False), ("python3.8", "Python", False), + ("python3.9", "Python", False), # numpy 1.20.3 (in PythonPEP600/requirements.txt) only support python 3.7+ ("python3.7", "PythonPEP600", False), ("python3.8", "PythonPEP600", False), @@ -98,6 +105,7 @@ class TestBuildCommand_PythonFunctions(BuildIntegBase): ("python3.6", "Python", "use_container"), ("python3.7", "Python", "use_container"), ("python3.8", "Python", "use_container"), + ("python3.9", "Python", "use_container"), ] ) @pytest.mark.flaky(reruns=3) @@ -1127,8 +1135,8 @@ def test_with_wrong_builder_specified_python_runtime(self, use_container): # runtime is chosen based off current python version. runtime = self._get_python_version() - # BuildMethod is set to the ruby2.7, this should cause failure. - overrides = {"Runtime": runtime, "CodeUri": "Provided", "Handler": "main.handler", "BuildMethod": "ruby2.7"} + # BuildMethod is set to the java8, this should cause failure. + overrides = {"Runtime": runtime, "CodeUri": "Provided", "Handler": "main.handler", "BuildMethod": "java8"} manifest_path = os.path.join(self.test_data_path, "Provided", "requirements.txt") cmdlist = self.get_command_list( @@ -1182,12 +1190,12 @@ class TestBuildWithDedupBuilds(DedupBuildIntegBase): ), (False, "Java/gradlew", "aws.example.Hello::myHandler", "aws.example.SecondFunction::myHandler", "java8"), (False, "Node", "main.lambdaHandler", "main.secondLambdaHandler", "nodejs14.x"), - (False, "Python", "main.first_function_handler", "main.second_function_handler", "python3.8"), + (False, "Python", "main.first_function_handler", "main.second_function_handler", "python3.9"), (False, "Ruby", "app.lambda_handler", "app.second_lambda_handler", "ruby2.5"), # container (True, "Java/gradlew", "aws.example.Hello::myHandler", "aws.example.SecondFunction::myHandler", "java8"), (True, "Node", "main.lambdaHandler", "main.secondLambdaHandler", "nodejs14.x"), - (True, "Python", "main.first_function_handler", "main.second_function_handler", "python3.8"), + (True, "Python", "main.first_function_handler", "main.second_function_handler", "python3.9"), (True, "Ruby", "app.lambda_handler", "app.second_lambda_handler", "ruby2.5"), ] ) @@ -1305,12 +1313,12 @@ class TestBuildWithCacheBuilds(CachedBuildIntegBase): ), (False, "Java/gradlew", "aws.example.Hello::myHandler", "aws.example.SecondFunction::myHandler", "java8"), (False, "Node", "main.lambdaHandler", "main.secondLambdaHandler", "nodejs14.x"), - (False, "Python", "main.first_function_handler", "main.second_function_handler", "python3.8"), + (False, "Python", "main.first_function_handler", "main.second_function_handler", "python3.9"), (False, "Ruby", "app.lambda_handler", "app.second_lambda_handler", "ruby2.5"), # container (True, "Java/gradlew", "aws.example.Hello::myHandler", "aws.example.SecondFunction::myHandler", "java8"), (True, "Node", "main.lambdaHandler", "main.secondLambdaHandler", "nodejs14.x"), - (True, "Python", "main.first_function_handler", "main.second_function_handler", "python3.8"), + (True, "Python", "main.first_function_handler", "main.second_function_handler", "python3.9"), (True, "Ruby", "app.lambda_handler", "app.second_lambda_handler", "ruby2.5"), ] ) @@ -1341,6 +1349,36 @@ def test_cache_build(self, use_container, code_uri, function1_handler, function2 expected_messages, command_result, self._make_parameter_override_arg(overrides) ) + @skipIf(SKIP_DOCKER_TESTS, SKIP_DOCKER_MESSAGE) + def test_cached_build_with_env_vars(self): + """ + Build 2 times to verify that second time hits the cached build + """ + overrides = { + "FunctionCodeUri": "Python", + "Function1Handler": "main.first_function_handler", + "Function2Handler": "main.second_function_handler", + "FunctionRuntime": "python3.8", + } + cmdlist = self.get_command_list( + use_container=True, parameter_overrides=overrides, cached=True, container_env_var="FOO=BAR" + ) + + LOG.info("Running Command (cache should be invalid): %s", cmdlist) + command_result = run_command(cmdlist, cwd=self.working_dir) + self.assertTrue( + "Cache is invalid, running build and copying resources to function build definition" + in command_result.stderr.decode("utf-8") + ) + + LOG.info("Re-Running Command (valid cache should exist): %s", cmdlist) + command_result_with_cache = run_command(cmdlist, cwd=self.working_dir) + + self.assertTrue( + "Valid cache found, copying previously built resources from function build definition" + in command_result_with_cache.stderr.decode("utf-8") + ) + @skipIf( ((IS_WINDOWS and RUNNING_ON_CI) and not CI_OVERRIDE), @@ -1361,12 +1399,12 @@ class TestParallelBuilds(DedupBuildIntegBase): ), (False, "Java/gradlew", "aws.example.Hello::myHandler", "aws.example.SecondFunction::myHandler", "java8"), (False, "Node", "main.lambdaHandler", "main.secondLambdaHandler", "nodejs14.x"), - (False, "Python", "main.first_function_handler", "main.second_function_handler", "python3.8"), + (False, "Python", "main.first_function_handler", "main.second_function_handler", "python3.9"), (False, "Ruby", "app.lambda_handler", "app.second_lambda_handler", "ruby2.5"), # container (True, "Java/gradlew", "aws.example.Hello::myHandler", "aws.example.SecondFunction::myHandler", "java8"), (True, "Node", "main.lambdaHandler", "main.secondLambdaHandler", "nodejs14.x"), - (True, "Python", "main.first_function_handler", "main.second_function_handler", "python3.8"), + (True, "Python", "main.first_function_handler", "main.second_function_handler", "python3.9"), (True, "Ruby", "app.lambda_handler", "app.second_lambda_handler", "ruby2.5"), ] ) @@ -2014,3 +2052,36 @@ def test_functions_layers_with_s3_codeuri(self): [""], command_result, # there is only one stack ) + + +@skipIf(SKIP_SAR_TESTS, "Skip SAR tests") +class TestBuildSAR(BuildIntegBase): + template = "aws-serverless-application-with-application-id-map.yaml" + + @parameterized.expand( + [ + ("use_container", "us-east-2"), + ("use_container", "eu-west-1"), + ("use_container", None), + (False, "us-east-2"), + (False, "eu-west-1"), + (False, None), + ] + ) + @pytest.mark.flaky(reruns=3) + def test_sar_application_with_location_resolved_from_map(self, use_container, region): + if use_container and SKIP_DOCKER_TESTS: + self.skipTest(SKIP_DOCKER_MESSAGE) + + cmdlist = self.get_command_list(use_container=use_container, region=region) + LOG.info("Running Command: %s", cmdlist) + LOG.info(self.working_dir) + process_execute = run_command(cmdlist, cwd=self.working_dir) + + if region == "us-east-2": # Success [the !FindInMap contains an entry for use-east-2 region only] + self.assertEqual(process_execute.process.returncode, 0) + else: + # Using other regions or the default SAM CLI region (us-east-1, in case if None region given) + # will fail the build as there is no mapping + self.assertEqual(process_execute.process.returncode, 1) + self.assertIn("Property \\'ApplicationId\\' cannot be resolved.", str(process_execute.stderr)) diff --git a/tests/integration/buildcmd/test_cdk_build_cmd.py b/tests/integration/buildcmd/test_cdk_build_cmd.py index 4d857b4e8e..0e41ed97d6 100644 --- a/tests/integration/buildcmd/test_cdk_build_cmd.py +++ b/tests/integration/buildcmd/test_cdk_build_cmd.py @@ -1,7 +1,9 @@ +from unittest import skipIf + from .build_integ_base import CdkBuildIntegPythonBase, CdkBuildIntegNodejsBase from distutils.dir_util import copy_tree from pathlib import Path -from tests.testing_utils import run_command +from tests.testing_utils import run_command, RUN_BY_CANARY import logging import requests import os @@ -9,6 +11,10 @@ LOG = logging.getLogger(__name__) +@skipIf( + not RUN_BY_CANARY, + "Skip build tests that are not running on canaries", +) class TestBuildWithCDKPluginNestedStacks(CdkBuildIntegPythonBase): def test_cdk_nested_build(self): project_name = "cdk-example-multiple-stacks-01" @@ -32,6 +38,10 @@ def test_cdk_nested_build(self): ) +@skipIf( + not RUN_BY_CANARY, + "Skip build tests that are not running on canaries", +) class TestBuildWithCDKPluginWithApiGateway(CdkBuildIntegNodejsBase): def test_cdk_apigateway(self): project_name = "cdk-example-rest-api-gateway" @@ -46,6 +56,10 @@ def test_cdk_apigateway(self): ) +@skipIf( + not RUN_BY_CANARY, + "Skip build tests that are not running on canaries", +) class TestBuildWithCDKPluginWithApiCorsLambda(CdkBuildIntegPythonBase): def test_cdk_api_cors_lambda(self): project_name = "api-cors-lambda" @@ -58,6 +72,10 @@ def test_cdk_api_cors_lambda(self): self.verify_invoke_built_function("ApiCorsLambdaStack/ApiCorsLambda", expected, self.working_dir) +@skipIf( + not RUN_BY_CANARY, + "Skip build tests that are not running on canaries", +) class TestBuildWithCDKLayer(CdkBuildIntegPythonBase): def test_cdk_layer(self): project_name = "cdk-example-layer" @@ -71,6 +89,10 @@ def test_cdk_layer(self): self.verify_invoke_built_function("CdkExampleLayerStack/lambda-function", expected, self.working_dir) +@skipIf( + not RUN_BY_CANARY, + "Skip build tests that are not running on canaries", +) class TestBuildWithCDKVariousOptions(CdkBuildIntegPythonBase): project_name = "api-cors-lambda" diff --git a/tests/integration/deploy/deploy_integ_base.py b/tests/integration/deploy/deploy_integ_base.py index fee12c9268..a0dfcdd0ab 100644 --- a/tests/integration/deploy/deploy_integ_base.py +++ b/tests/integration/deploy/deploy_integ_base.py @@ -30,6 +30,7 @@ def get_deploy_command_list( template_file=None, s3_prefix=None, capabilities=None, + capabilities_list=None, force_upload=False, notification_arns=None, fail_on_empty_changeset=None, @@ -61,6 +62,10 @@ def get_deploy_command_list( command_list = command_list + ["--image-repositories", str(image_repositories)] if capabilities: command_list = command_list + ["--capabilities", str(capabilities)] + elif capabilities_list: + command_list.append("--capabilities") + for capability in capabilities_list: + command_list.append(str(capability)) if parameter_overrides: command_list = command_list + ["--parameter-overrides", str(parameter_overrides)] if role_arn: diff --git a/tests/integration/deploy/test_deploy_command.py b/tests/integration/deploy/test_deploy_command.py index d942d09ead..6188b24062 100644 --- a/tests/integration/deploy/test_deploy_command.py +++ b/tests/integration/deploy/test_deploy_command.py @@ -1,20 +1,22 @@ import os import shutil import tempfile -import uuid import time +import uuid +from pathlib import Path from unittest import skipIf import boto3 import docker +from botocore.config import Config from parameterized import parameterized -from samcli.lib.config.samconfig import DEFAULT_CONFIG_FILE_NAME from samcli.lib.bootstrap.bootstrap import SAM_CLI_STACK_NAME +from samcli.lib.config.samconfig import DEFAULT_CONFIG_FILE_NAME from tests.integration.deploy.deploy_integ_base import DeployIntegBase from tests.integration.package.package_integ_base import PackageIntegBase from tests.testing_utils import RUNNING_ON_CI, RUNNING_TEST_FOR_MASTER_ON_CI, RUN_BY_CANARY -from tests.testing_utils import CommandResult, run_command, run_command_with_input +from tests.testing_utils import run_command, run_command_with_input # Deploy tests require credentials and CI/CD will only add credentials to the env if the PR is from the same repo. # This is to restrict package tests to run outside of CI/CD, when the branch is not master or tests are not run by Canary @@ -30,15 +32,14 @@ class TestDeploy(PackageIntegBase, DeployIntegBase): def setUpClass(cls): cls.docker_client = docker.from_env() cls.local_images = [ - ("alpine", "latest"), - # below 3 images are for test_deploy_nested_stacks() - ("python", "3.9-slim"), - ("python", "3.8-slim"), - ("python", "3.7-slim"), + ("public.ecr.aws/sam/emulation-python3.8", "latest"), ] # setup some images locally by pulling them. for repo, tag in cls.local_images: cls.docker_client.api.pull(repository=repo, tag=tag) + cls.docker_client.api.tag(f"{repo}:{tag}", "emulation-python3.8", tag="latest") + cls.docker_client.api.tag(f"{repo}:{tag}", "emulation-python3.8-2", tag="latest") + # setup signing profile arn & name cls.signing_profile_name = os.environ.get("AWS_SIGNING_PROFILE_NAME") cls.signing_profile_version_arn = os.environ.get("AWS_SIGNING_PROFILE_VERSION_ARN") @@ -48,16 +49,21 @@ def setUpClass(cls): def setUp(self): self.cf_client = boto3.client("cloudformation") self.sns_arn = os.environ.get("AWS_SNS") - self.stack_names = [] + self.stacks = [] time.sleep(CFN_SLEEP) super().setUp() def tearDown(self): shutil.rmtree(os.path.join(os.getcwd(), ".aws-sam", "build"), ignore_errors=True) - for stack_name in self.stack_names: + for stack in self.stacks: # because of the termination protection, do not delete aws-sam-cli-managed-default stack + stack_name = stack["name"] if stack_name != SAM_CLI_STACK_NAME: - self.cf_client.delete_stack(StackName=stack_name) + region = stack.get("region") + cf_client = ( + self.cf_client if not region else boto3.client("cloudformation", config=Config(region_name=region)) + ) + cf_client.delete_stack(StackName=stack_name) super().tearDown() @parameterized.expand(["aws-serverless-function.yaml"]) @@ -73,7 +79,7 @@ def test_package_and_deploy_no_s3_bucket_all_args(self, template_file): self.assertEqual(package_process.process.returncode, 0) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Deploy and only show changeset. deploy_command_list_no_execute = self.get_deploy_command_list( @@ -114,7 +120,7 @@ def test_no_package_and_deploy_with_s3_bucket_all_args(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( @@ -140,7 +146,7 @@ def test_no_package_and_deploy_with_s3_bucket_all_args_image_repository(self, te template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( @@ -167,7 +173,7 @@ def test_no_package_and_deploy_with_s3_bucket_all_args_image_repositories(self, template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( @@ -194,7 +200,7 @@ def test_no_package_and_deploy_with_s3_bucket_and_no_confirm_changeset(self, tem template_path = self.test_data_path.joinpath(template_file) stack_name = "a" + str(uuid.uuid4()).replace("-", "")[:10] - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( @@ -225,7 +231,7 @@ def test_deploy_no_redeploy_on_same_built_artifacts(self, template_file): run_command(build_command_list) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Should result in a zero exit code. deploy_command_list = self.get_deploy_command_list( stack_name=stack_name, @@ -259,7 +265,7 @@ def test_no_package_and_deploy_with_s3_bucket_all_args_confirm_changeset(self, t template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( @@ -383,7 +389,7 @@ def test_deploy_with_s3_bucket_switch_region(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( @@ -440,7 +446,7 @@ def test_deploy_twice_with_no_fail_on_empty_changeset(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) kwargs = { "template_file": template_path, @@ -479,7 +485,7 @@ def test_deploy_twice_with_fail_on_empty_changeset(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. kwargs = { @@ -515,7 +521,7 @@ def test_deploy_twice_with_fail_on_empty_changeset(self, template_file): def test_deploy_inline_no_package(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) deploy_command_list = self.get_deploy_command_list( template_file=template_path, stack_name=stack_name, capabilities="CAPABILITY_IAM" @@ -523,12 +529,26 @@ def test_deploy_inline_no_package(self, template_file): deploy_process_execute = run_command(deploy_command_list) self.assertEqual(deploy_process_execute.process.returncode, 0) + @parameterized.expand([("aws-serverless-inline.yaml", "samconfig-read-boolean-tomlkit.toml")]) + def test_deploy_with_toml_config(self, template_file, config_file): + template_path = self.test_data_path.joinpath(template_file) + config_path = self.test_data_path.joinpath(config_file) + + stack_name = self._method_to_stack_name(self.id()) + self.stacks.append({"name": stack_name}) + + deploy_command_list = self.get_deploy_command_list( + template_file=template_path, stack_name=stack_name, config_file=config_path, capabilities="CAPABILITY_IAM" + ) + deploy_process_execute = run_command(deploy_command_list) + self.assertEqual(deploy_process_execute.process.returncode, 0) + @parameterized.expand(["aws-serverless-function.yaml"]) def test_deploy_guided_zip(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) @@ -539,7 +559,7 @@ def test_deploy_guided_zip(self, template_file): # Deploy should succeed with a managed stack self.assertEqual(deploy_process_execute.process.returncode, 0) - self.stack_names.append(SAM_CLI_STACK_NAME) + self.stacks.append({"name": SAM_CLI_STACK_NAME}) # Remove samconfig.toml os.remove(self.test_data_path.joinpath(DEFAULT_CONFIG_FILE_NAME)) @@ -548,7 +568,7 @@ def test_deploy_guided_image(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) @@ -559,7 +579,7 @@ def test_deploy_guided_image(self, template_file): # Deploy should succeed with a managed stack self.assertEqual(deploy_process_execute.process.returncode, 0) - self.stack_names.append(SAM_CLI_STACK_NAME) + self.stacks.append({"name": SAM_CLI_STACK_NAME}) # Remove samconfig.toml os.remove(self.test_data_path.joinpath(DEFAULT_CONFIG_FILE_NAME)) @@ -568,7 +588,7 @@ def test_deploy_guided_set_parameter(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) @@ -579,7 +599,7 @@ def test_deploy_guided_set_parameter(self, template_file): # Deploy should succeed with a managed stack self.assertEqual(deploy_process_execute.process.returncode, 0) - self.stack_names.append(SAM_CLI_STACK_NAME) + self.stacks.append({"name": SAM_CLI_STACK_NAME}) # Remove samconfig.toml os.remove(self.test_data_path.joinpath(DEFAULT_CONFIG_FILE_NAME)) @@ -588,7 +608,7 @@ def test_deploy_guided_set_capabilities(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) @@ -599,7 +619,7 @@ def test_deploy_guided_set_capabilities(self, template_file): ) # Deploy should succeed with a managed stack self.assertEqual(deploy_process_execute.process.returncode, 0) - self.stack_names.append(SAM_CLI_STACK_NAME) + self.stacks.append({"name": SAM_CLI_STACK_NAME}) # Remove samconfig.toml os.remove(self.test_data_path.joinpath(DEFAULT_CONFIG_FILE_NAME)) @@ -608,7 +628,7 @@ def test_deploy_guided_capabilities_default(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) @@ -619,7 +639,7 @@ def test_deploy_guided_capabilities_default(self, template_file): ) # Deploy should succeed with a managed stack self.assertEqual(deploy_process_execute.process.returncode, 0) - self.stack_names.append(SAM_CLI_STACK_NAME) + self.stacks.append({"name": SAM_CLI_STACK_NAME}) # Remove samconfig.toml os.remove(self.test_data_path.joinpath(DEFAULT_CONFIG_FILE_NAME)) @@ -628,7 +648,7 @@ def test_deploy_guided_set_confirm_changeset(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) @@ -639,7 +659,7 @@ def test_deploy_guided_set_confirm_changeset(self, template_file): # Deploy should succeed with a managed stack self.assertEqual(deploy_process_execute.process.returncode, 0) - self.stack_names.append(SAM_CLI_STACK_NAME) + self.stacks.append({"name": SAM_CLI_STACK_NAME}) # Remove samconfig.toml os.remove(self.test_data_path.joinpath(DEFAULT_CONFIG_FILE_NAME)) @@ -648,7 +668,7 @@ def test_deploy_with_no_s3_bucket_set_resolve_s3(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) deploy_command_list = self.get_deploy_command_list( template_file=template_path, @@ -676,6 +696,44 @@ def test_deploy_with_invalid_config(self, template_file, config_file): self.assertEqual(deploy_process_execute.process.returncode, 1) self.assertIn("Error reading configuration: Unexpected character", str(deploy_process_execute.stderr)) + @parameterized.expand([("aws-serverless-function.yaml", "samconfig-tags-list.toml")]) + def test_deploy_with_valid_config_tags_list(self, template_file, config_file): + stack_name = self._method_to_stack_name(self.id()) + self.stacks.append({"name": stack_name}) + template_path = self.test_data_path.joinpath(template_file) + config_path = self.test_data_path.joinpath(config_file) + + deploy_command_list = self.get_deploy_command_list( + template_file=template_path, + stack_name=stack_name, + config_file=config_path, + s3_prefix="integ_deploy", + s3_bucket=self.s3_bucket.name, + capabilities="CAPABILITY_IAM", + ) + + deploy_process_execute = run_command(deploy_command_list) + self.assertEqual(deploy_process_execute.process.returncode, 0) + + @parameterized.expand([("aws-serverless-function.yaml", "samconfig-tags-string.toml")]) + def test_deploy_with_valid_config_tags_string(self, template_file, config_file): + stack_name = self._method_to_stack_name(self.id()) + self.stacks.append({"name": stack_name}) + template_path = self.test_data_path.joinpath(template_file) + config_path = self.test_data_path.joinpath(config_file) + + deploy_command_list = self.get_deploy_command_list( + template_file=template_path, + stack_name=stack_name, + config_file=config_path, + s3_prefix="integ_deploy", + s3_bucket=self.s3_bucket.name, + capabilities="CAPABILITY_IAM", + ) + + deploy_process_execute = run_command(deploy_command_list) + self.assertEqual(deploy_process_execute.process.returncode, 0) + @parameterized.expand([(True, True, True), (False, True, False), (False, False, True), (True, False, True)]) def test_deploy_with_code_signing_params(self, should_sign, should_enforce, will_succeed): """ @@ -695,7 +753,7 @@ def test_deploy_with_code_signing_params(self, should_sign, should_enforce, will "AWS_SIGNING_PROFILE_NAME and AWS_SIGNING_PROFILE_VERSION_ARN environment variables" ) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) signing_profiles_param = None if should_sign: @@ -728,6 +786,56 @@ def test_deploy_with_code_signing_params(self, should_sign, should_enforce, will else: self.assertEqual(deploy_process_execute.process.returncode, 1) + @parameterized.expand( + [ + ("aws-serverless-application-with-application-id-map.yaml", None, False), + ("aws-serverless-application-with-application-id-map.yaml", "us-east-2", True), + ] + ) + def test_deploy_sar_with_location_from_map(self, template_file, region, will_succeed): + template_path = Path(__file__).resolve().parents[1].joinpath("testdata", "buildcmd", template_file) + stack_name = self._method_to_stack_name(self.id()) + self.stacks.append({"name": stack_name, "region": region}) + + # The default region (us-east-1) has no entry in the map + deploy_command_list = self.get_deploy_command_list( + template_file=template_path, + stack_name=stack_name, + capabilities_list=["CAPABILITY_IAM", "CAPABILITY_AUTO_EXPAND"], + region=region, # the !FindInMap has an entry for use-east-2 region only + ) + deploy_process_execute = run_command(deploy_command_list) + + if will_succeed: + self.assertEqual(deploy_process_execute.process.returncode, 0) + else: + self.assertEqual(deploy_process_execute.process.returncode, 1) + self.assertIn("Property \\'ApplicationId\\' cannot be resolved.", str(deploy_process_execute.stderr)) + + @parameterized.expand( + [ + ("aws-serverless-application-with-application-id-map.yaml", None, False), + ("aws-serverless-application-with-application-id-map.yaml", "us-east-2", True), + ] + ) + def test_deploy_guided_sar_with_location_from_map(self, template_file, region, will_succeed): + template_path = Path(__file__).resolve().parents[1].joinpath("testdata", "buildcmd", template_file) + stack_name = self._method_to_stack_name(self.id()) + self.stacks.append({"name": stack_name, "region": region}) + + # Package and Deploy in one go without confirming change set. + deploy_command_list = self.get_deploy_command_list(template_file=template_path, guided=True) + + deploy_process_execute = run_command_with_input( + deploy_command_list, f"{stack_name}\n{region}\n\nN\nCAPABILITY_IAM CAPABILITY_AUTO_EXPAND\nN\n".encode() + ) + + if will_succeed: + self.assertEqual(deploy_process_execute.process.returncode, 0) + else: + self.assertEqual(deploy_process_execute.process.returncode, 1) + self.assertIn("Property \\'ApplicationId\\' cannot be resolved.", str(deploy_process_execute.stderr)) + @parameterized.expand( [os.path.join("deep-nested", "template.yaml"), os.path.join("deep-nested-image", "template.yaml")] ) @@ -735,7 +843,7 @@ def test_deploy_nested_stacks(self, template_file): template_path = self.test_data_path.joinpath(template_file) stack_name = self._method_to_stack_name(self.id()) - self.stack_names.append(stack_name) + self.stacks.append({"name": stack_name}) # Package and Deploy in one go without confirming change set. deploy_command_list = self.get_deploy_command_list( diff --git a/tests/integration/init/schemas/test_init_with_schemas_command.py b/tests/integration/init/schemas/test_init_with_schemas_command.py index 2efa40635f..501d111f78 100644 --- a/tests/integration/init/schemas/test_init_with_schemas_command.py +++ b/tests/integration/init/schemas/test_init_with_schemas_command.py @@ -21,7 +21,7 @@ def test_init_interactive_with_event_bridge_app_aws_registry(self): # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Packagetype - # 14: Java runtime + # 15: Java runtime # 1: dependency manager maven # eb-app-maven: response to name # 3: select event-bridge app from scratch @@ -33,7 +33,7 @@ def test_init_interactive_with_event_bridge_app_aws_registry(self): 1 1 1 -14 +15 1 eb-app-maven 3 @@ -59,7 +59,7 @@ def test_init_interactive_with_event_bridge_app_partner_registry(self): # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Packagetype - # 14: Java runtime + # 15: Java runtime # 1: dependency manager maven # eb-app-maven: response to name # 3: select event-bridge app from scratch @@ -71,7 +71,7 @@ def test_init_interactive_with_event_bridge_app_partner_registry(self): 1 1 1 -14 +15 1 eb-app-maven 3 @@ -108,7 +108,7 @@ def test_init_interactive_with_event_bridge_app_pagination(self): # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Packagetype - # 14: Java Runtime + # 15: Java Runtime # 1: dependency manager maven # eb-app-maven: response to name # 3: select event-bridge app from scratch @@ -150,7 +150,7 @@ def test_init_interactive_with_event_bridge_app_customer_registry(self): # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Packagetype - # 14: Java Runtime + # 15: Java Runtime # 1: dependency manager maven # eb-app-maven: response to name # 3: select event-bridge app from scratch @@ -162,7 +162,7 @@ def test_init_interactive_with_event_bridge_app_customer_registry(self): 1 1 1 -14 +15 1 eb-app-maven 3 @@ -199,7 +199,7 @@ def test_init_interactive_with_event_bridge_app_aws_schemas_python(self): # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Packagetype - # 9: Python 3.7 + # 10: Python 3.7 # eb-app-python37: response to name # 3: select event-bridge app from scratch # Y: Use default profile @@ -210,7 +210,7 @@ def test_init_interactive_with_event_bridge_app_aws_schemas_python(self): 1 1 1 -9 +10 eb-app-python37 3 Y @@ -233,7 +233,7 @@ def test_init_interactive_with_event_bridge_app_non_default_profile_selection(se # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Packagetype - # 9: Python 3.7 + # 10: Python 3.7 # eb-app-python37: response to name # 3: select event-bridge app from scratch # N: Use default profile @@ -246,7 +246,7 @@ def test_init_interactive_with_event_bridge_app_non_default_profile_selection(se 1 1 1 -9 +10 eb-app-python37 3 N @@ -273,7 +273,7 @@ def test_init_interactive_with_event_bridge_app_non_supported_schemas_region(sel # 1: SAM type project # 1: AWS Quick Start Templates # 1: Zip Pacakgetype - # 9: Python 3.7 + # 10: Python 3.7 # eb-app-python37: response to name # 3: select event-bridge app from scratch # Y: Use default profile @@ -284,7 +284,7 @@ def test_init_interactive_with_event_bridge_app_non_supported_schemas_region(sel 1 1 1 -9 +10 eb-app-python37 3 Y diff --git a/tests/integration/init/test_init_command.py b/tests/integration/init/test_init_command.py index e7ce6ab27f..a5405f4b9e 100644 --- a/tests/integration/init/test_init_command.py +++ b/tests/integration/init/test_init_command.py @@ -1,3 +1,6 @@ +""" +Integration tests for init command +""" from unittest import TestCase from parameterized import parameterized diff --git a/tests/integration/local/invoke/test_integration_cli_images.py b/tests/integration/local/invoke/test_integration_cli_images.py index 917694c69d..ba22d59604 100644 --- a/tests/integration/local/invoke/test_integration_cli_images.py +++ b/tests/integration/local/invoke/test_integration_cli_images.py @@ -380,3 +380,22 @@ def test_sam_template_file_env_var_set(self): process_stdout = stdout.strip() self.assertEqual(process_stdout.decode("utf-8"), '"Hello world"') + + def test_invoke_with_error_during_image_build(self): + command_list = self.get_command_list( + "ImageDoesntExistFunction", template_path=self.template_path, event_path=self.event_path + ) + + process = Popen(command_list, stderr=PIPE) + try: + _, stderr = process.communicate(timeout=TIMEOUT) + except TimeoutExpired: + process.kill() + raise + + process_stderr = stderr.strip() + self.assertRegex( + process_stderr.decode("utf-8"), + "Error: Error building docker image: pull access denied for non-existing-image", + ) + self.assertEqual(process.returncode, 1) diff --git a/tests/integration/local/invoke/test_integrations_cli.py b/tests/integration/local/invoke/test_integrations_cli.py index a9bbf4d824..a3d6206c40 100644 --- a/tests/integration/local/invoke/test_integrations_cli.py +++ b/tests/integration/local/invoke/test_integrations_cli.py @@ -13,7 +13,7 @@ from tests.integration.local.invoke.layer_utils import LayerUtils from .invoke_integ_base import InvokeIntegBase -from tests.testing_utils import IS_WINDOWS, RUNNING_ON_CI, RUNNING_TEST_FOR_MASTER_ON_CI, RUN_BY_CANARY +from tests.testing_utils import IS_WINDOWS, RUNNING_ON_CI, RUNNING_TEST_FOR_MASTER_ON_CI, RUN_BY_CANARY, run_command # Layers tests require credentials and Appveyor will only add credentials to the env if the PR is from the same repo. # This is to restrict layers tests to run outside of Appveyor, when the branch is not master and tests are not run by Canary. @@ -884,6 +884,24 @@ def test_caching_two_layers_with_layer_cache_env_set(self): self.assertEqual(2, len(os.listdir(str(self.layer_cache)))) +@skipIf(SKIP_LAYERS_TESTS, "Skip layers tests in Appveyor only") +class TestLocalZipLayerVersion(InvokeIntegBase): + template = Path("layers", "local-zip-layer-template.yml") + + def test_local_zip_layers( + self, + ): + command_list = self.get_command_list( + "OneLayerVersionServerlessFunction", + template_path=self.template_path, + no_event=True, + ) + + execute = run_command(command_list) + self.assertEqual(0, execute.process.returncode) + self.assertEqual('"Layer1"', execute.stdout.decode()) + + @skipIf(SKIP_LAYERS_TESTS, "Skip layers tests in Appveyor only") class TestLayerVersionThatDoNotCreateCache(InvokeIntegBase): template = Path("layers", "layer-template.yml") diff --git a/tests/integration/local/start_api/test_start_api.py b/tests/integration/local/start_api/test_start_api.py index 8aa146b254..0ddb8d5a31 100644 --- a/tests/integration/local/start_api/test_start_api.py +++ b/tests/integration/local/start_api/test_start_api.py @@ -1,3 +1,4 @@ +import base64 import uuid import random @@ -382,14 +383,14 @@ def test_valid_v2_lambda_integer_response(self): @pytest.mark.flaky(reruns=3) @pytest.mark.timeout(timeout=600, method="thread") - def test_invalid_v2_lambda_response(self): + def test_v2_lambda_response_skip_unexpected_fields(self): """ Patch Request to a path that was defined as ANY in SAM through AWS::Serverless::Function Events """ response = requests.get(self.url + "/invalidv2response", timeout=300) - self.assertEqual(response.status_code, 502) - self.assertEqual(response.json(), {"message": "Internal server error"}) + self.assertEqual(response.status_code, 200) + self.assertEqual(response.json(), {"hello": "world"}) @pytest.mark.flaky(reruns=3) @pytest.mark.timeout(timeout=600, method="thread") @@ -538,6 +539,48 @@ def test_binary_response(self): self.assertEqual(response.headers.get("Content-Type"), "image/gif") self.assertEqual(response.content, expected) + @pytest.mark.flaky(reruns=3) + @pytest.mark.timeout(timeout=600, method="thread") + def test_non_decoded_binary_response(self): + """ + Binary data is returned correctly + """ + expected = base64.b64encode(self.get_binary_data(self.binary_data_file)) + + response = requests.get(self.url + "/nondecodedbase64response", timeout=300) + + self.assertEqual(response.status_code, 200) + self.assertEqual(response.headers.get("Content-Type"), "image/gif") + self.assertEqual(response.content, expected) + + @pytest.mark.flaky(reruns=3) + @pytest.mark.timeout(timeout=600, method="thread") + def test_decoded_binary_response_base64encoded_field(self): + """ + Binary data is returned correctly + """ + expected = self.get_binary_data(self.binary_data_file) + + response = requests.get(self.url + "/decodedbase64responsebas64encoded", timeout=300) + + self.assertEqual(response.status_code, 200) + self.assertEqual(response.headers.get("Content-Type"), "image/gif") + self.assertEqual(response.content, expected) + + @pytest.mark.flaky(reruns=3) + @pytest.mark.timeout(timeout=600, method="thread") + def test_decoded_binary_response_base64encoded_field_is_priority(self): + """ + Binary data is returned correctly + """ + expected = base64.b64encode(self.get_binary_data(self.binary_data_file)) + + response = requests.get(self.url + "/decodedbase64responsebas64encodedpriority", timeout=300) + + self.assertEqual(response.status_code, 200) + self.assertEqual(response.headers.get("Content-Type"), "image/gif") + self.assertEqual(response.content, expected) + class TestStartApiWithSwaggerHttpApis(StartApiIntegBaseClass): template_path = "/testdata/start_api/swagger-template-http-api.yaml" @@ -1709,7 +1752,7 @@ class TestWarmContainersInitialization(TestWarmContainersBaseClass): mode_env_variable = str(uuid.uuid4()) parameter_overrides = {"ModeEnvVariable": mode_env_variable} - @pytest.mark.flaky(reruns=5) + @pytest.mark.flaky(reruns=3) @pytest.mark.timeout(timeout=600, method="thread") def test_all_containers_are_initialized_before_any_invoke(self): initiated_containers = self.count_running_containers() @@ -1723,7 +1766,7 @@ class TestWarmContainersMultipleInvoke(TestWarmContainersBaseClass): mode_env_variable = str(uuid.uuid4()) parameter_overrides = {"ModeEnvVariable": mode_env_variable} - @pytest.mark.flaky(reruns=5) + @pytest.mark.flaky(reruns=3) @pytest.mark.timeout(timeout=600, method="thread") def test_no_new_created_containers_after_lambda_function_invoke(self): initiated_containers_before_invoking_any_function = self.count_running_containers() diff --git a/tests/integration/local/start_lambda/test_start_lambda.py b/tests/integration/local/start_lambda/test_start_lambda.py index a35eceb1a1..ca4a00b5cf 100644 --- a/tests/integration/local/start_lambda/test_start_lambda.py +++ b/tests/integration/local/start_lambda/test_start_lambda.py @@ -271,7 +271,7 @@ class TestWarmContainersInitialization(TestWarmContainersBaseClass): mode_env_variable = str(uuid.uuid4()) parameter_overrides = {"ModeEnvVariable": mode_env_variable} - @pytest.mark.flaky(reruns=5) + @pytest.mark.flaky(reruns=3) @pytest.mark.timeout(timeout=600, method="thread") def test_all_containers_are_initialized_before_any_invoke(self): initiated_containers = self.count_running_containers() @@ -285,7 +285,7 @@ class TestWarmContainersMultipleInvoke(TestWarmContainersBaseClass): mode_env_variable = str(uuid.uuid4()) parameter_overrides = {"ModeEnvVariable": mode_env_variable} - @pytest.mark.flaky(reruns=5) + @pytest.mark.flaky(reruns=3) @pytest.mark.timeout(timeout=600, method="thread") def test_no_new_created_containers_after_lambda_function_invoke(self): diff --git a/tests/integration/package/test_package_command_image.py b/tests/integration/package/test_package_command_image.py index 8534c68e07..cedab01bd0 100644 --- a/tests/integration/package/test_package_command_image.py +++ b/tests/integration/package/test_package_command_image.py @@ -27,15 +27,14 @@ class TestPackageImage(PackageIntegBase): def setUpClass(cls): cls.docker_client = docker.from_env() cls.local_images = [ - ("alpine", "latest"), - # below 3 images are for test_package_with_deep_nested_template_image() - ("python", "3.9-slim"), - ("python", "3.8-slim"), - ("python", "3.7-slim"), + ("public.ecr.aws/sam/emulation-python3.8", "latest"), ] # setup some images locally by pulling them. for repo, tag in cls.local_images: cls.docker_client.api.pull(repository=repo, tag=tag) + cls.docker_client.api.tag(f"{repo}:{tag}", "emulation-python3.8", tag="latest") + cls.docker_client.api.tag(f"{repo}:{tag}", "emulation-python3.8-2", tag="latest") + super(TestPackageImage, cls).setUpClass() def setUp(self): @@ -204,16 +203,10 @@ def test_package_with_deep_nested_template_image(self): raise process_stderr = stderr.strip().decode("utf-8") - # there are in total 3 function images and 2 child template file to upload - # verify both child templates are uploaded - uploads = re.findall(r"\.template", process_stderr) - self.assertEqual(len(uploads), 2) - # verify all function images are pushed images = [ - ("python", "3.9-slim"), - ("python", "3.8-slim"), - ("python", "3.7-slim"), + ("emulation-python3.8", "latest"), + ("emulation-python3.8-2", "latest"), ] for image, tag in images: # check string like this: diff --git a/tests/integration/pipeline/__init__.py b/tests/integration/pipeline/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/integration/pipeline/base.py b/tests/integration/pipeline/base.py new file mode 100644 index 0000000000..e776613f33 --- /dev/null +++ b/tests/integration/pipeline/base.py @@ -0,0 +1,157 @@ +import os +import shutil +import logging +import uuid +from pathlib import Path +from typing import List, Optional, Set, Tuple, Any +from unittest import TestCase +from unittest.mock import Mock + +import boto3 +import botocore.exceptions +from botocore.exceptions import ClientError + +from samcli.lib.pipeline.bootstrap.stage import Stage + + +class PipelineBase(TestCase): + def base_command(self): + command = "sam" + if os.getenv("SAM_CLI_DEV"): + command = "samdev" + + return command + + +class InitIntegBase(PipelineBase): + generated_files: List[Path] = [] + + @classmethod + def setUpClass(cls) -> None: + # we need to compare the whole generated template, which is + # larger than normal diff size limit + cls.maxDiff = None + + def setUp(self) -> None: + super().setUp() + self.generated_files = [] + + def tearDown(self) -> None: + for generated_file in self.generated_files: + if generated_file.is_dir(): + shutil.rmtree(generated_file, ignore_errors=True) + elif generated_file.exists(): + generated_file.unlink() + super().tearDown() + + def get_init_command_list(self, with_bootstrap=False): + command_list = [self.base_command(), "pipeline", "init"] + if with_bootstrap: + command_list.append("--bootstrap") + return command_list + + +class BootstrapIntegBase(PipelineBase): + region = "us-east-1" + stack_names: List[str] + cf_client: Any + randomized_stage_suffix: str + + @classmethod + def setUpClass(cls): + cls.cf_client = boto3.client("cloudformation", region_name=cls.region) + cls.randomized_stage_suffix = uuid.uuid4().hex[-6:] + + def setUp(self): + self.stack_names = [] + super().setUp() + shutil.rmtree(os.path.join(os.getcwd(), ".aws-sam", "pipeline"), ignore_errors=True) + + def tearDown(self): + for stack_name in self.stack_names: + self._cleanup_s3_buckets(stack_name) + self.cf_client.delete_stack(StackName=stack_name) + shutil.rmtree(os.path.join(os.getcwd(), ".aws-sam", "pipeline"), ignore_errors=True) + super().tearDown() + + def _cleanup_s3_buckets(self, stack_name): + try: + stack_resources = self.cf_client.describe_stack_resources(StackName=stack_name) + buckets = [ + resource + for resource in stack_resources["StackResources"] + if resource["ResourceType"] == "AWS::S3::Bucket" + ] + session = boto3.session.Session() + s3_client = session.resource("s3") + for bucket in buckets: + bucket = s3_client.Bucket(bucket.get("PhysicalResourceId")) + bucket.object_versions.delete() + bucket.delete() + except botocore.exceptions.ClientError: + """No need to fail in cleanup""" + + def get_bootstrap_command_list( + self, + no_interactive: bool = False, + stage_name: Optional[str] = None, + profile_name: Optional[str] = None, + region: Optional[str] = None, + pipeline_user: Optional[str] = None, + pipeline_execution_role: Optional[str] = None, + cloudformation_execution_role: Optional[str] = None, + bucket: Optional[str] = None, + create_image_repository: bool = False, + image_repository: Optional[str] = None, + no_confirm_changeset: bool = False, + ): + command_list = [self.base_command(), "pipeline", "bootstrap"] + + if no_interactive: + command_list += ["--no-interactive"] + if stage_name: + command_list += ["--stage", stage_name] + if profile_name: + command_list += ["--profile", profile_name] + if region: + command_list += ["--region", region] + if pipeline_user: + command_list += ["--pipeline-user", pipeline_user] + if pipeline_execution_role: + command_list += ["--pipeline-execution-role", pipeline_execution_role] + if cloudformation_execution_role: + command_list += ["--cloudformation-execution-role", cloudformation_execution_role] + if bucket: + command_list += ["--bucket", bucket] + if create_image_repository: + command_list += ["--create-image-repository"] + if image_repository: + command_list += ["--image-repository", image_repository] + if no_confirm_changeset: + command_list += ["--no-confirm-changeset"] + + return command_list + + def _extract_created_resource_logical_ids(self, stack_name: str) -> List[str]: + response = self.cf_client.describe_stack_resources(StackName=stack_name) + return [resource["LogicalResourceId"] for resource in response["StackResources"]] + + def _stack_exists(self, stack_name) -> bool: + try: + self.cf_client.describe_stacks(StackName=stack_name) + return True + except ClientError as ex: + if "does not exist" in ex.response.get("Error", {}).get("Message", ""): + return False + raise ex + + def _get_stage_and_stack_name(self, suffix: str = "") -> Tuple[str, str]: + # Method expects method name which can be a full path. Eg: test.integration.test_bootstrap_command.method_name + method_name = self.id().split(".")[-1] + stage_name = method_name.replace("_", "-") + suffix + "-" + self.randomized_stage_suffix + + mock_env = Mock() + mock_env.name = stage_name + stack_name = Stage._get_stack_name(mock_env) + + return stage_name, stack_name diff --git a/tests/integration/pipeline/test_bootstrap_command.py b/tests/integration/pipeline/test_bootstrap_command.py new file mode 100644 index 0000000000..0cf7741c5c --- /dev/null +++ b/tests/integration/pipeline/test_bootstrap_command.py @@ -0,0 +1,380 @@ +from unittest import skipIf + +from parameterized import parameterized + +from samcli.commands.pipeline.bootstrap.cli import PIPELINE_CONFIG_FILENAME, PIPELINE_CONFIG_DIR +from samcli.lib.config.samconfig import SamConfig +from tests.integration.pipeline.base import BootstrapIntegBase +from tests.testing_utils import ( + run_command_with_input, + RUNNING_ON_CI, + RUNNING_TEST_FOR_MASTER_ON_CI, + RUN_BY_CANARY, + run_command, + run_command_with_inputs, +) +import boto3 +from botocore.exceptions import ClientError + +# bootstrap tests require credentials and CI/CD will only add credentials to the env if the PR is from the same repo. +# This is to restrict tests to run outside of CI/CD, when the branch is not master or tests are not run by Canary +SKIP_BOOTSTRAP_TESTS = RUNNING_ON_CI and RUNNING_TEST_FOR_MASTER_ON_CI and not RUN_BY_CANARY + +# In order to run bootstrap integration test locally make sure your test account is configured as `default` account. +CREDENTIAL_PROFILE = "2" if not RUN_BY_CANARY else "1" + +CFN_OUTPUT_TO_CONFIG_KEY = { + "ArtifactsBucket": "artifacts_bucket", + "CloudFormationExecutionRole": "cloudformation_execution_role", + "PipelineExecutionRole": "pipeline_execution_role", + "PipelineUser": "pipeline_user", +} + + +@skipIf(SKIP_BOOTSTRAP_TESTS, "Skip bootstrap tests in CI/CD only") +class TestBootstrap(BootstrapIntegBase): + @parameterized.expand([("create_image_repository",), (False,)]) + def test_interactive_with_no_resources_provided(self, create_image_repository): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list() + + inputs = [ + stage_name, + CREDENTIAL_PROFILE, + self.region, # region + "", # pipeline user + "", # Pipeline execution role + "", # CloudFormation execution role + "", # Artifacts bucket + "y" if create_image_repository else "N", # Should we create ECR repo + ] + + if create_image_repository: + inputs.append("") # Create image repository + + inputs.append("") # Confirm summary + inputs.append("y") # Create resources + + bootstrap_process_execute = run_command_with_inputs(bootstrap_command_list, inputs) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + # make sure pipeline user's credential is printed + self.assertIn("ACCESS_KEY_ID", stdout) + self.assertIn("SECRET_ACCESS_KEY", stdout) + + common_resources = { + "PipelineUser", + "PipelineUserAccessKey", + "PipelineUserSecretKey", + "CloudFormationExecutionRole", + "PipelineExecutionRole", + "ArtifactsBucket", + "ArtifactsLoggingBucket", + "ArtifactsLoggingBucketPolicy", + "ArtifactsBucketPolicy", + "PipelineExecutionRolePermissionPolicy", + } + if create_image_repository: + self.assertSetEqual( + { + *common_resources, + "ImageRepository", + }, + set(self._extract_created_resource_logical_ids(stack_name)), + ) + CFN_OUTPUT_TO_CONFIG_KEY["ImageRepository"] = "image_repository" + self.validate_pipeline_config(stack_name, stage_name, list(CFN_OUTPUT_TO_CONFIG_KEY.keys())) + del CFN_OUTPUT_TO_CONFIG_KEY["ImageRepository"] + else: + self.assertSetEqual(common_resources, set(self._extract_created_resource_logical_ids(stack_name))) + self.validate_pipeline_config(stack_name, stage_name) + + @parameterized.expand([("create_image_repository",), (False,)]) + def test_non_interactive_with_no_resources_provided(self, create_image_repository): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list( + no_interactive=True, + create_image_repository=create_image_repository, + no_confirm_changeset=True, + region=self.region, + ) + + bootstrap_process_execute = run_command(bootstrap_command_list) + + self.assertEqual(bootstrap_process_execute.process.returncode, 2) + stderr = bootstrap_process_execute.stderr.decode() + self.assertIn("Missing required parameter", stderr) + + def test_interactive_with_all_required_resources_provided(self): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list() + + inputs = [ + stage_name, + CREDENTIAL_PROFILE, + self.region, # region + "arn:aws:iam::123:user/user-name", # pipeline user + "arn:aws:iam::123:role/role-name", # Pipeline execution role + "arn:aws:iam::123:role/role-name", # CloudFormation execution role + "arn:aws:s3:::bucket-name", # Artifacts bucket + "N", # Should we create ECR repo, 3 - specify one + "", + ] + + bootstrap_process_execute = run_command_with_inputs(bootstrap_command_list, inputs) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + self.assertIn("skipping creation", stdout) + + def test_no_interactive_with_all_required_resources_provided(self): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list( + no_interactive=True, + stage_name=stage_name, + pipeline_user="arn:aws:iam::123:user/user-name", # pipeline user + pipeline_execution_role="arn:aws:iam::123:role/role-name", # Pipeline execution role + cloudformation_execution_role="arn:aws:iam::123:role/role-name", # CloudFormation execution role + bucket="arn:aws:s3:::bucket-name", # Artifacts bucket + image_repository="arn:aws:ecr:::repository/repo-name", # ecr repo + region=self.region, + ) + + bootstrap_process_execute = run_command(bootstrap_command_list) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + self.assertIn("skipping creation", stdout) + + def validate_pipeline_config(self, stack_name, stage_name, cfn_keys_to_check=None): + # Get output values from cloudformation + if cfn_keys_to_check is None: + cfn_keys_to_check = list(CFN_OUTPUT_TO_CONFIG_KEY.keys()) + response = self.cf_client.describe_stacks(StackName=stack_name) + stacks = response["Stacks"] + self.assertTrue(len(stacks) > 0) # in case stack name is invalid + stack_outputs = stacks[0]["Outputs"] + output_values = {} + for value in stack_outputs: + output_values[value["OutputKey"]] = value["OutputValue"] + + # Get values saved in config file + config = SamConfig(PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME) + config_values = config.get_all(["pipeline", "bootstrap"], "parameters", stage_name) + config_values = {**config_values, **config.get_all(["pipeline", "bootstrap"], "parameters")} + + for key in CFN_OUTPUT_TO_CONFIG_KEY: + if key not in cfn_keys_to_check: + continue + value = CFN_OUTPUT_TO_CONFIG_KEY[key] + cfn_value = output_values[key] + config_value = config_values[value] + if key == "ImageRepository": + self.assertEqual(cfn_value.split("/")[-1], config_value.split("/")[-1]) + else: + self.assertTrue(cfn_value.endswith(config_value) or cfn_value == config_value) + + @parameterized.expand([("confirm_changeset",), (False,)]) + def test_no_interactive_with_some_required_resources_provided(self, confirm_changeset: bool): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list( + no_interactive=True, + stage_name=stage_name, + pipeline_user="arn:aws:iam::123:user/user-name", # pipeline user + pipeline_execution_role="arn:aws:iam::123:role/role-name", # Pipeline execution role + # CloudFormation execution role missing + bucket="arn:aws:s3:::bucket-name", # Artifacts bucket + image_repository="arn:aws:ecr:::repository/repo-name", # ecr repo + no_confirm_changeset=not confirm_changeset, + region=self.region, + ) + + inputs = [ + "y", # proceed + ] + + bootstrap_process_execute = run_command_with_inputs(bootstrap_command_list, inputs if confirm_changeset else []) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + self.assertIn("Successfully created!", stdout) + self.assertIn("CloudFormationExecutionRole", self._extract_created_resource_logical_ids(stack_name)) + + def test_interactive_cancelled_by_user(self): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list() + + inputs = [ + stage_name, + CREDENTIAL_PROFILE, + self.region, # region + "arn:aws:iam::123:user/user-name", # pipeline user + "arn:aws:iam::123:role/role-name", # Pipeline execution role + "", # CloudFormation execution role + "arn:aws:s3:::bucket-name", # Artifacts bucket + "N", # Do you have Lambda with package type Image + "", + "", # Create resources confirmation + ] + + bootstrap_process_execute = run_command_with_inputs(bootstrap_command_list, inputs) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + self.assertTrue(stdout.strip().endswith("Canceling pipeline bootstrap creation.")) + self.assertFalse(self._stack_exists(stack_name)) + + def test_interactive_with_some_required_resources_provided(self): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list() + + inputs = [ + stage_name, + CREDENTIAL_PROFILE, + self.region, # region + "arn:aws:iam::123:user/user-name", # pipeline user + "arn:aws:iam::123:role/role-name", # Pipeline execution role + "", # CloudFormation execution role + "arn:aws:s3:::bucket-name", # Artifacts bucket + "N", # Do you have Lambda with package type Image + "", + "y", # Create resources confirmation + ] + + bootstrap_process_execute = run_command_with_inputs(bootstrap_command_list, inputs) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + self.assertIn("Successfully created!", stdout) + # make sure the not provided resource is the only resource created. + self.assertIn("CloudFormationExecutionRole", self._extract_created_resource_logical_ids(stack_name)) + self.validate_pipeline_config(stack_name, stage_name) + + def test_interactive_pipeline_user_only_created_once(self): + """ + Create 3 stages, only the first stage resource stack creates + a pipeline user, and the remaining two share the same pipeline user. + """ + stage_names = [] + for suffix in ["1", "2", "3"]: + stage_name, stack_name = self._get_stage_and_stack_name(suffix) + stage_names.append(stage_name) + self.stack_names.append(stack_name) + + bootstrap_command_list = self.get_bootstrap_command_list() + + for i, stage_name in enumerate(stage_names): + inputs = [ + stage_name, + CREDENTIAL_PROFILE, + self.region, # region + *([""] if i == 0 else []), # pipeline user + "arn:aws:iam::123:role/role-name", # Pipeline execution role + "arn:aws:iam::123:role/role-name", # CloudFormation execution role + "arn:aws:s3:::bucket-name", # Artifacts bucket + "N", # Should we create ECR repo, 3 - specify one + "", + "y", # Create resources confirmation + ] + + bootstrap_process_execute = run_command_with_input( + bootstrap_command_list, ("\n".join(inputs) + "\n").encode() + ) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + stdout = bootstrap_process_execute.stdout.decode() + + # Only first environment creates pipeline user + if i == 0: + self.assertIn("The following resources were created in your account:", stdout) + resources = self._extract_created_resource_logical_ids(self.stack_names[i]) + self.assertTrue("PipelineUser" in resources) + self.assertTrue("PipelineUserAccessKey" in resources) + self.assertTrue("PipelineUserSecretKey" in resources) + self.validate_pipeline_config(self.stack_names[i], stage_name) + else: + self.assertIn("skipping creation", stdout) + + @parameterized.expand([("ArtifactsBucket",), ("ArtifactsLoggingBucket",)]) + def test_bootstrapped_buckets_accept_ssl_requests_only(self, bucket_logical_id): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list( + stage_name=stage_name, no_interactive=True, no_confirm_changeset=True, region=self.region + ) + + bootstrap_process_execute = run_command(bootstrap_command_list) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + + stack_resources = self.cf_client.describe_stack_resources(StackName=stack_name) + bucket = next( + resource + for resource in stack_resources["StackResources"] + if resource["LogicalResourceId"] == bucket_logical_id + ) + bucket_name = bucket["PhysicalResourceId"] + bucket_key = "any/testing/key.txt" + testing_data = b"any testing binary data" + + s3_ssl_client = boto3.client("s3", region_name=self.region) + s3_non_ssl_client = boto3.client("s3", use_ssl=False, region_name=self.region) + + # Assert SSL requests are accepted + s3_ssl_client.put_object(Body=testing_data, Bucket=bucket_name, Key=bucket_key) + res = s3_ssl_client.get_object(Bucket=bucket_name, Key=bucket_key) + retrieved_data = res["Body"].read() + self.assertEqual(retrieved_data, testing_data) + + # Assert non SSl requests are denied + with self.assertRaises(ClientError) as error: + s3_non_ssl_client.get_object(Bucket=bucket_name, Key=bucket_key) + self.assertEqual( + str(error.exception), "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied" + ) + + def test_bootstrapped_artifacts_bucket_has_server_access_log_enabled(self): + stage_name, stack_name = self._get_stage_and_stack_name() + self.stack_names = [stack_name] + + bootstrap_command_list = self.get_bootstrap_command_list( + stage_name=stage_name, no_interactive=True, no_confirm_changeset=True, region=self.region + ) + + bootstrap_process_execute = run_command(bootstrap_command_list) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + + stack_resources = self.cf_client.describe_stack_resources(StackName=stack_name) + artifacts_bucket = next( + resource + for resource in stack_resources["StackResources"] + if resource["LogicalResourceId"] == "ArtifactsBucket" + ) + artifacts_bucket_name = artifacts_bucket["PhysicalResourceId"] + artifacts_logging_bucket = next( + resource + for resource in stack_resources["StackResources"] + if resource["LogicalResourceId"] == "ArtifactsLoggingBucket" + ) + artifacts_logging_bucket_name = artifacts_logging_bucket["PhysicalResourceId"] + + s3_client = boto3.client("s3", region_name=self.region) + res = s3_client.get_bucket_logging(Bucket=artifacts_bucket_name) + self.assertEqual(artifacts_logging_bucket_name, res["LoggingEnabled"]["TargetBucket"]) diff --git a/tests/integration/pipeline/test_init_command.py b/tests/integration/pipeline/test_init_command.py new file mode 100644 index 0000000000..32706f3fe2 --- /dev/null +++ b/tests/integration/pipeline/test_init_command.py @@ -0,0 +1,301 @@ +import os.path +import shutil +from pathlib import Path +from textwrap import dedent +from typing import List +from unittest import skipIf + +from parameterized import parameterized + +from samcli.cli.main import global_cfg +from samcli.commands.pipeline.bootstrap.cli import PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME +from samcli.commands.pipeline.init.interactive_init_flow import APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME +from tests.integration.pipeline.base import InitIntegBase, BootstrapIntegBase +from tests.integration.pipeline.test_bootstrap_command import SKIP_BOOTSTRAP_TESTS, CREDENTIAL_PROFILE +from tests.testing_utils import run_command_with_inputs + +QUICK_START_JENKINS_INPUTS_WITHOUT_AUTO_FILL = [ + "1", # quick start + "1", # jenkins, this depends on the template repo. + "", + "credential-id", + "main", + "template.yaml", + "test", + "test-stack", + "test-pipeline-execution-role", + "test-cfn-execution-role", + "test-bucket", + "test-ecr", + "us-east-2", + "prod", + "prod-stack", + "prod-pipeline-execution-role", + "prod-cfn-execution-role", + "prod-bucket", + "prod-ecr", + "us-west-2", +] +SHARED_PATH: Path = global_cfg.config_dir +EXPECTED_JENKINS_FILE_PATH = Path( + SHARED_PATH, APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME, "tests", "testfile_jenkins", "expected" +) + + +class TestInit(InitIntegBase): + """ + Here we use Jenkins template for testing + """ + + def setUp(self) -> None: + # make sure there is no pipelineconfig.toml, otherwise the autofill could affect the question flow + pipelineconfig_file = Path(PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME) + if pipelineconfig_file.exists(): + pipelineconfig_file.unlink() + + def tearDown(self) -> None: + super().tearDown() + shutil.rmtree(PIPELINE_CONFIG_DIR, ignore_errors=True) + + def test_quick_start(self): + generated_jenkinsfile_path = Path("Jenkinsfile") + self.generated_files.append(generated_jenkinsfile_path) + + init_command_list = self.get_init_command_list() + init_process_execute = run_command_with_inputs(init_command_list, QUICK_START_JENKINS_INPUTS_WITHOUT_AUTO_FILL) + + self.assertEqual(init_process_execute.process.returncode, 0) + self.assertTrue(Path("Jenkinsfile").exists()) + + with open(EXPECTED_JENKINS_FILE_PATH, "r") as expected, open(generated_jenkinsfile_path, "r") as output: + self.assertEqual(expected.read(), output.read()) + + def test_failed_when_generated_file_already_exist_override(self): + generated_jenkinsfile_path = Path("Jenkinsfile") + generated_jenkinsfile_path.touch() # the file now pre-exists + self.generated_files.append(generated_jenkinsfile_path) + + init_command_list = self.get_init_command_list() + init_process_execute = run_command_with_inputs( + init_command_list, [*QUICK_START_JENKINS_INPUTS_WITHOUT_AUTO_FILL, "y"] + ) + + self.assertEqual(init_process_execute.process.returncode, 0) + self.assertTrue(Path("Jenkinsfile").exists()) + + with open(EXPECTED_JENKINS_FILE_PATH, "r") as expected, open(generated_jenkinsfile_path, "r") as output: + self.assertEqual(expected.read(), output.read()) + + def test_failed_when_generated_file_already_exist_not_override(self): + generated_jenkinsfile_path = Path("Jenkinsfile") + generated_jenkinsfile_path.touch() # the file now pre-exists + self.generated_files.append(generated_jenkinsfile_path) + + init_command_list = self.get_init_command_list() + init_process_execute = run_command_with_inputs( + init_command_list, [*QUICK_START_JENKINS_INPUTS_WITHOUT_AUTO_FILL, ""] + ) + + self.assertEqual(init_process_execute.process.returncode, 0) + + with open(EXPECTED_JENKINS_FILE_PATH, "r") as expected, open( + os.path.join(".aws-sam", "pipeline", "generated-files", "Jenkinsfile"), "r" + ) as output: + self.assertEqual(expected.read(), output.read()) + + # also check the Jenkinsfile is not overridden + self.assertEqual("", open("Jenkinsfile", "r").read()) + + def test_custom_template(self): + generated_file = Path("weather") + self.generated_files.append(generated_file) + + custom_template_path = Path(__file__).parent.parent.joinpath(Path("testdata", "pipeline", "custom_template")) + inputs = ["2", str(custom_template_path), "", "Rainy"] # custom template + + init_command_list = self.get_init_command_list() + init_process_execute = run_command_with_inputs(init_command_list, inputs) + + self.assertEqual(init_process_execute.process.returncode, 0) + + self.assertTrue(generated_file.exists()) + + with open(generated_file, "r") as f: + self.assertEqual("Rainy\n", f.read()) + + @parameterized.expand([("with_bootstrap",), (False,)]) + def test_with_pipelineconfig_has_all_stage_values(self, with_bootstrap): + generated_jenkinsfile_path = Path("Jenkinsfile") + self.generated_files.append(generated_jenkinsfile_path) + + Path(PIPELINE_CONFIG_DIR).mkdir(parents=True, exist_ok=True) + pipelineconfig_path = Path(PIPELINE_CONFIG_DIR, PIPELINE_CONFIG_FILENAME) + with open(pipelineconfig_path, "w") as f: + f.write( + dedent( + """\ + version = 0.1 + [default] + [default.pipeline_bootstrap] + [default.pipeline_bootstrap.parameters] + pipeline_user = "arn:aws:iam::123:user/aws-sam-cli-managed-test-pipeline-res-PipelineUser-123" + + [test] + [test.pipeline_bootstrap] + [test.pipeline_bootstrap.parameters] + pipeline_execution_role = "test-pipeline-execution-role" + cloudformation_execution_role = "test-cfn-execution-role" + artifacts_bucket = "test-bucket" + image_repository = "test-ecr" + region = "us-east-2" + + [prod] + [prod.pipeline_bootstrap] + [prod.pipeline_bootstrap.parameters] + pipeline_execution_role = "prod-pipeline-execution-role" + cloudformation_execution_role = "prod-cfn-execution-role" + artifacts_bucket = "prod-bucket" + image_repository = "prod-ecr" + region = "us-west-2" + """ + ) + ) + + inputs = [ + "1", # quick start + "1", # jenkins, this depends on the template repo. + "credential-id", + "main", + "template.yaml", + "1", + "test-stack", + "2", + "prod-stack", + ] + + init_command_list = self.get_init_command_list(with_bootstrap) + init_process_execute = run_command_with_inputs(init_command_list, inputs) + + self.assertEqual(init_process_execute.process.returncode, 0) + self.assertTrue(Path("Jenkinsfile").exists()) + + with open(EXPECTED_JENKINS_FILE_PATH, "r") as expected, open(generated_jenkinsfile_path, "r") as output: + self.assertEqual(expected.read(), output.read()) + + +@skipIf(SKIP_BOOTSTRAP_TESTS, "Skip bootstrap tests in CI/CD only") +class TestInitWithBootstrap(BootstrapIntegBase): + generated_files: List[Path] = [] + + def setUp(self): + super().setUp() + self.command_list = [self.base_command(), "pipeline", "init", "--bootstrap"] + generated_jenkinsfile_path = Path("Jenkinsfile") + self.generated_files.append(generated_jenkinsfile_path) + + def tearDown(self) -> None: + for generated_file in self.generated_files: + if generated_file.is_dir(): + shutil.rmtree(generated_file, ignore_errors=True) + elif generated_file.exists(): + generated_file.unlink() + super().tearDown() + + def test_without_stages_in_pipeline_config(self): + stage_names = [] + for suffix in ["1", "2"]: + stage_name, stack_name = self._get_stage_and_stack_name(suffix) + stage_names.append(stage_name) + self.stack_names.append(stack_name) + + inputs = [ + "1", # quick start + "1", # jenkins, this depends on the template repo. + "y", # Do you want to go through stage setup process now? + stage_names[0], + CREDENTIAL_PROFILE, + self.region, + "", # pipeline user + "", # Pipeline execution role + "", # CloudFormation execution role + "", # Artifacts bucket + "N", # no ECR repo + "", # Confirm summary + "y", # Create resources + "y", # Do you want to go through stage setup process now? + stage_names[1], + CREDENTIAL_PROFILE, + self.region, + "", # pipeline user + "", # Pipeline execution role + "", # CloudFormation execution role + "", # Artifacts bucket + "N", # no ECR repo + "", # Confirm summary + "y", # Create resources + "credential-id", + "main", + "template.yaml", + "1", + "test-stack", + "2", + "prod-stack", + ] + init_process_execute = run_command_with_inputs(self.command_list, inputs) + self.assertEqual(init_process_execute.process.returncode, 0) + self.assertIn("Here are the stage names detected", init_process_execute.stdout.decode()) + self.assertIn(stage_names[0], init_process_execute.stdout.decode()) + self.assertIn(stage_names[1], init_process_execute.stdout.decode()) + + def test_with_one_stages_in_pipeline_config(self): + stage_names = [] + for suffix in ["1", "2"]: + stage_name, stack_name = self._get_stage_and_stack_name(suffix) + stage_names.append(stage_name) + self.stack_names.append(stack_name) + + bootstrap_command_list = self.get_bootstrap_command_list() + + inputs = [ + stage_names[0], + CREDENTIAL_PROFILE, + self.region, # region + "", # pipeline user + "", # Pipeline execution role + "", # CloudFormation execution role + "", # Artifacts bucket + "N", # no + "", # Confirm summary + "y", # Create resources + ] + + bootstrap_process_execute = run_command_with_inputs(bootstrap_command_list, inputs) + + self.assertEqual(bootstrap_process_execute.process.returncode, 0) + + inputs = [ + "1", # quick start + "1", # jenkins, this depends on the template repo. + "y", # Do you want to go through stage setup process now? + stage_names[1], + CREDENTIAL_PROFILE, + self.region, + "", # Pipeline execution role + "", # CloudFormation execution role + "", # Artifacts bucket + "N", # no ECR repo + "", # Confirm summary + "y", # Create resources + "credential-id", + "main", + "template.yaml", + "1", + "test-stack", + "2", + "prod-stack", + ] + init_process_execute = run_command_with_inputs(self.command_list, inputs) + self.assertEqual(init_process_execute.process.returncode, 0) + self.assertIn("Here are the stage names detected", init_process_execute.stdout.decode()) + self.assertIn(stage_names[0], init_process_execute.stdout.decode()) + self.assertIn(stage_names[1], init_process_execute.stdout.decode()) diff --git a/tests/integration/testdata/buildcmd/PyLayer/requirements.txt b/tests/integration/testdata/buildcmd/PyLayer/requirements.txt index bf8549f936..ce4af48039 100644 --- a/tests/integration/testdata/buildcmd/PyLayer/requirements.txt +++ b/tests/integration/testdata/buildcmd/PyLayer/requirements.txt @@ -1,6 +1,7 @@ # These are some hard packages to build. Using them here helps us verify that building works on various platforms -numpy~=1.15 +# NOTE: Fixing to <1.20.3 as numpy1.20.3 started to use a new wheel naming convention (PEP 600) +numpy<1.20.3 # `cryptography` has a dependency on `pycparser` which, for some reason doesn't build inside a Docker container. # Turning this off until we resolve this issue: https://github.com/awslabs/aws-lambda-builders/issues/29 # cryptography~=2.4 diff --git a/tests/integration/testdata/buildcmd/PyLayerMake/requirements.txt b/tests/integration/testdata/buildcmd/PyLayerMake/requirements.txt index bf8549f936..ce4af48039 100644 --- a/tests/integration/testdata/buildcmd/PyLayerMake/requirements.txt +++ b/tests/integration/testdata/buildcmd/PyLayerMake/requirements.txt @@ -1,6 +1,7 @@ # These are some hard packages to build. Using them here helps us verify that building works on various platforms -numpy~=1.15 +# NOTE: Fixing to <1.20.3 as numpy1.20.3 started to use a new wheel naming convention (PEP 600) +numpy<1.20.3 # `cryptography` has a dependency on `pycparser` which, for some reason doesn't build inside a Docker container. # Turning this off until we resolve this issue: https://github.com/awslabs/aws-lambda-builders/issues/29 # cryptography~=2.4 diff --git a/tests/integration/testdata/buildcmd/Python/requirements.txt b/tests/integration/testdata/buildcmd/Python/requirements.txt index bf8549f936..ce4af48039 100644 --- a/tests/integration/testdata/buildcmd/Python/requirements.txt +++ b/tests/integration/testdata/buildcmd/Python/requirements.txt @@ -1,6 +1,7 @@ # These are some hard packages to build. Using them here helps us verify that building works on various platforms -numpy~=1.15 +# NOTE: Fixing to <1.20.3 as numpy1.20.3 started to use a new wheel naming convention (PEP 600) +numpy<1.20.3 # `cryptography` has a dependency on `pycparser` which, for some reason doesn't build inside a Docker container. # Turning this off until we resolve this issue: https://github.com/awslabs/aws-lambda-builders/issues/29 # cryptography~=2.4 diff --git a/tests/integration/testdata/buildcmd/PythonImage/requirements.txt b/tests/integration/testdata/buildcmd/PythonImage/requirements.txt index bf8549f936..ce4af48039 100644 --- a/tests/integration/testdata/buildcmd/PythonImage/requirements.txt +++ b/tests/integration/testdata/buildcmd/PythonImage/requirements.txt @@ -1,6 +1,7 @@ # These are some hard packages to build. Using them here helps us verify that building works on various platforms -numpy~=1.15 +# NOTE: Fixing to <1.20.3 as numpy1.20.3 started to use a new wheel naming convention (PEP 600) +numpy<1.20.3 # `cryptography` has a dependency on `pycparser` which, for some reason doesn't build inside a Docker container. # Turning this off until we resolve this issue: https://github.com/awslabs/aws-lambda-builders/issues/29 # cryptography~=2.4 diff --git a/tests/integration/testdata/buildcmd/aws-serverless-application-with-application-id-map.yaml b/tests/integration/testdata/buildcmd/aws-serverless-application-with-application-id-map.yaml new file mode 100644 index 0000000000..e50d00fbd3 --- /dev/null +++ b/tests/integration/testdata/buildcmd/aws-serverless-application-with-application-id-map.yaml @@ -0,0 +1,17 @@ +AWSTemplateFormatVersion: "2010-09-09" +Transform: AWS::Serverless-2016-10-31 + +Mappings: + MappingExample: + us-east-2: + ApplicationId: arn:aws:serverlessrepo:us-east-1:077246666028:applications/hello-world + +Resources: + MyApplication: + Type: AWS::Serverless::Application + Properties: + Location: + ApplicationId: !FindInMap [ MappingExample, !Ref AWS::Region, ApplicationId ] + SemanticVersion: 1.0.4 + Parameters: + IdentityNameParameter: AnyValue \ No newline at end of file diff --git a/tests/integration/testdata/invoke/layers/local-zip-layer-template.yml b/tests/integration/testdata/invoke/layers/local-zip-layer-template.yml new file mode 100644 index 0000000000..466ff9791a --- /dev/null +++ b/tests/integration/testdata/invoke/layers/local-zip-layer-template.yml @@ -0,0 +1,19 @@ +AWSTemplateFormatVersion : '2010-09-09' +Transform: AWS::Serverless-2016-10-31 +Description: A hello world application. + +Resources: + LayerOne: + Type: AWS::Lambda::LayerVersion + Properties: + Content: ../layer_zips/layer1.zip + + OneLayerVersionServerlessFunction: + Type: AWS::Serverless::Function + Properties: + Handler: layer-main.one_layer_hanlder + Runtime: python3.6 + CodeUri: . + Timeout: 20 + Layers: + - !Ref LayerOne diff --git a/tests/integration/testdata/invoke/template_image.yaml b/tests/integration/testdata/invoke/template_image.yaml index 85bf24bb72..7bc106c7cb 100644 --- a/tests/integration/testdata/invoke/template_image.yaml +++ b/tests/integration/testdata/invoke/template_image.yaml @@ -203,3 +203,13 @@ Resources: ImageConfig: Command: - main.echo_event + + ImageDoesntExistFunction: + Type: AWS::Serverless::Function + Properties: + FunctionName: func-name + PackageType: Image + ImageUri: non-existing-image:v1 + ImageConfig: + Command: + - main.echo_event diff --git a/tests/integration/testdata/package/aws-lambda-function-image-and-api.yaml b/tests/integration/testdata/package/aws-lambda-function-image-and-api.yaml index 73e95a14d4..38cc761756 100644 --- a/tests/integration/testdata/package/aws-lambda-function-image-and-api.yaml +++ b/tests/integration/testdata/package/aws-lambda-function-image-and-api.yaml @@ -7,7 +7,7 @@ Resources: Type: AWS::Lambda::Function Properties: PackageType: Image - Code: alpine:latest + Code: "emulation-python3.8:latest" Role: Fn::GetAtt: - "LambdaExecutionRole" diff --git a/tests/integration/testdata/package/aws-lambda-function-image.yaml b/tests/integration/testdata/package/aws-lambda-function-image.yaml index 9aee7c0c3a..11f4c681fd 100644 --- a/tests/integration/testdata/package/aws-lambda-function-image.yaml +++ b/tests/integration/testdata/package/aws-lambda-function-image.yaml @@ -7,7 +7,7 @@ Resources: Type: AWS::Lambda::Function Properties: PackageType: Image - Code: alpine:latest + Code: emulation-python3.8:latest Role: Fn::GetAtt: - "LambdaExecutionRole" diff --git a/tests/integration/testdata/package/aws-serverless-function-image.yaml b/tests/integration/testdata/package/aws-serverless-function-image.yaml index d4cce7863f..f5864bebb5 100644 --- a/tests/integration/testdata/package/aws-serverless-function-image.yaml +++ b/tests/integration/testdata/package/aws-serverless-function-image.yaml @@ -7,7 +7,7 @@ Resources: Type: AWS::Serverless::Function Properties: PackageType: Image - ImageUri: alpine:latest + ImageUri: emulation-python3.8:latest Events: HelloWorld: Type: Api diff --git a/tests/integration/testdata/package/deep-nested-image/ChildStackX/ChildStackY/template.yaml b/tests/integration/testdata/package/deep-nested-image/ChildStackX/ChildStackY/template.yaml index c17e15143f..2fc93a6bfe 100644 --- a/tests/integration/testdata/package/deep-nested-image/ChildStackX/ChildStackY/template.yaml +++ b/tests/integration/testdata/package/deep-nested-image/ChildStackX/ChildStackY/template.yaml @@ -7,4 +7,4 @@ Resources: Type: AWS::Serverless::Function Properties: PackageType: Image - ImageUri: python:3.9-slim \ No newline at end of file + ImageUri: emulation-python3.8:latest \ No newline at end of file diff --git a/tests/integration/testdata/package/deep-nested-image/ChildStackX/template.yaml b/tests/integration/testdata/package/deep-nested-image/ChildStackX/template.yaml index 89b70fe0ea..0c26cd70e5 100644 --- a/tests/integration/testdata/package/deep-nested-image/ChildStackX/template.yaml +++ b/tests/integration/testdata/package/deep-nested-image/ChildStackX/template.yaml @@ -7,7 +7,7 @@ Resources: Type: AWS::Serverless::Function Properties: PackageType: Image - ImageUri: python:3.7-slim + ImageUri: emulation-python3.8-2:latest ChildStackY: Type: AWS::Serverless::Application diff --git a/tests/integration/testdata/package/deep-nested-image/template.yaml b/tests/integration/testdata/package/deep-nested-image/template.yaml index e5fdd116a2..c464e77194 100644 --- a/tests/integration/testdata/package/deep-nested-image/template.yaml +++ b/tests/integration/testdata/package/deep-nested-image/template.yaml @@ -7,7 +7,7 @@ Resources: Type: AWS::Serverless::Function Properties: PackageType: Image - ImageUri: python:3.8-slim + ImageUri: emulation-python3.8:latest ChildStackX: Type: AWS::Serverless::Application diff --git a/tests/integration/testdata/package/samconfig-read-boolean-tomlkit.toml b/tests/integration/testdata/package/samconfig-read-boolean-tomlkit.toml new file mode 100644 index 0000000000..b17c6498a8 --- /dev/null +++ b/tests/integration/testdata/package/samconfig-read-boolean-tomlkit.toml @@ -0,0 +1,6 @@ +version = 0.1 + +[default.global.parameters] + +[default.deploy.parameters] +confirm_changeset = false \ No newline at end of file diff --git a/tests/integration/testdata/package/samconfig-tags-list.toml b/tests/integration/testdata/package/samconfig-tags-list.toml new file mode 100644 index 0000000000..3e3a6c43cd --- /dev/null +++ b/tests/integration/testdata/package/samconfig-tags-list.toml @@ -0,0 +1,10 @@ +version = 0.1 +[default] +[default.deploy] +[default.deploy.parameters] +tags = [ + "stage=int", + "company:application=awesome-service", + "company:department=engineering", + "company:staff=12" + ] \ No newline at end of file diff --git a/tests/integration/testdata/package/samconfig-tags-string.toml b/tests/integration/testdata/package/samconfig-tags-string.toml new file mode 100644 index 0000000000..e66df4ac06 --- /dev/null +++ b/tests/integration/testdata/package/samconfig-tags-string.toml @@ -0,0 +1,5 @@ +version = 0.1 +[default] +[default.deploy] +[default.deploy.parameters] +tags = "stage=int company:application=awesome-service company:department=engineering company:staff=12" \ No newline at end of file diff --git a/tests/integration/testdata/pipeline/custom_template/cookiecutter.json b/tests/integration/testdata/pipeline/custom_template/cookiecutter.json new file mode 100644 index 0000000000..c02b7caed1 --- /dev/null +++ b/tests/integration/testdata/pipeline/custom_template/cookiecutter.json @@ -0,0 +1,4 @@ +{ + "outputDir": "aws-sam-pipeline", + "weather": "" +} \ No newline at end of file diff --git a/tests/integration/testdata/pipeline/custom_template/metadata.json b/tests/integration/testdata/pipeline/custom_template/metadata.json new file mode 100644 index 0000000000..689fe297f8 --- /dev/null +++ b/tests/integration/testdata/pipeline/custom_template/metadata.json @@ -0,0 +1,3 @@ +{ + "number_of_stages": 0 +} diff --git a/tests/integration/testdata/pipeline/custom_template/questions.json b/tests/integration/testdata/pipeline/custom_template/questions.json new file mode 100644 index 0000000000..a0fe2167bf --- /dev/null +++ b/tests/integration/testdata/pipeline/custom_template/questions.json @@ -0,0 +1,7 @@ +{ + "questions": [{ + "key": "weather", + "question": "How is the weather today?", + "default": "Sunny" + }] +} \ No newline at end of file diff --git a/tests/integration/testdata/pipeline/custom_template/{{cookiecutter.outputDir}}/weather b/tests/integration/testdata/pipeline/custom_template/{{cookiecutter.outputDir}}/weather new file mode 100644 index 0000000000..3501ffd0ae --- /dev/null +++ b/tests/integration/testdata/pipeline/custom_template/{{cookiecutter.outputDir}}/weather @@ -0,0 +1 @@ +{{cookiecutter.weather}} diff --git a/tests/integration/testdata/start_api/binarydata.gif b/tests/integration/testdata/start_api/binarydata.gif index 855b4041793a49335cf6d1b66d8c1e5059daf60f..3f40c2073daf9743db59e7bec58cf90e8f6d3fbc 100644 GIT binary patch literal 49 ucmZ?wbh9u|WMp7un8*ME|Ns97(+r9~SvVOOm>6_GT#!5i6O#)ggEau*tOpVR literal 1951 zcmd^8{a2EC9{qyKi?V`(uBZf}LzuRr)~uT8Lm4uyCYqdPgu-KB=kujfc^aqiu{cg<9!rLi^agVZ!h$> zz$jFYCx9m=a9GUthWTNxaL66B_yG8`|NMEqD=_-|!(y?4Ox6(zT1Eb5Tpax2!)j>g z9D`wwi17OyOxvr~evTZfkA%)-l?nh52n16s);tl!9S)F6rSD>5a!+u_16c$D;k!RV z8z^9aphxB6?Ck7zH^wN%$>8HdrBaOm%sPT8MZT5_$V9@snVFY?feeaMT*1Yfni{yP zYl=wJvRKDC&;WxLLUH=Ev;;wrvQ`0)jFj+ry}i8?C%A>V;#vl6P%OS(F3#ieBw}%I zRMgDOjM;3C2xN_Psj^wjHyRCWGQl%56)fhTSj>l#)E0pt^Gx0w4#!|Hm`tW51`Vds zioBT+i`i9Md0mo9H5#AxTv^r+J`nPvqM{lLQpI91ES19b^>aeuV`YiA#-Ihi-RIc= zfRmGdO0^)oZ= zZWZ=Ns_vx4T>8W5pr(&^R&b56w||iby+!f97hc}8&w3#9sT>ueJmu1_ZYwy=d`7lwzjCN37$e6$bVS~u7^ zlh8$yv&lx#oXPtkVMuiR^wG<3A+4zJoA1*ia<6L`0aCBq!@iw(uT+} zEE$l+3r))KFXy@J(i0HEqvzS{?!MWe z3~{=xc(t$lS_~WEMw-)ET;RAo2T!X1ymw&I9vzQ$SPxnn%&N|~$)t!1dr0NdOJ2oW zqr>eZKiztPxEHg-4&R;7(e11?mu0k$ShT;HqjbbRt{Btj@R0Yrx4oK&qPS~wE^)CD z)zsqF@7B`k^E%(G|6E*TqgLAvU_GNX?4u3|>Jh(GHDb;?$gqohqKwA?dG`Vb> zhPF}+J$2m=?_jFmm*S`XA?!)VE5!ReSCVjIYU-HX>a&Vd(4)B;#RdZ7PRNsb=#`^; zAB2)%Xksz5X4%p+iPeYoERx$FWmsAv^-ePhTUu zK00t9)3*ublF_u!DTukt+JTW9Gsq6qMZcvv@j<*XSgs06_24GDrW{UCJtth7ycJ-V zB8o;M&~5~4yOR>yIrDXe<}899OFM)JR{#5+fG-gr*>&TgDU3Wl^>@qn7&|G1eCst` zgmqivjoONao^O|{DyyT}D8(ppXCqwZx^KLAYl^y=`0drOt2p!{7roUzg_b8glt(HR zqIXaFXeS|T_F;r)BN(=w{f7s=PGw$yc_qM3$`8=Qr>F;U8^7rQ944|pm9TE6vyM$Omm*tOa#_@eN#A str: + return "samdev" if os.getenv("SAM_CLI_DEV") else "sam" + + def command_list( + self, + template_file: Optional[Path] = None, + profile: Optional[str] = None, + region: Optional[str] = None, + config_file: Optional[Path] = None, + ) -> List[str]: + command_list = [self.base_command(), "validate"] + if template_file: + command_list += ["--template-file", str(template_file)] + if profile: + command_list += ["--profile", profile] + if region: + command_list += ["--region", region] + if config_file: + command_list += ["--config_file", str(config_file)] + return command_list + + @parameterized.expand( + [ + ("default_yaml", TemplateFileTypes.YAML), # project with template.yaml + ("default_json", TemplateFileTypes.JSON), # project with template.json + ("multiple_files", TemplateFileTypes.YAML), # project with both template.yaml and template.json + ( + "with_build", + TemplateFileTypes.JSON, + ), # project with template.json and standard build directory .aws-sam/build/template.yaml + ] + ) + def test_default_template_file_choice(self, relative_folder: str, expected_file: TemplateFileTypes): + test_data_path = Path(__file__).resolve().parents[2] / "integration" / "testdata" / "validate" + process_dir = test_data_path / relative_folder + command_result = run_command(self.command_list(), cwd=str(process_dir)) + pattern = self.patterns[expected_file] # type: ignore + output = command_result.stdout.decode("utf-8") + self.assertEqual(command_result.process.returncode, 0) + self.assertRegex(output, pattern) diff --git a/tests/testing_utils.py b/tests/testing_utils.py index d0514dfb9a..05afd44c86 100644 --- a/tests/testing_utils.py +++ b/tests/testing_utils.py @@ -5,6 +5,7 @@ import shutil from collections import namedtuple from subprocess import Popen, PIPE, TimeoutExpired +from typing import List IS_WINDOWS = platform.system().lower() == "windows" RUNNING_ON_CI = os.environ.get("APPVEYOR", False) @@ -50,6 +51,14 @@ def run_command_with_input(command_list, stdin_input, cwd=None, env=None, timeou raise +def run_command_with_inputs( + command_list: List[str], inputs: List[str], cwd=None, env=None, timeout=TIMEOUT +) -> CommandResult: + return run_command_with_input( + command_list=command_list, stdin_input=("\n".join(inputs) + "\n").encode(), cwd=cwd, env=env, timeout=timeout + ) + + class FileCreator(object): def __init__(self): self.rootdir = tempfile.mkdtemp() diff --git a/tests/unit/cli/test_types.py b/tests/unit/cli/test_types.py index d4769dc0db..353e48eaca 100644 --- a/tests/unit/cli/test_types.py +++ b/tests/unit/cli/test_types.py @@ -228,6 +228,12 @@ def test_must_fail_on_invalid_format(self, input): {"a": "012345678901234567890123456789", "c": "012345678901234567890123456789"}, ), (("",), {}), + # list as input + ([], {}), + ( + ["stage=int", "company:application=awesome-service", "company:department=engineering"], + {"stage": "int", "company:application": "awesome-service", "company:department": "engineering"}, + ), ] ) def test_successful_parsing(self, input, expected): diff --git a/tests/unit/commands/_utils/test_options.py b/tests/unit/commands/_utils/test_options.py index 2660d1451a..7b1350122c 100644 --- a/tests/unit/commands/_utils/test_options.py +++ b/tests/unit/commands/_utils/test_options.py @@ -55,7 +55,18 @@ def test_must_return_yml_extension(self, os_mock): def test_must_return_yaml_extension(self, os_mock): expected = "template.yaml" - os_mock.path.exists.return_value = True + os_mock.path.exists.side_effect = lambda file_name: file_name == expected + os_mock.path.abspath.return_value = "absPath" + + result = get_or_default_template_file_name(None, None, _TEMPLATE_OPTION_DEFAULT_VALUE, include_build=False) + self.assertEqual(result, "absPath") + os_mock.path.abspath.assert_called_with(expected) + + @patch("samcli.commands._utils.options.os") + def test_must_return_json_extension(self, os_mock): + expected = "template.json" + + os_mock.path.exists.side_effect = lambda file_name: file_name == expected os_mock.path.abspath.return_value = "absPath" result = get_or_default_template_file_name(None, None, _TEMPLATE_OPTION_DEFAULT_VALUE, include_build=False) diff --git a/tests/unit/commands/_utils/test_template.py b/tests/unit/commands/_utils/test_template.py index 07bae1c728..576094bbcc 100644 --- a/tests/unit/commands/_utils/test_template.py +++ b/tests/unit/commands/_utils/test_template.py @@ -1,12 +1,10 @@ -import os import copy +import os +from unittest import TestCase +from unittest.mock import patch, mock_open, MagicMock -import jmespath import yaml from botocore.utils import set_value_from_jmespath - -from unittest import TestCase -from unittest.mock import patch, mock_open, MagicMock from parameterized import parameterized, param from samcli.commands._utils.resources import AWS_SERVERLESS_FUNCTION, AWS_SERVERLESS_API, RESOURCES_WITH_LOCAL_PATHS diff --git a/tests/unit/commands/buildcmd/test_build_context.py b/tests/unit/commands/buildcmd/test_build_context.py index d36deb68cf..265d68404d 100644 --- a/tests/unit/commands/buildcmd/test_build_context.py +++ b/tests/unit/commands/buildcmd/test_build_context.py @@ -61,6 +61,7 @@ def test_must_setup_context( mode="buildmode", cached=False, cache_dir="cache_dir", + aws_region="any_aws_region", iac=iac, project=project, ) @@ -85,7 +86,11 @@ def test_must_setup_context( self.assertTrue(function1 in resources_to_build.functions) self.assertTrue(layer1 in resources_to_build.layers) - get_buildable_stacks_mock.assert_called_once_with([iac_stack], parameter_overrides={"overrides": "value"}) + get_buildable_stacks_mock.assert_called_once_with( + [iac_stack], + parameter_overrides={"overrides": "value"}, + global_parameter_overrides={"AWS::Region": "any_aws_region"}, + ) SamFunctionProviderMock.assert_called_once_with([stack], False) pathlib_mock.Path.assert_called_once_with("template_file") setup_build_dir_mock.assert_called_with("build_dir", True) @@ -424,7 +429,9 @@ def test_must_return_many_functions_to_build( resources_to_build = context.resources_to_build self.assertEqual(resources_to_build.functions, [func1, func2]) self.assertEqual(resources_to_build.layers, [layer1]) - get_buildable_stacks_mock.assert_called_once_with([iac_stack], parameter_overrides={"overrides": "value"}) + get_buildable_stacks_mock.assert_called_once_with( + [iac_stack], parameter_overrides={"overrides": "value"}, global_parameter_overrides=None + ) SamFunctionProviderMock.assert_called_once_with([stack], False) pathlib_mock.Path.assert_called_once_with("template_file") setup_build_dir_mock.assert_called_with("build_dir", True) diff --git a/tests/unit/commands/buildcmd/test_command.py b/tests/unit/commands/buildcmd/test_command.py index 9c5df81330..a0e759f6a3 100644 --- a/tests/unit/commands/buildcmd/test_command.py +++ b/tests/unit/commands/buildcmd/test_command.py @@ -54,6 +54,7 @@ def test_must_succeed_build(self, os_mock, move_template_mock, ApplicationBuilde project = Mock() do_cli( + ctx_mock, "function_identifier", "template", "base_dir", @@ -133,6 +134,7 @@ def test_must_catch_known_exceptions(self, exception, wrapped_exception, Applica with self.assertRaises(UserException) as ctx: do_cli( + ctx_mock, "function_identifier", "template", "base_dir", @@ -170,6 +172,7 @@ def test_must_catch_function_not_found_exception(self, ApplicationBuilderMock, B with self.assertRaises(UserException) as ctx: do_cli( + ctx_mock, "function_identifier", "template", "base_dir", diff --git a/tests/unit/commands/deploy/test_deploy_context.py b/tests/unit/commands/deploy/test_deploy_context.py index 96af48047a..9e4cb4ff4a 100644 --- a/tests/unit/commands/deploy/test_deploy_context.py +++ b/tests/unit/commands/deploy/test_deploy_context.py @@ -1,6 +1,6 @@ """Test sam deploy command""" from unittest import TestCase -from unittest.mock import patch, MagicMock, Mock +from unittest.mock import ANY, patch, MagicMock, Mock import tempfile from samcli.lib.deploy.deployer import Deployer @@ -27,7 +27,7 @@ def setUp(self): notification_arns=[], fail_on_empty_changeset=False, tags={"a": "b"}, - region=None, + region="any-aws-region", profile=None, confirm_changeset=False, signing_profiles=None, @@ -149,3 +149,9 @@ def test_template_valid_execute_changeset_with_parameters( self.deploy_command_context.deployer.create_and_wait_for_changeset.call_args[1]["parameter_values"], [{"ParameterKey": "a", "ParameterValue": "b"}, {"ParameterKey": "c", "UsePreviousValue": True}], ) + patched_get_buildable_stacks.assert_called_once_with( + ANY, + parameter_overrides={"a": "b"}, + normalize_resource_metadata=False, + global_parameter_overrides={"AWS::Region": "any-aws-region"}, + ) diff --git a/tests/unit/commands/deploy/test_guided_context.py b/tests/unit/commands/deploy/test_guided_context.py index a4fc260204..6b67b9ffff 100644 --- a/tests/unit/commands/deploy/test_guided_context.py +++ b/tests/unit/commands/deploy/test_guided_context.py @@ -122,6 +122,9 @@ def test_guided_prompts_check_defaults_non_public_resources_zips( call(f"\t{self.gc.start_bold}Capabilities{self.gc.end_bold}", default=["CAPABILITY_IAM"], type=ANY), ] self.assertEqual(expected_prompt_calls, patched_prompt.call_args_list) + patched_get_buildable_stacks.assert_called_once_with( + ANY, parameter_overrides={}, global_parameter_overrides={"AWS::Region": ANY} + ) @patch("samcli.commands.deploy.guided_context.prompt") @patch("samcli.commands.deploy.guided_context.confirm") @@ -781,7 +784,7 @@ def test_guided_prompts_with_code_signing( expected_code_sign_calls = expected_code_sign_calls * (number_of_functions + number_of_layers) self.assertEqual(expected_code_sign_calls, patched_code_signer_prompt.call_args_list) - @patch("samcli.commands.deploy.guided_context.get_session") + @patch("samcli.commands.deploy.guided_context.get_default_aws_region") @patch("samcli.commands.deploy.guided_context.prompt") @patch("samcli.commands.deploy.guided_context.confirm") @patch("samcli.commands.deploy.guided_context.manage_stack") @@ -798,7 +801,7 @@ def test_guided_prompts_check_default_config_region( patched_manage_stack, patched_confirm, patched_prompt, - patched_get_session, + patched_get_default_aws_region, ): project_mock = self.gc._project project_mock.reset_mock() @@ -817,7 +820,7 @@ def test_guided_prompts_check_default_config_region( patched_confirm.side_effect = [True, False, True, True, ""] patched_signer_config_per_function.return_value = ({}, {}) patched_manage_stack.return_value = "managed_s3_stack" - patched_get_session.return_value.get_config_variable.return_value = "default_config_region" + patched_get_default_aws_region.return_value = "default_config_region" # setting the default region to None self.gc.region = None self.gc.guided_prompts() diff --git a/tests/unit/commands/init/test_cli.py b/tests/unit/commands/init/test_cli.py index 874819003f..b4462c1842 100644 --- a/tests/unit/commands/init/test_cli.py +++ b/tests/unit/commands/init/test_cli.py @@ -1,3 +1,9 @@ +import os +import shutil +import subprocess +import tempfile +from pathlib import Path +from typing import Dict, Any from unittest import TestCase from unittest.mock import patch, ANY @@ -5,23 +11,27 @@ import click from click.testing import CliRunner -from samcli.commands.init.init_templates import InitTemplates +from samcli.commands.exceptions import UserException from samcli.commands.init import cli as init_cmd from samcli.commands.init import do_cli as init_cli from samcli.lib.iac.interface import ProjectTypes +from samcli.commands.init.init_templates import InitTemplates, APP_TEMPLATES_REPO_URL from samcli.lib.init import GenerateProjectFailedError -from samcli.commands.exceptions import UserException +from samcli.lib.utils import osutils +from samcli.lib.utils.git_repo import GitRepo from samcli.lib.utils.packagetype import IMAGE, ZIP class MockInitTemplates: - def __init__(self, no_interactive=False, auto_clone=True): - self._repo_url = "https://github.com/awslabs/aws-sam-cli-app-templates.git" - self._repo_name = "aws-sam-cli-app-templates" - self.repo_path = "repository" - self.clone_attempted = True + def __init__(self, no_interactive=False): self._no_interactive = no_interactive - self._auto_clone = auto_clone + # TODO [melasmar] remove brach after CDK templates are GA + self._git_repo: GitRepo = GitRepo( + url=APP_TEMPLATES_REPO_URL, + branch="cdk-template", + ) + self._git_repo.clone_attempted = True + self._git_repo.local_path = Path("repository") class TestCli(TestCase): @@ -41,9 +51,42 @@ def setUp(self): self.extra_context = '{"project_name": "testing project", "runtime": "python3.6"}' self.extra_context_as_json = {"project_name": "testing project", "runtime": "python3.6"} - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + # setup cache for clone, so that if `git clone` is called multiple times on the same URL, + # only one clone will happen. + clone_cache: Dict[str, Path] + patcher: Any + + @classmethod + def setUpClass(cls) -> None: + # Make (url -> directory) cache to avoid cloning the same thing twice + cls.clone_cache = {} + cls.patcher = patch("samcli.lib.utils.git_repo.check_output", side_effect=cls.check_output_mock) + cls.patcher.start() + + @classmethod + def tearDownClass(cls) -> None: + cls.patcher.stop() + for _, directory in cls.clone_cache.items(): + shutil.rmtree(directory.parent) + + @classmethod + def check_output_mock(cls, commands, cwd, stderr): + # TODO remove --branch once CDK is GA + git_executable, _, url, _, branch, clone_name = commands + if url not in cls.clone_cache: + tempdir = tempfile.NamedTemporaryFile(delete=False).name + subprocess.check_output( + [git_executable, "clone", "--branch", branch, url, clone_name], + cwd=tempdir, + stderr=stderr, + ) + cls.clone_cache[url] = Path(tempdir, clone_name) + + osutils.copytree(str(cls.clone_cache[url]), str(Path(cwd, clone_name))) + + @patch("samcli.lib.utils.git_repo.GitRepo.clone") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_cli(self, generate_project_patch, sd_mock): + def test_init_cli(self, generate_project_patch, git_repo_clone_mock): # GIVEN generate_project successfully created a project # WHEN a project name has been passed init_cli( @@ -60,7 +103,6 @@ def test_init_cli(self, generate_project_patch, sd_mock): app_template=self.app_template, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -79,9 +121,9 @@ def test_init_cli(self, generate_project_patch, sd_mock): self.extra_context_as_json, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo.clone") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_image_cli(self, generate_project_patch, sd_mock): + def test_init_image_cli(self, generate_project_patch, git_repo_clone_mock): # GIVEN generate_project successfully created a project # WHEN a project name has been passed init_cli( @@ -98,7 +140,6 @@ def test_init_image_cli(self, generate_project_patch, sd_mock): app_template=None, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -117,9 +158,9 @@ def test_init_image_cli(self, generate_project_patch, sd_mock): {"runtime": "nodejs12.x", "project_name": "testing project"}, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo.clone") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_image_java_cli(self, generate_project_patch, sd_mock): + def test_init_image_java_cli(self, generate_project_patch, git_repo_clone_mock): # GIVEN generate_project successfully created a project # WHEN a project name has been passed init_cli( @@ -136,7 +177,6 @@ def test_init_image_java_cli(self, generate_project_patch, sd_mock): app_template=None, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -155,8 +195,8 @@ def test_init_image_java_cli(self, generate_project_patch, sd_mock): {"runtime": "java11", "project_name": "testing project"}, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") - def test_init_fails_invalid_template(self, sd_mock): + @patch("samcli.lib.utils.git_repo.GitRepo.clone") + def test_init_fails_invalid_template(self, git_repo_clone_mock): # WHEN an unknown app template is passed in # THEN an exception should be raised with self.assertRaises(UserException): @@ -174,13 +214,12 @@ def test_init_fails_invalid_template(self, sd_mock): app_template="wrong-and-bad", no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") - def test_init_fails_invalid_dep_mgr(self, sd_mock): + @patch("samcli.lib.utils.git_repo.GitRepo.clone") + def test_init_fails_invalid_dep_mgr(self, git_repo_clone_mock): # WHEN an unknown app template is passed in # THEN an exception should be raised with self.assertRaises(UserException): @@ -198,14 +237,13 @@ def test_init_fails_invalid_dep_mgr(self, sd_mock): app_template=self.app_template, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo.clone") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_cli_generate_project_fails(self, generate_project_patch, sd_mock): + def test_init_cli_generate_project_fails(self, generate_project_patch, git_repo_clone_mock): # GIVEN generate_project fails to create a project generate_project_patch.side_effect = GenerateProjectFailedError( project=self.name, provider_error="Something wrong happened" @@ -228,7 +266,6 @@ def test_init_cli_generate_project_fails(self, generate_project_patch, sd_mock): app_template=None, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -243,9 +280,9 @@ def test_init_cli_generate_project_fails(self, generate_project_patch, sd_mock): self.no_input, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo.clone") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_cli_generate_project_image_fails(self, generate_project_patch, sd_mock): + def test_init_cli_generate_project_image_fails(self, generate_project_patch, git_repo_clone_mock): # GIVEN generate_project fails to create a project generate_project_patch.side_effect = GenerateProjectFailedError( project=self.name, provider_error="Something wrong happened" @@ -268,7 +305,6 @@ def test_init_cli_generate_project_image_fails(self, generate_project_patch, sd_ app_template=None, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -301,7 +337,6 @@ def test_init_cli_with_extra_context_parameter_not_passed(self, generate_project app_template=self.app_template, no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -337,7 +372,6 @@ def test_init_cli_with_extra_context_parameter_passed(self, generate_project_pat app_template=self.app_template, no_input=self.no_input, extra_context='{"schema_name":"events", "schema_type":"aws"}', - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -373,7 +407,6 @@ def test_init_cli_with_extra_context_not_overriding_default_parameter(self, gene app_template=self.app_template, no_input=self.no_input, extra_context='{"project_name": "my_project", "runtime": "java8", "schema_name":"events", "schema_type": "aws"}', - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -409,7 +442,6 @@ def test_init_cli_with_extra_context_input_as_wrong_json_raises_exception(self): app_template=self.app_template, no_input=self.no_input, extra_context='{"project_name", "my_project", "runtime": "java8", "schema_name":"events", "schema_type": "aws"}', - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -432,7 +464,6 @@ def test_init_cli_must_set_default_context_when_location_is_provided(self, gener app_template=None, no_input=None, extra_context='{"schema_name":"events", "schema_type": "aws"}', - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -468,7 +499,6 @@ def test_init_cli_must_only_set_passed_project_name_when_location_is_provided(se app_template=None, no_input=None, extra_context='{"schema_name":"events", "schema_type": "aws"}', - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -504,7 +534,6 @@ def test_init_cli_must_only_set_passed_runtime_when_location_is_provided(self, g app_template=None, no_input=None, extra_context='{"schema_name":"events", "schema_type": "aws"}', - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -542,7 +571,6 @@ def test_init_cli_with_extra_context_parameter_passed_as_escaped(self, generate_ # fmt: off extra_context='{\"schema_name\":\"events\", \"schema_type\":\"aws\"}', # fmt: on - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -691,7 +719,7 @@ def test_init_cli_int_with_image_app_template( # 1: Project type: SAM # 1: AWS Quick Start Templates # 2: Package type - Image - # 13: Java8 base image + # 14: Java8 base image # 1: dependency manager maven # test-project: response to name @@ -699,7 +727,7 @@ def test_init_cli_int_with_image_app_template( 1 1 2 -13 +14 1 test-project """ @@ -1158,7 +1186,6 @@ def test_init_passes_dynamic_event_bridge_template(self, generate_project_patch, app_template="eventBridge-schema-app", no_input=self.no_input, extra_context=None, - auto_clone=False, project_type=ProjectTypes.CFN, cdk_language=None, ) @@ -1176,9 +1203,9 @@ def test_init_passes_dynamic_event_bridge_template(self, generate_project_patch, self.extra_context_as_json, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_cli_int_from_location(self, generate_project_patch, sd_mock): + def test_init_cli_int_from_location(self, generate_project_patch, cd_mock): # WHEN the user follows interactive init prompts # 1: Project type: SAM @@ -1208,9 +1235,9 @@ def test_init_cli_int_from_location(self, generate_project_patch, sd_mock): None, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_cli_int_cdk_project_from_location(self, generate_project_patch, sd_mock): + def test_init_cli_int_cdk_project_from_location(self, generate_project_patch, cd_mock): # WHEN the user follows interactive init prompts # 2: Project type: CDK @@ -1240,9 +1267,9 @@ def test_init_cli_int_cdk_project_from_location(self, generate_project_patch, sd None, ) - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") @patch("samcli.commands.init.init_generator.generate_project") - def test_init_cli_no_package_type(self, generate_project_patch, sd_mock): + def test_init_cli_no_package_type(self, generate_project_patch, cd_mock): # WHEN the user follows interactive init prompts # 1: Project type: SAM @@ -1279,3 +1306,266 @@ def test_init_cli_no_package_type(self, generate_project_patch, sd_mock): True, ANY, ) + + @patch.object(InitTemplates, "__init__", MockInitTemplates.__init__) + @patch("samcli.commands.init.init_templates.InitTemplates._init_options_from_manifest") + def test_init_cli_image_pool_with_base_image_having_multiple_managed_template_but_no_app_template_provided( + self, + init_options_from_manifest_mock, + ): + init_options_from_manifest_mock.return_value = [ + { + "directory": "python3.8-image/cookiecutter-aws-sam-hello-python-lambda-image", + "displayName": "Hello World Lambda Image Example", + "dependencyManager": "pip", + "appTemplate": "hello-world-lambda-image", + "packageType": "Image", + }, + { + "directory": "python3.8-image/cookiecutter-ml-apigw-pytorch", + "displayName": "PyTorch Machine Learning Inference API", + "dependencyManager": "pip", + "appTemplate": "ml-apigw-pytorch", + "packageType": "Image", + }, + ] + with self.assertRaises(UserException): + init_cli( + ctx=self.ctx, + no_interactive=self.no_interactive, + pt_explicit=self.pt_explicit, + package_type="Image", + base_image="amazon/python3.8-base", + dependency_manager="pip", + app_template=None, + name=self.name, + output_dir=self.output_dir, + location=None, + runtime=None, + no_input=self.no_input, + extra_context=self.extra_context, + project_type=ProjectTypes.CFN, + cdk_language=None, + ) + + @patch.object(InitTemplates, "__init__", MockInitTemplates.__init__) + @patch("samcli.commands.init.init_templates.InitTemplates._init_options_from_manifest") + def test_init_cli_image_pool_with_base_image_having_multiple_managed_template_and_provided_app_template_not_matching_any_managed_templates( + self, + init_options_from_manifest_mock, + ): + init_options_from_manifest_mock.return_value = [ + { + "directory": "python3.8-image/cookiecutter-aws-sam-hello-python-lambda-image", + "displayName": "Hello World Lambda Image Example", + "dependencyManager": "pip", + "appTemplate": "hello-world-lambda-image", + "packageType": "Image", + }, + { + "directory": "python3.8-image/cookiecutter-ml-apigw-pytorch", + "displayName": "PyTorch Machine Learning Inference API", + "dependencyManager": "pip", + "appTemplate": "ml-apigw-pytorch", + "packageType": "Image", + }, + ] + with self.assertRaises(UserException): + init_cli( + ctx=self.ctx, + no_interactive=self.no_interactive, + pt_explicit=self.pt_explicit, + package_type="Image", + base_image="amazon/python3.8-base", + dependency_manager="pip", + app_template="Not-ml-apigw-pytorch", # different value than appTemplates shown in the manifest above + name=self.name, + output_dir=self.output_dir, + location=None, + runtime=None, + no_input=self.no_input, + extra_context=self.extra_context, + project_type=ProjectTypes.CFN, + cdk_language=None, + ) + + @patch.object(InitTemplates, "__init__", MockInitTemplates.__init__) + @patch("samcli.commands.init.init_templates.InitTemplates._init_options_from_manifest") + @patch("samcli.commands.init.init_generator.generate_project") + def test_init_cli_image_pool_with_base_image_having_multiple_managed_template_with_matching_app_template_provided( + self, + generate_project_patch, + init_options_from_manifest_mock, + ): + init_options_from_manifest_mock.return_value = [ + { + "directory": "python3.8-image/cookiecutter-aws-sam-hello-python-lambda-image", + "displayName": "Hello World Lambda Image Example", + "dependencyManager": "pip", + "appTemplate": "hello-world-lambda-image", + "packageType": "Image", + }, + { + "directory": "python3.8-image/cookiecutter-ml-apigw-pytorch", + "displayName": "PyTorch Machine Learning Inference API", + "dependencyManager": "pip", + "appTemplate": "ml-apigw-pytorch", + "packageType": "Image", + }, + ] + init_cli( + ctx=self.ctx, + no_interactive=True, + pt_explicit=True, + package_type="Image", + base_image="amazon/python3.8-base", + dependency_manager="pip", + app_template="ml-apigw-pytorch", # same value as one appTemplate in the manifest above + name=self.name, + output_dir=None, + location=None, + runtime=None, + no_input=None, + extra_context=None, + project_type=ProjectTypes.CFN, + cdk_language=None, + ) + generate_project_patch.assert_called_once_with( + ProjectTypes.CFN, + os.path.normpath("repository/python3.8-image/cookiecutter-ml-apigw-pytorch"), # location + "Image", # package_type + "python3.8", # runtime + "pip", # dependency_manager + self.output_dir, + self.name, + True, # no_input + ANY, + ) + + @patch.object(InitTemplates, "__init__", MockInitTemplates.__init__) + @patch("samcli.commands.init.init_templates.InitTemplates._init_options_from_manifest") + @patch("samcli.commands.init.init_generator.generate_project") + def test_init_cli_image_pool_with_base_image_having_one_managed_template_does_not_need_app_template_argument( + self, + generate_project_patch, + init_options_from_manifest_mock, + ): + init_options_from_manifest_mock.return_value = [ + { + "directory": "python3.8-image/cookiecutter-ml-apigw-pytorch", + "displayName": "PyTorch Machine Learning Inference API", + "dependencyManager": "pip", + "appTemplate": "ml-apigw-pytorch", + "packageType": "Image", + }, + ] + init_cli( + ctx=self.ctx, + no_interactive=True, + pt_explicit=True, + package_type="Image", + base_image="amazon/python3.8-base", + dependency_manager="pip", + app_template=None, + name=self.name, + output_dir=None, + location=None, + runtime=None, + no_input=None, + extra_context=None, + project_type=ProjectTypes.CFN, + cdk_language=None, + ) + generate_project_patch.assert_called_once_with( + ProjectTypes.CFN, + os.path.normpath("repository/python3.8-image/cookiecutter-ml-apigw-pytorch"), # location + "Image", # package_type + "python3.8", # runtime + "pip", # dependency_manager + self.output_dir, + self.name, + True, # no_input + ANY, + ) + + @patch.object(InitTemplates, "__init__", MockInitTemplates.__init__) + @patch("samcli.commands.init.init_templates.InitTemplates._init_options_from_manifest") + @patch("samcli.commands.init.init_generator.generate_project") + def test_init_cli_image_pool_with_base_image_having_one_managed_template_with_provided_app_template_matching_the_managed_template( + self, + generate_project_patch, + init_options_from_manifest_mock, + ): + init_options_from_manifest_mock.return_value = [ + { + "directory": "python3.8-image/cookiecutter-ml-apigw-pytorch", + "displayName": "PyTorch Machine Learning Inference API", + "dependencyManager": "pip", + "appTemplate": "ml-apigw-pytorch", + "packageType": "Image", + }, + ] + init_cli( + ctx=self.ctx, + no_interactive=True, + pt_explicit=True, + package_type="Image", + base_image="amazon/python3.8-base", + dependency_manager="pip", + app_template="ml-apigw-pytorch", # same value as appTemplate indicated in the manifest above + name=self.name, + output_dir=None, + location=None, + runtime=None, + no_input=None, + extra_context=None, + project_type=ProjectTypes.CFN, + cdk_language=None, + ) + generate_project_patch.assert_called_once_with( + ProjectTypes.CFN, + os.path.normpath("repository/python3.8-image/cookiecutter-ml-apigw-pytorch"), # location + "Image", # package_type + "python3.8", # runtime + "pip", # dependency_manager + self.output_dir, + self.name, + True, # no_input + ANY, + ) + + @patch.object(InitTemplates, "__init__", MockInitTemplates.__init__) + @patch("samcli.commands.init.init_templates.InitTemplates._init_options_from_manifest") + @patch("samcli.commands.init.init_generator.generate_project") + def test_init_cli_image_pool_with_base_image_having_one_managed_template_with_provided_app_template_not_matching_the_managed_template( + self, + generate_project_patch, + init_options_from_manifest_mock, + ): + init_options_from_manifest_mock.return_value = [ + { + "directory": "python3.8-image/cookiecutter-ml-apigw-pytorch", + "displayName": "PyTorch Machine Learning Inference API", + "dependencyManager": "pip", + "appTemplate": "ml-apigw-pytorch", + "packageType": "Image", + }, + ] + with (self.assertRaises(UserException)): + init_cli( + ctx=self.ctx, + no_interactive=True, + pt_explicit=True, + package_type="Image", + base_image="amazon/python3.8-base", + dependency_manager="pip", + app_template="NOT-ml-apigw-pytorch", # different value than appTemplate shown in the manifest above + name=self.name, + output_dir=None, + location=None, + runtime=None, + no_input=None, + extra_context=None, + project_type=ProjectTypes.CFN, + cdk_language=None, + ) diff --git a/tests/unit/commands/init/test_templates.py b/tests/unit/commands/init/test_templates.py index 869fe0b907..6d7d967a26 100644 --- a/tests/unit/commands/init/test_templates.py +++ b/tests/unit/commands/init/test_templates.py @@ -1,25 +1,22 @@ import json import subprocess -import click - -from unittest.mock import mock_open, patch, PropertyMock, MagicMock +from pathlib import Path from re import search from unittest import TestCase from samcli.lib.iac.interface import ProjectTypes -from samcli.lib.utils.packagetype import IMAGE, ZIP - -from pathlib import Path +from unittest.mock import mock_open, patch, PropertyMock, MagicMock from samcli.commands.init.init_templates import InitTemplates +from samcli.lib.utils.packagetype import IMAGE, ZIP class TestTemplates(TestCase): - @patch("subprocess.check_output") - @patch("samcli.commands.init.init_templates.InitTemplates._git_executable") - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.GitRepo._git_executable") + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") @patch("shutil.copytree") - def test_location_from_app_template_zip(self, subprocess_mock, git_exec_mock, sd_mock, copy_mock): + def test_location_from_app_template_zip(self, subprocess_mock, git_exec_mock, cd_mock, copy_mock): it = InitTemplates(True) manifest = { @@ -37,18 +34,18 @@ def test_location_from_app_template_zip(self, subprocess_mock, git_exec_mock, sd m = mock_open(read_data=manifest_json) with patch("samcli.cli.global_config.GlobalConfig.config_dir", new_callable=PropertyMock) as mock_cfg: - mock_cfg.return_value = "/tmp/test-sam" + mock_cfg.return_value = Path("/tmp/test-sam") with patch("samcli.commands.init.init_templates.open", m): location = it.location_from_app_template( ProjectTypes.CFN, None, ZIP, "ruby2.5", None, "bundler", "hello-world" ) self.assertTrue(search("mock-ruby-template", location)) - @patch("subprocess.check_output") - @patch("samcli.commands.init.init_templates.InitTemplates._git_executable") - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.GitRepo._git_executable") + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") @patch("shutil.copytree") - def test_location_from_app_template_image(self, subprocess_mock, git_exec_mock, sd_mock, copy_mock): + def test_location_from_app_template_image(self, subprocess_mock, git_exec_mock, cd_mock, copy_mock): it = InitTemplates(True) manifest = { @@ -66,63 +63,37 @@ def test_location_from_app_template_image(self, subprocess_mock, git_exec_mock, m = mock_open(read_data=manifest_json) with patch("samcli.cli.global_config.GlobalConfig.config_dir", new_callable=PropertyMock) as mock_cfg: - mock_cfg.return_value = "/tmp/test-sam" + mock_cfg.return_value = Path("/tmp/test-sam") with patch("samcli.commands.init.init_templates.open", m): location = it.location_from_app_template( ProjectTypes.CFN, None, IMAGE, None, "ruby2.5-image", "bundler", "hello-world-lambda-image" ) self.assertTrue(search("mock-ruby-image-template", location)) - @patch("samcli.commands.init.init_templates.InitTemplates._git_executable") + @patch("samcli.lib.utils.git_repo.GitRepo._git_executable") @patch("click.prompt") - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") - def test_fallback_options(self, git_exec_mock, prompt_mock, sd_mock): + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") + def test_fallback_options(self, git_exec_mock, prompt_mock, cd_mock): prompt_mock.return_value = "1" - with patch("subprocess.check_output", new_callable=MagicMock) as mock_sub: + with patch("samcli.lib.utils.git_repo.check_output", new_callable=MagicMock) as mock_sub: with patch("samcli.cli.global_config.GlobalConfig.config_dir", new_callable=PropertyMock) as mock_cfg: mock_sub.side_effect = OSError("Fail") - mock_cfg.return_value = "/tmp/test-sam" + mock_cfg.return_value = Path("/tmp/test-sam") it = InitTemplates(True) location, app_template = it.prompt_for_location(ProjectTypes.CFN, None, ZIP, "ruby2.5", None, "bundler") self.assertTrue(search("cookiecutter-aws-sam-hello-ruby", location)) self.assertEqual("hello-world", app_template) - @patch("samcli.commands.init.init_templates.InitTemplates._git_executable") + @patch("samcli.lib.utils.git_repo.GitRepo._git_executable") @patch("click.prompt") - @patch("samcli.commands.init.init_templates.InitTemplates._shared_dir_check") - def test_fallback_process_error(self, git_exec_mock, prompt_mock, sd_mock): + @patch("samcli.lib.utils.git_repo.GitRepo._ensure_clone_directory_exists") + def test_fallback_process_error(self, git_exec_mock, prompt_mock, cd_mock): prompt_mock.return_value = "1" - with patch("subprocess.check_output", new_callable=MagicMock) as mock_sub: + with patch("samcli.lib.utils.git_repo.check_output", new_callable=MagicMock) as mock_sub: with patch("samcli.cli.global_config.GlobalConfig.config_dir", new_callable=PropertyMock) as mock_cfg: mock_sub.side_effect = subprocess.CalledProcessError("fail", "fail", "not found".encode("utf-8")) - mock_cfg.return_value = "/tmp/test-sam" + mock_cfg.return_value = Path("/tmp/test-sam") it = InitTemplates(True) location, app_template = it.prompt_for_location(ProjectTypes.CFN, None, ZIP, "ruby2.5", None, "bundler") self.assertTrue(search("cookiecutter-aws-sam-hello-ruby", location)) self.assertEqual("hello-world", app_template) - - def test_git_executable_windows(self): - with patch("platform.system", new_callable=MagicMock) as mock_platform: - mock_platform.return_value = "Windows" - with patch("subprocess.Popen", new_callable=MagicMock) as mock_popen: - it = InitTemplates(True) - executable = it._git_executable() - self.assertEqual(executable, "git") - - def test_git_executable_fails(self): - with patch("subprocess.Popen", new_callable=MagicMock) as mock_popen: - mock_popen.side_effect = OSError("fail") - it = InitTemplates(True) - with self.assertRaises(OSError): - executable = it._git_executable() - - def test_shared_dir_check(self): - it = InitTemplates(True, False) - shared_dir_mock = MagicMock() - self.assertTrue(it._shared_dir_check(shared_dir_mock)) - - def test_shared_dir_failure(self): - it = InitTemplates(True, False) - shared_dir_mock = MagicMock() - shared_dir_mock.mkdir.side_effect = OSError("fail") - self.assertFalse(it._shared_dir_check(shared_dir_mock)) diff --git a/tests/unit/commands/local/cli_common/test_invoke_context.py b/tests/unit/commands/local/cli_common/test_invoke_context.py index 770aed4b18..5e7591823d 100644 --- a/tests/unit/commands/local/cli_common/test_invoke_context.py +++ b/tests/unit/commands/local/cli_common/test_invoke_context.py @@ -571,6 +571,8 @@ def test_must_create_runner( env_vars_values=ANY, aws_profile="profile", aws_region="region", + container_host=None, + container_host_interface=None, ) result = self.context.local_lambda_runner @@ -644,6 +646,87 @@ def test_must_create_runner_using_warm_containers( env_vars_values=ANY, aws_profile="profile", aws_region="region", + container_host=None, + container_host_interface=None, + ) + + result = self.context.local_lambda_runner + self.assertEqual(result, runner_mock) + # assert that lambda runner is created only one time, and the cached version used in the second call + self.assertEqual(LocalLambdaMock.call_count, 1) + + @patch("samcli.commands.local.cli_common.invoke_context.LambdaImage") + @patch("samcli.commands.local.cli_common.invoke_context.LayerDownloader") + @patch("samcli.commands.local.cli_common.invoke_context.LambdaRuntime") + @patch("samcli.commands.local.cli_common.invoke_context.LocalLambdaRunner") + @patch("samcli.commands.local.cli_common.invoke_context.SamFunctionProvider") + def test_must_create_runner_with_container_host_option( + self, SamFunctionProviderMock, LocalLambdaMock, LambdaRuntimeMock, download_layers_mock, lambda_image_patch + ): + runtime_mock = Mock() + LambdaRuntimeMock.return_value = runtime_mock + + runner_mock = Mock() + LocalLambdaMock.return_value = runner_mock + + download_mock = Mock() + download_layers_mock.return_value = download_mock + + image_mock = Mock() + lambda_image_patch.return_value = image_mock + + iac_mock = Mock() + project_mock = Mock() + + cwd = "cwd" + self.context = InvokeContext( + template_file="template_file", + function_identifier="id", + env_vars_file="env_vars_file", + docker_volume_basedir="volumedir", + docker_network="network", + log_file="log_file", + skip_pull_image=True, + force_image_build=True, + debug_ports=[1111], + debugger_path="path-to-debugger", + debug_args="args", + aws_profile="profile", + aws_region="region", + container_host="abcdef", + container_host_interface="192.168.100.101", + iac=iac_mock, + project=project_mock, + ) + self.context.get_cwd = Mock() + self.context.get_cwd.return_value = cwd + + self.context._get_stacks = Mock() + self.context._get_stacks.return_value = [Mock()] + self.context._get_env_vars_value = Mock() + self.context._setup_log_file = Mock() + self.context._get_debug_context = Mock(return_value=None) + + container_manager_mock = Mock() + container_manager_mock.is_docker_reachable = PropertyMock(return_value=True) + self.context._get_container_manager = Mock(return_value=container_manager_mock) + + with self.context: + result = self.context.local_lambda_runner + self.assertEqual(result, runner_mock) + + LambdaRuntimeMock.assert_called_with(container_manager_mock, image_mock) + lambda_image_patch.assert_called_once_with(download_mock, True, True) + LocalLambdaMock.assert_called_with( + local_runtime=runtime_mock, + function_provider=ANY, + cwd=cwd, + debug_context=None, + env_vars_values=ANY, + aws_profile="profile", + aws_region="region", + container_host="abcdef", + container_host_interface="192.168.100.101", ) result = self.context.local_lambda_runner diff --git a/tests/unit/commands/local/invoke/test_cli.py b/tests/unit/commands/local/invoke/test_cli.py index e450d68033..0352b88fe9 100644 --- a/tests/unit/commands/local/invoke/test_cli.py +++ b/tests/unit/commands/local/invoke/test_cli.py @@ -41,6 +41,8 @@ def setUp(self): self.shutdown = False self.region_name = "region" self.profile = "profile" + self.container_host = "localhost" + self.container_host_interface = "127.0.0.1" @patch("samcli.commands.local.cli_common.invoke_context.InvokeContext") @patch("samcli.commands.local.invoke.cli._get_event") @@ -77,6 +79,8 @@ def test_cli_must_setup_context_and_invoke(self, get_event_mock, InvokeContextMo layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=iac_mock, project=project_mock, @@ -100,6 +104,8 @@ def test_cli_must_setup_context_and_invoke(self, get_event_mock, InvokeContextMo shutdown=self.shutdown, aws_region=self.region_name, aws_profile=self.profile, + container_host=self.container_host, + container_host_interface=self.container_host_interface, iac=iac_mock, project=project_mock, ) @@ -142,6 +148,8 @@ def test_cli_must_invoke_with_no_event(self, get_event_mock, InvokeContextMock): layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=iac_mock, project=project_mock, @@ -165,6 +173,8 @@ def test_cli_must_invoke_with_no_event(self, get_event_mock, InvokeContextMock): shutdown=self.shutdown, aws_region=self.region_name, aws_profile=self.profile, + container_host=self.container_host, + container_host_interface=self.container_host_interface, iac=iac_mock, project=project_mock, ) @@ -219,6 +229,8 @@ def test_must_raise_user_exception_on_function_not_found( layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=Mock(), project=Mock(), @@ -274,6 +286,8 @@ def test_must_raise_user_exception_on_function_local_invoke_image_not_found_for_ layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=Mock(), project=Mock(), @@ -327,6 +341,8 @@ def test_must_raise_user_exception_on_invalid_sam_template( layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=Mock(), project=Mock(), @@ -368,6 +384,8 @@ def test_must_raise_user_exception_on_invalid_env_vars(self, get_event_mock, Inv layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=Mock(), project=Mock(), @@ -423,6 +441,8 @@ def test_must_raise_user_exception_on_function_no_free_ports( layer_cache_basedir=self.layer_cache_basedir, force_image_build=self.force_image_build, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=Mock(), project=Mock(), diff --git a/tests/unit/commands/local/lib/test_local_lambda.py b/tests/unit/commands/local/lib/test_local_lambda.py index 83486b72ae..c5d9a2aee7 100644 --- a/tests/unit/commands/local/lib/test_local_lambda.py +++ b/tests/unit/commands/local/lib/test_local_lambda.py @@ -523,7 +523,13 @@ def test_must_work(self): self.local_lambda.invoke(name, event, stdout, stderr) self.runtime_mock.invoke.assert_called_with( - invoke_config, event, debug_context=None, stdout=stdout, stderr=stderr + invoke_config, + event, + debug_context=None, + stdout=stdout, + stderr=stderr, + container_host=None, + container_host_interface=None, ) def test_must_work_packagetype_ZIP(self): @@ -541,7 +547,13 @@ def test_must_work_packagetype_ZIP(self): self.local_lambda.invoke(name, event, stdout, stderr) self.runtime_mock.invoke.assert_called_with( - invoke_config, event, debug_context=None, stdout=stdout, stderr=stderr + invoke_config, + event, + debug_context=None, + stdout=stdout, + stderr=stderr, + container_host=None, + container_host_interface=None, ) def test_must_raise_if_no_privilege(self): @@ -614,7 +626,13 @@ def test_works_if_imageuri_and_Image_packagetype(self): self.local_lambda.get_invoke_config.return_value = invoke_config self.local_lambda.invoke(name, event, stdout, stderr) self.runtime_mock.invoke.assert_called_with( - invoke_config, event, debug_context=None, stdout=stdout, stderr=stderr + invoke_config, + event, + debug_context=None, + stdout=stdout, + stderr=stderr, + container_host=None, + container_host_interface=None, ) def test_must_raise_if_imageuri_not_found(self): @@ -630,6 +648,53 @@ def test_must_raise_if_imageuri_not_found(self): self.local_lambda.invoke(name, event, stdout, stderr) +class TestLocalLambda_invoke_with_container_host_option(TestCase): + def setUp(self): + self.runtime_mock = Mock() + self.function_provider_mock = Mock() + self.cwd = "/my/current/working/directory" + self.debug_context = None + self.aws_profile = "myprofile" + self.aws_region = "region" + self.env_vars_values = {} + self.container_host = "localhost" + self.container_host_interface = "127.0.0.1" + + self.local_lambda = LocalLambdaRunner( + self.runtime_mock, + self.function_provider_mock, + self.cwd, + env_vars_values=self.env_vars_values, + debug_context=self.debug_context, + container_host=self.container_host, + container_host_interface=self.container_host_interface, + ) + + def test_must_work(self): + name = "name" + event = "event" + stdout = "stdout" + stderr = "stderr" + function = Mock(functionname="name") + invoke_config = "config" + + self.function_provider_mock.get_all.return_value = [function] + self.local_lambda.get_invoke_config = Mock() + self.local_lambda.get_invoke_config.return_value = invoke_config + + self.local_lambda.invoke(name, event, stdout, stderr) + + self.runtime_mock.invoke.assert_called_with( + invoke_config, + event, + debug_context=None, + stdout=stdout, + stderr=stderr, + container_host="localhost", + container_host_interface="127.0.0.1", + ) + + class TestLocalLambda_is_debugging(TestCase): def setUp(self): self.runtime_mock = Mock() @@ -652,7 +717,6 @@ def test_must_be_on(self): self.assertTrue(self.local_lambda.is_debugging()) def test_must_be_off(self): - self.local_lambda = LocalLambdaRunner( self.runtime_mock, self.function_provider_mock, diff --git a/tests/unit/commands/local/lib/test_sam_function_provider.py b/tests/unit/commands/local/lib/test_sam_function_provider.py index 89558c6227..3b3a75bc34 100644 --- a/tests/unit/commands/local/lib/test_sam_function_provider.py +++ b/tests/unit/commands/local/lib/test_sam_function_provider.py @@ -64,10 +64,6 @@ class TestSamFunctionProviderEndToEnd(TestCase): "Handler": "index.handler", }, }, - "SamFunc4": { - "Type": "AWS::Serverless::Function", - "Properties": {"ImageUri": "123456789012.dkr.ecr.us-east-1.amazonaws.com/myrepo", "PackageType": IMAGE}, - }, "SamFuncWithFunctionNameOverride": { "Type": "AWS::Serverless::Function", "Properties": { @@ -77,6 +73,29 @@ class TestSamFunctionProviderEndToEnd(TestCase): "Handler": "index.handler", }, }, + "SamFuncWithImage1": { + "Type": "AWS::Serverless::Function", + "Properties": { + "PackageType": IMAGE, + }, + "Metadata": {"DockerTag": "tag", "DockerContext": "./image", "Dockerfile": "Dockerfile"}, + }, + "SamFuncWithImage2": { + "Type": "AWS::Serverless::Function", + "Properties": { + "ImageUri": "image:tag", + "PackageType": IMAGE, + }, + "Metadata": {"DockerTag": "tag", "DockerContext": "./image", "Dockerfile": "Dockerfile"}, + }, + "SamFuncWithImage3": { + # ImageUri is unsupported ECR location + "Type": "AWS::Serverless::Function", + "Properties": { + "ImageUri": "123456789012.dkr.ecr.us-east-1.amazonaws.com/myrepo:myimage", + "PackageType": IMAGE, + }, + }, "LambdaFunc1": { "Type": "AWS::Lambda::Function", "Properties": { @@ -85,21 +104,37 @@ class TestSamFunctionProviderEndToEnd(TestCase): "Handler": "index.handler", }, }, - "LambdaFuncWithInlineCode": { + "LambdaFuncWithImage1": { "Type": "AWS::Lambda::Function", "Properties": { - "Code": {"ZipFile": "testcode"}, - "Runtime": "nodejs4.3", - "Handler": "index.handler", + "PackageType": IMAGE, }, + "Metadata": {"DockerTag": "tag", "DockerContext": "./image", "Dockerfile": "Dockerfile"}, }, - "LambdaFunc2": { + "LambdaFuncWithImage2": { + "Type": "AWS::Lambda::Function", + "Properties": { + "Code": {"ImageUri": "image:tag"}, + "PackageType": IMAGE, + }, + "Metadata": {"DockerTag": "tag", "DockerContext": "./image", "Dockerfile": "Dockerfile"}, + }, + "LambdaFuncWithImage3": { + # ImageUri is unsupported ECR location "Type": "AWS::Lambda::Function", "Properties": { "Code": {"ImageUri": "123456789012.dkr.ecr.us-east-1.amazonaws.com/myrepo"}, "PackageType": IMAGE, }, }, + "LambdaFuncWithInlineCode": { + "Type": "AWS::Lambda::Function", + "Properties": { + "Code": {"ZipFile": "testcode"}, + "Runtime": "nodejs4.3", + "Handler": "index.handler", + }, + }, "LambdaFuncWithLocalPath": { "Type": "AWS::Lambda::Function", "Properties": {"Code": "./some/path/to/code", "Runtime": "nodejs4.3", "Handler": "index.handler"}, @@ -251,11 +286,11 @@ def setUp(self): ("SamFunc2", None), # codeuri is a s3 location, ignored ("SamFunc3", None), # codeuri is a s3 location, ignored ( - "SamFunc4", + "SamFuncWithImage1", Function( - function_id="SamFunc4", - name="SamFunc4", - functionname="SamFunc4", + function_id="SamFuncWithImage1", + name="SamFuncWithImage1", + functionname="SamFuncWithImage1", runtime=None, handler=None, codeuri=".", @@ -266,14 +301,47 @@ def setUp(self): layers=[], events=None, inlinecode=None, - imageuri="123456789012.dkr.ecr.us-east-1.amazonaws.com/myrepo", + imageuri=None, imageconfig=None, packagetype=IMAGE, - metadata=None, + metadata={ + "DockerTag": "tag", + "DockerContext": os.path.join("image"), + "Dockerfile": "Dockerfile", + }, codesign_config_arn=None, stack_path="", ), ), + ( + "SamFuncWithImage2", + Function( + function_id="SamFuncWithImage2", + name="SamFuncWithImage2", + functionname="SamFuncWithImage2", + runtime=None, + handler=None, + codeuri=".", + memory=None, + timeout=None, + environment=None, + rolearn=None, + layers=[], + events=None, + inlinecode=None, + imageuri="image:tag", + imageconfig=None, + packagetype=IMAGE, + metadata={ + "DockerTag": "tag", + "DockerContext": os.path.join("image"), + "Dockerfile": "Dockerfile", + }, + codesign_config_arn=None, + stack_path="", + ), + ), + ("SamFuncWithImage3", None), # imageuri is ecr location, ignored ( "SamFuncWithFunctionNameOverride-x", Function( @@ -300,35 +368,39 @@ def setUp(self): ), ("LambdaFunc1", None), # codeuri is a s3 location, ignored ( - "LambdaFuncWithInlineCode", + "LambdaFuncWithImage1", Function( - function_id="LambdaFuncWithInlineCode", - name="LambdaFuncWithInlineCode", - functionname="LambdaFuncWithInlineCode", - runtime="nodejs4.3", - handler="index.handler", - codeuri=None, + function_id="LambdaFuncWithImage1", + name="LambdaFuncWithImage1", + functionname="LambdaFuncWithImage1", + runtime=None, + handler=None, + codeuri=".", memory=None, timeout=None, environment=None, rolearn=None, layers=[], events=None, - metadata=None, - inlinecode="testcode", - codesign_config_arn=None, + metadata={ + "DockerTag": "tag", + "DockerContext": os.path.join("image"), + "Dockerfile": "Dockerfile", + }, + inlinecode=None, imageuri=None, imageconfig=None, - packagetype=ZIP, + packagetype=IMAGE, + codesign_config_arn=None, stack_path="", ), ), ( - "LambdaFunc2", + "LambdaFuncWithImage2", Function( - function_id="LambdaFunc2", - name="LambdaFunc2", - functionname="LambdaFunc2", + function_id="LambdaFuncWithImage2", + name="LambdaFuncWithImage2", + functionname="LambdaFuncWithImage2", runtime=None, handler=None, codeuri=".", @@ -338,15 +410,44 @@ def setUp(self): rolearn=None, layers=[], events=None, - metadata=None, + metadata={ + "DockerTag": "tag", + "DockerContext": os.path.join("image"), + "Dockerfile": "Dockerfile", + }, inlinecode=None, - imageuri="123456789012.dkr.ecr.us-east-1.amazonaws.com/myrepo", + imageuri="image:tag", imageconfig=None, packagetype=IMAGE, codesign_config_arn=None, stack_path="", ), ), + ("LambdaFuncWithImage3", None), # imageuri is a ecr location, ignored + ( + "LambdaFuncWithInlineCode", + Function( + function_id="LambdaFuncWithInlineCode", + name="LambdaFuncWithInlineCode", + functionname="LambdaFuncWithInlineCode", + runtime="nodejs4.3", + handler="index.handler", + codeuri=None, + memory=None, + timeout=None, + environment=None, + rolearn=None, + layers=[], + events=None, + metadata=None, + inlinecode="testcode", + codesign_config_arn=None, + imageuri=None, + imageconfig=None, + packagetype=ZIP, + stack_path="", + ), + ), ( "LambdaFuncWithLocalPath", Function( @@ -507,11 +608,13 @@ def test_get_all_must_return_all_functions(self): result = {posixpath.join(f.stack_path, f.name) for f in self.provider.get_all()} expected = { "SamFunctions", + "SamFuncWithImage1", + "SamFuncWithImage2", "SamFuncWithInlineCode", - "SamFunc4", "SamFuncWithFunctionNameOverride", + "LambdaFuncWithImage1", + "LambdaFuncWithImage2", "LambdaFuncWithInlineCode", - "LambdaFunc2", "LambdaFuncWithLocalPath", "LambdaFuncWithFunctionNameOverride", "LambdaFuncWithCodeSignConfig", diff --git a/tests/unit/commands/local/start_api/test_cli.py b/tests/unit/commands/local/start_api/test_cli.py index 7df8ccb583..b8ec24ed2d 100644 --- a/tests/unit/commands/local/start_api/test_cli.py +++ b/tests/unit/commands/local/start_api/test_cli.py @@ -47,6 +47,9 @@ def setUp(self): self.port = 123 self.static_dir = "staticdir" + self.container_host = "localhost" + self.container_host_interface = "127.0.0.1" + self.iac = Mock() self.project = Mock() @@ -85,6 +88,8 @@ def test_cli_must_setup_context_and_start_service(self, local_api_service_mock, warm_container_initialization_mode=self.warm_containers, debug_function=self.debug_function, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, iac=self.iac, project=self.project, ) @@ -193,6 +198,8 @@ def call_cli(self): warm_containers=self.warm_containers, debug_function=self.debug_function, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, project_type="CFN", iac=self.iac, project=self.project, diff --git a/tests/unit/commands/local/start_lambda/test_cli.py b/tests/unit/commands/local/start_lambda/test_cli.py index 7a6ce366fc..ee45ff67d3 100644 --- a/tests/unit/commands/local/start_lambda/test_cli.py +++ b/tests/unit/commands/local/start_lambda/test_cli.py @@ -40,6 +40,9 @@ def setUp(self): self.host = "host" self.port = 123 + self.container_host = "localhost" + self.container_host_interface = "127.0.0.1" + self.iac = Mock() self.project = Mock() @@ -77,6 +80,8 @@ def test_cli_must_setup_context_and_start_service(self, local_lambda_service_moc warm_container_initialization_mode=self.warm_containers, debug_function=self.debug_function, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, iac=self.iac, project=self.project, ) @@ -162,6 +167,8 @@ def call_cli(self): warm_containers=self.warm_containers, debug_function=self.debug_function, shutdown=self.shutdown, + container_host=self.container_host, + container_host_interface=self.container_host_interface, iac=self.iac, project=self.project, ) diff --git a/tests/unit/commands/logs/test_command.py b/tests/unit/commands/logs/test_command.py index b895428f19..3a48600ae0 100644 --- a/tests/unit/commands/logs/test_command.py +++ b/tests/unit/commands/logs/test_command.py @@ -1,5 +1,5 @@ from unittest import TestCase -from unittest.mock import Mock, patch, call +from unittest.mock import Mock, patch from samcli.commands.logs.command import do_cli @@ -13,67 +13,46 @@ def setUp(self): self.start_time = "start" self.end_time = "end" - @patch("samcli.commands.logs.command.click") @patch("samcli.commands.logs.logs_context.LogsCommandContext") - def test_without_tail(self, LogsCommandContextMock, click_mock): + def test_without_tail(self, logs_command_context_mock): tailing = False - events_iterable = [1, 2, 3] - formatted_events = [4, 5, 6] context_mock = Mock() - LogsCommandContextMock.return_value.__enter__.return_value = context_mock - - context_mock.fetcher.fetch.return_value = events_iterable - context_mock.formatter.do_format.return_value = formatted_events + logs_command_context_mock.return_value.__enter__.return_value = context_mock do_cli(self.function_name, self.stack_name, self.filter_pattern, tailing, self.start_time, self.end_time) - LogsCommandContextMock.assert_called_with( + logs_command_context_mock.assert_called_with( self.function_name, stack_name=self.stack_name, filter_pattern=self.filter_pattern, start_time=self.start_time, end_time=self.end_time, - output_file=None, ) - context_mock.fetcher.fetch.assert_called_with( - context_mock.log_group_name, + context_mock.fetcher.load_time_period.assert_called_with( filter_pattern=context_mock.filter_pattern, - start=context_mock.start_time, - end=context_mock.end_time, + start_time=context_mock.start_time, + end_time=context_mock.end_time, ) - context_mock.formatter.do_format.assert_called_with(events_iterable) - click_mock.echo.assert_has_calls([call(v, nl=False) for v in formatted_events]) - - @patch("samcli.commands.logs.command.click") @patch("samcli.commands.logs.logs_context.LogsCommandContext") - def test_with_tailing(self, LogsCommandContextMock, click_mock): + def test_with_tailing(self, logs_command_context_mock): tailing = True - events_iterable = [1, 2, 3] - formatted_events = [4, 5, 6] context_mock = Mock() - LogsCommandContextMock.return_value.__enter__.return_value = context_mock - - context_mock.fetcher.tail.return_value = events_iterable - context_mock.formatter.do_format.return_value = formatted_events + logs_command_context_mock.return_value.__enter__.return_value = context_mock do_cli(self.function_name, self.stack_name, self.filter_pattern, tailing, self.start_time, self.end_time) - LogsCommandContextMock.assert_called_with( + logs_command_context_mock.assert_called_with( self.function_name, stack_name=self.stack_name, filter_pattern=self.filter_pattern, start_time=self.start_time, end_time=self.end_time, - output_file=None, ) context_mock.fetcher.tail.assert_called_with( - context_mock.log_group_name, filter_pattern=context_mock.filter_pattern, start=context_mock.start_time + filter_pattern=context_mock.filter_pattern, start_time=context_mock.start_time ) - - context_mock.formatter.do_format.assert_called_with(events_iterable) - click_mock.echo.assert_has_calls([call(v, nl=False) for v in formatted_events]) diff --git a/tests/unit/commands/logs/test_console_consumers.py b/tests/unit/commands/logs/test_console_consumers.py new file mode 100644 index 0000000000..ab824ca769 --- /dev/null +++ b/tests/unit/commands/logs/test_console_consumers.py @@ -0,0 +1,15 @@ +from unittest import TestCase +from unittest.mock import patch, Mock + +from samcli.commands.logs.console_consumers import CWConsoleEventConsumer + + +class TestCWConsoleEventConsumer(TestCase): + def setUp(self): + self.consumer = CWConsoleEventConsumer() + + @patch("samcli.commands.logs.console_consumers.click") + def test_consume_with_event(self, patched_click): + event = Mock() + self.consumer.consume(event) + patched_click.echo.assert_called_with(event.message, nl=False) diff --git a/tests/unit/commands/logs/test_logs_context.py b/tests/unit/commands/logs/test_logs_context.py index fe37d4e1c7..abcd792b27 100644 --- a/tests/unit/commands/logs/test_logs_context.py +++ b/tests/unit/commands/logs/test_logs_context.py @@ -1,11 +1,11 @@ -import botocore.session -from botocore.stub import Stubber - from unittest import TestCase from unittest.mock import Mock, patch, ANY -from samcli.commands.logs.logs_context import LogsCommandContext +import botocore.session +from botocore.stub import Stubber + from samcli.commands.exceptions import UserException +from samcli.commands.logs.logs_context import LogsCommandContext class TestLogsCommandContext(TestCase): @@ -30,13 +30,6 @@ def test_basic_properties(self): self.assertEqual(self.context.filter_pattern, self.filter_pattern) self.assertIsNone(self.context.output_file_handle) # before setting context handle will be null - @patch("samcli.commands.logs.logs_context.LogsFetcher") - def test_fetcher_property(self, LogsFetcherMock): - LogsFetcherMock.return_value = Mock() - - self.assertEqual(self.context.fetcher, LogsFetcherMock.return_value) - LogsFetcherMock.assert_called_with(self.context._logs_client) - @patch("samcli.commands.logs.logs_context.Colored") def test_colored_property(self, ColoredMock): ColoredMock.return_value = Mock() @@ -61,15 +54,6 @@ def test_colored_property_without_output_file(self, ColoredMock): self.assertEqual(ctx.colored, ColoredMock.return_value) ColoredMock.assert_called_with(colorize=True) # Must enable colors - @patch("samcli.commands.logs.logs_context.LogsFormatter") - @patch("samcli.commands.logs.logs_context.Colored") - def test_formatter_property(self, ColoredMock, LogsFormatterMock): - LogsFormatterMock.return_value = Mock() - ColoredMock.return_value = Mock() - - self.assertEqual(self.context.formatter, LogsFormatterMock.return_value) - LogsFormatterMock.assert_called_with(ColoredMock.return_value, ANY) - @patch("samcli.commands.logs.logs_context.LogGroupProvider") @patch.object(LogsCommandContext, "_get_resource_id_from_stack") def test_log_group_name_property_with_stack_name(self, get_resource_id_mock, LogGroupProviderMock): diff --git a/tests/unit/commands/pipeline/__init__.py b/tests/unit/commands/pipeline/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/commands/pipeline/bootstrap/__init__.py b/tests/unit/commands/pipeline/bootstrap/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/commands/pipeline/bootstrap/test_cli.py b/tests/unit/commands/pipeline/bootstrap/test_cli.py new file mode 100644 index 0000000000..649fbbdf32 --- /dev/null +++ b/tests/unit/commands/pipeline/bootstrap/test_cli.py @@ -0,0 +1,276 @@ +from unittest import TestCase +from unittest.mock import patch, Mock + +import click +from click.testing import CliRunner + +from samcli.commands.pipeline.bootstrap.cli import ( + _load_saved_pipeline_user_arn, + _get_bootstrap_command_names, + PIPELINE_CONFIG_FILENAME, + PIPELINE_CONFIG_DIR, +) +from samcli.commands.pipeline.bootstrap.cli import cli as bootstrap_cmd +from samcli.commands.pipeline.bootstrap.cli import do_cli as bootstrap_cli + +ANY_REGION = "ANY_REGION" +ANY_PROFILE = "ANY_PROFILE" +ANY_STAGE_NAME = "ANY_STAGE_NAME" +ANY_PIPELINE_USER_ARN = "ANY_PIPELINE_USER_ARN" +ANY_PIPELINE_EXECUTION_ROLE_ARN = "ANY_PIPELINE_EXECUTION_ROLE_ARN" +ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN = "ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN" +ANY_ARTIFACTS_BUCKET_ARN = "ANY_ARTIFACTS_BUCKET_ARN" +ANY_IMAGE_REPOSITORY_ARN = "ANY_IMAGE_REPOSITORY_ARN" +ANY_ARN = "ANY_ARN" +ANY_CONFIG_FILE = "ANY_CONFIG_FILE" +ANY_CONFIG_ENV = "ANY_CONFIG_ENV" +PIPELINE_BOOTSTRAP_COMMAND_NAMES = ["pipeline", "bootstrap"] + + +class TestCli(TestCase): + def setUp(self) -> None: + self.cli_context = { + "region": ANY_REGION, + "profile": ANY_PROFILE, + "interactive": True, + "stage_name": ANY_STAGE_NAME, + "pipeline_user_arn": ANY_PIPELINE_USER_ARN, + "pipeline_execution_role_arn": ANY_PIPELINE_EXECUTION_ROLE_ARN, + "cloudformation_execution_role_arn": ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + "artifacts_bucket_arn": ANY_ARTIFACTS_BUCKET_ARN, + "create_image_repository": True, + "image_repository_arn": ANY_IMAGE_REPOSITORY_ARN, + "confirm_changeset": True, + "config_file": ANY_CONFIG_FILE, + "config_env": ANY_CONFIG_ENV, + } + + @patch("samcli.commands.pipeline.bootstrap.cli.do_cli") + def test_bootstrap_command_default_argument_values(self, do_cli_mock): + runner: CliRunner = CliRunner() + runner.invoke(bootstrap_cmd) + # Test the defaults are as following: + # interactive -> True + # create_image_repository -> False + # confirm_changeset -> True + # region, profile, stage_name and all ARNs are None + do_cli_mock.assert_called_once_with( + region=None, + profile=None, + interactive=True, + stage_name=None, + pipeline_user_arn=None, + pipeline_execution_role_arn=None, + cloudformation_execution_role_arn=None, + artifacts_bucket_arn=None, + create_image_repository=False, + image_repository_arn=None, + confirm_changeset=True, + config_file="default", + config_env="samconfig.toml", + ) + + @patch("samcli.commands.pipeline.bootstrap.cli.do_cli") + def test_bootstrap_command_flag_arguments(self, do_cli_mock): + runner: CliRunner = CliRunner() + runner.invoke(bootstrap_cmd, args=["--interactive", "--no-create-image-repository", "--confirm-changeset"]) + args, kwargs = do_cli_mock.call_args + self.assertTrue(kwargs["interactive"]) + self.assertFalse(kwargs["create_image_repository"]) + self.assertTrue(kwargs["confirm_changeset"]) + + runner.invoke(bootstrap_cmd, args=["--no-interactive", "--create-image-repository", "--no-confirm-changeset"]) + args, kwargs = do_cli_mock.call_args + self.assertFalse(kwargs["interactive"]) + self.assertTrue(kwargs["create_image_repository"]) + self.assertFalse(kwargs["confirm_changeset"]) + + @patch("samcli.commands.pipeline.bootstrap.cli.do_cli") + def test_bootstrap_command_with_different_arguments_combination(self, do_cli_mock): + runner: CliRunner = CliRunner() + runner.invoke( + bootstrap_cmd, + args=["--no-interactive", "--stage", "environment1", "--bucket", "bucketARN"], + ) + args, kwargs = do_cli_mock.call_args + self.assertFalse(kwargs["interactive"]) + self.assertEqual(kwargs["stage_name"], "environment1") + self.assertEqual(kwargs["artifacts_bucket_arn"], "bucketARN") + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_bootstrapping_normal_interactive_flow( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + # setup + gc_instance = Mock() + guided_context_mock.return_value = gc_instance + environment_instance = Mock() + environment_mock.return_value = environment_instance + load_saved_pipeline_user_arn_mock.return_value = ANY_PIPELINE_USER_ARN + self.cli_context["interactive"] = True + self.cli_context["pipeline_user_arn"] = None + get_command_names_mock.return_value = PIPELINE_BOOTSTRAP_COMMAND_NAMES + + # trigger + bootstrap_cli(**self.cli_context) + + # verify + load_saved_pipeline_user_arn_mock.assert_called_once() + gc_instance.run.assert_called_once() + environment_instance.bootstrap.assert_called_once_with(confirm_changeset=True) + environment_instance.print_resources_summary.assert_called_once() + environment_instance.save_config_safe.assert_called_once_with( + config_dir=PIPELINE_CONFIG_DIR, + filename=PIPELINE_CONFIG_FILENAME, + cmd_names=PIPELINE_BOOTSTRAP_COMMAND_NAMES, + ) + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_bootstrap_will_not_try_loading_pipeline_user_if_already_provided( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + bootstrap_cli(**self.cli_context) + load_saved_pipeline_user_arn_mock.assert_not_called() + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_bootstrap_will_try_loading_pipeline_user_if_not_provided( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + self.cli_context["pipeline_user_arn"] = None + bootstrap_cli(**self.cli_context) + load_saved_pipeline_user_arn_mock.assert_called_once() + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_stage_name_is_required_to_be_provided_in_case_of_non_interactive_mode( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + self.cli_context["interactive"] = False + self.cli_context["stage_name"] = None + with self.assertRaises(click.UsageError): + bootstrap_cli(**self.cli_context) + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_stage_name_is_not_required_to_be_provided_in_case_of_interactive_mode( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + self.cli_context["interactive"] = True + self.cli_context["stage_name"] = None + bootstrap_cli(**self.cli_context) # No exception is thrown + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_guided_context_will_be_enabled_or_disabled_based_on_the_interactive_mode( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + gc_instance = Mock() + guided_context_mock.return_value = gc_instance + self.cli_context["interactive"] = False + bootstrap_cli(**self.cli_context) + gc_instance.run.assert_not_called() + self.cli_context["interactive"] = True + bootstrap_cli(**self.cli_context) + gc_instance.run.assert_called_once() + + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + @patch("samcli.commands.pipeline.bootstrap.cli._load_saved_pipeline_user_arn") + @patch("samcli.commands.pipeline.bootstrap.cli.Stage") + @patch("samcli.commands.pipeline.bootstrap.cli.GuidedContext") + def test_bootstrapping_will_confirm_before_creating_the_resources_unless_the_user_choose_not_to( + self, guided_context_mock, environment_mock, load_saved_pipeline_user_arn_mock, get_command_names_mock + ): + environment_instance = Mock() + environment_mock.return_value = environment_instance + self.cli_context["confirm_changeset"] = False + bootstrap_cli(**self.cli_context) + environment_instance.bootstrap.assert_called_once_with(confirm_changeset=False) + environment_instance.bootstrap.reset_mock() + self.cli_context["confirm_changeset"] = True + bootstrap_cli(**self.cli_context) + environment_instance.bootstrap.assert_called_once_with(confirm_changeset=True) + + @patch("samcli.commands.pipeline.bootstrap.cli.SamConfig") + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + def test_load_saved_pipeline_user_arn_will_read_from_the_correct_file( + self, get_command_names_mock, sam_config_mock + ): + # setup + get_command_names_mock.return_value = PIPELINE_BOOTSTRAP_COMMAND_NAMES + sam_config_instance_mock = Mock() + sam_config_mock.return_value = sam_config_instance_mock + sam_config_instance_mock.exists.return_value = False + + # trigger + _load_saved_pipeline_user_arn() + + # verify + sam_config_mock.assert_called_once_with(config_dir=PIPELINE_CONFIG_DIR, filename=PIPELINE_CONFIG_FILENAME) + + @patch("samcli.commands.pipeline.bootstrap.cli.SamConfig") + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + def test_load_saved_pipeline_user_arn_will_return_non_if_the_pipeline_toml_file_is_not_found( + self, get_command_names_mock, sam_config_mock + ): + # setup + get_command_names_mock.return_value = PIPELINE_BOOTSTRAP_COMMAND_NAMES + sam_config_instance_mock = Mock() + sam_config_mock.return_value = sam_config_instance_mock + sam_config_instance_mock.exists.return_value = False + + # trigger + pipeline_user_arn = _load_saved_pipeline_user_arn() + + # verify + self.assertIsNone(pipeline_user_arn) + + @patch("samcli.commands.pipeline.bootstrap.cli.SamConfig") + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + def test_load_saved_pipeline_user_arn_will_return_non_if_the_pipeline_toml_file_does_not_contain_pipeline_user( + self, get_command_names_mock, sam_config_mock + ): + # setup + get_command_names_mock.return_value = PIPELINE_BOOTSTRAP_COMMAND_NAMES + sam_config_instance_mock = Mock() + sam_config_mock.return_value = sam_config_instance_mock + sam_config_instance_mock.exists.return_value = True + sam_config_instance_mock.get_all.return_value = {"non-pipeline_user-key": "any_value"} + + # trigger + pipeline_user_arn = _load_saved_pipeline_user_arn() + + # verify + self.assertIsNone(pipeline_user_arn) + + @patch("samcli.commands.pipeline.bootstrap.cli.SamConfig") + @patch("samcli.commands.pipeline.bootstrap.cli._get_bootstrap_command_names") + def test_load_saved_pipeline_user_arn_returns_the_pipeline_user_arn_from_the_pipeline_toml_file( + self, get_command_names_mock, sam_config_mock + ): + # setup + get_command_names_mock.return_value = PIPELINE_BOOTSTRAP_COMMAND_NAMES + sam_config_instance_mock = Mock() + sam_config_mock.return_value = sam_config_instance_mock + sam_config_instance_mock.exists.return_value = True + sam_config_instance_mock.get_all.return_value = {"pipeline_user": ANY_PIPELINE_USER_ARN} + + # trigger + pipeline_user_arn = _load_saved_pipeline_user_arn() + + # verify + self.assertEqual(pipeline_user_arn, ANY_PIPELINE_USER_ARN) diff --git a/tests/unit/commands/pipeline/bootstrap/test_guided_context.py b/tests/unit/commands/pipeline/bootstrap/test_guided_context.py new file mode 100644 index 0000000000..c4c11e9792 --- /dev/null +++ b/tests/unit/commands/pipeline/bootstrap/test_guided_context.py @@ -0,0 +1,231 @@ +from unittest import TestCase +from unittest.mock import patch, Mock, ANY + +from parameterized import parameterized + +from samcli.commands.pipeline.bootstrap.guided_context import GuidedContext + +ANY_STAGE_NAME = "ANY_STAGE_NAME" +ANY_PIPELINE_USER_ARN = "ANY_PIPELINE_USER_ARN" +ANY_PIPELINE_EXECUTION_ROLE_ARN = "ANY_PIPELINE_EXECUTION_ROLE_ARN" +ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN = "ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN" +ANY_ARTIFACTS_BUCKET_ARN = "ANY_ARTIFACTS_BUCKET_ARN" +ANY_IMAGE_REPOSITORY_ARN = "ANY_IMAGE_REPOSITORY_ARN" +ANY_ARN = "ANY_ARN" +ANY_REGION = "us-east-2" + + +class TestGuidedContext(TestCase): + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.GuidedContext._prompt_account_id") + def test_guided_context_will_not_prompt_for_fields_that_are_already_provided( + self, prompt_account_id_mock, click_mock, account_id_mock + ): + account_id_mock.return_value = "1234567890" + click_mock.confirm.return_value = False + click_mock.prompt = Mock(return_value="0") + gc: GuidedContext = GuidedContext( + stage_name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + image_repository_arn=ANY_IMAGE_REPOSITORY_ARN, + region=ANY_REGION, + ) + gc.run() + # there should only two prompt to ask + # 1. which account to use (mocked in _prompt_account_id(), not contributing to count) + # 2. what values customers want to change + prompt_account_id_mock.assert_called_once() + click_mock.prompt.assert_called_once() + + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.GuidedContext._prompt_account_id") + def test_guided_context_will_prompt_for_fields_that_are_not_provided( + self, prompt_account_id_mock, click_mock, account_id_mock + ): + account_id_mock.return_value = "1234567890" + click_mock.confirm.return_value = False + click_mock.prompt = Mock(return_value="0") + gc: GuidedContext = GuidedContext( + image_repository_arn=ANY_IMAGE_REPOSITORY_ARN # Exclude ECR repo, it has its own detailed test below + ) + gc.run() + prompt_account_id_mock.assert_called_once() + self.assertTrue(self.did_prompt_text_like("Stage Name", click_mock.prompt)) + self.assertTrue(self.did_prompt_text_like("Pipeline IAM user", click_mock.prompt)) + self.assertTrue(self.did_prompt_text_like("Pipeline execution role", click_mock.prompt)) + self.assertTrue(self.did_prompt_text_like("CloudFormation execution role", click_mock.prompt)) + self.assertTrue(self.did_prompt_text_like("Artifact bucket", click_mock.prompt)) + self.assertTrue(self.did_prompt_text_like("region", click_mock.prompt)) + + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.GuidedContext._prompt_account_id") + def test_guided_context_will_not_prompt_for_not_provided_image_repository_if_no_image_repository_is_required( + self, prompt_account_id_mock, click_mock, account_id_mock + ): + account_id_mock.return_value = "1234567890" + # ECR Image Repository choices: + # 1 - No, My SAM Template won't include lambda functions of Image package-type + # 2 - Yes, I need a help creating one + # 3 - I already have an ECR image repository + gc_without_ecr_info: GuidedContext = GuidedContext( + stage_name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + ) + + self.assertIsNone(gc_without_ecr_info.image_repository_arn) + + click_mock.confirm.return_value = False # the user chose to not CREATE an ECR Image repository + click_mock.prompt.side_effect = [None, "0"] + gc_without_ecr_info.run() + self.assertIsNone(gc_without_ecr_info.image_repository_arn) + self.assertFalse(gc_without_ecr_info.create_image_repository) + self.assertFalse(self.did_prompt_text_like("Please enter the ECR image repository", click_mock.prompt)) + + click_mock.confirm.return_value = True # the user chose to CREATE an ECR Image repository + click_mock.prompt.side_effect = [None, None, "0"] + gc_without_ecr_info.run() + self.assertIsNone(gc_without_ecr_info.image_repository_arn) + self.assertTrue(gc_without_ecr_info.create_image_repository) + self.assertTrue(self.did_prompt_text_like("Please enter the ECR image repository", click_mock.prompt)) + + click_mock.confirm.return_value = True # the user already has a repo + click_mock.prompt.side_effect = [None, ANY_IMAGE_REPOSITORY_ARN, "0"] + gc_without_ecr_info.run() + self.assertFalse(gc_without_ecr_info.create_image_repository) + self.assertTrue( + self.did_prompt_text_like("Please enter the ECR image repository", click_mock.prompt) + ) # we've asked about it + self.assertEqual(gc_without_ecr_info.image_repository_arn, ANY_IMAGE_REPOSITORY_ARN) + + @staticmethod + def did_prompt_text_like(txt, click_prompt_mock): + txt = txt.lower() + for kall in click_prompt_mock.call_args_list: + args, kwargs = kall + if args: + text = args[0].lower() + else: + text = kwargs.get("text", "").lower() + if txt in text: + return True + return False + + +class TestGuidedContext_prompt_account_id(TestCase): + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.os.getenv") + @patch("samcli.commands.pipeline.bootstrap.guided_context.list_available_profiles") + def test_prompt_account_id_can_display_profiles_and_environment( + self, list_available_profiles_mock, getenv_mock, click_mock, get_current_account_id_mock + ): + getenv_mock.return_value = "not None" + list_available_profiles_mock.return_value = ["profile1", "profile2"] + click_mock.prompt.return_value = "1" # select environment variable + get_current_account_id_mock.return_value = "account_id" + + guided_context_mock = Mock() + GuidedContext._prompt_account_id(guided_context_mock) + + click_mock.prompt.assert_called_once_with( + ANY, show_choices=False, show_default=False, type=click_mock.Choice(["1", "2", "3", "q"]) + ) + + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.os.getenv") + @patch("samcli.commands.pipeline.bootstrap.guided_context.list_available_profiles") + def test_prompt_account_id_wont_show_environment_option_when_it_doesnt_exist( + self, list_available_profiles_mock, getenv_mock, click_mock, get_current_account_id_mock + ): + getenv_mock.return_value = None + list_available_profiles_mock.return_value = ["profile1", "profile2"] + click_mock.prompt.return_value = "1" # select environment variable + get_current_account_id_mock.return_value = "account_id" + + guided_context_mock = Mock() + GuidedContext._prompt_account_id(guided_context_mock) + + click_mock.prompt.assert_called_once_with( + ANY, show_choices=False, show_default=False, type=click_mock.Choice(["2", "3", "q"]) + ) + + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.os.getenv") + @patch("samcli.commands.pipeline.bootstrap.guided_context.list_available_profiles") + def test_prompt_account_id_select_environment_unset_self_profile( + self, list_available_profiles_mock, getenv_mock, click_mock, get_current_account_id_mock + ): + getenv_mock.return_value = "not None" + list_available_profiles_mock.return_value = ["profile1", "profile2"] + click_mock.prompt.return_value = "1" # select environment variable + get_current_account_id_mock.return_value = "account_id" + + guided_context_mock = Mock() + GuidedContext._prompt_account_id(guided_context_mock) + + self.assertEquals(None, guided_context_mock.profile) + + @parameterized.expand( + [ + ( + "2", + "profile1", + ), + ( + "3", + "profile2", + ), + ] + ) + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.os.getenv") + @patch("samcli.commands.pipeline.bootstrap.guided_context.list_available_profiles") + def test_prompt_account_id_select_profile_set_profile_to_its_name( + self, + profile_selection, + expected_profile, + list_available_profiles_mock, + getenv_mock, + click_mock, + get_current_account_id_mock, + ): + getenv_mock.return_value = "not None" + list_available_profiles_mock.return_value = ["profile1", "profile2"] + click_mock.prompt.return_value = profile_selection + get_current_account_id_mock.return_value = "account_id" + + guided_context_mock = Mock() + GuidedContext._prompt_account_id(guided_context_mock) + + self.assertEquals(expected_profile, guided_context_mock.profile) + + @patch("samcli.commands.pipeline.bootstrap.guided_context.sys.exit") + @patch("samcli.commands.pipeline.bootstrap.guided_context.get_current_account_id") + @patch("samcli.commands.pipeline.bootstrap.guided_context.click") + @patch("samcli.commands.pipeline.bootstrap.guided_context.os.getenv") + @patch("samcli.commands.pipeline.bootstrap.guided_context.list_available_profiles") + def test_prompt_account_id_select_quit( + self, list_available_profiles_mock, getenv_mock, click_mock, get_current_account_id_mock, exit_mock + ): + getenv_mock.return_value = "not None" + list_available_profiles_mock.return_value = ["profile1", "profile2"] + click_mock.prompt.return_value = "q" # quit + get_current_account_id_mock.return_value = "account_id" + + guided_context_mock = Mock() + GuidedContext._prompt_account_id(guided_context_mock) + + exit_mock.assert_called_once_with(0) diff --git a/tests/unit/commands/pipeline/init/__init__.py b/tests/unit/commands/pipeline/init/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/commands/pipeline/init/test_cli.py b/tests/unit/commands/pipeline/init/test_cli.py new file mode 100644 index 0000000000..2e7cd0699b --- /dev/null +++ b/tests/unit/commands/pipeline/init/test_cli.py @@ -0,0 +1,22 @@ +from unittest import TestCase +from unittest.mock import patch + +from click.testing import CliRunner + +from samcli.commands.pipeline.init.cli import cli as init_cmd +from samcli.commands.pipeline.init.cli import do_cli as init_cli + + +class TestCli(TestCase): + @patch("samcli.commands.pipeline.init.cli.do_cli") + def test_cli_default_flow(self, do_cli_mock): + runner: CliRunner = CliRunner() + runner.invoke(init_cmd) + # Currently we support the interactive mode only, i.e. we don't accept any command arguments, + # instead we ask the user about the required arguments in an interactive way + do_cli_mock.assert_called_once_with(False) # Called without arguments + + @patch("samcli.commands.pipeline.init.cli.InteractiveInitFlow.do_interactive") + def test_do_cli(self, do_interactive_mock): + init_cli(False) + do_interactive_mock.assert_called_once_with() # Called without arguments diff --git a/tests/unit/commands/pipeline/init/test_initeractive_init_flow.py b/tests/unit/commands/pipeline/init/test_initeractive_init_flow.py new file mode 100644 index 0000000000..2cdaacc91e --- /dev/null +++ b/tests/unit/commands/pipeline/init/test_initeractive_init_flow.py @@ -0,0 +1,566 @@ +import json +import shutil +import tempfile +from unittest import TestCase +from unittest.mock import patch, Mock, call +import os +from pathlib import Path + +from parameterized import parameterized + +from samcli.commands.exceptions import AppPipelineTemplateMetadataException +from samcli.commands.pipeline.init.interactive_init_flow import ( + InteractiveInitFlow, + PipelineTemplateCloneException, + APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME, + shared_path, + CUSTOM_PIPELINE_TEMPLATE_REPO_LOCAL_NAME, + _prompt_cicd_provider, + _prompt_provider_pipeline_template, + _get_pipeline_template_metadata, + _copy_dir_contents_to_cwd, +) +from samcli.commands.pipeline.init.pipeline_templates_manifest import AppPipelineTemplateManifestException +from samcli.lib.utils.git_repo import CloneRepoException +from samcli.lib.cookiecutter.interactive_flow_creator import QuestionsNotFoundException + + +class TestInteractiveInitFlow(TestCase): + @patch("samcli.commands.pipeline.init.interactive_init_flow._read_app_pipeline_templates_manifest") + @patch("samcli.commands.pipeline.init.interactive_init_flow._prompt_pipeline_template") + @patch("samcli.commands.pipeline.init.interactive_init_flow.InteractiveInitFlow._generate_from_pipeline_template") + @patch("samcli.commands.pipeline.init.interactive_init_flow.shared_path") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.lib.cookiecutter.question.click") + def test_app_pipeline_templates_clone_fail_when_an_old_clone_exists( + self, + click_mock, + clone_mock, + shared_path_mock, + generate_from_pipeline_template_mock, + select_pipeline_template_mock, + read_app_pipeline_templates_manifest_mock, + ): + # setup + clone_mock.side_effect = CloneRepoException # clone fail + app_pipeline_templates_path_mock = Mock() + selected_pipeline_template_path_mock = Mock() + pipeline_templates_manifest_mock = Mock() + shared_path_mock.joinpath.return_value = app_pipeline_templates_path_mock + app_pipeline_templates_path_mock.exists.return_value = True # An old clone exists + app_pipeline_templates_path_mock.joinpath.return_value = selected_pipeline_template_path_mock + read_app_pipeline_templates_manifest_mock.return_value = pipeline_templates_manifest_mock + click_mock.prompt.return_value = "1" # App pipeline templates + + # trigger + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + # verify + clone_mock.assert_called_once_with( + shared_path_mock, APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME, replace_existing=True + ) + app_pipeline_templates_path_mock.exists.assert_called_once() + read_app_pipeline_templates_manifest_mock.assert_called_once_with(app_pipeline_templates_path_mock) + select_pipeline_template_mock.assert_called_once_with(pipeline_templates_manifest_mock) + generate_from_pipeline_template_mock.assert_called_once_with(selected_pipeline_template_path_mock) + + @patch("samcli.commands.pipeline.init.interactive_init_flow.shared_path") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.lib.cookiecutter.question.click") + def test_app_pipeline_templates_clone_fail_when_no_old_clone_exist(self, click_mock, clone_mock, shared_path_mock): + # setup + clone_mock.side_effect = CloneRepoException # clone fail + app_pipeline_templates_path_mock = Mock() + shared_path_mock.joinpath.return_value = app_pipeline_templates_path_mock + app_pipeline_templates_path_mock.exists.return_value = False # No old clone exists + click_mock.prompt.return_value = "1" # App pipeline templates + + # trigger + with self.assertRaises(PipelineTemplateCloneException): + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.commands.pipeline.init.interactive_init_flow.click") + @patch("samcli.lib.cookiecutter.question.click") + def test_custom_pipeline_template_clone_fail(self, question_click_mock, init_click_mock, clone_mock): + # setup + clone_mock.side_effect = CloneRepoException # clone fail + question_click_mock.prompt.return_value = "2" # Custom pipeline templates + init_click_mock.prompt.return_value = ( + "https://github.com/any-custom-pipeline-template-repo.git" # Custom pipeline template repo URL + ) + + # trigger + with self.assertRaises(PipelineTemplateCloneException): + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + @patch("samcli.commands.pipeline.init.interactive_init_flow._read_app_pipeline_templates_manifest") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.lib.cookiecutter.question.click") + def test_app_pipeline_templates_with_invalid_manifest( + self, click_mock, clone_mock, read_app_pipeline_templates_manifest_mock + ): + # setup + app_pipeline_templates_path_mock = Mock() + clone_mock.return_value = app_pipeline_templates_path_mock + read_app_pipeline_templates_manifest_mock.side_effect = AppPipelineTemplateManifestException("") + click_mock.prompt.return_value = "1" # App pipeline templates + + # trigger + with self.assertRaises(AppPipelineTemplateManifestException): + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + @patch("samcli.commands.pipeline.init.interactive_init_flow.SamConfig") + @patch("samcli.commands.pipeline.init.interactive_init_flow.osutils") + @patch("samcli.lib.cookiecutter.template.cookiecutter") + @patch("samcli.commands.pipeline.init.interactive_init_flow.InteractiveFlowCreator.create_flow") + @patch("samcli.commands.pipeline.init.interactive_init_flow.PipelineTemplatesManifest") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.commands.pipeline.init.interactive_init_flow._copy_dir_contents_to_cwd") + @patch("samcli.commands.pipeline.init.interactive_init_flow._get_pipeline_template_metadata") + @patch("samcli.lib.cookiecutter.question.click") + def test_generate_pipeline_configuration_file_from_app_pipeline_template_happy_case( + self, + click_mock, + _get_pipeline_template_metadata_mock, + _copy_dir_contents_to_cwd_mock, + clone_mock, + PipelineTemplatesManifest_mock, + create_interactive_flow_mock, + cookiecutter_mock, + osutils_mock, + samconfig_mock, + ): + # setup + any_app_pipeline_templates_path = Path( + os.path.normpath(shared_path.joinpath(APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME)) + ) + clone_mock.return_value = any_app_pipeline_templates_path + jenkins_template_location = "some/location" + jenkins_template_mock = Mock( + display_name="Jenkins pipeline template", location=jenkins_template_location, provider="jenkins" + ) + pipeline_templates_manifest_mock = Mock( + providers=[ + Mock(id="gitlab", display_name="Gitlab"), + Mock(id="jenkins", display_name="Jenkins"), + ], + templates=[jenkins_template_mock], + ) + PipelineTemplatesManifest_mock.return_value = pipeline_templates_manifest_mock + cookiecutter_output_dir_mock = "/tmp/any/dir2" + osutils_mock.mkdir_temp.return_value.__enter__ = Mock(return_value=cookiecutter_output_dir_mock) + interactive_flow_mock = Mock() + create_interactive_flow_mock.return_value = interactive_flow_mock + cookiecutter_context_mock = {"key": "value"} + interactive_flow_mock.run.return_value = cookiecutter_context_mock + config_file = Mock() + samconfig_mock.return_value = config_file + config_file.exists.return_value = True + config_file.get_stage_names.return_value = ["testing", "prod"] + config_file.get_stage_names.return_value = ["testing", "prod"] + config_file.get_all.return_value = {"pipeline_execution_role": "arn:aws:iam::123456789012:role/execution-role"} + + click_mock.prompt.side_effect = [ + "1", # App pipeline templates + "2", # choose "Jenkins" when prompt for CI/CD system. (See pipeline_templates_manifest_mock, Jenkins is the 2nd provider) + "1", # choose "Jenkins pipeline template" when prompt for pipeline template + ] + _get_pipeline_template_metadata_mock.return_value = {"number_of_stages": 2} + + # trigger + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + # verify + osutils_mock.mkdir_temp.assert_called() # cookiecutter project is generated to temp + expected_cookicutter_template_location = any_app_pipeline_templates_path.joinpath(jenkins_template_location) + clone_mock.assert_called_once_with(shared_path, APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME, replace_existing=True) + PipelineTemplatesManifest_mock.assert_called_once() + create_interactive_flow_mock.assert_called_once_with( + str(expected_cookicutter_template_location.joinpath("questions.json")) + ) + interactive_flow_mock.run.assert_called_once_with( + { + str(["testing", "pipeline_execution_role"]): "arn:aws:iam::123456789012:role/execution-role", + str(["1", "pipeline_execution_role"]): "arn:aws:iam::123456789012:role/execution-role", + str(["prod", "pipeline_execution_role"]): "arn:aws:iam::123456789012:role/execution-role", + str(["2", "pipeline_execution_role"]): "arn:aws:iam::123456789012:role/execution-role", + str(["stage_names_message"]): "Here are the stage names detected " + f'in {os.path.join(".aws-sam", "pipeline", "pipelineconfig.toml")}:\n\t1 - testing\n\t2 - prod', + } + ) + cookiecutter_mock.assert_called_once_with( + template=str(expected_cookicutter_template_location), + output_dir=cookiecutter_output_dir_mock, + no_input=True, + extra_context=cookiecutter_context_mock, + overwrite_if_exists=True, + ) + + @patch("samcli.commands.pipeline.init.interactive_init_flow._read_app_pipeline_templates_manifest") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.lib.cookiecutter.question.click") + def test_generate_pipeline_configuration_file_when_pipeline_template_missing_questions_file( + self, click_mock, clone_mock, read_app_pipeline_templates_manifest_mock + ): + # setup + any_app_pipeline_templates_path = shared_path.joinpath(APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME) + clone_mock.return_value = any_app_pipeline_templates_path + jenkins_template_location = "some/location" + jenkins_template_mock = Mock( + display_name="Jenkins pipeline template", location=jenkins_template_location, provider="jenkins" + ) + pipeline_templates_manifest_mock = Mock( + providers=[ + Mock(id="gitlab", display_name="Gitlab"), + Mock(id="jenkins", display_name="Jenkins"), + ], + templates=[jenkins_template_mock], + ) + read_app_pipeline_templates_manifest_mock.return_value = pipeline_templates_manifest_mock + + click_mock.prompt.side_effect = [ + "1", # App pipeline templates + "2", # choose "Jenkins" when prompt for CI/CD system. (See pipeline_templates_manifest_mock, Jenkins is the 2nd provider) + "1", # choose "Jenkins pipeline template" when prompt for pipeline template + ] + + # trigger + with self.assertRaises(QuestionsNotFoundException): + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + @patch("samcli.commands.pipeline.init.interactive_init_flow.os") + @patch("samcli.commands.pipeline.init.interactive_init_flow.osutils") + @patch("samcli.commands.pipeline.init.interactive_init_flow.InteractiveInitFlow._generate_from_pipeline_template") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.commands.pipeline.init.interactive_init_flow.click") + @patch("samcli.lib.cookiecutter.question.click") + def test_generate_pipeline_configuration_file_from_custom_local_existing_path_will_not_do_git_clone( + self, + questions_click_mock, + init_click_mock, + clone_mock, + generate_from_pipeline_template_mock, + osutils_mock, + os_mock, + ): + # setup + local_pipeline_templates_path = "/any/existing/local/path" + os_mock.path.exists.return_value = True + questions_click_mock.prompt.return_value = "2" # Custom pipeline templates + init_click_mock.prompt.return_value = local_pipeline_templates_path # git repo path + # trigger + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + # verify + osutils_mock.mkdir_temp.assert_not_called() + clone_mock.assert_not_called() + generate_from_pipeline_template_mock.assert_called_once_with(Path(local_pipeline_templates_path)) + + @patch("samcli.commands.pipeline.init.interactive_init_flow.osutils") + @patch("samcli.lib.cookiecutter.template.cookiecutter") + @patch("samcli.commands.pipeline.init.interactive_init_flow.InteractiveFlowCreator.create_flow") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.commands.pipeline.init.interactive_init_flow.click") + @patch("samcli.commands.pipeline.init.interactive_init_flow._copy_dir_contents_to_cwd") + @patch("samcli.commands.pipeline.init.interactive_init_flow._get_pipeline_template_metadata") + @patch("samcli.lib.cookiecutter.question.click") + def test_generate_pipeline_configuration_file_from_custom_remote_pipeline_template_happy_case( + self, + questions_click_mock, + _get_pipeline_template_metadata_mock, + _copy_dir_contents_to_cwd_mock, + init_click_mock, + clone_mock, + create_interactive_flow_mock, + cookiecutter_mock, + osutils_mock, + ): + # setup + any_temp_dir = "/tmp/any/dir" + cookiecutter_output_dir_mock = "/tmp/any/dir2" + osutils_mock.mkdir_temp.return_value.__enter__ = Mock(side_effect=[any_temp_dir, cookiecutter_output_dir_mock]) + osutils_mock.mkdir_temp.return_value.__exit__ = Mock() + any_custom_pipeline_templates_path = Path(os.path.join(any_temp_dir, CUSTOM_PIPELINE_TEMPLATE_REPO_LOCAL_NAME)) + clone_mock.return_value = any_custom_pipeline_templates_path + interactive_flow_mock = Mock() + create_interactive_flow_mock.return_value = interactive_flow_mock + cookiecutter_context_mock = {"key": "value"} + interactive_flow_mock.run.return_value = cookiecutter_context_mock + _copy_dir_contents_to_cwd_mock.return_value = ["file1"] + + questions_click_mock.prompt.return_value = "2" # Custom pipeline templates + init_click_mock.prompt.return_value = "https://github.com/any-custom-pipeline-template-repo.git" + _get_pipeline_template_metadata_mock.return_value = {"number_of_stages": 2} + + # trigger + InteractiveInitFlow(allow_bootstrap=False).do_interactive() + + # verify + # Custom templates are cloned to temp; cookiecutter project is generated to temp + osutils_mock.mkdir_temp.assert_called() + clone_mock.assert_called_once_with( + Path(any_temp_dir), CUSTOM_PIPELINE_TEMPLATE_REPO_LOCAL_NAME, replace_existing=True + ) + create_interactive_flow_mock.assert_called_once_with( + str(any_custom_pipeline_templates_path.joinpath("questions.json")) + ) + interactive_flow_mock.run.assert_called_once() + cookiecutter_mock.assert_called_once_with( + template=str(any_custom_pipeline_templates_path), + output_dir=cookiecutter_output_dir_mock, + no_input=True, + extra_context=cookiecutter_context_mock, + overwrite_if_exists=True, + ) + + @patch("samcli.lib.cookiecutter.question.click") + def test_prompt_cicd_provider_will_not_prompt_if_the_list_of_providers_has_only_one_provider(self, click_mock): + gitlab_provider = Mock(id="gitlab", display_name="Gitlab CI/CD") + providers = [gitlab_provider] + + chosen_provider = _prompt_cicd_provider(providers) + click_mock.prompt.assert_not_called() + self.assertEqual(chosen_provider, gitlab_provider) + + jenkins_provider = Mock(id="jenkins", display_name="Jenkins") + providers.append(jenkins_provider) + click_mock.prompt.return_value = "2" + chosen_provider = _prompt_cicd_provider(providers) + click_mock.prompt.assert_called_once() + self.assertEqual(chosen_provider, jenkins_provider) + + @patch("samcli.lib.cookiecutter.question.click") + def test_prompt_provider_pipeline_template_will_not_prompt_if_the_list_of_templatess_has_only_one_provider( + self, click_mock + ): + template1 = Mock(display_name="anyName1", location="anyLocation1", provider="a provider") + template2 = Mock(display_name="anyName2", location="anyLocation2", provider="a provider") + templates = [template1] + + chosen_template = _prompt_provider_pipeline_template(templates) + click_mock.prompt.assert_not_called() + self.assertEqual(chosen_template, template1) + + templates.append(template2) + click_mock.prompt.return_value = "2" + chosen_template = _prompt_provider_pipeline_template(templates) + click_mock.prompt.assert_called_once() + self.assertEqual(chosen_template, template2) + + def test_get_pipeline_template_metadata_can_load(self): + with tempfile.TemporaryDirectory() as dir: + metadata = {"number_of_stages": 2} + with open(Path(dir, "metadata.json"), "w") as f: + json.dump(metadata, f) + self.assertEquals(metadata, _get_pipeline_template_metadata(dir)) + + def test_get_pipeline_template_metadata_not_exist(self): + with tempfile.TemporaryDirectory() as dir: + with self.assertRaises(AppPipelineTemplateMetadataException): + _get_pipeline_template_metadata(dir) + + @parameterized.expand( + [ + ('["not_a_dict"]',), + ("not a json"), + ] + ) + def test_get_pipeline_template_metadata_not_valid(self, metadata_str): + with tempfile.TemporaryDirectory() as dir: + with open(Path(dir, "metadata.json"), "w") as f: + f.write(metadata_str) + with self.assertRaises(AppPipelineTemplateMetadataException): + _get_pipeline_template_metadata(dir) + + +class TestInteractiveInitFlowWithBootstrap(TestCase): + @patch("samcli.commands.pipeline.init.interactive_init_flow.SamConfig") + @patch("samcli.commands.pipeline.init.interactive_init_flow.osutils") + @patch("samcli.lib.cookiecutter.template.cookiecutter") + @patch("samcli.commands.pipeline.init.interactive_init_flow.InteractiveFlowCreator.create_flow") + @patch( + "samcli.commands.pipeline.init.interactive_init_flow.InteractiveInitFlow._prompt_run_bootstrap_within_pipeline_init" + ) + @patch("samcli.commands.pipeline.init.interactive_init_flow.PipelineTemplatesManifest") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.commands.pipeline.init.interactive_init_flow._copy_dir_contents_to_cwd") + @patch("samcli.commands.pipeline.init.interactive_init_flow._get_pipeline_template_metadata") + @patch("samcli.lib.cookiecutter.question.click") + def test_with_bootstrap_but_answer_no( + self, + click_mock, + _get_pipeline_template_metadata_mock, + _copy_dir_contents_to_cwd_mock, + clone_mock, + PipelineTemplatesManifest_mock, + _prompt_run_bootstrap_within_pipeline_init_mock, + create_interactive_flow_mock, + cookiecutter_mock, + osutils_mock, + samconfig_mock, + ): + # setup + any_app_pipeline_templates_path = Path( + os.path.normpath(shared_path.joinpath(APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME)) + ) + clone_mock.return_value = any_app_pipeline_templates_path + jenkins_template_location = "some/location" + jenkins_template_mock = Mock( + display_name="Jenkins pipeline template", location=jenkins_template_location, provider="jenkins" + ) + pipeline_templates_manifest_mock = Mock( + providers=[ + Mock(id="gitlab", display_name="Gitlab"), + Mock(id="jenkins", display_name="Jenkins"), + ], + templates=[jenkins_template_mock], + ) + PipelineTemplatesManifest_mock.return_value = pipeline_templates_manifest_mock + cookiecutter_output_dir_mock = "/tmp/any/dir2" + osutils_mock.mkdir_temp.return_value.__enter__ = Mock(return_value=cookiecutter_output_dir_mock) + interactive_flow_mock = Mock() + create_interactive_flow_mock.return_value = interactive_flow_mock + cookiecutter_context_mock = {"key": "value"} + interactive_flow_mock.run.return_value = cookiecutter_context_mock + config_file = Mock() + samconfig_mock.return_value = config_file + config_file.exists.return_value = True + config_file.get_stage_names.return_value = ["testing"] + config_file.get_all.return_value = {"pipeline_execution_role": "arn:aws:iam::123456789012:role/execution-role"} + _get_pipeline_template_metadata_mock.return_value = {"number_of_stages": 2} + + click_mock.prompt.side_effect = [ + "1", # App pipeline templates + "2", + # choose "Jenkins" when prompt for CI/CD system. (See pipeline_templates_manifest_mock, Jenkins is the 2nd provider) + "1", # choose "Jenkins pipeline template" when prompt for pipeline template + ] + + _prompt_run_bootstrap_within_pipeline_init_mock.return_value = False # not to bootstrap + + # trigger + InteractiveInitFlow(allow_bootstrap=True).do_interactive() + + # verify + _prompt_run_bootstrap_within_pipeline_init_mock.assert_called_once_with(["testing"], 2) + + @parameterized.expand( + [ + ([["testing"], ["testing", "prod"]], [call(["testing"], 2)]), + ([[], ["testing"], ["testing", "prod"]], [call([], 2), call(["testing"], 2)]), + ] + ) + @patch("samcli.commands.pipeline.init.interactive_init_flow.SamConfig") + @patch("samcli.commands.pipeline.init.interactive_init_flow.osutils") + @patch("samcli.lib.cookiecutter.template.cookiecutter") + @patch("samcli.commands.pipeline.init.interactive_init_flow.InteractiveFlowCreator.create_flow") + @patch( + "samcli.commands.pipeline.init.interactive_init_flow.InteractiveInitFlow._prompt_run_bootstrap_within_pipeline_init" + ) + @patch("samcli.commands.pipeline.init.interactive_init_flow.PipelineTemplatesManifest") + @patch("samcli.commands.pipeline.init.interactive_init_flow.GitRepo.clone") + @patch("samcli.commands.pipeline.init.interactive_init_flow._copy_dir_contents_to_cwd") + @patch("samcli.commands.pipeline.init.interactive_init_flow._get_pipeline_template_metadata") + @patch("samcli.lib.cookiecutter.question.click") + def test_with_bootstrap_answer_yes( + self, + get_stage_name_side_effects, + _prompt_run_bootstrap_expected_calls, + click_mock, + _get_pipeline_template_metadata_mock, + _copy_dir_contents_to_cwd_mock, + clone_mock, + PipelineTemplatesManifest_mock, + _prompt_run_bootstrap_within_pipeline_init_mock, + create_interactive_flow_mock, + cookiecutter_mock, + osutils_mock, + samconfig_mock, + ): + # setup + any_app_pipeline_templates_path = Path( + os.path.normpath(shared_path.joinpath(APP_PIPELINE_TEMPLATES_REPO_LOCAL_NAME)) + ) + clone_mock.return_value = any_app_pipeline_templates_path + jenkins_template_location = "some/location" + jenkins_template_mock = Mock( + display_name="Jenkins pipeline template", location=jenkins_template_location, provider="jenkins" + ) + pipeline_templates_manifest_mock = Mock( + providers=[ + Mock(id="gitlab", display_name="Gitlab"), + Mock(id="jenkins", display_name="Jenkins"), + ], + templates=[jenkins_template_mock], + ) + PipelineTemplatesManifest_mock.return_value = pipeline_templates_manifest_mock + cookiecutter_output_dir_mock = "/tmp/any/dir2" + osutils_mock.mkdir_temp.return_value.__enter__ = Mock(return_value=cookiecutter_output_dir_mock) + interactive_flow_mock = Mock() + create_interactive_flow_mock.return_value = interactive_flow_mock + cookiecutter_context_mock = {"key": "value"} + interactive_flow_mock.run.return_value = cookiecutter_context_mock + config_file = Mock() + samconfig_mock.return_value = config_file + config_file.exists.return_value = True + config_file.get_stage_names.side_effect = get_stage_name_side_effects + config_file.get_all.return_value = {"pipeline_execution_role": "arn:aws:iam::123456789012:role/execution-role"} + _get_pipeline_template_metadata_mock.return_value = {"number_of_stages": 2} + + click_mock.prompt.side_effect = [ + "1", # App pipeline templates + "2", + # choose "Jenkins" when prompt for CI/CD system. (See pipeline_templates_manifest_mock, Jenkins is the 2nd provider) + "1", # choose "Jenkins pipeline template" when prompt for pipeline template + ] + + _prompt_run_bootstrap_within_pipeline_init_mock.return_value = True # to bootstrap + + # trigger + InteractiveInitFlow(allow_bootstrap=True).do_interactive() + + # verify + _prompt_run_bootstrap_within_pipeline_init_mock.assert_has_calls(_prompt_run_bootstrap_expected_calls) + + +class TestInteractiveInitFlow_copy_dir_contents_to_cwd(TestCase): + def tearDown(self) -> None: + if Path("file").exists(): + Path("file").unlink() + shutil.rmtree(os.path.join(".aws-sam", "pipeline"), ignore_errors=True) + + @patch("samcli.commands.pipeline.init.interactive_init_flow.click.confirm") + def test_copy_dir_contents_to_cwd_no_need_override(self, confirm_mock): + with tempfile.TemporaryDirectory() as source: + confirm_mock.return_value = True + Path(source, "file").touch() + Path(source, "file").write_text("hi") + file_paths = _copy_dir_contents_to_cwd(source) + confirm_mock.assert_not_called() + self.assertEqual("hi", Path("file").read_text(encoding="utf-8")) + self.assertEqual([str(Path(".", "file"))], file_paths) + + @patch("samcli.commands.pipeline.init.interactive_init_flow.click.confirm") + def test_copy_dir_contents_to_cwd_override(self, confirm_mock): + with tempfile.TemporaryDirectory() as source: + confirm_mock.return_value = True + Path(source, "file").touch() + Path(source, "file").write_text("hi") + Path("file").touch() + file_paths = _copy_dir_contents_to_cwd(source) + confirm_mock.assert_called_once() + self.assertEqual("hi", Path("file").read_text(encoding="utf-8")) + self.assertEqual([str(Path(".", "file"))], file_paths) + + @patch("samcli.commands.pipeline.init.interactive_init_flow.click.confirm") + def test_copy_dir_contents_to_cwd_not_override(self, confirm_mock): + with tempfile.TemporaryDirectory() as source: + confirm_mock.return_value = False + Path(source, "file").touch() + Path(source, "file").write_text("hi") + Path("file").touch() + file_paths = _copy_dir_contents_to_cwd(source) + confirm_mock.assert_called_once() + self.assertEqual("", Path("file").read_text(encoding="utf-8")) + self.assertEqual([str(Path(".aws-sam", "pipeline", "generated-files", "file"))], file_paths) diff --git a/tests/unit/commands/pipeline/init/test_pipeline_templates_manifest.py b/tests/unit/commands/pipeline/init/test_pipeline_templates_manifest.py new file mode 100644 index 0000000000..d35541c3f6 --- /dev/null +++ b/tests/unit/commands/pipeline/init/test_pipeline_templates_manifest.py @@ -0,0 +1,82 @@ +from unittest import TestCase +import os +from pathlib import Path +from samcli.commands.pipeline.init.pipeline_templates_manifest import ( + Provider, + PipelineTemplatesManifest, + PipelineTemplateMetadata, + AppPipelineTemplateManifestException, +) +from samcli.lib.utils import osutils + +INVALID_YAML_MANIFEST = """ +providers: +- Jenkins with wrong identation +""" + +MISSING_KEYS_MANIFEST = """ +NotProviders: + - Jenkins +Templates: + - NotName: jenkins-two-environments-pipeline + provider: Jenkins + location: templates/cookiecutter-jenkins-two-environments-pipeline +""" + +VALID_MANIFEST = """ +providers: + - displayName: Jenkins + id: jenkins + - displayName: Gitlab CI/CD + id: gitlab + - displayName: Github Actions + id: github-actions +templates: + - displayName: jenkins-two-environments-pipeline + provider: jenkins + location: templates/cookiecutter-jenkins-two-environments-pipeline + - displayName: gitlab-two-environments-pipeline + provider: gitlab + location: templates/cookiecutter-gitlab-two-environments-pipeline + - displayName: Github-Actions-two-environments-pipeline + provider: github-actions + location: templates/cookiecutter-github-actions-two-environments-pipeline +""" + + +class TestCli(TestCase): + def test_manifest_file_not_found(self): + non_existing_path = Path(os.path.normpath("/any/non/existing/manifest.yaml")) + with self.assertRaises(AppPipelineTemplateManifestException): + PipelineTemplatesManifest(manifest_path=non_existing_path) + + def test_invalid_yaml_manifest_file(self): + with osutils.mkdir_temp(ignore_errors=True) as tempdir: + manifest_path = os.path.normpath(os.path.join(tempdir, "manifest.yaml")) + with open(manifest_path, "w", encoding="utf-8") as fp: + fp.write(INVALID_YAML_MANIFEST) + with self.assertRaises(AppPipelineTemplateManifestException): + PipelineTemplatesManifest(manifest_path=Path(manifest_path)) + + def test_manifest_missing_required_keys(self): + with osutils.mkdir_temp(ignore_errors=True) as tempdir: + manifest_path = os.path.normpath(os.path.join(tempdir, "manifest.yaml")) + with open(manifest_path, "w", encoding="utf-8") as fp: + fp.write(MISSING_KEYS_MANIFEST) + with self.assertRaises(AppPipelineTemplateManifestException): + PipelineTemplatesManifest(manifest_path=Path(manifest_path)) + + def test_manifest_happy_case(self): + with osutils.mkdir_temp(ignore_errors=True) as tempdir: + manifest_path = os.path.normpath(os.path.join(tempdir, "manifest.yaml")) + with open(manifest_path, "w", encoding="utf-8") as fp: + fp.write(VALID_MANIFEST) + manifest = PipelineTemplatesManifest(manifest_path=Path(manifest_path)) + self.assertEquals(len(manifest.providers), 3) + gitlab_provider: Provider = next(p for p in manifest.providers if p.id == "gitlab") + self.assertEquals(gitlab_provider.display_name, "Gitlab CI/CD") + self.assertEquals(len(manifest.templates), 3) + gitlab_template: PipelineTemplateMetadata = next(t for t in manifest.templates if t.provider == "gitlab") + self.assertEquals(gitlab_template.display_name, "gitlab-two-environments-pipeline") + self.assertEquals(gitlab_template.provider, "gitlab") + self.assertEquals(gitlab_template.location, "templates/cookiecutter-gitlab-two-environments-pipeline") diff --git a/tests/unit/commands/samconfig/test_samconfig.py b/tests/unit/commands/samconfig/test_samconfig.py index 4b27b01827..2477f21625 100644 --- a/tests/unit/commands/samconfig/test_samconfig.py +++ b/tests/unit/commands/samconfig/test_samconfig.py @@ -142,6 +142,7 @@ def test_build(self, get_iac_plugin_mock, do_cli_mock): self.assertIsNone(result.exception) do_cli_mock.assert_called_with( + ANY, "foo", str(Path(os.getcwd(), "mytemplate.yaml")), "basedir", @@ -201,6 +202,7 @@ def test_build_with_container_env_vars(self, get_iac_plugin_mock, do_cli_mock): self.assertIsNone(result.exception) do_cli_mock.assert_called_with( + ANY, "foo", str(Path(os.getcwd(), "mytemplate.yaml")), "basedir", @@ -261,6 +263,7 @@ def test_build_with_build_images(self, get_iac_plugin_mock, do_cli_mock): self.assertIsNone(result.exception) do_cli_mock.assert_called_with( + ANY, "foo", str(Path(os.getcwd(), "mytemplate.yaml")), "basedir", @@ -345,6 +348,8 @@ def test_local_invoke(self, get_iac_plugin_mock, do_cli_mock): True, True, {"Key": "Value", "Key2": "Value2"}, + "localhost", + "127.0.0.1", "CFN", iac_mock, project_mock, @@ -415,6 +420,8 @@ def test_local_start_api(self, get_iac_plugin_mock, do_cli_mock): None, False, None, + "localhost", + "127.0.0.1", "CFN", iac_mock, project_mock, @@ -483,6 +490,8 @@ def test_local_start_lambda(self, get_iac_plugin_mock, do_cli_mock): None, False, None, + "localhost", + "127.0.0.1", iac_mock, project_mock, ) @@ -928,6 +937,10 @@ def test_override_with_cli_params(self, get_iac_plugin_mock, do_cli_mock): "--shutdown", "--parameter-overrides", "A=123 C=D E=F12! G=H", + "--container-host", + "localhost", + "--container-host-interface", + "127.0.0.1", ], ) @@ -957,6 +970,8 @@ def test_override_with_cli_params(self, get_iac_plugin_mock, do_cli_mock): None, True, None, + "localhost", + "127.0.0.1", iac_mock, project_mock, ) @@ -1056,6 +1071,8 @@ def test_override_with_cli_params_and_envvars(self, get_iac_plugin_mock, do_cli_ None, False, None, + "localhost", + "127.0.0.1", iac_mock, project_mock, ) diff --git a/tests/unit/commands/validate/lib/test_sam_template_validator.py b/tests/unit/commands/validate/lib/test_sam_template_validator.py index b8ce1e3bed..a269278b93 100644 --- a/tests/unit/commands/validate/lib/test_sam_template_validator.py +++ b/tests/unit/commands/validate/lib/test_sam_template_validator.py @@ -9,9 +9,10 @@ class TestSamTemplateValidator(TestCase): + @patch("samcli.commands.validate.lib.sam_template_validator.Session") @patch("samcli.commands.validate.lib.sam_template_validator.Translator") @patch("samcli.commands.validate.lib.sam_template_validator.parser") - def test_is_valid_returns_true(self, sam_parser, sam_translator): + def test_is_valid_returns_true(self, sam_parser, sam_translator, boto_session_patch): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"policy": "SomePolicy"} template = {"a": "b"} @@ -19,24 +20,29 @@ def test_is_valid_returns_true(self, sam_parser, sam_translator): parser = Mock() sam_parser.Parser.return_value = parser + boto_session_mock = Mock() + boto_session_patch.return_value = boto_session_mock + translate_mock = Mock() translate_mock.translate.return_value = {"c": "d"} sam_translator.return_value = translate_mock - validator = SamTemplateValidator(template, managed_policy_mock) + validator = SamTemplateValidator(template, managed_policy_mock, profile="profile", region="region") # Should not throw an Exception validator.is_valid() + boto_session_patch.assert_called_once_with(profile_name="profile", region_name="region") sam_translator.assert_called_once_with( - managed_policy_map={"policy": "SomePolicy"}, sam_parser=parser, plugins=[] + managed_policy_map={"policy": "SomePolicy"}, sam_parser=parser, plugins=[], boto_session=boto_session_mock ) translate_mock.translate.assert_called_once_with(sam_template=template, parameter_values={}) sam_parser.Parser.assert_called_once() + @patch("samcli.commands.validate.lib.sam_template_validator.Session") @patch("samcli.commands.validate.lib.sam_template_validator.Translator") @patch("samcli.commands.validate.lib.sam_template_validator.parser") - def test_is_valid_raises_exception(self, sam_parser, sam_translator): + def test_is_valid_raises_exception(self, sam_parser, sam_translator, boto_session_patch): managed_policy_mock = Mock() managed_policy_mock.load.return_value = {"policy": "SomePolicy"} template = {"a": "b"} @@ -44,6 +50,9 @@ def test_is_valid_raises_exception(self, sam_parser, sam_translator): parser = Mock() sam_parser.Parser.return_value = parser + boto_session_mock = Mock() + boto_session_patch.return_value = boto_session_mock + translate_mock = Mock() translate_mock.translate.side_effect = InvalidDocumentException([Exception("message")]) sam_translator.return_value = translate_mock @@ -54,8 +63,10 @@ def test_is_valid_raises_exception(self, sam_parser, sam_translator): validator.is_valid() sam_translator.assert_called_once_with( - managed_policy_map={"policy": "SomePolicy"}, sam_parser=parser, plugins=[] + managed_policy_map={"policy": "SomePolicy"}, sam_parser=parser, plugins=[], boto_session=boto_session_mock ) + + boto_session_patch.assert_called_once_with(profile_name=None, region_name=None) translate_mock.translate.assert_called_once_with(sam_template=template, parameter_values={}) sam_parser.Parser.assert_called_once() diff --git a/tests/unit/commands/validate/test_cli.py b/tests/unit/commands/validate/test_cli.py index 5f9cac7a9a..d354c69327 100644 --- a/tests/unit/commands/validate/test_cli.py +++ b/tests/unit/commands/validate/test_cli.py @@ -1,5 +1,6 @@ from unittest import TestCase from unittest.mock import Mock, patch +from collections import namedtuple from botocore.exceptions import NoCredentialsError @@ -8,6 +9,8 @@ from samcli.commands.validate.lib.exceptions import InvalidSamDocumentException from samcli.commands.validate.validate import do_cli, _read_sam_file +ctx_mock = namedtuple("ctx", ["profile", "region"]) + class TestValidateCli(TestCase): @patch("samcli.commands.validate.validate.click") @@ -46,7 +49,7 @@ def test_template_fails_validation(self, read_sam_file_patch, click_patch, templ template_valiadator.return_value = is_valid_mock with self.assertRaises(InvalidSamTemplateException): - do_cli(ctx=None, template=template_path) + do_cli(ctx=ctx_mock(profile="profile", region="region"), template=template_path) @patch("samcli.commands.validate.lib.sam_template_validator.SamTemplateValidator") @patch("samcli.commands.validate.validate.click") @@ -60,7 +63,7 @@ def test_no_credentials_provided(self, read_sam_file_patch, click_patch, templat template_valiadator.return_value = is_valid_mock with self.assertRaises(UserException): - do_cli(ctx=None, template=template_path) + do_cli(ctx=ctx_mock(profile="profile", region="region"), template=template_path) @patch("samcli.commands.validate.lib.sam_template_validator.SamTemplateValidator") @patch("samcli.commands.validate.validate.click") @@ -73,4 +76,4 @@ def test_template_passes_validation(self, read_sam_file_patch, click_patch, temp is_valid_mock.is_valid.return_value = True template_valiadator.return_value = is_valid_mock - do_cli(ctx=None, template=template_path) + do_cli(ctx=ctx_mock(profile="profile", region="region"), template=template_path) diff --git a/tests/unit/lib/bootstrap/test_bootstrap.py b/tests/unit/lib/bootstrap/test_bootstrap.py index 8094a404c0..e62ad26a5c 100644 --- a/tests/unit/lib/bootstrap/test_bootstrap.py +++ b/tests/unit/lib/bootstrap/test_bootstrap.py @@ -1,23 +1,45 @@ from unittest import TestCase -from unittest.mock import patch +from unittest.mock import patch, MagicMock -from samcli.commands.exceptions import UserException -from samcli.lib.bootstrap.bootstrap import manage_stack +from samcli.commands.exceptions import UserException, CredentialsError +from samcli.lib.bootstrap.bootstrap import manage_stack, StackOutput, get_current_account_id class TestBootstrapManagedStack(TestCase): @patch("samcli.lib.bootstrap.bootstrap.manage_cloudformation_stack") def test_stack_missing_bucket(self, manage_cfn_stack_mock): - manage_cfn_stack_mock.return_value = [] + manage_cfn_stack_mock.return_value = StackOutput(stack_output=[]) with self.assertRaises(UserException): manage_stack("testProfile", "fakeRegion") - manage_cfn_stack_mock.return_value = [{"OutputKey": "NotSourceBucket", "OutputValue": "AnyValue"}] + manage_cfn_stack_mock.return_value = StackOutput( + stack_output=[{"OutputKey": "NotSourceBucket", "OutputValue": "AnyValue"}] + ) with self.assertRaises(UserException): manage_stack("testProfile", "fakeRegion") @patch("samcli.lib.bootstrap.bootstrap.manage_cloudformation_stack") def test_manage_stack_happy_case(self, manage_cfn_stack_mock): expected_bucket_name = "BucketName" - manage_cfn_stack_mock.return_value = [{"OutputKey": "SourceBucket", "OutputValue": expected_bucket_name}] + manage_cfn_stack_mock.return_value = StackOutput( + stack_output=[{"OutputKey": "SourceBucket", "OutputValue": expected_bucket_name}] + ) actual_bucket_name = manage_stack("testProfile", "fakeRegion") self.assertEqual(actual_bucket_name, expected_bucket_name) + + @patch("samcli.lib.bootstrap.bootstrap.boto3") + def test_get_current_account_id(self, boto3_mock): + session_mock = boto3_mock.Session.return_value = MagicMock() + sts_mock = MagicMock() + sts_mock.get_caller_identity.return_value = {"Account": 1234567890} + session_mock.client.return_value = sts_mock + account_id = get_current_account_id() + self.assertEqual(account_id, 1234567890) + + @patch("samcli.lib.bootstrap.bootstrap.boto3") + def test_get_current_account_id_missing_id(self, boto3_mock): + session_mock = boto3_mock.Session.return_value = MagicMock() + sts_mock = MagicMock() + sts_mock.get_caller_identity.return_value = {} + session_mock.client.return_value = sts_mock + with self.assertRaises(CredentialsError): + get_current_account_id() diff --git a/tests/unit/lib/build_module/test_build_graph.py b/tests/unit/lib/build_module/test_build_graph.py index ff898a4619..73c79267a8 100644 --- a/tests/unit/lib/build_module/test_build_graph.py +++ b/tests/unit/lib/build_module/test_build_graph.py @@ -26,7 +26,7 @@ InvalidBuildGraphException, LayerBuildDefinition, ) -from samcli.lib.providers.provider import Function +from samcli.lib.providers.provider import Function, LayerVersion from samcli.lib.utils import osutils from samcli.lib.utils.packagetype import ZIP @@ -47,10 +47,13 @@ def generate_function( layers="layers", events="events", codesign_config_arn="codesign_config_arn", - metadata={}, + metadata=None, inlinecode=None, stack_path="", ): + if metadata is None: + metadata = {} + return Function( name, name, @@ -74,6 +77,21 @@ def generate_function( ) +def generate_layer( + arn="arn:aws:lambda:region:account-id:layer:layer-name:1", + codeuri="codeuri", + compatible_runtimes=None, + metadata=None, + stack_path="", +): + if compatible_runtimes is None: + compatible_runtimes = ["runtime"] + if metadata is None: + metadata = {} + + return LayerVersion("", arn, codeuri, compatible_runtimes, metadata, stack_path) + + class TestConversionFunctions(TestCase): def test_function_build_definition_to_toml_table(self): build_definition = FunctionBuildDefinition( @@ -152,8 +170,11 @@ def test_toml_table_to_layer_build_definition(self): class TestBuildGraph(TestCase): CODEURI = "hello_world_python/" + LAYER_CODEURI = "sum_layer/" + LAYER_NAME = "SumLayer" ZIP = ZIP RUNTIME = "python3.8" + LAYER_RUNTIME = "nodejs12.x" METADATA = {"Test": "hello", "Test2": "world"} UUID = "3c1c254e-cd4b-4d94-8c74-7ab870b36063" LAYER_UUID = "7dnc257e-cd4b-4d94-8c74-7ab870b3abc3" @@ -176,10 +197,10 @@ class TestBuildGraph(TestCase): [layer_build_definitions] [layer_build_definitions.{LAYER_UUID}] - layer_name = "SumLayer" - codeuri = "sum_layer/" - build_method = "nodejs12.x" - compatible_runtimes = ["nodejs12.x"] + layer_name = "{LAYER_NAME}" + codeuri = "{LAYER_CODEURI}" + build_method = "{LAYER_RUNTIME}" + compatible_runtimes = ["{LAYER_RUNTIME}"] source_md5 = "{SOURCE_MD5}" layer = "SumLayer" [layer_build_definitions.{LAYER_UUID}.env_vars] @@ -198,6 +219,7 @@ def test_should_instantiate_first_time(self): self.assertEqual( build_graph1.get_function_build_definitions(), build_graph2.get_function_build_definitions() ) + self.assertEqual(build_graph1.get_layer_build_definitions(), build_graph2.get_layer_build_definitions()) def test_should_instantiate_first_time_and_update(self): with osutils.mkdir_temp() as temp_base_dir: @@ -206,7 +228,7 @@ def test_should_instantiate_first_time_and_update(self): # create a build graph and persist it build_graph1 = BuildGraph(str(build_dir)) - build_definition1 = FunctionBuildDefinition( + function_build_definition1 = FunctionBuildDefinition( TestBuildGraph.RUNTIME, TestBuildGraph.CODEURI, TestBuildGraph.ZIP, @@ -217,7 +239,22 @@ def test_should_instantiate_first_time_and_update(self): function1 = generate_function( runtime=TestBuildGraph.RUNTIME, codeuri=TestBuildGraph.CODEURI, metadata=TestBuildGraph.METADATA ) - build_graph1.put_function_build_definition(build_definition1, function1) + build_graph1.put_function_build_definition(function_build_definition1, function1) + layer_build_definition1 = LayerBuildDefinition( + TestBuildGraph.LAYER_NAME, + TestBuildGraph.LAYER_CODEURI, + TestBuildGraph.LAYER_RUNTIME, + [TestBuildGraph.LAYER_RUNTIME], + TestBuildGraph.SOURCE_MD5, + TestBuildGraph.ENV_VARS, + ) + layer1 = generate_layer( + compatible_runtimes=[TestBuildGraph.RUNTIME], + codeuri=TestBuildGraph.LAYER_CODEURI, + metadata=TestBuildGraph.METADATA, + ) + build_graph1.put_layer_build_definition(layer_build_definition1, layer1) + build_graph1.clean_redundant_definitions_and_update(True) # read previously persisted graph and compare @@ -225,10 +262,17 @@ def test_should_instantiate_first_time_and_update(self): self.assertEqual( len(build_graph1.get_function_build_definitions()), len(build_graph2.get_function_build_definitions()) ) + self.assertEqual( + len(build_graph1.get_layer_build_definitions()), len(build_graph2.get_layer_build_definitions()) + ) self.assertEqual( list(build_graph1.get_function_build_definitions())[0], list(build_graph2.get_function_build_definitions())[0], ) + self.assertEqual( + list(build_graph1.get_layer_build_definitions())[0], + list(build_graph2.get_layer_build_definitions())[0], + ) def test_should_read_existing_build_graph(self): with osutils.mkdir_temp() as temp_base_dir: @@ -239,13 +283,20 @@ def test_should_read_existing_build_graph(self): build_graph_path.write_text(TestBuildGraph.BUILD_GRAPH_CONTENTS) build_graph = BuildGraph(str(build_dir)) - for build_definition in build_graph.get_function_build_definitions(): - self.assertEqual(build_definition.codeuri, TestBuildGraph.CODEURI) - self.assertEqual(build_definition.runtime, TestBuildGraph.RUNTIME) - self.assertEqual(build_definition.packagetype, TestBuildGraph.ZIP) - self.assertEqual(build_definition.metadata, TestBuildGraph.METADATA) - self.assertEqual(build_definition.source_md5, TestBuildGraph.SOURCE_MD5) - self.assertEqual(build_definition.env_vars, TestBuildGraph.ENV_VARS) + for function_build_definition in build_graph.get_function_build_definitions(): + self.assertEqual(function_build_definition.codeuri, TestBuildGraph.CODEURI) + self.assertEqual(function_build_definition.runtime, TestBuildGraph.RUNTIME) + self.assertEqual(function_build_definition.packagetype, TestBuildGraph.ZIP) + self.assertEqual(function_build_definition.metadata, TestBuildGraph.METADATA) + self.assertEqual(function_build_definition.source_md5, TestBuildGraph.SOURCE_MD5) + self.assertEqual(function_build_definition.env_vars, TestBuildGraph.ENV_VARS) + + for layer_build_definition in build_graph.get_layer_build_definitions(): + self.assertEqual(layer_build_definition.name, TestBuildGraph.LAYER_NAME) + self.assertEqual(layer_build_definition.codeuri, TestBuildGraph.LAYER_CODEURI) + self.assertEqual(layer_build_definition.build_method, TestBuildGraph.LAYER_RUNTIME) + self.assertEqual(layer_build_definition.compatible_runtimes, [TestBuildGraph.LAYER_RUNTIME]) + self.assertEqual(layer_build_definition.env_vars, TestBuildGraph.ENV_VARS) def test_functions_should_be_added_existing_build_graph(self): with osutils.mkdir_temp() as temp_base_dir: @@ -266,15 +317,17 @@ def test_functions_should_be_added_existing_build_graph(self): TestBuildGraph.ENV_VARS, ) function1 = generate_function( - runtime=TestBuildGraph.RUNTIME, codeuri=TestBuildGraph.CODEURI, metadata=TestBuildGraph.METADATA + runtime=TestBuildGraph.RUNTIME, + codeuri=TestBuildGraph.CODEURI, + metadata=TestBuildGraph.METADATA, ) build_graph.put_function_build_definition(build_definition1, function1) - self.assertTrue(len(build_graph.get_function_build_definitions()), 1) - for build_definition in build_graph.get_function_build_definitions(): - self.assertTrue(len(build_definition.functions), 1) - self.assertTrue(build_definition.functions[0], function1) - self.assertEqual(build_definition.uuid, TestBuildGraph.UUID) + build_definitions = build_graph.get_function_build_definitions() + self.assertEqual(len(build_definitions), 1) + self.assertEqual(len(build_definitions[0].functions), 1) + self.assertEqual(build_definitions[0].functions[0], function1) + self.assertEqual(build_definitions[0].uuid, TestBuildGraph.UUID) build_definition2 = FunctionBuildDefinition( "another_runtime", @@ -286,7 +339,56 @@ def test_functions_should_be_added_existing_build_graph(self): ) function2 = generate_function(name="another_function") build_graph.put_function_build_definition(build_definition2, function2) - self.assertTrue(len(build_graph.get_function_build_definitions()), 2) + + build_definitions = build_graph.get_function_build_definitions() + self.assertEqual(len(build_definitions), 2) + self.assertEqual(len(build_definitions[1].functions), 1) + self.assertEqual(build_definitions[1].functions[0], function2) + + def test_layers_should_be_added_existing_build_graph(self): + with osutils.mkdir_temp() as temp_base_dir: + build_dir = Path(temp_base_dir, ".aws-sam", "build") + build_dir.mkdir(parents=True) + + build_graph_path = Path(build_dir.parent, "build.toml") + build_graph_path.write_text(TestBuildGraph.BUILD_GRAPH_CONTENTS) + + build_graph = BuildGraph(str(build_dir)) + + build_definition1 = LayerBuildDefinition( + TestBuildGraph.LAYER_NAME, + TestBuildGraph.LAYER_CODEURI, + TestBuildGraph.LAYER_RUNTIME, + [TestBuildGraph.LAYER_RUNTIME], + TestBuildGraph.SOURCE_MD5, + TestBuildGraph.ENV_VARS, + ) + layer1 = generate_layer( + compatible_runtimes=[TestBuildGraph.RUNTIME], + codeuri=TestBuildGraph.LAYER_CODEURI, + metadata=TestBuildGraph.METADATA, + ) + build_graph.put_layer_build_definition(build_definition1, layer1) + + build_definitions = build_graph.get_layer_build_definitions() + self.assertEqual(len(build_definitions), 1) + self.assertEqual(build_definitions[0].layer, layer1) + self.assertEqual(build_definitions[0].uuid, TestBuildGraph.LAYER_UUID) + + build_definition2 = LayerBuildDefinition( + "another_layername", + "another_codeuri", + "another_runtime", + ["another_runtime"], + "another_source_md5", + {"env_vars": "value2"}, + ) + layer2 = generate_layer(arn="arn:aws:lambda:region:account-id:layer:another-layer-name:1") + build_graph.put_layer_build_definition(build_definition2, layer2) + + build_definitions = build_graph.get_layer_build_definitions() + self.assertEqual(len(build_definitions), 2) + self.assertEqual(build_definitions[1].layer, layer2) class TestBuildDefinition(TestCase): diff --git a/tests/unit/lib/build_module/test_build_strategy.py b/tests/unit/lib/build_module/test_build_strategy.py index 7e9902a172..1fae5b7962 100644 --- a/tests/unit/lib/build_module/test_build_strategy.py +++ b/tests/unit/lib/build_module/test_build_strategy.py @@ -1,3 +1,4 @@ +from copy import deepcopy from unittest import TestCase from unittest.mock import Mock, patch, MagicMock, call, ANY @@ -218,11 +219,15 @@ def test_build_single_function_definition_image_functions_with_same_metadata(sel function2.name = "Function2" function2.full_path = "Function2" function2.packagetype = IMAGE - build_definition = FunctionBuildDefinition("3.7", "codeuri", IMAGE, {}) + build_definition = FunctionBuildDefinition("3.7", "codeuri", IMAGE, {}, env_vars={"FOO": "BAR"}) # since they have the same metadata, they are put into the same build_definition. build_definition.functions = [function1, function2] - result = default_build_strategy.build_single_function_definition(build_definition) + with patch("samcli.lib.build.build_strategy.deepcopy", wraps=deepcopy) as patched_deepcopy: + result = default_build_strategy.build_single_function_definition(build_definition) + + patched_deepcopy.assert_called_with(build_definition.env_vars) + # both of the function name should show up in results self.assertEqual(result, {"Function": built_image, "Function2": built_image}) diff --git a/tests/unit/lib/cookiecutter/test_interactive_flow.py b/tests/unit/lib/cookiecutter/test_interactive_flow.py index ed52626451..47ed0ec2b6 100644 --- a/tests/unit/lib/cookiecutter/test_interactive_flow.py +++ b/tests/unit/lib/cookiecutter/test_interactive_flow.py @@ -1,3 +1,4 @@ +from pathlib import Path from unittest import TestCase from unittest.mock import patch from samcli.lib.cookiecutter.interactive_flow import InteractiveFlow @@ -49,3 +50,26 @@ def test_run(self, mock_3rd_q, mock_2nd_q, mock_1st_q): mock_3rd_q.assert_called_once() self.assertEqual(expected_context, actual_context) self.assertIsNot(actual_context, initial_context) # shouldn't modify the input, it should copy and return new + + @patch.object(Question, "ask") + @patch.object(Confirm, "ask") + @patch.object(Choice, "ask") + def test_run_with_preloaded_default_values(self, mock_3rd_q, mock_2nd_q, mock_1st_q): + + mock_1st_q.return_value = "answer1" + mock_2nd_q.return_value = False + mock_3rd_q.return_value = "option1" + + initial_context = {"key": "value", "['beta', 'bootstrap', 'x']": "y"} + + actual_context = self.flow.run(initial_context) + + mock_1st_q.assert_called_once() + mock_2nd_q.assert_called_once() + mock_3rd_q.assert_called_once() + + self.assertEqual( + {"1st": "answer1", "2nd": False, "3rd": "option1", "['beta', 'bootstrap', 'x']": "y", "key": "value"}, + actual_context, + ) + self.assertIsNot(actual_context, initial_context) # shouldn't modify the input, it should copy and return new diff --git a/tests/unit/lib/cookiecutter/test_question.py b/tests/unit/lib/cookiecutter/test_question.py index e59a76b782..2db7055357 100644 --- a/tests/unit/lib/cookiecutter/test_question.py +++ b/tests/unit/lib/cookiecutter/test_question.py @@ -1,5 +1,9 @@ +from typing import List, Union, Dict from unittest import TestCase -from unittest.mock import ANY, patch +from unittest.mock import ANY, patch, Mock + +from parameterized import parameterized + from samcli.lib.cookiecutter.question import Question, QuestionKind, Choice, Confirm, Info, QuestionFactory @@ -23,6 +27,19 @@ def setUp(self): key=self._ANY_KEY, default=self._ANY_ANSWER, is_required=True, + allow_autofill=False, + next_question_map=self._ANY_NEXT_QUESTION_MAP, + default_next_question_key=self._ANY_DEFAULT_NEXT_QUESTION_KEY, + ) + + def get_question_with_default_from_cookiecutter_context_using_keypath( + self, key_path: List[Union[str, Dict]] + ) -> Question: + return Question( + text=self._ANY_TEXT, + key=self._ANY_KEY, + default={"keyPath": key_path}, + is_required=True, next_question_map=self._ANY_NEXT_QUESTION_MAP, default_next_question_key=self._ANY_DEFAULT_NEXT_QUESTION_KEY, ) @@ -61,10 +78,90 @@ def test_get_next_question_key(self): @patch("samcli.lib.cookiecutter.question.click") def test_ask(self, mock_click): mock_click.prompt.return_value = self._ANY_ANSWER - answer = self.question.ask() + answer = self.question.ask({}) self.assertEqual(answer, self._ANY_ANSWER) mock_click.prompt.assert_called_once_with(text=self.question.text, default=self.question.default_answer) + @patch("samcli.lib.cookiecutter.question.click") + def test_ask_resolves_from_cookiecutter_context(self, mock_click): + # Setup + expected_default_value = Mock() + previous_question_key = "this is a question" + previous_question_answer = "this is an answer" + context = { + "['x', 'this is an answer']": expected_default_value, + previous_question_key: previous_question_answer, + } + question = self.get_question_with_default_from_cookiecutter_context_using_keypath( + ["x", {"valueOf": previous_question_key}] + ) + + # Trigger + question.ask(context=context) + + # Verify + mock_click.prompt.assert_called_once_with(text=self.question.text, default=expected_default_value) + + @patch("samcli.lib.cookiecutter.question.click") + def test_ask_resolves_from_cookiecutter_context_non_exist_key_path(self, mock_click): + # Setup + context = {} + question = self.get_question_with_default_from_cookiecutter_context_using_keypath(["y"]) + + # Trigger + question.ask(context=context) + + # Verify + mock_click.prompt.assert_called_once_with(text=self.question.text, default=None) + + def test_ask_resolves_from_cookiecutter_context_non_exist_question_key(self): + # Setup + expected_default_value = Mock() + previous_question_key = "this is a question" + previous_question_answer = "this is an answer" + context = { + "['x', 'this is an answer']": expected_default_value, + previous_question_key: previous_question_answer, + } + question = self.get_question_with_default_from_cookiecutter_context_using_keypath( + ["x", {"valueOf": "non_exist_question_key"}] + ) + + # Trigger + with self.assertRaises(KeyError): + question.ask(context=context) + + @parameterized.expand([("this should have been a list"), ([1],), ({},)]) + def test_ask_resolves_from_cookiecutter_context_with_key_path_not_a_list(self, key_path): + # Setup + context = {} + question = self.get_question_with_default_from_cookiecutter_context_using_keypath(key_path) + + # Trigger + with self.assertRaises(ValueError): + question.ask(context=context) + + @parameterized.expand([({"keyPath123": Mock()},), ({"keyPath": [{"valueOf123": Mock()}]},)]) + def test_ask_resolves_from_cookiecutter_context_with_default_object_missing_keys(self, default_object): + # Setup + context = {} + question = self.get_question_with_default_from_cookiecutter_context_using_keypath([]) + question._default_answer = default_object + + # Trigger + with self.assertRaises(KeyError): + question.ask(context=context) + + def test_question_allow_autofill_with_default_value(self): + q = Question(text=self._ANY_TEXT, key=self._ANY_KEY, is_required=True, allow_autofill=True, default="123") + self.assertEquals("123", q.ask()) + + @patch("samcli.lib.cookiecutter.question.click") + def test_question_allow_autofill_without_default_value(self, click_mock): + answer_mock = click_mock.prompt.return_value = Mock() + q = Question(text=self._ANY_TEXT, key=self._ANY_KEY, is_required=True, allow_autofill=True) + self.assertEquals(answer_mock, q.ask()) + class TestChoice(TestCase): def setUp(self): @@ -99,10 +196,14 @@ def test_get_options_indexes_with_different_bases(self): @patch("samcli.lib.cookiecutter.question.click") def test_ask(self, mock_click, mock_choice): mock_click.prompt.return_value = 2 - answer = self.question.ask() + answer = self.question.ask({}) self.assertEqual(answer, TestQuestion._ANY_OPTIONS[1]) # we deduct one from user's choice (base 1 vs base 0) mock_click.prompt.assert_called_once_with( - text="Choice", default=self.question.default_answer, show_choices=False, type=ANY + text="Choice", + default=self.question.default_answer, + show_choices=False, + type=ANY, + show_default=self.question.default_answer is not None, ) mock_choice.assert_called_once_with(["1", "2", "3"]) @@ -112,7 +213,7 @@ class TestInfo(TestCase): def test_ask(self, mock_click): q = Info(text=TestQuestion._ANY_TEXT, key=TestQuestion._ANY_KEY) mock_click.echo.return_value = None - answer = q.ask() + answer = q.ask({}) self.assertIsNone(answer) mock_click.echo.assert_called_once_with(message=q.text) @@ -122,7 +223,7 @@ class TestConfirm(TestCase): def test_ask(self, mock_click): q = Confirm(text=TestQuestion._ANY_TEXT, key=TestQuestion._ANY_KEY) mock_click.confirm.return_value = True - answer = q.ask() + answer = q.ask({}) self.assertTrue(answer) mock_click.confirm.assert_called_once_with(text=q.text) diff --git a/tests/unit/lib/cookiecutter/test_template.py b/tests/unit/lib/cookiecutter/test_template.py index edb7412f59..318939f46b 100644 --- a/tests/unit/lib/cookiecutter/test_template.py +++ b/tests/unit/lib/cookiecutter/test_template.py @@ -114,11 +114,16 @@ def test_generate_project(self, mock_preprocessor, mock_postprocessor, mock_inte postprocessors=[mock_postprocessor], ) mock_preprocessor.run.return_value = self._ANY_PROCESSOR_CONTEXT - t.generate_project(context=self._ANY_INTERACTIVE_FLOW_CONTEXT) + output_dir = Mock() + t.generate_project(context=self._ANY_INTERACTIVE_FLOW_CONTEXT, output_dir=output_dir) mock_interactive_flow.run.assert_not_called() mock_preprocessor.run.assert_called_once_with(self._ANY_INTERACTIVE_FLOW_CONTEXT) mock_cookiecutter.assert_called_with( - template=self._ANY_LOCATION, output_dir=".", no_input=True, extra_context=self._ANY_PROCESSOR_CONTEXT + template=self._ANY_LOCATION, + output_dir=output_dir, + no_input=True, + extra_context=self._ANY_PROCESSOR_CONTEXT, + overwrite_if_exists=True, ) mock_postprocessor.run.assert_called_once_with(self._ANY_PROCESSOR_CONTEXT) @@ -127,7 +132,7 @@ def test_generate_project_preprocessors_exceptions(self, mock_preprocessor): t = Template(location=self._ANY_LOCATION, preprocessors=[mock_preprocessor]) with self.assertRaises(PreprocessingError): mock_preprocessor.run.side_effect = Exception("something went wrong") - t.generate_project({}) + t.generate_project({}, Mock()) @patch("samcli.lib.cookiecutter.template.cookiecutter") @patch("samcli.lib.cookiecutter.processor") @@ -135,7 +140,7 @@ def test_generate_project_postprocessors_exceptions(self, mock_postprocessor, mo t = Template(location=self._ANY_LOCATION, postprocessors=[mock_postprocessor]) with self.assertRaises(PostprocessingError): mock_postprocessor.run.side_effect = Exception("something went wrong") - t.generate_project({}) + t.generate_project({}, Mock()) @patch("samcli.lib.cookiecutter.template.generate_non_cookiecutter_project") @patch("samcli.lib.cookiecutter.template.cookiecutter") @@ -143,13 +148,13 @@ def test_generate_project_cookiecutter_exceptions(self, mock_cookiecutter, mock_ t = Template(location=self._ANY_LOCATION) with self.assertRaises(InvalidLocationError): mock_cookiecutter.side_effect = UnknownRepoType() - t.generate_project({}) + t.generate_project({}, Mock()) mock_cookiecutter.reset_mock() with self.assertRaises(GenerateProjectFailedError): mock_cookiecutter.side_effect = Exception("something went wrong") - t.generate_project({}) + t.generate_project({}, Mock()) mock_cookiecutter.reset_mock() # if the provided template is not a cookiecutter template, we generate a non cookiecutter template mock_cookiecutter.side_effect = RepositoryNotFound() - t.generate_project({}) + t.generate_project({}, Mock()) mock_generate_non_cookiecutter_project.assert_called_once() diff --git a/tests/unit/lib/logs/test_fetcher.py b/tests/unit/lib/logs/test_fetcher.py deleted file mode 100644 index c0b634c008..0000000000 --- a/tests/unit/lib/logs/test_fetcher.py +++ /dev/null @@ -1,255 +0,0 @@ -import copy -import datetime -import botocore.session - -from unittest import TestCase -from unittest.mock import Mock, patch, call, ANY -from botocore.stub import Stubber - -from samcli.lib.logs.fetcher import LogsFetcher -from samcli.lib.logs.event import LogEvent -from samcli.lib.utils.time import to_timestamp, to_datetime - - -class TestLogsFetcher_fetch(TestCase): - def setUp(self): - - real_client = botocore.session.get_session().create_client("logs", region_name="us-east-1") - self.client_stubber = Stubber(real_client) - self.fetcher = LogsFetcher(real_client) - - self.log_group_name = "name" - self.stream_name = "stream name" - self.timestamp = to_timestamp(datetime.datetime.utcnow()) - - self.mock_api_response = { - "events": [ - { - "eventId": "id1", - "ingestionTime": 0, - "logStreamName": self.stream_name, - "message": "message 1", - "timestamp": self.timestamp, - }, - { - "eventId": "id2", - "ingestionTime": 0, - "logStreamName": self.stream_name, - "message": "message 2", - "timestamp": self.timestamp, - }, - ] - } - - self.expected_events = [ - LogEvent( - self.log_group_name, - { - "eventId": "id1", - "ingestionTime": 0, - "logStreamName": self.stream_name, - "message": "message 1", - "timestamp": self.timestamp, - }, - ), - LogEvent( - self.log_group_name, - { - "eventId": "id2", - "ingestionTime": 0, - "logStreamName": self.stream_name, - "message": "message 2", - "timestamp": self.timestamp, - }, - ), - ] - - def test_must_fetch_logs_for_log_group(self): - expected_params = {"logGroupName": self.log_group_name, "interleaved": True} - - # Configure the stubber to return the configured response. The stubber also verifies - # that input params were provided as expected - self.client_stubber.add_response("filter_log_events", self.mock_api_response, expected_params) - - with self.client_stubber: - events_iterable = self.fetcher.fetch(self.log_group_name) - - actual_result = list(events_iterable) - self.assertEqual(self.expected_events, actual_result) - - def test_must_fetch_logs_with_all_params(self): - pattern = "foobar" - start = datetime.datetime.utcnow() - end = datetime.datetime.utcnow() - - expected_params = { - "logGroupName": self.log_group_name, - "interleaved": True, - "startTime": to_timestamp(start), - "endTime": to_timestamp(end), - "filterPattern": pattern, - } - - self.client_stubber.add_response("filter_log_events", self.mock_api_response, expected_params) - - with self.client_stubber: - events_iterable = self.fetcher.fetch(self.log_group_name, start=start, end=end, filter_pattern=pattern) - - actual_result = list(events_iterable) - self.assertEqual(self.expected_events, actual_result) - - def test_must_paginate_using_next_token(self): - """Make three API calls, first two returns a nextToken and last does not.""" - token = "token" - expected_params = {"logGroupName": self.log_group_name, "interleaved": True} - expected_params_with_token = {"logGroupName": self.log_group_name, "interleaved": True, "nextToken": token} - - mock_response_with_token = copy.deepcopy(self.mock_api_response) - mock_response_with_token["nextToken"] = token - - # Call 1 returns a token. Also when first call is made, token is **not** passed as API params - self.client_stubber.add_response("filter_log_events", mock_response_with_token, expected_params) - - # Call 2 returns a token - self.client_stubber.add_response("filter_log_events", mock_response_with_token, expected_params_with_token) - - # Call 3 DOES NOT return a token. This will terminate the loop. - self.client_stubber.add_response("filter_log_events", self.mock_api_response, expected_params_with_token) - - # Same data was returned in each API call - expected_events_result = self.expected_events + self.expected_events + self.expected_events - - with self.client_stubber: - events_iterable = self.fetcher.fetch(self.log_group_name) - - actual_result = list(events_iterable) - self.assertEqual(expected_events_result, actual_result) - - -class TestLogsFetcher_tail(TestCase): - def setUp(self): - - self.fetcher = LogsFetcher(Mock()) - - self.log_group_name = "name" - self.filter_pattern = "pattern" - - self.start_time = to_datetime(10) - self.max_retries = 3 - self.poll_interval = 1 - - self.mock_events1 = [ - LogEvent(self.log_group_name, {"timestamp": 11}), - LogEvent(self.log_group_name, {"timestamp": 12}), - ] - self.mock_events2 = [ - LogEvent(self.log_group_name, {"timestamp": 13}), - LogEvent(self.log_group_name, {"timestamp": 14}), - ] - self.mock_events_empty = [] - - @patch("samcli.lib.logs.fetcher.time") - def test_must_tail_logs_with_single_data_fetch(self, time_mock): - - self.fetcher.fetch = Mock() - - self.fetcher.fetch.side_effect = [ - self.mock_events1, - # Return empty data for `max_retries` number of polls - self.mock_events_empty, - self.mock_events_empty, - self.mock_events_empty, - ] - - expected_fetch_calls = [ - # First fetch returns data - call(ANY, start=self.start_time, filter_pattern=self.filter_pattern), - # Three empty fetches - call(ANY, start=to_datetime(13), filter_pattern=self.filter_pattern), - call(ANY, start=to_datetime(13), filter_pattern=self.filter_pattern), - call(ANY, start=to_datetime(13), filter_pattern=self.filter_pattern), - ] - - # One per poll - expected_sleep_calls = [call(self.poll_interval) for i in expected_fetch_calls] - - result_itr = self.fetcher.tail( - self.log_group_name, - start=self.start_time, - filter_pattern=self.filter_pattern, - max_retries=self.max_retries, - poll_interval=self.poll_interval, - ) - - self.assertEqual(self.mock_events1, list(result_itr)) - self.assertEqual(expected_fetch_calls, self.fetcher.fetch.call_args_list) - self.assertEqual(expected_sleep_calls, time_mock.sleep.call_args_list) - - @patch("samcli.lib.logs.fetcher.time") - def test_must_tail_logs_with_multiple_data_fetches(self, time_mock): - - self.fetcher.fetch = Mock() - - self.fetcher.fetch.side_effect = [ - self.mock_events1, - # Just one empty fetch - self.mock_events_empty, - # This fetch returns data - self.mock_events2, - # Return empty data for `max_retries` number of polls - self.mock_events_empty, - self.mock_events_empty, - self.mock_events_empty, - ] - - expected_fetch_calls = [ - # First fetch returns data - call(ANY, start=self.start_time, filter_pattern=self.filter_pattern), - # This fetch was empty - call(ANY, start=to_datetime(13), filter_pattern=self.filter_pattern), - # This fetch returned data - call(ANY, start=to_datetime(13), filter_pattern=self.filter_pattern), - # Three empty fetches - call(ANY, start=to_datetime(15), filter_pattern=self.filter_pattern), - call(ANY, start=to_datetime(15), filter_pattern=self.filter_pattern), - call(ANY, start=to_datetime(15), filter_pattern=self.filter_pattern), - ] - - # One per poll - expected_sleep_calls = [call(self.poll_interval) for i in expected_fetch_calls] - - result_itr = self.fetcher.tail( - self.log_group_name, - start=self.start_time, - filter_pattern=self.filter_pattern, - max_retries=self.max_retries, - poll_interval=self.poll_interval, - ) - - self.assertEqual(self.mock_events1 + self.mock_events2, list(result_itr)) - self.assertEqual(expected_fetch_calls, self.fetcher.fetch.call_args_list) - self.assertEqual(expected_sleep_calls, time_mock.sleep.call_args_list) - - @patch("samcli.lib.logs.fetcher.time") - def test_without_start_time(self, time_mock): - - self.fetcher.fetch = Mock() - - self.fetcher.fetch.return_value = self.mock_events_empty - - expected_fetch_calls = [ - # Three empty fetches, all with default start time - call(ANY, start=to_datetime(0), filter_pattern=ANY), - call(ANY, start=to_datetime(0), filter_pattern=ANY), - call(ANY, start=to_datetime(0), filter_pattern=ANY), - ] - - result_itr = self.fetcher.tail( - self.log_group_name, - filter_pattern=self.filter_pattern, - max_retries=self.max_retries, - poll_interval=self.poll_interval, - ) - - self.assertEqual([], list(result_itr)) - self.assertEqual(expected_fetch_calls, self.fetcher.fetch.call_args_list) diff --git a/tests/unit/lib/logs/test_formatter.py b/tests/unit/lib/logs/test_formatter.py deleted file mode 100644 index b30fd49c71..0000000000 --- a/tests/unit/lib/logs/test_formatter.py +++ /dev/null @@ -1,164 +0,0 @@ -import json - -from unittest import TestCase -from unittest.mock import Mock, patch, call -from parameterized import parameterized - -from samcli.lib.logs.formatter import LogsFormatter, LambdaLogMsgFormatters, KeywordHighlighter, JSONMsgFormatter -from samcli.lib.logs.event import LogEvent - - -class TestLogsFormatter_pretty_print_event(TestCase): - def setUp(self): - self.colored_mock = Mock() - self.group_name = "group name" - self.stream_name = "stream name" - self.message = "message" - self.event_dict = {"timestamp": 1, "message": self.message, "logStreamName": self.stream_name} - - def test_must_serialize_event(self): - colored_timestamp = "colored timestamp" - colored_stream_name = "colored stream name" - self.colored_mock.yellow.return_value = colored_timestamp - self.colored_mock.cyan.return_value = colored_stream_name - - event = LogEvent(self.group_name, self.event_dict) - - expected = " ".join([colored_stream_name, colored_timestamp, self.message]) - result = LogsFormatter._pretty_print_event(event, self.colored_mock) - - self.assertEqual(expected, result) - self.colored_mock.yellow.has_calls() - self.colored_mock.cyan.assert_called_with(self.stream_name) - - -def _passthru_formatter(event, colored): - return event - - -class TestLogsFormatter_do_format(TestCase): - def setUp(self): - self.colored_mock = Mock() - - # Set formatter chain method to return the input unaltered. - self.chain_method1 = Mock(wraps=_passthru_formatter) - self.chain_method2 = Mock(wraps=_passthru_formatter) - self.chain_method3 = Mock(wraps=_passthru_formatter) - - self.formatter_chain = [self.chain_method1, self.chain_method2, self.chain_method3] - - @patch.object(LogsFormatter, "_pretty_print_event", wraps=_passthru_formatter) - def test_must_map_formatters_sequentially(self, pretty_print_mock): - - events_iterable = [1, 2, 3] - expected_result = [1, 2, 3] - expected_call_order = [ - call(1, colored=self.colored_mock), - call(2, colored=self.colored_mock), - call(3, colored=self.colored_mock), - ] - - formatter = LogsFormatter(self.colored_mock, self.formatter_chain) - - result_iterable = formatter.do_format(events_iterable) - self.assertEqual(list(result_iterable), expected_result) - - self.chain_method1.assert_has_calls(expected_call_order) - self.chain_method2.assert_has_calls(expected_call_order) - self.chain_method3.assert_has_calls(expected_call_order) - pretty_print_mock.assert_has_calls(expected_call_order) # Pretty Printer must always be called - - @patch.object(LogsFormatter, "_pretty_print_event", wraps=_passthru_formatter) - def test_must_work_without_formatter_chain(self, pretty_print_mock): - - events_iterable = [1, 2, 3] - expected_result = [1, 2, 3] - expected_call_order = [ - call(1, colored=self.colored_mock), - call(2, colored=self.colored_mock), - call(3, colored=self.colored_mock), - ] - - # No formatter chain. - formatter = LogsFormatter(self.colored_mock) - - result_iterable = formatter.do_format(events_iterable) - self.assertEqual(list(result_iterable), expected_result) - - # Pretty Print is always called, even if there are no other formatters in the chain. - pretty_print_mock.assert_has_calls(expected_call_order) - self.chain_method1.assert_not_called() - self.chain_method2.assert_not_called() - self.chain_method3.assert_not_called() - - -class TestLambdaLogMsgFormatters_colorize_crashes(TestCase): - @parameterized.expand( - [ - "Task timed out", - "Something happened. Task timed out. Something else happend", - "Process exited before completing request", - ] - ) - def test_must_color_crash_messages(self, input_msg): - color_result = "colored messaage" - colored = Mock() - colored.red.return_value = color_result - event = LogEvent("group_name", {"message": input_msg}) - - result = LambdaLogMsgFormatters.colorize_errors(event, colored) - self.assertEqual(result.message, color_result) - colored.red.assert_called_with(input_msg) - - def test_must_ignore_other_messages(self): - colored = Mock() - event = LogEvent("group_name", {"message": "some msg"}) - - result = LambdaLogMsgFormatters.colorize_errors(event, colored) - self.assertEqual(result.message, "some msg") - colored.red.assert_not_called() - - -class TestKeywordHighlight_highlight_keyword(TestCase): - def test_must_highlight_all_keywords(self): - input_msg = "this keyword some keyword other keyword" - keyword = "keyword" - color_result = "colored" - expected_msg = "this colored some colored other colored" - - colored = Mock() - colored.underline.return_value = color_result - event = LogEvent("group_name", {"message": input_msg}) - - result = KeywordHighlighter(keyword).highlight_keywords(event, colored) - self.assertEqual(result.message, expected_msg) - colored.underline.assert_called_with(keyword) - - def test_must_ignore_if_keyword_is_absent(self): - colored = Mock() - input_msg = "this keyword some keyword other keyword" - event = LogEvent("group_name", {"message": input_msg}) - - result = KeywordHighlighter().highlight_keywords(event, colored) - self.assertEqual(result.message, input_msg) - colored.underline.assert_not_called() - - -class TestJSONMsgFormatter_format_json(TestCase): - def test_must_pretty_print_json(self): - data = {"a": "b"} - input_msg = '{"a": "b"}' - expected_msg = json.dumps(data, indent=2) - - event = LogEvent("group_name", {"message": input_msg}) - - result = JSONMsgFormatter.format_json(event, None) - self.assertEqual(result.message, expected_msg) - - @parameterized.expand(["this is not json", '{"not a valid json"}']) - def test_ignore_non_json(self, input_msg): - - event = LogEvent("group_name", {"message": input_msg}) - - result = JSONMsgFormatter.format_json(event, None) - self.assertEqual(result.message, input_msg) diff --git a/tests/unit/lib/observability/__init__.py b/tests/unit/lib/observability/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/lib/observability/cw_logs/__init__.py b/tests/unit/lib/observability/cw_logs/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/lib/logs/test_event.py b/tests/unit/lib/observability/cw_logs/test_cw_log_event.py similarity index 51% rename from tests/unit/lib/logs/test_event.py rename to tests/unit/lib/observability/cw_logs/test_cw_log_event.py index c093edf0e2..62968f0d71 100644 --- a/tests/unit/lib/logs/test_event.py +++ b/tests/unit/lib/observability/cw_logs/test_cw_log_event.py @@ -1,9 +1,9 @@ from unittest import TestCase -from samcli.lib.logs.event import LogEvent +from samcli.lib.observability.cw_logs.cw_log_event import CWLogEvent -class TestLogEvent(TestCase): +class TestCWLogEvent(TestCase): def setUp(self): self.group_name = "log group name" self.stream_name = "stream name" @@ -12,44 +12,56 @@ def setUp(self): self.timestamp_str = "2018-07-06T13:09:54" def test_must_extract_fields_from_event(self): - event = LogEvent( + event = CWLogEvent( self.group_name, {"timestamp": self.timestamp, "logStreamName": self.stream_name, "message": self.message} ) - self.assertEqual(event.log_group_name, self.group_name) + self.assertEqual(event.cw_log_group, self.group_name) self.assertEqual(event.log_stream_name, self.stream_name) self.assertEqual(event.message, self.message) - self.assertEqual(self.timestamp_str, event.timestamp) + self.assertEqual(self.timestamp, event.timestamp) def test_must_ignore_if_some_fields_are_empty(self): - event = LogEvent(self.group_name, {"logStreamName": "stream name"}) + event = CWLogEvent(self.group_name, {"logStreamName": "stream name"}) - self.assertEqual(event.log_group_name, self.group_name) + self.assertEqual(event.cw_log_group, self.group_name) self.assertEqual(event.log_stream_name, self.stream_name) self.assertEqual(event.message, "") - self.assertIsNone(event.timestamp) + self.assertEqual(event.timestamp, 0) def test_must_ignore_if_event_is_empty(self): - event = LogEvent(self.group_name, {}) + event = CWLogEvent(self.group_name, {}) - self.assertEqual(event.log_group_name, self.group_name) - self.assertIsNone(event.log_stream_name) - self.assertIsNone(event.message) - self.assertIsNone(event.timestamp) + self.assertEqual(event.cw_log_group, self.group_name) + self.assertEqual(event.log_stream_name, "") + self.assertEqual(event.message, "") + self.assertEqual(event.timestamp, 0) def test_check_for_equality(self): - event = LogEvent( + event = CWLogEvent( self.group_name, {"timestamp": self.timestamp, "logStreamName": self.stream_name, "message": self.message} ) - other = LogEvent( + other = CWLogEvent( self.group_name, {"timestamp": self.timestamp, "logStreamName": self.stream_name, "message": self.message} ) self.assertEqual(event, other) + def test_check_for_inequality(self): + event = CWLogEvent( + self.group_name, + {"timestamp": self.timestamp + 1, "logStreamName": self.stream_name, "message": self.message}, + ) + + other = CWLogEvent( + self.group_name, {"timestamp": self.timestamp, "logStreamName": self.stream_name, "message": self.message} + ) + + self.assertNotEqual(event, other) + def test_check_for_equality_with_other_data_types(self): - event = LogEvent(self.group_name, {}) + event = CWLogEvent(self.group_name, {}) other = "this is not an event" self.assertNotEqual(event, other) diff --git a/tests/unit/lib/observability/cw_logs/test_cw_log_formatters.py b/tests/unit/lib/observability/cw_logs/test_cw_log_formatters.py new file mode 100644 index 0000000000..f864ff1fe7 --- /dev/null +++ b/tests/unit/lib/observability/cw_logs/test_cw_log_formatters.py @@ -0,0 +1,120 @@ +import json +from unittest import TestCase +from unittest.mock import Mock + +from parameterized import parameterized + +from samcli.lib.observability.cw_logs.cw_log_event import CWLogEvent +from samcli.lib.observability.cw_logs.cw_log_formatters import ( + CWPrettyPrintFormatter, + CWColorizeErrorsFormatter, + CWKeywordHighlighterFormatter, + CWJsonFormatter, +) + + +class TestCWPrettyPrintFormatter(TestCase): + def setUp(self): + self.colored = Mock() + self.pretty_print_formatter = CWPrettyPrintFormatter(self.colored) + self.group_name = "group name" + self.stream_name = "stream name" + self.message = "message" + self.event_dict = {"timestamp": 1, "message": self.message, "logStreamName": self.stream_name} + + def test_must_serialize_event(self): + colored_timestamp = "colored timestamp" + colored_stream_name = "colored stream name" + self.colored.yellow.return_value = colored_timestamp + self.colored.cyan.return_value = colored_stream_name + + event = CWLogEvent(self.group_name, self.event_dict) + + expected = " ".join([colored_stream_name, colored_timestamp, self.message]) + result = self.pretty_print_formatter.map(event) + + self.assertEqual(expected, result.message) + self.colored.yellow.has_calls() + self.colored.cyan.assert_called_with(self.stream_name) + + +class TestCWColorizeErrorsFormatter(TestCase): + def setUp(self): + self.colored = Mock() + self.formatter = CWColorizeErrorsFormatter(self.colored) + + @parameterized.expand( + [ + "Task timed out", + "Something happened. Task timed out. Something else happend", + "Process exited before completing request", + ] + ) + def test_must_color_crash_messages(self, input_msg): + color_result = "colored messaage" + self.colored.red.return_value = color_result + event = CWLogEvent("group_name", {"message": input_msg}) + + result = self.formatter.map(event) + self.assertEqual(result.message, color_result) + self.colored.red.assert_called_with(input_msg) + + def test_must_ignore_other_messages(self): + event = CWLogEvent("group_name", {"message": "some msg"}) + + result = self.formatter.map(event) + self.assertEqual(result.message, "some msg") + self.colored.red.assert_not_called() + + +class CWCWKeywordHighlighterFormatter(TestCase): + def setUp(self): + self.colored = Mock() + + def test_must_highlight_all_keywords(self): + input_msg = "this keyword some keyword other keyword" + keyword = "keyword" + color_result = "colored" + expected_msg = "this colored some colored other colored" + + formatter = CWKeywordHighlighterFormatter(self.colored, keyword) + + self.colored.underline.return_value = color_result + event = CWLogEvent("group_name", {"message": input_msg}) + + result = formatter.map(event) + self.assertEqual(result.message, expected_msg) + self.colored.underline.assert_called_with(keyword) + + def test_must_ignore_if_keyword_is_absent(self): + input_msg = "this keyword some keyword other keyword" + event = CWLogEvent("group_name", {"message": input_msg}) + + formatter = CWKeywordHighlighterFormatter(self.colored) + + result = formatter.map(event) + self.assertEqual(result.message, input_msg) + self.colored.underline.assert_not_called() + + +class TestCWJsonFormatter(TestCase): + def setUp(self): + self.formatter = CWJsonFormatter() + + def test_must_pretty_print_json(self): + data = {"a": "b"} + input_msg = '{"a": "b"}' + expected_msg = json.dumps(data, indent=2) + + event = CWLogEvent("group_name", {"message": input_msg}) + + result = self.formatter.map(event) + self.assertEqual(result.message, expected_msg) + + @parameterized.expand(["this is not json", '{"not a valid json"}']) + def test_ignore_non_json(self, input_msg): + + event = CWLogEvent("group_name", {"message": input_msg}) + + result = self.formatter.map(event) + self.assertEqual(result.message, input_msg) diff --git a/tests/unit/lib/logs/test_provider.py b/tests/unit/lib/observability/cw_logs/test_cw_log_group_provider.py similarity index 78% rename from tests/unit/lib/logs/test_provider.py rename to tests/unit/lib/observability/cw_logs/test_cw_log_group_provider.py index 59da01928c..295ad6d898 100644 --- a/tests/unit/lib/logs/test_provider.py +++ b/tests/unit/lib/observability/cw_logs/test_cw_log_group_provider.py @@ -1,6 +1,6 @@ from unittest import TestCase -from samcli.lib.logs.provider import LogGroupProvider +from samcli.lib.observability.cw_logs.cw_log_group_provider import LogGroupProvider class TestLogGroupProvider_for_lambda_function(TestCase): diff --git a/tests/unit/lib/observability/cw_logs/test_cw_log_puller.py b/tests/unit/lib/observability/cw_logs/test_cw_log_puller.py new file mode 100644 index 0000000000..98f4e6d3de --- /dev/null +++ b/tests/unit/lib/observability/cw_logs/test_cw_log_puller.py @@ -0,0 +1,322 @@ +import copy +from datetime import datetime +from unittest import TestCase +from unittest.mock import Mock, call, patch, ANY + +import botocore.session +from botocore.stub import Stubber + +from samcli.lib.observability.cw_logs.cw_log_event import CWLogEvent +from samcli.lib.observability.cw_logs.cw_log_puller import CWLogPuller +from samcli.lib.utils.time import to_timestamp, to_datetime + + +class TestCWLogPuller_load_time_period(TestCase): + def setUp(self): + self.log_group_name = "name" + self.stream_name = "stream name" + self.timestamp = to_timestamp(datetime.utcnow()) + + real_client = botocore.session.get_session().create_client("logs", region_name="us-east-1") + self.client_stubber = Stubber(real_client) + self.consumer = Mock() + self.fetcher = CWLogPuller(real_client, self.consumer, self.log_group_name) + + self.mock_api_response = { + "events": [ + { + "eventId": "id1", + "ingestionTime": 0, + "logStreamName": self.stream_name, + "message": "message 1", + "timestamp": self.timestamp, + }, + { + "eventId": "id2", + "ingestionTime": 0, + "logStreamName": self.stream_name, + "message": "message 2", + "timestamp": self.timestamp, + }, + ] + } + + self.expected_events = [ + CWLogEvent( + self.log_group_name, + { + "eventId": "id1", + "ingestionTime": 0, + "logStreamName": self.stream_name, + "message": "message 1", + "timestamp": self.timestamp, + }, + ), + CWLogEvent( + self.log_group_name, + { + "eventId": "id2", + "ingestionTime": 0, + "logStreamName": self.stream_name, + "message": "message 2", + "timestamp": self.timestamp, + }, + ), + ] + + def test_must_fetch_logs_for_log_group(self): + expected_params = {"logGroupName": self.log_group_name, "interleaved": True} + + # Configure the stubber to return the configured response. The stubber also verifies + # that input params were provided as expected + self.client_stubber.add_response("filter_log_events", self.mock_api_response, expected_params) + + with self.client_stubber: + self.fetcher.load_time_period() + + call_args = [args[0] for (args, _) in self.consumer.consume.call_args_list] + for event in self.expected_events: + self.assertIn(event, call_args) + + def test_must_fetch_logs_with_all_params(self): + pattern = "foobar" + start = datetime.utcnow() + end = datetime.utcnow() + + expected_params = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": to_timestamp(start), + "endTime": to_timestamp(end), + "filterPattern": pattern, + } + + self.client_stubber.add_response("filter_log_events", self.mock_api_response, expected_params) + + with self.client_stubber: + self.fetcher.load_time_period(start_time=start, end_time=end, filter_pattern=pattern) + + call_args = [args[0] for (args, _) in self.consumer.consume.call_args_list] + for event in self.expected_events: + self.assertIn(event, call_args) + + def test_must_paginate_using_next_token(self): + """Make three API calls, first two returns a nextToken and last does not.""" + token = "token" + expected_params = {"logGroupName": self.log_group_name, "interleaved": True} + expected_params_with_token = {"logGroupName": self.log_group_name, "interleaved": True, "nextToken": token} + + mock_response_with_token = copy.deepcopy(self.mock_api_response) + mock_response_with_token["nextToken"] = token + + # Call 1 returns a token. Also when first call is made, token is **not** passed as API params + self.client_stubber.add_response("filter_log_events", mock_response_with_token, expected_params) + + # Call 2 returns a token + self.client_stubber.add_response("filter_log_events", mock_response_with_token, expected_params_with_token) + + # Call 3 DOES NOT return a token. This will terminate the loop. + self.client_stubber.add_response("filter_log_events", self.mock_api_response, expected_params_with_token) + + # Same data was returned in each API call + expected_events_result = self.expected_events + self.expected_events + self.expected_events + + with self.client_stubber: + self.fetcher.load_time_period() + + call_args = [args[0] for (args, _) in self.consumer.consume.call_args_list] + for event in expected_events_result: + self.assertIn(event, call_args) + + +class TestCWLogPuller_tail(TestCase): + def setUp(self): + self.log_group_name = "name" + self.filter_pattern = "pattern" + self.start_time = to_datetime(10) + self.max_retries = 3 + self.poll_interval = 1 + + real_client = botocore.session.get_session().create_client("logs", region_name="us-east-1") + self.client_stubber = Stubber(real_client) + self.consumer = Mock() + self.fetcher = CWLogPuller( + real_client, + self.consumer, + self.log_group_name, + max_retries=self.max_retries, + poll_interval=self.poll_interval, + ) + + self.mock_api_empty_response = {"events": []} + self.mock_api_response_1 = { + "events": [ + { + "timestamp": 11, + }, + { + "timestamp": 12, + }, + ] + } + self.mock_api_response_2 = { + "events": [ + { + "timestamp": 13, + }, + { + "timestamp": 14, + }, + ] + } + + self.mock_events1 = [ + CWLogEvent(self.log_group_name, {"timestamp": 11}), + CWLogEvent(self.log_group_name, {"timestamp": 12}), + ] + self.mock_events2 = [ + CWLogEvent(self.log_group_name, {"timestamp": 13}), + CWLogEvent(self.log_group_name, {"timestamp": 14}), + ] + self.mock_events_empty = [] + + @patch("samcli.lib.observability.cw_logs.cw_log_puller.time") + def test_must_tail_logs_with_single_data_fetch(self, time_mock): + expected_params = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": 10, + "filterPattern": self.filter_pattern, + } + expected_params_second_try = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": 13, + "filterPattern": self.filter_pattern, + } + + # first successful return + self.client_stubber.add_response("filter_log_events", self.mock_api_response_1, expected_params) + # 3 empty returns as the number of max retries + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_second_try) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_second_try) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_second_try) + + with patch.object( + self.fetcher, "load_time_period", wraps=self.fetcher.load_time_period + ) as patched_load_time_period: + with self.client_stubber: + self.fetcher.tail( + start_time=self.start_time, + filter_pattern=self.filter_pattern, + ) + + expected_load_time_period_calls = [ + # First fetch returns data + call(self.start_time, filter_pattern=self.filter_pattern), + # Three empty fetches + call(to_datetime(13), filter_pattern=self.filter_pattern), + call(to_datetime(13), filter_pattern=self.filter_pattern), + call(to_datetime(13), filter_pattern=self.filter_pattern), + ] + + # One per poll + expected_sleep_calls = [call(self.poll_interval) for _ in expected_load_time_period_calls] + + consumer_call_args = [args[0] for (args, _) in self.consumer.consume.call_args_list] + + self.assertEqual(self.mock_events1, consumer_call_args) + self.assertEqual(expected_sleep_calls, time_mock.sleep.call_args_list) + self.assertEqual(expected_load_time_period_calls, patched_load_time_period.call_args_list) + + @patch("samcli.lib.observability.cw_logs.cw_log_puller.time") + def test_must_tail_logs_with_multiple_data_fetches(self, time_mock): + expected_params = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": 10, + "filterPattern": self.filter_pattern, + } + expected_params_second_try = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": 13, + "filterPattern": self.filter_pattern, + } + expected_params_third_try = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": 15, + "filterPattern": self.filter_pattern, + } + + self.client_stubber.add_response("filter_log_events", self.mock_api_response_1, expected_params) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_second_try) + self.client_stubber.add_response("filter_log_events", self.mock_api_response_2, expected_params_second_try) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_third_try) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_third_try) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params_third_try) + + expected_load_time_period_calls = [ + # First fetch returns data + call(self.start_time, filter_pattern=self.filter_pattern), + # This fetch was empty + call(to_datetime(13), filter_pattern=self.filter_pattern), + # This fetch returned data + call(to_datetime(13), filter_pattern=self.filter_pattern), + # Three empty fetches + call(to_datetime(15), filter_pattern=self.filter_pattern), + call(to_datetime(15), filter_pattern=self.filter_pattern), + call(to_datetime(15), filter_pattern=self.filter_pattern), + ] + + # One per poll + expected_sleep_calls = [call(self.poll_interval) for _ in expected_load_time_period_calls] + + with patch.object( + self.fetcher, "load_time_period", wraps=self.fetcher.load_time_period + ) as patched_load_time_period: + with self.client_stubber: + self.fetcher.tail(start_time=self.start_time, filter_pattern=self.filter_pattern) + + expected_consumer_call_args = [args[0] for (args, _) in self.consumer.consume.call_args_list] + + self.assertEqual(self.mock_events1 + self.mock_events2, expected_consumer_call_args) + self.assertEqual(expected_load_time_period_calls, patched_load_time_period.call_args_list) + self.assertEqual(expected_sleep_calls, time_mock.sleep.call_args_list) + + @patch("samcli.lib.observability.cw_logs.cw_log_puller.time") + def test_without_start_time(self, time_mock): + expected_params = { + "logGroupName": self.log_group_name, + "interleaved": True, + "startTime": 0, + "filterPattern": self.filter_pattern, + } + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params) + self.client_stubber.add_response("filter_log_events", self.mock_api_empty_response, expected_params) + + expected_load_time_period_calls = [ + # Three empty fetches, all with default start time + call(to_datetime(0), filter_pattern=ANY), + call(to_datetime(0), filter_pattern=ANY), + call(to_datetime(0), filter_pattern=ANY), + ] + + # One per poll + expected_sleep_calls = [call(self.poll_interval) for _ in expected_load_time_period_calls] + + with patch.object( + self.fetcher, "load_time_period", wraps=self.fetcher.load_time_period + ) as patched_load_time_period: + with self.client_stubber: + self.fetcher.tail( + filter_pattern=self.filter_pattern, + ) + + expected_consumer_call_args = [args[0] for (args, _) in self.consumer.consume.call_args_list] + + self.assertEqual([], expected_consumer_call_args) + self.assertEqual(expected_load_time_period_calls, patched_load_time_period.call_args_list) + self.assertEqual(expected_sleep_calls, time_mock.sleep.call_args_list) diff --git a/tests/unit/lib/observability/test_observability_info_puller.py b/tests/unit/lib/observability/test_observability_info_puller.py new file mode 100644 index 0000000000..3fbbb9fe34 --- /dev/null +++ b/tests/unit/lib/observability/test_observability_info_puller.py @@ -0,0 +1,50 @@ +from unittest import TestCase +from unittest.mock import Mock + +from parameterized import parameterized, param + +from samcli.lib.observability.observability_info_puller import ObservabilityEventConsumerDecorator + + +class TestObservabilityEventConsumerDecorator(TestCase): + def test_decorator(self): + actual_consumer = Mock() + event = Mock() + + consumer_decorator = ObservabilityEventConsumerDecorator([], actual_consumer) + consumer_decorator.consume(event) + + actual_consumer.consume.assert_called_with(event) + + def test_decorator_with_mapper(self): + actual_consumer = Mock() + event = Mock() + mapped_event = Mock() + mapper = Mock() + mapper.map.return_value = mapped_event + + consumer_decorator = ObservabilityEventConsumerDecorator([mapper], actual_consumer) + consumer_decorator.consume(event) + + mapper.map.assert_called_with(event) + actual_consumer.consume.assert_called_with(mapped_event) + + @parameterized.expand( + [ + param([Mock()]), + param([Mock(), Mock()]), + param([Mock(), Mock(), Mock()]), + ] + ) + def test_decorator_with_mappers(self, mappers): + actual_consumer = Mock() + event = Mock() + for mapper in mappers: + mapper.map.return_value = event + + consumer_decorator = ObservabilityEventConsumerDecorator(mappers, actual_consumer) + consumer_decorator.consume(event) + + actual_consumer.consume.assert_called_with(event) + for mapper in mappers: + mapper.map.assert_called_with(event) diff --git a/tests/unit/lib/package/test_artifact_exporter.py b/tests/unit/lib/package/test_artifact_exporter.py index 85e3e22701..202007ebff 100644 --- a/tests/unit/lib/package/test_artifact_exporter.py +++ b/tests/unit/lib/package/test_artifact_exporter.py @@ -24,7 +24,7 @@ ) from samcli.lib.package.utils import make_abs_path from samcli.lib.package.packageable_resources import ( - is_s3_url, + is_s3_protocol_url, is_local_file, upload_local_artifacts, Resource, @@ -223,10 +223,10 @@ def test_is_s3_url(self): self._assert_is_invalid_s3_url(url) def _assert_is_valid_s3_url(self, url): - self.assertTrue(is_s3_url(url), "{0} should be valid".format(url)) + self.assertTrue(is_s3_protocol_url(url), "{0} should be valid".format(url)) def _assert_is_invalid_s3_url(self, url): - self.assertFalse(is_s3_url(url), "{0} should be valid".format(url)) + self.assertFalse(is_s3_protocol_url(url), "{0} should be valid".format(url)) def test_parse_s3_url(self): diff --git a/tests/unit/lib/package/test_utils.py b/tests/unit/lib/package/test_utils.py new file mode 100644 index 0000000000..2907d7c479 --- /dev/null +++ b/tests/unit/lib/package/test_utils.py @@ -0,0 +1,44 @@ +from unittest import TestCase + +from parameterized import parameterized + +from samcli.lib.package import utils + + +class TestPackageUtils(TestCase): + @parameterized.expand( + [ + # path like + "https://s3.us-west-2.amazonaws.com/bucket-name/some/path/object.html", + "http://s3.amazonaws.com/bucket-name/some/path/object.html", + "https://s3.dualstack.us-west-2.amazonaws.com/bucket-name/some/path/object.html", + # virual host + "http://bucket-name.s3.us-west-2.amazonaws.com/some/path/object.html", + "https://bucket-name.s3-us-west-2.amazonaws.com/some/path/object.html", + "https://bucket-name.s3.amazonaws.com/some/path/object.html", + # access point + "https://access-name-123456.s3-accesspoint.us-west-2.amazonaws.com/some/path/object.html", + "http://access-name-899889.s3-accesspoint.us-east-1.amazonaws.com/some/path/object.html", + # s3:// + "s3://bucket-name/path/to/object", + ] + ) + def test_is_s3_url(self, url): + self.assertTrue(utils.is_s3_url(url)) + + @parameterized.expand( + [ + # path like + "https://s3.$region.amazonaws.com/bucket-name/some/path/object.html", # invalid region + "https://s3.amazonaws.com/object.html", # no bucket + # virual host + "https://bucket-name.s3-us-west-2.amazonaws.com/", # no object + # access point + "https://access-name.s3-accesspoint.us-west-2.amazonaws.com/some/path/object.html", # no account id + # s3:// + "s3://bucket-name", # no object + "s3:://bucket-name", # typo + ] + ) + def test_is_not_s3_url(self, url): + self.assertFalse(utils.is_s3_url(url)) diff --git a/tests/unit/lib/pipeline/__init__.py b/tests/unit/lib/pipeline/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/lib/pipeline/bootstrap/__init__.py b/tests/unit/lib/pipeline/bootstrap/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/unit/lib/pipeline/bootstrap/test_environment.py b/tests/unit/lib/pipeline/bootstrap/test_environment.py new file mode 100644 index 0000000000..9a12f2be15 --- /dev/null +++ b/tests/unit/lib/pipeline/bootstrap/test_environment.py @@ -0,0 +1,425 @@ +from unittest import TestCase +from unittest.mock import Mock, patch, call, MagicMock + +from samcli.lib.pipeline.bootstrap.stage import Stage + +ANY_STAGE_NAME = "ANY_STAGE_NAME" +ANY_PIPELINE_USER_ARN = "ANY_PIPELINE_USER_ARN" +ANY_PIPELINE_EXECUTION_ROLE_ARN = "ANY_PIPELINE_EXECUTION_ROLE_ARN" +ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN = "ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN" +ANY_ARTIFACTS_BUCKET_ARN = "ANY_ARTIFACTS_BUCKET_ARN" +ANY_IMAGE_REPOSITORY_ARN = "ANY_IMAGE_REPOSITORY_ARN" +ANY_ARN = "ANY_ARN" + + +class TestStage(TestCase): + def test_stage_name_is_the_only_required_field_to_initialize_an_stage(self): + stage: Stage = Stage(name=ANY_STAGE_NAME) + self.assertEqual(stage.name, ANY_STAGE_NAME) + self.assertIsNone(stage.aws_profile) + self.assertIsNone(stage.aws_region) + self.assertIsNotNone(stage.pipeline_user) + self.assertIsNotNone(stage.pipeline_execution_role) + self.assertIsNotNone(stage.cloudformation_execution_role) + self.assertIsNotNone(stage.artifacts_bucket) + self.assertIsNotNone(stage.image_repository) + + with self.assertRaises(TypeError): + Stage() + + def test_did_user_provide_all_required_resources_when_not_all_resources_are_provided(self): + stage: Stage = Stage(name=ANY_STAGE_NAME) + self.assertFalse(stage.did_user_provide_all_required_resources()) + stage: Stage = Stage(name=ANY_STAGE_NAME, pipeline_user_arn=ANY_PIPELINE_USER_ARN) + self.assertFalse(stage.did_user_provide_all_required_resources()) + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + ) + self.assertFalse(stage.did_user_provide_all_required_resources()) + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + ) + self.assertFalse(stage.did_user_provide_all_required_resources()) + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + ) + self.assertFalse(stage.did_user_provide_all_required_resources()) + + def test_did_user_provide_all_required_resources_ignore_image_repository_if_it_is_not_required(self): + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=False, + ) + self.assertTrue(stage.did_user_provide_all_required_resources()) + + def test_did_user_provide_all_required_resources_when_image_repository_is_required(self): + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + ) + self.assertFalse(stage.did_user_provide_all_required_resources()) + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + image_repository_arn=ANY_IMAGE_REPOSITORY_ARN, + ) + self.assertTrue(stage.did_user_provide_all_required_resources()) + + @patch("samcli.lib.pipeline.bootstrap.stage.Stage._get_pipeline_user_secret_pair") + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + def test_did_user_provide_all_required_resources_returns_false_if_the_stage_was_initialized_without_any_of_the_resources_even_if_fulfilled_after_bootstrap( + self, manage_stack_mock, click_mock, pipeline_user_secret_pair_mock + ): + # setup + stack_output = Mock() + pipeline_user_secret_pair_mock.return_value = ("id", "secret") + stack_output.get.return_value = ANY_ARN + manage_stack_mock.return_value = stack_output + stage: Stage = Stage(name=ANY_STAGE_NAME) + + self.assertFalse(stage.did_user_provide_all_required_resources()) + + stage.bootstrap(confirm_changeset=False) + # After bootstrapping, all the resources should be fulfilled + self.assertEqual(ANY_ARN, stage.pipeline_user.arn) + self.assertEqual(ANY_ARN, stage.pipeline_execution_role.arn) + self.assertEqual(ANY_ARN, stage.cloudformation_execution_role.arn) + self.assertEqual(ANY_ARN, stage.artifacts_bucket.arn) + self.assertEqual(ANY_ARN, stage.image_repository.arn) + + # although all of the resources got fulfilled, `did_user_provide_all_required_resources` should return false + # as these resources are not provided by the user + self.assertFalse(stage.did_user_provide_all_required_resources()) + + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + @patch.object(Stage, "did_user_provide_all_required_resources") + def test_bootstrap_will_not_deploy_the_cfn_template_if_all_resources_are_already_provided( + self, did_user_provide_all_required_resources_mock, manage_stack_mock, click_mock + ): + did_user_provide_all_required_resources_mock.return_value = True + stage: Stage = Stage(name=ANY_STAGE_NAME) + stage.bootstrap(confirm_changeset=False) + manage_stack_mock.assert_not_called() + + @patch("samcli.lib.pipeline.bootstrap.stage.Stage._get_pipeline_user_secret_pair") + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + def test_bootstrap_will_confirm_before_deploying_unless_confirm_changeset_is_disabled( + self, manage_stack_mock, click_mock, pipeline_user_secret_pair_mock + ): + click_mock.confirm.return_value = False + pipeline_user_secret_pair_mock.return_value = ("id", "secret") + stage: Stage = Stage(name=ANY_STAGE_NAME) + stage.bootstrap(confirm_changeset=False) + click_mock.confirm.assert_not_called() + manage_stack_mock.assert_called_once() + manage_stack_mock.reset_mock() + stage.bootstrap(confirm_changeset=True) + click_mock.confirm.assert_called_once() + manage_stack_mock.assert_not_called() # As the user choose to not confirm + + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + def test_bootstrap_will_not_deploy_the_cfn_template_if_the_user_did_not_confirm( + self, manage_stack_mock, click_mock + ): + click_mock.confirm.return_value = False + stage: Stage = Stage(name=ANY_STAGE_NAME) + stage.bootstrap(confirm_changeset=True) + manage_stack_mock.assert_not_called() + + @patch("samcli.lib.pipeline.bootstrap.stage.Stage._get_pipeline_user_secret_pair") + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + def test_bootstrap_will_deploy_the_cfn_template_if_the_user_did_confirm( + self, manage_stack_mock, click_mock, pipeline_user_secret_pair_mock + ): + click_mock.confirm.return_value = True + pipeline_user_secret_pair_mock.return_value = ("id", "secret") + stage: Stage = Stage(name=ANY_STAGE_NAME) + stage.bootstrap(confirm_changeset=True) + manage_stack_mock.assert_called_once() + + @patch("samcli.lib.pipeline.bootstrap.stage.Stage._get_pipeline_user_secret_pair") + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + def test_bootstrap_will_pass_arns_of_all_user_provided_resources_any_empty_strings_for_other_resources_to_the_cfn_stack( + self, manage_stack_mock, click_mock, pipeline_user_secret_pair_mock + ): + click_mock.confirm.return_value = True + pipeline_user_secret_pair_mock.return_value = ("id", "secret") + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + image_repository_arn=ANY_IMAGE_REPOSITORY_ARN, + ) + stage.bootstrap() + manage_stack_mock.assert_called_once() + args, kwargs = manage_stack_mock.call_args_list[0] + expected_parameter_overrides = { + "PipelineUserArn": ANY_PIPELINE_USER_ARN, + "PipelineExecutionRoleArn": "", + "CloudFormationExecutionRoleArn": "", + "ArtifactsBucketArn": ANY_ARTIFACTS_BUCKET_ARN, + "CreateImageRepository": "true", + "ImageRepositoryArn": ANY_IMAGE_REPOSITORY_ARN, + } + self.assertEqual(expected_parameter_overrides, kwargs["parameter_overrides"]) + + @patch("samcli.lib.pipeline.bootstrap.stage.Stage._get_pipeline_user_secret_pair") + @patch("samcli.lib.pipeline.bootstrap.stage.click") + @patch("samcli.lib.pipeline.bootstrap.stage.manage_stack") + def test_bootstrap_will_fullfill_all_resource_arns( + self, manage_stack_mock, click_mock, pipeline_user_secret_pair_mock + ): + # setup + pipeline_user_secret_pair_mock.return_value = ("id", "secret") + stack_output = Mock() + stack_output.get.return_value = ANY_ARN + manage_stack_mock.return_value = stack_output + stage: Stage = Stage(name=ANY_STAGE_NAME) + click_mock.confirm.return_value = True + + # verify resources' ARNS are empty + self.assertIsNone(stage.pipeline_user.arn) + self.assertIsNone(stage.pipeline_execution_role.arn) + self.assertIsNone(stage.cloudformation_execution_role.arn) + self.assertIsNone(stage.artifacts_bucket.arn) + + # trigger + stage.bootstrap() + + # verify + manage_stack_mock.assert_called_once() + self.assertEqual(ANY_ARN, stage.pipeline_user.arn) + self.assertEqual(ANY_ARN, stage.pipeline_execution_role.arn) + self.assertEqual(ANY_ARN, stage.cloudformation_execution_role.arn) + self.assertEqual(ANY_ARN, stage.artifacts_bucket.arn) + + @patch("samcli.lib.pipeline.bootstrap.stage.SamConfig") + def test_save_config_escapes_none_resources(self, samconfig_mock): + cmd_names = ["any", "commands"] + samconfig_instance_mock = Mock() + samconfig_mock.return_value = samconfig_instance_mock + stage: Stage = Stage(name=ANY_STAGE_NAME) + + empty_ecr_call = call( + cmd_names=cmd_names, + section="parameters", + env=ANY_STAGE_NAME, + key="image_repository", + value="", + ) + + expected_calls = [] + self.trigger_and_assert_save_config_calls( + stage, cmd_names, expected_calls + [empty_ecr_call], samconfig_instance_mock.put + ) + + stage.pipeline_user.arn = ANY_PIPELINE_USER_ARN + expected_calls.append( + call(cmd_names=cmd_names, section="parameters", key="pipeline_user", value=ANY_PIPELINE_USER_ARN) + ) + self.trigger_and_assert_save_config_calls( + stage, cmd_names, expected_calls + [empty_ecr_call], samconfig_instance_mock.put + ) + + stage.pipeline_execution_role.arn = ANY_PIPELINE_EXECUTION_ROLE_ARN + expected_calls.append( + call( + cmd_names=cmd_names, + section="parameters", + env=ANY_STAGE_NAME, + key="pipeline_execution_role", + value=ANY_PIPELINE_EXECUTION_ROLE_ARN, + ), + ) + self.trigger_and_assert_save_config_calls( + stage, cmd_names, expected_calls + [empty_ecr_call], samconfig_instance_mock.put + ) + + stage.cloudformation_execution_role.arn = ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN + expected_calls.append( + call( + cmd_names=cmd_names, + section="parameters", + env=ANY_STAGE_NAME, + key="cloudformation_execution_role", + value=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + ), + ) + self.trigger_and_assert_save_config_calls( + stage, cmd_names, expected_calls + [empty_ecr_call], samconfig_instance_mock.put + ) + + stage.artifacts_bucket.arn = "arn:aws:s3:::artifact_bucket_name" + expected_calls.append( + call( + cmd_names=cmd_names, + section="parameters", + env=ANY_STAGE_NAME, + key="artifacts_bucket", + value="artifact_bucket_name", + ), + ) + self.trigger_and_assert_save_config_calls( + stage, cmd_names, expected_calls + [empty_ecr_call], samconfig_instance_mock.put + ) + + stage.image_repository.arn = "arn:aws:ecr:us-east-2:111111111111:repository/image_repository_name" + expected_calls.append( + call( + cmd_names=cmd_names, + section="parameters", + env=ANY_STAGE_NAME, + key="image_repository", + value="111111111111.dkr.ecr.us-east-2.amazonaws.com/image_repository_name", + ) + ) + self.trigger_and_assert_save_config_calls(stage, cmd_names, expected_calls, samconfig_instance_mock.put) + + def trigger_and_assert_save_config_calls(self, stage, cmd_names, expected_calls, samconfig_put_mock): + stage.save_config(config_dir="any_config_dir", filename="any_pipeline.toml", cmd_names=cmd_names) + self.assertEqual(len(expected_calls), samconfig_put_mock.call_count) + samconfig_put_mock.assert_has_calls(expected_calls, any_order=True) + samconfig_put_mock.reset_mock() + + @patch("samcli.lib.pipeline.bootstrap.stage.boto3") + def test_getting_pipeline_user_credentials(self, boto3_mock): + sm_client_mock = MagicMock() + sm_client_mock.get_secret_value.return_value = { + "SecretString": '{"aws_access_key_id": "AccessKeyId", "aws_secret_access_key": "SuperSecretKey"}' + } + session_mock = MagicMock() + session_mock.client.return_value = sm_client_mock + boto3_mock.Session.return_value = session_mock + + (key, secret) = Stage._get_pipeline_user_secret_pair("dummy_arn", None, "dummy-region") + self.assertEqual(key, "AccessKeyId") + self.assertEqual(secret, "SuperSecretKey") + sm_client_mock.get_secret_value.assert_called_once_with(SecretId="dummy_arn") + + @patch("samcli.lib.pipeline.bootstrap.stage.SamConfig") + def test_save_config_ignores_exceptions_thrown_while_calculating_artifacts_bucket_name(self, samconfig_mock): + samconfig_instance_mock = Mock() + samconfig_mock.return_value = samconfig_instance_mock + stage: Stage = Stage(name=ANY_STAGE_NAME, artifacts_bucket_arn="invalid ARN") + # calling artifacts_bucket.name() during save_config() will raise a ValueError exception, we need to make sure + # this exception is swallowed so that other configs can be safely saved to the pipelineconfig.toml file + stage.save_config(config_dir="any_config_dir", filename="any_pipeline.toml", cmd_names=["any", "commands"]) + + @patch("samcli.lib.pipeline.bootstrap.stage.SamConfig") + def test_save_config_ignores_exceptions_thrown_while_calculating_image_repository_uri(self, samconfig_mock): + samconfig_instance_mock = Mock() + samconfig_mock.return_value = samconfig_instance_mock + stage: Stage = Stage(name=ANY_STAGE_NAME, image_repository_arn="invalid ARN") + # calling image_repository.get_uri() during save_config() will raise a ValueError exception, we need to make + # sure this exception is swallowed so that other configs can be safely saved to the pipelineconfig.toml file + stage.save_config(config_dir="any_config_dir", filename="any_pipeline.toml", cmd_names=["any", "commands"]) + + @patch.object(Stage, "save_config") + def test_save_config_safe(self, save_config_mock): + save_config_mock.side_effect = Exception + stage: Stage = Stage(name=ANY_STAGE_NAME) + stage.save_config_safe(config_dir="any_config_dir", filename="any_pipeline.toml", cmd_names=["commands"]) + save_config_mock.assert_called_once_with("any_config_dir", "any_pipeline.toml", ["commands"]) + + @patch("samcli.lib.pipeline.bootstrap.stage.click") + def test_print_resources_summary_when_no_resources_provided_by_the_user(self, click_mock): + stage: Stage = Stage(name=ANY_STAGE_NAME) + stage.print_resources_summary() + self.assert_summary_has_a_message_like("The following resources were created in your account", click_mock.secho) + + @patch("samcli.lib.pipeline.bootstrap.stage.click") + def test_print_resources_summary_when_all_resources_are_provided_by_the_user(self, click_mock): + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + pipeline_execution_role_arn=ANY_PIPELINE_EXECUTION_ROLE_ARN, + cloudformation_execution_role_arn=ANY_CLOUDFORMATION_EXECUTION_ROLE_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + image_repository_arn=ANY_IMAGE_REPOSITORY_ARN, + ) + stage.print_resources_summary() + self.assert_summary_does_not_have_a_message_like( + "The following resources were created in your account", click_mock.secho + ) + + @patch("samcli.lib.pipeline.bootstrap.stage.click") + def test_print_resources_summary_when_some_resources_are_provided_by_the_user(self, click_mock): + stage: Stage = Stage( + name=ANY_STAGE_NAME, + pipeline_user_arn=ANY_PIPELINE_USER_ARN, + artifacts_bucket_arn=ANY_ARTIFACTS_BUCKET_ARN, + create_image_repository=True, + image_repository_arn=ANY_IMAGE_REPOSITORY_ARN, + ) + stage.print_resources_summary() + self.assert_summary_has_a_message_like("The following resources were created in your account", click_mock.secho) + + @patch("samcli.lib.pipeline.bootstrap.stage.click") + def test_print_resources_summary_prints_the_credentials_of_the_pipeline_user_iff_not_provided_by_the_user( + self, click_mock + ): + stage_with_provided_pipeline_user: Stage = Stage(name=ANY_STAGE_NAME, pipeline_user_arn=ANY_PIPELINE_USER_ARN) + stage_with_provided_pipeline_user.print_resources_summary() + self.assert_summary_does_not_have_a_message_like("AWS_ACCESS_KEY_ID", click_mock.secho) + self.assert_summary_does_not_have_a_message_like("AWS_SECRET_ACCESS_KEY", click_mock.secho) + click_mock.secho.reset_mock() + + stage_without_provided_pipeline_user: Stage = Stage(name=ANY_STAGE_NAME) + stage_without_provided_pipeline_user.print_resources_summary() + self.assert_summary_has_a_message_like("AWS_ACCESS_KEY_ID", click_mock.secho) + self.assert_summary_has_a_message_like("AWS_SECRET_ACCESS_KEY", click_mock.secho) + + def assert_summary_has_a_message_like(self, msg, click_secho_mock): + self.assertTrue( + self.does_summary_have_a_message_like(msg, click_secho_mock), + msg=f'stage resources summary does not include "{msg}" which is unexpected', + ) + + def assert_summary_does_not_have_a_message_like(self, msg, click_secho_mock): + self.assertFalse( + self.does_summary_have_a_message_like(msg, click_secho_mock), + msg=f'stage resources summary includes "{msg}" which is unexpected', + ) + + @staticmethod + def does_summary_have_a_message_like(msg, click_secho_mock): + msg = msg.lower() + for kall in click_secho_mock.call_args_list: + args, kwargs = kall + if args: + message = args[0].lower() + else: + message = kwargs.get("message", "").lower() + if msg in message: + return True + return False diff --git a/tests/unit/lib/pipeline/bootstrap/test_resource.py b/tests/unit/lib/pipeline/bootstrap/test_resource.py new file mode 100644 index 0000000000..f7dcab50f2 --- /dev/null +++ b/tests/unit/lib/pipeline/bootstrap/test_resource.py @@ -0,0 +1,81 @@ +from unittest import TestCase + +from samcli.lib.pipeline.bootstrap.resource import ARNParts, Resource, IAMUser, ECRImageRepository + +VALID_ARN = "arn:partition:service:region:account-id:resource-id" +INVALID_ARN = "ARN" + + +class TestArnParts(TestCase): + def test_arn_parts_of_valid_arn(self): + arn_parts: ARNParts = ARNParts(arn=VALID_ARN) + self.assertEqual(arn_parts.partition, "partition") + self.assertEqual(arn_parts.service, "service") + self.assertEqual(arn_parts.region, "region") + self.assertEqual(arn_parts.account_id, "account-id") + self.assertEqual(arn_parts.resource_id, "resource-id") + + def test_arn_parts_of_invalid_arn(self): + with self.assertRaises(ValueError): + invalid_arn = "invalid_arn" + ARNParts(arn=invalid_arn) + + +class TestResource(TestCase): + def test_resource(self): + resource = Resource(arn=VALID_ARN, comment="") + self.assertEqual(resource.arn, VALID_ARN) + self.assertTrue(resource.is_user_provided) + self.assertEqual(resource.name(), "resource-id") + + resource = Resource(arn=INVALID_ARN, comment="") + self.assertEqual(resource.arn, INVALID_ARN) + self.assertTrue(resource.is_user_provided) + with self.assertRaises(ValueError): + resource.name() + + resource = Resource(arn=None, comment="") + self.assertIsNone(resource.arn) + self.assertFalse(resource.is_user_provided) + self.assertIsNone(resource.name()) + + +class TestIAMUser(TestCase): + def test_create_iam_user(self): + user: IAMUser = IAMUser(arn=VALID_ARN, comment="user") + self.assertEquals(user.arn, VALID_ARN) + self.assertEquals(user.comment, "user") + self.assertIsNone(user.access_key_id) + self.assertIsNone(user.secret_access_key) + + user = IAMUser( + arn=INVALID_ARN, + access_key_id="any_access_key_id", + secret_access_key="any_secret_access_key", + comment="user", + ) + self.assertEquals(user.arn, INVALID_ARN) + self.assertEquals(user.comment, "user") + self.assertEquals(user.access_key_id, "any_access_key_id") + self.assertEquals(user.secret_access_key, "any_secret_access_key") + + +class TestECRImageRepository(TestCase): + def test_get_uri_with_valid_ecr_arn(self): + valid_ecr_arn = "arn:partition:service:region:account-id:repository/repository-name" + repo: ECRImageRepository = ECRImageRepository(arn=valid_ecr_arn, comment="ecr") + self.assertEqual(repo.get_uri(), "account-id.dkr.ecr.region.amazonaws.com/repository-name") + self.assertEquals("ecr", repo.comment) + + def test_get_uri_with_invalid_ecr_arn(self): + repo = ECRImageRepository(arn=INVALID_ARN, comment="ecr") + with self.assertRaises(ValueError): + repo.get_uri() + + def test_get_uri_with_valid_aws_arn_that_is_invalid_ecr_arn(self): + ecr_arn_missing_repository_prefix = ( + "arn:partition:service:region:account-id:repository-name-without-repository/-prefix" + ) + repo = ECRImageRepository(arn=ecr_arn_missing_repository_prefix, comment="ecr") + with self.assertRaises(ValueError): + repo.get_uri() diff --git a/tests/unit/lib/samconfig/test_samconfig.py b/tests/unit/lib/samconfig/test_samconfig.py index 74c9ee9661..42017d5490 100644 --- a/tests/unit/lib/samconfig/test_samconfig.py +++ b/tests/unit/lib/samconfig/test_samconfig.py @@ -1,11 +1,11 @@ import os from pathlib import Path - from unittest import TestCase from samcli.lib.config.exceptions import SamConfigVersionException +from samcli.lib.config.samconfig import SamConfig, DEFAULT_CONFIG_FILE_NAME, DEFAULT_GLOBAL_CMDNAME, DEFAULT_ENV from samcli.lib.config.version import VERSION_KEY, SAM_CONFIG_VERSION -from samcli.lib.config.samconfig import SamConfig, DEFAULT_CONFIG_FILE_NAME, DEFAULT_GLOBAL_CMDNAME +from samcli.lib.utils import osutils class TestSamConfig(TestCase): @@ -27,14 +27,25 @@ def _check_config_file(self): self.assertTrue(self.samconfig.sanity_check()) self.assertEqual(SAM_CONFIG_VERSION, self.samconfig.document.get(VERSION_KEY)) - def _update_samconfig(self, cmd_names, section, key, value, env): - self.samconfig.put(cmd_names=cmd_names, section=section, key=key, value=value, env=env) + def _update_samconfig(self, cmd_names, section, key, value, env=None): + if env: + self.samconfig.put(cmd_names=cmd_names, section=section, key=key, value=value, env=env) + else: + self.samconfig.put(cmd_names=cmd_names, section=section, key=key, value=value) self.samconfig.flush() self._check_config_file() def test_init(self): self.assertEqual(self.samconfig.filepath, Path(self.config_dir, DEFAULT_CONFIG_FILE_NAME)) + def test_get_stage_names(self): + self.assertEqual(self.samconfig.get_stage_names(), []) + self._update_samconfig(cmd_names=["myCommand"], section="mySection", key="port", value=5401, env="stage1") + self._update_samconfig(cmd_names=["myCommand"], section="mySection", key="port", value=5401, env="stage2") + self.assertEqual(self.samconfig.get_stage_names(), ["stage1", "stage2"]) + self._update_samconfig(cmd_names=["myCommand"], section="mySection", key="port", value=5401) + self.assertEqual(self.samconfig.get_stage_names(), ["stage1", "stage2", DEFAULT_ENV]) + def test_param_overwrite(self): self._update_samconfig(cmd_names=["myCommand"], section="mySection", key="port", value=5401, env="myEnv") self.assertEqual( @@ -195,3 +206,18 @@ def test_write_config_file_non_standard_version(self): self.samconfig.put(cmd_names=["local", "start", "api"], section="parameters", key="skip_pull_image", value=True) self.samconfig.sanity_check() self.assertEqual(self.samconfig.document.get(VERSION_KEY), 0.2) + + def test_write_config_file_will_create_the_file_if_not_exist(self): + with osutils.mkdir_temp(ignore_errors=True) as tempdir: + non_existing_dir = os.path.join(tempdir, "non-existing-dir") + non_existing_file = "non-existing-file" + samconfig = SamConfig(config_dir=non_existing_dir, filename=non_existing_file) + + self.assertFalse(samconfig.exists()) + + samconfig.flush() + self.assertFalse(samconfig.exists()) # nothing to write, no need to create the file + + samconfig.put(cmd_names=["any", "command"], section="any-section", key="any-key", value="any-value") + samconfig.flush() + self.assertTrue(samconfig.exists()) diff --git a/tests/unit/lib/telemetry/test_cicd.py b/tests/unit/lib/telemetry/test_cicd.py index b380beb969..382076bbcd 100644 --- a/tests/unit/lib/telemetry/test_cicd.py +++ b/tests/unit/lib/telemetry/test_cicd.py @@ -9,6 +9,7 @@ class TestCICD(TestCase): @parameterized.expand( [ + (CICDPlatform.Jenkins, "BUILD_TAG", "jenkins-jobname-123"), (CICDPlatform.Jenkins, "JENKINS_URL", Mock()), (CICDPlatform.GitLab, "GITLAB_CI", Mock()), (CICDPlatform.GitHubAction, "GITHUB_ACTION", Mock()), @@ -26,3 +27,12 @@ class TestCICD(TestCase): ) def test_is_cicd_platform(self, cicd_platform, env_var, env_var_value): self.assertTrue(_is_cicd_platform(cicd_platform, {env_var: env_var_value})) + + @parameterized.expand( + [ + (CICDPlatform.Jenkins, "BUILD_TAG", "not-jenkins-"), + (CICDPlatform.CodeShip, "CI_NAME", "not-CodeShip"), + ] + ) + def test_is_not_cicd_platform(self, cicd_platform, env_var, env_var_value): + self.assertFalse(_is_cicd_platform(cicd_platform, {env_var: env_var_value})) diff --git a/tests/unit/lib/utils/test_git_repo.py b/tests/unit/lib/utils/test_git_repo.py new file mode 100644 index 0000000000..c1ea286e92 --- /dev/null +++ b/tests/unit/lib/utils/test_git_repo.py @@ -0,0 +1,194 @@ +import subprocess +from pathlib import Path +from unittest import TestCase +from unittest.mock import patch, MagicMock, ANY, call +import os +from samcli.lib.utils.git_repo import GitRepo, rmtree_callback, CloneRepoException, CloneRepoUnstableStateException + +REPO_URL = "REPO URL" +REPO_NAME = "REPO NAME" +CLONE_DIR = os.path.normpath("/tmp/local/clone/dir") +EXPECTED_DEFAULT_CLONE_PATH = os.path.normpath(os.path.join(CLONE_DIR, REPO_NAME)) + + +class TestGitRepo(TestCase): + def setUp(self): + self.repo = GitRepo(url=REPO_URL) + self.local_clone_dir = MagicMock() + self.local_clone_dir.joinpath.side_effect = lambda sub_dir: os.path.normpath(os.path.join(CLONE_DIR, sub_dir)) + + def test_ensure_clone_directory_exists(self): + self.repo._ensure_clone_directory_exists(self.local_clone_dir) # No exception is thrown + self.local_clone_dir.mkdir.assert_called_once_with(mode=0o700, parents=True, exist_ok=True) + + def test_ensure_clone_directory_exists_fail(self): + self.local_clone_dir.mkdir.side_effect = OSError + with self.assertRaises(OSError): + self.repo._ensure_clone_directory_exists(self.local_clone_dir) + + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_git_executable_not_windows(self, mock_platform, mock_popen): + mock_platform.return_value = "Not Windows" + executable = self.repo._git_executable() + self.assertEqual(executable, "git") + + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_git_executable_windows(self, mock_platform, mock_popen): + mock_platform.return_value = "Windows" + executable = self.repo._git_executable() + self.assertEqual(executable, "git") + + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + def test_git_executable_fails(self, mock_popen): + mock_popen.side_effect = OSError("fail") + with self.assertRaises(OSError): + self.repo._git_executable() + + @patch("samcli.lib.utils.git_repo.Path.exists") + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_happy_case(self, platform_mock, popen_mock, check_output_mock, shutil_mock, path_exist_mock): + path_exist_mock.return_value = False + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + self.local_clone_dir.mkdir.assert_called_once_with(mode=0o700, parents=True, exist_ok=True) + popen_mock.assert_called_once_with(["git"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) + check_output_mock.assert_has_calls( + [ + call( + ["git", "clone", self.repo.url, REPO_NAME], + cwd=ANY, + stderr=subprocess.STDOUT, + ) + ] + ) + shutil_mock.rmtree.assert_not_called() + shutil_mock.copytree.assert_called_with(ANY, EXPECTED_DEFAULT_CLONE_PATH, ignore=ANY) + shutil_mock.ignore_patterns.assert_called_with("*.git") + + @patch("samcli.lib.utils.git_repo.Path.exists") + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_create_new_local_repo( + self, platform_mock, popen_mock, check_output_mock, shutil_mock, path_exist_mock + ): + path_exist_mock.return_value = False + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + shutil_mock.rmtree.assert_not_called() + shutil_mock.copytree.assert_called_with(ANY, EXPECTED_DEFAULT_CLONE_PATH, ignore=ANY) + shutil_mock.ignore_patterns.assert_called_with("*.git") + + @patch("samcli.lib.utils.git_repo.Path.exists") + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_replace_current_local_repo_if_replace_existing_flag_is_set( + self, platform_mock, popen_mock, check_output_mock, shutil_mock, path_exist_mock + ): + path_exist_mock.return_value = True + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME, replace_existing=True) + self.local_clone_dir.mkdir.assert_called_once_with(mode=0o700, parents=True, exist_ok=True) + shutil_mock.rmtree.assert_called_with(EXPECTED_DEFAULT_CLONE_PATH, onerror=rmtree_callback) + shutil_mock.copytree.assert_called_with(ANY, EXPECTED_DEFAULT_CLONE_PATH, ignore=ANY) + shutil_mock.ignore_patterns.assert_called_with("*.git") + + @patch("samcli.lib.utils.git_repo.Path.exists") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_fail_if_current_local_repo_exists_and_replace_existing_flag_is_not_set( + self, platform_mock, popen_mock, check_output_mock, path_exist_mock + ): + path_exist_mock.return_value = True + with self.assertRaises(CloneRepoException): + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) # replace_existing=False by default + + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_attempt_is_set_to_true_after_clone(self, platform_mock, popen_mock, check_output_mock, shutil_mock): + self.assertFalse(self.repo.clone_attempted) + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + self.assertTrue(self.repo.clone_attempted) + + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_attempt_is_set_to_true_even_if_clone_failed( + self, platform_mock, popen_mock, check_output_mock, shutil_mock + ): + check_output_mock.side_effect = subprocess.CalledProcessError("fail", "fail", "not found".encode("utf-8")) + self.assertFalse(self.repo.clone_attempted) + try: + with self.assertRaises(CloneRepoException): + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + except: + pass + self.assertTrue(self.repo.clone_attempted) + + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_failed_to_create_the_clone_directory( + self, platform_mock, popen_mock, check_output_mock, shutil_mock + ): + self.local_clone_dir.mkdir.side_effect = OSError + try: + with self.assertRaises(OSError): + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + except: + pass + self.local_clone_dir.mkdir.assert_called_once_with(mode=0o700, parents=True, exist_ok=True) + popen_mock.assert_not_called() + check_output_mock.assert_not_called() + shutil_mock.assert_not_called() + + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_when_the_subprocess_fail(self, platform_mock, popen_mock, check_output_mock, shutil_mock): + check_output_mock.side_effect = subprocess.CalledProcessError("fail", "fail", "any reason".encode("utf-8")) + with self.assertRaises(CloneRepoException): + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + + @patch("samcli.lib.utils.git_repo.LOG") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_when_the_git_repo_not_found(self, platform_mock, popen_mock, check_output_mock, log_mock): + check_output_mock.side_effect = subprocess.CalledProcessError("fail", "fail", "not found".encode("utf-8")) + try: + with self.assertRaises(CloneRepoException): + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME) + except Exception: + pass + log_mock.warning.assert_called() + + @patch("samcli.lib.utils.git_repo.Path.exists") + @patch("samcli.lib.utils.git_repo.shutil") + @patch("samcli.lib.utils.git_repo.check_output") + @patch("samcli.lib.utils.git_repo.subprocess.Popen") + @patch("samcli.lib.utils.git_repo.platform.system") + def test_clone_when_failed_to_move_cloned_repo_from_temp_to_final_destination( + self, platform_mock, popen_mock, check_output_mock, shutil_mock, path_exist_mock + ): + path_exist_mock.return_value = True + shutil_mock.copytree.side_effect = OSError + try: + with self.assertRaises(CloneRepoUnstableStateException): + self.repo.clone(clone_dir=self.local_clone_dir, clone_name=REPO_NAME, replace_existing=True) + except Exception: + pass + shutil_mock.rmtree.assert_called_once_with(EXPECTED_DEFAULT_CLONE_PATH, onerror=rmtree_callback) + shutil_mock.copytree.assert_called_once_with(ANY, EXPECTED_DEFAULT_CLONE_PATH, ignore=ANY) + shutil_mock.ignore_patterns.assert_called_once_with("*.git") diff --git a/tests/unit/lib/utils/test_hash.py b/tests/unit/lib/utils/test_hash.py index 1f16bb393e..388b3c96da 100644 --- a/tests/unit/lib/utils/test_hash.py +++ b/tests/unit/lib/utils/test_hash.py @@ -40,11 +40,11 @@ def test_dir_hash_independent_of_file_order(self): mockwalk.return_value = [ ( self.temp_dir, - (), - ( + [], + [ file1.name, file2.name, - ), + ], ), ] dir_checksums["first"] = dir_checksum(self.temp_dir) @@ -53,11 +53,11 @@ def test_dir_hash_independent_of_file_order(self): mockwalk.return_value = [ ( self.temp_dir, - (), - ( + [], + [ file2.name, file1.name, - ), + ], ), ] dir_checksums["second"] = dir_checksum(self.temp_dir) @@ -73,6 +73,27 @@ def test_dir_hash_same_contents_diff_file_per_directory(self): checksum_after = dir_checksum(os.path.dirname(_file.name)) self.assertNotEqual(checksum_before, checksum_after) + def test_dir_hash_with_ignore_list(self): + _file = tempfile.NamedTemporaryFile(delete=False, dir=self.temp_dir) + _file.write(b"Testfile") + _file.close() + + dir_path = os.path.dirname(_file.name) + checksum_before = dir_checksum(dir_path) + + # add a file to .aws-sam/ + aws_sam_dir_path = os.path.join(dir_path, ".aws-sam") + os.mkdir(aws_sam_dir_path) + _new_file = tempfile.NamedTemporaryFile(delete=False, dir=aws_sam_dir_path) + _new_file.write(b"dummy") + _new_file.close() + + checksum_after = dir_checksum(os.path.dirname(_file.name)) + self.assertNotEqual(checksum_before, checksum_after) + + checksum_after_with_ignore_list = dir_checksum(os.path.dirname(_file.name), ignore_list=[".aws-sam"]) + self.assertEqual(checksum_before, checksum_after_with_ignore_list) + def test_dir_cyclic_links(self): _file = tempfile.NamedTemporaryFile(delete=False, dir=self.temp_dir) _file.write(b"Testfile") diff --git a/tests/unit/lib/utils/test_managed_cloudformation_stack.py b/tests/unit/lib/utils/test_managed_cloudformation_stack.py index 9f1ea0915a..fd21b792f1 100644 --- a/tests/unit/lib/utils/test_managed_cloudformation_stack.py +++ b/tests/unit/lib/utils/test_managed_cloudformation_stack.py @@ -21,19 +21,28 @@ def _stubbed_cf_client(self): def test_session_missing_profile(self, boto_mock): boto_mock.side_effect = ProfileNotFound(profile="test-profile") with self.assertRaises(CredentialsError): - manage_stack("test-profile", "fake-region", SAM_CLI_STACK_NAME, _get_stack_template()) + manage_stack( + profile="test-profile", + region="fake-region", + stack_name=SAM_CLI_STACK_NAME, + template_body=_get_stack_template(), + ) @patch("boto3.client") def test_client_missing_credentials(self, boto_mock): boto_mock.side_effect = NoCredentialsError() with self.assertRaises(CredentialsError): - manage_stack(None, "fake-region", SAM_CLI_STACK_NAME, _get_stack_template()) + manage_stack( + profile=None, region="fake-region", stack_name=SAM_CLI_STACK_NAME, template_body=_get_stack_template() + ) @patch("boto3.client") def test_client_missing_region(self, boto_mock): boto_mock.side_effect = NoRegionError() with self.assertRaises(RegionError): - manage_stack(None, "fake-region", SAM_CLI_STACK_NAME, _get_stack_template()) + manage_stack( + profile=None, region="fake-region", stack_name=SAM_CLI_STACK_NAME, template_body=_get_stack_template() + ) def test_new_stack(self): stub_cf, stubber = self._stubbed_cf_client() @@ -47,6 +56,8 @@ def test_new_stack(self): "Tags": [{"Key": "ManagedStackSource", "Value": "AwsSamCli"}], "ChangeSetType": "CREATE", "ChangeSetName": "InitialCreation", + "Capabilities": ["CAPABILITY_IAM"], + "Parameters": [], } ccs_resp = {"Id": "id", "StackId": "aws-sam-cli-managed-default"} stubber.add_response("create_change_set", ccs_resp, ccs_params) @@ -151,6 +162,8 @@ def test_change_set_creation_fails(self): "Tags": [{"Key": "ManagedStackSource", "Value": "AwsSamCli"}], "ChangeSetType": "CREATE", "ChangeSetName": "InitialCreation", + "Capabilities": ["CAPABILITY_IAM"], + "Parameters": [], } stubber.add_client_error("create_change_set", service_error_code="ClientError", expected_params=ccs_params) stubber.activate() @@ -171,6 +184,8 @@ def test_change_set_execution_fails(self): "Tags": [{"Key": "ManagedStackSource", "Value": "AwsSamCli"}], "ChangeSetType": "CREATE", "ChangeSetName": "InitialCreation", + "Capabilities": ["CAPABILITY_IAM"], + "Parameters": [], } ccs_resp = {"Id": "id", "StackId": "aws-sam-cli-managed-default"} stubber.add_response("create_change_set", ccs_resp, ccs_params) diff --git a/tests/unit/lib/utils/test_osutils.py b/tests/unit/lib/utils/test_osutils.py index fbcdb10a81..d65dac1436 100644 --- a/tests/unit/lib/utils/test_osutils.py +++ b/tests/unit/lib/utils/test_osutils.py @@ -7,7 +7,7 @@ from unittest import TestCase from unittest.mock import patch -import samcli.lib.utils.osutils as osutils +from samcli.lib.utils import osutils class Test_mkdir_temp(TestCase): diff --git a/tests/unit/local/apigw/test_local_apigw_service.py b/tests/unit/local/apigw/test_local_apigw_service.py index dc785936a5..a6ab380f7d 100644 --- a/tests/unit/local/apigw/test_local_apigw_service.py +++ b/tests/unit/local/apigw/test_local_apigw_service.py @@ -226,8 +226,11 @@ def test_request_handler_returns_process_stdout_when_making_response(self, lambd make_response_mock = Mock() request_mock.return_value = ("test", "test") self.api_service.service_response = make_response_mock + current_route = Mock() self.api_service._get_current_route = MagicMock() - self.api_service._get_current_route.methods = [] + self.api_service._get_current_route.return_value = current_route + current_route.methods = [] + current_route.event_type = Route.API self.api_service._construct_v_1_0_event = Mock() @@ -249,7 +252,7 @@ def test_request_handler_returns_process_stdout_when_making_response(self, lambd lambda_output_parser_mock.get_lambda_output.assert_called_with(ANY) # Make sure the parse method is called only on the returned response and not on the raw data from stdout - parse_output_mock.assert_called_with(lambda_response, ANY, ANY) + parse_output_mock.assert_called_with(lambda_response, ANY, ANY, Route.API) # Make sure the logs are written to stderr self.stderr.write.assert_called_with(lambda_logs) @@ -507,69 +510,105 @@ def test_merge_does_not_duplicate_values(self): class TestServiceParsingV1PayloadFormatLambdaOutput(TestCase): - def test_default_content_type_header_added_with_no_headers(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_default_content_type_header_added_with_no_headers(self, event_type): lambda_output = ( '{"statusCode": 200, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertIn("Content-Type", headers) self.assertEqual(headers["Content-Type"], "application/json") - def test_default_content_type_header_added_with_empty_headers(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_default_content_type_header_added_with_empty_headers(self, event_type): lambda_output = ( '{"statusCode": 200, "headers":{}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertIn("Content-Type", headers) self.assertEqual(headers["Content-Type"], "application/json") - def test_custom_content_type_header_is_not_modified(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_custom_content_type_header_is_not_modified(self, event_type): lambda_output = ( '{"statusCode": 200, "headers":{"Content-Type": "text/xml"}, "body": "{}", ' '"isBase64Encoded": false}' ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertIn("Content-Type", headers) self.assertEqual(headers["Content-Type"], "text/xml") - def test_custom_content_type_multivalue_header_is_not_modified(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_custom_content_type_multivalue_header_is_not_modified(self, event_type): lambda_output = ( '{"statusCode": 200, "multiValueHeaders":{"Content-Type": ["text/xml"]}, "body": "{}", ' '"isBase64Encoded": false}' ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertIn("Content-Type", headers) self.assertEqual(headers["Content-Type"], "text/xml") - def test_multivalue_headers(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_multivalue_headers(self, event_type): lambda_output = ( '{"statusCode": 200, "multiValueHeaders":{"X-Foo": ["bar", "42"]}, ' '"body": "{\\"message\\":\\"Hello from Lambda\\"}", "isBase64Encoded": false}' ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertEqual(headers, Headers({"Content-Type": "application/json", "X-Foo": ["bar", "42"]})) - def test_single_and_multivalue_headers(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_single_and_multivalue_headers(self, event_type): lambda_output = ( '{"statusCode": 200, "headers":{"X-Foo": "foo", "X-Bar": "bar"}, ' '"multiValueHeaders":{"X-Foo": ["bar", "42"]}, ' @@ -577,7 +616,7 @@ def test_single_and_multivalue_headers(self): ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertEqual( @@ -587,29 +626,54 @@ def test_single_and_multivalue_headers(self): def test_extra_values_raise(self): lambda_output = ( '{"statusCode": 200, "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' - '"isBase64Encoded": false, "another_key": "some value"}' + '"isBase64Encoded": false, "base64Encoded": false, "another_key": "some value"}' ) with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=Route.API ) - def test_parse_returns_correct_tuple(self): + def test_extra_values_skipped_http_api(self): + lambda_output = ( + '{"statusCode": 200, "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' + '"isBase64Encoded": false, "another_key": "some value"}' + ) + + (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( + lambda_output, binary_types=[], flask_request=Mock(), event_type=Route.HTTP + ) + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/json"})) + self.assertEqual(body, '{"message":"Hello from Lambda"}') + + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_parse_returns_correct_tuple(self, event_type): lambda_output = ( '{"statusCode": 200, "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' ) (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertEqual(status_code, 200) self.assertEqual(headers, Headers({"Content-Type": "application/json"})) self.assertEqual(body, '{"message":"Hello from Lambda"}') - def test_parse_raises_when_invalid_mimetype(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_parse_raises_when_invalid_mimetype(self, event_type): lambda_output = ( '{"statusCode": 200, "headers": {\\"Content-Type\\": \\"text\\"}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' @@ -617,11 +681,92 @@ def test_parse_raises_when_invalid_mimetype(self): with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) + @parameterized.expand( + [ + param("isBase64Encoded", True, True), + param("base64Encoded", True, True), + param("isBase64Encoded", False, False), + param("base64Encoded", False, False), + param("isBase64Encoded", "True", True), + param("base64Encoded", "True", True), + param("isBase64Encoded", "true", True), + param("base64Encoded", "true", True), + param("isBase64Encoded", "False", False), + param("base64Encoded", "False", False), + param("isBase64Encoded", "false", False), + param("base64Encoded", "false", False), + ] + ) @patch("samcli.local.apigw.local_apigw_service.LocalApigwService._should_base64_decode_body") - def test_parse_returns_decodes_base64_to_binary(self, should_decode_body_patch): + def test_parse_returns_decodes_base64_to_binary_for_rest_api( + self, encoded_field_name, encoded_response_value, encoded_parsed_value, should_decode_body_patch + ): + should_decode_body_patch.return_value = True + + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + encoded_field_name: encoded_response_value, + } + + flask_request_mock = Mock() + (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=flask_request_mock, event_type=Route.API + ) + + should_decode_body_patch.assert_called_with( + ["*/*"], flask_request_mock, Headers({"Content-Type": "application/octet-stream"}), encoded_parsed_value + ) + + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) + self.assertEqual(body, binary_body) + + @parameterized.expand( + [ + param("isBase64Encoded", 0), + param("base64Encoded", 0), + param("isBase64Encoded", 1), + param("base64Encoded", 1), + param("isBase64Encoded", -1), + param("base64Encoded", -1), + param("isBase64Encoded", 10), + param("base64Encoded", 10), + param("isBase64Encoded", "TRue"), + param("base64Encoded", "TRue"), + param("isBase64Encoded", "Any Value"), + param("base64Encoded", "Any Value"), + ] + ) + @patch("samcli.local.apigw.local_apigw_service.LocalApigwService._should_base64_decode_body") + def test_parse_raise_exception_invalide_base64_encoded( + self, encoded_field_name, encoded_response_value, should_decode_body_patch + ): + should_decode_body_patch.return_value = True + + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + encoded_field_name: encoded_response_value, + } + + flask_request_mock = Mock() + with self.assertRaises(LambdaResponseParseException): + LocalApigwService._parse_v1_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=flask_request_mock, event_type=Route.API + ) + + @patch("samcli.local.apigw.local_apigw_service.LocalApigwService._should_base64_decode_body") + def test_parse_base64Encoded_field_is_priority(self, should_decode_body_patch): should_decode_body_patch.return_value = True binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary @@ -631,17 +776,136 @@ def test_parse_returns_decodes_base64_to_binary(self, should_decode_body_patch): "headers": {"Content-Type": "application/octet-stream"}, "body": base64_body, "isBase64Encoded": False, + "base64Encoded": True, } + flask_request_mock = Mock() (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( - json.dumps(lambda_output), binary_types=["*/*"], flask_request=Mock() + json.dumps(lambda_output), binary_types=["*/*"], flask_request=flask_request_mock, event_type=Route.API + ) + + should_decode_body_patch.assert_called_with( + ["*/*"], flask_request_mock, Headers({"Content-Type": "application/octet-stream"}), True ) self.assertEqual(status_code, 200) self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) self.assertEqual(body, binary_body) - def test_status_code_not_int(self): + @parameterized.expand( + [ + param(True, True), + param(False, False), + param("True", True), + param("true", True), + param("False", False), + param("false", False), + ] + ) + def test_parse_returns_decodes_base64_to_binary_for_http_api(self, encoded_response_value, encoded_parsed_value): + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + "isBase64Encoded": encoded_response_value, + } + + (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=Mock(), event_type=Route.HTTP + ) + + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) + self.assertEqual(body, binary_body if encoded_parsed_value else base64_body) + + @parameterized.expand( + [ + param(0), + param(1), + param(-1), + param(10), + param("TRue"), + param("Any Value"), + ] + ) + def test_parse_raise_exception_invalide_base64_encoded_for_http_api(self, encoded_response_value): + + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + "isBase64Encoded": encoded_response_value, + } + + flask_request_mock = Mock() + with self.assertRaises(LambdaResponseParseException): + LocalApigwService._parse_v1_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=flask_request_mock, event_type=Route.API + ) + + @parameterized.expand( + [ + param(True), + param(False), + param("True"), + param("true"), + param("False"), + param("false"), + param(0), + param(1), + param(-1), + param(10), + param("TRue"), + param("Any Value"), + ] + ) + def test_parse_skip_base_64_encoded_field_http_api(self, encoded_response_value): + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + "base64Encoded": encoded_response_value, + } + + (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=Mock(), event_type=Route.HTTP + ) + + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) + self.assertEqual(body, base64_body) + + def test_parse_returns_does_not_decodes_base64_to_binary_for_http_api(self): + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + "isBase64Encoded": False, + } + + (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=Mock(), event_type=Route.HTTP + ) + + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) + self.assertEqual(body, base64_body) + + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_status_code_not_int(self, event_type): lambda_output = ( '{"statusCode": "str", "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' @@ -649,21 +913,33 @@ def test_status_code_not_int(self): with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) - def test_status_code_int_str(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_status_code_int_str(self, event_type): lambda_output = ( '{"statusCode": "200", "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' ) (status_code, _, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertEqual(status_code, 200) - def test_status_code_negative_int(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_status_code_negative_int(self, event_type): lambda_output = ( '{"statusCode": -1, "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' @@ -671,10 +947,39 @@ def test_status_code_negative_int(self): with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) - def test_status_code_negative_int_str(self): + def test_status_code_is_none_http_api(self): + lambda_output = ( + '{"headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' + ) + + with self.assertRaises(LambdaResponseParseException): + LocalApigwService._parse_v1_payload_format_lambda_output( + lambda_output, binary_types=[], flask_request=Mock(), event_type=Route.HTTP + ) + + def test_status_code_is_none_rest_api(self): + lambda_output = ( + '{"headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' + ) + + (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( + lambda_output, binary_types=[], flask_request=Mock(), event_type=Route.API + ) + + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/json"})) + self.assertEqual(body, '{"message":"Hello from Lambda"}') + + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_status_code_negative_int_str(self, event_type): lambda_output = ( '{"statusCode": "-1", "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false}' @@ -682,44 +987,68 @@ def test_status_code_negative_int_str(self): with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) - def test_lambda_output_list_not_dict(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_lambda_output_list_not_dict(self, event_type): lambda_output = "[]" with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) - def test_lambda_output_not_json_serializable(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_lambda_output_not_json_serializable(self, event_type): lambda_output = "some str" with self.assertRaises(LambdaResponseParseException): LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) - def test_properties_are_null(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_properties_are_null(self, event_type): lambda_output = '{"statusCode": 0, "headers": null, "body": null, ' '"isBase64Encoded": null}' (status_code, headers, body) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) self.assertEqual(status_code, 200) self.assertEqual(headers, Headers({"Content-Type": "application/json"})) self.assertEqual(body, None) - def test_cookies_is_not_raise(self): + @parameterized.expand( + [ + param(Route.API), + param(Route.HTTP), + ] + ) + def test_cookies_is_not_raise(self, event_type): lambda_output = ( '{"statusCode": 200, "headers":{}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false, "cookies":{}}' ) (_, headers, _) = LocalApigwService._parse_v1_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() + lambda_output, binary_types=[], flask_request=Mock(), event_type=event_type ) @@ -761,16 +1090,19 @@ def test_custom_content_type_header_is_not_modified(self): self.assertIn("Content-Type", headers) self.assertEqual(headers["Content-Type"], "text/xml") - def test_extra_values_raise(self): + def test_extra_values_skipped(self): lambda_output = ( '{"statusCode": 200, "headers": {}, "body": "{\\"message\\":\\"Hello from Lambda\\"}", ' '"isBase64Encoded": false, "another_key": "some value"}' ) - with self.assertRaises(LambdaResponseParseException): - LocalApigwService._parse_v2_payload_format_lambda_output( - lambda_output, binary_types=[], flask_request=Mock() - ) + (status_code, headers, body) = LocalApigwService._parse_v2_payload_format_lambda_output( + lambda_output, binary_types=[], flask_request=Mock() + ) + + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/json"})) + self.assertEqual(body, '{"message":"Hello from Lambda"}') def test_parse_returns_correct_tuple(self): lambda_output = ( @@ -797,10 +1129,7 @@ def test_parse_raises_when_invalid_mimetype(self): lambda_output, binary_types=[], flask_request=Mock() ) - @patch("samcli.local.apigw.local_apigw_service.LocalApigwService._should_base64_decode_body") - def test_parse_returns_decodes_base64_to_binary(self, should_decode_body_patch): - should_decode_body_patch.return_value = True - + def test_parse_returns_does_not_decodes_base64_to_binary(self): binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary base64_body = base64.b64encode(binary_body).decode("utf-8") lambda_output = { @@ -814,6 +1143,24 @@ def test_parse_returns_decodes_base64_to_binary(self, should_decode_body_patch): json.dumps(lambda_output), binary_types=["*/*"], flask_request=Mock() ) + self.assertEqual(status_code, 200) + self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) + self.assertEqual(body, base64_body) + + def test_parse_returns_decodes_base64_to_binary(self): + binary_body = b"011000100110100101101110011000010111001001111001" # binary in binary + base64_body = base64.b64encode(binary_body).decode("utf-8") + lambda_output = { + "statusCode": 200, + "headers": {"Content-Type": "application/octet-stream"}, + "body": base64_body, + "isBase64Encoded": True, + } + + (status_code, headers, body) = LocalApigwService._parse_v2_payload_format_lambda_output( + json.dumps(lambda_output), binary_types=["*/*"], flask_request=Mock() + ) + self.assertEqual(status_code, 200) self.assertEqual(headers, Headers({"Content-Type": "application/octet-stream"})) self.assertEqual(body, binary_body) diff --git a/tests/unit/local/docker/test_container.py b/tests/unit/local/docker/test_container.py index a853bfd5bb..3c9b18b89f 100644 --- a/tests/unit/local/docker/test_container.py +++ b/tests/unit/local/docker/test_container.py @@ -65,6 +65,8 @@ def setUp(self): self.env_vars = {"key": "value"} self.container_opts = {"container": "opts"} self.additional_volumes = {"/somepath": {"blah": "blah value"}} + self.container_host = "localhost" + self.container_host_interface = "127.0.0.1" self.mock_docker_client = Mock() self.mock_docker_client.containers = Mock() @@ -138,6 +140,8 @@ def test_must_create_container_including_all_optional_values(self): docker_client=self.mock_docker_client, container_opts=self.container_opts, additional_volumes=self.additional_volumes, + container_host=self.container_host, + container_host_interface=self.container_host_interface, ) container_id = container.create() @@ -153,7 +157,7 @@ def test_must_create_container_including_all_optional_values(self): use_config_proxy=True, environment=self.env_vars, ports={ - container_port: ("127.0.0.1", host_port) + container_port: (self.container_host_interface, host_port) for container_port, host_port in {**self.exposed_ports, **self.always_exposed_ports}.items() }, entrypoint=self.entrypoint, @@ -513,12 +517,18 @@ def setUp(self): self.cmd = ["cmd"] self.working_dir = "working_dir" self.host_dir = "host_dir" + self.container_host = "localhost" self.mock_docker_client = Mock() self.mock_docker_client.containers = Mock() self.mock_docker_client.containers.get = Mock() self.container = Container( - self.image, self.cmd, self.working_dir, self.host_dir, docker_client=self.mock_docker_client + self.image, + self.cmd, + self.working_dir, + self.host_dir, + docker_client=self.mock_docker_client, + container_host=self.container_host, ) self.container.id = "someid" diff --git a/tests/unit/local/docker/test_lambda_container.py b/tests/unit/local/docker/test_lambda_container.py index ed5759ac13..cce093b463 100644 --- a/tests/unit/local/docker/test_lambda_container.py +++ b/tests/unit/local/docker/test_lambda_container.py @@ -21,6 +21,7 @@ Runtime.python38.value, Runtime.python36.value, Runtime.python27.value, + Runtime.python39.value, ] RUNTIMES_WITH_DEBUG_ENV_VARS_ONLY = [ @@ -454,7 +455,7 @@ def test_must_skip_if_port_is_not_given(self): class TestLambdaContainer_get_image(TestCase): def test_must_return_build_image(self): - expected = "amazon/aws-sam-cli-emulation-image-foo:rapid-x.y.z" + expected = "public.ecr.aws/sam/emulation-foo:rapid-x.y.z" image_builder = Mock() image_builder.build.return_value = expected diff --git a/tests/unit/local/docker/test_lambda_debug_settings.py b/tests/unit/local/docker/test_lambda_debug_settings.py index 1eadb6e6f5..59329186cb 100644 --- a/tests/unit/local/docker/test_lambda_debug_settings.py +++ b/tests/unit/local/docker/test_lambda_debug_settings.py @@ -19,6 +19,7 @@ Runtime.python36, Runtime.python37, Runtime.python38, + Runtime.python39, ] diff --git a/tests/unit/local/docker/test_lambda_image.py b/tests/unit/local/docker/test_lambda_image.py index f7fd9043d5..d8b2653c75 100644 --- a/tests/unit/local/docker/test_lambda_image.py +++ b/tests/unit/local/docker/test_lambda_image.py @@ -98,7 +98,7 @@ def test_building_image_with_no_layers(self): self.assertEqual( lambda_image.build("python3.6", ZIP, None, []), - f"amazon/aws-sam-cli-emulation-image-python3.6:rapid-{version}", + f"public.ecr.aws/sam/emulation-python3.6:rapid-{version}", ) @patch("samcli.local.docker.lambda_image.LambdaImage._build_image") @@ -149,7 +149,7 @@ def test_force_building_image_that_doesnt_already_exists( generate_docker_image_version_patch.assert_called_once_with(["layers1"], "python3.6") docker_client_mock.images.get.assert_called_once_with("samcli/lambda:image-version") build_image_patch.assert_called_once_with( - "amazon/aws-sam-cli-emulation-image-python3.6:latest", + "public.ecr.aws/sam/emulation-python3.6:latest", "samcli/lambda:image-version", ["layers1"], stream=stream, @@ -179,7 +179,7 @@ def test_not_force_building_image_that_doesnt_already_exists( generate_docker_image_version_patch.assert_called_once_with(["layers1"], "python3.6") docker_client_mock.images.get.assert_called_once_with("samcli/lambda:image-version") build_image_patch.assert_called_once_with( - "amazon/aws-sam-cli-emulation-image-python3.6:latest", + "public.ecr.aws/sam/emulation-python3.6:latest", "samcli/lambda:image-version", ["layers1"], stream=stream, @@ -249,7 +249,12 @@ def test_build_image(self, generate_dockerfile_patch, path_patch, uuid_patch, cr handle.write.assert_called_with("Dockerfile content") path_patch.assert_called_once_with("cached layers", "dockerfile_uuid") docker_client_mock.api.build.assert_called_once_with( - fileobj=tarball_fileobj, rm=True, tag="docker_tag", pull=False, custom_context=True + fileobj=tarball_fileobj, + rm=True, + tag="docker_tag", + pull=False, + custom_context=True, + decode=True, ) docker_full_path_mock.unlink.assert_called_once() @@ -292,7 +297,60 @@ def test_build_image_fails_with_BuildError( handle.write.assert_called_with("Dockerfile content") path_patch.assert_called_once_with("cached layers", "dockerfile_uuid") docker_client_mock.api.build.assert_called_once_with( - fileobj=tarball_fileobj, rm=True, tag="docker_tag", pull=False, custom_context=True + fileobj=tarball_fileobj, + rm=True, + tag="docker_tag", + pull=False, + custom_context=True, + decode=True, + ) + + docker_full_path_mock.unlink.assert_not_called() + + @patch("samcli.local.docker.lambda_image.create_tarball") + @patch("samcli.local.docker.lambda_image.uuid") + @patch("samcli.local.docker.lambda_image.Path") + @patch("samcli.local.docker.lambda_image.LambdaImage._generate_dockerfile") + def test_build_image_fails_with_BuildError_from_output( + self, generate_dockerfile_patch, path_patch, uuid_patch, create_tarball_patch + ): + uuid_patch.uuid4.return_value = "uuid" + generate_dockerfile_patch.return_value = "Dockerfile content" + + docker_full_path_mock = Mock() + docker_full_path_mock.exists.return_value = False + path_patch.return_value = docker_full_path_mock + + docker_client_mock = Mock() + docker_client_mock.api.build.return_value = [{"stream": "Some text"}, {"error": "Problem in the build!"}] + layer_downloader_mock = Mock() + layer_downloader_mock.layer_cache = "cached layers" + + tarball_fileobj = Mock() + create_tarball_patch.return_value.__enter__.return_value = tarball_fileobj + + layer_version1 = Mock() + layer_version1.codeuri = "somevalue" + layer_version1.name = "name" + + dockerfile_mock = Mock() + m = mock_open(dockerfile_mock) + with patch("samcli.local.docker.lambda_image.open", m): + with self.assertRaisesRegexp(ImageBuildException, "Problem in the build!"): + LambdaImage(layer_downloader_mock, True, False, docker_client=docker_client_mock)._build_image( + "base_image", "docker_tag", [layer_version1] + ) + + handle = m() + handle.write.assert_called_with("Dockerfile content") + path_patch.assert_called_once_with("cached layers", "dockerfile_uuid") + docker_client_mock.api.build.assert_called_once_with( + fileobj=tarball_fileobj, + rm=True, + tag="docker_tag", + pull=False, + custom_context=True, + decode=True, ) docker_full_path_mock.unlink.assert_not_called() @@ -334,6 +392,11 @@ def test_build_image_fails_with_ApiError( handle.write.assert_called_with("Dockerfile content") path_patch.assert_called_once_with("cached layers", "dockerfile_uuid") docker_client_mock.api.build.assert_called_once_with( - fileobj=tarball_fileobj, rm=True, tag="docker_tag", pull=False, custom_context=True + fileobj=tarball_fileobj, + rm=True, + tag="docker_tag", + pull=False, + custom_context=True, + decode=True, ) docker_full_path_mock.unlink.assert_called_once() diff --git a/tests/unit/local/lambdafn/test_runtime.py b/tests/unit/local/lambdafn/test_runtime.py index d109039f8e..8e0c9c0e89 100644 --- a/tests/unit/local/lambdafn/test_runtime.py +++ b/tests/unit/local/lambdafn/test_runtime.py @@ -81,6 +81,8 @@ def test_must_create_lambda_container(self, LambdaContainerMock): debug_options=debug_options, env_vars=self.env_var_value, memory_mb=self.DEFAULT_MEMORY, + container_host=None, + container_host_interface=None, ) # Run the container and get results self.manager_mock.create.assert_called_with(container) @@ -161,7 +163,7 @@ def test_must_create_container_first_if_passed_container_is_none(self): create_mock.return_value = container self.runtime.run(None, self.func_config, debug_context=debug_options) - create_mock.assert_called_with(self.func_config, debug_options) + create_mock.assert_called_with(self.func_config, debug_options, None, None) self.manager_mock.run.assert_called_with(container) def test_must_skip_run_running_container(self): @@ -269,6 +271,8 @@ def test_must_run_container_and_wait_for_result(self, LambdaContainerMock): debug_options=debug_options, env_vars=self.env_var_value, memory_mb=self.DEFAULT_MEMORY, + container_host=None, + container_host_interface=None, ) # Run the container and get results @@ -510,6 +514,30 @@ def test_must_return_a_valid_file(self, unzip_file_mock, shutil_mock, os_mock): shutil_mock.rmtree.assert_not_called() +class TestLambdaRuntime_unarchived_layer(TestCase): + def setUp(self): + self.manager_mock = Mock() + self.layer_downloader = Mock() + self.runtime = LambdaRuntime(self.manager_mock, self.layer_downloader) + + @parameterized.expand([(LayerVersion("", arn="arn", codeuri="file.zip"),)]) + @patch("samcli.local.lambdafn.runtime.LambdaRuntime._get_code_dir") + def test_unarchived_layer(self, layer, get_code_dir_mock): + new_url = get_code_dir_mock.return_value = Mock() + result = self.runtime._unarchived_layer(layer) + self.assertNotEqual(layer, result) + self.assertEqual(new_url, result.codeuri) + + @parameterized.expand( + [("arn",), (LayerVersion("", arn="arn", codeuri="folder"),), ({"Name": "hi", "Version": "x.y.z"},)] + ) + @patch("samcli.local.lambdafn.runtime.LambdaRuntime._get_code_dir") + def test_unarchived_layer_not_local_archive_file(self, layer, get_code_dir_mock): + get_code_dir_mock.side_effect = lambda x: x # directly return the input + result = self.runtime._unarchived_layer(layer) + self.assertEqual(layer, result) + + class TestWarmLambdaRuntime_invoke(TestCase): DEFAULT_MEMORY = 128 @@ -595,6 +623,8 @@ def test_must_run_container_then_wait_for_result_and_container_not_stopped( debug_options=debug_options, env_vars=self.env_var_value, memory_mb=self.DEFAULT_MEMORY, + container_host=None, + container_host_interface=None, ) # Run the container and get results @@ -672,6 +702,8 @@ def test_must_create_non_cached_container(self, LambdaContainerMock, LambdaFunct debug_options=debug_options, env_vars=self.env_var_value, memory_mb=self.DEFAULT_MEMORY, + container_host=None, + container_host_interface=None, ) self.manager_mock.create.assert_called_with(container) @@ -737,6 +769,8 @@ def test_must_ignore_debug_options_if_function_name_is_not_debug_function( debug_options=None, env_vars=self.env_var_value, memory_mb=self.DEFAULT_MEMORY, + container_host=None, + container_host_interface=None, ) self.manager_mock.create.assert_called_with(container) # validate that the created container got cached