See: Description
| Interface | Description |
|---|---|
| AlexaSkillDeployActionProps |
(experimental) Construction properties of the
Alexa deploy Action. |
| CloudFormationCreateReplaceChangeSetActionProps |
(experimental) Properties for the CloudFormationCreateReplaceChangeSetAction.
|
| CloudFormationCreateUpdateStackActionProps |
(experimental) Properties for the CloudFormationCreateUpdateStackAction.
|
| CloudFormationDeleteStackActionProps |
(experimental) Properties for the CloudFormationDeleteStackAction.
|
| CloudFormationExecuteChangeSetActionProps |
(experimental) Properties for the CloudFormationExecuteChangeSetAction.
|
| CodeBuildActionProps |
(experimental) Construction properties of the
CodeBuild build CodePipeline action. |
| CodeCommitSourceActionProps |
(experimental) Construction properties of the
CodeCommit source CodePipeline Action. |
| CodeCommitSourceVariables |
(experimental) The CodePipeline variables emitted by the CodeCommit source Action.
|
| CodeDeployEcsContainerImageInput |
(experimental) Configuration for replacing a placeholder string in the ECS task definition template file with an image URI.
|
| CodeDeployEcsDeployActionProps |
(experimental) Construction properties of the
CodeDeploy ECS deploy CodePipeline Action. |
| CodeDeployServerDeployActionProps |
(experimental) Construction properties of the
CodeDeploy server deploy CodePipeline Action. |
| CodeStarConnectionsSourceActionProps |
(experimental) Construction properties for
CodeStarConnectionsSourceAction. |
| EcrSourceActionProps |
(experimental) Construction properties of
EcrSourceAction. |
| EcrSourceVariables |
(experimental) The CodePipeline variables emitted by the ECR source Action.
|
| EcsDeployActionProps |
(experimental) Construction properties of
EcsDeployAction. |
| GitHubSourceActionProps |
(experimental) Construction properties of the
GitHub source action. |
| GitHubSourceVariables |
(experimental) The CodePipeline variables emitted by GitHub source Action.
|
| IJenkinsProvider |
(experimental) A Jenkins provider.
|
| IJenkinsProvider.Jsii$Default |
Internal default implementation for
IJenkinsProvider. |
| JenkinsActionProps |
(experimental) Construction properties of
JenkinsAction. |
| JenkinsProviderAttributes |
(experimental) Properties for importing an existing Jenkins provider.
|
| JenkinsProviderProps | |
| LambdaInvokeActionProps |
(experimental) Construction properties of the
Lambda invoke CodePipeline Action. |
| ManualApprovalActionProps |
(experimental) Construction properties of the
ManualApprovalAction. |
| S3DeployActionProps |
(experimental) Construction properties of the
S3 deploy Action. |
| S3SourceActionProps |
(experimental) Construction properties of the
S3 source Action. |
| S3SourceVariables |
(experimental) The CodePipeline variables emitted by the S3 source Action.
|
| ServiceCatalogDeployActionBeta1Props |
(experimental) Construction properties of the
ServiceCatalog deploy CodePipeline Action. |
| StepFunctionsInvokeActionProps |
(experimental) Construction properties of the
StepFunction Invoke Action. |
| Enum | Description |
|---|---|
| CodeBuildActionType |
(experimental) The type of the CodeBuild action that determines its CodePipeline Category - Build, or Test.
|
| CodeCommitTrigger |
(experimental) How should the CodeCommit Action detect changes.
|
| GitHubTrigger |
(experimental) If and how the GitHub source action should be triggered.
|
| JenkinsActionType |
(experimental) The type of the Jenkins Action that determines its CodePipeline Category - Build, or Test.
|
| S3Trigger |
(experimental) How should the S3 Action detect changes.
|
---
This package contains Actions that can be used in a CodePipeline.
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826 import software.amazon.awscdk.aws_codepipeline; import software.amazon.awscdk.aws_codepipeline_actions;
To use a CodeCommit Repository in a CodePipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_codecommit;
Repository repo = new Repository(this, "Repo", new RepositoryProps());
Object pipeline = Pipeline.Builder.create(this, "MyPipeline")
.pipelineName("MyPipeline")
.build();
Object sourceOutput = new Artifact();
Object sourceAction = CodeCommitSourceAction.Builder.create()
.actionName("CodeCommit")
.repository(repo)
.output(sourceOutput)
.build();
pipeline.addStage(Map.of(
"stageName", "Source",
"actions", asList(sourceAction)));
If you want to use existing role which can be used by on commit event rule. You can specify the role object in eventRole property.
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
IRole eventRole = iam.Role.fromRoleArn(this, "Event-role", "roleArn");
Object sourceAction = CodeCommitSourceAction.Builder.create()
.actionName("CodeCommit")
.repository(repo)
.output(new Artifact())
.eventRole(eventRole)
.build();
If you want to clone the entire CodeCommit repository (only available for CodeBuild actions),
you can set the codeBuildCloneOutput property to true:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceOutput = new Artifact();
Object sourceAction = CodeCommitSourceAction.Builder.create()
.actionName("CodeCommit")
.repository(repo)
.output(sourceOutput)
.codeBuildCloneOutput(true)
.build();
Object buildAction = CodeBuildAction.Builder.create()
.actionName("CodeBuild")
.project(project)
.input(sourceOutput)// The build action must use the CodeCommitSourceAction output as input.
.outputs(asList(new Artifact()))
.build();
The CodeCommit source action emits variables:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceAction = CodeCommitSourceAction.Builder.create()
// ...
.variablesNamespace("MyNamespace")
.build();
// later:
// later:
CodeBuildAction.Builder.create()
// ...
.environmentVariables(Map.of(
"COMMIT_ID", Map.of(
"value", sourceAction.variables.getCommitId())))
.build();
If you want to use a GitHub repository as the source, you must create:
my-github-token).
This token can be stored either as Plaintext or as a Secret key/value.
If you stored the token as Plaintext,
set cdk.SecretValue.secretsManager('my-github-token') as the value of oauthToken.
If you stored it as a Secret key/value,
you must set cdk.SecretValue.secretsManager('my-github-token', { jsonField : 'my-github-token' }) as the value of oauthToken.To use GitHub as the source of a CodePipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
// Read the secret from Secrets Manager
Object sourceOutput = new Artifact();
Object sourceAction = GitHubSourceAction.Builder.create()
.actionName("GitHub_Source")
.owner("awslabs")
.repo("aws-cdk")
.oauthToken(cdk.SecretValue.secretsManager("my-github-token"))
.output(sourceOutput)
.branch("develop")
.build();
pipeline.addStage(Map.of(
"stageName", "Source",
"actions", asList(sourceAction)));
The GitHub source action emits variables:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceAction = GitHubSourceAction.Builder.create()
// ...
.variablesNamespace("MyNamespace")
.build();
// later:
// later:
CodeBuildAction.Builder.create()
// ...
.environmentVariables(Map.of(
"COMMIT_URL", Map.of(
"value", sourceAction.variables.getCommitUrl())))
.build();
CodePipeline can use a BitBucket Git repository as a source:
Note: you have to manually connect CodePipeline through the AWS Console with your BitBucket account.
This is a one-time operation for a given AWS account in a given region.
The simplest way to do that is to either start creating a new CodePipeline,
or edit an existing one, while being logged in to BitBucket.
Choose BitBucket as the source,
and grant CodePipeline permissions to your BitBucket account.
Copy & paste the Connection ARN that you get in the console,
or use the codestar-connections list-connections AWS CLI operation
to find it.
After that, you can safely abort creating or editing the pipeline -
the connection has already been created.
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceOutput = new Artifact();
Object sourceAction = CodeStarConnectionsSourceAction.Builder.create()
.actionName("BitBucket_Source")
.owner("aws")
.repo("aws-cdk")
.output(sourceOutput)
.connectionArn("arn:aws:codestar-connections:us-east-1:123456789012:connection/12345678-abcd-12ab-34cdef5678gh")
.build();
You can also use the CodeStarConnectionsSourceAction to connect to GitHub, in the same way
(you just have to select GitHub as the source when creating the connection in the console).
To use an S3 Bucket as a source in CodePipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_s3;
Bucket sourceBucket = new Bucket(this, "MyBucket", new BucketProps()
.versioned(true));
Object pipeline = new Pipeline(this, "MyPipeline");
Object sourceOutput = new Artifact();
Object sourceAction = S3SourceAction.Builder.create()
.actionName("S3Source")
.bucket(sourceBucket)
.bucketKey("path/to/file.zip")
.output(sourceOutput)
.build();
pipeline.addStage(Map.of(
"stageName", "Source",
"actions", asList(sourceAction)));
The region of the action will be determined by the region the bucket itself is in. When using a newly created bucket, that region will be taken from the stack the bucket belongs to; for an imported bucket, you can specify the region explicitly:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
IBucket sourceBucket = s3.Bucket.fromBucketAttributes(this, "SourceBucket", new BucketAttributes()
.bucketName("my-bucket")
.region("ap-southeast-1"));
By default, the Pipeline will poll the Bucket to detect changes.
You can change that behavior to use CloudWatch Events by setting the trigger
property to S3Trigger.EVENTS (it's S3Trigger.POLL by default).
If you do that, make sure the source Bucket is part of an AWS CloudTrail Trail -
otherwise, the CloudWatch Events will not be emitted,
and your Pipeline will not react to changes in the Bucket.
You can do it through the CDK:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_cloudtrail;
String key = "some/key.zip";
Trail trail = new Trail(this, "CloudTrail");
trail.addS3EventSelector(asList(new S3EventSelector()
.bucket(sourceBucket)
.objectPrefix(key)), new AddEventSelectorOptions()
.readWriteType(cloudtrail.ReadWriteType.getWRITE_ONLY()));
Object sourceAction = S3SourceAction.Builder.create()
.actionName("S3Source")
.bucketKey(key)
.bucket(sourceBucket)
.output(sourceOutput)
.trigger(codepipeline_actions.S3Trigger.getEVENTS())
.build();
The S3 source action emits variables:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceAction = S3SourceAction.Builder.create()
// ...
.variablesNamespace("MyNamespace")
.build();
// later:
// later:
CodeBuildAction.Builder.create()
// ...
.environmentVariables(Map.of(
"VERSION_ID", Map.of(
"value", sourceAction.variables.getVersionId())))
.build();
To use an ECR Repository as a source in a Pipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_ecr;
Object pipeline = new Pipeline(this, "MyPipeline");
Object sourceOutput = new Artifact();
Object sourceAction = EcrSourceAction.Builder.create()
.actionName("ECR")
.repository(ecrRepository)
.imageTag("some-tag")// optional, default: 'latest'
.output(sourceOutput)
.build();
pipeline.addStage(Map.of(
"stageName", "Source",
"actions", asList(sourceAction)));
The ECR source action emits variables:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceAction = EcrSourceAction.Builder.create()
// ...
.variablesNamespace("MyNamespace")
.build();
// later:
// later:
CodeBuildAction.Builder.create()
// ...
.environmentVariables(Map.of(
"IMAGE_URI", Map.of(
"value", sourceAction.variables.getImageUri())))
.build();
Example of a CodeBuild Project used in a Pipeline, alongside CodeCommit:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_codebuild;
import software.amazon.awscdk.aws_codecommit;
Repository repository = new Repository(this, "MyRepository", new RepositoryProps()
.repositoryName("MyRepository"));
PipelineProject project = new PipelineProject(this, "MyProject");
Object sourceOutput = new Artifact();
Object sourceAction = CodeCommitSourceAction.Builder.create()
.actionName("CodeCommit")
.repository(repository)
.output(sourceOutput)
.build();
Object buildAction = CodeBuildAction.Builder.create()
.actionName("CodeBuild")
.project(project)
.input(sourceOutput)
.outputs(asList(new Artifact()))// optional
.executeBatchBuild(true)
.build();
Pipeline.Builder.create(this, "MyPipeline")
.stages(asList(Map.of(
"stageName", "Source",
"actions", asList(sourceAction)), Map.of(
"stageName", "Build",
"actions", asList(buildAction))))
.build();
The default category of the CodeBuild Action is Build;
if you want a Test Action instead,
override the type property:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object testAction = CodeBuildAction.Builder.create()
.actionName("IntegrationTest")
.project(project)
.input(sourceOutput)
.type(codepipeline_actions.CodeBuildActionType.getTEST())
.build();
When you want to have multiple inputs and/or outputs for a Project used in a
Pipeline, instead of using the secondarySources and secondaryArtifacts
properties of the Project class, you need to use the extraInputs and
outputs properties of the CodeBuild CodePipeline
Actions. Example:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object sourceOutput1 = new Artifact();
Object sourceAction1 = CodeCommitSourceAction.Builder.create()
.actionName("Source1")
.repository(repository1)
.output(sourceOutput1)
.build();
Object sourceOutput2 = new Artifact("source2");
Object sourceAction2 = CodeCommitSourceAction.Builder.create()
.actionName("Source2")
.repository(repository2)
.output(sourceOutput2)
.build();
Object buildAction = CodeBuildAction.Builder.create()
.actionName("Build")
.project(project)
.input(sourceOutput1)
.extraInputs(asList(sourceOutput2))
.outputs(asList(
new Artifact("artifact1"), // for better buildspec readability - see below
new Artifact("artifact2")))
.build();
Note: when a CodeBuild Action in a Pipeline has more than one output, it
only uses the secondary-artifacts field of the buildspec, never the
primary output specification directly under artifacts. Because of that, it
pays to explicitly name all output artifacts of that Action, like we did
above, so that you know what name to use in the buildspec.
Example buildspec for the above project:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object project = PipelineProject.Builder.create(this, "MyProject")
.buildSpec(codebuild.BuildSpec.fromObject(Map.of(
"version", "0.2",
"phases", Map.of(
"build", Map.of(
"commands", asList())),
"artifacts", Map.of(
"secondary-artifacts", Map.of(
"artifact1", Map.of(),
"artifact2", Map.of())))))
.build();
The CodeBuild action emits variables. Unlike many other actions, the variables are not static, but dynamic, defined in the buildspec, in the 'exported-variables' subsection of the 'env' section. Example:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object buildAction = CodeBuildAction.Builder.create()
.actionName("Build1")
.input(sourceOutput)
.project(PipelineProject.Builder.create(this, "Project")
.buildSpec(codebuild.BuildSpec.fromObject(Map.of(
"version", "0.2",
"env", Map.of(
"exported-variables", asList("MY_VAR")),
"phases", Map.of(
"build", Map.of(
"commands", "export MY_VAR=\"some value\"")))))
.build())
.variablesNamespace("MyNamespace")
.build();
// later:
// later:
CodeBuildAction.Builder.create()
// ...
.environmentVariables(Map.of(
"MyVar", Map.of(
"value", buildAction.variable("MY_VAR"))))
.build();
In order to use Jenkins Actions in the Pipeline,
you first need to create a JenkinsProvider:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object jenkinsProvider = JenkinsProvider.Builder.create(this, "JenkinsProvider")
.providerName("MyJenkinsProvider")
.serverUrl("http://my-jenkins.com:8080")
.version("2")
.build();
If you've registered a Jenkins provider in a different CDK app, or outside the CDK (in the CodePipeline AWS Console, for example), you can import it:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object jenkinsProvider = codepipeline_actions.JenkinsProvider.import(this, "JenkinsProvider", Map.of(
"providerName", "MyJenkinsProvider",
"serverUrl", "http://my-jenkins.com:8080",
"version", "2"));
Note that a Jenkins provider (identified by the provider name-category(build/test)-version tuple) must always be registered in the given account, in the given AWS region, before it can be used in CodePipeline.
With a JenkinsProvider,
we can create a Jenkins Action:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object buildAction = JenkinsAction.Builder.create()
.actionName("JenkinsBuild")
.jenkinsProvider(jenkinsProvider)
.projectName("MyProject")
.type(codepipeline_actions.JenkinsActionType.getBUILD())
.build();
This module contains Actions that allows you to deploy to CloudFormation from AWS CodePipeline.
For example, the following code fragment defines a pipeline that automatically deploys a CloudFormation template directly from a CodeCommit repository, with a manual approval step in between to confirm the changes:
example Pipeline to deploy CloudFormation
See the AWS documentation for more details about using CloudFormation in CodePipeline.
This package contains the following CloudFormation actions:
replaceOnFailure
is set to true, in which case it will be destroyed and recreated).
If you want to deploy your Lambda through CodePipeline,
and you don't use assets (for example, because your CDK code and Lambda code are separate),
you can use a special Lambda Code class, CfnParametersCode.
Note that your Lambda must be in a different Stack than your Pipeline.
The Lambda itself will be deployed, alongside the entire Stack it belongs to,
using a CloudFormation CodePipeline Action. Example:
Example of deploying a Lambda through CodePipeline
If you want to update stacks in a different account,
pass the account property when creating the action:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
CloudFormationCreateUpdateStackAction.Builder.create()
// ...
.account("123456789012")
.build();
This will create a new stack, called <PipelineStackName>-support-123456789012, in your App,
that will contain the role that the pipeline will assume in account 123456789012 before executing this action.
This support stack will automatically be deployed before the stack containing the pipeline.
You can also pass a role explicitly when creating the action -
in that case, the account property is ignored,
and the action will operate in the same account the role belongs to:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.PhysicalName;
// in stack for account 123456789012...
Role actionRole = new Role(otherAccountStack, "ActionRole", new RoleProps()
.assumedBy(new AccountPrincipal(pipelineAccount))
// the role has to have a physical name set
.roleName(PhysicalName.getGENERATE_IF_NEEDED()));
// in the pipeline stack...
// in the pipeline stack...
CloudFormationCreateUpdateStackAction.Builder.create()
// ...
.role(actionRole)
.build();
To use CodeDeploy for EC2/on-premise deployments in a Pipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_codedeploy;
Object pipeline = Pipeline.Builder.create(this, "MyPipeline")
.pipelineName("MyPipeline")
.build();
// add the source and build Stages to the Pipeline...
Object deployAction = CodeDeployServerDeployAction.Builder.create()
.actionName("CodeDeploy")
.input(buildOutput)
.deploymentGroup(deploymentGroup)
.build();
pipeline.addStage(Map.of(
"stageName", "Deploy",
"actions", asList(deployAction)));
To use CodeDeploy for blue-green Lambda deployments in a Pipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
CfnParametersCode lambdaCode = lambda.Code.fromCfnParameters();
Function func = new Function(lambdaStack, "Lambda", new FunctionProps()
.code(lambdaCode)
.handler("index.handler")
.runtime(lambda.Runtime.getNODEJS_12_X()));
// used to make sure each CDK synthesis produces a different Version
Version version = func.addVersion("NewVersion");
Alias alias = new Alias(lambdaStack, "LambdaAlias", new AliasProps()
.aliasName("Prod")
.version(version));
LambdaDeploymentGroup.Builder.create(lambdaStack, "DeploymentGroup")
.alias(alias)
.deploymentConfig(codedeploy.LambdaDeploymentConfig.getLINEAR_10PERCENT_EVERY_1MINUTE())
.build();
Then, you need to create your Pipeline Stack,
where you will define your Pipeline,
and deploy the lambdaStack using a CloudFormation CodePipeline Action
(see above for a complete example).
CodePipeline can deploy an ECS service. The deploy Action receives one input Artifact which contains the image definition file:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object deployStage = pipeline.addStage(Map.of(
"stageName", "Deploy",
"actions", asList(
EcsDeployAction.Builder.create()
.actionName("DeployAction")
.service(service)
// if your file is called imagedefinitions.json,
// use the `input` property,
// and leave out the `imageFile` property
.input(buildOutput)
// if your file name is _not_ imagedefinitions.json,
// use the `imageFile` property,
// and leave out the `input` property
.imageFile(buildOutput.atPath("imageDef.json"))
.deploymentTimeout(cdk.Duration.minutes(60))
.build())));
The idiomatic CDK way of deploying an ECS application is to have your Dockerfiles and your CDK code in the same source code repository, leveraging Docker Assets, and use the CDK Pipelines module.
However, if you want to deploy a Docker application whose source code is kept in a separate version control repository than the CDK code,
you can use the TagParameterContainerImage class from the ECS module.
Here's an example:
example ECS pipeline for an application in a separate source code repository
To use an S3 Bucket as a deployment target in CodePipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Bucket targetBucket = new Bucket(this, "MyBucket", new BucketProps());
Object pipeline = new Pipeline(this, "MyPipeline");
Object deployAction = S3DeployAction.Builder.create()
.actionName("S3Deploy")
.stage(deployStage)
.bucket(targetBucket)
.input(sourceOutput)
.build();
Object deployStage = pipeline.addStage(Map.of(
"stageName", "Deploy",
"actions", asList(deployAction)));
There is currently no native support in CodePipeline for invalidating a CloudFront cache after deployment. One workaround is to add another build step after the deploy step, and use the AWS CLI to invalidate the cache:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
// Create a Cloudfront Web Distribution
Object distribution = Distribution.Builder.create(this, "Distribution").build();
// Create the build project that will invalidate the cache
Object invalidateBuildProject = PipelineProject.Builder.create(this, "InvalidateProject")
.buildSpec(codebuild.BuildSpec.fromObject(Map.of(
"version", "0.2",
"phases", Map.of(
"build", Map.of(
"commands", asList("aws cloudfront create-invalidation --distribution-id ${CLOUDFRONT_ID} --paths \"/*\""))))))
.environmentVariables(Map.of(
"CLOUDFRONT_ID", Map.of("value", distribution.getDistributionId())))
.build();
// Add Cloudfront invalidation permissions to the project
String distributionArn = String.format("arn:aws:cloudfront::%s:distribution/%s", this.account, distribution.getDistributionId());
invalidateBuildProject.addToRolePolicy(new PolicyStatement(new PolicyStatementProps()
.resources(asList(distributionArn))
.actions(asList("cloudfront:CreateInvalidation"))));
// Create the pipeline (here only the S3 deploy and Invalidate cache build)
// Create the pipeline (here only the S3 deploy and Invalidate cache build)
Pipeline.Builder.create(this, "Pipeline")
.stages(asList(Map.of(
"stageName", "Deploy",
"actions", asList(
S3DeployAction.Builder.create()
.actionName("S3Deploy")
.bucket(deployBucket)
.input(deployInput)
.runOrder(1)
.build(),
CodeBuildAction.Builder.create()
.actionName("InvalidateCache")
.project(invalidateBuildProject)
.input(deployInput)
.runOrder(2)
.build()))))
.build();
You can deploy to Alexa using CodePipeline with the following Action:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
// Read the secrets from ParameterStore
Object clientId = cdk.SecretValue.secretsManager("AlexaClientId");
Object clientSecret = cdk.SecretValue.secretsManager("AlexaClientSecret");
Object refreshToken = cdk.SecretValue.secretsManager("AlexaRefreshToken");
// Add deploy action
// Add deploy action
AlexaSkillDeployAction.Builder.create()
.actionName("DeploySkill")
.runOrder(1)
.input(sourceOutput)
.clientId(clientId.toString())
.clientSecret(clientSecret)
.refreshToken(refreshToken)
.skillId("amzn1.ask.skill.12345678-1234-1234-1234-123456789012")
.build();
If you need manifest overrides you can specify them as parameterOverridesArtifact in the action:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_cloudformation;
// Deploy some CFN change set and store output
Object executeOutput = new Artifact("CloudFormation");
Object executeChangeSetAction = CloudFormationExecuteChangeSetAction.Builder.create()
.actionName("ExecuteChangesTest")
.runOrder(2)
.stackName(stackName)
.changeSetName(changeSetName)
.outputFileName("overrides.json")
.output(executeOutput)
.build();
// Provide CFN output as manifest overrides
// Provide CFN output as manifest overrides
AlexaSkillDeployAction.Builder.create()
.actionName("DeploySkill")
.runOrder(1)
.input(sourceOutput)
.parameterOverridesArtifact(executeOutput)
.clientId(clientId.toString())
.clientSecret(clientSecret)
.refreshToken(refreshToken)
.skillId("amzn1.ask.skill.12345678-1234-1234-1234-123456789012")
.build();
You can deploy a CloudFormation template to an existing Service Catalog product with the following Action:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object serviceCatalogDeployAction = ServiceCatalogDeployActionBeta1.Builder.create()
.actionName("ServiceCatalogDeploy")
.templatePath(cdkBuildOutput.atPath("Sample.template.json"))
.productVersionName("Version - " + Date.now.getToString())
.productType("CLOUD_FORMATION_TEMPLATE")
.productVersionDescription("This is a version from the pipeline with a new description.")
.productId("prod-XXXXXXXX")
.build();
This package contains an Action that stops the Pipeline until someone manually clicks the approve button:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object manualApprovalAction = ManualApprovalAction.Builder.create()
.actionName("Approve")
.notificationTopic(new Topic(this, "Topic"))// optional
.notifyEmails(asList("some_email@example.com"))// optional
.additionalInformation("additional info")
.build();
approveStage.addAction(manualApprovalAction);
If the notificationTopic has not been provided,
but notifyEmails were,
a new SNS Topic will be created
(and accessible through the notificationTopic property of the Action).
This module contains an Action that allows you to invoke a Lambda function in a Pipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_lambda;
Object pipeline = new Pipeline(this, "MyPipeline");
Object lambdaAction = LambdaInvokeAction.Builder.create()
.actionName("Lambda")
.lambda(fn)
.build();
pipeline.addStage(Map.of(
"stageName", "Lambda",
"actions", asList(lambdaAction)));
The Lambda Action can have up to 5 inputs, and up to 5 outputs:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
Object lambdaAction = LambdaInvokeAction.Builder.create()
.actionName("Lambda")
.inputs(asList(sourceOutput, buildOutput))
.outputs(asList(
new Artifact("Out1"),
new Artifact("Out2")))
.lambda(fn)
.build();
The Lambda invoke action emits variables.
Unlike many other actions, the variables are not static,
but dynamic, defined by the function calling the PutJobSuccessResult
API with the outputVariables property filled with the map of variables
Example:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_lambda;
Object lambdaInvokeAction = LambdaInvokeAction.Builder.create()
.actionName("Lambda")
.lambda(new Function(this, "Func", new FunctionProps()
.runtime(lambda.Runtime.getNODEJS_12_X())
.handler("index.handler")
.code(lambda.Code.fromInline("\n const AWS = require('aws-sdk');\n\n exports.handler = async function(event, context) {\n const codepipeline = new AWS.CodePipeline();\n await codepipeline.putJobSuccessResult({\n jobId: event['CodePipeline.job'].id,\n outputVariables: {\n MY_VAR: \"some value\",\n },\n }).promise();\n }\n "))))
.variablesNamespace("MyNamespace")
.build();
// later:
// later:
CodeBuildAction.Builder.create()
// ...
.environmentVariables(Map.of(
"MyVar", Map.of(
"value", lambdaInvokeAction.variable("MY_VAR"))))
.build();
See the AWS documentation on how to write a Lambda function invoked from CodePipeline.
This module contains an Action that allows you to invoke a Step Function in a Pipeline:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_stepfunctions;
Object pipeline = new Pipeline(this, "MyPipeline");
Pass startState = new Pass(stack, "StartState");
StateMachine simpleStateMachine = new StateMachine(stack, "SimpleStateMachine", new StateMachineProps()
.definition(startState));
Object stepFunctionAction = StepFunctionsInvokeAction.Builder.create()
.actionName("Invoke")
.stateMachine(simpleStateMachine)
.stateMachineInput(codepipeline_actions.StateMachineInput.literal(Map.of("IsHelloWorldExample", true)))
.build();
pipeline.addStage(Map.of(
"stageName", "StepFunctions",
"actions", asList(stepFunctionAction)));
The StateMachineInput can be created with one of 2 static factory methods:
literal, which takes an arbitrary map as its only argument, or filePath:
// Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import software.amazon.awscdk.aws_stepfunctions;
Object pipeline = new Pipeline(this, "MyPipeline");
Object inputArtifact = new Artifact();
Pass startState = new Pass(stack, "StartState");
StateMachine simpleStateMachine = new StateMachine(stack, "SimpleStateMachine", new StateMachineProps()
.definition(startState));
Object stepFunctionAction = StepFunctionsInvokeAction.Builder.create()
.actionName("Invoke")
.stateMachine(simpleStateMachine)
.stateMachineInput(codepipeline_actions.StateMachineInput.filePath(inputArtifact.atPath("assets/input.json")))
.build();
pipeline.addStage(Map.of(
"stageName", "StepFunctions",
"actions", asList(stepFunctionAction)));
See the AWS documentation for information on Action structure reference.
Copyright © 2021. All rights reserved.