2022/12/21 - AWS Transfer Family - 3 updated api methods
Changes This release adds support for Decrypt as a workflow step type.
{'OnExceptionSteps': {'DecryptStepDetails': {'DestinationFileLocation': {'EfsFileLocation': {'FileSystemId': 'string', 'Path': 'string'}, 'S3FileLocation': {'Bucket': 'string', 'Key': 'string'}}, 'Name': 'string', 'OverwriteExisting': 'TRUE | ' 'FALSE', 'SourceFileLocation': 'string', 'Type': 'PGP'}, 'Type': {'DECRYPT'}}, 'Steps': {'DecryptStepDetails': {'DestinationFileLocation': {'EfsFileLocation': {'FileSystemId': 'string', 'Path': 'string'}, 'S3FileLocation': {'Bucket': 'string', 'Key': 'string'}}, 'Name': 'string', 'OverwriteExisting': 'TRUE | FALSE', 'SourceFileLocation': 'string', 'Type': 'PGP'}, 'Type': {'DECRYPT'}}}
Allows you to create a workflow with specified steps and step details the workflow invokes after file transfer completes. After creating a workflow, you can associate the workflow created with any transfer servers by specifying the workflow-details field in CreateServer and UpdateServer operations.
See also: AWS API Documentation
Request Syntax
client.create_workflow( Description='string', Steps=[ { 'Type': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'CopyStepDetails': { 'Name': 'string', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'OverwriteExisting': 'TRUE'|'FALSE', 'SourceFileLocation': 'string' }, 'CustomStepDetails': { 'Name': 'string', 'Target': 'string', 'TimeoutSeconds': 123, 'SourceFileLocation': 'string' }, 'DeleteStepDetails': { 'Name': 'string', 'SourceFileLocation': 'string' }, 'TagStepDetails': { 'Name': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ], 'SourceFileLocation': 'string' }, 'DecryptStepDetails': { 'Name': 'string', 'Type': 'PGP', 'SourceFileLocation': 'string', 'OverwriteExisting': 'TRUE'|'FALSE', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } } } }, ], OnExceptionSteps=[ { 'Type': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'CopyStepDetails': { 'Name': 'string', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'OverwriteExisting': 'TRUE'|'FALSE', 'SourceFileLocation': 'string' }, 'CustomStepDetails': { 'Name': 'string', 'Target': 'string', 'TimeoutSeconds': 123, 'SourceFileLocation': 'string' }, 'DeleteStepDetails': { 'Name': 'string', 'SourceFileLocation': 'string' }, 'TagStepDetails': { 'Name': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ], 'SourceFileLocation': 'string' }, 'DecryptStepDetails': { 'Name': 'string', 'Type': 'PGP', 'SourceFileLocation': 'string', 'OverwriteExisting': 'TRUE'|'FALSE', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } } } }, ], Tags=[ { 'Key': 'string', 'Value': 'string' }, ] )
string
A textual description for the workflow.
list
[REQUIRED]
Specifies the details for the steps that are in the specified workflow.
The TYPE specifies which of the following actions is being taken for this step.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
Note
Currently, copying and tagging are supported only on S3.
For file location, you specify either the S3 bucket and key, or the EFS file system ID and path.
(dict) --
The basic building block of a workflow.
Type (string) --
Currently, the following step types are supported.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
CopyStepDetails (dict) --
Details for a step that performs a file copy.
Consists of the following values:
A description
An S3 location for the destination of the file copy.
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
Name (string) --
The name of the step, used as an identifier.
DestinationFileLocation (dict) --
Specifies the location for the file being copied. Only applicable for Copy type workflow steps. Use ${Transfer:username} in this field to parametrize the destination prefix by username.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
OverwriteExisting (string) --
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
CustomStepDetails (dict) --
Details for a step that invokes a lambda function.
Consists of the lambda function name, target, and timeout (in seconds).
Name (string) --
The name of the step, used as an identifier.
Target (string) --
The ARN for the lambda function that is being called.
TimeoutSeconds (integer) --
Timeout, in seconds, for the step.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DeleteStepDetails (dict) --
Details for a step that deletes the file.
Name (string) --
The name of the step, used as an identifier.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
TagStepDetails (dict) --
Details for a step that creates one or more tags.
You specify one or more tags: each tag contains a key/value pair.
Name (string) --
The name of the step, used as an identifier.
Tags (list) --
Array that contains from 1 to 10 key/value pairs.
(dict) --
Specifies the key-value pair that are assigned to a file during the execution of a Tagging step.
Key (string) -- [REQUIRED]
The name assigned to the tag that you create.
Value (string) -- [REQUIRED]
The value that corresponds to the key.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DecryptStepDetails (dict) --
Name (string) --
Type (string) -- [REQUIRED]
SourceFileLocation (string) --
OverwriteExisting (string) --
DestinationFileLocation (dict) -- [REQUIRED]
Specifies the location for the file being copied. Only applicable for the Copy type of workflow steps.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
list
Specifies the steps (actions) to take if errors are encountered during execution of the workflow.
Note
For custom steps, the lambda function needs to send FAILURE to the call back API to kick off the exception steps. Additionally, if the lambda does not send SUCCESS before it times out, the exception steps are executed.
(dict) --
The basic building block of a workflow.
Type (string) --
Currently, the following step types are supported.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
CopyStepDetails (dict) --
Details for a step that performs a file copy.
Consists of the following values:
A description
An S3 location for the destination of the file copy.
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
Name (string) --
The name of the step, used as an identifier.
DestinationFileLocation (dict) --
Specifies the location for the file being copied. Only applicable for Copy type workflow steps. Use ${Transfer:username} in this field to parametrize the destination prefix by username.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
OverwriteExisting (string) --
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
CustomStepDetails (dict) --
Details for a step that invokes a lambda function.
Consists of the lambda function name, target, and timeout (in seconds).
Name (string) --
The name of the step, used as an identifier.
Target (string) --
The ARN for the lambda function that is being called.
TimeoutSeconds (integer) --
Timeout, in seconds, for the step.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DeleteStepDetails (dict) --
Details for a step that deletes the file.
Name (string) --
The name of the step, used as an identifier.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
TagStepDetails (dict) --
Details for a step that creates one or more tags.
You specify one or more tags: each tag contains a key/value pair.
Name (string) --
The name of the step, used as an identifier.
Tags (list) --
Array that contains from 1 to 10 key/value pairs.
(dict) --
Specifies the key-value pair that are assigned to a file during the execution of a Tagging step.
Key (string) -- [REQUIRED]
The name assigned to the tag that you create.
Value (string) -- [REQUIRED]
The value that corresponds to the key.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DecryptStepDetails (dict) --
Name (string) --
Type (string) -- [REQUIRED]
SourceFileLocation (string) --
OverwriteExisting (string) --
DestinationFileLocation (dict) -- [REQUIRED]
Specifies the location for the file being copied. Only applicable for the Copy type of workflow steps.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
list
Key-value pairs that can be used to group and search for workflows. Tags are metadata attached to workflows for any purpose.
(dict) --
Creates a key-value pair for a specific resource. Tags are metadata that you can use to search for and group a resource for various purposes. You can apply tags to servers, users, and roles. A tag key can take more than one value. For example, to group servers for accounting purposes, you might create a tag called Group and assign the values Research and Accounting to that group.
Key (string) -- [REQUIRED]
The name assigned to the tag that you create.
Value (string) -- [REQUIRED]
Contains one or more values that you assigned to the key name you create.
dict
Response Syntax
{ 'WorkflowId': 'string' }
Response Structure
(dict) --
WorkflowId (string) --
A unique identifier for the workflow.
{'Execution': {'Results': {'OnExceptionSteps': {'StepType': {'DECRYPT'}}, 'Steps': {'StepType': {'DECRYPT'}}}}}
You can use DescribeExecution to check the details of the execution of the specified workflow.
See also: AWS API Documentation
Request Syntax
client.describe_execution( ExecutionId='string', WorkflowId='string' )
string
[REQUIRED]
A unique identifier for the execution of a workflow.
string
[REQUIRED]
A unique identifier for the workflow.
dict
Response Syntax
{ 'WorkflowId': 'string', 'Execution': { 'ExecutionId': 'string', 'InitialFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string', 'VersionId': 'string', 'Etag': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'ServiceMetadata': { 'UserDetails': { 'UserName': 'string', 'ServerId': 'string', 'SessionId': 'string' } }, 'ExecutionRole': 'string', 'LoggingConfiguration': { 'LoggingRole': 'string', 'LogGroupName': 'string' }, 'PosixProfile': { 'Uid': 123, 'Gid': 123, 'SecondaryGids': [ 123, ] }, 'Status': 'IN_PROGRESS'|'COMPLETED'|'EXCEPTION'|'HANDLING_EXCEPTION', 'Results': { 'Steps': [ { 'StepType': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'Outputs': 'string', 'Error': { 'Type': 'PERMISSION_DENIED'|'CUSTOM_STEP_FAILED'|'THROTTLED'|'ALREADY_EXISTS'|'NOT_FOUND'|'BAD_REQUEST'|'TIMEOUT'|'INTERNAL_SERVER_ERROR', 'Message': 'string' } }, ], 'OnExceptionSteps': [ { 'StepType': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'Outputs': 'string', 'Error': { 'Type': 'PERMISSION_DENIED'|'CUSTOM_STEP_FAILED'|'THROTTLED'|'ALREADY_EXISTS'|'NOT_FOUND'|'BAD_REQUEST'|'TIMEOUT'|'INTERNAL_SERVER_ERROR', 'Message': 'string' } }, ] } } }
Response Structure
(dict) --
WorkflowId (string) --
A unique identifier for the workflow.
Execution (dict) --
The structure that contains the details of the workflow' execution.
ExecutionId (string) --
A unique identifier for the execution of a workflow.
InitialFileLocation (dict) --
A structure that describes the Amazon S3 or EFS file location. This is the file location when the execution begins: if the file is being copied, this is the initial (as opposed to destination) file location.
S3FileLocation (dict) --
Specifies the S3 details for the file being used, such as bucket, ETag, and so forth.
Bucket (string) --
Specifies the S3 bucket that contains the file being used.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
VersionId (string) --
Specifies the file version.
Etag (string) --
The entity tag is a hash of the object. The ETag reflects changes only to the contents of an object, not its metadata.
EfsFileLocation (dict) --
Specifies the Amazon EFS identifier and the path for the file being used.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
ServiceMetadata (dict) --
A container object for the session details that are associated with a workflow.
UserDetails (dict) --
The Server ID (ServerId ), Session ID (SessionId ) and user (UserName ) make up the UserDetails .
UserName (string) --
A unique string that identifies a user account associated with a server.
ServerId (string) --
The system-assigned unique identifier for a Transfer server instance.
SessionId (string) --
The system-assigned unique identifier for a session that corresponds to the workflow.
ExecutionRole (string) --
The IAM role associated with the execution.
LoggingConfiguration (dict) --
The IAM logging role associated with the execution.
LoggingRole (string) --
The Amazon Resource Name (ARN) of the Identity and Access Management (IAM) role that allows a server to turn on Amazon CloudWatch logging for Amazon S3 or Amazon EFSevents. When set, you can view user activity in your CloudWatch logs.
LogGroupName (string) --
The name of the CloudWatch logging group for the Transfer Family server to which this workflow belongs.
PosixProfile (dict) --
The full POSIX identity, including user ID (Uid ), group ID (Gid ), and any secondary groups IDs (SecondaryGids ), that controls your users' access to your Amazon EFS file systems. The POSIX permissions that are set on files and directories in your file system determine the level of access your users get when transferring files into and out of your Amazon EFS file systems.
Uid (integer) --
The POSIX user ID used for all EFS operations by this user.
Gid (integer) --
The POSIX group ID used for all EFS operations by this user.
SecondaryGids (list) --
The secondary POSIX group IDs used for all EFS operations by this user.
(integer) --
Status (string) --
The status is one of the execution. Can be in progress, completed, exception encountered, or handling the exception.
Results (dict) --
A structure that describes the execution results. This includes a list of the steps along with the details of each step, error type and message (if any), and the OnExceptionSteps structure.
Steps (list) --
Specifies the details for the steps that are in the specified workflow.
(dict) --
Specifies the following details for the step: error (if any), outputs (if any), and the step type.
StepType (string) --
One of the available step types.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
Outputs (string) --
The values for the key/value pair applied as a tag to the file. Only applicable if the step type is TAG .
Error (dict) --
Specifies the details for an error, if it occurred during execution of the specified workflow step.
Type (string) --
Specifies the error type.
ALREADY_EXISTS : occurs for a copy step, if the overwrite option is not selected and a file with the same name already exists in the target location.
BAD_REQUEST : a general bad request: for example, a step that attempts to tag an EFS file returns BAD_REQUEST , as only S3 files can be tagged.
CUSTOM_STEP_FAILED : occurs when the custom step provided a callback that indicates failure.
INTERNAL_SERVER_ERROR : a catch-all error that can occur for a variety of reasons.
NOT_FOUND : occurs when a requested entity, for example a source file for a copy step, does not exist.
PERMISSION_DENIED : occurs if your policy does not contain the correct permissions to complete one or more of the steps in the workflow.
TIMEOUT : occurs when the execution times out.
Note
You can set the TimeoutSeconds for a custom step, anywhere from 1 second to 1800 seconds (30 minutes).
THROTTLED : occurs if you exceed the new execution refill rate of one workflow per second.
Message (string) --
Specifies the descriptive message that corresponds to the ErrorType .
OnExceptionSteps (list) --
Specifies the steps (actions) to take if errors are encountered during execution of the workflow.
(dict) --
Specifies the following details for the step: error (if any), outputs (if any), and the step type.
StepType (string) --
One of the available step types.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
Outputs (string) --
The values for the key/value pair applied as a tag to the file. Only applicable if the step type is TAG .
Error (dict) --
Specifies the details for an error, if it occurred during execution of the specified workflow step.
Type (string) --
Specifies the error type.
ALREADY_EXISTS : occurs for a copy step, if the overwrite option is not selected and a file with the same name already exists in the target location.
BAD_REQUEST : a general bad request: for example, a step that attempts to tag an EFS file returns BAD_REQUEST , as only S3 files can be tagged.
CUSTOM_STEP_FAILED : occurs when the custom step provided a callback that indicates failure.
INTERNAL_SERVER_ERROR : a catch-all error that can occur for a variety of reasons.
NOT_FOUND : occurs when a requested entity, for example a source file for a copy step, does not exist.
PERMISSION_DENIED : occurs if your policy does not contain the correct permissions to complete one or more of the steps in the workflow.
TIMEOUT : occurs when the execution times out.
Note
You can set the TimeoutSeconds for a custom step, anywhere from 1 second to 1800 seconds (30 minutes).
THROTTLED : occurs if you exceed the new execution refill rate of one workflow per second.
Message (string) --
Specifies the descriptive message that corresponds to the ErrorType .
{'Workflow': {'OnExceptionSteps': {'DecryptStepDetails': {'DestinationFileLocation': {'EfsFileLocation': {'FileSystemId': 'string', 'Path': 'string'}, 'S3FileLocation': {'Bucket': 'string', 'Key': 'string'}}, 'Name': 'string', 'OverwriteExisting': 'TRUE ' '| ' 'FALSE', 'SourceFileLocation': 'string', 'Type': 'PGP'}, 'Type': {'DECRYPT'}}, 'Steps': {'DecryptStepDetails': {'DestinationFileLocation': {'EfsFileLocation': {'FileSystemId': 'string', 'Path': 'string'}, 'S3FileLocation': {'Bucket': 'string', 'Key': 'string'}}, 'Name': 'string', 'OverwriteExisting': 'TRUE | ' 'FALSE', 'SourceFileLocation': 'string', 'Type': 'PGP'}, 'Type': {'DECRYPT'}}}}
Describes the specified workflow.
See also: AWS API Documentation
Request Syntax
client.describe_workflow( WorkflowId='string' )
string
[REQUIRED]
A unique identifier for the workflow.
dict
Response Syntax
{ 'Workflow': { 'Arn': 'string', 'Description': 'string', 'Steps': [ { 'Type': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'CopyStepDetails': { 'Name': 'string', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'OverwriteExisting': 'TRUE'|'FALSE', 'SourceFileLocation': 'string' }, 'CustomStepDetails': { 'Name': 'string', 'Target': 'string', 'TimeoutSeconds': 123, 'SourceFileLocation': 'string' }, 'DeleteStepDetails': { 'Name': 'string', 'SourceFileLocation': 'string' }, 'TagStepDetails': { 'Name': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ], 'SourceFileLocation': 'string' }, 'DecryptStepDetails': { 'Name': 'string', 'Type': 'PGP', 'SourceFileLocation': 'string', 'OverwriteExisting': 'TRUE'|'FALSE', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } } } }, ], 'OnExceptionSteps': [ { 'Type': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'CopyStepDetails': { 'Name': 'string', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'OverwriteExisting': 'TRUE'|'FALSE', 'SourceFileLocation': 'string' }, 'CustomStepDetails': { 'Name': 'string', 'Target': 'string', 'TimeoutSeconds': 123, 'SourceFileLocation': 'string' }, 'DeleteStepDetails': { 'Name': 'string', 'SourceFileLocation': 'string' }, 'TagStepDetails': { 'Name': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ], 'SourceFileLocation': 'string' }, 'DecryptStepDetails': { 'Name': 'string', 'Type': 'PGP', 'SourceFileLocation': 'string', 'OverwriteExisting': 'TRUE'|'FALSE', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } } } }, ], 'WorkflowId': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ] } }
Response Structure
(dict) --
Workflow (dict) --
The structure that contains the details of the workflow.
Arn (string) --
Specifies the unique Amazon Resource Name (ARN) for the workflow.
Description (string) --
Specifies the text description for the workflow.
Steps (list) --
Specifies the details for the steps that are in the specified workflow.
(dict) --
The basic building block of a workflow.
Type (string) --
Currently, the following step types are supported.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
CopyStepDetails (dict) --
Details for a step that performs a file copy.
Consists of the following values:
A description
An S3 location for the destination of the file copy.
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
Name (string) --
The name of the step, used as an identifier.
DestinationFileLocation (dict) --
Specifies the location for the file being copied. Only applicable for Copy type workflow steps. Use ${Transfer:username} in this field to parametrize the destination prefix by username.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
OverwriteExisting (string) --
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
CustomStepDetails (dict) --
Details for a step that invokes a lambda function.
Consists of the lambda function name, target, and timeout (in seconds).
Name (string) --
The name of the step, used as an identifier.
Target (string) --
The ARN for the lambda function that is being called.
TimeoutSeconds (integer) --
Timeout, in seconds, for the step.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DeleteStepDetails (dict) --
Details for a step that deletes the file.
Name (string) --
The name of the step, used as an identifier.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
TagStepDetails (dict) --
Details for a step that creates one or more tags.
You specify one or more tags: each tag contains a key/value pair.
Name (string) --
The name of the step, used as an identifier.
Tags (list) --
Array that contains from 1 to 10 key/value pairs.
(dict) --
Specifies the key-value pair that are assigned to a file during the execution of a Tagging step.
Key (string) --
The name assigned to the tag that you create.
Value (string) --
The value that corresponds to the key.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DecryptStepDetails (dict) --
Name (string) --
Type (string) --
SourceFileLocation (string) --
OverwriteExisting (string) --
DestinationFileLocation (dict) --
Specifies the location for the file being copied. Only applicable for the Copy type of workflow steps.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
OnExceptionSteps (list) --
Specifies the steps (actions) to take if errors are encountered during execution of the workflow.
(dict) --
The basic building block of a workflow.
Type (string) --
Currently, the following step types are supported.
COPY : Copy the file to another location.
CUSTOM : Perform a custom step with an Lambda function target.
DELETE : Delete the file.
TAG : Add a tag to the file.
CopyStepDetails (dict) --
Details for a step that performs a file copy.
Consists of the following values:
A description
An S3 location for the destination of the file copy.
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
Name (string) --
The name of the step, used as an identifier.
DestinationFileLocation (dict) --
Specifies the location for the file being copied. Only applicable for Copy type workflow steps. Use ${Transfer:username} in this field to parametrize the destination prefix by username.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
OverwriteExisting (string) --
A flag that indicates whether or not to overwrite an existing file of the same name. The default is FALSE .
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
CustomStepDetails (dict) --
Details for a step that invokes a lambda function.
Consists of the lambda function name, target, and timeout (in seconds).
Name (string) --
The name of the step, used as an identifier.
Target (string) --
The ARN for the lambda function that is being called.
TimeoutSeconds (integer) --
Timeout, in seconds, for the step.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DeleteStepDetails (dict) --
Details for a step that deletes the file.
Name (string) --
The name of the step, used as an identifier.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
TagStepDetails (dict) --
Details for a step that creates one or more tags.
You specify one or more tags: each tag contains a key/value pair.
Name (string) --
The name of the step, used as an identifier.
Tags (list) --
Array that contains from 1 to 10 key/value pairs.
(dict) --
Specifies the key-value pair that are assigned to a file during the execution of a Tagging step.
Key (string) --
The name assigned to the tag that you create.
Value (string) --
The value that corresponds to the key.
SourceFileLocation (string) --
Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow.
Enter ${previous.file} to use the previous file as the input. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
Enter ${original.file} to use the originally-uploaded file location as input for this step.
DecryptStepDetails (dict) --
Name (string) --
Type (string) --
SourceFileLocation (string) --
OverwriteExisting (string) --
DestinationFileLocation (dict) --
Specifies the location for the file being copied. Only applicable for the Copy type of workflow steps.
S3FileLocation (dict) --
Specifies the details for the S3 file being copied.
Bucket (string) --
Specifies the S3 bucket for the customer input file.
Key (string) --
The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object.
EfsFileLocation (dict) --
Reserved for future use.
FileSystemId (string) --
The identifier of the file system, assigned by Amazon EFS.
Path (string) --
The pathname for the folder being used by a workflow.
WorkflowId (string) --
A unique identifier for the workflow.
Tags (list) --
Key-value pairs that can be used to group and search for workflows. Tags are metadata attached to workflows for any purpose.
(dict) --
Creates a key-value pair for a specific resource. Tags are metadata that you can use to search for and group a resource for various purposes. You can apply tags to servers, users, and roles. A tag key can take more than one value. For example, to group servers for accounting purposes, you might create a tag called Group and assign the values Research and Accounting to that group.
Key (string) --
The name assigned to the tag that you create.
Value (string) --
Contains one or more values that you assigned to the key name you create.