MWAA **** Client ====== class MWAA.Client A low-level client representing AmazonMWAA This section contains the Amazon Managed Workflows for Apache Airflow (MWAA) API reference documentation. For more information, see What is Amazon MWAA?. **Endpoints** * "api.airflow.{region}.amazonaws.com" - This endpoint is used for environment management. * CreateEnvironment * DeleteEnvironment * GetEnvironment * ListEnvironments * ListTagsForResource * TagResource * UntagResource * UpdateEnvironment * "env.airflow.{region}.amazonaws.com" - This endpoint is used to operate the Airflow environment. * CreateCliToken * CreateWebLoginToken * InvokeRestApi **Regions** For a list of supported regions, see Amazon MWAA endpoints and quotas in the *Amazon Web Services General Reference*. import boto3 client = boto3.client('mwaa') These are the available methods: * can_paginate * close * create_cli_token * create_environment * create_web_login_token * delete_environment * get_environment * get_paginator * get_waiter * invoke_rest_api * list_environments * list_tags_for_resource * publish_metrics * tag_resource * untag_resource * update_environment Paginators ========== Paginators are available on a client instance via the "get_paginator" method. For more detailed instructions and examples on the usage of paginators, see the paginators user guide. The available paginators are: * ListEnvironments MWAA / Paginator / ListEnvironments ListEnvironments **************** class MWAA.Paginator.ListEnvironments paginator = client.get_paginator('list_environments') paginate(**kwargs) Creates an iterator that will paginate through responses from "MWAA.Client.list_environments()". See also: AWS API Documentation **Request Syntax** response_iterator = paginator.paginate( PaginationConfig={ 'MaxItems': 123, 'PageSize': 123, 'StartingToken': 'string' } ) Parameters: **PaginationConfig** (*dict*) -- A dictionary that provides parameters to control pagination. * **MaxItems** *(integer) --* The total number of items to return. If the total number of items available is more than the value specified in max- items then a "NextToken" will be provided in the output that you can use to resume pagination. * **PageSize** *(integer) --* The size of each page. * **StartingToken** *(string) --* A token to specify where to start paginating. This is the "NextToken" from a previous response. Return type: dict Returns: **Response Syntax** { 'Environments': [ 'string', ], } **Response Structure** * *(dict) --* * **Environments** *(list) --* Returns a list of Amazon MWAA environments. * *(string) --* MWAA / Client / get_environment get_environment *************** MWAA.Client.get_environment(**kwargs) Describes an Amazon Managed Workflows for Apache Airflow (MWAA) environment. See also: AWS API Documentation **Request Syntax** response = client.get_environment( Name='string' ) Parameters: **Name** (*string*) -- **[REQUIRED]** The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". Return type: dict Returns: **Response Syntax** { 'Environment': { 'Name': 'string', 'Status': 'CREATING'|'CREATE_FAILED'|'AVAILABLE'|'UPDATING'|'DELETING'|'DELETED'|'UNAVAILABLE'|'UPDATE_FAILED'|'ROLLING_BACK'|'CREATING_SNAPSHOT'|'PENDING'|'MAINTENANCE', 'Arn': 'string', 'CreatedAt': datetime(2015, 1, 1), 'WebserverUrl': 'string', 'ExecutionRoleArn': 'string', 'ServiceRoleArn': 'string', 'KmsKey': 'string', 'AirflowVersion': 'string', 'SourceBucketArn': 'string', 'DagS3Path': 'string', 'PluginsS3Path': 'string', 'PluginsS3ObjectVersion': 'string', 'RequirementsS3Path': 'string', 'RequirementsS3ObjectVersion': 'string', 'StartupScriptS3Path': 'string', 'StartupScriptS3ObjectVersion': 'string', 'AirflowConfigurationOptions': { 'string': 'string' }, 'EnvironmentClass': 'string', 'MaxWorkers': 123, 'NetworkConfiguration': { 'SubnetIds': [ 'string', ], 'SecurityGroupIds': [ 'string', ] }, 'LoggingConfiguration': { 'DagProcessingLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG', 'CloudWatchLogGroupArn': 'string' }, 'SchedulerLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG', 'CloudWatchLogGroupArn': 'string' }, 'WebserverLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG', 'CloudWatchLogGroupArn': 'string' }, 'WorkerLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG', 'CloudWatchLogGroupArn': 'string' }, 'TaskLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG', 'CloudWatchLogGroupArn': 'string' } }, 'LastUpdate': { 'Status': 'SUCCESS'|'PENDING'|'FAILED', 'CreatedAt': datetime(2015, 1, 1), 'Error': { 'ErrorCode': 'string', 'ErrorMessage': 'string' }, 'Source': 'string', 'WorkerReplacementStrategy': 'FORCED'|'GRACEFUL' }, 'WeeklyMaintenanceWindowStart': 'string', 'Tags': { 'string': 'string' }, 'WebserverAccessMode': 'PRIVATE_ONLY'|'PUBLIC_ONLY', 'MinWorkers': 123, 'Schedulers': 123, 'WebserverVpcEndpointService': 'string', 'DatabaseVpcEndpointService': 'string', 'CeleryExecutorQueue': 'string', 'EndpointManagement': 'CUSTOMER'|'SERVICE', 'MinWebservers': 123, 'MaxWebservers': 123 } } **Response Structure** * *(dict) --* * **Environment** *(dict) --* An object containing all available details about the environment. * **Name** *(string) --* The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". * **Status** *(string) --* The status of the Amazon MWAA environment. Valid values: * "CREATING" - Indicates the request to create the environment is in progress. * "CREATING_SNAPSHOT" - Indicates the request to update environment details, or upgrade the environment version, is in progress and Amazon MWAA is creating a storage volume snapshot of the Amazon RDS database cluster associated with the environment. A database snapshot is a backup created at a specific point in time. Amazon MWAA uses snapshots to recover environment metadata if the process to update or upgrade an environment fails. * "CREATE_FAILED" - Indicates the request to create the environment failed, and the environment could not be created. * "AVAILABLE" - Indicates the request was successful and the environment is ready to use. * "PENDING" - Indicates the request was successful, but the process to create the environment is paused until you create the required VPC endpoints in your VPC. After you create the VPC endpoints, the process resumes. * "UPDATING" - Indicates the request to update the environment is in progress. * "ROLLING_BACK" - Indicates the request to update environment details, or upgrade the environment version, failed and Amazon MWAA is restoring the environment using the latest storage volume snapshot. * "DELETING" - Indicates the request to delete the environment is in progress. * "DELETED" - Indicates the request to delete the environment is complete, and the environment has been deleted. * "UNAVAILABLE" - Indicates the request failed, but the environment did not return to its previous state and is not stable. * "UPDATE_FAILED" - Indicates the request to update the environment failed, and the environment was restored to its previous state successfully and is ready to use. * "MAINTENANCE" - Indicates that the environment is undergoing maintenance. Depending on the type of work Amazon MWAA is performing, your environment might become unavailable during this process. After all operations are done, your environment will return to its status prior to mainteneace operations. We recommend reviewing our troubleshooting guide for a list of common errors and their solutions. For more information, see Amazon MWAA troubleshooting. * **Arn** *(string) --* The Amazon Resource Name (ARN) of the Amazon MWAA environment. * **CreatedAt** *(datetime) --* The day and time the environment was created. * **WebserverUrl** *(string) --* The Apache Airflow *web server* host name for the Amazon MWAA environment. For more information, see Accessing the Apache Airflow UI. * **ExecutionRoleArn** *(string) --* The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access Amazon Web Services resources in your environment. For example, "arn:aws:iam::123456789:role/my-execution-role". For more information, see Amazon MWAA Execution role. * **ServiceRoleArn** *(string) --* The Amazon Resource Name (ARN) for the service-linked role of the environment. For more information, see Amazon MWAA Service-linked role. * **KmsKey** *(string) --* The KMS encryption key used to encrypt the data in your environment. * **AirflowVersion** *(string) --* The Apache Airflow version on your environment. Valid values: "1.10.12", "2.0.2", "2.2.2", "2.4.3", "2.5.1", "2.6.3", "2.7.2", "2.8.1", "2.9.2", "2.10.1", and "2.10.3". * **SourceBucketArn** *(string) --* The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, "arn:aws:s3:::my-airflow-bucket-unique-name". For more information, see Create an Amazon S3 bucket for Amazon MWAA. * **DagS3Path** *(string) --* The relative path to the DAGs folder in your Amazon S3 bucket. For example, "s3://mwaa-environment/dags". For more information, see Adding or updating DAGs. * **PluginsS3Path** *(string) --* The relative path to the file in your Amazon S3 bucket. For example, "s3://mwaa-environment/plugins.zip". For more information, see Installing custom plugins. * **PluginsS3ObjectVersion** *(string) --* The version of the "plugins.zip" file in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: "3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdR QBpUMLUo" For more information, see Installing custom plugins. * **RequirementsS3Path** *(string) --* The relative path to the "requirements.txt" file in your Amazon S3 bucket. For example, "s3://mwaa- environment/requirements.txt". For more information, see Installing Python dependencies. * **RequirementsS3ObjectVersion** *(string) --* The version of the "requirements.txt" file on your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: "3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdR QBpUMLUo" For more information, see Installing Python dependencies. * **StartupScriptS3Path** *(string) --* The relative path to the startup shell script in your Amazon S3 bucket. For example, "s3://mwaa- environment/startup.sh". Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script. * **StartupScriptS3ObjectVersion** *(string) --* The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: "3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdR QBpUMLUo" For more information, see Using a startup script. * **AirflowConfigurationOptions** *(dict) --* A list of key-value pairs containing the Apache Airflow configuration options attached to your environment. For more information, see Apache Airflow configuration options. * *(string) --* * *(string) --* * **EnvironmentClass** *(string) --* The environment class type. Valid values: "mw1.micro", "mw1.small", "mw1.medium", "mw1.large", "mw1.xlarge", and "mw1.2xlarge". For more information, see Amazon MWAA environment class. * **MaxWorkers** *(integer) --* The maximum number of workers that run in your environment. For example, "20". * **NetworkConfiguration** *(dict) --* Describes the VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA. * **SubnetIds** *(list) --* A list of subnet IDs. For more information, see About networking on Amazon MWAA. * *(string) --* * **SecurityGroupIds** *(list) --* A list of security group IDs. For more information, see Security in your VPC on Amazon MWAA. * *(string) --* * **LoggingConfiguration** *(dict) --* The Apache Airflow logs published to CloudWatch Logs. * **DagProcessingLogs** *(dict) --* The Airflow DAG processing logs published to CloudWatch Logs and the log level. * **Enabled** *(boolean) --* Indicates whether the Apache Airflow log type (e.g. "DagProcessingLogs") is enabled. * **LogLevel** *(string) --* The Apache Airflow log level for the log type (e.g. "DagProcessingLogs"). * **CloudWatchLogGroupArn** *(string) --* The Amazon Resource Name (ARN) for the CloudWatch Logs group where the Apache Airflow log type (e.g. "DagProcessingLogs") is published. For example, "arn:aws:logs:us-east-1:123456789012:log-group :airflow-MyMWAAEnvironment-MwaaEnvironment- DAGProcessing:*". * **SchedulerLogs** *(dict) --* The Airflow scheduler logs published to CloudWatch Logs and the log level. * **Enabled** *(boolean) --* Indicates whether the Apache Airflow log type (e.g. "DagProcessingLogs") is enabled. * **LogLevel** *(string) --* The Apache Airflow log level for the log type (e.g. "DagProcessingLogs"). * **CloudWatchLogGroupArn** *(string) --* The Amazon Resource Name (ARN) for the CloudWatch Logs group where the Apache Airflow log type (e.g. "DagProcessingLogs") is published. For example, "arn:aws:logs:us-east-1:123456789012:log-group :airflow-MyMWAAEnvironment-MwaaEnvironment- DAGProcessing:*". * **WebserverLogs** *(dict) --* The Airflow web server logs published to CloudWatch Logs and the log level. * **Enabled** *(boolean) --* Indicates whether the Apache Airflow log type (e.g. "DagProcessingLogs") is enabled. * **LogLevel** *(string) --* The Apache Airflow log level for the log type (e.g. "DagProcessingLogs"). * **CloudWatchLogGroupArn** *(string) --* The Amazon Resource Name (ARN) for the CloudWatch Logs group where the Apache Airflow log type (e.g. "DagProcessingLogs") is published. For example, "arn:aws:logs:us-east-1:123456789012:log-group :airflow-MyMWAAEnvironment-MwaaEnvironment- DAGProcessing:*". * **WorkerLogs** *(dict) --* The Airflow worker logs published to CloudWatch Logs and the log level. * **Enabled** *(boolean) --* Indicates whether the Apache Airflow log type (e.g. "DagProcessingLogs") is enabled. * **LogLevel** *(string) --* The Apache Airflow log level for the log type (e.g. "DagProcessingLogs"). * **CloudWatchLogGroupArn** *(string) --* The Amazon Resource Name (ARN) for the CloudWatch Logs group where the Apache Airflow log type (e.g. "DagProcessingLogs") is published. For example, "arn:aws:logs:us-east-1:123456789012:log-group :airflow-MyMWAAEnvironment-MwaaEnvironment- DAGProcessing:*". * **TaskLogs** *(dict) --* The Airflow task logs published to CloudWatch Logs and the log level. * **Enabled** *(boolean) --* Indicates whether the Apache Airflow log type (e.g. "DagProcessingLogs") is enabled. * **LogLevel** *(string) --* The Apache Airflow log level for the log type (e.g. "DagProcessingLogs"). * **CloudWatchLogGroupArn** *(string) --* The Amazon Resource Name (ARN) for the CloudWatch Logs group where the Apache Airflow log type (e.g. "DagProcessingLogs") is published. For example, "arn:aws:logs:us-east-1:123456789012:log-group :airflow-MyMWAAEnvironment-MwaaEnvironment- DAGProcessing:*". * **LastUpdate** *(dict) --* The status of the last update on the environment. * **Status** *(string) --* The status of the last update on the environment. * **CreatedAt** *(datetime) --* The day and time of the last update on the environment. * **Error** *(dict) --* The error that was encountered during the last update of the environment. * **ErrorCode** *(string) --* The error code that corresponds to the error with the last update. * **ErrorMessage** *(string) --* The error message that corresponds to the error code. * **Source** *(string) --* The source of the last update to the environment. Includes internal processes by Amazon MWAA, such as an environment maintenance update. * **WorkerReplacementStrategy** *(string) --* The worker replacement strategy used in the last update of the environment. * **WeeklyMaintenanceWindowStart** *(string) --* The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time that weekly maintenance updates are scheduled. For example: "TUE:03:30". * **Tags** *(dict) --* The key-value tag pairs associated to your environment. For example, ""Environment": "Staging"". For more information, see Tagging Amazon Web Services resources. * *(string) --* * *(string) --* * **WebserverAccessMode** *(string) --* The Apache Airflow *web server* access mode. For more information, see Apache Airflow access modes. * **MinWorkers** *(integer) --* The minimum number of workers that run in your environment. For example, "2". * **Schedulers** *(integer) --* The number of Apache Airflow schedulers that run in your Amazon MWAA environment. * **WebserverVpcEndpointService** *(string) --* The VPC endpoint for the environment's web server. * **DatabaseVpcEndpointService** *(string) --* The VPC endpoint for the environment's Amazon RDS database. * **CeleryExecutorQueue** *(string) --* The queue ARN for the environment's Celery Executor. Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC. * **EndpointManagement** *(string) --* Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to "SERVICE", Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set to "CUSTOMER", you must create, and manage, the VPC endpoints in your VPC. * **MinWebservers** *(integer) --* The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for "MaxWebservers" when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in "MinxWebserers". Valid values: For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". * **MaxWebservers** *(integer) --* The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for "MaxWebservers" when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in "MaxWebserers". As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in "MinxWebserers". Valid values: For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / get_paginator get_paginator ************* MWAA.Client.get_paginator(operation_name) Create a paginator for an operation. Parameters: **operation_name** (*string*) -- The operation name. This is the same name as the method name on the client. For example, if the method name is "create_foo", and you'd normally invoke the operation as "client.create_foo(**kwargs)", if the "create_foo" operation can be paginated, you can use the call "client.get_paginator("create_foo")". Raises: **OperationNotPageableError** -- Raised if the operation is not pageable. You can use the "client.can_paginate" method to check if an operation is pageable. Return type: "botocore.paginate.Paginator" Returns: A paginator object. MWAA / Client / delete_environment delete_environment ****************** MWAA.Client.delete_environment(**kwargs) Deletes an Amazon Managed Workflows for Apache Airflow (Amazon MWAA) environment. See also: AWS API Documentation **Request Syntax** response = client.delete_environment( Name='string' ) Parameters: **Name** (*string*) -- **[REQUIRED]** The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". Return type: dict Returns: **Response Syntax** {} **Response Structure** * *(dict) --* **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / can_paginate can_paginate ************ MWAA.Client.can_paginate(operation_name) Check if an operation can be paginated. Parameters: **operation_name** (*string*) -- The operation name. This is the same name as the method name on the client. For example, if the method name is "create_foo", and you'd normally invoke the operation as "client.create_foo(**kwargs)", if the "create_foo" operation can be paginated, you can use the call "client.get_paginator("create_foo")". Returns: "True" if the operation can be paginated, "False" otherwise. MWAA / Client / invoke_rest_api invoke_rest_api *************** MWAA.Client.invoke_rest_api(**kwargs) Invokes the Apache Airflow REST API on the webserver with the specified inputs. To learn more, see Using the Apache Airflow REST API See also: AWS API Documentation **Request Syntax** response = client.invoke_rest_api( Name='string', Path='string', Method='GET'|'PUT'|'POST'|'PATCH'|'DELETE', QueryParameters={...}|[...]|123|123.4|'string'|True|None, Body={...}|[...]|123|123.4|'string'|True|None ) Parameters: * **Name** (*string*) -- **[REQUIRED]** The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". * **Path** (*string*) -- **[REQUIRED]** The Apache Airflow REST API endpoint path to be called. For example, "/dags/123456/clearTaskInstances". For more information, see Apache Airflow API * **Method** (*string*) -- **[REQUIRED]** The HTTP method used for making Airflow REST API calls. For example, "POST". * **QueryParameters** (*document*) -- Query parameters to be included in the Apache Airflow REST API call, provided as a JSON object. * **Body** (*document*) -- The request body for the Apache Airflow REST API call, provided as a JSON object. Return type: dict Returns: **Response Syntax** { 'RestApiStatusCode': 123, 'RestApiResponse': {...}|[...]|123|123.4|'string'|True|None } **Response Structure** * *(dict) --* * **RestApiStatusCode** *(integer) --* The HTTP status code returned by the Apache Airflow REST API call. * **RestApiResponse** (*document*) -- The response data from the Apache Airflow REST API call, provided as a JSON object. **Exceptions** * "MWAA.Client.exceptions.RestApiClientException" * "MWAA.Client.exceptions.AccessDeniedException" * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" * "MWAA.Client.exceptions.RestApiServerException" MWAA / Client / create_environment create_environment ****************** MWAA.Client.create_environment(**kwargs) Creates an Amazon Managed Workflows for Apache Airflow (Amazon MWAA) environment. See also: AWS API Documentation **Request Syntax** response = client.create_environment( Name='string', ExecutionRoleArn='string', SourceBucketArn='string', DagS3Path='string', NetworkConfiguration={ 'SubnetIds': [ 'string', ], 'SecurityGroupIds': [ 'string', ] }, PluginsS3Path='string', PluginsS3ObjectVersion='string', RequirementsS3Path='string', RequirementsS3ObjectVersion='string', StartupScriptS3Path='string', StartupScriptS3ObjectVersion='string', AirflowConfigurationOptions={ 'string': 'string' }, EnvironmentClass='string', MaxWorkers=123, KmsKey='string', AirflowVersion='string', LoggingConfiguration={ 'DagProcessingLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'SchedulerLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'WebserverLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'WorkerLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'TaskLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' } }, WeeklyMaintenanceWindowStart='string', Tags={ 'string': 'string' }, WebserverAccessMode='PRIVATE_ONLY'|'PUBLIC_ONLY', MinWorkers=123, Schedulers=123, EndpointManagement='CUSTOMER'|'SERVICE', MinWebservers=123, MaxWebservers=123 ) Parameters: * **Name** (*string*) -- **[REQUIRED]** The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". * **ExecutionRoleArn** (*string*) -- **[REQUIRED]** The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, "arn:aws:iam::123456789:role/my-execution-role". For more information, see Amazon MWAA Execution role. * **SourceBucketArn** (*string*) -- **[REQUIRED]** The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, "arn:aws:s3:::my-airflow-bucket-unique-name". For more information, see Create an Amazon S3 bucket for Amazon MWAA. * **DagS3Path** (*string*) -- **[REQUIRED]** The relative path to the DAGs folder on your Amazon S3 bucket. For example, "dags". For more information, see Adding or updating DAGs. * **NetworkConfiguration** (*dict*) -- **[REQUIRED]** The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA. * **SubnetIds** *(list) --* A list of subnet IDs. For more information, see About networking on Amazon MWAA. * *(string) --* * **SecurityGroupIds** *(list) --* A list of security group IDs. For more information, see Security in your VPC on Amazon MWAA. * *(string) --* * **PluginsS3Path** (*string*) -- The relative path to the "plugins.zip" file on your Amazon S3 bucket. For example, "plugins.zip". If specified, then the "plugins.zip" version is required. For more information, see Installing custom plugins. * **PluginsS3ObjectVersion** (*string*) -- The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works. * **RequirementsS3Path** (*string*) -- The relative path to the "requirements.txt" file on your Amazon S3 bucket. For example, "requirements.txt". If specified, then a version is required. For more information, see Installing Python dependencies. * **RequirementsS3ObjectVersion** (*string*) -- The version of the "requirements.txt" file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works. * **StartupScriptS3Path** (*string*) -- The relative path to the startup shell script in your Amazon S3 bucket. For example, "s3://mwaa-environment/startup.sh". Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script. * **StartupScriptS3ObjectVersion** (*string*) -- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: "3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpU MLUo" For more information, see Using a startup script. * **AirflowConfigurationOptions** (*dict*) -- A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options. * *(string) --* * *(string) --* * **EnvironmentClass** (*string*) -- The environment class type. Valid values: "mw1.micro", "mw1.small", "mw1.medium", "mw1.large", "mw1.xlarge", and "mw1.2xlarge". For more information, see Amazon MWAA environment class. * **MaxWorkers** (*integer*) -- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the "MaxWorkers" field. For example, "20". When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in "MinWorkers". * **KmsKey** (*string*) -- The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment. * **AirflowVersion** (*string*) -- The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (Amazon MWAA). Valid values: "1.10.12", "2.0.2", "2.2.2", "2.4.3", "2.5.1", "2.6.3", "2.7.2", "2.8.1", "2.9.2", "2.10.1", and "2.10.3". * **LoggingConfiguration** (*dict*) -- Defines the Apache Airflow logs to send to CloudWatch Logs. * **DagProcessingLogs** *(dict) --* Publishes Airflow DAG processing logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **SchedulerLogs** *(dict) --* Publishes Airflow scheduler logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **WebserverLogs** *(dict) --* Publishes Airflow web server logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **WorkerLogs** *(dict) --* Publishes Airflow worker logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **TaskLogs** *(dict) --* Publishes Airflow task logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **WeeklyMaintenanceWindowStart** (*string*) -- The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: "DAY:HH:MM". For example: "TUE:03:30". You can specify a start time in 30 minute increments only. * **Tags** (*dict*) -- The key-value tag pairs you want to associate to your environment. For example, ""Environment": "Staging"". For more information, see Tagging Amazon Web Services resources. * *(string) --* * *(string) --* * **WebserverAccessMode** (*string*) -- Defines the access mode for the Apache Airflow *web server*. For more information, see Apache Airflow access modes. * **MinWorkers** (*integer*) -- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the "MaxWorkers" field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the "MinWorkers" field. For example, "2". * **Schedulers** (*integer*) -- The number of Apache Airflow schedulers to run in your environment. Valid values: * v2 - For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". * v1 - Accepts "1". * **EndpointManagement** (*string*) -- Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to "SERVICE", Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set to "CUSTOMER", you must create, and manage, the VPC endpoints for your VPC. If you choose to create an environment in a shared VPC, you must set this value to "CUSTOMER". In a shared VPC deployment, the environment will remain in "PENDING" status until you create the VPC endpoints. If you do not take action to create the endpoints within 72 hours, the status will change to "CREATE_FAILED". You can delete the failed environment and create a new one. * **MinWebservers** (*integer*) -- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for "MaxWebservers" when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in "MinxWebserers". Valid values: For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". * **MaxWebservers** (*integer*) -- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for "MaxWebservers" when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction- per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in "MaxWebserers". As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in "MinxWebserers". Valid values: For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". Return type: dict Returns: **Response Syntax** { 'Arn': 'string' } **Response Structure** * *(dict) --* * **Arn** *(string) --* The Amazon Resource Name (ARN) returned in the response for the environment. **Exceptions** * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / list_environments list_environments ***************** MWAA.Client.list_environments(**kwargs) Lists the Amazon Managed Workflows for Apache Airflow (MWAA) environments. See also: AWS API Documentation **Request Syntax** response = client.list_environments( NextToken='string', MaxResults=123 ) Parameters: * **NextToken** (*string*) -- Retrieves the next page of the results. * **MaxResults** (*integer*) -- The maximum number of results to retrieve per page. For example, "5" environments per page. Return type: dict Returns: **Response Syntax** { 'Environments': [ 'string', ], 'NextToken': 'string' } **Response Structure** * *(dict) --* * **Environments** *(list) --* Returns a list of Amazon MWAA environments. * *(string) --* * **NextToken** *(string) --* Retrieves the next page of the results. **Exceptions** * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / create_web_login_token create_web_login_token ********************** MWAA.Client.create_web_login_token(**kwargs) Creates a web login token for the Airflow Web UI. To learn more, see Creating an Apache Airflow web login token. See also: AWS API Documentation **Request Syntax** response = client.create_web_login_token( Name='string' ) Parameters: **Name** (*string*) -- **[REQUIRED]** The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". Return type: dict Returns: **Response Syntax** { 'WebToken': 'string', 'WebServerHostname': 'string', 'IamIdentity': 'string', 'AirflowIdentity': 'string' } **Response Structure** * *(dict) --* * **WebToken** *(string) --* An Airflow web server login token. * **WebServerHostname** *(string) --* The Airflow web server hostname for the environment. * **IamIdentity** *(string) --* The name of the IAM identity creating the web login token. This might be an IAM user, or an assumed or federated identity. For example, "assumed-role/Admin/your-name". * **AirflowIdentity** *(string) --* The user name of the Apache Airflow identity creating the web login token. **Exceptions** * "MWAA.Client.exceptions.AccessDeniedException" * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / list_tags_for_resource list_tags_for_resource ********************** MWAA.Client.list_tags_for_resource(**kwargs) Lists the key-value tag pairs associated to the Amazon Managed Workflows for Apache Airflow (MWAA) environment. For example, ""Environment": "Staging"". See also: AWS API Documentation **Request Syntax** response = client.list_tags_for_resource( ResourceArn='string' ) Parameters: **ResourceArn** (*string*) -- **[REQUIRED]** The Amazon Resource Name (ARN) of the Amazon MWAA environment. For example, "arn:aws:airflow:us- east-1:123456789012:environment/MyMWAAEnvironment". Return type: dict Returns: **Response Syntax** { 'Tags': { 'string': 'string' } } **Response Structure** * *(dict) --* * **Tags** *(dict) --* The key-value tag pairs associated to your environment. For more information, see Tagging Amazon Web Services resources. * *(string) --* * *(string) --* **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / update_environment update_environment ****************** MWAA.Client.update_environment(**kwargs) Updates an Amazon Managed Workflows for Apache Airflow (MWAA) environment. See also: AWS API Documentation **Request Syntax** response = client.update_environment( Name='string', ExecutionRoleArn='string', AirflowConfigurationOptions={ 'string': 'string' }, AirflowVersion='string', DagS3Path='string', EnvironmentClass='string', LoggingConfiguration={ 'DagProcessingLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'SchedulerLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'WebserverLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'WorkerLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' }, 'TaskLogs': { 'Enabled': True|False, 'LogLevel': 'CRITICAL'|'ERROR'|'WARNING'|'INFO'|'DEBUG' } }, MaxWorkers=123, MinWorkers=123, MaxWebservers=123, MinWebservers=123, WorkerReplacementStrategy='FORCED'|'GRACEFUL', NetworkConfiguration={ 'SecurityGroupIds': [ 'string', ] }, PluginsS3Path='string', PluginsS3ObjectVersion='string', RequirementsS3Path='string', RequirementsS3ObjectVersion='string', Schedulers=123, SourceBucketArn='string', StartupScriptS3Path='string', StartupScriptS3ObjectVersion='string', WebserverAccessMode='PRIVATE_ONLY'|'PUBLIC_ONLY', WeeklyMaintenanceWindowStart='string' ) Parameters: * **Name** (*string*) -- **[REQUIRED]** The name of your Amazon MWAA environment. For example, "MyMWAAEnvironment". * **ExecutionRoleArn** (*string*) -- The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access Amazon Web Services resources in your environment. For example, "arn:aws:iam::123456789:role/my-execution-role". For more information, see Amazon MWAA Execution role. * **AirflowConfigurationOptions** (*dict*) -- A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options. * *(string) --* * *(string) --* * **AirflowVersion** (*string*) -- The Apache Airflow version for your environment. To upgrade your environment, specify a newer version of Apache Airflow supported by Amazon MWAA. Before you upgrade an environment, make sure your requirements, DAGs, plugins, and other resources used in your workflows are compatible with the new Apache Airflow version. For more information about updating your resources, see Upgrading an Amazon MWAA environment. Valid values: "1.10.12", "2.0.2", "2.2.2", "2.4.3", "2.5.1", "2.6.3", "2.7.2", "2.8.1", "2.9.2", "2.10.1", and "2.10.3". * **DagS3Path** (*string*) -- The relative path to the DAGs folder on your Amazon S3 bucket. For example, "dags". For more information, see Adding or updating DAGs. * **EnvironmentClass** (*string*) -- The environment class type. Valid values: "mw1.micro", "mw1.small", "mw1.medium", "mw1.large", "mw1.xlarge", and "mw1.2xlarge". For more information, see Amazon MWAA environment class. * **LoggingConfiguration** (*dict*) -- The Apache Airflow log types to send to CloudWatch Logs. * **DagProcessingLogs** *(dict) --* Publishes Airflow DAG processing logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **SchedulerLogs** *(dict) --* Publishes Airflow scheduler logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **WebserverLogs** *(dict) --* Publishes Airflow web server logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **WorkerLogs** *(dict) --* Publishes Airflow worker logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **TaskLogs** *(dict) --* Publishes Airflow task logs to CloudWatch Logs. * **Enabled** *(boolean) --* **[REQUIRED]** Indicates whether to enable the Apache Airflow log type (e.g. "DagProcessingLogs"). * **LogLevel** *(string) --* **[REQUIRED]** Defines the Apache Airflow log level (e.g. "INFO") to send to CloudWatch Logs. * **MaxWorkers** (*integer*) -- The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the "MaxWorkers" field. For example, "20". When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in "MinWorkers". * **MinWorkers** (*integer*) -- The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the "MaxWorkers" field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the "MinWorkers" field. For example, "2". * **MaxWebservers** (*integer*) -- The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for "MaxWebservers" when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction- per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in "MaxWebserers". As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in "MinxWebserers". Valid values: For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". * **MinWebservers** (*integer*) -- The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for "MaxWebservers" when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in "MinxWebserers". Valid values: For environments larger than mw1.micro, accepts values from "2" to "5". Defaults to "2" for all environment sizes except mw1.micro, which defaults to "1". * **WorkerReplacementStrategy** (*string*) -- The worker replacement strategy to use when updating the environment. You can select one of the following strategies: * **Forced -** Stops and replaces Apache Airflow workers without waiting for tasks to complete before an update. * **Graceful -** Allows Apache Airflow workers to complete running tasks for up to 12 hours during an update before they're stopped and replaced. * **NetworkConfiguration** (*dict*) -- The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA. * **SecurityGroupIds** *(list) --* **[REQUIRED]** A list of security group IDs. A security group must be attached to the same VPC as the subnets. For more information, see Security in your VPC on Amazon MWAA. * *(string) --* * **PluginsS3Path** (*string*) -- The relative path to the "plugins.zip" file on your Amazon S3 bucket. For example, "plugins.zip". If specified, then the plugins.zip version is required. For more information, see Installing custom plugins. * **PluginsS3ObjectVersion** (*string*) -- The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a "plugins.zip" file is updated. For more information, see How S3 Versioning works. * **RequirementsS3Path** (*string*) -- The relative path to the "requirements.txt" file on your Amazon S3 bucket. For example, "requirements.txt". If specified, then a file version is required. For more information, see Installing Python dependencies. * **RequirementsS3ObjectVersion** (*string*) -- The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a "requirements.txt" file is updated. For more information, see How S3 Versioning works. * **Schedulers** (*integer*) -- The number of Apache Airflow schedulers to run in your Amazon MWAA environment. * **SourceBucketArn** (*string*) -- The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, "arn:aws:s3:::my- airflow-bucket-unique-name". For more information, see Create an Amazon S3 bucket for Amazon MWAA. * **StartupScriptS3Path** (*string*) -- The relative path to the startup shell script in your Amazon S3 bucket. For example, "s3://mwaa-environment/startup.sh". Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script. * **StartupScriptS3ObjectVersion** (*string*) -- The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script. Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example: "3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpU MLUo" For more information, see Using a startup script. * **WebserverAccessMode** (*string*) -- The Apache Airflow *Web server* access mode. For more information, see Apache Airflow access modes. * **WeeklyMaintenanceWindowStart** (*string*) -- The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: "DAY:HH:MM". For example: "TUE:03:30". You can specify a start time in 30 minute increments only. Return type: dict Returns: **Response Syntax** { 'Arn': 'string' } **Response Structure** * *(dict) --* * **Arn** *(string) --* The Amazon Resource Name (ARN) of the Amazon MWAA environment. For example, "arn:aws:airflow:us- east-1:123456789012:environment/MyMWAAEnvironment". **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / untag_resource untag_resource ************** MWAA.Client.untag_resource(**kwargs) Removes key-value tag pairs associated to your Amazon Managed Workflows for Apache Airflow (MWAA) environment. For example, ""Environment": "Staging"". See also: AWS API Documentation **Request Syntax** response = client.untag_resource( ResourceArn='string', tagKeys=[ 'string', ] ) Parameters: * **ResourceArn** (*string*) -- **[REQUIRED]** The Amazon Resource Name (ARN) of the Amazon MWAA environment. For example, "arn:aws:airflow:us- east-1:123456789012:environment/MyMWAAEnvironment". * **tagKeys** (*list*) -- **[REQUIRED]** The key-value tag pair you want to remove. For example, ""Environment": "Staging"". * *(string) --* Return type: dict Returns: **Response Syntax** {} **Response Structure** * *(dict) --* **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / get_waiter get_waiter ********** MWAA.Client.get_waiter(waiter_name) Returns an object that can wait for some condition. Parameters: **waiter_name** (*str*) -- The name of the waiter to get. See the waiters section of the service docs for a list of available waiters. Returns: The specified waiter object. Return type: "botocore.waiter.Waiter" MWAA / Client / publish_metrics publish_metrics *************** MWAA.Client.publish_metrics(**kwargs) **Internal only**. Publishes environment health metrics to Amazon CloudWatch. Danger: This operation is deprecated and may not function as expected. This operation should not be used going forward and is only kept for the purpose of backwards compatiblity. See also: AWS API Documentation **Request Syntax** response = client.publish_metrics( EnvironmentName='string', MetricData=[ { 'MetricName': 'string', 'Timestamp': datetime(2015, 1, 1), 'Dimensions': [ { 'Name': 'string', 'Value': 'string' }, ], 'Value': 123.0, 'Unit': 'Seconds'|'Microseconds'|'Milliseconds'|'Bytes'|'Kilobytes'|'Megabytes'|'Gigabytes'|'Terabytes'|'Bits'|'Kilobits'|'Megabits'|'Gigabits'|'Terabits'|'Percent'|'Count'|'Bytes/Second'|'Kilobytes/Second'|'Megabytes/Second'|'Gigabytes/Second'|'Terabytes/Second'|'Bits/Second'|'Kilobits/Second'|'Megabits/Second'|'Gigabits/Second'|'Terabits/Second'|'Count/Second'|'None', 'StatisticValues': { 'SampleCount': 123, 'Sum': 123.0, 'Minimum': 123.0, 'Maximum': 123.0 } }, ] ) Parameters: * **EnvironmentName** (*string*) -- **[REQUIRED]** **Internal only**. The name of the environment. * **MetricData** (*list*) -- **[REQUIRED]** **Internal only**. Publishes metrics to Amazon CloudWatch. To learn more about the metrics published to Amazon CloudWatch, see Amazon MWAA performance metrics in Amazon CloudWatch. * *(dict) --* **Internal only**. Collects Apache Airflow metrics. To learn more about the metrics published to Amazon CloudWatch, see Amazon MWAA performance metrics in Amazon CloudWatch. * **MetricName** *(string) --* **[REQUIRED]** **Internal only**. The name of the metric. * **Timestamp** *(datetime) --* **[REQUIRED]** **Internal only**. The time the metric data was received. * **Dimensions** *(list) --* **Internal only**. The dimensions associated with the metric. * *(dict) --* **Internal only**. Represents the dimensions of a metric. To learn more about the metrics published to Amazon CloudWatch, see Amazon MWAA performance metrics in Amazon CloudWatch. * **Name** *(string) --* **[REQUIRED]** **Internal only**. The name of the dimension. * **Value** *(string) --* **[REQUIRED]** **Internal only**. The value of the dimension. * **Value** *(float) --* **Internal only**. The value for the metric. * **Unit** *(string) --* **Internal only**. The unit used to store the metric. * **StatisticValues** *(dict) --* **Internal only**. The statistical values for the metric. * **SampleCount** *(integer) --* **Internal only**. The number of samples used for the statistic set. * **Sum** *(float) --* **Internal only**. The sum of values for the sample set. * **Minimum** *(float) --* **Internal only**. The minimum value of the sample set. * **Maximum** *(float) --* **Internal only**. The maximum value of the sample set. Return type: dict Returns: **Response Syntax** {} **Response Structure** * *(dict) --* **Exceptions** * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException" MWAA / Client / close close ***** MWAA.Client.close() Closes underlying endpoint connections. MWAA / Client / create_cli_token create_cli_token **************** MWAA.Client.create_cli_token(**kwargs) Creates a CLI token for the Airflow CLI. To learn more, see Creating an Apache Airflow CLI token. See also: AWS API Documentation **Request Syntax** response = client.create_cli_token( Name='string' ) Parameters: **Name** (*string*) -- **[REQUIRED]** The name of the Amazon MWAA environment. For example, "MyMWAAEnvironment". Return type: dict Returns: **Response Syntax** { 'CliToken': 'string', 'WebServerHostname': 'string' } **Response Structure** * *(dict) --* * **CliToken** *(string) --* An Airflow CLI login token. * **WebServerHostname** *(string) --* The Airflow web server hostname for the environment. **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" MWAA / Client / tag_resource tag_resource ************ MWAA.Client.tag_resource(**kwargs) Associates key-value tag pairs to your Amazon Managed Workflows for Apache Airflow (MWAA) environment. See also: AWS API Documentation **Request Syntax** response = client.tag_resource( ResourceArn='string', Tags={ 'string': 'string' } ) Parameters: * **ResourceArn** (*string*) -- **[REQUIRED]** The Amazon Resource Name (ARN) of the Amazon MWAA environment. For example, "arn:aws:airflow:us- east-1:123456789012:environment/MyMWAAEnvironment". * **Tags** (*dict*) -- **[REQUIRED]** The key-value tag pairs you want to associate to your environment. For example, ""Environment": "Staging"". For more information, see Tagging Amazon Web Services resources. * *(string) --* * *(string) --* Return type: dict Returns: **Response Syntax** {} **Response Structure** * *(dict) --* **Exceptions** * "MWAA.Client.exceptions.ResourceNotFoundException" * "MWAA.Client.exceptions.ValidationException" * "MWAA.Client.exceptions.InternalServerException"