paint-brush
如何为多个用户自动执行 AWS WorkMail 备份经过@th3n00bc0d3r
306 讀數
306 讀數

如何为多个用户自动执行 AWS WorkMail 备份

经过 Muhammad Bilal15m2024/05/27
Read on Terminal Reader

太長; 讀書

AWS WorkMail 是一项功能强大的电子邮件和日历服务,集安全性、可靠性和易用性于一体。通过利用 AWS WorkMail,组织可以专注于其核心活动,而无需担心管理电子邮件基础设施的复杂性。在本教程中,我将引导您完成为多个用户自动执行 Amazon WorkMail 备份的过程,最多可处理 10 个并发邮箱导出作业。
featured image - 如何为多个用户自动执行 AWS WorkMail 备份
Muhammad Bilal HackerNoon profile picture
0-item

AWS WorkMail 是一项功能强大的电子邮件和日历服务,兼具安全性、可靠性和易用性。它与 AWS 服务的集成以及与现有电子邮件客户端的兼容性使其成为希望简化电子邮件和日历管理同时确保安全性和合规性的企业的诱人选择。通过利用 AWS WorkMail,组织可以专注于其核心活动,而无需担心管理电子邮件基础设施的复杂性。为您的 AWS WorkMail 设置可靠的备份系统对于确保您的电子邮件安全且易于恢复至关重要。在本教程中,我将引导您完成为多个用户自动执行 AWS WorkMail 备份的过程,处理最多 10 个并发邮箱导出作业。我们将使用 AWS 服务和 Python 来创建一个强大的备份解决方案。


啊!我记得当时我使用 CPANEL 邮件服务器,然后又干预我自己的邮件服务器。2010 年代初,那些 IP 地址访问权限有限的单实例邮件服务器运行顺畅,但随着全球越来越多的人开始利用这些服务器,Comon,我们仍然会收到大量垃圾邮件和营销邮件。确保你的电子邮件不会落入垃圾邮件箱是一件很难的事情。


AWS WorkMail、自托管邮件服务器和 cPanel 邮件服务器之间的选择取决于多种因素,包括预算、技术专长、可扩展性需求和安全要求。


  • 对于寻求可扩展、安全且可管理的电子邮件解决方案且维护开销极小的企业来说, AWS WorkMail是理想的选择。
  • 自托管邮件服务器适合具有内部技术专长和特定定制需求的组织,但涉及更高的成本和维护工作。
  • cPanel 邮件服务器为需要用户友好界面且愿意接受一定程度维护的中小型企业提供一种均衡的方法,并受益于捆绑的托管服务。


我主要关注的是以下几点;

安全

  • AWS WorkMail :提供内置安全功能,包括传输中和静态加密、与 AWS 密钥管理服务 (KMS) 集成以及遵守各种监管标准。
  • 自托管邮件服务器:安全性完全取决于管理员的专业知识。需要配置加密、垃圾邮件过滤和定期安全更新以防范威胁。
  • cPanel 邮件服务器:提供安全功能,但实施和维护安全措施的责任在于用户。cPanel 提供 SSL/TLS 配置、垃圾邮件过滤和防病毒工具,但正确的设置和定期更新至关重要。


然后 AWS 来拯救我了,自 2015 年使用 AWS 以来,一切都变得轻而易举,除了需要备份我的 AWS 账户以转移到新账户的情况。好吧,我一直在互联网上搜索可靠的解决方案,因为就我们目前而言,AWS WorkMail 并没有提供一种直接的方式来将您的电子邮件备份到您的本地计算机或 S3,并且基于安全合规性这是可以理解的,但我仍然希望 AWS 应该提供一些 GUI 或工具来实现这一点。好吧,在浏览时,我确实遇到了一些付费工具,所以我决定走上开发自己的工具的道路。经过严格的测试,根据我使用 AWS 的经验,当一个服务由于策略或 IAM 角色问题而无法与另一个服务通信时,总是会出现 9/10 的问题。


这是我向所有难以备份 AWS 工作邮件账户的人提供的贡献。


备份如何进行?


我们在您的账户中创建了一个 IAM 角色,授予该角色从 AWS WorkMail 导出的权限,然后我们将一个策略附加到同一角色以允许其访问 S3。然后我们创建一个 KMS 密钥,该密钥被授予对 IAM 角色的访问权限。S3 存储桶也需要访问 IAM 角色才能正常运行。

步骤 1:配置 AWS CLI

首先,确保已安装 AWS CLI 并配置了适当的凭证。打开终端并运行:

 aws configure

按照提示设置您的 AWS 访问密钥 ID、秘密访问密钥、默认区域和输出格式。

步骤 2:设置所需的 AWS 资源

我们需要创建一个 IAM 角色、策略、一个 S3 存储桶和一个 KMS 密钥。将以下 bash 脚本保存为setup_workmail_export.sh

 #!/bin/bash # Configuration ROLE_NAME="WorkMailExportRole" POLICY_NAME="workmail-export" S3_BUCKET_NAME="s3.bucket.name" AWS_REGION="your-region" ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text) # Create Trust Policy cat <<EOF > trust-policy.json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "export.workmail.amazonaws.com" }, "Action": "sts:AssumeRole", "Condition": { "StringEquals": { "sts:ExternalId": "$ACCOUNT_ID" } } } ] } EOF # Create IAM Role aws iam create-role --role-name $ROLE_NAME --assume-role-policy-document file://trust-policy.json # Create IAM Policy cat <<EOF > role-policy.json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "s3:*", "Resource": [ "arn:aws:s3:::$S3_BUCKET_NAME", "arn:aws:s3:::$S3_BUCKET_NAME/*" ] }, { "Effect": "Allow", "Action": [ "kms:Decrypt", "kms:GenerateDataKey" ], "Resource": "*" } ] } EOF # Attach the Policy to the Role aws iam put-role-policy --role-name $ROLE_NAME --policy-name $POLICY_NAME --policy-document file://role-policy.json # Create S3 Bucket aws s3api create-bucket --bucket $S3_BUCKET_NAME --region $AWS_REGION --create-bucket-configuration LocationConstraint=$AWS_REGION # Create Key Policy cat <<EOF > key-policy.json { "Version": "2012-10-17", "Id": "workmail-export-key", "Statement": [ { "Sid": "Enable IAM User Permissions", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::$ACCOUNT_ID:root" }, "Action": "kms:*", "Resource": "*" }, { "Sid": "Allow administration of the key", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::$ACCOUNT_ID:role/$ROLE_NAME" }, "Action": [ "kms:Create*", "kms:Describe*", "kms:Enable*", "kms:List*", "kms:Put*", "kms:Update*", "kms:Revoke*", "kms:Disable*", "kms:Get*", "kms:Delete*", "kms:ScheduleKeyDeletion", "kms:CancelKeyDeletion" ], "Resource": "*" } ] } EOF # Create the KMS Key and get the Key ID and ARN using Python for JSON parsing KEY_METADATA=$(aws kms create-key --policy file://key-policy.json) KEY_ID=$(python3 -c "import sys, json; print(json.load(sys.stdin)['KeyMetadata']['KeyId'])" <<< "$KEY_METADATA") KEY_ARN=$(python3 -c "import sys, json; print(json.load(sys.stdin)['KeyMetadata']['Arn'])" <<< "$KEY_METADATA") # Update S3 Bucket Policy cat <<EOF > s3_bucket_policy.json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::$ACCOUNT_ID:role/$ROLE_NAME" }, "Action": "s3:*" "Resource": [ "arn:aws:s3:::$S3_BUCKET_NAME", "arn:aws:s3:::$S3_BUCKET_NAME/*" ] } ] } EOF # Apply the Bucket Policy aws s3api put-bucket-policy --bucket $S3_BUCKET_NAME --policy file://s3_bucket_policy.json # Clean up temporary files rm trust-policy.json role-policy.json key-policy.json s3_bucket_policy.json # Display the variables required for the backup script cat <<EOF Setup complete. Here are the variables required for the backup script: # Print out the Variables organization_id = 'your-organization-id' user_id = 'your-user-id' s3_bucket_name = '$S3_BUCKET_NAME' s3_prefix = 'workmail-backup/' region = '$AWS_REGION' kms_key_arn = '$KEY_ARN' role_name = '$ROLE_NAME' account_id = '$ACCOUNT_ID' EOF

使脚本可执行并运行它:

 chmod +x setup_workmail_export.sh ./setup_workmail_export.sh

步骤3:编写备份脚本

现在,让我们编写 Python 脚本来分批导出 10 个邮箱。将以下脚本保存为workmail_export.py

 import boto3 import json import time from datetime import datetime # Configuration organization_id = 'your-organization-id' user_emails = { 'user-id-1': '[email protected]', 'user-id-2': '[email protected]', 'user-id-3': '[email protected]', 'user-id-4': '[email protected]', 'user-id-5': '[email protected]', 'user-id-6': '[email protected]', 'user-id-7': '[email protected]', 'user-id-8': '[email protected]', 'user-id-9': '[email protected]', 'user-id-10': '[email protected]', 'user-id-11': '[email protected]', 'user-id-12': '[email protected]' # Add more user ID to email mappings as needed } s3_bucket_name = 's3.bucket.name' region = 'your-region' kms_key_arn = 'arn:aws:kms:your-region:your-account-id:key/your-key-id' role_name = 'WorkMailExportRole' account_id = 'your-account-id' # Get the current date current_date = datetime.now().strftime('%Y-%m-%d') # Set the S3 prefix with the date included s3_prefix_base = f'workmail-backup/{current_date}/' # Initialize AWS clients workmail = boto3.client('workmail', region_name=region) sts = boto3.client('sts', region_name=region) def start_export_job(entity_id, user_email): client_token = str(time.time()) # Unique client token role_arn = f"arn:aws:iam::{account_id}:role/{role_name}" s3_prefix = f"{s3_prefix_base}{user_email}/" try: response = workmail.start_mailbox_export_job( ClientToken=client_token, OrganizationId=organization_id, EntityId=entity_id, Description='Backup job', RoleArn=role_arn, KmsKeyArn=kms_key_arn, S3BucketName=s3_bucket_name, S3Prefix=s3_prefix ) return response['JobId'] except Exception as e: print(f"Failed to start export job for {entity_id}: {e}") return None def check_job_status(job_id): while True: try: response = workmail.describe_mailbox_export_job( OrganizationId=organization_id, JobId=job_id ) print(f"Full Response: {response}") # Log full response for debugging state = response.get('State', 'UNKNOWN') print(f"Job State: {state}") if state in ['COMPLETED', 'FAILED']: break except Exception as e : print(f"Error checking job status for {job_id}: {e}") time.sleep(30) # Wait for 30 seconds before checking again return state def export_mailboxes_in_batches(user_emails, batch_size=10): user_ids = list(user_emails.keys()) for i in range(0, len(user_ids), batch_size): batch = user_ids[i:i+batch_size] job_ids = [] for user_id in batch: user_email = user_emails[user_id] job_id = start_export_job(user_id, user_email) if job_id: print(f"Started export job for {user_email} with Job ID: {job_id}") job_ids.append((user_email, job_id)) for user_email, job_id in job_ids: state = check_job_status(job_id) if state == 'COMPLETED': print(f"Export job for {user_email} completed successfully") else: print(f"Export job for {user_email} failed with state: {state}") def main(): export_mailboxes_in_batches(user_emails) if __name__ == "__main__": main()

将占位符替换为您的实际 AWS WorkMail 组织 ID、用户 ID 到电子邮件映射、S3 存储桶名称、区域、KMS 密钥 ARN、角色名称和 AWS 账户 ID。


步骤 4:运行备份脚本

确保您已安装 Boto3:

 pip install boto3

然后,执行 Python 脚本:

 python workmail_export.py 


匯入信箱

import boto3 import json import time # Configuration organization_id = 'your-organization-id' user_import_data = { 'user-id-1': '[email protected]', 'user-id-2': '[email protected]', 'user-id-3': '[email protected]', 'user-id-4': '[email protected]', 'user-id-5': '[email protected]', 'user-id-6': '[email protected]', 'user-id-7': '[email protected]', 'user-id-8': '[email protected]', 'user-id-9': '[email protected]', 'user-id-10': '[email protected]', 'user-id-11': '[email protected]', 'user-id-12': '[email protected]' # Add more user ID to email mappings as needed } s3_bucket_name = 's3.bucket.name' s3_object_prefix = 'workmail-backup/' # Prefix for S3 objects (folders) region = 'your-region' role_name = 'WorkMailImportRole' account_id = 'your-account-id' # Initialize AWS clients workmail = boto3.client('workmail', region_name=region) sts = boto3.client('sts', region_name=region) def start_import_job(entity_id, user_email): client_token = str(time.time()) # Unique client token role_arn = f"arn:aws:iam::{account_id}:role/{role_name}" s3_object_key = f"{s3_object_prefix}{user_email}/export.zip" try: response = workmail.start_mailbox_import_job( ClientToken=client_token, OrganizationId=organization_id, EntityId=entity_id, Description='Import job', RoleArn=role_arn, S3BucketName=s3_bucket_name, S3ObjectKey=s3_object_key ) return response['JobId'] except Exception as e: print(f"Failed to start import job for {entity_id}: {e}") return None def check_job_status(job_id): while True: try: response = workmail.describe_mailbox_import_job( OrganizationId=organization_id, JobId=job_id ) state = response.get('State', 'UNKNOWN') print(f"Job State: {state}") if state in ['COMPLETED', 'FAILED']: break except Exception as e: print(f"Error checking job status for {job_id}: {e}") time.sleep(30) # Wait for 30 seconds before checking again return state def import_mailboxes_in_batches(user_import_data, batch_size=10): user_ids = list(user_import_data.keys()) for i in range(0, len(user_ids), batch_size): batch = user_ids[i:i+batch_size] job_ids = [] for user_id in batch: user_email = user_import_data[user_id] job_id = start_import_job(user_id, user_email) if job_id: print(f"Started import job for {user_email} with Job ID: {job_id}") job_ids.append((user_email, job_id)) for user_email, job_id in job_ids: state = check_job_status(job_id) if state == 'COMPLETED': print(f"Import job for {user_email} completed successfully") else: print(f"Import job for {user_email} failed with state: {state}") def main(): import_mailboxes_in_batches(user_import_data) if __name__ == "__main__": main()


通过执行这些步骤,您已设置了一个自动化系统来备份您的 AWS WorkMail 邮箱,可处理最多 10 个并发导出作业。此解决方案可确保您的电子邮件安全地存储在 S3 存储桶中,按用户电子邮件和日期组织,并使用 KMS 密钥加密。此设置为您组织的电子邮件数据提供了强大且可扩展的备份策略。


Github 仓库: https://github.com/th3n00bc0d3r/AWS-WorkMail-Backup


欢迎通过我的LinkedIn与我联系,也可以随时请我喝杯咖啡