I have some logs in CloudWatch and everyday, I keep getting new logs. Now, I want to store today's and yesterday's logs in Cloud Watch itself but logs that are 2 days older have to be moved to S3.
I have tried using the below code to export CloudWatch Logs to S3 :
import boto3
import collections
region = 'us-east-1'
def lambda_handler(event, context):
s3 = boto3.client('s3')
response = s3.create_export_task(
taskName='export_task',
logGroupName='/aws/lambda/test2',
logStreamNamePrefix='2016/11/29/',
fromTime=1437584472382,
to=1437584472402,
destination='prudhvi1234',
destinationPrefix='AWS'
)
print response
When I run this, I got the following error :
'S3' object has no attribute 'create_export_task': AttributeError
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 10, in lambda_handler
response = s3.create_export_task(
AttributeError: 'S3' object has no attribute 'create_export_task'
What might the mistake be?
client = boto3.client('logs')
You are accessing logs from CloudWatch and not S3. Hence the error. Subsequently
response = client.create_export_task(
taskName='export_task',
logGroupName='/aws/lambda/test2',
logStreamNamePrefix='2016/11/29/',
fromTime=1437584472382,
to=1437584472402,
destination='prudhvi1234',
destinationPrefix='AWS'
)
checkout this link for more information