Export table from Aurora PostgreSQL to Amazon S3

In this blog post I discuss how to export 100GB non-partitioned table from Aurora PostgreSQL to Amazon S3. I will walk you through two approaches that you can use to export the data. Firstly I will demonstrate using aws_s3, a PostgreSQL extension which Aurora PostgreSQL provides and then using AWS Glue service. The post also … Continue reading Export table from Aurora PostgreSQL to Amazon S3

Advertisement

Aurora MySQL – Export data to S3

Using SELECT INTO OUTFILE S3 you can query data from an Aurora MySQL DB cluster and save it directly into text files stored in S3 bucket. 1. Create an IAM policy for S3. { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:DeleteObject", "s3:GetBucketLocation", "s3:GetObject", "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:PutObject" ], "Resource": [ "arn:aws:s3:::bucket-name", … Continue reading Aurora MySQL – Export data to S3

Usecase with RDS Snapshot Export to S3

AWS recently announced "Amazon RDS Snapshot Export to S3" feature wherein you can now export Amazon Relational Database Service (Amazon RDS) or Amazon Aurora snapshots to Amazon S3 as Apache Parquet, an efficient open columnar storage format for analytics. I had a use-case to refresh Athena tables daily with full data set in Account B(us-east-1) … Continue reading Usecase with RDS Snapshot Export to S3

PostgreSQL – CPU Utilization and Index

One of the Production Aurora PostgreSQL instance running on db.r4.16xlarge instance (64 vCPU and 488 GB ) was reporting high CPU Utilization spiking upto 100%. With such issues, one of the first thing is to look for the SQLs with high buffers shared hit. I have built a small tool called pgsnap which is something … Continue reading PostgreSQL – CPU Utilization and Index