Migrating business critical applications to a new environment can be difficult and expensive. The short duration of maintenance windows often dictates the use of costly tools to perform change data capture (CDC) from the source to target databases so that the switch over process happens as quickly as possible. Amazon Web Services recently introduced the Database Migration Service (DMS) that supports the migration of databases from on-premises to the cloud with CDC support. This session will explain how DMS provides a simple and cost effective way to migrate business critical applications to Amazon Web Services. It will also cover how DMS enables new workloads for analytics, dev/test and heterogeneous database migrations.
Report
Share
Report
Share
1 of 43
Download to read offline
More Related Content
Migrating Databases to AWS for Business Critical Applications and Analytics
2. Customers Want to Migrate to AWS, but…
They can’t afford long periods of application downtime
Tools that enable minimal downtime are expensive
It seems too complex and expensive to migrate
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
3. Traditional Approach to Migrate to AWS
1. Create your AWS account
2. Setup your Virtual Private Cloud (VPC) in AWS
3. Connect to AWS with a VPN or Direct Connect
4. Shutdown and backup your database
5. Transmit the backup to S3
6. Configure an EC2 instance with the DB software
7. Restore the backup
8. Configure EC2 instances for the application
9. Switch the users to use AWS
4. Traditional Approach to Migrate to AWS
1. Create your AWS account
2. Setup your Virtual Private Cloud (VPC) in AWS
3. Connect to AWS with a VPN or Direct Connect
4. Shutdown and backup your database
5. Transmit the backup to S3
6. Configure an EC2 instance with the DB software
7. Restore the backup
8. Configure EC2 instances for the application
9. Switch the users to use AWS
Steps 4-9 could take a week or more!
6. Start your first migration in 10 minutes or less
Keep your apps running during the migration
Replicate within, to or from Amazon EC2 or RDS
Move data to the same or a different database engine
AWS
Database Migration
Service
7. DMS Console
Console (or API) controlled
Set up replication instances,
tasks
Use as many or as few as you
want, even against the same
database instance
Choose the power/speed/cost
of your migration
• T2.micro – C4.4xlarge
Choose the tables you want
On-prem->RDS/EC2,
EC2<>EC2, RDS<>RDS,
RDS/EC2->on-prem
8. Customer
Premises
Application Users
AWS
Internet
VPN
Start a replication instance
Connect to source and target
databases
Select tables, schemas, or
databases
Let AWS DMS create tables,
load data, and keep them in
sync
Switch applications over to the
target at your convenience
Keep Your Apps Running During the Migration
AWS
Database Migration
Service
9. Load is Table by Table
Replication instance
Source Target
10. Change Data Capture (CDC) and Apply
Replication instance
Source Target
update
t1 t2
t1
t2
Transactions Change
apply
after bulk
load
15. Sources for AWS Database Migration Service
Customers can use the following databases as a source for data migration using
AWS DMS:
On-premises and Amazon EC2 instance databases:
• Oracle Database 10g – 12c
• Microsoft SQL Server 2005 – 2014
• MySQL 5.5 – 5.7
• MariaDB (MySQL-compatible data source)
• PostgreSQL 9.4 – 9.5
Amazon RDS instance databases:
• Oracle Database 11g – 12c
• Microsoft SQL Server 2008R2 and 2012. CDC operations are not supported yet.
• MySQL versions 5.5 – 5.7
• MariaDB (MySQL-compatible data source)
• PostgreSQL 9.4 – 9.5. CDC operations are not supported yet.
• Amazon Aurora (MySQL-compatible data source)
16. Targets for AWS Database Migration Service
Customers can use the following databases as a target for data replication using
AWS DMS:
On-premises and Amazon EC2 instance databases:
• Oracle Database 10g – 12c
• Microsoft SQL Server 2005 – 2014
• MySQL 5.5 – 5.7
• MariaDB (MySQL-compatible data source)
• PostgreSQL 9.3 – 9.5
Amazon RDS instance databases:
• Oracle Database11g – 12c
• Microsoft SQL Server 2008R2 and 2012
• MySQL 5.5 – 5.7
• MariaDB (MySQL-compatible data source)
• PostgreSQL 9.3 – 9.5
• Amazon Aurora (MySQL-compatible data source)
Amazon Redshift
17. AWS Database Migration Service Pricing
T2 for developing and periodic data migration
tasks
C4 for large databases and minimizing time
T2 pricing starts at $0.018 per Hour for T2.micro
C4 pricing starts at $0.154 per Hour for C4.large
50GB GP2 storage included with T2 instances
100GB GP2 storage included with C4 instances
Data transfer inbound and within AZ is free
Data transfer across AZs starts at $0.01 per GB
18. Migrate 5 TB in 33 Hours!
In our testing, under mostly ideal conditions, we were able
to migrate 5 TB of relatively evenly distributed
data from a database on Amazon EC2 to a database on
Amazon RDS in about 33 hours. The data
included 4 large (250 GB) tables, a huge (1 TB) table, 1000
small to moderately sized tables, 3 tables
which contained LOBs varying between 25 GB and 75 GB,
and 10,000 very small tables.
- DMS Documentation
21. New Use Cases with DMS
Migration of business critical applications
Migration from Classic to VPC
Cheap read replicas for Oracle
Read replicas on other engines
Cross region read replicas for Oracle and SQL Server
Analytics in the Cloud
Dev/Test and Production environment sync.
24. Data Ingestion with AWS
AWS Import/Export Disk – Ship your hard
disks to AWS
AWS Import/Export Snowball – A secure
storage appliance with up to 80 TB that AWS
ships to you
Amazon S3 Transfer Acceleration – Use
AWS Edge Locations nearest to you to
transfer data on Amazon’s optimized network
up to 300% faster. You only need to pay for a
local network connection!
25. AWS Schema Conversion Tool
Features
Oracle and SQL Server schema conversion to MySQL/Aurora/MariaDB and PostgreSQL
Database Migration Assessment report for choosing the best target engine
Code browser that highlights places where manual edits are required
Secure connections to your databases with SSL
The AWS Schema Conversion Tool helps
automate many database schema and code
conversion tasks when migrating from Oracle
and SQL Server to open source database
engines.
26. SCT helps with converting tables, views, & code
Sequences
User Defined Types
Synonyms
Packages
Stored Procedures
Functions
Triggers
Schemas
Tables
Indexes
Views
27. SCT can tell you how hard the migration will be
1. Connect SCT to
Source and Target
databases.
2. Run Assessment
Report.
3. Read Executive
Summary.
4. Follow detailed
instructions.
28. Pricing and Terms and Conditions
$0
for software license
Allowed Use
Use SCT to migrate database schemas to
Amazon RDS, Amazon Redshift, or Amazon
EC2-based databases
To use SCT to migrate schemas to other
destinations, contact for special pricing
Pricing
Free software license
For active AWS customers with
accounts in good standing
30. Thomas Publishing
Thomas Publishing has been in business for over a century, connecting buyers and
suppliers across all industrial sectors, evolving from an industrial trade print
publisher into industry’s most respected group of digital-friendly businesses.
Previously ran homegrown applications on a single, monolithic Oracle database.
With a growing user base, performance declined as licensing costs increased.
Working with Apps Associates, Thomas Publishing championed a proof of concept
project to migrate to the cloud.
They leveraged the Schema Conversion Tool to convert their Oracle schema to
Amazon Aurora and used DMS to migrate the data.
The proof of concept was successful and they are moving the remainder of their
applications and data to AWS.
31. Pegasystems
Pegasystems, whose customers include many of the world’s most
sophisticated and successful enterprises, develops strategic
applications for sales, marketing, service, and operations.
Pegasystems used DMS to migrate customers from their legacy cloud
environment (Oracle) to their new cloud 2.1 environment using RDS
PostgreSQL.
They experienced better availability and performance. The new 2.1
environment is built using AWS best practices and services such as
ELB, auto scaling, and of course RDS Multi-AZ to remove single points
of failure in the architecture.
They also experienced cost savings by moving from Oracle to RDS
PostgreSQL.
34. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
35. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
36. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
37. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
38. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
39. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
40. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
41. Customers Want to Migrate to AWS, but…
They can’t afford any application downtime
Tools that enable minimal downtime are expensive
It seems too complex to migrate successfully
They still need a copy of the data on-premise
They want to migrate to an open source database
Sending large volumes of data to AWS requires an
expensive international network link
They don’t have the skills inside their organization
42. Now you can Migrate Your
Business Critical Applications
to AWS!