Affects Version/s: None
Fix Version/s: None
Component/s: CUBRID Migration Toolkit
Recently, a lot of customers ask to use CMT however its performance and mal-functioning is not that good enough.
Especially, in the enterprise, its data size of 1 DB is around 200G so that CMT should be positioned to be a tool for large data migration. To do so, please prepare data simulated with various types with 2000+tables and more than 1,000,000,000 records with complexed indexes for the test. Total DB size should be around 200G.
The goal will be "easier to map data types, to edit conditions which data only be transferred.
- Default column mapping table should have a smart logic. For example, "Y" "N" columns should not be converted to char(3). char(1) or varchar(3) like that.
- mapping information should be exported/imported in excel so that they can review with DBAs or senior developers.
- the final report should have a comparison between source and target so that users can get what is done and what to do more.
The goal will be "as fast as simple temporarily coded program for migration"
- supporting SQL format with XML becasue XML parsing takes so long. Any DB can export its data to *.sql format.
- assumpting how long it will take and display it, and do not make users wait for such long.
- multi-threaded option should be perfectly working to reduce execution time.
- pick-up the fastest way to insert data whether it is offline(loaddb) or online(multi-threads)
Most of user pattern is
When professinal DBA is in the company:
dump data from source DB --> create DB and schema --> migrating DATA --> index --> done
When developer operates alone:
dump data from source DB --> CMT execution creating a DB, migrating DATA, indexing --> done
dump data from source DB --> create DB and scheme, index --> CMT execution for migrating DATA --> done
(User sometimes creates index before migration, which is bad for performance)