Oracle Database Migration Assistant

Oracle Database Migration Assistant for Unicode:Frequently Asked Questions

Updated: June 2024

    Content

    Open all Close all
  • 1. Can I use the DMU if my database version is not listed in the supported configurations?

    In order to use DMU with a database older than 11.2.0.3, a database patch must be installed to add server-side migration functionality required by DMU. The patch is only available for the supported configurations. If your database version is not in the list, please consider upgrading to one of the supported releases. Note that all database releases since 11.2.0.3 are supported on all platforms except IBM z/OS and Fujitsu BS2000.

  • 2. Which JDK version is required to run the DMU client?

    DMU 23.1 requires JDK 11. DMU 2.1 requires JDK 6 or 7.

  • 3. Can I run the DMU client remotely to migrate the database?

    While DMU can be run remotely from a different machine, it is highly recommended to run it locally on the same host as the database server to be migrated for faster performance and reducing network overhead. It is also recommended to run the database in dedicated server mode during migration.

  • 4. What are the hardware requirements for a database server to migrate a database with the DMU?

    Migration to Unicode is a resource-intensive operation especially for migrating large-scale databases. It is recommended to perform the migration on sufficiently configured hardware to maximize the migration throughput. The exact hardware requirements vary depending on the size of the database and the targeted down-time window, but it is advisable to have a minimum of 8 CPU cores and 16GB memory for migrating databases of 100GB or above.

  • 5. What are the hardware requirements for running the DMU client?

    Oracle recommends that you run the DMU client on the same host as the database server to be migrated to minimize network overhead. If you must run the DMU client on a different host, the minimum hardware requirements are: CPU speed - 2GHz, Memory - 4GB RAM. Avoid having other jobs running in the same environment if they consume significant system resources.

  • 6. I set the JDK location incorrectly when starting the DMU. How can I change the JDK location?

    DMU stores the java executable path in ~/.dmu_jdk for Unix-based platforms. If you remove this file, the tool will ask for the full path again. If the java executable is found in PATH or JAVA_HOME is defined, then it will not prompt the user. On Microsoft Windows, you can edit the JDK location in the file dmu\dmu\bin\dmu32.conf (DMU with 32-bit JDK) or in the file dmu\dmu\bin\dmu64.conf (DMU with 64-bit JDK) under the DMU installation folder. Look for the keyword SetJavaHome.

  • 7. What is the recommended strategy for cleansing invalid data?

    Invalid data is often a result of storing data in a different encoding than the database character set using the pass-through configuration which bypasses the client/server character set conversion. In order to migrate such data correctly, you must identify the actual encoding used for these data. The DMU character set tagging feature allows you to analyze columns containing invalid data by re-rendering them in different character sets. After the actual character set is confirmed and tagged to the column, it will be used in all subsequent data scanning and conversion operations. If you believe all data in the database is stored in a different character set, you can set it in the "Assumed Database Character Set" field in the database property tab (e.g., storing WE8MSWIN1252 data in a WE8ISO8859P1 database).

    It is also possible that the invalid data is caused by application bugs or storing binary data in character columns. Please see Chapter 6 of the Users' Guide for more cleansing scenarios.

  • 8. What is the recommended strategy for cleansing data expansion issues?

    When migrating non-ASCII data to Unicode, the resulting data may expand in size due to their multi-byte representations in Unicode. The data expansion issues can manifest either as over column limit issues or over data type limit issues.

    • Lengthen the column
    • Change the length semantics of the column from bytes to characters
    • Shorten the stored values manually
    • Allow DMU to truncate the values during conversion
    • Edit the value to replace characters that expand in conversion
    • Migrate to a larger data type

    For "exceed data type limit" issues, the options are:

    • Migrate to a larger data type
    • In Oracle Database 12c, enable extended data types, i.e. VARCHAR2(32767)
    • Shorten the stored values manually
    • Allow DMU to truncate the values during conversion
    • Edit the value to replace characters that expand in conversion

    Cleansing actions involving changes to column definitions may not be suitable to be performed on production environments since they typically require corresponding updates in the application code logic. DMU provides scheduled cleansing actions for such cases so that the changes can be saved in the DMU repository for execution later during the conversion phase as part of the downtime window. To define a scheduled cleansing action, select “Schedule Column Modification…” from the cleansing editor context menu on the target column.

  • 9. What are the DMU conversion criteria for the data dictionary?

    In general, the DMU does not support converting data dictionary in this release if there is convertible data in data dictionary tables, except for the following:

    • CLOB columns – this is necessary only in a single-byte database
    • Binary XML token manager tables, with names like XDB.X$QN% and XDB.X$NM%
    • PL/SQL source code: text of CREATE PROCEDURE, CREATE FUNCTION, CREATE PACKAGE, CREATE PACKAGE BODY, CREATE TYPE BODY, CREATE TRIGGER, and CREATE LIBRARY; type specifications (CREATE TYPE) are not converted
    • View definitions: text of CREATE VIEW
    • The columns:
      • SYS.SCHEDULER$_JOB.NLS_ENV – NLS environment for Database Scheduler jobs (DBMS_SCHEDULER)
      • SYS.SCHEDULER$_PROGRAM.NLS_ENV – NLS environment for Database Scheduler job programs (DBMS_SCHEDULER)
      • SYS.JOB$.NLS_ENV – NLS environment for legacy jobs (DBMS_JOB)
      • CTXSYS.DR$INDEX_VALUE.IXV_VALUE – attribute values of Oracle Text policies
      • over 50 different columns in SYS, SYSTEM, and CTXSYS schemas that contain user comments for various database objects

    The PL/SQL source code and the view source text are kept in multiple tables. The DMU checks the following columns when processing the source code and view definitions:

    • SYS.VIEW$.TEXT – view definition text
    • SYS.SOURCE$.SOURCE – PL/SQL and Java source code
    • SYS.ARGUMENT$.PROCEDURE$ – PL/SQL argument definitions: procedure name
    • SYS.ARGUMENT$.ARGUMENT – PL/SQL argument definitions: argument name
    • SYS.ARGUMENT.DEFAULT$ – PL/SQL argument definitions: default value
    • SYS.PROCEDUREINFO$.PROCEDURENAME – names of procedures and functions declared in packages
    • SYS.IDL_CHAR$.PIECE – internal representation of PL/SQL
    • SYS.PLSCOPE_IDENTIFIER$.SYMREP – internal representation of PL/SQL; this table is new Oracle Database 11g

    The DMU does not report convertible character data in the tables and columns listed above as a convertibility issue. Any convertible data in the remaining tables and columns of the data dictionary is flagged as a convertibility issue in scan reports and on the Migration Status tab. The database conversion step cannot be started before the flagged data is removed. Cleansing operations are not allowed on data dictionary tables.

  • 10. How to deal with convertible data in AWR tables (WRI$_%, WRH$_%, WRR$_%)?

    The SYS schema contains a number of tables with names beginning with WRI$, WRH$%, and WRR$_, which comprise the Automatic Workload Repository (AWR). In addition to historical object statistics, this repository stores snapshots of vital system statistic, such as those visible in various fixed views, for example, V$SYSSTAT and V$SQLAREA.

    If non-ASCII characters are used in object names or in SQL statements, for example in character literals or comments, they may get captured into the AWR tables. The DMU scan will report such characters as convertible data dictionary content, which prevents conversion of the database. To get rid of this data completely, recreate the Automatic Workload Repository by logging into SQL*Plus with SYSDBA privileges and running:

       SQL> @?/rdbms/admin/catnoawr.sql
    
       SQL> @?/rdbms/admin/catawr.sql
       SQL> execute dbms_swrf_internal.register_local_dbid;
    

    As the catawr.sql script is not present in Oracle Database versions 10.2.0.4 and earlier, Oracle recommends that you install the Oracle Database patch set 10.2.0.5 before purging AWR contents.

  • 11. Why do I get warnings for modifying columns under the Oracle E-Business Suite schemas?

    Modifying the structures of Oracle E-Business Suite schemas is not supported as it may cause the Oracle applications to malfunction. You should only modify such columns if the affected table is a custom table created by you or if you have been advised to do so by Oracle Support.

  • 12. I just removed the characters causing invalid binary representation in my CLOB data cells in the cleansing editor. Why are they still highlighted as exceptions?

    This is expected because rescanning larger data types on the client-side can be very expensive. The cleansing editor filtering and highlighting for CLOB, LONG, and XMLType columns are based on the most recent scan results for the table which won't change until you rescan the table/column.

  • 13. How does the DMU report data cells that exhibit both invalid binary representation and size expansion issues?

    The DMU classifies each data cell under only one of the scan result categories. Values that have invalid binary representation issues are classified only as such even if their lengths also exceed column or data type limit after the conversion. The user is generally expected to resolve the invalid data issues and rescan the data before attempting conversion. Since the DMU also allows you to ignore the invalid data issues and force the conversion of a column, you should be aware that forcefully converted values with invalid binary representation may be additionally truncated. You can compare the value of the Maximum Post-conversion Length property of the column with the column and data type length limits to see if the truncation will take place.

  • 14. Why do I sometimes get performance warnings when applying filters in the cleansing editor?

    DMU can identify rows with a given type of convertibility issues by either using the rowids collected in the migration repository or analyzing the column values dynamically on the client as they are fetched from the database. For the latter approach, when only a small percentage of rows in the table meet the filter criteria, DMU may have to fetch and analyze many rows before it finds enough rows to fill the cleansing editor. From the performance standpoint, it is recommended to scan the table to pre-collect the rowid information and enable the “User Scan Log to Filter Data” button on the cleansing toolbar before applying the filters.

  • 15. The migration status tab says "The current setting rules out CTAS conversion method for tables with Row Movement disabled". What does it mean?

    For tables with relatively large percentage of convertible data, DMU can assign the "Copy data using CREATE TABLE AS SELECT" conversion method which provides significant performance advantage. The CTAS conversion method is not enabled by default since it may not preserve the rowids of rows in the table. If your applications do not store rowids, you can set the "Consider CTAS with Row Movement Disabled:" parameter to "Yes" in the database property tab so that DMU will assign the CTAS conversion method for optimal conversion performance.

  • 16. How do I monitor the table-level conversion progress in the conversion phase?

    You can monitor the table-level conversion progress in the "View Table Conversion Progress" link on the Conversion Progress tab. It will display the completion percentage for each table based on the status from the V$SESSION_LONGOPS view along with the execution status of individual SQL statements.

  • 17. I got ORA-12721 during altering database character set to Unicode in the conversion phase. How can I diagnose the offending session?

    The SQL statement ALTER DATABASE CHARACTER SET, which the DMU uses in the conversion phase, succeeds only if the session executing the statement is the only user session logged into the database. Therefore, before starting the conversion, the DMU warns you about any user sessions logged into the database that are not opened by the DMU itself. You may use the following SQL statement in SQL*Plus or SQL Developer to find out the details of the offending sessions:

       SELECT sid, serial#, username, status,
    
              osuser, machine, process, program
         FROM v$session
        WHERE username IS NOT NULL
          AND program <> 'Database Migration Assistant for Unicode';
    

    At this point, you can still disconnect the offending sessions and resume the conversion.

  • 18. A DMU scan fails on some tables with "ORA-29913: error in executing ODCIEXTTABLEOPEN callout". How to resolve this?

    The error indicates that the DMU was unable to read an external table. This can happen if data files of the external table are not available in the expected location or they are in a wrong format. As external tables are often used to load data from external sources only at certain points in time, it is not uncommon for data files to be unavailable. You can either make sure the data files exist and retry the scan or ignore these errors since they will not block any subsequent migration operations.

  • 19. Why does DMU report Unicode replacement characters as invalid in the validation mode?

    The value of the property "Report U+FFFD as an invalid character" on the Scanning sub-tab of the Database Properties tab determines how the DMU interprets the Unicode default replacement character U+FFFD (the byte sequence 0xEF 0xBF 0xBD in AL32UTF8 and UTF8). If the property value is "Yes", the character is treated as invalid data. This is the default behavior, because the presence of this character in data usually indicates that the data is the result of character set conversion of some input that was not properly tagged with its real character set. If you use the character U+FFFD for some internal processing purposes and you do not want the DMU to report it as invalid, change the property value to "No".

  • 20. I have fixed some data issues. How come the DMU status icons still show the affected objects as not ready for conversion?

    The DMU determines the data readiness status based on the results of the most recent scan. Please be sure to rescan the affected database objects after applying cleansing actions to see the effect of the changes and verify that the data issues have been resolved successfully.

    If you cleansed a table outside of the DMU, for example by lengthening a column in SQL Developer or SQL*Plus, and the changes are not reflected in the DMU, then refresh the DMU repository by clicking "Refresh DMU Repository..." in the Migration menu.

  • 21. Does the DMU have a rollback feature?

    The DMU does not offer any conversion rollback feature per se but it comes with built-in conversion error handling such that if the conversion process is interrupted by an error condition, it is possible to resolve the issue and resume the conversion. If you really need to rollback your database to the state before the conversion, you can restore from backup or use the flashback database feature.

    Note: The flashback database feature has not been tested to work across the ALTER DATABASE CHARACTER SET (ADBCS) statement. While the design of the feature should not conflict with ADBCS, it is recommended that you choose restoring from backup if ADBCS has already been performed during the conversion process, that is, the query SELECT value FROM nls_database_parameters WHERE parameter='NLS_CHARACTERSET' shows the target character set. If you have no backup available and you are forced to try flashback database after ADBCS, make sure you restart the instance immediately after the flashback command. See Oracle Database Backup and Recovery User's Guide for more information on FLASHBACK DATABASE and its requirements prior to starting the conversion process.

  • 22. Some DMU operations appear to hang or take unusually long on my database. What could cause this?

    If you notice that a DMU operation, especially one that is normally fast, such as refreshing the repository or calculating split threshold before scanning, takes unusually long or appears to hang, then verify that the database initialization parameter optimizer_features_enable is not set to an old database version. For example, the value 9.2.0 of this parameter is known to cause certain internal queries in the DMU to use suboptimal execution plans.

  • 23. When starting, the DMU consistently picks up an older version of the JDK. How to specify the location of the correct JDK?

    When starting on a Unix-like operating system, the DMU looks for a JDK in the following locations in turn:

    • the location specified in the SetJavaHome directive in the file dmu/dmu/bin/dmu.conf
    • $APP_JAVA_HOME
    • $OIDE_JAVA_HOME
    • dmu/jdk
    • dmu/../jdk
    • $JAVA_HOME
    • /usr/java/jdk1.6.* (with highest version)
    • the location specified previously by the user and stored in the file $HOME/.dmu_jdk
    • the location requested from the user.

    When starting on a Microsoft Windows operating system, the DMU looks for a JDK in the following locations in turn:

    • the location specified in the SetJavaHome directive in the file dmu\dmu\bin\dmu32.conf if started through dmu32.exe or dmuW32.exe or in the file dmu\dmu\bin\dmu64.conf if started through dmu64.exe or dmuW64.exe
    • dmu\..\jdk
    • the location requested from the user; the location is stored under SetJavaHome in dmu32.conf or dmu64.conf correspondingly.

    In each of the locations, the DMU verifies the presence of the files bin/java and jre/bin/java (java.exe on Windows).

    If any of the locations above contains an incorrect version of the JDK, this version may be picked up by the DMU resulting in an error. To solve the issue, modify the file dmu/dmu/bin/dmu[32|64].conf and add the path of the correct JDK installation in the SetJavaHome directive.

  • 24. Is the downtime needed to convert a database with the DMU directly proportional to the size of the database?

    It is generally not possible to extrapolate the migration downtime window based on the database size alone. Other important factors include the readiness of the data for conversion, the percentage of the data that needs to be converted (non-character data types and 7-bit ASCII data in non-CLOB columns need no conversion), and the size of CLOB columns (CLOBs are typically more expensive to convert than CHAR/VARCHAR2), etc.

    To get a more precise estimate of the conversion time, the best approach is to take a clone of the database to be migrated and run the clone through the end-to-end process in a controlled test environment. This will give a good idea of what types of data convertibility issues need to be dealt with prior to the data conversion and the conversion time window one can expect.

  • 25. The DMU reports invalid representation data in the table SYS.BOOTSTRAP$ in an Oracle Database 12.1.0.2 PDB. How to handle this data?

    This problem is reported in bug #19533216. In Oracle Database 12.1.0.2, a PDB may contain one or more rows with binary data in the data dictionary column SYS.BOOTSTRAP$.SQL_TEXT. The DMU 2.0 reports this data as having invalid representation, which prevents character set conversion of the PDB. The presence of this data is not considered a bug from the Oracle Database perspective.

    To solve the problem, upgrade the DMU to the most current version. DMU 2.1 or later contains a fix for this issue.