Teradata Plug-in for Eclipse 16.00.02.00
Teradata Plug-in for Eclipse Documentation 16.00.02.00
About this download

The Teradata Plug-in for Eclipse is based on the Eclipse plug-in framework and Data Tools Platform (DTP). Teradata Plug-in for Eclipse enables Eclipse DTP to operate with the Teradata Database and value-added features.

► NOTE: This version requires Java Runtime Environment 1.8.

NOTE:Neon JEE Eclipse bundle is missing the EMF Client Platform (ECP) 1.9.0 packages. Before installing the Teradata Plug-in, you must:
     -  Install the EMF Client Platform 1.9.0 by going to the Eclipse Help menu, selecting Install New Software... and entering the EMF ECP 1.9.x Update Site: http://download.eclipse.org/ecp/releases/releases_19/.

Note: If you have the previous version of EMF ECP installed, you want to remove it from Eclipse before installing the new version.

To install the new version:

     -  Press the Add... button to add the location of the EMF ECP 1.9.0 Update Site (see image below).

     -  Choose this site and install the "ECP SDK 3.x" in the "All SDKs" category. Press Next to install EMF ECP.

     -  Before letting Eclipse restart to finish the installation, remove the file

            plugins\org.eclipse.emf.ecp.view.swt.layout_versionNumber.jar

     - To remove this plugin, locate your Eclipse directory, and delete plugins
               \org.eclipse.emf.ecp.view.swt.layout_<versionNumber>.jar

     -  Restart Eclipse, File>Restart

     -  Install Teradata Plug-in for Eclipse

To install the Teradata Plug-in for Eclipse via the Eclipse Update Site process, bring up Eclipse and run the Software Updates process.

  • From the main menu, select Help then Install New Software...
  • In the Work with field, select the Add... button.
  • Enter Teradata Update Site in the Name field and the Teradata Plug-in for Eclipse URL (http://downloads.teradata.com/download/cdn/tools/tdide/tdide-16.00.02.00/update) as the Location.
  • Click OK

 

  • Select all of the Teradata Plug-in for Eclipse features to install.
  • Click Next to proceed with the installation.

 

To install from a local Teradata Plug-in for Eclipse Update Site, download the zip file locally. Refer to the Installation Guide for further instructions on installing Teradata Plug-in for Eclipse.

Teradata Plug-in for Eclipse contains open source components. A package containing the source code and licenses for these components is also available above for download. This package does not contain proprietary Teradata source code.

For help getting started using the Teradata Plug-in, refer to the Getting Started with Teradata Plug-in for Eclipse article.

Readme

Teradata Plug-in for Eclipse 16.00.02 release information.

Supported Platforms:
=============================
     Windows Vista  - Microsoft Windows Vista Enterprise, 32-bit and 64-bit
     Windows Server - Microsoft Windows Server 2003, 32-bit and 64-bit
                    - Microsoft Windows Server 2008, 32-bit and 64-bit
     Windows 2000   - Microsoft Windows 2000, 32-bit
     Windows 7      - Microsoft Windows 7, 32-bit and 64-bit
     Windows 8      - Microsoft Windows 8 and 8.1, 32-bit and 64-bit
     Windows 10     - Microsoft Windows 10, 32-bit and 64-bit
     Apple Mac      - OSX 10.7, 10.8, 10.9, 10.10, and 10.11

Dependent Eclipse Software:
=============================
     Eclipse IDE for Java JEE Developers Neon (Eclipse 4.6)

     NOTE: Neon JEE Eclipse bundles are missing the EMF Client Platform (ECP) 1.9.0 packages. You must:
     Before installing the Teradata Plug-in, you must:
     -  Install the EMF Client Platform 1.9.0 by going to the Eclipse Help menu,
        selecting Install New Software... and entering the EMF ECP 1.9.x Update Site.
        (Note that if you have the previous version of EMF ECP installed, you want to remove
        it from Eclipse before installing the new version.)
     To install the new version:

     -  Press the Add... button to add the location of the EMF ECP 1.9.0 Update Site.
        (http://download.eclipse.org/ecp/releases/releases_19)

     -  Choose this site and install the "ECP SDK 3.x" in the "All SDKs" category.
        Press Next to install EMF ECP.     
     
     -  Before letting Eclipse restart to finish the installation, remove the file

            plugins\org.eclipse.emf.ecp.view.swt.layout_versionNumber.jar

     -  To remove this plugin, locate your Eclipse directory, and delete plugins
               \org.eclipse.emf.ecp.view.swt.layout_<versionNumber>.jar

     -  Restart Eclipse, File>Restart
     -  Install Teradata Plug-in for Eclipse


Supported Teradata Databases Versions:
======================================
     Teradata Database 14.00
     Teradata Database 14.10
     Teradata Database 15.00
     Teradata Database 15.10
     Teradata Database 16.00

Supported Aster Databases Versions:
======================================
     Aster Database 6.00
     Aster Database 6.10
     Aster Database 6.20
     Aster Database 7.0 (Aster on Hadoop)

Required Software:
=============================
     - Java Runtime Environment (JRE) version 1.8
     - Mac OsX 10.7 or greater does not provide the Apple JRE. Users must
     install Oracle's JDK (not JRE) 1.8.

Install Instructions;
=============================
     Once the dependent software has been downloaded, Teradata Plug-in for Eclipse can
     be installed and configured. Follow this procedure to install Teradata Plug-in for
     Eclipse 15.12 using the Eclipse Update Site install procedure.

     To install Teradata Plug-in for Eclipse 16.00.02:
     1. From the main menu, click Help and then click Install New Software….
     2. Click the Add... button.
     3. Type ”Teradata Update Site” in the Name field and the Teradata Plug-in for
       Eclipse update site URL as the Location:
       http://downloads.teradata.com/download/cdn/tools/tdide/tdide-16.00.02.00/update
     4. Click OK to add the site.
     5. Select each of the Teradata Plug-in for Eclipse components, and then click Next.
     6. When the Feature License Agreement appears, select I accept the terms in the
        license agreement, and then click Next. A list of features appears with the feature
        version, size, and Eclipse install location.
     7. Verify that the information is correct, and then click Finish to continue the
        installation. A warning message appears regarding unsigned JAR content.
     8. Click OK to continue.
     9. When a message appears asking if you want to restart Eclipse, click Yes.

Support Notes:
====================================
Problem - Object lists displayed in the Data Source Explorer and shown in the wrong sort order.
Resolution - In the eclipse.ini file (located in your install directory), add the following variable to the list of -vmargs
     -Djava.util.Arrays.useLegacyMergeSort=true



Bug Fixes
====================================

The following bugs were fixed in 16.00.02:

IDE-17881     FastLoad with Unicode Pass Through needs to pass in RUNSTARTUP parameter
IDE-18240     Disable SSL from Teradata Studio connection to hadoop through Knox


The following bugs were fixed in 16.00.01:

IDE-14512    TD Studio does not display details under the Teradata Labs tab
IDE-17501    In SQL Editor star (*) is showing in editor tab even SQL save. Refer SQL.PNG
IDE-17590    Presto enabled in Connection Profile if Knox selected
IDE-17591    Cloudera Existing Connection Profile - Properties Page - Test Connection uses HortonworksConnection
IDE-17617    Data Transfer: The process just hangs since the update to 15.10.02 15.12.00. Exporting few hundred rows and the process ran for hours.
IDE-17712    Help table of contents missing
IDE-17722    Copy table TD to TD showing 0 rows transferred when actually transferred 65533
IDE-17743    Load table data always uses FastLoad protocol, even if only 45 rows
IDE-17888    SQL History SQL statement has lost the leading comment - Regression
 
The following bugs were fixed in 16.00.00:

IDE-12705    Linux and Amazon Linux --> Data Sort/Filter, Export All and ‘Refresh table from Database’ toolbar options are not displayed
IDE-12818    Redhat 7 and Amazon Redhat - Copy object Wizard --> Generated SQL is not displayed in the SQL Summary Page in Copy object Wizard
IDE-12819    Redhat 7 and Amazon Redhat - Format the Result set --> Format the Result set is not working in Redhat 7 and Amazon Redhat
IDE-14695    Row Count on more than 15 tables causes "Response limit exceeded."
IDE-14791    Result Set Viewer preference error message
IDE-15075    Transfer View Refresh is not refreshing after dropping table in Admin Perspective
IDE-15209    Hadoop Refactoring --> Confirm Dialog message box is not displayed when clicking on OK button after updating the Hadoop Profile
                  name in the Properties dialog and updated Hadoop Profile name in the Navigator/DSE with entered name
IDE-15549    Project Explorer 'Execute SQL Files' does nothing when press OK
IDE-15806    Property form is not closed when the table is dropped
IDE-15824    Preconfigured Data Sources mechanism does not support JDBC HTTP Transport Mode property
IDE-15878    Error when using default editor with previously imported Hadoop profile
IDE-15935    Set Root Database for Navigator --> Perform the case insensitive validation for Root Database
IDE-15937    JOIN/HASH --> Hash/Join Index is displayed without ‘Database name’ prefix when index dragged and dropped on SQL Editor
IDE-15959    Drop - Delete Database.../Drop Database... --> Menu options for Drop/Delete are not displayed when selecting the Database/User
IDE-16067    Result set Format settings lost on cancel operation
IDE-16070    Aster Smartloader gets incorrect nullable information
IDE-16156    Teradata ViewVX is not being used when loading Table Size information (DBC.TableSizeVX)
IDE-16225    Getting NullPointException for Sample Contents on empty table from Hadoop system.
IDE-16233    Certify CDH 5.8 with Studio Testing --> Unhandled event loop exception is thrown when clicking on ‘Toggle Result Set display as
                  Text or Grid’ toolbar button in the ‘Teradata Result Set Viewer’ after selecting the Sample Contents option for a Hadoop table
IDE-16302    Edit Data… - Teradata --> Unhandled event loop exception is thrown when clicking on ‘OK’ Button in the Filter/Sort Data dialog
IDE-16440    Smartload table returned null right after create statement successfully ran in Aster
IDE-16441    Only check for EXPLAIN for Teradata
IDE-16517    Japanese Localization Bugs in the 15.12 release
IDE-16604    Studio with ASTER on setting MAROWS is generating a pop up
IDE-16755    Aster --> Schemas are not displayed in the OLV when double clicked on database in the Navigator tab
IDE-16756    Aster --> Inform the user with proper message (or) Disable the ‘Add a Schema’ toolbar option as we cannot(other than login DB)
                   create the schema directly under the ‘XX DB’ using ‘Create Schema’ form
IDE-16774    Studio with ASTER treating "?" mark in reset function as query parameter
IDE-16814    Teradata/Aster --> 'Row Count' tooltip is displayed incorrectly for Views
IDE-16815    Spelling --> Message is displayed incorrectly in the ‘Drop Confirmation’ message box
IDE-16833    Need to update wording on notification dialog
IDE-16886    CDH 5.8 testing with TDCH 1.5 --> Unhandled event loop exception is thrown when clicking on Finish button after clicking on
                  ‘Select None --> Select All --> Reset’ buttons when trying to transfer Teradata table to Hadoop
IDE-16896    Teradata Studio 15.12 - Open SQL File does not work.
IDE-16915    Label name is displayed as ‘Database’ instead of ‘Schema’ in the ‘Transfer Hadoop Table to Aster Table’ wizard
IDE-16930    Unhandled event loop exception is thrown if the destination type contains unsupported data type as distribute by hash key
                   when transferring a Hadoop table to Aster
IDE-16931    Transfer Hadoop table to Aster -->Distribute by hash clause is not displayed when selecting hash key as other than first column
                   after unselecting the first column in the Transfer Column field
IDE-16996    Error Dialogue loops when timeout Teradata Connection and attempt copy table
IDE-17008    In SQL Editor, 'Connection Profile' Drop down is displayed as SmartLoad(CDH) instead of SmartLoad(HDP).
IDE-17055    Cloudera connection with SmartLoader connection option should show WebHDFS properties dialog, not TDCH
IDE-17092    In JDBC Connection profile, every field has Key board shortcut key other than 'Cancel' button and 'LDAP Security enabled' check box.
IDE-17366    Externalize SQL Formatter preference and routine editor
IDE-17369    Foreign Servers --> Unhandled event loop exception is thrown when pressing the 'Alt+Tab'
IDE-17397    Load Data and Export Data have Error Dialogue loops when connection got timed out


Improvements
====================================

The following improvements were made in 16.00.02:

IDE-17476     Support for Cloudera 5.9
IDE-17620    Create Aster Execution Engine Connections
IDE-17621    Aster Execution Engine Create table support
IDE-18179    Migrate Aster 7.0 connection profiles


The following improvements were made in 16.00.01:

IDE-10476    Unable to logon to Teradata DB using SSO/Kerberos in Studio v 15.0
IDE-14432    Support Kerberized cluster for Hortonworks and Cloudera
IDE-17725    Propagate WebHDFS username to WebHCAT username for new Hadoop Connection Profile

The following improvements were made in 16.00.00:

IDE-14769
    mm/dd/yyyy hh:mm:ss timestamp format is not recognized in external data transfer
IDE-14237    Provide a preference for setting number of decimal places on Float values displayed in the Result set viewer
IDE-11792    Result Set Viewer preference to specify numeric format pattern
IDE-14670    Provide option to choose the date format in the Result Set Viewer
IDE-14678    Provide option to display binary data in the Result Set Viewer with dashes
IDE-14679    Provide option to display BigInt and Decimal(16)+ values as Strings for Excel
IDE-13583    Result Set Viewer - add preference for displaying TIME values
IDE-14325    Provide direct string substitution for parameters in SQL Editor like SQL Assistant does
IDE-11839    Specify the location of the history file.
IDE-12922    See Triggers with table objects in the Object List Viewer
IDE-12570    Ability to "show all objects" for a DB/User
IDE-12571    Display the references for a table
IDE-14657    See the list of roles that a user is in.
IDE-14665    Show the list of statistics for tables
IDE-12082    Preference to use Columns VX or custom view.
IDE-14732    SmartLoad and FastLoad: add option to ignore null rows


Hadoop Configuration Support:
=============================
Teradata Studio provides an option to transfer data to and from Hadoop systems using its feature called the
Smart Loader for Hadoop. The Smart Loader for Hadoop uses the Teradata Connector for Hadoop (TDCH), which is
installed on the Hadoop node, to provide the data transfer to and from the Hadoop system. Teradata Studio
requires TDCH version 1.5.1. The following are the system requirements of TDCH 1.5.1:

    Supported Teradata Database versions:

        Teradata Database 14.00
        Teradata Database 14.10
        Teradata Database 15.00
    Teradata Database 15.10
        Teradata Database 16.00

    Supported Hortonworks Data Platform (HDP) versions:

        HDP 2.3 (Hadoop 2.7.1, Hive 1.2.1, HBase 1.1.1)
        HDP 2.4 (Hadoop 2.7.1, Hive 1.2.1, HBase 1.1.2)
        HDP 2.5 (Hadoop 2.7.3, Hive 1.2.1, HBase 1.1.2)

    Supported Cloudera Hadoop (CDH) versions:
           CDH 5.4, 5.7, 5.8


Teradata Connector for Hadoop Setup:
====================================
The TDCH 1.5.1 must be installed and configured on the Hadoop system. Teradata Plugin for Eclipse's Smart Loader
for Hadoop uses Oozie to submit the data transfer workflow on the Hadoop System. Please follow these instructions
to configure the Hadoop System and create the Oozie workflow files.

1) If the script is not already on your Hadoop system, download and install the TDCH (version 1.5.1) onto your Hadoop system.
2) Navigate to the TDCH scripts folder in the TDCH install directory (default location: /usr/lib/tdch/1.5/scripts)
3) Execute the configureOozie.sh script as root user, providing the locations of your Hadoop services.
    
The usage of the configureOozie.sh script is as follows (this is all a single line):

    Usage: ./configureOozie.sh nn=nameNodeHost [nnHA=fs.default.value] [rm=resourceManagerHost] [oozie=oozieHost] [webhcat=webHCatalogHost] [webhdfs=webHDFSHost]
        [nnPort=nameNodePortNum] [rmPort=resourceManagerPortNum] [ooziePort=ooziePortNum] [webhcatPort=webhcatPortNum] [webhdfsPort=webhdfsPortNum]
        [hiveClientMetastorePort=hiveClientMetastorePortNum] [kerberosRealm=kerberosRealm] [hiveMetaStore=hiveMetaStoreHost]
        [hiveMetaStoreKerberosPrincipal=hiveMetaStoreKerberosPrincipal]

(The parameters are entered on a single line. The parameters surrounded by [ ] are optional. The “[“ and “]” are not part of the command.)
(Note: the Job Tracker in HDP 1.x is now the Resource Manager in HDP 2.x)

    nn - The Name Node host name (required)
    nnHA - If the name node is HA (High Availability), specify the fs.defaultFS value found in core-site.xml
    rm - The Resource Manager host name (uses nn parameter value if omitted)
    oozie - The Oozie host name (uses nn parameter value if omitted)
    webhcat - The WebHCatalog host name (uses nn parameter value if omitted)
    webhdfs - The WebHDFS host name (uses nn parameter value if omitted)
    nnPort - The Name node port number (8020 if omitted)
    rmPort - The Resource Manager port number (8050 if omitted)
    ooziePort - The Oozie port number (11000 if omitted)
    webhcatPort - The WebHCatalog port number (50111 if omitted)
    webhdfsPort - The WebHDFS port number (50070 if omitted)
    hiveClientMetastorePort - The URI port for hive client to connect to metastore server (9083 if omitted)
    kerberosRealm - name of the Kerberos realm
    hiveMetaStore - The Hive Metastore host name (uses nn paarameter value if omitted)
    hiveMetaStoreKerberosPrincipal - The service principal for the metastore thrift server (hive/_HOST if ommitted)

The port numbers are HDP’s defaults. So, if the system being set up has all the services
hosted on a single system on the default ports, only the nn parameter is needed.

The script will exit with an error message if the TDCH is not in its expected location.
Otherwise the script will display a message indicating the parameters values. A sample
message is as follows:

The following is the specification of the Hadoop services used by the Oozie workflows:
{
        "Distribution":"HDP",
        "DistributionVersion":"2.4",
        "TeradataConnectorForHadoopVersion":"1.5.1",
        "WebHCatalog":"hostname",
        "WebHCatalogPort":50111,
        "WebHDFS":"hostname",
        "WebHDFSPort":50070,
        "JobTracker":"hostname",
        "JobTrackerPort":8050,
        "NameNode":"hostname",
        "NameNodePort":8020,
        "NameNodeHA":"fs.defaultFS",
        "NameNodeHAConfigured":true,
        "Oozie":"hostname",
        "OoziePort":11000,
        "HiveClientMetastorePort":9083
        "HiveMetaStoreKerberosPrincipal":"hive/_HOST",
        "KerberosRealm":"",
        "HiveMetaStore":"hostname"
}

** You must also make sure the Teradata IDE/Studio client machine can access the Hadoop system services (hostnames and ports provided to configureOozie script).
Thus, you may need to add the Hadoop services host names and IP addresses to your host file or DNS service.


HORTONWORKS SELF_SIGNED CERTIFICATES:
=====================================

The certificate used by the Knox server needs to be added to the Java runtime's certificate store. You can save the certificate with a web browser.
For example, with Chrome, you can:
•    Enter the Knox server:port in the address bar. It will say that the connection is not private
•    Click Advanced, then click on the Proceed to site link
•    Click on the lock in the address bar and select Details
•    Click View certificate
•    Select the Details tab in the resulting dialog and click the Copy to file... button
•    In the resulting Certificate Export Wizard, save the certificate as Base-64 encoded

(Other browsers have similar methods to get to the Certificate Export Wizard)

Alternatively, on the Knox server, run the command:

     keytool -export -alias gateway-identity -rfc -file knox.crt -keystore <path to gateway.jks keystore (eg. /usr/lib/knox/data/security/keystore/gateway.jks)>

To install the certificate into your Java Runtime certificate store, run the command:
     %JDK_HOME%\bin\keytool.exe -importcert -alias "TDH240 Knox self-signed certificate" -file cert_location/<filename>.txt -keystore %JRE_HOME%\lib\security\cacerts
Where %JDK_HOME% is an environment variable with the location of a JDK and %JRE_HOME% is the location of the JRE used to run Studio.

The keytool.exe will ask for the password to the certificate store. It is “changeit” unless you’ve changed it