0 - 49 of 49 tags for hadoop

The Teradata Connector for Hadoop (Command Line Edition) provides bi-directional data movement between Teradata and Hadoop in a command line environment.

I am not able to use enclosedby and escapedby arguments in Teradata hadoop connector. I get the following error when I pass these arguments. here I am trying to set enclosedby with a double quote and escapedby with forward slash. The error goes away when I remove the enclosedby and escapedby arguments.

I have a Hive table that is too big for my Teradata database, but if I can exclude some of the columns, it should be OK.  Since I don't want to have a duplicate copy of the table with fewer columns on my hadoop server I have two choices for import:    1) use a view to filter out the columns or    2) use a query to filter out the columns.

Hi All,
 
I was looking to download Teradata Loom. But seems it is not available to download. Can someone confirm if this is available or discontinued?
http://www.teradata.com/download-teradata-loom/?LangType=1033&LangSelect=true
 
Please reply !
 
Thanks
Kavin

Hello,
I am using teradata vesion of Prestodb 141t to connect to different data sources and one of my datasource is teradata but there aren't any documenatation about the connector. Does prestodb have the ability to connect to Teradata? If yes, how could i set up my connector?
 
Thanks, Siva. 

Presto is an open source distributed SQL engine, originally developed by Facebook for their massive Hadoop data warehouse.

Hi,
I am trying to read data from Teradata DB in a MapReduce job using the JDBC Driver. I get the following exception/stack trace:
 
Provided connection string: jdbc:teradata://teradata/database=dbname
 

java.lang.ExceptionInInitializerError: null

Hi,
I have TD Studio 15.10, TD DB 15.0, HDP 2.0 and Aster 6.0. They are all connected well in TD Studio and I can see the tables.
I'm doing some tests on Aster based on a HDP table but when I execute the query:
 
SELECT * FROM load_from_hcatalog
(USING server('192.168.100.131')
port('9083')
username('root')

Today's analytic environments incorporate multiple technologies and systems. Teradata QueryGrid™  Teradata-to-Hadoop allows you to access data and processing on Hadoop from your Teradata data warehouse.

Hi.,
Are you looking for Big Data/Hadoop with Data Analytics & Data Science Training ..?,
Here i am the one & only source stop solution for all of your technical requirements.
I am in expert to provide training on below Data Science/Data Analytics/Big Data/Hadoop components like .,
HDFS & Map/Reduce

This training course is designed to provide knowledge and skills to become a successful Hadoop Administrator. It starts with the fundamental concepts of Apache Hadoop and covers topics such as the internals of YARN, MapReduce and HDFS, best practices for  preparing and maintaining Apache Hadoop in production. 

 

Teradata Studio 15.10 is now available for download. Teradata Studio is an administration tool for creating and administering database objects. It can be run on multiple operating system platforms, such as Windows, Linux, and Mac OSX.

HI, we have a sqoop job that exports data to Teradata hourly. Sometimes job fails with untranslatable character error. We would like to know, if there is any Java Function to check if the string is translatable by teradata. This is of higher priority. Please let us know, if there are any questions.

Hi all,
I am trying to export data hive table to teradata table, Below is the command

hadoop com.teradata.hadoop.tool.TeradataExportTool

-url jdbc:teradata://testsystem/database=testdb

-username testuser

-password testpassword

-jobtype hive

-fileformat textfile

Hello All,
 
I am currently in the process of developing a complete end-end solution that will migrate objects/data from Teradata to Hadoop and vise versa.
 
When researched, i got the articles on TERADATA CONNECTORS FOR HADOOP (COMMAND LINE INTERFACE) and their respective  tutorial document.
 
 

I am trying to connect Mapr Hadoop and Teradata using TDCH. I have successfully installed the TDCH on MapR Hadoop.
Now I want to create new Connection Profile, but I am unable to see the Hadoop System option in Teradata studio 15. Please tell me how to get it.

Hadoop Market to Grow $50.2 Billion in Next 6 Years. This is the best time to take advantage and get trained in Hadoop. Register Now http://j.mp/hadoop-bangalore-training

Course starts on: 

2015-01-13 

and ends on: 

2015-01-16

Venue: 

Ibis Hotel,

Bangalore 

Contact: 

This article describes how to use Teradata query grid to execute a Mahout machine learning algorithm on a Hadoop cluster based on data sourced from the Teradata Integrated Data Warehouse.

Good Evening,
 

Does TD SQL-H support reading ORC File tables?

OSSCube provides Hadoop Training.

 

Hadoop Enthusiasts! Get ready for our upcoming Cloudera Apache Hadoop development training in Bangalore on 27 Oct 2014

 

Registration: http://j.mp/hadoop-bangalore-training

 

Regards,                         

OSSCube India Marketing Team 

This webinar provides an overview of the Teradata Enterprise Access for Hadoop solutions.

The Unified Data Architecture enables Teradata, Aster and Hadoop to deliver unparalleled value.

Hi,
I have a couple of questions regarding the Hadoop connectors:

  • What connector do I need to integrate the Hadoop and Teradata using an ETL tool like Talend or Informatica PowerCenter?
  • Can I use all of the 3 connectors side-by-side and deploy it in a sandbox environment?

 Regards,
Joseph

4Trainings is provides Hadoop Online Training Class course by excellent experienced IT professional trainers  who has more then 10+ Years of real time experience Our trainers are more dedicated professionals so that best quality output will be delivered. Our professionals cover each and every topic in Hadoop Online Training.

Hello,
I downloaded Aster v5 and v6 VM Images available from the download section and have been able to replicate the error on both. I'm logged in as aster
I have a single node hadoop cluster 2.2.0 with hive 0.12 (webhcat running on 50111). I can ping both machines and connectivity is good.

Hi All,
I am using teradata-connector-1.2.1 with the following versions:
->Teradata 14.0
->HDP 2.0(Hadoop 2.1.0.2.0.5.0-67)
->Hive 0.11

I am trying to import data from teradata tables to Hive.
 
/*Created table and inserted a row in teradata*/

Processes for integrating security controls were underdeveloped in early versions of Hadoop because data protection needs in its ecosystem were not well-understood.

Exporting data from a Teradata table to an Hive table using the studio (v14.10.01.201310271220) fails silently. Nothing happens since there is a bad parameter called within the tdch command launched.

TThis session dives into the new Hadoop architectural constructs called YARN and Tez.

In some areas, big data tools, such as Hadoop and HBase, overlap with relational databases and data warehouses. Putting the hyperbole aside, how should you choose when both systems provide analytic value?

The Analytical Ecosystem can be quite complex. It usually consists of multiple managed servers containing instances of databases, ETL servers, infrastructure services and application servers. Monitoring and managing applications in this environment can be a very challenging task. Knowing at any moment what is the state of your hardware, how your applications are doing, how many jobs have finished successfully, how many failed and why have they failed are the types of questions database administrators typically ask themselves. Now with the addition of Hadoop infrastructure components within an ecosystem, monitoring has become even harder. Unity Ecosystem Manager helps users to answer those questions and perform and necessary maintenance tasks.

Get a quick start on Teradata Studio and Teradata Studio Express with these usage videos. The overview and connection videos apply to both Teradata Studio and Teradata Studio Express. The others apply to Teradata Studio only: Create Database, Create Table, Move Space, Smart Loader, Copy Objects, and Transfer Data.

Hi Masters,

 

This is a Hadoop/Teradata scenario. I'm looking for advise for a data migration from Hadoop to Teradata. The source data is supposed to be coming in form of delimited flat files. What are my best options for migrating data with best performance.

 

 

Module:    Thinking at Scale: Introduction to Hadoop

You know your data is big – you found Hadoop. What implications must you consider when working at this scale? This lecture addresses common challenges and general best practices for scaling with your data.

 

Module: MapReduce and HDFS

 

MAGNIFIC trainings are pioners in offering real time industrail experts hands on training on HADOOP BIG DATA online TRAINING.

 HADOOP technologies like HADOOP, ADOBE, BIG DATA, analytics training, SAP through webex ,team viewer & skype to serve the 

requests coming from all over the world.

 

The Teradata Connector for Hadoop (TDCH) provides scalable, high performance bi-directional data movement between the Teradata database system and Hadoop system.

I am trying to load data from hadoop to Teradata using tpt.
The delimiter in these files is the default hadoop delimiter (Cntrl-A - hex value 01).
Does tpt support this delimiter and how do I specify it?

Behind the Big Data hype and buzzword storm are some fundamental additions to the analytic landscape.

Hi,
I am trying to connect Aster and hcatalog of hortonworks using the command load_from_hcatalog.
But evrytime I try to fire this query, it throws an error saying the following:

The Teradata Connector for Hadoop (TDCH) is a map-reduce application that supports high-performance parallel bi-directional data movement between Teradata systems and various Hadoop ecosystem components.

Teradata Studio provides a Smart Loader for Hadoop feature that allows users to transfer data from Teradata to Hadoop, Hadoop to Teradata, and Hadoop to Aster. When transfering between Teradata and Hadoop, the Hadoop Smart Loader uses the Teradata Connector for Hadoop MapReduce Java classes as the underlying technology for data movement.

Teradata Studio 14.02 is now available for download. Teradata Studio is an administration tool for creating and administering database objects. It can be run on multiple operating system platforms, such as Windows, Linux, and Mac OSX.

I have a ~250 node hadoop cluster containing a large data set that I want to move to Teradata as quickly as possible.  The target Teradata system has ~100 (recent generation) nodes.

Hadoop MapReduce programmers often find that it is more convenient and productive to have direct access from their MapReduce programs to data stored in a RDBMS such as Teradata Enterprise Data Warehouse (EDW) because:

  1. There is no benefit to exporting relational data into a flat file.
  2. There is no need to upload the file into the Hadoop Distributed File System (HDFS).
  3. There is no need to change and rerun the scripts/commands in the first two steps when they need to use different tables/columns in their MapReduce programs.

Hadoop systems [1], sometimes called Map Reduce, can coexist with the Teradata Data Warehouse allowing each subsystem to be used for its core strength when solving business problems. Integrating the Teradata Database with Hadoop turns out to be straight forward using existing Teradata utilities and SQL capabilities. There are a few options for directly integrating data from a Hadoop Distributed File System (HDFS) with a Teradata Enterprise Data Warehouse (EDW), including using SQL and Fastload. This document focuses on using a Table Function UDF to both access and load HDFS data into the Teradata EDW. In our examples, there is historical data already in Teradata EDW, presumably derived from HDFS for trend analysis. We will show examples where the Table Function UDF approach is used to perform inserts or joins from HDFS with the data warehouse.