0 - 26 of 26 tags for v_applications

Beyond the typical database functions Aster has advanced analytical functionality.  For instance, the Aster nPath function allows regular pattern matching over a sequence of rows.

Oracle Essbase was first launched in 1992. Since then, many other OLAP tools have come and gone. Oracle Essbase has thrived and become a leading OLAP tool by continuously adding new features, adapting to new environments, and taking advantage of new technologies. Oracle Essbase XOLAP and Oracle Essbase Studio are examples of the latest improvements.

The OBIEE Sample Application is a fully functioning OBIEE environment available on a VM.  It is self-contained in that demo applications can be run against data stored on the VM.  It is possible to modify the VM so that external data can also be accessed and shown in a demo or used for testing.  This article shows how to mod

The Teradata Aster Discovery Platform is engineered for all new and old data types, offering integrated analytics and revolutionary ways of analyzing data. It’s the industry’s most transformative solution for turning big data insights into outstanding customer experiences, advancements in customer service and record-breaking marketing campaigns. Teradata Aster Discovery Portfolio provides a suite of ready-to-use functions applied from a familiar SQL interface for fast and easy discovery of business insights from big data.

Teradata QueryGrid provides a means to access Hortonworks Hadoop® data from a Teradata system.  The QueryGrid uses SQL to access and join other system’s data with the enterprise data.  QueryGrid is flexible in that Hadoop, Aster or other databases can be accessed from the integrated data warehouse.

Oracle BI Enterprise Edition (OBIEE) 11g Configuration and Integration with Fuzzy Logix DB Lytix™ On Teradata

This article describes how to integrate Oracle BI Enterprise Edition (OBIEE) 11g with Fuzzy Logix DB Lytix™ 1.x on Teradata to enable transparent OBIEE end user access to the advanced analytic functions from Fuzzy Logix.  Fuzzy Logix’s DB Lytix™ 1.x includes libraries of in-database models that run deep inside Teradata Systems.  Use of these models allows execution of analytics within the database, leveraging the performance, parallelism and scalability of Teradata system while dramatically improving the performance of analytic results and simplifying the integration of analytics into existing reporting and analytic applications.

Teradata ADS Generator, part of the Teradata Warehouse Miner family of products, was built to support both comprehensive data profiling as well as analytic data generation for Teradata customers.  Neither the data profiling nor the analytic data set generation capabilities of the product require any movement of data outside of the warehouse and utilize as

Teradata Warehouse Miner provides an array of data profiling and mining functions ranging from data exploration and transformation to analytic model development and deployment that are performed directly in the Teradata Database. While many data mining solutions require analysts to extract data samples to build and run analytic models, Teradata Warehouse Miner allows you to analyze detailed data without moving it out of the data warehouse, thus streamlining the data mining process.

Teradata Warehouse Miner, and its derivative products Teradata Analytic Data Set (ADS) Generator, Teradata Data Set Builder for SAS and Teradata Profiler, can provide both a framework for the data mining process and a significant savings of time and resources in implementing the process.

An updated version of the teradataR package is now available here.

Enhancements included with this new 1.0.1 release include:

How do I find the system, error, or PE log files for Aprimo Relationship Manager (formerly Teradata Relationship Manager)?

One of the biggest benefits of Query Banding is the ability to pass parameters into an SQL script.

Establishing an associative security table lookup between a userid and allowable company codes may allow easy access to the base data tables.  

Please note, we are no longer supporting teradataR since the decision was made for Teradata to focus on our partnership with Revolution for R integration with Teradata.

R  is an open source language for statistical computing and graphics. R provides a wide variety of statistical (linear and nonlinear modeling, classical statistical tests, time-series analysis, classification, clustering) and graphical techniques, and is highly extensible. This free package is designed to allow users of R to interact with a Teradata database.  Users can use many statistical functions directly against the Teradata system without having to extract the data into memory.

In this post, we will cover some more advanced logging topics in TRM, including PE logging, SQL output, user clicks, authentication attempts, and startup logging.

This article describes the basics of TRM's V6 logging in the application server. This includes log file configuration, versioning, location and content.

Here are a couple of helpful links in Teradata Relationship Manager (TRM).  These allow you to see all of the security privileges and user groups assigned to the logged in user, as well as a report of the TRM version and detailed environmental information.

Right out-of-the box, custom processing engine tasks may be created without having to write new Java components. These tasks are application components that implement common functionalities. There are several base processing engine tasks that are also based on these common tasks.

Processing Engine Extension represents mapping of customized processing engine tasks, properties, and dependencies whose purpose is to change or extend the processing flow of base processing engine capabilities defined in the base definition files.

In parts 1 and 2, we looked at how to load and retrieve large objects using Teradata BLOBs and CLOBs. In part 1, I put forward my unbreakable golden rule that you shouldn’t simply store Object Models as Large Objects. Rather, always map your objects’ attributes to columns in a table; don’t just serialize a bunch of classes and store them in a BLOB.

In part 3, I present another of my unbreakable golden rules; which is “Rules were made to be broken”. The focus of this article is how to serialize and persist (store) an Object Model into a BLOB on Teradata and reinstate it.

Expect 
"It will not do to leave a live dragon out of your plans if you live near one." ~ The Hobbit

No matter what you do, no matter how long you prepare, your plan will begin to slowly, or not so slowly, unravel before your eyes.  Plans fail, that is the one constant you can actually plan for.  

What in the world do you mean?  Agile is a spotlight?  That's ridiculous.  It's all about daily stand up meetings, burndown charts, cards on walls, and other fun stuff like that, right?  If I do those things, I am Agile, right?  

WRONG!

Come on, doing all of that stuff has to count for something!!!

In Part 1, we looked at what Large Objects are and some tactics that we could employee to load them into Teradata. As I stated back in part 1, loading large objects is great, but by itself is pretty useless. In this article, we will look at extracting Binary Large Objects from Teradata for inclusion in a web application.

But before we get to that, we need to know something about the large objects we will be dealing with.

Last week we concluded by challenging you to consider how a Connection Pool contributes to the performance of Web and Web service applications. Further we suggested that this week we would expand on this thought by exploring what you can do, during the development of an Active Integration application, to achieve improved performance through collaborating with the Database on Workload Management.

Until now nothing that we have discussed has been dramatically different from any other Spring Framework / DAO / Web services / JEE / POJO tutorial, however, there is method to this madness in that it was necessary to introduce the Teradata Masters out there to some new Java concepts while leading the Java Masters into the Teradata fold. While this has taken a reasonable number of weeks to accomplish it should now be safe for the entire readership to enter the world of Teradata Workload Management and Query Banding together.

Lately it seems like there has been quite a few questions relating to the use of Large Objects on the various Teradata forums. Mostly these revolve around the question of how do I get Large Objects into the database. So it seems like there is a bit of mystery surrounding Large Objects.

For myself, the mystery is just who is “Large” and to what does he (or she) “Object”? Perhaps "Large" objects to the veil of mystery being lifted which is what this series of articles is about - working with Large Objects.

In part 1, we will look at getting those Large Objects of yours into the Teradata. One way or another we will get them into Teradata, kicking and screaming if need be (however, it will be relatively painless, I promise) and we will do this despite any "Objections" "Large" may have.

Later in part 2, we will drag those Large Objects back out and use them in a web application.