0 - 24 of 24 tags for backup

Hey all,

I have a TARA job which backs up everything under dbc excluding dbc. The following is part of the arc script which works

 

.

.

.

ARCHIVE DATA TABLES ("DBC") ALL,

EXCLUDE ("DBC"),

RELEASE LOCK,

.

.

.

 

Hello,
 
I have a backup which was taken using ABU 14.10 that is consisted of 4 data files(4-node configuration) - size is about 1.5 TB. I want to restore this backup on another server which also has ABU 14.10 installed but it is a 2 Node configuration. Is there a way to restore(using the COPY arcmain command) on the 2 Node machine?

Hi All,
 
Our Backup failed due to "no room in database" error,
The offending database had max perm of 54 mb,.. and current perm of 10mb..
My question how much minimum space is required for creating online logs during Online Archiving ?
 
Thank you
Anup

I am getting below warning message after the restore job completion,only if I run BUILD command I can get my data back after restore.Please help me on this.

 04:49:21  *** Warning ARC1244:NO BUILD automatically enabled. Run 

                     BUILD after all restore jobs.

 

Thanks in Advance.

 

Ashok

Hi,
I ran a restore using DSA, and received an error in the postscript.  Where can I find the DSA logs so that I can further investigate my error?  I'm guessing it's on the BAR media server, but I need the directory path.
 
Thanks!

Hi.
 
Is there any limitation in the use of Online Backup in conjunction with Incremental Backup (AFTER JOURNAL)?
 
I'm working on a client that currently uses a job online backup weekly. Now, for just a few databases, this customer will have the need to implement incremental backups (AFTER JOURNAL).
 

Teradata Data Stream Architecture (DSA) is a new product offering for Teradata Backup and Restore, providing significant performance and usability improvements over existing backup and restore functionality.

Does Teradata DBMS have a command-line function for creating offline backups, something roughly equivalent to "pg_dump" for Postgres or "mysqldump" for MySQL ?

Or perhaps a GUI-based backup, like right-clicking a database name in SQL Server Management Studio?

Thank you.

Hello.
I run a number of simple programs in SQL Assistant that insert rows into tables for predictive modeling (I'm a user, not a DBA).  Before each insert I make a copy of the target table as a backup.   The backup table's name is the target table's name with a "_bu" suffix.

I regularly perform the following type of backup to get the database level backup so that I can lay down these databases first, in a full restore.  I can then kick off multiple restore jobs on different media servers to restore different tables in the same database, concurrrenlty.  We have found this to be a very efficient way to utilize all four of our m

Hello,
 
Thank you in advance for your time.
I want to create a table with the same structure as an existing one.
The problem with the create table command , is that it does take under consideration foreign keys (for my case).
(

I need the ability to backup a select number of tables daily that would be the building blocks for a restore and ELT run if there were any issues.  I also would like to do a full backup on the weekend of the set of tables that make up the base data for all time, these are the tables that all the reporting and other data objects are based on.

Hi all, 
We are experiencing very slow backup performance through NetBackup - Less than 5MB/Sec on a single stream. 
NetBackup and network configurations are being checked. Can you please advise what we can look at on the Teradata side... Specific Logs etc. Would increasing the response buffer size help at all?

Hi all,

I am not seeing any information on this so I thought I'd ask the smart minds here ;-). Are Teradata backups consistent as at the backup start time or end time?

 

Thanks,

Hi all,

 

I am new to Teradata (spooling up POC) and have thus far not been able to locate the infomation which I am seeking so beforehand I do apologize for such basic questions, but I would greatly appreciate some assistance.

 

Fellows,

I am working on a process to identfy backups within environment.

And in order to answer this I designed a query below query but this only answers that what all tables within have identical structure.

But does not answers the big one.

What is the best practice for backing up all Teradata table structures and view structures?

Can it be done without using a stored procedure or UDF?

 

Is there an simplier way than submitting the following statements for every for database spaces?

Hello folks,

I am reviewing some documentation and have done some research on the ability to protect my data using the Archive utility. My source of the documentaiton is here... http://www.info.teradata.com/edownload.cfm?itemid=101680007

For my situation, I need to have a few datasets (7 at most) to recover from. So given the following Archive script...

-----
LOGON DBCID/UID,PW;
ARCHIVE DATA TABLES (MYONLINEDATA) ALL,
RELEASE LOCK,
FILEDEF=(tddumps,/var/tddumps/dump.%UEN%.out);
LOGOFF;
-----

Hello All,
we are using Netbackup and TARA for backups .
ETL job was failed with RDBMS code 7595: Operation not allowed: an online archive logging is currently active on Database. my investigation is
while the backup job is running on the database it is holding lock on that database until it complete the backup for that database. Example CAPACC01 database takes around 50 minutes to
complte backup,during this time ETL jobs not abel do any change on tables in that database, etl jobs aborting with 7595 error. once backup job compltes same ETL jobs runs fine.

Here in EIM, the challenge with the backup was backup window was growing steadily. When it reached 11 hour window, it became an issue. With the continuous ETL loads & so many report users it became nearly impossible to have that big backup window. Here, ETL almost pulls the real time data from the legacy systems and puts them into DW.

On investigation, it was found that only one table in the database takes almost 9 hours to get backed up. The size of the table was 6 TB +. Then we planned to implement incremental backup for that big table.

The primary goals in a successful backup strategy are to ensure backup integrity, minimize backup & restore time, & reduce impact on the Teradata system. This presentation covers the connection, configuration, & tuning of the BAR hardware and software components, protecting & validating backups, & how different backup strategies can be applied to reduce system impact.

We get a lot of questions about BAR. Yes, we here at eBay went our own way a number of years ago, and it was one of the more successful things we’ve done – I have not even thought about them in quite some time. I’ll explore a bit about what we’ve done then wade through some of the architectural definitions and issues.

By introducing new, powerful portlets, adding and enhancing administrative capabilities, and achieving product internationalization, the Teradata Viewpoint 13.0.1 release continues to deliver state-of-the-art system management and monitoring tools, all within your web-browser.