0 - 5 of 5 tags for ampcputime

When comparing CPU consumption time between DBQLogTbl.AMPCPUTime and DBC.Acctg against User/Hour/Day, we notice difference in CPU time recorded. Sometime a net difference of 1-2% recorded high or less in Acctg when compared to DBQL.
What makes this difference, is this difference in CPU is respect to Query type or system internal resource.

Hi, I have put together a dashboard showing all the AMPCPUTime consumed by our various workloads based on the data from the DBQL LOG table column, AMPCPUTime. I want to add the Idle time into this so that management get an accurate picture of what is going on the system.

Is there any way to measure the total AMPCPUTIME consumed by any MLOAD/FLOAD/FEXP job post completion? The values from the dbql(sum of ampcputime for a particular LSN number) does not seem right.
I also tried dbc.acctg. There too, the values seem low.

Hi experts,

I understand that the CPU utilized by a query can be determined by adding AMPCPUTime and ParserCPUTime. If I get a value of 100 does this mean that the CPU was processing the query for 100 seconds? Or is the number represented in some other unit?

According to my understanding, the value returned by subtracting StartTime from FirstRespTime should be equal to AMPCpuTime. But this doesnt seem to work. Can anyone please help me understand how are all these values related/calculated?