Email Rowcount using Global variables derived from Audit

Hi

I have a requirement to email rowcount, sum of amount along with the job status. Rowcount will be reported only if the job is success.

As per one of the postings in this forum, created a global variable in the job, output dataflow local variable and assigned global variable to it. Then using the audit feature, did the count of query and target.

Made the audit to fail everytime (by giving $count < 0). On Audit fail, in the function, assigned the Count to the local variable and in turn global.

When i tried to print/email the value in global variable, all I got was NULL. Not sure what I am missing.

I can email from Audit failure function using smtp_to but I dont think thats good idea to send email without the job completion.

Tried using gen_row_num, previous_row_value - but not sure how to pass the value out.

The last option I have is using the scripts to count, sum the source and target. In this case, I am not even using the DI features. It will be just a sql script.

Any optimistic method to report the row count, sum.

My Data Services version is XI 3.1 and Designer 12.1.0.0

Thanks
Poongs


Poongs (BOB member since 2009-07-20)

If I get your problem correctly, you have Table Properties (Attribute : Total_Number_Of_Rows_Processed)

You can use this attribute of the Table like
DATASTORE_NAME.OWNER_NAME.TABLE_NAME.Total_Number_Of_Rows_Processed


ganeshxp :us: (BOB member since 2008-07-17)

Hi Ganesh

Thanks for your reply.

But I need to report/email also the sum of the amount in the table.

How can I report that.

I have these options
1. Audit feature - but not sure how to pass the value out of the dataflow.

   2. Previous_row_value in another Query transform - again not sure how the value will be passed out of the dataflow to a script to send the email along with the job status.

   3. Having one more script to count and sum the fields and report it through email.

Thanks
Poongs


Poongs (BOB member since 2009-07-20)

I have designed similar functionality for my ETL jobs - sending out row counts, sums and last-date information by e-mail at job completion.

I’ve opted for your option 3. At the end of the job, I added a Workflow to email a job report summary in which I placed a Script task that called a number of custom functions to gather the data required and compile the e-mail.

(The custom functions in turn would query specific ETL job log tables in the DWH and run counts and sums over key fact tables and certain key dimensions).

Some additonal but not required information. Feel free to skip :slight_smile:

All my Workflows used a Try/Catch approach and when an error was encountered, it would call a custom error handling function. This function would do a number of things including populating some global variables to indicate that an error had happened and in which workflow and also write this information into an error log table in the datawarehouse.

Based on certain flags, certain downstream workflows would or would not be executed based on the workflows affected by the error. The reasoning for this was that while certain areas would be negatively impacted - other non-related areas of the datawarehouse would still need to be populated. This all depended on business requirements.

(Example: in our case, it was a business requirement to load all P&L impacting transactional data even if the dimensions themselves were not completely loaded e.g. loan arrears data, mortgage account balances, bond trading transactions vs customer, loan or asset dimension data. The incorrect dimension data would be corrected during the weekend or at another point in time).

Based on the global variables indicating the occurrence of an error, the ETL process summary workflow would use the “succesful load”, “load with low priority errors” or “critical error during load” e-mail templates. If an error had been detected, the workflow would also call custom functions to query the error log tables to retrieve a list of all affected workflows.

Originally we were using the standard email functionality to send parts of the log file but we just found that we wanted more control over the layout and content of these emails. Also the emails themselves were designed to give the ETL administrators a warning that an error had happened - who would then log into the Management Console to investigate the problem.

Sending chunks of the log file was just not going to be very helpful, especially when you try reading those e-mails on a Blackberry screen.


ErikR :new_zealand: (BOB member since 2007-01-10)

Erik

That was a good information. Thank you. We just started using BO for last 3 months - still more to explore.

I have used the below query in the script after the dataflow to report the count and sum and it worked. For this, used the Audit feature also.

select aud_inf.value
from al_history hist, al_lang lang
, al_audit aud, al_audit_info aud_inf
where hist.service =
and lang.name =
and aud_inf.audit_key = aud.object_key
and aud.history_key = hist.object_key
and aud.df_lang_key = lang.object_key
and aud_inf.label =
and aud_inf.audit_key = (select max(aud_inf1.audit_key) from al_audit_info aud_inf1 where aud_inf.label = aud_inf1.label)

Saw this in one of the posts. I am using Sql Transforms for couple of ETL jobs. For those, may be I will do a select count, sum from the source and target tables and report through the email in the script.

Thanks
Poongs


Poongs (BOB member since 2009-07-20)

https://wiki.sdn.sap.com/wiki/display/BOBJ/Audit+Points

my version at the bottom of above page.


Werner Daehn :de: (BOB member since 2004-12-17)