BusinessObjects Board

DI/DS Improvements & Suggestions

While using the Web-based Management Console, looking at the Batch Job status… I’d like to have a hover-over menu on the Job Name, that had options for:

  • Show History - this would link you over to the Operational Job Execution History page (/odreports/odJobList.do) with that particular job already filtering the results

  • Auto Documentation - this would link you over to the Auto Doc section (/smartdoc.jsp) with that job already chosen

It would save some clicks.


dnewton :us: (BOB member since 2004-01-30)

Flat files: a template flat file like a template table or xml

Flat files: ability to automatically ignore columns to the right of those defined in the flat file format


dlaplant :us: (BOB member since 2006-03-27)

Just asking how close these features get:

https://wiki.sdn.sap.com:443/wiki/display/BOBJ/How+to+create+a+flat+file+format+from+a+Query

https://wiki.sdn.sap.com:443/wiki/display/BOBJ/Errorhandling+in+file+formats (Scenario 3)


Werner Daehn :de: (BOB member since 2004-12-17)

That works. Thanks.


dlaplant :us: (BOB member since 2006-03-27)

Idea: Some of the Table Compare options should be set-able through global variables, to allow dataflows that handle both Incremental loading as well as full re-loads.

In particular, if you could set the “Detect Deleted Rows” and “Comparison Method” options to also read a variable, you could have a single dataflow that does incremental (no deletion detection; comparison method is row-by-row), and a full re-load (deletion detection; cached comparison table).

Without this, you need two separate dataflows.


dnewton :us: (BOB member since 2004-01-30)

Create and include in the DI installation, a BO universe to report on the DI repository. This would seem like a slam-dunk integration point between these two BO products…


dnewton :us: (BOB member since 2004-01-30)

The Merge transform should have a radio button for “Merge All” versus “Merge Distinct”. Similar to the MERGE ALL vs MERGE in SQL itself.

Yes, you can add a Query transform after the Merge with the Distinct checkbox checked, but a radio button choice would make for a simpler dataflow design (fewer steps) and would make it clearer to the new DI user what was going on.


dnewton :us: (BOB member since 2004-01-30)

My wishlist :

  1. To be able to go back to the designer of lookup_ext() function once it has been implemented. It is a pain to manually decode the whole function because once cannot go into the designer mode again.

  2. To have more compatibility in the lookup_ext() function like the usage of ‘AND’/‘OR’/both clauses in the conditions box. As far as I know, all the conditions in the box are joined with ‘AND’ which limits the usability of the function.

  3. To have a small box at the end/beginning of the Administrator page which asks for the "page _ of _ " to go rather than clicking the next page button those many times. :hb:

  4. To have the compatibility of (+) instead of/and having an outer join box in the Query transform. If there are 5 tables to be joined and if the query requires outer join only on 2 tables, even then one is forced to apply the outer join involving all the tables/queries.


ds41user :india: (BOB member since 2007-09-03)

  1. Include more functions in the inbuilt functions library. BO XI R2 Webi has wonderful functions and DI seems to fall short of useful functions like them.

ds41user :india: (BOB member since 2007-09-03)

  1. I never use the lookup_ext in a mapping but as a new-function-call for exactly that reason.

https://wiki.sdn.sap.com:443/wiki/display/BOBJ/lookup_ext

  1. Use two lookups for the OR. So if the condition should be col1=1 or col1=2 you can have two lookups with each condition and in the next step you decide which of the two you actually take via a ifthenelse(). I know, not perfect but with ORs you would need to support brackets as well and then the GUI gets really complex.

  2. Wait one or two months and then DS 12.2 supports that!

  3. True.

  4. True. Can you list the most important ones for us?


Werner Daehn :de: (BOB member since 2004-12-17)

A new transform, something different than History Preserve, more like “Existing Column Preserve”, that does the following:

  • If the data in the target table has a value, leave it alone
  • If the data in the target table is null, then fill in the incoming value

In this transform, you could click on each column and the tab/dialog area below would have options for:

  • Normal (pass-through)
  • Preserve if not not null
  • User-specified condition (and you could put in your own WHERE-like criteria)

This is useful for some columns where you always want to preserve a value like “initial load date”, even if the row is later Updated (rather than inserted). But you want to have the same dataflow, and not have to resort to bringing in the target table as a source.

In fact, now that I think of it, this could be added into the Table Compare transform. Since you already have to read the entire target table’s row for that transform to work, you could have another tab in the table compare that lets you specify this column-by-column behavior.


dnewton :us: (BOB member since 2004-01-30)

How about the ability to have multiple instances of the Designer open at the same time, with each one pointing to a different repository?

For example, I’m in the development environment and want to go look at something in the production repository. Currently I have to exit Designer and restart it, point it to the prod system, look at what I need to (assuming that CRS hasn’t kicked in and I can remember what I was after), then lather-rinse-repeat to go back to my development environment.


nitrobrother :us: (BOB member since 2004-09-08)

How about “View Where Used” ability for variables!!! right now is there a way to find it?


urk :us: (BOB member since 2005-11-29)

That would be fantastic. We recently tried to clean up our warehouse load because there were tons of variables defined, some of them old, but it was hard to find out where they were used.

Can you enter a new Message with SAP support and suggest this?


dnewton :us: (BOB member since 2004-01-30)

I had a requirement to compute an aggregated fact table, and only show the Top customers or Top products in the result. This was a minor pain to do, as it required several Query steps in succession:

  • Aggregate the measures
  • Sort by the customer/product and then by the Revenue, in DESC order
  • Add a “gen_row_num_by_group” column
  • Then filter out just the top 10.

If there was a “Top” transform, it could do all of the steps above. Just specify which columns are the dimensions, which are the measures, and whether you want the highest or lowest values, and how big the TOP should be. It could do all of these in one step.


dnewton :us: (BOB member since 2004-01-30)

KattaUR wrote: 
How about "View Where Used" ability for variables!!! right now is there a way to find it? 
That would be fantastic. We recently tried to clean up our warehouse load because there were tons of variables defined, some of them old, but it was hard to find out where they were used. 

Can you enter a new Message with SAP support and suggest this?

Made a suggestion to SAP support about this, hope they will make this feature available in their future release.


urk :us: (BOB member since 2005-11-29)

Here is the reply from SAP support

We are aware of this limitation, customers have asked to for a "where used" functionality for variables in the past. Unfortunately, this is not a small feature to implement, so it is not possible to put it into an intermediate release, it needs to be part of a major release (it requires changes in the way we store variables in the repository, unlike what we do with jobs, dataflows and workflows, for variables we don't store the relationship between the variable and the places where it is used. So a "where used" today is not possible - or would require a full scan of the textual representation of a job).
 
As a workaround you could query our repository and scan the AL_LANG and AL_LANGTEXT tables for the variable name. But I realize this is not a user friendly solution at all ;-(

urk :us: (BOB member since 2005-11-29)

The facility to lock a datastore so that it won’t be overwiten by an ATL import - maybe an “Are you sure?” prompt.


gpn01 :uk: (BOB member since 2004-11-30)

My apologies if this was already mentioned before!

  • Job chaining:

The ability to call jobs on the same and different jobservers from other jobs - with success, fail, waiting, etc status and associated error alert/reporting options.

This allows you to have master/child jobs as for some of the EDW’s I’ve build it just isn’t possible to stuff everything in a single monser job.

You can already do this, using commandline execution and all, but it’s quite some work to develop a framework around this and it would be very useful if Data Services would offer this “natively”.

  • Job, Workflow, Dataflow Process Execution metadata

There is a alot of metadata stored in the DI/DS repository when jobs are executed. But I would like the ability to capture and store that metadata elsewhere - for instance, in an ETL process schema in your data warehouse environment.

The execution data is available through the Web Admin interface but I store this data also in my data warehouses for reporting on data loading performance etc.

You can already store this information by setting this up manually in your workflow designs but it would be nice if this was already build-in.

It woudl also be great if the error description can be exposed from within Data Services. Right now, I can capture that an error happened (as part of Try / Catch) but it’s tricky to get the error message/description itself - as you have to dive into the log file and retrieve this, which is a pretty messy affair.

Yes, I know that’s all in the Admin interface, but it’s part of the overall process logging schema. I always want to know what records were inserted/updated/deleted under which process execution and if errors happened, I also want my reports to show the error descriptions without having to provide DS Admin monitoring access to the consumers of the daily loading reports (data stewards and BA’s at division level).


ErikR :new_zealand: (BOB member since 2007-01-10)

Since it’s been mentioned a couple of times: The ability to schedule Profiling jobs in an easy way.


dnewton :us: (BOB member since 2004-01-30)