My mistake on this one - there is already a build-in mechanism for uploading all ABAPs to SAP, without manually using the SAP GUI.
Not sure what version this was introduced, but it dramatically cuts down the time it takes to load the ABAPs.
Just make sure the upload details are set correctly in the datastore (i.e package, request ID, task ID) and then right click on the R3 dataflows and choose the generate ABAP option.
Best of all, you can highlight multiple flows and load the ABAP in a single go.
I didn’t know this option existed until I was looking at ways to avoid manually loading the ABAPs. In the past I’ve went in through the SAP GUI and manually loaded the ABAP code that SDS generates.
I’m using v3.2 - dont know when this new option was introduced though.
Under the R3 datastore there is a section called ‘Upload attributes’, which is basically just the same details you would give when creating a program - the package, request ID, task ID, status, application.
Once you have that in place just right click on your R3 flows for which you want to upload the ABAP. There is an option ‘Generate ABAP Code…’.
This option opens a panel in which you select the directory where the generated ABAP will be created, and also a check box for uploading the ABAP.
Dont know how, but it then goes away and loads the code up.
From previous experience, it took a single person >3 days to get the >400 ABAPs loaded for all the rapidmarts. When I tried this option they were all done in less than 1 hour.
Something to watch out for with the automated upload process.
SDS global variables used in R3 flows normally get converted to ABAP parameters. For example, $G_DEFAULT_TEXT would become $PARAM1 in the ABAP.
If I generate & upload the ABAP from SDS, but I dont have the Job open in which the flow is a child object, I get an error message - ‘Referencing undeclared variable’.
The problem is that the program still gets uploaded, and the SDS variable does not get converted to an ABAP variable. So, the SDS variable is explicitly referenced in the ABAP. This causes it to fail at run-time.
If I then regenerate the ABAP, but have the Job open, everything works fine and the SDS variable is correctly converted in the ABAP code.
When you say ‘R3 Datastore’ do you mean SAP Applications datastore?
My reason for asking is that I’ve tried your technique on a SAP BW source datastore and I don’t get the options that you describe. Which is a shame at it sounds like really useful functionality.
Yes, when I said R3 datastore I mean SAP applications datastore.
Under the advanced options on the datastore configuration there is a section called ‘Upload Attributes’…although I am not sure which version this was first introduced, as I used to load the programs manually.
This saves so much time, especially when deploying the rapidmarts, so it’s worth a look.
I had a similar post running parallel and thanks to ganeshxp who pointed to this post.
dcstevenson, The feature you have mentioned is very cool. However, I am getting similar error on global variable but in my case, the program is not uploaded.
I tried opening the job which has this ABAP dataflow in Designer and tried the generate Abap with upload checked. It still fails with following error:
RES-020107: |DATAFLOW ZAWCOOI,DF_R3_COOI,SAP_DS,ETR_TADIR_INTERFACE:6
Referencing undeclared variable <$G_DEFAULT_TEXT>. Please declare the variable in the context it is being used in.
The ABAP program for ABAP data flow <DF_R3_COOI> (datastore <SAP_DS>) was not uploaded: error code
<TR_TADIR_INTERFACE:6>.
A global variable is defined in a job. So when you open the ABAP dataflow from the object library without having a job opened first, Desigenr does not know if that variable is a valid one or not. So the solution for you is simple, open the job first, then the ABAP dataflow.
For me the consequence is, global variables should be repo global, not job global.
As per your instruction, I opened the job in designer and then opened ABAP data flow.
Unfortunately, I am still getting the same error when I try to Generate and Upload from abap dataflow.