I wouldn’t go that far. The first question is, why does the DataServices process consume so much memory - assuming it is.
Either you cache large amounts of data because you have to for performance reasons or accidentally and without caching it would be even faster.
I have similar situation, in which we have more than 80 scheduled jobs and we are getting this error. We are processing Master jobs that runs for more than 3 hours each.
We will not be able to make job level changes now. We are able to find a parameter -
MAX_64BIT_PROCESS_VM_IN_MB=4096
in DSConfig.txt file.
Can we increase the Virtual Memory in this parameter to a high level value let’s say increase 1 GB more?
The only way without changing the current jobs is to increase the amount of memory available on the machine.
However, if a jobs falls over due to memory usage and you have normally 16 GB available. And you double it to 32 GB, who is to say that 32 GB is enough?
A job consuming too much memory needs to be redesigned.
And with for a example the usage of a data transfer (set to table) there are quick and dirty work-a-rounds available.