[Date Prev] [Date Next] [Prev in Thread] [Next in Thread] [Date Index] [Thread Index]

Re: How to duplicate 1000 records without a time out?

Hi Thomas,

I might have misread your requirement here.. :-)

Where do you need the "backup" data to end up and how are you doing it currently?

The fastest way to get data out of Remedy is with an SQL query directly on the DB, but even that may time out if there is too much data.

At the very least, you will have to have a two-phased approach using a staging form of some kind that kicks off an external export process that does the "export" work outside of Remedy to avoid the timeout issue.

Best Regards,

On Tue, Oct 1, 2019 at 9:46 AM Theo Fondse <theo.fondse@gmail.com> wrote:
Hi Thomas,

Yes, this could be done by using a staging form.
You need to have a 3-phased approach.

  • Phase 1 submits ONLY the EntryID of the source record to the staging form and returns a confirmation message to the user.
  • Phase 2 is done by an escalation running on the staging form that triggers workflow to collect the rest of the fields data to be duplicated and then does a push field to create the new duplicate record on the source form.
  • Phase 3 is done by archiving that deletes the record from the staging form overnight.

Best Regards,

On Tue, Oct 1, 2019 at 9:10 AM Thomas Miskiewicz <tmiskiew@gmail.com> wrote:
Hi Listers

User clicks on the button to take a back up off 1000 records. Is there a way to achieve that using filter workflow without causing a time out for the User?

Thank you

ARSList mailing list