Failed to remove session 1e13f3d6-f4d1-4d23-94cb-9f85ca6a7fdd for user testera because actions are still running for this session.

0
Hi, Currently we are testing a new import procedure with a large amount of data (200.000 rows) and a couple of webservices calls per row. We use the Java action executeMicroflowAsUser_2 from the CommunityCommons to import the data in batches. This is done in the ProcessQueue in 1 QueueAction. It does not work to create for each import row an own QueueAction. They are created en placed in the queue, but the ProcessQueue becomes very slow. But when we do an import we keep getting a Warning with the text 'Failed to remove session '1e13f3d6-f4d1-4d23-94cb-9f85ca6a7fdd' for user 'testera' because actions are still running for this session. Client access has been disabled. Session will be attempted to be removed again in 300 seconds.' It looks like this delay the import procedure. Since the first batch is done in 1:15 minutes en the last one 4:33 minutes. Can anyone help us with this issue? Or does someone has a good solution for processsing large amount of data?    
asked
2 answers
1

For some first control in processing large amount of data I use offset/limit at retrieve so the MBS memory stays more under control. For not spooling up the database-server memory (in the default behaviour full commit or rollback) you might use EndTransaction as also available in community-commons so you have 'hard' commit points along the way (if your transaction scope allows this). Alternatively you might move to more async using the process-queue module, or is that already in place which is not clear to me.

answered
1

Hi Rens,

We also ran into issues with asnychronous processing. It is improtant to have the latest version of Process Queue as it includes fixes for session issues - check the release notes - https://appstore.home.mendix.com/link/app/393/Mendix/Process-Queue

 

If you want to have mutliple objects processed in a single process you can use the QueuedAction object.

1. Create an association between the QueuedAction and the entity that you are proessing in the queue.

2. Then before scheduling the Process associate a batch of objects that you want to process to the QueuedAction.

           Use $Limit and $Offset to get the objects you want to process in batches, just like Remco suggests. 

3. In the processing microflow retrieve the objects from the Queued and process them.

Hope this helps,

Andrej

answered