JPPF Issue Tracker
JPPF (jppf)
July 16, 2019
bug_report_tiny.png 02:42  Bug report JPPF-596 - Outdated dependency - net.jpountz.lz4 should be replaced with org.lz4.lz4-java
lolocohen : Issue closed
July 15, 2019
bug_report_tiny.png 12:51  Bug report JPPF-596 - Outdated dependency - net.jpountz.lz4 should be replaced with org.lz4.lz4-java
zorba128 : Issue created
Referenced library is dead - it was moved to another namespace long time ago.
This causes some conflicts with more up-to-date libraries.

https://mvnrepository.com/artifact/net.jpountz.lz4/lz4
feature_request_tiny.png 00:55  Feature request JPPF-589 - Docker images for JPPF components
lolocohen : Issue closed
task_tiny.png 00:20  Task JPPF-595 - Remove Android and .Net sections of the documentation
lolocohen : Issue closed
July 14, 2019
icon_milestone.png 18:12 JPPF 4.0
A new milestone has been reached
task_tiny.png 12:05  Task JPPF-595 - Remove Android and .Net sections of the documentation
lolocohen : Issue created
Since we are no longer developing / maintaining the Android and .Net integrations, we propose to remove them from the docuemntation altogether. The source code will be kept in the repository for now.
July 09, 2019
bug_report_tiny.png 05:23  Bug report JPPF-594 - Deadlock in the driver
lolocohen : Issue closed
July 08, 2019
bug_report_tiny.png 23:27  Bug report JPPF-594 - Deadlock in the driver
lolocohen : Issue created
During a run of our Jenkins-automated tests on v6.1, the following deadlock was detected and captured:
Deadlock detected

- thread id 26 "JobScheduler" is waiting to lock java.util.concurrent.locks.ReentrantLock$NonfairSync@58c56e8e which is held by thread id 21 "JPPF-0007"
- thread id 21 "JPPF-0007" is waiting to lock java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb which is held by thread id 26 "JobScheduler"

Stack trace information for the threads listed above

"JobScheduler" - 26 - state: WAITING - blocked count: 5 - blocked time: 4 - wait count: 7 - wait time: 20220
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.ReentrantLock$NonfairSync@58c56e8e
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
at org.jppf.server.protocol.AbstractServerJobBase.getTaskCount(AbstractServerJobBase.java:112)
at org.jppf.server.queue.JPPFPriorityQueue.nextBundle(JPPFPriorityQueue.java:224)
at org.jppf.server.nio.nodeserver.async.AsyncJobScheduler.prepareJobDispatch(AsyncJobScheduler.java:163)
at org.jppf.server.nio.nodeserver.async.AsyncJobScheduler.dispatch(AsyncJobScheduler.java:121)
- locked java.util.LinkedHashSet@1cc4404f
at org.jppf.server.nio.nodeserver.async.AsyncJobScheduler.run(AsyncJobScheduler.java:73)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb

"JPPF-0007" - 21 - state: WAITING - blocked count: 2 - blocked time: 3 - wait count: 70 - wait time: 20813
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
at org.jppf.server.queue.JPPFPriorityQueue.removeBundle(JPPFPriorityQueue.java:263)
at org.jppf.server.queue.RemoveBundleAction.run(RemoveBundleAction.java:62)
at org.jppf.server.protocol.AbstractServerJob.done(AbstractServerJob.java:333)
at org.jppf.server.protocol.AbstractServerJob.setSubmissionStatus(AbstractServerJob.java:418)
at org.jppf.server.protocol.BundleCompletionListener.lambda$bundleEnded$0(BundleCompletionListener.java:83)
at org.jppf.server.protocol.BundleCompletionListener$$Lambda$29/877663927.run(Unknown Source)
at org.jppf.server.protocol.BundleCompletionListener.bundleEnded(BundleCompletionListener.java:85)
at org.jppf.server.protocol.ServerTaskBundleClient.bundleEnded(ServerTaskBundleClient.java:425)
at org.jppf.server.nio.client.JobEntry.jobEnded(JobEntry.java:73)
at org.jppf.server.nio.client.AsyncClientMessageHandler.jobResultsSent(AsyncClientMessageHandler.java:124)
at org.jppf.server.nio.client.CompletionListener.taskCompleted(CompletionListener.java:62)
at org.jppf.server.protocol.ServerTaskBundleClient.fireTasksCompleted(ServerTaskBundleClient.java:416)
at org.jppf.server.protocol.ServerTaskBundleClient.resultReceived(ServerTaskBundleClient.java:270)
- locked org.jppf.server.protocol.ServerTaskBundleClient@1846b48a
at org.jppf.server.protocol.ServerJob.lambda$handleCancelledTasks$3(ServerJob.java:300)
at org.jppf.server.protocol.ServerJob$$Lambda$33/445459200.accept(Unknown Source)
at org.jppf.utils.collections.CollectionMap.forEach(CollectionMap.java:155)
at org.jppf.server.protocol.ServerJob.handleCancelledTasks(ServerJob.java:300)
at org.jppf.server.protocol.ServerJob.cancel(ServerJob.java:316)
at org.jppf.server.nio.client.AsyncClientContext.cancelJobOnClose(AsyncClientContext.java:238)
at org.jppf.server.nio.client.AsyncClientContext.lambda$cancelJobsOnClose$0(AsyncClientContext.java:209)
at org.jppf.server.nio.client.AsyncClientContext$$Lambda$31/1598015078.accept(Unknown Source)
at java.util.concurrent.ConcurrentHashMap.forEach(ConcurrentHashMap.java:1597)
at org.jppf.server.nio.client.AsyncClientContext.cancelJobsOnClose(AsyncClientContext.java:208)
at org.jppf.server.nio.client.AsyncClientContext.handleException(AsyncClientContext.java:87)
at org.jppf.server.nio.client.AsyncClientMessageHandler.jobReceived(AsyncClientMessageHandler.java:73)
at org.jppf.server.nio.client.AsyncClientMessageReader.handleMessage(AsyncClientMessageReader.java:69)
at org.jppf.server.nio.client.AsyncClientMessageReader$$Lambda$22/2144344426.execute(Unknown Source)
at org.jppf.nio.NioMessageReader$HandlingTask.run(NioMessageReader.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.locks.ReentrantLock$NonfairSync@58c56e8e
- java.util.concurrent.ThreadPoolExecutor$Worker@5b1d2887

--------------------------------------------------------------------------------

"main" - 1 - state: WAITING - blocked count: 2 - blocked time: 0 - wait count: 1 - wait time: 21142
at java.lang.Object.wait(Native Method)
- waiting on java.lang.Object@55a86678
at java.lang.Object.wait(Object.java:502)
at org.jppf.server.JPPFDriver.main(JPPFDriver.java:189)

"Reference Handler" - 2 - state: WAITING - blocked count: 5 - blocked time: 0 - wait count: 3 - wait time: 21043
at java.lang.Object.wait(Native Method)
- waiting on java.lang.ref.Reference$Lock@270a3ee0
at java.lang.Object.wait(Object.java:502)
at java.lang.ref.Reference.tryHandlePending(Reference.java:191)
at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)

"Finalizer" - 3 - state: WAITING - blocked count: 4 - blocked time: 3 - wait count: 4 - wait time: 21030
at java.lang.Object.wait(Native Method)
- waiting on java.lang.ref.ReferenceQueue$Lock@74993a7e
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209)

"Signal Dispatcher" - 4 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 0 - wait time: 0

"Attach Listener" - 5 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 0 - wait time: 0

"LauncherSocket-34298" - 12 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 0 - wait time: 0 - in native code
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.net.SocketInputStream.read(SocketInputStream.java:224)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at org.jppf.process.LauncherListener.run(LauncherListener.java:83)

"DeadlockChecker" - 13 - state: WAITING - blocked count: 1 - blocked time: 0 - wait count: 4 - wait time: 20353
at java.lang.Object.wait(Native Method)
- waiting on java.util.TaskQueue@1e60d97c
at java.lang.Object.wait(Object.java:502)
at java.util.TimerThread.mainLoop(Timer.java:526)
at java.util.TimerThread.run(Timer.java:505)

"ClientClassServer" - 14 - state: RUNNABLE - blocked count: 2 - blocked time: 0 - wait count: 4 - wait time: 0 - in native code
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:296)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:278)
at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:159)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked sun.nio.ch.WindowsSelectorImpl@6bf33465
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101)
at org.jppf.nio.NioServer.run(NioServer.java:256)

"JPPF-0001" - 15 - state: WAITING - blocked count: 0 - blocked time: 0 - wait count: 1 - wait time: 0
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.SynchronousQueue$TransferStack@7f397481
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:458)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.take(SynchronousQueue.java:924)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

"JPPF-0002" - 16 - state: WAITING - blocked count: 5 - blocked time: 0 - wait count: 32 - wait time: 20018
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.SynchronousQueue$TransferStack@7f397481
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:458)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.take(SynchronousQueue.java:924)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

"JPPF-0003" - 17 - state: WAITING - blocked count: 0 - blocked time: 0 - wait count: 1 - wait time: 0
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.SynchronousQueue$TransferStack@7f397481
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:458)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.take(SynchronousQueue.java:924)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

"JPPF-0004" - 18 - state: WAITING - blocked count: 0 - blocked time: 0 - wait count: 1 - wait time: 0
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.SynchronousQueue$TransferStack@7f397481
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:458)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.take(SynchronousQueue.java:924)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

"JPPF-0005" - 19 - state: WAITING - blocked count: 0 - blocked time: 0 - wait count: 5 - wait time: 18590
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
at org.jppf.server.queue.JPPFPriorityQueue.getJob(JPPFPriorityQueue.java:345)
at org.jppf.server.nio.client.AsyncClientContext.cancelJobOnClose(AsyncClientContext.java:233)
at org.jppf.server.nio.client.AsyncClientContext.lambda$cancelJobsOnClose$0(AsyncClientContext.java:209)
at org.jppf.server.nio.client.AsyncClientContext$$Lambda$31/1598015078.accept(Unknown Source)
at java.util.concurrent.ConcurrentHashMap.forEach(ConcurrentHashMap.java:1597)
at org.jppf.server.nio.client.AsyncClientContext.cancelJobsOnClose(AsyncClientContext.java:208)
at org.jppf.server.nio.client.AsyncClientContext.handleException(AsyncClientContext.java:87)
at org.jppf.server.nio.client.AsyncClientMessageHandler.jobReceived(AsyncClientMessageHandler.java:73)
at org.jppf.server.nio.client.AsyncClientMessageReader.handleMessage(AsyncClientMessageReader.java:69)
at org.jppf.server.nio.client.AsyncClientMessageReader$$Lambda$22/2144344426.execute(Unknown Source)
at org.jppf.nio.NioMessageReader$HandlingTask.run(NioMessageReader.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.ThreadPoolExecutor$Worker@42f30e0a

"JPPF-0006" - 20 - state: WAITING - blocked count: 3 - blocked time: 0 - wait count: 80 - wait time: 20742
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
at org.jppf.server.queue.JPPFPriorityQueue.addBundle(JPPFPriorityQueue.java:104)
at org.jppf.server.nio.client.AsyncClientMessageHandler.jobReceived(AsyncClientMessageHandler.java:89)
at org.jppf.server.nio.client.AsyncClientMessageReader.handleMessage(AsyncClientMessageReader.java:69)
at org.jppf.server.nio.client.AsyncClientMessageReader$$Lambda$22/2144344426.execute(Unknown Source)
at org.jppf.nio.NioMessageReader$HandlingTask.run(NioMessageReader.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.ThreadPoolExecutor$Worker@24273305

"JPPF-0007" - 21 - state: WAITING - blocked count: 2 - blocked time: 3 - wait count: 70 - wait time: 20813
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
at org.jppf.server.queue.JPPFPriorityQueue.removeBundle(JPPFPriorityQueue.java:263)
at org.jppf.server.queue.RemoveBundleAction.run(RemoveBundleAction.java:62)
at org.jppf.server.protocol.AbstractServerJob.done(AbstractServerJob.java:333)
at org.jppf.server.protocol.AbstractServerJob.setSubmissionStatus(AbstractServerJob.java:418)
at org.jppf.server.protocol.BundleCompletionListener.lambda$bundleEnded$0(BundleCompletionListener.java:83)
at org.jppf.server.protocol.BundleCompletionListener$$Lambda$29/877663927.run(Unknown Source)
at org.jppf.server.protocol.BundleCompletionListener.bundleEnded(BundleCompletionListener.java:85)
at org.jppf.server.protocol.ServerTaskBundleClient.bundleEnded(ServerTaskBundleClient.java:425)
at org.jppf.server.nio.client.JobEntry.jobEnded(JobEntry.java:73)
at org.jppf.server.nio.client.AsyncClientMessageHandler.jobResultsSent(AsyncClientMessageHandler.java:124)
at org.jppf.server.nio.client.CompletionListener.taskCompleted(CompletionListener.java:62)
at org.jppf.server.protocol.ServerTaskBundleClient.fireTasksCompleted(ServerTaskBundleClient.java:416)
at org.jppf.server.protocol.ServerTaskBundleClient.resultReceived(ServerTaskBundleClient.java:270)
- locked org.jppf.server.protocol.ServerTaskBundleClient@1846b48a
at org.jppf.server.protocol.ServerJob.lambda$handleCancelledTasks$3(ServerJob.java:300)
at org.jppf.server.protocol.ServerJob$$Lambda$33/445459200.accept(Unknown Source)
at org.jppf.utils.collections.CollectionMap.forEach(CollectionMap.java:155)
at org.jppf.server.protocol.ServerJob.handleCancelledTasks(ServerJob.java:300)
at org.jppf.server.protocol.ServerJob.cancel(ServerJob.java:316)
at org.jppf.server.nio.client.AsyncClientContext.cancelJobOnClose(AsyncClientContext.java:238)
at org.jppf.server.nio.client.AsyncClientContext.lambda$cancelJobsOnClose$0(AsyncClientContext.java:209)
at org.jppf.server.nio.client.AsyncClientContext$$Lambda$31/1598015078.accept(Unknown Source)
at java.util.concurrent.ConcurrentHashMap.forEach(ConcurrentHashMap.java:1597)
at org.jppf.server.nio.client.AsyncClientContext.cancelJobsOnClose(AsyncClientContext.java:208)
at org.jppf.server.nio.client.AsyncClientContext.handleException(AsyncClientContext.java:87)
at org.jppf.server.nio.client.AsyncClientMessageHandler.jobReceived(AsyncClientMessageHandler.java:73)
at org.jppf.server.nio.client.AsyncClientMessageReader.handleMessage(AsyncClientMessageReader.java:69)
at org.jppf.server.nio.client.AsyncClientMessageReader$$Lambda$22/2144344426.execute(Unknown Source)
at org.jppf.nio.NioMessageReader$HandlingTask.run(NioMessageReader.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.locks.ReentrantLock$NonfairSync@58c56e8e
- java.util.concurrent.ThreadPoolExecutor$Worker@5b1d2887

"JPPF-0008" - 22 - state: RUNNABLE - blocked count: 1 - blocked time: 2 - wait count: 25 - wait time: 20783
at sun.management.ThreadImpl.dumpThreads0(Native Method)
at sun.management.ThreadImpl.dumpAllThreads(ThreadImpl.java:454)
at org.jppf.management.diagnostics.ThreadDump.(ThreadDump.java:48)
at org.jppf.management.diagnostics.Diagnostics.threadDump(Diagnostics.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:71)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:275)
at com.sun.jmx.mbeanserver.StandardMBeanIntrospector.invokeM2(StandardMBeanIntrospector.java:112)
at com.sun.jmx.mbeanserver.StandardMBeanIntrospector.invokeM2(StandardMBeanIntrospector.java:46)
at com.sun.jmx.mbeanserver.MBeanIntrospector.invokeM(MBeanIntrospector.java:237)
at com.sun.jmx.mbeanserver.PerInterface.invoke(PerInterface.java:138)
at com.sun.jmx.mbeanserver.MBeanSupport.invoke(MBeanSupport.java:252)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
at org.jppf.jmxremote.nio.JMXMessageReader.handleRequest(JMXMessageReader.java:123)
at org.jppf.jmxremote.nio.JMXMessageReader.handleMessage(JMXMessageReader.java:98)
at org.jppf.jmxremote.nio.JMXMessageReader.access$100(JMXMessageReader.java:41)
at org.jppf.jmxremote.nio.JMXMessageReader$HandlingTask.run(JMXMessageReader.java:339)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.ThreadPoolExecutor$Worker@46f5f779

"NodeClassServer" - 23 - state: RUNNABLE - blocked count: 8 - blocked time: 0 - wait count: 4 - wait time: 0 - in native code
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:296)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:278)
at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:159)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked sun.nio.ch.WindowsSelectorImpl@6825b89
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101)
at org.jppf.nio.NioServer.run(NioServer.java:256)

"ClientJobServer" - 24 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 16 - wait time: 22 - in native code
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:296)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:278)
at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:159)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked sun.nio.ch.WindowsSelectorImpl@c3cc50e
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at org.jppf.nio.StatelessNioServer.run(StatelessNioServer.java:126)

"NodeJobServer" - 25 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 7 - wait time: 10 - in native code
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:296)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:278)
at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:159)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked sun.nio.ch.WindowsSelectorImpl@e1b0f79
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101)
at org.jppf.nio.StatelessNioServer.run(StatelessNioServer.java:126)

"JobScheduler" - 26 - state: WAITING - blocked count: 5 - blocked time: 4 - wait count: 7 - wait time: 20220
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.ReentrantLock$NonfairSync@58c56e8e
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireQueued(AbstractQueuedSynchronizer.java:870)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquire(AbstractQueuedSynchronizer.java:1199)
at java.util.concurrent.locks.ReentrantLock$NonfairSync.lock(ReentrantLock.java:209)
at java.util.concurrent.locks.ReentrantLock.lock(ReentrantLock.java:285)
at org.jppf.server.protocol.AbstractServerJobBase.getTaskCount(AbstractServerJobBase.java:112)
at org.jppf.server.queue.JPPFPriorityQueue.nextBundle(JPPFPriorityQueue.java:224)
at org.jppf.server.nio.nodeserver.async.AsyncJobScheduler.prepareJobDispatch(AsyncJobScheduler.java:163)
at org.jppf.server.nio.nodeserver.async.AsyncJobScheduler.dispatch(AsyncJobScheduler.java:121)
- locked java.util.LinkedHashSet@1cc4404f
at org.jppf.server.nio.nodeserver.async.AsyncJobScheduler.run(AsyncJobScheduler.java:73)
at java.lang.Thread.run(Thread.java:745)

Locked ownable synchronizers:
- java.util.concurrent.locks.ReentrantLock$NonfairSync@1964bdfb

"Acceptor" - 27 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 27 - wait time: 61 - in native code
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:296)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:278)
at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:159)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked sun.nio.ch.WindowsSelectorImpl@72706db6
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101)
at org.jppf.nio.StatelessNioServer.run(StatelessNioServer.java:126)

"JMXRemote-1" - 29 - state: RUNNABLE - blocked count: 0 - blocked time: 0 - wait count: 23 - wait time: 13
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:296)
at sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:278)
at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:159)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked sun.nio.ch.WindowsSelectorImpl@7d2b3f6
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101)
at org.jppf.nio.StatelessNioServer.run(StatelessNioServer.java:126)

"StatsEventDispatcher-0001" - 30 - state: WAITING - blocked count: 0 - blocked time: 0 - wait count: 59 - wait time: 20913
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@64e410e9
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

"NodeChannels-0001" - 34 - state: WAITING - blocked count: 2 - blocked time: 3 - wait count: 3 - wait time: 20227
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@575d0676
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

"JobManager-0001" - 37 - state: WAITING - blocked count: 2 - blocked time: 0 - wait count: 7 - wait time: 19701
at sun.misc.Unsafe.park(Native Method)
- waiting on java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@5da56d06
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1067)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
icon_milestone.png 17:57 JPPF 2.5.5
A new milestone has been reached
icon_milestone.png 04:52 JPPF 6.0.3
A new milestone has been reached
July 02, 2019
icon_milestone.png 09:33 JPPF 3.3
A new milestone has been reached
June 30, 2019
bug_report_tiny.png 07:39  Bug report JPPF-593 - Various typos in the documentation
lolocohen : Issue closed
June 29, 2019
bug_report_tiny.png 04:21  Bug report JPPF-593 - Various typos in the documentation
lolocohen : Issue created
We need to fix a number of user-reported errors in the documentation:

- chapter 2.5.2, here under "Job SLA":

nuumber should be number

- chapter 3.4.2 Remove one "the" in the sentence:

In other words, the results are always in the same order as the tasks in the the job.

- chapter 4.1.12 There is a "r" missing in the word "interruptible" in the sentence:

Tasks that do not extend AbstractTask, such as Callable, Runnable, Pojo tasks or tasks annotated with @JPPFRunnable, will need to implement the Interruptibility interface to override the interuptible flag, as in this example:

- chapter 4.4, the yellow box, under Note 2

There is a space missing after the comma: from each task's perspective,the data provider

- chapter 4.5.1.2 First sentence. eligble channels sould be eligible channels

- chapter 4.5.1.4 Second word should be each, not aach

- chapter 4.5.2.10, the yellow box, Note 1:

usally should be usually

- chapter 4.9.1, third word:

heirarchy should be hierarchy

- chapter 4.11, yellow box:

hanldle should be handle

- chapter 4.11.4.1 THe text block under the source:

overriden should be overridden

- chapter 4.12 delete one the" in the sentence:

extensions to the the job SLA specifying if and how the persistence of each job should be handled

- chapter 5.4.1, yellow box, note 2: explicitely should be explicitly

- chapter 5.4.5 ollload schould be offload, attahed should ne attached

- chapter 5.5.1.1.3 exppressed should be expressed, expressd should be expressed
June 28, 2019
task_tiny.png 05:02  Task JPPF-592 - Security scans
lolocohen : Issue created
We propose to add security scans as part of the JPPF build, including:
* dependency checks (with [https://jeremylong.github.io/DependencyCheck/dependency-check-ant/index.html dependency-check for Ant])
* Docker images scans (maybe [https://github.com/coreos/clair/blob/master/Documentation/running-clair.md Clair]?)

June 11, 2019
May 14, 2019
task_tiny.png 07:43  Task JPPF-591 - Come up with a lightweight test suite for day-to-day development purposes
lolocohen : Issue closed
May 13, 2019
task_tiny.png 09:03  Task JPPF-591 - Come up with a lightweight test suite for day-to-day development purposes
lolocohen : Issue created
Currently, the full suite of autmated tests take 15 to 25 mn to run, depending on the hardware. We propose to reduce it to a more reasonable set of tests that would take much less time, while still providing a meaningful coverage.

We also propose to run this lightweight suite as part of the Travis build setup in github, in addition to making it available locally on the command line. The currentm CI build with Jenkins would still run the full suite.
May 12, 2019
bug_report_tiny.png 12:21  Bug report JPPF-590 - Failures in v6.1 in multi-server topology tests
lolocohen : Issue closed
May 10, 2019
feature_request_tiny.png 20:21  Feature request JPPF-583 - Tasks dependencies within a job
lolocohen : Issue closed
May 05, 2019
bug_report_tiny.png 08:16  Bug report JPPF-590 - Failures in v6.1 in multi-server topology tests
lolocohen : Issue created
The Jenkins build is showing intermittent failures for some of our automated tests on multiserver topologies. I'm not sure yet what the problem is exactly, but it seems to occur frequently enough (1 time out of 5) to warrant a dedicated bug report.
The failures I've been seeing recently, and for which a set of logs is attached to this defect:

build 221:
test: test.org.jppf.server.peer.TestMultiServer.testTopologyMonitoring()

java.lang.Exception: test timed out after 10000 milliseconds
at java.lang.ThreadGroup.threadTerminated(ThreadGroup.java:942)
at java.lang.Thread.exit(Thread.java:755)
build 227:
Test: test.org.jppf.server.peer.TestMultiServerSetup.testSetup()

java.lang.Exception: test timed out after 15000 milliseconds
at sun.misc.Unsafe.getObject(Native Method)
at java.io.ObjectStreamClass$FieldReflector.getObjFieldValues(ObjectStreamClass.java:2094)
at java.io.ObjectStreamClass.getObjFieldValues(ObjectStreamClass.java:1296)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1539)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.jppf.jmxremote.message.JMXRequest.writeObject(JMXRequest.java:102)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1028)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.jppf.serialization.DefaultJavaSerialization.serialize(DefaultJavaSerialization.java:29)
at org.jppf.utils.ObjectSerializerImpl.serialize(ObjectSerializerImpl.java:79)
at org.jppf.io.IOHelper.serializeDataToMemory(IOHelper.java:274)
at org.jppf.io.IOHelper.serializeData(IOHelper.java:255)
at org.jppf.io.IOHelper.serializeData(IOHelper.java:241)
at org.jppf.jmxremote.nio.JMXContext.offerJmxMessage(JMXContext.java:121)
at org.jppf.jmxremote.message.JMXMessageHandler.sendMessage(JMXMessageHandler.java:197)
at org.jppf.jmxremote.message.JMXMessageHandler.receiveResponse(JMXMessageHandler.java:122)
at org.jppf.jmxremote.message.JMXMessageHandler.sendRequestWithResponse(JMXMessageHandler.java:107)
at org.jppf.jmxremote.JPPFMBeanServerConnection.invoke(JPPFMBeanServerConnection.java:238)
at org.jppf.management.JMXConnectionWrapper.invoke(JMXConnectionWrapper.java:162)
at org.jppf.management.JMXDriverConnectionWrapper.nbIdleNodes(JMXDriverConnectionWrapper.java:224)
at test.org.jppf.test.setup.AbstractNonStandardSetup.lambda$awaitNbIdleNodes$2(AbstractNonStandardSetup.java:415)
at test.org.jppf.test.setup.AbstractNonStandardSetup$$Lambda$17/1147147352.evaluateWithException(Unknown Source)
at org.jppf.utils.concurrent.ConcurrentUtils$ConditionFalseOnException.evaluate(ConcurrentUtils.java:246)
at org.jppf.utils.concurrent.ConcurrentUtils.awaitCondition(ConcurrentUtils.java:96)
at test.org.jppf.test.setup.AbstractNonStandardSetup.awaitNbIdleNodes(AbstractNonStandardSetup.java:415)
at test.org.jppf.test.setup.AbstractNonStandardSetup.awaitNbIdleNodes(AbstractNonStandardSetup.java:400)
at test.org.jppf.test.setup.AbstractNonStandardSetup.checkPeers(AbstractNonStandardSetup.java:371)
at test.org.jppf.test.setup.AbstractNonStandardSetup.checkPeers(AbstractNonStandardSetup.java:340)
at test.org.jppf.test.setup.AbstractNonStandardSetup.checkPeers(AbstractNonStandardSetup.java:329)
at test.org.jppf.server.peer.TestMultiServerSetup.testSetup(TestMultiServerSetup.java:49)
May 01, 2019
icon_build.png 10:00 JPPF 6.0.3
New version released
April 20, 2019
bug_report_tiny.png 11:10  Bug report JPPF-588 - Concurrent operations with DefaultFilePersistence job persistence result in exceptions
lolocohen : Issue closed
feature_request_tiny.png 07:22  Feature request JPPF-589 - Docker images for JPPF components
lolocohen : Issue created
We propose to add Docker images for JPPF, for drivers, nodes and web admin console. The configuration of a JPPF grid with docker should allow any kind of JPPF topology, including multi-server topologies.

Antoher objectivve is to provide ready-to-use tools and configurations to run JPPF grids within a Docker swarm/Kubernetes infrastructure.
bug_report_tiny.png 07:08  Bug report JPPF-588 - Concurrent operations with DefaultFilePersistence job persistence result in exceptions
lolocohen : Issue created
In a recently failed test of the file-based job persistence, I could find the following pattern in the driver log, where multilpe threads are performing 'store' and 'delete' opeations:
2019-04-19 08:07:34,852 [DEBUG][JPPF-0007][org.jppf.server.queue.PersistenceHandler.storeResults(116)] persisting 5 results for job ServerJob[id=2, job uuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23, name=testSimplePersistedJob, submissionStatus=EXECUTING, status=EXECUTING, taskCount=0, nbBundles=1, jobExpired=false, pending=false, suspended=false]
...
2019-04-19 08:07:34,852 [DEBUG][JPPF-0007][org.jppf.job.persistence.impl.DefaultFilePersistence.store(97)] storing [PersistenceInfoImpl[type=TASK_RESULT, taskPosition=5, job=testSimplePersistedJob, jobUuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23], PersistenceInfoImpl[type=TASK_RESULT, taskPosition=8, job=testSimplePersistedJob, jobUuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23], PersistenceInfoImpl[type=TASK_RESULT, taskPosition=6, job=testSimplePersistedJob, jobUuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23], PersistenceInfoImpl[type=TASK_RESULT, taskPosition=9, job=testSimplePersistedJob, jobUuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23], PersistenceInfoImpl[type=TASK_RESULT, taskPosition=7, job=testSimplePersistedJob, jobUuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23]]
2019-04-19 08:07:34,852 [DEBUG][JPPF-0005][org.jppf.server.job.JPPFJobManager.jobEnded(195)] jobId 'testSimplePersistedJob' ended
2019-04-19 08:07:34,853 [DEBUG][JPPF-0005][org.jppf.server.queue.PersistenceHandler.deleteJob(141)] removing job 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23 from persistence store
2019-04-19 08:07:34,853 [DEBUG][JPPF-0005][org.jppf.job.persistence.impl.DefaultFilePersistence.deleteJob(168)] deleting job with uuid = 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23
2019-04-19 08:07:34,854 [DEBUG][JPPF-0003][org.jppf.server.queue.PersistenceHandler.deleteJob(141)] removing job 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23 from persistence store
2019-04-19 08:07:34,854 [DEBUG][JPPF-0003][org.jppf.job.persistence.impl.DefaultFilePersistence.deleteJob(168)] deleting job with uuid = 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23
...
2019-04-19 08:07:34,858 [DEBUG][JPPF-0007][org.jppf.server.protocol.ServerJob.lambda$postResultsReceived$0(189)] received results for ServerTaskBundleClient[id=2, pendingTasks=0, cancelled=false, done=true, job=JPPFTaskBundle[name=testSimplePersistedJob, uuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23, initialTaskCount=10, taskCount=10, bundleUuid=null, uuidPath=TraversalList[position=0, list=[4F5DEEB3-FEDE-4A18-A48F-9EE05CF06D94, d1]], nodeBundleId=null]; strategy=NodeResults]
2019-04-19 08:07:34,858 [DEBUG][JPPF-0007][org.jppf.server.protocol.ServerJob.taskCompleted(245)] requeue = false for bundle ServerTaskBundleNode[id=4, name=testSimplePersistedJob, uuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23, initialTaskCount=10, taskCount=5, cancelled=false, requeued=false, channel=AsyncNodeContext[uuid=n2, peer=false, ssl=false, local=false, offline=false, maxJobs=1, jobEntries=0, sendQueue size=0, interestOps=1, executionStatus=EXECUTING]], job = ServerJob[id=2, job uuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23, name=testSimplePersistedJob, submissionStatus=ENDED, status=DONE, taskCount=0, nbBundles=0, jobExpired=false, pending=false, suspended=false]
2019-04-19 08:07:34,859 [DEBUG][JPPF-0007][org.jppf.server.queue.PersistenceHandler.deleteJob(141)] removing job 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23 from persistence store
2019-04-19 08:07:34,859 [DEBUG][JPPF-0007][org.jppf.job.persistence.impl.DefaultFilePersistence.deleteJob(168)] deleting job with uuid = 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23
2019-04-19 08:07:34,860 [WARN ][JPPF-0003][org.jppf.utils.DeleteFileVisitor.visitFile(82)] error trying to delete file 'persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-7.tmp': java.nio.file.NoSuchFileException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-7.tmp
2019-04-19 08:07:34,860 [ERROR][JPPF-0007][org.jppf.server.queue.PersistenceHandler.deleteJob(145)] error deleting persistent job 0282BA15-CE62-4C5D-AF33-5F92CFB2BB23 : org.jppf.job.persistence.JobPersistenceException: java.nio.file.AccessDeniedException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\header.data
at org.jppf.job.persistence.impl.DefaultFilePersistence.deleteJob(DefaultFilePersistence.java:172)
at org.jppf.server.queue.PersistenceHandler.deleteJob(PersistenceHandler.java:143)
at org.jppf.server.queue.PersistenceHandler.deleteJob(PersistenceHandler.java:132)
at org.jppf.server.queue.RemoveBundleAction.run(RemoveBundleAction.java:63)
at org.jppf.server.protocol.AbstractServerJob.done(AbstractServerJob.java:333)
at org.jppf.server.protocol.AbstractServerJob.setSubmissionStatus(AbstractServerJob.java:418)
at org.jppf.server.protocol.ServerJob.taskCompleted(ServerJob.java:253)
at org.jppf.server.protocol.ServerJob.postResultsReceived(ServerJob.java:191)
at org.jppf.server.protocol.ServerJob.resultsReceived(ServerJob.java:142)
at org.jppf.server.protocol.ServerTaskBundleNode.resultsReceived(ServerTaskBundleNode.java:198)
at org.jppf.server.nio.nodeserver.async.AsyncNodeMessageHandler.processResults(AsyncNodeMessageHandler.java:337)
at org.jppf.server.nio.nodeserver.async.AsyncNodeMessageHandler.process(AsyncNodeMessageHandler.java:274)
at org.jppf.server.nio.nodeserver.async.AsyncNodeMessageHandler.resultsReceived(AsyncNodeMessageHandler.java:183)
at org.jppf.server.nio.nodeserver.async.AsyncNodeMessageReader.handleMessage(AsyncNodeMessageReader.java:73)
at org.jppf.nio.NioMessageReader$HandlingTask.run(NioMessageReader.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.nio.file.AccessDeniedException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\header.data
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:83)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102)
at sun.nio.fs.WindowsFileSystemProvider.implDelete(WindowsFileSystemProvider.java:269)
at sun.nio.fs.AbstractFileSystemProvider.delete(AbstractFileSystemProvider.java:103)
at java.nio.file.Files.delete(Files.java:1126)
at org.jppf.utils.DeleteFileVisitor.visitFile(DeleteFileVisitor.java:78)
at org.jppf.utils.DeleteFileVisitor.visitFile(DeleteFileVisitor.java:30)
at java.nio.file.Files.walkFileTree(Files.java:2670)
at java.nio.file.Files.walkFileTree(Files.java:2742)
at org.jppf.job.persistence.impl.DefaultFilePersistence.deleteJob(DefaultFilePersistence.java:170)
... 17 more
2019-04-19 08:07:34,861 [DEBUG][JPPF-0007][org.jppf.server.protocol.ServerJob.taskCompleted(254)] submissionStatus=ENDED, clientBundles=0 for ServerJob[id=2, job uuid=0282BA15-CE62-4C5D-AF33-5F92CFB2BB23, name=testSimplePersistedJob, submissionStatus=ENDED, status=DONE, taskCount=0, nbBundles=0, jobExpired=false, pending=false, suspended=false]
2019-04-19 08:07:34,861 [DEBUG][JPPF-0007][org.jppf.server.nio.nodeserver.async.AsyncNodeMessageHandler.processResults(338)] updated stats for AsyncNodeContext[uuid=n2, peer=false, ssl=false, local=false, offline=false, maxJobs=1, jobEntries=0, sendQueue size=0, interestOps=1, executionStatus=EXECUTING]
2019-04-19 08:07:34,860 [WARN ][JPPF-0005][org.jppf.utils.DeleteFileVisitor.visitFile(82)] error trying to delete file 'persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\header.data': java.nio.file.NoSuchFileException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\header.data
2019-04-19 08:07:34,862 [WARN ][JPPF-0005][org.jppf.utils.DeleteFileVisitor.visitFile(82)] error trying to delete file 'persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-0.data': java.nio.file.NoSuchFileException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-0.data
2019-04-19 08:07:34,862 [WARN ][JPPF-0005][org.jppf.utils.DeleteFileVisitor.visitFile(82)] error trying to delete file 'persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-1.data': java.nio.file.NoSuchFileException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-1.data
2019-04-19 08:07:34,862 [WARN ][JPPF-0005][org.jppf.utils.DeleteFileVisitor.visitFile(82)] error trying to delete file 'persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-2.data': java.nio.file.NoSuchFileException: persistence\0282BA15-CE62-4C5D-AF33-5F92CFB2BB23\result-2.data
April 14, 2019
enhancement_tiny.png 05:56  Enhancement JPPF-587 - Ability to create JPPFSchedule instances using java.time.* APIs
lolocohen : Issue created
The class [https://www.jppf.org/javadoc/6.2/index.html?org/jppf/scheduling/JPPFSchedule.html JPPFSchedule] is used to specify the start or expiration schedule of a job, as well as the expiration schedule of a task. It currently has 2 basic constructors, one that takes an epoch time in millis, the other that takes a string which represents a date, along with a SimpleDateFormat-compliant format to parse it.

We propose to extends this class to enable building JPPFSchedule objects based on the classes in java.time.*, such as ZOnedDateTime, Duration, etc.
April 13, 2019
enhancement_tiny.png 20:19  Enhancement JPPF-586 - Cache of class/resource definitions in the client
lolocohen : Issue closed
April 12, 2019
task_tiny.png 13:13  Task JPPF-584 - Refactor the distributed class loader into the asynchronous Nio communication model
lolocohen : Issue closed
April 08, 2019
enhancement_tiny.png 07:36  Enhancement JPPF-586 - Cache of class/resource definitions in the client
lolocohen : Issue created
When a job is dispatched to multiple nodes in parallel, this can result in the same class loading request being issued to the same client in parallel or in sequence. This would happen when identical requests are forwarded to the same client, before the first response is received by the server, and therefore before in can be added to the server-side cache. It could be worthwhile, from a perfromance perspective, to use a cache of class definitions, such that identical requests (same client-side class loader and same resource path) only result in a single lookup in the classpath.

To this effect, we propose to implement a cache in the client, as follows:
* an identity hash map whose keys are class loaders
* the values are hash maps where the key is a path in the classpath and the value is the byte[] for the resource located at that path. These could be implemented as [https://www.jppf.org/javadoc/6.2/index.html?org/jppf/utils/collections/SoftReferenceValuesMap.html SoftReferenceValueMap]s to avoid out of memory conditions due to the cache
March 31, 2019
bug_report_tiny.png 09:43  Bug report JPPF-585 - The 6.1 tutorial still uses JPPFClient.submitJob() and blocking job attribute
lolocohen : Issue closed
bug_report_tiny.png 08:42  Bug report JPPF-585 - The 6.1 tutorial still uses JPPFClient.submitJob() and blocking job attribute
lolocohen : Issue created
The online and offline dosc still show, in the tutorial, code snippe that use the "blocking" job attribute, which is now deprecated, as well as job submission with JPPFClient.submitJob(), now deprecated and replace with submit() and submitAsync
March 30, 2019
task_tiny.png 07:19  Task JPPF-584 - Refactor the distributed class loader into the asynchronous Nio communication model
lolocohen : Issue created
This is about refactoring the distributed class loader communication model into the more efficient and scalable model introduced in JPPF 6.1 (see feature request JPPF-549 and feature request JPPF-564). This includes both driver/node and driver/client communication channels.

These are the last components to switch to the new model. Once it is done, we expect a number of benefits:
* we will be able to get rid of the old nio code, which should reduce the maintenance burden
* this should also increase the performance, simply because we will remove parts of the code inherited from the old model, which are still present but not used in the new model
* increased performance and scalability, because the new nio model is more efficient
March 28, 2019
task_tiny.png 20:13  Task JPPF-572 - Performance, endurance and stress testing
lolocohen : Issue closed
icon_build.png 10:00 JPPF 6.1
New version released
March 22, 2019
feature_request_tiny.png 08:31  Feature request JPPF-583 - Tasks dependencies within a job
lolocohen : Issue created
We propose to enable dependencies between tasks in the same job. For instance, we envision the ability to express that task A depends on the completion of tasks B and C, who each depend on the completion of task D (diamond dependency graph):
Dependencies:
B
/ \
A D
\ /
C
This implies a number of challenges, including but not limited to:
* decide how to schedule and parallelize the execution of the tasks: a task with dependencies cannot be scheduled before its dependenices have completed
* handle failures / cancellation of tasks on which other tasks depend
* provide an expressive and intuitive API to specify the dependencies
March 21, 2019
task_tiny.png 19:31  Task JPPF-529 - Explore usage of .Net Core instead of Visual Studio for the .Net bridge
lolocohen : Issue closed
feature_request_tiny.png 19:29  Feature request JPPF-177 - Write documentation on logging
lolocohen : Issue closed
feature_request_tiny.png 19:27  Feature request JPPF-111 - Implement a recipe for Cloudify
lolocohen : Issue closed
feature_request_tiny.png 19:25  Feature request JPPF-9 - Change the JPPF configuration based on observable behavior
lolocohen : Issue closed
feature_request_tiny.png 18:59  Feature request JPPF-429 - Use existing node connections for heartbeat-based connection checks
lolocohen : Issue closed
Show moreaction_add_small.png