-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deadlock issue in CompressionUtils #104
Comments
Thanks for a thorough bug report - I will be looking into it soon. |
The implementation needs to support huge files, so a plain byte array is too simple - But the idea is good. I think that it is possible to avoid the piped streams by utilizing CXF's CachedOutputStream and do exactly what you are suggesting, and still support huge files. |
I have a fix for this that also support large files, without using piped streams, It will be in the 4.1.8 release. |
Ok thanks, |
I just pushed the code now - You can take a look at it in the hotfix-4.1.8 branch. |
Just performed a test with the new code. |
…ead keystore" This reverts commit cb281db.
A deadlock can occur when sending to multiple receivers of which at least one has an invalid endpoint.
For example 2 receivers:
now send 30 documents to those 2 receivers at random (can be sequentially)
A deadlock occurs and no other documents can be sent anymore.
This is caused by the CompressionUtil code: AS4 4.1.5
This code tries not to take the full payload in memory. But there are several issues with this.
If the other side of the piped stream is not receiving anymore (eg in some cases when the send fails), then this side of the piped stream will block forever.
Note that to have this behavior, there has to be some load, the zip runnable needs to start before the cxf is consuming it.
Also note 2 things about the GZIPOutputStream:
The piped stream is passed around as an instance var in the cxf attachment. There is currently no code that guarantees that it is always closed properly.
A simple solution here is to just do this compression in memory and avoid the complex threading and piped streams:
Stack traces from blocked threads:
The text was updated successfully, but these errors were encountered: