• 0

    posted a message on LibCompress
    :Decompress calls :DecompressHuffman by looking at the header byte in the compressed string.

    The error you experience is apparently a mystery to many. Noone knows exactly what triggers it. Because of this, I am hard pressed to fix it if the problem indeed is located with LibCompress. Common addons that can trigger this error is ArkInventory and CowTip. Other addons may do it as well. The error caused does not show the problem to be related with ArkInventory or CowTip, but disabling those addons usually prevents the error or makes it much less frequent. LibCompress MAY be a trigger for this error, but despite your error message, it may not be the real problem.

    The error does seem to be related with number of table entries but apparently not seen on it own as addons seem to cause problems for each other. The problem is not related to the amount of memory free and occurs with addons memory usage from 80MB and up (mostly).

    LibHuffman can be fixed to use less table entries, but this will cause some performance.

    It is not the job og LibCompress to encode the datastream to fik whatever purpose the developer needs. But a seperate encoding addon/library could be programmed to encode/decode strings so they are safe for transmit and/or safe for storage. _I_ don't know which values are safe for storage, but with my work on GuildAds, I can check to see which values are safe for transmit (and this list changes depending on whether you use SendChatMessage or SendAddonMessage). Encoding/decoding for storage/sendchatmessage/sendaddonmessage is something that has to be handled, but is not something that is to be handled by LibCompress (as encoding != compressing).
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    With Galmok_Save holding your table, I ran these commands (macro):

    /run a=LibStub:GetLibrary("LibCompress")
    /run b=LibStub:GetLibrary("AceSerializer-3.0")
    /run c=b:Serialize(Galmok_Save)
    /run d=a:CompressHuffman(c)
    /run e=a:Decompress(d)

    No errors and c is equal to e.

    Your problem is not directly due to LibCompress. It just triggered triggered a general problem. You could simply have run out of memory or even have experienced a memory error.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Could you please provide me with the lines necessary to load AceSerialize-3.0 and convert the table to the string? Google finds only this thread mentioning AceSerialize...
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I would appreciate the string that causes a wow C error. Even if you can't use the addon for your purpose, I would like to know if there is a bug/flaw in the implementation.

    I was thinking about making the compression take part in smaller parts and issue a callback when the data was compressed. But the problem is that player may log off before the data has been compressed and the addon that uses LibCompress has to store the uncompressed data somewhere inbetween.

    Most of the compressor and decompressor in Huffman can be broken into smaller parts (as small as wanted, but will get much slower then).
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I just tried this simple macro in WoW:

    /run a=LibStub:GetLibrary("LibCompress")
    /run for i=10000,30000,1000 do ChatFrame1:AddMessage(i); r=string.rep("a",i); c=a:CompressHuffman(r); d=a:Decompress(c); end

    No errors.

    Tried up to a string length 100000 and still no problem. Sidenote: This uses approx 3 megabyte memory to compress and decompress (memory is freed again).

    I can't reproduce the problem.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I haven't debugged it yet but according to your report, the low-level (c) error occurs when trying to add a new entry to the table. The only two things that should be able to cause this error is if you have run out of memory or if wow lua tables have a low limit on the number of keys. My huffman codec was created, debugged and stress-tested using an external LUA interpreter and only the functionality was tested in wow.

    What was the size of the string you compressed (and then tried decompress)?
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    The line that errors is:

    table_insert(uncompressed, symbol)

    Basically that means the table uncompressed has gotten too large.

    How large may tables be in WoW?

    Is the limit based in entries (number of keys) or total data in the table?

    If the limit is a certain number of entries, then I can solve it fairly easy, but it will cost a bit performance in the decompression.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Quote from Allara »

    Has anyone been using this lately? Is it working well? I'm working on a new add-on which needs to send a fairly large amount of data between users (about 20-30K). This takes a long time to send using AceComm-3.0 due to the message size limit. I've brought in LibCompress, which is able to compress the data to an average of about 65% of the original size. The data is actually a table which has been sent through AceSerialize-3.0. The compression appears to be working, but decompression yields an error on line 522 (in the DecompressHuffman function):

    I tried switching to just CompressLZW instead of Compress, and everything works just fine (but the LZW algorithm compresses it far less than Huffman did). Is anyone still working on this? I can provide more code for testing if needed.

    Also: did the Huffman algorithm here ever get updated to escape \000 characters?


    Will you be able to provide the data you tried to compress to me? Send to [email]galmok@gmail.com[/email] if possible. But looking at the error, the problem may not be easy to solve (not LUA error, but C error).
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I have now comitted my changes, but can't see it in FishEye yet. How long is the delay on fisheye?
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Quote from jjsheets »

    I'd much rather develop LibCompress on wowace (at least until curseforge gets stable).

    I will be happy to commit any changes to LibCompress for you until you get your SVN access. Just send me a PM with the changed files or patches, and a commit message.


    That wont be necessary. It seems I have received my SVN login and can now begin work on LibCompress.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I haven't received any response about my SVN query so I'll just make my own library and publish on curse.com. May I include the LZW code and reuse the LibCompress name?
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    Problem fixed and new version to test at:

    http://pastey.net/87960

    The reason was that I was putting too many bits into a variable (35 bits when there was only room for 32). This was a problem in the decompressor only which has been reworked. But the problem can happen in the compressor as well but now I am checking for it (the only place where it can happen I think). If it does happen, the compressor returns the uncompressed string with a header indicating it is uncompressed (can be fed to the decompressor).
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I'll have a look at it.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I don't think this is is what a compression algorithm should do. Some addons have their own channel encoding and us messing with the datastream will not be optimal. Keep the things seperate I say.

    I haven't any wowace SVN access yet, but may get it eventually.
    Posted in: Libraries
  • 0

    posted a message on LibCompress
    I have no problems with the suggested IDs.

    I cannot guarantee that the compressed huffman data does not contain \000 byte values. The huffman algorithm does not operate on byte values as such, but packs words of varying bit-length into bytes for storage. This can result in all sorts of byte values. It is up to a channel-encoder to make sure all byte values can be transmitted over a given communication channel or be stored in savedvariables files.

    I would guess encoding all 0 byte values would make the data grow by about 1-2% or so.
    Posted in: Libraries
  • To post a comment, please or register a new account.