Blob.lua - binary serialization library

Showcase your libraries, tools and other projects that help your fellow love users.
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Blob.lua - binary serialization library

Post by grump »

Blob.lua is a small LuaJIT library for binary data parsing and serialization. Its intended use is to parse and generate arbitrary binary data, and to serialize Lua tables for efficient storage or transmission.

I wrote this library mainly to implement a portable and compact file format for my font generator. You can use it to implement game saving and loading, for reading and writing all kinds of binary data formats or for efficient serialization of data over a network. It supports all basic types and nested tables, has dedicated functions to read and write native C signed/unsigned integer and floating point types, and handles both little and big endian data.

Code: Select all

local Blob = require('Blob')

local gamestate = { ... any kind of data ... }

local output = Blob()
output:write(gamestate)
local bin = output:string()
-- bin now contains a binary representation of gamestate that can be written to a file or sent over a network

local input = Blob(bin)
local state = input:read()
-- state now contains the data from the original gamestate table
License: MIT

Github project
Last edited by grump on Thu Dec 02, 2021 5:08 pm, edited 1 time in total.
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Blob.lua - binary serialization library

Post by grump »

I updated the library with pack/unpack functionality that works basically the same as string.pack/unpack in Lua 5.3 (with some differences).

Instead of writing long chains of read*/write* instructions, you can now use a format string that describes the binary data format in a concise manner.

Code: Select all

-- before
local u8 = blob:readU8()
local u16 = blob:readU16()
local f32 = blob:readFloat()
local str = blob:readString()

-- now
local u8, u16, f32, str = blob:unpack('BHfs')
GitHub
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Blob.lua - binary serialization library

Post by grump »

The lib has been updated to version 2.0.1. The new version has a simplified and cleaner API.
There is now a moonscript port available, which will eventually replace the Lua implementation.

Updated API documentation for Blob.lua
API documentation for moonblob

moonscript example:

Code: Select all

writer = BlobWriter!
writer\write({
	data: 'example'
	nested:
		foo: 'bar'
		answer: 42
})

with writer
	\u32(0xdeadbeef)
	\f32(42.23)

reader = BlobReader(writer\tostring!)
tbl = reader\read!
beef, float = reader\u32!, reader\f32!
User avatar
Salvakiya
Prole
Posts: 1
Joined: Sat Nov 03, 2018 7:14 am

Re: Blob.lua - binary serialization library

Post by Salvakiya »

How is the performance of this in comparison to love.data.pack/unpack? and what are the benefits of using this over pack/unpack? I presume I could use this to serialize data to send over network... but is it more efficient than love.data.pack?
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Blob.lua - binary serialization library

Post by grump »

When I created this lib, love.data.pack/unpack did not exist yet.

Some benefits:
  • It supports more data types than pack/unpack. BlowWriter/Reader can serialize tables and 64 bit integers, both trough dedicated read/write functions and the pack/unpack functions. LÖVE's pack/unpack can't handle tables or integers > 2^52.
  • Pack/unpack is limited to the format string syntax, which can become tedious and a serious limitation, depending on your use case. This library can be used like file I/O, and provides read/write functions for each supported data type, while also having support for the pack/unpack syntax.
  • BlobWriter uses an efficient storage scheme for strings and (optional) 32 bit integers. It has a special size-optimized 32 bits integer type (vs32/vu32) that is internally used to encode string lengths.
  • Pack/unpack uses native data sizes, making it harder to use in cross-platform use cases, while Blob.lua uses fixed data sizes.
  • It's much faster in general.
Performance in comparison to table.concat:

Code: Select all

local t1 = love.timer.getTime()
local blob = BlobWriter(nil, 2 ^ 20)

-- create a 10 MiB string of random bytes
for i = 1, 10 * 2 ^ 20 do
	blob:u8(love.math.random(0, 255))
end

local str = blob:tostring()
local t2 = love.timer.getTime()
print(1000 * (t2 - t1), 'ms')
vs.

Code: Select all

local t1 = love.timer.getTime()
local tmp = {}

-- create a 10 MiB string of random bytes
for j = 1, 10 * 2 ^ 20 do
	tmp[#tmp + 1] = string.char(love.math.random(0, 255))
end

local str = table.concat(tmp)
local t2 = love.timer.getTime()
print(1000 * (t2 - t1), 'ms')
BlobWriter: 83.615149000252 ms
table.concat: 2581.4122849988 ms


Performance comparison when not being limited by table.concat:

Code: Select all

local t1 = love.timer.getTime()

local blob = BlobWriter()
for j = 1, 1e6 do
	blob:u32(j):u8(0)
end

local t2 = love.timer.getTime()
print(1000 * (t2 - t1), 'ms')
vs.

Code: Select all

local t1 = love.timer.getTime()

-- note: data is discarded here, while it's retained with BlobWriter
for j = 1, 1e6 do
	love.data.pack('string', 'I4B', j, 0)
end

local t2 = love.timer.getTime()
print(1000 * (t2 - t1), 'ms')
BlobWriter: 8.5714660017402 ms
love.data.pack: 281.65941900079 ms


In direct comparison, BlobWriter:pack is slower than love.data.pack.

Code: Select all

local t1 = love.timer.getTime()

local blob = BlobWriter()
for j = 1, 1e6 do
	blob:pack('LB', j, 0)
end

local t2 = love.timer.getTime()
print(1000 * (t2 - t1), 'ms')
BlobWriter:pack: 714.23170400521 ms (vs. the ~282 ms of love.data.pack from above)
But you can of course combine love.data.pack with BlobWriter.

Note that the comparison is flawed, because it does not take the cost of buffer concatenation into account that is already 'built-in' with BlobWriter:pack. Also, BlobWriter:pack does not support all features of love.data.pack.
User avatar
nikki93
Prole
Posts: 12
Joined: Mon Mar 19, 2018 9:52 am

Re: Blob.lua - binary serialization library

Post by nikki93 »

Nice! How does it compare to bitser https://github.com/gvx/bitser? I am using that along with a slight modification (https://github.com/expo/ghost-multi/com ... b6a1abe3ee) to handle entity references by id (to deserialize to existing entity on receiving end) for a multiplayer sync library (https://github.com/expo/ghost-multi/blo ... r/sync.lua) and am highly interested in fast + compact binary serialization.
grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Blob.lua - binary serialization library

Post by grump »

nikki93 wrote: Mon Nov 05, 2018 11:35 pm Nice! How does it compare to bitser https://github.com/gvx/bitser?
I quickly made a test program to compare the speed of binser, bitser and Blob. Blob came out faster in almost all tests.

Tests include: a sequence of 64k numbers (sequentialManyNumbers), 64k uint32_t (sequentialManyU32), 32 numbers (sequentialFewNumbers), 32 uint8_t (sequentialFewU8), and a deeply nested table (deepTable).

Image

According to these results, serialization is pretty fast, but deserialization needs work. Take the results with a grain of salt though - these are not real life examples and I might even be using binser/bitser wrong.
I am using that along with a slight modification (https://github.com/expo/ghost-multi/com ... b6a1abe3ee) to handle entity references by id (to deserialize to existing entity on receiving end) for a multiplayer sync library (https://github.com/expo/ghost-multi/blo ... r/sync.lua) and am highly interested in fast + compact binary serialization.
Well, it's quite fast. Compactness depends on how you use it - if you know how many bits your values require and use it accordingly, it can be pretty much optimal with regards to data size. Just throwing tables at it won't do it. There's still room for optimzation though.

Oh, and it doesn't have fancy class support like bitser. It supports simple values and nested tables, anything else will make it croak.
Attachments
blob_benchmark.love
(39.54 KiB) Downloaded 744 times
User avatar
nikki93
Prole
Posts: 12
Joined: Mon Mar 19, 2018 9:52 am

Re: Blob.lua - binary serialization library

Post by nikki93 »

Awesome! Will experiment with this myself. Thanks so much for the benchmark example! Will helps a lot.
MissDanish
Citizen
Posts: 65
Joined: Wed Mar 07, 2018 11:21 pm

Re: Blob.lua - binary serialization library

Post by MissDanish »

how do I save and load a simple table? I find the documentation a bit confusing. It just gives me a bytecode error

my code:

Code: Select all

local saveListPath = "saves/savelist.save"
local saves = {}
local currentSave = ""

function save_add(ID)
  local save = {
    ID = ID,
    data = {}
  }
  table.insert(saves,save)
end

function saves_load()
  if love.filesystem.getInfo(saveListPath) ~= nil then
    saves = BlobReader(love.filesystem.read(saveListPath))
  end
end

function saves_save()
  save_add("bah")

  local saveList = BlobWriter()
  saveList:write(saves)
  love.filesystem.createDirectory("saves")
  love.filesystem.write(saveListPath, saveList:tostring())
end

grump
Party member
Posts: 947
Joined: Sat Jul 22, 2017 7:43 pm

Re: Blob.lua - binary serialization library

Post by grump »

MissDanish wrote: Wed Nov 28, 2018 6:12 am how do I save and load a simple table? I find the documentation a bit confusing
The README has read and write examples right at the top, and there's a (moonscript) file I/O example in the 'examples' directory.

I just added a minimal example that shows how to read and write tables in LÖVE. Hope that helps.
my code:

Code: Select all

local saveListPath = "saves/savelist.save"
local saves = {}
local currentSave = ""

function save_add(ID)
  local save = {
    ID = ID,
    data = {}
  }
  table.insert(saves,save)
end

function saves_load()
  if love.filesystem.getInfo(saveListPath) ~= nil then
    saves = BlobReader(love.filesystem.read(saveListPath))
  end
end

function saves_save()
  save_add("bah")

  local saveList = BlobWriter()
  saveList:write(saves)
  love.filesystem.createDirectory("saves")
  love.filesystem.write(saveListPath, saveList:tostring())
end

love.filesystem.read returns data and size; you can't pass the call result directly to BlobReader(), because it expects the optional second parameter to be byte order, not size.

This will work:

Code: Select all

function saves_load()
  if love.filesystem.getInfo(saveListPath) ~= nil then
    data = love.filesystem.read(saveListPath)
    saves = BlobReader(data):read()
  end
end
The save code is fine as is and should work as expected.
Post Reply

Who is online

Users browsing this forum: No registered users and 12 guests