Protobuf to ts transpiler
protons
is a high performance implementation of Protocol Buffers v3.
It transpiles code to TypeScript and supports BigInts for 64 bit types.
The protons
module contains the code to compile .proto
files to .ts
files and protons-runtime
contains the code to do serialization/deserialization to Uint8Array
s during application execution.
Please ensure you declare them as the correct type of dependencies:
$ npm install --save-dev protons
$ npm install --save protons-runtime
First generate your .ts
files:
$ protons ./path/to/foo.proto ./path/to/output.ts
Then run tsc over them as normal:
$ tsc
In your code import the generated classes and use them to transform to/from bytes:
import { Foo } from './foo.js'
const foo = {
message: 'hello world'
}
const encoded = Foo.encode(foo)
const decoded = Foo.decode(encoded)
console.info(decoded.message)
// 'hello world'
This module uses the internal reader/writer from protobuf.js
as it is highly optimised and there's no point reinventing the wheel.
It does have one or two differences:
- Supports
proto3
semantics only - All 64 bit values are represented as
BigInt
s and notLong
s (e.g.int64
,uint64
,sint64
etc) - Unset
optional
fields are set on the deserialized object forms asundefined
instead of the default values -
singular
fields set to default values are not serialized and are set to default values when deserialized if not set - protobuf.js diverges from the language guide around this feature -
map
fields can have keys of any type - protobufs.js only supports strings -
map
fields are deserialized as ES6Map
s - protobuf.js usesObject
s
To protect decoders from malicious payloads, it's possible to limit the maximum size of repeated/map elements.
You can either do this at compile time by using the protons.options extension:
message MyMessage {
// repeatedField cannot have more than 10 entries
repeated uint32 repeatedField = 1 [(protons.options).limit = 10];
// stringMap cannot have more than 10 keys
map<string, string> stringMap = 2 [(protons.options).limit = 10];
}
Or at runtime by passing objects to the .decode
function of your message:
const message = MyMessage.decode(buf, {
limits: {
repeatedField: 10,
stringMap: 10
}
})
Sub messages with repeating elements can be limited in a similar way:
message SubMessage {
repeated uint32 repeatedField = 1;
}
message MyMessage {
SubMessage message = 1;
}
const message = MyMessage.decode(buf, {
limits: {
messages: {
repeatedField: 5 // the SubMessage can not have more than 5 repeatedField entries
}
}
})
Sub messages defined in repeating elements can be limited by appending $
to the field name in the runtime limit options:
message SubMessage {
repeated uint32 repeatedField = 1;
}
message MyMessage {
repeated SubMessage messages = 1;
}
const message = MyMessage.decode(buf, {
limits: {
messages: 5 // max 5x SubMessages
messages$: {
repeatedField: 5 // no SubMessage can have more than 5 repeatedField entries
}
}
})
Repeating fields in map entries can be limited by appending $value
to the field name in the runtime limit options:
message SubMessage {
repeated uint32 repeatedField = 1;
}
message MyMessage {
map<string, SubMessage> messages = 1;
}
const message = MyMessage.decode(buf, {
limits: {
messages: 5 // max 5x SubMessages in the map
messages$value: {
repeatedField: 5 // no SubMessage in the map can have more than 5 repeatedField entries
}
}
})
By default 64 bit types are implemented as BigInts.
Sometimes this is undesirable due to performance issues or code legibility.
It's possible to override the JavaScript type 64 bit fields will deserialize to:
message MyMessage {
repeated int64 bigintField = 1;
repeated int64 numberField = 2 [jstype = JS_NUMBER];
repeated int64 stringField = 3 [jstype = JS_STRING];
}
const message = MyMessage.decode(buf)
console.info(typeof message.bigintField) // bigint
console.info(typeof message.numberField) // number
console.info(typeof message.stringField) // string
Some features are missing OneOf
s, etc due to them not being needed so far in ipfs/libp2p. If these features are important to you, please open PRs implementing them along with tests comparing the generated bytes to protobuf.js
and pbjs
.
$ npm i protons
Licensed under either of
- Apache 2.0, (LICENSE-APACHE / http://www.apache.org/licenses/LICENSE-2.0)
- MIT (LICENSE-MIT / http://opensource.org/licenses/MIT)
Contributions welcome! Please check out the issues.
Also see our contributing document for more information on how we work, and about contributing in general.
Please be aware that all interactions related to this repo are subject to the IPFS Code of Conduct.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.