ts-prototransforms your.protofiles into strongly-typed, idiomatic TypeScript files!
- ts-proto
- Overview
- QuickStart
- Goals
- Example Types
- Highlights
- Auto-Batching / N+1 Prevention
- Usage
- Sponsors
- Development
- Assumptions
- Todo
- OneOf Handling
- Default values and unset fields
- Well-Known Types
- Number Types
- Current Status of Optional Values
ts-proto generates TypeScript types from protobuf schemas.
I.e. given a person.proto schema like:
message Person {
string name = 1;
}ts-proto will generate a person.ts file like:
interface Person {
name: string
}
const Person = {
encode(person): Writer { ... }
decode(reader): Person { ... }
toJSON(person): unknown { ... }
fromJSON(data): Person { ... }
}It also knows about services and will generate types for them as well, i.e.:
export interface PingService {
ping(request: PingRequest): Promise<PingResponse>;
}It will also generate client implementations of PingService; currently Twirp, grpc-web, grpc-js and nestjs are supported.
npm install ts-protoprotoc --plugin=./node_modules/.bin/protoc-gen-ts_proto --ts_proto_out=. ./simple.proto- (Note that the output parameter name,
ts_proto_out, is named based on the suffix of the plugin's name, i.e. "ts_proto" suffix in the--plugin=./node_modules/.bin/protoc-gen-ts_protoparameter becomes the_outprefix, perprotoc's CLI conventions.) - On Windows, use
protoc --plugin=protoc-gen-ts_proto=".\\node_modules\\.bin\\protoc-gen-ts_proto.cmd" --ts_proto_out=. ./simple.proto(see #93) - Ensure you're using a modern
protoc(see installation instructions for your platform, i.e.protocv3.0.0doesn't support the_optflag
- (Note that the output parameter name,
This will generate *.ts source files for the given *.proto types.
If you want to package these source files into an npm package to distribute to clients, just run tsc on them as usual to generate the .js/.d.ts files, and deploy the output as a regular npm package.
If you're using Buf, pass strategy: all in your buf.gen.yaml file (docs).
version: v1
plugins:
- name: ts
out: ../gen/ts
strategy: all
path: ../node_modules/ts-proto/protoc-gen-ts_protoTo prevent buf push from reading irrelevent .proto files, configure buf.yaml like so:
build:
excludes: [node_modules]You can also use the official plugin published to the Buf Registry.
version: v1
plugins:
- plugin: buf.build/community/stephenh-ts-proto
out: ../gen/ts
opt:
- outputServices=...
- useExactTypes=...If you're using a modern TS setup with either esModuleInterop or running in an ESM environment, you'll need to pass ts_proto_opts of:
esModuleInterop=trueif usingesModuleInteropin yourtsconfig.json, andimportSuffix=.jsif executing the generated ts-proto code in an ESM environment
In terms of the code that ts-proto generates, the general goals are:
- Idiomatic TypeScript/ES6 types
ts-protois a clean break from either the built-in Google/Java-esque JS code ofprotocor the "make.d.tsfiles the*.jscomments" approach ofprotobufjs- (Techically the
protobufjs/minimalpackage is used for actually reading/writing bytes.)
- TypeScript-first output
- Interfaces over classes
- As much as possible, types are just interfaces, so you can work with messages just like regular hashes/data structures.
- Only supports codegen
*.proto-to-*.tsworkflow, currently no runtime reflection/loading of dynamic.protofiles
Note that ts-proto is not an out-of-the-box RPC framework; instead it's more of a swiss-army knife (as witnessed by its many config options), that lets you build exactly the RPC framework you'd like on top of it (i.e. that best integrates with your company's protobuf ecosystem; for better or worse, protobuf RPC is still a somewhat fragmented ecosystem).
If you'd like an out-of-the-box RPC framework built on top of ts-proto, there are a few examples:
(Note for potential contributors, if you develop other frameworks/mini-frameworks, or even blog posts/tutorials, on using ts-proto, we're happy to link to them.)
The generated types are "just data", i.e.:
export interface Simple {
name: string;
age: number;
createdAt: Date | undefined;
child: Child | undefined;
state: StateEnum;
grandChildren: Child[];
coins: number[];
}Along with encode/decode factory methods:
export const Simple = {
create(baseObject?: DeepPartial<Simple>): Simple {
...
},
encode(message: Simple, writer: Writer = Writer.create()): Writer {
...
},
decode(reader: Reader, length?: number): Simple {
...
},
fromJSON(object: any): Simple {
...
},
fromPartial(object: DeepPartial<Simple>): Simple {
...
},
toJSON(message: Simple): unknown {
...
},
};This allows idiomatic TS/JS usage like:
const bytes = Simple.encode({ name: ..., age: ..., ... }).finish();
const simple = Simple.decode(Reader.create(bytes));
const { name, age } = simple;Which can dramatically ease integration when converting to/from other layers without creating a class and calling the right getters/setters.
-
A poor man's attempt at "please give us back optional types"
The canonical protobuf wrapper types, i.e.
google.protobuf.StringValue, are mapped as optional values, i.e.string | undefined, which means for primitives we can kind of pretend the protobuf type system has optional types.(Update: ts-proto now also supports the proto3
optionalkeyword.) -
Timestamps are mapped as
Date(Configurable with the
useDateparameter.) -
fromJSON/toJSONuse the proto3 canonical JSON encoding format (e.g. timestamps are ISO strings), unlikeprotobufjs. -
ObjectIds can be mapped as
mongodb.ObjectId(Configurable with the
useMongoObjectIdparameter.)
(Note: this is currently only supported by the Twirp clients.)
If you're using ts-proto's clients to call backend micro-services, similar to the N+1 problem in SQL applications, it is easy for micro-service clients to (when serving an individual request) inadvertantly trigger multiple separate RPC calls for "get book 1", "get book 2", "get book 3", that should really be batched into a single "get books [1, 2, 3]" (assuming the backend supports a batch-oriented RPC method).
ts-proto can help with this, and essentially auto-batch your individual "get book" calls into batched "get books" calls.
For ts-proto to do this, you need to implement your service's RPC methods with the batching convention of:
- A method name of
Batch<OperationName> - The
Batch<OperationName>input type has a single repeated field (i.e.repeated string ids = 1) - The
Batch<OperationName>output type has either a:- A single repeated field (i.e.
repeated Foo foos = 1) where the output order is the same as the inputidsorder, or - A map of the input to an output (i.e.
map<string, Entity> entities = 1;)
- A single repeated field (i.e.
When ts-proto recognizes methods of this pattern, it will automatically create a "non-batch" version of <OperationName> for the client, i.e. client.Get<OperationName>, that takes a single id and returns a single result.
This provides the client code with the illusion that it can make individual Get<OperationName> calls (which is generally preferrable/easier when implementing the client's business logic), but the actual implementation that ts-proto provides will end up making Batch<OperationName> calls to the backend service.
You also need to enable the useContext=true build-time parameter, which gives all client methods a Go-style ctx parameter, with a getDataLoaders method that lets ts-proto cache/resolve request-scoped DataLoaders, which provide the fundamental auto-batch detection/flushing behavior.
See the batching.proto file and related tests for examples/more details.
But the net effect is that ts-proto can provide SQL-/ORM-style N+1 prevention for clients calls, which can be critical especially in high-volume / highly-parallel implementations like GraphQL front-end gateways calling backend micro-services.
ts-proto is a protoc plugin, so you run it by (either directly in your project, or more likely in your mono-repo schema pipeline, i.e. like Ibotta or Namely):
- Add
ts-prototo yourpackage.json - Run
npm installto download it - Invoke
protocwith apluginparameter like:
protoc --plugin=node_modules/ts-proto/protoc-gen-ts_proto ./batching.proto -I.ts-proto can also be invoked with Gradle using the protobuf-gradle-plugin:
protobuf {
plugins {
// `ts` can be replaced by any unused plugin name, e.g. `tsproto`
ts {
path = 'path/to/plugin'
}
}
// This section only needed if you provide plugin options
generateProtoTasks {
all().each { task ->
task.plugins {
// Must match plugin ID declared above
ts {
option 'foo=bar'
}
}
}
}
}Generated code will be placed in the Gradle build directory.
-
With
--ts_proto_opt=globalThisPolyfill=true, ts-proto will include a polyfill for globalThis.Defaults to
false, i.e. we assumeglobalThisis available. -
With
--ts_proto_opt=context=true, the services will have a Go-stylectxparameter, which is useful for tracing/logging/etc. if you're not using node'sasync_hooksapi due to performance reasons. -
With
--ts_proto_opt=forceLong=long, all 64-bit numbers will be parsed as instances ofLong(using the long library).With
--ts_proto_opt=forceLong=string, all 64-bit numbers will be output as strings.With
--ts_proto_opt=forceLong=bigint, all 64-bit numbers will be output asBigInts. This option still uses thelonglibrary to encode/decode internally withinprotobuf.js, but then converts to/fromBigInts in the ts-proto-generated code.The default behavior is
forceLong=number, which will internally still use thelonglibrary to encode/decode values on the wire (so you will still see autil.Long = Longline in your output), but will convert thelongvalues tonumberautomatically for you. Note that a runtime error is thrown if, while doing this conversion, a 64-bit value is larger than can be correctly stored as anumber. -
With
--ts_proto_opt=esModuleInterop=truechanges output to beesModuleInteropcompliant.Specifically the
Longimports will be generated asimport Long from 'long'instead ofimport * as Long from 'long'. -
With
--ts_proto_opt=env=nodeorbrowserorboth, ts-proto will make environment-specific assumptions in your output. This defaults toboth, which makes no environment-specific assumptions.Using
nodechanges the types ofbytesfromUint8ArraytoBufferfor easier integration with the node ecosystem which generally usesBuffer.Currently
browserdoesn't have any specific behavior other than being "notnode". It probably will soon/at some point. -
With
--ts_proto_opt=useOptionals=messages(for message fields) or--ts_proto_opt=useOptionals=all(for message and scalar fields), fields are declared as optional keys, e.g.field?: Messageinstead of the defaultfield: Message | undefined.ts-proto defaults to
useOptionals=nonebecause it:- Prevents typos when initializing messages, and
- Provides the most consistent API to readers
- Ensures production messages are properly initialized with all fields.
For typo prevention, optional fields make it easy for extra fields to slip into a message (until we get Exact Types), i.e.:
interface SomeMessage { firstName: string; lastName: string; } // Declared with a typo const data = { firstName: "a", lastTypo: "b" }; // With useOptionals=none, this correctly fails to compile; if `lastName` was optional, it would not const message: SomeMessage = { ...data };
For a consistent API, if
SomeMessage.lastNameis optionallastName?, then readers have to check two empty conditions: a) islastNameundefined(b/c it was created in-memory and left unset), or b) islastNameempty string (b/c we readSomeMessageoff the wire and, per the proto3 spec, initializedlastNameto empty string)?For ensuring proper initialization, if later
SomeMessage.middleInitialis added, but it's marked as optionalmiddleInitial?, you may have many call sites in production code that should now be passingmiddleInitialto create a validSomeMessage, but are not.So, between typo-prevention, reader inconsistency, and proper initialization, ts-proto recommends using
useOptionals=noneas the "most safe" option.All that said, this approach does require writers/creators to set every field (although
fromPartialandcreateare meant to address this), so if you still want to have optional keys, you can setuseOptionals=messagesoruseOptionals=all.(See this issue and this issue for discussions on
useOptional.) -
With
--ts_proto_opt=exportCommonSymbols=false, utility types likeDeepPartialandprotobufPackagewon't beexportd.This should make it possible to use create barrel imports of the generated output, i.e.
import * from ./fooandimport * from ./bar.Note that if you have the same message name used in multiple
*.protofiles, you will still get import conflicts. -
With
--ts_proto_opt=oneof=unions,oneoffields will be generated as ADTs.See the "OneOf Handling" section.
-
With
--ts_proto_opt=unrecognizedEnumName=<NAME>enums will contain a key<NAME>with value of theunrecognizedEnumValueoption.Defaults to
UNRECOGNIZED. -
With
--ts_proto_opt=unrecognizedEnumValue=<NUMBER>enums will contain a key provided by theunrecognizedEnumNameoption with value of<NUMBER>.Defaults to
-1. -
With
--ts_proto_opt=unrecognizedEnum=falseenums will not contain an unrecognized enum key and value as provided by theunrecognizedEnumNameandunrecognizedEnumValueoptions. -
With
--ts_proto_opt=removeEnumPrefix=truegenerated enums will have the enum name removed from members.FooBar.FOO_BAR_BAZ = "FOO_BAR_BAZ"will generateFooBar.BAZ = "FOO_BAR_BAZ" -
With
--ts_proto_opt=lowerCaseServiceMethods=true, the method names of service methods will be lowered/camel-case, i.e.service.findFooinstead ofservice.FindFoo. -
With
--ts_proto_opt=snakeToCamel=false, fields will be kept snake case in both the message keys and thetoJSON/fromJSONmethods.snakeToCamelcan also be set as a_-delimited list of strings (comma is reserved as the flag delimited), i.e.--ts_proto_opt=snakeToCamel=keys_json, where includingkeyswill make message keys be camel case and includingjsonwill make JSON keys be camel case.Empty string, i.e.
snakeToCamel=, will keep both messages keys andJSONkeys as snake case (it is the same assnakeToCamel=false).Note that to use the
json_nameattribute, you'll have to use thejson.The default behavior is
keys_json, i.e. both will be camel cased, andjson_namewill be used if set. -
With
--ts_proto_opt=outputEncodeMethods=false, theMessage.encodeandMessage.decodemethods for working with protobuf-encoded/binary data will not be output.This is useful if you want "only types".
-
With
--ts_proto_opt=outputJsonMethods=false, theMessage.fromJSONandMessage.toJSONmethods for working with JSON-coded data will not be output.This is also useful if you want "only types".
-
With
--ts_proto_opt=outputJsonMethods=to-onlyand--ts_proto_opt=outputJsonMethods=from-onlyyou will be able to export only one between theMessage.toJSONandMessage.fromJSONmethods.This is useful if you're using ts-proto just to
encodeordecodeand not for both. -
With
--ts_proto_opt=outputPartialMethods=false, theMessage.fromPartialandMessage.createmethods for accepting partially-formed objects/object literals will not be output. -
With
--ts_proto_opt=stringEnums=true, the generated enum types will be string-based instead of int-based.This is useful if you want "only types" and are using a gRPC REST Gateway configured to serialize enums as strings.
(Requires
outputEncodeMethods=false.) -
With
--ts_proto_opt=outputClientImpl=false, the client implementations, i.e.FooServiceClientImpl, that implement the client-side (in Twirp, see next option forgrpc-web) RPC interfaces will not be output. -
With
--ts_proto_opt=outputClientImpl=grpc-web, the client implementations, i.e.FooServiceClientImpl, will use the @improbable-eng/grpc-web library at runtime to send grpc messages to a grpc-web backend.(Note that this only uses the grpc-web runtime, you don't need to use any of their generated code, i.e. the ts-proto output replaces their
ts-protoc-genoutput.)You'll need to add the
@improbable-eng/grpc-weband a transport to your project'spackage.json; see theintegration/grpc-webdirectory for a working example. Also see #504 for integrating with grpc-web-devtools. -
With
--ts_proto_opt=returnObservable=true, the return type of service methods will beObservable<T>instead ofPromise<T>. -
With
--ts_proto_opt=addGrpcMetadata=true, the last argument of service methods will accept the grpcMetadatatype, which contains additional information with the call (i.e. access tokens/etc.).(Requires
nestJs=true.) -
With
--ts_proto_opt=addNestjsRestParameter=true, the last argument of service methods will be an rest parameter with type any. This way you can use custom decorators you could normally use in nestjs.(Requires
nestJs=true.) -
With
--ts_proto_opt=nestJs=true, the defaults will change to generate NestJS protobuf friendly types & service interfaces that can be used in both the client-side and server-side of NestJS protobuf implementations. See the nestjs readme for more information and implementation examples.Specifically
outputEncodeMethods,outputJsonMethods, andoutputClientImplwill all be false,lowerCaseServiceMethodswill be true andoutputServiceswill be ignored.Note that
addGrpcMetadata,addNestjsRestParameterandreturnObservablewill still be false. -
With
--ts_proto_opt=useDate=false, fields of typegoogle.protobuf.Timestampwill not be mapped to typeDatein the generated types. See Timestamp for more details. -
With
--ts_proto_opt=useMongoObjectId=true, fields of a type called ObjectId where the message is constructed to have on field called value that is a string will be mapped to typemongodb.ObjectIdin the generated types. This will require your project to install the mongodb npm package. See ObjectId for more details. -
With
--ts_proto_opt=outputSchema=true, meta typings will be generated that can later be used in other code generators. -
With
--ts_proto_opt=outputTypeAnnotations=true, each message will be given a$typefield containing its fully-qualified name. You can use--ts_proto_opt=outputTypeAnnotations=static-onlyto omit it from theinterfacedeclaration. -
With
--ts_proto_opt=outputTypeRegistry=true, the type registry will be generated that can be used to resolve message types by fully-qualified name. Also, each message will be given a$typefield containing its fully-qualified name. -
With
--ts_proto_opt=outputServices=grpc-js, ts-proto will output service definitions and server / client stubs in grpc-js format. -
With
--ts_proto_opt=outputServices=generic-definitions, ts-proto will output generic (framework-agnostic) service definitions. These definitions contain descriptors for each method with links to request and response types, which allows to generate server and client stubs at runtime, and also generate strong types for them at compile time. An example of a library that uses this approach is nice-grpc. -
With
--ts_proto_opt=outputServices=nice-grpc, ts-proto will output server and client stubs for nice-grpc. This should be used together with generic definitions, i.e. you should specify two options:outputServices=nice-grpc,outputServices=generic-definitions. -
With
--ts_proto_opt=metadataType=Foo@./some-file, ts-proto add a generic (framework-agnostic) metadata field to the generic service definition. -
With
--ts_proto_opt=outputServices=generic-definitions,outputServices=default, ts-proto will output both generic definitions and interfaces. This is useful if you want to rely on the interfaces, but also have some reflection capabilities at runtime. -
With
--ts_proto_opt=outputServices=false, or=none, ts-proto will output NO service definitions. -
With
--ts_proto_opt=useAbortSignal=true, the generated services will accept anAbortSignalto cancel RPC calls. -
With
--ts_proto_opt=useAsyncIterable=true, the generated services will useAsyncIterableinstead ofObservable. -
With
--ts_proto_opt=emitImportedFiles=false, ts-proto will not emitgoogle/protobuf/*files unless you explicit add files toprotoclike thisprotoc --plugin=./node_modules/.bin/protoc-gen-ts_proto my_message.proto google/protobuf/duration.proto -
With
--ts_proto_opt=fileSuffix=<SUFFIX>, ts-proto will emit generated files using the specified suffix. Ahelloworld.protofile withfileSuffix=.pbwould be generated ashelloworld.pb.ts. This is common behavior in other protoc plugins and provides a way to quickly glob all the generated files. -
With
--ts_proto_opt=importSuffix=<SUFFIX>, ts-proto will emit file imports using the specified suffix. An import ofhelloworld.tswithfileSuffix=.jswould generateimport "helloworld.js". The default is to import without a file extension. Supported by TypeScript 4.7.x and up. -
With
--ts_proto_opt=enumsAsLiterals=true, the generated enum types will be enum-ish object withas const. -
With
--ts_proto_opt=useExactTypes=false, the generatedfromPartialandcreatemethods will not use Exact types.The default behavior is
useExactTypes=true, which makesfromPartialandcreateuse Exact type for its argument to make TypeScript reject any unknown properties. -
With
--ts_proto_opt=unknownFields=true, all unknown fields will be parsed and output as arrays of buffers. -
With
--ts_proto_opt=onlyTypes=true, only types will be emitted, and imports forlongandprotobufjs/minimalwill be excluded.This is the same as setting
outputJsonMethods=false,outputEncodeMethods=false,outputClientImpl=false,nestJs=false -
With
--ts_proto_opt=usePrototypeForDefaults=true, the generated code will wrap new objects withObject.create.This allows code to do hazzer checks to detect when default values have been applied, which due to proto3's behavior of not putting default values on the wire, is typically only useful for interacting with proto2 messages.
When enabled, default values are inherited from a prototype, and so code can use Object.keys().includes("someField") to detect if someField was actually decoded or not.
Note that, as indicated, this means Object.keys will not include set-by-default fields, so if you have code that iterates over messages keys in a generic fashion, it will have to also iterate over keys inherited from the prototype.
-
With
--ts_proto_opt=useJsonName=true,json_namedefined in protofiles will be used instead of message field names. -
With
--ts_proto_opt=useJsonWireFormat=true, the generated code will reflect the JSON representation of Protobuf messages.Requires
onlyTypes=true. ImpliesuseDate=stringandstringEnums=true. This option is to generate types that can be directly used with marshalling/unmarshalling Protobuf messages serialized as JSON. You may also want to setuseOptionals=all, as gRPC gateways are not required to send default value for scalar values. -
With
--ts_proto_opt=useNumericEnumForJson=true, the JSON converter (toJSON) will encode enum values as int, rather than a string literal. -
With
--ts_proto_opt=initializeFieldsAsUndefined=false, all optional field initializers will be omited from the generated base instances. -
With
--ts_proto_opt=Mgoogle/protobuf/empty.proto=./google3/protobuf/empty, ('M' means 'importMapping', similar to protoc-gen-go), the generated code import path for./google/protobuf/empty.tswill reflect the overridden value:Mfoo/bar.proto=@myorg/some-libwill mapfoo/bar.protoimports intoimport ... from '@myorg/some-lib'.Mfoo/bar.proto=./some/local/libwill mapfoo/bar.protoimports intoimport ... from './some/local/lib'.Mfoo/bar.proto=some-modules/some-libwill mapfoo/bar.protoimports intoimport ... from 'some-module/some-lib'.- Note: Uses are accummulated, so multiple values are expected in the form of
--ts_proto_opt=M... --ts_proto_opt=M...(onets_proto_optper mapping). - Note: Proto files that match mapped imports will not be generated.
-
With
--ts_proto_opt=useMapType=true, the generated code for protobufmap<key_type, value_type>will becomeMap<key_type, value_type>that uses JavaScript Map type.The default behavior is
useMapType=false, which makes it generate the code for protobufmap<key_type, value_typewith the key-value pair like{[key: key_type]: value_type}. -
With
--ts_proto_opt=useReadonlyTypes=true, the generated types will be declared as immutable using typescript'sreadonlymodifer. -
With
--ts_proto_opt=useSnakeTypeName=falsewill remove snake casing from types.Example Protobuf
message Box { message Element { message Image { enum Alignment { LEFT = 1; CENTER = 2; RIGHT = 3; } } } }
by default this is enabled which would generate a type of
Box_Element_Image_Alignment. By disabling this option the type that is generated would beBoxElementImageAlignment. -
With
--ts_proto_opt=outputExtensions=true, the generated code will include proto2 extensionsExtension encode/decode methods are compliant with the
outputEncodeMethodsoption, and ifunknownFields=true, thesetExtensionandgetExtensionmethods will be created for extendable messages, also compliant withoutputEncodeMethods(setExtension = encode, getExtension = decode). -
With
--ts_proto_opt=outputIndex=true, index files will be generated based on the proto package namespaces.This will disable
exportCommonSymbolsto avoid name collisions on the common symbols. -
With
--emitDefaultValues=json-methods, the generated toJSON method will emit scalars like0and""as json fields.
We have a great way of working together with nestjs. ts-proto generates interfaces and decorators for you controller, client. For more information see the nestjs readme.
If you want to run ts-proto on every change of a proto file, you'll need to use a tool like chokidar-cli and use it as a script in package.json:
"proto:generate": "protoc --ts_proto_out=. ./<proto_path>/<proto_name>.proto --ts_proto_opt=esModuleInterop=true",
"proto:watch": "chokidar \"**/*.proto\" -c \"npm run proto:generate\""ts-proto is RPC framework agnostic - how you transmit your data to and from
your data source is up to you. The generated client implementations all expect
a rpc parameter, which type is defined like this:
interface Rpc {
request(service: string, method: string, data: Uint8Array): Promise<Uint8Array>;
}If you're working with gRPC, a simple implementation could look like this:
type RpcImpl = (service: string, method: string, data: Uint8Array) => Promise<Uint8Array>;
const sendRequest: RpcImpl = (service, method, data) => {
// Conventionally in gRPC, the request path looks like
// "package.names.ServiceName/MethodName",
// we therefore construct such a string
const path = `/${service}/${method}`;
return new Promise((resolve, reject) => {
// makeUnaryRequest transmits the result (and error) with a callback
// transform this into a promise!
const resultCallback: UnaryCallback<any> = (err, res) => {
if (err) {
return reject(err);
}
resolve(res);
};
function passThrough(argument: any) {
return argument;
}
// Using passThrough as the serialize and deserialize functions
conn.makeUnaryRequest(path, passThrough, passThrough, data, resultCallback);
});
};
const rpc: Rpc = { request: sendRequest };Kudos to our sponsors:
- ngrok funded ts-proto's initial grpc-web support.
If you need ts-proto customizations or priority support for your company, you can ping me at via email.
This section describes how to contribute directly to ts-proto, i.e. it's not required for running ts-proto in protoc or using the generated TypeScript.
Requirements
Setup
The commands below assume you have Docker installed. To use a local copy of protoc without docker, use commands suffixed with :local. If you are using OS X, install coreutils, brew install coreutils.
- Check out the repository for the latest code.
- Run
yarn installto install the dependencies. - Run
yarn build:testto generate the test files.This runs the following commands:
proto2bin— Converts integration test.protofiles to.bin.bin2ts— Runsts-protoon the.binfiles to generate.tsfiles.proto2pbjs— Generates a reference implementation usingpbjsfor testing compatibility.
- Run
yarn test
Workflow
- Add/update an integration test for your use case
- Either find an existing
integration/*test that is close enough to your use case, e.g. has aparameters.txtthat matches thets_proto_optparams necessary to reproduce your use case - If creating a new integration test:
- Make a new
integration/your-new-test/parameters.txtwith the necessaryts_proto_optparams - Create a minimal
integration/your-new-test/your-new-test.protoschema to reproduce your use case
- Make a new
- After any changes to
your-new-test.proto, or an existingintegration/*.protofile, runyarn proto2bin- You can also leave
yarn watchrunning, and it should "just do the right thing"
- You can also leave
- Add/update a
integration/your-new-test/some-test.tsunit test, even if it's as trivial as just making sure the generated code compiles
- Either find an existing
- Modify the
ts-protocode generation logic:- Most important logic is found in src/main.ts.
- After any changes to
src/*.tsfiles, runyarn bin2tsto re-codegen all integration tests- Or
yarn bin2ts your-new-testto re-codegen a specific test - Again leaving
yarn watchrunning should "just do the right thing"
- Or
- Run
yarn testto verify your changes pass all existing tests - Commit and submit a PR
- Run
yarn formatto format the typescript files. - Make sure to
git addall of the*.proto,*.bin, and*.tsfiles inintegration/your-new-test- Sometimes checking in generated code is frowned upon, but given ts-proto's main job is to generate code, seeing the codegen diffs in PRs is helpful
- Run
Dockerized Protoc
The repository includes a dockerized version of protoc, which is configured in docker-compose.yml.
It can be useful in case you want to manually invoke the plugin with a known version of protoc.
Usage:
# Include the protoc alias in your shell.
. aliases.sh
# Run protoc as usual. The ts-proto directory is available in /ts-proto.
protoc --plugin=/ts-proto/protoc-gen-ts_proto --ts_proto_out=./output -I=./protos ./protoc/*.proto
# Or use the ts-protoc alias which specifies the plugin path for you.
ts-protoc --ts_proto_out=./output -I=./protos ./protoc/*.proto- All paths must be relative paths within the current working directory of the host.
../is not allowed - Within the docker container, the absolute path to the project root is
/ts-proto - The container mounts the current working directory in
/host, and sets it as its working directory. - Once
aliases.shis sourced, you can use theprotoccommand in any folder.
- TS/ES6 module name is the proto package
- Support the string-based encoding of duration in
fromJSON/toJSON - Make
oneof=unionsthe default behavior in 2.0 - Probably change
forceLongdefault in 2.0, should default toforceLong=long - Make
esModuleInterop=truethe default in 2.0
By default, ts-proto models oneof fields "flatly" in the message, e.g. a message like:
message Foo {
oneof either_field { string field_a = 1; string field_b = 2; }
}Will generate a Foo type with two fields: field_a: string | undefined; and field_b: string | undefined.
With this output, you'll have to check both if object.field_a and if object.field_b, and if you set one, you'll have to remember to unset the other.
Instead, we recommend using the oneof=unions option, which will change the output to be an Abstract Data Type/ADT like:
interface YourMessage {
eitherField?: { $case: "field_a"; field_a: string } | { $case: "field_b"; field_b: string };
}As this will automatically enforce only one of field_a or field_b "being set" at a time, because the values are stored in the eitherField field that can only have a single value at a time.
(Note that eitherField is optional b/c oneof in Protobuf means "at most one field" is set, and does not mean one of the fields must be set.)
In ts-proto's currently-unscheduled 2.x release, oneof=unions will become the default behavior.
In core Protobuf (and so also ts-proto), values that are unset or equal to the default value are not sent over the wire.
For example, the default value of a message is undefined. Primitive types take their natural default value, e.g. string is '', number is 0, etc.
Protobuf chose/enforces this behavior because it enables forward compatibility, as primitive fields will always have a value, even when omitted by outdated agents.
This is good, but it also means default and unset values cannot be distinguished in ts-proto fields; it's just fundamentally how Protobuf works.
If you need primitive fields where you can detect set/unset, see Wrapper Types.
Encode / Decode
ts-proto follows the Protobuf rules, and always returns default values for unsets fields when decoding, while omitting them from the output when serialized in binary format.
syntax = "proto3";
message Foo {
string bar = 1;
}protobufBytes; // assume this is an empty Foo object, in protobuf binary format
Foo.decode(protobufBytes); // => { bar: '' }Foo.encode({ bar: "" }); // => { }, writes an empty Foo object, in protobuf binary formatfromJSON / toJSON
Reading JSON will also initialize the default values. Since senders may either omit unset fields, or set them to the default value, use fromJSON to normalize the input.
Foo.fromJSON({}); // => { bar: '' }
Foo.fromJSON({ bar: "" }); // => { bar: '' }
Foo.fromJSON({ bar: "baz" }); // => { bar: 'baz' }When writing JSON, ts-proto normalizes messages by omitting unset fields and fields set to their default values.
Foo.toJSON({}); // => { }
Foo.toJSON({ bar: undefined }); // => { }
Foo.toJSON({ bar: "" }); // => { } - note: omitting the default value, as expected
Foo.toJSON({ bar: "baz" }); // => { bar: 'baz' }Protobuf comes with several predefined message definitions, called "Well-Known Types". Their interpretation is defined by the Protobuf specification, and libraries are expected to convert these messages to corresponding native types in the target language.
ts-proto currently automatically converts these messages to their corresponding native types.
- google.protobuf.BoolValue ⇆
boolean - google.protobuf.BytesValue ⇆
Uint8Array - google.protobuf.DoubleValue ⇆
number - google.protobuf.FieldMask ⇆
string[] - google.protobuf.FloatValue ⇆
number - google.protobuf.Int32Value ⇆
number - google.protobuf.Int64Value ⇆
number - google.protobuf.ListValue ⇆
any[] - google.protobuf.UInt32Value ⇆
number - google.protobuf.UInt64Value ⇆
number - google.protobuf.StringValue ⇆
string - google.protobuf.Value ⇆
any(i.e.number | string | boolean | null | array | object) - google.protobuf.Struct ⇆
{ [key: string]: any }
Wrapper Types are messages containing a single primitive field, and can be imported in .proto files with import "google/protobuf/wrappers.proto".
Since these are messages, their default value is undefined, allowing you to distinguish unset primitives from their default values, when using Wrapper Types.
ts-proto generates these fields as <primitive> | undefined.
For example:
// Protobuf
syntax = "proto3";
import "google/protobuf/wrappers.proto";
message ExampleMessage {
google.protobuf.StringValue name = 1;
}// TypeScript
interface ExampleMessage {
name: string | undefined;
}When encoding a message the primitive value is converted back to its corresponding wrapper type:
ExampleMessage.encode({ name: "foo" }); // => { name: { value: 'foo' } }, in binaryWhen calling toJSON, the value is not converted, because wrapper types are idiomatic in JSON.
ExampleMessage.toJSON({ name: "foo" }); // => { name: 'foo' }Protobuf's language and types are not sufficient to represent all possible JSON values, since JSON may contain values whose type is unknown in advance. For this reason, Protobuf offers several additional types to represent arbitrary JSON values.
These are called Struct Types, and can be imported in .proto files with import "google/protobuf/struct.proto".
- google.protobuf.Value ⇆
any- This is the most general type, and can represent any JSON value (i.e.
number | string | boolean | null | array | object).
- This is the most general type, and can represent any JSON value (i.e.
- google.protobuf.ListValue ⇆
any[]- To represent a JSON array
- google.protobuf.Struct ⇆
{ [key: string]: any }- To represent a JSON object
ts-proto automatically converts back and forth between these Struct Types and their corresponding JSON types.
Example:
// Protobuf
syntax = "proto3";
import "google/protobuf/struct.proto";
message ExampleMessage {
google.protobuf.Value anything = 1;
}// TypeScript
interface ExampleMessage {
anything: any | undefined;
}Encoding a JSON value embedded in a message, converts it to a Struct Type:
ExampleMessage.encode({ anything: { name: "hello" } });
/* Outputs the following structure, encoded in protobuf binary format:
{
anything: Value {
structValue = Struct {
fields = [
MapEntry {
key = "name",
value = Value {
stringValue = "hello"
}
]
}
}
}
}*/
ExampleMessage.encode({ anything: true });
/* Outputs the following structure encoded in protobuf binary format:
{
anything: Value {
boolValue = true
}
}*/The representation of google.protobuf.Timestamp is configurable by the useDate flag.
| Protobuf well-known type | Default/useDate=true |
useDate=false |
useDate=string |
|---|---|---|---|
google.protobuf.Timestamp |
Date |
{ seconds: number, nanos: number } |
string |
Numbers are by default assumed to be plain JavaScript numbers.
This is fine for Protobuf types like int32 and float, but 64-bit types like int64 can't be 100% represented by JavaScript's number type, because int64 can have larger/smaller values than number.
ts-proto's default configuration (which is forceLong=number) is to still use number for 64-bit fields, and then throw an error if a value (at runtime) is larger than Number.MAX_SAFE_INTEGER.
If you expect to use 64-bit / higher-than-MAX_SAFE_INTEGER values, then you can use the ts-proto forceLong option, which uses the long npm package to support the entire range of 64-bit values.
The protobuf number types map to JavaScript types based on the forceLong config option:
| Protobuf number types | Default/forceLong=number |
forceLong=long |
forceLong=string |
|---|---|---|---|
| double | number | number | number |
| float | number | number | number |
| int32 | number | number | number |
| int64 | number* | Long | string |
| uint32 | number | number | number |
| uint64 | number* | Unsigned Long | string |
| sint32 | number | number | number |
| sint64 | number* | Long | string |
| fixed32 | number | number | number |
| fixed64 | number* | Unsigned Long | string |
| sfixed32 | number | number | number |
| sfixed64 | number* | Long | string |
Where (*) indicates they might throw an error at runtime.
- Required primitives: use as-is, i.e.
string name = 1. - Optional primitives: use wrapper types, i.e.
StringValue name = 1. - Required messages: not available
- Optional messages: use as-is, i.e.
SubMessage message = 1.