diff --git a/README.md b/README.md index 9e0ea22..c5b65d8 100644 --- a/README.md +++ b/README.md @@ -1,307 +1,5 @@ -gRPC-Mate - An enterprise ready micro service project base on [gRPC-Java](https://github.com/grpc/grpc-java) +gRPC-Mate - An enterprise ready micro service project base on [gRPC](https://grpc.io/) ======================================== -gRPC-Mate demostrate best practice for gRPC based micro service. -[![Build Status](https://travis-ci.org/email2liyang/grpc-mate.svg?branch=master)](https://travis-ci.org/email2liyang/grpc-mate) -[![Code Coverage Status](https://s3.amazonaws.com/assets.coveralls.io/badges/coveralls_94.svg)](https://coveralls.io/github/email2liyang/grpc-mate?branch=master) -[![Gitter chat](https://badges.gitter.im/grpc-mate/gitter.png)](https://gitter.im/grpc-mate/Lobby) - -* [Grpc best practice](#grpc-best-practice) - * [Simple RPC](#simple-rpc) - * [Server streaming](#server-streaming) - * [Client streaming](#client-streaming) - * [Bi-directional streaming](#bi-directional-streaming) - * [Interceptors](#interceptors) - * [Transfer Large File](#transfer-large-file) - * [Restful endpoint](#restful-endpoint) -* [Promethues integration](#promethues-integration) -* [Kubernetes Deployment](#kubernetes-deployment) -* [Gradle multiple builds best practice](#gradle-best-practice) -* [Mockito best practice](#mockito-best-practice) -* [Junit best practice](#junit-best-practice) -* [Proto buffer best practice](#proto-buffer-best-practice) -* [Docker best practice](#docker-best-practice) -* [Quality control best practice](#quality-control-best-practice) - * [CheckStyle](#checkstyle) - * [FindBug](#findbug) - * [Jacoco](#jacoco) - -### Demo Script -the project will demonstrate an online store search service including - -* Create elasticsearch index with alias -* Uploading products into Elasticsearch (client streaming) -* Downloading products from Elasticsearch (server streaming) -* Search products from elasticsearch (simple RPC) -* Calculate products score (bi-directional streaming) -### Grpc best practice -* elastic search communicate - * use JsonFormat.Printer to convert proto buffer message into json - * use JsonFormat.Parser to parse json into proto buffer -#### Simple RPC -* [sample](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L23) -* we could use JsonFormat.Parser to convert es document into protobuf message -```java - Product.Builder builder = Product.newBuilder(); - jsonParser.merge(hit.getSourceAsString(), builder); - responseBuilder.addProducts(builder.build()); -``` -#### Server streaming -* [sample](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L39) -* with server streaming , user could pass PublishSubject to dao layer to connect the real data with ResponseObserver -```java -PublishSubject productPublishSubject = PublishSubject.create(); - Disposable disposable = productPublishSubject - .doOnNext(product -> responseObserver.onNext(product)) - .doOnComplete(() -> responseObserver.onCompleted()) - .doOnError(t -> responseObserver.onError(t)) - .subscribe(); - productDao.downloadProducts(request, productPublishSubject); - disposable.dispose(); -``` -#### Client streaming -* [sample](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductUpdateService.java#L29) -* use [RxStreamObserver](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/RxStreamObserver.java) to connect grpc StreamObserver and [rxJava](https://github.com/ReactiveX/RxJava) so that in grpc service, we could use rx style programming -```java -PublishSubject publishSubject = PublishSubject.create(); - publishSubject - .doOnNext(product -> { - log.info("saving product - {} ", product); - productDao.upsertProduct(product); - }) - .doOnError(t -> responseObserver.onError(t)) - .doOnComplete(() -> { - responseObserver.onNext(UploadProductResponse.newBuilder().build()); - responseObserver.onCompleted(); - }) - .subscribe(); -``` -#### Bi-directional streaming -* [sample](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L49) -* use grpc's InProcessServer to test grpc service -#### Interceptors -* [ClientInterceptor](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/test/java/io/datanerd/es/service/CallerInterceptor.java) -* [ServerInterceptor](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceInterceptor.java) -#### Transfer Large File -* grpc is not designed to transfer large files, but we could leverage stream api to transfer any size of data in binary stream -* see protobuf definition below we could use stream api to transfer any size of data in any format -```proto -message DataChunk { - bytes data = 1; -} -rpc DownloadProductImage(DownloadProductImageRequest) returns(stream DataChunk){ -} -``` -* [Server Side](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L125-L145) -* [Client Side](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductReadServiceTest.java#L196-L243) -#### Restful endpoint -* use [grpc-gateway](https://github.com/grpc-ecosystem/grpc-gateway) to bridge grpc service to restful endpoint -* stream is not supported in http 1.1 -* define a sample grpc service like below -``` -service EchoService { - rpc Echo (EchoRequest) returns (EchoResponse) { - option (google.api.http) = { - post: "/grpc/api/v1/echo" - body: "*" - }; - } -} - -message EchoRequest { - string ping = 1; -} - -message EchoResponse { - string pong = 2; -} -``` -* use grpc-gateway to generate an reverse proxy. -* start grpc-gateway server see details from https://github.com/email2liyang/grpc-mate/tree/master/grpc-gateway -* test the rest api -```bash -curl -XPOST localhost:7070/grpc/api/v1/echo -d '{"ping":"hello"}' -{"pong":"hello"}% -``` -### Promethues integration -* use [Auto Value](https://github.com/google/auto/tree/master/value) to define the value class with builder, see [Metric.java](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/metrics/Metric.java) -* use [CounterFactory.java](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/metrics/CounterFactory.java) to normalize Prometheus Counter's path and instance -* use CounterFactory to create counter and use the counter to record service metrics see [ProductReadService.java](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java) -* use [NanoHttpD](https://github.com/NanoHttpd/nanohttpd) based [HttpServer.java](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/main/java/io/datanerd/es/server/HttpServer.java) to serve metrics and grpc health info -### Kubernetes Deployment -* [sample](https://github.com/email2liyang/grpc-mate/tree/master/elasticsearch-service/deployment) -* use property file to manage system property and add the system property to configmap, so it's easy to debug program locally by specify the property file from system env. -``` -kubectl create configmap cluster-config --from-file=data_nerd.properties --namespace=prod -``` -* mount property from configmap in deploymnet yaml file -``` -volumes: - - name: config-volume - configMap: - name: cluster-config - items: - - key: data_nerd.properties - path: data_nerd.properties -``` -* service will seldom get redeployed after first deployment - -### Gradle Best Practice -* add gradle wrapper, so that it can be run anywhere - -``` -task wrapper(type: Wrapper) { - gradleVersion = '4.0' -} - -> gradle wrapper -``` -* remove auto generated classes in clean task - -```groovy -clean { - doLast { - // remove auto-generated files on clean - delete "${projectDir}/src/generated" - } -} -``` -* we force gradle to detect version conflict on build - -```groovy -subprojects { - apply plugin: 'java' - - configurations.all { - resolutionStrategy { - // fail eagerly on version conflict (includes transitive dependencies) - // e.g. multiple different versions of the same dependency (group and name are equal) - failOnVersionConflict() - } - } -} -``` -* show error log in console make it easier to debug build failure in travis-ci -```groovy -test { - testLogging { - // set options for log level LIFECYCLE - events "failed" - exceptionFormat "full" - - // remove standard output/error logging from --info builds - // by assigning only 'failed' and 'skipped' events - info.events = ["failed", "skipped"] - } -} -``` -### Mockito best practice -* use Mockito to mock dao method in service test, so that we do not launch docker container to provide ES env -* use Guice to inject any mocked instance into the dependency graph in unit test -```java -productDao = mock(ProductDao.class); - injector = Guice.createInjector( - Modules.override(new ElasticSearchModule()) - .with(binder -> { - binder.bind(ProductDao.class).toInstance(productDao); - }) - ); -``` -### Junit best practice -* use [testcontainers-java](https://github.com/testcontainers/testcontainers-java), we could launch any docker image to support any env related class -* it's convenient to use JUnit Rule and ClassRule with docker container for test see [TransportClientProviderTest.java](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/test/java/io/datanerd/es/guice/TransportClientProviderTest.java) for more details -```java - @ClassRule - public static final GenericContainer esContainer = - new GenericContainer("email2liyang/elasticsearch-unit-image:5.4.3") - .withExposedPorts(9200,9300); -``` -* user can use Guice Modules.override() method to override any default configuration in test -```java -MapConfiguration memoryParams = new MapConfiguration(new HashMap<>()); - memoryParams.setProperty(CONFIG_ES_CLUSTER_HOST,ip); - memoryParams.setProperty(CONFIG_ES_CLUSTER_PORT,transportPort); - memoryParams.setProperty(CONFIG_ES_CLUSTER_NAME,"elasticsearch"); - Injector injector = Guice.createInjector( - Modules.override(new ElasticSearchModule()).with( - binder -> { - binder.bind(Configuration.class).toProvider(() -> memoryParams); - } - ) - ); -``` -* use toProvider(()->xxx); to avoid dedicated provider logic to execute -* use GrpcServerRule with Junit Rule to start a mock grpc server to test grpc, see [EchoServiceTest](https://github.com/email2liyang/grpc-mate/blob/master/elasticsearch-service/src/test/java/io/datanerd/es/service/EchoServiceTest.java) - - - -### Proto buffer best practice -* define all proto file in top level of project for larger organization, it's a good idea to store all protobuffer file into a dedicated git repository, then checkout the proto buffer repository as a git submodule, then we could have single place to define all the grpc service and message to share across projects -* define Makefile to generate java code , then it's easy to detect any issue for proto buffer definition. -``` -clean: - mkdir -p java_generated && rm -rf java_generated/* -gen: clean - protoc --java_out=java_generated *.proto -> make gen -``` -* it's good idea to use proto buffer message as value object to pass value among different layer of the application, then the developers do not need to care about marshalling/unmarshalling in different layer. let protobuffer to handle it in a reliable and fast way. -* we could use JsonFormat.Printer and JsonFormat.Parser to serialize/deserialize proto buffer message into/from json to communicate with elasticsearch, as elastic search only support json format of data as it's document -* it's good idea to define common message in a separate proto file, so that it can be used in multiple proto files by import -* it's good idea to define package name and set multiple_files to true so that the generated java file has better package name -``` -option java_package = "io.datanerd.generated.common"; -option java_multiple_files = true; -``` -* [proto buffer name best practice](https://developers.google.com/protocol-buffers/docs/style) - * use CamelCase (with an initial capital) for message names - * use CamelCase (with an initial capital) for grpc service name - * use underscore_separated_names for field names - * use CamelCase (with an initial capital) for enum type names and CAPITALS_WITH_UNDERSCORES for value names -```proto -service ProductUpdateService { - //upload product into elastic search , make it so that we could search on it - //used to demo client side stream - rpc UploadProduct (stream Product) returns (UploadProductResponse) { - - } -} - - -message UploadProductResponse { - enum ResultStatus { - SUCCESS = 0; - FAILED = 1; - } - ResultStatus result_status = 1; -} -``` -### Docker best practice -* we can use docker to simulate external service (e.g elasticsearch) in unit test - * in this demo project , we will an [elasitcsearch image](https://github.com/email2liyang/elasticsearch-unit-image) for unit test purpose only - * user can download it by command ```make pull_image``` to get latest test image - -### Quality control best practice -#### CheckStyle -* apply [Google Java Style] (http://checkstyle.sourceforge.net/google_style.html) -* user can exclude any file from checkstyle(e.g: grpc generated java file) by adding it to gradle/google_checks_suppressions.xml -#### FindBugs -* user can exclude any file from findbugs(e.g: grpc generated java file) by adding it to findbugs_exclude_filter.xml -#### Jacoco -* Jacoco related tasks are not bind to check and test task, we can bind jacoco related tasks to test by -```groovy - test.finalizedBy(jacocoTestReport,jacocoTestCoverageVerification) -``` -* use can add multiple rules in jacocoTestCoverageVerification -* user can exclude any package from jacoco report in afterEvaluate config -```groovy - afterEvaluate { - classDirectories = files(classDirectories.files.collect { - fileTree(dir: it, - exclude: ['**/generated/**', - 'com/google/**']) - }) - } -``` -* Line coverage ratio on package level is the most meaningful standard on code coverage perspective -* Jacoco will work with Junit out of box, for TestNG, it need extra config to make jacoco to work. \ No newline at end of file +* for java based project, please check out [grpc-mate-java](https://github.com/email2liyang/grpc-mate/tree/master/grpc-mate-java) +* for python based project, please check out [grpc-mate-python](https://github.com/email2liyang/grpc-mate/tree/master/grpc-mate-python) diff --git a/elasticsearch-service/src/main/proto/google b/elasticsearch-service/src/main/proto/google deleted file mode 120000 index 74e284b..0000000 --- a/elasticsearch-service/src/main/proto/google +++ /dev/null @@ -1 +0,0 @@ -../../../../protobuffers/google \ No newline at end of file diff --git a/elasticsearch-service/src/main/proto/product_common.proto b/elasticsearch-service/src/main/proto/product_common.proto deleted file mode 120000 index e127a00..0000000 --- a/elasticsearch-service/src/main/proto/product_common.proto +++ /dev/null @@ -1 +0,0 @@ -../../../../protobuffers/product_common.proto \ No newline at end of file diff --git a/elasticsearch-service/src/main/proto/product_search_engine.proto b/elasticsearch-service/src/main/proto/product_search_engine.proto deleted file mode 120000 index 7ed75a7..0000000 --- a/elasticsearch-service/src/main/proto/product_search_engine.proto +++ /dev/null @@ -1 +0,0 @@ -../../../../protobuffers/product_search_engine.proto \ No newline at end of file diff --git a/.travis.yml b/grpc-mate-java/.travis.yml similarity index 100% rename from .travis.yml rename to grpc-mate-java/.travis.yml diff --git a/LICENSE b/grpc-mate-java/LICENSE similarity index 100% rename from LICENSE rename to grpc-mate-java/LICENSE diff --git a/Makefile b/grpc-mate-java/Makefile similarity index 100% rename from Makefile rename to grpc-mate-java/Makefile diff --git a/grpc-mate-java/README.md b/grpc-mate-java/README.md new file mode 100644 index 0000000..c2f62b1 --- /dev/null +++ b/grpc-mate-java/README.md @@ -0,0 +1,307 @@ +gRPC-Mate - An enterprise ready micro service project base on [gRPC-Java](https://github.com/grpc/grpc-java) +======================================== +gRPC-Mate demostrate best practice for gRPC based micro service. + +[![Build Status](https://travis-ci.org/email2liyang/grpc-mate.svg?branch=master)](https://travis-ci.org/email2liyang/grpc-mate) +[![Code Coverage Status](https://s3.amazonaws.com/assets.coveralls.io/badges/coveralls_94.svg)](https://coveralls.io/github/email2liyang/grpc-mate?branch=master) +[![Gitter chat](https://badges.gitter.im/grpc-mate/gitter.png)](https://gitter.im/grpc-mate/Lobby) + +* [Grpc best practice](#grpc-best-practice) + * [Simple RPC](#simple-rpc) + * [Server streaming](#server-streaming) + * [Client streaming](#client-streaming) + * [Bi-directional streaming](#bi-directional-streaming) + * [Interceptors](#interceptors) + * [Transfer Large File](#transfer-large-file) + * [Restful endpoint](#restful-endpoint) +* [Promethues integration](#promethues-integration) +* [Kubernetes Deployment](#kubernetes-deployment) +* [Gradle multiple builds best practice](#gradle-best-practice) +* [Mockito best practice](#mockito-best-practice) +* [Junit best practice](#junit-best-practice) +* [Proto buffer best practice](#proto-buffer-best-practice) +* [Docker best practice](#docker-best-practice) +* [Quality control best practice](#quality-control-best-practice) + * [CheckStyle](#checkstyle) + * [FindBug](#findbug) + * [Jacoco](#jacoco) + +### Demo Script +the project will demonstrate an online store search service including + +* Create elasticsearch index with alias +* Uploading products into Elasticsearch (client streaming) +* Downloading products from Elasticsearch (server streaming) +* Search products from elasticsearch (simple RPC) +* Calculate products score (bi-directional streaming) +### Grpc best practice +* elastic search communicate + * use JsonFormat.Printer to convert proto buffer message into json + * use JsonFormat.Parser to parse json into proto buffer +#### Simple RPC +* [sample](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L23) +* we could use JsonFormat.Parser to convert es document into protobuf message +```java + Product.Builder builder = Product.newBuilder(); + jsonParser.merge(hit.getSourceAsString(), builder); + responseBuilder.addProducts(builder.build()); +``` +#### Server streaming +* [sample](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L39) +* with server streaming , user could pass PublishSubject to dao layer to connect the real data with ResponseObserver +```java +PublishSubject productPublishSubject = PublishSubject.create(); + Disposable disposable = productPublishSubject + .doOnNext(product -> responseObserver.onNext(product)) + .doOnComplete(() -> responseObserver.onCompleted()) + .doOnError(t -> responseObserver.onError(t)) + .subscribe(); + productDao.downloadProducts(request, productPublishSubject); + disposable.dispose(); +``` +#### Client streaming +* [sample](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductUpdateService.java#L29) +* use [RxStreamObserver](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/RxStreamObserver.java) to connect grpc StreamObserver and [rxJava](https://github.com/ReactiveX/RxJava) so that in grpc service, we could use rx style programming +```java +PublishSubject publishSubject = PublishSubject.create(); + publishSubject + .doOnNext(product -> { + log.info("saving product - {} ", product); + productDao.upsertProduct(product); + }) + .doOnError(t -> responseObserver.onError(t)) + .doOnComplete(() -> { + responseObserver.onNext(UploadProductResponse.newBuilder().build()); + responseObserver.onCompleted(); + }) + .subscribe(); +``` +#### Bi-directional streaming +* [sample](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L49) +* use grpc's InProcessServer to test grpc service +#### Interceptors +* [ClientInterceptor](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/CallerInterceptor.java) +* [ServerInterceptor](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceInterceptor.java) +#### Transfer Large File +* grpc is not designed to transfer large files, but we could leverage stream api to transfer any size of data in binary stream +* see protobuf definition below we could use stream api to transfer any size of data in any format +```proto +message DataChunk { + bytes data = 1; +} +rpc DownloadProductImage(DownloadProductImageRequest) returns(stream DataChunk){ +} +``` +* [Server Side](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java#L125-L145) +* [Client Side](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductReadServiceTest.java#L196-L243) +#### Restful endpoint +* use [grpc-gateway](https://github.com/grpc-ecosystem/grpc-gateway) to bridge grpc service to restful endpoint +* stream is not supported in http 1.1 +* define a sample grpc service like below +``` +service EchoService { + rpc Echo (EchoRequest) returns (EchoResponse) { + option (google.api.http) = { + post: "/grpc/api/v1/echo" + body: "*" + }; + } +} + +message EchoRequest { + string ping = 1; +} + +message EchoResponse { + string pong = 2; +} +``` +* use grpc-gateway to generate an reverse proxy. +* start grpc-gateway server see details from https://github.com/email2liyang/grpc-mate/tree/master/grpc-gateway +* test the rest api +```bash +curl -XPOST localhost:7070/grpc/api/v1/echo -d '{"ping":"hello"}' +{"pong":"hello"}% +``` +### Promethues integration +* use [Auto Value](https://github.com/google/auto/tree/master/value) to define the value class with builder, see [Metric.java](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/metrics/Metric.java) +* use [CounterFactory.java](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/metrics/CounterFactory.java) to normalize Prometheus Counter's path and instance +* use CounterFactory to create counter and use the counter to record service metrics see [ProductReadService.java](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java) +* use [NanoHttpD](https://github.com/NanoHttpd/nanohttpd) based [HttpServer.java](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/HttpServer.java) to serve metrics and grpc health info +### Kubernetes Deployment +* [sample](https://github.com/email2liyang/grpc-mate/tree/master/elasticsearch-service/deployment) +* use property file to manage system property and add the system property to configmap, so it's easy to debug program locally by specify the property file from system env. +``` +kubectl create configmap cluster-config --from-file=data_nerd.properties --namespace=prod +``` +* mount property from configmap in deploymnet yaml file +``` +volumes: + - name: config-volume + configMap: + name: cluster-config + items: + - key: data_nerd.properties + path: data_nerd.properties +``` +* service will seldom get redeployed after first deployment + +### Gradle Best Practice +* add gradle wrapper, so that it can be run anywhere + +``` +task wrapper(type: Wrapper) { + gradleVersion = '4.0' +} + +> gradle wrapper +``` +* remove auto generated classes in clean task + +```groovy +clean { + doLast { + // remove auto-generated files on clean + delete "${projectDir}/src/generated" + } +} +``` +* we force gradle to detect version conflict on build + +```groovy +subprojects { + apply plugin: 'java' + + configurations.all { + resolutionStrategy { + // fail eagerly on version conflict (includes transitive dependencies) + // e.g. multiple different versions of the same dependency (group and name are equal) + failOnVersionConflict() + } + } +} +``` +* show error log in console make it easier to debug build failure in travis-ci +```groovy +test { + testLogging { + // set options for log level LIFECYCLE + events "failed" + exceptionFormat "full" + + // remove standard output/error logging from --info builds + // by assigning only 'failed' and 'skipped' events + info.events = ["failed", "skipped"] + } +} +``` +### Mockito best practice +* use Mockito to mock dao method in service test, so that we do not launch docker container to provide ES env +* use Guice to inject any mocked instance into the dependency graph in unit test +```java +productDao = mock(ProductDao.class); + injector = Guice.createInjector( + Modules.override(new ElasticSearchModule()) + .with(binder -> { + binder.bind(ProductDao.class).toInstance(productDao); + }) + ); +``` +### Junit best practice +* use [testcontainers-java](https://github.com/testcontainers/testcontainers-java), we could launch any docker image to support any env related class +* it's convenient to use JUnit Rule and ClassRule with docker container for test see [TransportClientProviderTest.java](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/TransportClientProviderTest.java) for more details +```java + @ClassRule + public static final GenericContainer esContainer = + new GenericContainer("email2liyang/elasticsearch-unit-image:5.4.3") + .withExposedPorts(9200,9300); +``` +* user can use Guice Modules.override() method to override any default configuration in test +```java +MapConfiguration memoryParams = new MapConfiguration(new HashMap<>()); + memoryParams.setProperty(CONFIG_ES_CLUSTER_HOST,ip); + memoryParams.setProperty(CONFIG_ES_CLUSTER_PORT,transportPort); + memoryParams.setProperty(CONFIG_ES_CLUSTER_NAME,"elasticsearch"); + Injector injector = Guice.createInjector( + Modules.override(new ElasticSearchModule()).with( + binder -> { + binder.bind(Configuration.class).toProvider(() -> memoryParams); + } + ) + ); +``` +* use toProvider(()->xxx); to avoid dedicated provider logic to execute +* use GrpcServerRule with Junit Rule to start a mock grpc server to test grpc, see [EchoServiceTest](https://github.com/email2liyang/grpc-mate/blob/master/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/EchoServiceTest.java) + + + +### Proto buffer best practice +* define all proto file in top level of project for larger organization, it's a good idea to store all protobuffer file into a dedicated git repository, then checkout the proto buffer repository as a git submodule, then we could have single place to define all the grpc service and message to share across projects +* define Makefile to generate java code , then it's easy to detect any issue for proto buffer definition. +``` +clean: + mkdir -p java_generated && rm -rf java_generated/* +gen: clean + protoc --java_out=java_generated *.proto +> make gen +``` +* it's good idea to use proto buffer message as value object to pass value among different layer of the application, then the developers do not need to care about marshalling/unmarshalling in different layer. let protobuffer to handle it in a reliable and fast way. +* we could use JsonFormat.Printer and JsonFormat.Parser to serialize/deserialize proto buffer message into/from json to communicate with elasticsearch, as elastic search only support json format of data as it's document +* it's good idea to define common message in a separate proto file, so that it can be used in multiple proto files by import +* it's good idea to define package name and set multiple_files to true so that the generated java file has better package name +``` +option java_package = "io.datanerd.generated.common"; +option java_multiple_files = true; +``` +* [proto buffer name best practice](https://developers.google.com/protocol-buffers/docs/style) + * use CamelCase (with an initial capital) for message names + * use CamelCase (with an initial capital) for grpc service name + * use underscore_separated_names for field names + * use CamelCase (with an initial capital) for enum type names and CAPITALS_WITH_UNDERSCORES for value names +```proto +service ProductUpdateService { + //upload product into elastic search , make it so that we could search on it + //used to demo client side stream + rpc UploadProduct (stream Product) returns (UploadProductResponse) { + + } +} + + +message UploadProductResponse { + enum ResultStatus { + SUCCESS = 0; + FAILED = 1; + } + ResultStatus result_status = 1; +} +``` +### Docker best practice +* we can use docker to simulate external service (e.g elasticsearch) in unit test + * in this demo project , we will an [elasitcsearch image](https://github.com/email2liyang/elasticsearch-unit-image) for unit test purpose only + * user can download it by command ```make pull_image``` to get latest test image + +### Quality control best practice +#### CheckStyle +* apply [Google Java Style] (http://checkstyle.sourceforge.net/google_style.html) +* user can exclude any file from checkstyle(e.g: grpc generated java file) by adding it to gradle/google_checks_suppressions.xml +#### FindBugs +* user can exclude any file from findbugs(e.g: grpc generated java file) by adding it to findbugs_exclude_filter.xml +#### Jacoco +* Jacoco related tasks are not bind to check and test task, we can bind jacoco related tasks to test by +```groovy + test.finalizedBy(jacocoTestReport,jacocoTestCoverageVerification) +``` +* use can add multiple rules in jacocoTestCoverageVerification +* user can exclude any package from jacoco report in afterEvaluate config +```groovy + afterEvaluate { + classDirectories = files(classDirectories.files.collect { + fileTree(dir: it, + exclude: ['**/generated/**', + 'com/google/**']) + }) + } +``` +* Line coverage ratio on package level is the most meaningful standard on code coverage perspective +* Jacoco will work with Junit out of box, for TestNG, it need extra config to make jacoco to work. \ No newline at end of file diff --git a/build.gradle b/grpc-mate-java/build.gradle similarity index 100% rename from build.gradle rename to grpc-mate-java/build.gradle diff --git a/elasticsearch-service/build.gradle b/grpc-mate-java/elasticsearch-service/build.gradle similarity index 100% rename from elasticsearch-service/build.gradle rename to grpc-mate-java/elasticsearch-service/build.gradle diff --git a/elasticsearch-service/deployment/Dockerfile b/grpc-mate-java/elasticsearch-service/deployment/Dockerfile similarity index 100% rename from elasticsearch-service/deployment/Dockerfile rename to grpc-mate-java/elasticsearch-service/deployment/Dockerfile diff --git a/elasticsearch-service/deployment/data_nerd.properties b/grpc-mate-java/elasticsearch-service/deployment/data_nerd.properties similarity index 100% rename from elasticsearch-service/deployment/data_nerd.properties rename to grpc-mate-java/elasticsearch-service/deployment/data_nerd.properties diff --git a/elasticsearch-service/deployment/deploy.sh b/grpc-mate-java/elasticsearch-service/deployment/deploy.sh similarity index 100% rename from elasticsearch-service/deployment/deploy.sh rename to grpc-mate-java/elasticsearch-service/deployment/deploy.sh diff --git a/elasticsearch-service/deployment/deployment.yaml b/grpc-mate-java/elasticsearch-service/deployment/deployment.yaml similarity index 100% rename from elasticsearch-service/deployment/deployment.yaml rename to grpc-mate-java/elasticsearch-service/deployment/deployment.yaml diff --git a/elasticsearch-service/deployment/service.yaml b/grpc-mate-java/elasticsearch-service/deployment/service.yaml similarity index 100% rename from elasticsearch-service/deployment/service.yaml rename to grpc-mate-java/elasticsearch-service/deployment/service.yaml diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/dao/ProductDao.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/dao/ProductDao.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/dao/ProductDao.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/dao/ProductDao.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/guice/ConfigurationProvider.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/ConfigurationProvider.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/guice/ConfigurationProvider.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/ConfigurationProvider.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/guice/Constants.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/Constants.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/guice/Constants.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/Constants.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/guice/ElasticSearchModule.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/ElasticSearchModule.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/guice/ElasticSearchModule.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/ElasticSearchModule.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/guice/TransportClientProvider.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/TransportClientProvider.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/guice/TransportClientProvider.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/guice/TransportClientProvider.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/metrics/CounterFactory.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/metrics/CounterFactory.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/metrics/CounterFactory.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/metrics/CounterFactory.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/metrics/Metric.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/metrics/Metric.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/metrics/Metric.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/metrics/Metric.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/server/GrpcServer.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/GrpcServer.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/server/GrpcServer.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/GrpcServer.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/server/HttpServer.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/HttpServer.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/server/HttpServer.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/HttpServer.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceInterceptor.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceInterceptor.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceInterceptor.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceInterceptor.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceLauncher.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceLauncher.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceLauncher.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/server/ServiceLauncher.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/service/EchoService.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/EchoService.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/service/EchoService.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/EchoService.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductImageSeeker.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductImageSeeker.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/service/ProductImageSeeker.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductImageSeeker.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductReadService.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductUpdateService.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductUpdateService.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/service/ProductUpdateService.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/ProductUpdateService.java diff --git a/elasticsearch-service/src/main/java/io/datanerd/es/service/RxStreamObserver.java b/grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/RxStreamObserver.java similarity index 100% rename from elasticsearch-service/src/main/java/io/datanerd/es/service/RxStreamObserver.java rename to grpc-mate-java/elasticsearch-service/src/main/java/io/datanerd/es/service/RxStreamObserver.java diff --git a/grpc-mate-java/elasticsearch-service/src/main/proto/google b/grpc-mate-java/elasticsearch-service/src/main/proto/google new file mode 120000 index 0000000..8aa56a3 --- /dev/null +++ b/grpc-mate-java/elasticsearch-service/src/main/proto/google @@ -0,0 +1 @@ +../../../../../protobuffers/google \ No newline at end of file diff --git a/grpc-mate-java/elasticsearch-service/src/main/proto/grpc_mate/product_common.proto b/grpc-mate-java/elasticsearch-service/src/main/proto/grpc_mate/product_common.proto new file mode 120000 index 0000000..ba4615a --- /dev/null +++ b/grpc-mate-java/elasticsearch-service/src/main/proto/grpc_mate/product_common.proto @@ -0,0 +1 @@ +../../../../../../protobuffers/grpc_mate/product_common.proto \ No newline at end of file diff --git a/grpc-mate-java/elasticsearch-service/src/main/proto/grpc_mate/product_search_engine.proto b/grpc-mate-java/elasticsearch-service/src/main/proto/grpc_mate/product_search_engine.proto new file mode 120000 index 0000000..c6d84ec --- /dev/null +++ b/grpc-mate-java/elasticsearch-service/src/main/proto/grpc_mate/product_search_engine.proto @@ -0,0 +1 @@ +../../../../../../protobuffers/grpc_mate/product_search_engine.proto \ No newline at end of file diff --git a/elasticsearch-service/src/main/resources/elasticsearch/product_mappings.json b/grpc-mate-java/elasticsearch-service/src/main/resources/elasticsearch/product_mappings.json similarity index 100% rename from elasticsearch-service/src/main/resources/elasticsearch/product_mappings.json rename to grpc-mate-java/elasticsearch-service/src/main/resources/elasticsearch/product_mappings.json diff --git a/elasticsearch-service/src/main/resources/elasticsearch/product_settings.json b/grpc-mate-java/elasticsearch-service/src/main/resources/elasticsearch/product_settings.json similarity index 100% rename from elasticsearch-service/src/main/resources/elasticsearch/product_settings.json rename to grpc-mate-java/elasticsearch-service/src/main/resources/elasticsearch/product_settings.json diff --git a/elasticsearch-service/src/main/resources/logback.xml b/grpc-mate-java/elasticsearch-service/src/main/resources/logback.xml similarity index 100% rename from elasticsearch-service/src/main/resources/logback.xml rename to grpc-mate-java/elasticsearch-service/src/main/resources/logback.xml diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/TestConstant.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/TestConstant.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/TestConstant.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/TestConstant.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/dao/ProductDaoTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/dao/ProductDaoTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/dao/ProductDaoTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/dao/ProductDaoTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/guice/ConfigurationProviderTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/ConfigurationProviderTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/guice/ConfigurationProviderTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/ConfigurationProviderTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/guice/ElasticSearchModuleTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/ElasticSearchModuleTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/guice/ElasticSearchModuleTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/ElasticSearchModuleTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/guice/TransportClientProviderTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/TransportClientProviderTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/guice/TransportClientProviderTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/guice/TransportClientProviderTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/metrics/CounterFactoryTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/metrics/CounterFactoryTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/metrics/CounterFactoryTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/metrics/CounterFactoryTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/metrics/MetricTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/metrics/MetricTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/metrics/MetricTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/metrics/MetricTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/service/CallerInterceptor.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/CallerInterceptor.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/service/CallerInterceptor.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/CallerInterceptor.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/service/EchoServiceTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/EchoServiceTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/service/EchoServiceTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/EchoServiceTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductReadServiceTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductReadServiceTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/service/ProductReadServiceTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductReadServiceTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductUpdateServiceTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductUpdateServiceTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/service/ProductUpdateServiceTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/ProductUpdateServiceTest.java diff --git a/elasticsearch-service/src/test/java/io/datanerd/es/service/RxStreamObserverTest.java b/grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/RxStreamObserverTest.java similarity index 100% rename from elasticsearch-service/src/test/java/io/datanerd/es/service/RxStreamObserverTest.java rename to grpc-mate-java/elasticsearch-service/src/test/java/io/datanerd/es/service/RxStreamObserverTest.java diff --git a/elasticsearch-service/src/test/resources/Large_Scaled_Forest_Lizard.jpg b/grpc-mate-java/elasticsearch-service/src/test/resources/Large_Scaled_Forest_Lizard.jpg similarity index 100% rename from elasticsearch-service/src/test/resources/Large_Scaled_Forest_Lizard.jpg rename to grpc-mate-java/elasticsearch-service/src/test/resources/Large_Scaled_Forest_Lizard.jpg diff --git a/elasticsearch-service/src/test/resources/logback-test.xml b/grpc-mate-java/elasticsearch-service/src/test/resources/logback-test.xml similarity index 100% rename from elasticsearch-service/src/test/resources/logback-test.xml rename to grpc-mate-java/elasticsearch-service/src/test/resources/logback-test.xml diff --git a/elasticsearch-service/src/test/resources/test.properties b/grpc-mate-java/elasticsearch-service/src/test/resources/test.properties similarity index 100% rename from elasticsearch-service/src/test/resources/test.properties rename to grpc-mate-java/elasticsearch-service/src/test/resources/test.properties diff --git a/gradle/checkstyle.gradle b/grpc-mate-java/gradle/checkstyle.gradle similarity index 100% rename from gradle/checkstyle.gradle rename to grpc-mate-java/gradle/checkstyle.gradle diff --git a/gradle/findbugs.gradle b/grpc-mate-java/gradle/findbugs.gradle similarity index 100% rename from gradle/findbugs.gradle rename to grpc-mate-java/gradle/findbugs.gradle diff --git a/gradle/findbugs_exclude_filter.xml b/grpc-mate-java/gradle/findbugs_exclude_filter.xml similarity index 100% rename from gradle/findbugs_exclude_filter.xml rename to grpc-mate-java/gradle/findbugs_exclude_filter.xml diff --git a/gradle/google_checks.xml b/grpc-mate-java/gradle/google_checks.xml similarity index 100% rename from gradle/google_checks.xml rename to grpc-mate-java/gradle/google_checks.xml diff --git a/gradle/google_checks_suppressions.xml b/grpc-mate-java/gradle/google_checks_suppressions.xml similarity index 100% rename from gradle/google_checks_suppressions.xml rename to grpc-mate-java/gradle/google_checks_suppressions.xml diff --git a/gradle/jacoco.gradle b/grpc-mate-java/gradle/jacoco.gradle similarity index 100% rename from gradle/jacoco.gradle rename to grpc-mate-java/gradle/jacoco.gradle diff --git a/gradle/wrapper/gradle-wrapper.jar b/grpc-mate-java/gradle/wrapper/gradle-wrapper.jar similarity index 100% rename from gradle/wrapper/gradle-wrapper.jar rename to grpc-mate-java/gradle/wrapper/gradle-wrapper.jar diff --git a/gradle/wrapper/gradle-wrapper.properties b/grpc-mate-java/gradle/wrapper/gradle-wrapper.properties similarity index 100% rename from gradle/wrapper/gradle-wrapper.properties rename to grpc-mate-java/gradle/wrapper/gradle-wrapper.properties diff --git a/gradlew b/grpc-mate-java/gradlew similarity index 100% rename from gradlew rename to grpc-mate-java/gradlew diff --git a/gradlew.bat b/grpc-mate-java/gradlew.bat similarity index 100% rename from gradlew.bat rename to grpc-mate-java/gradlew.bat diff --git a/grpc-gateway/README.md b/grpc-mate-java/grpc-gateway/README.md similarity index 100% rename from grpc-gateway/README.md rename to grpc-mate-java/grpc-gateway/README.md diff --git a/grpc-gateway/build.sh b/grpc-mate-java/grpc-gateway/build.sh similarity index 100% rename from grpc-gateway/build.sh rename to grpc-mate-java/grpc-gateway/build.sh diff --git a/grpc-gateway/src/grpc-mate-gateway/main.go b/grpc-mate-java/grpc-gateway/src/grpc-mate-gateway/main.go similarity index 100% rename from grpc-gateway/src/grpc-mate-gateway/main.go rename to grpc-mate-java/grpc-gateway/src/grpc-mate-gateway/main.go diff --git a/helloworld-service/build.gradle b/grpc-mate-java/helloworld-service/build.gradle similarity index 100% rename from helloworld-service/build.gradle rename to grpc-mate-java/helloworld-service/build.gradle diff --git a/helloworld-service/deployment/Dockerfile b/grpc-mate-java/helloworld-service/deployment/Dockerfile similarity index 100% rename from helloworld-service/deployment/Dockerfile rename to grpc-mate-java/helloworld-service/deployment/Dockerfile diff --git a/helloworld-service/deployment/deployment.yaml b/grpc-mate-java/helloworld-service/deployment/deployment.yaml similarity index 100% rename from helloworld-service/deployment/deployment.yaml rename to grpc-mate-java/helloworld-service/deployment/deployment.yaml diff --git a/helloworld-service/deployment/endpoints/Makefile b/grpc-mate-java/helloworld-service/deployment/endpoints/Makefile similarity index 100% rename from helloworld-service/deployment/endpoints/Makefile rename to grpc-mate-java/helloworld-service/deployment/endpoints/Makefile diff --git a/helloworld-service/deployment/endpoints/api_service.yaml b/grpc-mate-java/helloworld-service/deployment/endpoints/api_service.yaml similarity index 100% rename from helloworld-service/deployment/endpoints/api_service.yaml rename to grpc-mate-java/helloworld-service/deployment/endpoints/api_service.yaml diff --git a/helloworld-service/deployment/service.yaml b/grpc-mate-java/helloworld-service/deployment/service.yaml similarity index 100% rename from helloworld-service/deployment/service.yaml rename to grpc-mate-java/helloworld-service/deployment/service.yaml diff --git a/helloworld-service/src/main/java/io/datanerd/hello/server/GrpcServer.java b/grpc-mate-java/helloworld-service/src/main/java/io/datanerd/hello/server/GrpcServer.java similarity index 100% rename from helloworld-service/src/main/java/io/datanerd/hello/server/GrpcServer.java rename to grpc-mate-java/helloworld-service/src/main/java/io/datanerd/hello/server/GrpcServer.java diff --git a/helloworld-service/src/main/java/io/datanerd/hello/server/ServiceLauncher.java b/grpc-mate-java/helloworld-service/src/main/java/io/datanerd/hello/server/ServiceLauncher.java similarity index 100% rename from helloworld-service/src/main/java/io/datanerd/hello/server/ServiceLauncher.java rename to grpc-mate-java/helloworld-service/src/main/java/io/datanerd/hello/server/ServiceLauncher.java diff --git a/helloworld-service/src/main/java/io/datanerd/hello/service/GreeterService.java b/grpc-mate-java/helloworld-service/src/main/java/io/datanerd/hello/service/GreeterService.java similarity index 100% rename from helloworld-service/src/main/java/io/datanerd/hello/service/GreeterService.java rename to grpc-mate-java/helloworld-service/src/main/java/io/datanerd/hello/service/GreeterService.java diff --git a/grpc-mate-java/helloworld-service/src/main/proto/grpc_mate/helloworld.proto b/grpc-mate-java/helloworld-service/src/main/proto/grpc_mate/helloworld.proto new file mode 120000 index 0000000..0c17c5a --- /dev/null +++ b/grpc-mate-java/helloworld-service/src/main/proto/grpc_mate/helloworld.proto @@ -0,0 +1 @@ +../../../../../../protobuffers/grpc_mate/helloworld.proto \ No newline at end of file diff --git a/helloworld-service/src/main/resources/logback.xml b/grpc-mate-java/helloworld-service/src/main/resources/logback.xml similarity index 100% rename from helloworld-service/src/main/resources/logback.xml rename to grpc-mate-java/helloworld-service/src/main/resources/logback.xml diff --git a/helloworld-service/src/test/java/io/datanerd/hello/service/GreeterServiceTest.java b/grpc-mate-java/helloworld-service/src/test/java/io/datanerd/hello/service/GreeterServiceTest.java similarity index 100% rename from helloworld-service/src/test/java/io/datanerd/hello/service/GreeterServiceTest.java rename to grpc-mate-java/helloworld-service/src/test/java/io/datanerd/hello/service/GreeterServiceTest.java diff --git a/settings.gradle b/grpc-mate-java/settings.gradle similarity index 100% rename from settings.gradle rename to grpc-mate-java/settings.gradle diff --git a/shippable.yml b/grpc-mate-java/shippable.yml similarity index 100% rename from shippable.yml rename to grpc-mate-java/shippable.yml diff --git a/grpc-mate-python/Makefile b/grpc-mate-python/Makefile new file mode 100644 index 0000000..77650d4 --- /dev/null +++ b/grpc-mate-python/Makefile @@ -0,0 +1,24 @@ +freeze: + pipenv lock -r > requirements.txt + +protoc: + rm -fR grpc_mate/* && \ + rm -fR google/* && \ + python -m grpc_tools.protoc -Iprotobuffers --python_out=. --grpc_python_out=. protobuffers/grpc_mate/*.proto protobuffers/google/api/*.proto && \ + touch grpc_mate/__init__.py + touch google/__init__.py + touch google/api/__init__.py + +pytest: + pytest --grpc-fake-server + +style: + pycodestyle service data_store server + +clean: + find data_store/ google/ grpc_mate/ server/ service/ -name "__pycache__" -type d -exec rm -r "{}" \; +tar: + mkdir -p target/ + rm -fR target/* + tar cvf grpc-mate-python.tar data_store/ google/ grpc_mate/ server/ service/ requirements.txt + mv grpc-mate-python.tar target/ \ No newline at end of file diff --git a/grpc-mate-python/Pipfile b/grpc-mate-python/Pipfile new file mode 100644 index 0000000..1770c83 --- /dev/null +++ b/grpc-mate-python/Pipfile @@ -0,0 +1,21 @@ +[[source]] +name = "pypi" +url = "https://pypi.org/simple" +verify_ssl = true + +[dev-packages] +faker = "*" +v = {editable = true,version = "*"} +pytest-grpc = "*" +pytest-cov = "*" +pycodestyle = "*" + +[packages] +grpcio = "*" +grpcio-tools = "*" +protobuf = "*" +pyyaml = "*" +sqlalchemy = "*" + +[requires] +python_version = "3.6" diff --git a/grpc-mate-python/Pipfile.lock b/grpc-mate-python/Pipfile.lock new file mode 100644 index 0000000..4082d41 --- /dev/null +++ b/grpc-mate-python/Pipfile.lock @@ -0,0 +1,352 @@ +{ + "_meta": { + "hash": { + "sha256": "0423ccbd63cd67e04d83068c55f39fd64cf34bad648381fed0dbcfed2fd8c2e1" + }, + "pipfile-spec": 6, + "requires": { + "python_version": "3.6" + }, + "sources": [ + { + "name": "pypi", + "url": "https://pypi.org/simple", + "verify_ssl": true + } + ] + }, + "default": { + "grpcio": { + "hashes": [ + "sha256:0419ae5a45f49c7c40d9ae77ae4de9442431b7822851dfbbe56ee0eacb5e5654", + "sha256:1e8631eeee0fb0b4230aeb135e4890035f6ef9159c2a3555fa184468e325691a", + "sha256:24db2fa5438f3815a4edb7a189035051760ca6aa2b0b70a6a948b28bfc63c76b", + "sha256:2adb1cdb7d33e91069517b41249622710a94a1faece1fed31cd36904e4201cde", + "sha256:2cd51f35692b551aeb1fdeb7a256c7c558f6d78fcddff00640942d42f7aeba5f", + "sha256:3247834d24964589f8c2b121b40cd61319b3c2e8d744a6a82008643ef8a378b1", + "sha256:3433cb848b4209717722b62392e575a77a52a34d67c6730138102abc0a441685", + "sha256:39671b7ff77a962bd745746d9d2292c8ed227c5748f16598d16d8631d17dd7e5", + "sha256:40a0b8b2e6f6dd630f8b267eede2f40a848963d0f3c40b1b1f453a4a870f679e", + "sha256:40f9a74c7aa210b3e76eb1c9d56aa8d08722b73426a77626967019df9bbac287", + "sha256:423f76aa504c84cb94594fb88b8a24027c887f1c488cf58f2173f22f4fbd046c", + "sha256:43bd04cec72281a96eb361e1b0232f0f542b46da50bcfe72ef7e5a1b41d00cb3", + "sha256:43e38762635c09e24885d15e3a8e374b72d105d4178ee2cc9491855a8da9c380", + "sha256:4413b11c2385180d7de03add6c8845dd66692b148d36e27ec8c9ef537b2553a1", + "sha256:4450352a87094fd58daf468b04c65a9fa19ad11a0ac8ac7b7ff17d46f873cbc1", + "sha256:49ffda04a6e44de028b3b786278ac9a70043e7905c3eea29eed88b6524d53a29", + "sha256:4a38c4dde4c9120deef43aaabaa44f19186c98659ce554c29788c4071ab2f0a4", + "sha256:50b1febdfd21e2144b56a9aa226829e93a79c354ef22a4e5b013d9965e1ec0ed", + "sha256:559b1a3a8be7395ded2943ea6c2135d096f8cc7039d6d12127110b6496f251fe", + "sha256:5de86c182667ec68cf84019aa0d8ceccf01d352cdca19bf9e373725204bdbf50", + "sha256:5fc069bb481fe3fad0ba24d3baaf69e22dfa6cc1b63290e6dfeaf4ac1e996fb7", + "sha256:6a19d654da49516296515d6f65de4bbcbd734bc57913b21a610cfc45e6df3ff1", + "sha256:7535b3e52f498270e7877dde1c8944d6b7720e93e2e66b89c82a11447b5818f5", + "sha256:7c4e495bcabc308198b8962e60ca12f53b27eb8f03a21ac1d2d711d6dd9ecfca", + "sha256:8a8fc4a0220367cb8370cedac02272d574079ccc32bffbb34d53aaf9e38b5060", + "sha256:8b008515e067232838daca020d1af628bf6520c8cc338bf383284efe6d8bd083", + "sha256:8d1684258e1385e459418f3429e107eec5fb3d75e1f5a8c52e5946b3f329d6ea", + "sha256:8eb5d54b87fb561dc2e00a5c5226c33ffe8dbc13f2e4033a412bafb7b37b194d", + "sha256:94cdef0c61bd014bb7af495e21a1c3a369dd0399c3cd1965b1502043f5c88d94", + "sha256:9d9f3be69c7a5e84c3549a8c4403fa9ac7672da456863d21e390b2bbf45ccad1", + "sha256:9fb6fb5975a448169756da2d124a1beb38c0924ff6c0306d883b6848a9980f38", + "sha256:a5eaae8700b87144d7dfb475aa4675e500ff707292caba3deff41609ddc5b845", + "sha256:aaeac2d552772b76d24eaff67a5d2325bc5205c74c0d4f9fbe71685d4a971db2", + "sha256:bb611e447559b3b5665e12a7da5160c0de6876097f62bf1d23ba66911564868e", + "sha256:bc0d41f4eb07da8b8d3ea85e50b62f6491ab313834db86ae2345be07536a4e5a", + "sha256:bf51051c129b847d1bb63a9b0826346b5f52fb821b15fe5e0d5ef86f268510f5", + "sha256:c948c034d8997526011960db54f512756fb0b4be1b81140a15b4ef094c6594a4", + "sha256:d435a01334157c3b126b4ee5141401d44bdc8440993b18b05e2f267a6647f92d", + "sha256:d46c1f95672b73288e08cdca181e14e84c6229b5879561b7b8cfd48374e09287", + "sha256:d5d58309b42064228b16b0311ff715d6c6e20230e81b35e8d0c8cfa1bbdecad8", + "sha256:dc6e2e91365a1dd6314d615d80291159c7981928b88a4c65654e3fefac83a836", + "sha256:e0dfb5f7a39029a6cbec23affa923b22a2c02207960fd66f109e01d6f632c1eb", + "sha256:eb4bf58d381b1373bd21d50837a53953d625d1693f1b58fed12743c75d3dd321", + "sha256:ebb211a85248dbc396b29320273c1ffde484b898852432613e8df0164c091006", + "sha256:ec759ece4786ae993a5b7dc3b3dead6e9375d89a6c65dfd6860076d2eb2abe7b", + "sha256:f55108397a8fa164268238c3e69cc134e945d1f693572a2f05a028b8d0d2b837", + "sha256:f6c706866d424ff285b85a02de7bbe5ed0ace227766b2c42cbe12f3d9ea5a8aa", + "sha256:f8370ad332b36fbad117440faf0dd4b910e80b9c49db5648afd337abdde9a1b6" + ], + "index": "pypi", + "version": "==1.25.0" + }, + "grpcio-tools": { + "hashes": [ + "sha256:007c075eb9611379fa8f520a1865b9afd850469495b0e4a46e1349b2dc1744ce", + "sha256:02ae9708bdd3f329b1abe1ee16b1d768b2dd7a036a8a57e342d08ee8ca054cec", + "sha256:2f10226bfea4f947de355008b14fb4711c85fc1121570833a96f0e2cd8de580f", + "sha256:314354c7321c84a6e176a99afe1945c933b8a38b4f837255c8decfef8d07f24e", + "sha256:406b530c283a2bb804a10ee97928290b0b60788cd114ddfce0faa681cccfe4b8", + "sha256:49e7682e505e6a1d35459dae1d8a616a08d5cfa6f05de00235aff2e15786af14", + "sha256:4a5c2b38078fc4b949e4e70f7e25cb80443d1ee9a648ce4223aa3c040a0d3b9b", + "sha256:4b40291d67a1fecb5170ed9ec32016e2ae07908a8fa143d2d37311b2bcbeb2c5", + "sha256:4b72b04cba6ecd1940d6eda07886f80fe71fb2e669f1095ebab58b1eb17a53fa", + "sha256:4cc95d5fddebb9348fafcc4c0147745882794ded7cfd5282b2aa158596c77a8a", + "sha256:4ce0261bd4426482a96467ed9ad8411417a6932c331a5bb35aa1907f618f34f6", + "sha256:5226371a2b569c62be0d0590ccff7bbb9566762f243933efbd4b695f9f108cd5", + "sha256:52aab4cbab10683f8830420c0b55ccdc6344702b4a0940913d71fe928dd731c9", + "sha256:532a19419535a92a1b621222f70d6da7624151fe69afa4a1063be56e7a2b884a", + "sha256:5a8d44add097e0a3a7c27e66a8ed0aa2fd561cda77381e818cf7862d4ad0f629", + "sha256:64f6027887e32a938f00b2344c337c6d4f7c4cf157ec2e84b1dd6b6fddad8e50", + "sha256:651b0441e8d8f302b44fb50397fe73dcd5e61b790533438e690055abdef3b234", + "sha256:67d12ec4548dd2b1f15c9e3a953c8f48d8c3441c2d8bd143fc3af95a1c041c2b", + "sha256:6c029341132a0e64cbd2dba1dda9a125e06a798b9ec864569afdecce626dd5d5", + "sha256:6e64214709f37b347875ac83cfed4e9cfd287f255dab2836521f591620412c40", + "sha256:6f70fc9a82a0145296358720cf24f83a657a745e8b51ec9564f4c9e678c5b872", + "sha256:6fb4739eb5eef051945b16b3c434d08653ea05f0313cf88495ced5d9db641745", + "sha256:79b5b1c172dafb0e76aa95bf572d4c7afc0bf97a1669b2228a0bc151071c4666", + "sha256:7d02755480cec3c0222f35397e810bfaf4cf9f2bf2e626f7f6efc1d40fffb7fa", + "sha256:818f2b8168760cf16e66fe85894a37afcff5378a64939549663a371216618498", + "sha256:834564c2fba02c31179af081bd80aada8dfdcca52c80e241353f6063b6154bd2", + "sha256:8b17347a90a14386641ffe57743bbb01a16a7149c95905364d3c8091ad377bd8", + "sha256:902e13dbaca9733e4668928967b301526197ecffacb8c7a0acc0c7045de8836f", + "sha256:988014c714ca654b3b7ca9f4dabfe487b00e023bfdd9eaf1bb0fed82bf8c4255", + "sha256:9a83d39e198cbed5d093f43790b92945ab74140357ec00e53ae13b421489ffb7", + "sha256:ac7649cff7354d2f04ebe2872f786a1d07547deded61f3d39036ebb569de91bc", + "sha256:b013d93bc6dc5c7bf3642bf30e673daee46f9a4984fbd9588a9cda1071278414", + "sha256:b02701d40f1ccf16bc8c46f56bdbf89e03110bd8fd570c854e72299ce2920c35", + "sha256:b0ef0da2eec959def8ba508b2a763c492f1fb989446a422d1456ac17dc1b19f4", + "sha256:bb8264ccf8ff904a1a396dc757ac1560b24f270b90e7dabb0ae3f637cb351bb3", + "sha256:bbfb58f5c0aa27b599141bb5eacaf8116b55ad89bc5a2c3afd5e965d840ad341", + "sha256:c1a482fdd8952a7f0098f78161a4deef8a500e54babef302548cd9f1e326d42c", + "sha256:c40efc662fa037898488e31756242af68a8ab5729f939bc8c9ba259bc32e7d6a", + "sha256:c5ad07adae3fe62761bc662c554c2734203f0f700616fc58138b852a7ef5e40e", + "sha256:c765512cb5cb4afaf652837b8cc69229dee14c8e92f15a6ea0f4dfd646902dd2", + "sha256:c871f5a89012ae44d9233305d74dfdd2059a78f0cb0303d38a4b6a562c6f9ba7", + "sha256:cc950fb17c1172d0c0129e8c6e787206e7ef8c24a8e39005f8cc297e9faa4f9a", + "sha256:d3619b43009a5c82cb7ef11847518236140d7ffdcc6600e1a151b8b49350693a", + "sha256:dc17a8a8b39cb37380d927d4669882af4ccc7d3ee298a15a3004f4b18ecd2ac3", + "sha256:eab3684ce9dec3a934a36ba79e8435210d07c50906425ab157eeb4b14503a925", + "sha256:f258b32dffd27ef1eb5f5f01ebb115dfad07677b0510b41f786c511a62ded033", + "sha256:f550c94728b67a7eeddc35b03c99552f2d7aac09c52935ad4b0552d0843fd03c", + "sha256:f7fc690a517c8f3765796ed005bb3273895a985a8593977291bad24568e018e3" + ], + "index": "pypi", + "version": "==1.25.0" + }, + "protobuf": { + "hashes": [ + "sha256:0265379852b9e1f76af6d3d3fe4b3c383a595cc937594bda8565cf69a96baabd", + "sha256:29bd1ed46b2536ad8959401a2f02d2d7b5a309f8e97518e4f92ca6c5ba74dbed", + "sha256:3175d45698edb9a07c1a78a1a4850e674ce8988f20596580158b1d0921d0f057", + "sha256:34a7270940f86da7a28be466ac541c89b6dbf144a6348b9cf7ac6f56b71006ce", + "sha256:38cbc830a4a5ba9956763b0f37090bfd14dd74e72762be6225de2ceac55f4d03", + "sha256:665194f5ad386511ac8d8a0bd57b9ab37b8dd2cd71969458777318e774b9cd46", + "sha256:839bad7d115c77cdff29b488fae6a3ab503ce9a4192bd4c42302a6ea8e5d0f33", + "sha256:934a9869a7f3b0d84eca460e386fba1f7ba2a0c1a120a2648bc41fadf50efd1c", + "sha256:aecdf12ef6dc7fd91713a6da93a86c2f2a8fe54840a3b1670853a2b7402e77c9", + "sha256:c4e90bc27c0691c76e09b5dc506133451e52caee1472b8b3c741b7c912ce43ef", + "sha256:c65d135ea2d85d40309e268106dab02d3bea723db2db21c23ecad4163ced210b", + "sha256:c98dea04a1ff41a70aff2489610f280004831798cb36a068013eed04c698903d", + "sha256:d9049aa194378a426f0b2c784e2054565bf6f754d20fcafdee7102a6250556e8", + "sha256:e028fee51c96de4e81924484c77111dfdea14010ecfc906ea5b252209b0c4de6", + "sha256:e84ad26fb50091b1ea676403c0dd2bd47663099454aa6d88000b1dafecab0941", + "sha256:e88a924b591b06d0191620e9c8aa75297b3111066bb09d49a24bae1054a10c13" + ], + "index": "pypi", + "version": "==3.11.1" + }, + "pyyaml": { + "hashes": [ + "sha256:0e7f69397d53155e55d10ff68fdfb2cf630a35e6daf65cf0bdeaf04f127c09dc", + "sha256:2e9f0b7c5914367b0916c3c104a024bb68f269a486b9d04a2e8ac6f6597b7803", + "sha256:35ace9b4147848cafac3db142795ee42deebe9d0dad885ce643928e88daebdcc", + "sha256:38a4f0d114101c58c0f3a88aeaa44d63efd588845c5a2df5290b73db8f246d15", + "sha256:483eb6a33b671408c8529106df3707270bfacb2447bf8ad856a4b4f57f6e3075", + "sha256:4b6be5edb9f6bb73680f5bf4ee08ff25416d1400fbd4535fe0069b2994da07cd", + "sha256:7f38e35c00e160db592091751d385cd7b3046d6d51f578b29943225178257b31", + "sha256:8100c896ecb361794d8bfdb9c11fce618c7cf83d624d73d5ab38aef3bc82d43f", + "sha256:c0ee8eca2c582d29c3c2ec6e2c4f703d1b7f1fb10bc72317355a746057e7346c", + "sha256:e4c015484ff0ff197564917b4b4246ca03f411b9bd7f16e02a2f586eb48b6d04", + "sha256:ebc4ed52dcc93eeebeae5cf5deb2ae4347b3a81c3fa12b0b8c976544829396a4" + ], + "index": "pypi", + "version": "==5.2" + }, + "six": { + "hashes": [ + "sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd", + "sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66" + ], + "version": "==1.13.0" + }, + "sqlalchemy": { + "hashes": [ + "sha256:afa5541e9dea8ad0014251bc9d56171ca3d8b130c9627c6cb3681cff30be3f8a" + ], + "index": "pypi", + "version": "==1.3.11" + } + }, + "develop": { + "attrs": { + "hashes": [ + "sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c", + "sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72" + ], + "version": "==19.3.0" + }, + "coverage": { + "hashes": [ + "sha256:08907593569fe59baca0bf152c43f3863201efb6113ecb38ce7e97ce339805a6", + "sha256:0be0f1ed45fc0c185cfd4ecc19a1d6532d72f86a2bac9de7e24541febad72650", + "sha256:141f08ed3c4b1847015e2cd62ec06d35e67a3ac185c26f7635f4406b90afa9c5", + "sha256:19e4df788a0581238e9390c85a7a09af39c7b539b29f25c89209e6c3e371270d", + "sha256:23cc09ed395b03424d1ae30dcc292615c1372bfba7141eb85e11e50efaa6b351", + "sha256:245388cda02af78276b479f299bbf3783ef0a6a6273037d7c60dc73b8d8d7755", + "sha256:331cb5115673a20fb131dadd22f5bcaf7677ef758741312bee4937d71a14b2ef", + "sha256:386e2e4090f0bc5df274e720105c342263423e77ee8826002dcffe0c9533dbca", + "sha256:3a794ce50daee01c74a494919d5ebdc23d58873747fa0e288318728533a3e1ca", + "sha256:60851187677b24c6085248f0a0b9b98d49cba7ecc7ec60ba6b9d2e5574ac1ee9", + "sha256:63a9a5fc43b58735f65ed63d2cf43508f462dc49857da70b8980ad78d41d52fc", + "sha256:6b62544bb68106e3f00b21c8930e83e584fdca005d4fffd29bb39fb3ffa03cb5", + "sha256:6ba744056423ef8d450cf627289166da65903885272055fb4b5e113137cfa14f", + "sha256:7494b0b0274c5072bddbfd5b4a6c6f18fbbe1ab1d22a41e99cd2d00c8f96ecfe", + "sha256:826f32b9547c8091679ff292a82aca9c7b9650f9fda3e2ca6bf2ac905b7ce888", + "sha256:93715dffbcd0678057f947f496484e906bf9509f5c1c38fc9ba3922893cda5f5", + "sha256:9a334d6c83dfeadae576b4d633a71620d40d1c379129d587faa42ee3e2a85cce", + "sha256:af7ed8a8aa6957aac47b4268631fa1df984643f07ef00acd374e456364b373f5", + "sha256:bf0a7aed7f5521c7ca67febd57db473af4762b9622254291fbcbb8cd0ba5e33e", + "sha256:bf1ef9eb901113a9805287e090452c05547578eaab1b62e4ad456fcc049a9b7e", + "sha256:c0afd27bc0e307a1ffc04ca5ec010a290e49e3afbe841c5cafc5c5a80ecd81c9", + "sha256:dd579709a87092c6dbee09d1b7cfa81831040705ffa12a1b248935274aee0437", + "sha256:df6712284b2e44a065097846488f66840445eb987eb81b3cc6e4149e7b6982e1", + "sha256:e07d9f1a23e9e93ab5c62902833bf3e4b1f65502927379148b6622686223125c", + "sha256:e2ede7c1d45e65e209d6093b762e98e8318ddeff95317d07a27a2140b80cfd24", + "sha256:e4ef9c164eb55123c62411f5936b5c2e521b12356037b6e1c2617cef45523d47", + "sha256:eca2b7343524e7ba246cab8ff00cab47a2d6d54ada3b02772e908a45675722e2", + "sha256:eee64c616adeff7db37cc37da4180a3a5b6177f5c46b187894e633f088fb5b28", + "sha256:ef824cad1f980d27f26166f86856efe11eff9912c4fed97d3804820d43fa550c", + "sha256:efc89291bd5a08855829a3c522df16d856455297cf35ae827a37edac45f466a7", + "sha256:fa964bae817babece5aa2e8c1af841bebb6d0b9add8e637548809d040443fee0", + "sha256:ff37757e068ae606659c28c3bd0d923f9d29a85de79bf25b2b34b148473b5025" + ], + "version": "==4.5.4" + }, + "faker": { + "hashes": [ + "sha256:202ad3b2ec16ae7c51c02904fb838831f8d2899e61bf18db1e91a5a582feab11", + "sha256:92c84a10bec81217d9cb554ee12b3838c8986ce0b5d45f72f769da22e4bb5432" + ], + "index": "pypi", + "version": "==3.0.0" + }, + "importlib-metadata": { + "hashes": [ + "sha256:073a852570f92da5f744a3472af1b61e28e9f78ccf0c9117658dc32b15de7b45", + "sha256:d95141fbfa7ef2ec65cfd945e2af7e5a6ddbd7c8d9a25e66ff3be8e3daf9f60f" + ], + "markers": "python_version < '3.8'", + "version": "==1.3.0" + }, + "more-itertools": { + "hashes": [ + "sha256:b84b238cce0d9adad5ed87e745778d20a3f8487d0f0cb8b8a586816c7496458d", + "sha256:c833ef592a0324bcc6a60e48440da07645063c453880c9477ceb22490aec1564" + ], + "version": "==8.0.2" + }, + "packaging": { + "hashes": [ + "sha256:28b924174df7a2fa32c1953825ff29c61e2f5e082343165438812f00d3a7fc47", + "sha256:d9551545c6d761f3def1677baf08ab2a3ca17c56879e70fecba2fc4dde4ed108" + ], + "version": "==19.2" + }, + "pluggy": { + "hashes": [ + "sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0", + "sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d" + ], + "version": "==0.13.1" + }, + "py": { + "hashes": [ + "sha256:64f65755aee5b381cea27766a3a147c3f15b9b6b9ac88676de66ba2ae36793fa", + "sha256:dc639b046a6e2cff5bbe40194ad65936d6ba360b52b3c3fe1d08a82dd50b5e53" + ], + "version": "==1.8.0" + }, + "pycodestyle": { + "hashes": [ + "sha256:95a2219d12372f05704562a14ec30bc76b05a5b297b21a5dfe3f6fac3491ae56", + "sha256:e40a936c9a450ad81df37f549d676d127b1b66000a6c500caa2b085bc0ca976c" + ], + "index": "pypi", + "version": "==2.5.0" + }, + "pyparsing": { + "hashes": [ + "sha256:20f995ecd72f2a1f4bf6b072b63b22e2eb457836601e76d6e5dfcd75436acc1f", + "sha256:4ca62001be367f01bd3e92ecbb79070272a9d4964dce6a48a82ff0b8bc7e683a" + ], + "version": "==2.4.5" + }, + "pytest": { + "hashes": [ + "sha256:63344a2e3bce2e4d522fd62b4fdebb647c019f1f9e4ca075debbd13219db4418", + "sha256:f67403f33b2b1d25a6756184077394167fe5e2f9d8bdaab30707d19ccec35427" + ], + "version": "==5.3.1" + }, + "pytest-cov": { + "hashes": [ + "sha256:cc6742d8bac45070217169f5f72ceee1e0e55b0221f54bcf24845972d3a47f2b", + "sha256:cdbdef4f870408ebdbfeb44e63e07eb18bb4619fae852f6e760645fa36172626" + ], + "index": "pypi", + "version": "==2.8.1" + }, + "pytest-grpc": { + "hashes": [ + "sha256:28d75d2eea55518327289690053679b15ae867e54e7dff184c36316766b745ce", + "sha256:6884dea2279c874be59dccc25d69aa93cf7e00e1636dbe23b84144a8f81095fa" + ], + "index": "pypi", + "version": "==0.7.0" + }, + "python-dateutil": { + "hashes": [ + "sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c", + "sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a" + ], + "version": "==2.8.1" + }, + "six": { + "hashes": [ + "sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd", + "sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66" + ], + "version": "==1.13.0" + }, + "text-unidecode": { + "hashes": [ + "sha256:1311f10e8b895935241623731c2ba64f4c455287888b18189350b67134a822e8", + "sha256:bad6603bb14d279193107714b288be206cac565dfa49aa5b105294dd5c4aab93" + ], + "version": "==1.3" + }, + "v": { + "hashes": [ + "sha256:2d5a8f79a36aaebe62ef2c7068e3ec7f86656078202edabfdbf74715dc822d36", + "sha256:cd6b6b20b4a611f209c88bcdfb7211321f85662efb2bdd53a7b40314d0a84618" + ], + "index": "pypi", + "version": "==0.0.0" + }, + "wcwidth": { + "hashes": [ + "sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e", + "sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c" + ], + "version": "==0.1.7" + }, + "zipp": { + "hashes": [ + "sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e", + "sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335" + ], + "version": "==0.6.0" + } + } +} diff --git a/grpc-mate-python/README.md b/grpc-mate-python/README.md new file mode 100644 index 0000000..12a295a --- /dev/null +++ b/grpc-mate-python/README.md @@ -0,0 +1,14 @@ +gRPC-Mate - An enterprise ready micro service project base on [gRPC](https://github.com/grpc/grpc) +======================================== +gRPC-Mate demostrate best practice for gRPC based micro service. + +* [how to setup python development env](https://www.vipmind.me/programing/python/set-up-python-development-env-with-pyenv-and-pipenv.html) +* [how to bootstrap simple grpc server](https://www.vipmind.me/programing/python/setup-grpc-server-project-in-python.html) +* [how to do unit test for grpc servicer](https://www.vipmind.me/programing/python/write-unit-test-for-grpc-with-pytest-and-pytest-grpc.html) +* [how to use sqlalchemy to persist data](https://www.vipmind.me/programing/python/sqlalchemy-makes-python-orm-easy.html) +* [how to use protobuf enum in python](https://www.vipmind.me/programing/python/understand-protobuf-enum-in-python.html) +* [how to do client streaming](https://www.vipmind.me/programing/python/how-to-do-grpc-client-stream-upload.html) +* [how to do server streaming](https://www.vipmind.me/programing/python/how-to-do-grpc-server-stream.html) +* [how to output large binary stream via data chunk](https://www.vipmind.me/programing/python/how-to-output-large-binary-stream-via-data-chunk-in-grpc.html) +* [How to config python log in Grpc Server](https://www.vipmind.me/programing/python/how-to-config-python-log-in-grpc-server.html) + diff --git a/grpc-mate-python/container_config/deploy.yaml b/grpc-mate-python/container_config/deploy.yaml new file mode 100644 index 0000000..f7f3b04 --- /dev/null +++ b/grpc-mate-python/container_config/deploy.yaml @@ -0,0 +1,65 @@ +apiVersion: extensions/v1beta1 +kind: Deployment +metadata: + name: grpc-mate-python +spec: + replicas: 1 + template: + metadata: + labels: + app: grpc-mate-python + spec: + imagePullSecrets: + - name: face-staging-docker-registry + containers: + - name: grpc-mate-python + image: us.gcr.io/face-staging/grpc-mate-python:1.0.0 + imagePullPolicy: Always + resources: + requests: + memory: 128Mi + readinessProbe: + exec: + command: + - /bin/bash + - -c + - ps -ef | grep server | grep -v "grep" + initialDelaySeconds: 8 + timeoutSeconds: 10 + livenessProbe: + exec: + command: + - /bin/bash + - -c + - ps -ef | grep server | grep -v "grep" + initialDelaySeconds: 60 + timeoutSeconds: 10 + ports: + - name: grpc + containerPort: 8080 + env: + - name: GOOGLE_APPLICATION_CREDENTIALS + value: /etc/appconfig/face-prod-ops.json + volumeMounts: + - name: google-face-prod-ops-service-account-key + mountPath: /etc/appconfig + - name: esp + image: gcr.io/endpoints-release/endpoints-runtime:1 + args: [ + "--http_port=9000", + "--backend=grpc://127.0.0.1:8080", + "--service=greeter.endpoints.face-prod.cloud.goog", + "--version=2019-12-03r0", + "--service_account_key=/etc/nginx/creds/face-prod-ops.json" + ] + ports: + - name: http + containerPort: 9000 + volumeMounts: + - mountPath: /etc/nginx/creds + name: google-face-prod-ops-service-account-key + readOnly: true + volumes: + - name: google-face-prod-ops-service-account-key + secret: + secretName: google-face-prod-ops-service-account-key diff --git a/grpc-mate-python/container_config/endpoints/Makefile b/grpc-mate-python/container_config/endpoints/Makefile new file mode 100644 index 0000000..a7b510f --- /dev/null +++ b/grpc-mate-python/container_config/endpoints/Makefile @@ -0,0 +1,18 @@ +build: clean + protoc --include_imports \ + --proto_path=../../protobuffers \ + --descriptor_set_out helloworld.pb \ + ../../protobuffers/grpc_mate/helloworld.proto + +config_list: + rm ../../.python-version + gcloud endpoints configs list --service greeter.endpoints.face-prod.cloud.goog + echo "3.6.8" > ../../.python-version + +clean: + rm -f *.pb + +update_prod_spec: build + rm ../../.python-version + gcloud endpoints services deploy helloworld.pb endpoint.yaml + echo "3.6.8" > ../../.python-version \ No newline at end of file diff --git a/grpc-mate-python/container_config/endpoints/endpoint.yaml b/grpc-mate-python/container_config/endpoints/endpoint.yaml new file mode 100644 index 0000000..5a6775f --- /dev/null +++ b/grpc-mate-python/container_config/endpoints/endpoint.yaml @@ -0,0 +1,26 @@ +# The configuration schema is defined by service.proto file +# https://github.com/googleapis/googleapis/blob/master/google/api/service.proto +type: google.api.Service +config_version: 3 + +# +# Name of the service config +# +name: greeter.endpoints.face-prod.cloud.goog + +# +# API title for user interface (Google Cloud Console). +# +title: grpc mate python API + +apis: +- name: Greeter + +endpoints: +- name: greeter.endpoints.face-prod.cloud.goog + allow_cors: true + +usage: + rules: + - selector: "*" + allow_unregistered_calls: true \ No newline at end of file diff --git a/grpc-mate-python/container_image/Dockerfile b/grpc-mate-python/container_image/Dockerfile new file mode 100644 index 0000000..d95c54b --- /dev/null +++ b/grpc-mate-python/container_image/Dockerfile @@ -0,0 +1,21 @@ +FROM centos:7 + +RUN yum update -y +RUN yum install -y sudo curl wget unzip +RUN yum install yum-utils –y +RUN yum install -y python36 python36-libs python36-devel python36-pip + +RUN mkdir /app +WORKDIR /app + +COPY grpc-mate-python.tar /app +RUN tar xvf /app/grpc-mate-python.tar -C /app +RUN pip3 install -r /app/requirements.txt +ENV PYTHONPATH=. + +EXPOSE 8080 + +CMD ["python3","server/server.py"] +#COPY entrypoint.sh /entrypoint.sh +#RUN chmod +x /entrypoint.sh +#ENTRYPOINT ["/entrypoint.sh"] \ No newline at end of file diff --git a/grpc-mate-python/container_image/Makefile b/grpc-mate-python/container_image/Makefile new file mode 100644 index 0000000..fc5b287 --- /dev/null +++ b/grpc-mate-python/container_image/Makefile @@ -0,0 +1,16 @@ +build: + rm -f grpc-mate-python.tar + cp ../target/grpc-mate-python.tar . + docker build -t us.gcr.io/face-staging/grpc-mate-python:1.0.0 . + +push: + docker push us.gcr.io/face-staging/grpc-mate-python:1.0.0 + +run: + docker run --name grpc-mate-python -p 8080:8080 -d us.gcr.io/face-staging/grpc-mate-python:1.0.0 + +rm: + docker rm -f grpc-mate-python + +shell: run + docker exec -it grpc-mate-python bash \ No newline at end of file diff --git a/grpc-mate-python/container_image/entrypoint.sh b/grpc-mate-python/container_image/entrypoint.sh new file mode 100755 index 0000000..0c382cc --- /dev/null +++ b/grpc-mate-python/container_image/entrypoint.sh @@ -0,0 +1,2 @@ +#!/usr/bin/env bash +while true; do sleep 30; done; \ No newline at end of file diff --git a/grpc-mate-python/data_store/__init__.py b/grpc-mate-python/data_store/__init__.py new file mode 100644 index 0000000..9d08207 --- /dev/null +++ b/grpc-mate-python/data_store/__init__.py @@ -0,0 +1,8 @@ +import os + +from sqlalchemy import create_engine +from sqlalchemy.orm import sessionmaker + +db_url = os.getenv('db_url', 'sqlite:///:memory:') +engine = create_engine(db_url, echo=True) +Session = sessionmaker(bind=engine) diff --git a/grpc-mate-python/data_store/db.py b/grpc-mate-python/data_store/db.py new file mode 100644 index 0000000..37d7d35 --- /dev/null +++ b/grpc-mate-python/data_store/db.py @@ -0,0 +1,16 @@ +from data_store import Session +from contextlib import contextmanager + + +@contextmanager +def session_scope(): + """Provide a transactional scope around a series of operations.""" + session = Session() + try: + yield session + session.commit() + except: + session.rollback() + raise + finally: + session.close() diff --git a/grpc-mate-python/data_store/models.py b/grpc-mate-python/data_store/models.py new file mode 100644 index 0000000..05014b3 --- /dev/null +++ b/grpc-mate-python/data_store/models.py @@ -0,0 +1,13 @@ +from sqlalchemy import Column, SMALLINT, Integer, String, DECIMAL +from sqlalchemy.ext.declarative import declarative_base + +Base = declarative_base() + + +class DBProduct(Base): + __tablename__ = 'products' + product_id = Column(Integer, primary_key=True) + product_name = Column(String(200)) + product_price = Column(DECIMAL(10, 2)) + product_status = Column(SMALLINT) + category = Column(String(50)) diff --git a/grpc-mate-python/google/__init__.py b/grpc-mate-python/google/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/google/api/__init__.py b/grpc-mate-python/google/api/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/google/api/annotations_pb2.py b/grpc-mate-python/google/api/annotations_pb2.py new file mode 100644 index 0000000..e72a7f8 --- /dev/null +++ b/grpc-mate-python/google/api/annotations_pb2.py @@ -0,0 +1,46 @@ +# -*- coding: utf-8 -*- +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: google/api/annotations.proto + +import sys +_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1')) +from google.protobuf import descriptor as _descriptor +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection +from google.protobuf import symbol_database as _symbol_database +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + +from google.api import http_pb2 as google_dot_api_dot_http__pb2 +from google.protobuf import descriptor_pb2 as google_dot_protobuf_dot_descriptor__pb2 + + +DESCRIPTOR = _descriptor.FileDescriptor( + name='google/api/annotations.proto', + package='google.api', + syntax='proto3', + serialized_options=_b('\n\016com.google.apiB\020AnnotationsProtoP\001ZAgoogle.golang.org/genproto/googleapis/api/annotations;annotations\242\002\004GAPI'), + serialized_pb=_b('\n\x1cgoogle/api/annotations.proto\x12\ngoogle.api\x1a\x15google/api/http.proto\x1a google/protobuf/descriptor.proto:E\n\x04http\x12\x1e.google.protobuf.MethodOptions\x18\xb0\xca\xbc\" \x01(\x0b\x32\x14.google.api.HttpRuleBn\n\x0e\x63om.google.apiB\x10\x41nnotationsProtoP\x01ZAgoogle.golang.org/genproto/googleapis/api/annotations;annotations\xa2\x02\x04GAPIb\x06proto3') + , + dependencies=[google_dot_api_dot_http__pb2.DESCRIPTOR,google_dot_protobuf_dot_descriptor__pb2.DESCRIPTOR,]) + + +HTTP_FIELD_NUMBER = 72295728 +http = _descriptor.FieldDescriptor( + name='http', full_name='google.api.http', index=0, + number=72295728, type=11, cpp_type=10, label=1, + has_default_value=False, default_value=None, + message_type=None, enum_type=None, containing_type=None, + is_extension=True, extension_scope=None, + serialized_options=None, file=DESCRIPTOR) + +DESCRIPTOR.extensions_by_name['http'] = http +_sym_db.RegisterFileDescriptor(DESCRIPTOR) + +http.message_type = google_dot_api_dot_http__pb2._HTTPRULE +google_dot_protobuf_dot_descriptor__pb2.MethodOptions.RegisterExtension(http) + +DESCRIPTOR._options = None +# @@protoc_insertion_point(module_scope) diff --git a/grpc-mate-python/google/api/annotations_pb2_grpc.py b/grpc-mate-python/google/api/annotations_pb2_grpc.py new file mode 100644 index 0000000..a894352 --- /dev/null +++ b/grpc-mate-python/google/api/annotations_pb2_grpc.py @@ -0,0 +1,3 @@ +# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! +import grpc + diff --git a/grpc-mate-python/google/api/http_pb2.py b/grpc-mate-python/google/api/http_pb2.py new file mode 100644 index 0000000..fdf6238 --- /dev/null +++ b/grpc-mate-python/google/api/http_pb2.py @@ -0,0 +1,236 @@ +# -*- coding: utf-8 -*- +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: google/api/http.proto + +import sys +_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1')) +from google.protobuf import descriptor as _descriptor +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection +from google.protobuf import symbol_database as _symbol_database +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + + + +DESCRIPTOR = _descriptor.FileDescriptor( + name='google/api/http.proto', + package='google.api', + syntax='proto3', + serialized_options=_b('\n\016com.google.apiB\tHttpProtoP\001ZAgoogle.golang.org/genproto/googleapis/api/annotations;annotations\370\001\001\242\002\004GAPI'), + serialized_pb=_b('\n\x15google/api/http.proto\x12\ngoogle.api\"+\n\x04Http\x12#\n\x05rules\x18\x01 \x03(\x0b\x32\x14.google.api.HttpRule\"\xea\x01\n\x08HttpRule\x12\x10\n\x08selector\x18\x01 \x01(\t\x12\r\n\x03get\x18\x02 \x01(\tH\x00\x12\r\n\x03put\x18\x03 \x01(\tH\x00\x12\x0e\n\x04post\x18\x04 \x01(\tH\x00\x12\x10\n\x06\x64\x65lete\x18\x05 \x01(\tH\x00\x12\x0f\n\x05patch\x18\x06 \x01(\tH\x00\x12/\n\x06\x63ustom\x18\x08 \x01(\x0b\x32\x1d.google.api.CustomHttpPatternH\x00\x12\x0c\n\x04\x62ody\x18\x07 \x01(\t\x12\x31\n\x13\x61\x64\x64itional_bindings\x18\x0b \x03(\x0b\x32\x14.google.api.HttpRuleB\t\n\x07pattern\"/\n\x11\x43ustomHttpPattern\x12\x0c\n\x04kind\x18\x01 \x01(\t\x12\x0c\n\x04path\x18\x02 \x01(\tBj\n\x0e\x63om.google.apiB\tHttpProtoP\x01ZAgoogle.golang.org/genproto/googleapis/api/annotations;annotations\xf8\x01\x01\xa2\x02\x04GAPIb\x06proto3') +) + + + + +_HTTP = _descriptor.Descriptor( + name='Http', + full_name='google.api.Http', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='rules', full_name='google.api.Http.rules', index=0, + number=1, type=11, cpp_type=10, label=3, + has_default_value=False, default_value=[], + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=37, + serialized_end=80, +) + + +_HTTPRULE = _descriptor.Descriptor( + name='HttpRule', + full_name='google.api.HttpRule', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='selector', full_name='google.api.HttpRule.selector', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='get', full_name='google.api.HttpRule.get', index=1, + number=2, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='put', full_name='google.api.HttpRule.put', index=2, + number=3, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='post', full_name='google.api.HttpRule.post', index=3, + number=4, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='delete', full_name='google.api.HttpRule.delete', index=4, + number=5, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='patch', full_name='google.api.HttpRule.patch', index=5, + number=6, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='custom', full_name='google.api.HttpRule.custom', index=6, + number=8, type=11, cpp_type=10, label=1, + has_default_value=False, default_value=None, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='body', full_name='google.api.HttpRule.body', index=7, + number=7, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='additional_bindings', full_name='google.api.HttpRule.additional_bindings', index=8, + number=11, type=11, cpp_type=10, label=3, + has_default_value=False, default_value=[], + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + _descriptor.OneofDescriptor( + name='pattern', full_name='google.api.HttpRule.pattern', + index=0, containing_type=None, fields=[]), + ], + serialized_start=83, + serialized_end=317, +) + + +_CUSTOMHTTPPATTERN = _descriptor.Descriptor( + name='CustomHttpPattern', + full_name='google.api.CustomHttpPattern', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='kind', full_name='google.api.CustomHttpPattern.kind', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='path', full_name='google.api.CustomHttpPattern.path', index=1, + number=2, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=319, + serialized_end=366, +) + +_HTTP.fields_by_name['rules'].message_type = _HTTPRULE +_HTTPRULE.fields_by_name['custom'].message_type = _CUSTOMHTTPPATTERN +_HTTPRULE.fields_by_name['additional_bindings'].message_type = _HTTPRULE +_HTTPRULE.oneofs_by_name['pattern'].fields.append( + _HTTPRULE.fields_by_name['get']) +_HTTPRULE.fields_by_name['get'].containing_oneof = _HTTPRULE.oneofs_by_name['pattern'] +_HTTPRULE.oneofs_by_name['pattern'].fields.append( + _HTTPRULE.fields_by_name['put']) +_HTTPRULE.fields_by_name['put'].containing_oneof = _HTTPRULE.oneofs_by_name['pattern'] +_HTTPRULE.oneofs_by_name['pattern'].fields.append( + _HTTPRULE.fields_by_name['post']) +_HTTPRULE.fields_by_name['post'].containing_oneof = _HTTPRULE.oneofs_by_name['pattern'] +_HTTPRULE.oneofs_by_name['pattern'].fields.append( + _HTTPRULE.fields_by_name['delete']) +_HTTPRULE.fields_by_name['delete'].containing_oneof = _HTTPRULE.oneofs_by_name['pattern'] +_HTTPRULE.oneofs_by_name['pattern'].fields.append( + _HTTPRULE.fields_by_name['patch']) +_HTTPRULE.fields_by_name['patch'].containing_oneof = _HTTPRULE.oneofs_by_name['pattern'] +_HTTPRULE.oneofs_by_name['pattern'].fields.append( + _HTTPRULE.fields_by_name['custom']) +_HTTPRULE.fields_by_name['custom'].containing_oneof = _HTTPRULE.oneofs_by_name['pattern'] +DESCRIPTOR.message_types_by_name['Http'] = _HTTP +DESCRIPTOR.message_types_by_name['HttpRule'] = _HTTPRULE +DESCRIPTOR.message_types_by_name['CustomHttpPattern'] = _CUSTOMHTTPPATTERN +_sym_db.RegisterFileDescriptor(DESCRIPTOR) + +Http = _reflection.GeneratedProtocolMessageType('Http', (_message.Message,), { + 'DESCRIPTOR' : _HTTP, + '__module__' : 'google.api.http_pb2' + # @@protoc_insertion_point(class_scope:google.api.Http) + }) +_sym_db.RegisterMessage(Http) + +HttpRule = _reflection.GeneratedProtocolMessageType('HttpRule', (_message.Message,), { + 'DESCRIPTOR' : _HTTPRULE, + '__module__' : 'google.api.http_pb2' + # @@protoc_insertion_point(class_scope:google.api.HttpRule) + }) +_sym_db.RegisterMessage(HttpRule) + +CustomHttpPattern = _reflection.GeneratedProtocolMessageType('CustomHttpPattern', (_message.Message,), { + 'DESCRIPTOR' : _CUSTOMHTTPPATTERN, + '__module__' : 'google.api.http_pb2' + # @@protoc_insertion_point(class_scope:google.api.CustomHttpPattern) + }) +_sym_db.RegisterMessage(CustomHttpPattern) + + +DESCRIPTOR._options = None +# @@protoc_insertion_point(module_scope) diff --git a/grpc-mate-python/google/api/http_pb2_grpc.py b/grpc-mate-python/google/api/http_pb2_grpc.py new file mode 100644 index 0000000..a894352 --- /dev/null +++ b/grpc-mate-python/google/api/http_pb2_grpc.py @@ -0,0 +1,3 @@ +# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! +import grpc + diff --git a/grpc-mate-python/grpc_mate/__init__.py b/grpc-mate-python/grpc_mate/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/grpc_mate/helloworld_pb2.py b/grpc-mate-python/grpc_mate/helloworld_pb2.py new file mode 100644 index 0000000..d0afdfa --- /dev/null +++ b/grpc-mate-python/grpc_mate/helloworld_pb2.py @@ -0,0 +1,136 @@ +# -*- coding: utf-8 -*- +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: grpc_mate/helloworld.proto + +import sys +_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1')) +from google.protobuf import descriptor as _descriptor +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection +from google.protobuf import symbol_database as _symbol_database +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + +from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2 + + +DESCRIPTOR = _descriptor.FileDescriptor( + name='grpc_mate/helloworld.proto', + package='', + syntax='proto3', + serialized_options=_b('\n io.datanerd.generated.helloworldP\001'), + serialized_pb=_b('\n\x1agrpc_mate/helloworld.proto\x1a\x1cgoogle/api/annotations.proto\"\x1c\n\x0cHelloRequest\x12\x0c\n\x04name\x18\x01 \x01(\t\"\x1d\n\nHelloReply\x12\x0f\n\x07message\x18\x01 \x01(\t2S\n\x07Greeter\x12H\n\x08SayHello\x12\r.HelloRequest\x1a\x0b.HelloReply\" \x82\xd3\xe4\x93\x02\x1a\"\x15/api/v1/greeter/hello:\x01*B$\n io.datanerd.generated.helloworldP\x01\x62\x06proto3') + , + dependencies=[google_dot_api_dot_annotations__pb2.DESCRIPTOR,]) + + + + +_HELLOREQUEST = _descriptor.Descriptor( + name='HelloRequest', + full_name='HelloRequest', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='name', full_name='HelloRequest.name', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=60, + serialized_end=88, +) + + +_HELLOREPLY = _descriptor.Descriptor( + name='HelloReply', + full_name='HelloReply', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='message', full_name='HelloReply.message', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=90, + serialized_end=119, +) + +DESCRIPTOR.message_types_by_name['HelloRequest'] = _HELLOREQUEST +DESCRIPTOR.message_types_by_name['HelloReply'] = _HELLOREPLY +_sym_db.RegisterFileDescriptor(DESCRIPTOR) + +HelloRequest = _reflection.GeneratedProtocolMessageType('HelloRequest', (_message.Message,), { + 'DESCRIPTOR' : _HELLOREQUEST, + '__module__' : 'grpc_mate.helloworld_pb2' + # @@protoc_insertion_point(class_scope:HelloRequest) + }) +_sym_db.RegisterMessage(HelloRequest) + +HelloReply = _reflection.GeneratedProtocolMessageType('HelloReply', (_message.Message,), { + 'DESCRIPTOR' : _HELLOREPLY, + '__module__' : 'grpc_mate.helloworld_pb2' + # @@protoc_insertion_point(class_scope:HelloReply) + }) +_sym_db.RegisterMessage(HelloReply) + + +DESCRIPTOR._options = None + +_GREETER = _descriptor.ServiceDescriptor( + name='Greeter', + full_name='Greeter', + file=DESCRIPTOR, + index=0, + serialized_options=None, + serialized_start=121, + serialized_end=204, + methods=[ + _descriptor.MethodDescriptor( + name='SayHello', + full_name='Greeter.SayHello', + index=0, + containing_service=None, + input_type=_HELLOREQUEST, + output_type=_HELLOREPLY, + serialized_options=_b('\202\323\344\223\002\032\"\025/api/v1/greeter/hello:\001*'), + ), +]) +_sym_db.RegisterServiceDescriptor(_GREETER) + +DESCRIPTOR.services_by_name['Greeter'] = _GREETER + +# @@protoc_insertion_point(module_scope) diff --git a/grpc-mate-python/grpc_mate/helloworld_pb2_grpc.py b/grpc-mate-python/grpc_mate/helloworld_pb2_grpc.py new file mode 100644 index 0000000..256ecc5 --- /dev/null +++ b/grpc-mate-python/grpc_mate/helloworld_pb2_grpc.py @@ -0,0 +1,46 @@ +# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! +import grpc + +from grpc_mate import helloworld_pb2 as grpc__mate_dot_helloworld__pb2 + + +class GreeterStub(object): + """The greeting service definition. + """ + + def __init__(self, channel): + """Constructor. + + Args: + channel: A grpc.Channel. + """ + self.SayHello = channel.unary_unary( + '/Greeter/SayHello', + request_serializer=grpc__mate_dot_helloworld__pb2.HelloRequest.SerializeToString, + response_deserializer=grpc__mate_dot_helloworld__pb2.HelloReply.FromString, + ) + + +class GreeterServicer(object): + """The greeting service definition. + """ + + def SayHello(self, request, context): + """Sends a greeting + """ + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + +def add_GreeterServicer_to_server(servicer, server): + rpc_method_handlers = { + 'SayHello': grpc.unary_unary_rpc_method_handler( + servicer.SayHello, + request_deserializer=grpc__mate_dot_helloworld__pb2.HelloRequest.FromString, + response_serializer=grpc__mate_dot_helloworld__pb2.HelloReply.SerializeToString, + ), + } + generic_handler = grpc.method_handlers_generic_handler( + 'Greeter', rpc_method_handlers) + server.add_generic_rpc_handlers((generic_handler,)) diff --git a/grpc-mate-python/grpc_mate/product_common_pb2.py b/grpc-mate-python/grpc_mate/product_common_pb2.py new file mode 100644 index 0000000..1c49cc3 --- /dev/null +++ b/grpc-mate-python/grpc_mate/product_common_pb2.py @@ -0,0 +1,166 @@ +# -*- coding: utf-8 -*- +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: grpc_mate/product_common.proto + +import sys +_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1')) +from google.protobuf.internal import enum_type_wrapper +from google.protobuf import descriptor as _descriptor +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection +from google.protobuf import symbol_database as _symbol_database +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + + + +DESCRIPTOR = _descriptor.FileDescriptor( + name='grpc_mate/product_common.proto', + package='', + syntax='proto3', + serialized_options=_b('\n\034io.datanerd.generated.commonP\001Z\010datanerd'), + serialized_pb=_b('\n\x1egrpc_mate/product_common.proto\"\x84\x01\n\x07Product\x12\x12\n\nproduct_id\x18\x01 \x01(\x03\x12\x14\n\x0cproduct_name\x18\x02 \x01(\t\x12\x15\n\rproduct_price\x18\x03 \x01(\x01\x12&\n\x0eproduct_status\x18\x04 \x01(\x0e\x32\x0e.ProductStatus\x12\x10\n\x08\x63\x61tegory\x18\x05 \x01(\t\"\x19\n\tDataChunk\x12\x0c\n\x04\x64\x61ta\x18\x01 \x01(\x0c**\n\rProductStatus\x12\x0b\n\x07InStock\x10\x00\x12\x0c\n\x08OutStock\x10\x01\x42*\n\x1cio.datanerd.generated.commonP\x01Z\x08\x64\x61tanerdb\x06proto3') +) + +_PRODUCTSTATUS = _descriptor.EnumDescriptor( + name='ProductStatus', + full_name='ProductStatus', + filename=None, + file=DESCRIPTOR, + values=[ + _descriptor.EnumValueDescriptor( + name='InStock', index=0, number=0, + serialized_options=None, + type=None), + _descriptor.EnumValueDescriptor( + name='OutStock', index=1, number=1, + serialized_options=None, + type=None), + ], + containing_type=None, + serialized_options=None, + serialized_start=196, + serialized_end=238, +) +_sym_db.RegisterEnumDescriptor(_PRODUCTSTATUS) + +ProductStatus = enum_type_wrapper.EnumTypeWrapper(_PRODUCTSTATUS) +InStock = 0 +OutStock = 1 + + + +_PRODUCT = _descriptor.Descriptor( + name='Product', + full_name='Product', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='product_id', full_name='Product.product_id', index=0, + number=1, type=3, cpp_type=2, label=1, + has_default_value=False, default_value=0, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='product_name', full_name='Product.product_name', index=1, + number=2, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='product_price', full_name='Product.product_price', index=2, + number=3, type=1, cpp_type=5, label=1, + has_default_value=False, default_value=float(0), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='product_status', full_name='Product.product_status', index=3, + number=4, type=14, cpp_type=8, label=1, + has_default_value=False, default_value=0, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='category', full_name='Product.category', index=4, + number=5, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=35, + serialized_end=167, +) + + +_DATACHUNK = _descriptor.Descriptor( + name='DataChunk', + full_name='DataChunk', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='data', full_name='DataChunk.data', index=0, + number=1, type=12, cpp_type=9, label=1, + has_default_value=False, default_value=_b(""), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=169, + serialized_end=194, +) + +_PRODUCT.fields_by_name['product_status'].enum_type = _PRODUCTSTATUS +DESCRIPTOR.message_types_by_name['Product'] = _PRODUCT +DESCRIPTOR.message_types_by_name['DataChunk'] = _DATACHUNK +DESCRIPTOR.enum_types_by_name['ProductStatus'] = _PRODUCTSTATUS +_sym_db.RegisterFileDescriptor(DESCRIPTOR) + +Product = _reflection.GeneratedProtocolMessageType('Product', (_message.Message,), { + 'DESCRIPTOR' : _PRODUCT, + '__module__' : 'grpc_mate.product_common_pb2' + # @@protoc_insertion_point(class_scope:Product) + }) +_sym_db.RegisterMessage(Product) + +DataChunk = _reflection.GeneratedProtocolMessageType('DataChunk', (_message.Message,), { + 'DESCRIPTOR' : _DATACHUNK, + '__module__' : 'grpc_mate.product_common_pb2' + # @@protoc_insertion_point(class_scope:DataChunk) + }) +_sym_db.RegisterMessage(DataChunk) + + +DESCRIPTOR._options = None +# @@protoc_insertion_point(module_scope) diff --git a/grpc-mate-python/grpc_mate/product_common_pb2_grpc.py b/grpc-mate-python/grpc_mate/product_common_pb2_grpc.py new file mode 100644 index 0000000..a894352 --- /dev/null +++ b/grpc-mate-python/grpc_mate/product_common_pb2_grpc.py @@ -0,0 +1,3 @@ +# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! +import grpc + diff --git a/grpc-mate-python/grpc_mate/product_search_engine_pb2.py b/grpc-mate-python/grpc_mate/product_search_engine_pb2.py new file mode 100644 index 0000000..9745ea8 --- /dev/null +++ b/grpc-mate-python/grpc_mate/product_search_engine_pb2.py @@ -0,0 +1,487 @@ +# -*- coding: utf-8 -*- +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: grpc_mate/product_search_engine.proto + +import sys +_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1')) +from google.protobuf import descriptor as _descriptor +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection +from google.protobuf import symbol_database as _symbol_database +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + +from grpc_mate import product_common_pb2 as grpc__mate_dot_product__common__pb2 +from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2 + + +DESCRIPTOR = _descriptor.FileDescriptor( + name='grpc_mate/product_search_engine.proto', + package='', + syntax='proto3', + serialized_options=_b('\n\030io.datanerd.generated.esP\001Z\010datanerd'), + serialized_pb=_b('\n%grpc_mate/product_search_engine.proto\x1a\x1egrpc_mate/product_common.proto\x1a\x1cgoogle/api/annotations.proto\"|\n\x15UploadProductResponse\x12:\n\rresult_status\x18\x01 \x01(\x0e\x32#.UploadProductResponse.ResultStatus\"\'\n\x0cResultStatus\x12\x0b\n\x07SUCCESS\x10\x00\x12\n\n\x06\x46\x41ILED\x10\x01\"1\n\x1b\x44ownloadProductImageRequest\x12\x12\n\nproduct_id\x18\x01 \x01(\x03\"+\n\x17\x44ownloadProductsRequest\x12\x10\n\x08\x63\x61tegory\x18\x01 \x01(\t\"8\n\x15SearchProductsRequest\x12\x10\n\x08key_word\x18\x01 \x01(\t\x12\r\n\x05limit\x18\x02 \x01(\x05\"4\n\x16SearchProductsResponse\x12\x1a\n\x08products\x18\x01 \x03(\x0b\x32\x08.Product\"I\n\x1d\x43\x61lculateProductScoreResponse\x12\x19\n\x07product\x18\x01 \x01(\x0b\x32\x08.Product\x12\r\n\x05score\x18\x02 \x01(\x03\"\x1b\n\x0b\x45\x63hoRequest\x12\x0c\n\x04ping\x18\x01 \x01(\t\"\x1c\n\x0c\x45\x63hoResponse\x12\x0c\n\x04pong\x18\x02 \x01(\t2M\n\x14ProductUpdateService\x12\x35\n\rUploadProduct\x12\x08.Product\x1a\x16.UploadProductResponse\"\x00(\x01\x32\xa4\x02\n\x12ProductReadService\x12:\n\x10\x44ownloadProducts\x12\x18.DownloadProductsRequest\x1a\x08.Product\"\x00\x30\x01\x12\x43\n\x0eSearchProducts\x12\x16.SearchProductsRequest\x1a\x17.SearchProductsResponse\"\x00\x12G\n\x15\x43\x61lculateProductScore\x12\x08.Product\x1a\x1e.CalculateProductScoreResponse\"\x00(\x01\x30\x01\x12\x44\n\x14\x44ownloadProductImage\x12\x1c.DownloadProductImageRequest\x1a\n.DataChunk\"\x00\x30\x01\x32P\n\x0b\x45\x63hoService\x12\x41\n\x04\x45\x63ho\x12\x0c.EchoRequest\x1a\r.EchoResponse\"\x1c\x82\xd3\xe4\x93\x02\x16\"\x11/grpc/api/v1/echo:\x01*B&\n\x18io.datanerd.generated.esP\x01Z\x08\x64\x61tanerdb\x06proto3') + , + dependencies=[grpc__mate_dot_product__common__pb2.DESCRIPTOR,google_dot_api_dot_annotations__pb2.DESCRIPTOR,]) + + + +_UPLOADPRODUCTRESPONSE_RESULTSTATUS = _descriptor.EnumDescriptor( + name='ResultStatus', + full_name='UploadProductResponse.ResultStatus', + filename=None, + file=DESCRIPTOR, + values=[ + _descriptor.EnumValueDescriptor( + name='SUCCESS', index=0, number=0, + serialized_options=None, + type=None), + _descriptor.EnumValueDescriptor( + name='FAILED', index=1, number=1, + serialized_options=None, + type=None), + ], + containing_type=None, + serialized_options=None, + serialized_start=188, + serialized_end=227, +) +_sym_db.RegisterEnumDescriptor(_UPLOADPRODUCTRESPONSE_RESULTSTATUS) + + +_UPLOADPRODUCTRESPONSE = _descriptor.Descriptor( + name='UploadProductResponse', + full_name='UploadProductResponse', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='result_status', full_name='UploadProductResponse.result_status', index=0, + number=1, type=14, cpp_type=8, label=1, + has_default_value=False, default_value=0, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + _UPLOADPRODUCTRESPONSE_RESULTSTATUS, + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=103, + serialized_end=227, +) + + +_DOWNLOADPRODUCTIMAGEREQUEST = _descriptor.Descriptor( + name='DownloadProductImageRequest', + full_name='DownloadProductImageRequest', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='product_id', full_name='DownloadProductImageRequest.product_id', index=0, + number=1, type=3, cpp_type=2, label=1, + has_default_value=False, default_value=0, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=229, + serialized_end=278, +) + + +_DOWNLOADPRODUCTSREQUEST = _descriptor.Descriptor( + name='DownloadProductsRequest', + full_name='DownloadProductsRequest', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='category', full_name='DownloadProductsRequest.category', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=280, + serialized_end=323, +) + + +_SEARCHPRODUCTSREQUEST = _descriptor.Descriptor( + name='SearchProductsRequest', + full_name='SearchProductsRequest', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='key_word', full_name='SearchProductsRequest.key_word', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='limit', full_name='SearchProductsRequest.limit', index=1, + number=2, type=5, cpp_type=1, label=1, + has_default_value=False, default_value=0, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=325, + serialized_end=381, +) + + +_SEARCHPRODUCTSRESPONSE = _descriptor.Descriptor( + name='SearchProductsResponse', + full_name='SearchProductsResponse', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='products', full_name='SearchProductsResponse.products', index=0, + number=1, type=11, cpp_type=10, label=3, + has_default_value=False, default_value=[], + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=383, + serialized_end=435, +) + + +_CALCULATEPRODUCTSCORERESPONSE = _descriptor.Descriptor( + name='CalculateProductScoreResponse', + full_name='CalculateProductScoreResponse', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='product', full_name='CalculateProductScoreResponse.product', index=0, + number=1, type=11, cpp_type=10, label=1, + has_default_value=False, default_value=None, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + _descriptor.FieldDescriptor( + name='score', full_name='CalculateProductScoreResponse.score', index=1, + number=2, type=3, cpp_type=2, label=1, + has_default_value=False, default_value=0, + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=437, + serialized_end=510, +) + + +_ECHOREQUEST = _descriptor.Descriptor( + name='EchoRequest', + full_name='EchoRequest', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='ping', full_name='EchoRequest.ping', index=0, + number=1, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=512, + serialized_end=539, +) + + +_ECHORESPONSE = _descriptor.Descriptor( + name='EchoResponse', + full_name='EchoResponse', + filename=None, + file=DESCRIPTOR, + containing_type=None, + fields=[ + _descriptor.FieldDescriptor( + name='pong', full_name='EchoResponse.pong', index=0, + number=2, type=9, cpp_type=9, label=1, + has_default_value=False, default_value=_b("").decode('utf-8'), + message_type=None, enum_type=None, containing_type=None, + is_extension=False, extension_scope=None, + serialized_options=None, file=DESCRIPTOR), + ], + extensions=[ + ], + nested_types=[], + enum_types=[ + ], + serialized_options=None, + is_extendable=False, + syntax='proto3', + extension_ranges=[], + oneofs=[ + ], + serialized_start=541, + serialized_end=569, +) + +_UPLOADPRODUCTRESPONSE.fields_by_name['result_status'].enum_type = _UPLOADPRODUCTRESPONSE_RESULTSTATUS +_UPLOADPRODUCTRESPONSE_RESULTSTATUS.containing_type = _UPLOADPRODUCTRESPONSE +_SEARCHPRODUCTSRESPONSE.fields_by_name['products'].message_type = grpc__mate_dot_product__common__pb2._PRODUCT +_CALCULATEPRODUCTSCORERESPONSE.fields_by_name['product'].message_type = grpc__mate_dot_product__common__pb2._PRODUCT +DESCRIPTOR.message_types_by_name['UploadProductResponse'] = _UPLOADPRODUCTRESPONSE +DESCRIPTOR.message_types_by_name['DownloadProductImageRequest'] = _DOWNLOADPRODUCTIMAGEREQUEST +DESCRIPTOR.message_types_by_name['DownloadProductsRequest'] = _DOWNLOADPRODUCTSREQUEST +DESCRIPTOR.message_types_by_name['SearchProductsRequest'] = _SEARCHPRODUCTSREQUEST +DESCRIPTOR.message_types_by_name['SearchProductsResponse'] = _SEARCHPRODUCTSRESPONSE +DESCRIPTOR.message_types_by_name['CalculateProductScoreResponse'] = _CALCULATEPRODUCTSCORERESPONSE +DESCRIPTOR.message_types_by_name['EchoRequest'] = _ECHOREQUEST +DESCRIPTOR.message_types_by_name['EchoResponse'] = _ECHORESPONSE +_sym_db.RegisterFileDescriptor(DESCRIPTOR) + +UploadProductResponse = _reflection.GeneratedProtocolMessageType('UploadProductResponse', (_message.Message,), { + 'DESCRIPTOR' : _UPLOADPRODUCTRESPONSE, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:UploadProductResponse) + }) +_sym_db.RegisterMessage(UploadProductResponse) + +DownloadProductImageRequest = _reflection.GeneratedProtocolMessageType('DownloadProductImageRequest', (_message.Message,), { + 'DESCRIPTOR' : _DOWNLOADPRODUCTIMAGEREQUEST, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:DownloadProductImageRequest) + }) +_sym_db.RegisterMessage(DownloadProductImageRequest) + +DownloadProductsRequest = _reflection.GeneratedProtocolMessageType('DownloadProductsRequest', (_message.Message,), { + 'DESCRIPTOR' : _DOWNLOADPRODUCTSREQUEST, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:DownloadProductsRequest) + }) +_sym_db.RegisterMessage(DownloadProductsRequest) + +SearchProductsRequest = _reflection.GeneratedProtocolMessageType('SearchProductsRequest', (_message.Message,), { + 'DESCRIPTOR' : _SEARCHPRODUCTSREQUEST, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:SearchProductsRequest) + }) +_sym_db.RegisterMessage(SearchProductsRequest) + +SearchProductsResponse = _reflection.GeneratedProtocolMessageType('SearchProductsResponse', (_message.Message,), { + 'DESCRIPTOR' : _SEARCHPRODUCTSRESPONSE, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:SearchProductsResponse) + }) +_sym_db.RegisterMessage(SearchProductsResponse) + +CalculateProductScoreResponse = _reflection.GeneratedProtocolMessageType('CalculateProductScoreResponse', (_message.Message,), { + 'DESCRIPTOR' : _CALCULATEPRODUCTSCORERESPONSE, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:CalculateProductScoreResponse) + }) +_sym_db.RegisterMessage(CalculateProductScoreResponse) + +EchoRequest = _reflection.GeneratedProtocolMessageType('EchoRequest', (_message.Message,), { + 'DESCRIPTOR' : _ECHOREQUEST, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:EchoRequest) + }) +_sym_db.RegisterMessage(EchoRequest) + +EchoResponse = _reflection.GeneratedProtocolMessageType('EchoResponse', (_message.Message,), { + 'DESCRIPTOR' : _ECHORESPONSE, + '__module__' : 'grpc_mate.product_search_engine_pb2' + # @@protoc_insertion_point(class_scope:EchoResponse) + }) +_sym_db.RegisterMessage(EchoResponse) + + +DESCRIPTOR._options = None + +_PRODUCTUPDATESERVICE = _descriptor.ServiceDescriptor( + name='ProductUpdateService', + full_name='ProductUpdateService', + file=DESCRIPTOR, + index=0, + serialized_options=None, + serialized_start=571, + serialized_end=648, + methods=[ + _descriptor.MethodDescriptor( + name='UploadProduct', + full_name='ProductUpdateService.UploadProduct', + index=0, + containing_service=None, + input_type=grpc__mate_dot_product__common__pb2._PRODUCT, + output_type=_UPLOADPRODUCTRESPONSE, + serialized_options=None, + ), +]) +_sym_db.RegisterServiceDescriptor(_PRODUCTUPDATESERVICE) + +DESCRIPTOR.services_by_name['ProductUpdateService'] = _PRODUCTUPDATESERVICE + + +_PRODUCTREADSERVICE = _descriptor.ServiceDescriptor( + name='ProductReadService', + full_name='ProductReadService', + file=DESCRIPTOR, + index=1, + serialized_options=None, + serialized_start=651, + serialized_end=943, + methods=[ + _descriptor.MethodDescriptor( + name='DownloadProducts', + full_name='ProductReadService.DownloadProducts', + index=0, + containing_service=None, + input_type=_DOWNLOADPRODUCTSREQUEST, + output_type=grpc__mate_dot_product__common__pb2._PRODUCT, + serialized_options=None, + ), + _descriptor.MethodDescriptor( + name='SearchProducts', + full_name='ProductReadService.SearchProducts', + index=1, + containing_service=None, + input_type=_SEARCHPRODUCTSREQUEST, + output_type=_SEARCHPRODUCTSRESPONSE, + serialized_options=None, + ), + _descriptor.MethodDescriptor( + name='CalculateProductScore', + full_name='ProductReadService.CalculateProductScore', + index=2, + containing_service=None, + input_type=grpc__mate_dot_product__common__pb2._PRODUCT, + output_type=_CALCULATEPRODUCTSCORERESPONSE, + serialized_options=None, + ), + _descriptor.MethodDescriptor( + name='DownloadProductImage', + full_name='ProductReadService.DownloadProductImage', + index=3, + containing_service=None, + input_type=_DOWNLOADPRODUCTIMAGEREQUEST, + output_type=grpc__mate_dot_product__common__pb2._DATACHUNK, + serialized_options=None, + ), +]) +_sym_db.RegisterServiceDescriptor(_PRODUCTREADSERVICE) + +DESCRIPTOR.services_by_name['ProductReadService'] = _PRODUCTREADSERVICE + + +_ECHOSERVICE = _descriptor.ServiceDescriptor( + name='EchoService', + full_name='EchoService', + file=DESCRIPTOR, + index=2, + serialized_options=None, + serialized_start=945, + serialized_end=1025, + methods=[ + _descriptor.MethodDescriptor( + name='Echo', + full_name='EchoService.Echo', + index=0, + containing_service=None, + input_type=_ECHOREQUEST, + output_type=_ECHORESPONSE, + serialized_options=_b('\202\323\344\223\002\026\"\021/grpc/api/v1/echo:\001*'), + ), +]) +_sym_db.RegisterServiceDescriptor(_ECHOSERVICE) + +DESCRIPTOR.services_by_name['EchoService'] = _ECHOSERVICE + +# @@protoc_insertion_point(module_scope) diff --git a/grpc-mate-python/grpc_mate/product_search_engine_pb2_grpc.py b/grpc-mate-python/grpc_mate/product_search_engine_pb2_grpc.py new file mode 100644 index 0000000..9fbf202 --- /dev/null +++ b/grpc-mate-python/grpc_mate/product_search_engine_pb2_grpc.py @@ -0,0 +1,186 @@ +# Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT! +import grpc + +from grpc_mate import product_common_pb2 as grpc__mate_dot_product__common__pb2 +from grpc_mate import product_search_engine_pb2 as grpc__mate_dot_product__search__engine__pb2 + + +class ProductUpdateServiceStub(object): + # missing associated documentation comment in .proto file + pass + + def __init__(self, channel): + """Constructor. + + Args: + channel: A grpc.Channel. + """ + self.UploadProduct = channel.stream_unary( + '/ProductUpdateService/UploadProduct', + request_serializer=grpc__mate_dot_product__common__pb2.Product.SerializeToString, + response_deserializer=grpc__mate_dot_product__search__engine__pb2.UploadProductResponse.FromString, + ) + + +class ProductUpdateServiceServicer(object): + # missing associated documentation comment in .proto file + pass + + def UploadProduct(self, request_iterator, context): + """upload product into elastic search , make it so that we could search on it + used to demo client side stream + """ + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + +def add_ProductUpdateServiceServicer_to_server(servicer, server): + rpc_method_handlers = { + 'UploadProduct': grpc.stream_unary_rpc_method_handler( + servicer.UploadProduct, + request_deserializer=grpc__mate_dot_product__common__pb2.Product.FromString, + response_serializer=grpc__mate_dot_product__search__engine__pb2.UploadProductResponse.SerializeToString, + ), + } + generic_handler = grpc.method_handlers_generic_handler( + 'ProductUpdateService', rpc_method_handlers) + server.add_generic_rpc_handlers((generic_handler,)) + + +class ProductReadServiceStub(object): + # missing associated documentation comment in .proto file + pass + + def __init__(self, channel): + """Constructor. + + Args: + channel: A grpc.Channel. + """ + self.DownloadProducts = channel.unary_stream( + '/ProductReadService/DownloadProducts', + request_serializer=grpc__mate_dot_product__search__engine__pb2.DownloadProductsRequest.SerializeToString, + response_deserializer=grpc__mate_dot_product__common__pb2.Product.FromString, + ) + self.SearchProducts = channel.unary_unary( + '/ProductReadService/SearchProducts', + request_serializer=grpc__mate_dot_product__search__engine__pb2.SearchProductsRequest.SerializeToString, + response_deserializer=grpc__mate_dot_product__search__engine__pb2.SearchProductsResponse.FromString, + ) + self.CalculateProductScore = channel.stream_stream( + '/ProductReadService/CalculateProductScore', + request_serializer=grpc__mate_dot_product__common__pb2.Product.SerializeToString, + response_deserializer=grpc__mate_dot_product__search__engine__pb2.CalculateProductScoreResponse.FromString, + ) + self.DownloadProductImage = channel.unary_stream( + '/ProductReadService/DownloadProductImage', + request_serializer=grpc__mate_dot_product__search__engine__pb2.DownloadProductImageRequest.SerializeToString, + response_deserializer=grpc__mate_dot_product__common__pb2.DataChunk.FromString, + ) + + +class ProductReadServiceServicer(object): + # missing associated documentation comment in .proto file + pass + + def DownloadProducts(self, request, context): + """download product by category + used to demo server side stream + """ + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + def SearchProducts(self, request, context): + """search product and return all matched products + used to demo simple grpc call + """ + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + def CalculateProductScore(self, request_iterator, context): + """calcualte each proeuct sore based on simple rule + used to demo bi directional stream + """ + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + def DownloadProductImage(self, request, context): + # missing associated documentation comment in .proto file + pass + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + +def add_ProductReadServiceServicer_to_server(servicer, server): + rpc_method_handlers = { + 'DownloadProducts': grpc.unary_stream_rpc_method_handler( + servicer.DownloadProducts, + request_deserializer=grpc__mate_dot_product__search__engine__pb2.DownloadProductsRequest.FromString, + response_serializer=grpc__mate_dot_product__common__pb2.Product.SerializeToString, + ), + 'SearchProducts': grpc.unary_unary_rpc_method_handler( + servicer.SearchProducts, + request_deserializer=grpc__mate_dot_product__search__engine__pb2.SearchProductsRequest.FromString, + response_serializer=grpc__mate_dot_product__search__engine__pb2.SearchProductsResponse.SerializeToString, + ), + 'CalculateProductScore': grpc.stream_stream_rpc_method_handler( + servicer.CalculateProductScore, + request_deserializer=grpc__mate_dot_product__common__pb2.Product.FromString, + response_serializer=grpc__mate_dot_product__search__engine__pb2.CalculateProductScoreResponse.SerializeToString, + ), + 'DownloadProductImage': grpc.unary_stream_rpc_method_handler( + servicer.DownloadProductImage, + request_deserializer=grpc__mate_dot_product__search__engine__pb2.DownloadProductImageRequest.FromString, + response_serializer=grpc__mate_dot_product__common__pb2.DataChunk.SerializeToString, + ), + } + generic_handler = grpc.method_handlers_generic_handler( + 'ProductReadService', rpc_method_handlers) + server.add_generic_rpc_handlers((generic_handler,)) + + +class EchoServiceStub(object): + # missing associated documentation comment in .proto file + pass + + def __init__(self, channel): + """Constructor. + + Args: + channel: A grpc.Channel. + """ + self.Echo = channel.unary_unary( + '/EchoService/Echo', + request_serializer=grpc__mate_dot_product__search__engine__pb2.EchoRequest.SerializeToString, + response_deserializer=grpc__mate_dot_product__search__engine__pb2.EchoResponse.FromString, + ) + + +class EchoServiceServicer(object): + # missing associated documentation comment in .proto file + pass + + def Echo(self, request, context): + # missing associated documentation comment in .proto file + pass + context.set_code(grpc.StatusCode.UNIMPLEMENTED) + context.set_details('Method not implemented!') + raise NotImplementedError('Method not implemented!') + + +def add_EchoServiceServicer_to_server(servicer, server): + rpc_method_handlers = { + 'Echo': grpc.unary_unary_rpc_method_handler( + servicer.Echo, + request_deserializer=grpc__mate_dot_product__search__engine__pb2.EchoRequest.FromString, + response_serializer=grpc__mate_dot_product__search__engine__pb2.EchoResponse.SerializeToString, + ), + } + generic_handler = grpc.method_handlers_generic_handler( + 'EchoService', rpc_method_handlers) + server.add_generic_rpc_handlers((generic_handler,)) diff --git a/grpc-mate-python/images/python-grpc.png b/grpc-mate-python/images/python-grpc.png new file mode 100644 index 0000000..3d3d2f8 Binary files /dev/null and b/grpc-mate-python/images/python-grpc.png differ diff --git a/grpc-mate-python/protobuffers/google b/grpc-mate-python/protobuffers/google new file mode 120000 index 0000000..f4559ed --- /dev/null +++ b/grpc-mate-python/protobuffers/google @@ -0,0 +1 @@ +../../protobuffers/google \ No newline at end of file diff --git a/grpc-mate-python/protobuffers/grpc_mate b/grpc-mate-python/protobuffers/grpc_mate new file mode 120000 index 0000000..8141156 --- /dev/null +++ b/grpc-mate-python/protobuffers/grpc_mate @@ -0,0 +1 @@ +../../protobuffers/grpc_mate \ No newline at end of file diff --git a/grpc-mate-python/requirements.txt b/grpc-mate-python/requirements.txt new file mode 100644 index 0000000..1852465 --- /dev/null +++ b/grpc-mate-python/requirements.txt @@ -0,0 +1,7 @@ +-i https://pypi.org/simple +grpcio-tools==1.24.3 +grpcio==1.24.3 +protobuf==3.10.0 +pyyaml==5.1.2 +six==1.12.0 +sqlalchemy==1.3.10 diff --git a/grpc-mate-python/server/__init__.py b/grpc-mate-python/server/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/server/logging.yaml b/grpc-mate-python/server/logging.yaml new file mode 100644 index 0000000..3882351 --- /dev/null +++ b/grpc-mate-python/server/logging.yaml @@ -0,0 +1,23 @@ +version: 1 +formatters: + simple: + format: '%(asctime)s - %(name)s - %(levelname)s - %(message)s' +handlers: + console: + class: logging.StreamHandler + level: DEBUG + formatter: simple + stream: ext://sys.stdout +loggers: + __main__: + level: DEBUG + handlers: [console] + propagate: no + service.greeter_servicer: + level: DEBUG + handlers: [console] + propagate: no +root: + level: INFO + handlers: [console] + propagate: yes \ No newline at end of file diff --git a/grpc-mate-python/server/server.py b/grpc-mate-python/server/server.py new file mode 100644 index 0000000..d54e221 --- /dev/null +++ b/grpc-mate-python/server/server.py @@ -0,0 +1,36 @@ +import logging.config +from concurrent import futures +from pathlib import Path + +import grpc +import yaml + +import grpc_mate.helloworld_pb2_grpc +import grpc_mate.product_search_engine_pb2_grpc +from service.greeter_servicer import GreeterServicer +from service.product_read_servicer import ProductReadServiceServicer +from service.product_update_servicer import ProductUpdateServiceServicer + +# Create a custom logger +with Path(__file__).resolve().parent.joinpath('logging.yaml').open('r') as f: + config = yaml.safe_load(f.read()) + logging.config.dictConfig(config) + +logger = logging.getLogger(__name__) + + +def serve(): + server = grpc.server(futures.ThreadPoolExecutor(max_workers=10)) + grpc_mate.helloworld_pb2_grpc.add_GreeterServicer_to_server(GreeterServicer(), server) + grpc_mate.product_search_engine_pb2_grpc.add_ProductUpdateServiceServicer_to_server(ProductUpdateServiceServicer(), + server) + grpc_mate.product_search_engine_pb2_grpc.add_ProductReadServiceServicer_to_server(ProductReadServiceServicer(), + server) + server.add_insecure_port('[::]:8080') + server.start() + logger.debug('grpc server started at port 8080') + server.wait_for_termination() + + +if __name__ == '__main__': + serve() diff --git a/grpc-mate-python/service/__init__.py b/grpc-mate-python/service/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/service/greeter_servicer.py b/grpc-mate-python/service/greeter_servicer.py new file mode 100644 index 0000000..b1b07dc --- /dev/null +++ b/grpc-mate-python/service/greeter_servicer.py @@ -0,0 +1,13 @@ +import logging + +import grpc_mate.helloworld_pb2 +import grpc_mate.helloworld_pb2_grpc + +logger = logging.getLogger(__name__) + + +class GreeterServicer(grpc_mate.helloworld_pb2_grpc.GreeterServicer): + + def SayHello(self, request, context): + logger.debug(f"get request {request.name}") + return grpc_mate.helloworld_pb2.HelloReply(message=f"hello {request.name}") diff --git a/grpc-mate-python/service/product_read_servicer.py b/grpc-mate-python/service/product_read_servicer.py new file mode 100644 index 0000000..a9711d2 --- /dev/null +++ b/grpc-mate-python/service/product_read_servicer.py @@ -0,0 +1,53 @@ +import logging +from pathlib import Path + +import grpc_mate.product_search_engine_pb2_grpc +from data_store.db import session_scope +from data_store.models import DBProduct +from grpc_mate.product_common_pb2 import Product, DataChunk +from grpc_mate.product_search_engine_pb2 import SearchProductsResponse, CalculateProductScoreResponse + +logger = logging.getLogger(__name__) + + +def db_product_to_protobuf_product(db_product): + # copy the db product to grpc product + protobuf_product = Product() + for k in protobuf_product.DESCRIPTOR.fields_by_name: + setattr(protobuf_product, k, db_product.__dict__[k]) + return protobuf_product + + +class ProductReadServiceServicer(grpc_mate.product_search_engine_pb2_grpc.ProductReadServiceServicer): + def DownloadProducts(self, request, context): + with session_scope() as session: + result = session.query(DBProduct) \ + .filter(DBProduct.category == request.category) \ + .all() + for product in result: + yield db_product_to_protobuf_product(product) + + def SearchProducts(self, request, context): + with session_scope() as session: + result = session.query(DBProduct) \ + .filter(DBProduct.product_name.like(f'%{request.key_word}%')) \ + .order_by(DBProduct.product_id.asc()) \ + .limit(limit=request.limit) \ + .all() + products = list(map(db_product_to_protobuf_product, result)) + return SearchProductsResponse(products=products) + + def CalculateProductScore(self, request_iterator, context): + for product in request_iterator: + yield CalculateProductScoreResponse(product=product, score=int(product.product_price * 2)) + + def DownloadProductImage(self, request, context): + chunk_size = 1024 + image_path = Path(__file__).resolve().parent.parent.joinpath('images/python-grpc.png') + + with image_path.open('rb') as f: + while True: + chunk = f.read(chunk_size) + if not chunk: + break + yield DataChunk(data=chunk) diff --git a/grpc-mate-python/service/product_update_servicer.py b/grpc-mate-python/service/product_update_servicer.py new file mode 100644 index 0000000..5cb137d --- /dev/null +++ b/grpc-mate-python/service/product_update_servicer.py @@ -0,0 +1,21 @@ +import logging + +import grpc_mate.product_search_engine_pb2_grpc +from data_store.db import session_scope +from data_store.models import DBProduct +from grpc_mate.product_search_engine_pb2 import UploadProductResponse + +logger = logging.getLogger(__name__) + + +class ProductUpdateServiceServicer(grpc_mate.product_search_engine_pb2_grpc.ProductUpdateServiceServicer): + def UploadProduct(self, request_iterator, context): + with session_scope() as session: + for product in request_iterator: + db_product = DBProduct() + for k in product.DESCRIPTOR.fields_by_name: + setattr(db_product, k, getattr(product, k)) + db_product.product_id = None + session.add(db_product) + + return UploadProductResponse(result_status=UploadProductResponse.SUCCESS) diff --git a/grpc-mate-python/tests/__init__.py b/grpc-mate-python/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/tests/data_store/__init__.py b/grpc-mate-python/tests/data_store/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/tests/data_store/test_models.py b/grpc-mate-python/tests/data_store/test_models.py new file mode 100644 index 0000000..58a169f --- /dev/null +++ b/grpc-mate-python/tests/data_store/test_models.py @@ -0,0 +1,34 @@ +from decimal import Decimal + +import pytest +from faker import Faker + +from data_store import engine +from data_store.db import session_scope +from data_store.models import Base, DBProduct +from grpc_mate.product_common_pb2 import InStock + + +@pytest.fixture(autouse=True, scope='function') +def create_schema(): + if engine.url.__str__() == 'sqlite:///:memory:': + Base.metadata.create_all(engine) + yield None + Base.metadata.drop_all(engine) + + +def test_db_products(): + faker = Faker() + product = DBProduct(product_name=faker.name(), + product_price=Decimal(faker.random_int() / 100), + product_status=InStock, + category=faker.name()) + with session_scope() as session: + session.add(product) + my_product = session.query(DBProduct).one() + session.expunge(my_product) + assert my_product.product_id is not None + assert my_product.product_name == product.product_name + assert my_product.product_price == product.product_price + assert my_product.product_status == product.product_status + assert my_product.category == product.category diff --git a/grpc-mate-python/tests/service/__init__.py b/grpc-mate-python/tests/service/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/grpc-mate-python/tests/service/test_greeter_servicer.py b/grpc-mate-python/tests/service/test_greeter_servicer.py new file mode 100644 index 0000000..d93882e --- /dev/null +++ b/grpc-mate-python/tests/service/test_greeter_servicer.py @@ -0,0 +1,40 @@ +import grpc +import pytest +from grpc_mate.helloworld_pb2 import HelloRequest + + +@pytest.fixture(scope='module') +def grpc_add_to_server(): + from grpc_mate.helloworld_pb2_grpc import add_GreeterServicer_to_server + + return add_GreeterServicer_to_server + + +@pytest.fixture(scope='module') +def grpc_servicer(): + from service.greeter_servicer import GreeterServicer + + return GreeterServicer() + + +@pytest.fixture(scope='module') +def grpc_stub_cls(grpc_channel): + from grpc_mate.helloworld_pb2_grpc import GreeterStub + + return GreeterStub + + +def test_SayHello(grpc_stub): + hello_request = HelloRequest(name='ivan') + response = grpc_stub.SayHello(hello_request) + + assert response.message == f'hello {hello_request.name}' + + +def integration_test_SayHello(): + from grpc_mate.helloworld_pb2_grpc import GreeterStub + channel = grpc.insecure_channel('localhost:8080') + stub = GreeterStub(channel) + hello_request = HelloRequest(name='local') + response = stub.SayHello(hello_request) + assert response.message == f'hello {hello_request.name}' diff --git a/grpc-mate-python/tests/service/test_product_read_servicer.py b/grpc-mate-python/tests/service/test_product_read_servicer.py new file mode 100644 index 0000000..4162582 --- /dev/null +++ b/grpc-mate-python/tests/service/test_product_read_servicer.py @@ -0,0 +1,123 @@ +import filecmp +import os +from decimal import Decimal +from pathlib import Path + +import pytest +from faker import Faker + +from data_store import engine +from data_store.db import session_scope +from data_store.models import Base, DBProduct +from grpc_mate.product_common_pb2 import InStock +from grpc_mate.product_search_engine_pb2 import SearchProductsRequest, DownloadProductsRequest, \ + DownloadProductImageRequest + + +@pytest.fixture(scope='module') +def grpc_add_to_server(): + from grpc_mate.product_search_engine_pb2_grpc import add_ProductReadServiceServicer_to_server + return add_ProductReadServiceServicer_to_server + + +@pytest.fixture(scope='module') +def grpc_servicer(): + from service.product_read_servicer import ProductReadServiceServicer + + return ProductReadServiceServicer() + + +@pytest.fixture(scope='module') +def grpc_stub_cls(grpc_channel): + from grpc_mate.product_search_engine_pb2_grpc import ProductReadServiceStub + + return ProductReadServiceStub + + +@pytest.fixture(autouse=True, scope='function') +def create_schema(): + if engine.url.__str__() == 'sqlite:///:memory:': + Base.metadata.create_all(engine) + yield None + Base.metadata.drop_all(engine) + + +def test_SearchProducts_none_exist(grpc_stub): + faker = Faker() + keyword = faker.name() + + response = grpc_stub.SearchProducts(SearchProductsRequest(key_word=keyword, limit=2)) + assert len(response.products) == 0 + + +def test_SearchProducts_exist(grpc_stub): + faker = Faker() + keyword = faker.name() + product = DBProduct(product_name=keyword, + product_price=Decimal(faker.random_int() / 100), + product_status=InStock, + category=faker.name()) + # save to db + with session_scope() as session: + session.add(product) + + response = grpc_stub.SearchProducts(SearchProductsRequest(key_word=keyword, limit=2)) + assert len(response.products) == 1 + assert keyword in response.products[0].product_name + + +def test_SearchProducts_limit(grpc_stub): + faker = Faker() + keyword = faker.name() + # save to db + with session_scope() as session: + for idx in range(5): + product = DBProduct(product_name=f'{keyword}_{idx}', + product_price=Decimal(faker.random_int() / 100), + product_status=InStock, + category=faker.name()) + session.add(product) + + response = grpc_stub.SearchProducts(SearchProductsRequest(key_word=keyword, limit=2)) + assert len(response.products) == 2 + assert keyword in response.products[0].product_name + assert keyword in response.products[1].product_name + + +def test_DownloadProducts_exist(grpc_stub): + faker = Faker() + category = faker.name() + # save to db + with session_scope() as session: + for idx in range(5): + product = DBProduct(product_name=f'{faker.name()}_{idx}', + product_price=Decimal(faker.random_int() / 100), + product_status=InStock, + category=category) + session.add(product) + result = grpc_stub.DownloadProducts(DownloadProductsRequest(category=category)) + + # assert we have 5 items + assert len(list(result)) == 5 + + +def test_DownloadProducts_none_exist(grpc_stub): + faker = Faker() + category = faker.name() + result = grpc_stub.DownloadProducts(DownloadProductsRequest(category=category)) + + # assert we have 0 items + assert len(list(result)) == 0 + + +def test_DownloadProductImage(grpc_stub): + faker = Faker() + target_image_file = faker.file_name(category=None, extension='png') + data_chunks = grpc_stub.DownloadProductImage(DownloadProductImageRequest(product_id=1)) + with open(target_image_file, 'wb') as f: + for chunk in data_chunks: + f.write(chunk.data) + + original_image_file = Path(__file__).resolve().parent.parent.parent.joinpath('images/python-grpc.png') + assert filecmp.cmp(original_image_file, target_image_file) + os.remove(target_image_file) diff --git a/grpc-mate-python/tests/service/test_product_update_servicer.py b/grpc-mate-python/tests/service/test_product_update_servicer.py new file mode 100644 index 0000000..3bc2a95 --- /dev/null +++ b/grpc-mate-python/tests/service/test_product_update_servicer.py @@ -0,0 +1,71 @@ +import pytest + +from data_store import engine +from data_store.models import Base, DBProduct +from data_store.db import session_scope +from grpc_mate.product_common_pb2 import Product, InStock +from grpc_mate.product_search_engine_pb2 import UploadProductResponse + + +@pytest.fixture(scope='module') +def grpc_add_to_server(): + from grpc_mate.product_search_engine_pb2_grpc import add_ProductUpdateServiceServicer_to_server + return add_ProductUpdateServiceServicer_to_server + + +@pytest.fixture(scope='module') +def grpc_servicer(): + from service.product_update_servicer import ProductUpdateServiceServicer + + return ProductUpdateServiceServicer() + + +@pytest.fixture(scope='module') +def grpc_stub_cls(grpc_channel): + from grpc_mate.product_search_engine_pb2_grpc import ProductUpdateServiceStub + + return ProductUpdateServiceStub + + +@pytest.fixture(autouse=True, scope='function') +def create_schema(): + if engine.url.__str__() == 'sqlite:///:memory:': + Base.metadata.create_all(engine) + yield None + Base.metadata.drop_all(engine) + + +def test_UploadProduct_insert_one(grpc_stub): + products = [ + Product(product_name='product_name_1', product_price=1.0, product_status=InStock, category='category_1')] + grpc_stub.UploadProduct(iter(products)) + with session_scope() as session: + rows = session.query(DBProduct).count() + assert rows == 1 + my_product = session.query(DBProduct).one() + product = products[0] + assert my_product.product_id is not None + assert my_product.product_name == product.product_name + assert my_product.product_price == product.product_price + assert my_product.product_status == product.product_status + assert my_product.category == product.category + + +def test_UploadProduct_insert_two(grpc_stub): + products = [ + Product(product_name='product_name_1', product_price=1.0, product_status=InStock, category='category_1'), + Product(product_name='product_name_2', product_price=2.0, product_status=InStock, category='category_2')] + grpc_stub.UploadProduct(iter(products)) + with session_scope() as session: + rows = session.query(DBProduct).count() + assert rows == 2 + +def test_UploadProductResponse_enum(): + """ + see https://developers.google.com/protocol-buffers/docs/reference/python-generated#enum on how to use enum + see https://github.com/protocolbuffers/protobuf/blob/master/python/google/protobuf/internal/enum_type_wrapper.py + for method in all enums + :return: + """ + upload_product_response = UploadProductResponse(result_status=UploadProductResponse.SUCCESS) + assert upload_product_response.result_status == UploadProductResponse.ResultStatus.Value('SUCCESS') diff --git a/grpc-mate-python/tox.ini b/grpc-mate-python/tox.ini new file mode 100644 index 0000000..3597c0e --- /dev/null +++ b/grpc-mate-python/tox.ini @@ -0,0 +1,9 @@ +[pycodestyle] +ignore = E722 +max-line-length = 120 +statistics = True + +[tool:pytest] + +[pytest] +addopts = --cov=service/ --cov=data_store/ --cov-fail-under=90 \ No newline at end of file diff --git a/helloworld-service/src/main/proto/helloworld.proto b/helloworld-service/src/main/proto/helloworld.proto deleted file mode 120000 index 8a729af..0000000 --- a/helloworld-service/src/main/proto/helloworld.proto +++ /dev/null @@ -1 +0,0 @@ -../../../../protobuffers/helloworld.proto \ No newline at end of file diff --git a/protobuffers/Makefile b/protobuffers/Makefile index 16bfb73..4f99eec 100644 --- a/protobuffers/Makefile +++ b/protobuffers/Makefile @@ -1,4 +1,4 @@ clean: mkdir -p java_generated && rm -rf java_generated/* gen: clean - protoc --java_out=java_generated *.proto \ No newline at end of file + protoc --java_out=java_generated grpc_mate/*.proto diff --git a/protobuffers/helloworld.proto b/protobuffers/grpc_mate/helloworld.proto similarity index 64% rename from protobuffers/helloworld.proto rename to protobuffers/grpc_mate/helloworld.proto index f03ed12..689d973 100644 --- a/protobuffers/helloworld.proto +++ b/protobuffers/grpc_mate/helloworld.proto @@ -2,10 +2,17 @@ syntax = "proto3"; option java_package = "io.datanerd.generated.helloworld"; option java_multiple_files = true; +import "google/api/annotations.proto"; + // The greeting service definition. service Greeter { // Sends a greeting - rpc SayHello (HelloRequest) returns (HelloReply) {} + rpc SayHello (HelloRequest) returns (HelloReply) { + option (google.api.http) = { + post: "/api/v1/greeter/hello" + body: "*" + }; + } } // The request message containing the user's name. diff --git a/protobuffers/product_common.proto b/protobuffers/grpc_mate/product_common.proto similarity index 100% rename from protobuffers/product_common.proto rename to protobuffers/grpc_mate/product_common.proto diff --git a/protobuffers/product_search_engine.proto b/protobuffers/grpc_mate/product_search_engine.proto similarity index 97% rename from protobuffers/product_search_engine.proto rename to protobuffers/grpc_mate/product_search_engine.proto index 8837173..f24f8c7 100644 --- a/protobuffers/product_search_engine.proto +++ b/protobuffers/grpc_mate/product_search_engine.proto @@ -1,7 +1,7 @@ syntax = "proto3"; option java_package = "io.datanerd.generated.es"; option java_multiple_files = true; -import "product_common.proto"; +import "grpc_mate/product_common.proto"; import "google/api/annotations.proto"; //make sure this can be used in grpc gateway option go_package = "datanerd";