-
Notifications
You must be signed in to change notification settings - Fork 679
Could not parse example input in TFX_Pipeline_for_Bert_Preprocessing #67
Comments
@hanneshapke maybe you can help me? |
Hi @mironnn, The tf.Example data structure is not intuitive. The data structure needs to be serialized and then encoded as Let me know if you have any questions regarding the example. It works with the output of the latest BERT pipeline version and required a recent version of TF Serving. |
In the coming days, I will publish a pipeline which doesn't require the |
@hanneshapke Thank you so much for your time and for the provided example. Yeah, it would be great! It is very interesting to see how to make requests without preparation (serialization in the client) directly to the TF Serving with raw text. |
Hi @mironnn, Colab version of the BERT Pipeline which exports a model for simple REST requests: @rcrowe-google I think the TFX docs should mention the export of Keras models without the tf.Example dependency. I am happy to update the existing TFX documentation. Do you mind pointing me in the right direction where additional comments would be most appropriate? @mironnn Let me know if you have any questions. I think we can close this issue. |
@hanneshapke Thanks for the offer! I think the best place to document this would be in https://github.com/tensorflow/tfx/blob/master/docs/guide/keras.md I would also be interested to include your Colab, probably under https://github.com/tensorflow/tfx/tree/master/docs/tutorials/serving |
@joeliedtke for reference |
@rcrowe-google Thank you for your reply. I'll make those PRs tomorrow. I will ping you and @joeliedtke when the PRs are ready for a review. |
+1 to @hanneshapke May we revise the demonstration of serving_input_fn() for model export to not receive 1-D Bytes Tensors and do parse_examples() in the serving graph, but simply receive flat list of raw Tensors? It's fine to receive serialized tf.Examples as input in training input_fn, but this characteristics doesn't have to carry over to serving_input_fn(), and doing so with Keras model is causing non-intuitive behavior like this issue and tensorflow/tfx#1885. |
@hanneshapke is there an example for gRPC? I have b64 encoded data but can't find the right format to make the prediction over the protocol. |
Hi, could you please advice, where I'm wrong.
I don't have much experience and I'm trying to figure out how does it work.
I tried to to use model build with your TFX_Pipeline_for_Bert_Preprocessing.ipynb, but when I try to serve it via TF Serving I receive ""error": "Could not parse example input, value: 'You are very good person'\n\t [[{{node ParseExample/ParseExampleV2}}]]""
My steps:
curl -d '{"instances": ["You are very good person"]}' -X POST --output - http://localhost:8501/v1/models/my_model:predict
Receive
{ "error": "Could not parse example input, value: 'You are very good person'\n\t [[{{node ParseExample/ParseExampleV2}}]]" }
So I assume, that model is trained with tensor as an input. Also in the end of your notebook there is a test, trying model's "serving default" and we also fit a tensor to the model.
How could I achieve to pass the raw text in request to TF Serving ? Should TF Serving convert string to tensor?
Could you please advice where I'm wrong. Spent more than a week trying to solve this.
The text was updated successfully, but these errors were encountered: