Skip to content
This repository was archived by the owner on Nov 5, 2022. It is now read-only.
This repository was archived by the owner on Nov 5, 2022. It is now read-only.

Could not parse example input in TFX_Pipeline_for_Bert_Preprocessing #67

@mironnn

Description

@mironnn

Hi, could you please advice, where I'm wrong.
I don't have much experience and I'm trying to figure out how does it work.

I tried to to use model build with your TFX_Pipeline_for_Bert_Preprocessing.ipynb, but when I try to serve it via TF Serving I receive ""error": "Could not parse example input, value: 'You are very good person'\n\t [[{{node ParseExample/ParseExampleV2}}]]""

My steps:

  1. Download TFX_Pipeline_for_Bert_Preprocessing.ipynb notebook locally
  2. Change "/content/..." folder to "/tmp/..."
  3. Change version of dataset from "0.1.0" to "1.0.0", cause only 1.0.0 is available
  4. Install dependencies and build model locally
  5. Run TF serving via docker and fit already built model
  6. Make request curl -d '{"instances": ["You are very good person"]}' -X POST --output - http://localhost:8501/v1/models/my_model:predict
    Receive { "error": "Could not parse example input, value: 'You are very good person'\n\t [[{{node ParseExample/ParseExampleV2}}]]" }

So I assume, that model is trained with tensor as an input. Also in the end of your notebook there is a test, trying model's "serving default" and we also fit a tensor to the model.

How could I achieve to pass the raw text in request to TF Serving ? Should TF Serving convert string to tensor?

Could you please advice where I'm wrong. Spent more than a week trying to solve this.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions