-
Notifications
You must be signed in to change notification settings - Fork 428
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Freezing the graph and then reloading for inference #26
Comments
I think i am on the right track here, at the very least it returns me an answer, but given my very small number of training iterations I wont know until fully trained if its the right answer To start i gave the input_data tensor a name:
Then when i printed the value of I can now write out the graph using "add" as my output name: ` output_graph_def = graph_util.convert_variables_to_constants(sess, sess.graph.as_graph_def(), ["add"]) with gfile.FastGFile("frozen_graph.pb", 'wb') as f: ` Now when i want to pass new sentences in later on, I can restore the frozen graph and feed the new information like this, which at least doesnt crash but not sure if its a complete replica of the testing script using checkpoint files? In particular am i right in what i think is the crucial input and output? `inputText = FLAGS.sentence tf.reset_default_graph() with tf.Session(graph=graph) as sess: |
Hmm, I'm actually not too familiar with saving into .pb files, but the general approach you have of naming the variables, using f.write, then load_graph, and then sess.run seems right to me. Will let you know if I find any more info on this. |
Thanks that would be great |
@camer314 Is it possible to share the frozen model (.pb)? |
Hello,
I am wanting to freeze your model into a .PB file and then later reload that when i want to pass my own sentences. I am familiar with image training where operations are named (e.g 'input' and 'final_result') and i feed the necessary tensors.
However in your model there are no named operations and the testing script does a lot of heavy lifting with restoration and loading of vector files.
Is it possible to just load a PB file and pass in a sentence? And if so then what do I pass that sentence in to and then what tensor do i need as a result?
Basically i am trying to get this running on TF mobile, so only using the Java/C# library, all my other models i have just frozen to PB, restored and fed the correct tensor in, got the results and i am wanting to do the same here but not sure what the absolute minimum requirement is.
The text was updated successfully, but these errors were encountered: