-
Notifications
You must be signed in to change notification settings - Fork 221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serve the model with Tensorserve ? #138
Comments
After reading many documentations and googling around I came up with this solution that I can specify .get_next() tensors as input tensors and feed them for prediction. something like this ( I have summarized the creating iterators code here ) : # tensorrec iterators structure
row_index = np.array([sparse_matrix.row], dtype=np.int64)
col_index = np.array([sparse_matrix.col], dtype=np.int64)
values = np.array([sparse_matrix.data], dtype=np.float32)
n_dim_0 = np.array([sparse_matrix.shape[0]], dtype=np.int64)
n_dim_1 = np.array([sparse_matrix.shape[1]], dtype=np.int64)
# create dataset
tensor_slices = (row_index, col_index, values, n_dim_0, n_dim_1)
dataset = tf.data.Dataset.from_tensor_slices(tensor_slices)
# initialize it
self.tf_user_feature_iterator.make_initializer(dataset)
# get_next op
self.user_get_next = self.tf_user_feature_iterator.get_next()
.....
... train model ...
.....
# inputs for saving model with saved_model_builder
inputs={
'user_row': tf.saved_model.utils.build_tensor_info(self.user_get_next[0]),
'user_col': tf.saved_model.utils.build_tensor_info(self.user_get_next[1]),
'user_values': tf.saved_model.utils.build_tensor_info(self.user_get_next[2])
...
}, and I also set : builder.add_meta_graph_and_variables(
...
clear_devices=True,
) The complete code is here It is working fine with the tests that I have done with it, but i am wondering that if it is a good idea or not and it may cause problems that I am not aware of. Can you help me with this ? Thanks in advance. |
Hey @bluemmb -- I like this approach! I don't see anything that jumps out as a problem at serving time. Did you run in to issues getting this to work that we should call out? There definitely needs to be a more elegant way to do this within the TensorRec API. I'll tackle that as part of the general TF2.0 refactor I'm working on. |
Hey @jfkirk. thanks :) Despite the problem above ( that is working fine so far with the solution I have written ). Other important problem is that it seems that Tensorserve does not support py_func officially, but I am using IBM Watson machine learning environment for running the served model and it seems they have customized it to run the py_func. ( watson only supports Tensorflow 1.11 ) But it might be good to replace the I am still working on it and I will share useful things that I find about this here. |
@bluemmb @jfkirk Have you found any alternative to replace the use of py_func ? This part generates and error while trying to use a model for tensorflow serving...
thanks in advance! |
Hi @mpradilla. Unfortunately no. Because I had not problem serving it as I have described above, actually I didn't put time on it. But I'm not good enough with tensorflow to say it is possible or not and @jfkirk should guide us with this. |
I am new to the world of tensorflow and this library is really good and understandable. Thanks @jfkirk.
I have read almost the whole code and I know the general steps it is taking to make the model and the predictions. But where I am stuck is that how can we serve the trained model with Tensorserve ?
Actually I don't know how to determine the inputs for the serving purpose.
Any input on this would be great.
Thanks in advance.
The text was updated successfully, but these errors were encountered: