tensorflow 1 Session.run is taking too much time to embed sentence using universal sentence encoder

Using tensforflow with flask REST API

How should i reduce the time for session.run

I am using tf 1/2 in REST API, instead of serving it i am using it on my server.

i have tried tensorflow 1 and 2.

tensorflow 1 is taking too much time.

tensorflow 2 is not even returning the vectors for text.

in tensorflow 1 initialising is taking 2-4 seconds and session.run is taking 5-8 seconds. and time is getting increased as i keep hitting the requests.

tensorflow 1

import tensorflow.compat.v1 as tfo import tensorflow_hub as hub tfo.disable_eager_execution()  module_url = "https://tfhub.dev/google/universal-sentence-encoder-qa/3" # Import the Universal Sentence Encoder's TF Hub module embed = hub.Module(module_url)  def convert_text_to_vector(text):     # Compute a representation for each message, showing various lengths supported.     try:         #text = "qwerty" or ["qwerty"]         if isinstance(text, str):             text = [text]         with tfo.Session() as session:             t_time = time.time()             session.run([tfo.global_variables_initializer(), tfo.tables_initializer()])             m_time = time.time()             message_embeddings = session.run(embed(text))             vector_array = message_embeddings.tolist()[0]         return vector_array     except Exception as err:         raise Exception(str(err)) 

tensorflow 2

its getting stuck at vector_array = embedding_fn(text)

import tensorflow as tf import tensorflow_hub as hub module_url = "https://tfhub.dev/google/universal-sentence-encoder-qa/3" embedding_fn = hub.load(module_url)  @tf.function def convert_text_to_vector(text):     try:         #text = ["qwerty"]         vector_array = embedding_fn(text)         return vector_array     except Exception as err:         raise Exception(str(err)) 
Add Comment
2 Answer(s)

for the tensorflow 2 version I made few corrections. Basically I followed the example in universal sentence encoder that you provided.

import tensorflow as tf import tensorflow_hub as hub import numpy as np module_url = "https://tfhub.dev/google/universal-sentence-encoder-qa/3" embedding_fn = hub.load(module_url)  @tf.function def convert_text_to_vector(text):   try:       vector_array = embedding_fn.signatures['question_encoder'](           tf.constant(text))       return vector_array['outputs']   except Exception as err:       raise Exception(str(err))  ### run the function vector = convert_text_to_vector(['is this helpful ?']) print(vector.shape()) 
Add Comment
from flask import Flask from flask_restplus import Api, Resource from werkzeug.utils import cached_property  import tensorflow as tf import tensorflow_hub as hub module_url = "https://tfhub.dev/google/universal-sentence-encoder-qa/3" embedding_fn = hub.load(module_url)   app = Flask(__name__)  @app.route('/embedding', methods=['POST']) def entry_point(args):     if args.get("text"):         text_term = args.get("text")         if isinstance(text_term, str):             text_term = [text_term]         vectors = convert_text_to_vector(text_term)     return vectors    @tf.function def convert_text_to_vector(text):     try:         vector_array = embedding_fn.signatures['question_encoder'](tf.constant(text))         return vector_array['outputs']     except Exception as err:         raise Exception(str(err))   if __name__ == '__main__':     app.run(host='0.0.0.0', port=5000, debug=True)  """  ----- Requirements.txt ---- flask-restplus==0.13.0 Flask==1.1.1 Werkzeug==0.15.5 tensorboard==2.2.2 tensorboard-plugin-wit==1.6.0.post3 tensorflow==2.2.0 tensorflow-estimator==2.2.0 tensorflow-hub==0.8.0 tensorflow-text==2.2.1 """ 
Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.