cancel
Showing results for 
Search instead for 
Did you mean: 

Anyone using the Tensorflow functional API from inside q(embedpy)?

krish240574
New Contributor II

Hello,

I'm trying to write this code inside q:

def get_model(input_shape, time2vec_dim = 3):
inp = Input(input_shape)
x = inp

time_embedding = keras.layers.TimeDistributed(Time2Vec(time2vec_dim - 1))(x) /*******************/

Equivalent q code is :

lyr:.p.import`keras.layers

x:lyr[`:Input;<;65;16]  /create Input(65,16)

time_embedding:lyr[`:TimeDistributed;<;.p.eval("Time2Vec(2)")] / Time2Vec is my own class, defined in a .p file

Now, how do I pass the final (x) in the above line? (marked /******************/)

I tried time_embedding[x], it gives me a "rank" error.

What is the semantic meaning of a (x) passed in the functional API? Code-wise, it seems that we're passing (x) to an object of TimeDistributed above. Doesn't quite make sense, could someone please shed light on this issue?

Thanks,

Krishna

1 ACCEPTED SOLUTION

rocuinneagain
Contributor II
Contributor II

You could load the function in to q rather than redefine it line by line.

 

Example file.p

def get_model(input_shape, time2vec_dim = 3):
  inp = Input(input_shape)
  x = inp
  time_embedding = keras.layers.TimeDistributed(Time2Vec(time2vec_dim - 1))(x)

Load it and pull in the function:

q)\l file.p
q)get_model:.p.get`get_model
q)get_model[(65;16)]

Another example of this: https://github.com/rianoc/qparquet 

 

Looking to replicate an example from https://keras.io/api/layers/recurrent_layers/time_distributed/ 

q)lyr:.p.import`keras.layers
q)x:lyr[`:Input;`shape pykw (10, 128, 128, 3)]
q)conv_2d_layer:lyr[`:Conv2D;64;(3;3)]
q)time_embedding:lyr[`:TimeDistributed;conv_2d_layer]
q)outputs:time_embedding[x]
q)print outputs[`:shape]
(None, 10, 126, 126, 64)

 

View solution in original post

2 REPLIES 2

rocuinneagain
Contributor II
Contributor II

You could load the function in to q rather than redefine it line by line.

 

Example file.p

def get_model(input_shape, time2vec_dim = 3):
  inp = Input(input_shape)
  x = inp
  time_embedding = keras.layers.TimeDistributed(Time2Vec(time2vec_dim - 1))(x)

Load it and pull in the function:

q)\l file.p
q)get_model:.p.get`get_model
q)get_model[(65;16)]

Another example of this: https://github.com/rianoc/qparquet 

 

Looking to replicate an example from https://keras.io/api/layers/recurrent_layers/time_distributed/ 

q)lyr:.p.import`keras.layers
q)x:lyr[`:Input;`shape pykw (10, 128, 128, 3)]
q)conv_2d_layer:lyr[`:Conv2D;64;(3;3)]
q)time_embedding:lyr[`:TimeDistributed;conv_2d_layer]
q)outputs:time_embedding[x]
q)print outputs[`:shape]
(None, 10, 126, 126, 64)

 

Thanks a ton for the detailed replies, @rocuinneagain. I see that I was making a couple of errors:

1. Using all the tensorflow function calls with "<" included, i.e intending to get them to return q. I'm much better off leaving them without the "<", - using them as a default embedpy  , as mentioned here :

https://code.kx.com/q/ml/embedpy/userguide/#function-calls 

2. Doing a .p.eval"Time2Vec(2)" - causes the following warning:

WARNING:tensorflow:AutoGraph could not transform <bound method Time2Vec.call of <__main__.Time2Vec object at 0x7fbf09ad6160>> and will run it as-is.
Cause: Unable to locate the source code of <bound method Time2Vec.call of <__main__.Time2Vec object at 0x7fbf09ad6160>>. Note that functions defined in certain environments, like the interactive Python shell, do not expose their source code. If that is the case, you should define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.experimental.do_not_convert. Original error: source code not available
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert

 

Which I avoided using the following:(

(.p.get`Time2Vec)[2] - seems strange, that the .p.eval should cause that warning, which the .p.get `Time2Vec does not. 

I

I'll read the documentation again. 

 

Cheers, 

Krishna