Struggling with tf.gradients and dimension mismatch in tensorflow
I am currently programming a loss function and I am struggling to compute the second derivatives of the model.
def loss(model, t_interior, S_interior, t_terminal, S_terminal): Args: model: DGM model object t_interior: sampled time points in the interior of the function's domain S_interior: sampled space points in the interior of the function's domain t_terminal: sampled time points at terminal point (vector of terminal times) S_terminal: sampled space points at terminal time # Loss term #1: PDE # compute function value and derivatives at current sampled points V = model(t_interior, S_interior) V_t = tf.gradients(V, t_interior)[0] V_S = tf.gradients(V, S_interior)[0] sumterm = 0 for i in range(space_dim): for j in range(space_dim): temp1 = tf.transpose(tf.gather_nd(tf.transpose(S_interior), [[i]])) temp2 = tf.gather_nd(tf.transpose(S_interior), [[i]]) temp3 = tf.gather_nd(tf.transpose(V_S), [[i]]) # ith column of V_S, but as row vector V_SS = tf.gradients(tf.transpose(temp3), tf.transpose(temp2))[0] # Computes second derivative of V with respect to x_i and x_j print('temp3: ', temp3) print('V_SS: ', V_SS) sumterm += tf.multiply(rho, tf.multiply(tf.matmul(sig(temp1),sig(temp2)), V_SS)) diff_V = V_t + tf.matmul(mu(S_interior), V_S) + 0.5* sumterm - r*V
So what I was trying to do with V_SS is computing all the second derivatives of V with respect to S_interior, but it is now just of dimension none and doesn’t work in sumterm. S_interior should be of dimension (1000, 4), but is actually of dimension (?, 4) due to what is happening later in the code. As background this is code taken from alialaradi/DeepGalerkinMethod and I’m trying to increase the spatial dimension to 4 instead of just 1. Thanks in advance for any help and let me know if anything is unclear.