0

I'm building a neural network with a lambda array where the function deals with a numpy array.

@tf.function
def getEdge(tensor):
    batchLines = []
    for a in tensor:
        array = a.numpy()
        gray = cv2.cvtColor(array,cv2.COLOR_BGR2GRAY)
        edges = cv2.Canny(gray,50,150,apertureSize = 3)

        lines = cv2.HoughLines(edges,1,np.pi/180,200).reshape(127,2)    
        batchLines.append(lines[:,1][:60])
    return batchLines

The first layers of the model are :

inputs = keras.Input(shape=(600,600,3))

perspective = layers.Lambda(getEdge)(inputs)
batchNorm0 = layers.BatchNormalization()(perspective)

dense1 = layers.Dense(60, activation="relu")(batchNorm0)
batchNorm1 = layers.BatchNormalization()(dense1)

dense2 = layers.Dense(60, activation="relu")(batchNorm1)
batchNorm2 = layers.BatchNormalization()(dense2)

dense3 = layers.Dense(64, activation="relu")(batchNorm2)
batchNorm3 = layers.BatchNormalization()(dense3)

wheights = layers.Dense(nbDecsriptors, activation="softmax")(batchNorm3)

merged = Concatenate([inputs, wheights])

describtors = layers.Lambda(describe)(merged)

However I get an error when a run this cell (I'm using a kaggle notebook). I get an error with a.numpy() It says 'Tensor' object has no attribute numpy

znb
  • 1
  • 1

0 Answers0