How to share weights across different layers in Tensorflow?
A lot of papers and approaches talk about weights sharing across different layers in Tensorflow 1. I need to share the weights of Conv2D layers. How can I do it?
A lot of papers and approaches talk about weights sharing across different layers in Tensorflow 1. I need to share the weights of Conv2D layers. How can I do it?
Tensorflow has an insight called scope, which lets you create and share weights across that scope.
You just need to use the same name for the scope of different layers.
import tensorflow as tf
with tf.variable_scope('scope1'):
out_1 = tf.layers.conv2d(input_1, 16, [3, 3], name='conv3x3')
with tf.variable_scope('scope1', reuse=True):
out_2 = tf.layers.conv2d(input_2, 16, [3, 3], name='conv3x3')
You will have to put reuse=True to share and use the same weights. If you don't put reuse=True, you are going to get an error of duplicates.