Discussions>How to share weights across different layers in Tensorflow?>

How to share weights across different layers in Tensorflow?

A lot of papers and approaches talk about weights sharing across different layers in Tensorflow 1. I need to share the weights of Conv2D layers. How can I do it?

3 votesJW326.00
1 Answers

Tensorflow has an insight called scope, which lets you create and share weights across that scope.
You just need to use the same name for the scope of different layers.

import tensorflow as tf

with tf.variable_scope('scope1'):
    out_1 = tf.layers.conv2d(input_1, 16, [3, 3], name='conv3x3')

with tf.variable_scope('scope1', reuse=True):
    out_2 = tf.layers.conv2d(input_2, 16, [3, 3], name='conv3x3')

You will have to put reuse=True to share and use the same weights. If you don't put reuse=True, you are going to get an error of duplicates.

Couldn't find what you were looking for?and we will find an expert to answer.
How helpful was this page?