Path: blob/master/Custom and Distributed Training with Tensorflow/Week 3 - Graph Mode/C2_W3_Lab_2-graphs-for-complex-code.ipynb
15840 views
Autograph: Graphs for complex code
In this ungraded lab, you'll go through some of the scenarios from the lesson Creating graphs for complex code
.
Imports
As you saw in the lectures, seemingly simple functions can sometimes be difficult to write in graph mode. Fortunately, Autograph generates this complex graph code for us.
Here is a function that does some multiplication and additon.
Here is a function that checks if the sign of a number is positive or not.
Here is another function that includes a while loop.
Here is a function that uses a for loop and an if statement.
Print statements
Tracing also behaves differently in graph mode. First, here is a function (not decorated with @tf.function
yet) that prints the value of the input parameter. f(2)
is called in a for loop 5 times, and then f(3)
is called.
If you were to decorate this function with @tf.function
and run it, notice that the print statement only appears once for f(2)
even though it is called in a loop.
Now compare print
to tf.print
.
tf.print
is graph aware and will run as expected in loops.
Try running the same code where tf.print()
is added in addition to the regular print
.
Note how
tf.print
behaves compared toprint
in graph mode.
Avoid defining variables inside the function
This function (not decorated yet) defines a tensor v
and adds the input x
to it.
Here, it runs fine.
Now if you decorate the function with @tf.function
.
The cell below will throw an error because tf.Variable
is defined within the function. The graph mode function should only contain operations.
To get around the error above, simply move v = tf.Variable(1.0)
to the top of the cell before the @tf.function
decorator.