Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
tensorflow
GitHub Repository: tensorflow/docs-l10n
Path: blob/master/site/zh-cn/tutorials/customization/custom_layers.ipynb
25118 views
Kernel: Python 3
#@title Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # https://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License.

自定义层

我们建议使用 tf.keras 作为构建神经网络的高级 API。也就是说,大多数 TensorFlow API 都支持 Eager Execution 模式。

import tensorflow as tf
print(tf.config.list_physical_devices('GPU'))

层:常用的实用运算集

在大多数情况下,为机器学习模型编写代码时,您会希望在更高级别的抽象层上操作而非使用各个运算以及处理各个变量。

通常机器学习模型可以表示为简单层的组合与堆叠,并且 TensorFlow 提供了许多常用层的集合,并使您可以方便地从头开始或采用现有层的结构自行编写特定于应用的层。

TensorFlow 在 tf.keras 软件包中提供了完整的 Keras API,Keras 层在构建您自己的模型时非常实用。

# In the tf.keras.layers package, layers are objects. To construct a layer, # simply construct the object. Most layers take as a first argument the number # of output dimensions / channels. layer = tf.keras.layers.Dense(100) # The number of input dimensions is often unnecessary, as it can be inferred # the first time the layer is used, but it can be provided if you want to # specify it manually, which is useful in some complex models. layer = tf.keras.layers.Dense(10, input_shape=(None, 5))

文档中提供了现有层的完整列表,其中包含 Dense(全连接层)、Conv2D、LSTM、BatchNormalization、Dropout 等各种层。

# To use a layer, simply call it. layer(tf.zeros([10, 5]))
# Layers have many useful methods. For example, you can inspect all variables # in a layer using `layer.variables` and trainable variables using # `layer.trainable_variables`. In this case a fully-connected layer # will have variables for weights and biases. layer.variables
# The variables are also accessible through nice accessors layer.kernel, layer.bias

实现自定义层

自行实现层的最佳方式是扩展 tf.keras.Layer 类并实现:

  1. __init__:您可以在其中执行所有与输入无关的初始化

  2. build:您可以在其中获得输入张量的形状,并可以进行其余初始化

  3. call:您可以在其中进行前向计算

请注意,您不必等到调用 build 来创建变量,您还可以在 __init__ 中创建变量。但是,在 build 中创建变量的优点是,它可以根据层将要运算的输入的形状启用变量创建。另一方面,在 __init__ 中创建变量意味着需要明确指定创建变量所需的形状。

class MyDenseLayer(tf.keras.layers.Layer): def __init__(self, num_outputs): super(MyDenseLayer, self).__init__() self.num_outputs = num_outputs def build(self, input_shape): self.kernel = self.add_weight("kernel", shape=[int(input_shape[-1]), self.num_outputs]) def call(self, inputs): return tf.matmul(inputs, self.kernel) layer = MyDenseLayer(10)
_ = layer(tf.zeros([10, 5])) # Calling the layer `.builds` it.
print([var.name for var in layer.trainable_variables])

总体而言,在可能的情况下,如果代码使用标准层,它将更易于阅读和维护,因为其他读者熟悉标准层的行为。如果要使用 tf.keras.layers 内不包含的层,建议您提交 Github 议题,或者最好可以向我们发送拉取请求!

模型:组合层

机器学习模型中有许多有趣的层状物都是通过组合现有层来实现的。例如,ResNet 中的每个残差块都是卷积、批次归一化和捷径的组合。层可以嵌套在其他层中。

通常,当您需要以下模型方法时,您将从 keras.Model 继承:Model.fit,Model.evaluate, and Model.save (see Custom Keras layers and models for details).

除了跟踪变量外,keras.Model(非 keras.layers.Layer )提供的另一个功能是,keras.Model 还可跟踪其内部层,使它们更易于检查。

例如,以下是一个 ResNet 块:

class ResnetIdentityBlock(tf.keras.Model): def __init__(self, kernel_size, filters): super(ResnetIdentityBlock, self).__init__(name='') filters1, filters2, filters3 = filters self.conv2a = tf.keras.layers.Conv2D(filters1, (1, 1)) self.bn2a = tf.keras.layers.BatchNormalization() self.conv2b = tf.keras.layers.Conv2D(filters2, kernel_size, padding='same') self.bn2b = tf.keras.layers.BatchNormalization() self.conv2c = tf.keras.layers.Conv2D(filters3, (1, 1)) self.bn2c = tf.keras.layers.BatchNormalization() def call(self, input_tensor, training=False): x = self.conv2a(input_tensor) x = self.bn2a(x, training=training) x = tf.nn.relu(x) x = self.conv2b(x) x = self.bn2b(x, training=training) x = tf.nn.relu(x) x = self.conv2c(x) x = self.bn2c(x, training=training) x += input_tensor return tf.nn.relu(x) block = ResnetIdentityBlock(1, [1, 2, 3])
_ = block(tf.zeros([1, 2, 3, 3]))
block.layers
len(block.variables)
block.summary()

但是,在很多时候,由多个层组合而成的模型只需要逐一地调用各层。为此,使用 tf.keras.Sequential 只需少量代码即可完成:

my_seq = tf.keras.Sequential([tf.keras.layers.Conv2D(1, (1, 1), input_shape=( None, None, 3)), tf.keras.layers.BatchNormalization(), tf.keras.layers.Conv2D(2, 1, padding='same'), tf.keras.layers.BatchNormalization(), tf.keras.layers.Conv2D(3, (1, 1)), tf.keras.layers.BatchNormalization()]) my_seq(tf.zeros([1, 2, 3, 3]))
my_seq.summary()

后续步骤

现在,您可以回到上一个笔记本,调整线性回归样本以使用结构更好的层和模型。