如何使用函数式 API 来处理 Python 中的残差连接?
Keras存在于Tensorflow包中。可以使用以下代码行访问它。
import tensorflow from tensorflow import keras
与使用顺序API创建的模型相比,Keras函数式API有助于创建更灵活的模型。函数式API可以处理具有非线性拓扑结构的模型,可以共享层并处理多个输入和输出。深度学习模型通常是包含多个层的有向无环图(DAG)。函数式API有助于构建层图。
我们正在使用GoogleColaboratory运行以下代码。GoogleColab或Colaboratory帮助在浏览器上运行Python代码,并且需要零配置和免费访问GPU(图形处理单元)。Colaboratory建立在JupyterNotebook之上。以下是代码片段;
示例
print("Toy ResNet model for CIFAR10") print("Layers generated for model") inputs = keras.Input(shape=(32, 32, 3), name="img") x = layers.Conv2D(32, 3, activation="relu")(inputs) x = layers.Conv2D(64, 3, activation="relu")(x) block_1_output = layers.MaxPooling2D(3)(x) x = layers.Conv2D(64, 3, activation="relu", padding="same")(block_1_output) x = layers.Conv2D(64, 3, activation="relu", padding="same")(x) block_2_output = layers.add([x, block_1_output]) x = layers.Conv2D(64, 3, activation="relu", padding="same")(block_2_output) x = layers.Conv2D(64, 3, activation="relu", padding="same")(x) block_3_output = layers.add([x, block_2_output]) x = layers.Conv2D(64, 3, activation="relu")(block_3_output) x = layers.GlobalAveragePooling2D()(x) x = layers.Dense(256, activation="relu")(x) x = layers.Dropout(0.5)(x) outputs = layers.Dense(10)(x) model = keras.Model(inputs, outputs, name="toy_resnet") print("More information about the model") model.summary()
代码信用-https://www.tensorflow.org/guide/keras/functional
输出结果
Toy ResNet model for CIFAR10 Layers generated for model More information about the model Model: "toy_resnet" ________________________________________________________________________________ __________________ Layer (type) Output Shape Param #连接到 ================================================================================ ================== img (InputLayer) [(None, 32, 32, 3)] 0 ________________________________________________________________________________ __________________ conv2d_32 (Conv2D) (None, 30, 30, 32) 896 img[0][0] ________________________________________________________________________________ __________________ conv2d_33 (Conv2D) (None, 28, 28, 64) 18496 conv2d_32[0][0] ________________________________________________________________________________ __________________ max_pooling2d_8 (MaxPooling2D) (None, 9, 9, 64) 0 conv2d_33[0][0] ________________________________________________________________________________ __________________ conv2d_34 (Conv2D) (None, 9, 9, 64) 36928 max_pooling2d_8[0][0] ________________________________________________________________________________ __________________ conv2d_35 (Conv2D) (None, 9, 9, 64) 36928 conv2d_34[0][0] ________________________________________________________________________________ __________________ add_12 (Add) (None, 9, 9, 64) 0 conv2d_35[0][0] max_pooling2d_8[0][0] ________________________________________________________________________________ __________________ conv2d_36 (Conv2D) (None, 9, 9, 64) 36928 add_12[0][0] ________________________________________________________________________________ __________________ conv2d_37 (Conv2D) (None, 9, 9, 64) 36928 conv2d_36[0][0] ________________________________________________________________________________ __________________ add_13 (Add) (None, 9, 9, 64) 0 conv2d_37[0][0] add_12[0][0] ________________________________________________________________________________ __________________ conv2d_38 (Conv2D) (None, 7, 7, 64) 36928 add_13[0][0] ________________________________________________________________________________ __________________ global_average_pooling2d_1 (Glo (None, 64) 0 conv2d_38[0][0] ________________________________________________________________________________ __________________ dense_40 (Dense) (None, 256) 16640 global_average_pooling2d_1[0][0] ________________________________________________________________________________ __________________ dropout_2 (Dropout) (None, 256) 0 dense_40[0][0] ________________________________________________________________________________ __________________ dense_41 (Dense) (None, 10) 2570 dropout_2[0][0] ================================================================================ ================== Total params: 223,242 Trainable params: 223,242 Non-trainable params: 0 ________________________________________________________________________________ __________________
解释
该模型具有多个输入和输出。
函数式API可以轻松处理非线性连接拓扑。
这个带有层的模型不是按顺序连接的,因此“顺序”API无法使用它。
这就是残差连接出现的地方。
构建了一个使用CIFAR10的示例ResNet模型来演示相同的内容。