TensorFlow / Python: Transferencia de Aprendizaje con Fine-Tuning de Modelos Pre-entrenados

Este enfoque de aprendizaje automático implica tomar un modelo de red neuronal pre-entrenado en un conjunto de datos grande y afinarlo para una tarea específica. Este proceso implica:

  1. Transferencia de Aprendizaje (Transfer Learning): En el aprendizaje profundo, se ha demostrado que las redes neuronales entrenadas en conjuntos de datos masivos, como ImageNet para imágenes o GPT-3 para procesamiento de lenguaje natural, aprenden representaciones útiles y generalizadas. En lugar de entrenar un modelo desde cero, la transferencia de aprendizaje aprovecha estas representaciones pre-entrenadas como punto de partida.
  2. Fine-Tuning (Afinamiento): El proceso de fine-tuning implica tomar un modelo pre-entrenado y ajustarlo para realizar una tarea específica. Esto se hace desbloqueando una o varias capas del modelo pre-entrenado y permitiendo que los pesos se actualicen durante el entrenamiento en un nuevo conjunto de datos más pequeño y específico.
  3. Ejemplo de Fine-Tuning:
    • Imagina que tienes un modelo de red neuronal convolucional (CNN) que fue entrenado en un gran conjunto de datos de imágenes para clasificar objetos generales, como perros y gatos.
    • Ahora, quieres usar ese modelo para una tarea específica, como clasificar diferentes razas de perros. En lugar de entrenar un modelo desde cero, puedes realizar fine-tuning en el modelo pre-entrenado.
    • Para hacerlo, desbloquearías una o más capas en el modelo (generalmente las capas más profundas) y las entrenarías en un conjunto de datos más pequeño de imágenes de diferentes razas de perros.
    • El modelo pre-entrenado ya ha aprendido características útiles, como bordes, texturas y formas, que son relevantes para la tarea de clasificación de razas de perros. Fine-tuning ajusta estas características para que sean específicas de las razas de perros en cuestión.
  4. Beneficios de Fine-Tuning:
    • El fine-tuning a menudo requiere menos datos de entrenamiento y tiempo de cómputo en comparación con el entrenamiento desde cero.
    • Permite aprovechar modelos de última generación pre-entrenados, lo que es especialmente útil en tareas de visión por computadora, procesamiento de lenguaje natural y más.

En el ejemplo proporcionado a continuación, estamos probando el modelo MobileNetV2 pre-entrenado sin realizar Fine-Tuning. No se están ajustando las capas superiores del modelo pre-entrenado en un nuevo conjunto de datos. En su lugar, simplemente estamos utilizando el modelo pre-entrenado para hacer predicciones en el conjunto de datos de flores, lo que nos permite ver cómo se desempeña en ese conjunto de datos sin realizar ningún ajuste adicional.

import tensorflow as tf
import tensorflow_hub as hub
from tensorflow.keras.preprocessing.image import ImageDataGenerator
import matplotlib.pyplot as plt
import numpy as np

# URL del modelo MobileNetV2 pre-entrenado
model_url = "https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification/4"

# Directorio de datos de flores
data_root = tf.keras.utils.get_file(
    'flower_photos',
    'https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz',
    untar=True)

# Crear un generador de datos de imágenes para el conjunto de datos de flores
image_data_generator = ImageDataGenerator(rescale=1/255)
image_data = image_data_generator.flow_from_directory(data_root, target_size=(224, 224), batch_size=1)

# Cargar el modelo MobileNetV2 desde TensorFlow Hub
model = hub.load(model_url)

# Contador para limitar a 10 imágenes
count = 0
num_images = 10  # Número de imágenes para mostrar

# Listas para almacenar las imágenes y las clasificaciones
images = []
classifications = []

# Procesar imágenes de flores y obtener las predicciones
for batch in image_data:
    image = batch[0]  # Obtener la imagen del lote
    predictions = model(image)  # Realizar predicciones
    
    # Obtener las etiquetas y probabilidades
    labels_url = "https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt"
    labels_path = tf.keras.utils.get_file("ImageNetLabels.txt", labels_url)
    with open(labels_path) as file:
        labels = file.read().splitlines()
    
    top_label_index = tf.argmax(predictions, axis=-1)
    top_label = labels[top_label_index[0]]
    
    images.append(image[0])
    classifications.append(top_label)
    
    count += 1
    if count >= num_images:
        break  # Detener después de procesar el número deseado de imágenes

# Crear una matriz de imágenes y clasificaciones
fig, axes = plt.subplots(2, 5, figsize=(10, 4))
axes = axes.ravel()

for i in range(num_images):
    axes[i].imshow(images[i])
    axes[i].set_title(classifications[i])
    axes[i].axis('off')

plt.tight_layout()
plt.show()

Aquí les muestro un ejemplo de cómo realizar fine-tuning de un modelo pre-entrenado en TensorFlow. En este caso, utilizaremos un modelo pre-entrenado de TensorFlow Hub para la clasificación de imágenes y lo afinaremos para una tarea específica de clasificación de flores:

import tensorflow as tf
from tensorflow.keras.applications import MobileNetV2
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.preprocessing.image import ImageDataGenerator

# Descargar el conjunto de datos de flores
data_root = tf.keras.utils.get_file(
    'flower_photos',
    'https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz',
    untar=True)

# Crear un generador de datos de imágenes para el conjunto de datos de flores
image_data_generator = ImageDataGenerator(rescale=1/255, validation_split=0.2)
batch_size = 32

train_data = image_data_generator.flow_from_directory(
    data_root,
    target_size=(224, 224),
    batch_size=batch_size,
    subset='training'
)

validation_data = image_data_generator.flow_from_directory(
    data_root,
    target_size=(224, 224),
    batch_size=batch_size,
    subset='validation'
)

# Cargar la arquitectura MobileNetV2 pre-entrenada sin las capas superiores
base_model = MobileNetV2(weights='imagenet', include_top=False)

# Agregar capas de clasificación adicionales para flores
x = GlobalAveragePooling2D()(base_model.output)
x = Dense(128, activation='relu')(x)
output = Dense(5, activation='softmax')(x)  # 5 clases de flores en este caso

# Crear el nuevo modelo con las capas de clasificación
model = Model(inputs=base_model.input, outputs=output)

# Congelar las capas del modelo base
for layer in base_model.layers:
    layer.trainable = False

# Compilar el modelo
model.compile(optimizer=Adam(lr=0.001), loss='categorical_crossentropy', metrics=['accuracy'])

# Entrenar el modelo
model.fit(train_data, validation_data=validation_data, epochs=10)

Este ejemplo descarga el conjunto de datos de flores, carga la arquitectura de MobileNetV2 pre-entrenada y agrega capas de clasificación para clasificar flores específicas. Luego, realiza el fine-tuning en el conjunto de datos de flores.

38 thoughts on “TensorFlow / Python: Transferencia de Aprendizaje con Fine-Tuning de Modelos Pre-entrenados

  1. you are truly a just right webmaster The site loading speed is incredible It kind of feels that youre doing any distinctive trick In addition The contents are masterwork you have done a great activity in this matter

  2. I loved as much as youll receive carried out right here The sketch is tasteful your authored material stylish nonetheless you command get bought an nervousness over that you wish be delivering the following unwell unquestionably come more formerly again since exactly the same nearly a lot often inside case you shield this hike

  3. Your blog is a testament to your dedication to your craft. Your commitment to excellence is evident in every aspect of your writing. Thank you for being such a positive influence in the online community.

  4. helloI like your writing very so much proportion we keep up a correspondence extra approximately your post on AOL I need an expert in this space to unravel my problem May be that is you Taking a look forward to see you

  5. of course like your website but you have to check the spelling on several of your posts A number of them are rife with spelling issues and I in finding it very troublesome to inform the reality on the other hand I will certainly come back again

  6. helloI really like your writing so a lot share we keep up a correspondence extra approximately your post on AOL I need an expert in this house to unravel my problem May be that is you Taking a look ahead to see you

  7. Your writing has a way of making even the most complex topics accessible and engaging. I’m constantly impressed by your ability to distill complicated concepts into easy-to-understand language.

  8. My degree of admiration for your work is a reflection of my own enthusiasm for it. Your sketch is visually appealing, and your composed material is both interesting and informative. Despite this, you are apparently concerned about moving in a direction that may induce anxiety. I agree that you will be able to deal with the situation quickly and effectively.

  9. Its like you read my mind You appear to know a lot about this like you wrote the book in it or something I think that you could do with some pics to drive the message home a little bit but instead of that this is fantastic blog An excellent read I will certainly be back

  10. Simply wish to say your article is as amazing The clearness in your post is just nice and i could assume youre an expert on this subject Well with your permission let me to grab your feed to keep updated with forthcoming post Thanks a million and please carry on the gratifying work

  11. I loved as much as youll receive carried out right here The sketch is attractive your authored material stylish nonetheless you command get bought an nervousness over that you wish be delivering the following unwell unquestionably come more formerly again as exactly the same nearly a lot often inside case you shield this hike

  12. you are truly a just right webmaster The site loading speed is incredible It kind of feels that youre doing any distinctive trick In addition The contents are masterwork you have done a great activity in this matter

  13. I just could not leave your web site before suggesting that I really enjoyed the standard information a person supply to your visitors Is gonna be again steadily in order to check up on new posts

  14. of course like your website but you have to check the spelling on several of your posts A number of them are rife with spelling issues and I in finding it very troublesome to inform the reality on the other hand I will certainly come back again

  15. My brother was absolutely right when he suggested that I would like this website. You have no idea how much time I spent looking for this information, but this post made my day.

  16. I just started reading this amazing website, they produce high quality content for people. The site owner works hard to engage customers. I’m delighted and hope they keep up the good work!

  17. Usually I do not read article on blogs however I would like to say that this writeup very compelled me to take a look at and do so Your writing taste has been amazed me Thanks quite nice post

Deja un comentario