Transfer Learning Techniques

By Bill Sharlow

Day 6 of our TensorFlow Deep Learning Framework Setup

Welcome to Day 6 of our 10-Day DIY TensorFlow Deep Learning Framework Setup series! Today, we’re exploring a powerful technique in the deep learning arsenal—Transfer Learning. Transfer learning involves leveraging pre-trained models to boost performance on a specific task, even with limited data.

Understanding Transfer Learning

Transfer learning takes advantage of the knowledge gained by a model on one task and applies it to another related task. This is particularly valuable when you have a limited dataset for your target task, as it allows your model to benefit from the feature representations learned on a larger, pre-existing dataset.

Hands-On Transfer Learning with TensorFlow

In this example, we’ll use a pre-trained model from TensorFlow Hub for image classification. We’ll fine-tune the model on a smaller dataset for a specific task, showcasing the power of transfer learning.

import tensorflow as tf
import tensorflow_hub as hub
from tensorflow.keras import layers, models
from tensorflow.keras.datasets import cifar10
from tensorflow.keras.utils import to_categorical

# Load and preprocess the CIFAR-10 dataset
(train_images, train_labels), (test_images, test_labels) = cifar10.load_data()
train_images = train_images.astype('float32') / 255
test_images = test_images.astype('float32') / 255
train_labels = to_categorical(train_labels)
test_labels = to_categorical(test_labels)

# Load a pre-trained model from TensorFlow Hub
feature_extractor_url = "https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification/4"
feature_extractor_layer = hub.KerasLayer(feature_extractor_url, input_shape=(224, 224, 3))

# Build a new model on top of the pre-trained model
model = models.Sequential([
    feature_extractor_layer,
    layers.Dense(128, activation='relu'),
    layers.Dropout(0.3),
    layers.Dense(10, activation='softmax')
])

# Compile the model
model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

# Resize images to match the expected input size of the pre-trained model
train_images_resized = tf.image.resize(train_images, (224, 224))
test_images_resized = tf.image.resize(test_images, (224, 224))

# Train the model
model.fit(train_images_resized, train_labels, epochs=5, batch_size=64, validation_split=0.2)

# Evaluate the model on the test set
test_loss, test_acc = model.evaluate(test_images_resized, test_labels)
print('\nTest Accuracy:', test_acc)

In this script:

  • We load and preprocess the CIFAR-10 dataset.
  • A pre-trained MobileNetV2 model from TensorFlow Hub is used as a feature extractor.
  • We build a new model on top of the pre-trained model, adding a few densely connected layers.
  • The model is compiled, trained on the resized images, and evaluated on the test set.

Experiment and Adapt

Experiment with different pre-trained models and adapt them to your specific tasks. Transfer learning provides a powerful shortcut to harness the capabilities of state-of-the-art models without starting from scratch.

What’s Next?

Transfer learning has opened new horizons for your deep learning endeavors. In the upcoming days, we’ll explore techniques for optimizing model performance and ensuring your models reach their full potential.

Stay tuned for Day 7: Optimizing Model Performance, where we’ll delve into hyperparameter tuning and regularization strategies. Happy coding!

Leave a Comment

Exit mobile version