AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
[Link] Experiments
Design and implement a neural based network for generating word embedding
1.
for words in a document corpus
Write a program to demonstrate the working of a deep neural network for
2.
classification task.
Desing and implement a Convolutional Neural Network(CNN) for classification
3.
of image dataset
Build and demonstrate an autoencoder network using neural layers for data
4.
compression on image dataset.
Desing and implement a deep learning network for classification of textual
5.
documents.
6. Design and implement a deep learning network for forecasting time series data.
7. Write a program to enable pre-train models to classify a given image dataset
Simple Grid World Problem: Design a custom 2D grid world where the agent
8. navigates from a start position to a goal, avoiding obstacles. Environment:
Custom grid (easily implemented in Python)
Course outcomes (Course Skill Set):
At the end of the course, the student will be able to:
• CO1: Demonstrate the implementation of deep learning techniques
• CO2: Examine various deep learning techniques for solving the real world problems
• CO3: Design and implement research-oriented scenario using deep learning techniques in a
team
Page 1 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
1. Design and implement a neural based network for generating word embedding for
words in a document corpus
import numpy as np
from [Link] import Tokenizer
from [Link] import Sequential
from [Link] import Embedding, Flatten, Dense
# Sample corpus
corpus = [
'Deep learning networks are powerful',
'Networks learn patterns from data',
'Word embeddings capture semantic information'
]
# Tokenize corpus
tokenizer = Tokenizer()
tokenizer.fit_on_texts(corpus)
sequences = tokenizer.texts_to_sequences(corpus)
word_index = tokenizer.word_index
# Pad sequences
from [Link] import pad_sequences
data = pad_sequences(sequences, padding='post')
# Create embedding model
vocab_size = len(word_index) + 1
embed_dim = 8
model = Sequential()
[Link](Embedding(input_dim=vocab_size, output_dim=embed_dim, input_length=[Link][1]))
[Link](Flatten())
[Link](Dense(1, activation='sigmoid')) # dummy target for example
Page 2 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
[Link](optimizer='adam', loss='binary_crossentropy')
[Link]()
# For demonstration, mock targets
labels = [Link](0, 2, size=([Link][0], 1))
[Link](data, labels, epochs=10)
Output :
Page 3 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
2. Write a program to demonstrate the working of a deep neural network for
Classification task.
from [Link] import load_iris
from sklearn.model_selection import train_test_split
from [Link] import to_categorical
# Load data
X, y = load_iris(return_X_y=True)
y = to_categorical(y, 3)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
model = Sequential([
Dense(32, activation='relu', input_shape=(4,)),
Dense(64, activation='relu'),
Dense(3, activation='softmax')
])
[Link](optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
[Link](X_train, y_train, epochs=5, validation_data=(X_test, y_test))
Output :
Page 4 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
3. Desing and implement a Convolutional Neural Network(CNN) for classification of
image dataset
import numpy as np
from [Link] import mnist
from [Link] import to_categorical
from [Link] import Sequential
from [Link] import Conv2D, MaxPooling2D, Flatten, Dense
# Load and preprocess data
(X_train, y_train), (X_test, y_test) = mnist.load_data()
X_train = X_train[..., [Link]] / 255.0
X_test = X_test[..., [Link]] / 255.0
y_train = to_categorical(y_train, 10)
y_test = to_categorical(y_test, 10)
model = Sequential([
Conv2D(32, (3,3), activation='relu', input_shape=(28,28,1)),
MaxPooling2D((2,2)),
Flatten(),
Dense(64, activation='relu'),
Dense(10, activation='softmax')
])
[Link](optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
[Link](X_train, y_train, epochs=5, validation_data=(X_test, y_test))
Output:
Page 5 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
4. Build and demonstrate an autoencoder network using neural layers for data
compression on image dataset.
import numpy as np
from [Link] import mnist
from [Link] import Model
from [Link] import Input, Dense, Flatten, Reshape
from [Link] import Adam
import [Link] as plt
# Load MNIST data and normalize
(X_train, _), (X_test, _) = mnist.load_data()
X_train = X_train.astype('float32') / 255.0
X_test = X_test.astype('float32') / 255.0
# Autoencoder architecture
encoding_dim = 32 # Bottleneck size
input_img = Input(shape=(28, 28))
x = Flatten()(input_img)
encoded = Dense(encoding_dim, activation='relu')(x)
decoded = Dense(784, activation='sigmoid')(encoded)
decoded = Reshape((28, 28))(decoded)
# Build the autoencoder model
autoencoder = Model(inputs=input_img, outputs=decoded)
[Link](optimizer='adam', loss='mse')
# Train the autoencoder
[Link](X_train, X_train,
epochs=10,
batch_size=256,
shuffle=True,
Page 6 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
validation_data=(X_test, X_test))
# Optional: visualize a few reconstructions
decoded_imgs = [Link](X_test[:10])
# Plotting original and reconstructed images
n = 10
[Link](figsize=(20, 4))
for i in range(n):
# Original
ax = [Link](2, n, i + 1)
[Link](X_test[i], cmap='gray')
[Link]("Original")
[Link]("off")
# Reconstructed
ax = [Link](2, n, i + 1 + n)
[Link](decoded_imgs[i], cmap='gray')
[Link]("Reconstructed")
[Link]("off")
plt.tight_layout()
[Link]()
Output :
Page 7 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
5. Desing and implement a deep learning network for classification of textual
documents.
import numpy as np
from [Link] import imdb
from [Link] import Sequential
from [Link] import Embedding, Flatten, Dense
from [Link] import pad_sequences
# 1. Load IMDB dataset (binary sentiment classification)
vocab_size = 10000 # top 10,000 words
maxlen = 200 # maximum review length
(X_train, y_train), (X_test, y_test) = imdb.load_data(num_words=vocab_size)
# 2. Pad sequences to ensure equal input length
X_train = pad_sequences(X_train, maxlen=maxlen)
X_test = pad_sequences(X_test, maxlen=maxlen)
# 3. Define the deep learning model
model = Sequential()
[Link](Embedding(input_dim=vocab_size, output_dim=32, input_length=maxlen))
[Link](Flatten())
[Link](Dense(64, activation='relu'))
[Link](Dense(1, activation='sigmoid'))
# 4. Compile the model
[Link](optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
# 5. Display model summary
[Link]()
# 6. Train the model
[Link](X_train, y_train, epochs=5, batch_size=128, validation_split=0.2)
# 7. Evaluate on test data
loss, accuracy = [Link](X_test, y_test)
print(f"\nTest Accuracy: {accuracy:.4f}")
Page 8 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
Output :
6. Design and implement a deep learning network for forecasting time series data.
import numpy as np
import [Link] as plt
from [Link] import Sequential
from [Link] import Dense
from [Link] import MinMaxScaler
# 1. Generate synthetic time series data (sine wave)
time = [Link](0, 100, 0.1)
data = [Link](time) + 0.1 * [Link](len(time)) # noisy sine wave
# 2. Normalize the data
scaler = MinMaxScaler()
data_scaled = scaler.fit_transform([Link](-1, 1))
Page 9 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
# 3. Prepare dataset (windowed sequences)
def create_dataset(series, window_size=10):
X, y = [], []
for i in range(len(series) - window_size):
[Link](series[i:i+window_size])
[Link](series[i+window_size])
return [Link](X), [Link](y)
window_size = 10
X, y = create_dataset(data_scaled, window_size)
# 4. Split into training and testing
split = int(len(X) * 0.8)
X_train, X_test = X[:split], X[split:]
y_train, y_test = y[:split], y[split:]
# 5. Build a deep learning model (Dense-based)
model = Sequential()
[Link](Dense(64, activation='relu', input_shape=(window_size,)))
[Link](Dense(32, activation='relu'))
[Link](Dense(1))
# 6. Compile and train the model
[Link](optimizer='adam', loss='mse')
[Link](X_train, y_train, epochs=20, verbose=1)
# 7. Evaluate and predict
loss = [Link](X_test, y_test)
print(f"\nTest MSE: {loss:.4f}")
# 8. Forecast on test data
Page 10 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
predicted = [Link](X_test)
predicted = scaler.inverse_transform(predicted)
y_test_actual = scaler.inverse_transform(y_test)
# 9. Plot results
[Link](figsize=(10, 5))
[Link](y_test_actual, label='Actual')
[Link](predicted, label='Predicted')
[Link]("Time Series Forecasting (Test Data)")
[Link]("Time Step")
[Link]("Value")
[Link]()
[Link](True)
[Link]()
Output :
Page 11 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
7. Write a program to enable pre-train models to classify a given image dataset
import numpy as np
from [Link] import cifar10
from [Link] import MobileNetV2
from [Link].mobilenet_v2 import preprocess_input
from [Link] import Model
from [Link] import Dense, GlobalAveragePooling2D, Input
from [Link] import to_categorical
from [Link] import Adam
from [Link] import ImageDataGenerator
# 1. Load CIFAR-10 dataset
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
# 2. Resize images to 96x96 (MobileNetV2 expects 96x96 or more)
from [Link] import resize
x_train_resized = resize(x_train, [96, 96]).numpy()
x_test_resized = resize(x_test, [96, 96]).numpy()
Page 12 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
# 3. Preprocess input and one-hot encode labels
x_train_resized = preprocess_input(x_train_resized)
x_test_resized = preprocess_input(x_test_resized)
y_train_cat = to_categorical(y_train, 10)
y_test_cat = to_categorical(y_test, 10)
# 4. Load pre-trained model without top layer
base_model = MobileNetV2(weights='imagenet', include_top=False, input_shape=(96, 96, 3))
base_model.trainable = False # freeze base model
# 5. Add custom classification layers
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(128, activation='relu')(x)
output = Dense(10, activation='softmax')(x)
model = Model(inputs=base_model.input, outputs=output)
# 6. Compile and train the model
[Link](optimizer=Adam(), loss='categorical_crossentropy', metrics=['accuracy'])
[Link]()
# 7. Train with augmentation
datagen = ImageDataGenerator(horizontal_flip=True, zoom_range=0.2)
[Link]([Link](x_train_resized, y_train_cat, batch_size=64),
epochs=5,
validation_data=(x_test_resized, y_test_cat))
# 8. Evaluate on test data
loss, acc = [Link](x_test_resized, y_test_cat)
print(f"\nTest Accuracy: {acc:.4f}")
Page 13 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
Output :
8. Simple Grid World Problem: Design a custom 2D grid world where the agent
navigates from a start position to a goal, avoiding obstacles. Environment: Custom
grid (easily implemented in Python)
import numpy as np
grid_size = 5
grid = [Link]((grid_size, grid_size))
start = (0, 0)
goal = (4, 4)
obstacles = [(2,2), (1,3), (3,1)]
for pos in obstacles:
grid[pos] = -1 # Mark obstacles
Page 14 of 15
AIML Deep Learning and Reinforcement Learning (BAI701) 7th Sem
def is_valid(pos):
x, y = pos
return 0 <= x < grid_size and 0 <= y < grid_size and grid[pos] == 0
current = start
path = [current]
while current != goal:
# For demo: Move right if possible, else down
next_move = (current[0], current[1]+1) if is_valid((current[0], current[1]+1)) else (current[0]+1,
current[1])
if not is_valid(next_move):
break # Agent is stuck
[Link](next_move)
current = next_move
print("Agent path:", path)
Output :
Agent path: [(0, 0), (0, 1), (0, 2), (0, 3), (0, 4), (1, 4), (2, 4), (3, 4), (4, 4)]
Page 15 of 15