mlp-iris
A multi-layer perceptron (MLP) trained on the Iris dataset.
It takes four inputs: 'SepalLengthCm', 'SepalWidthCm', 'PetalLengthCm' and 'PetalWidthCm'. It predicts whether the species is 'Iris-setosa' / 'Iris-versicolor' / 'Iris-virginica'.
It is a PyTorch adaptation of the scikit-learn model in Chapter 10 of Aurelien Geron's book 'Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow'. Find the scikit-learn model here: https://github.com/ageron/handson-ml3/blob/main/10_neural_nets_with_keras.ipynb
Code: https://github.com/sambitmukherjee/handson-ml3-pytorch/blob/main/chapter10/mlp_iris.ipynb
Experiment tracking: https://wandb.ai/sadhaklal/mlp-iris
Usage
!pip install -q datasets
from datasets import load_dataset
iris = load_dataset("scikit-learn/iris")
iris.set_format("pandas")
iris_df = iris['train'][:]
label2id = {'Iris-setosa': 0, 'Iris-versicolor': 1, 'Iris-virginica': 2}
iris_df['Species'] = [label2id[species] for species in iris_df['Species']]
X = iris_df[['SepalLengthCm', 'SepalWidthCm', 'PetalLengthCm', 'PetalWidthCm']].values
y = iris_df['Species'].values
from sklearn.model_selection import train_test_split
X_train_full, X_test, y_train_full, y_test = train_test_split(X, y, test_size=0.1, stratify=y, random_state=42)
X_train, X_valid, y_train, y_valid = train_test_split(X_train_full, y_train_full, test_size=0.1, stratify=y_train_full, random_state=42)
X_means, X_stds = X_train.mean(axis=0), X_train.std(axis=0)
import torch
import torch.nn as nn
from huggingface_hub import PyTorchModelHubMixin
device = torch.device("cpu")
class MLP(nn.Module, PyTorchModelHubMixin):
def __init__(self):
super().__init__()
self.fc1 = nn.Linear(4, 5)
self.fc2 = nn.Linear(5, 3)
def forward(self, x):
act = torch.relu(self.fc1(x))
return self.fc2(act)
model = MLP.from_pretrained("sadhaklal/mlp-iris")
model.to(device)
X_new = X_test[:2] # Contains data on 2 new flowers from the test set.
X_new = ((X_new - X_means) / X_stds) # Normalize.
X_new = torch.tensor(X_new, dtype=torch.float32)
model.eval()
X_new = X_new.to(device)
with torch.no_grad():
logits = model(X_new)
probas = torch.softmax(logits, dim=-1)
confidences, preds = probas.max(dim=-1)
print(f"Predicted classes: {preds}")
print(f"Predicted confidences: {confidences}")
Metric
Accuracy on the test set: 0.9333
This model has been pushed to the Hub using the PyTorchModelHubMixin integration.