TensorBloom includes 16 pre-built architecture templates. Each creates a complete graph with a Data node, model layers, and loss function — ready to train.
Getting Started
| Template | Dataset | Description |
|---|
| Simple MLP | MNIST | 3-layer feedforward classifier |
Vision
| Template | Dataset | Description |
|---|
| LeNet-5 | MNIST | Classic CNN (LeCun 1998) |
| ResNet-18 | CIFAR-10 | Skip connections (He 2015) |
| ViT | CIFAR-10 | Patch embed + Transformer |
| MobileNetV2 | CIFAR-10 | Inverted residual blocks |
| U-Net | ImageFolder | Segmentation with skip concat |
Language
| Template | Dataset | Description |
|---|
| Transformer | WikiText-2 | Encoder-decoder (Vaswani 2017) |
| nanoGPT | TinyShakespeare | 6-layer char-level GPT |
| BERT Base | AG News | 12-layer text classifier |
| LSTM Classifier | IMDB | Bidirectional LSTM sentiment |
Audio
| Template | Dataset | Description |
|---|
| Whisper Encoder | SpeechCommands | Conv1d + Transformer |
| WaveNet Block | SpeechCommands | Dilated conv residuals |
Generative
| Template | Dataset | Description |
|---|
| Autoencoder | MNIST | FC encoder-decoder |
| Conv Autoencoder | MNIST | Conv + ConvTranspose |
| Embedding Classifier | MNIST | Conv + embedding head |
Other
| Template | Dataset | Description |
|---|
| DenseNet Block | CIFAR-10 | Dense concat connectivity |
Using Templates
- Go to Insert > From Template
- Browse by category or search
- Click a template to load it
- Click Start Training — everything is pre-configured
Modifying Templates
Templates are starting points. After loading:
- Change layers — Click any node to edit parameters, or delete/add layers
- Swap datasets — Click the Data node and select a different source
- Adjust training — Switch to the Training tab to change optimizer, learning rate, epochs
- Auto-layout — Press Ctrl+L after making structural changes