Skip to main content

Code

Syntax-highlighted code blocks via Prism, with light and dark theme support.


Inline Code

Use single backticks: const x = 42.


Code Blocks

Specify the language after the opening triple backticks. Wrap in <Code> for a numbered, captioned, cross-referenceable block:


import torch.nn as nn

class TransformerBlock(nn.Module):
def __init__(self, d_model, n_heads, d_ff):
super().__init__()
self.attn = nn.MultiheadAttention(d_model, n_heads)
self.ff = nn.Sequential(nn.Linear(d_model, d_ff), nn.GELU(), nn.Linear(d_ff, d_model))
self.norm1 = nn.LayerNorm(d_model)
self.norm2 = nn.LayerNorm(d_model)

def forward(self, x):
x = self.norm1(x + self.attn(x, x, x)[0])
x = self.norm2(x + self.ff(x))
return x

def train_step(model, batch, optimizer):
optimizer.zero_grad()
loss = model(batch)
loss.backward()
optimizer.step()
return loss.item()

Reference them inline: see and .


Supported Languages

LanguageTagExampleDescription
Pythonpythondef foo(): passML, data science, scripting
JavaScriptjsconst x = 42;Frontend, Node.js, React
TypeScripttslet x: number = 42;Typed JavaScript
Bashbashecho "hello"Shell scripts, CLI commands
C / C++c / cppint main() {}Systems, CUDA kernels
JavajavaSystem.out.println()Enterprise, Android
Rustrustfn main() {}Systems, performance-critical
YAML / JSONyaml / jsonkey: valueConfiguration files
LaTeXlatex\frac{a}{b}Math typesetting
SQLsqlSELECT * FROM tDatabase queries

Full list at Prism supported languages.


Algorithms

The <Algorithm> component renders a bordered pseudocode block with a numbered caption:

Input: learning rate η\eta, initial parameters θ0\theta_0, dataset D\mathcal{D}

  1. for t=1,2,,Tt = 1, 2, \ldots, T do
  2.     Sample mini-batch BD\mathcal{B} \subset \mathcal{D}
  3.     gt1B(x,y)Bθ(fθ(x),y)g_t \leftarrow \frac{1}{|\mathcal{B}|} \sum_{(x,y) \in \mathcal{B}} \nabla_\theta \ell(f_\theta(x), y)
  4.     θtθt1ηgt\theta_t \leftarrow \theta_{t-1} - \eta \cdot g_t
  5. end for

Output: trained parameters θT\theta_T

Reference: see .