Architecture: LLMs are typically based on the Transformer architecture, introduced in the paper "Attention is All You Need" by Vaswani et al. in 2017. Without getting too into the weeds ...