Transformer-based large language models (LLMs) are rapidly expanding in both their applications and size. OpenAI’s GPT, for example, has ballooned from 117 million to 175 billion parameters since its ...