LLMCO2: Advancing Accurate Carbon Footprint Prediction for LLM Inferences
Throughout its lifecycle, an LLM incurs significantly higher carbon emissions during inference than training. Inference requests vary in batch size, prompt length, and token generation, while cloud providers deploy heterogeneous GPU configurations to meet diverse service-level objectives. Unlike tra...
Saved in:
| Published in: | Energy informatics review 2025-07, Vol.5 (2), p.63-68 |
|---|---|
| Main Authors: | , , , , |
| Format: | Article |
| Language: | English |
| Subjects: | |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|