How Much Does It Cost To Run ChatGPT?
Cost and Scale of ChatGPT Usage
With ChatGPT processing millions of queries daily, the question of cost naturally arises. How much does it cost to run ChatGPT each day and how does this compare to household energy use? OpenAI the company that runs ChatGPT does not make its financial statements public, however there have been several attempts by researchers and media outlets to work out what it might cost each day to run ChatGPT. The amount arrived at was an eye popping $700,000 and with a power draw the size of a small city it has broader implications for energy consumption and sustainability as the popularity and use of AI models continue to increase. In this article, I’ll break down the numbers and try to decipher how that huge $700,000 daily cost was arrived at. I’ll look at the amount of energy a ChatGPT query takes and how that compares to some typical household activities. Let’s take a look.
What Is ChatGPT?
How Much Does Energy Does a Single ChatGPT Query Use?
To understand how much energy it might take to run ChatGPT, we need to break it down to its smallest unit: a single query. According to estimates, each query consumes 0.005 kWh of electricity on average. That’s about ten times the energy needed for a standard Google search. At first glance, this seems negligible—but when multiplied across ChatGPT’s daily traffic, the numbers quickly become staggering. With 100 million queries per day, ChatGPT consumes approximately 500 MWh daily. Using an average industrial electricity cost of $0.10 per kWh, this translates to $50,000 per day spent on electricity alone.
Comparing ChatGPT’s Energy Use to Household Consumption
To put this into perspective, let’s compare a ChatGPT query to running some common household appliances:
- LED Light Bulb: A single ChatGPT query uses roughly the same energy as running a 10-watt LED bulb for about 30 minutes.
- Smartphone Charging: Charging a smartphone for 5–10 minutes consumes roughly the same energy as one query.
While a single query is quite small in the scheme of things when you look at the total number of queries, the scale of energy that is used by ChatGPT is revealed. In the U.S., the average household consumes about 30 kWh daily. This means ChatGPT’s daily energy consumption of around 500 MWh could power over 16,600 homes for a day. This is enough power for a small city.
How Do We Get To $700,000 Per Day Though?
The $700,000 figure includes much more than just electricity. OpenAI’s operations encompass hardware, software, and human expertise. Here’s a closer look at what makes up this hefty price tag:
Computational Costs: GPUs and Energy
- AI Model Size and Complexity:
- ChatGPT is based on large-scale language models like GPT-4, which require significant computational resources. These models can involve billions of parameters, demanding powerful hardware.
- GPUs for Inference:
- Each query processed by ChatGPT requires inference (real-time processing of input data). This is often performed on high-performance GPUs like NVIDIA A100 or H100.
- Costs for running GPUs are substantial. Each GPU can cost $1.50–$3.00/hour depending on the provider and location. OpenAI is likely running thousands of GPUs simultaneously.
- Energy Consumption:
- GPUs are energy-intensive, and the energy costs (depending on location and efficiency) can add up significantly.
User Base and Queries
- Active Daily Users:
- ChatGPT serves millions of users daily. Estimations often consider 10–20 million active daily users, generating billions of queries.
- Query Processing Cost:
- Some reports estimate that each query costs around $0.01–$0.03 to process, based on energy and GPU usage. With millions of queries daily, this scales rapidly.
Hosting and Cloud Infrastructure
- Cloud Providers:
- OpenAI partners with Microsoft for Azure cloud infrastructure. This partnership likely includes substantial discounts due to Microsoft’s $13 billion investment, but even with discounts, hosting costs are massive.
- Estimated hosting and data transfer costs are in the tens of thousands of dollars daily.
- Redundancy and Scaling:
- Systems must be robust and scalable, with additional servers for failover, testing, and global availability.
Software Development and Maintenance
- Continuous Updates:
- ChatGPT is continuously updated to improve its performance, add features, and address ethical concerns.
- Development and maintenance involve large teams of engineers, researchers, and support staff, incurring significant payroll expenses.
Support Staff and Other Operational Costs
- Human Moderators:
- Content moderation and human feedback for fine-tuning models are labor-intensive and costly.
- Customer Support:
- Handling user inquiries and managing subscription services (like ChatGPT Plus) require dedicated support staff.
Looking At The Daily Costs
- GPU Costs:
- Assume 2,000 GPUs running 24/7 at $2/hour = $96,000/day.
- Energy Costs:
- Estimate $50,000/day for power.
- Query Processing:
- Assume 100 million queries, at $0.02 per query:
- Up to $2,000,000/day for queries alone.
- Adjusted for subscription revenue and discounts, this might be $400,000/day.
- Assume 100 million queries, at $0.02 per query:
- Cloud Infrastructure:
- Azure hosting, storage, and bandwidth = $50,000/day.
- Staffing and Other Costs:
- R&D, salaries, and moderation = $100,000/day.
Adding It All Together
- GPU + Energy Costs: $146,000/day
- Query Processing: $400,000/day
- Hosting: $50,000/day
- Staffing: $100,000/day
As you can see, once you add everything back together we get to a daily operational cost estimated at over $700,000. Taking a look at the increasing usage the cost to run ChatGPT continues to grow and these numbers are unlikely to shrink anytime soon.
The Bigger Picture: Why It Matters
Understanding the costs and energy demands of AI models like ChatGPT isn’t just about numbers—it’s about recognizing the broader implications. As the world increasingly relies on AI, balancing innovation with sustainability becomes critical. OpenAI has indicated plans to reduce its carbon footprint by investing in renewable energy, but the sheer scale of operations like ChatGPT raises important questions about efficiency, environmental impact, and the future of AI development.
Weighing the Costs and Benefits of Running ChatGPT
Running ChatGPT isn’t just a technological marvel; it’s a costly and energy-intensive operation. Each query may only cost fractions of a cent, but the cumulative daily expense reaches hundreds of thousands of dollars. Comparing its energy use to household activities helps contextualize its environmental footprint and underscores the need for sustainable solutions. As AI adoption grows, understanding these costs allows us to make informed decisions about how we use and support such technologies. What do you think? Is the cost of running AI like ChatGPT worth the benefits? Send me a message or sign up for the newsletter for more informative posts like this one.
Additional Information
The following links provide some more information on the energy costs for ChatGPT:
- “ChatGPT costs $700,000 to run daily, OpenAI may go bankrupt in 2024- report” TechNext OpenAI spends $700K daily on ChatGPT, with losses hitting $540M in 2023 amid user decline and competition from open-source models like Meta’s Llama 2.
- “ChatGPT’s Power Consumption: Ten Times More Than Google’s” Heise Online Examines the energy demands of ChatGPT, highlighting that each request consumes approximately 2.9 watt-hours, significantly more than a standard Google search.
- “The Environmental Impact of ChatGPT” Earth.org Looks at the carbon footprint associated with ChatGPT’s operations, discussing its annual CO₂ emissions and the broader implications for sustainability.
- “Is Generative AI Bad for the Environment? A Computer Scientist Explains the Carbon Footprint of ChatGPT and Its Cousins” The Conversation Provides an in-depth analysis of the energy consumption involved in training and running large language models like ChatGPT, offering insights into their environmental costs.