Cost Management in GCP: Optimizing Your Cloud Spending 

Introduction 

 
Cloud computing has revolutionized how organizations operate, providing flexibility, scalability, and cost efficiency. Among the leading cloud providers, Google Cloud Platform (GCP) has gained significant traction due to its robust infrastructure, advanced AI/ML capabilities, and wide range of services catering to various industries. Over the past few years, GCP’s market share has grown steadily, capturing approximately 10% of the global cloud market as of 2024. This growth reflects an increasing reliance by organizations on GCP for hosting applications, managing data, and running critical workloads. 

Organizations across industries leverage GCP for a variety of use cases: 

  1. E-commerce platforms benefit from GCP’s scalability to handle peak traffic during sales. 
  1. Media and entertainment companies utilize its powerful analytics and data processing capabilities to deliver personalized content. 
  1. Healthcare providers rely on GCP’s compliance with industry standards like HIPAA to store and process sensitive data securely. 
  1. Startups and tech companies use its AI/ML tools to innovate rapidly without the overhead of managing physical infrastructure. 

While GCP provides immense potential, managing costs effectively is crucial for ensuring a good return on investment. This guide will walk you through the importance of cost management in GCP and proven strategies to optimize your spending. 

Why is Cost Management Important? 

  1. Cost Visibility: Helps organizations understand where their money is spent and identify high-cost services. 
  1. Budget Compliance: Ensures spending stays within predefined budgets, preventing unexpected expenses. 
  1. Resource Optimization: Identifies unused or underutilized resources, allowing for cost savings. 
  1. Business Continuity: Ensures financial predictability and stability for cloud operations. 
  1. Scalability: Enables businesses to scale resources efficiently without overspending. 

Strategies for Cost Optimization in GCP 

1. Enable Billing Alerts and Budgets 

Setting up budgets and alerts ensures notifications when spending exceeds predefined limits. 

Implementation: 

  • Navigate to Billing > Budgets and Alerts in the GCP Console. 
  • Create a budget and set thresholds (e.g., 80%, 100%). 

Best Practice: 

Set up separate budgets for individual projects and services for better tracking and control. 

2. Use Cost Analysis and Billing Reports 

The Cost Analysis tool provides insights into spending patterns across projects, services, and regions. 

Implementation: 

  • Go to Billing > Cost Analysis in the GCP Console. 
  • Filter costs by service, region, or project to identify high-cost resources. 

Best Practice: 
 
Use the Breakdown by Service feature to pinpoint the most expensive services. 

3. Tagging and Cost Allocation 

Tags (or labels) categorize resources, making it easier to allocate costs across teams or departments. 

Implementation: 

  • Apply tags like department=marketing or environment=production during resource creation. 

Best Practice: 

Establish a consistent tagging schema to ensure comprehensive cost tracking. 

4. Rightsize Resources 

Rightsizing identifies over-provisioned resources and recommends resizing them based on usage. 

Implementation: 

  • Use the Recommendations page in the GCP Console to find rightsizing suggestions for VMs. 

Best Practice: 

Enable autoscaling to dynamically adjust resources based on demand. 

5. Use Preemptible VMs 

Preemptible VMs are short-lived, low-cost instances that can save up to 80% compared to standard VMs. 

Implementation: 

  • While creating a VM, select the Preemptible option under Machine Configuration. 

Best Practice: 

Use Preemptible VMs for non-critical workloads like batch processing or testing. 

6. Commit to Long-Term Contracts (Committed Use Discounts) 

Long-term commitments for specific resources offer significant discounts. 

Implementation: 

  • Navigate to Billing > Commitment Plans, and select a 1-year or 3-year commitment for services like Compute Engine. 

Best Practice: 

Commit only to resources you consistently use, such as VMs or databases. 

7. Optimize Cloud Storage Classes 

Choose storage classes like Coldline or Archive for data that is infrequently accessed. 

Implementation: 

  • Configure storage classes during object uploads in Cloud Storage
  • Set lifecycle management policies to automatically transition objects to cheaper storage classes. 

Best Practice: 

Periodically review storage policies to ensure they align with your data access patterns. 

8. Set Up Auto-Scaling for Compute Resources 

Auto-scaling dynamically adjusts resources based on workload demand. 

Implementation: 

  • Go to Compute Engine > Instance Groups, and configure auto-scaling policies based on metrics like CPU or memory usage. 

Best Practice: 

Use horizontal auto-scaling for stateless workloads and vertical auto-scaling for stateful workloads. 

9. Regular Audits and Cleanup 

Periodic audits help identify and remove unused resources like idle VMs, old snapshots, and unattached disks. 

Implementation: 

  • Use Cloud Asset Inventory for resource audits. 
  • Automate cleanups with Cloud Functions

Best Practice: 

Run monthly audits and leverage the Recommendations tool for insights. 

10. Monitor and Automate Cost Management with Stackdriver 

Stackdriver Monitoring enables proactive tracking of resource usage and cost trends. 

Implementation: 

  • Use Stackdriver Monitoring to create dashboards for resource usage. 
  • Automate actions (e.g., stopping non-critical workloads) with Cloud Functions based on cost thresholds. 

Best Practice: 

Set up automated alerts and remediation policies to optimize costs in real-time. 

Ultimate Guide: Automating GCP Project-Wise Cost Reporting & Weekly Slack/Teams Updates using BigQuery, Python, Cloud Functions, and Cloud Scheduler 

🚀 Step 1: Enable Required APIs 

Before running the script, enable the following GCP APIs

gcloud services enable cloudbilling.googleapis.com \ 
    bigquery.googleapis.com \ 
    cloudfunctions.googleapis.com \ 
    cloudscheduler.googleapis.com 
 

These services allow you to fetch billing data, execute queries, deploy functions, and schedule tasks

🔹 Step 2: Enable Billing Export to BigQuery 

Why? 

GCP does not provide direct billing API access for detailed cost breakdowns. Instead, we export billing data to BigQuery

How to Set Up Billing Export? 

1️⃣ Go to Billing Export Settings 

2️⃣ Click “Edit Settings” under Billing Export 

3️⃣ Choose BigQuery Dataset (or create a new dataset) 

4️⃣ Enable “Detailed Usage Cost Data” export 

5️⃣ Wait up to 24 hours for the data to populate 

Your billing table will be named like: 

project.YOUR_DATASET.gcp_billing_export_v1_xxx_xxx 
 

🔹 Step 3: Assign Required IAM Roles 

The service account running this script needs the following permissions

1️⃣ Billing Account Viewer → Access billing data 

2️⃣ BigQuery Data Viewer → Read billing export table 

3️⃣ Cloud Functions Admin → Deploy function 

4️⃣ Cloud Scheduler Admin → Schedule automatic execution 

Run the following commands (replace with your Project ID and Service Account): 

gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \ 
    –member=serviceAccount:YOUR_SERVICE_ACCOUNT \ 
    –role=roles/billing.viewer 

 
 
gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \ 
    –member=serviceAccount:YOUR_SERVICE_ACCOUNT \ 
    –role=roles/bigquery.dataViewer 
 

🔹 Step 4: Query GCP Billing Data using BigQuery 

We need to fetch project-wise cost data for the last 7 days using BigQuery SQL Query

SQL Query: 

SELECT 
  project.id AS project_id, 
  SUM(cost) AS total_cost 
FROM `YOUR_BILLING_EXPORT_DATASET.billing_table` 
WHERE usage_start_time BETWEEN ‘YYYY-MM-DD’ AND ‘YYYY-MM-DD’ 
GROUP BY project_id 
ORDER BY total_cost DESC; 
 

This query returns: 

Project ID Total Cost (USD) 
project-12345 $120.50 
project-67890 $98.75 

Run This Query in BigQuery Console 

1️⃣ Open BigQuery Console 

2️⃣ Select your Billing Dataset 

3️⃣ Open Query Editor, Paste the query, and replace ‘YYYY-MM-DD’ with actual dates 

4️⃣ Click Run 

🔹 Step 5: Python Script to Fetch Cost Data & Send to Slack/Teams 

Now, we integrate the SQL query into a Python script that fetches data and sends it to Slack/Teams

👉 Create a file: gcp_cost_report.py 

👉 Paste the following script: 

import requests 
import datetime 
import google.auth 
from google.cloud import bigquery 
 
# 🔹 Set your Slack/Teams Webhook URL 
WEBHOOK_URL = “YOUR_SLACK_OR_TEAMS_WEBHOOK” 
 
# 🔹 Authenticate and initialize BigQuery Client 
credentials, project = google.auth.default() 
client = bigquery.Client(credentials=credentials, project=project) 
 
# 🔹 Get Date Range (Last 7 Days) 
end_date = datetime.date.today() 
start_date = end_date – datetime.timedelta(days=7) 
 
# 🔹 Query GCP billing data for project-wise costs 
query = f”’ 
SELECT 
  project.id AS project_id, 
  SUM(cost) AS total_cost 
FROM `YOUR_BILLING_EXPORT_DATASET.billing_table` 
WHERE usage_start_time BETWEEN ‘{start_date}’ AND ‘{end_date}’ 
GROUP BY project_id 
ORDER BY total_cost DESC 
”’ 
 
# 🔹 Execute Query 
query_job = client.query(query) 
results = query_job.result() 
 
# 🔹 Format Data for Slack/Teams Message 
message = “\n*🚀 GCP Project-Wise Cost Report (Last 7 Days)*\n” 
message += ““`\nProject ID       | Total Cost (USD)\n” 
message += “———————————-\n” 
for row in results: 
    message += f”{row.project_id:<15} | ${row.total_cost:.2f}\n” 
message += ““`” 
 
# 🔹 Send Report to Slack/Teams 
payload = {“text”: message} 
requests.post(WEBHOOK_URL, json=payload) 
 
print(“✅ Report sent successfully!”) 
 

🔹 Step 6: Deploy as a Cloud Function 

To automate execution, deploy this script as a Cloud Function

Run the following command: 

gcloud functions deploy gcp-cost-report \ 
    –runtime python310 \ 
    –trigger-http \ 
    –allow-unauthenticated \ 
    –entry-point=main \ 
    –set-env-vars WEBHOOK_URL=”YOUR_SLACK_OR_TEAMS_WEBHOOK” 
 

Get the Function URL: 

gcloud functions describe gcp-cost-report –format=”value(httpsTrigger.url)” 
 

📌 Copy the Cloud Function URL

🔹 Step 7: Schedule Weekly Execution with Cloud Scheduler 

To automate execution every Monday, use Cloud Scheduler

Run: 

gcloud scheduler jobs create http weekly-gcp-cost \ 
    –schedule=”0 9 * * 1″ \  # Runs every Monday at 9 AM 
    –uri=”YOUR_CLOUD_FUNCTION_URL” \ 
    –http-method=POST 
 

🚀 Weekly Cost Comparison 

To track weekly cost trends, modify the query to fetch: 

  • Last week’s total cost 
  • Previous week’s total cost 
  • Percentage increase/decrease 

SQL Query: 

WITH cost_data AS ( 
  SELECT 
    project.id AS project_id, 
    SUM(cost) AS total_cost, 
    DATE_TRUNC(DATE(usage_start_time), WEEK) AS week_start 
  FROM `YOUR_BILLING_EXPORT_DATASET.billing_table` 
  WHERE usage_start_time >= DATE_SUB(CURRENT_DATE(), INTERVAL 14 DAY) 
  GROUP BY project_id, week_start 

 
SELECT  
  c1.project_id, 
  c1.total_cost AS last_week_cost, 
  c2.total_cost AS prev_week_cost, 
  ((c1.total_cost – c2.total_cost) / NULLIF(c2.total_cost, 0)) * 100 AS cost_change 
FROM cost_data c1 
LEFT JOIN cost_data c2  
ON c1.project_id = c2.project_id AND c1.week_start = DATE_ADD(c2.week_start, INTERVAL 7 DAY) 
WHERE c1.week_start = DATE_TRUNC(DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY), WEEK) 
ORDER BY c1.total_cost DESC; 
 

Python Script for Weekly Comparison 

THRESHOLD_PERCENT = 20 
 
message = “\n*🚀 GCP Project-Wise Cost Report (Weekly Comparison)*\n” 
message += ““`\nProject ID       | Last Week  | Prev Week  | Change (%)\n” 
message += “——————————————————\n” 
 
for row in results: 
    cost_change = row.cost_change if row.cost_change is not None else 0 
    indicator = “🔴” if cost_change > THRESHOLD_PERCENT else “🟢” 
    message += f”{row.project_id:<15} | ${row.last_week_cost:.2f} | ${row.prev_week_cost:.2f} | {cost_change:.2f}% {indicator}\n” 
 
message += ““`” 
requests.post(WEBHOOK_URL, json={“text”: message}) 
 

✅ Final Workflow 

1️⃣ GCP exports cost data → BigQuery (Step 2) 

2️⃣ Python script fetches project-wise costs (Step 5) 

3️⃣ Cloud Function runs the script (Step 6) 

4️⃣ Cloud Scheduler executes it weekly & sends the report (Step 7) 

🚀 Now, every Monday, the report will be sent to Slack/Teams automatically! 

 
Conclusion 

As organizations increasingly rely on GCP, effective cost management is crucial to maintaining cloud efficiency and budget control. By automating project-wise cost reports using BigQuery, Cloud Functions, and Cloud Scheduler, teams can track and optimize costs effortlessly. 

💡 Key Takeaways: 

✅ Use BigQuery for detailed cost breakdowns 

✅ Automate weekly cost reports via Slack/Teams 

✅ Set up budgets, alerts, and rightsizing strategies 

✅ Optimize costs through auto-scaling, storage classes, and reserved instances 

Author
Latest Blogs

SEND US YOUR RESUME

Apply Now