Checking the public IP address of a GCP Cloud Function

Public IP address of a GCP Cloud Function
When you deploy a Cloud Function without configuring a VPC, it operates within Google’s internal network. This means it leverages Google’s infrastructure to handle incoming and outgoing traffic.
- When your function needs to make outbound requests to external services (e.g., HTTP requests), it will use Google’s internal network to reach those services.
- The specific IP addresses used for outbound traffic will change over time.
This blog post details how to check this dynamic IP address (and log it to a database).
The source code can be found in the following repository https://github.com/mortie23/ml/tree/master/eng/misc/fn-misc-ipchecker.
Directory structure
We will start with the standard directory structure for this GCP Cloud Functions following on from the previous post of developing a GCP Cloud Function.
📁 fn-misc-ipchecker/
├── 📁 src/
│ │ 📄 __init__.py
│ │ 📄 misc.yaml
│ │ 📄 main.py
│ │ 📄 requirements.txt
├── 📁 tests/
│ │ 📄 test_fn-misc-ipchecker.py
│ 📄 deploy.sh
│ 📄 poetry.lock
│ 📄 pyproject.toml
│ 📄 pytest.ini
📄 .env
How it works
The configuration file defines the BigQuery database (project), schema (dataset) and table where to log the IP addresses and the base URL of the service that returns the IP address that made the HTTP request.
project_id: 'prj-xyz-<env>-misc-ip-1'
dataset_name: 'demo'
table_name: 'ipchecker'
base_url: 'https://api64.ipify.org/?format=json'
The Python function simple makes the HTTP request and consumes and extracts the IP address that is returned.
def ip_checker() -> str:
"""Call a service that returns the calling network IP address
Returns:
str: The IP Address
"""
data = requests.get(cfg.base_url)
ip_address = json.loads(data.content.decode("utf-8"))["ip"]
return ip_address
Running the source code in interactive
As a test, running the source code main.py
script from the developer machine is a good way to have a comparison IP address logged to the BigQuery table.
Development and Deployment
Given this function is developed and deployed in the same way as a previous blog post, to avoid duplication, I will just reference it here: GCP Cloud Function, Development process and tooling.
However, this time I will outline the process to deploy the infrastructure using the GCP Terraform provider as opposed to the Gcloud SDK CLI.
Infrastructure as code
To ensure that all required infrastructure is deployed, we will use a Infrastructure as Code (IaC) solution, namely Terraform.
The below example is from the main.tf
file and shows how the Google Terraform provider is required and how the Google provider is authenticated with the Application Default Credentials (ADC).
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "4.51.0"
}
}
}
provider "google" {
credentials = file("~/.config/gcloud/application_default_credentials.json")
region = var.region
}
resource "google_project" "misc_ip_1" {
name = "prj-xyz-${var.env}-misc-ip-1"
project_id = "prj-xyz-${var.env}-misc-ip-1"
billing_account = var.billing_account
}
Requirements
I am using a WSL2 installation of the Ubuntu 20.04 distro. I have previously installed the following requirements on the machine.
- Terraform https://developer.hashicorp.com/terraform/tutorials/aws-get-started/install-cli
- Gcloud SDK https://cloud.google.com/sdk/docs/install#deb
The GCP Terraform provider uses an Application Default Credential (ADC) to authenticate to GCP as we mentioned before. Use the following to create yours.
# Create an ADC JSON file
gcloud auth application-default login
Initialising the backend
The next step is to re-initialise the backend.
- re-initialises and configures a Terraform working directory
- identifies the providers and modules used in your configuration files and downloads them
- loads up variables used in the Terraform configuration for managing different environments
# Change to the Terraform working directory
cd infra/tf/
# Initialise
terraform init -reconfigure -var-file="env-dev.tfvars"
Plan and Apply
First we run plan
which generates a detailed execution plan. The plan let’s us know the changes that Terraform will make to the cloud infrastructure before we apply them, including:
- creating (add) new resources,
- modifying (change) existing ones, or
- deleting (destroy) unnecessary resources
terraform plan -var-file="env-dev.tfvars"
Example output:
google_project.misc_ip_1: Refreshing state... [id=projects/prj-xyz-dev-misc-ip-1]
...
Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
-/+ destroy and then create replacement
Terraform will perform the following actions:
# google_project.misc_ip_1
-/+ resource "google_project" "misc_ip_1" {
~ id = "projects/prj-xyz-dev-misc-ip-1" -> (known after apply)
- labels = {} -> null
~ name = "prj-xyz-dev-misc-ip-1"
~ number =
~ project_id = "prj-xyz-dev-misc-ip-1"
}
...
Plan: 17 to add, 0 to change, 17 to destroy.
The above plan let’s us know that 17 resources will be added, and what they are.
Assuming we are happy with the plan, we will continue with the apply
step, which creates all the infra defined in the configurations.
terraform apply -auto-approve -var-file="env-dev.tfvars"
Once a successful apply has finished, you can log into the Cloud console and check that the resources, such as the new project and/or new BigQuery dataset have been created.
Test calling the function
Once you have deployed the required infrastructure and the Cloud Function it is now time to run it. To call the Cloud Function we can use a basic curl command. Ensure you are logged in as a user that has invoke permissions.
curl https://australia-southeast1-prj-xyz-dev-misc-ip-1.cloudfunctions.net/fn-misc-ipchecker-0 \
-H "Authorization: Bearer $(gcloud auth print-identity-token)"
If you check the target table, it should include an extra row with the public IP address that was used to make the external HTTP request. The first record is from the successful run of the source code from the development machine, which will log the public IP address of the development machines network.
Where does the IP address geo locate to
This result could be obvious, but I found it interesting none the less.
Testing
To run the unit tests during development we can run this:
pytest ./tests/test_fn-misc-ipchecker.py::test_ip_checker
====================== test session starts ===========================
platform linux -- Python 3.10.7, pytest-8.3.3, pluggy-1.5.0
rootdir: /mnt/c/git/github/mortie23/ml/eng/misc/fn-misc-ipchecker
configfile: pytest.ini
plugins: env-1.1.5, hydra-core-1.3.2
collected 1 item
tests/test_fn-misc-ipchecker.py . [100%]
====================== 1 passed in 2.72s =============================