Cloud Storage Logging

Overview

This documentation provides a detailed walkthrough to send the Google CCloud Storage logs directly to SigNoz. By the end of this guide, you will have a setup that automatically sends your Cloud Storage logs to SigNoz.

Here's a quick summary of what we will be doing in this guide

  • Create and configure Cloud Storage Audit Logs
  • Create Pub/Sub topic
  • Create Log Router to route the Cloud Storage logs to SigNoz
  • Create OTel Collector to route logs from Pub/Sub topic to SigNoz Cloud
  • Create Cloud Storage bucket and objects
  • Send and Visualize the logs in SigNoz Cloud

Prerequisites

  • Google Cloud account with administrative privilege or Logging Admin privilege.
  • SigNoz Cloud Account (we are using SigNoz Cloud for this demonstration, we will also need ingestion details. To get your Ingestion Key and Ingestion URL, sign-in to your SigNoz Cloud Account and go to Settings >> Ingestion Settings)
  • Access to a project in GCP

Setup

Get started with Cloud Storage Audit Log Configuration

Enable the Google Cloud Storage Audit Logs using the following steps:

Step 1: Go to your GCP console and search for Audit Logs, go to Audit Logs under IAM and Admin service.

Go to Audit Logs

Go to Audit Logs

Step 2: In the Data access audit logs configuration table, search for "Google Cloud Storage" in the search bar of the table, and click on the same.

Search for Google Cloud Storage in Data Access Audit Logs Configuration

Search for Google Cloud Storage in Data Access Audit Logs Configuration

Step 3: You can select the Google Cloud Storage record by clicking on the check box to the left of the record. This will popup the permission settings to the right of the screen.

Check the Data Write option, and click SAVE button. This will enable the data write audit logs on the Google Cloud Storage service.

Info

You can choose to select Admin Read and Date Read options as well from the Permission Types popup. But note that choosing those options can lead to huge number of audit logs, eventually even increasing the cost.

Enable Data Write Audit Logs

Enable Data Write Audit Logs

Create PubSub Topic

Follow the steps mentioned in the Creating Pub/Sub Topic document to create the Pub/Sub topic.

Create Log Router to Pub/Sub Topic

Follow the steps mentioned in the Log Router Setup document to create the Log Router.

To ensure you filter out only the Compute Engine logs, use the following filter conditions:

resource.type="gcs_bucket"

Setup OTel Collector

Follow the steps mentioned in the Creating Compute Engine document to create another Compute Engine instance. We will be installing OTel Collector on this instance.

Install OTel Collector as agent

Firstly, we will establish the authentication using the following commands:

  1. Initialize gcloud:
gcloud init
  1. Authenticate into GCP:
gcloud auth application-default login

Let us now proceed to the OTel Collector installation:

Step 1: Download otel-collector tar.gz for your architecture

wget https://github.com/open-telemetry/opentelemetry-collector-releases/releases/download/v0.88.0/otelcol-contrib_0.88.0_linux_amd64.tar.gz

Step 2: Extract otel-collector tar.gz to the otelcol-contrib folder

mkdir otelcol-contrib && tar xvzf otelcol-contrib_0.88.0_linux_amd64.tar.gz -C otelcol-contrib

Step 3: Create config.yaml in the folder otelcol-contrib with the below content in it. Replace <region> with the appropriate SigNoz Cloud region. Replace SIGNOZ_INGESTION_KEY with what is provided by SigNoz:

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318
  googlecloudpubsub:
    project: <gcp-project-id>
    subscription: projects/<gcp-project-id>/subscriptions/<pubsub-topic's-subscription>
    encoding: raw_text
processors:
  batch: {}
exporters:
  otlp:
    endpoint: "ingest.<region>.signoz.cloud:443"
    tls:
      insecure: false
    headers:
      "signoz-ingestion-key": "<SigNoz-Key>"
service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp]
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp]
    logs:
      receivers: [otlp, googlecloudpubsub]
      processors: [batch]
      exporters: [otlp]

Step 4: Once we are done with the above configurations, we can now run the collector service with the following command:

From the otelcol-contrib, run the following command:

./otelcol-contrib --config ./config.yaml

Run in background

If you want to run OTel Collector process in the background:

./otelcol-contrib --config ./config.yaml &> otelcol-output.log & echo "$!" > otel-pid

The above command sends the output of the otel-collector to otelcol-output.log file and prints the process id of the background running OTel Collector process to the otel-pid file.

If you want to see the output of the logs you’ve just set up for the background process, you may look it up with:

tail -f -n 50 otelcol-output.log

Invoking Cloud Storage logs

Step 1: On the GCP console, search for Cloud Storage, go to Cloud Storage service.

Step 2: Click on the CREATE button at the top of the Cloud Storage page to create a bucket.

Create Cloud Storage Bucket

Create Cloud Storage Bucket

Step 3: On the Create a Bucket page, provide an appropriate name for the bucket in the Name your bucket section, and click on CONTINUE.

In the Choose where to store your data section, choose an appropriate location type, and click on CONTINUE. You can select an appropriate option in the rest of the sections as per your requirements, or let them be as default.

After filling the complete form, click on the CREATE button at the bottom of the page. This will create the new bucket.

Name the Cloud Storage Bucket

Name the Cloud Storage Bucket

Step 4: Optionally, you can also upload files to the newly created bucket. For that, click on the UPLOAD FILES button, and select the file you want to upload.

Upload File to Cloud Storage Bucket

Upload File to Cloud Storage Bucket

This will generate the Cloud Storage logs which you can now see on the SigNoz Cloud.

Cloud Storage Logs in SigNoz Cloud

Cloud Storage Logs in SigNoz Cloud

Was this page helpful?