Using the API#

Authentication and Configuration#

  • For an overview of authentication in google-cloud-python, see Authentication.

  • In addition to any authentication configuration, you should also set the GOOGLE_CLOUD_PROJECT environment variable for the project you’d like to interact with. If you are Google App Engine or Google Compute Engine this will be detected automatically.

  • The library now enables the gRPC transport for the logging API by default, assuming that the required dependencies are installed and importable. To disable this transport, set the GOOGLE_CLOUD_DISABLE_GRPC environment variable to a non-empty string, e.g.: $ export GOOGLE_CLOUD_DISABLE_GRPC=true.

  • After configuring your environment, create a Client

    >>> from google.cloud import logging
    >>> client = logging.Client()
    

    or pass in credentials and project explicitly

    >>> from google.cloud import logging
    >>> client = logging.Client(project='my-project', credentials=creds)
    

Writing log entries#

Write a simple text entry to a logger.

>>> from google.cloud import logging
>>> client = logging.Client()
>>> logger = client.logger('log_name')
>>> logger.log_text("A simple entry")  # API call

Write a dictionary entry to a logger.

>>> from google.cloud import logging
>>> client = logging.Client()
>>> logger = client.logger('log_name')
>>> logger.log_struct(
...     message="My second entry",
...     weather="partly cloudy")  # API call

Retrieving log entries#

Fetch entries for the default project.

>>> from google.cloud import logging
>>> client = logging.Client()
>>> entries, token = client.list_entries()  # API call
>>> for entry in entries:
...    timestamp = entry.timestamp.isoformat()
...    print('%sZ: %s' %
...          (timestamp, entry.payload))
2016-02-17T20:35:49.031864072Z: A simple entry | None
2016-02-17T20:38:15.944418531Z: None | {'message': 'My second entry', 'weather': 'partly cloudy'}

Fetch entries across multiple projects.

>>> from google.cloud import logging
>>> client = logging.Client()
>>> entries, token = client.list_entries(
...     project_ids=['one-project', 'another-project'])  # API call

Filter entries retrieved using the Advanced Logs Filters syntax

>>> from google.cloud import logging
>>> client = logging.Client()
>>> FILTER = "log:log_name AND textPayload:simple"
>>> entries, token = client.list_entries(filter=FILTER)  # API call

Sort entries in descending timestamp order.

>>> from google.cloud import logging
>>> client = logging.Client()
>>> entries, token = client.list_entries(order_by=logging.DESCENDING)  # API call

Retrieve entries in batches of 10, iterating until done.

>>> from google.cloud import logging
>>> client = logging.Client()
>>> retrieved = []
>>> token = None
>>> while True:
...     entries, token = client.list_entries(page_size=10, page_token=token)  # API call
...     retrieved.extend(entries)
...     if token is None:
...         break

Retrieve entries for a single logger, sorting in descending timestamp order:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> logger = client.logger('log_name')
>>> entries, token = logger.list_entries(order_by=logging.DESCENDING)  # API call

Delete all entries for a logger#

>>> from google.cloud import logging
>>> client = logging.Client()
>>> logger = client.logger('log_name')
>>> logger.delete()  # API call

Manage log metrics#

Metrics are counters of entries which match a given filter. They can be used within Stackdriver Monitoring to create charts and alerts.

Create a metric:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> metric = client.metric(
...     "robots", "Robots all up in your server",
...     filter='log:apache-access AND textPayload:robot')
>>> metric.exists()  # API call
False
>>> metric.create()  # API call
>>> metric.exists()  # API call
True

List all metrics for a project:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> metrics, token = client.list_metrics()
>>> len(metrics)
1
>>> metric = metrics[0]
>>> metric.name
"robots"

Refresh local information about a metric:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> metric = client.metric("robots")
>>> metric.reload()  # API call
>>> metric.description
"Robots all up in your server"
>>> metric.filter_
"log:apache-access AND textPayload:robot"

Update a metric:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> metric = client.metric("robots")
>>> metric.exists()  # API call
True
>>> metric.reload()  # API call
>>> metric.description = "Danger, Will Robinson!"
>>> metric.update()  # API call

Delete a metric:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> metric = client.metric("robots")
>>> metric.exists()  # API call
True
>>> metric.delete()  # API call
>>> metric.exists()  # API call
False

Export log entries using sinks#

Sinks allow exporting entries which match a given filter to Cloud Storage buckets, BigQuery datasets, or Cloud Pub/Sub topics.

Export to Cloud Storage#

Make sure that the storage bucket you want to export logs too has cloud-logs@google.com as the owner. See Set permission for writing exported logs.

Add cloud-logs@google.com as the owner of my-bucket-name:

>>> from google.cloud import storage
>>> client = storage.Client()
>>> bucket = client.get_bucket('my-bucket-name')
>>> bucket.acl.reload()
>>> logs_group = bucket.acl.group('cloud-logs@google.com')
>>> logs_group.grant_owner()
>>> bucket.acl.add_entity(logs_group)
>>> bucket.acl.save()

Export to BigQuery#

To export logs to BigQuery you must log into the Cloud Platform Console and add cloud-logs@google.com to a dataset.

See: Setting permissions for BigQuery

>>> from google.cloud import bigquery
>>> from google.cloud.bigquery.dataset import AccessGrant
>>> bigquery_client = bigquery.Client()
>>> dataset = bigquery_client.dataset('my-dataset-name')
>>> dataset.create()
>>> dataset.reload()
>>> grants = dataset.access_grants
>>> grants.append(AccessGrant(
...     'WRITER', 'groupByEmail', 'cloud-logs@google.com')))
>>> dataset.access_grants = grants
>>> dataset.update()

Export to Pub/Sub#

To export logs to BigQuery you must log into the Cloud Platform Console and add cloud-logs@google.com to a topic.

See: Setting permissions for Pub/Sub

>>> from google.cloud import pubsub
>>> client = pubsub.Client()
>>> topic = client.topic('your-topic-name')
>>> policy = top.get_iam_policy()
>>> policy.owners.add(policy.group('cloud-logs@google.com'))
>>> topic.set_iam_policy(policy)

Create a Cloud Storage sink:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> sink = client.sink(
...     "robots-storage",
...     'log:apache-access AND textPayload:robot',
...     'storage.googleapis.com/my-bucket-name')
>>> sink.exists()  # API call
False
>>> sink.create()  # API call
>>> sink.exists()  # API call
True

Create a BigQuery sink:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> sink = client.sink(
...     "robots-bq",
...     'log:apache-access AND textPayload:robot',
...     'bigquery.googleapis.com/projects/projects/my-project/datasets/my-dataset')
>>> sink.exists()  # API call
False
>>> sink.create()  # API call
>>> sink.exists()  # API call
True

Create a Cloud Pub/Sub sink:

>>> from google.cloud import logging
>>> client = logging.Client()

>>> sink = client.sink(
...     "robots-pubsub",
...      'log:apache-access AND textPayload:robot',
...      'pubsub.googleapis.com/projects/my-project/topics/my-topic')
>>> sink.exists()  # API call
False
>>> sink.create()  # API call
>>> sink.exists()  # API call
True

List all sinks for a project:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> sinks, token = client.list_sinks()
>>> for sink in sinks:
...     print('%s: %s' % (sink.name, sink.destination))
robots-storage: storage.googleapis.com/my-bucket-name
robots-bq: bigquery.googleapis.com/projects/my-project/datasets/my-dataset
robots-pubsub: pubsub.googleapis.com/projects/my-project/topics/my-topic

Refresh local information about a sink:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> sink = client.sink('robots-storage')
>>> sink.filter_ is None
True
>>> sink.reload()  # API call
>>> sink.filter_
'log:apache-access AND textPayload:robot'
>>> sink.destination
'storage.googleapis.com/my-bucket-name'

Update a sink:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> sink = client.sink("robots")
>>> sink.reload()  # API call
>>> sink.filter_ = "log:apache-access"
>>> sink.update()  # API call

Delete a sink:

>>> from google.cloud import logging
>>> client = logging.Client()
>>> sink = client.sink(
...     "robots",
...     filter='log:apache-access AND textPayload:robot')
>>> sink.exists()  # API call
True
>>> sink.delete()  # API call
>>> sink.exists()  # API call
False

Integration with Python logging module#

It’s possible to tie the Python logging module directly into Google Cloud Logging. To use it, create a CloudLoggingHandler instance from your Logging client.

>>> import logging
>>> import google.cloud.logging # Don't conflict with standard logging
>>> from google.cloud.logging.handlers import CloudLoggingHandler
>>> client = google.cloud.logging.Client()
>>> handler = CloudLoggingHandler(client)
>>> cloud_logger = logging.getLogger('cloudLogger')
>>> cloud_logger.setLevel(logging.INFO) # defaults to WARN
>>> cloud_logger.addHandler(handler)
>>> cloud_logger.error('bad news')

Note

This handler by default uses an asynchronous transport that sends log entries on a background
thread. However, the API call will still be made in the same process. For other transport options, see the transports section.

All logs will go to a single custom log, which defaults to “python”. The name of the Python logger will be included in the structured log entry under the “python_logger” field. You can change it by providing a name to the handler:

>>> handler = CloudLoggingHandler(client, name="mycustomlog")

It is also possible to attach the handler to the root Python logger, so that for example a plain logging.warn call would be sent to Cloud Logging, as well as any other loggers created. However, you must avoid infinite recursion from the logging calls the client itself makes. A helper method setup_logging is provided to configure this automatically:

>>> import logging
>>> import google.cloud.logging # Don't conflict with standard logging
>>> from google.cloud.logging.handlers import CloudLoggingHandler, setup_logging
>>> client = google.cloud.logging.Client()
>>> handler = CloudLoggingHandler(client)
>>> logging.getLogger().setLevel(logging.INFO) # defaults to WARN
>>> setup_logging(handler)
>>> logging.error('bad news')

You can also exclude certain loggers:

>>> setup_logging(handler, excluded_loggers=('werkzeug',)))

Python logging handler transports#

The Python logging handler can use different transports. The default is google.cloud.logging.handlers.BackgroundThreadTransport.

1. google.cloud.logging.handlers.BackgroundThreadTransport this is the default. It writes entries on a background python.threading.Thread.

1. google.cloud.logging.handlers.SyncTransport this handler does a direct API call on each logging statement to write the entry.