Collect SecureLink logs

Supported in:

This document explains how to ingest SecureLink logs to Google Security Operations using Google Cloud Storage V2.

SecureLink (now Imprivata Privileged Remote Access) is a cloud-based third-party remote access platform for secure vendor and privileged access management. The SecureLink REST API provides programmatic access to session logs, authentication events, and administrative audit logs.

Before you begin

Make sure you have the following prerequisites:

  • A Google SecOps instance
  • A GCP project with Cloud Storage API enabled
  • Permissions to create and manage GCS buckets
  • Permissions to manage IAM policies on GCS buckets
  • Permissions to create Cloud Run services, Pub/Sub topics, and Cloud Scheduler jobs
  • Privileged access to the SecureLink admin portal with administrator permissions
  • A SecureLink API key with audit log read access

Create Google Cloud Storage bucket

  1. Go to the Google Cloud Console.
  2. Select your project or create a new one.
  3. In the navigation menu, go to Cloud Storage > Buckets.
  4. Click Create bucket.
  5. Provide the following configuration details:

    Setting Value
    Name your bucket Enter a globally unique name (for example, securelink-logs)
    Location type Choose based on your needs (Region, Dual-region, Multi-region)
    Location Select the location (for example, us-central1)
    Storage class Standard (recommended for frequently accessed logs)
    Access control Uniform (recommended)
    Protection tools Optional: Enable object versioning or retention policy
  6. Click Create.

Generate API key

  1. Sign in to the SecureLink admin portal.
  2. Go to Administration > API Keys (or Administration > Integrations > API).
  3. Click Generate API Key.
  4. Enter a name for the API key (for example, Google SecOps Integration).
  5. Select the required permissions:
    • Audit Log Read: Access to session and authentication logs
    • Session Read: Access to remote access session data
  6. Click Generate.
  7. Copy and save the following details in a secure location:

    • API Key: The generated API key value
    • Organization ID: Your SecureLink organization ID (visible in the admin portal URL or Settings page)

Determine API base URL

The SecureLink API base URL:

Environment API Base URL
Production https://api.securelink.com/v1

Test API access

  • Test your credentials before proceeding with the integration:

    # Replace with your actual credentials
    SECURELINK_API_KEY="your-api-key"
    SECURELINK_ORG_ID="your-org-id"
    
    # Test API access - list recent audit logs
    curl -s -X GET "https://api.securelink.com/v1/audit-logs?limit=1" \
      -H "Authorization: Bearer ${SECURELINK_API_KEY}" \
      -H "X-Organization-Id: ${SECURELINK_ORG_ID}" \
      -H "Accept: application/json"
    

Create service account for Cloud Run function

The Cloud Run function needs a service account with permissions to write to GCS bucket and be invoked by Pub/Sub.

Create service account

  1. In the GCP Console, go to IAM & Admin > Service Accounts.
  2. Click Create Service Account.
  3. Provide the following configuration details:
    • Service account name: Enter securelink-logs-collector-sa
    • Service account description: Enter Service account for Cloud Run function to collect SecureLink logs
  4. Click Create and Continue.
  5. In the Grant this service account access to project section, add the following roles:
    1. Click Select a role.
    2. Search for and select Storage Object Admin.
    3. Click + Add another role.
    4. Search for and select Cloud Run Invoker.
    5. Click + Add another role.
    6. Search for and select Cloud Functions Invoker.
  6. Click Continue.
  7. Click Done.

These roles are required for:

  • Storage Object Admin: Write logs to GCS bucket and manage state files
  • Cloud Run Invoker: Allow Pub/Sub to invoke the function
  • Cloud Functions Invoker: Allow function invocation

Grant IAM permissions on GCS bucket

Grant the service account write permissions on the GCS bucket:

  1. Go to Cloud Storage > Buckets.
  2. Click on your bucket name (for example, securelink-logs).
  3. Go to the Permissions tab.
  4. Click Grant access.
  5. Provide the following configuration details:
    • Add principals: Enter the service account email (for example, securelink-logs-collector-sa@PROJECT_ID.iam.gserviceaccount.com)
    • Assign roles: Select Storage Object Admin
  6. Click Save.

Create Pub/Sub topic

Create a Pub/Sub topic that Cloud Scheduler will publish to and the Cloud Run function will subscribe to.

  1. In the GCP Console, go to Pub/Sub > Topics.
  2. Click Create topic.
  3. Provide the following configuration details:
    • Topic ID: Enter securelink-logs-trigger
    • Leave other settings as default
  4. Click Create.

Create Cloud Run function to collect logs

The Cloud Run function will be triggered by Pub/Sub messages from Cloud Scheduler to fetch logs from the SecureLink REST API and write them to GCS.

  1. In the GCP Console, go to Cloud Run.
  2. Click Create service.
  3. Select Function (use an inline editor to create a function).
  4. In the Configure section, provide the following configuration details:

    Setting Value
    Service name securelink-logs-collector
    Region Select region matching your GCS bucket (for example, us-central1)
    Runtime Select Python 3.12 or later
  5. In the Trigger (optional) section:

    1. Click + Add trigger.
    2. Select Cloud Pub/Sub.
    3. In Select a Cloud Pub/Sub topic, choose the topic securelink-logs-trigger.
    4. Click Save.
  6. In the Authentication section:

    1. Select Require authentication.
    2. Check Identity and Access Management (IAM).
  7. Scroll down and expand Containers, Networking, Security.

  8. Go to the Security tab:

    • Service account: Select the service account securelink-logs-collector-sa.
  9. Go to the Containers tab:

    1. Click Variables & Secrets.
    2. Click + Add variable for each environment variable:
    Variable Name Example Value Description
    GCS_BUCKET securelink-logs GCS bucket name
    GCS_PREFIX securelink Prefix for log files
    STATE_KEY securelink/state.json State file path
    SECURELINK_API_KEY your-api-key SecureLink API key
    SECURELINK_ORG_ID your-org-id SecureLink organization ID
    SECURELINK_API_BASE https://api.securelink.com/v1 SecureLink API base URL
    MAX_RECORDS 5000 Max records per run
    PAGE_SIZE 1000 Records per page
    LOOKBACK_HOURS 24 Initial lookback period
  10. In the Variables & Secrets section, scroll down to Requests:

    • Request timeout: Enter 600 seconds (10 minutes)
  11. Go to the Settings tab:

    • In the Resources section:
      • Memory: Select 512 MiB or higher
      • CPU: Select 1
  12. In the Revision scaling section:

    • Minimum number of instances: Enter 0
    • Maximum number of instances: Enter 100 (or adjust based on expected load)
  13. Click Create.

  14. Wait for the service to be created (1-2 minutes).

  15. After the service is created, the inline code editor will open automatically.

Add function code

  1. Enter main in the Entry point field.
  2. In the inline code editor, create two files:

    • First file main.py:

      import functions_framework
      from google.cloud import storage
      import json
      import os
      import urllib3
      from datetime import datetime, timezone, timedelta
      import time
      
      # Initialize HTTP client with timeouts
      http = urllib3.PoolManager(
        timeout=urllib3.Timeout(connect=5.0, read=30.0),
        retries=False,
      )
      
      # Initialize Storage client
      storage_client = storage.Client()
      
      # Environment variables
      GCS_BUCKET = os.environ.get('GCS_BUCKET')
      GCS_PREFIX = os.environ.get('GCS_PREFIX', 'securelink')
      STATE_KEY = os.environ.get('STATE_KEY', 'securelink/state.json')
      SECURELINK_API_KEY = os.environ.get('SECURELINK_API_KEY')
      SECURELINK_ORG_ID = os.environ.get('SECURELINK_ORG_ID')
      SECURELINK_API_BASE = os.environ.get('SECURELINK_API_BASE', 'https://api.securelink.com/v1')
      MAX_RECORDS = int(os.environ.get('MAX_RECORDS', '5000'))
      PAGE_SIZE = int(os.environ.get('PAGE_SIZE', '1000'))
      LOOKBACK_HOURS = int(os.environ.get('LOOKBACK_HOURS', '24'))
      
      def parse_datetime(value: str) -> datetime:
        """Parse ISO datetime string to datetime object."""
        if value.endswith("Z"):
          value = value[:-1] + "+00:00"
        return datetime.fromisoformat(value)
      
      @functions_framework.cloud_event
      def main(cloud_event):
        """
        Cloud Run function triggered by Pub/Sub to fetch SecureLink
        audit and session logs and write to GCS.
      
        Args:
          cloud_event: CloudEvent object containing Pub/Sub message
        """
      
        if not all([GCS_BUCKET, SECURELINK_API_KEY, SECURELINK_ORG_ID]):
          print('Error: Missing required environment variables')
          return
      
        try:
          bucket = storage_client.bucket(GCS_BUCKET)
      
          # Load state
          state = load_state(bucket, STATE_KEY)
      
          # Determine time window
          now = datetime.now(timezone.utc)
          last_time = None
      
          if isinstance(state, dict) and state.get("last_event_time"):
            try:
              last_time = parse_datetime(state["last_event_time"])
              # Overlap by 2 minutes to catch any delayed events
              last_time = last_time - timedelta(minutes=2)
            except Exception as e:
              print(f"Warning: Could not parse last_event_time: {e}")
      
          if last_time is None:
            last_time = now - timedelta(hours=LOOKBACK_HOURS)
      
          print(f"Fetching logs from {last_time.isoformat()} to {now.isoformat()}")
      
          # Fetch logs from multiple endpoints
          all_records = []
          newest_event_time = None
      
          for endpoint_type in ['audit-logs', 'sessions']:
            records, newest_time = fetch_logs(
              endpoint_type=endpoint_type,
              start_time=last_time,
              end_time=now,
              page_size=PAGE_SIZE,
              max_records=MAX_RECORDS,
            )
            all_records.extend(records)
            if newest_time:
              if newest_event_time is None or parse_datetime(newest_time) > parse_datetime(newest_event_time):
                newest_event_time = newest_time
      
          if not all_records:
            print("No new log records found.")
            save_state(bucket, STATE_KEY, now.isoformat())
            return
      
          # Write to GCS as NDJSON
          timestamp = now.strftime('%Y%m%d_%H%M%S')
          object_key = f"{GCS_PREFIX}/logs_{timestamp}.ndjson"
          blob = bucket.blob(object_key)
      
          ndjson = '\n'.join([json.dumps(record, ensure_ascii=False) for record in all_records]) + '\n'
          blob.upload_from_string(ndjson, content_type='application/x-ndjson')
      
          print(f"Wrote {len(all_records)} records to gs://{GCS_BUCKET}/{object_key}")
      
          # Update state with newest event time
          if newest_event_time:
            save_state(bucket, STATE_KEY, newest_event_time)
          else:
            save_state(bucket, STATE_KEY, now.isoformat())
      
          print(f"Successfully processed {len(all_records)} records")
      
        except Exception as e:
          print(f'Error processing logs: {str(e)}')
          raise
      
      def load_state(bucket, key):
        """Load state from GCS."""
        try:
          blob = bucket.blob(key)
          if blob.exists():
            state_data = blob.download_as_text()
            return json.loads(state_data)
        except Exception as e:
          print(f"Warning: Could not load state: {e}")
      
        return {}
      
      def save_state(bucket, key, last_event_time_iso: str):
        """Save the last event timestamp to GCS state file."""
        try:
          state = {'last_event_time': last_event_time_iso}
          blob = bucket.blob(key)
          blob.upload_from_string(
            json.dumps(state, indent=2),
            content_type='application/json'
          )
          print(f"Saved state: last_event_time={last_event_time_iso}")
        except Exception as e:
          print(f"Warning: Could not save state: {e}")
      
      def fetch_logs(endpoint_type: str, start_time: datetime, end_time: datetime, page_size: int, max_records: int):
        """
        Fetch logs from SecureLink REST API
        with cursor-based pagination and rate limiting.
      
        Args:
          endpoint_type: API endpoint type (audit-logs, sessions)
          start_time: Start time for log query
          end_time: End time for log query
          page_size: Number of records per page
          max_records: Maximum total records to fetch
      
        Returns:
          Tuple of (records list, newest_event_time ISO string)
        """
        api_base = SECURELINK_API_BASE.rstrip('/')
        endpoint = f"{api_base}/{endpoint_type}"
      
        headers = {
          'Authorization': f'Bearer {SECURELINK_API_KEY}',
          'X-Organization-Id': SECURELINK_ORG_ID,
          'Accept': 'application/json',
          'User-Agent': 'GoogleSecOps-SecureLinkCollector/1.0'
        }
      
        records = []
        newest_time = None
        page_num = 0
        backoff = 1.0
        cursor = None
      
        start_iso = start_time.strftime('%Y-%m-%dT%H:%M:%SZ')
        end_iso = end_time.strftime('%Y-%m-%dT%H:%M:%SZ')
      
        while True:
          page_num += 1
      
          if len(records) >= max_records:
            print(f"Reached max_records limit ({max_records}) for {endpoint_type}")
            break
      
          url = f"{endpoint}?startDate={start_iso}&endDate={end_iso}&limit={min(page_size, max_records - len(records))}"
          if cursor:
            url += f"&cursor={cursor}"
      
          try:
            response = http.request('GET', url, headers=headers)
      
            # Handle rate limiting with exponential backoff
            if response.status == 429:
              retry_after = int(response.headers.get('Retry-After', str(int(backoff))))
              print(f"Rate limited (429). Retrying after {retry_after}s...")
              time.sleep(retry_after)
              backoff = min(backoff * 2, 30.0)
              continue
      
            backoff = 1.0
      
            if response.status != 200:
              print(f"HTTP Error: {response.status}")
              response_text = response.data.decode('utf-8')
              print(f"Response body: {response_text}")
              return records, newest_time
      
            data = json.loads(response.data.decode('utf-8'))
      
            page_results = data.get('data', data.get('results', data.get('items', [])))
      
            if not page_results:
              print(f"No more results (empty page) for {endpoint_type}")
              break
      
            print(f"{endpoint_type} page {page_num}: Retrieved {len(page_results)} events")
      
            # Add endpoint type for identification
            for event in page_results:
              event['_securelink_log_type'] = endpoint_type
      
            records.extend(page_results)
      
            # Track newest event time
            for event in page_results:
              try:
                event_ts = event.get('timestamp') or event.get('createdAt') or event.get('startTime')
                if event_ts:
                  event_time = str(event_ts)
                  if newest_time is None or parse_datetime(event_time) > parse_datetime(newest_time):
                    newest_time = event_time
              except Exception as e:
                print(f"Warning: Could not parse event time: {e}")
      
            # Check for next cursor
            cursor = data.get('cursor') or data.get('nextCursor') or data.get('next_cursor')
            if not cursor:
              if len(page_results) < page_size:
                print(f"No more pages for {endpoint_type} (last page not full)")
              else:
                print(f"No more pages for {endpoint_type} (no cursor)")
              break
      
          except Exception as e:
            print(f"Error fetching {endpoint_type} logs: {e}")
            return records, newest_time
      
        print(f"Retrieved {len(records)} total {endpoint_type} records from {page_num} pages")
        return records, newest_time
      
    • Second file requirements.txt:

      functions-framework==3.*
      google-cloud-storage==2.*
      urllib3>=2.0.0
      
  3. Click Deploy to save and deploy the function.

  4. Wait for deployment to complete (2-3 minutes).

Create Cloud Scheduler job

Cloud Scheduler will publish messages to the Pub/Sub topic at regular intervals, triggering the Cloud Run function.

  1. In the GCP Console, go to Cloud Scheduler.
  2. Click Create Job.
  3. Provide the following configuration details:

    Setting Value
    Name securelink-logs-collector-hourly
    Region Select same region as Cloud Run function
    Frequency 0 * * * * (every hour, on the hour)
    Timezone Select timezone (UTC recommended)
    Target type Pub/Sub
    Topic Select the topic securelink-logs-trigger
    Message body {} (empty JSON object)
  4. Click Create.

Schedule frequency options

Choose frequency based on log volume and latency requirements:

Frequency Cron Expression Use Case
Every 5 minutes */5 * * * * High-volume, low-latency
Every 15 minutes */15 * * * * Medium volume
Every hour 0 * * * * Standard (recommended)
Every 6 hours 0 */6 * * * Low volume, batch processing
Daily 0 0 * * * Historical data collection

Test the integration

  1. In the Cloud Scheduler console, find your job.
  2. Click Force run to trigger the job manually.
  3. Wait a few seconds.
  4. Go to Cloud Run > Services.
  5. Click on securelink-logs-collector.
  6. Click the Logs tab.
  7. Verify the function executed successfully. Look for:

    Fetching logs from YYYY-MM-DDTHH:MM:SS+00:00 to YYYY-MM-DDTHH:MM:SS+00:00
    audit-logs page 1: Retrieved X events
    sessions page 1: Retrieved X events
    Wrote X records to gs://securelink-logs/securelink/logs_YYYYMMDD_HHMMSS.ndjson
    Successfully processed X records
    
  8. Go to Cloud Storage > Buckets.

  9. Click on your bucket name (securelink-logs).

  10. Navigate to the securelink/ folder.

  11. Verify that a new .ndjson file was created with the current timestamp.

If you see errors in the logs:

  • HTTP 401: Check API key and organization ID in environment variables
  • HTTP 403: Verify API key has required permissions in SecureLink admin portal
  • HTTP 429: Rate limiting - function will automatically retry with backoff
  • Missing environment variables: Check all required variables are set
  1. Go to SIEM Settings > Feeds.
  2. Click Add New Feed.
  3. Click Configure a single feed.
  4. In the Feed name field, enter a name for the feed (for example, SecureLink Logs).
  5. Select Google Cloud Storage V2 as the Source type.
  6. Select SecureLink as the Log type.
  7. Click Get Service Account. A unique service account email will be displayed, for example:

    chronicle-12345678@chronicle-gcp-prod.iam.gserviceaccount.com
    
  8. Copy this email address.

  9. Click Next.

  10. Specify values for the following input parameters:

    • Storage bucket URL: Enter the GCS bucket URI with the prefix path:

      gs://securelink-logs/securelink/
      
      • Replace:
        • securelink-logs: Your GCS bucket name.
        • securelink: Optional prefix/folder path where logs are stored (leave empty for root).
    • Source deletion option: Select the deletion option according to your preference:

      • Never: Never deletes any files after transfers (recommended for testing).
      • Delete transferred files: Deletes files after successful transfer.
      • Delete transferred files and empty directories: Deletes files and empty directories after successful transfer.

    • Maximum File Age: Include files modified in the last number of days (default is 180 days)

    • Asset namespace: The asset namespace

    • Ingestion labels: The label to be applied to the events from this feed

  11. Click Next.

  12. Review your new feed configuration in the Finalize screen, and then click Submit.

Grant IAM permissions to the Google SecOps service account

The Google SecOps service account needs Storage Object Viewer role on your GCS bucket.

  1. Go to Cloud Storage > Buckets.
  2. Click on your bucket name.
  3. Go to the Permissions tab.
  4. Click Grant access.
  5. Provide the following configuration details:
    • Add principals: Paste the Google SecOps service account email
    • Assign roles: Select Storage Object Viewer
  6. Click Save.

UDM mapping table

Log Field UDM Mapping Logic
additional_classification additional.fields Any important vendor-specific event data that cannot be adequately represented within the formal sections of the UDM model
mechanism extensions.auth.mechanism Authentication mechanism
extensions.auth.type Authentication type
msg, inner_message, desc metadata.description Description of the event
metadata.event_type Type of event (e.g., USER_LOGIN, NETWORK_CONNECTION)
resource_name, _application_protocol network.application_protocol Application protocol (e.g., HTTP, FTP)
method network.http.method HTTP method (e.g., GET, POST)
network.ip_protocol IP protocol (e.g., TCP, UDP)
sessionId network.session_id Session ID
principal_host principal.hostname Source hostname
login_ip, src_ip, inner_ip, principal_host principal.ip Source IP address of the connection
principal_port principal.port Source port number
process_id principal.process.pid Process ID
principal_uri principal.url URL associated with the principal
username, key principal.user.email_addresses Email addresses associated with the user
username, key, user_id principal.user.userid User ID
security_action security_result.action Action taken (e.g., ALLOW, BLOCK)
category security_result.category Category of the security result
action, description security_result.description Description of the security result
priority security_result.priority Priority of the security result
method security_result.severity Severity of the security result
priority security_result.severity_details Severity details
reason security_result.summary Summary of the security result
application, product_event_type target.application Application name
pwd target.file.full_path Full path of the file
target_host target.hostname Destination hostname
dst_ip, target_host target.ip Destination IP address
target_port target.port Destination port number
command target.process.command_line Command line of the process
target.resource.type Type of resource
url target.url URL associated with the target
username, key target.user.email_addresses Email addresses associated with the target user
name target.user.user_display_name Display name of the user
username, key, user_id target.user.userid User ID of the target user
metadata.product_name Product name
metadata.vendor_name Vendor/company name

Need more help? Get answers from Community members and Google SecOps professionals.