Log's

One feature that we recommend ForgeRock build is a page to look at logs in the identity cloud platform. While working in the environment and troubleshooting, pulling logs is very cumbersome. As someone who has worked with more than one ForgeRock customer … this is the biggest complaint we hear. Even having the ability to forward the raw logs on a continuous basis to a SIEM (not just the few that are currently integrated like splunk) would be helpful. If anyone has any easy work arounds that are non developer friendly we are open to suggestions.

2 Likes

Hi @johnw19000, Thank you for the great feedback. A dedicated page for viewing logs or the ability to forward logs to a broader range of SIEMs would enhance the troubleshooting experience.

I suggest opening a support ticket and raising a Request for Enhancement (RFE) to address this, outlining the challenges, impact, and benefits. ForgeRock welcomes feedback and feature requests from our users, and our Product managers and Developers actively monitor these requests.
Submitting an RFE through our support channels is the optimal avenue for driving future product enhancements.

I hope this helps.
Many thanks!
Sheila

Hello John,

My team has also struggled with this for a while but have found a work around to port the logs using the rest interface into aws cloudwatch and downstream our enterprise kibana stack.

Transporting all this data costs a bit of money but you can use a aws lamba function in combination with boto3 cloudwatch logs or a filebeat setup for permanent tailing on /monitoring/logs/tail endpoint.

Don’t get me wrong. I also feel that Forgerock and Ping can inprove their product but the following solution direction helped my team to be more agile and that is why i hope it will do the same for yours.

Docs:

  1. CloudWatchLogs - Boto3 1.34.112 documentation
  2. Get audit and debug logs :: ForgeRock Identity Cloud Docs
  3. GitHub - lapinek/fidc-logs
  4. Example filebeat config:
filebeat.inputs:
- type: httpjson
  config_version: 2
  enabled: ##FIDC_TAIL_ENABLED##
  interval: ##FIDC_TAIL_INTERVAL##
  tags: ["fidc"]
  fields_under_root: true
  publisher_pipeline.disable_host: true
  request.url: ##FIDC_TENANT_URL##/monitoring/logs/tail
  auth.basic:
    user: ##FIDC_API_KEY_ID##
    password: ##FIDC_API_KEY_SECRET##
  request.timeout: 1m
  request.transforms:
    - set:
        target: url.params.source
        value: '##FIDC_LOG_SOURCES##'
    - set:
        target: url.params._pagedResultsCookie
        value: '[[.last_response.body.pagedResultsCookie]]'
  request.rate_limit:
    limit: '[[.last_response.header.Get "x-ratelimit-limit"]]'
    remaining: '[[.last_response.header.Get "x-ratelimit-remaining"]]'
    reset: '[[.last_response.header.Get "x-ratelimit-reset"]]'
  response.split:
    target: body.result
    type: array
    transforms:
      - set:
          target: body.tenant
          value: '##FIDC_TENANT_NAME##'
- type: httpjson
  config_version: 2
  enabled: ##FIDC_LOGS_ENABLED##
  interval: ##FIDC_LOGS_INTERVAL##
  tags: ["fidc"]
  fields_under_root: true
  publisher_pipeline.disable_host: true
  request.url: ##FIDC_TENANT_URL##/monitoring/logs
  auth.basic:
    user: ##FIDC_API_KEY_ID##
    password: ##FIDC_API_KEY_SECRET##
  request.timeout: 1m
  request.transforms:
    - set:
        target: url.params.source
        value: '##FIDC_LOG_SOURCES##'
    - set:
        target: url.params.beginTime
        value: '##FIDC_LOGS_BEGIN_TIME##'
    - set:
        target: url.params.endTime
        value: '##FIDC_LOGS_END_TIME##'
    - set:
        target: url.params._pagedResultsCookie
        value: '[[.last_response.body.pagedResultsCookie]]'
  request.rate_limit:
    limit: '[[.last_response.header.Get "x-ratelimit-limit"]]'
    remaining: '[[.last_response.header.Get "x-ratelimit-remaining"]]'
    reset: '[[.last_response.header.Get "x-ratelimit-reset"]]'
  response.split:
    target: body.result
    type: array
    transforms:
      - set:
          target: body.tenant
          value: '##FIDC_TENANT_NAME##'

processors:
  - decode_json_fields:
      fields: ["message"]
      process_array: true
      max_depth: 5
      target: ""
      overwrite_keys: true
      add_error_key: true
  - timestamp:
      field: timestamp
      ignore_failure: false
      layouts:
        - '2006-01-02T15:04:05.999999999Z'
      test:
        - '2021-03-16T16:39:40.410894588Z'
  - drop_fields:
      fields: ["timestamp"]
  - if:
      contains:
        type: "text"
    then:
      - rename:
          fields:
            - from: "payload"
              to: "text_payload"
            - from: "source"
              to: "fidc_source"
          ignor

Regards,
Roland

1 Like