Uploaded image for project: 'camunda BPM'
  1. camunda BPM
  2. CAM-9612

BulkFetch complex/object-Variable values in context of historic detail data

    Details

      Description

      AT:

      • given:
        • I have more than 100 million variables in the Camunda engine
        • at least 10 % of the variables are complex variables (e.g. JSON or XML)
      • when:
        • I fetch 10 000 historic variable updates from the historic detail table using the Optimize rest api or the historic detail endpoint
      • then:
        • it does not take more than 2 seconds
      • such that:
        • Even if the user has a lot of complex variables, Optimize can import the data in a very fast manner

      Hints:
      In the context of bulk fetching historic variable updates from the historic detail table the current implementation of e.g. HistoricDetailQueryImpl#executeList and OptimizeHistoricVariableUpdateQueryCmd#fetchVariableValues sequentially calls getTypedValue for each variable entry which will perform one additional query per complex/object variable in AbstractSerializableValueSerializer#readValue to resolve the actual value.

      This doesn't scale with maxResults values in the magnitude of several thousand as used by optimize in the context of importing data e.g.
      maxResults -> response time
      100 -> 400ms
      200 -> 640ms
      500 -> 1s
      1000 -> 2s
      10_000 -> 20s

      We need a better scaling implementation of fetching variables for this usecase that e.g. fetches all byteArray entries in one bulk.

        Activity

        No work has yet been logged on this issue.

          People

          • Assignee:
            Unassigned
            Reporter:
            sebastian.bathke Sebastian Bathke
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development