Optimizely flags - payload way too big

So we turned on these two flags in Snow plow recently because we want to log Optimizely expirement and variation ids. Does anyone know why payload is so big? We don’t need to log the actual Optimizely code snippet in snow plow, all we want to do is log experiement and variation ids. If you look at one event in the raw snow plow logs, it’s huge.

Thanks for any help.

as you can see we are getting this error in the snow plow chrome extension and snow plow ETL is giving us some trouble as well. so wondering if us having ETL problems could be tied back to when we made the optiimizely flags live 4-5 days ago.

this is what we have in our JS on our website:

  optimizelyExperiments: true,
  optimizelyVariations: true,
  optimizelyVisitor: true,
  optimizelyAudiences: true

even if we can trash “code” in the schema so it never gets logged or gets to ETL

Hi @mjensen - we recommend setting the optimizelySummary context rather than the other optimizely contexts you’ve tried for exactly that reason: they grab a lot of data that simply isn’t helpful. You can see the release notes associated with the summary context here:


Hat tip to @digdeep for showing us how to integrate Optimizely more elegantly.


thanks, let me look at having the guys change the code to test.

this is the change we’re making

@yali thanks for the suggestion. and @digdeep thanks for the work done. we have this testing in staging and now getting errors where for some reason the JS (we are on 2.8.0) is sending activeExprimentID as integer instead of character. JSON schema 1-0-0 clearly states expirement and variation ids are char(12)

[{‘level’: ‘error’, ‘message’: ‘error: instance type (integer) does not match any allowed primitive type (allowed: [“string”])\n level: “error”\n schema: {“loadingURI”:"#",“pointer”:"/properties/activeExperimentId"}\n instance: {“pointer”:"/activeExperimentId"}\n domain: “validation”\n keyword: “type”\n found: “integer”\n expected: [“string”]\n’}], ‘failure_tstamp’: ‘2017-07-03T19:15:54.455Z’}

you can clearly see that this is being sent as an integer from the JS instead of a character field. variation id is being sent correctly as character field though.

{\\“schema\\”:\\“iglu:com.optimizely.snowplow/optimizely_summary/jsonschema/1-0-0\\”,\\“data\\”:{\\“activeExperimentId\\”:8251671025,\\“variation\\”:\\“8247457437\\”,\\“conditional\\”:false,\\“manual\\”:true,\\“name\\”:\\“ABTST-75 Homepage Modal (Han)\\”}}]}",“vp”:“249x716”,“ds”:“249x7739”,“vid”:“195”,“sid”:“a2b7a72e-3237-47f9-be31-626f2cde9313”,“duid”:“7281e18d-eb2a-4f8f-9dc6-2cc103a49c5b”,“fp”:“767470003”,“stm”:“1499105351708”}]}’, headers=[‘Connection: upgrade’, ‘Host: sp.generalassemb.ly’, ‘X-Real-IP:’, ‘X-Forwarded-For:,’, ‘Content-Length: 1665’, ‘Accept: /’, ‘Accept-Encoding: gzip, deflate, br’, 'Accept-Language: en-US, en;q=0.8\

live on staging:

  window.GAplow('newTracker', 'cf-hosted-v2.7.0', 'sp.generalassemb.ly', {
    appId: 'WebSiteOld',
    platform: 'web',
    cookieDomain: '.generalassemb.ly',
    discoverRootDomain: true,
    cookieName: "sp",
    encodeBase64: false,
    respectDoNotTrack: true,
    userFingerprint: true,
    userFingerprintSeed: 6485926835,
    pageUnloadTimer: 0,
    forceSecureTracker: false,
    stateStorageStrategy: 'cookie',
    post: true,
    bufferSize: 1,
    maxPostBytes: 45000,
    cookieLifetime: 86400 * 180,
    contexts: {
      webPage: true,
      gaCookies: true,
      optimizelySummary: true
  window.GAplow('enableActivityTracking', 30, 30);

Ouch… Not seen that before - interested to know if other Optimizely users have the same issue? (Not had it reported before.) Are you getting it consistently for all experiments? If so we should raise a ticket in https://github.com/snowplow/snowplow-javascript-tracker/issues.

In the meantime a quick workaround would be to:

  1. Make a local copy of the optimizely summary context schema locally but change the field type ["string", "integer"]
  2. Upload the schema to your local Iglu registry
  3. Make sure you iglu resolver config was set so that your own registry has a higher priority that Iglu Central as documented here. (Make sure the priority for your local registry is a lower number than Iglu Central.)

The updated schema should be something like this:

  "$schema" : "http://iglucentral.com/schemas/com.snowplowanalytics.self-desc/schema/jsonschema/1-0-0#",
  "self" : {
    "vendor" : "com.optimizely.snowplow",
    "name" : "optimizely_summary",
    "format" : "jsonschema",
    "version" : "1-0-0"

  "type": "object",
  "properties": {
    "activeExperimentId": {
      "type": ["string","integer"],
      "maxLength": 12
    "variation": {
      "type": "string",
      "maxLength": 12
    "conditional": {
      "type": "boolean"
    "manual": {
      "type": "boolean"
    "name": {
      "type": "string",
      "maxLength": 256

  "additionalProperties": false
1 Like

thanks @yali. that was my thought as well. to just make the json schema local and make changes. let me test a little more as well.

Let us know how you go! Thanks @mjensen

Hello @mjensen,

I vaguely remember, when we were adding Optimizely Summary context into JS Tracker 2.7.0, this id strings that always have numeric values confused us as well. I’ve asked for clarification on Optimizely Support forums and they confirmed that indeed it always will be a string (and in past could contain non-numeric characters). I’m not 100% sure it was exactly about activeExperimentId, but one of summary context’s properties for sure.

I have a strong feeling that they changed this behavior recently and we need to reflect this change in our schema.

1 Like

@yali thanks we’re good now and live with new optimizely summary schema. @anton yes… and thank you so much for helping us. glad this is rolling.

dev=> select count(*) from atomic.com_optimizely_snowplow_optimizely_summary_1;

(1 row)

i decided to make both ID’s int/char to be safe.

Hello @mjensen,

FYI we just released JS Tracker 2.8.1 rc1 with Optimizely Summary fix. It is available at http://d1fc8wv8zag5ca.cloudfront.net/2.8.1-rc1/sp.js.

We decided to cast number to string, so nobody would have to change their schema.


i see 2.8.1 is out. we will go with this in production now. thanks,