Good morning! Our pipeline is currently set up with three different loaders pulling off of our enriched stream:
- Elasticsearch
- S3
- PostgresDB
While events are getting loaded into all three data stores, our custom context is only making it into two. For example, we can see the following in both ES and S3. I understand that in PostgresDB a new table should be created and populated per context. However, this table is simply never made.
contexts_com_my_org_app_context_1
{
"env": "local"
}
I’ve combed through the logs of loader and no errors could be found. We are using the “atomic” database schema as referenced by the Snowplow documentation and have several other enrichments and default context tables populating correctly. Listed below is the configuration for the Postgres loader.
module "postgres_loader_enriched" {
source = "snowplow-devops/postgres-loader-kinesis-ec2/aws"
name = "postgres-loader-enriched-server"
vpc_id = "<vpc_id>"
subnet_ids = [<subnets>]
in_stream_name = "enriched-stream"
# Note: The purpose defines what the input data set should look like
purpose = "ENRICHED_EVENTS"
# Note: This schema is created automatically by the VM on launch
schema_name = "atomic"
ssh_key_name = "snowplow-ssh"
db_sg_id = "<sg_id>"
db_host = "<url>"
db_port = 5432
db_name = "enriched"
db_username = "<user>"
db_password = "<pass>"
}
This is the schema for the additional global context:
{
"$schema":"http://iglucentral.com/schemas/com.snowplowanalytics.self-desc/schema/jsonschema/1-0-0#",
"self":{
"vendor":"my.org",
"name":"app_context",
"format":"jsonschema",
"version":"1-0-0"
},
"description":"Generic schema for a additional data",
"properties":{},
"additionalProperties":true,
"type":"object"
}