Hi there,
I’m following Simo’s Ahava guide to setup Snowplow in GCP. I’m almost there but I have some issues when trying to setup a vm instance for the ETL process.
From what I understand I will need to add the following command in my startup script:
java -jar snowplow-bigquery-mutator-1.0.1.jar create --config $(cat bigquery_config.json | base64 -w 0) --resolver $(cat iglu_resolver.json | base64 -w 0)
I’ve tried to run myself for testing purpose and it returns me this error:
Attempt to decode value on failed cursor: DownField(projectId)
DecodingFailure at .http.uri: Iglu Client Failure. Provided URI string violates RFC 2396: [Illegal character in pat
h at index 1: ${iglu_server_url}]
The bigquery_config.json file seems to be well copied on my vm as well as the snowplow-bigquery-mutator-1.0.1.jar file.
Here is my bigquery_config.json file:
{
"schema": "iglu:com.snowplowanalytics.snowplow.storage/bigquery_config/jsonschema/1-0-0",
"data": {
"name": "Snowplow Page View Data",
"id": "b19fea0e-380c-450c-ba33-d3b507173e25",
"projectId": "myprojectid,
"datasetId": "mydataset",
"tableId": "pageviews",
"input": "enriched-good-sub",
"typesTopic": "bq-types",
"typesSubscription": "bq-types-sub",
"badRows": "bq-bad-rows",
"failedInserts": "bq-failed-inserts",
"load": {
"mode": "STREAMING_INSERTS",
"retry": false
},
"purpose": "ENRICHED_EVENTS"
}
}
Thanks.