SnowPlow System Setup:
Collector: Scala Stream Collector
Tracker: Javascript Tracker (I have triggered PageView event)
Enricher: Stream Enrich
AWS setup:
2 Data Stream i.e snowplow-collected-good-events-stream & snowplow-collected-bad-events-stream
2 Data Firehose i.e snowplow-enriched-good-events-firehose & snowplow-enriched-bad-events-firehose
2 S3 bucket for data store i.e snowplow-enriched-good-events & snowplow-enriched-bad-events
Ok now, if i use firehose without lambda, “snowplow-enriched-good-events” s3 bucket shows files in TSV format that is good. But if i use aws lambda(any lambda), it will create “processing-failed” folder in “snowplow-enriched-good-events” s3 bucket and inside that file it contains:
{"attemptsMade":4,"arrivalTimestamp":1600154159619,"errorCode":"Lambda.FunctionError","errorMessage":"The Lambda function was successfully invoked but it returned an error result.","attemptEndingTimestamp":1600154235846,"rawData":"****","lambdaArn":"arn:aws:lambda:ap-southeast-2:573188294151:function:snowplow-json-transformer-lambda:$LATEST"}
{"attemptsMade":4,"arrivalTimestamp":1600154161523,"errorCode":"Lambda.FunctionError","errorMessage":"The Lambda function was successfully invoked but it returned an error result.","attemptEndingTimestamp":1600154235846,"rawData":"*****=","lambdaArn":"arn:aws:lambda:ap-southeast-2:573188294151:function:snowplow-json-transformer-lambda:$LATEST"}
And when i see in “snowplow-enriched-bad-events” s3 bucket, there is file with contents:
{"schema":"iglu:com.snowplowanalytics.snowplow.badrows/collector_payload_format_violation/jsonschema/1-0-0","data":{"processor":{"artifact":"snowplow-stream-enrich","version":"1.0.0"},"failure":{"timestamp":"2020-09-15T07:16:02.488Z","loader":"thrift","message":{"error":"error deserializing raw event: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)"}},"payload":"****="}}
Help me!