Hey Snowplow, we are currently following your Kinesis Stream Transformer docs and we are encountering some issues with the SQS queue.
Problem
We are able to send test events through the pipeline (collector → enricher → transformer) and see these written to S3, however, we are not observing any messages in the configured SQS queue.
Tried & Failed Fixes
- Given the ECS task running the transformer very open permissions.
- Attached a very open resource-based-policy to the SQS queue.
Errors Encountered
software.amazon.awssdk.services.sqs.model.SqsException: Value shredding for parameter MessageGroupId is invalid. Reason: The request include parameter that is not valid for this queue type. (Service: Sqs, Status Code: 400, Request ID: 95402e98-c015-5186-bf35-75f9014fa56c, Extended Request ID: null)
Relevant Configuration
{ "input": { "type": "kinesis", "appName": ${?DDB_NAME}, "streamName": ${?INPUT_STREAM_NAME}, "region": "eu-west-1", "position": "TRIM_HORIZON" "retrievalMode": { "type": "Polling" "maxRecords": 10000 } "bufferSize": 3 } "output": { "path": ${?TRANSFORMED_ARCHIVE_PATH}, "compression": "GZIP", "region": "eu-west-1" } "windowing": "5 minutes" "queue": { "type": "sqs", "queueName": ${?SQS_QUEUE_NAME}, "region": "eu-west-1" } "formats": { "transformationType": "widerow", "default": "TSV", } "monitoring": { "metrics": { "cloudwatch": true } } … }
With DDB_NAME
(name of DynamoDB table - not ARN), INPUT_STREAM_NAME
(name of the input Kinesis Stream - not ARN), TRANSFORMED_ARCHIVE_PATH
(S3 URI) and SQS_QUEUE_NAME
(name of the SQS queue - not ARN) configured as environment variables.Are you able to help us diagnose this issue? Let us know if you require any additional information. Thanks