Hi everyone - I’m trying to configure a docker-compose.yml which will deploy Kafka & Snowplow locally for development / evaluation use.
I’ve hit a roadblock with the IP lookup enrichment:
" An error occured: Attempt to download s3://minio:9000/geoip/GeoLite2-City.mmdb to ./ip_geo failed: No AWS credentials provided"
I have set up a minio container in my docker compose to host this file. I can see and download the file in the web interface so I assume this side is working correctly.
In the Snowplow enricher config.hocon, I have:
Then, in the Snowplow enricher service in docker-compose.yml, I have the actual values:
But I still get the error above on trying to start the enricher container. This seems strange for two reasons - first, the minio bucket is set up to allow anonymous downloads, and second, I have supplied the credentials.
Does anyone know how to make this work? Thanks
Hi @Ffxa4W8B I am not sure this is possible currently.
To use the AWS SDK with non-AWS endpoints you need to provide a custom endpoint for that particular service. For Enrich Kinesis we provide overrides for Kinesis, CloudWatch and DynamoDB so the app can run with localstack but it looks like S3 is not a supported override.
For this to work an option would likely need to be added to all of the Enrich builds to allow overriding the Endpoint for S3 to connect over so you could target a minio layer instead (or more generally just something other than the default S3 endpoint).
All that being said - I think the best way for you to proceed is likely to move away from mixing Kafka + AWS services and instead mount the database on a local volume instead that you can share into the Enrich containers directly.