Below is the content:
{
“schema”: “iglu:com.snowplowanalytics.iglu/resolver-config/jsonschema/1-0-2”,
“data”: {
“cacheSize”: 500,
“cacheTtl”: 600,
“repositories”: [{
“connection”: {
“http”: {
“uri”: “http://iglucentral.com/schemas ”
}
},
“name”: “Iglu Central”,
“priority”: 10,
“vendorPrefixes”: [“com.snowplowanalytics”]
}, {
“connection”: {
“http”: {
“uri”: “http://mirror01.iglucentral.com/schemas ”
}
},
“name”: “Iglu Central - Mirror 01”,
“priority”: 20,
“vendorPrefixes”: [“com.snowplowanalytics”]
}, {
“connection”: {
“http”: {
“apikey”: “81bb9b5d-cdfd-4d63-9fd7-6ccffcac8dbd”,
“uri”: “http://sp-iglu-lb-1498571896.us-west-1.elb.amazonaws.com/api ”
}
},
“name”: “Iglu Server”,
“priority”: 1,
“vendorPrefixes”:
}
]
}
}
Let me know if it is correct. We need to fix it urgently.
mike
March 10, 2022, 8:56pm
22
You’ll want to drop /schemas
from your Iglu Central URIs. Your custom Iglu server looks like it should be fine providing it’s reachable.
That ‘/schemas’ I have added lately for testing, will drop it for sure.
Do you think it can be any caching in iglu server ?
Another idea came to my mind us that, is it possible that, may be AWS permission control not allowing the new schema to work?
Hi
Now I am receiving custom data in enriched s3 bucket. But the issue with Postgre remains same as enriched custom data not received in Postgre database.
What should I look for into the good stream data so that I can fix Postgre Loader issue for custom data?
Please suggest.
Thanks Again.
mike
March 13, 2022, 10:19pm
26
What is the configuration for your Postgres loader? In theory rows should either load entirely (in atomic.events and other tables) or not at all.
I recommend doing inline primary key definitions: id int generated always as identity primary key
. this keep your always unique.