While creating custom table schema for associated custom json schema, I face following error:
ERROR: there is no unique constraint matching given keys for referenced table “events”
SQL state: 42830
The column “event_id” in “events” table is not unique so this error shows up.
Please help me , how to fix the issue and store custom data in Postgre DB.
Can you paste the table definition for your events table and your custom table? I didn’t think the Postgres loader defined any constraints but it sounds like maybe it does (or has one missing on a table).
If you want a quick fix I would drop the FOREIGN key from your table. Out of interest can you link the source to the loader you are using? I would have thought it wouldn’t be trying to enforce a constraint as there’s no truly unique constraint in the atomic.events table.
Edit: Ok so it looks like schema-ddl can generate a foreign key but that’s just for Redshift as far as I can tell the postgres-loader shouldn’t.
I have created the table without foreign key references and tracking data sent to db but no data coming into custom table but data coming into ‘events’ table for sure.
Hi @achintya.c - there’s not really enough information in your question to easily answer this. Are you running into errors, is the data being shredded correctly, is the Postgres connector able to connect and sent data to the cluster? What queries are failing / succeeding and are they providing any output?
As far my tracker is concerned, there is no error.
Postgres are getting data but not all data. Some basic data such as app id, platform etc are getting sent to database but custom data such as email, device type, device id etc are not reaching database.
Also I am getting some custom data such battery level. batter state etc are coming through to enriched bucket in s3.
I suppose the custom table not getting data because it is not connected to ‘events’ table which stores basic data.
Is there any column which can be used reference for custom table?
I hope I have described the issue with good amount of information. If you feel I am missing something, please let me know.
It is correct that the schema showing errors in bad buckets in s3. Now my question is that, if the schema error is fixed , will it reach the database? The tables ‘event’ and my custom table has no one to one relationship.
I will try to fix the schema error but not sure , how to establish tables relationship.
Please suggest.
I appreciate you are thinking ahead in the pipeline but lets take this one step at a time, first port of call is to correct the the schema violations. We can then look at the contexts/unstruct records in the good stream and hopefully answer your questions.
This suggests that your configuration (as part of your Iglu Resolver) is not able to find the schemas you are looking for. In your resolver you’ll want to ensure you provide the endpoints of your Iglu server - rather than the path directly to your schemas e.g.,
It also looks like Iglu Central isn’t being found which is quite unusual as both Iglu Central and the mirror should be publicly accessible. Does the infrastructure you are running the loader on have a route and / or permission to the public internet?
Can you post your Iglu Resolver file? I think that there is likely some configuration errors in this file that are preventing the schemas from being looked up.