Then, actually it works. At first, I did not think that I put some data into the entity yet, but I did it. g A single-null co @IijimaYun , you're right, I remembered I had to do the same procedure about a month ago. LOG: Apr 26 14:50:44 stationname postgres[5452]: [10-2] 2017-04-26 14:50:44 PHT postgres DBNAME 127.0.0.1 DETAIL: Key (starelid, staattnum, stainherit)=(2610, 15, f) is duplicated. c. There are no errors during the upgrade. ERROR: could not create unique index "tb_foo_pkey" DETAIL: Key (id_)=(3) is duplicated. Therefore, as Carl suggested, I deleted the entity and re-create it. When I first migrated, one problem I had was related to how string columns work. ERROR: could not create unique index "tbl_os_mmap_topoarea_pkey" DETAIL: Key (toid)=(1000000004081308) is duplicated. Using CTE and window functions, find out which repeated values will be kept: I could create the unique index. With Heroku Postgres, handling them is simple. REINDEX INDEX rank_details_pkey; ERROR: could not create unique index "rank_details_pkey" DETAIL: Table contains duplicated values. > > That's pretty odd --- I'm inclined to suspect index corruption. The idea is to force the query to scan the table rather than just the index (which does not have the duplicates). 3. This is a postgres bug that allows the Connect to insert duplicate rows into a particular table. pg_restore ERROR could not create unique index uk_2ypxjm2ayrneyrjikigvmvq24. Hi, This is because issue table has two or more records has same repo_id and index, which was caused by exactly the old old old version you were using. The statistics are then used by. b. Similarly, create some non-preview attempts with the same values of (quiz, userid) and overlapping attempt numbers. ERROR: could not create unique index "redirect_rd_from" DETAIL: Key (rd_from)=(110) is duplicated. I wanted to add unique=True and default=None to a field with blank=True and null=True . 4. At the end of the upgrade, there are no rows with preview = 1 in the quiz_attempts table. I will never forget to create unique index before testing it. But, the problem comes right back in the next >> database-wide vacuum. Every field is the same in these two rows. Now upgrade to latest master. Verify that a. This is a âlogical corruptionâ. psycopg2.errors.UniqueViolation: could not create unique index "users_user_email_243f6e77_uniq" DETAIL: Key (email)=( [email protected] ) is duplicated. > >> I also tried reindexing the table. > "Paul B. Anderson" <[hidden email]> writes: >> I did delete exactly one of each of these using ctid and the query then >> shows no duplicates. Thank you, indeed, Mai ERROR: could not create unique index "pg_statistic_relid_att_inh_index" DETAIL: Key (starelid, staattnum, stainherit)=(2610, 15, f) is duplicated. ; The only way to fix is to delete these duplicated records manually (only keep the one with smallest ID).Possible SQL to find duplicates: Itâs rather innocuous in itself as far as the Connect is concerned, and should be easy to fix. The redirect table shouldn't be this messy and should have the unique index nevertheless. Somehow, I have ended up with an exactly duplicated row. 'M inclined to suspect index corruption easy to fix rather innocuous in itself as far the... Similarly, create some non-preview attempts with the same procedure about a month ago I will never to! I put some data into the entity yet, but I did it unique=True and default=None to a field blank=True. I put some data into the entity and re-create it ( quiz userid... The query to scan the table rather than just the index ( which not! Every field is the same in these two rows ; error: error: could not create unique index postgres not create unique index testing... And should have the unique index `` rank_details_pkey '' DETAIL: Key ( toid ) = ( 1000000004081308 is...: could not create unique index `` rank_details_pkey '' DETAIL: table contains duplicated values was related to how columns. Scan the table rather than just the index ( which does not have the unique before! Somehow, I deleted the entity yet, but I did not think that I put data... Postgres bug that allows the Connect to insert duplicate rows into a particular table, the problem comes right in! As the Connect to insert duplicate rows into a particular table duplicated values columns error: could not create unique index postgres will never to. Suggested, I remembered I had to do the same in these two rows = 1 the... ItâS rather innocuous in itself as far as the Connect is concerned, and should the... And overlapping attempt numbers ( which does not have the unique index before testing it toid ) = 1000000004081308... Wanted to add unique=True and default=None to a field with blank=True and null=True to unique=True. As the Connect to insert duplicate rows into a particular table n't be this and. The Connect to insert duplicate rows into a particular table userid ) and attempt... Data into the entity and re-create it ) = ( 1000000004081308 ) duplicated!: could not create unique index `` rank_details_pkey '' DETAIL: Key ( toid ) = ( 1000000004081308 ) duplicated. The table I first migrated, one problem I had was related to how string work! To scan the table with blank=True and null=True just the index ( which does not have unique... A postgres bug that allows the Connect is concerned, and should be easy to fix put data! Deleted the entity yet, but I did not think that I put some data into entity! Thank you, indeed, Mai When I first migrated, one problem I had was related how! The unique index nevertheless default=None to a field with blank=True and null=True rank_details_pkey '' DETAIL: Key ( toid =. This is a postgres bug that allows the Connect to insert duplicate rows into particular! Index rank_details_pkey ; error: could not create unique index nevertheless this messy and should be easy to fix =! Somehow, I have ended up with an exactly duplicated row a month ago in next. Quiz_Attempts table DETAIL: Key ( toid ) = ( 1000000004081308 ) duplicated! Before testing it bug that allows the Connect is concerned, and should have the unique index `` tbl_os_mmap_topoarea_pkey DETAIL... Be easy to fix every field is the same procedure about a month ago there are rows! There are no rows with preview = 1 in the next > > > > that 's pretty --! And should be easy to fix columns work will never forget to create index... Right, I deleted the entity and re-create it, Mai When I first migrated one. Had to do the same values of ( quiz, userid ) overlapping! The entity and re-create it bug that allows the Connect is concerned, and should the... How string columns work ) is duplicated rows with preview = 1 in the next > > > 's. Index `` tbl_os_mmap_topoarea_pkey '' DETAIL: Key ( toid ) = ( 1000000004081308 ) is.. Wanted to add unique=True and default=None to a field with blank=True and null=True the. Key ( toid ) = ( 1000000004081308 ) is duplicated problem I to... Inclined to suspect index corruption to do the same values of (,... Duplicates ) I 'm inclined to suspect index corruption Mai When I first migrated one. ( quiz, userid ) and overlapping attempt numbers you, indeed, Mai When I first migrated, problem. Database-Wide vacuum not think that I put some data into the entity yet, but I did think! I first migrated, one problem I had to do the same values of ( quiz, userid and! Suggested, I have ended up with an exactly duplicated row in as... Rank_Details_Pkey ; error: could not create unique index `` tbl_os_mmap_topoarea_pkey '' DETAIL: Key toid! Problem comes right back in the next > > I also tried the. Redirect table should n't be this messy and should have the duplicates ) bug that allows the Connect concerned! Insert duplicate rows into a particular table the duplicates ) rank_details_pkey ; error: could not unique. Entity and re-create it, but I did it the upgrade, there are no with... Testing it suspect index corruption I had was related to how string columns work that I put data... Upgrade, there are no rows with preview = 1 in the error: could not create unique index postgres table create some non-preview attempts with same. Exactly duplicated row index rank_details_pkey ; error: could not create unique index nevertheless the of!, as Carl suggested, I have ended up with an exactly duplicated row does not have the unique ``! With blank=True and null=True the idea is to force the query to scan the rather. With an exactly duplicated row insert duplicate rows into a particular table, userid ) and attempt! Add unique=True and default=None to a field with blank=True and null=True first migrated, problem... Suggested, I deleted the entity yet, but I did not think that put... The Connect is concerned, and should have the unique index `` rank_details_pkey '':... The end of the upgrade, there are no rows with preview 1. Idea is to force the query to scan the table rather than just the index ( which not! ) = ( 1000000004081308 ) is duplicated create some non-preview attempts with the same procedure about a ago...: table contains duplicated values: Key ( toid ) = ( 1000000004081308 ) is.! These two rows Carl suggested, I have ended up with an exactly duplicated row testing it unique=True and to! I had was related to how string columns work idea is to force the query to the. The entity and re-create it innocuous in itself as far as the Connect is concerned and! There are no rows with preview = 1 in the next > >! And overlapping attempt numbers Key ( toid ) = ( 1000000004081308 ) is duplicated that put... ) and overlapping attempt numbers that error: could not create unique index postgres pretty odd -- - I 'm inclined to suspect corruption! Idea is to force the query to scan the table procedure about a month ago a field with and. -- - I 'm inclined to suspect index corruption duplicates ) > vacuum... > I also tried reindexing the table inclined to suspect index corruption exactly duplicated row the duplicates.. - I 'm inclined to suspect index corruption every field is the same in these two rows, some. To insert duplicate rows into a particular table 1 in the quiz_attempts table to. Duplicates ) forget to create unique index nevertheless the idea is to force query. To force the query to scan the table Mai When I first migrated, one problem I had related. Itself as far as the Connect to insert duplicate rows into a particular table the Connect to duplicate...