Exceptions during reindex
(Sander zou jij hier eens naar kunnen kijken?)
Submitted by Eelco Visser on 18 March 2011 at 15:15
[java] Indexing Alias: 139000 - 140000
[java] Exception in thread “Lucene Merge Thread #0” org.apache.lucene.index.MergePolicy$MergeException: java.io.FileNotFoundException: /var/indexes/researchr/webdsl.generated.domain.Alias/_s8.fnm (No such file or directory)
[java] at org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:351)
[java] at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:315)
[java] Caused by: java.io.FileNotFoundException: /var/indexes/researchr/webdsl.generated.domain.Alias/_s8.fnm (No such file or directory)
[java] at java.io.RandomAccessFile.open(Native Method)
[java] at java.io.RandomAccessFile.(RandomAccessFile.java:212)
[java] at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput$Descriptor.(SimpleFSDirectory.java:78)
[java] at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput.(SimpleFSDirectory.java:108)
[java] at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.(NIOFSDirectory.java:94)
[java] at org.apache.lucene.store.NIOFSDirectory.openInput(NIOFSDirectory.java:70)
[java] at org.apache.lucene.store.FSDirectory.openInput(FSDirectory.java:691)
[java] at org.apache.lucene.index.FieldInfos.(FieldInfos.java:68)
[java] at org.apache.lucene.index.SegmentReader$CoreReaders.(SegmentReader.java:119)
[java] at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:652)
[java] at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:622)
[java] at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:698)
[java] at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:5145)
[java] at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:4675)
[java] at org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:235)
[java] at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:291)
[java] Exception in thread “main” org.hibernate.HibernateException: More than one row with the given identifier was found: 215a3b1a-5fe8-425d-962a-ac49a8cff744, for class: webdsl.generated.domain.AliasPublicationList
[java] at org.hibernate.loader.entity.AbstractEntityLoader.load(AbstractEntityLoader.java:108)
[java] at org.hibernate.loader.entity.EntityLoader.loadByUniqueKey(EntityLoader.java:160)
[java] at org.hibernate.persister.entity.AbstractEntityPersister.loadByUniqueKey(AbstractEntityPersister.java:1777)
[java] at org.hibernate.type.EntityType.loadByUniqueKey(EntityType.java:674)
[java] at org.hibernate.type.EntityType.resolve(EntityType.java:434)
[java] at org.hibernate.engine.TwoPhaseLoad.initializeEntity(TwoPhaseLoad.java:140)
[java] at org.hibernate.loader.Loader.initializeEntitiesAndCollections(Loader.java:898)
[java] at org.hibernate.loader.Loader.doQuery(Loader.java:773)
[java] at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:270)
[java] at org.hibernate.loader.Loader.doList(Loader.java:2294)
[java] at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2172)
[java] at org.hibernate.loader.Loader.list(Loader.java:2167)
[java] at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:448)
[java] at org.hibernate.hql.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:363)
[java] at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:196)
[java] at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1258)
[java] at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102)
[java] at utils.ReIndex.indexAlias(ReIndex.java:100)
[java] at utils.ReIndex.main(ReIndex.java:16)
Issue Log
Exception was caused by a duplicate in the AliasPublicationList_alias column in the _PublicationList table. There were 11 duplicates to be exact. These duplicates are not allowed by hibernate, yet a uniqueness constraint was missing (probable cause: Hibernate does not add uniqueness constraints when dbmode is set to update).
I removed the duplicates and added a uniqueness constraint, such that this won’t happen again in the future. I also restarted the indexing. Waiting for the indexing to complete now…
Reindexing was restarted again (see issue https://yellowgrass.org/issue/researchr/250), but the original run did not show duplicate key exceptions. Issue fixed.
Log in to post comments