What seems to be happening is that when the second job starts running it resets the "feeds_clean_list" table to contain all the records. Most of those are in the 'Cleaning' state. Click the 'Import in background' button again on your feed view and manually kick off the cron again.Īfter the last step, if you view your queue job you'll see that a huge number (depending on your total import size) of records queued up.If you look at Queue UI again, you should see that some records are still stuck in the queue.Go back to your feed view and click the Unlock button.Manually start the cron run and wait until Queue UI shows multiple entries, so that we know it's at the processing stage.Queue UI should show one record for the feed queue at this point, which is the 'Fetch' stage. Go to you feed view and click the 'Import in background' button.Setup a feed that has more records then can finish in a single cron run (Mine has about 1,000).I installed the Queue UI module to aid in debugging. I don't know every situation this can occur in, but I have found a reliable way to reproduce the issue on my end. I say in the title that it can potentially cause data loss, since the feeds can be configured to delete items not present. I was noticing during testing that the number of results on the front-end kept fluctuating, which was due to my node items being unpublished incorrectly. I have my feed set to Unpublish any items not present during an import. I was testing the new alpha7 release of the feeds module on an existing site. So I think the solution would be to save the file based on its mime type? I've tried to use tamper plugin, but upon changes, the URL gets broken and I've got 404. 'allowedExtensions' => 'png gif jpg jpeg asp', My FeedsEnclosure looks like: FeedsEnclosure::_set_state(array( It seems it works for other feeds which have proper JPG extension (like this). So I guess the solution would be ASP extension to allowedExtensions, but it didn't work. I think this is related to #1482530: Support remote files in file field., but I'm not sure. However the problem is that this file doesn't have content-disposition (so I can't use the patch from #1104378-8: Allow passing custom filename to FeedsParser.inc::getFile()), so my filename is get saved as ShowImageXML.asp which fails with:Įrror Feeds exception 'PDOException' with message 'SQLSTATE: Integrity constraint violation: 1062 Duplicate entry 'public://images/ShowImageXML.asp' for key 'uri'' I've the following URL of image file which I want to import it:
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |