Index Search Engine - All Error
Permalink
I was running through my automated jobs and when I did the Index Search Engine - All Job right at the end I got a "Invalid or Empty Node passed to getItem constructor." Error.
Anyone have any idea what this is, how to fix it?
Anyone have any idea what this is, how to fix it?
I have same problem. I can run partial search only.
I had to roll back to a previous back because something went wrong with Workflows and when I tried to run this on the restored site I still hit the same error.
@theana27
@gavthompson
I just tried running "Index Search Engine - All" with success.
What version of concrete5 are you using?
@gavthompson
I just tried running "Index Search Engine - All" with success.
What version of concrete5 are you using?
No issues for me with 5.7.5.3 or 5.7.5.4?
I was running 5.7.5.4 at the time.
I haven't solved the issue but I don't have the issue anymore as I had to roll our site back to 5.7.5.3 due to not being able to fix a error with the workflow page after updating to 5.7.5.4.
I haven't solved the issue but I don't have the issue anymore as I had to roll our site back to 5.7.5.3 due to not being able to fix a error with the workflow page after updating to 5.7.5.4.
After updating to concrete5.7.4 everything work normal.
Same issue here with 5.7.5.6 - No idea what's wrong. Anyone else have any idea?
I occurred same issue today:
This is happening on concrete5.7.5.6
I debug it a bit and i find out that reason for this behaviour was Page with only one version that wasn't approved.
So what is happing is:
Reindexing is triggered here:
https://github.com/concrete5/concrete5/blob/develop/web/concrete/job...
Then this code is triggered to get specific page in specific version:
https://github.com/concrete5/concrete5/blob/develop/web/concrete/src...
Then this function failed get page version since there is only one version and it's not approved.
https://github.com/concrete5/concrete5/blob/develop/web/concrete/src...
Then this code is triggered and try to get page from cache:
https://github.com/concrete5/concrete5/blob/develop/web/concrete/src...
But since $version variable is empty, this library throws an exception
https://github.com/tedious/Stash/blob/v0.12.1/src/Stash/Pool.php#L13...
because key that is trying to get cache for key `page/474//Concrete\Core\Page\Page`
I don't exactly know where wrong logic lives, i'm not C5 developer, i only debugged issue. So it would be better if somebody that knows system better find solution for this edge case.
This is happening on concrete5.7.5.6
I debug it a bit and i find out that reason for this behaviour was Page with only one version that wasn't approved.
So what is happing is:
Reindexing is triggered here:
https://github.com/concrete5/concrete5/blob/develop/web/concrete/job...
Then this code is triggered to get specific page in specific version:
https://github.com/concrete5/concrete5/blob/develop/web/concrete/src...
Then this function failed get page version since there is only one version and it's not approved.
https://github.com/concrete5/concrete5/blob/develop/web/concrete/src...
Then this code is triggered and try to get page from cache:
https://github.com/concrete5/concrete5/blob/develop/web/concrete/src...
But since $version variable is empty, this library throws an exception
https://github.com/tedious/Stash/blob/v0.12.1/src/Stash/Pool.php#L13...
because key that is trying to get cache for key `page/474//Concrete\Core\Page\Page`
I don't exactly know where wrong logic lives, i'm not C5 developer, i only debugged issue. So it would be better if somebody that knows system better find solution for this edge case.
Thanks for posting this - I hadn't considered anything like that, and knew that all the pages I had on the site were approved.
So I took a look in the Trash and found a bunch of old pages there. So I deleted all the pages in the trash - Now the job runs fine.
So I took a look in the Trash and found a bunch of old pages there. So I deleted all the pages in the trash - Now the job runs fine.
As you suggested Juddc I emptied the trash and re-ran the task it and it was successful.
Bug maybe? is the task supposed to index trash pages?
Bug maybe? is the task supposed to index trash pages?