Job won't stop running???

Permalink
Hi,

I recently have a problem with the job that index the pages for the search. Iv'e had around 500 pages past 4 days with the same template and i've run the job index search and it as been running more then 24 hours. Normally it took around 30 sec to finish.

The thing is that I'm on a share hosted server and I don't have access to ssh to see whats going on. Or if theres a way, its out of my knowledge. Can anyone help me ?

Thanks

olacom
 
olacom replied on at Permalink Reply
olacom
Is there a way I can see what the Index_search.php JOB is doing? I think it's freeze on something during the scan but I can't see.

Any code I can add in the script to echo back ?
ScottC replied on at Permalink Reply
ScottC
it is probably under Jobs.jLastStatusCode in the db. it might have a 1 or something instead of a 0 (zero)
olacom replied on at Permalink Reply
olacom
Hi Scott,

Here what Ive done so far. I've checked the last data entrie in the db to see if it was stoping at a particular place and it seem to scan normaly. I also did what you said and it work buyt if I restart the jobs it will not put the status back. There is something that make the script not able to finish the process or to reset the status back to finish.

Any idea at something else I can check?
ScottC replied on at Permalink Reply
ScottC
i haven't had this problem to be totally honest, but I know that anything will time out after your php max timeout or whatever it is called at around 30 seconds by default? I could google it but you can too :)

looked at the code and jLastStatusText or jStatus is what you should look at in the db, or set max_execute time or whatever it is called to a little more, although admittedly that isn't ideal.
olacom replied on at Permalink Reply
olacom
Fatal error: Allowed memory size of 262144 bytes exhausted was the error created.

I made some research and I had to put this in the config/site.php

<?php 
ini_set('memory_limit', -1);
?>


Is there a draw back on changing this ?