Index Search Engine runs "forever" after upgrading to 5.6.0.1
Permalink 3 users found helpful
In the "automated jobs" section, everything worked fine until I
upgraded to 5.6.0.1.
I tried the "reset all running jobs", and that stops the job, but it won't finish running when trying again either.
I used to run this at least weekly, and after upgrading I get the following message:
Index Search Engine Index the site to allow searching to work quickly and accurately. Currently Running (Since 9:52:27 AM)
Any suggestions?
upgraded to 5.6.0.1.
I tried the "reset all running jobs", and that stops the job, but it won't finish running when trying again either.
I used to run this at least weekly, and after upgrading I get the following message:
Index Search Engine Index the site to allow searching to work quickly and accurately. Currently Running (Since 9:52:27 AM)
Any suggestions?
I know this is an old post, but I'm getting the same thing (index search engine job never completing). Did you find a solution?
Have you tried running just the search job from the system command line? That will tell you if it is a server timing/resources issue, and also give a clearer error message if there is a code failure.
Also, after it has failed once it could be stuck marked that way in the jobs table of the database and the icon will continue to spin forever. It may not even be a job failure that leads to this. Sometimes a browser, network or server glitch means the ajax call to run the job does not return.
You can patch the database entry using phpMyAdmin, or from the dashboard remove the job and restore it with the newly released Restore Automated Jobs addon.
Also, after it has failed once it could be stuck marked that way in the jobs table of the database and the icon will continue to spin forever. It may not even be a job failure that leads to this. Sometimes a browser, network or server glitch means the ajax call to run the job does not return.
You can patch the database entry using phpMyAdmin, or from the dashboard remove the job and restore it with the newly released Restore Automated Jobs addon.
Sorry, I don't know what you mean by "the system command line."
I did try running just the search job from the dashboard page.
Also tried plugging the cron job URL into a browser to see if it gave me a PHP error or something.
Neither provided any more information, just lots of spinning.
I did try running just the search job from the dashboard page.
Also tried plugging the cron job URL into a browser to see if it gave me a PHP error or something.
Neither provided any more information, just lots of spinning.
Try right-clicking on the 'run' button to the left of the 'Index Search Engine' job and choose "Copy Link Address" (in Chrome). Paste that address into your address bar and hit 'enter'. Any errors should show up (maybe). If no errors show up, view the source of the resulting white page just in case an error is showing up but not displaying for some reason.
Also, make sure your system will actually display errors by going to "Dashboard->System and Settings->Debug Settings" and setting it to "Show errors in page".
Also, make sure your system will actually display errors by going to "Dashboard->System and Settings->Debug Settings" and setting it to "Show errors in page".
Thanks for the step-by-step, mhawke.
I set the debug to show errors and got this:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 83 bytes) in /home/parispro/public_html/updates/concrete5.5.2.1/concrete/libraries/3rdparty/adodb/adodb.inc.php on line 1052
Is there something obvious to you I could update to alleviate this?
Would updating c5 do it?
*** edit ***
I see now that this install is on 5.5.2.1, not 5.6... so updating probably isn't the answer.
I set the debug to show errors and got this:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 83 bytes) in /home/parispro/public_html/updates/concrete5.5.2.1/concrete/libraries/3rdparty/adodb/adodb.inc.php on line 1052
Is there something obvious to you I could update to alleviate this?
Would updating c5 do it?
*** edit ***
I see now that this install is on 5.5.2.1, not 5.6... so updating probably isn't the answer.
That is saying that you have exceeded the memory allocation. How many pages is the site?
The solution is to delete pages (not feasible) or allocate more memory in php.ini (more feasible). Can you post the contents of your concrete5 'environment' (Dashboard->Systems and Settings->Environment) so we can see what's already in your php.ini?
The solution is to delete pages (not feasible) or allocate more memory in php.ini (more feasible). Can you post the contents of your concrete5 'environment' (Dashboard->Systems and Settings->Environment) so we can see what's already in your php.ini?
Odd.
The site is about 500 pages and the PHP memory limit is 128MB.
That seems like more than enough.
By the way, I really appreciate your attention!
Here's the environment info:
# concrete5 Version
5.5.2.1
# concrete5 Packages
Advanced Forms (1.5.2.6), Automatic Email Obfuscator (1.2.1), Designer Content (2.1), Fileset attribute (1.0), Force Single Sublevel (1.0), Scrapbook Display - Basic (1.4), tnSpacer (1.2), Whitespace Theme (1.0).
# concrete5 Overrides
blocks/ppg_current_market_old, blocks/ppg_answer, blocks/ppg_web_link, blocks/autonav, blocks/ppg_current_market, blocks/slideshow, blocks/ppg_architecture, blocks/ppg_glossary_term, blocks/map_points, blocks/page_list, blocks/image_text_wysiwyg, blocks/two_text_fields, blocks/page_image_text, blocks/ppg_place_listing, blocks/html, blocks/ppg_paris_map_google_text, controllers/page_types, helpers/image.php, helpers/ppg.php, jobs/generate_sitemap.php, jobs/flush_cache.php, js/fancybox, js/paris_map_google.js, js/infoBox_packed.js, js/jquery.galleriffic-2.0, js/markerClusterer_packed.js, js/jquery.mosne.map.min.js, js/raphael-min.js, js/paris_map_google.js.bak, js/paris-map.js, js/current_market.js, single_pages/page_not_found.php, single_pages/login.php, themes/ppg, tools/propInfoRequest.php, tools/slideshow.php
# Server Software
LiteSpeed
# Server API
litespeed
# PHP Version
5.2.17
# PHP Extensions
bcmath, bz2, calendar, ctype, curl, date, dom, exif, filter, ftp, gd, gettext, hash, iconv, imap, ionCube Loader, json, libxml, litespeed, mailparse, mbstring, mcrypt, mhash, mime_magic, mysql, mysqli, openssl, pcre, PDO, pdo_mysql, pdo_sqlite, posix, pspell, Reflection, session, SimpleXML, soap, sockets, SPL, SQLite, standard, tidy, timezonedb, tokenizer, wddx, xml, xmlreader, xmlrpc, xmlwriter, xsl, Zend Optimizer, zip, zlib.
# PHP Settings
log_errors_max_len - 1024
max_execution_time - 5
max_file_uploads - 20
max_input_nesting_level - 64
max_input_time - 60
max_input_vars - 1000
memory_limit - 128M
post_max_size - 250M
safe_mode - Off
safe_mode_exec_dir - /usr/local/php/bin
safe_mode_gid - Off
safe_mode_include_dir - <i>no value</i>
sql.safe_mode - Off
upload_max_filesize - 250M
mysql.max_links - Unlimited
mysql.max_persistent - Unlimited
mysqli.max_links - Unlimited
pcre.backtrack_limit - 100000
pcre.recursion_limit - 100000
session.cache_limiter - nocache
session.gc_maxlifetime - 7200
soap.wsdl_cache_limit - 5
safe_mode_allowed_env_vars - PHP_
safe_mode_protected_env_vars - LD_LIBRARY_PATH
The site is about 500 pages and the PHP memory limit is 128MB.
That seems like more than enough.
By the way, I really appreciate your attention!
Here's the environment info:
# concrete5 Version
5.5.2.1
# concrete5 Packages
Advanced Forms (1.5.2.6), Automatic Email Obfuscator (1.2.1), Designer Content (2.1), Fileset attribute (1.0), Force Single Sublevel (1.0), Scrapbook Display - Basic (1.4), tnSpacer (1.2), Whitespace Theme (1.0).
# concrete5 Overrides
blocks/ppg_current_market_old, blocks/ppg_answer, blocks/ppg_web_link, blocks/autonav, blocks/ppg_current_market, blocks/slideshow, blocks/ppg_architecture, blocks/ppg_glossary_term, blocks/map_points, blocks/page_list, blocks/image_text_wysiwyg, blocks/two_text_fields, blocks/page_image_text, blocks/ppg_place_listing, blocks/html, blocks/ppg_paris_map_google_text, controllers/page_types, helpers/image.php, helpers/ppg.php, jobs/generate_sitemap.php, jobs/flush_cache.php, js/fancybox, js/paris_map_google.js, js/infoBox_packed.js, js/jquery.galleriffic-2.0, js/markerClusterer_packed.js, js/jquery.mosne.map.min.js, js/raphael-min.js, js/paris_map_google.js.bak, js/paris-map.js, js/current_market.js, single_pages/page_not_found.php, single_pages/login.php, themes/ppg, tools/propInfoRequest.php, tools/slideshow.php
# Server Software
LiteSpeed
# Server API
litespeed
# PHP Version
5.2.17
# PHP Extensions
bcmath, bz2, calendar, ctype, curl, date, dom, exif, filter, ftp, gd, gettext, hash, iconv, imap, ionCube Loader, json, libxml, litespeed, mailparse, mbstring, mcrypt, mhash, mime_magic, mysql, mysqli, openssl, pcre, PDO, pdo_mysql, pdo_sqlite, posix, pspell, Reflection, session, SimpleXML, soap, sockets, SPL, SQLite, standard, tidy, timezonedb, tokenizer, wddx, xml, xmlreader, xmlrpc, xmlwriter, xsl, Zend Optimizer, zip, zlib.
# PHP Settings
log_errors_max_len - 1024
max_execution_time - 5
max_file_uploads - 20
max_input_nesting_level - 64
max_input_time - 60
max_input_vars - 1000
memory_limit - 128M
post_max_size - 250M
safe_mode - Off
safe_mode_exec_dir - /usr/local/php/bin
safe_mode_gid - Off
safe_mode_include_dir - <i>no value</i>
sql.safe_mode - Off
upload_max_filesize - 250M
mysql.max_links - Unlimited
mysql.max_persistent - Unlimited
mysqli.max_links - Unlimited
pcre.backtrack_limit - 100000
pcre.recursion_limit - 100000
session.cache_limiter - nocache
session.gc_maxlifetime - 7200
soap.wsdl_cache_limit - 5
safe_mode_allowed_env_vars - PHP_
safe_mode_protected_env_vars - LD_LIBRARY_PATH
It's comforting to know that you feel 128M is enough memory but your server feels differently and it's in charge right now ;-)
I think 500 pages is a lot so I would speak to your tech support and see what can be done about increasing your memory_limit.
I think your 'max_input_time' of 60 seconds is likely to cause a problem as well because 500 pages might take longer than 60 seconds to index but I might be wrong.
My shared hosting has these parameters:
max_input_time - 600
memory_limit - 384M
Have you also tried to run the job that deletes all but the last 10 page versions? I'm not sure that the indexing job looks thought all the old versions but with 500 pages, I think I would try to keep on top of that. You need to run this job a bunch of times to clean things up completely because this job doesn't do your whole site at once. It does it in small chucks to avoid overloading the server. I have a modified version that increases the size of the chunk and deletes all but the last 5 versions instead of 10. Let me know if you'd like to try it.
I think 500 pages is a lot so I would speak to your tech support and see what can be done about increasing your memory_limit.
I think your 'max_input_time' of 60 seconds is likely to cause a problem as well because 500 pages might take longer than 60 seconds to index but I might be wrong.
My shared hosting has these parameters:
max_input_time - 600
memory_limit - 384M
Have you also tried to run the job that deletes all but the last 10 page versions? I'm not sure that the indexing job looks thought all the old versions but with 500 pages, I think I would try to keep on top of that. You need to run this job a bunch of times to clean things up completely because this job doesn't do your whole site at once. It does it in small chucks to avoid overloading the server. I have a modified version that increases the size of the chunk and deletes all but the last 5 versions instead of 10. Let me know if you'd like to try it.
heh heh... it's actually kind of disturbing that sometimes I talk like I know what's going on when clearly that isn't the case :-)
I bumped up the memory to 384 and input time to 600. The first time re-running the jobs the search index hung up again. Canceled that and re-ran it and now it is completing successfully. Thanks!
Wish I could mark your answer as "right" but I'm not the original poster. Maybe they'll come back and give you credit and help others.
I bumped up the memory to 384 and input time to 600. The first time re-running the jobs the search index hung up again. Canceled that and re-ran it and now it is completing successfully. Thanks!
Wish I could mark your answer as "right" but I'm not the original poster. Maybe they'll come back and give you credit and help others.
Glad to be of service.
(I marked my answer on your behalf.)
(I marked my answer on your behalf.)
Well done!
After all that work you deserve the karma.
At a slight tangent, I have a re-enable jobs job currently in the prb. It won't fix jobs that always fail, but will stop jobs that occasionally run over from becoming stuck running.
(This link will only work once the addon is through the prb - it will be free)
http://www.concrete5.org/marketplace/review_pending/-/view_detail/1...
At a slight tangent, I have a re-enable jobs job currently in the prb. It won't fix jobs that always fail, but will stop jobs that occasionally run over from becoming stuck running.
(This link will only work once the addon is through the prb - it will be free)
http://www.concrete5.org/marketplace/review_pending/-/view_detail/1...
Thanks, John (for this and your many, many other contributions).
good karma for you (just not the c5 kind)
good karma for you (just not the c5 kind)
The job will have been marked as 'running' in the jobs table of the database and that mark will be stuck there. Its a 'feature' of the way the dashboard runs jobs that because it is marked as running the spinner will be set to spin whenever you visit that page from now on.
The way to reset this marker is to either:
- Go into the database table with phpMyAdmin and edit the entry from 'running' to 'enabled'.
- Remove the job, then re-install it.
The way to reset this marker is to either:
- Go into the database table with phpMyAdmin and edit the entry from 'running' to 'enabled'.
- Remove the job, then re-install it.
Awesome, thanks mhawke and JohntheFish.
I'll give these a shot (soon... this isn't top-of-list but I wanted to respond to you quickly).
I'll give these a shot (soon... this isn't top-of-list but I wanted to respond to you quickly).