-- setup problems
Hi,
Find out where Xampp stores the PHP error logs and see if there are any errors in it.
The debugging you were doing in your video seemed reasonable. A crawl is stopped when both Scheduler and Indexer write some file, the fact that both weren't succeeding in doing this suggests either the indexer crashed or some kind of failure in finding / writing the proper folder. The same sort of failure could explain why you aren't seeing any log files. Maybe try to do a crawl from the command line and tell me what happens. That is, turn off the QueueServer and Fetchers under Manage Machines. Open a command or power shell, cd into src/executables, type:
php QueueServer.php terminal
open a second shell, cd to src/executables, type:
php Fetcher.php terminal
make sure under Configure you error reporting is checked. Start a crawl. You only have to start a crawl once. Wait and see if there are any error messages and wait until it has crawled a couple pages. In your case, wikipedia.org. Click stop crawl. Wait till index appears. When crawling from the command line, both scheduler and indexer are in the same process, so it is a little simpler. If it is still stuck you can drop me a line at chris@pollett.org and maybe we can arrange a skype chat.
Best,
Chris
(
Edited: 2019-08-25)
Hi,
Find out where Xampp stores the PHP error logs and see if there are any errors in it.
The debugging you were doing in your video seemed reasonable. A crawl is stopped when both Scheduler and Indexer write some file, the fact that both weren't succeeding in doing this suggests either the indexer crashed or some kind of failure in finding / writing the proper folder. The same sort of failure could explain why you aren't seeing any log files. Maybe try to do a crawl from the command line and tell me what happens. That is, turn off the QueueServer and Fetchers under Manage Machines. Open a command or power shell, cd into src/executables, type:
php QueueServer.php terminal
open a second shell, cd to src/executables, type:
php Fetcher.php terminal
make sure under Configure you error reporting is checked. Start a crawl. You only have to start a crawl once. Wait and see if there are any error messages and wait until it has crawled a couple pages. In your case, wikipedia.org. Click stop crawl. Wait till index appears. When crawling from the command line, both scheduler and indexer are in the same process, so it is a little simpler. If it is still stuck you can drop me a line at chris@pollett.org and maybe we can arrange a skype chat.
Best,
Chris