-- Installing is Bootlooped
Choose one version of PHP, say 7.3. It doesn't matter Windows, Linux, or Mac, so choose the system you want and try to debug that. The crawl status is no longer updating message shows up when you are on the manage crawl activity and then don't do anything for more the 20 minutes or so. If you did a page refresh it would go away and you could see if anything had been crawled. In any case, if crawling was occurring, you would see it within a minute or two, you don't need to wait twenty minutes. My suggestion for debugging stuff is to get as simple a test crawl as possible working. That is, under Manage Crawl - Options make sure the Server Channel is 0, Crawl Order is Page Importance, Max Depth is no limit, Repeat Type is No Repeat, robots.txt is Always Follow. Make sure there are no sites listed under Disallowed Sites. Under Seeds Sites, put a single site that you know should be crawlable, for example, https://www.yahoo.com/ . Don't use a site like Facebook because their robots.txt forbids all crawlers other than a few that are paying them like Google. Save your settings. Then look at the Page Options activity make sure that all the Page File types are checked, if not, check them. Go to the Manage Machines activity. For a simple crawl, you should have only one machine with a Queue Server and some Fetchers. Turn on the Queue Server and one Fetcher. Make sure you see log messages that each is running. Next go back to Manage Crawls and start the crawl. Go back to Manage Machines and look at the logs for the Queue Server and the running Fetcher, you should see some changes indicating the crawl is started. Go back to Manage Crawls you should see that it says a few pages have been crawled. If this doesn't happen within maybe two minutes, some thing isn't working, post your Queue Server and Fetcher logs here, and I will look at it.
Best,
Chris
Choose one version of PHP, say 7.3. It doesn't matter Windows, Linux, or Mac, so choose the system you want and try to debug that. The crawl status is no longer updating message shows up when you are on the manage crawl activity and then don't do anything for more the 20 minutes or so. If you did a page refresh it would go away and you could see if anything had been crawled. In any case, if crawling was occurring, you would see it within a minute or two, you don't need to wait twenty minutes. My suggestion for debugging stuff is to get as simple a test crawl as possible working. That is, under Manage Crawl - Options make sure the Server Channel is 0, Crawl Order is Page Importance, Max Depth is no limit, Repeat Type is No Repeat, robots.txt is Always Follow. Make sure there are no sites listed under Disallowed Sites. Under Seeds Sites, put a single site that you know should be crawlable, for example, https://www.yahoo.com/ . Don't use a site like Facebook because their robots.txt forbids all crawlers other than a few that are paying them like Google. Save your settings. Then look at the Page Options activity make sure that all the Page File types are checked, if not, check them. Go to the Manage Machines activity. For a simple crawl, you should have only one machine with a Queue Server and some Fetchers. Turn on the Queue Server and one Fetcher. Make sure you see log messages that each is running. Next go back to Manage Crawls and start the crawl. Go back to Manage Machines and look at the logs for the Queue Server and the running Fetcher, you should see some changes indicating the crawl is started. Go back to Manage Crawls you should see that it says a few pages have been crawled. If this doesn't happen within maybe two minutes, some thing isn't working, post your Queue Server and Fetcher logs here, and I will look at it.
Best,
Chris