2019-04-12

Getting Stuck at Robots.txt.

It seems like when I try to crawl, the software will just get "stuck" right off the bat when it sees the robots.txt file. I've looked at these files and they aren't disallowing anything. I have resorted to stopping and restarting the queserver and that seems to fix it sometimes. I have also rebooted the machine and that seems to work somethimes. And also simply stopping the crawl and then starting a new crawl with a different name will do the trick. It's really weird and I don't know why it does this. But when it sticks on robots.txt I have let it sit there for 30 minutes before and it really is "stuck"
It seems like when I try to crawl, the software will just get "stuck" right off the bat when it sees the robots.txt file. I've looked at these files and they aren't disallowing anything. I have resorted to stopping and restarting the queserver and that seems to fix it sometimes. I have also rebooted the machine and that seems to work somethimes. And also simply stopping the crawl and then starting a new crawl with a different name will do the trick. It's really weird and I don't know why it does this. But when it sticks on robots.txt I have let it sit there for 30 minutes before and it really is "stuck"
2019-04-14

-- Getting Stuck at Robots.txt
It really does seem like stopping and restarting the queserver fixes this issue
It really does seem like stopping and restarting the queserver fixes this issue
X