Spider not found appears when crawling with Scrapy

I ran into a mysterious error when I was about to crawl with Scrapy, so I made a note of it.

Status

Execute the following from the command prompt scrapy crawl websiteCrawl.py

raise KeyError("Spider not found: {}".format(spider_name))
KeyError: 'Spider not found: websiteCrawl.py'

Error is output. I haven't set anything strange, but what does it mean that there is no Spider?

Correspondence

If you remove .py and execute it, no error will occur. scrapy crawl websiteCrawl

It was a pitiful careless mistake, but if you save it as a tab, it will be .py, so I wonder if it is easy to make this kind of error.

Recommended Posts

Spider not found appears when crawling with Scrapy
Dealing with key not found error in pacstrap when installing Arch Linux
Solution when Not Found appears when hitting the Django REST Framework API from the outside
When wildcard specification does not work with pylint
Investigate the cause when an error is thrown when python3.8 is not found when using lambda-uploader with python3.8
Restart with Scrapy
When architecture not supported comes out with pip (Mac)
Check when the version does not switch with pyenv
Memory is not allocated when ndarray is initialized with numpy.zeros ()
How to log with python (when No handlers could be found for logger "__main__" appears)