搜寻AJAX请求

I have an ASP.NET MVC website with drop down lists and when the user selects an option in the first drop down list, the other drop down lists are populated using an AJAX call. Based on the logs, crawlers try to access these AJAX methods as normal gets and because of that my app logs errors. I made those AJAX methods as not crawlable, meaning that I return a 404 when the request is not an AJAX call. Is this the best way to do it? On the other hand, I have a page that has multiple steps, meaning that the user fills a form and then goes to a second step. Every time the user fills a form I do a POST AJAX request and saves the input data. How should I manage this situation?

Add URLs you don't want crawled to robots.txt.

If you offer a link in GET form crawlers will try to crawl it. Returning a 404 is not technically correct - it does work to deter crawlers from indexing the page though!

Consider returning a 500 Internal Server Error or 501 Not Implemented.