You can tell webcrawler which visiting your website, you like or don't like crawling, by:
robots.txt
If you don't already have a robots.txt:
create a robots.txt and save this file on your webroot.
You can add p.e. the follow lines to your robots.txt
# Sample for you tell dont crawl anything from my website:
User-agent: *
Disallow: /
You can configure the robots.txt, p.e. for different User-agent (crawler), folder, file types and some other. Please check for this a robots.txt guide by your search engine of your choice. Or check the follow one:
https://audisto.com/guides/robots.txt/
Remark:
You can p.e disallow all User-agents and allow every well known Browser and search engines. Or dont disallow all, disallow only User-agent which are known by you which doing with your website what you dont like.