Let us say I want to solve a large sparse linear system. It is said that iterative solvers should be better than direct solvers in this case. But how large is large? What is the exact threshold beyond which I must use iterative solvers? A thousand by thousand matrix? A million by million matrix? I understand this might depend on the particular situation but I want to have some idea of how large number we are talking about. Currently I have absolutely no idea what kind of size they are talking about when textbooks say "iterative methods are for large systems."
It is similar to having no idea of what temperatures should be regarded as hot and what temperatures should be regarded as cold. I want somebody tell me "You can quite safely assume that beyond 40ºC is hot in normal circumstances, even though people whose primary occupation is to burn things might consider even 100ºC as cold."
Let me make the question concrete. Please post the result of specific tests performed on direct and/or iterative methods. Please include information on the PDE (or other) problem, preconditioners, sizes, dimensions, etc, anything that might be important.
[Here is a frustrated rant: It feels like all textbooks and even some comments and answers here are trying to hide the exact numbers, similarly to some videos on the internet saying they have something important to say but never really say it, and in the end ask you to pay if you want the answer. They say "it depends on this depends on that" but never say any concrete numbers to at least give some rough idea of the scale. I am sure there is a good reason, but I suspect it is because the experts don't remember how they were when they first learned the subject. It is almost as if they only want to talk amongst themselves, and only care what the other experts think of what they say even when they are talking to newcomers.]