So, I face this question every day. When I write code, sometimes I just swap some conditionals around to clean up the spaghetti I make or replace a datatype by another, and suddenly my code runs much faster.
I'm not quite experienced enough to understand why, but my guess is that most of the time this is due to the optimizer picking up some specific pattern in the code that it can optimize.
The thing is that I prefer to develop my applications on Linux, but I also target windows, it's honestly a trouble testing on both platforms, so I just keep working on Linux, and later I test it on Windows only.
That always leave me with the question: "What if it made it slower on MSVC now?".
I wonder if there's any rule of thumb for when optimizations tend to generalize across compilers, or if the only way of knowing is by profiling it.