At an advanced undergrad level, I have had an encounter with "programming-versus-math" in the crypto course and equally-"applied"-but-different error-correcting codes course that I developed in the late 1990s.
First, let me say that I myself find contemporary computers substantially helpful "even" for purposes that existed prior to them. Communication! Typesetting!
Computational issues in number theory (my general field of interest) are low-hanging fruit in terms of marketability to students, or to amateurs, and have considerable interest to any sensible person. But, of course, there's a limit to what experiment can suggest, all the while thinking that (if we were practical physicists) a think untestable by experiment is nearly worthless.
Somewhat surprisingly to me, I found a bifurcation in the attitudes of students, between computational/experimental verification (or experiment!) and "proof".
In fact, much of the push-back against experiment from undergrads who claimed to exclusively endorse "proof" could be understood, under closer examination, to be a simple-minded resistance to "confusing inputs". In particular, a significant portion of the population was the sort of "math major" who "likes math" because "there are rules" and no general sensibility is relevant.
As a pathetic counterpoint, the people who showed up with some capacity to program (but mostly inefficient graphical interfaces, silly things, nothing high-performance, by a mile...) could not imagine that there would ever be any need to "prove" anything. In fact, the most bizarre conception I'd encountered up to those dates was that ... in computer science students' belief system... computers are so fast/good/whatever that no task is impossible... The reason I had trouble catching-on to this conceit was that I'd have thought they'd have known that it's not clear that P is or is not NP, not to mention that the security of various security systems depends on tasks being difficulty for anyone.
Sad summary: the majority of kids attracted to math as an undergrad major were attracted for reasons violently opposite to computer-science reasons, while the computer-science kids had belief systems that made them unable to understand why/how mathematics was necessary or useful.
That is, apparently, there's a bifurcation in the general population between these ways of thinking. The time I've spent trying to cajole people into seeing the opposite possibility .. has not been repaid in any way, sadly.
But, yes, in the ideal new world, people would learn a few computer languages, especially some scripting languages, and learn some mathematics, and be able to do things...
Here, yet again, I find that the implicit aspects of the question are the real trouble... Yes, I've tried to have an impact on curriculum, and on "attitude", but "computing" and "math" each do seem to have angry, uncooperative constituencies already established... Whah?
Anyway, I honestly think a person would be a fool to not learn about computing (not to mention communication) if they could encompass it, whether or not they were interested in mathematics. Srsly, this is a sort of tail obviously wagging the dog thing, for most purposes. (And, with luck, I will become rich by using Fourier analysis to break the stock market, of course.)