His response was that it was going to use more resources if it was array instead of text and that I should focus on my programming instead of wasting resources.
Evidence?
If he's arguing performance against what'd otherwise be a clarity and correctness improvement, the onus is on him to provide evidence.
Delimited text is usually the wrong way to do it - see e.g. this related question.
I don't know enough about PostgreSQL to know if the use of arrays in this case would be concern.
It depends on how you're constructing the arrays. Try to create them in a single run using array_agg or an ARRAY[] constructor or similar. Iteratively building them with concatenation in a loop tends to be very slow because at the SQL level arrays are immutable, so they have to be copied to append to them.
Of course, the same is true of text at the SQL level, and the same principles apply.
My questions are: do PostgreSQL dynamic arrays use up more memory than text?
It depends on the underlying type. In general, if t1 uses more memory than t2, then an array of t1 uses more memory than an array of t2.
The internal storage of arrays depends a little on how they're created, but in general they add little or no memory overhead over the base type.
And if so, is it a significant concern to drop the use of arrays on large queries?
Yes, but no more of a concern than using delimited text.
Try to return result sets, not huge single results.