Just before I start the question I would like you all to know that I have checked the other threads on taking the log of variables but I still think I have a question that hasn't been touched on yet. I would also like to thank whuber for his lengthy answer to another log question here.
This question specifically relates to one of the reasons of why we take logs, namely transforming the distribution of the data. When we take the log of a variable it is usually because the distribution of the variable is skewed and we want to give it a normal distribution. A common example of this in OLS regressions in economics is a variable denoting wages, income, GDP etc. However, no one ever seems to mention the central limit theorem (CLT). The CLT says that sum of many random variables will be normally distributed even if their underlying distributions are not normally distributed. If the error is the sum of the random variables $X$ and $Y$, $\epsilon = Y - X\beta$, then surely the error will be normally distributed regardless of the distribution of $X$ and $Y$. If this holds (and the CLT seems to hold under pretty weak conditions) then why would we need to transform the variable?

