14

This question was asked to me in a mock interview...Really got surprised to find awkward answers...

consider a macro:

#define SQR(x) (x*x)

Example 1:

SQR(2) //prints 4

Example 2:

If SQR(1+1) is given it doesn't sum (1+1) to 2 but rather ...

SQR(1+1) //prints 3

Awkward right? What is the reason? How does this code work?

NOTE: I searched SO but couldn't find any relevant questions. If there are any kindly please share it!

Neil Townsend
  • 5,944
  • 5
  • 34
  • 50
abu
  • 239
  • 2
  • 3
  • 11

3 Answers3

47

SQR(1+1) expands to 1+1*1+1 which is 3, not 4, correct?

A correct definition of the macro would be

#define SQR(x) ((x)*(x))

which expands to (1+1)*(1+1) and, more important, shows you one of the reasons you shouldn't use macros where they aren't needed. The following is better:

inline int SQR(int x)
{
    return x*x;
}

Furthermore: SQR(i++) would be undefined behavior if SQR is a macro, and completely correct if SQR is a function.

Luchian Grigore
  • 245,575
  • 61
  • 446
  • 609
6

The problem is that macros are doing textual substition before it is compiled, so the macro expands to 1+1*1+1

Devolus
  • 21,012
  • 12
  • 62
  • 109
6

That is why you always put arguments to macros into ():

#define SQR(x) ((x)*(x))
Philipp T.
  • 775
  • 4
  • 13