7

According to K&R C section 1.6, a char is a type of integer. So why do we need %c. And why can't we use %d for everything?

Shash
  • 3,860
  • 8
  • 38
  • 64
  • 1
    @MaziarBouali Not necessarily. – Pubby Jun 08 '12 at 11:31
  • `printf` needs to know the size of the argument in order to print it (to cast it appropriately, so to say). A `char` has size 1, an `int` has at least that, on most machines more. Also, when using `%c` you want a character printed, not a number. In the `D` language, you would always use `%s` and let the compiler worry about the types. – Matej Jun 08 '12 at 11:32
  • 1
    @MatejNanut No, integer types smaller than `int` are promoted to `int` when being passed into a variadic function. – Pubby Jun 08 '12 at 11:35
  • @Pubby: thank you, I didn't know that. However, there is still this ambiguity when using longs, which are integers and you can't (or shouldn't) use `%d` for them. – Matej Jun 08 '12 at 11:36

6 Answers6

29

Because %d will print the numeric character code of the char:

printf("%d", 'a');

prints 97 (on an ASCII system), while

printf("%c", 'a');

prints a.

Fred Foo
  • 342,876
  • 71
  • 713
  • 819
  • 3
    And don't forget `scanf`: `%d` reads and converts numeric strings, where as `%c` reads and converts individual characters. – John Bode Jun 08 '12 at 15:00
4

While it's an integer, the %c interprets its numeric value as a character value for display. For instance for the character a:

If you used %d you'd get an integer, e.g., 97, the internal representation of the character a

vs

using %c to display the character 'a' itself (if using ASCII)

I.e., it's a matter of internal representation vs interpretation for external purposes (such as with printf)

Levon
  • 129,246
  • 33
  • 194
  • 186
1

Roughly speaking, %c prints the ASCII representation of the character. %d prints its decimal value.

Diego Sevilla
  • 27,697
  • 3
  • 55
  • 86
1

If you use %c, you'll print (or scan) a character, or char. If you use %d, you'll print (or scan) an integer.

printf("%d", 0x70);

How will the machine would know that you want to output a character, not its equivalent ASCII value?

Donotalo
  • 12,378
  • 23
  • 79
  • 116
-1

%d think char is 32bit and print it like a integer, but %c convert int value of char to ascii character then print it.

-1

%d is used to print decimal(integer) number ,while %c is used to print character.If you try to print a character with %d format the computer will print the ASCII code of the character.

Kypros
  • 2,999
  • 5
  • 20
  • 27
kapil
  • 587
  • 11
  • 20