7

I am comparing different CNN architectures for edge implementation. Some papers describing architectures refer to mult-adds, like the MobileNet V1 paper, where it is claimed that this net has 569M mult-adds, and others refer to floating-point operations (FLOPs), like the CondenseNet paper claims 274M FLOPs.

Are these comparable? Is 1 multiply-add equivalent to 2 floating-point operations? Any direction will be greatly appreciated.

nbro
  • 40,472
  • 12
  • 105
  • 192
Quintus
  • 71
  • 1
  • 2
  • Yes if I have to take a guess. The clue is in the name...multiply with weights and add. Although it should be more flops since there are many multiplication in a single neuron input. –  Sep 08 '20 at 19:00
  • One of the papers that you mention states "Throughout the paper, FLOPs refers to the number of multiplication-addition operations", and this is consistent with my knowledge of FLOPS, which are pretty standard in computational science (i.e. analysis and implementation of numerical algorithms, etc.). However, it is the first time I hear of "mult-add", to be honest, so I would need to check what those papers actually refer to when they calculate the "mult-adds". – nbro Sep 08 '20 at 22:23
  • This answer says they are equivalent, but it doesn't explain why. However, this answer seems to provide more insight. Maybe later I will provide a more formal answer. – nbro Sep 08 '20 at 22:35

1 Answers1

4

According to this source, one MAC is roughly equal to two FLOP (multiply accumulate). My guess/understanding would be that the distinction is made because neural nets spend compute overwhelmingly on multiply-accumulate operations, and thus optimizations and statistics over MAC operations would be more significant than FLOPs.

k.c. sayz 'k.c sayz'
  • 2,091
  • 10
  • 26