0

in machine learning mastery website, i don't understand this paragraph. P(A|B) = P(B|A) * P(A) / P(B) We can simplify this calculation by removing the normalizing value of P(B) and describe the conditional probability as a proportional quantity. This is useful as we are not interested in calculating a specific conditional probability, but instead in optimizing a quantity.

*why they said we can remove p(B).and became P(B|A)P(A)??doesnt it will distorted the maximum value?

i will give my example, my view might be a little bit distorted . these both example come from same samples: 'EX(1)P(B|A) = 3/4 , p(A)= 1/2 , P(B|A)*P(A)= 3/8 ,[P(B|A)*P(A)]/P(B) = 3/5. EX(2) P(B|A) = 2/4 , P(A) = 1/2 , P(B|A)*P(A) = 2/8 , [P(B|A)*P(A)]/P(B) = 2/3. ' in this example, for [P(B|A)*P(A)]/P(B) we got EX(2) better than EX(1) but when we remove the P(B) the EX(1) is clearly better. am i missing something???

  • See posterior is proportional to joint? and Tim's answer, specifically the The constant P(D) can be dropped... part. – user2974951 Jan 30 '24 at 13:49
  • @user2974951 no,what i want to ask is why we can remove the p(B). from what i know . P(A|B) = P(B|A) * P(A) / P(B) will give us normalized that mean our denominator can keep as changing but when we remove P(B) , our denominator kept constant. why we can maximize with P(B) being removed? – Kelvin Wijaya Jan 30 '24 at 14:02
  • 1
    Is $P(B)$ varied in any way for the maximization problem? If not, then all you care about is whether $P(B)$ is negative (that would change the problem into a minimization). But you know $P(B)$ cannot be negative. – whuber Jan 30 '24 at 14:10
  • @whuber i will give my example, my view might be a little bit distorted . these both example come from same samples: 'EX(1)P(B|A) = 3/4 , p(A)= 1/2 , P(B|A)P(A)= 3/8 ,[P(B|A)P(A)]/P(B) = 3/5. EX(2) P(B|A) = 2/4 , P(A) = 1/2 , P(B|A)P(A) = 2/8 , [P(B|A)P(A)]/P(B) = 2/3. ' in this example, for [P(B|A)*P(A)]/P(B) we got EX(2) better than EX(1) but when we remove the P(B) the EX(1) is clearly better. am i missing something??? – Kelvin Wijaya Jan 30 '24 at 15:41
  • 1
    You aren't maximizing anything: you're just doing some calculations. – whuber Jan 30 '24 at 16:12

0 Answers0