I have two metrics for measuring source code cohesion. To verify that both have some agreement can I use Cohen kappa for that? Both the metrics have values in range [0 to 1]. It's not necessary if both have the same values. If both the metrics are similar than I consider the common case and in case of conflicting as conflict. If one metric is decreasing and other is decreasing than both the metrics are common otherwise conflicting vice versa.
Asked
Active
Viewed 169 times
0
-
1This needs more detail for a proper answer. Can you edit your question to include the nature of the metrics (continuous, categorical with how many categories and ordered or not ordered), a definition of what you mean by close and by conflict? – mdewey Apr 22 '19 at 12:34
1 Answers
0
If I understood correctly, it seems to me that you have a continuous variable: the values can be any number in the interval $[0; 1]$. In this case Cohen's k is not appropriate.
In the continuous case I would suggest the mean squared deviation or the intra-class correlation coefficient.