The input to my program is a (x, y) integer coordinate inside the blue region of this circle of radius 100. I want to scale the input coordinate from the blue area to the red area, maintaining the the x and y ratios.
(link to the Desmos plot: https://www.desmos.com/calculator/61f4y2r7r4)
I know how to do this with one dimension - this answer gives a good overview on performing linear scaling. I attempted to apply this approach to the x and y axes separately. Here is some example code that I wrote to model the image.
#include <math.h>
#include <stdio.h>
#define RADIUS 100
static int find_point_on_circumference(int val) {
return sqrt(RADIUS * RADIUS - val * val);
}
static int scale(int val, int min_old, int max_old, int min_new, int max_new) {
return (max_old == min_old)
? val
: (((val - min_old) * (max_new - min_new)) / (max_old - min_old) + min_new);
}
int main() {
int x = 15;
int y = 96;
int boundary_old_x = 10;
int boundary_old_y = 10;
int boundary_new_x = 30;
int boundary_new_y = 20;
int new_x = scale(
x,
boundary_old_x,
find_point_on_circumference(y),
boundary_new_x,
find_point_on_circumference(y));
int new_y = scale(
y,
boundary_old_y,
find_point_on_circumference(x),
boundary_new_y,
find_point_on_circumference(x));
if (sqrt(new_x * new_x + new_y * new_y) <= RADIUS) {
printf("SUCCESS\n");
} else {
printf("FAIL\n");
}
return 0;
}
For this particular input, (15, 96), the result is outside of the circle. I can see that the reason for that is that my max-x bound is less than my min-x bound. I'm just not sure how I should be applying this scaling correctly in the first place.