Intro
I want to render celestial bodies based on their real-world positions in a geocentric system using OpenGL. I pull the daily updated real-world coordinates from NASA's Horizons system via a custom Python script and write them to a CSV file. Here is an example of the CSV file that I pass as input to my C++/OpenGL program:
The figure shows that the coordinates are outside the range [1, -1]. As a result, OpenGL will not render them. I hold this positional data in a float-vector, which I then pass to the vertex buffer object:
RenderingData convert(const EphemerisParsingResult &parsing_result) {
const size_t num_positions = parsing_result.num_parsed_rows;
const unsigned int vector_dimension = 3;
std::vector<float> positions(num_positions * vector_dimension);
for (int i = 0; i < parsing_result.num_parsed_rows; ++i) {
positions[i + 0] = static_cast<float>(parsing_result.position_x_km[i]); // x coordinate of ephemeris i
positions[i + 1] = static_cast<float>(parsing_result.position_y_km[i]); // y coordinate of ephemeris i
positions[i + 2] = static_cast<float>(parsing_result.position_z_km[i]); // z coordinate of ephemeris i
}
RenderingData renderingPacket;
renderingPacket.positions = positions;
return renderingPacket;
}
// ...
VertexBufferObject::VertexBufferObject(const std::vector<float> &vertices, const GLenum usage)
: AbstractOpenGlBufferObject(GL_ARRAY_BUFFER, vertices.size(), sizeof(float) * vertices.size()),
vertices_(vertices), usage_(usage) {
}
void VertexBufferObject::setBufferData() {
glBufferData(GL_ARRAY_BUFFER, getUsedMemorySize(), &vertices_[0], usage_);
}
Question
How to transform such real coordinates into normalized device coordinates? I would appreciate the exact transformation matrix.