11

I am no astronomer. I am a computer scientist. I recently read this article: JWST has changed the speed of discovery, for better or for worse - Astronomers are working at a furious pace to analyze and test whopping amounts of JWST data.

Are astronomers still searching somewhat randomly the sky dome for points of interest or is the astronomy community fixed on observing specific regions (about 200-500 uncommon events of black holes, supernovas, exoplanets, nebulas and galaxies) of interest ?

Is the main type of data astronomers use JWST (or any other type of orbital telescope) optical images ? It would amaze me if most of those daily Gigabytes of data the article mentions were raw data (measurement values for example not pixels).

Does this field use any kind of algorithmic identification of points of interest to discard any dull and common matter passing through the observing lenses ?

For example: The telescope after taking a high-resolution image of the universe in any kind of electromagnetic spectrum, it would do the following:

  1. Will catalogue the coordinates of the depicted image and other meta data like time and so on (as I presume it is already being done)
  2. Maintain only the pixels that are of interest set by the astronomer like regions that exceed a threshold of luminosity or density (cluster of pixels) of certain elements or distance from the telescope (not sure if the latest is something the telescope can measure).
  3. Those filtered information will be preserved and the rest discarded resulting in less data sent back to earth in a certain format to be reconstructed as an image (far from the true image of that region) but still eligible for study as those pixels will carry important data.
  4. These data could also be further compressed with certain structural reconstruction algorithms, i.e. if the final image is thousands of pixels wide but the actual information caught from filtering occupies scarce small regions in the center then only the coordinates of that pixel clusters could be saved and the rest assumed blank. The reconstructed image in earth would still be of the same dimensions.
  5. After reviewing these data, if the astronomer (or an other filtering algorithm) deems it worthy of interest, could ask from the telescope a full spectrum analysis with no filters sent back for further review.

Steps 1-4 could be executed by the telescopes computer before sent to earth and steps 2 & 3 could even execute while scanning (before completing the image) without having to post-process the complete image.

Forgive my ignorance if this pipeline is already applied. I'm not trying to sound smart.

Demis
  • 883
  • 5
  • 14

3 Answers3

22

JWST operates in a mode where groups of astonomers make detailed proposals to observe particular obects with particular instruments. There are no random pointings, although there may well be serendipitously observed observations in the same field of view of some objects of interest. There will also be some deep survey fields, which although not targeted at specific objects, will be targeted at some particular position in the sky. The amount of data taken is small compared with the rate at which the data can be transferred to Earth - so ALL the data is transmitted.

Each observing programme takes a fair bit of exposure time. You can see the observing schedules and you will notice that a typical observing programe might take between 1-20 hours. During these programmes the instruments might take a few to a few tens of images or spectra. i.e The shutter is opened for some considerable time before ending the exposure and recording the data as an image. You can find these raw images at the Mikulski Archive Space Telescopes (MAST).

For example, I searched for JWST data on HIP 65426 (which was observed in July 2022). There are 34 data images, some of them are calibrations, others are exposures of thousands of seconds. Each image is about 40 Mb in size (for this instrument). So to get ALL the data requires a transmission rate of about 34 x 40Mb/6 hours = 62 kb/s.

Apparently, the data transfer from JWST to the ground stations can take place at 3.5 Mb/s, so there is not the slightest problem in downloading all the data for later analysis.

The same is not true for other space missions. For example the Gaia astrometry satellite is limited in terms of the vast amount of data it can take compared with its download speeds to Earth. The difference here is that Gaia is a scanning, imaging instrument that is continuously surveying the whole sky. Here, there is a scheme, much like you propose, where an onboard computer selects the portions of the data (basically postage stamps around stars) that are selected for transmission back to Earth.

ProfRob
  • 151,483
  • 9
  • 359
  • 566
  • Very informative hyperlinks ! – Demis Dec 24 '22 at 14:49
  • I recently had the pleasure of speaking with people from ICRAR and the Pawsey Supercomputing Centre about radioastronomy and the Square Kilometre Array. As I understand it, practically none of this stuff happens in real time. They write a proposal, point the telescope, collect the data (all the data, no on-site filtering), and then spend a lot of core hours doing the analysis. That’s ground-based, but it sounds like JWST is similar: able to transfer the data plenty fast enough, so the big work is going to be processing it. – Tim Pederick Dec 25 '22 at 08:49
  • See Popular Science article of 2022-12, A fierce competition will decide James Webb Space Telescope’s next views of the cosmos, subtitled JWST had a busy first year in space, but astronomers are already vying for observation time in 2023.. – Basil Bourque Dec 26 '22 at 06:04
  • Other examples of downlink-hungry space telescopes are european Euqlid (will launch in 2023) and NASA Roman space telescope (in development, planned to launch in late 2020s). Both are telescopes for cosmological survey, to study how "dark energy" and "dark matter" work. Both need to observe large parts of the sky (thousands of square degrees), so data rates will be high - hundreds of gigabits fo Euqlid and up to several terabits for Roman, per day. – Heopps Dec 27 '22 at 07:16
  • @TimPederick Sure, most astronomical observations these days use recorded data. It's not like the old days of the lonely astronomer sitting in a cold mountain observatory, peering through a telescope in the wee hours of the morning. OTOH, looking at astronomical objects through a good telescope is an awesome experience. :) – PM 2Ring Dec 28 '22 at 19:38
  • @TimPederick The JWST does have procedures to respond to unpredictable "real time" phenomena, but even in those cases the turnaround time is a couple of days. See https://jwst-docs.stsci.edu/methods-and-roadmaps/jwst-target-of-opportunity-observations for details. Also see https://astronomy.stackexchange.com/q/47727/16685 – PM 2Ring Dec 28 '22 at 19:40
9

No, not randomly, see Is pointing a telescope at a random place a viable astronomical strategy?

The closest thing to "pointing randomly and seeing what is there" would be "deep field" images, pointing at a carefully chosen area of sky in which nothing seems to be present and looking for a long time to see what very distant galaxies are visible.

Nearly all observations are targeted and done with specific intent.

Your "pipeline" is at best pointless, probably harmful and certainly impossible. The "issue" is not with transferring data from the telescope to Earth, but with interpreting and analysing that data. The interpretation and analysis is a task for a natural intelligence (ie a scientist). That is because the interpretation is "hard". There's no bottleneck at the data transmission from the telescope, and if you want to "filter" the data, you can do that on Earth.

It seems that your basic assumption is that the JWST is taking random photos of the sky, and there is a problem handling the mass of irrelevant data. That's not the case. The "problem" is that there is a massive amount of relevant data that needs interpretation and analysis. I say "problem", because having too much good data is a nice problem to have.

So to answer the title question, the main occupation of scientists working with JWST data is understanding what we can learn from the data.

James K
  • 120,702
  • 5
  • 298
  • 423
  • 1
    Thank you for your clarifications. So the most common jobs for JWST is to further study existing points of interest and not for space exploration/cartography right ? And the main form of data sent back to earth is still of visual form (including spectra out of the human eye perception) ? – Demis Dec 24 '22 at 14:41
  • 4
    Yes, and yes, images and spectrograms are the main form of data. Strictly speaking not "visual", since it is an infrared telescope. – James K Dec 24 '22 at 14:49
  • @Demis JWST has too narrow field of view to contribute to cartography. The exploration it is designed for is looking at objects extremely far away. This means it can only see a small patch of sky at a time. Its kinda like asking an algorithm to be both exponential and polynomial. It doesn't work. Regarding its spectrum, only one instrument barely reaches visible light. Visible light is about 380nm to 750nm. JWST see 600nm to 28,500nm. The data arrives raw. You need an image processor to convert direct JWST data to an image. The images you see all have manual adjustments after arriving to earth – David S Dec 30 '22 at 22:01
9

Generally, the observation time is scarce, the pointing of the telescopes is very accurate over extended periods of time; e.g. the pointing accuracy and stability of the JWST are given as "The pointing stability for moving targets over a 1000-s exposure is estimated to be between 6.2 and 6.7 mas (1-σ)"

As scientist you can apply for observation time in advance (usually 6 month or 12 month prior). Your application will be peer-reviewed by the telescope's scientific panel and depending on various criteria, time will be alotted to different research proposals. These are then give time frames in which the telescope will be commanded to look with the requested instruments in the requested and proposed directions or objects.

See e.g. here for the JWST for the approved observations. As one of grant receivers you will get all the raw data of your observation and relevant corresponding house keeping data. For earth-orbit based telescopes bandwidth constraints are less of an issue.

Bandwidth WILL become an issue for deep space applications. In those cases you might want to implement some sophisticated pre-processing in order to reduce the data rate or increase SNR for what you transmit. (One such might be deriving the stokes parameters for polarization from solar images already on-board instead of transmitting 4 images for solar imaging missions - but the Sun can be imaged MUCH quicker than any other astronomical object due to its brightness, so you can aquire at high frequency (images / second) as opposed to image/hour or even image/day for non-solar-system astronomy).

planetmaker
  • 19,369
  • 3
  • 46
  • 80
  • What is an SNR mentioned in "in order to reduce the data rate or increase SNR for what you transmit." ? – Demis Dec 24 '22 at 14:48
  • 4
    SNR = signal-to-noise (ratio). That's what you usually increase by longer exposure or stacking multiple exposures of the same non-time varying object in astronomy when doing imaging. – planetmaker Dec 24 '22 at 14:55