I really like the Rubin because I think a lot of people focus too much on "deep" seeing (IE, looking at individual or several objects with very high magnification only once). The Rubin does much more "wide" seeing and this actually produces a ton of useful data- basically, enough data to collect reliable statistics about things. This helps refine cosmological models in ways that smaller individual observations cannot.
What's amazing to me is just how long it took to get to first photo- I was working on the design of the LSST scope well over 10 years ago, and the project had been underway for some time before that. It's hard to keep attention on projects for that long when a company can IPO and make billions in just a few years.
You worked on the design? That is interesting. I worked on the simulating the LSST , back in 2008 to 2010. The goal of which was to test the data reduction software. We were on the Image Simulation team.
It is surreal to see LSST/Rubin finally get first light.
Even more interesting to see who is still working on LSST, and who is not.
My feeling is the "deep" vs "wide" thing is a circumstance of which groups you interact with (and also which facilities you have access to, and even to some extent the culture of your science community). Rubin is an example of what you can do when you build something massive specifically for a single purpose, and as more of these kind of facilities come online (SDSS and Gaia have been around for a while, but DESI, 4MOST and other similar facilities are coming, and let's not forget radio), it's what we get out of the whole suite supporting each other that gets the best science.
Speaking of wide-fields, check out the Xuntian space telescope, which has (will have) a 1.1 degree field of view and a 2.5 gigapixel camera.
Deep is still interesting in understanding the origins of the universe. Rubin seems highly practical on the flip side. It'll be a super helpful tool in predicting asteroid impacts.
Also microlensing events, supernovae, and many other things in our very dynamic universe.
Also new planets! Planet Nine should likely be resolved within months, one way or another.
> "Probably within the first year we’re going to see if there’s something there or not,” says Pedro Bernardinelli, an astronomer at the University of Washington."
https://www.nationalgeographic.com/science/article/is-there-...
nice
Welcome to Hacker News!
It is generally recommended to upvote a comment you appreciate rather than making a comment that isn't adding substance. It helps keep the signal rate higher.
Or detecting more unusual interstellar objects like 'Oumuamua.
It does wide through image stacking/repeated visits. The speed and FOV is the key here.
The "wide" mode is called "survey" astronomy, and there have been several large surveys like Rubin/LSST, going all the way back to the Sloan Digital Sky Survey, which started in 2000 (if you count surveys from before the era of digital sensors, there are surveys going back more than 100 years).[0] Rubin/LSST is just the newest and most advanced large, ground-based optical survey.
Both modes of observation - surveys and targeted observations of individual objects - are necessary for astronomical research. Often, large surveys are used to scan the sky, and then targeted observations are used to follow up on the most interesting objects.
0. https://en.wikipedia.org/wiki/Sloan_Digital_Sky_Survey
Note that "seeing" means something very specific in astronomy: https://en.wikipedia.org/wiki/Astronomical_seeing.
The asteroid detection capability is amazing: https://rubinobservatory.org/news/rubin-first-look/swarm-ast...
And supernovae: https://m.youtube.com/watch?v=Ch18t9cz-JU&pp=ygUETHNzdA%3D%3...
Among many other uses: https://m.youtube.com/watch?v=h6QYjNjivDE
That is likely the most unexcitedly unsettling video I have ever seen. Amazing storytelling really.
It’s like swimming in a lake or river and thinking the water is just water but then you take a closer look and it’s just incredibly alive to the point of absurdity.
I suppose the weeds, bugs, bacteria, frogs, fish, and snakes are equally unlikely to harm us, but nonetheless. Holy shit!
I was just coming back to comment on the existential dread elicited by that video.
This is really going to revolutionize our ability to detect and predict asteroid impact.
And just in the nick of time!
Whoa that's incredible.
(And amazing production of the actual video as well)
Pretty sure you can see some kind of masking for satellites in some of the frames of the asteroid videos.
Wow, they should have led with this.
Which also tells the astronomical low odds of asteroids hitting earth even with “so many” of them. To me it changes nothing
If it has the potential to wipe out our entire species, but there's something we could do to prevent it (which I'm not sure about w/r/to asteroids), then it's worth looking out for the black swan event.
Doing some extremely rough math along these lines to double check myself:
* Gemini says that a dinosaur-extincting asteroid hits Earth about once every 100 million years. So in any given year that's 0.000001%.
* Economists say a human life is worth about 10 million dollars. There are about 8 billion people on Earth. So the total value of all human life is $80,000,000,000,000,000 (or 8e+16).
* So in any given year, the present value of asteroid protection is $800,000,000 (likelihood of an impact that year times value of the human life it would wipe out).
* The Guardian says the Vera Rubin telescope cost about $2,000,000,000 (2 billion).
By that measure, assuming the Rubin telescope prevents any dinosaur-extinction-level asteroid impacts, it will pay for itself in three years.
It seems incredibly bizarre to assign a monetary value to the elimination of all human life given the concept of monetary value would be wiped out along with the people.
These numbers are not what I expected at all.
So you could actually make an argument that to a country like the US, full 100% reliable asteroid protection is only worth like $50M/year (even if an impact means full extinction)?
So if upkeep for a detection/deflection system costs more than that we'd be "better off" just risking it?! Thats insane. I would have expected this number to be much higher than $50M/year.
The economists calculated the value of 1 life. The calculation might be different if it extinguishes the whole of humanity (and thousands of other species). In a way, it also presents all future human lives. Should we include those?
I don't believe that this would change the outcome much: It seems hard to argue that preservation of a nonhuman species would be worth more than a million lives (=> negligible) and assuming global loss of all human life is already unreasonably pessimistic in my view-- (e.g. the Chicxulub impactor would not have achieved this).
I also think that fully accounting for multi-generational consequences is murky/questionable and not really something we do even in much more obvious cases: Eligible people deciding against having children are not punished for depriving future society of centuries of expected workyears, and neither are mothers/fathers rewarded for the reverse.
But even if you accounted for losing 3 full generations and some change (for biodiversity loss), that still leaves you in the ~$200M/year range.
Currently we don't have reliable asteroid deflection capability at any price (but it would be technically somewhat in reach), but just imagine a future NASA budget discussion that goes "we're gonna have to mothball our asteroid deflector 3000 because it eats 5% of yearly NASA budget and thats just not worth it"-- that could be the mathematically correct choice, which confounds me.
Around 500 tonnes of meteorites hit earth every year.
Tracking large near earth objects is wise for several global and domestic security reasons.
Have a great day =3
The wikipedia article is quite good - https://en.wikipedia.org/wiki/Vera_C._Rubin_Observatory (Edit: Treasure trove of details in the references if any of your interests are adjacent to this)
The image of the woman holding the model of the sensor is nice because it includes a moon for scale.
Question I was curious about is whether or not the focal plane was flat (it is).
This is an interesting tidbit:
> Once images are taken, they are processed according to three different timescales, prompt (within 60 seconds), daily, and annually.
> The prompt products are alerts, issued within 60 seconds of observation, about objects that have changed brightness or position relative to archived images of that sky position. Transferring, processing, and differencing such large images within 60 seconds (previous methods took hours, on smaller images) is a significant software engineering problem by itself. This stage of processing will be performed at a classified government facility so events that would reveal secret assets can be edited out.
They are estimating 10 million alerts per night, which will be released publicly after the previously mentioned assessment takes place.
>The prompt products are alerts, issued within 60 seconds of observation, about objects that have changed brightness or position relative to archived images of that sky position. Transferring, processing, and differencing such large images within 60 seconds (previous methods took hours, on smaller images) is a significant software engineering problem by itself.[64]
>This stage of processing will be performed at a classified government facility so events that would reveal secret assets can be edited out.
Interesting, I'm guessing secret spy satellites?
"Let's look for spy satellites / orbiters" was an "application" I wondered about. My second thought about this was: maybe the US (and possibly other countries) already have something like this, but classified?
The US already has a very sophisticated system for this.
https://en.wikipedia.org/wiki/United_States_Space_Surveillan...
I expect a lot of events to get filtered that foreign governments expect to stay reasonably secret, even if they aren't friendly with the US. It's a game.
The thing that really saddens me is that the military gets to filter the data first and scientists only get to see the already manipulated data instead of a raw feed from their own instrument.
I thought all satellites already have known orbits?
Both because they can't be made invisible, and because you need to avoid collisions.
Many can (and do) change orbits.
it’s spy satellites (mainly domestic). In some cases, they don’t actually need to be removed, just embargoed until orbital change.
.. and aliens, of course ...
Back in January 2010 I went on a blind date with a lady who’s now my wife — an astrophysicist. We talked about this instrument and how Google would shuffle petabytes of raw observations, then distilling them into datasets researchers could actually use (don't know if Google is still involved?). We’ll celebrate 15 years of marriage this January, and I have been following the progress of this telescope since 2007 or so. It's amazing how long it takes for these instruments to come online, but the benefits are significant.
> We’ll celebrate 15 years of marriage this January,
Congrats!
The amount of data this thing will be putting out every night is insane. For years now the community has been building the infrastructure to be able to efficiently consume it for useful science, but we still have work to do. Anyone interested in the problem of pipelining and distributing 10s of TB of data a night should check out the LSST and related GitHubs.
I've followed this project for over a decade and the amount of data they are moving around is fairly routine, given their budget size and access to computing and networking resources. The total storage (~40-50PB) is pretty large, but moving 10TB around the world isn't special engineering at this point.
It's not about the size of the data in bytes, it's also the amount of changes that need to be detected and alerts that need to be sent out (estimated at millions a night). Keep in mind the downstream consumers of this data are mostly small scientific outfits with extremely limited software engineering budgets.
Again, nothing special. The small outfits aren't going to be doing the critical processing.
…they do the science
I've worked on quite a few large-scale scientific collaborations like this (and also worked on/talked to the lead scientists of LSST) and typically, the end groups that do science aren't the ones handling the massive infrastructure. That typically goes to well-funded sites with great infrastructure who then provide straightforward ways for the smaller science groups to operate on the bits of data they care about.
Here's the canonical example: https://home.cern/science/computing/grid and a lab that didn't have enough horsepower using a different grid: https://osg-htc.org/spotlights/new-frontiers-at-thyme-lab.ht...
Personally, I have pointed the grid folks (I used to work on grid) towards cloud, and many projects like this have a tier 1 in the cloud. The data lives in S3, metadata in some database, and use cloud provider's notification system. The scientists work in adjacent AWS accounts that have access to those systems and can move data pretty quickly.
The difference with this project is the data from Rubin itself isn’t where most of the scientific value comes from. It’s from follow up observations. Coordinating multiple observatories all with varying degrees of programmatic access in order to get timely observations is a challenge. But hey if you insist on being an “everything is easy” Andy I won’t bother anymore.
If you’re dealing with a fairly constant amount of data every day for years, using the cloud will be way more expensive than necessary.
The whole thread comes off as an AWS sales pitch...
Why move the data? Why not just enable permissions on cloud sharing a la Snowflake or iceberg?
Is this not the same problem high resolution spy satellites have? Seems like a fair bit of crossover at least?
Spy sats are more bandwidth and power constrained. For low earth, you also can’t usually offload data over the target.
> For low earth, you also can’t usually offload data over the target.
That capability is coming with starlink laser modules. They've already tested this on a dragon mission, and they have the links working between some satellite shells. So you'd be able to offload data from pretty much everywhere starlink has presence.
Here's the SDSS view[0] of this featured[1] section from the Virgo Cluster, in comparison, to put the staggering depth of these exposures in their proper context,
[0] https://aladin.cds.unistra.fr/AladinLite/?target=12%2026%205...
[1] https://rubinobservatory.org/gallery/collections/first-look-...
With an opacity slider, for easy comparison:
https://aladin.cds.unistra.fr/AladinLite/?baseImageLayer=CDS...
Thanks for the link, I didn't know one can do this with Aladin Lite! But to be fair, if we compare to DESI LS, it looks much less impressive. I.e. all the shells/tidal debris are basically visible in DESI.
Agreed. Here is the link: https://aladin.cds.unistra.fr/AladinLite/?baseImageLayer=CDS...
Why do the brighter objects have the four way cross artifact? My (apparently incorrect) understanding was that those types of artifacts were a result of support structures holding reflecting mirrors on a telescope. But this camera just has a "standard" glass lense with nothing obstructing the light path to the sensor.
It’s a reflecting telescope, not a camera with a glass lens.
Ah, thanks. I had seen a bunch of hype about the camera itself (which is on its own very impressive) and assumed that was the complete device. Didn't realize it was part of a larger telescope.
So stoked for this observatory to go online! One cool uses it'll excel at is taking "deltas" between images and detect moving stuff. Close asteroids is one obvious goal, but I'm more interested in the next Oumuamua / Borisov like objects that come in from interstellar space. It would be amazing to get early warnings about those, and be able to study them with other powerful telescopes we have now.
> So stoked for this observatory to go online!
Second this, but other areas are of great interest too. Kuiper Belt discoveries and surveys FTW!
Counter-rotating spiral galaxies. Super neat! https://skyviewer.app/embed?target=186.66721+8.89072&fov=0.2...
> "?target=186.66721+8.89072"
(For those who haven't noticed, you can just simply paste 186.66721+8.89072 or whichever target you're curious about in an astronomy database like Aladin[0], and there right-click on "What is this?")
[0] https://aladin.cds.unistra.fr/AladinLite/?target=12%2026%204...
I wonder if there's some kind of gravitational lensing going on. A lot of the galaxies look similar, but in different orientations.
https://skyviewer.app/embed?target=186.66721+8.89072&fov=0.2...
https://skyviewer.app/embed?target=185.46019+4.48014&fov=0.6...
https://skyviewer.app/embed?target=188.49629+8.40493&fov=1.3...
(Quick side note, if you go to /explorer instead of /embed you can zoom out so you can see the whole image at once)
https://skyviewer.app/explorer?target=187.69717+12.33897&fov...
That is interesting!
They look like they're roughly in the same plane. Is it safe to assume they're roughly in the same plane, or could they be really distant along the line of sight? The similarity in size makes me think they are, but I don't have any reason to be confident in that judgment.
I believe there would be a difference in their red/blue signatures if they were moving relative to each other, but as you say they clearly are on the same plane
Those are NGC 4411 a+b and they're indeed right next to each other,
https://noirlab.edu/public/images/iotw2421b/ ("thought to be right next to each other — both at a distance of about 50 million light-years")
What's going on directly above with what looks to be 3-4 galaxies interacting?
"like NGC 4410, above them in this image. The four interacting galaxies of that system are connected by tidal bridges, created by the gravity of each galaxy pulling on the others in the system."
It says that NGC 4410 is (gravitationally) interacting galaxies. After clicking through the link, it calls it RSCG 55 instead and explains more. I don't understand the naming scheme.
The naming scheme is based on the principle "tens of thousand of people have done this over thousands of years, and they all named things themselves". Its not uncommon for objects to have ~20 separate names[1], with some having over a hundred [2].
In this particular case, RSCG 55 means a group of galaxies[3], of which NGC 4410 is one member. Apparently RSCG is the "Redshift Survey Compact Groups" (https://cds.unistra.fr/cgi-bin/Dic-Simbad?RSCG) so 55 is just an index number.
That's also the case for the 4410 after NGC; in that case stands for "New General Catalog". In contrast the Sloan Digital Sky Survey gave NGC 4410 the name SDSS J122628.29+090111.4 where the numbers indicates its position in the sky.
The "index number" and the "position of the sky" are the two most popular naming strategies.
[1] NGC 4410 has 37, but the NGC objects are among the more popular https://simbad.u-strasbg.fr/simbad/sim-id?Ident=+NGC+4410&Nb... [2] https://simbad.u-strasbg.fr/simbad/sim-id?Ident=M87&submit=s... [3] https://simbad.u-strasbg.fr/simbad/sim-id?Ident=RSCG+55&NbId...
Check out this video: https://rubinobservatory.org/gallery/collections/first-look-...
Incredible.
For anyone that hasn't clicked the link, it shows that in just a few days, the observatory has already found over 2000 new asteroids. That is indeed very impressive.
something green: https://skyviewer.app/embed?target=186.82033+8.25479&fov=0.0...
Could be a satellite that moved into the frame during green.
There was a livestream presentation and press conference up on YouTube
https://www.youtube.com/live/Zv22_Amsreo?si=zQLeGfJokZoCPkji
At time 1:38:19 - one hour 38 minutes 19 seconds - into the livestream presentation, there's a slide that shows RGB streaks of fast-moving objects that were removed for the final image.
Those streaks are apparently asteroids.
Perhaps it is indeed a glitch or cosmic ray event.
(Is there a better URL for the slide deck?)
might be bad cosmic ray rejection during green exposure
Looks like two galaxies interacting/merging.
Petition to name those two mirrored galaxies "Wax on" and "wax off"?
I'll see myself out.
Took me a while but I got it
Related: When a Telescope Is a National-Security Risk [1];
TL;DR: VCRO is capable of imaging spy- and other classified US satellites. An automated filtering system (involves routing through some government processing facility) is in place to remove them from the freshly captured raw data used for the public transient phenomena alert service. 3 days later, unredacted data is made available (by then the elusive, variable-orbit assets are long gone.)
[1] https://www.theatlantic.com/science/archive/2024/12/vera-rub...
PetaPixel has a decent article / video on the topic from a visit to the observatory:
* https://petapixel.com/2025/06/23/hands-on-at-the-vera-c-rubi...
Not super technical, but a little higher level (with decent analogies to photography, for their traditional audience).
For a step by step tour: https://skyviewer.app/tours/cosmic-treasure-chest/
We got DOGE instead of using $100k of tax dollars making this into a super nice public mobile app...
While it would be cool to make a mobile app for this, having used both the tech stack behind this site (it's open source, and really great), and written a mobile app with a similar tech stack, $100k will get you one, but it's going to be a pain to debug all the various niggles around various devices, so you'll need at least double that to make a robust one (and then the question is, do you just go native and reimplement that tech stack).
Im not typically this person, but hacker news is just more and more "allow me to tell you", buddy I have 20 years experience building mobile apps... including against scientific, open source, and plain old hack-AF platforms...
My God, it's full of stars
brings up that old paradox - should any line of sight ultimately end up at a star?
Note that it makes a lot of assumptions beyond the stated ones, such as:
* the only objects in space are stars
* all stars are equally bright
* the average brightness is one that can be seen
(unless you roll all this into "homogeneous"?)
Every set of deep field imagery reminds me that any point of light we see could be a star, a galaxy, or a cluster of galaxies. The universe is unimaginably vast.
For observatories like Rubin, is there a plan for keeping them open after the funding ends? Is it feasible for Chile to take over the project and keep it going?
On a practical note, what happens to a facility like this if one day it's just locked up? Will it degrade without routine maintenance, or will it still be operational in the event someone can put together funding?
There are already facilities like this (obviously not as new as Rubin) degrading due to funding, but this is because there's usually no better purpose for them. Space monitoring has been used in the past as a second life for facilities (outreach too), but ~1m class telescopes are good enough now that networks of them are better than a 40+ year old telescope. It's also worth noting bits can be reused: buildings gutted and repurposed, telescopes/instruments moved/sold on, etc.; but the real issue is having the staff to look after these places, and many older facilities are not always as amenable to automation as people might like (especially funding agencies).
Arecibo was about 60 years old for comparison when it collapsed, but there are lots of faculties that are effectively ships of Theseus, with new instruments coming in over time which refresh the faculty (and when that stops happening, then you get concerned).
It will continue with a new instrument after 10 years (spectroscopic) funding permitting. Tololo has been running since the 60s. In California, Lick has been running since the 1880s.
Even one zoom-in and I find something interesting.
What's that faint illuminated tendril extending from M61 (the large spiral galaxy at the bottom center of the image) upwards towards that red giant? It seems too straight and off-center to be an extension of the spiral arm.
EDIT: The supposed "Tidal tail" on M61 was evidently known from deep astrophotography, but only rarely detected & commented upon.
I was surprised by how many lensed objects I could spot.
Related: https://www.nytimes.com/2025/06/23/science/vera-rubin-scient...
(via https://news.ycombinator.com/item?id=44352455, but no comments there)
The zoomed images look grainy as one would expect from raw data, but I would have expected them to do dark field subtraction for the chips to minimize this effect. Does anyone know if that's done (or expressly avoided) in this context, or why it might not be as helpful (e.g., for longer exposures)?
Seems this will be done on the 'nightly' release cadence. Found on page 11 in this doc that I found from the wikipedia page:
https://docushare.lsstcorp.org/docushare/dsweb/Get/LSE-163/L...
Why are there lens-flare-like artifacts around some of the bright objects?
Those are diffraction spikes, caused by how the light interacts with the support structure holding the secondary mirror. Each telescope has different patterns, hubble, jwst, etc. I think they only happen for stars, and not for galaxies (an easy way to know which is which), but I might be wrong on that (there's a possibility for faint stars not to have them IIRC).
> "Each telescope has different patterns"
This one's extra-special! The pattern is multiple + shapes, rotated and superimposed on top of each other. And they're different colors! That's this telescope's signature scanning algorithm—I don't know what that is, but, it's evident it takes multiple exposures, in different color filters, with the image plane rotated differently relative to the CCD plane in each exposure. I assume there's some kind of signal processing rationale behind that choice.
edit: Here's one of the bright stars, I think it's HD 107428:
https://i.ibb.co/HTmP0rqn/diffraction.webp
This one has asteroid streaks surrounding it (it's a toggle in one of the hidden menus), which gives a strong clue about the timing of the multiple exposures. The asteroids are going in a straight line at a constant speed—the spacing and colors of the dots shows what the exposure sequence was.
I think this quote explains the reason they want to rotate the camera:
> "The ranking criteria also ensure that the visits to each field are widely distributed in position angle on the sky and rotation angle of the camera in order to minimize systematic effects in galaxy shape determination."
https://arxiv.org/abs/0805.2366 ("LSST [Vera Rubin]: from Science Drivers to Reference Design and Anticipated Data Products")
> with the image plane rotated differently relative to the CCD plane in each exposure
LSST is a alt/az telescope. The earth rotates. The sensor plane must rotate during the exposure to prevent stars from streaking, which it accomplishes via this platform: https://docushare.lsstcorp.org/docushare/dsweb/Get/Document-...
The fact that the sensor rotates without the spider rotating also spreads out the diffraction spikes.
But that rotation is limited, so between different exposures with different filters the image plane will be rotated relative to the sky.
As the quote goes the change in orientation has benefits for controlling systematics..
No, they happen for absolutely every externally-generated pixel of light (that is, not for shot noise, or firelflies that happen to fly between the mirrors). Where objects subtend more than one pixel, each pixel will generate it's own diffraction patterns, and the superposition of all are present in the final image. Of course, each diffraction pattern is offset from the next, so they mostly just broaden (smear out), not intensify.
However, the brightness of the diffraction effects is much lower than the light of the focused image itself. Where the image is itself dim, the diffraction effects might not add up to anything noticeable. Where the image supersaturates the detector (as can happen with a 1-pixel-wide star), the "much lower" fraction of that intensity can still be annoyingly visible.
It depends on the science you're doing as even these small effects add up, there's a project within the LSST science team (which a college is working on) to reduce this scattered light (search for "low surface brightness"), where there's a whole lot of work around modelling and understanding what effect the telescope system on the idealised single point that is a star.
There are projects (dragonfly and huntsman are the ones I know of) which avoid using mirrors and instead use lenses (which have their own issues) to reduce this scattered light.
The same effect is used for Bahtinov focusing masks. From what i know, all light will bend around the structures, but stars are bright and focused enough to see; in theory galaxies would too
Diffraction spikes [1] are a natural result of the wave-like nature of light, so they occur for all objects viewed through a telescope, and the exact pattern depends on the number and thickness of the vanes.
My favourite fact about these in relation to astronomy is that you can actually get rid of the diffraction spikes if your support vanes are curved, which ends up smearing out the diffraction pattern over a larger area [2]. However this is often not what you want in professional astronomy, because the smeared light can obscure faint objects you might want to see, like moons orbiting planets, planets orbiting stars, or lensed objects behind galaxies in deep space. So you often want sharp, crisp diffraction spikes so you can resolve these faint objects next to or behind the bright object that's up front.
[1] https://www.celestron.com/blogs/knowledgebase/what-is-a-diff...
Those are stars, they create those lens flares because they are so bright.
All the dim fuzzy objects are galaxies much further away.
"Space is big. You just won't believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space."
Jesus H Christ, the Universe is big.
“Space is big. You just won't believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space.” -Douglas Adams
[dead]
Sometimes I feel like a diatom floiting in the ocean
[dead]
[dead]
[dead]
[flagged]