It’s insane what these people do. They’re rewriting code from the 60s to use even less memory, have to test it in production without physical access, and it takes two days to see if anything changes. It’s an insane piece of engineering and it’s incredible that it’s still sending useful data.
I’d love to see what their test environments are like. You can’t test everything, but they can certainly test some things. A raspberry pi has more software capability.
Yeah. I’m half-drunk but the first thing that I thought was, “I could use some gyros. Preferably with a buttload of tzatziki”. (The video is about gyroscopes though. Also cool. But not edible.)
Knowing what I know, I am assuming this image was standardised and then normalised (fancy stats algos to keep things in the same visual range) while stitching it together, and the final product enhanced a lot of colouration (saturation). They’re subtle or undetectable to the naked eye, but they exist. They are reflected in the different minerals present. I’ve done this stuff (raster stitching) with different imagery. Op was active in the comments with info, but I didn’t read up on it.
The colors don’t match what a human eye would see, but without going into a philosophy tangent, color is extremely complex and a huge part of what a human sees is your brain doing representations and mapping that isn’t perfectly represented in the physical object being observed. In this photo the saturation has been increased (versus a human eye) because it helps show the geological differences on the lunar surface. The reddish areas are high in iron and feldspar, and the blue-tinted zones have higher titanium content. Instead of thinking of the color as “real” or “fake” it’s probably better to think of it as a tool, to simulate if you were a super human with the ability to adjust saturation and detect metal composition with your eye. Usually when a photo like this is shared by researchers and scientist all this nuance and exposition is included, but then journalist and social media get a hold of it and people start crying “fake” without an understanding of what the image is trying to accomplish. TL;DR - The image isn’t what a human eye would see but it isn’t just art to look cool, the color and modifications have physical meaning and serve a purpose.
I imagine the yellowish tinted areas are mostly sulfur from volcanic ash emissions. That middle picture, in the section between the two mare, it looks like how beach sand is altered after being inundated with water. In general, most of the surface looks like pulverized sand on a beach, at a high level abstracted perspective view. That one section between the mare looks whetted by comparison. Perhaps ash altered the consistency enough to create a similar type of compacted appearance, but if there was water and vulcanism in the area, perhaps that was the Lunar version of Yellowstone.
Funny that the most recent research on the anomalous regions inside the Earth’s mantle have now been linked to the Theia collision through the mantle hotspot activity. So it is likely that the moon and Yellowstone are directly linked. It would be interesting to find that the regional anomalies on the moon are likewise of a similar origin. It would be interesting to me if Yellowstone’s doppelganger is right there in plain sight as well.
But how did they composite 81,000 images without worrying about atmospheric lensing distorting the proportions as it moved across the sky for 4 days? Is it just negligible?
The Samsung moon actually just makes up a plausible looking moon, which is hilarious given that the moon essentially doesn’t change, so they could have just overlayed reference images. Instead, you get features on the moon that don’t exist.
They didn’t. What they did was take 81,000 images and then filter through, them taking the best images of each region of the Moon and then averaging and compositing those.
It isn’t 81k images stitched together. It’s 81k images taken in the hopes of getting enough with perfect clarity to create the composite.
astronomy
Najstarsze
Magazyn ze zdalnego serwera może być niekompletny. Zobacz więcej na oryginalnej instancji.