Recent ATS Board Posts
Can Russia beat us?
Big Brother setting up the NWO
Has anyone had an UFO expereince like this?
WMDs found by Election??
(MSSS) - A True Story of "Something" Inside of Me.
2015 NASA interstellar mission
The Worlds Biggest Nuke
Hypnotism
India Buys The Admiral Gorshkov: Accessories Included.
Why not Nuke Mars?
Atlantis and the bermuda triangle
Odd thing on "A Few Things On the Ouija..." thread
A Chilling Interview With Zionist Benny Morris
do we get answers when we die?
10,000 ATS points for best story.read for details.


 ATS Board Forums
War on Terrorism
Political Scandals
Current Events
Political Mud Pit
Research Forum
Conspiracy Related
Aliens & UFOs
Secret Societies
New World Order
Science & Technology
Aircraft Projects
Military & Gov. Facilities
Military & Gov. Projects
Weaponry
Education & Media
Ancient & Lost Civilizations
Cryptozoology
Medical Issues & Conspiracies
Conspiracies in Religion
Paranormal Studies
Predictions



 TOP STORY: NASA Is Not Altering Mars Colors.

Posted by: Kano
On: Sun January, 18 2004 @ 03:34 GMT
This article is a brief summarised explanation of how the PanCam on the Mars Spirit Rover operates, in relation to the strange appearance of the calibration sundial in some pictures. The question was first raised by ATS member AArchAngel, and has been discussed at length in this AboveTopSecret forum thread and ATSNN story:
thread

Mars Spirit Rover Picture analysis.

In this thread I will attempt to summarise my posts to the larger thread.

What are you talking about?

Ok, the initial alarm was raised after it was noticed that the color-calibration sundial mounted on the rover, looked quite markedly different in the Mars-Panorama shots compared to its regular appearance.



Immediately wide-ranging theories began to pop up. At this stage I knew very little of the particulars of the PanCam so I decided to go and see what the Horses mouth had to say. I sent out a swag of emails to the NASA marsrover team, the Athena Instrument team at Cornell University, and the long shot, an email to Assoc. Professor James Bell. Who is the Pancam Payload Element Lead for the mission.

Now, getting no response from the Athena team, and an automated response from the NASA team. I was amazed and delighted to see that Dr. Bell had indeed taken the time out of his busy schedule to help explain this quirk in the panorama pictures. His email response is below:
quote:
Thanks for writing. The answer is that the color chips on the sundial have different colors in the near-infrared range of Pancam filters. For example, the blue chip is dark near 600 nm, where humans see red light, but is especially bright at 750 nm, which is used as "red" for many Pancam images. So it appears pink in RGB composites. We chose the pigments for the chips on purpose this way, so they could provide different patterns of brightnesses regardless of which filters we used. The details of the colors of the pigments are published in a paper I wrote in the December issue of the Journal of Geophysical Research (Planets), in case you want more details...


All of us tired folks on the team are really happy that so many people around the world are following the mission and sending their support and encouragement...


Thanks,


Jim Bell
Cornell U.


Now, as far as the pink tab where the blue one should be, that email is infact the complete answer. But its not easily understandable to the layman. Below I will attempt to explain why this occurs.





Displaying the first 12 replies to this news story...
Posted by: Kano
On: Sun January, 18 2004 @ 03:35 GMT
Digital Cameras

Firstly, we need to understand how the PanCam, and indeed digital photography in general works.

Luckily for us we have our good friends at http://www.howstuffworks.com to turn to.

How Digital Cameras Work

It would be worthwhile to read the entire article on howstuffworks, for a fuller understanding of the processes at work. But because I know you are all busy (lazy?) I will summarise.

Basically, the heart of a digital camera is the charge coupled device or CCD. This CCD converts light hitting it into electrical impulses, the brighter the light, the stronger the impulse. Now, CCD's are color-blind. All they do is signal how bright the light hitting them is. All well and good for black and white photography. But for color we need to do more. To get a color-picture. We need to record images via the CCD using a series of 3 filters. A Red filter, a Green filter, and a Blue filter. These are then recombined afterwards to give a color-representation of the picture. (Note, cheaper options like the Bayer filter pattern are often used in commercial digital cameras, but they use interpolation and are subsequently less accurate than 3-filter methods.

Never True Color

Quite a big deal has been made of NASA not sending 'True Color' images back from Mars. The problem with this argument is the fact that no digital images are ever 'True Color'. They are all composites. We cannot at present make a digital camera that sees images as the human eye does. The human eye also has 3-color receptors, but, being biological, there is a range over which the receptors pick up the colors.

http://science.howstuffworks.com/eye3.htm
quote:
From howstuffworks.com


In the diagram above, the wavelengths of the three types of cones (red, green and blue) are shown. The peak absorbancy of blue-sensitive pigment is 445 nanometers, for green-sensitive pigment it is 535 nanometers, and for red-sensitive pigment it is 570 nanometers.


Now, the PanCam

Some reading about the PanCam:
http://athena.cornell.edu/pdf/tb_pancam.pdf

From this document, we can find the wavelength values for the different filters on the PanCam:
quote:

LEFT CAMERA..............RIGHT CAMERA

L1. EMPTY................R1. 430 (SP) *
L2. 750 (20).............R2. 750 (20)
L3. 670 (20).............R3. 800 (20)
L4. 600 (20).............R4. 860 (25)
L5. 530 (20).............R5. 900 (25)
L6. 480 (25).............R6. 930 (30)
L7. 430 (SP)*............R7. 980 (LP)*
L8. 440 Solar ND.........R8. 880 Solar ND

*SP indicates short-pass filter; LP indicates long-pass filter

Table 2.1.2-1: Pancam Multispectral Filter Set: Wavelength (and Bandpass) in nm



Typical RGB values for recording and display are Red-600nm, Green-530nm and Blue-480nm. As we can see these coincide with the L4, L5 and L6 filters on the PanCam. The difference is, in this panorama image, and in most images taken by the Rover, the L2 is used for the Red-Channel instead of the L4. The L2 is at 750nm, and right at the extreme end of the visible spectrum, the near infra-red range. This increases the range of the spectrum that can be recorded by the PanCam, allowing higher definition to be recorded, making it easier to see into the shadows and so forth.

Color-Chip Pigments

As Dr. Bell explained in his email, and as visible by viewing the Raw images hosted by NASA. The color-chips are not as simple as they appear. The pigments are designed to have different brightness at a variety of wavelengths. Not just RGB values. So as to "provide different patterns of brightnesses regardless of which filters we used". The blue pigment is very bright in the near-IR range. Thus the L2 plate has a very bright recording of the blue pigment.
Posted by: Kano
On: Sun January, 18 2004 @ 03:37 GMT
Heres a quick way to re-create the effect yourself.

In the name of the image on the raw images directory

http://marsrovers.jpl.nasa.gov/gallery/all/spirit.html

It shows what filter the image was taken with.

For example 2P126644567ESF0200P2095L2M1.JPG was taken with the L2 filter, which we know is at 750nm.

All the raw images seem to follow this format. A way to re-create the effect seen by shifting the redpoint. (Thats all thats been done, it actually makes the surface seem less red). Is this.

Photoshop-only explanation here.
Download these 2 sets of 3 images.
Series 1.
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/003/2P126632830ESF0200P2899L4M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/003/2P126632883ESF0200P2899L5M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/003/2P126632935ESF0200P2899L6M1.JPG

Series 2.
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/004/2P126725406ESF0200P2095L2M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/004/2P126725437ESF0200P2095L5M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/004/2P126725479ESF0200P2095L6M1.JPG

Now, In the first series the Red component is from filter L4, (600nm) and in the second the filter is L2, (750nm). The green and blue filters are the same for both at L5 (530nm), and L6 (480nm) respectively.

To combine these, we will start with the first series. Open the L4 filtered image in photoshop first. This will be the background Then open the L5 and copy/paste as a layer over the L4 (layer 1), then copy/paste the L6 image as a layer over both (layer 2). Now all you have to do is rightclick on the L6 layer, go to blending options, advanced blending, and make sure only the blue channel is selected (deselect the other 2) this makes this layer the blue channel. Now, do the same for layer 1 (L5) but select the green channel. You don't have to do the same for the last channel as if you havent changed the opacity. Red is the only thing that can show through from that layer.

You should now have a regular, true-colour image of the sundial. As RGB are typically those values.

Now, if we repeat the process with the second series of images. Using the L2 layer as the background (and therefore red channel). We get a completely different looking sundial.

Try it for yourself.

Here are the two processed images:

Series 1:


Series 2:


All this from a little shift of the redpoint by 150nm. You'll also notice it doesn't really change the look of much apart from the extreme colours. I will make a diagram to show how the color-space is transposed.

NOTE: This simple-combination method is only appropriate in a few instances, explanation a little later.
Posted by: Kano
On: Sun January, 18 2004 @ 03:38 GMT
But Why?

Ok, explanation of why the color-chip pigments look so strange on the L2/L5/L6 filtered image.

Firstly we have the full visible spectrum.



We can then map our RGB colorspace onto it:



The curved grey region is the entire visible spectrum. The white triangle is the region of colours displayable by RGB. (The L4,L5 and L6 filters correspond to the points R,G and B).

This is the space recorded by replacing the L4 filter with the L2 filter (ie shifting the Red point by 150nm to the very edge of infra-red).



Notice there is a region recorded that is outside the visible spectrum. (The bottom right corner of the RGB triangle).

Now, when we display the composite RGB image of this back on our monitors. The colorspace that is recorded by the PanCam (with regions outside the visible) is transposed onto the displayable region shown in the first image. Thus a small region of infra-red is now added to the end of the red channel. As it is squashed into the displayable region.

Now, this means anything that is very reflective in the near infra-red spectrum (for example the blue pigment) has a massive boost in the Red channel when transposed. By comparing the L2 and L4 images for the green-chip, we can see the green chip also is quite a bit more reflective in L2 than L4. Thus the blue pigment appears pink and the green a kind of beige.

Also you'll notice that the transposition would actually make the environment look less red. As anything in the true red range (600nm) would be shifted to a slightly shorter wavelength, and appear more orange.
Posted by: Kano
On: Sun January, 18 2004 @ 03:40 GMT
Thats It?

In a simplified way, yes. That is the explanation for the blue pigment showing as pink. There is a lot more to this story though. Firstly, we have to remember what the mission is for Spirit. It is a geological one, not a sightseeing one. More than half of the filters on the pancam are outside the visible spectrum. The way the filters and sundial are set up is to try and decrease the discoloration of the surface by the atmosphere, as it is better for the geological mission to see the colors of the rocks and ground as it would be when white-lit. Not all with a pink/red filter over it.

To most of us though, color pictures from Mars are much more satisfying than any data regarding the planets geological history. Athena and NASA have purpose built some high-end image processing software to re-create the images as close as they can get to what it would actually look like from the surface.
Posted by: Kano
On: Sun January, 18 2004 @ 03:41 GMT
Not Quite it.

That is obviously a shortened explanation of the reasons behind the blue pigment appearing pink. As shown earlier we can re-create this effect ourselves.

I also mentioned in that post that the simple equal mix of the RGB color plates (from the Spirit Raw Images hosted by NASA) is only equivalent for some pictures.

Why not all?

To explain this we need to look a little more at the PanCam and how it transmits the data. From our Pancam Technical brief, we discover that the onboard computer on the rover (which controls PanCam) has the ability to perform a limited set of image-processing tasks, one of which is:
quote:
(4) rudimentary automatic exposure control capability to maximize the SNR of downlinked data while preventing data saturation


Channels Normalized
This means that the brightness of all three color plates has been amplified to give the highest range of brightness for each plate. I don't know the graphical term for it, but an equivalent Audio term would be something like Hard limiting. So I'll use that.

Basically, in each of the three filter pics. The exposure has been set so the brightest part of the picture from each filter, correlates with the absolute maximum brightness for that channel. For example the brightest part of the red channel is FF0000, green is 00FF00, and blue is 0000FF. (Obviously they all come in as b/w pics so in each black and white plate there is a perfect range from 000000 (absolute black) to FFFFFF (absolute white).

You can test this by opening one of the black and white plates (photoshop again sorry). Select either 000000 or FFFFFF as the working color. Then go to the select menu and color range. Set fuzziness to zero and OK. For each extreme you will find at least a few pixels of each.

On Earth
You can test the counter to this theory with a photo taken on Earth. Choose any photo taken on earth, (A good one to try is that Autumn road looking one that comes with Windows XP). Open it in photoshop and set its blending options so only the blue channel is showing. Its very dark, and there are no 0000FF pixels at all. In fact there are only a few 0000AA pixels, and they are in the whiteish parts. You can try this with any picture taken on earth. Try to avoid pictures with solid black and white in them however. Or something silly like a rainbow. White requires bright amounts of all RG and B to show. The rainbow is self-explanatory :P.

Reason
By sending each plate of colors spread across each extreme, you gain the maximum amount of data from each plate. Once you know the calibration information it is easy to amplify each channel back down to its correct setting, and get the images looking as they should. If you were to send the images at equal exposure levels, the signal to noise ratio would be lower, and any slight error in one of the blue/green channels would be more noticeable.

Now, this only throws out the color-balance on images where the original plates were not almost even. Unfortunately this covers most of the pics where the rover isn't visible. Remember each plate is Hard Limited when transmitted back. For this to not change the look of the simple-combined image. The original plates would have to already be almost hard limited. There are a few where this is the case.

Now, a way to test this. Is to get these images:
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/005/2P126825055EFF0200P2303L2M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/005/2P126825088EDN0200P2303L5M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/005/2P126825120EDN0200P2303L6M1.JPG

You will have to shrink the first one from 1024 to 512. The 'EFF' is a prefix for 1024x1024 and 'EDN' is for 512x512. I don't know why, thats just the pattern I've noticed :P. These are the 3 plates that make up the top of the little silver pole and corner of the sat-dish visible in the panorama.

http://marsrovers.jpl.nasa.gov/gallery/press/spirit/20040108a/PIA05015_br.jpg

Now, this pole has very bright almost white areas in the reflection. Thus all plates should be fairly even in exposure levels.

Combining them in photoshop. (In the manner mentioned before. We get:


Which is extremely close to the colors in the panorama. With slightly less of a red-tint.

Yet when we use other 3-plate series from Sol05 (which is largely the panorama). Such as these ones:
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/005/2P126824700EFF0200P2303L2M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/005/2P126824735EDN0200P2303L5M1.JPG
http://marsrovers.jpl.nasa.gov/gallery/all/2/p/005/2P126824762EDN0200P2303L6M1.JPG

We get:


A completely different look. Even though they are combined in the exact same manner. This is the effect of having all channels Hard Limited. You can re-create this effect by choosing auto-levels in Photoshop. While this is often handy, brightening up images and so forth. It does not work well when you are dealing with images predominantly one color, and whos brightest and darkest point is not a shade of grey.

How does NASA do it?
Well, clearly high-end, purpose made image processing software is a big part of it. They also have all the relevant calibration and exposure information from the rover.

Are we boned?
Not at all. Any picture with White and black, or bright red, green and blue in it will look almost exact when mixed evenly. What is the one thing we know has these? The sundial.

So any photo of the sundial. (Such as the ones shown earlier in the thread). We can be fairly sure will be accurate when mixed evenly. The convenient thing with the sundial is the fact it has mirrors on it to show the Martian sky, so with any plate-series of L4, L5 and L6 filtered plates, we can see a close approximation of the Martian Sky.

For example:


We can see the sky color in the little mirrors at the edges of the sundial.

Now, the one flaw with all this is the fact that a slight and constant hue of any sort would be removed by the equalisation of all the channels. So if anything all these pics would likely have a slight red/orange tint.

Among the multitude of images from Spirit, a nice pair for comparison are these.

There is a series on sol 8 which looks like a test of almost all the filters at one hill. This was good news as it allows us to compare the difference when we use an L2 filter as the red channel and when we use an L4 filter.

The results are below. REMEMBER these are normalized color images, not real color.


The slide on the left looks less-red than the one on the right. Obviously the channels are normalised so the colours are not true. But it is a good visual example of the idea that the near-infrared selection of filter for the Red channel will actually give the appearance of less red than would using the L4 filter for the red plate.
Posted by: Kano
On: Sun January, 18 2004 @ 03:42 GMT
Earth-Bound examples

Ok so, further explanation on why normalizing the colors is not a way to find the 'true-colors' of any given pics.

We have all seen those images from Mars where people have used photoshops 'auto-levels' function and 'proven' that Mars has a blue sky. This is so wrong as to be stupid. That function merely maximises each color channel. It can not know the circumstances of each picture and 'fix them up'.

Here are a few examples of pictures taken on earth. Try it for yourself.

Firstly this one. Taken in the Canberra bushfires in Januray 2003. http://abc.net.au/news/indepth/featureitems/s766029.htm



In this one the color-change is extreme due to the original being largely red-tinted. With no white, blue or green coloring.

Further to this, I grabbed my trusty Canon A70 and went out the back and took some photos. Then came and Auto-Levelled them. You can see the difference.


In all cases, the picture on the left is the Original, and on the right is the image where all channels have been equalized.

Tomato Plants.


The pool I am too lazy to clean.


The bush.


Now, the ones on the right seem to have more definition (as all ranges of brightness are covered by each channel). But the colors are simply wrong. You can try this yourself at home. Use photoshops 'auto-levels' to equalize the color channels. Remember pictures where the brightest part is a shade of grey, or white, will not be changed very much, either will pictures that have all 3 primary colors visible.
Posted by: Kano
On: Sun January, 18 2004 @ 03:44 GMT
Conclusions

Now after all that ranting. I suppose we need to summarise.

Basically, there is no way for us to recombine all the 'Raw' images to show the final images. We'd need to know more about the exposure levels and calibration settings to do that.

But, do not despair!. There are still a few which we can get a very close approximation of the actual colors. For example any picture that has the sundial or that pole (or any other white part of the rover) visible. These images we can be sure are close to the true-color images, with the only difference that any overall red tint will be lost.

So from the images we already have we can independantly check on the color of the ground and sky.

The sundial picture shows the reflected sky and the picture with the pole and sat-dish (I really should find out what that little pole is at some stage :P) shows the ground.





So all we are able to do so far is show that the sky and ground color we have been seeing in the released NASA images have been accurate. As you would expect really.

Another relevant point. The 'Raw' data on the NASA webservers is not technically 'Raw'. As NASA uses its own image compression to transmit back from Spirit. Then converts to .jpg when hosted online. To save bandwidth, plus its a much more palatable format.

Why dont they tell us this?

Some people have posed the question, 'why doesn't NASA tell us all this on their site'. The simple response is 'why would they?'. The images shown by NASA are as close to the actual appearance from the surface as they can get. The colors are as true as a hundred million dollars worth of camera and image processing software can get them. As accurate as any digital image can be.

There is simply no point in adding on their site "caution these images are not 100% precisely actual colors" when no digital image is really 'actual colors'. It would just give the conspiracy types more things to panic about.

We can already see for ourselves that the color of the ground and sky shown in the released panoramas is correct. There is no suggestion whatsoever that any modification has been made of the data coming in from Mars.
Posted by: Kano
On: Sun January, 18 2004 @ 03:45 GMT
Resources and Additional Reading

A brief list of related and useful sites regarding this matter:

Mars Rover Home at NASA:
http://marsrovers.jpl.nasa.gov/home/index.html

An Overview of the Mission of the Two Rovers:
http://marsrovers.jpl.nasa.gov/overview/

All Raw images from the Spirit Rover:
http://marsrovers.jpl.nasa.gov/gallery/all/spirit.html

Homepage of the Athena Instrument Team at Cornell University:
http://athena.cornell.edu/

Specifically the PanCam:
http://athena.cornell.edu/the_mission/ins_pancam.html

PanCam Technical Briefing:
http://athena.cornell.edu/pdf/tb_pancam.pdf

HowStuffWorks Page on Digital Cameras:
http://electronics.howstuffworks.com/digital-camera.htm

HowStuffWorks Page on the Eye and Vision:
http://science.howstuffworks.com/eye.htm

Georgia State University's Hyperphysics pages:
http://hyperphysics.phy-astr.gsu.edu/hbase/hph.html

Specifically Light and Vision:
http://hyperphysics.phy-astr.gsu.edu/hbase/ligcon.html

Including Color Vision:
http://hyperphysics.phy-astr.gsu.edu/hbase/vision/colviscon.html


There are probably more I have forgotten, but that is a good covering of sites related to the topics covered.

Enjoy!
Posted by: Kano
On: Sun January, 18 2004 @ 03:48 GMT
Created a thread for members to discuss and ask questions about this thread.

http://www.abovetopsecret.com/forum/viewthread.php?tid=30053
Posted by: William One Sac
On: Sun January, 18 2004 @ 07:39 GMT
What an amazing analysis you have provided Kano! Thank you for uncovering the truth of this mystery.


:cool:
Posted by: mikromarius
On: Sun January, 18 2004 @ 18:26 GMT
why don't they just use normal RGB colored images? Today's technology would make that possible. They could beam them back to earth with lasars instead of radio emissions.

And that calibration tool they have added to their sundial is just about the silliest thing Ive ever seen. To use four colors (and those four colors) to calibrate an image is just bogus. Hey. When I calibrate a screen, a scanner or a TV monitor we use a screen with many optimally the whole spectrum. NASA doesn't manage to turn me into a believer just like that.

Blessings,
Mikromarius

[Edited on 19-1-2004 by mikromarius]
Posted by: mikromarius
On: Sun January, 18 2004 @ 19:04 GMT
And came up with this result:


The pictures on the top are originals (and the referance calibration are also untouched). The ones below are calibrated images. Well calibrated. I used like two minutes in PS. If I had larger pictures I could have done a good job with them. But you "get the picture" don't you?

This means: By reverse "engeneering" these images, you can say that what is blue becomes red. And what's green becomes yellow. Hmmmm......

Below is an attempt at "bringing the image back" to it's original state. I only looked at the colors on the sundial, I didn's look at the photos. Startling.....



Blessings,
Mikromarius

[Edited on 19-1-2004 by mikromarius]
ATSNN.com is brought to you by the staff and members of the Above Top Secret discussion board, the Internet's most respected source for discussion on alternative topics.