A curious case where RAW isn't higher quality
For stills photography, many people advocate shooting RAW as it allows for more flexible processing and is higher quality. However, there is one weird case where shooting JPEG in camera can actually yield higher quality. This can occur when the deBayer/demosaic algorithm in the camera happens to do a better job than the RAW processor (e.g. Adobe Camera RAW).
In the comparison above, Photoshop (I call it Photoshop instead of Adobe Camera RAW) shows more color and mazing artifacts than the camera’s built-in processing. Weird but true- in this specific situation, the camera’s digital signal processing does a better job than the Photoshop RAW plug-in. In practice this is likely not a huge deal since:
A- The image was of a test pattern that has lots of high contrast fine detail. Real world images tend not to be anything like this. RAW processing is typically designed to do well on real-world images and it is real world images that count, not evil test patterns.
B- Almost all of the time, the RAW workflow will yield higher quality. This specific situation would be so rare that I wouldn’t worry about it.
Since these RAW images come from a Bayer pattern sensor, the sampling of red, green, and blue is not co-sited. The red, green, and blue pixels do not stack on top of each other so the in-between data must be (essentially) guessed. This is not quite as bad as it seems as our eye is also like the Bayer pattern in that our S, M, and L cones aren’t co-sited either. (The real answer is more complicated as the human visual system doesn’t suffer from aliasing and debayer artifacts in the same way that cameras do. I believe this is because the HVS can implement tricks to fix these issues because the human eye moves when it samples a scene. Machine systems like scanners can move the sensor, but we cannot do this for still cameras. In practice, the Bayer pattern works very well for human viewing and is the choice for almost all dSLRs.)
But back to Bayer… one of the tricks to good debayering is to give special handling to edges to avoid zippering artifacts (see the example images at this site; search for the word “zippering”). Mazing artifacts can occur when these tricks do not detect edges correctly, resulting in erroneous lines and edges. High-quality debayering can get quite complex in trying to solve these problems. The debayering algorithm also interacts with the design of the camera. A properly-engineered camera will implement an optical low-pass filter (OLPF) in front of the sensor. A OLPF can be thought of as a blurry piece of glass. The blur is actually desirable as it reduces aliasing and “fills in the gaps” in the sampling structure. These images are less prone to mazing artifacts when processed through Adobe Camera RAW. Lenses will also introduce blur and can act as a OLPF. All this stuff interacts.
The practical thing to do is to run a test, or look at someone else’s. Dpreview.com has comparisons between JPEG and Adobe Camera RAW in their reviews (here is their Leica M8 review and their Nikon D200 review). The Leica M8 is notable because it has no OLPF (perhaps because Leica lenses are very sharp and the M8 designers wanted to show that off). The M8 is therefore very prone to aliasing (and by extension, bayer artifacts).
To evaluate whether your prefer the camera’s processing or a RAW converter’s, shoot a test of both RAW and JPEG and see what the differences are. Keep in mind other differences in signal processing will affect image quality too (e.g. sharpening, color, highlight and exposure recovery tricks, etc.).