Sony have now finally solved the problem of how to make a DSLR camera with live preview on the view-screen, and they’ve made the view-screen detachable (as on a camcorder) in the process.
There’s a review in the New York times here. It says this about how Sony did it:
On an ordinary single-lens reflex camera [...], light enters from the lens and is split by a semi-transparent mirror. Part of the light goes to the eyepiece viewfinder, and the other part goes downward to the autofocus sensor.
When you press the shutter button, that mirror flips up out of the light’s path, revealing [...] the computer chip that records the photo.
[...] Why can’t you frame a photo using an S.L.R.’s back-panel screen, as you can on a little pocket camera? Actually, a few recent S.L.R. models do, in fact, have this Live View feature, but it’s mostly a disaster. It works by flipping that mirror up out of the way, so that light from the lens hits the image sensor, which feeds the image to the screen. Trouble is, once the mirror goes up, no light hits the autofocus sensor, so the camera can’t focus.
So here’s what happens when you press the shutter button. There’s a noisy clank as the mirror drops down again; the screen goes black; the camera computes focus and exposure; the mirror lifts again; the screen comes back to life; and finally — a second or so later — the shot is recorded.
[...] All of this silliness arises because the camera’s image sensor must do double duty: it’s responsible for supplying the screen with a live preview and for recording the shot.
Sony’s technical breakthrough on the A300, therefore, was this: “Duh! Put in another sensor!”
On this camera, turning on Live View sends light from that main mirror onto a second sensor, one that’s devoted solely to feeding the preview screen. The autofocus sensor works normally as you compose a shot, since the mirror never has to flip up.
I’d really like to own one of these, whether the α300 or α350!