(NOTE: Long post ahead)
For as often(?) as this projector comes up on Woot, I ought to simply save my previous posts so I can just cut & paste.
- Very conspicuously missing from the specs is that this projector will not project VGA @ 1080p. Well, it does, but it does this weird "I'm not processing the data quickly enough" thing that makes the image look like you're projecting in a disco. I called Optoma and confirmed this quirk, so just make sure you're projecting video from your computer @ 720p.
Bradley, Can you elaborate a little bit more on this? When you say "VGA", are you referring to the spec max resolution of 800x600 pixels? Or are you talking about receiving a signal through a VGA cable to the VGA port on the projector?
I have the HD20 (recommended over the HD180) and it projected all my resolutions just fine. It's connected to my computer and my digital cable box. So I run resolutions anywhere from 640x480 up to 1920x1080. Keep in mind a couple of things...
1) 1080p resolution refers to a fixed resolution of 1920x1080 pixels. "P" refers to Progressive Scan and "I" refers to Interlaced. So I'm not sure how you are mixing "VGA" and "1080p" in the same thought. For the record, my computer outputs 1080p just fine to the HD20.
2) If you are using a VGA cable from your computer, you may be running into signal loss problems as a standard 15-pin D-sub VGA cable isn't spec'd to carry the data requirements of 1080p. And if you are using a VGA cable from your computer, for God's sake, child, upgrade your video card! Even your basic $20 video cards will at least have a DVI port, and probably even an HDMI port. Unless you're outputting from an older/basic laptop. Then I guess you're stuck. But even then, I had my HD20 hooked up to a VGA port on a laptop and didn't experience any problems.
(Stay with me here, Bradley, I'm gonna come back to you shortly.)
Another poster asked "What's the difference between the HD20 and HD180?"
For all intents and purposes, just get the HD20. It has slightly better contrast and supports more formats.
Supposedly the HD180 doesn't support 1080i and the HD20 does, which could be very important for people hooking it up to their cable box. Not many, if any digital cable boxes output 1080p. Most, as I was informed, output 1080i, thus making the HD20 the better choice.
Other little differences? The HD20 supports more international TV formats. Supposedly the HD180 supports two more computer resolutions, but I didn't find that to be true. My HD20 outputted every resolution I threw at it. So essentially, there's no reason to get the HD180. Get the HD20.
Now, "Is this a good projector for the price?" A year ago, it was a no-brainer. "YES!" Today? Just "Yes." The market is heating up in the 1080p sub-$1000 range. BenQ, ViewSonic, and Acer are in this market now with competitive products. While I don't have any personal experience with anything but my Optoma HD20, if I had to look for a new projector (and I may be, more on that below), I would carefully consider these other mfgs for a few bucks more and have a brand-new unit.
All that being said...
I bought my HD20 on Feb 7th, 2011 from sellout.woot. Today, I'm sending it in for warranty service, possibly for the issue that Bradley ("bjx") mentioned previously. (I bought an extended warranty from SquareTrade and I highly recommend them for almost all electronic purchases!)
The projector has a vertical refresh spec of 24Hz - 85Hz. This is important for people that use this as a theater projector and for computer or game console output.
For the best movie experience, the source playing your movie should be able to output at 24Hz and your projector should switch to 24Hz since movies are filmed @ 24fps. The HD20 does this.
For computer output and TV output, those signals are typically output @ 60Hz.
And some computer games will switch the output rate to 50Hz. Again, the HD20 supports all these refresh rates at 1080p resolutions (with a 32-bit color palette.)
For some reason, my projector decided a couple months ago that it was only truly happy displaying 60Hz signals. Whenever it would switch to 24Hz or 50Hz, the video processor would no longer maintain 32-bit color depth and would drop down to 16-bit color. What this means is that when watching a movie, the colors are not smooth. Gradients show distinct breaks or color bands between colors and/or solid colors are simply displayed as a different color. Such as a light blue sky might take on a more greenish-yellow hue. That's almost tolerable, but the flicker between 16-bit and 32-bit color palettes was most definitely annoying and thus made viewing a movie @24Hz unwatchable. (Is this what you were experiencing, Bradley? Did you notice if the projector was noting a different refresh rate when it changes signal? It always displays this in the bottom right-corner when switching.)
Yes, I simply switched the signal back to 60Hz, and very few people would notice the difference. But I'm a movie and electronics buff, so I noticed the difference. (24Hz is the reason movies look like they do in the theater. When you have a 133" projection in your home theater like I do, you should expect the same experience.)
So...since it used to work and now it's flaky, it's going in for warrantied service.
Again, most people would/will never notice this because most people have their signal input set to 60Hz or never bother to use other refresh rates. With that in mind, there would have been zero complaints with this projector.