Click

Thursday, September 6, 2007

Zippy Serene (GP2-5600V)

With this review of the Serene we now have our second Zippy power supply for review. For those unfamiliar with the company and its roots, we suggest reading our first Zippy review as well. Zippy has been around for quite some time and in the server world they are recognized as having one of the highest qualities available in the market. Zippy is located in a suburb of Taipei called Xin Dian (Hsin Tien) and manufactures all of their power supplies in their factory over there.

They are known for having extremely reliable server power supplies, but recently Zippy has made the step into the retail desktop PSU market with several high class offerings. The Gaming G1 power supply in our last review exhibited very high quality, but it could still use quite a bit of improvement in order to better target the retail desktop PC market. Today we will be looking at the Serene 600W (GP2-5600V), a power supply that was built with the goal of having the best efficiency possible. The package claims 86%, which is quite a lofty goal for a retail product.

As we have seen many times with other power supplies, the Serene comes with a single 12V rail. We have written previously that this does not conform to the actual Intel Power Supply Design Guidelines, but as we have seen, readers and manufacturers have a different opinion about this issue. While some say it is no problem at all - there will be enough safety features that will kick in before something bad happens, i.e. overloading the power supply - the other half prefers to stick to the rules and have released power supplies with up to six 12V rails. While the lower voltage rails have each 25A on disposal, the single 12V rail has 40A and should have no difficulty powering everything a decent system needs.Given the name, one area that will be of particular interest to us is how quiet this power supply manages to run. Granted, delivering a relatively silent power supply that provides 600W is going to be a bit easier than making a "silent" 1000W power supply, but we still need to determine whether or not the Zippy Serene can live up to its name.

ASRock 4CoreDual-SATA2: Sneak Peek

When discussing current Intel chipsets, the phrase "budget sector" is somewhat of an oxymoron. While there are a lot of choices in the $45 to $60 range for Core 2 Duo compatible boards, these are mainly based on older designs that do not offer anything in the way of additional features, extended overclocking, or performance oriented chipsets. This is not to say they are in any way bad, as our favorite Intel budget board in the lab is the VIA based ASRock 4CoreDual-VSTA, but rather these boards are targeted at an audience that is price sensitive or just looking for the best bang for the buck.ASRock has built a very good reputation on offering these types of solutions. The performance oriented crowd will often snub these products due to their sometimes quirky nature but you cannot deny their value. In the case of the ASRock 4CoreDual-SATA2, this board allows you to move to the Core 2 Duo platform for a minimal cost. Besides offering good performance for a great price this board also provides the capability to utilize DDR memory and an AGP graphics card.
We provided a series of reviews centered on the 775Dual-VSTA last year, which was the first board in the VIA PT880 based family of products from ASRock. That board was replaced by the 4CoreDual-VSTA last winter that brought with it a move from the PT880 Pro to the Ultra version of the chipset along with quad core compatibility. ASRock is now introducing the 4CoreDual-SATA2 board with the primary difference being a move from the VT8237A Southbridge to the VT8237S Southbridge that offers native SATA II compatibility.Our article today is a first look at this new board to determine if there are any performance differences between it and the 4CoreDual-VSTA in a few benchmarks that are CPU and storage system sensitive. We are not providing a full review of the board and its various capabilities at this time; instead this is a sneak peek to answer numerous reader questions surrounding any differences between the two boards.We will test DDR, AGP, and even quad core capabilities in our next article that will delve into the performance attributes of this board and several other new offerings from ASRock and others in the sub-$70 Intel market. While most people would not run a quad core processor in this board, it does have the capability and our Q6600 has been running stable now for a couple of weeks, though we have run across a couple of quirks that ASRock is working on. The reason we even mention this is that with Intel reducing the pricing on the Q6600 to the $260 range shortly, it might just mean the current users of the 4CoreDual series will want to upgrade CPUs without changing other components (yet). In the meantime, let's see if there are any initial differences between the two boards besides a new Southbridge.

Apple TV - Part 2: Apple Enters the Digital Home

What is the center of your digital home? To the majority of the population, it’s not a question that’s asked or even remotely understood. If we rephrased the question, you might be able to answer it a bit better. Where do you keep all of your music, movies and photos? An educated guess on our part would be that the average AnandTech reader keeps most of his digital content on his/her computer, thus making the PC the center of the digital home.
Microsoft would be quite happy with that assessment but there’s one key distinction: PC does not have to mean Windows PC, it could very well mean a Mac. Both Microsoft and Apple have made significant headway into fleshing out the digital home. Microsoft’s attempts have been more pronounced; the initial release of Windows XP Media Center Edition was an obvious attempt at jump starting the era of the digital home. Microsoft’s Xbox 360 and even Windows Vista are both clear attempts to give Microsoft a significant role in the digital home. Microsoft wants you to keep your content on a Vista PC, whether it be music or movies or more, and then stream it to an Xbox 360 or copy it to a Zune to take it with you.
Apple’s approach, to date, has been far more subtle. While the iPod paved a crystal clear way for you to take your content with you, Apple had not done much to let you move your content around your home. If you have multiple computers running iTunes you can easily share libraries, but Apple didn’t apply its usual elegant simplicity to bridging the gap between your computer and your TV; Apple TV is the product that aims to change that.
Apple TV is nothing more than Apple’s attempt at a digital media extender, a box designed to take content from your computer and make it accessible on a TV. As Microsoft discovered with Media Center, you need a drastically different user interface if you're going to be connected to a TV. Thus the (expensive) idea of simply hooking your computer up to your TV died and was replaced with a much better alternative: keep your computer in place and just stream content from it to dumb terminals that will display it on a TV, hence the birth of the media extender. Whole-house networking became more popular, and barriers were broken with the widespread use of wireless technologies, paving the way for networked media extenders to enter the home.
The problem is that most of these media extenders were simply useless devices. They were either too expensive or too restrictive with what content you could play back on them. Then there were the usual concerns about performance and UI, not to mention compatibility with various platforms.
Microsoft has tried its hands at the media extender market, the latest attempt being the Xbox 360. If you've got Vista or XP Media Center Edition, the Xbox 360 can act like a media extender for content stored on your PC. With an installed user base of over 10 million, it's arguably the most pervasive PC media extender currently available. But now it's Apple's try.

Skeptics are welcome, as conquering the media extender market is not as easy as delivering a simple UI. If that's all it took we'd have a lot of confidence in Apple, but the requirements for success are much higher here. Believe it or not, but the iPod's success was largely due to the fact that you could play both legal and pirated content on it; the success of the iTunes Store came after the fact.
The iPod didn't discriminate, if you had a MP3 it'd play it. Media extenders aren't as forgiving, mostly because hardware makers are afraid of the ramifications of building a device that is used predominantly for pirated content. Apple, obviously with close ties to content providers, isn't going to release something that is exceptionally flexible (although there is hope for the unit from within the mod community). Apple TV will only play H.264 or MPEG-4 encoded video, with bit rate, resolution and frame rate restrictions (we'll get into the specifics later); there's no native support for DivX, XviD, MPEG-2 or WMV.
Already lacking the the ability to play all of your content, is there any hope for Apple TV or will it go down in history as another Apple product that just never caught on?

Dual Core Linux Performance: Two Penguins are Better than One

With all of the attention on dual core processors lately, it has been real easy to overlook the one application that might benefit more from multiple cores than any other; Linux. OK, so technically Linux isn't an application, but the kernel has supported SMP for nine years almost to the date. The road to SMP has not been an easy one for Linux, but in the last nine years, and particularly since 1999, Linux has received quite the attention as a 2-8 processor core operating system. If you need a reference, just look at how many Linux machines hold SPEC benchmark records in web serving and number crunching.
But does any of this translate to great desktop performance for dual core processors? We are going to look at that question today while also determining whether Intel or AMD is the better suited contender for the Linux desktop. We have some slightly non-traditional (but very replicable) tests we plan on running today that should demonstrate the strengths of each processor family as well as the difference between some similar Windows tests that we have performed in the past on similar configurations. Ultimately, we would love to see a Linux configuration perform the same task as a Windows machine but faster.
Just to recap, the scope of today's exploration will be to determine which configuration offers the best performance per buck on Linux, and whether or not any of these configurations out perform similar Windows machines running similar benchmarks. It becomes real easy to lose the scope of the analysis otherwise. We obtained some reasonably priced dual core Intel and AMD processors for our benchmarks today, and we will also throw in some benchmarks of newer single core chips to give some point of reference.

AMD's Opteron hits 3.2GHz

Introduction

"AMD has no answer to the armada of new Intel's CPUs.""Penryn will be the final blow."These two sentences have been showing up on a lot of hardware forums around the Internet. The situation in the desktop is close to desperate for AMD as it can hardly keep pace with the third highest clocked Core 2 Duo CPU, and there are several quad core chips - either high clocked expensive ones or cheaper midrange models - that AMD simply has no answer for at present. As AMD gets closer to the launch of their own quad core, even at a humble 2GHz, Intel let the world know it will deliver a 3GHz quad core Xeon with 12 MB L2 that only needs 80W, and Intel showed that 3.33GHz is just around the corner too. However, there is a reason why Intel is more paranoid than the many hardware enthusiasts.While most people focus on the fact that Intel's Core CPUs win almost every benchmark in the desktop space, the battle in the server space is far from over. Look at the four socket market for example, also called the 4S space. As we showed in our previous article, the fastest Xeon MP at 3.4GHz is about as fast as the Opteron at 2.6GHz. Not bad at all, but today AMD introduces a 3.2GHz Opteron 8224, which extends AMD's lead in the 4S space. This lead probably won't last for long, as Intel is very close to introducing its newest quad core Xeon MP Tigerton line, but it shows that AMD is not throwing in the towel. Along with the top-end 3.2GHz 8224 (120W), a 3GHz 8222 at 95W, 3.2GHz Opteron 2224 (120W) and 3GHz 2222 (95W) are also being introduced.The 3.2GHz Opteron 2224 is quite interesting, as it is priced at $873. This is the same price point as the dual core Intel Xeon 5160 at 3GHz and the quad core Intel Xeon 5355. The contrast with the desktop market is sharp: not one AMD desktop CPU can be found in the higher price ranges. So how does AMD's newest offering compare to the two Intel CPUs? Is it just an attempt at deceiving IT departments into thinking the parts are comparable, or does AMD have an attractive alternative to the Intel CPUs?

HD Video Decode Quality and Performance Summer '07

The current generation of graphics hardware is capable of delivering high definition video with lower CPU utilization and better quality than ever. Armed with the most recent drivers from AMD and NVIDIA we have spent quite a bit of time testing and analyzing the current state of HD playback on the GPU. And we have to say that while there are certainly some very high points here, we have our concerns as well. Since the last time we tested HD playback performance on the 8600 line, we have seen software support improve dramatically. PowerDVD, especially, has come quite a long way and now fully supports both AMD and NVIDIA hardware with full hardware acceleration and is quite stable. Drivers from both camps have also now added HD video quality improvements in the form of post processing to their drivers. HD deinterlacing and noise reduction now (mostly) work as we would expect. This is in contrast to the across the board scores of 0 under HD HQV we saw earlier this year. This will be the first time we test AMD's new R600 and RV6xx based graphics cards using our video decode tests. Our RV6xx based Radeon HD 2600 and 2400 hardware features AMD's UVD video decode pipeline that accelerates 100% of the HD video decode process on all codecs supported by HD-DVD and Blu-ray. NVIDIA's hardware falls short of AMD's offering in the VC-1 bitstream decoding department, as it leaves this task up to the CPU. We will try to evaluate just how much of an impact this difference will really offer end users. Here's a breakdown of the decode features for the hardware we will be testing:

While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", G84 and G86 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding (where decode support is the same as the HD 2900 XT, lacking only bitstream processing). With software and driver support finally coming up to speed, we will begin to be able to answer the questions that fill in the gaps with the quality and efficacy of AMD and NVIDIA's mainstream hardware. These new parts are sorely lacking in 3D performance, and we've been very disappointed with what they've had to offer. Neither camp has yet provided a midrange solution that bridges the gap between cost effective and acceptable gaming performance (especially under current DX10 applications). Many have claimed that HTPC and video enthusiasts will be able to find value in low end current generation hardware. We will certainly address this issue as well.

No 3G on the iPhone, but why? A Battery Life Analysis

Most of the initial reviews of Apple's iPhone shared one complaint in common: AT&T's EDGE network was slow, and it's the fastest cellular network the iPhone supported. In an interview with The Wall Street Journal, Steve Jobs explained Apple's rationale for not including 3G support in the initial iPhone:
"When we looked at 3G, the chipsets are not quite mature, in the sense that they're not low-enough power for what we were looking for. They were not integrated enough, so they took up too much physical space. We cared a lot about battery life and we cared a lot about physical size. Down the road, I'm sure some of those tradeoffs will become more favorable towards 3G but as of now we think we made a pretty good doggone decision."
The primary benefit of 3G support is obvious: faster data rates. Using dslreports.com's mobile speed test, we were able to pull an average of 100kbps off of AT&T's EDGE network as compared to 1Mbps on its 3G UMTS/WCDMA network.
Apple's stance is that the iPhone gives you a slower than 3G solution with EDGE, that doesn't consume a lot of power, and a faster than 3G solution with Wi-Fi when you're in range of a network. Our tests showed that on Wi-Fi, the iPhone was able to pull between 1 and 2Mbps, which is faster than what we got over UMTS but not tremendously faster. While we appreciate the iPhone's Wi-Fi support, the lightning quick iPhone interface makes those times that you're on EDGE feel even slower than on other phones. Admittedly it doesn't take too long to get used to, but we wanted to dig a little deeper and see what really kept 3G out of the iPhone.Pointing at size and power consumption, Steve gave us two targets to investigate. The space argument is an easy one to confirm, we cracked open the Samsung Blackjack and looked at its 3G UMTS implementation, powered by Qualcomm:
Motherboard Battle: iPhone (left) vs. Blackjack (right), only one layer of the iPhone's
motherboard is present

Mr. Jobs indicated that integration was a limitation to bringing UMTS to the iPhone, so we attempted to identify all of the chips Apple used for its GSM/EDGE implementation (shown in purple) vs. what Samsung had to use for its Blackjack (shown in red):

The largest chip on both motherboards contains the multimedia engine which houses the modem itself, GSM/EDGE in the case of the iPhone's motherboard (left) and GSM/EDGE/UMTS in the case of the Blackjack's motherboard (right). The two smaller chips on the iPhone appear to be the GSM transmitter/receiver and the GSM signal amplifier. On the Blackjack, the chip in the lower left is a Qualcomm power management chip that works in conjunction with the larger multimedia engine we mentioned above. The two medium sized ICs in the middle appear to be the UMTS/EDGE transmitter/receivers, while the remaining chips are power amplifiers.
The iPhone would have to be a bit thicker, wider or longer to accommodate the same 3G UMTS interface that Samsung used in its Blackjack. Instead, Apple went with Wi-Fi alongside GSM - the square in green shows the Marvell 802.11b/g WLAN controller needed to enable Wi-Fi.
So the integration argument checks out, but what about the impact on battery life? In order to answer that question we looked at two smartphones - the Samsung Blackjack and Apple's iPhone. The Blackjack would be our 3G vs. EDGE testbed, while we'd look at the impact of Wi-Fi on power consumption using the iPhone.

HP w2207: Shiny 22" Perfection?

IntroductionLet's be honest: we like big displays. Given the choice between any two computer LCDs, we would almost invariably take the larger display - provided that price isn't an overriding concern, naturally. That being the case, and looking at the current prices of LCDs, we have a serious problem even considering anything smaller than a 20" LCD. The difference in price between a 17" LCD and a 22" LCD can be as little as $75, and by the time you're looking at reasonable quality displays, the price difference can narrow even more. Widescreen displays are the trend these days, which is all the more reason to get something a bit larger if possible - note that in terms of screen surface area, a 19" widescreen is actually slightly smaller than a standard 19" 5:4 aspect ratio LCD.It wasn't all that long ago that a typical 20" LCD could cost well over $500. After watching 20-22" CRTs bottom out at around $500 for more than five years, you certainly won't find us complaining about LCD prices dropping by 30% or more per year! There is a point of diminishing returns, however, and it's quite difficult to find any size LCD for under $150. Should you go out and purchase the least expensive (and probably lowest quality) LCD you can find for $150, or is it better to spend a bit more money to get one of the larger displays? Considering that the display is what you're actually spending all of your time looking at when you use a computer, we continue to recommend that you spend more rather than less money on that particular component, and the fact that a good quality display can last through several computer upgrades is merely one more reason to do so.

We're looking at HP's 22" w2207 display today, which at $360 costs quite a bit more than the entry level 22" LCDs on the market. We previously looked at one such monitor, the Acer AL2216W that currently sells for $230, so one of the first questions we need to answer is what exactly the w2207 provides that the Acer lacks. Other than a few extra features, we also need to look at performance, but for less demanding users we feel pretty confident in stating that you'll be hard-pressed to find an extra $130 of value in the HP offering. What about those of you who aren't so easy to please - is there a case to be made for the HP w2207? Let's find out....

Gigabyte GA-X38-DQ6 An early look at X38

Intel has certainly been on a roll for the past year, and from all indications there's a good chance this will continue for the foreseeable future. Even though they are looking down the barrel of AMD's K10 gun, with Barcelona scheduled to launch next week and Phenom later this year, they are continuing to release product at a brisk pace and seem unfazed by what their competitors are planning. It is this focus that has brought us the Core 2 processor family and a string of new chipsets since June of 2006.Instead of taking a breather or resting on its laurels, we will see several new chipsets and processor families over the course of the upcoming year as Intel continues to march to the beat of a company looking to annihilate everyone and everything in its way. While you can debate the merits of their product lineup, potential performance improvements, costs, or what it all really means for the consumer, you cannot deny their tenacity in trying to address just about every sector of the market they serve with a class leading product.One of those market sectors consists of a very loyal and very vocal group: the computer enthusiasts whose lust for the latest and greatest technology often drives the market to innovate at a pace faster than it normally would. The next chipset on Intel's product road map is designed specifically for this group, but it should also find its way into the workstation market in a short time.

This chipset is the Intel X38 that will replace the venerable 975X as Intel's performance oriented offering in the enthusiast market. The P35 has been an excellent chipset, serving user needs across a broad range of products starting with boards like the budget-priced/performance-oriented abit IP35-E up through the midrange Gigabyte GA-P35-DS3R and beyond that to the über-expensive and tweakable DFI LANParty UT P35-T2R in the DDR2 sector. The P35 has also been shown to be an admirable performer when matched with high speed/low latency DDR3 in boards such as the Gigabyte GA-P35T-DQ6 and ASUS P5K3-Deluxe.Intel designed the P35 with the mainstream market in mind and always planned on the X38 being the chipset that would offer the next step in performance, whether that was in a multi-GPU situation with dual x16 capability and enhanced throughput, improved memory/chipset performance, or as a great overclocking platform for a wide range of Core 2 processors including the upcoming Penryn series.We will go into detail about the X38, its range of capabilities, and design aspects when the chipset officially launches on September 24th, but for now we will provide a very early look at its performance with the Gigabyte GA-X38-DQ6. In fact, this preview is based on an engineering sample board and BIOS, but we felt like the progress of the X38 chipset on this particular board warranted a sneak peek of its performance capability - or in this case more like its potential.Our testing today is centered on a select group of synthetic and application benchmarks that will provide a general indication of how the X38 currently performs with beta hardware. We will follow up shortly with results from a retail board/BIOS combination that will focus on CrossFire (sorry, as of now there's no official SLI support), overclocking, power consumption, and additional application benchmarks.

MTRON 32GB SSD: Better in a Notebook?

We recently provided a brief overview of the MTRON 32GB SSD provided by DV Nation and found its performance on the desktop to be very robust in most tests. In fact, it competed very well against the Western Digital Raptor 150GB drive in the application benchmarks and just annihilated it in the FutureMark PCMark05 benchmarks. Besides the MTRON's excellent performance and excessive costs, we also discovered an issue with the latest Intel desktop chipsets that feature the ICH9 or ICH8 Southbridges.Our first indication of a problem was during our theoretical throughput tests featuring HD Tach that showed the NVIDIA 680i SLI MCP generating a sustained transfer rate of 95.1 MB/sec, write speeds of 74.7 MB/sec, and a burst rate of 100.4 MB/sec. The same MTRON drive on the Intel P35/ICH9R scored a sustained transfer rate of 79.4 MB/sec, write speeds of 67.2 MB/sec, and a burst rate of 82.7 MB/sec. Utilizing the NVIDIA 680i MCP showed a 17% improvement in sustained transfer rates, 11% improvement in write speeds, and a 21% increase in burst rates. PCMark05 showed improvements up to 88% while our current application benchmarks show anywhere from a 1% to 20% gain over the Intel ICH9R. We still do not have an answer as to why the latest Intel Southbridges cap sustained transfer rates to around 80 MB/sec with the SSD drives but should have one soon.

We received numerous requests (we are still responding for those awaiting answers) after our original MTRON article to show additional test results on a notebook platform. We were already in the process of testing this drive with our new Vista based testbed and application test suite as part of a 2.5" drive roundup so we will provide a few initial results today. Of course, nothing is ever as easy at it seems and what can go wrong will go wrong. During preliminary testing we discovered the same throughput issues with the Intel PM965/ICH8-M combination used in the latest Crestline based notebooks. After several reloads, new driver combinations, and praying to the Intel gods we still have the same problem and possibly more. Our current NVIDIA and ATI chipset based notebooks do not have this same cap and it turns out an older 945PM/ICH7 unit we had is fine.Not only were we having the 80 MB/sec cap issue with the MTRON unit but our SanDisk 32GB SSD unit seemed to be capped at 26 MB/sec compared to the 60.7 MB/sec capability on the NVIDIA GeForce Go 6150 platform. Our Samsung Hybrid drive decided to chime in and give us some of the most inconsistent test results we have ever experienced, but that was cleared up with a new BIOS release, or so we hope as the benchmarks roll on. Also, our SanDisk 32GB SSD drive is reporting random access times around 14ms on both platforms compared to the .1ms results on our other SSD drives. We are still investigating these problems, but just in case we have a new PM965/ICH8-M platform and SanDisk 32GB drive arriving on Monday for additional analysis.Our quick take today is based on a limited test suite using Vista Home Premium and an NVIDIA/AMD based notebook platform. We will follow up in our 2.5" drive roundup with full test suite results on both the Intel and AMD CPU based platforms. In the meantime, let's take a quick look at this MTRON drive and see how it compares to our review units from Samsung and Seagate in the notebook sector.

Digital Photography from 20,000 Feet,Sony

It is remarkable how fast photography has shifted from film to digital imaging. If you doubt the shift is all but complete, check the impact on Kodak. Shutdown of US film operations has been accelerated several times, many thousands of employees have been cut, and Kodak stock has taken a beating as the company struggled to find secure footing in a new digital imaging world. All of this was happening while Kodak invested millions in developing digital imaging solutions in a market that was shifting like quicksand.Digital, of course, is the domain of the computer, and the transition of artistic photographers to digital has been anything but smooth. The artistic types distrust turning their vision into cheap Adobe Photoshop tricks, and the tech-savvy are so enamored of technology and editing that they often don't have a clue about what makes a good photograph and what lens to use in a given situation. As AnandTech prepares to re-launch Digital Photography reviews, it is important that our readers understand at least the basics of digital photography. That is the purpose of this guide. There are plenty of Digital Camera Review sites out on the web, so you may ask why AnandTech is re-launching a Digital Photography section? If you are a photographer or serious photo hobbyist you have many excellent review sites already available. They do a great job of providing the kind of information the serious photo hobbyist is looking for. However, our readers who visit those sites are often overwhelmed with the sheer amount of information and the background required to make that information accessible. For a computer enthusiast who wants to learn about digital cameras to make a buying decision, many current sites are a difficult place to find answers. Some sites make the assumption that the reader knows a lot more about photography than our average reader, which often leads to much of the review being gibberish to a non-photographer. Other sites dwell on tests of things like "start-up times" that were important in early digital, but have become all but meaningless in today's digital SLR market unless you are a professional sports photographer. Still other sites, which are very well-grounded in the traditional photography side show an obvious lack of knowledge about computers and computer tools that make digital photography so flexible today. Some of our readers may not like AT delving into Digital Camera Reviews, and to them we say you just can't ignore digital photography any more. Today's digital imaging is nothing more than an optic stuck on a computer, and unfortunately there is very little left of the mechanical gems that once ruled the world of photography. It is our sincere belief that we can do digital camera reviews with a unique perspective for our readers and computer enthusiasts everywhere, but please help us as we try to reinvent this wheel. There are some things about photography that have not changed in the move to digital, however. In the end taking a digital photo is still basically dependent on the same set of "rules" as taking a film image, as the only real difference in digital and film is what happens after the image is captured. This is particularly obvious in looking at Digital SLR cameras, which are currently the fastest growing segment of the Digital Photography market. You will find all the traditional photography names here - Nikon, Canon, Pentax, Olympus, Minolta - and this is where the "real" photographers work. Names like Casio, HP, Sony, Fuji, Samsung and Kodak don't exist in SLR space - except as the odd offering based on the lenses of one of the "real" Photography companies.The reasons for this are really quite clear. Digital and computer imaging have concentrated on the sensor and ever increasing megapixel counts, while the people who take photographs for a living have continued to concentrate on the quality of the lenses they work with and the images that they sell. In both film and digital, all other things being equal, the best quality lens wins. Of course the best quality lenses and the widest variety of lenses come from the traditional photo companies like Canon, Nikon, Minolta, Pentax and Olympus. These companies have taken years to develop their extensive line of lenses, and these lenses are the ones in the hands of photographers. Today, it takes a lot of money and effort to develop a new lens line. As a result you have amalgams like Samsung using the Pentax lens line on their SLR, a Fuji Pro camera using Nikon lenses, and past Kodak Pro Digitals designed for both Canon and Nikon lens mounts - two models for each Pro camera.
Recently Sony introduced their first SLR, and one of our first digital camera reviews at AT will be the new Sony Alpha or A100. So did Sony break the rules? Sony is one of the world's largest manufacturers of digital sensors - the chip that captures an image in digital format. In fact you will see Sony sensors in almost every brand of "serious" camera except Canon and Olympus. Sony makes sensors for Nikon, Pentax and Minolta. Canon is another huge sensor manufacturer and makes their own sensors for their cameras, while Kodak and Panasonic both make four-thirds sensors used by Olympus in their various models.Sony has some very feature-rich and capable fixed lens cameras in their lineup, and their own form factor for memory, but Sony has coveted a big piece of the "serious" photography or SLR market. Sony apparently did not want to brand themselves a second tier player in the SLR market by offering an SLR for other brand lenses. Instead they entered into a joint development agreement with Konica-Minolta last year. Then, early this year, Sony bought the Konica-Minolta camera business and announced they would continue development of the 20-year old Minolta auto-focus lens system to work with their own new Digital SLR cameras.The Sony Alpha or A100 is the first camera that marries Sony technology with the Minolta system. It is a new Digital SLR brand with a new Sony 10.2 megapixel sensor and an existing lens base of some 20 million Minolta Auto-Focus lenses. By purchasing the Konica Minolta camera business and assuming warranty responsibilities, Sony instantly became a major player with a full lens line. When you consider that only Sony and Canon make their own sensors for their digital SLR cameras you can clearly see what Sony can leverage in the DSLR market, and why they were willing to buy an existing lens line. Sony didn't break the rules, they just bought instant credibility in a market that is difficult to crack.If you want to learn about digital photography you should find this guide a good place to start. If you are in the market for a new Digital SLR then this is a good place to gain the background to intelligently compare these cameras. The Digital SLR market is hot and we will be covering the six new 10 megapixel cameras that sell for less than $1000 in detail in the coming months: the Sony A100, Nikon D80, Canon Rebel XTi, Olympus E-400 (Europe/Asia only), Pentax K10D, and Samsung GX-10.

Sony DSC-M1: Good Video, Disappointing Pictures

The DSC-M1 is a dual-function digicam from Sony that promises extended video functionality through the use of MPEG-4 compression in addition to a 5-megapixel still image mode. The camera has a unique vertical design with a flip-out LCD that can be rotated 270º. The M1 features several still recording modes ranging from Auto, Program, and 9 different preset modes. The camera offers some advanced exposure controls such as exposure compensation, white balance adjustment, and ISO control. The video mode uses MPEG-4 compression to record stereo audio and video with a resolution of 640x480 at 30 fps.
In our review of the Sony DSC-M1, we discovered that although the camera has quite a bit going for it, the camera falls short in the end because of the quality of its still images. For example, the camera does very well in terms of startup time, cycle times without the flash, and auto-focus/shutter lag times. The video mode is exceptional, offering the use of the optical zoom during recording as well as a few image adjustment options. The true downfall of the M1 is in its mediocre still image quality. In our sample images, virtually every image suffers from soft/fuzzy details. We were also surprised to find JPEG artifacts in a number of images from the M1. In general, the images were also much noisier than they should have been (even at ISO 100). For all the details, read on for our full review of the Sony DSC-M1.