Click

Saturday, September 12, 2009

ATI and NVIDIA: Quick Look at HDTV Overscan Compensation

ATI and NVIDIA: Quick Look at HDTV Overscan CompensationIt has been a while since ATI released their HDTV dongles, which provided HDTV output support for most of their Radeon line. In fact, we probably first experimented with their HDTV dongle back in July or August of 2002. Back then, HDTV output support was plagued by the overscan issue.

And for those of you unfamiliar with "overscan", it is simply the part of the picture that is cropped. Depending on whom you ask, others also describe it as the space that bleeds or "scans" beyond the edges of the visible area of the screen. Typical televisions can have a loss of up to 20% of the image due to cropping. This portion of lost image is what is commonly known as overscan. Technically speaking, the information of the "lost picture" is not actually lost, but it is outside the range of the visible area of your TV screen. A similar situation on the computer end is when you view a picture in 100% scaling on a monitor with a lower set resolution than the picture, i.e. a 1600 x 1200 picture in a 1280 x 1024 desktop environment. The difference is that on a computer, you can move the picture around to see portions cut off by the visible area of the monitor.

We should clarify that overscan is a not necessarily a bad thing. It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle. (Underscanning would be the opposite of overscanning, where the image is smaller than the area on which it is being displayed.) It would be tantamount to zooming out when you look at a picture; though in the case of TV sets, the space that doesn't reach the edges of the display would be black. The deliberate use of overscanning allows the screen to be completely filled, as opposed to underscanning, which could have varying degrees of underscanned margins.

The reason why we notice overscan more so on a computer is that we already know what the signal is suppose to look like: where the start menu button should be, where the clock should be, where the corners of the screen should be in relation to our desktop shortcuts. A DVD to a DTV or even a regular TV usually encounters some measure of overscan, though we hardly notice it because we aren't use to its native signal. One way that DVD player manufacturers' or TV manufacturers' compensate for this is to provide a zoom out function, where you tell the system essentially to underscan. This is why we go crazy when we notice overscan from a X-Box rather than from DVD signal; you know where the game menu is supposed to look like.

In theory, if a HDTV was designed for only HDTV, there would be no overscan from component computer video output. The main issue with overscan is that DTV are programmed to do more than just DTV signals. They accept many legacy signals: camcorders, s-video, composite, etc All of this means that there must be a cross platform support for all formats, and the only way for that to occur is to either overscan or underscan. Underscanning would be more frustrating to the consumer, since the signal would be smaller than the displayed area with the black bars surrounding the image. Overscan ensures that video signal always fills the screen, though this gets to be increasingly frustrating when you get an X-Box or output video from your computer and the signal is overscanned.

And as Keith Rochford (Chief Engineer of eVGA) explained, when you switch to a DTV, you are now talking about a high resolution display, and backwards engineering a pixel technology to a line scan technology isn't a simple task. This backwards engineering or transfer is what leads to the large 10% to 15% margins of overscan that we have been accustomed to when we output from a computer to a DTV. For those who own something like a plasma display that can do direct computer video output via VGA or DVI, this obviously isn't an issue, since there is no backwards conversion needed. It is essentially like a really big computer monitor, since it keeps the video card's native output.

In the most practical sense, overscan is something you don't want to have or at least want to minimize. Using your HDTV set as a substitute for your monitor can be awesome, but the limitation of having part of the picture cropped gets to be a major deterrent, especially when you want to plan games, surf the web, or watch videos on that nice big screen.

There are more than just ATI and NVIDIA cards on the market, but most of us are still going to be stuck with one or the other. In which case, you are most likely going to get some degree of overscan. Keep in mind that we can't track down every or even most DTV sets and check the degree of overscan, and even if we could, overscan varies between TV sets because of the manufacturer's design, which isn't a bearing on the video card. For these practical reasons, we are going to focus primarily on how ATI and NVIDIA approach HDTV overscan compensation.

Apple's 2009 MacBook Pro: Battery Life to Die For

I was so focused on the iPhone 3GS and Snow Leopard announcements from this year’s WWDC that Ialmost missed the gravity of the MacBook Pro announcements.

Apple announced price drops on nearly all of its laptops. The new lineup looks like this:

MacBook

MacBook Pro 13-inch

MacBook Pro 15-inchMacBook Pro 17-inch
CPUCore 2 Duo 2.13GHzCore 2 Duo 2.26GHzCore 2 Duo 2.53GHzCore 2 Duo 2.8GHz
GPUNVIDIA GeForce 9400MNVIDIA GeForce 9400MNVIDIA GeForce 9400MNVIDIA GeForce 9400M + 9600M
Memory2GB DDR22GB DDR34GB DDR34GB DDR3
HDD160GB160GB250GB500GB
Battery LifeUp to 5 hoursUp to 7 hoursUp to 7 hoursUp to 8 hours
Price$999$1199$1699$2499

If you want an all aluminum body, you have to buy a MacBook Pro. There’s only a single MacBook model and it’s the white chassis that’s been around for a while now.

Apple added a 13” MacBook Pro to the lineup to fill in the gap, although it’s not clear to me whether this 13” MacBook Pro uses the same LCD panel as the old 13” aluminum MacBook or a derivative of the 15” MacBook Pro’s panel, which is superior.

Of course there are different models within each one of these categories that you can purchase, but they are irrelevant to the discussion we’re about to have. Look at the battery life row in the table above; Apple is claiming up to 7 hours of battery on the new MacBook Pros. The old specs used to be up to 5 hours.

Apple did some clever work on its own here. Standard lithium ion batteries are made up of cylindrical cells, similar to AA batteries. The problem with these batteries is that they waste a lot of space within a notebook (try cramming a lot of cylinders into a box, you end up with wasted space). This wasted space translates into larger batteries than are necessary, which makes for larger notebooks.

In order to continue to drive laptop thinness down, Apple started experimenting with using custom lithium polymer batteries instead of the industry standard lithium ion parts. Lithium polymer cells aren’t made of cylindrical cells (they’re rectangular), so there’s no wasted space. Not only does this make the batteries more compact, but it also gives you greater capacity since you’re using all available chassis volume for the battery.


Makes sense. Courtesy, Apple.

Apple also found that it was wasting space in the removable enclosure for the batteries as well, so its lithium polymer offerings are no longer user removable. I suspect this part of the equation has more to do with cutting costs than saving space though.

Apple first used this lithium polymer battery technology in its MacBook Air. It gave Apple a very thin battery that allowed it to create the MacBook Air’s sweet form factor. Then came the new 17” MacBook Pro, without a removable battery. Apple claimed that this battery would last for five years before it needed replacing and resulted in up to an 8 hour battery life.

The extended life is supposedly due to an on-battery sensor that communicates with the system's management controller that can dynamically sense the needs of each lithium polymer cell and feed that info back to the charging circuitry. The result is slight variations in charging current designed to optimally charge each and every cell; apparently reducing wasted charge cycles significantly. Apple claims that most cells will hit 80% of their life after 200 - 300 charge cycles, but its special lithium polymer batteries will hit the 80% mark after as many as 1000 charge cycles. Apple claims its unique battery chemistry and microprocessor managed charging (Adaptive Charging) is responsible for these gains but it’s a difficult statement to prove; we’ll have to wait and see what happens after a few years of use.

Apple TV - Part 1: Unboxed and Dissected

Although the less exciting of Apple's major announcements this year, Apple TV is finally upon us. As the world waits for the iPhone, it's time to look at Apple's latest entry into the convergence market. While we work on our review we thought you all might like to see the innards of the Apple TV.

The Apple TV box is stylish in Apple's usual fashion:


Click to Enlarge


Click to Enlarge

The inner box is like a giant iPod box, you open it like a book:


Click to Enlarge

On the left you've got the Apple TV unit, on the right its remote. Beneath the Apple TV you'll find the only cable supplied with the unit and documentation:


Click to Enlarge

The power cable that ships with the Apple TV is quite possibly the most unkempt cable we've seen ship in a recent Apple product. The cable itself is fine, but it's not wrapped in some ridiculously elegant way - given that the Apple TV is about a month late to market could it be that things were a little rushed on the assembly line?


Click to Enlarge - This is exactly how the box arrived, you also get two Apple stickers with your purchase, each one good for 10HP


Intel's SoC Update: 1B Transistors Embedded in 5 Years

It’s announcement day! Well, not really. Intel made a few disclosures about its efforts in the embedded space, but nothing tremendous. It all starts with a little chip called Atom.


Oh, hai. You're bigger than I expected. You know, being called Atom and all.

Intel’s Atom processor is quite possibly Intel’s most important microarchitecture, yet it is hardly discussed - mostly because current implementations are hardly interesting. Today the Atom processor is little more than a very low power x86 chip that performs a lot like a 1.2GHz Pentium M, it’s not exactly setting any speed records.

We’ve already detailed the Atom architecture in depth here, to recap it’s a 2-issue, in-order architecture with SMT support. The goal for Atom has always been to be able to deliver the performance of a 4-year-old Pentium M, in a 1W power envelope. As we found in our initial performance investigation of Atom, Intel was able to sort of meet this goal. At 1.6GHz, the Atom performs somewhere between a 800MHz Pentium M and a 1.6GHz Pentium M - we roughly approximated that to be the performance of a 1.2GHz PM.


The problem is that at 1.6GHz, we’re at a 2W TDP. In order to drop below 1W we have to look at the 800MHz Atom Z500, which not only runs at half the clock speed but also lacks SMT support, which normally delivers a healthy performance increase on the in-order architecture.

So today Atom is hardly interesting beyond some niche markets. ASUS has embraced the chip for its Eee PC and Eee Box machines, but it’s either too slow for a real desktop, or too power hungry (and too large) for things like smartphones. You have to start somewhere though, and that’s what Atom is - a starting point. The next step is where things get more interesting...

Intel's 32nm Update: The Follow-on to Core i7 and More

Seven billion dollars.

That’s the amount that Intel is going to spend in the US alone on bringing up its 32nm manufacturing process in 2009 and 2010.

These are the fabs Intel is converting to 32nm:

In Oregon Intel has the D1D fab which is already producing 32nm parts, and D1C which is scheduled to start 32nm production at the end of this year. Then two fabs in Arizona: Fab 32 and Fab 11X. Both of them come on line in 2010.

By the end of next year the total investment just to enable 32nm production in the US will be approximately eight billion dollars. In a time where all we hear about are bailouts, cutbacks and recession, this is welcome news.

If anything, Intel should have a renewed focus on competition given that its chief competitor finally woke up. That focus is there. The show must go on. 32nm will happen this year. Let’s talk about how.


Gigabyte GA-EP45-UD3P - P45 at its Finest

It seems like ages ago when Intel released the P45 Express chipset. In fact it was just last June, but that is normally an eternity in the personal computer market. After our first look at the chipset, we were not convinced that it could be successful. The P35 Express chipset was mature, less expensive, and a very popular choice for the first time buyer and enthusiast alike. Anyone needing high-end performance for a CrossFire setup had numerous options to choose from with the X38/X48 based motherboards. What seemed like the final nail in the coffin is that board pricing was closer to X38 territory than the P35 and initial performance numbers just did not wow anyone.

Beyond that, it seemed like the upcoming Nehalem platform was getting more press than the P45. Most of us were wondering out loud why anyone would invest in a brand new chipset based on a previous generation processor when the mother of all platforms was getting ready to launch. Not to mention, except for CrossFire capabilities upgraded from x16/x4 on the P35 to a performance friendly x8/x8 setup, what did the P45 really offer?

As it turns out, this chipset had a lot to offer. Of course, the stars seemed to align perfectly for its march to success. After a few rough patches with early BIOS releases, this chipset became the favorite upgrade choice for the enthusiast due to its incredible front-side bus and memory overclocking capabilities. AMD released two of the best value/performance video cards in recent memory with the HD 4870 / HD 4850 video cards and all of a sudden you could run CrossFire on a mainstream board without spending a fortune. Intel pushed this chipset heavily and the motherboard manufacturers started pumping out various models from the low-end $80 market up to the high-end $250 sector. The P45 was everywhere and available at almost any price point - we last counted about 100 different models available from just about every manufacturer in the business.



It’s hard not to get lost in the sea of available models when searching the web sites at ASUS, MSI, Gigabyte, and others. Thanks to aggressive price cuts on the Core 2 series of processors and with the Core i7 platform regulated to the high-end market until the end of this year, the opportunity for the P45’s star to shine brightly continues for the immediate future. Based on recent information from Intel we can expect to see the P45 around until 2011.

One of the industry's leading supporters of the P45 chipset is Gigabyte. At last count, Gigabyte had fifteenP45 motherboards in their lineup. Gigabyte has already released six new P45 second-generation products based on their Ultra Durable 3 technology. We will be taking an in-depth look at the Ultra Durable 3 technology in separate article shortly. In the meantime, today we are reviewing one of the top models in the Ultra Durable 3 lineup, the GA-EP45-UD3P.

This particular board offers CrossFire support in dual x8 mode, native support for DDR2-1366 memory speeds, a revised cooling system, dual PCI-E Gigabit LAN controllers with teaming, and Dolby Home Theater support via the Realtek ALC889a. Add in an integrated TPM data encryption chip, eight SATA ports, Dynamic Energy Saver power management system, and IEEE 1394a support plus several other features and you just bought the kitchen sink with this board. Speaking of buying, the current retail pricing is around $135 and a $20 rebate is available, meaning there is a lot of value packed into this blue wonder board.

Did the Gigabyte GA-EP45-UD3P impress us? Let’s find out now

Computex 2008 Day One

Computex 2008 Day OneWe spent the majority of today going from private suite to private suite at the Grand Hyatt hotel in Taipei where several companies are showcasing their latest products. These suites offer a distinct difference from the hustle and bustle of main halls by offering a more casual and intimate atmosphere to discuss current product trends and technology. That said, let's run through a few of the many products we touched today.

ASRock

One of the more interesting motherboard companies we visit at Computex each year is ASRock. Their product line typically consists of previous generation chipsets targeting the budget sector. That focus has changed and ASRock will be offering the latest chipset technology at product launch now. However, what has not changed is their emphasis on providing the best possible performance to price ratios. Their primary lineup for Computex is featuring several variations of boards based on the Intel P45, G43 and P43 chipsets. They also had a working version of the new AMD 790GX chipset and SB750 Southbridge.


The G43 Twins-FullHD features the upcoming G43 (GMA X4500) chipset that is basically a G45 without hardware acceleration for VC-1 or H.264 formats. Also included is the ICH10 Southbridge, DDR3/DDR2 combo memory setup, Realtek ALC888 HD codec, and a DVI-Display Port card. This board should be available next month.


The ASRock P43R1600 Twins-110dB is based on the P43 chipset and features a DDR3/DDR2 memory combination setup. Also included is the ICH10R Southbridge, Realtek ALC890 HD codec, eSATA2, and PCIe Gigabit LAN.


The P45 TurboTwins2000 board features the P45/ICH10 combination along with four DDR3 and two DDR2 memory slots. AMD CrossFire is available with a 2x8 electrical configuration.

Images from ASRock to OCZ:

Overclocking Extravaganza: GTX 275's Complex Characteristics

Overclocking Extravaganza: GTX 275's Complex CharacteristicsAfter our in depth look at overclocking with AMD's Radeon HD 4890, many of our readers wanted to see the same thing done with NVIDIA's GTX 275. We had planned on looking at both parts from the beginning, but we knew each review would take a bit of time and effort to design and put together. Our goal has been to try and design tests that would best show the particular overclocking characteristics of the different hardware, and shoehorning all that into one review would be difficult. Different approaches are needed to evaluate overclocking with AMD and NVIDIA hardware.

For our AMD tests, we only needed to worry about memory and core clock speed. This gave us some freedom to look at clock scaling in order to better understand the hardware. On the other hand, NVIDIA divides their GPU up a bit more and has another, higher speed, clock domain for shader hardware. Throwing another variable in there has a multiplicative impact on our testing, and we had a hard time deciding what tests really mattered. If we had simply used the same approach we did with the 4890 article, we would have ended up with way too much data to easily present or meaningfully analyze.

We've kept a few key test points, as we will look at each clock at the highest speed we could achieve on its own (all other clocks set at stock speeds). We will also look at performance with all clocks set to the maximum we could hit. Beyond this, rather than looking at how performance scales over clock speed with memory and shader at their maximum and looking at how performance scales over shader speed with memory and core at their maximum, we decided it would be cleaner to look at just one more configuration. For this test, we chose core and shader speed at maximum with memory at stock.

As with the previous look at overclocking, we present our analysis based on percent increases in performance but provide the raw data as well. It's all pretty straight forward with the raw data, and we do include our highly overclocked 4890 as well as the 900MHz core clocked 4890 that can be picked up pre-overclocked from the manufacturer. For the bulk of the article, we will just be considering the impact of overclocking on the GTX 275, but our conclusion will compare AMD and NVIDIA on overclocking in this segment.

The clock speeds we were able to pull out of our GTX 275 were not to shabby as far as overclocks go. Our core clock speed could have been better, but otherwise we did pretty well. Here is what we will be looking at today:

Core: 702MHz (vs. 633MHz stock)
Memory: 1296MHz (vs. 1134MHz stock)
Shader: 1656MHz (vs. 1404MHz stock)

These are 10.9, 14.3, and 17.9 percent increases respectively. First up, we'll look at the impact of overclocking the memory, then we'll move on to core and shader. After that it's on to fully overclocked and our core/shader combined overclock.


Meedio Essentials 1.15.22.0, Part 2 - A Sleek and Clean HTPC Interface

Meedio Essentials 1.15.22.0, Part 2  -  A Sleek and Clean HTPC InterfaceWe are wrapping things up in this second half of our Meedio Essentials review. For those of you who haven't following, read our Part 1 coverage of Meedio Essentials (aka ME). As we continue, keep in mind that Meedio Essentials is one half of the "Meedio Suite", since Meedio TV is unavailable (at least at the time that this article was written). Now, let's get onto the plug-ins and the rest...

Albatron Widio - Wireless Audio System, A Quick Look

Albatron Widio  -  Wireless Audio System, A Quick LookBack when we did the CEO Forum, we already knew that the PC market was somewhat saturated in terms of having too many motherboard makers. So, we questioned if companies were already foreseeing this. The only choice that the companies had to make was whether or not they were going to change, and it was an important one at that. Changing and diversifying could actually be a matter of living or dying.

We have seen some motherboard manufacturers pursue networking products, while some have tried taking on consumer electronics, and other maufacturers have tried their hand out with displays. Others have taken even the almost fatal market saturation hit. Today, we thought we would quickly look at Widio, Albatron's latest attempt at diversification. Any by all accounts, they are trying to generate a lot of hype. Let's dive in and find out if the Widio is all that they are trying to make it out to be.

Microsoft's Windows Media Player 10 - Providing Some Pointers

Windows Media Player (WMP) has been arguably the most prevalent media player on the market. Whether you like or dislike Microsoft, "the company", isn't relevant for the sake of this article and by extension, their product - that's best saved for a business and/or OS discussion.

As most of you undoubtedly know, Microsoft has recently released Windows Media Player 10. And it is a pretty fair guess that you probably have it installed on the computer that you are currently using to read this article.

It really is pointless for us to simply show what has changed both feature-wise and, to a lesser extent, the design because you are likely already putting it to use. However, in the past few years, Microsoft has developed a good grasp of aesthetics, usability, features, and a knack to know how to implement a combination of all three into an end product. Our case in point would be MCE and its iterations (2004 edition and beyond). This is the reason why we make references to WMP often times in our personal video recorder (PVR) software reviews, as there are key pointers that other companies can take from Microsoft.

Speech Recognition - Ready for Prime Time?

The machine had been delivered two days ago on her first adult birthday. She had said, "But father, everybody - just everybody in the class who has the slightest pretensions to being anybody has one. Nobody but some old drips would use hand machines - "

The salesman had said, "There is no other model as compact on the one hand and as adaptable on the other. It will spell and punctuate correctly according to the sense of the sentence. Naturally, it is a great aid to education since it encourages the user to employ careful enunciation and breathing in order to make sure of the correct spelling, to say nothing of demanding a proper and elegant delivery for correct punctuation."

Even then her father had tried to get one geared for type-print as if she were some dried-up, old-maid teacher. But when it was delivered, it was the model she wanted - obtained perhaps with a little more wail and sniffle than quite went with the adulthood of fourteen - and copy was turned out in a charming and entirely feminine handwriting, with the most beautifully graceful capitals anyone ever saw. Even the phrase, "Oh, golly." somehow breathed glamour when the Transcriber was done with it.

--Isaac Asimov, Second Foundation - 1953



Here at AnandTech, we do our best to cover the topics that will interest our readers. Naturally, some topics are of interest to the vast majority of readers, while others target a more limited audience. At first glance, this article falls squarely into the latter category. However, when we think about where computers started and where they are now, and then try to extrapolate that and determine where they are heading in the future, certainly the User Interface has to play a substantial part in making computers easier to use for a larger portion of the population. Manual typewriters gave way to keyboards; text interfaces have been replaced by GUIs (mostly); and we have mice, trackballs, touchpads, and WYSIWYG interfaces now. Unfortunately, we have yet to realize the vision of Isaac Asimov and other science fiction writers where computers can fully understand human speech.

Why does any of this really matter? I mean, we're all basically familiar with using keyboards and mice, and they seem to get the job done quite well. Certainly, it's difficult to imagine speech recognition becoming the preferred way of playing games. (Well, some types of games at least.) There are also people in the world that can type at 140 wpm or faster -- wouldn't they just be slowed down by trying to dictate to the computer instead of typing?

There are plenty of seemingly valid concerns, and change can be a difficult process. However, think back for a moment to the first time you saw Microsoft's new wheel mouse. I don't know how other people reacted, but the first time I saw one I thought it was the stupidest gimmick I had ever seen. I already had a three button mouse, and while the right mouse button was generally useful, the middle mouse button served little purpose. How could turning the middle mouse button into a wheel possibly make anything better? Fast forward to today, and it irritates me to no end if I have to use a mouse that doesn't have a wheel. In fact, when I finally tried out the wheel mouse, it only took about two hours of use before I was hooked. I've heard the same thing from many other people. In other words, just because something is different or you haven't tried it before, don't assume that it's worthless.

There are a couple areas in which speech recognition can be extremely useful. For one, there are handicapped people that don't have proper control over their arms and hands, and yet they can speak easily. Given how pervasive computers have become in everyday life, flat out denying access to certain people would be unconscionable. Many businesses are finding speech recognition to be useful as well -- or more appropriately, voice recognition. (The difference between speech recognition and voice recognition is that voice recognition generally only has to deal with a limited vocabulary.) As an example, warehousing job functions only require a relatively small vocabulary of around 400 words, and allowing a computer system to interface with the user via earphones and a microphone can free up the hands to do other things. The end result is increased productivity and reduced errors, which in turn yields better profitability.

Winter Audio Reference: On-Board, Consumer, and Pro Solutions

Introduction

It's been quite some time since AnandTech has tackled an audio review. With Intel feeding higher bandwidth to onboard solutions and ever more data available to add-in cards through PCI Express, we could start to see some changes in the way that the industry approaches audio. We already have DVD-Audio and SACDs on current storage formats. With HD-DVD or Blu-ray coming down the pipe shortly, we'll have larger storage devices to feed the bandwidth-hungry PCs of today. That means even better quality media.

Our drive in life is to stay ahead of the curve and help as many people understand and ride the wave of upcoming technology as possible. When AnandTech got started, the AMD/Intel war was just getting going and 3D hardware was just beginning to take off. Before the advent of hardware 3D graphics acceleration, the video card was basically used as a rasterizer that drew a 2D image to the screen over an analog output. When talking about image quality, all rested on the DAC, which took the image of the screen in RAM and converted it from a digital grid of color values to an analog signal that the monitor could understand. Back in the day, Matrox started getting fancy and accelerated 2D windows function calls so that the CPU didn't have to draw everything itself. Slowly, more and more drawing was handled by the graphics card until we ended up moving complex 3D functions onto the graphics card and removing overhead from the rest of the system.

Over the years, a much slower trend has been happening on sound cards that parallels the graphics card industry. We have 3D positional audio and hardware DSP effects that manipulate audio in order to make it sound like it's contained in an altogether different environment.

Some of the key factors have kept the audio industry from advancing as fast and furiously as the graphics industry. First, our ears are easier to fool than our eyes. In general, people just don't care as much about hearing things where they are if they can see it. But there are mold breakers. Games like Doom 3, Thief 3, and The Chronicles of Riddick: Escape from Butcher Bay, are aurally quite beautiful and the sound quality not only adds to the experience, but is essential to gameplay as well.

There hasn't been enough emphasis placed on more than a 2-speaker 3D positional audio yet. In our opinion, applying HRTFs (head related transform functions) to 2-speaker setups is on its way out. Solutions like Doom 3's 5.1 channel surround implementation are doable and sound more natural. As the average end user for any given game begins to have a 5.1 surround system rather than a 2 or 2.1 system, we will start to see more and more developers use better sounding techniques.

The minimum quality for PC speakers is way too low. The speaker is the weakest link in the audio chain, and there's no need to buy an expensive sound card if you're going to have a cheap set of speakers connected to it. As people start to understand audio more, they will start to embrace it. The more realistic visuals become in games, the more obvious problems with audio will become. If by no other factor, we will see audio quality improve on the PC.

Today, we are going to take a look at a cross section of the audio industry. The lineup includes two cards from Creative (the Audigy 2 ZS Platinum Pro and Audigy 4 Pro), the Realtek Intel HD Audio solution, and the Echo Audio Gina3G. With these cards, we are covering our bases for the consumer add-in market, professional recording, and onboard audio solutions. Over time, as we review more audio solutions, we will compare against these cards as well.

Before we get to the cards and tests, we will need to take a look at what it is exactly that we will be doing. First, we will look what goes into an audio solution, and then we'll take a look at RightMark Audio Analyzer. As most of our analysis will be based on RMAA, understanding what all its tests mean is of the utmost importance.

Microsoft Windows XP Media Center Edition 2005: Feature and Performance Investigation

As impressed as we were with Windows XP Media Center Edition when it first launched, it's no surprise that the Microsoft OS has not taken off by storm.

Distributed only to OEMs for use in custom built systems, this wasn't an OS you could go out and buy. Even though some managed to get it (through MSDN and other less legal routes), there were relatively steep hardware requirements keeping that barrier to entry nice and high. You had to have a hardware MPEG-2 encoder card, which at the time of the release of MCE was far from common (since then times have changed, mostly thanks to MCE). You had to have one of the fastest CPUs available on the market, which at the time was around a Pentium 4 3GHz. And you had to have the MCE remote control setup, which also wasn't readily available to end users.

Things have changed however, and while it was still difficult to get a hold of the copy of the OS, the rest of the items became much easier. Places like Newegg began selling the Media Center remote control, with the stipulation that you had to buy it with some sort of hardware to make it look like you were buying a PC with it. And the price of CPUs went down, as the power of CPUs went up. The introduction of the Athlon 64 provided a nice, very powerful, very capable alternative to the Pentium 4 with one very important feature - an on-die memory controller. The on-die memory controller would prove to be very helpful in making the Athlon 64 an extremely high performer when it came to Media Center PCs.

In between MCE's maiden launch and today, Microsoft released a much-needed update to the OS: MCE 2004, which provided bug fixes, performance enhancements and introduced a few new tweaks and features to the OS. But it was clear that MCE 2004 was not an example of perfection, rather an example of the direction Microsoft was going in. There were still numerous features missing from the MCE equation, things like HDTV and multiple tuner support were left unaddressed, only to be serviced in the latest version of Microsoft's Media Center OS - MCE 2005.

Today marks the official launch of MCE 2005 and although there have already been reports on what's new in the updated OS, we've taken an in-depth look at it to not only evaluate the changes made to the OS, but also to finally investigate the performance of the OS and find out how fast of a system you truly need to run this beast of an OS. There are many details within and tons of screenshots, but we strongly suggest that our read our original article on Windows XP Media Center Edition as we will not be rehashing most of the information covered in that article.