Click

Saturday, September 8, 2007

Toshiba Satellite X205-S9359 Take Two: Displays and Drivers

Last week we took a first look at the Toshiba Satellite X205-S9359, a laptop sporting NVIDIA's fastest current mobile DirectX 10 offering. We took a look at gaming performance and general application performance and came away relatively impressed, but we weren't finished testing just yet. We're here today to wrap up our review of the X205, with a look at the LCD, additional performance benchmarks - including DirectX 10 results for a couple of games - and some commentary on other features of this laptop.
Jumping right into the thick of things, let's take a moment to discuss preinstalled software. Most OEMs do it, and perhaps there are some users out there that actually appreciate all of the installed software. Some of it can be truly useful, for instance applications that let you watch high-definition movies on the included HD DVD drive. If you don't do a lot of office work, Microsoft Works is also sufficient for the basics, although most people should probably just give in and purchase a copy of Microsoft Office. We're still a bit torn about some of the UI changes in Office 2007, but whether you choose the new version or stick with the older Office 2003 the simple fact of the matter is that most PCs need a decent office suite. The standard X205 comes with a 60 day trial version of Office 2007 installed, however, which is unlikely to satisfy the needs of most users. If you don't need Office and will be content with Microsoft Works, there's no need to have a 60 day trial. Conversely, if you know you need Microsoft Office there's no need to have Works installed and you would probably like the option to go straight to the full version of Office 2007.There is an option to purchase Microsoft Office Home and Student 2007 for $136, but it's not entirely clear if that comes in a separate box or if the software gets preinstalled - and if it's the latter, hopefully Microsoft Works gets removed as well. As far as we can tell, Toshiba takes a "one-size-fits-all" approach to software, and we would really appreciate a few more options. In fact, what we would really appreciate is the ability to not have a bunch of extra software installed. Above is a snapshot of Windows Vista's Add/Remove Programs tool for the X205 we received, prior to installing or uninstalling anything. The vast majority of the extra software is what we would kindly classify as "junk". All of it is free and easily downloadable for those who are actually interested in having things like Wild Tangent games on their computer. We prefer to run a bit leaner in terms of software, so prior to conducting most of our benchmarks we of course had to uninstall a lot of software, which took about an hour and several reboots. If this laptop is in fact intended for the "gamer on the go", which seems reasonable, we'd imagine that most gamers would also prefer to get a cleaner default system configuration. Alienware did an excellent job in this area, and even Dell does very well on their XPS line. For now, Toshiba holds the record for having the most unnecessary software preinstalled.

Gateway FX530: Mad Cows and Quad Core Overclocking

There are a few things that we tend to take for granted in life: death, taxes, and a lack of overclocking support on PCs from major OEMs. Certainly, there are many people that don't need overclocking, and those are exactly the people that tend to purchase name brand PCs in the first place. If your typical computer use doesn't get much more complex than surfing the Internet, the difference between a massively overclocked CPU and the stock configuration is hardly going to be noticeable. What's more, overclocking tends to come with drawbacks. System stability is frequently suspect, and outside of a few boutique computer shops that factory overclock systems, you will generally void your warranty by overclocking.Conversely, overclocking gives those willing to take a chance the ability to squeeze extra performance out of even the top performing parts. Alternately, consumers can save money by purchasing a cheaper processor and running it at speeds that would normally cost two or three times as much. Intel's latest Core 2 processors have rekindled interest in overclocking, in part simply because they overclock so well. In the past, the benchmark for highly overclockable chips has generally been set at 50% or more, with good overclocking chips achieving a 25% to 50% overclock. Core 2 Duo blows away some of these old conventions, with some chips like the E4300 managing even massive 100% overclocks! And they manage this without breaking a sweat. With chips that overclock so well, it seems a shame to run them at stock speeds.Over the years, we have seen a few factory overclocked systems, but rarely from a major OEM. The big OEMs like Dell, Gateway, HP, etc. tend to play it safe, but Gateway has broken with tradition by releasing a significantly overclocked Core 2 Extreme QX6700 system. What's more, they have done it at a price that is likely to turn a lot of heads - and yes, the factory warranty remains intact. We talked in the past about the type of people that actually can make use of a quad core system, and the people that are likely to want a quad core processor are often the people that stand to benefit the most from additional performance courtesy of overclocking. With Intel's QX6700 already reigning supreme as the fastest multi-core configuration on the market, why not add another 20% performance? We've seen similar configurations for sale from boutique manufacturers, often with astronomical prices. While the QX6700 certainly won't be cheap no matter how you slice it, Gateway offers their 20% overclock for a modest $100 price increase. Considering the price difference between the Q6600 and the QX6700 is $150 for a 266 MHz speed increase, doubling that speed increase for a mere $100 is a real bargain!A super fast processor sounds great, especially if it still carries a factory warranty. However, warranties don't mean a lot if the system won't run stable. Besides the processor, however, there are many other components that can affect system performance. The type of work you plan on doing with the computer will also affect how much benefit a fast CPU gets you. We'll assume right now that anyone planning on purchasing a quad core system routinely needs a lot of CPU power, but unfortunately there are still CPU intensive tasks that can't properly utilize multiple processor cores. In order to see just how much faster this Gateway system is compared to other options, we will be comparing performance results with the test systems used in our AMD Quad FX article. Before we get to the actual performance, however, let's take a closer look at the FX530.

A Messy Transition (Part 3): Vista Buys Some Time


As we saw in
part 1 of this series, large applications and games under Windows are getting incredibly close to hitting the 2GB barrier, the amount of virtual address space a traditional Win32 (32-bit) application can access. Once applications begin to hit these barriers, many of them will start acting up and/or crashing in unpredictable ways which makes resolving the problem even harder. Developers can work around these issues, but none of these except for building less resource intensive games or switching to 64-bit will properly solve the problem without creating other serious issues.
Furthermore, as we saw in
part 2, games are consuming greater amounts of address space under Windows Vista than Windows XP. This makes Vista less suitable for use with games when at the same time it will be the version of Windows that will see the computing industry through the transition to 64-bit operating systems becoming the new standard. Microsoft knew about the problem, but up until now we were unable to get further details on what was going on and why. As of today that has changed.
Microsoft has published
knowledge base article 940105 on the matter, and with it has finalized a patch to reduce the high virtual address space usage of games under Vista. From this and our own developer sources, we can piece together the problem that was causing the high virtual address space issues under Vista.
As it turns out, our initial guess about the issue being related to memory allocations being limited to the 2GB of user space for security reasons was wrong, the issue is simpler than that. One of the features of the Windows Vista Display Driver Model (WDDM) is that video memory is no longer a limited-sharing resource that applications will often take complete sovereign control of; instead the WDDM offers virtualization of video memory so that all applications can use what they think is video memory without needing to actually care about what else is using it - in effect removing much of the work of video memory management from the application. From both a developer's and user's perspective this is great as it makes game/application development easier and multiple 3D accelerated applications get along better, but it came with a cost.
All of that virtualization requires address space to work with; Vista uses an application's 2GB user allocation of virtual address space for this purpose, scaling the amount of address space consumed by the WDDM with the amount of video memory actually used. This feature is ahead of its time however as games and applications written to the DirectX 9 and earlier standards didn't have the WDDM to take care of their memory management, so applications did it themselves. This required the application to also allocate some virtual address space to its management tasks, which is fine under XP.
However under Vista this results in the application and the WDDM effectively playing a game of chicken: both are consuming virtual address space out of the same 2GB pool and neither is aware of the other doing the exact same thing. Amusingly, given a big enough card (such as a 1GB Radeon X2900XT), it's theoretically possible to consume all 2GB of virtual address space under Vista with just the WDDM and the application each trying to manage the video memory, which would leave no further virtual address space for anything else the application needs to do. In practice, both the virtual address space allocations for the WDDM and the application video memory manager attempt to grow as needed, and ultimately crash the application as each starts passing 500MB+ of allocated virtual address space.
This obviously needed to be fixed, and for a multitude of reasons (such as Vista & XP application compatibility) such a fix needed to be handled by the operating system. That fix is KB940105, which is a change to how the WDDM handles its video memory management. Now the WDDM will not default to using its full memory management capabilities, and more importantly it will not be consuming virtual address space unless specifically told to by the application. This will significantly reduce the virtual address space usage of an application when video memory is the culprit, but at best it will only bring Vista down to the kind of virtual address space usage of XP.

Laptop LCD Roundup: Road Warriors Deserve Better

Two of the areas where we've seen the most growth in the last few years are notebooks and flat-panel displays. The reasons for the tremendous growth differ, of course. Notebooks are a hot item because people are becoming enamored with wireless networks and portability, while LCDs have become popular because few manufacturers are making CRTs anymore and the small footprint of LCDs is desired by many people. We're working on increasing our coverage of both of these sectors, but up until now we haven't actually taken a close look at where they intersect.Since the first laptops began shipping, LCDs have been the de facto display standard. Years before most people were using LCDs on their desktop, laptops were sporting these thin, sleek, attractive displays. As anyone who used one of the earlier laptops can tell you, however, the actual quality of the LCD panels was often severely lacking. With the ramp up in production of both LCD panels and notebook computers, you might be tempted to assume that the quality of laptop displays has improved dramatically over the years. That may be true to a certain degree, but with power considerations being a primary factor in the design of most notebooks, compromises continue to be made.Without even running any objective tests, most people could pretty easily tell you that the latest and greatest desktop LCDs are far superior to any of the laptop LCDs currently available. While desktop LCDs have moved beyond TN panels to such technologies as S-IPS, S-PVA, and S-MVA we are aware of only a few laptop brands that use something other than a TN panel. (Unfortunately, we have not yet been able to get any of those laptops for review.) We have also complained about desktop LCDs that have reached the point where they are actually becoming too bright, in an apparent attempt to win the marketing war for maximum brightness. The same can't be said of laptops, as very few can even break the 200 cd/m2 mark. Individual preferences definitely play a role, but outside of photography and print work most people prefer a brightness setting of somewhere between 200 and 300 cd/m2.Luckily, there are plenty of new technologies being worked on that aim to improve the current situation. Not only should we get brighter laptop panels in the near future, but color accuracy may improve and power requirements may actually be reduced relative to current models. LED backlighting is one technology that holds a lot of promise, and it has only just begun to show up on desktop LCDs. Dynamic backlighting - were the brightness of some LEDs can be increased or decreased in zones depending on what content is currently being shown - is another technology that we may see sooner rather than later. Then there are completely new display technologies like OLED.With the current laptop landscape in mind, we have decided that it's time for us to put a bigger focus on the quality of laptop LCDs. To accomplish this we have put together a roundup of the current notebooks that we have in-house. Future laptop reviews will continue this trend by including a section covering display analysis and quality, but we wanted to build a repertoire of past notebook displays in the meantime. While we only have four laptops at present, it is also important to remember that there are only a few companies that actually manufacture LCD panels. We would also expect any companies that release notebooks with higher-quality LCDs to make a bullet point out of the fact, which means that if you don't see any particular emphasis placed on the display panel in a notebook's specifications it probably has a panel similar to one of the laptops we're looking at today.

Killing the Business Desktop PC Softly

Despite numerous attempts to kill it, it is still alive and kicking. It is "fat" some say, and it hogs up lots of energy and money. To others it is like a mosquito bearing malaria: nothing more than a transmitter of viruses and other parasites. This "source of all evil" in the IT infrastructure is also known as the business desktop PC. Back at the end of nineties, Larry Ellison (Oracle) wanted to see the PC die, and proposed a thin client device as a replacement dubbed the NC (Network Computer). Unfortunately for Oracle, the only thing that died was the NC, as the desktop PC quickly adapted and became a more tamable beast.When we entered the 21st century, it became clear that the thin PC is back. Server based computing (SBC), the prime example being Citrix Metaframe Presentation Servers, has become quite popular, and it has helped to reduce the costs of traditional desktop PC computing. What's more, you definitely don't need a full blown desktop client to connect to Citrix servers, so a thin client should be a more cost friendly alternative. When Microsoft Windows Server 2003 came out with a decent Terminal Server, SBC became even more popular for light office work. However the good old PC hung on. First, as interfaces and websites became more graphically intensive, the extra power found in typical PCs made thin clients feel slow. Second, the easily upgradeable PC offered better specs for the same price as the inflexible thin client. Third and more importantly, many applications were not - and still are not - compatible with SBC.That all could change in 2007, and this time the attempt on the PC's life is much more serious. In fact, the murder is planned by nobody less than the "parents" of the PC. Father IBM is involved, and so is mother Compaq (now part of HP). Yes, two of the most important companies in the history of the PC are ready to slowly kill the 25 year old. Will these super heavyweights finally offer a more cost friendly alternative to the desktop PC? Let's find out.

Silver Power Blue Lightning 600W

Most of our readers are probably not familiar with the company Silver Power, which is no surprise considering that this is a new brand name primarily targeting the European market. However, the parent company of Silver Power is anything but new and has been manufacturing a variety of power supplies for many years. MaxPoint is headquartered in Hamburg Germany and they have ties to several other brands of power supplies, the most notable being Tagan

The Tagan brand was established to focus more on the high-end gamers and enthusiasts, where quality is the primary concern and price isn't necessarily a limiting factor. Silver Power takes a slightly different route, expanding the product portfolio into the more cost-conscious markets. Having diverse product lines that target different market segments is often beneficial for a company, though of course the real question is whether or not Silver Power can deliver good quality for a reduced price.

We were sent their latest model, the SP-600 A2C "Blue Lightning" 600W, power supply for testing. This PSU delivers 24A on the 3.3V rail and 30A on the 5V rail, which is pretty average for a 600W power supply. In keeping with the latest power supply guidelines, the 12V power is delivered on two rails each capable of providing up to 22A. However, that's the maximum power each 12V rail can deliver; the total combined power capable of being delivered on the 3.3V, 5V, and 12V rails is 585W, and it's not clear exactly how much of that can come from the 12V rails which are each theoretically capable of delivering up to 264W each.

RAID Primer: What's in a number?

The majority of home users have experienced the agony of at least one hard drive failure in their lives. Power users often experience bottlenecks caused by their hard drives when they try and accomplish I/O-intensive tasks. Every IT person who has been in industry for any length of time has dealt with multiple hard drive failures. In short, hard drives have long caused the majority of support headaches in standard desktop or server configurations today, with little hope of improvement in the near term.With the increased use of computers in the daily lives of people worldwide, the dollar value of data stored on the average computer has steadily increased. Even as MTBF figures have moved from 8000 hours in the 1980s (example: MiniScribe M2006) to the current levels of over 750,000 hours (Seagate 7200.11 series drives), this increase in data value has offset the relative decrease of hard drive failures. The increase in the value of data, and the general unwillingness of most casual users to back up their hard drive contents on a regular basis, has put increasing focus on technologies which can help users to survive a hard drive failure. RAID (Redundant Array of Inexpensive Disks) is one of these technologies.Drawing on whitepapers produced in the late 1970s, the term RAID was coined in 1987 by researchers at the University of California, Berkley in an effort to put in practice theoretical gains in performance and redundancy which could be made by teaming multiple hard drives in a single configuration. While their paper proposed certain levels of RAID, the practical needs of the IT industry have brought several slightly differing approaches. Most common now are:RAID 0 - Data StripingRAID 1 - Data MirroringRAID 5 - Data Striping with ParityRAID 6 - Data Striping with Redundant ParityRAID 0+1 - Data Striping with a Mirrored CopyEach of these RAID configurations has its own benefits and drawbacks, and is targeted for specific applications. In this article we'll go over each and discuss in which situations RAID can potentially help - or harm - you as a user.

OCZ Introduces DDR3-1800

Memory based on the exciting new Micron Z9 memory chips for DDR3 first appeared a couple of weeks ago and we first looked at it in Super Talent & TEAM: DDR3-1600 is Here! As predicted in that review, it was only a matter of days until most of the major enthusiast memory makers began talking about their own products based on Micron Z9 chips. Some even announced fast availability of the new kits in the retail market.The reasons for this are basic. All memory makers buy raw memory chips available in the open market. Some memory makers do not like to talk about the chips used in their DIMMs, as they consider that information proprietary, but this secrecy does not normally last very long. It is rare to see a memory manufacturer with a truly exclusive supply arrangement with a memory vendor, but several companies have been trying very hard to do just this, and we may see more of these attempts in the future.The DIMM manufacturers then speed grade or "bin" the chips to create one or more speed grades from a single chip type. Memory chips are then surface-mounted on generic or proprietary circuit boards with SPD (Serial Presence Detect) chips programmed with generic code or custom SPD programming done by the DIMM maker. This is why the introduction of fast new chips like the Micron Z9 often circulates rapidly through the enthusiast memory market as each manufacturer tries to introduce products based on the new chips with new twists that outdo the competition. This does not mean the memory you buy from Super Talent, for example, is exactly the same as the Micron Z9-based memory you buy from Corsair. Companies pride themselves on the sophistication of their speed-grading technology, their design and/or sourcing of PCBs, and their skill at programming the SPD.Despite the real differences that emerge in memory performance from different DIMM manufacturers, the normal arrangement is one company successfully uses a new chip in a top-performing new DIMM, and then everyone in the market has a similar memory product based on the same chip. That is why every memory company has announced, or will soon be announcing, their own Micron Z9-based memory.One of the more interesting of the announcements is OCZ DDR3-1800, rated at 8-8-8 timings at DDR3-1800, which is the fastest production DDR3 kit currently available. This new PC3-14400 Platinum Edition kit is specified to reach DDR3-1800 at 1.9V and is claimed to have substantial headroom above this speed. It certainly appears that OCZ is binning Micron Z9 chips for even higher memory speeds, along with possibly some other tweaks to squeeze more from these chips. The test results should tell us what these new DIMMs can actually do.

OCZ Introduces DDR3-1800

Memory based on the exciting new Micron Z9 memory chips for DDR3 first appeared a couple of weeks ago and we first looked at it in Super Talent & TEAM: DDR3-1600 is Here! As predicted in that review, it was only a matter of days until most of the major enthusiast memory makers began talking about their own products based on Micron Z9 chips. Some even announced fast availability of the new kits in the retail market.The reasons for this are basic. All memory makers buy raw memory chips available in the open market. Some memory makers do not like to talk about the chips used in their DIMMs, as they consider that information proprietary, but this secrecy does not normally last very long. It is rare to see a memory manufacturer with a truly exclusive supply arrangement with a memory vendor, but several companies have been trying very hard to do just this, and we may see more of these attempts in the future.The DIMM manufacturers then speed grade or "bin" the chips to create one or more speed grades from a single chip type. Memory chips are then surface-mounted on generic or proprietary circuit boards with SPD (Serial Presence Detect) chips programmed with generic code or custom SPD programming done by the DIMM maker. This is why the introduction of fast new chips like the Micron Z9 often circulates rapidly through the enthusiast memory market as each manufacturer tries to introduce products based on the new chips with new twists that outdo the competition. This does not mean the memory you buy from Super Talent, for example, is exactly the same as the Micron Z9-based memory you buy from Corsair. Companies pride themselves on the sophistication of their speed-grading technology, their design and/or sourcing of PCBs, and their skill at programming the SPD.Despite the real differences that emerge in memory performance from different DIMM manufacturers, the normal arrangement is one company successfully uses a new chip in a top-performing new DIMM, and then everyone in the market has a similar memory product based on the same chip. That is why every memory company has announced, or will soon be announcing, their own Micron Z9-based memory.One of the more interesting of the announcements is OCZ DDR3-1800, rated at 8-8-8 timings at DDR3-1800, which is the fastest production DDR3 kit currently available. This new PC3-14400 Platinum Edition kit is specified to reach DDR3-1800 at 1.9V and is claimed to have substantial headroom above this speed. It certainly appears that OCZ is binning Micron Z9 chips for even higher memory speeds, along with possibly some other tweaks to squeeze more from these chips. The test results should tell us what these new DIMMs can actually do.

µATX Overview: Prelude to a Roundup


Our upcoming series of µATX articles has traveled a long road (Ed: that's an understatement!). When we first envisioned a long-overdue look at the µATX form factor motherboards, we thought it would be your typical motherboard roundup with maybe a twist or two tossed in to keep it interesting. One thing led to another and before you knew it, our minds started to run rampant with additional items that we felt were important for the article. This led to scope creep and those of us who manage projects - or who have been unlucky enough to be on a project that has featuritis - know what happens next.That's right, we over-emphasized the new article features to the detriment of our primary focus, providing a motherboard roundup that featured the often ignored but market leading µATX form factor. What started out with adding a couple of features such as IGP video quality comparisons and midrange CPU performance turned into a maze of thoughts and ideas that led us to believe it would be quite easy to add additional tests without affecting the overall schedule too much. We were wrong, but we hope that our future motherboard articles will be better for it.How did we get stuck in the quagmire of µATX hell? It began with innocent thoughts of adding budget to midrange CPU coverage, low to midrange graphics comparisons against the IGP solutions, High Definition playback comparisons utilizing not one but each competing standard, Windows XP versus Vista versus Linux, onboard audio versus add-in cards, and even tests of input devices and external storage items. It ended with our project scope changing from being motherboard specific to platform encompassing.We started down that path but despite periodic excitement, at times we also ended up with a dreaded case of paralysis by analysis syndrome. Don't get us wrong: we do not regret the effort that has been expended on this roundup; however, we sincerely regret the time it has taken to complete it and we apologize to those of you who have been waiting months for this information. It turns out that we ignored one of our favorite quotes from C. Gordon Bell, "The cheapest, fastest, and most reliable components are those that aren't there." That is one of the many factors that caused us problems, as it became quite obvious during testing that getting all of this equipment to work together and then benchmarking as planned was not exactly going to be a walk in the park.We have been constantly waiting on that one BIOS or driver to fix a malady of problems that we've discovered along the way. The manufacturers would ask - sometimes plead - for us to retest or wait as "that problem is being solved and a fix should be available immediately". Immediately it turns means days and weeks, not hours. We also received several product revisions during the course of testing that required us to throw out the old results and start again. In the end, we hope our efforts paid off and at least we have the knowledge that every supplier has had ample opportunity to fix any ills with their product.Our experiences with a wide variety of components will be discussed extensively in a series of articles to be published over the coming month. However, at the end of the day, the star of this show is still the motherboard. If the CPU is the brain of a computer and the video card is its eyes, then the motherboard is the central nervous system. It truly is the central focal point of the system and having one that works correctly makes it really easy to put a system together.As such, we are changing our testing emphasis from being primarily performance based to a combination of performance, features, stability, support, and those intangibles that we experience during testing that might set one board apart from another. While performance is important, does a few tenths of second or an additional two frames per second in a benchmark really mean that much when you cannot get a USB port working due to a crappy BIOS release or your system does not properly recover from S3 sleep state when you are set to record the last episode of the Sopranos? We thought as much also, so we are changing our vantage point on motherboard testing.While we are performance enthusiasts at heart, the fastest board available is not worth much if the included features do not work as advertised or the board constantly crashes when trying to use an application. Our testing emphasis, especially between boards based on the same chipset, will be focused on stability and compatibility with a wide range of peripherals in both stock and overclocked conditions. Speaking of features, we will place a renewed emphasis on networking, storage, memory, and audio performance. More importantly, we will provide additional analysis on overclocking, energy consumption, cooling capabilities, layout, and power management features where applicable.We also want to take this opportunity to put the manufacturers on notice: we will not countenance delays, patches, and numerous updates again, particularly on products that are available in the retail market! If a lemon of a motherboard gets released to consumers and it needs more BIOS tuning or perhaps an entirely new revision, we are going to do our best to point this fact out to the readers. We understand that it can be difficult to get every single peripheral to work properly, especially with new devices coming out all the time, but when a motherboard fails to work properly with a large number of USB devices, memory types, GPUs, etc. that product shouldn't be on the market.At the end of this journey we will provide three different platform recommendations based on the various components we have utilized in testing. Our platforms are designed around HTPC, Gaming, and Home/Office centric configurations with a heavy emphasis on the systems being quiet, reliable, and affordable. Okay, we blew the budget on the HTPC configuration but we will provide several alternatives to help control costs on that particular buildup. Let's find out what else is changing and exactly what will be included in our comprehensive review of the µATX motherboards and surrounding technologies.