Archive for the ‘Technology’ Category.

What is Happening at the Japanese Nuclear Plants

This is the most helpful article I have found yet on the problems at earthquake-damaged nuclear plants.  As one can imagine, it is a lot more sensible than some of the garbage in the general media.

It cleared up one point of confusion I had - I was not sure why there was still heat generation after the control rods slammed down, killing the fission process.  But apparently there are a number of intermediate fission products created that continue to decay for several days, producing about 3% of the heat of the full fission process.  This heat is what boiled away the water in the reactor vessel once flow of cooling water stopped.  It is this boiling that led to the necessity to release steam (to reduce pressure in the reactor vessel).  It was this steam that was partially disassociated into hydrogen and oxygen, which led to the explosion.

One fact that has been lost in all the hype, and may continue to be lost, is that the earthquake alone (which was 7 times larger than the plant was designed for) was necessary but not sufficient to lead to the current problems.  Everything probably would have been fine had it not been for the tsunami knocking off all the diesel generators the plant used in an emergency to keep the colling pumps running.  Apparently the generators they rushed to the site later could not be used due to various incompatibilities, the type of real-world frustrating problem that will be immediately recognizable to any engineer who has a troubleshooting background.

Update: Unfortunately, the author may have been overly optimistic.  The author implied the pile would stop producing new heat after a few days, but that does not seem to be the case, particularly since spent fuel rods apparently have to be kept in water to keep them cool months or years after they were in service.  With the apparent rupture of the main presure vessel around the core, all bets would seem to be off in terms of containing the most harmful radioactive elements.

I did troubleshooting at a refinery for years, and almost every time the worst disasters were from improbable event and/or screwup after improbable event.   The human mind seems to be unable to really grasp just how screwed up things can get.  The novel Jurassic Park was as much about this problem as it was about dinosaurs.

Update #2: This is the piece that was missing from the earlier linked report:

The sharp deterioration came after a frantic day and night of rescue efforts focused largely on the No. 2 reactor. There, a malfunctioning valve prevented workers from manually venting the containment vessel to release pressure and allow fresh seawater to be injected into it. That meant that the extraordinary remedy emergency workers had jury-rigged to keep the nuclear fuel from overheating no longer worked.

As a result, the nuclear fuel in that reactor was exposed for many hours, increasing the risk of a breach of the container vessel and more dangerous emissions of radioactive particles.

By Tuesday morning, Tokyo Electric Power said that it had fixed the valve and resumed seawater injections, but that it had detected possible leaks in the containment vessel that prevented water from fully covering the fuel rods.

Update #3:  Things are slightly better.

OK, This is Actually a Kind of Cool Application of Green Technology

MUCH better than stockpiling food and ammo in some crappy mountain retreat, this is the way to ride out the end of civilization.

Told Ya

Based on past studies of sudden acceleration problems  (e.g. that the vast majority of sudden acceleration problems mysteriously happen to senior citizens) I predicted that many of the Toyota failures would come down to operator error.  The incentives for operators are substantial, even before tort action, both from a psychological and monetary standpoint to blame their own errors on Toyota.

The U.S. Department of Transportation has analyzed dozens of data recorders from Toyota Motor Corp. vehicles involved in accidents blamed on sudden acceleration and found that at the time of the crashes, throttles were wide open and the brakes were not engaged, people familiar with the findings said.

The results suggest that some drivers who said their Toyota and Lexus vehicles surged out of control were mistakenly flooring the accelerator when they intended to jam on the brakes. But the findings don't exonerate Toyota from two known issues blamed for sudden acceleration in its vehicles: sticky accelerator pedals and floor mats that can trap accelerator pedals to the floor.

The findings by the National Highway Traffic Safety Administration involve a sample of reports in which a driver of a Toyota vehicle said the brakes were depressed but failed to stop the car from accelerating and ultimately crashing.

The data recorders analyzed by NHTSA were selected by the agency, not Toyota, based on complaints the drivers had filed with the government.

The findings are consistent with a 1989 government-sponsored study that blamed similar driver mistakes for a rash of sudden-acceleration reports involving Audi 5000 sedans.

The Toyota findings, which haven't been released by NHTSA, support Toyota's position that sudden-acceleration reports involving its vehicles weren't caused by electronic glitches in computer-controlled throttle systems, as some safety advocates and plaintiffs' attorneys have alleged. More than 100 people have sued the auto maker claiming crashes were the result of faulty electronics.

Of course breast implants pretty clearly never caused immune disorders, but that did not stop tort lawyers from bankrupting an entire industry on that theory.  So it is nice that Toyota has the facts on its side, but that may or may not help in court, and almost certainly will not help in Congress or the Administration, whose agendas were always driven more by the desire to help domestic auto companies against a powerful foreign rival.

Pretty Cool

I thought this was interesting, not only for the unified control algorithm, but also just for the lifting capacity of these little buggers.  Via Engadget

Commercial Jetpack!

It would be nice if it were more compact, but they are claiming a 30-minute flight time, which is huge compared to earlier efforts.

10mar10jetpack25

It does help to illustrate a different point I make about alternatives to internal combustion.  Note the device uses gasoline.  Nothing else that is so cheap and plentiful has gasoline's energy content to weight ratio.  Which is why it is so freaking hard to replace in cars.

Things That Are Ticking Me Off Today

Overscan.  Yes, I just lost a day of my time to overscan.  Traditional TV sets do not show the entire image they receive from broadcast or DVDs.  They cut off 8-15% of the image around the edges.  This is to make sure there is not black or other border around the image, much like one does in a printing process.  This is fine for an Avatar DVD where one might lose a few leaves in the jungle at the edge, but when I am projecting charts in one of my climate videos, it tends to cut off axes  (when one is working with only 480 vertical pixels (traditional DVD resolutions) it is hard to get detailed charts to project well anyway, but losing resolution to overscan is a further pain.  Making the situation more complicated, DVD's played on a computer or on some (but not all) modern flat screens do have have overscan.

Anyway, I am mostly done, and will post my latest effort here soon.

Some Decent Rechargeable Batteries?

This Is Still A Stupid Idea

I probably have posted on the electricity generating speed bump more times than it deserves, but Glen Reynolds linked this story and I am seeing it linked uncritically all over.  Here was the email I dashed off to Instapundit:

The speed bump / power device at the Burger King in New Jersey is the silliest technology I have ever seen and I am amazed that so many people praise it or write uncritically that it provides free power.  Energy is never free, it comes from somewhere.  In this case, the energy is actually stolen from the car.  The electricity power produced is equal to or less than the extra power the car has to expend going over the bump.

This electricity might be "free" if it is used where cars are braking anyway, say on a long down ramp in a parking garage, or on a suburban street or school zone where speed bumps already exist.  But the Burger King example, and in fact most of the examples I have seen of this installation, are just vampiric theft, very similar to what the US Government does in many of its programs, creating a large benefit for a single user and hoping that distributing the costs in small chunks across a wide number of people makes these costs invisible.

I wrote more about the technology here.

Rated Capacity

One needs to be a careful consumer of information when reading about the "rated capacity" of certain alternative energy plants. 

Take a 1MW nuclear plant, run it for 24 hours, and you get 24 MW-hours, or something fairly close to that, of electricity.

Leave 1MW worth of solar panels out in the sun for 24 hours, you get much less total electricity, depending on where you put it.  On an average day in New York City, you will get about 4 MW-hours.  In one of the best solar sites in the word, my home of Phoenix, you get about 6.5 MW-hours per day.  The key metric is peak sun-hours per day, and some example figures are here.  So, even in the best solar sites in the world, solar panels run at only about 25-30% of capacity.

It turns out, not surprisingly, that the same relationship holds for wind.

It's not like it's a secret that wind turbines are an unreliable source of electrical power. Bryce points out that, "In
July 2006, for example, wind turbines in California produced power at
only about 10 percent of their capacity; in Texas, one of the most
promising states for wind energy, the windmills produced electricity at
about 17 percent of their rated capacity."

That means
that there has to be nuclear, coal-fired or natural gas power plants
functioning fulltime as a backup to the pathetically unreliable and
inefficient wind farms. Moreover, what electricity they do generate
is lost to some degree in the process of transmitting it over long
distances to distribution facilities.

Now, this should not outright dissuade us from these technologies, but since no one has really licked the night-time / not-windy storage proble, it's certainly an issue.   I have looked at solar for my house a number of times, and the numbers just are not there (even with up to 50% government subsidies!) without a 2-5x decrease in panel costs.  Low yields can potentially be tolerated, but capital costs are going to have to be a lot lower before they make a ton of sense.

The Poverty Bomb

Who knew that one small piece of technology could turn a group of wealthy American urbanites into third world refugees.

Some Blu-Ray Advice

I am a bleeding edge guy when it comes to home theater, so I have had a Blu-Ray high-def disk player for over a year.  I am currently looking for a second player to replace the first, and I thought I might share a couple of thoughts.

The press has declared the high-def DVD format war over, with Toshiba pulling the plug on the HD-DVD format.  This makes it much easier to figure out what software to buy (though it is still really expensive -- some Blu-Ray disks are going for $40!)

However, the hardware issue is still a minefield.  This is related to how the Blu-ray standard is being run, which presents problems and opportunities.  Unlike your CD or DVD player, the Blu-ray standard continues to evolve.  A lot.  It is much more like a computer standard, and I suspect in fact that the computer guys (or at least the game console guys) are running the show here.  This means that new features continue to evolve and be added.  And these are not just add-on features, like additional hardware inputs, but software features that create compatibility issues between versions.   As a result, there are already at least 3 generations of players out there.  The original profile 1.0, and then profile 1.1, and now profile 2.0.  And even within these profiles, individual players may vary in their conformance to them.   Sometimes you can do a firmware upgrade to a newer spec, and sometimes you can't, but such upgrades are not a piece of cake, and involve burning a DVD from the Internet and running certain codes from the Blu-ray remote to make the firmware upload.

The net result is that the features on a certain disk may not work on your player, or the disk may not work in your player at all (Newer movies like Pirates of the Carib. III have multimedia title pages that won't load on my player, and when the title page won't load, there was no way to play the movie.)  My advice is if you have waited this long, hold out until this summer for the newer profile 2.0 machines.  Also, you should confirm the player supports HDMI 1.3, so it can take advantage of the wider color gamut of newer TV's.  Players of this spec will start showing up in the next months -- the Sony BDP-S350 will likely be a good choice available this summer.

By the way, good luck finding anything on the box or in a Best Buy store that says what profile the player conforms to.  Hardware makers have created a really compatibility mess with Blu-ray (its seems to be a very poorly run standard) but they want to hide this fact from consumers because the are only just now recovering from the format war with HD-DVD and don't want consumers to have another reason to wait to purchase.  So there is not way they are going to put the profile number on the box, I guess, so you need to do your research.

As a final thought, and maybe I am just old and out of step here, but I really find the insistence on multimedia content and bitchin-cool menu screens on Blu-ray disks to be tiresome.  I just want to watch the movie in beautiful high-resolution, and having my software not work right because the menu doesn't work is just stupid.  Further, the addition of all these features has caused most blu-ray players to have a boot up cycle longer than Windows.  It can take 45 seconds for a blu-ray player to boot up, and a similar amount of time to get the software to start playing.  Add in the time to plow through stupid menu screens, and it can take several minutes to get a movie started.

Tonight I watched Cloverfield on blu-ray and it was awesome.  I was surprised the reviews on Amazon were so bad for Cloverfield, because I really liked it.  Yea, its different, but unlike movies like Bourne Ultimatum, there is actually a explanable reason for the jerky (and sometimes nauseating, I will admit) camera work. I did not pay much attention to it when it came out in theaters -- is this one of those geek litmus-test videos that only a few of us hard-core nerds like (a la Serenity?)

Some Blu-Ray Advice

I am a bleeding edge guy when it comes to home theater, so I have had a Blu-Ray high-def disk player for over a year.  I am currently looking for a second player to replace the first, and I thought I might share a couple of thoughts.

The press has declared the high-def DVD format war over, with Toshiba pulling the plug on the HD-DVD format.  This makes it much easier to figure out what software to buy (though it is still really expensive -- some Blu-Ray disks are going for $40!)

However, the hardware issue is still a minefield.  This is related to how the Blu-ray standard is being run, which presents problems and opportunities.  Unlike your CD or DVD player, the Blu-ray standard continues to evolve.  A lot.  It is much more like a computer standard, and I suspect in fact that the computer guys (or at least the game console guys) are running the show here.  This means that new features continue to evolve and be added.  And these are not just add-on features, like additional hardware inputs, but software features that create compatibility issues between versions.   As a result, there are already at least 3 generations of players out there.  The original profile 1.0, and then profile 1.1, and now profile 2.0.  And even within these profiles, individual players may vary in their conformance to them.   Sometimes you can do a firmware upgrade to a newer spec, and sometimes you can't, but such upgrades are not a piece of cake, and involve burning a DVD from the Internet and running certain codes from the Blu-ray remote to make the firmware upload.

The net result is that the features on a certain disk may not work on your player, or the disk may not work in your player at all (Newer movies like Pirates of the Carib. III have multimedia title pages that won't load on my player, and when the title page won't load, there was no way to play the movie.)  My advice is if you have waited this long, hold out until this summer for the newer profile 2.0 machines.  Also, you should confirm the player supports HDMI 1.3, so it can take advantage of the wider color gamut of newer TV's.  Players of this spec will start showing up in the next months -- the Sony BDP-S350 will likely be a good choice available this summer.

By the way, good luck finding anything on the box or in a Best Buy store that says what profile the player conforms to.  Hardware makers have created a really compatibility mess with Blu-ray (its seems to be a very poorly run standard) but they want to hide this fact from consumers because the are only just now recovering from the format war with HD-DVD and don't want consumers to have another reason to wait to purchase.  So there is not way they are going to put the profile number on the box, I guess, so you need to do your research.

As a final thought, and maybe I am just old and out of step here, but I really find the insistence on multimedia content and bitchin-cool menu screens on Blu-ray disks to be tiresome.  I just want to watch the movie in beautiful high-resolution, and having my software not work right because the menu doesn't work is just stupid.  Further, the addition of all these features has caused most blu-ray players to have a boot up cycle longer than Windows.  It can take 45 seconds for a blu-ray player to boot up, and a similar amount of time to get the software to start playing.  Add in the time to plow through stupid menu screens, and it can take several minutes to get a movie started.

Tonight I watched Cloverfield on blu-ray and it was awesome.  I was surprised the reviews on Amazon were so bad for Cloverfield, because I really liked it.  Yea, its different, but unlike movies like Bourne Ultimatum, there is actually a explanable reason for the jerky (and sometimes nauseating, I will admit) camera work. I did not pay much attention to it when it came out in theaters -- is this one of those geek litmus-test videos that only a few of us hard-core nerds like (a la Serenity?)

Some Blu-Ray Advice

I am a bleeding edge guy when it comes to home theater, so I have had a Blu-Ray high-def disk player for over a year.  I am currently looking for a second player to replace the first, and I thought I might share a couple of thoughts.

The press has declared the high-def DVD format war over, with Toshiba pulling the plug on the HD-DVD format.  This makes it much easier to figure out what software to buy (though it is still really expensive -- some Blu-Ray disks are going for $40!)

However, the hardware issue is still a minefield.  This is related to how the Blu-ray standard is being run, which presents problems and opportunities.  Unlike your CD or DVD player, the Blu-ray standard continues to evolve.  A lot.  It is much more like a computer standard, and I suspect in fact that the computer guys (or at least the game console guys) are running the show here.  This means that new features continue to evolve and be added.  And these are not just add-on features, like additional hardware inputs, but software features that create compatibility issues between versions.   As a result, there are already at least 3 generations of players out there.  The original profile 1.0, and then profile 1.1, and now profile 2.0.  And even within these profiles, individual players may vary in their conformance to them.   Sometimes you can do a firmware upgrade to a newer spec, and sometimes you can't, but such upgrades are not a piece of cake, and involve burning a DVD from the Internet and running certain codes from the Blu-ray remote to make the firmware upload.

The net result is that the features on a certain disk may not work on your player, or the disk may not work in your player at all (Newer movies like Pirates of the Carib. III have multimedia title pages that won't load on my player, and when the title page won't load, there was no way to play the movie.)  My advice is if you have waited this long, hold out until this summer for the newer profile 2.0 machines.  Also, you should confirm the player supports HDMI 1.3, so it can take advantage of the wider color gamut of newer TV's.  Players of this spec will start showing up in the next months -- the Sony BDP-S350 will likely be a good choice available this summer.

By the way, good luck finding anything on the box or in a Best Buy store that says what profile the player conforms to.  Hardware makers have created a really compatibility mess with Blu-ray (its seems to be a very poorly run standard) but they want to hide this fact from consumers because the are only just now recovering from the format war with HD-DVD and don't want consumers to have another reason to wait to purchase.  So there is not way they are going to put the profile number on the box, I guess, so you need to do your research.

As a final thought, and maybe I am just old and out of step here, but I really find the insistence on multimedia content and bitchin-cool menu screens on Blu-ray disks to be tiresome.  I just want to watch the movie in beautiful high-resolution, and having my software not work right because the menu doesn't work is just stupid.  Further, the addition of all these features has caused most blu-ray players to have a boot up cycle longer than Windows.  It can take 45 seconds for a blu-ray player to boot up, and a similar amount of time to get the software to start playing.  Add in the time to plow through stupid menu screens, and it can take several minutes to get a movie started.

Tonight I watched Cloverfield on blu-ray and it was awesome.  I was surprised the reviews on Amazon were so bad for Cloverfield, because I really liked it.  Yea, its different, but unlike movies like Bourne Ultimatum, there is actually a explanable reason for the jerky (and sometimes nauseating, I will admit) camera work. I did not pay much attention to it when it came out in theaters -- is this one of those geek litmus-test videos that only a few of us hard-core nerds like (a la Serenity?)

Cheap Cables? Well Mostly

This post advocates always buying the cheap home theater cables.  I agree up to a point.  I have never been able to hear the difference in really, really expensive cables, say for 3 foot interconnects.

But there is an exception to this, and it is interesting the author Glen is quoting actually uses this example -- long runs of video cable, particularly HDMI.  If your TV sits on top of your video source, and the video run is 6 feet or less, then the average person with the average equipment will not notice the difference in video cables.  But should your cable run extend to, say, 25 feet or more, then you are going to have problems.  Video is both very high bandwidth and very susceptible to noise.  HDMI and other digital cables are no exception  -- the only thing that changes are the symptoms. 

In an analog cable, you will start getting a lot of video noise with longer cable runs.  Computer VGA cables were notorious for this -- if you went more than 6 feet, your picture could be a real mess.  SVGA S-video also had such problems.  Now, with digital cable, the picture does not gain noise but at some point the signal is lost altogether and the picture drops out completely -- think of a youtube video streaming over a bad wireless connection.   I will about gaurantee this will happen with 25 feet of JC Penny HDMI cable.

New iPod Warning and Update

A few days ago I wrote that there were a lot of bad reviews of the new iPod Classics.  The form factor and increased storage seem enticing, but people complained about the user interface.

Today I went to Best Buy and tried them out.  Yuk!!  Scrolling through the menu, even with the album cover flip thing off, is really bad.  All sense of precision is lost, and the speed is much slower.  Just to get "artist" in the top menu was hard -- I kept scrolling past it.  There is just no sense of precise control.

I urge all of you to go try one before you buy, particularly if you are like me and are upgrading from a gen 5.5 classic.  Do not just buy it online sight unseen assuming it is just like the 5.5 but with more storage and a thinner form factor.  I am also told, but can't attest to the fact, that it is much harder to get video out to a TV, say in a hotel room, and takes new adapters and cradles to do so. 

This may get fixed in a software patch, but I an not entirely sure.  I have heard that new hardware on the touchpad is partially to blame, and there is no patch for that.  I can confirm that it did not feel like the old touch pad. 

Machines Have Each Others' Back

Something about this seems oddly Judgement-Day-ish (in the Terminator sense):  Cameras watching cameras.  Via Hit and Run

Fluorescent Bulbs

I have to echo this post from Glen Reynolds about fluorescent replacement bulbs for the home.  If you have not bought any in the last two years, they have come a long, long way.  They are much cheaper - home depot was running a screaming deal on multi-packs here this weekend.  The buzzing fluorescent sound is gone.  And the ones at Home Depot came in a range of three color temperatures - from warm white, which comes close to matching the light color and temperature of incandescent bulbs, to bright white and daylight.  The latter have a brighter, cooler (blue-er) light that you might more closely associate with fluorescent.  I use the warm ones indoors and the cooler white ones outdoors.  I can barely tell the difference even for bare bulbs in my ceiling cans between the newer warm fluorescent and the older incandescents.  And if the bulb is in a lamp under a shade, I really can't tell the difference.

These are a total no-brainer.  They pay for themselves in longer life alone, and the 70-80% energy savings comes on top of that.  Highly recommended.

Another Crazy Patent Decision

Apple just lost $100 million to a Creative Labs suit over iPod menus.  Stephan Kinsella isolates exactly what the unbelievable breakthrough it was that Apple will now have to pay for.  Here it is.  Be amazed at the genius involved, all you small-minded folks who would never have been smart enough to think of this for yourselves:

A method of selecting at least one track from a plurality of tracks
stored in a computer-readable medium of a portable media player
configured to present sequentially a first, second, and third display
screen on the display of the media player, the plurality of tracks
accessed according to a hierarchy, the hierarchy having a plurality of
categories, subcategories, and items respectively in a first, second,
and third level of the hierarchy, the method comprising:

  • selecting a category in the first display screen of the portable media player;
  • displaying the subcategories belonging to the selected category in a listing presented in the second display screen;
  • selecting a subcategory in the second display screen;
  • displaying the items belonging to the selected subcategory in a listing presented in the third display screen; and
  • accessing at least one track based on a selection made in one of the display screens.

Oh my god, like, my head is going to explode this is so revolutionary and complicated.  Someone just invented the hierarchical menu.  Jeez, how have we done without this all these years?  </sarcasm>

Time for Patent Reform

Its clearly time for patent reform as it applied to software.  In the last ten years, software engineers have apparently have been able to convince hardware-centric patent examiners that some pretty basic software concepts are "non-obvious" and patentable.  Guestblogging at Overlawyered last week, I mentioned one such patent, the Amazon "1-click ordering" patent, which to me is clearly copyrightable, but not patentable.

Rob Pegoraro makes a similar point in the Washington Post, editorializing on the Blackberry suit:

No, the problem here is simpler. There are too many bogus patents getting handed out.

One
solution would be to make more things unpatentable. Just as you can't
-- or shouldn't -- be able to patent a mathematical equation, in this
scenario you wouldn't be able to claim ownership of things like the
general workings of software (any individual program is already
protected by copyright) or business methods. The U.S. has been a
pioneer in turning those things into new types of intellectual
property; perhaps it's time to declare this experiment a failure.

Another,
somewhat overlapping solution would make it harder to get any patent.
The patent office would apply a higher standard of "non-obviousness" --
the idea that a patent shouldn't reward "inventions" any competent
individual could have thought up. And any outside party could submit
evidence against a patent before it became final.

I am generally sympathetic to Blackberry's plight, in part because I went to school with Jim Balsillie, the CEO of RIM.  One thing Pegoraro missed in his editorial:  The US Patent Office has already said it made a mistake in issuing the original patent that RIM was found to be violating.  The nullification of this patent is working through the system, and RIM is pleading that the injunction against them wait until this process is complete, sort of like a victim on death row begging not to be put to death because the prosecutor has admitted that based on new evidence, he shouldn't have pursued the case in the first place.  RIM has offered to settle with NTP (the patent holder)if there is a give-back if the patent is invalidated in the future, but NTP has refused this.  This all makes for an interesting drama, with a lot of brinksmanship.

By the way, though I am sympathetic to RIM to some extent, that sympathy is diminished by this:

In 2002, RIM sued software developer Good Technology for its wireless
mail-transfer technology and "smart phone" maker Handspring over its
miniaturized keyboard design. Both wound up forking over licensing fees.

As I wrote before, what goes around, comes around when you use the legal system and the long hand of the government to step on competitors.

Computer Build

Well, I had a number of emails asking for the specifics of my computer build, so all you non-geeks can move on.  Hopefully I will get a post up on the USA Today putting for-gods-sakes ethanol on the front page of today's paper.  Anyway, here is my computer build components:

  • ASUS A8N-SLI Premium motherboard.  This basic motherboard platform is rock-solid.  The premium version mainly brings a quieter heat-pipe design to cool the mobo chipset and a software rather than hardware switch for single to dual SLI.  It is one of the better overclocking platforms, with good BIOS options.  It has a couple of quirks, probably the most important of which is that it tends not to like RAM in 4 sticks -- better to use two.  I chose not to use the newer A8N32-SLI, which is supposed to increase the bandwidth when 2 SLI cards are used.  However, I think the Nvidia chipset for this was rushed (to please Dell) and tests show its not necessarily faster, even with 2 SLI cards, than the one I bought.  Also, I wanted to shy away from bleeding edge for my first build
  • AMD 64 Athlon X2 (dual core) 4400+ microprocessor.  This is the 2.2Ghz Toledo core with the larger cache.  As I mentioned yesterday, its a notch or two less fast than the top of the line, which tends to be a better value.  And the consensus opinion is that AMD is dusting Intel right now.  I got the large cache because you can always overclock but you can't overcache.  The dual core is clearly the wave of the future, and more games and programs will support it in the future.  I was a bit worried that I would have some compatibility problems at first, but I have had none, even on Star Wars Battlefront 2, which was reported to have a compatibility issue with dual core
  • 2 gigs of memory from Corsair, in 2 1GB sticks.  Corsair is a top company in memory.  I can't tell you how many people struggle to overclock their PC a few percent but have too little memory.  Tests show even going from 1 to 2 gigs shows real results.  I got the Twinx-2048-4000.  I debated between lower speed (ddr 400), lower latency memory and higher speed (ddr 500) higher latency memory.  I went with the latter, hoping that it was better for overclocking, but this is one issue not well addressed online.  The answer is probably here, but I decided it would not matter that much for me.  If you go with 512K sticks rather than 1 Gb sticks there are more options for memory that is both low latency and higher ddr.
  • I wanted to try my hand at overclocking, so I wanted a good CPU fan.  Zalman has a lot of great products, so I went with the CNPS9500, which looks cool too.  Its quiet and keeps the cpu ice cold.  It looks huge but it fit fine.
  • I may have made a mistake on the case.  I went with an Aspire X-Navigator, which is cool looking and keeps everything cool inside but is loud.  I might next time research for a quieter case.
  • I splurged and went with dual SLI, because I love games, and bought two evga 7800GT sli cards.  I never really understood the variations in their 7800GT cards - some variations of memory speed, I think.  The nvidia sli chipset right now blows anything else away - it is the ONLY choice for gaming.  A pair of GTX's would have cost me $400 more.  Again, I find the best price-value point a step or two below top-of-the-line.  I didn't realize until later that DirectX 10 will be a pretty substantial upgrade, which will require new chips to support it.  That means that if you are a gamer, you will probably want a new card in 12-18 months.  Knowing this, I certainly wouldn't pay for GTX right now and might have only gone with one rather than 2 cards.
  • I bought a couple of 250Gig Seagate SATA 3gb/s hard drives and put them in a raid 0 configuration.  This makes a 500 gig hard drive that is fast as hell.  This is cheaper than buying a single 500 gig and it is faster, but it will be less reliable since data is "striped" across the two drives, so that if either fails, you lose ALL the data.  Because of this issue, I bought a smaller 160 gig drive that runs separately as a backup for my data.  By the way, this was the one issue I had with my install.  Basically I had to leave this 160 gig drive unplugged until I get windows installed on the raid 0 drives and make them bootable, or else the system would get confused.  Once windows was installed on the raid drives and was bootable, then I plugged in the third drive and partitioned it and all was well.
  • Power supplies seem to be a nightmare in terms of failure rates.  I use a 650 watt Silverstone Zeus and it has been fine and it had all the cables I needed.  Note you need at least 500 watt and probably 600 if you are going dual sli.
  • Other components include a fast NEC DVD read-write drive (whichever one was highest rated on newegg), a floppy drive (you HAVE to have one to load the drivers for this self build if you are using a raid drive array) and a nifty little drive that accepts all kinds of memory cards on the front panel.  And windows of course.

This article on the Corsair web site provides an outstanding walk-through of how to build and set up your PC, demonstrably sufficient for even the noob since it got me through it.  I actually found this after I bought my components so I was happy to see that the component selection in the article for a high-performance gaming box was very similar to mine.  I also have the logitech cordless keyboard and mouse shown and love those too.

Have fun.

Update:  In response to the question in the comments, this build cost about $2000, which is expensive for a desktop, except that I expect to get much longer life out of this thing with performance that stays top notch for a while and many upgrade paths.  It might have been more but several parts were on weekend sale at newegg and others had cross-promotions (i.e. if you buy the AMD procesor and the evga card you get an extra $30 off).  Also note that this is a very competitive system to gaming rigs (e.g. Alienware) costing over $4000. You could take a few steps to bring this under $1500:  One 6800 GT rather than two 7800GT graphic cards would save almost $400.  One graphics card would let you save about $50 or more in the power supply, and you could easily get a good case for $50-$75 less.  Making these subs would get you a very very good rig for under $1500.  Dropping down a notch on the CPU could save another $200.  Smaller hard drive capacity could save $100-150, though hard drives are so cheap, I think it is short-sighted not to overdo it a bit.  I still remember my first hard drive card for my original IBM PC.  It was 10 meg, and my thought was "I'll never be able to fill that much memory".  LOL.

The build time was probably 8 hours, including windows installation and disk formatting.  This includes three false starts:  one, when I thought the power supply was bad but I had just forgotten to hook up the on/off signal wire; two, when the floppy drive actually was bad and I had to run to compUSA to get a new one; and three, when I struggled, as mentioned above, to get windows installed with the hard drive configuration I had chosen.  If everything had gone smoothly, I could easily have done it in 4-5 hours.

Did I mention I love this rig?  Its like the geek version of showing up to your high school reunion in a Ford GT.

Moore's Law Alive and Well

I just bought, or rather built, a new home computer.  My last computer, a Dell, was about 18 months old.  I really enjoyed building my own this time, and Newegg.com makes it pretty easy, and there are lots of articles out there to help.  As with my old Dell, I bought a processor that was one notch or two below the fastest currently available, which tends to be a sweet spot in price-value.  This time, though, I switched from Intel to the AMD dual core ("toledo") at 2.2 Ghz.  Basically both my computers are/were fast machines for their times, though since I built this new one myself, I did a few extras, like going with parallel SLI graphics cards and I overclocked the whole rig about 10-15%. 

The result?  My 18-month-old machine gets a 3dwinmark score of 250.  My new machine gets a score of over 13,000, or over 70 times better!  Granted, this is on a synthetic benchmark mainly aimed at measuring 3d graphics performance, but this is still a huge leap in performance in 1.5 years.  I have also played around with my hard drive selection and configuration to get a jump in performance there as well.

I spent a lot of time researching and picking out my components -- that is the real joy of a self-build, that you know the quality and trade-offs of every single subsystem in the box.  If anyone out there is interested, email me and I will tell you the exact components I chose and why, or maybe I will do a post on it sometime.

UPDATE:  Components posted here.

Channeling my Grandparents

You know how when you grew up, your parents and grand-parents always said stuff like "I remember when I was a kid, we didn't even have X", where X was airplanes, or TV's or ice or whatever.  I actually found myself having one of those moments in the OfficeMax store today.  I remember when I got my first hard drive for my IBM PC in the very early 80's.  It was 10MB, cost about $500, and my one thought at the time was "I'll never be able to fill up this thing".

Today at the office supply store at the register I made an impulse purchase for a new USB memory key.  My son stole mine to use to take stuff back and forth to school, and I wanted a larger capacity drive anyway.  So here I was buying a 1GB key, with 100x the storage of that first hard drive in a package about 1/100 of the size of that hard drive, and I was buying it at the cash register from a rack next to the gum.  Pretty cool.

However, I am not going to let scientists totally off the hook.  I am still waiting for my hover car, my jet pack, and my vacation on the moon, which I expected to have long before now.