diglloyd

Up to 8TB of Thunderbolt Storage!

SSDHard drivesMemory
Mac Performance Guide


Our trusted photo rental store.

100% Kona, 100% Family Owned

January 2006

Nikon D2X lens mount and/or sensor misalignment

Today I had the opportunity to test raw-file conversion speed on a PowerMac Quad. Converting 59 D2x NEF files took 6:14 on the Quad, and 6:23 on a PowerMac G5 dual 2.5GHz machine, about a 2.4% difference. Both computers used the same Firewire drive for the test; neither machine used the CPUs more than about 150% (out of 200% and 400% max). Both the Quad and the dual 2.5GHz machine run at 2.5GHz, so we can conclude that at the same clock speed, the Quad might be slightly faster (probably due to its faster RAM).

A look at the CPU Monitor graph (in Activity Viewer) showed that Nikon Capture does a poor job of utilizing the Quad’s four cores. Utilization does improve somewhat when noise reduction is enabled, with higher utilization of all 4 cores about 1/3 of the time; it seems to be a staged process which is single-threaded for a good part of the time, which limits the performance improvement.

In this blog’s January 22 entry, I showed a problem with optical misalignment of my Nikon 17-35/f2.8 EDIF. It appears that I might have been mistaken, as I began to suspect during a comparison of several Nikon wide-angle zooms (part of a future diglloyd review). While examining frames from each lens, I observed pronounced blurriness on one side of the frame with my trusted 17-55/f2.8 DX EDIF. At first, I groaned inwardly, assuming that another Nikkor had gone whacko. But because the 17-55 had always been tack-sharp, the facts didn’t add up, and I determined to investigate.

The problem appears to be a lens mount and/or sensor misalignment issue. I determined this today after a careful test with four different lenses.

For a detailed write-up on this problem, please see Lens Mount Misalignment. Unfortunately, this means my Nikon D2X will need to go into Nikon service, a really unhappy situation after receiving a defective Nikon D200.

The PowerMac G5 Quad and power usage

Today I performed some raw conversions using Aperture 1.0.1 at a friend’s house, using a PowerMac G5 Quad with 4.5 GB of memory, and a single (non-RAID) drive.  The results will become part of the Raw-file Converters article.  Without a doubt Aperture has the most attractive user interface (by far) every produced for a raw-file converter (though Adobe Lightroom, still in beta, might become a competitor in that regard).

Aperture on the Quad is notably faster than anything I’ve seen before, roughly 2-3 seconds to convert a raw D2X NEF or 1DsMII CR2 to a 16-bit TIF (no specific timings were done). Compare that to the 7.2 seconds for converting a NEF on a dual 2.5GHz PowerMac using Nikon Capture (see Nikon Capture—Speed and Stability).  Aperture does use all 4 CPU cores fully, albeit very briefly, which must contribute to the fast performance.  Benchmarks using a large number of  the same raw files would be needed to make a precise comparison to other machines and converters, but I have no doubt that Aperture would come out near the top of the heap, if not the very top.

I noticed that Aperture by itself was using about 800MB of real memory (mapped to about 1.2GB of virtual memory).  Any test that makes claims about Aperture’s processing speed is worthless unless it verifies there is adequate memory.  Otherwise, the test is just benchmarking virtual memory system performance (the disk), and the results are meaningless in any setup that isn’t an exact match (in multiple ways).

Given the massive memory requirements, laptop users shouldn’t even consider Aperture with less than 1.5GB of memory. But even that may not be enough if you want to run Photoshop or other programs at the same time.    Plan on getting 2GB for your laptop if you intend to run Aperture.

What a quiet, beautifully-built machine! No PC I’ve ever seen comes close to this combination of low noise and elegant design.  It is very difficult to make the Quad break a sweat—nothing we had on hand could blip CPU usage on all 4 cores for more than a moment.

If you’ve put off buying a Quad because an Intel-based version may be coming, wait no longer—the G5 Quad is the first computer I’ve used that makes any normal task appear effortless.   But plan on getting at least a 2-drive striped RAID array, because unless you can feed the Quad’s voracious appetite, the 4 cores will be underutilized.

Also, programs that are single-threaded use just one core (equivalent), so the Quad won’t be any faster than a single-core machine for such tasks.   Unfortunately, it takes extra coding and testing to make a program “threaded”, and some developers don’t do that work.   Still, it’s like having 4 computers in one if you’re running several single-threaded programs at once.

One of my concerns about the PowerMac Quad (until today) was its power usage.  The power cable is noticeably larger in diameter than the one for my PowerMac G5 dual 2.5GHz.  Its specifications indicate power usage up to 1200 watts—a deal-killer for me in terms of the noise, heat and cost.  Try finding a 1200 watt power supply at your local PC superstore—the maximum you’ll see advertised is usually 600 watts.  The Quad is apparently liquid-cooled, a nifty feature that costs considerably extra in most PCs.

Knowing that it was unlikely to actually consume 1200 watts, I called Apple business sales, and the sales representative  put me on hold to “check with an engineer” on the power usage issue.  The answer came back that the Quad uses 1000 watts in a 4GB configuration.

This still made no sense, so I gave an old friend at Apple a call who plugged a Quad into a power meter. His power meter indicated a usage of up to 500 watts while booting, but that settled down to 180 watts at idle.  I believe that was with 4GB memory and one hard disk.

Today, I took my Smart-UPS XL 1000VA over to a friend’s house (see Uninterruptible Power Supplies (UPS) for your computer).   We plugged in the Quad to the UPS, and its indicator lights showed that at idle the Quad consumes about 160-200 watts, and a bit more at about 100% CPU usage (400% being full use of all 4 cores on the Quad).

In short, the PowerMac G5 Quad, though sporting a massive power supply (presumably for power-hungry PCI-Express cards), actually uses about the same or perhaps even slightly less power than my older PowerMac G5 dual 2.5GHz (with single-core CPUs).

New front-page photo “Yosymmetry”

Sometimes an idea strikes me while working on an image...the front page photo is an image which was mirrored with itself to create a symmetrical whole.  It’s not my usual thing, but I rather like it, and so do people who’ve seen it in print. It was taken with a Hasselblad XPan II in Yosemite’s Tenaya Canyon on Fuji Velvia.

Apple’s Aperture coming soon to Raw-file converters article

A friend of mine just purchased Apple’s Aperture, and has graciously offered to let me process the six sample files (three from the Nikon D2x and three from the Canon EOS 1Ds Mark II) on his PowerMac.

Results will be added to the Raw-file Converters article, which already contains instructive comparisons of 8 raw-file converters for Nikon NEF and 8 for Canon CR2. That's a total of six files and 8 converters for 48 combinations with very generously-sized samples from each file and each converter (the article is an 82MB download).  You could spend hundreds of dollars and many hours of your time performing your own comparisons, or you can skip that tedium and get the most useful raw-file converter comparison available today (for those concerned with image quality).

Discount on Realviz Stitcher 5

If you enjoy stitching multiple images together (for a higher-resolution image or for Quicktime VR), Stitcher 5 from Realviz is one of the best products around.  It can handle flat stitching (from shift lenses) as well as assembling panoramas taken by rotating the camera.  There is an Express version and a full version, as well as a trial version. You might get a 10% discount by contacting Wolfgang Santner at Realviz sales (wolf@realvizusa.com) instead of going through the website.

Lots of Universal Binary apps for Intel Mac appearing

I have received confirmation from Zeiss that the new ZF lenses for Nikon will offer automatic aperture control. See the 18 January entry in this blog for details.   This means that when you set the lens to, say, f5.6, the aperture closes down only when a picture is taken. Canon users won’t benefit from this feature if they use Zeiss ZF lenses via an adapter.

Yesterday’s discussion of ECC memory brought to light issues with data integrity and a running computer. But what if the computer stops running suddenly because the power fails? What kind of data integrity or data loss can you expect?  That’s hard to say, but it’s a bad thing to lose power suddenly.  Not only is data at risk, but voltage spikes and swings can damage sensitive computer equipment.

Yesterday evening, after a calm and sunny day, the power at my home popped on and off at least 3 times before dying a 4th time for the better part of an hour.  Aside from regular beeps and fan noise from the UPS units, my computers kept running without a hitch, including the diglloyd.com webserver.   That’s  because several weeks ago, a 24 hour power failure took down all my computers, including the diglloyd.com webserver which prompted me to look into some serious battery backup, also known as an Uninterruptible Power Supply.

I now have about 250 pounds of glorified lead-acid batteries which allow the web server to run for 24 hours, and my power-hungry PowerMac G5 for about 3 hours.   UPS devices also are excellent power-conditioners, much better than plain-vanilla surge-supressors.  I chose units from APC, which offers a full line of UPS units.   Let me share some thoughts on what I purchased, and why.

I first looked at the entry-level units such as the Back-UPS RS 1500VA, which offers an output power capacity of 865 watts.  However, with a 400 watt load (eg a PowerMac G5 with screen and drives), it offers only 13 minutes of juice.  By adding the optional battery pack, the runtime can be increased to 51 minutes.  For the web server (a laptop), DSL router, ethernet switch, only about 50 watts is needed, and that would yield about six hours of runtime.   Still, that's pretty limited runtime, especially for any future needs.  And from past experience I knew that it’s better to buy more than “just enough”—or I’d end up buying something more capable in short order.

So I looked at units with more runtime, and finally settled on the Smart-UPS XL 1000VA, which offers 800 watts of power output.   I also added a Smart-UPS XL 24V Battery Pack and two Power Distribution Units (highly recommended; don’t plug a redundant surge-supressor into a UPS) . Together, those reasonably-sized and priced units provide 2.5-3 hours of runtime with a 400 watt load.  The nice thing about the Smart-UPS XL is its expandability—up to 4 of the regular battery packs may be attached, thereby extending the runtime to nearly 9 hours at 400 watts.    For truly substantial runtime, up to four of the 12.2 X 17.6 X 29.7-inch 280-pound Smart-UPS 24V Ultra Battery Packs may be attached, for runtime of 35 hours at 400 watts. Presumably those humongous batteries need to ship on a pallet via truck—and then you’d need a couple of guys to move just one of them, so I stuck with the “wimpy” ones.  I purchased everything from thenerds.net, and got prompt and very reasonably priced shipping, as well as very competitive prices.

Now here’s the catch—if you like quiet computer operation as I do, it turns out that the Smart-UPS XL line switches on a rather noisy fan when the power draw exceeds 57% of capacity, which my PowerMac G5 system just barely managed to do, drawing about 500 watts (with router, ethernet switch, etc).   So now I had excessive noise on my hands.  In the end I purchased another Smart-UPS XL 1000VA and battery, and attached the web server, DSL router and ethernet switch to that unit, and the main G5 system to the other unit, which kept both units comfortably under 400 watts.  Voilà—no fan noise.  This yielded a setup with nearly 24 hours of runtime for my power-frugal web server (a PowerBook G3 laptop), and 3 hours or so for my PowerMac G5 system.

Many small developers have released (or promise to soon release) Universal Binary applications (for use with the newly-introduced iMac Core Dual and MacBook).   The folks at macintouch.com are maintaining a list of vendors.

According to the macintouch.com list, Adobe Systems will not be releasing its Creative Suite CS2 product as a Universal Binary.  This suggests that users will have to pay for the privilege of getting a Universal Binary version by upgrading to the next version (when it becomes available).  Does this mean a new release is coming soon?

Also, Nikon, Canon, Bibble Labs, DXO Optics and Phase One are conspicuously absent from the list.  When will software vendors learn that proactive honesty about their plans is the best type of customer relations they can offer?

Error Correcting Code memory

An acquaintance of mine wrote to me:

“The fact that RAW conversion is done by the OS means that Apple will have to update the OS each time a new camera is released.”

This seems to be the prevailing belief online.   As a veteran 20-year software developer, I’m skeptical—a sufficiently incompetent implementation might require a system software update, but there’s no reason I know of to believe that this is actually the case.

Raw conversion is likely done in just 1 or 2 libraries.   Consider that QuickTime, Java, DVD Player, security updates, iTunes, etc have all been updated in the past month alone, all without a system software version change.  Assuming that Aperture can’t and won’t be handled the same way is unfounded.

Still, for the sake of argument, let‘s assume that supporting new cameras in Aperture does require a system software update.  Apple has released 3 or 4 system software updates each year, so the interval at which updates to Aperture could be made are around 3-4 months—not great, but not a disaster either.

If you’re considering a raw converter, you’ll find very useful comparisons in my Raw-file Converters article.  Aperture is not included, because I’m not yet willing to spend $500 on a program just to test its raw-conversion quality.  Perhaps a demo version will emerge, or I’ll be able to do some test conversions on a machine that a friend or acquaintance has.

With the large amounts of RAM being stuffed into computers these days, one has to question whether to buy Error Correcting Code (ECC) RAM, because as the amount of RAM goes up, the chance for errors does also.  Whereas 1GB was a lot of RAM a few years ago, 8GB is not uncommon today, offering 8 times the chance of a memory failure.   Errors are caused by esoteric cosmic rays, which have the energy to twiddle memory bits.  Higher elevations are more prone to problems, so if you live in Denver, it’s more of a concern than in San Francisco (maybe as high as ten times more).  Flying in an airplane is considerably riskier.

Now that your computer has 8GB instead of 1GB, has the chance of a random “soft” error also decreased by a factor of 8?  Unfortunately, it appears the opposite may be true.   Densities have increased to the point that a single “event” can flip more than one bit.   Even ECC memory can only detect multi-bit errors, not correct them—it’s reboot time in such a case.

At least one online source analyzes the likelihood of memory errors, though the analysis is faulty in several ways.  For example, it assumes that home users turn off their computers. If you’re a Mac user like me, you never turn off your computer, relying instead on “sleep” mode.  That means the computer is really running 24 hours a day every day of the year (24X7).  A 1998 EE Times article quotes research that claims as high as 4 soft errors per gigabyte of RAM per month—and of course that’s in lower density parts than today’s computers use.  Other sources claim higher and lower numbers.  Assuming 4 soft errors per gigabyte, a user with 8GB of RAM might experience 32 errors per month, or one per day on average.  Some of those errors are in places that matter, though most are likely harmless.

High-end PC motherboards have offered ECC RAM for some time.  Apple’s new PowerMac G5 models, which can accept up to 16GB of RAM, now also offer ECC memory, which uses an additional chip on the memory module to detect and correct single-bit errors in each byte and detect (but not correct) multi-bit errors.   Apple’s XServe G5 has offered ECC memory for some time, and there is a nice tech note showing the user interface that allows monitoring the memory status.   It’s not clear if any equivalent tool exists in regular MacOS X (as opposed to MacOS X Server).

Several approaches come to mind:

1. Buy the cheapest memory you can find, and hope that if the system crashes it isn’t the memory.  Cheap memory is more prone to “hard” errors—manufacturing defects, and certainly no less prone to “soft” (cosmic-ray induced) errors.  Of course, there’s no way to tell if errors are occuring.

2.  Buy high-quality memory, which should eliminate “hard” errors, and just assume that the chance of a soft error is small enough to be acceptable.  This is a reasonable strategy, but there’s always that nagging doubt about memory if your machine freezes once in a while.  And you never do know if your data has been subtly altered.

3.  Buy ECC memory and chances are you’ll never experience a memory related problem.  If  you do experience a crash or freeze, the MacOS system log is supposed to record a memory failure.

Of course, rebooting starts you off with a fresh copy of everything, effectively erasing all soft errors that have occured. So maybe rebooting is not such a bad idea, once a month or so.   Then again, my Powerbook G3 has run as a mail and web server for four (4) years with nary  a crash or freeze, but it only has 384MB of RAM, doesn’t access much data, and I’ve rebooted it a number of times after software updates.

I’ve come to value reliability more and more, and ECC memory is likely what I’ll go with in my next computer.   The downsides include cost, which will be at least 25% higher.  For example, Apple charges 50% more for ECC memory.  Companies like satech.com offer ECC memory for only 25% more.  Another claimed cost is a performance hit, but this appears to no longer be true, at least on PowerMacs according to the testers at barefeats.com.

Nikon 17-35/f2.8D AF-S EDIF optical misalignment

I bought one of the very first copies of the Nikon 17-35 when it was released about 6 years ago. Out of the box, it was obviously blurry even through the camera viewfinder.  Pictures at f8 were fuzzy.  Disappointed, I sent the lens into Nikon for service, as there were simply no replacements to be had.  Upon its return, it was a stunning performer.  At the time, it was almost certainly the best wide angle zoom ever made, and superior to even the Nikon prime lenses in its zoom range.  High sharpness, high contrast, low flare and superb color saturation are its hallmarks.  So too, apparently, is its propensity to become badly misaligned optically—

It served me well for about a year, until the following August when I noticed, again through the viewfinder, that one side of the frame was blurry.  Even at f11 the effect could be seen in the resulting images, and many frames were impacted (when you’re on a backpacking trip, camera stores are few and far between).  Again I had the lens serviced, and again it returned in fine shape. 

In no case was the lens banged or mistreated in any way.  I am at a complete loss as to how this optical misalignment materializes.

In October 2005 I used the lens while preparing the D2X vs EOS review, only to discover that the lens was again blurry on one side.  I sent the lens in for service, and this time, it literally sat in the drawer until yesterday, when I finally found the time to verify its optical quality.  (It went unused for that time because I favor the 17-55/f2.8D DX for most work).

Nikon did not fix the problem.  The blurring is so obvious that it’s hard to see how they could have returned the lens in this condition.  But perhaps it is not too surprising given the 4 trips my Nikon 12-24/f4 DX made to service, each time receiving a defective replacement.

The frame below was taken with mirror lockup on a tripod, 1/1600 second @ f2.8.  A portion of the bottom of the frame was removed to reduce file size.

Nikon 17-35/f2.8D AF-S EDIF @ f2.8, about 25mm

Below are crops showing the issue.  The blurriness is not confined to the far edge; it is well into the frame.  To see a 4288 X 300 pixel crop, click here.

Nikon 17-35/f2.8D AF-S EDIF @ f2.8, about 25mm
actual pixels
Crop from mid-left Crop from mid-right

So it’s back to Nikon once more for the 17-35.

I often see comments online about how this or that lens is “not sharp”, “soft unless stopped well down”, etc.   The truth is that modern pro zooms from either Nikon or Canon are outstanding, provided that they are not optically out of whack.  Some lens-sharpness claims are made by incompetent users, but many are no doubt due to misalignment issues as demonstrated above. 

Assuming a lens is optically good because it is brand-new is a not a good assumption. My opinion (based on personal experience with new zoom lenses) is that neither Nikon nor Canon spend much on quality control, and that their manufacturing tolerances are too loose to consistently produce zoom lenses that perform to their as-designed optical potential.

Be cautious buying a pre-owned “new” lens; the owner might be selling it because it’s a poor optical performer.  The good news is that in most cases optical problems can be fixed by the manufacturer (again from personal experience).  If you’re in the USA, be sure you buy a USA-warranty lens; buying a gray-market lens is sure to cause headaches if repair or servicing is needed, which Nikon or Canon USA might refuse to do for a non-USA lens.

RSS feed now available

Updates to the diglloyd web site are now posted in an RSS feed.  MacOS Safari users can simply click on the RSS icon in the address bar.  Other browsers may require an external RSS feed program.

The diglloyd blog (what you’re reading now) will continue in its present form; the RSS feed will nearly always reference the blog.

Color Temperature and Noise with Digital Cameras

If you’re in the habit of using warming and/or cooling filters with your digital camera, you’ll want to read my new article Color Temperature and Noise with Digital Cameras. Enjoy.

PowerMac wish list

Now that Apple has started the ball rolling on Intel-based Macs, one can only hope that in revising the existing PowerMac, more attention will be paid to the truly useful features for digital photographers.  Please, Mr. Jobs, make sure the desktops include the following features:

  • It’s about time for a built-in digital-camera memory card reader.  Impress us with dual slots, so we can download two cards at once.
  • 12 memory slots instead of 8. We don’t like paying twice the price per megabyte for higher-density DIMMS.  Give us four more slots.
  • Room inside the case for 4 SATA drives, and preferably 6.  That way, we can make a reasonably-fast striped RAID without having to add an external box.  Add motherboard support for those drives, too.
  • Motherboard support for eSATA with port multiplexing capability so we can just plug in an external box.  Two ports, please (each of which supports many drives).
  • Another PCI Express slot or two.  Lots of users fill up all the available slots.
  • Keep Firewire 800, and provide two ports, not just one.  While you’re at it, fix the outrageous Firewire 800 sustained-write bug (described in various places on barefeats.com).  This bug causes PowerMac G5 performance to be 1/2 that of a PowerMac G4 when writing to Firewire 800 volumes.

Now, in addition to the above, give us long-deprived Mac users the opportunity for some really fast performance.  Offer the very fastest Intel chips with the big caches.  Offer dual and quad-chip versions, so that we can have 2, 4 or 8 cores (eg the PowerMac OctaCore).   But also spend some marketing dollars rewarding software vendors that design their products to actually exploit the potential of such machines.

Finally, no machine is worth the trouble if it isn’t absolutely rock-solid with great software.  Please stop uglifying the various parts of MacOS. Delete the useless Dashboard project, and spend that money on best-in-class stability and performance.

Nikon D200 review progresses

Noise, dynamic range and resolution are being explored.  So far, my Nikon D200 has not shown any of the striped noise pattern seen in some online posts.

The new Intel-based Macs

Apple yesterday announced new iMacs and “MacBook” computers using Intel’s power-efficient Intel® Core™ processors (code name “Yonah”).  These dual-core chips essentially put two processing units on one chip, as do the current PowerMac G5 models (which don’t yet have Intel-based replacements).  This arrangement allegedly gives tremendous computing power at little additional cost, and lower power consumption.

The promise of a 4X faster laptop (according to Apple) is certainly a tremendous advancement for those who need to work with images in the field.  However, the sluggish performance of the old PowerPC-based G4 laptops means that this 4X increase will just bring things up to the performance of a mid-range PowerMac G5.  Or will it?

Apple’s speed claims are based on industry standard “SPEC” benchmarks that have only modest correlation to real-world performance with real applications, especially with the snail-like hard drives found in today’s laptops (not an Apple-specific issue).  The fact that Apple is now quoting SPEC numbers is rather odd; performance figures in the past were quoted for real programs, like Photoshop, or Final Cut Pro (see the PowerMac G5 performance claims).   The SPEC numbers are almost certainly more flattering than real-world numbers, and thus make for far better marketing hype.

Furthermore, it’s not clear whether half of the 4X factor is due to the dual-core chip (eg each core 2X as fast and with 2 cores = 4X). If so, then the maximum improvement for many applications will be just 2X, not 4X, since very few applications can actually make full use of two threads (cores), and even those that can do so rarely use both cores fully for more than a brief time.  

Even applications such as Adobe Photoshop only briefly fully utilize both CPUs on the PowerMac dual 2.5 GHz desktop—and that’s with 7GB RAM and a 4-way SATA striped RAID where disk I/O speed is sustained at over 220MB/sec.  Most of the time CPU usage does not exceed 125% (200% being both CPUs at full use).  As shown in the Nikon Capture article, average CPU use by Nikon Capture when batch-processing raw files is somewhere around 125%.

For non-threaded applications (those that will use just one core), a real-world 1.5X performance improvement is likely, with a reasonable possibility of 2X improvement.  Much higher bandwidth memory and larger on-chip caches in the new MacBooks should increase the odds for 2X. I hope to be proven wrong, and see the full 4X improvement—but don’t buy a MacBook based on 4X hype just yet.

The foregoing all comes with  a huge “gotcha”: Until you update your software to a “native” or “fat” version which is compiled for the Intel chip, it will run in emulation (eg relatively slowly). This is the reality today for Photoshop, all existing raw converters, and virtually all non-Apple software.  Except to spend some time, trouble and money to get versions which run natively on the Intel-based machines.  Some software vendors will see this as a terrific money-making opportunity, and thus will be on it like flies on a fresh carcass.  They will trumpet the 3 or 4 pitiful features they’ve added in order to make you pay for the native version.  Return on investment is not greed, and is perfectly reasonable, but keep an eye on this phenomenon and observe—you can learn a lot just by looking.

Finally, all is not good news.  Firewire 800 has been eliminated in the new MacBooks.  Those using Firewire 800 drives externally will have to obtain a Firewire 800-to-400 cable, and accept slower transfer speeds.

In the long run, this transition offers huge promise, and Apple should be applauded for undertaking it, a transition which few companies are skillful enough to manage, but one which Apple will handle with agility and quality.  I look forward to future desktop Intel Macs.  A dual dual-core 3.8 GHz box will do just  fine.

Nikon D200 has arrived

My Nikon D200 is now in hand. A thorough comparative review will begin soon—I believe that standalone reviews have considerably less value, since performance relative to the competition makes for an informed purchasing decision, and allows additional insights into camera performance.

The intent at this point is primarily a comparison with its big brother, the Nikon D2X, but some comparisons with the Canon EOS 5D are likely too.  If you have particular concerns about the Nikon D200 that you would like to see addressed, please email me.  As the review progresses, various tidbits may appear here in this blog.

In keeping with my philosophy of providing excellent value, purchasers of the D2X vs EOS article will receive some sort of discount, amount to be determined (but not for use in combination with any other discount).

Composite images by stitching with shift lenses

If you’re shooting digitally, sooner or later (if not already), you’ve found that you need more and faster storage every year.  You want reliability and performance at a fair price, but don’t understand the options out there (SATA, eSATA, Firewire?), or how much you really need to pay and where the good deals are.  That’s what Rob-ART Morgan’s barefeats.com is all about.  While the site is mostly Mac-orientated, Windows products are also included, and RAID systems usually offer similar performance on both platforms.  Check out Rob’s terrific web site whenever you’re in the market for high-capacity or fast storage, graphics cards, flash drives, etc.  I have followed barefeats.com for several years, and Rob-ART offers information you can rely on.

See the home page.  Rabid bats flying about at 3pm in 100+ degree heat in Death Valley’s Cottonwood Canyon did not deter me from making this image.  Can you see the figure?  Some people see it immediately, and some don’t.  Once you see it, you cannot “not see it”.

Many readers have written to me to express their satisfaction at what they learned from the Raw-file Converters article.   You can spend hundreds of dollars buying and downloading the various raw converters, and several days evaluating them on your own.  But why not spend a fraction of the money and time and simple order your copy today—it’s also the least expensive diglloyd review, and you can see for yourself why diglloyd reviews are worth the money.   If you’re in the market for a Nikon D2X, Canon EOS 5D or Canon EOS 1Ds Mark II, you’ll also want to check out the D2X vs EOS article—my most comprehensive article ever.

If you’re reading this blog, please  email me a quick note. Comments or suggestions are welcome  (on this blog, or anything else).   Finally, diglloyd.com is a a relatively new web site, and needs to grow its reader base—thank you for letting your friends and associates know about this site.

The long-promised article on creating high-resolution composite images using shift lenses is underway.  This article will cover background material on shift lenses and how they work, focal length and field of view, parallax, equipment considerations, optical considerations, operating procedure, assembly of the composite image, gotchas, tips and techniques and more!  An overview of all the available shift lenses for Nikon and Canon will also be included.    And as usual with diglloyd articles, plenty of examples will be included.

The article will take at least another month to complete, and its preparation may be interrupted by a review of the Nikon D200, but a number of readers are eager to see it completed soon—it will be.

Intel-based Macs not so fast (at least not yet)

I was perusing some Photo CD scans of old 35mm slides yesterday.  How good is digital today?  In one word: outstanding.  Of course, good drum scans of 35mm slides beat the pants off Photo CD, but at $30 or more a slide, it’s hardly economical.  Even at $1.00/slide (10 years ago), my Photo CD scans were still too expensive (considering the miserable scan quality) compared to what I’m getting today from the Nikon D2X and Canon EOS 1Ds Mark II.

With slides, it is easier to overlook a slightly misfocused image—at least a 4X loupe is required, 10X for critical viewing.  Film grain is much stronger than the piddly amount of noise from top-end digital SLRs. Film contrast and dynamic range are a delicate balancing act.  Film color rendition, while pleasing, is not exactly accurate for many subjects.   Filtration is mandatory for accurate color balance with film.

With digital, noise is low, contrast and dynamic range are excellent, color rendition is superb.  Digital is also the ultimate critic and teacher: weaknesses in our photographic skills are ruthlessly revealed as soon as an image pops up on screen: we can see that focus wasn’t quite right, and that f11 was required, not f5.6.  In some cases, we can see that the lens needs optical adjustment!  Those are just the technical factors; one still has to succeed at composition.

In short, 35mm film had its appeal, but I won’t miss it.

I called Steve Jobs’ performance claims for the new Intel-based Macs “less than honest” in my January 14 blog entry.  Was that an ignorant exaggeration?  Hardly.  Twenty years of professional software development, including disk driver software, benchmark software and compression software (with 3 patents) give me plenty of first-hand experience to draw on—and the Apple hype didn’t smell right to me then or now.

As suggested in the January 11 entry, performance improvements of the new Intel-based Macs might be modest even when native code is run, and quite poor when Rosetta is involved (PowerPC-based apps).   Macworld Labs tests results should discourage any photographer looking to buy an Intel-based Mac.  While benchmarks with Photoshop and the various raw converters have yet to emerge, it’s a good bet that those programs will run like molasses.  Plan to wait until Photoshop and your favorite raw converter(s) are released in Intel-based versions.

That’s the bad news.  The good news is that the dual-core Intel chip appears to be no slower than the G5, and can be quite a bit faster when native code is involved.  Six months from now, the picture will be considerably improved as many, if not most, popular applications are released as “fat” (Universal Binary) applications.

Apple did the smart thing by introducing the Intel-based iMac first: the customer base for that machine won’t mind (or even notice) the performance hit, nor will it be as upset over the exaggerated performance claims.  But photographers accustomed to PowerMac G5 performance expect some real gusto under the hood.   By waiting to introduce Intel-based PowerMacs, Apple has shrewdly given both itself and software vendors time to prep for beefier machines to follow.

For a perspective on the iMac Intel Core Dual, see the article at macintouch.com.

Zeiss lenses for Nikon “F” mount announced

Zeiss yesterday announced the first two of a series of lenses for the Nikon F mount, the Planar T* 50mm/f1.4 and the Planar T* 85mm/f1.4.  The two “ZF” series lenses should be available in early summer 2006, with more be announced later in 2006.   Zeiss promises high mechanical, optical and esthetic quality.  Further comments are available in the January Camera Lens News.   Key points/claims include (1) high esthetic quality, (2) made in Japan to Zeiss standards, (3) ambiguous statements about price, (4) optically-identical M42 screw-mount “ZS” versions.

The lenses cover full-frame, making them usable on a variety of other cameras besides Nikon.  Prospective Canon users might be better off purchasing the Nikon-mount version, and using an adapter on EOS bodies, thus covering the two major platforms.  M42 screw-mount lenses are not adaptable to Nikon bodies, unless a quality-degrading intermediary optic is employed).

One key point left unanswered is whether the lenses have an automatic diaphragm: they do have a manual aperture ring, but Zeiss does not indicate whether the diaphragm stays open at maximum aperture until the picture is taken, as it does with every Nikkor made within the last 30 years.   If setting the lens to f5.6 means that the lens stays stopped down to f5.6, the lenses will be impractical for many types of photography.

While the 50mm is an excellent focal length for a 2/3-frame DX sensor, such as that found in the Nikon D2X, D200, D70, etc, the 85mm is a far less useful focal length on a DX sensor, having a field of view equal to a 128mm lens on a full-frame camera.  Presumably that lens is targeted at full-frame cameras, such as the Canon EOS 1Ds Mark II and 5D.

In other words, Zeiss announced a lens (50mm) with two excellent Nikon alternatives (50mm/f1.4D and NOCT-Nikkor 58mm/f1.2), and another that’s not particularly useful, and again with an excellent Nikon alternative (85mm/f1.4D).  Then again, perhaps Zeiss glass really is better than Nikon or Canon glass—we shall see.

What should Zeiss have announced?

What Nikon doesn’t offer: top-quality, fast wide-angle lenses: a 24mm/f1.4, a 28mm/f1.2, a 35mm/f1.2 plus 24mm, 28mm, 35mm and 50mm shift lenses.  And perhaps an outstanding wide-angle zoom, though Nikon already offers some really great glass in that area.

California’s Muir Woods National Monument

On a pleasant January day, diglloyd and offspring explored the delightful Muir Woods, about a 25 minute drive north of San Francisco, California.  Presence of the latter disallowed any serious photography, but a few fun infrared snapshots resulted nonetheless (see below).

While visiting the San Francisco area, Muir Woods is a great place to relax, smell the fresh air, and at this time of year, look for endangered spawning Coho salmon in the creek (and Steelhead trout as well).  This year, unlike your author’s visit four years ago, there are few salmon (so far at least).  The best time to visit is early morning or evening on weekdays, when visitors are scarce, thus avoiding the clueless variety who feel that a prohibited cigarette enhances the wonderfully fresh air.

If you’re looking for an IR camera, my Nikon D70 is for sale for $850.  The camera works great—save yourself the hassle of sending yours in and waiting for it.  It has a hardened filter installed, which may be cleaned (unlike some conversions which use soft filters).  I will be converting a Nikon D200 to IR use and so won’t need the D70 IR much longer.

Infrared with the human body

Shooting with an infrared digital camera reveals whole new tonal relationships which are often quite beatiful.  Diglloyd intends to explore infrared shooting considerably more, preferably with a true monochrome camera (when one emerges, as discussed in yesterday’s blog entry).

An interesting application of infrared photography is its ability to “see” a short distance beneath the skin.  Hold a bright flashlight against one’s hand, and you’ll see that intense red is visible on the other side—not infrared, but infrared no doubt penetrates even more deeply. Portraits taken in infrared, particularly of elderly subjects, reveal a face that looks far more youthful and sometimes rather different—the mask of age is removed and a gentle softness is typically imparted.  

Young children have relatively transparent skin.  The image below is the green channel of an RGB image, manipulated to emphasize the veins:

Diglloyd’s Nikon D70 IR camera is for sale for $850. It will be replaced with an infrared D200 when one can be found and suitably modified.

Comments on the newly-announced Hasselblad H2D

Yesterday, Hasselblad/Imacon announced three variants of its 39-megapixel back: the CFH-39 digital back for the H1/H2, the standalone CF-39 and CF-39MS digital backs for other platforms, and the H2D-39 fully-integrated camera system.  The (“MS” variant is a multi-shot back for studio work).  See this URL for a “docs” folder: http://www.imaconusamarketing.com/docs/

Sensor size

The sensor used is 36.7 X 49.0mm in size, thus making it 2.08 times as large as a full-frame “35mm” sensor.  But full-frame film (56 X 42mm) is still 31% larger than this sensor, so the sensor is still somewhat smaller than full-frame medium format film.  The “field of view crop” factor is thus 1.14, making the H System’s widest lens, the HC-35/f3.5, equivalent to a 40mm lens, or roughly equivalent to a 26mm lens for a 35mm camera.  Hasselblad needs to introduce something wider to account for this—anyone coming from a full-frame digital SLR is going to miss that 17-25mm range.

Hits

A very interesting aspect of this new announcement is Hasselblad’s claim that their new “Digital APO Correction(DAC) occurs in-camera (!), the implication being that once the raw file is created, all correctable color aberrations have been eliminated.  The statement is that each H System lens has been carefully mapped to allow this computation to occur in-camera (back).  Kudos to Hasselblad for implementing such a ground-breaking feature.   It remains to be seen how effective this feature is, but based on Nikon Capture’s performance, and the mathematical basis for it, I suspect it might be very effective indeed.   It also remains to be seen whether Hasselblad extends this feature to its line of C-type lenses, which can be mounted on the H2 via an adapter.

Hasselblad also groundbreaking color management via the use of its new 3FR raw format and “Hasselblad RBG” [sic] color space.  This, like the DAC feature, remains to be proven, but it sounds promising.  Hasselblad also claims “Color Definition” of 16 bits.  Whether the back actually produces 16-bit tonality, or whether it’s just scaling 12 or 14-bit output into 16 bits of storage remains to be seen.  The 16-bit issue really does matter for fine-art photography, where very dark tones can become posterized.

Also of note, and also worthy of praise, is that the new back supports lossless compression in the 3FR format, averaging (according to Hasselblad) a 36% reduction in stored file size.   The value of such a feature is not to be understimated when shooting in the field—an uncompressed 39-megapixel image is 78 megabytes!  The compressed image is thus 50MB—still huge, but whose size allows storage of 56% more images in the same space, a substantial improvement. A 4GB card would be able to store 78 fifty-megabyte images (3800MB used for calculation; 4GB cards are really about 3.8 GB cards).  The 3FR format is apparently proprietary, but can later be converted to Adobe’s DNG format.  Diglloyd will gladly take the reduced file size in lieu of DNG.

Misses

There is apparently no downsized capture mode where the 39-megapixel capture could be downsampled or cropped in-camera.  Such a feature would greatly extend the flexibility of the camera by offering smaller files for those jobs that don’t need the ultimate resolution, as well as improved speed and/or field-of-view crop a la Nikon D2x.

The LCD screen on the back is comparatively tiny, being only 2.2 inches. Compare that to the 2.5 inch screen on the Canon EOS 5D and Nikon D2X.  A back this expensive deserves a 3.5-inch screen.  However, the screen is an OLED display, which may be more usable in bright sunlight.

Price.  Diglloyd won’t be affording the new H2D anytime soon.  Perhaps the price will drop substantially in a year or two, as competition intensifies, and sensor prices drop.

Infrared-shooting fans will be disappointed to hear that the IR-blocking filter is integral to the CCD, thus making IR photography a non-starter. Point of interest : diglloyd has since learned that a competing product, the PhaseOne P45 back, based on the same Kodak sensor, is available with or without an IR filter.  Whether the H2D variants are also available this way is unclear.

Some Universal Binary apps emerging

I picked up the Nikon D200 today intending to do some shooting.  Mounting the optically superb Nikon 50mm/f1.4D on it, I noticed that the diaphragm was stuck fully stopped down.  Switching to an AF-S lens, the same problem occurred.  Pressing and releasing the stop-down lever opened the lens up, but within a few frames, the diaphragm stuck at minimum aperture again (f16 or f22), even if the frame was shot at f2 or f2.8.  When the stop-down lever is pressed and released, “Err” briefly flashes inside the viewfinder, just before the diaphragm opens up.

Unless there is some user-end cure for this problem, the D200 will have to go in for service, delaying the D200 review for an unknown period of time. Work will resume on the Stitching with Shift Lenses article in the meantime.

While on the subject of D200 problems, I've also noticed some “backfocus” behavior (the camera focus behind the intended subject).  While “pilot error” is a possibility (however unlikely), I’ve seen this sort of problem before with a D2X, which had its focusing module replaced.  So maybe service on this brand-new D200 is not such a bad idea.

Given the semi-ambiguous coverage of the D2X/D200 focus sensors, controlled testing is required to say for certain if it’s a camera fault, though it should be noted that such issues have never occurred in my use of a Canon EOS 1Ds Mark II, which seems to have more pinpoint focus sensors.  

Focus problems are one reason all results are double-checked while preparing reviews, so that information presented in a diglloyd review is based on examples that are consistent with other redundant ones.  There is far more shooting going on to prepare a review than is actually presented.

A reader pointed out that there is at least one semi-photo-related Universal Binary MacOS application out there now (a “fat” app containing both PowerPC and Intel code).  It is the utility program Renamer.  I don’t expect that applications like this will yield much benefit from being native code, since most of their activities are limited by the speed of the MacOS file system, which is already native code and which is in turn limited by the speed of the hard disk.

Additional internal drives in PowerMac dual-core desktops

Barefeats.com has a note on revised internal drive kits to fit the new dual and quad-core PowerMacs.  I’m hesitant to use such a kit, because the more heat inside your PowerMac, the louder the fans will be.  If you enjoy a quiet computing experience (and the PowerMacs are very quiet compared to most PCs), think carefully before stuffing more heat-producing drives into your PowerMac.

More on Intel Macs and Rosetta

I see a MacBook in my future for field work. But until there are more than zero (0) non-Apple photo-related applications that are Intel-native code, users are unlikely to see any performance improvement, and perhaps even poorer performance than on a Powerbook G4.  Have patience—wait for native Intel apps to emerge unless you need a notebook right now—in which case the Intel version is the way to go.

A reader writes:

Will the new software (that runs natively on Intel) still run on my PowerPC chipset?

It’s a good question.

The short answer is that this time around is no different than when Apple switched to PowerPC from the Motorola 68030/68040 chips. Vendors shipped “fat” binaries for a long time, and nobody worried about it—except perhaps for performance issues.  Gradually, applications became PowerPC-only, but that process took 2-4 years.

When a “fat” application is started, MacOS simply picks the appropriate PowerPC or Intel code to run. This should be completely transparent to the user. In theory, if Apple ported MacOS to Sun Sparc processors;  a “fat” app could contain three binaries: code for PowerPC chips, code for Intel chips and code for Sparc chips (potato chips?).  I’m not predicting such a state of affairs; it is used for illustrative purposes only.

It is likely that most vendors will ship “fat” (dual-binary aka “Universal Binary”) applications, so that if you purchase new/updated software, you’ll be able to use it on either the PowerPC and Intel boxes. A many-millions-strong PowerPC customer base exists, and vendors aren’t about to abandon that base.  It would be financial suicide to produce Intel-only software (except perhaps software that requires very specific high-end hardware yet to be revealed).

Let’s not forget that PowerMac towers using Intel chips are not yet available, and might not be available until the end of the year.  Intel chips or not, Apple will have to work hard to beat the performance of the PowerMac Quad, which already compares favorably to high-end PCs at photo-related tasks.   Perhaps we’ll see a 4-processor dual-core Powermac (8 cores)? We can only hope so.

In other words, no need to worry.  If you own, or are planning to own an Intel-based Mac, your immediate concern should be whether you can get any Intel-native apps, not whether your PowerPC apps will run!

Whither the digital monochrome camera?


The crop above is an actual pixels crop taken from a six-megapixel Kodak DCS 760m image (“m” for monochrome).  Move your mouse pointer over the image to see a version sharpened at {100, 0.3, 0} in Photoshop, and then move it off the image to see the unsharpened version (give it a moment to appear).  Note the “cleanliness” of the image.  No speckled aliasing and no weirdo patterns.  Just clean pixels.

The monochrome DCS 760m digital camera (circa 2002) has no color filter array as found in today’s color digital cameras. Because there is no light loss from these miniature color filters, its base ISO is 400—and that’s from a 3-4 year old camera.  There is no funky color aliasing, because there is no interpolation of color, just pure monochrome values. Color digital cameras are monochrome too (sort of), but each photosite receives red, green or blue light which later requires interpolation (smart guessing) as to the actual color.   If you ever put a colored filter on your digital camera, you can see how much it reduces the exposure.

Kodak only made about 80 or so DCS 760m cameras, most of them allegedly going to government agencies, and they still cost about $8000 used (though finding one is next to impossible).  The camera had its issues, but when operating perfectly, image quality is beautiful.  One can only image what image quality would be possible today, should Nikon decide to create a D2Xm, or Canon an EOS 1Ds Mark IIm.  For more insight into the DCS 760m, please see this article by fine-art photographer Pete Meyers.

I love color, but well-done black and white imagery is gorgeous too. And unlike conventional photographic film,  a monochrome digital camera allows full-spectrum capture, so that infrared, ultraviolet and visible light can all be captured in combination, or separately, using appropriate filtration. Witness the multiple web sites offering to convert color digital cameras to infrared as evidence that the market is sorely lacking a monochrome offering.  Still, such converted cameras cannot equal the image quality that a true monochrome camera would offer; even in a converted camera, the Bayer pattern color filters are still present, and still do their damage to dynamic range, sharpness, aliasing, etc.   These conversion services are also not cheap ($300-500).  That kind of premium over and above a color camera ought to provide an alluring profit margin for a major manufacturer.

Monochrome sensors should be much cheaper to manufacture than color ones (with sufficient production volume—therein lies the problem).  They require no interpolation as do Bayer Pattern sensors, so the camera “brains” can be simplified.  Because they have no color filter array over the photosites, they also offer about two stops more sensitivity than a color camera at the same level of digital noise. An ISO 50 or 100 monochrome sensor would produce stunningly-smooth images with extremely low noise.

A 12 to 16-megapixel monochrome sensor with a wide dynamic range (12 stops or more) is on diglloyd’s short list (so long as it offers true 16-bit files).  Such a camera would produce breathtakingly sharp images as compared to a similar resolution color camera, but should be cheaper to manufacture, needing a less expensive sensor, less computing ability, less memory, etc in the camera.  Are you listening Nikon and Canon?

Emerging storage technology: port-multiplier external SATA II

I’ve followed Norwegian photographer Bjorn Rorslett’s very useful web site for years, and his outstanding review of the Nikon D200 is now online.   Bjorn’s vast experience lends a particularly useful and interesting viewpoint to anything he reviews.  The D200 review is his best review ever—don’t miss it if you have an interest in the D200!

Work continues on the diglloyd Nikon D200 review, presented as a comparative review with the Nikon D2X, and possibly with some Canon EOS 5D comparisons.    It will likely take several more weeks to complete.

I’ve never much liked noise reduction, but the results obtainable in Nikon Capture show that it is a welcome feature for some images. Read the article, and see for yourself.

Some new products have already been announced and I’ve learned through other means that more will follow from other vendors.  For external high-speed storage, Firewire 800 never delivered (especially with the write-performance bug  on G5 PowerMacs), and eSATA (external SATA) is now emerging as the de-facto performance standard.

SATA first emerged as an internal storage technology supporting 2 drives via the motherboard.   Now what we see emerging is external SATA II, capable of 3 GB/sec (twice the original SATA 1.5 GB/sec).  Only a 8 drive striped RAID array can saturate that kind of bandwidth, so there’s room to grow.

But more importantly, we see port multiplication emerging, which allows up to 5 drives per physical cable (at least in the Sonnet E4P implementation).   The Sonnet Technologies card has 4 ports, supporting 5 drives per port, or 20 drives with a single card.  The claim is 300MB/sec per port.  In theory, this would allow 1200MB/sec with 20 drives, though other factors are likely to limit that theoretical speed.  Of course this is “bleeding edge” stuff, with Sonnet the first Mac vendor to announce a product—I advise waiting 2-3 months before spending money on this technology.  As usual, barefeats.com is a great place to keep abreast of this sort of thing.

Why is port multiplication good?  I currently use the FirmTek SeriTek/2eEN4, a very reasonably priced and very high quality external SATA I/II 4-drive storage unit.   The downside is that it does not support port multiplication and thus 4 cables and 4 ports are needed—one for each drive in the unit.   This not only leads to a tangle of cables, but it means that the PCI card supplying those 4 ports uses one of two slots in the PowerMac G5 (the GeForce 6800 Ultra DDL video card occupying two slots).  Addition of another external unit means the last slot would be scarfed up by another card.  With port multiplication, four (4) external units could be supported with 4 cables and one card, a much more flexible solution, especially if you want  a master unit, and an online backup unit.

Apple’s performance claims scrutinized

As predicted in this blog’s January 11 entry, shining a bright light on Steve Jobs’ performance claims for the new Intel-based Macs reveals a less than honest approach in presenting them to the unsuspecting public.

Two articles have emerged which discuss this. Infoworld has a brief discussion on it, as does Henry Norr at Macintouch.com.   The criticism to be levelled here is not that the new Intel-based Macs aren’t faster (they are of course), but that Apple has been less than honest in its explanation of when one can expect performance gains.

Steve Jobs could have said These are dual-core machines, so on some tasks they’ll be radically faster, since it’s like having two computers in one—plus they also run at higher clock speeds than the previous machines.”.  Rather, he chose to quote SPEC numbers, and did not make it explicit that Apple used multithreaded benchmarks.  Real-world performance is what matters when buying a computer, and much of real-world performance is single-threaded (making use of only one core).  Technically astute users who had time on their hands could dig around and figure it out, but not 99% of the people hearing his message.

Those who already intend to purchase a new laptop or iMac should go ahead and get the new models—you will likely be pleased.  But if you’re on the fence, don’t spend a lot of money exchanging a perfectly good model for a newer one that might perform little better than the old one—wait until you see performance numbers for the applications that matter to you.   And do keep in mind that all of your non-Apple software will run in emulation mode with Rosetta, and thus might perform more poorly than on the “slower” non-Intel machines until you upgrade to Intel-based versions of your software ($$).

Beware of camera “reviews”—that aren’t!

The diglloyd Nikon D200 review is in progress, and already it’s very clear that the battery life of the D200 is nowhere near that of the Nikon D2X. Side-by-side shooting shows much more rapid battery drain on the D200.  To be fair, the Nikon D200 battery hasn’t been cycled from fully drained to fully charged several times yet, so perhaps its performance will improve.

One of the limitations of the D200 viewfinder as compared with the D2X viewfinder is that it does not show the whole frame.  I like to shoot “tight” and this approach always yields extra stuff around the edges, since precise framing is possible only for what is seen through the viewfinder.  A small thing, but one that impacts every shot taken.   The same issue exists with the Canon EOS 5D as compared with the Canon EOS 1D series.

Are you looking for a real review, not a rehash of manufacturer data sheets, boring specifications and marketing hype?  You get what you pay for, though I won’t name the guilty web sites or magazines here (most magazines being particularly offensive in this regard—eg inoffensive to their advertisers and thus not Serving their readers).

Diglloyd reviews don’t waste your time or money on such things, nor do I even think it’s honest to call such mulch a “review”.  Instead, a wide variety of operational and image quality factors are explored with numerous examples and large, highest-quality crops—see for yourself by purchasing the D2X vs EOS review today.   You simply won’t find a higher quality review anywhere, at any price.  You can expect carefully conducted tests with results you can trust—and there are more than enough high-quality examples for even a cynic to come to his/her own conclusions.  I scrupulously analyze all results, reshooting tests if there is any doubt as to perfect execution.

Backing up is good to do...

If your livelihood is intellectual property (such as photographs), or even if it’s just your favorite photos which have value to you alone, wouldn’t it be nice to know they'll be there tomorrow, even if the computer burns into a pile of goo along with the house, or goes scuba diving during the next hurricane? Or the next nasty virus wipes the hard drives?  Or the hard drive just fails inexplicably?

Always make sure you have at least two current backups.  If they’re all in the same building, they’re at risk.  If they’re connected to the computer, they’re at risk.  How to handle this may be the subject of a future diglloyd article, but here are some quick pointers:

1. Consider using DVDs—they’re annoyingly small and slow to create, but cheap.  Burn two copies of everything, and store at least one set at a second location.  Be sure you also burn subsequent modifications to the originals, unless you’d like to repeat that work. Every 3-4 years, reassess the current state of the art, and make copies using the lastest technology (technology obsolescence and/or media risk).

2. Hard drives are cheap.  Laziness sets in for everyone, sooner or later.  By keeping a 2nd hard drive connected, you have no excuse for not having an always-current backup.  This will only protect you from failure of the main hard drive; it won’t protect you from natural disasters or unnatural ones, such as viruses fond of trashing your computer.

3. Use a disciplined approach: Don't skip a backup because it’s inconvenient.  Reread #2 above.

4. Apply the same discipline when shooting in the field.  If you download your flash card to an Epson P-4000 or the like, then erase it, you had better have two Epon P-4000s, one as the master, and one as the backup.  Or a laptop and a P-4000—you get the idea. If you only have one copy, it’s not backed up.  Alternately, don’t erase your flash cards (they remain the master copy), and consider a single P-4000 (or something similar) the backup.

Was diglloyd unfair to Windows XP in yesterday's article on Nikon Capture? How does this relate to digital photography?

“Considering the grossly inferior user interface, and the pervasive and severe virus and security problems of a Windows PC, MacOS is the only rational choice at this point in time.”

Diglloyd was listening to National Public Radio news yesterday, hearing that yet another security hole has been found in Microsoft Windows, which can apparently infect your machine just by visiting a website or reading an email containing an image file!  The newscaster’s advice was to “avoid unfamiliar websites”.   So much for the value of web search engines—just restrict your use of the web to a handful of websites!  If all this still leaves you feeling all warm and fuzzy, consider treating yourself to a nice Windows “rootkit” CD from Sony.

How does this relate to digital photography?

Digital photography requires a reliable computer that yields more benefits than drawbacks. It is hard enough just learning all the software and techniques that are prerequisites for productive work.  Add to that the insult of paying for protection from malicious hackers on a regular basis, the necessity of staying abreast of the latest threats...well that’s not diglloyd’s idea of a good use of one’s time or money.  It’s as if you just bought a new car, and if you don’t regularly add some Symantec No-Blo or McAfee No-Plode to the gas tank  (new formulations required every week), the car will explode, not start, or drive off on its own to assist in a bank heist (no affront to Symantec or McAfee intended—they're simply filling the gaping holes created by Microsoft).

Is MacOS immune to all of this?  No, of course not.  But you just don’t read about any actual virus infestations occuring on MacOS (which, like Linux, is Unix, one of the most proven, secure, and reliable operating systems available).  At any rate, diglloyd has never used anti-virus software on his Mac, and that goes back to 1983, yet diglloyd had only one virus, back in 1989 or so, on MacOS 9 (which is not MacOS X, and not Unix).

No computer user is immune to losing valuable data. The loss may come from a virus, a machine failure, a natural disaster, or your 7-year-old.  Make a backup of anything that you wouldn’t want to see gone forever—or you might as well throw it in the trash right now.

diglloyd Inc. | FTC Disclosure | PRIVACY POLICY | Trademarks | Terms of Use | Copyright © 2008-2014 diglloyd Inc, all rights reserved. | Contact