Get all the tools you need to upgrade the factory HDD of any 2009-2019 iMac to a larger HDD or a modern SSD.
877-865-7002
Today’s Deal Zone Items... Handpicked deals...
$200 $150
SAVE $50

$3999 $3199
SAVE $800

$1049 $879
SAVE $170

$1049 $899
SAVE $150

$4599 $4199
SAVE $400

$280 $230
SAVE $50

$470 $330
SAVE $140

$5999 $2999
SAVE $3000

$329 $159
SAVE $170

$1499 $1399
SAVE $100

$979 $879
SAVE $100

$2198 $1598
SAVE $600

$2797 $1797
SAVE $1000

$2399 $2099
SAVE $300

$1299 $1199
SAVE $100

$2999 $1799
SAVE $1200

$998 $678
SAVE $320

$2394 $1694
SAVE $700

Sony FE 20-70mm f/4G: Assessing Distortion at 20mm, 24mm, 35mm, 50mm, 65mm

This page looks at distortion uncorrected and corrected of the Sony FE 20-70mm f/4 G at 20/24/35/50/65mm.

Comparison lenses include the Sony FE 12-24mm f/2.8 GM, Voigtlander FE 35mm f/2 APO, Voigtlander FE 50mm f/2 APO, Voigtlander FE 65mm f/2 APO. While those latter three are reference-grade lenses, that’s the point—the difference.

Sony FE 20-70mm f/4 G Distortion @ 20mm, 24mm, 35mm, 50mm, 65mm

Includes corrected and uncorrected images, and a comparison/reference lens at each focal length.

CLICK TO VIEW: Gear used here, and related

 

Dr S writes:

Your comment tells it all:

 “But if it’s for casual snapshots or vlogging or any task where high resolution is not a goal, then turn on distortion correction and be happy.

 Cam/Lens manufacturers have been building in distortion corrections for years.  Yes, that screws up high resolution. Sony programmed a huge amount in the 20-70/4 to sell a bunch of lenses......enough quality to satisfy the masses but not nearly enough for many of us.  Obviously there must be a large "Sony" market for a wider range zoom.  Just glancing at some of the forums one can see a ton of people with their new lenses.  Some are concerned about the excessive WA distortion, while others accept a lower quality image but the question, as you alluded in your blog entry, is what are the images used for?

I still shoot primes and will continue to do so for all my shots (well, I did get the GF 20-35 for the hobby part of me)  For some tele shots for events I use a 70-200.  Sharpness there is not primary.  

Don't know where the industry will ultimately end up but it seems to me the manufacturers are in a higher trajectory to build in a lot more software correction for imagery beyond distortion.  Going even further I'm waiting for the day soon when one will be able to buy a Sony MILC with a built-in telephone.  Maybe one exists but the massive amount of computational photography built into mobile phones has not been lost on the eyes of the major cam manufacturers.  Interesting time.

DIGLLOYD: I’m OK with software correction for things like lateral chromatic aberration, though it is not ideal; the effects are not equally good as a well corrected lens and can never be. Especially since longitudinal chromatic aberration has a substantial perceptual effect.

Distortion and the negative effects of distortion correction are a radically different case because of degraded sharpness and micro contrast—together with grossly misleading MTF charts that account neither for diffraction nor distortion correction, a defiance of how such designs MUST be used. It’s flat-out dishonest IMO. Simple math is all that’s required to understand the losses (no pictures required!). Though I’ve also shown directly (many times) how much sharpness is lost with/without correction. These points have clearly escaped the dpreview crew, but whether it is incompetence along with conflating unrelated types of design compromises, or intentional misrepresentation using inappropriate examples for “air cover” for advertising sponsors in reviews to come (the article hints as much), I don’t know.

BTW, I cannot (yet) show the sharpness losses with the Sony FE 20-70/4 because ACR as yet lacks a lens profile for correcting it. Indeed, the 20-70mm as it stands is unusable with RAW because there is no easy way to correct the extreme distortion (manual correction can help but it is only an approximation).


Get all the tools you need to upgrade the factory HDD of any 2009-2019 iMac to a larger HDD or a modern SSD.

Sony FE 20-70mm f/4G: Fun-House-Mirror Distortion, as Expected

As I expected, and confirming my intense dislike of wide-range kit-style zooms, the Sony FE 20-70mm f/4G lives up to its design decisions. Its 3.5X zoom range spanning that 20mm to 70mm range inherently defies rational expectations with regards to distortion. Software distortion correction is mandatory, either by shooting JPEGs (enforced, no choice) or in RAW (Adobe Camera Raw as yet has no profile available so cannot be corrected easily).

CLICK TO VIEW: Mid-range zooms for Sony mirrorless

I’ll be showing how the 20-70mm compares to the Sony FE 12-24mm f/2.8 GM and also various reference lenses. Readers can decide for themselves how much sharpness they are willing to give up in exchange for convenience.

Clearly this is a “vlogger” lens and with distortion correction it will be well loved for its extremely convenient zoom range, and for JPEG snapshots and similar. But the 20-70mm is wholly unsuitable for landscape photography unless your quality standards for sharpness are low. At least over a good part of the zoom range, where distortion correction must be used.

Correcting this level of barrel distortion will result is major loss of sharpness in the outer zones.

At the long end, pincushion distortion is similarly unappealing, and correcting it will damage sharpness everywhere in the frame, making high micro contrast impossible.

In this example, distortion correction at the edges involves a pixel-stretch factor up to 1.27X linearly. Which means you are reducing the potential sharpness from 60 to 37 megapixels in those outer zones.

Toggle to compare uncorrected to corrected. The corrected version is an in-camera JPEG.

View all handpicked deals...

FUJIFILM GFX 50S II Medium Format Mirrorless Camera
$3999 $3199
SAVE $800

Fun Fact: Size of Subject Matter, Center vs Corner

I cannot conceptualize in my head this optical behavior, which of course is fully explained by optical laws. It has something to do with the angle of the lens to the camera and the way the lens is required to project a rectilinear image—I just visualize it to myself without going exploring optical laws.

Below, the same pot is shown as captured in the corner v center. No change in distance, only a change in the way the pot has been included in the picture, the framing of the picture.

Hot tip: the corner is a great spot to put your mother in law for that holiday family photo. Set the lens at 12mm and you’re 'good' so long as you email the photos after everyone goes home.

CLICK TO VIEW: Outstanding Landscape Lenses for Sony mirrorless

This effect is why doing stitching with a wide angle lens is problematic, particularly at close range. To compensate, I like to overlap the images much more. But it’s also why I like to use at least a 35mm focal length for panoramas via stitching, with 50mm and 65mm my go-to focal lengths, on up to 105mm or so.

Same distance, 12mm lens: size of elements in center vs corner

Jason W writes:

Short blog post but this point is a big a deal to me.

As you point out, it makes stitching a pain. Since objects in the corners that are close to the camera are distorted, this changes the scale and weight of them in the composition. It can be hard to visualize all the elements without seeing the final image in the viewfinder, which with stitching, you can't.

Back in the film days, rectilinear ultra wide angle was also a pain. On 6x17, you had optical viewfinders that sat on the top of the camera, but then you had parallax error where you'd have to tilt the frame down to have the same framing as the viewfinder on top of the camera. Many of the viewfinders also had different distortion at wide angles than the shooting lens, so again, more question marks. You could solve all of it by using ground glass, but that sucked on most non-view camera 6x17 models like the Technorama or GX617 because you had to unload the film.

Overall, I feel center cropping a GFX 100S frame and framing using the 65:24 guides is the most effective way to shoot a wide format image.

DIGLLOYD: yep, it’s really difficult, and at close range the true size perspective is also varying hugely over distances of a few feet. Yet your “eye” obeys the perceptual size-distance invariance principle, showing you a non-reality that the camera can never capture because it does not exist. See Perception and Imaging by Richard. D Zakia @AMAZON.

As for really wide format images, it’s a more practical to shoot the Sony FE 12-24mm f/2.8 GM in the 12mm to 16mm range than deal with the Fujfilm GFX100S limits on angle of view (Fujifilm GF 20-35mm f/4).


Get all the tools you need to upgrade the factory HDD of any 2009-2019 iMac to a larger HDD or a modern SSD.

2022 MacBook Pro M2 Max: Good Choice to Replace a Destkop Computer?

re: Video Presentation: Configuring the 2018 MacBook Pro as as Desktop Replacement
re: Apple 14/16-inch MacBook Pros Here, with M1 Max or M1Pro CPU — and Why a Desktop is Often Better than a Laptop 

2022 MacBook Pro M2 Max, Tigger hunting

Is the 2022 MacBook Pro M2 Max a good choice for one’s only computer, good for both travel and at home as a desktop workstation replacement?

REVIEWED: Apple 2022 MacBook Pro M2 Max

On the performance front, the answer is an emphatic yes—the MBP is not only a solid desktop, it offers workstation-class performance, trouncing (most things) my 28-core 2019 Mac Pro with 384GB memory and pricey Vega II video card. And not far behind when it isn’t busy making the Mac Pro feel inadequate. With 96GB memory there’s not much you can’t throw at the MBP.

Which pretty much leaves ergonomics: working on a laptop sucks: tiny screen and hunching over it will do damage to your body sooner or later.

I’ve said before that most users are better off with the form factor of an iMac 5K. That remains true, except that the only option is the aging and discontinued 2020 iMac 5K. And I can no longer recommend the inferior performance of the 2020 iMac 5K over the 2022 MBP.

Solution

Attach a display and keyboard and mouse for work at home/office and you sidestep the ergonomic issues. Add the OWC Thunderbolt Go Dock for more ports along with extra storage, and you are looking good.

I’ve been considering the 2022 MacBook Pro M2 Max as a replacement for my 2019 iMac 5K for travel in my Sprinter van. I’d attach the NEC PA271Q to it while on the road (the pixel density is far more useful for image evaluation). It’s particularly attractive in that the laptop can be used all by itself, with the display attached for in-depth photographic. But the $5299 cost (+$350 for AppleCare) is too much to handle right now. So I’ll keep using the 2019 iMac 5K and see if we get an iMac 5K M2 Max or better yet an iMac 6K M3 Max, late this year or early next year.

Sony FE 20-70mm f/4G: Should Arrive in a Few Days for Testing

See previous post for details and my early take: Sony FE 20-70mm f/4G: Appealing Zoom Range, How Good Will It Be?.

If it arrives on time (early next week), I should be able to quickly determine its appropriateness for demanding subjects eg landscape photography.

My requirements are relatively low distortion along with minimal field curvature and edge-to-edge sharpness (corner to corner even better). These requirements often fail, for example the Sony FE 24-70mm f/2.8 GM II, which suffers excessive field curvature making across-the-frame sharpness at the same distance a non-thing.

I suspect that these two issues will be troublesome with the 20-70mm because they are always necessary compromises in a zoom, especially a 3.5X zoom of this range. Even the Sony FE 12-24mm f/2.8 GM is compromised with pronounced field curvature. I find it acceptable given the extreme focal lengths involved. But not for a 20-70mm or 24-70mm.

I hope to be pleasantly surprised.

Get Sony FE 20-70mm f/4 G at B&H Photo...


Upgrade the memory of your 2020 iMac up to 128GB

Reader Question: Fujifilm GF 20-35mm f/4 for Landscape?

Forrest G writes:

Do you recommend the Fujifilm GF 20-35 lens for landscape photography? I am using the Fujifilm GFX 50S II, mostly for black and white imaging.

DIGLLOYD: absolutely, terrific lens, and a great complementary lens to the Fujifilm GF 35-70mm f/4.5-5.6. I w ould have loved to own it myself after reviewing it, but budget precluded that.

The main flaw of both the Fujifilm GF 20-35mm f/4 and the Fujifilm GF 23mm f/4 is ghosting flares which suck for night shots having intense light sources, or sun shots. I’m not sure which is worse, but neither is satisfactory that way.



Get all the tools you need to upgrade the factory HDD of any 2009-2019 iMac to a larger HDD or a modern SSD.

Drought Be Gone — California Rain and Snow Shock and Awe

re: mosquito

Normally this time of year, I’d have no qualms about heading over to the Eastern Sierra region and nearby desert like Death Valley and Alabama Hills. This year is radically different. I’m staying put for at least two more weeks, and it’s going to have to be all desert, as the snow is right down and deep even to the shores of Mono Lake which I’ve never before seen.

It has been largely dry since the atmospheric river at the end of 2022, but now the rain has come back with a vengeance to the tune of 5 inches in 3 days and it’s still coming down. Backyard is ponding and hyper soggy. And it’s as cold a sustained rain as I can remember in 40 years of living in this area (about 41°F while raining). Snow is on the nearby hills, a rarity.

Update March 1: hooray for a sunny day! The land is draining. About 6 inches rain in 3 days at my place. And... another inch and it’s raining again tonight and it’s suppose to rain for another week. Gah!

Tigger says that hunting rats and gophers in cold rain SUCKS and a warm bed is much better.

I am SO glad I had a new roof installed 18 months ago.

Meanwhile, the Sierra Nevada are more Nevada than in my lifetime. The snow is so deep that meltwater will breed mosquitoes almost to September. Any visit to the Sierra this year in June or July and most of August will be miserable below 11600 feet elevation or so in many places.

I expect to see much of the snow fail to melt by the end of the summer, which was the case back in 2016, but we have much more snow this year I’d bet. Maybe the Mt Conness and Mt Dana ice fields (former glaciers) will grow this year, reversing some of the steady decline.

A lot of Bighorn Sheep will die this year from avalanches, as was the case back in 2016.

Where does all the water go? Here in California it’s all figured out... downriver and into the ocean. No significant new storage has been built in over 40 years. As the snowmelt ramps up in earnest within 6 weeks, spillways over overlowering reservoirs will dump all that water.

Roads are closed — ALL of them!

Blizzard Warning Issued for Large Portion of the Sierra with 7 Feet of Snow in the Forecast

Reader Tait S writes with an amusing (assuming you’re at home!) advisory from the Mono County Sheriff’s Office.:

“The roads are closed. All of them,” wrote the Mono County Sheriff’s Office on social media. “There is no alternate route, back way, or secret route. It’s a blizzard, people. You cannot see your hand in front of your face, let alone a snow stake to guide your way. Stay home. Or wherever you are if you aren’t home (and if you’re somewhere you shouldn’t be, you’ll have to sort that out with your significant other – we told you to make good choices).”

Skis or snowmobile sound nice, if you happen to live in the area, once the storm breaks.

Jon L writes:

Have you ever tried picaridin-based insect repellant? I use the Sawyer 20% picaridin spray or cream. Lasts up to 12 hours. No where near as toxic or obnoxious as DEET. And it really works, even agains the ravenous blood-sucking bastards in Canada (including black flies and ankle biters). Also works against ticks which is a problem in Arkansas when working outside in the spring. I can even put the lotion on my face at night to keep the mosquitos from buzzing around. Best stuff I have ever found.

Must be at least 20%. I’ve tried some other brands from Wally World, but active ingredient amount lower and carrier solvents more obnoxious.

DIGLLOYD: good tip, hadn’t heard of it!

Picaridin @AMAZON

OWC Thunderblade Thunderbolt SSD

Blazing fast, up to 32TB.

YEE HAH!



√ No more slow and noisy hard drives!

FOR SALE: My Low-Mileage Sony A1

re: Sony A1

The Sony A1 is a fantastic camera, but for future work I’ll be using the Sony A7R V because I do not need the advanced high-speed features of the A1, and I want the 60MP sensor and focus stacking support. While the difference is hardly detectable on resolution, even a little bit is helpful for my work.

Selling:

  • Sony A1 in perfect working condition.
  • Original box, manual, accessories as shipped with.
  • Really Right Stuff L-Bracket for A1.

$4800 OBO. Contact Lloyd —  SOLD, no longer available

Save Big $$$$ on Memory for 2019 Mac Pro

Up to 65% better pricing than Apple

Lloyd recommends 32GB RDIMM modules for most users (more expensive LRDIMMS are for 512GB or more).


Sony FE 50mm f/1.4 GM

re: Sony FE 50mm f/1.2 GM

Sony FE 50mm f/1.4 GM

The about $1298 Sony FE 50mm f/1.4 GM [Sony specifications] was announced in February 2023, shipping in May 2023.

  • Full-Frame, f/1.4 to f/16
  • G-Master Design with Advanced Optic
  • Two XD Linear AF Motors
  • Internal Focus Focus Hold Buttons & Iris Lock Switch
  • Physical Aperture Ring & De-Click Switch
  • Rounded 11-Blade Diaphragm
  • Nano AR II & Fluorine Coatings
  • Dust and Moisture-Resistant Construction

I’ll review the 50/1.4 GM when it becomes available in May.

CLICK TO VIEW: 50mm Lenses for Sony Mirrorless

vs Sony FE 50mm f/1.2 GM

Similar advanced optical construction makes one wonder how the lens will perform at f/1.4 vs the Sony FE 50mm f/1.2 GM. Quite a bit smaller and lighter and cheaper, and as good performance, maybe better?

Real-world testing of the f/1.2 GM shows that the MTF chart is fantasty; f/5.6 is required for full sharpness across the field, with fantastically sharp results at center at f/1.2 along with rapid deviations towards the edges. Will the f/1.4 GM show similar deviations? Probably.

It if baffling why Sony produces MTF charts for f/8 that (a) do not incorporate diffraction and therefore do not represent a physical reality and (b) hides from us the performance at more interesting apertures such as f/4. F/8 is not particularly useful to look at.

What I like to see in in the 50/1.4 GM is (a) no focus shift, (b) minimal field curvature, (c) edge-to-edge sharpness. This is what the Voigtlander FE 50mm f/2 APO-Lanthar delivers and without stopping down. But of course the Sony 50/1.4 GM has other design goals, and so maybe both lenses will be needed to cover different photographic challenges, and certainly the 50/1.4 GM is a far better lens due to its (fast) autofocus.

Claimed MTF for Sony FE 50mm f/1.4 GM, 10/30 lp/mm
diffraction-free computed fantasy MTF eg impossible at f/8

In my experience, the fantasy MTF charts from Sony have invariably failed to match what real lenses actually deliver, so claims of high performance need to be met with some skepticism. Still, I expect little less than very pleasing performance out of the 50/1.4 GM.

Specifications

Nominal, as per Sony.

Specifications for Sony FE 50mm f/1.4 GM
Focal length: 20-70mm (nominal)
Aperture range: f/1.4 - f/16
Focusing range: 16.1in = 41cm
Angle of view: 47°
Number of elements/groups: 14 elements in 11 groups
Diaphragm: 11 blades, rounded
Magnification: 0.39X = 1:2.56
Filter thread: 67mm
Weight (as weighed): 18.2 oz = 516 (nominal)
Dimensions: Approx. 3.2 X 3.8 in = 80.6 X 96mm
Street price: about $1298
Supplied with:

Front Lens Cap
Rear Lens cap
Lens Hood
Limited 1-Year Warranty

Sony Description

Sony web page for 50/1.4 GM...

Optical design for Sony FE 50mm f/1.4 GM

Breathtaking G Master quality right out to the image edges —  Two high-precision XA (extreme aspherical) lens elements effectively correct distortion and most types of aberration. This new optical design also employs an ED (extra-low dispersion) glass element to suppress chromatic aberration. And with our original Nano AR Coating II on lens surfaces, internal reflections that can cause flare and ghosting are dramatically reduced.

Innovative optics and an F1.4 aperture deliver magnificent bokeh — Innovative optics and an F1.4 maximum aperture combine to deliver magnificent bokeh. Smooth, naturally defocused background and foreground bokeh at F1.4 are ideal for making portrait subjects stand out from their surroundings. Advanced features that contribute to superb bokeh quality include XA (extreme aspherical) elements that effectively prevent onion-ring bokeh, and carefully controlled spherical aberration during design and manufacture.

A lightweight and compact lens with incredible performance — XA (extreme aspherical) elements precisely positioned in an innovative design, high-thrust XD linear motors, and the latest compact circular aperture unit work together in a large-aperture high-resolution lens hat is only 3.17” (80.6 mm) in outer diameter and 3.78” (96.0 mm) in length while weighing just 18.23 oz (516 grams). A unique blend of mobility and low-light performance make it an ideal choice for everything from portraits to snapshots.

Fast, precise, quiet AF and tracking for stills and movies — The lens’s focus group is driven by our high-thrust XD (extreme dynamic) linear motors and an advanced control algorithm for smooth, responsive focus drive. Noise and vibration are minimized for refined, quiet AF operation that can be a huge advantage when shooting movies. Movie subjects are smoothly captured and tracked even when shooting at 120 fps or other high frame rates.

Intuitive operation supports the creator's vision — Linear Response MF provides manual focusing response that rivals mechanical lenses, and is more than responsive enough to produce smooth rack focus for movies. An iris lock switch can be locked so that the aperture ring can be fixed at the “A” position to prevent unwanted changes, or manually set from F1.4 to F16. Two customizable focus. hold buttons and an AF/MF focus mode switch provide extra shooting versatility.

Outstanding reliability for any environment and application — The front lens element features a fluorine coating that repels fingerprints, dust, water, oil, and other contaminants while making it easier to wipe off any contaminants that do become attached to the lens surface. A dust and moisture resistant design3 provides extra reliability for outdoor use in challenging conditions.


Deals Updated Daily at B&H Photo

Sony A7R V 16-Shot Sony Pixel Shift vs Fujifilm GFX100S: Huge Laurel at Bend in Alpine Creek

re: Sony A7R V
re: Fujifilm GFX100S

How does the 100-megapixel Fujifilm GFX100S compare to the 60-megapixel Sony A7R V using 16-shot pixel shift mode with Sony motion correction?

Sony A7R V 16-Shot Sony Pixel Shift vs Fujifilm GFX100S: Huge Laurel at Bend in Alpine Creek

The Sony A7R Vimages span single shot, single shot enhanced, 4-shot pixel shift and 16-shot pixel shift images, for full perspective on what it can deliver.

Images and crops at 240 and 120MP sizes are shown, with both cameras run through the same upsampling algorithm for consistent and comparable processing. Plus, thelarge sizes make it easier to see differences in detail between all the variants. Defects from pixel shift at 240MP are noted. Performance across the aperture range is demonstrated also.

f5.6 @ 0.7 sec, ISO 100; 2023-01-31 17:39:20
Fujifilm GFX100S + Fujifilm GF 35-70mm f/4.5-5.6 WR @ 34.4mm equiv (41.8mm) RAW: Enhance Details

[low-res image for bot]

Apple 2022 MacBook Pro M2 Max: Blows Away Intel Macs on Almost Everything; High-End Workstation Masquerading as Laptop

re: 2022 Macbook Pro M2 Max

While the 28 CPU cores of my 2019 Mac Pro win out for a few tasks, in general the Apple 2022 MacBook Pro M2 Max is a wolf in sheep’s clothing, a high-end desktop workstation masquerading as a laptop computer. Do I want one? Heck yeah. If only one did not have to run Apple’s worst OS ever, the feckless usability dumpster fire that is macOS Ventura... but dang the hardware is impressive.

Whether focus stacking or image scaling (incredible!) or raw file conversion or Adobe Lightroom or general photoshop work, the thing is a speed demon you can use on the road or at work/home as a high-end workstation. Incredible.

REVIEWED: Apple 2022 MacBook Pro M2 Max

See my full review as well as the recommendations page. Consult with me on turning it into a desktop workstation or field computer.

View: current Mac wishlist and all current OWC wishlists.
See 2022 MacBook Pro M2 Max at B&H Photo, sales tax B&H Photo Payboo pays it does, for most states.

CLICK TO VIEW: Recommended MacBook Pro Configurations

2022 MacBook Pro M2 Max, showing Tigger hunting
OWC ROVER PRO wheels for Mac Pro

No tools or hassle… just place your Mac Pro’s factory feet into the Rover Pro’s polished stainless-steel housings and secure with a few hand twists.

When you’re done moving your Mac Pro around, the Rover Pro makes it just as quick and easy to convert back to the factory feet for stationary use.

Fujifilm GFX100S: Horizontal White Stripes Related to PDAF?

re: pattern noise

I reported quite often on the issues with Fujifilm GFX100S (and GFX100) pattern noise that shows up in very troublesome ways when converting to black and white. But my very first finding on that problem involved blue sky and in color.

Fujifilm GFX100: Horizontal White Stripes a Problem for Numerous Images from my last trip

Fujifilm has never provided an explanation or done anything about it that I can tell, and so I will persist in periodically noting the issue and its manifestations. I can accept “it’s a camera limitation blah blah blah”. It is unacceptable to say/do nothing.

Reader Jason W writes:

This is a non-pixel shift shot on the GFX100S with the sharp 35-70 on a high contrast object.

You can see the purple/yellow lines which are the same spacing as the PDAF bands you can see in the sky.

DIGLLOYD: there is a particularly troublesome line about 1/3 up from bottom. I’d pay extra for a camera lacking the damned PDAF striping problem. PDAF speeds up AF at the cost of accuracy and image quality, which makes it a lead balloon for landscape photography.

This imaging defect is just one more argument to go to the Sony A7R V, which can outperform the Fujifilm GFX100S using Sony pixel shift and get very close in single-shot mode with the best lenses. And do so for outdoor scenes very successfully, as shown with multiple comparisons and different lenses.

OWC Envoy Pro EX SSD
Blazingly fast Thunderbolt 3 SSD!

Up to 4TB capacity, USB-C compatible.

USB-C model also available


Great for travel or for desktop!

Why Does Sony Pixel Shift Operate at Only 2 Frames Per Second?

re: Sony pixel shift

Reader Alfred W writes:

Shooting @ 1/60 second, Iso 100, how long will a four shot Sony pixel shot take?

DIGLLOYD: in theory, it should be possible to capture pixel shift frames as fast as the camera can otherwise take pictures given a shutter speed supporting the maximum frame rate eg 10 fps for the A7R V.

That implies something approaching 4/10 second for 4-shot and 16/10 second for 16-shot.

In practice, the time required is molasses slow, for reasons I do not understand. Perhaps it relates to moving the sensor to the precise position required?

Sony A7R V

I tested the Sony A7R V at 1/500 second with Interval = Shortest — as fast possible.

Sony A7R V:
4-shot pixel shift: 2 seconds
16-shot pixel shift: 8 seconds

In other words, the Sony A7R V takes 1/2 second per shot for each shot of the pixel shift capture which equates to five (5) times longer than one might hope for. Why?

Sony A1

The Sony A1 is faster, which might be expected given its 20 fps raw capture rate (lossless raw, full-frame). But it is not twice as fast in spite of that. Indeed, the A1 takes 7.5 times longer than its frame frame rate allows.

Sony A1:
4-shot pixel shift: 1.5 seconds
16-shot pixel shift: 6.0 seconds

Actually, the A1 ought to be able to capture at 30 fps and still save as lossless compressed (instead of its "stupid mode" uncompressed raw), which would make it 11.25X slower than one might expect. All it has to do is capture, then compress them and write to card; it’s not a burst capture but a known-in-advance capture of 4 or 16 frames—no technical limit stopping 30 fps.


Get all the tools you need to upgrade the factory HDD of any 2009-2019 iMac to a larger HDD or a modern SSD.

Open AI Image Generation — the Death of Photography? +Reader Comment

Is Open AI to be the death of photography? Or a new form of creativity, driven by natural language descriptions? Or the new tool for insecure people to generate artificially attractive images of themselves? And a million other things.

For example, how to make a portrait of a young woman video and be prepared to be astounded at how good it is overall, and yet how many small but problematic issues there are. And then see the community showcase. Assume that the power of the technology will double every year, so that 10 years from now it will be 1024 times more powerful and the kinds will be worked out.

You should not have been trusting any images or videos for years now. This will only get worse and worse, once deep fake videos are readily creatable at the push of a button for anyone. Reality and artificial imagery will merge and you will not be able to tell the difference. Already most people can’t with the good deep fakes.

First try — fail

https://labs.openai.com

Tigger is a mackerel tabby cat, a natural born killer good for ~300 rodents per year. Keeps the entire neighborhood free of vermin. Almost daily cleanup job in the garage. He once killed and ate 5 rabbits in 2.5 days—the whole neighborhood is de-rodented much to the delight of most of my neighbors. We took him in as a feral hissing beast at ~1 year old, healthy and in his prime.

Dall-E refuses to generate a predator/prey image

AI is still like a nanny with an IQ of 63 when it come to judgment. Or maybe it’s just woke programming, which is even scarier. My first attempt at an AI image failed.

mackerel tabby stalking rabbit in field’” is unnaceptable, but “mackerel tabby hunting rabbit in field” is a go.

AI technology is the new good, and the new evil*. This below captures everything you need to know about wet robots (you and I), and how you will be caged by all the technology around you. AI is going to change everything so fast your head will spin and the world will be radically different within 10 years.

* Killer AI drones are already here, flying and quadriped. Humanoid next. The Terminator won’t be science fiction much longer.

Impressive that an AI can turn “mackeral tabby hunting rabbit in field” into these images at all.

But badmashups with the perspective and behavior all wrong. AI will presumably improve. And yes, I know that properly phrased queries can deliver pretty good results. It takes skills. I just wanted to see what it could do.

Dall-E: “mackeral tabby hunting rabbit in field

Not like real life, but interesting: “mackerel tabby cat devouring rabbit”.

Sorry, but “mackerel tabby cat devouring bloody headless rabbit” is forbidden by the AI nanny.

Dall-E: “mackerel tabby cat devouring rabbit”

Wrong summit hut, wrong placement, wrong perspective. But the third one is a decent start. Variations a failure.

Dall-E: “mackerel tabby cat flying over Mt Whitney summit hut”

Reader Jason W writes:

Event Photography —  Zero impact. AI can't generate images of your wedding or the birth of your child, nor would you want it to.

[DIGLLOYD: Why not? Kauai Hawaii seems perfect: “wedding photos in Kauai Hawaii at the ZZZ Hotel courtyard, the beach area, the top of XYZ, boudoir photos of my wife enhanced in suitable places, .... Wife in wedding dress like <famous person>, husband dressed like <whatever>, including Fred, Tom, Joe, Emily, Charlotte and their kids ....". ]

Stock Photography —  This medium is already f*cked by the immoderate heap of real image content already available, that you can use AI to generate already near valueless photography hardly seems destabilizing.

[DIGLLOYD: Shutterstock already uses AI and its use will grow exponentially I’d bet]

Product Photography — AI can't generate images of things it has never seen. Got a new product? You need to take a series of photos of it to get it into the system. After that maybe you can have AI generate images since it has learned what the product looks like, but there's still an initial requirement. For stuff like cars, a lot of it is already 3D imagery to begin with.

[DIGLLOYD: many product images are already computer generated from models, no need for a picture at all. AI will only improve upon that including placing things in scenes]

Portrait Photography —  Again, same problem as stock photography. You need an actual image of yourself first. Maybe you can do this with an iPhone app and then generate stock photos of yourself, but people might object to a machine generating images of them because they know it isn't real.

[DIGLLOYD: lots of existing images—of course AI cannot start from zero... so what?.  And AI can make you younger or older or anything you like].

Fine Art Photography —  The value of art photography depends on placement and the artist. In this area, AI is a tool, not the artist. It'll come down to what attracts people and is original. Overall, AI currently has a style that is identifiable even when it is imitating, and I think that's off putting. It also still needs an original artist to generate like-kind imagery.

[DIGLLOYD: AI has already won art contests.   Maybe it will win them all before long.   Art is a set of rules along with a style that are not that complicated. It's just a matter of the input and maybe AI will be more creative than humans, since it can understand perception and persuasion more than any human can, and thus manipulate those brain functions.].

Landscape Photography —  Most landscape photography is travel photography. It's people documenting a time and place on their personal journey, and there's no point generating fake versions. For high-end art landscape photography, it's fine art and how much people like it and I from personal experience people "default to truth" and assume a fine art landscape print is effectively an accurate representation. When they learn it isn't they tend to not like it or re-categorize it. This is why Epson has altered and unaltered categories for the Pano Awards.

[DIGLLOYD: maybe real photographs might turn into a 'thing' like platinum prints. Most people like fake landscape already; just look online—hyper fake "reality" that never existed with super saturated color and enhancements.].

DIGLLOYD: my comments inline, above. I don’t presume to imagine where this is all going, but I’d bet it’s going much faster and much farther than my imagination can easily contemplate.

OTOH, with Dall-E people are often grotesequely rendered, location and environment way off, etc. I am sure there are way better AI rendering systems out there and indeed there are, at least for people, as in the intro.

For example, just about everything but the crudest details are wrong here. More like a lame science fair project than anything persuasive. But maybe I just lack the right AI and the skills for proper input.

Dall-E: “Mercedes Sprinter van with open sliding door, titanium mountain bicycle leaning against it, top of White Mountain Peak, red head woman straddling bike

Not too bad...

Dall-E: “two mackerel tabby cats perched on chair wearing a party hat and yawning

Reader Comments

Michael E writes:

I’ve been using Midjourney AI pretty much since its inception. For my own use, I don’t imagine I’m making “Art” with it, but rather I use it for illustration. It marks the death of stock photo sites that don’t include AI output as part of their offerings, IMO.

I blog to some 11,000 people each day, and I use Midjourney to Illustrate the stories or articles I write. It not only saves money from licensing graphics, it allows me to tweak images until they reflect the emotional content of the blogs and articles I produce.

I saw the same thing that will happen to stock photos when typesetting vanished back in the late 1960s-1970s, as word processors and graphic software arose.

I saw the same thing when cellphone photos took the heart out of many professional photographers, and left them gnashing their teeth and cursing fate, or when Spotify and the like morphed the music industry.

IMO, AI graphics like Midjourney mark a sea change that will sweep through the world of graphics like a field fire, perhaps the most powerful graphic tool ever imagined, where a picture is not only worth a thousand words, but can be created with carefully arrange words and some patience.

I already use it every day for illustrating. And yes, it will threaten those who do not recognize AI graphics as the tool it is and learn to program and use it. It is a liberator and tool, not the end of art.

DIGLLOYD: it’s always hard to anticipate what disruptive technology will change.

Chris R writes:

Following on from your article the other day regarding AI taking over and virtually rendering mainstream photographers out of future business, here is an image and article just confirming what you have been discussing on your blog.

https://www.thesun.co.uk/tech/21265059/sunset-photograph-wins-contest-scary-twist/

DIGLLOYD: nice "shot". I would have been suspicious of it at the outset, at least as a heavily manipulated image.

Upgrade Your Mac Memory
At much lower cost than Apple, with more options.

Lloyd recommends 64GB for iMac or Mac Pro for photography/videography.

Sony 16-shot Pixel Shift Works in the Field, Trounces the Fujifilm GFX100S, Might Match the PhaseOne IQ4 150

re: Sony pixel shift

I wondered whether Sony 16-shot pixel shift was usable in the field. Would it be better than 4-shot mode, and would it pan-out as an ultra high resolution medium format solution?

So I went and shot my Alpine Creek scene against the Fujifilm GFX100S tonight.

The crop below is at an image size of 180 megapixels. The 60-megapixel sensor of the Sony A7R V with 16-shot makes short work of the GFX100S. And of its own 4-shot mode. Not 240 megapixels worth of detail (lens performance and diffraction impose limits), but 180MP is about right, or you can just cut it in half to 120MP and be blown away by what you get.

This test was not with some easy scene to make it look good; it was 16 frames at 0.8 seconds each along with moving water and foliage that was not entirely still.

There will be practical situational limitations of course eg 16 frames at 4 seconds each might not fly and any wind will be a problem (camera shake), but this first effort makes me wonder whether I might have the equivalent of a PhaseOne IQ4 150 in terms of resolution, and with vastly lower noise*. Something I cannot apply fairly often, but something I can apply quite a lot.

The storage used is annoyingly huge with 16 uncompressed raw files for a single capture. All fixable if Sony gets its act together, via lossless compressed and compressed DNG using Sony motion correction right in camera. With options to save only the stuff I want, and no need whatsoever to do any 'post' other than use the raw file straightaway. It’s such low-hanging fruit that I am baffled why Sony has not already done it. Free money! What’s the holdup, Sony?

Perhaps field shooting will turn up other issues. Perhaps some scenes won’t work out, but since you cannot lose no matter what, it’s a no-brainer for many situations. Given what I am seeing, it appears that I have an about 180 megapixel camera on my hands that I can apply to quite a few real outdoor shooting situations.

I even shot a 2-frame focus stack with the Sony FE 12-24mm f/2.8 GM at 12mm with impressive detail capture. No other camera on the planet can make such an image (that wide, with resolution that high).

* Using 16 frames, the light capture is 16X greater for 4X lower noise, most noise being the square root of the exposure difference. Put another way, ISO 100 is like ISO 6.

Sony A7R V + Voigtlander FE 35mm f/2 APO-Lanthar at f/5.6 vs
Fujifilm GFX100S + Fujifilm GF 35-70mm f/4.5-5.6 @ f/7.1

Actual pixels crop from 180 megapixel image

 


Get all the tools you need to upgrade the factory HDD of any 2009-2019 iMac to a larger HDD or a modern SSD.

All Too Often: the Garbage Image Quality of Camera Phones with HEIC or JPEG

re: computational photography

What I refer to as computational photography is all the good stuff that improves image quality (resolution, noise, whatever) without losing anything. For example, Adobe Camera Raw Enhance Details, frame averaging, focus stacking, deconvolution (blur removal), etc.

This article below perfectly demonstrates why I keep stating that iPhone image quality (HEIC or JPG) is total freaking garbage. Unless you shoot RAW*. Or unless you pretend it’s fine by looking at it on a small screen (eg iPhone or iPad) whose extreme pixel density hides reality from you.

I keep trying to get my kids to shoot RAW, but since all they see photos on is their iPhone, they keep shooting crap-quality JPEGs on their phones and do not believe me that the photos suck in terms of detail.

For Apple (and presumably other phone vendors), it’s all about minimizing image size while making images look good to the eye at a very crude/coarse level (eg a tiny display).

HEIC is the newest gaslighting joke as it does next to nothing for quality vs JPEG, saving a little more but producing the same trash. For the most part, all fine details are discarded, and this is plain to see for anyone that actually looks. There are exceptions that meet the compression model’s expectations but skin, cloth, any kind of landscape, etc are all mutilated, typically cutting real resolution by a factor of 4X to 10X.

* TIP: ou can shoot your camera phone using RAW. This sidesteps all the issues discussed here. There is nothing inherently bad about a camera phone sensor; all the nasty badness comes in the processing pipeline to HEIC/JPEG. Which need not be so—it is possible to create very high quality JPEG files.

The Limits of "Computational Photography"

Jan 2023, by Will Yager

I was recently discussing laser etching with an engineer/font-designer friend of mine, and I wanted to show him a picture of some really good laser etching on a particular piece of optical equipment.

No problem, I figured - I would just snap a quick picture on my phone and send him a text. Unfortunately, I ran into an unexpected problem; my phone simply could not manage to take a picture of the text. This article is a bit of a rant spurred by this annoyance, so please forgive any hand-waving and epistemic sloppiness I’m engaging in so that I can pound this out before I stop being annoyed.

Every time I tried to take a picture of the engraved text, the picture on my phone looked terrible! It looked like someone had sloppily drawn the text with a paint marker. What was going on? Was my vision somehow faulty, failing to see the rough edges and sloppy linework that my iPhone seemed to be picking up?

   

What is going on here? Well, I noticed that when I first take the picture on my iPhone, for a split second the image looks fine. Then, after some processing completes, it’s replaced with the absolute garbage you see here. Something in the iPhone’s image processing pipeline is taking a perfectly intelligible and representative (if perhaps slightly blurry) image and replacing it with an “improved” image that looks like crap.

...

Significantly more objectionable are the types of approaches that impose a complex prior on the contents of the image. This is the type of process that produces the trash-tier results you see in my example photos. Basically, the image processing software has some kind of internal model that encodes what it “expects” to see in photos. This model could be very explicit, like the fake moon thing, an “embodied” model that makes relatively simple assumptions (e.g. about the physical dynamics of objects in the image), or a model with a very complex implicit prior, such as a neural network trained on image upscaling. In any case, the camera is just guessing what’s in your image. If your image is “out-of-band”, that is, not something the software is trained to guess, any attempts to computationally “improve” your image are just going to royally trash it up.

DIGLLOYD: the example shown is obvious in its failures, though it’s a little unfair in having different size/resolution. Still, the total mutilation of all detail except the coarsest edges is self evident in the phone picture and it is precisely what the iPhone does to just about everything.

Real life is far worse: an iPhone picture of my skin makes me look like I have some nasty disease, skies are banded/stepped, textural detail of just about everything is smeared to oblivion.

Anon MD writes:

Your comments about crappy iPhone pics has an analogy to the world of medical photography past and present.

With the iPhone and similar now in the hands of gazillions of customers combined with platforms such as Instagram and their ilk pushing the populace to generate more gazillions of mostly useless pictures, it all comes down to quantity over quality. And if all you are doing is looking at images on your phone, who cares if they look OK there but like shit on anything bigger than a 3x4” screen?

When I started practice in 1984 the ophthalmic images we took were either taken by the MD or by a trained professional medical photographer. You had to know what you were taking a picture of, what details you wanted to enhance, and how to adjust focus and exposure to capture the relevant details of a three dimensional image. To do this required years of experience. Also, we were limited to one roll of film or less so you had to make every shot count. Plus film images took up a lot of physical storage.

As digital supplanted film and digital storage got cheaper and cheaper it all changed to quantity over quality. Now the “technicians”, who for the most part are feckless morons, (medical photographers having gone the way of the dodo except at huge medical institutions) have the ability to take hundreds of pictures of one patient in the literal blink of an eye and screw focus, exposure, or framing of the desired portion of the image, etc. cuz that’s for losers. It’s not really the “technician’s" fault - they are just mimicking the photographic behavior they now do in their personal smartphone life.

I’m not sure there is a moral to this story - just a recognition that if you want a quality image it will take time, experience, patience, and quality equipment and software.

The photographer George Lepp used to run a digital photographic training institute in Los Osos years ago. One of his published portfolios was on the California poppy. I heard that once after a photo shoot comprising some 900 images, he kept only one image and deleted the rest. I guess this is halfway between the requirements of the film world and the advantages of the digital world. But George was certainly no iPhone photographer.

DIGLLOYD: I really enjoy emails like this.


Upgrade the memory of your 2020 iMac up to 128GB

Save the tax, we pay you back, instantly!

diglloyd Inc. | FTC Disclosure | PRIVACY POLICY | Trademarks | Terms of Use
Contact | About Lloyd Chambers | Consulting | Photo Tours
RSS Feeds | Twitter
Copyright © 2022 diglloyd Inc, all rights reserved.