Playwright or Selenium?

Playwright and Selenium are the two big choices when choosing a browser user-interface test programming tool. Selenium has been around a long time. Playwright is the new kid in town.

I’ve used Selenium before. It is a pita but it mostly works. Unfortunately, even 99% “mostly works” is a problem when you are running hundreds of tests. That means something is always failing. So you wind up writing all your code with “wait-until-something-is true” and “try-this-multiple-times-until-it-succeeds” features. And then it still fails once in a while, so you just have to repeat the whole test and then it works. The bigger the project, the worse this problem becomes.

In short, we use Selenium because we have to, not because we like it.

I had the opportunity to start a new project so I tried Playwright. Still a learning curve. Still requires a lot of work. But they took all the “wait-until” stuff and moved it “behind the scenes”. It still has to be done, but you, the programmer, don’t have to handle it yourself. unless you want to. Much better.

After 2 weeks working with Playwright, I was still impressed. Progress was slow but steady.  The hardest part was that I am using Vaadin as the front-end development tool and they make serious use of the shadow DOM, so each new element type took trial and error to get working. This would have been the same amount of work in Playwright and Selenium.

I was also fighting against the “best-practices” of Playwright. I like to use “id” attributes whenever I am selecting something. And I really like to use XPath. Yes, XPath can be brittle, but don’t kid yourself: UI testing is always going to be brittle. Now, Playwright doesn’t support XPath inside the shadow-dom, and I was constantly running into that problem. Eventually, everything I was doing with XPath was easily handled using CSS. For example, select the element with type “vaadin-form-layout” and attributes class=”user=prefs” and id=”userPrefsId”. So I was writing “xpath-ish” and it was easily translated into CSS “vaadin-form-layout[user=”prefs”][id=userPrefsId”]” and so forth. Anyway, personal preferences and nothing to do with the subject of Playwright vs Selenium.

And then I read this guy and he scared me:

https://javascript.plainenglish.io/playwrights-auto-waiting-is-wrong-2f9e24beb3b8

https://zhiminzhan.medium.com/waiting-strategies-for-test-steps-in-web-test-automation-aaae828eb3b3

https://zhiminzhan.medium.com/why-raw-selenium-syntax-is-better-than-cypress-and-playwright-a0f796aafc43

https://zhiminzhan.medium.com/correct-wrong-playwrights-advantage-over-selenium-part-1-playwright-is-modern-and-faster-than-0a652c7e9ee7

https://zhiminzhan.medium.com/why-raw-selenium-syntax-is-better-than-cypress-and-playwright-part-2-the-audience-matters-a8375e6918e4

https://zhiminzhan.medium.com/playwright-vs-selenium-webdriver-syntax-comparison-by-example-4ad74ca59dcc

https://medium.com/geekculture/optimize-selenium-webdriver-automated-test-scripts-speed-12d23f623a6

His arguments were reasonable but not sufficiently detailed to be definitively persuasive. To summarize at a very high level, the best arguments were:

  1. Playwright did waiting wrong.
  2. Selenium is the web-standard and google/facebook will make sure it is up to date. Playwright could get left behind.

Ok, both of these are serious accusations. And I don’t feel qualified to comment on their validity.

Emotionally, the author, Zhimin Zhan seemed a bit cranky. Certainly seems like an expert, but sometimes experts get cranky when their favorite technology gets left behind. Either possibility seemed plausible.

So I decided I would redo the last 2 weeks work in Playwright in Selenium. It only took a few hours.

As soon as I ran the same tests in Selenium, I remembered why it was always so frustrating:

The first problem was with the Save-Menu. On startup it is inactive (disabled=”true”) and my tests assert that. However, the Selenium method isEnabled() returned true. Google “selenium isenabled not working”. My God! How many years now and that is STILL broken?

We shouldn’t all have to write this crappy code:

public static boolean isEnabled(WebElement element) {
   boolean enabled1 = element.isEnabled(); //this can be wrong
   String disabled = element.getAttribute("disabled"); //this is reliable
   boolean disabled2 = disabled != null && Boolean.parseBoolean(disabled);
   if (enabled1 == disabled2) {
      System.err.println("discrepancy in isEnabled");
      enabled1 = !disabled2;
   }
   return enabled1;
}

The next problem was when I clicked on a VaadinSideNavItem. I got this error:

org.openqa.selenium.ElementClickInterceptedException: element click intercepted: Element <vaadin-side-nav-item path="domain/person" 
id="CometPersonInit-peopleNav" role="listitem" has-children="">...</vaadin-side-nav-item> is not clickable at point (127, 92).
Other element would receive the click: <html lang="en" theme="dark">...</html>

The ‘theme=”dark”‘ element is adjacent to the side-nav button. So something was wrong with the point location calculation.

Setting an implicit wait period did not work. An explicit wait period did not work either. One thing that really sucks about wait code is that it swallows the exception so you don’t see the actual problem. (You are ignoring the exception and not logging it.) So when it fails, all you know is that it did not work for X seconds; not why.

The third issue is that the Selenium isVisible() method checks the element values but does not actually check to see if the element has been scrolled into view. Playwright does it correctly (my interpretation of “isVisible” is literal). Playwright also automatically scrolls elements into view when you try and act upon them. Very nice.

So I had 3 immediate frustrations with Selenium that Playwright took care of.

I sat back and googled some more. I found a video I liked where the speaker asserted that there was no comparison between the two.

At this point I was inclined to agree with him. And I’d invested a day to verify I was making the best decision. Out with Selenium. In with Playwright.

Now I am not saying Playwright is without difficulties. The codegen tool is a miss more than a hit when I use it to try and auto-code the Locators. And I believe I have found a bug where it just fails to work correctly with chromium. That was a huge frustration that cost a week of stoppage hell until I ran out of ideas and tried firefox instead of chromium. Firefox worked, and I was able to start moving again.

Gimp batch mode with gmic

For photo work, I use Aurora HDR 2019 and Gimp. Within Gimp, I use the GMIC plugin a great deal, as it has the best noise reduction and hot pixel reduction.

The one big weakness of Gimp is lack of batch mode. You cannot record some action on an individual image, then save that action and apply it to a group of images.

And I do a lot of work on groups of images.

There is a heavy duty script processing engine in gimp, but I find it inaccessible. And I’m a programmer! The basic problem is that I am a lazy programmer and I really don’t need another language unless I really need another language.

Well there is BIMP for batch Gimp operation.

I’ve done practically no BIMP. The basic operations that BIMP provides I can can do in Irfanview.

Well I found myself needing to remove the hot pixels of a few hundred photos. The gmix function is remove_hotpixels. But this is the first time I have seen this documentation and the example is confusing. After reading this I thought I needed to write:

+remove_hotpixels _mask_size=3, _threshold=10,

And I tried many many options and nothing worked.

However, some of the simpler commands worked, so I knew it was just a matter of finding the right syntax.

This post is to record what worked. The image below shows the successful Bimp settings. (Except the input field would not expand.)

The function name is plug-in-gmic-qt

The input layer is something besides 0 (holy crap, that just generated out with no changes whatsoever and was a real pita to figure out)

The output mode is 0. Maybe some other options work but 0 works.

The command line string is “remove_hotpixels 3 10“, where 3 is the mask size and 10 is the threshold.

Don’t use a “plus” or “minus”. Don’t name your arguments

Well that is it so far. Another inch of knowledge.

 


Update from 2020:

 

Well, I have given up trying to run batch gmic from within gimp. Too much of a hassle, too iffy, and far too slow.

Instead, I installed gmic, the command line tool. You can find it at https://gmic.eu/download.html.

Better to have a horrendously hard to figure out uphill battle from the command line than the same thing from within a clunky gimp dialog.

The positive from using a command line is that we have well known tools for iterating over multiple files. This is important because the “iterating over multiple files” part of the gmic command line doesn’t seem to exist. I can only figure out how to write a gmic script to read one file and write one file.

I can’t yet even figure out how to tell gmic to read a jpg and write a tiff.

But the “remove hit pixels” command is (on a windows platform, using “cmd”):

for /r %f in (.\*) do gmic %f ^
-remove_hotpixels 3,10,,,Merged ^
-o %f

From the command line, cd to the folder containing the files you want to convert and paste the above text. It will recursively find all files in this folder and all sub-folders. It will open each file, run the “remove hotpixels 3,10” command, then save over the same file. And it will do it orders of magnitude faster than the same thing in gimp with bimp.

I also find anisoptropic smoothing to be very useful. It is a nightmare trying to find the right combination of arguments to a gmic command. The best way  is to setup the command in gimp then alter the settings to display the arguments. Still a pita, as you have to eyeball the text and re-type it into the command line; copy-paste does not work.

Here is a starting point for anisotropic smoothing:

for /r %f in (.\*) do gmic %f ^
-fx_smooth_anisotropic 80,0.7,0.3,0.6,1.1,0.8,30,2,2,0,1,0,0,50,50 ^
-o %f

The actual settings are listed here, but good luck converting the argument types to the integer command line values.

 

Not sure where the best documentation and tutorials for gmic are. Everything I have found requires you to know what everything means before being able to do anything. More notes added here as I learn.

 

https://manpages.ubuntu.com/manpages/trusty/fr/man1/gmic.1.html

http://gimpchat.com/viewtopic.php?f=10&t=19008

https://gmic.eu/tutorial/

G’mic Command Line: First steps

RIP Chester Williams

The death of Chester Williams hit me very, very hard today. I’ve written before how much the World Cup Final between South Africa and New Zealand mattered … the subject of the movie Invictus. And how important Chester was to that victory and the future of South Africa. In how he delivered the first big tackle to Jonah Lomu that set the tone for the entire game. It was at that moment I began to believe we could somehow win. Chester carried a heavy weight on his shoulders as the sole black on the team. People worried he was a “token player” and it was a fair concern because he was the first black to make it. I’d been studying him intensely all tournament (who hadn’t?) and knew he deserved to be there. But sports can be cruel and heroics seemed too much to hope for. Then, as I saw Lomu shudder and collapse, I began to hope and believe … and hope and believe … and then simply hoped and prayed and hung on until the end … like all the players on both sides. The greatest game of all time in which no one could score.

I cried like a baby after that game — the only game that ever mattered so much — and the deepest tears were because of Chester. If we’d won but he’d been a liability on the field, it would have been a setback to a country’s future. Instead, he had the game of a lifetime. And whites experienced pride and love for the gift of a black man’s pure courage in an arena that they understood viscerally. Many for the first time. The celebration of Chester was perhaps the first, honest, positive feeling that all colours could experience together.

I am sorry he died so young. But he is a legend and had 24 years of that knowledge. I will never forget my admiration of and debt to him.

Aurora HDR 2019 – Questionable RAW support of Sony RX10M4

I’ve had support communication with Aurora about problems reading Sony RX10M4 RAW files, which they claim to handle. At the time, I was complaining about invalid cropping: The corners of a RAW image containing vignetting (the dark edges), and Aurora did not remove it. Aurora support said that this is normal operation.

The following image shows this artifact. Same image in RAW and JPG processed by Auror without any subsequent processing. On the left is the RAW image and you can see the vignetting. On the right is the JPG image and you can see the vignetting was cropped and the image was enlarged to the same pixel dimensions.

Note that the JPG image was generated by the camera, i.e., I am saving in RAW+JPG format.

I also use the Sony Imaging Edge app to read RAW images on the computer and save them to JPG. When I do this with the RAW image, I get the identical JPG that is stored on the camera. So this tell me that Imaging Edge DOES choose to apply the vignetting when converting RAW.

Well I can live with this decision by Aurora. But that isn’t the full story. As you can tell, the above images are not the same otherwise. For example, the colors are different. Perhaps this is due to Aurora having better information from the RAW images and making more informed decisions. I can live with this too, if it is true. Color can be altered.

It is when you get to the details that the serious problems appear: 1. Aurora has made different decisions about the cropping/expansion. 2. The RAW image has bad noise artifacts. 3. The RAW image has chromatic aberrations.

Remember that the RAW image is stored with all the imperfections: The sensor itself has noise artifacts, chromatic and lens aberrations. But the RAW image also contains the information that lets the processor compensate for them. Aurora is not doing so.

Look at a close-up image. Below is a portion of the tent on the left-hand-side. On the left the RAW image processed by Aurora. On the right is the exact same region from the JPG image.

 

First, we see that we have different shapes. This is what leads me to suspect that Aurora is not using the lens information to correct for it. (I am assuming that Imaging Edge IS correcting for it, as well as the in-camera software.)

Second, look at the noise. The RAW image contains a lot of noise. In comparison, the JPG has eliminated that noise, but now has JPG noise artifacts.

Another close-up image below, showing the noise problem again AND the chromatic problem. On the left, the RAW images show serious color error along the borders. The JPG image does not.

If Aurora is doing “correct” RAW image processing, I don’t want it. (This is why I am currently saving in RAW+JPG and only working with the RAW when I absolutely have to.) But really, I am seeing so many issues that I am not convinced that Aurora is processing the RAW correctly.

Back to tech support …

…Back from tech support. Confirmation that this is a bug.

Aurora HDR 2019 – Blowouts

Much to learn about Aurora HDR 2019 still. This particular problem is that it is importing a set of bracketed images into an HDR and the starting point has blowouts, i.e., the whites are crushed to 100% and information is lost forever.

Below is a set of 9 images. These are bracketed exposures, with 1 EV separation.

After import into Aurora, this is what I see:

Notes: Actually importing the RAW images. auto-alignment, ghost reduction, color denoise, chromatic aberration reduction all turned off.

Ignore the vignettes. This is a separate issue I am dealing with (Aurora processes the RAW information but does not do lens correction).

I have “white highlighting” turned on. All the red areas in the image are 100% white, i.e., crushed/blown-out. If you look closely at the histogram, you will see a vertical line at the 100% mark. That is a problem.

That vertical line and the red splotches means that information is lost and cannot be recovered. No amount of adjustments in Aurora will let me fix this.

An HDR should not do this. This is a bug in the tool. We know for a fact that at least 1 (actually more than 1) image in the set is not blown out at these location. That information must be retained for further editing.

(Note: I proved that fact by importing only the darkest image into Aurora and noting that there were no blowouts.)

I could probably fix this by removing one or more of the over-exposed images and re-importing. But that isn’t the solution.

We’ll see what Aurora support has to say about it.

Sebago Resort dewinterizing – Memorial 2019

This was the first time that Robyn could not make it. We had some rain the first day, but otherwise great weather. Cold enough at night to need fires in the cabins. The lake water was really cold. My first time with waders, thanks honey. Replaced 2 dead trees and everything looking pretty good. And, of course, my new Sony RX10M4 camera.

Aurora HDR 2019 software. Sony RX10M4 5 image bracketing with 2 stop gaps. One sequence every 2 seconds. Pete Townshend “Dirty Water”. Many struggles with Aurora’s inability to accept the number “5” as the correct answer to the grouping problem.

Whassup with the Aurora HDR app sharpen halo?

Still a week 2 noob here, but going through some old HDR work to learn Aurora HDR 2019. I had an old series of HDR images from Sebago that was interesting: Boats on a dock. Boats that move constantly even on a calm day, and thus are very difficult to deal with.

The original base images are here:

base-1
base-2
base-3
base-4

I used hugin to align the images and produce the following HDR image shown below:

Hugin produced HDR image

I thought this was amazingly good. You can see the blur artifacts from the movement of the boats (the number 15; the motors, etc), but if you don’t look too closely this is a nice image.

Below is the attempt with Aurora HDR 2019, Version 1.0.0.2549. I imported the base images with Auto Alignment, Ghost Reduction, and Chromatic Aberration Reduction all turned on. Then I turned off all the Filters. (Meaning this is not going to be the final result.)

Aurora version

Obviously a fantastic job with the blurring. Amazing. I can work with this.

Except I can’t. Something is causing sharpening halos. I’ve tried turning all the filters on and off, but even with everything off, nothing eliminates those halos, afaict. So where did they come from?

Here is closeup of the problem: Look at the sides of the posts. Hugin on the left shows that it is possible to combine the images without excessive halos. Aurora on the right show unacceptable halos.

Not all my images show excessive halo. I did a series of work with a much better camera at http://www.clevercaboose.com/2019/05/07/tablerock-trip-20190414/ But now I needed to stop and figure out what is causing this.

I created a support ticket with Aurora. Quick response:

..it’s quite normal behavior of the software…the RAW photos you’ve sent to us are quite low-resolution ones, and they come out aligned pretty well considering their resolution and format. You might have achieved a better result with RAWs though. The halos you may see are not halos, but light on the photo increased by our tone-mapping powered by AI which increases the contrast.

Contrast that looks like a noob that just discovered the unsharp mask filter! I’d rather a more adult AI.

A note on the word “halo”. Specifically I am talking about overshoot and undershoot.

But perhaps the problem IS low resolution. These ARE old photos.

Try again with 2019 quality images:

hidef-base-1
hidef-base-2
hidef-base-3

Here is an HDR composite with ALL Aurora filters turned off.

ALL aurora filters turned off

The light on the horizon IS “haloed”, excuse me, AI-powered contrasting. But there is also contrast from the original images. Has Aurora added to this even with all filters turned off? Below is an area on the horizon zoomed in:

Zoomed in horizon

If there is any halo,. it is present in the originals as well. I have no cause for complaint. Aurora is excused.

Caveat: Play around with the filters (in particular the “HDR Smart Structure”) and you will find many ways to create crappy sharpening. So don’t turn the knob to 11.

So the issue is resolved. But be careful working with low resolution images. The AI isn’t very smart all the time.

Update 6/1/2019: Here is an extreme example with hi-rez images. Base images first. This is 9 images with 1EV bracketing

And here is an HDR of all 9 images. I aligned the images (this was hand-held), did no ghosting, and selected the “essential/vivid” template.

9-image HDR. Align images. No ghost reduction. Essential/vivid template.

Below is the same 9-image HDR with all the same settings EXCEPT medium ghost reduction is turned on. And here we see the problem: That is the worst halo I have ever seen.

Previously I was frustrated because I could not find a setting to turn off whatever was causing the halo. Now it is clear why: That halo is baked in when the image is created.

Robyn’s last game with Mike Cleary, Feb 18, 2018

Today was the last game of 10 years of my daughter playing for this man. When she started she could outrun anyone on the field, but a gust of wind might blow her over. A gazelle on a top team in the city. She got to travel far away to play in miserable weather. She got to learn that she is deeply competitive, even ruthless, but never dirty or unfair. She got to learn team success. She got to make mistakes that cost the team the tournament. She got to sit on a bench and wait for another chance. She got another chance and failed at that too. She got to endure two frustrating years playing only 1/3 of a game on a struggling no longer top team in the city. She got to learn that if failure is one’s fate, then not complaining and continuing to work hard every day is how you beat it. That character and perseverance is necessary to win in the long run.

She got to learn that she is a Marine by nature: That she’ll take that hill or die trying. Anything less is unthinkable. And that most people are unlike her, so she’ll probably spend much of her time climbing that hill alone. But that the view is worth it.

She got to learn that life is unfair. But that even if one is too small to hold onto the ball, one can always develop a great first touch so as never to be caught with the ball in the first place. Which now makes one a great passer. So weakness overcome at one stage of development becomes an advantage at the next. And that when height and strength finally do arrive, payback is beautiful.

The gazelle is now a panther.

I didn’t always have faith my daughter was going to be a good player. Those failures were painful, and it was my first rodeo. She had athleticism, but lacked skill and received no reward. I didn’t think anyone else had faith in her either, perhaps even her coach. 5 years in, after another disappointing year, I asked if we needed to move on. He was shocked at the question, answering: “Absolutely not. She is about to become very good and I have big plans for her.”

(What he only told us later was that he was alone in that opinion. All the other coaches had recommended he drop her.)

That meeting ended our concern. I trusted the man. That summer she grew, and that fall everything changed for the better.

3 girls who played all 10 years together. 8 who played the last 6. That just doesn’t happen. You have to be incredibly lucky.

Lucky to have a coach powerful enough to win the internal club battles and keep the team together. Who has always done the right thing for the player, as opposed to himself. Who cares about character and chemistry and playing soccer the right way. Who praises a loss played well, and criticizes a win played badly. Who criticizes without belittling. Who believes that fundamentals cannot be taught via shortcuts. Who demands you play it out of the backfield. That you pass to build up an attack instead of just kicking it downfield. Even if it means you lose for a few years. That you need to take that hill the way he tells you to take that hill, and if you die today while trying, today isn’t the most important matter.

Where you are in 10 years is the most important matter.

Thank you, Michael Cleary. Your character matters, and that is why this team always stayed together. My daughter has been incredibly lucky to have you as her coach. May you have many more teams of quality, and may you be respected and rewarded for being the great coach you are.