Regression tests failing?

Jouni K. Seppänen <jks@...278...> writes:

It looks like Jouni wrote that test... Jouni: could you verify that the
difference isn't significant? If it's not, we can just create a new
baseline image.

I can't replicate the problem: for me "nosetests
matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
matplotlib revision 8276, OS X 10.6.3). Could you post the failing
images somewhere?

Thanks for the images Michael. If I change all E1 bytes in the current
baseline image to E0 bytes, the images match perfectly. Sounds like a
change in the discretization of floating-point values to integer bytes.

The difference is definitely not significant, but as I said, I get the
old image on my system. I don't object to changing the baseline image,
but the best solution would be to make the image diff less sensitive.
Unfortunately I don't have the time to pursue this now.

···

--
Jouni K. Seppänen
http://www.iki.fi/jks

Jouni K. Seppänen wrote:

The following message is a courtesy copy of an article
that has been posted to gmane.comp.python.matplotlib.devel as well.

Jouni K. Seppänen <jks-X3B1VOXEql0@...517...> writes:

It looks like Jouni wrote that test... Jouni: could you verify that the difference isn't significant? If it's not, we can just create a new baseline image.
      

I can't replicate the problem: for me "nosetests
matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
matplotlib revision 8276, OS X 10.6.3). Could you post the failing
images somewhere?
    
Thanks for the images Michael. If I change all E1 bytes in the current
baseline image to E0 bytes, the images match perfectly. Sounds like a
change in the discretization of floating-point values to integer bytes.

The difference is definitely not significant, but as I said, I get the
old image on my system. I don't object to changing the baseline image,
but the best solution would be to make the image diff less sensitive.
Unfortunately I don't have the time to pursue this now.
  

Thanks for the analysis. I'll look into the image diffing and see what can be done.

Mike

···

--
Michael Droettboom
Science Software Branch
Operations and Engineering Division
Space Telescope Science Institute
Operated by AURA for NASA

I made a small change to the test harness so that the tolerance can be set on a per-test basis. I increased the tolerance for the figimage test so it passes on my machine.

Cheers,
Mike

Michael Droettboom wrote:

···

Jouni K. Seppänen wrote:
  

The following message is a courtesy copy of an article
that has been posted to gmane.comp.python.matplotlib.devel as well.

Jouni K. Seppänen <jks-X3B1VOXEql0@...517...> writes:

It looks like Jouni wrote that test... Jouni: could you verify that the difference isn't significant? If it's not, we can just create a new baseline image.
      

I can't replicate the problem: for me "nosetests
matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
matplotlib revision 8276, OS X 10.6.3). Could you post the failing
images somewhere?
    

Thanks for the images Michael. If I change all E1 bytes in the current
baseline image to E0 bytes, the images match perfectly. Sounds like a
change in the discretization of floating-point values to integer bytes.

The difference is definitely not significant, but as I said, I get the
old image on my system. I don't object to changing the baseline image,
but the best solution would be to make the image diff less sensitive.
Unfortunately I don't have the time to pursue this now.
  

Thanks for the analysis. I'll look into the image diffing and see what can be done.

Mike

--
Michael Droettboom
Science Software Branch
Operations and Engineering Division
Space Telescope Science Institute
Operated by AURA for NASA