matplotlib.test() no errors, but $nosetest matplotlib.tests -> errors and failure?

Hello, I have just updated to v1.0.0 and am trying to run the test
suite to make sure everything is ok. There seems to be two different
suites and I am not sure which is correct/current:

$python -c 'import matplotlib; matplotlib.test()'
[...snipped output...]
Ran 138 tests in 390.991s
OK (KNOWNFAIL=2)

$nosetests matplotlib.tests I get:
[...snipped output]
Ran 144 tests in 380.165s
FAILED (errors=4, failures=1)

Two of these errors are the known failures from above, and the other
two are in "matplotlib.tests.test_text.test_font_styles":
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles.png vs.
/home/adam/result_images/test_text/expected-font_styles.png (RMS
23.833)
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles_svg.png vs.
/home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
12.961)

The module that fails is:

FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip

···

----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
line 186, in runTest
self.test(*self.arg)
File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
line 24, in test_recarray_csv_roundtrip
assert np.allclose( expected['x'], actual['x'] )
AssertionError

I am not sure of the importance level of these - but I wanted to ask
to see if I should do anything or if they can safely be ignored.

Thanks,
Adam.

Of course, we'll prefer to see all of the tests pass...

I'm surprised the two modes of running the tests gives different results. Are you sure they are running the same python? Does

    python `which nosetests` matplotlib.tests

give you the same result as

    nosetests matplotlib.tests

?

There must be some environmental difference between the two to cause the different results.

Mike

···

On 07/24/2010 05:09 PM, Adam wrote:

Hello, I have just updated to v1.0.0 and am trying to run the test
suite to make sure everything is ok. There seems to be two different
suites and I am not sure which is correct/current:

$python -c 'import matplotlib; matplotlib.test()'
[...snipped output...]
Ran 138 tests in 390.991s
OK (KNOWNFAIL=2)

$nosetests matplotlib.tests I get:
[...snipped output]
Ran 144 tests in 380.165s
FAILED (errors=4, failures=1)

Two of these errors are the known failures from above, and the other
two are in "matplotlib.tests.test_text.test_font_styles":
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles.png vs.
/home/adam/result_images/test_text/expected-font_styles.png (RMS
23.833)
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles_svg.png vs.
/home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
12.961)

The module that fails is:

FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
line 186, in runTest
    self.test(*self.arg)
  File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
line 24, in test_recarray_csv_roundtrip
    assert np.allclose( expected['x'], actual['x'] )
AssertionError

I am not sure of the importance level of these - but I wanted to ask
to see if I should do anything or if they can safely be ignored.

Thanks,
Adam.

------------------------------------------------------------------------------
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
matplotlib-users List Signup and Options

Hmm... surprisingly, I am actually able to reproduce this sort of behaviour here. I'll look into it further.

Mike

···

On 07/27/2010 09:49 AM, Michael Droettboom wrote:

Of course, we'll prefer to see all of the tests pass...

I'm surprised the two modes of running the tests gives different
results. Are you sure they are running the same python? Does

     python `which nosetests` matplotlib.tests

give you the same result as

     nosetests matplotlib.tests

?

There must be some environmental difference between the two to cause the
different results.

Mike

On 07/24/2010 05:09 PM, Adam wrote:
   

Hello, I have just updated to v1.0.0 and am trying to run the test
suite to make sure everything is ok. There seems to be two different
suites and I am not sure which is correct/current:

$python -c 'import matplotlib; matplotlib.test()'
[...snipped output...]
Ran 138 tests in 390.991s
OK (KNOWNFAIL=2)

$nosetests matplotlib.tests I get:
[...snipped output]
Ran 144 tests in 380.165s
FAILED (errors=4, failures=1)

Two of these errors are the known failures from above, and the other
two are in "matplotlib.tests.test_text.test_font_styles":
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles.png vs.
/home/adam/result_images/test_text/expected-font_styles.png (RMS
23.833)
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles_svg.png vs.
/home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
12.961)

The module that fails is:

FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
----------------------------------------------------------------------
Traceback (most recent call last):
   File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
line 186, in runTest
     self.test(*self.arg)
   File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
line 24, in test_recarray_csv_roundtrip
     assert np.allclose( expected['x'], actual['x'] )
AssertionError

I am not sure of the importance level of these - but I wanted to ask
to see if I should do anything or if they can safely be ignored.

Thanks,
Adam.

------------------------------------------------------------------------------
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
matplotlib-users List Signup and Options

------------------------------------------------------------------------------
The Palm PDK Hot Apps Program offers developers who use the
Plug-In Development Kit to bring their C/C++ apps to Palm for a share
of $1 Million in cash or HP Products. Visit us here for more details:
http://ad.doubleclick.net/clk;226879339;13503038;l?
http://clk.atdmt.com/CRS/go/247765532/direct/01/
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net
matplotlib-users List Signup and Options

Blast from the past!

I just ran into this and it comes from the fact that
'matplotlib.tests.test_text' is not in the default_test_modules
variable inside matplotlib's __init__.py

Here's the necessary diff:

index 82633a5..649e4d8 100644
--- a/lib/matplotlib/__init__.py
+++ b/lib/matplotlib/__init__.py
@@ -968,7 +968,8 @@ default_test_modules = [
     'matplotlib.tests.test_spines',
     'matplotlib.tests.test_image',
     'matplotlib.tests.test_simplification',
- 'matplotlib.tests.test_mathtext'
+ 'matplotlib.tests.test_mathtext',
+ 'matplotlib.tests.test_text'
     ]

I added a pull request for this two line change just in case
there was a specific reason to *exclude* test_text from the test
modules?

For instance, right now, I get one failure in the test suite if I
include it. The failure is in test_text:test_font_styles, but
this has been the case for a while, it's just that these tests
weren't running before.

Any developers want to chime in on this?

best,

···

--
Paul Ivanov
http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7

Michael Droettboom, on 2010-07-27 11:19, wrote:

Hmm... surprisingly, I am actually able to reproduce this sort of
behaviour here. I'll look into it further.

Mike

On 07/27/2010 09:49 AM, Michael Droettboom wrote:
> Of course, we'll prefer to see all of the tests pass...
>
> I'm surprised the two modes of running the tests gives different
> results. Are you sure they are running the same python? Does
>
> python `which nosetests` matplotlib.tests
>
> give you the same result as
>
> nosetests matplotlib.tests
>
> ?
>
> There must be some environmental difference between the two to cause the
> different results.
>
> Mike
>
> On 07/24/2010 05:09 PM, Adam wrote:
>
>> Hello, I have just updated to v1.0.0 and am trying to run the test
>> suite to make sure everything is ok. There seems to be two different
>> suites and I am not sure which is correct/current:
>>
>> $python -c 'import matplotlib; matplotlib.test()'
>> [...snipped output...]
>> Ran 138 tests in 390.991s
>> OK (KNOWNFAIL=2)
>>
>> $nosetests matplotlib.tests I get:
>> [...snipped output]
>> Ran 144 tests in 380.165s
>> FAILED (errors=4, failures=1)
>>
>> Two of these errors are the known failures from above, and the other
>> two are in "matplotlib.tests.test_text.test_font_styles":
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles.png vs.
>> /home/adam/result_images/test_text/expected-font_styles.png (RMS
>> 23.833)
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles_svg.png vs.
>> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
>> 12.961)
>>
>> The module that fails is:
>>
>> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
>> line 186, in runTest
>> self.test(*self.arg)
>> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
>> line 24, in test_recarray_csv_roundtrip
>> assert np.allclose( expected['x'], actual['x'] )
>> AssertionError
>>
>>
>>
>> I am not sure of the importance level of these - but I wanted to ask
>> to see if I should do anything or if they can safely be ignored.
>>
>> Thanks,
>> Adam.

Blast from the past!

I just ran into this and it comes from the fact that
'matplotlib.tests.test_text' is not in the default_test_modules
variable inside matplotlib's __init__.py

Here's the necessary diff:

index 82633a5..649e4d8 100644
--- a/lib/matplotlib/__init__.py
+++ b/lib/matplotlib/__init__.py
@@ -968,7 +968,8 @@ default_test_modules =3D [
     'matplotlib.tests.test_spines',
     'matplotlib.tests.test_image',
     'matplotlib.tests.test_simplification',
- 'matplotlib.tests.test_mathtext'
+ 'matplotlib.tests.test_mathtext',
+ 'matplotlib.tests.test_text'
     ]

I added a pull request for this two line change just in case
there was a specific reason to *exclude* test_text from the test
modules?=20

For instance, right now, I get one failure in the test suite if I
include it. The failure is in test_text:test_font_styles, but
this has been the case for a while, it's just that these tests
weren't running before.

Any developers want to chime in on this?

best,

···

--
Paul Ivanov
http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7

Michael Droettboom, on 2010-07-27 11:19, wrote:

Hmm... surprisingly, I am actually able to reproduce this sort of=20
behaviour here. I'll look into it further.
=20
Mike
=20
On 07/27/2010 09:49 AM, Michael Droettboom wrote:
> Of course, we'll prefer to see all of the tests pass...
>
> I'm surprised the two modes of running the tests gives different
> results. Are you sure they are running the same python? Does
>
> python `which nosetests` matplotlib.tests
>
> give you the same result as
>
> nosetests matplotlib.tests
>
> ?
>
> There must be some environmental difference between the two to cause the
> different results.
>
> Mike
>
> On 07/24/2010 05:09 PM, Adam wrote:
> =20
>> Hello, I have just updated to v1.0.0 and am trying to run the test
>> suite to make sure everything is ok. There seems to be two different
>> suites and I am not sure which is correct/current:
>>
>> $python -c 'import matplotlib; matplotlib.test()'
>> [...snipped output...]
>> Ran 138 tests in 390.991s
>> OK (KNOWNFAIL=3D2)
>>
>> $nosetests matplotlib.tests I get:
>> [...snipped output]
>> Ran 144 tests in 380.165s
>> FAILED (errors=3D4, failures=3D1)
>>
>> Two of these errors are the known failures from above, and the other
>> two are in "matplotlib.tests.test_text.test_font_styles":
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles.png vs.
>> /home/adam/result_images/test_text/expected-font_styles.png (RMS
>> 23.833)
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles_svg.png vs.
>> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
>> 12.961)
>>
>> The module that fails is:
>>
>> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/=

nose/case.py",

>> line 186, in runTest
>> self.test(*self.arg)
>> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_=

mlab.py",

>> line 24, in test_recarray_csv_roundtrip
>> assert np.allclose( expected['x'], actual['x'] )
>> AssertionError
>>
>>
>>
>> I am not sure of the importance level of these - but I wanted to ask
>> to see if I should do anything or if they can safely be ignored.
>>
>> Thanks,
>> Adam.