Hello, I have just updated to v1.0.0 and am trying to run the test
suite to make sure everything is ok. There seems to be two different
suites and I am not sure which is correct/current:
$python -c 'import matplotlib; matplotlib.test()'
[...snipped output...]
Ran 138 tests in 390.991s
OK (KNOWNFAIL=2)
$nosetests matplotlib.tests I get:
[...snipped output]
Ran 144 tests in 380.165s
FAILED (errors=4, failures=1)
Two of these errors are the known failures from above, and the other
two are in "matplotlib.tests.test_text.test_font_styles":
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles.png vs.
/home/adam/result_images/test_text/expected-font_styles.png (RMS
23.833)
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles_svg.png vs.
/home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
12.961)
Of course, we'll prefer to see all of the tests pass...
I'm surprised the two modes of running the tests gives different results. Are you sure they are running the same python? Does
python `which nosetests` matplotlib.tests
give you the same result as
nosetests matplotlib.tests
?
There must be some environmental difference between the two to cause the different results.
Mike
···
On 07/24/2010 05:09 PM, Adam wrote:
Hello, I have just updated to v1.0.0 and am trying to run the test
suite to make sure everything is ok. There seems to be two different
suites and I am not sure which is correct/current:
$python -c 'import matplotlib; matplotlib.test()'
[...snipped output...]
Ran 138 tests in 390.991s
OK (KNOWNFAIL=2)
$nosetests matplotlib.tests I get:
[...snipped output]
Ran 144 tests in 380.165s
FAILED (errors=4, failures=1)
Two of these errors are the known failures from above, and the other
two are in "matplotlib.tests.test_text.test_font_styles":
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles.png vs.
/home/adam/result_images/test_text/expected-font_styles.png (RMS
23.833)
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles_svg.png vs.
/home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
12.961)
The module that fails is:
FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
line 186, in runTest
self.test(*self.arg)
File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
line 24, in test_recarray_csv_roundtrip
assert np.allclose( expected['x'], actual['x'] )
AssertionError
I am not sure of the importance level of these - but I wanted to ask
to see if I should do anything or if they can safely be ignored.
Thanks,
Adam.
------------------------------------------------------------------------------
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net matplotlib-users List Signup and Options
Hmm... surprisingly, I am actually able to reproduce this sort of behaviour here. I'll look into it further.
Mike
···
On 07/27/2010 09:49 AM, Michael Droettboom wrote:
Of course, we'll prefer to see all of the tests pass...
I'm surprised the two modes of running the tests gives different
results. Are you sure they are running the same python? Does
python `which nosetests` matplotlib.tests
give you the same result as
nosetests matplotlib.tests
?
There must be some environmental difference between the two to cause the
different results.
Mike
On 07/24/2010 05:09 PM, Adam wrote:
Hello, I have just updated to v1.0.0 and am trying to run the test
suite to make sure everything is ok. There seems to be two different
suites and I am not sure which is correct/current:
$python -c 'import matplotlib; matplotlib.test()'
[...snipped output...]
Ran 138 tests in 390.991s
OK (KNOWNFAIL=2)
$nosetests matplotlib.tests I get:
[...snipped output]
Ran 144 tests in 380.165s
FAILED (errors=4, failures=1)
Two of these errors are the known failures from above, and the other
two are in "matplotlib.tests.test_text.test_font_styles":
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles.png vs.
/home/adam/result_images/test_text/expected-font_styles.png (RMS
23.833)
ImageComparisonFailure: images not close:
/home/adam/result_images/test_text/font_styles_svg.png vs.
/home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
12.961)
The module that fails is:
FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
line 186, in runTest
self.test(*self.arg)
File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
line 24, in test_recarray_csv_roundtrip
assert np.allclose( expected['x'], actual['x'] )
AssertionError
I am not sure of the importance level of these - but I wanted to ask
to see if I should do anything or if they can safely be ignored.
Thanks,
Adam.
------------------------------------------------------------------------------
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Matplotlib-users mailing list
Matplotlib-users@lists.sourceforge.net matplotlib-users List Signup and Options
I just ran into this and it comes from the fact that
'matplotlib.tests.test_text' is not in the default_test_modules
variable inside matplotlib's __init__.py
I added a pull request for this two line change just in case
there was a specific reason to *exclude* test_text from the test
modules?
For instance, right now, I get one failure in the test suite if I
include it. The failure is in test_text:test_font_styles, but
this has been the case for a while, it's just that these tests
weren't running before.
Hmm... surprisingly, I am actually able to reproduce this sort of
behaviour here. I'll look into it further.
Mike
On 07/27/2010 09:49 AM, Michael Droettboom wrote:
> Of course, we'll prefer to see all of the tests pass...
>
> I'm surprised the two modes of running the tests gives different
> results. Are you sure they are running the same python? Does
>
> python `which nosetests` matplotlib.tests
>
> give you the same result as
>
> nosetests matplotlib.tests
>
> ?
>
> There must be some environmental difference between the two to cause the
> different results.
>
> Mike
>
> On 07/24/2010 05:09 PM, Adam wrote:
>
>> Hello, I have just updated to v1.0.0 and am trying to run the test
>> suite to make sure everything is ok. There seems to be two different
>> suites and I am not sure which is correct/current:
>>
>> $python -c 'import matplotlib; matplotlib.test()'
>> [...snipped output...]
>> Ran 138 tests in 390.991s
>> OK (KNOWNFAIL=2)
>>
>> $nosetests matplotlib.tests I get:
>> [...snipped output]
>> Ran 144 tests in 380.165s
>> FAILED (errors=4, failures=1)
>>
>> Two of these errors are the known failures from above, and the other
>> two are in "matplotlib.tests.test_text.test_font_styles":
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles.png vs.
>> /home/adam/result_images/test_text/expected-font_styles.png (RMS
>> 23.833)
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles_svg.png vs.
>> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
>> 12.961)
>>
>> The module that fails is:
>>
>> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
>> line 186, in runTest
>> self.test(*self.arg)
>> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
>> line 24, in test_recarray_csv_roundtrip
>> assert np.allclose( expected['x'], actual['x'] )
>> AssertionError
>>
>>
>>
>> I am not sure of the importance level of these - but I wanted to ask
>> to see if I should do anything or if they can safely be ignored.
>>
>> Thanks,
>> Adam.
I just ran into this and it comes from the fact that
'matplotlib.tests.test_text' is not in the default_test_modules
variable inside matplotlib's __init__.py
I added a pull request for this two line change just in case
there was a specific reason to *exclude* test_text from the test
modules?=20
For instance, right now, I get one failure in the test suite if I
include it. The failure is in test_text:test_font_styles, but
this has been the case for a while, it's just that these tests
weren't running before.
Hmm... surprisingly, I am actually able to reproduce this sort of=20
behaviour here. I'll look into it further.
=20
Mike
=20
On 07/27/2010 09:49 AM, Michael Droettboom wrote:
> Of course, we'll prefer to see all of the tests pass...
>
> I'm surprised the two modes of running the tests gives different
> results. Are you sure they are running the same python? Does
>
> python `which nosetests` matplotlib.tests
>
> give you the same result as
>
> nosetests matplotlib.tests
>
> ?
>
> There must be some environmental difference between the two to cause the
> different results.
>
> Mike
>
> On 07/24/2010 05:09 PM, Adam wrote:
> =20
>> Hello, I have just updated to v1.0.0 and am trying to run the test
>> suite to make sure everything is ok. There seems to be two different
>> suites and I am not sure which is correct/current:
>>
>> $python -c 'import matplotlib; matplotlib.test()'
>> [...snipped output...]
>> Ran 138 tests in 390.991s
>> OK (KNOWNFAIL=3D2)
>>
>> $nosetests matplotlib.tests I get:
>> [...snipped output]
>> Ran 144 tests in 380.165s
>> FAILED (errors=3D4, failures=3D1)
>>
>> Two of these errors are the known failures from above, and the other
>> two are in "matplotlib.tests.test_text.test_font_styles":
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles.png vs.
>> /home/adam/result_images/test_text/expected-font_styles.png (RMS
>> 23.833)
>> ImageComparisonFailure: images not close:
>> /home/adam/result_images/test_text/font_styles_svg.png vs.
>> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
>> 12.961)
>>
>> The module that fails is:
>>
>> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/=
nose/case.py",
>> line 186, in runTest
>> self.test(*self.arg)
>> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_=
mlab.py",
>> line 24, in test_recarray_csv_roundtrip
>> assert np.allclose( expected['x'], actual['x'] )
>> AssertionError
>>
>>
>>
>> I am not sure of the importance level of these - but I wanted to ask
>> to see if I should do anything or if they can safely be ignored.
>>
>> Thanks,
>> Adam.