Comparing pdf output in tests

Andrew Straw <strawman@...36...> writes:

Michael Droettboom wrote:

We can probably standardize the version of gs on the buildbot machines,
but it's been very useful up to now to have tests that can run on a
variety of developer machines as well.

I understood Jouni's idea to be to save the .pdfs as baseline images --
then the same version of gs would be used to generated the rasterized
images for the baseline and test result -- the version on your computer.
I think this is the way to go (either that or compare the PDFs directly
somehow).

Yes, that's what I meant: we want to test that the PDF file generated by
the code is "equivalent" to the baseline, and aside from some metadata
in the files, I think "equivalence" should mean that the files generate
the same rasterized output on some particular PDF renderer.

I suppose Ghostscript is widespread enough that we can assume that it
exists in the test environment? Or is there some buildout magic that we
should add in some file?

···

--
Jouni K. Sepp�nen

Jouni K. Seppänen wrote:

Andrew Straw <strawman@...36...> writes:

Michael Droettboom wrote:
    

We can probably standardize the version of gs on the buildbot machines,
but it's been very useful up to now to have tests that can run on a
variety of developer machines as well.

I understood Jouni's idea to be to save the .pdfs as baseline images --
then the same version of gs would be used to generated the rasterized
images for the baseline and test result -- the version on your computer.
I think this is the way to go (either that or compare the PDFs directly
somehow).
    
Yes, that's what I meant: we want to test that the PDF file generated by
the code is "equivalent" to the baseline, and aside from some metadata
in the files, I think "equivalence" should mean that the files generate
the same rasterized output on some particular PDF renderer.

I suppose Ghostscript is widespread enough that we can assume that it
exists in the test environment? Or is there some buildout magic that we
should add in some file?
  

We can always use "gs --version" in subprocess.check_call() and if it's
not installed (or if there's some other error with it) just not compare
the pdf output. That way we still test that the pdf generation at least
doesn't raise an exception, and a test pdf is generated for later
inspection if need be.

Jouni - I don't think this would be hard to add, but I'm swamped at
work. If this is an itch you'd like to scratch, feel free to hack away
on the image_comparison() function in
lib/matplotlib/testing/decorators.py -- it's a pretty straightforward
piece of code.

-Andrew