Format date tick labels

Hi,

I am having trouble to customize the format of date axis tick labels. Below is a snippet to demonstrate my problem:

#------------------------------------- code ----------------------------------------------------

Make an example plot.

fig = figure()

tsta = num2epoch(date2num(datetime.datetime.now()))

tarr = tsta + arange(0, 60600.5, 0.1) # half hour, dt = 0.1 sec

x = np.array(num2date(epoch2num(tarr)))

nt = len(tarr)

y = randn(nt)

ax = fig.add_subplot(111)

ax.plot(x, y)

fig.canvas.draw()

Make a formatter instance.

locator = mpl.dates.AutoDateLocator()

formatter = mpl.dates.AutoDateFormatter(locator)

Customize the scaling.

formatter.scaled = {

365.0 : ‘%Y’, # view interval > 356 days

  1. : ‘%b %Y’, # view interval > 30 days but less than 365 days

1.0 : ‘%b %d’, # view interval > 1 day but less than 30 days

1./24. : ‘%H:%M’, # view interval > 1 hour but less than 24 hours

1./24./60. : ‘%M:%S’, # view interval > 1 min but less than 1 hour

1./24./60./60. : ‘%S’, # view interval < 1 min

}

Apply the formatter and redraw the plot.

ax.get_xaxis().set_major_formatter(formatter)

fig.canvas.draw()

#---------------------------------- end of code -------------------------------------------------

I listed my expectation about the new scaling above. However, what I got was simply ‘%Y’ after I applied the formatter. So, how can I make the new formatter work, please? Thank you very much.

Jianbao

Confirmed. It seems like that code fell out of maintenance a bit somehow. Luckily, there is a quick fix to get your code working again. Just add the following two lines after you set the major formatter, but before your final draw:

locator.set_axis(ax.xaxis)
locator.refresh()

Note, if you zoom in and out after that call, I doubt it will re-adjust your formatter. Could you file a bug report, please?

Cheers!
Ben Root

···

On Sun, Oct 7, 2012 at 6:47 PM, Jianbao Tao <jianbao.tao@…287…> wrote:

fig = figure()

tsta = num2epoch(date2num(datetime.datetime.now()))

tarr = tsta + arange(0, 60600.5, 0.1) # half hour, dt = 0.1 sec

x = np.array(num2date(epoch2num(tarr)))

nt = len(tarr)

y = randn(nt)

ax = fig.add_subplot(111)

ax.plot(x, y)

fig.canvas.draw()

Make a formatter instance.

locator = mpl.dates.AutoDateLocator()

formatter = mpl.dates.AutoDateFormatter(locator)

Customize the scaling.

formatter.scaled = {

365.0 : ‘%Y’, # view interval > 356 days

  1. : ‘%b %Y’, # view interval > 30 days but less than 365 days

1.0 : ‘%b %d’, # view interval > 1 day but less than 30 days

1./24. : ‘%H:%M’, # view interval > 1 hour but less than 24 hours

1./24./60. : ‘%M:%S’, # view interval > 1 min but less than 1 hour

1./24./60./60. : ‘%S’, # view interval < 1 min

}

Apply the formatter and redraw the plot.

ax.get_xaxis().set_major_formatter(formatter)

fig.canvas.draw()

Thanks, Ben.

Your fix works when the view interval is greater than 1 minute, but not so much when the view interval is less than one minute.

BTW, what I am trying to accomplish is to use matplotlib to plot time-series data that can be as long as several days and as short as a few milliseconds. So, do you think I am better off to handle the format manually by myself instead of deriving the format from auto locator? I am skeptical about the auto locator now because it doesn’t seem to be designed for intervals less than 1 second.

Cheers,

Jianbao

···

On Mon, Oct 8, 2012 at 6:16 AM, Benjamin Root <ben.root@…1836…304…> wrote:

On Sun, Oct 7, 2012 at 6:47 PM, Jianbao Tao <jianbao.tao@…2015…87…> wrote:

fig = figure()

tsta = num2epoch(date2num(datetime.datetime.now()))

tarr = tsta + arange(0, 60600.5, 0.1) # half hour, dt = 0.1 sec

x = np.array(num2date(epoch2num(tarr)))

nt = len(tarr)

y = randn(nt)

ax = fig.add_subplot(111)

ax.plot(x, y)

fig.canvas.draw()

Make a formatter instance.

locator = mpl.dates.AutoDateLocator()

formatter = mpl.dates.AutoDateFormatter(locator)

Customize the scaling.

formatter.scaled = {

365.0 : ‘%Y’, # view interval > 356 days

  1. : ‘%b %Y’, # view interval > 30 days but less than 365 days

1.0 : ‘%b %d’, # view interval > 1 day but less than 30 days

1./24. : ‘%H:%M’, # view interval > 1 hour but less than 24 hours

1./24./60. : ‘%M:%S’, # view interval > 1 min but less than 1 hour

1./24./60./60. : ‘%S’, # view interval < 1 min

}

Apply the formatter and redraw the plot.

ax.get_xaxis().set_major_formatter(formatter)

fig.canvas.draw()

Confirmed. It seems like that code fell out of maintenance a bit somehow. Luckily, there is a quick fix to get your code working again. Just add the following two lines after you set the major formatter, but before your final draw:

locator.set_axis(ax.xaxis)
locator.refresh()

Note, if you zoom in and out after that call, I doubt it will re-adjust your formatter. Could you file a bug report, please?

Cheers!
Ben Root

Jianbao, your smallest scale in your AutoDateFormatter dictionary is 1 second. The formatter is designed to find the smallest key that is still greater than or equal to the scale of your axis. So, if your scale is less than 1 second, then you will notice formatting issues. I would try adding a few more entries to see if that would help.

I know of a few people who have difficulties with matplotlib’s datetime handling, but they are usually operating on the scale of milliseconds or less (lightning data), in which case, one is already at the edge of the resolution handled by python’s datetime objects. However, we would certainly welcome any sort of examples of how matplotlib fails in handling seconds scale and lower plots.

Cheers!
Ben Root

···

On Mon, Oct 8, 2012 at 12:06 PM, Jianbao Tao <jianbao.tao@…287…> wrote:

Thanks, Ben.

Your fix works when the view interval is greater than 1 minute, but not so much when the view interval is less than one minute.

BTW, what I am trying to accomplish is to use matplotlib to plot time-series data that can be as long as several days and as short as a few milliseconds. So, do you think I am better off to handle the format manually by myself instead of deriving the format from auto locator? I am skeptical about the auto locator now because it doesn’t seem to be designed for intervals less than 1 second.

Cheers,

Jianbao

I'll assume that the milliseconds above is a typo. From datetime — Basic date and time types — Python 3.12.0 documentation "class datetime.timedelta A duration expressing the difference between two date, time, or datetime instances to microsecond resolution." Still, what's a factor of 1000 amongst friends? :slight_smile:

···

On 10/10/2012 14:29, Benjamin Root wrote:

I know of a few people who have difficulties with matplotlib's datetime
handling, but they are usually operating on the scale of milliseconds or
less (lightning data), in which case, one is already at the edge of the
resolution handled by python's datetime objects. However, we would
certainly welcome any sort of examples of how matplotlib fails in handling
seconds scale and lower plots.

Cheers!
Ben Root

--
Cheers.

Mark Lawrence.

PEP 418 – Add monotonic time, performance counter, and process time functions | peps.python.org has been implemented in Python 3.3 and talks about clocks with nanosecond resolutions. I've flagged it up here just in case people weren't aware.

···

On 10/10/2012 15:41, Mark Lawrence wrote:

On 10/10/2012 14:29, Benjamin Root wrote:

I know of a few people who have difficulties with matplotlib's datetime
handling, but they are usually operating on the scale of milliseconds or
less (lightning data), in which case, one is already at the edge of the
resolution handled by python's datetime objects. However, we would
certainly welcome any sort of examples of how matplotlib fails in handling
seconds scale and lower plots.

Cheers!
Ben Root

I'll assume that the milliseconds above is a typo. From
datetime — Basic date and time types — Python 3.12.0 documentation "class datetime.timedelta A
duration expressing the difference between two date, time, or datetime
instances to microsecond resolution." Still, what's a factor of 1000
amongst friends? :slight_smile:

--
Cheers.

Mark Lawrence.

Ah, you are right, I meant microseconds.

With apologies to Spaceballs:

“Prepare to go to microsecond resolution!”
“No, no, microsecond resolution is too slow”

“Microsecond resolution is too slow?”
“Yes, too slow. We must use nanosecond resolution!”
“Prep-- Prepare Python, for nanosecond resolution!”

Cheers!
Ben Root

···

On Wed, Oct 10, 2012 at 10:55 AM, Mark Lawrence <breamoreboy@…225…> wrote:

On 10/10/2012 15:41, Mark Lawrence wrote:

On 10/10/2012 14:29, Benjamin Root wrote:

I know of a few people who have difficulties with matplotlib’s datetime

handling, but they are usually operating on the scale of milliseconds or

less (lightning data), in which case, one is already at the edge of the

resolution handled by python’s datetime objects. However, we would

certainly welcome any sort of examples of how matplotlib fails in handling

seconds scale and lower plots.

Cheers!

Ben Root

I’ll assume that the milliseconds above is a typo. From

http://docs.python.org/library/datetime.html "class datetime.timedelta A

duration expressing the difference between two date, time, or datetime

instances to microsecond resolution." Still, what’s a factor of 1000

amongst friends? :slight_smile:

http://www.python.org/dev/peps/pep-0418/ has been implemented in Python

3.3 and talks about clocks with nanosecond resolutions. I’ve flagged it

up here just in case people weren’t aware.

Am I missing something here? Are seconds just floats internally? A
delta of 1e-6 is nothing (pardon the pun). A delta of 1e-9 is the
*least* I'd expect. Maybe even 1e-12. Perhaps the python interpreter
doesn't do any denormalising<c++ - Why does changing 0.1f to 0 slow down performance by 10x? - Stack Overflow;
when encountered with deltas very close to zero...

···

On Wed, Oct 10, 2012 at 5:00 PM, Benjamin Root <ben.root@...1304...> wrote:

On Wed, Oct 10, 2012 at 10:55 AM, Mark Lawrence <breamoreboy@...225...> > wrote:

On 10/10/2012 15:41, Mark Lawrence wrote:
> On 10/10/2012 14:29, Benjamin Root wrote:
>>
>> I know of a few people who have difficulties with matplotlib's datetime
>> handling, but they are usually operating on the scale of milliseconds
>> or
>> less (lightning data), in which case, one is already at the edge of the
>> resolution handled by python's datetime objects. However, we would
>> certainly welcome any sort of examples of how matplotlib fails in
>> handling
>> seconds scale and lower plots.
>>
>> Cheers!
>> Ben Root
>>
>>
>
> I'll assume that the milliseconds above is a typo. From
> datetime — Basic date and time types — Python 3.12.0 documentation "class datetime.timedelta A
> duration expressing the difference between two date, time, or datetime
> instances to microsecond resolution." Still, what's a factor of 1000
> amongst friends? :slight_smile:
>

PEP 418 – Add monotonic time, performance counter, and process time functions | peps.python.org has been implemented in Python
3.3 and talks about clocks with nanosecond resolutions. I've flagged it
up here just in case people weren't aware.

Ah, you are right, I meant microseconds.

With apologies to Spaceballs:

"Prepare to go to microsecond resolution!"
"No, no, microsecond resolution is too slow"
"Microsecond resolution is too slow?"
"Yes, too slow. We must use nanosecond resolution!"
"Prep-- Prepare Python, for nanosecond resolution!"

Cheers!
Ben Root

--
Damon McDougall
http://www.damon-is-a-geek.com
B2.39
Mathematics Institute
University of Warwick
Coventry
West Midlands
CV4 7AL
United Kingdom

What percentage of computer users wants a delta of 1e-12? I suspect that the vast majority of users couldn't care two hoots about miniscule time deltas in a world where changing time zones can cause chaos. Where some applications cannot handle years before 1970, or 1904, or 1900 or whatever. Or they can't go too far forward, 2036 I think but don't quote me. Where people like myself had to put a huge amount of effort into changing code so that applications would carry on working when the date flipped over from 31st December 1999 to 1st January 2000. If things were that simple why is matplotlib using third party modules like dateutil and pytz? Why doesn't the "batteries included" Python already provide this functionality?

···

On 11/10/2012 10:55, Damon McDougall wrote:

On Wed, Oct 10, 2012 at 5:00 PM, Benjamin Root <ben.root@...1304...> wrote:

On Wed, Oct 10, 2012 at 10:55 AM, Mark Lawrence <breamoreboy@...225...> >> wrote:

On 10/10/2012 15:41, Mark Lawrence wrote:

On 10/10/2012 14:29, Benjamin Root wrote:

I know of a few people who have difficulties with matplotlib's datetime
handling, but they are usually operating on the scale of milliseconds
or
less (lightning data), in which case, one is already at the edge of the
resolution handled by python's datetime objects. However, we would
certainly welcome any sort of examples of how matplotlib fails in
handling
seconds scale and lower plots.

Cheers!
Ben Root

I'll assume that the milliseconds above is a typo. From
datetime — Basic date and time types — Python 3.12.0 documentation "class datetime.timedelta A
duration expressing the difference between two date, time, or datetime
instances to microsecond resolution." Still, what's a factor of 1000
amongst friends? :slight_smile:

PEP 418 – Add monotonic time, performance counter, and process time functions | peps.python.org has been implemented in Python
3.3 and talks about clocks with nanosecond resolutions. I've flagged it
up here just in case people weren't aware.

Ah, you are right, I meant microseconds.

With apologies to Spaceballs:

"Prepare to go to microsecond resolution!"
"No, no, microsecond resolution is too slow"
"Microsecond resolution is too slow?"
"Yes, too slow. We must use nanosecond resolution!"
"Prep-- Prepare Python, for nanosecond resolution!"

Cheers!
Ben Root

Am I missing something here? Are seconds just floats internally? A
delta of 1e-6 is nothing (pardon the pun). A delta of 1e-9 is the
*least* I'd expect. Maybe even 1e-12. Perhaps the python interpreter
doesn't do any denormalising<c++ - Why does changing 0.1f to 0 slow down performance by 10x? - Stack Overflow;
when encountered with deltas very close to zero...

--
Cheers.

Mark Lawrence.

Preach on, my brother! Preach on!

[psst – you are facing the choir…]

Cheers!
Ben Root

···

On Thu, Oct 11, 2012 at 4:53 PM, Mark Lawrence <breamoreboy@…225…> wrote:

On 11/10/2012 10:55, Damon McDougall wrote:

On Wed, Oct 10, 2012 at 5:00 PM, Benjamin Root <ben.root@…1304…> wrote:

On Wed, Oct 10, 2012 at 10:55 AM, Mark Lawrence <breamoreboy@…225…> > > >> wrote:

On 10/10/2012 15:41, Mark Lawrence wrote:

On 10/10/2012 14:29, Benjamin Root wrote:

I know of a few people who have difficulties with matplotlib’s datetime

handling, but they are usually operating on the scale of milliseconds

or

less (lightning data), in which case, one is already at the edge of the

resolution handled by python’s datetime objects. However, we would

certainly welcome any sort of examples of how matplotlib fails in

handling

seconds scale and lower plots.

Cheers!

Ben Root

I’ll assume that the milliseconds above is a typo. From

http://docs.python.org/library/datetime.html "class datetime.timedelta A

duration expressing the difference between two date, time, or datetime

instances to microsecond resolution." Still, what’s a factor of 1000

amongst friends? :slight_smile:

http://www.python.org/dev/peps/pep-0418/ has been implemented in Python

3.3 and talks about clocks with nanosecond resolutions. I’ve flagged it

up here just in case people weren’t aware.

Ah, you are right, I meant microseconds.

With apologies to Spaceballs:

“Prepare to go to microsecond resolution!”

“No, no, microsecond resolution is too slow”

“Microsecond resolution is too slow?”

“Yes, too slow. We must use nanosecond resolution!”

“Prep-- Prepare Python, for nanosecond resolution!”

Cheers!

Ben Root

Am I missing something here? Are seconds just floats internally? A

delta of 1e-6 is nothing (pardon the pun). A delta of 1e-9 is the

least I’d expect. Maybe even 1e-12. Perhaps the python interpreter

doesn’t do any denormalising<http://stackoverflow.com/questions/9314534/why-does-changing-0-1f-to-0-slow-down-performance-by-10x>

when encountered with deltas very close to zero…

What percentage of computer users wants a delta of 1e-12? I suspect

that the vast majority of users couldn’t care two hoots about miniscule

time deltas in a world where changing time zones can cause chaos. Where

some applications cannot handle years before 1970, or 1904, or 1900 or

whatever. Or they can’t go too far forward, 2036 I think but don’t

quote me. Where people like myself had to put a huge amount of effort

into changing code so that applications would carry on working when the

date flipped over from 31st December 1999 to 1st January 2000. If

things were that simple why is matplotlib using third party modules like

dateutil and pytz? Why doesn’t the “batteries included” Python already

provide this functionality?

Clearly I have misunderstood something and hit a nerve. Apologies.

···

On Thursday, October 11, 2012, Benjamin Root wrote:

On Thu, Oct 11, 2012 at 4:53 PM, Mark Lawrence <breamoreboy@…225…> wrote:

On 11/10/2012 10:55, Damon McDougall wrote:

On Wed, Oct 10, 2012 at 5:00 PM, Benjamin Root <ben.root@…1304…> wrote:

On Wed, Oct 10, 2012 at 10:55 AM, Mark Lawrence <breamoreboy@…225…> > > > > >> wrote:

On 10/10/2012 15:41, Mark Lawrence wrote:

On 10/10/2012 14:29, Benjamin Root wrote:

I know of a few people who have difficulties with matplotlib’s datetime

handling, but they are usually operating on the scale of milliseconds

or

less (lightning data), in which case, one is already at the edge of the

resolution handled by python’s datetime objects. However, we would

certainly welcome any sort of examples of how matplotlib fails in

handling

seconds scale and lower plots.

Cheers!

Ben Root

I’ll assume that the milliseconds above is a typo. From

http://docs.python.org/library/datetime.html "class datetime.timedelta A

duration expressing the difference between two date, time, or datetime

instances to microsecond resolution." Still, what’s a factor of 1000

amongst friends? :slight_smile:

http://www.python.org/dev/peps/pep-0418/ has been implemented in Python

3.3 and talks about clocks with nanosecond resolutions. I’ve flagged it

up here just in case people weren’t aware.

Ah, you are right, I meant microseconds.

With apologies to Spaceballs:

“Prepare to go to microsecond resolution!”

“No, no, microsecond resolution is too slow”

“Microsecond resolution is too slow?”

“Yes, too slow. We must use nanosecond resolution!”

“Prep-- Prepare Python, for nanosecond resolution!”

Cheers!

Ben Root

Am I missing something here? Are seconds just floats internally? A

delta of 1e-6 is nothing (pardon the pun). A delta of 1e-9 is the

least I’d expect. Maybe even 1e-12. Perhaps the python interpreter

doesn’t do any denormalising<http://stackoverflow.com/questions/9314534/why-does-changing-0-1f-to-0-slow-down-performance-by-10x>

when encountered with deltas very close to zero…

What percentage of computer users wants a delta of 1e-12? I suspect

that the vast majority of users couldn’t care two hoots about miniscule

time deltas in a world where changing time zones can cause chaos. Where

some applications cannot handle years before 1970, or 1904, or 1900 or

whatever. Or they can’t go too far forward, 2036 I think but don’t

quote me. Where people like myself had to put a huge amount of effort

into changing code so that applications would carry on working when the

date flipped over from 31st December 1999 to 1st January 2000. If

things were that simple why is matplotlib using third party modules like

dateutil and pytz? Why doesn’t the “batteries included” Python already

provide this functionality?

Preach on, my brother! Preach on!

[psst – you are facing the choir…]

Cheers!
Ben Root


Damon McDougall
http://www.damon-is-a-geek.com
B2.39
Mathematics Institute
University of Warwick
Coventry
West Midlands
CV4 7AL

United Kingdom

I’m a little confused by this attitude. I recognize that there are issues around dates, I’ve written a few date libraries myself to get around insane excel date issues (pop quiz for anyone at MS, was 1900 a leap year?) or just to simplify APIs for my own use. But do neither of you think that nanoseconds are important to scientists? I know of enough projects that work with pico (and a few with femto) seconds. Even though I often work with climate data covering ~100s of years and used to work with geologic data covering ~billions of years, I may start working with raw laser data for distance measurements where nanoseconds can be a pretty big deal. These data would be collected over a few years time, so a date utility that can handle that scale range would be useful. I guess I’ll be writing my own date/time library again and hacking together some way of plotting data in a meaningful way in matplotlib.

Don’t get me wrong, matplotlib shouldn’t have to reinvent the wheel here, but claiming that nobody could possibly care about 1e-12 seconds seems a little provincial. My apologies if that is not how the above statements were intended.

regards,

Ethan

···

On Oct 11, 2012, at 2:58 PM, Benjamin Root wrote:

On Thu, Oct 11, 2012 at 4:53 PM, Mark Lawrence <breamoreboy@…878…225…> wrote:

On 11/10/2012 10:55, Damon McDougall wrote:

Am I missing something here? Are seconds just floats internally? A

delta of 1e-6 is nothing (pardon the pun). A delta of 1e-9 is the

least I’d expect. Maybe even 1e-12. Perhaps the python interpreter

doesn’t do any denormalising<http://stackoverflow.com/questions/9314534/why-does-changing-0-1f-to-0-slow-down-performance-by-10x>

when encountered with deltas very close to zero…

What percentage of computer users wants a delta of 1e-12? I suspect

that the vast majority of users couldn’t care two hoots about miniscule

Preach on, my brother! Preach on!

[psst – you are facing the choir…]

Oh, don’t get me wrong, those time scales are important. I was merely expressing my dismay over the amount of problems that exists in the set of tools for everyday use. It is 2012, and Perl still has probably the best datetime library out there…

Python can do much better.

Ben Root

···

On Friday, October 12, 2012, Ethan Gutmann wrote:

On Oct 11, 2012, at 2:58 PM, Benjamin Root wrote:

On Thu, Oct 11, 2012 at 4:53 PM, Mark Lawrence <breamoreboy@…225…> wrote:

On 11/10/2012 10:55, Damon McDougall wrote:

Am I missing something here? Are seconds just floats internally? A

delta of 1e-6 is nothing (pardon the pun). A delta of 1e-9 is the

least I’d expect. Maybe even 1e-12. Perhaps the python interpreter

doesn’t do any denormalising<http://stackoverflow.com/questions/9314534/why-does-changing-0-1f-to-0-slow-down-performance-by-10x>

when encountered with deltas very close to zero…

What percentage of computer users wants a delta of 1e-12? I suspect

that the vast majority of users couldn’t care two hoots about miniscule

Preach on, my brother! Preach on!

[psst – you are facing the choir…]

I’m a little confused by this attitude. I recognize that there are issues around dates, I’ve written a few date libraries myself to get around insane excel date issues (pop quiz for anyone at MS, was 1900 a leap year?) or just to simplify APIs for my own use. But do neither of you think that nanoseconds are important to scientists? I know of enough projects that work with pico (and a few with femto) seconds. Even though I often work with climate data covering ~100s of years and used to work with geologic data covering ~billions of years, I may start working with raw laser data for distance measurements where nanoseconds can be a pretty big deal. These data would be collected over a few years time, so a date utility that can handle that scale range would be useful. I guess I’ll be writing my own date/time library again and hacking together some way of plotting data in a meaningful way in matplotlib.

Don’t get me wrong, matplotlib shouldn’t have to reinvent the wheel here, but claiming that nobody could possibly care about 1e-12 seconds seems a little provincial. My apologies if that is not how the above statements were intended.

regards,

Ethan

I actually said "What percentage of computer users wants a delta of 1e-12? I suspect that the vast majority of users couldn't care two hoots about miniscule time deltas in a world where changing time zones can cause chaos...".

How does this translate into "claiming that nobody could possibly care about 1e-12 seconds seems a little provincial"?

···

On 12/10/2012 20:38, Ethan Gutmann wrote:

I'm a little confused by this attitude. I recognize that there are issues around dates, I've written a few date libraries myself to get around insane excel date issues (pop quiz for anyone at MS, was 1900 a leap year?) or just to simplify APIs for my own use. But do neither of you think that nanoseconds are important to scientists? I know of enough projects that work with pico (and a few with femto) seconds. Even though I often work with climate data covering ~100s of years and used to work with geologic data covering ~billions of years, I may start working with raw laser data for distance measurements where nanoseconds can be a pretty big deal. These data would be collected over a few years time, so a date utility that can handle that scale range would be useful. I guess I'll be writing my own date/time library again and hacking together some way of plotting data in a meaningful way in matplotlib.

Don't get me wrong, matplotlib shouldn't have to reinvent the wheel here, but claiming that nobody could possibly care about 1e-12 seconds seems a little provincial. My apologies if that is not how the above statements were intended.

regards,
Ethan

--
Cheers.

Mark Lawrence.

Like I said, my apologies if I mis-interpreted. To me the statement "the vast majority of users couldn't care two hoots..." *implies* "since almost nobody needs this we won't worry about it", especially when it is in response to someone who felt this was an important feature:
"A delta of 1e-9 is the *least* I'd expect. Maybe even 1e-12. ".
My response was as much an issue with how I perceived the tone as anything else (obviously, tone doesn't cary well over email :wink: )

Don't get me wrong, I realize there are bigger fish to fry. I just want add a vote that 1E-12 seconds (and less) can indeed be important to a significant number of people. I suspect that many experimental physicists would be unable to use a time utility that can't handle those timescales. Many of them will simply write there own utility, but then if they start running into any of the longer time scale issues e.g. leap years/seconds etc. they end up having to reinvent the wheel. Others have also pointed out that databases[1], network packets and stock trading transactions[2] may care about nanoseconds.

[1]Re: [Python-Dev] datetime nanosecond support « python-dev « ActiveState List Archives
[2]Re: [Python-Dev] datetime nanosecond support « python-dev « ActiveState List Archives

I'm glad to see that others are thinking about this and that future python versions may get down to nanosecond (or better?) resolution, though I haven't found the PEP for it yet. Guido seems to have given his approval for more work on the matter at least : Re: [Python-Dev] datetime nanosecond support « python-dev « ActiveState List Archives

PEP 418 mentioned before doesn't mention the date time class as far as I can tell. PEP 418 – Add monotonic time, performance counter, and process time functions | peps.python.org

regards,
Ethan

···

On Oct 12, 2012, at 4:15 PM, Mark Lawrence wrote:

On 12/10/2012 20:38, Ethan Gutmann wrote:

I'm a little confused by this attitude. I recognize that there are issues around dates, I've written a few date libraries myself to get around insane excel date issues (pop quiz for anyone at MS, was 1900 a leap year?) or just to simplify APIs for my own use. But do neither of you think that nanoseconds are important to scientists? I know of enough projects that work with pico (and a few with femto) seconds. Even though I often work with climate data covering ~100s of years and used to work with geologic data covering ~billions of years, I may start working with raw laser data for distance measurements where nanoseconds can be a pretty big deal. These data would be collected over a few years time, so a date utility that can handle that scale range would be useful. I guess I'll be writing my own date/time library again and hacking together some way of plotting data in a meaningful way in matplotlib.

Don't get me wrong, matplotlib shouldn't have to reinvent the wheel here, but claiming that nobody could possibly care about 1e-12 seconds seems a little provincial. My apologies if that is not how the above statements were intended.

regards,
Ethan

I actually said "What percentage of computer users wants a delta of
1e-12? I suspect that the vast majority of users couldn't care two
hoots about miniscule time deltas in a world where changing time zones
can cause chaos...".

How does this translate into "claiming that nobody could possibly care
about 1e-12 seconds seems a little provincial"?

--
Cheers.

Mark Lawrence.

I know that there are 1000s times more people dealing with order processing systems who want to know the number of working days including bank holidays in which to get their order processed than experimental physicists playing with relatively nothing. You only have to look at the UK for real world problems like this as bank holidays are different in England and Wales when compared to both Scotland and Northern Ireland.

From PEP 418 "This PEP proposes to add time.get_clock_info(name), time.monotonic(), time.perf_counter() and time.process_time() functions to Python 3.3." I've checked and the functions have all been implemented in Python 3.3. If they're inadequate patches are always welcome.

···

On 13/10/2012 00:37, Ethan Gutmann wrote:

On Oct 12, 2012, at 4:15 PM, Mark Lawrence wrote:

On 12/10/2012 20:38, Ethan Gutmann wrote:

I'm a little confused by this attitude. I recognize that there are issues around dates, I've written a few date libraries myself to get around insane excel date issues (pop quiz for anyone at MS, was 1900 a leap year?) or just to simplify APIs for my own use. But do neither of you think that nanoseconds are important to scientists? I know of enough projects that work with pico (and a few with femto) seconds. Even though I often work with climate data covering ~100s of years and used to work with geologic data covering ~billions of years, I may start working with raw laser data for distance measurements where nanoseconds can be a pretty big deal. These data would be collected over a few years time, so a date utility that can handle that scale range would be useful. I guess I'll be

  writing my own date/time library again and hacking together some way of plotting data in a meaningful way in matplotlib.

Don't get me wrong, matplotlib shouldn't have to reinvent the wheel here, but claiming that nobody could possibly care about 1e-12 seconds seems a little provincial. My apologies if that is not how the above statements were intended.

regards,
Ethan

I actually said "What percentage of computer users wants a delta of
1e-12? I suspect that the vast majority of users couldn't care two
hoots about miniscule time deltas in a world where changing time zones
can cause chaos...".

How does this translate into "claiming that nobody could possibly care
about 1e-12 seconds seems a little provincial"?

--
Cheers.

Mark Lawrence.

Like I said, my apologies if I mis-interpreted. To me the statement "the vast majority of users couldn't care two hoots..." *implies* "since almost nobody needs this we won't worry about it", especially when it is in response to someone who felt this was an important feature:
"A delta of 1e-9 is the *least* I'd expect. Maybe even 1e-12. ".
My response was as much an issue with how I perceived the tone as anything else (obviously, tone doesn't cary well over email :wink: )

Don't get me wrong, I realize there are bigger fish to fry. I just want add a vote that 1E-12 seconds (and less) can indeed be important to a significant number of people. I suspect that many experimental physicists would be unable to use a time utility that can't handle those timescales. Many of them will simply write there own utility, but then if they start running into any of the longer time scale issues e.g. leap years/seconds etc. they end up having to reinvent the wheel. Others have also pointed out that databases[1], network packets and stock trading transactions[2] may care about nanoseconds.

[1]Re: [Python-Dev] datetime nanosecond support « python-dev « ActiveState List Archives
[2]Re: [Python-Dev] datetime nanosecond support « python-dev « ActiveState List Archives

I'm glad to see that others are thinking about this and that future python versions may get down to nanosecond (or better?) resolution, though I haven't found the PEP for it yet. Guido seems to have given his approval for more work on the matter at least : Re: [Python-Dev] datetime nanosecond support « python-dev « ActiveState List Archives

PEP 418 mentioned before doesn't mention the date time class as far as I can tell. PEP 418 – Add monotonic time, performance counter, and process time functions | peps.python.org

regards,
Ethan

--
Cheers.

Mark Lawrence.