• CanadaPlus@futurology.today
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago
      1. Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.
      2. The system clock will never be set to a time that is in the distant past or the far future.

      Does this come up? I feel like if you’re doing retrocomputing you assume a certain level of responsibility for your software breaking.

      1. Ok, but the duration of one minute on the system clock will be pretty close to the duration of one minute on most other clocks.
      2. Fine, but the duration of one minute on the system clock would never be more than an hour.
      3. You can’t be serious.

      You can’t be, can you? Ditto on that being the user’s problem. My thing also isn’t portable onto Zeus Z-2 or a billiard ball computer you built in your garage.

      There’s some weird shit in the crowdsourced ones. I don’t even know where to start.

      • Redjard@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        5
        ·
        11 months ago

        You heard of standby and the likes? What do you recon that does to programs calculating with time in that exact moment?

        • CanadaPlus@futurology.today
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          I… Actually don’t know.

          The real time clock continues to move in real time under reasonable conditions. If it’s in a weird year it’s either because you’ve decided to run a disk you found in a cave, left by the Ancient Ones, or you’re cheating at Animal Crossing.

          I’m a little unclear on how the rest of the clocks typically work together. If your program is drawing from one that gets stopped for a while, I guess yeah, a minute could totally be weeks long, and I’m in the picture as a falsehood believer.

    • AVincentInSpace@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      if that person who wrote all these could provide examples for why literally any of them are wrong instead of just resorting to the standard “falsehoods programmers believe” fare of “you believe this? ha. it is wrong. therefore I am smarter than you” I would very much appreciate it

  • ExLisper@linux.community
    link
    fedilink
    English
    arrow-up
    24
    ·
    1 year ago

    OMG, it’s so trivial. What you do is when T2 happens you send an atomic clock back in time to T1 and start counting till T2 happens again. If T1 and T2 happen in different locations you send two entangled clocks and collapse the state on T2 clock when the event happens measuring the exact moment on T1. How is this an issue?

      • fiah@discuss.tchncs.de
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        1 year ago

        you know what will solve those problems though? blame them on someone else. “oh yeah that bug, yeah sorry the package we’re using messed it up, there’s a PR for that”

      • azertyfun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        11 months ago

        EDIT: NVM I’m a goddamn idiot, Unix Time’s handling of leap seconds is moronic and makes everything I said below wrong.


        Unix Time is an appropriate tool for measuring time intervals, since it does not factor in leap seconds or any astronomical phenomenon and is therefore monotonously increasing… If T1 and/or T2 are given in another format, then it can get very hairy to do the conversion to an epoch time like unix time, sure.

        The alt-text pokes fun at the fact that due to relativity, at astronomical scales then time moves at different speeds. However, I would argue that this is irrelevant as the comic itself talks about “Anyone who’s worked on datetime systems”, vanishingly few of which ever have to account for relativity (the only non-research use-case being GPS AFAIK).
        While the comic is funny, if:

        • Your time source is NTP or GPS
        • “event 1” and “even 2” both happen on Earth
        • You’re reasonably confident that the system clock is functioning properly

        (All of which are reasonable assumption for any real use-case)
        Then ((time_t) t2) - ((time_t) t1) is precise well within the error margin of the available tools. Expanding the problem space to take into account relativistic phenomena would be a mistake in almost every case and you’re not getting the job.

        • CanadaPlus@futurology.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Clock misalignment comes up pretty frequently in some networking and networking-esque applications. Otherwise, yeah, the edge cases are indeed on the edge.

          Subsecond precision comes up often in common applications too, but you can just expand out to milliseconds or whatever.

        • mormegil@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          When you’re saying Unix time does not include leap seconds, you are making exactly the wrong conclusion. Unix time is not a monotonically increasing number of seconds since the Epoch, because it excludes those seconds which are marked as leap seconds in UTC. I.e. the time between now and the Epoch was larger than the current Unix time shows (by exactly the number of leap seconds in between). See e.g. https://en.wikipedia.org/wiki/Unix_time#Leap_seconds

          • azertyfun@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            Aight I’m just dumb then. Now the question is who the fuck thought this was a good idea? Probably someone so naive they thought it’d make time conversions easy…

        • snowe@programming.dev
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          Unix time fails to work for the ‘simple’ case of timezones entirely. It’s not meant for timezone based data and therefore unixtime in one timezone subtracted from unix time in another timezone will most likely give completely incorrect results. Even in the same timezone it will give incorrect results, see the ‘simple’ case of a country jumping across the international date line. Typically they skip entire days, none of which unix time will account for, as that would require not just time zone data, but location data as well.

          • azertyfun@sh.itjust.works
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            11 months ago

            You misunderstand what Unix Time is. It’s the number of seconds since 1970-01-01T00:00+00:00. It’s always relative to UTC. And the number of seconds since epoch is always the same regardless of where you are on Earth.

            As I write this it’s 1702600950 for you, for me, and in Sydney. Timezones (and DST, and leap seconds, and other political fuckery) only play a role once you want to convert 1702600950 into a “human” datetime. It corresponds to 2023-12-15 00:46:02 UTC and 2023-12-14 16:46:02 PST (and the only sane and reliable way to do the conversion is to use a library which depends on the tzdata).

  • tetris11@lemmy.ml
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    The past is the past. Everything that happened before time t_now should be set to Inf. I thank you for your ears.

    • CanadaPlus@futurology.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Actually, while mathematically heavy, it’s easy to measure in GR, assuming you’ve got a metric solved (If you don’t, you’re fucked. That shit is intractable to the point where you can name every exact solution on one page, and inexact solutions can just be lies) However, you may have to ask additional questions about what sort of time you want, which probably stems from why you need it.

  • CanadaPlus@futurology.today
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    Re: The mouseover text, is there a standard frame of reference for really general space stuff? I propose a frame comoving with the CMB and reaching the center of the Earth at Epoch 0 if not.

  • sebsch@discuss.tchncs.de
    link
    fedilink
    arrow-up
    5
    ·
    11 months ago

    I mean, as long you only need the delta in milliseconds it’s easy. Just count the milliseconds from 1970 to the event. The problem starts when you want to have a human readable representation.

    It’s calenders they suck, not time.

    • mormegil@programming.dev
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      Well… unless you measure the number of [milli]seconds using something like time_t, which lies because of leap seconds. I.e. even such a seemingly simple interface, in fact, includes a calendar.

    • BehindTheBarrier@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      After using it, coming to python and not having a super easy way to work with dates is a pain.

      But DateTime in dotNet have horrible timezone support. It’s essentially either local timezone, not timezone or utc. And the utc part is somewhat rough. There’s some datetimeoffset and the like, but they too just don’t let working with timezones be easy.

    • CanadaPlus@futurology.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I’m guessing it’s not alone. Every time format should come with a distance function and order function, or equivalent. If you have a life, that could mean something like subtraction.

      Unfortunately, “should” isn’t always enough. Optimally there’s also type structure to the return of the function so you can’t mix up seconds and days, or calendar and (one of the) standard length days.

  • Amaltheamannen@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Sounds like a distributed systems problem. While the time between events could be impossible (can’t guarantee clocks are synchronized) you can use a logical clock to have causaul ordering.

  • tias@discuss.tchncs.de
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    11 months ago

    I feel like this is a solved and simple problem as long as there are no relativistic effects. Just make sure t1 and t2 are represented as seconds since a known reference time, e.g. Unix epoch, and make sure that measure is accurate. You don’t need to bring the Gregorian calendar into it, use TAI represented as an integer.

    • jol@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      Until you need to decide how many months are between t1 and t2, and then all answers are wrong.

      • tias@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        11 months ago

        To do that you first need to choose a calendar and a time zone, then convert to that representation. It can be done, but you need a good implementation that understands the entire history of what has transpired w.r.t. to date conventions in that location and culture. For timestamps in the future it is impossible to do correctly, since you can’t know how date conventions will change in the future.

        However, I should add that as far as mathematical operations go, calculating the number of months between t1 and t2 is an entirely different thing than the duration of time that passed between those timestamps. Even if it is expressed similarly in the English language, semantically it’s something else. It’s like asking “how many kilometers did your car go” vs “how many houses did the car pass on the way”.