Maybe I’m an idiot but how would a base 60 system with “Cleaner fractions means fewer approximations and more accurate maths, and the researchers suggest we can learn from it today.” make any difference when computers are powerful enough to generate solutions to answer with more accuracy than is ever needed in real world applications?
None, in modern context we can work in any base we desire, all that basic stuff got generalized ages ago. No one is going to change computing systems to use babylonian-style. And the trigonometry stuff is the same thing we knew, but discovered earlier than the greeks.
It’s a important discovery for sure, especially for our understanding of ancient Mesopotamian cultures, but everything else is the authors and the article going bananas with conclusions.
That’s kind of what I figured. I wish journalism didn’t need to be so incredibly sensationalist. I understand that it’s because the majority of the populace has the attention span of a gnat but it doesn’t make me feel any less annoyed by it.
So, I’m a writer, not a researcher, but I’ve found the more tools I have stuffed into my brain, the more likely it is that two different things clank against each other and create something interesting.
I don’t think this is something unique to writing fiction–from my understanding of history, there’s quite a few moments in science where two somewhat unrelated things bash against each other and spark a new idea.
Sure, computers can do things we already know how to do, but actual inventors/scientists/people making stuff still need to think up things first before you can computerize it.
It’s possible that this WON’T do anything new in the realm of math, but it might create a string a researcher in a different domain–history, linguistics, whatever–can pull on to unravel something else. A diverse tool set leads to multiple ways to solve a given problem, and sometimes edge cases come up where one solution actually is better in some niche application because of something unique to the way it is shaped.
You’re not wrong that people can take inspiration from many different fields, but wild speculation about what could happen can be done for any new development, which makes it pointless and tiring when overused.
Maybe I’m an idiot but how would a base 60 system with “Cleaner fractions means fewer approximations and more accurate maths, and the researchers suggest we can learn from it today.” make any difference when computers are powerful enough to generate solutions to answer with more accuracy than is ever needed in real world applications?
None, in modern context we can work in any base we desire, all that basic stuff got generalized ages ago. No one is going to change computing systems to use babylonian-style. And the trigonometry stuff is the same thing we knew, but discovered earlier than the greeks.
It’s a important discovery for sure, especially for our understanding of ancient Mesopotamian cultures, but everything else is the authors and the article going bananas with conclusions.
That’s kind of what I figured. I wish journalism didn’t need to be so incredibly sensationalist. I understand that it’s because the majority of the populace has the attention span of a gnat but it doesn’t make me feel any less annoyed by it.
deleted by creator
Rule 4 of the community forbids changing the headline. So you’d really need to convince the mods.
Computers still run different algorithms internally, some of which are more prone to having undetected errors than others:
https://en.m.wikipedia.org/wiki/Pentium_FDIV_bug
Computers use base 2, binary. Whether humans use base 10 or base 60 is irrelevant.
So, I’m a writer, not a researcher, but I’ve found the more tools I have stuffed into my brain, the more likely it is that two different things clank against each other and create something interesting.
I don’t think this is something unique to writing fiction–from my understanding of history, there’s quite a few moments in science where two somewhat unrelated things bash against each other and spark a new idea.
Sure, computers can do things we already know how to do, but actual inventors/scientists/people making stuff still need to think up things first before you can computerize it.
It’s possible that this WON’T do anything new in the realm of math, but it might create a string a researcher in a different domain–history, linguistics, whatever–can pull on to unravel something else. A diverse tool set leads to multiple ways to solve a given problem, and sometimes edge cases come up where one solution actually is better in some niche application because of something unique to the way it is shaped.
You’re not wrong that people can take inspiration from many different fields, but wild speculation about what could happen can be done for any new development, which makes it pointless and tiring when overused.
To generate most* solutions.
But I see your point.