Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".
The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:
Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.
I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.
Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?
(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)
"settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)
Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).
Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.
In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.
Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.
> The most surreal part of implementing the new calendar came in October 1582, when 10 days were dropped from the calendar to bring the vernal equinox from March 11 back to March 21. The church had chosen October to avoid skipping any major Christian festivals.
The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.
No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.
There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.
It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.
Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.
Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.
> It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.
Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.
We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.
I vote for a year zero and for using two's complement for representing years before zero (because it makes computing durations that span zero a little easier).
What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.
It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".
There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.
These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?
It's easy, we should have simply started counting centuries from zero. Centuries should be zero-indexed, then everything works.
We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!
At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.
When we refer to 'the first year of life', we mean the time from birth until you turn 1.
Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.
But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).
“Half one” is archaic English, and common German, for 12:30. Similarly “my 27th year” just sounds archaic to me: I wonder if you went through a bunch of 19th century writing if you’d see ages more often be “Xth year” vs “X-1 years old”.
There may be something cultural that caused such a shift, like a change in how math or reading is taught (or even that it’s nearly universally taught, which changes how we think and speak because now a sizeable chunk of the population thinks in visually written words rather than sounds).
Isn’t “half one” used as a short form of “half past one” these days, I.e. 01:30? That has been a source of confusion for someone used to the Germanic way.
I had this exact topic with an Irish coworker who lives in Germany and has issues to convey the right time. For me as a German „half one“ is half of one so 12:30. Same for „Dreiviertel eins“ -> „threequarter one“ being 12:45 and „Viertel eins“ -> „quarter one“ being 12:15.
To be fair the logic behind this is also under constant confusion as some parts of Germany rather use „viertel vor“ or „viertel nach“ -> „quarter to“ „quarter after“ and have no understanding of the three quarter business.
That's not really indexing from 0 though. It's just rounding the amount of time you've lived down to the nearest year. You get the same number, but semantically you're saying roughly how old you are, not which year you're in. This becomes obvious when you talk to small children, who tend to insist on saying e.g "I'm 4 and a half". And talking about children in their first year, no one says they're 0. They say they're n days/weeks/months old.
My preference is semi-compatible with both conventions:
First = 0
Second = 1
Toward = 2
Third = 3
…
This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.
On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.
Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.
In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.
We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.
So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!
Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).
It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.
> In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.
That’s not the case, though. They can vote (and drink, in quite a few countries) when they are at least 18 years old, not when they are in their 18th year (who would even say that?)
People are 18 years old (meaning that 18 years passed since their date of birth) on their 18th birthday. There is no need of shoehorning 0-based indexing or anything like that.
> Most will say yes.
Most people say something stupid if you ask tricky questions, I am not sure this is a very strong argument. Have you seriously heard anybody talking about a child’s “5th year of development”, except maybe a paediatrician? We do talk about things like “3rd year of school” or “2nd year of college”, but with the expected (1-indexed) meaning.
> So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!
It’s really not. To have experienced a full year, you need a year to have passed, which therefore has to be the first. I think that’s a cardinal versus ordinal confusion. The first year after an event is between the event itself and its first anniversary. I am not aware of any context in which this is not true, but obviously if you have examples I am happy to learn.
> It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.
Right. I know it is difficult to admit for some of us, but we are not computers and we do not work like computers (besides the fact that computers work just fine with 1-indexing). Some people would like it very much if we counted from 0, but that is not the case. It is more productive to understand how it works and why (and again cardinals and ordinals) than wishing it were different.
What? No. When you are 0, it is your first year. When you are 21, you have begun your 22nd year. In the US you are legal to drink in your 22nd year of life.
You are correct that nobody says "22nd year" in this context, but nobody says "21st year" either. The former is awkward but the latter is just incorrect.
It's just not true. You've completed being 17 years old on your 18th birthday, when you enter your 19th year and can count 18 years under your belt.
Consider a newborn. As soon as they're squeezed out they are in their first year of life. That continues until the first anniversary of their decanting, at which point they are one year old and enter their second year of life.
There is nobody, nobody, who refers to a baby as being in their zeroth year of life. Nor would they refer to a one-year-old as still being in their first year of life as if they failed a grade and are being held back.
The pattern continues for other countable things. Breakfast is not widely considered the zeroth meal of the day. Neil Armstrong has never been considered the zeroth man on the moon nor is Buzz Aldrin the first. The gold medal in the Olympics is not awarded for coming in zeroth place.
No one's saying it's true! All that's being claimed is that writers will often use phrases like "became an adult in their 18th year" or "was legally allowed to drink in their 21st year".
It's completely incorrect, but some people use it that way, and ultimately everyone understands what they actually mean.
The top response in your Quora link is that your 21st year "means you’re 20. You have had your 20th birthday, but not yet your 21st." That is the conventional definition.
People commonly make the mistake of thinking otherwise, but that's all it is. A mistake.
I don't think it is a mistake for Lua. The convention to zero-index arrays is not sacrosanct, it's just the way older languages did it (due to implementation details) and thus how people continue to do it. But it's very counter-intuitive, and I think it's fair game for new languages to challenge assumptions that we hold because we're used to past languages.
It's a C family (predecessors and descendants) idiosyncrasy that very unfortunately got out of hand. Most other old languages had either 1-based indexing or were agnostic. Most notably FORTRAN, which is the language for numerical calculations is 1-based.
The seminal book Numerical Recipies was first published as 1- based for FORTRAN and Pascal and they only latter added a 0-based version for C.
Personally, coming from Pascal, I think the agnostic way is best. It is not only about 0-based or 1-based but that the type system encodes and verifies the valid range of index values, e.g. in Pascal you define an array like this:
temperature = array [ -35 .. 60 ] of real;
You will get an immediate compule-time error if you use
temperature[61];
At least with Turbo Pascal you could chose if you wanted run-time checks as well.
I have a hard time wrapping my head around the fact that this feature is pretty much absent from any practically used language except ADA.
It's not counter-intuitive at all, it only seems that way because people are now used to languages with zero-based indexing. That's almost entirely because of the C language, which used pointer offset arithmetic with its arrays.
Outside of that machine context, where an array is a contiguous block of RAM that can be indexed with memory pointers, there's no particular reason to do offset indexing. 1-based works just fine - "first element, second element" - works just fine and is perfectly intuitive.
Different types of indexing can make sense in different situations. Some languages even allow that. In Ada, for example, arrays can start at whatever index you define.
This was confusing to me as a kid, especially as we entered the 21st. I also still remember learning about the Dutch golden age in elementary school, but can't remember if it was the 1600s or 16th century.
I'm running into a similar issue recently. Turns out that many people saying they are '7 months pregnant' actually mean they are in the 7th month, which starts after 26 weeks (6 months!)
Here we say something like the "ninteen-hundred-era" for the 1900s, "ninteen-hundred-ten-era", for 1910s, "ninteen-hundred-twenty-era", etc. In writing 1900-era, 1910-era, 1920-era. The most recent decades are referred to with only the "70-era" for the 70s. The word for age/epoch/era in our language is a lot more casual in this setting.
The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.
I find it weird when people take a long time for these little things. My wife still struggles with the German numbers (85 = fünfundachtzig) and the half thing with time (8:30 = halb neun) even though I managed to switch over to those very quickly. I think it depends on the person how hard it is
A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.
I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.
Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.
A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.
Exactly. Historians often talk about things like "the long 18th century" running from 1688 (Britian's "Glorious Revolution") to 1815 (the defeat of Napoleon) because it makes sense culturally to have periods that don't exactly fit 100-year chunks.
I thought the article was going to argue against chunking ideas into centuries because it's an arbitrary, artificial construct superimposed on fluid human culture. I could get behind that, generally, while acknowledging that many academic pursuits need arbitrary bins other people understand for context. I did not expect to see arguments for stamping out the ambiguities in labellibg these arbitrary time chunks. Nerdy pub trivia aside, I don't see the utility of instantly recalling the absolute timeline of the American revotion in relation to the enlightenment. The 'why's— the relationships among the ideas— hold the answers. The 'when's just help with context. To my eye, the century count labels suit their purpose for colloquial usage and the precise years work fine for more specific things. Not everything has to be good at everything to be useful enough for something.
This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]
In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.
Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.
For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".
> you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750.
From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.
I’ve just taken to writing things like: 201X or 20XX. This is non-standard but I don’t care anymore, referencing events from 20 years ago is just too annoying otherwise.
In spoken conversation, I dunno, it doesn’t seem to come up all that often. And you can always just say “20 years ago” because conversations don’t stick around like writing, so the dates can be relative.
I thought this article was railing against the lumping together of entire spans of hundreds of years as being alike (ie, we lump together 1901 and 1999 under the name ”the 1900s” despite their sharing only numerical similarity), and was interested until I learned the author’s real, much less interesting intention
I wonder at what point we can just assume decades belong to the current century. Will "the twenties" in the US always primarily mean Prohibition, flappers, and Al Capone or will it ever mean this decade?
I say give it 11 years or so for 2020's kids to starting come into age, and twenties babies will refer to babies born in the 2020's and not centenarians.
You can always just say "the 2000s" for 2000-2010. If the context is such that you might possibly be talking about the far future then I guess "the 2000's" is no longer suitable but how often does that happen in everyday conversation?
I got a fight recently with a philosophy teacher about that, I changed the dates like the op to be more clear on my writing, she took it so seriously, it was a big fight about clarity vs. tradition but really superficial and mean on both sides. Now I wish to be* more articulate and have a good debate. I wrote it in my way on the final exam and approved, she had to deal with it I guess.
* Sorry, I don't know how to write that in past, like haber sido in Spanish, my main language.
> This leaves ambiguous how to refer to decades like 1800-1809.
There is the apostrophe convention for decades.
You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context.
(The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.)
If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)
There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X".
It is less neat but unambiguous about the range when it's one "X" for a digit.
(https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples.
A few give you the century, decade, and year and omit the millennium.)
Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.
If I lived much longer than 100 years I might care more about the precision of such language. However, as it stands, I know what people mean when they say "remember the early 2000s". I know that doesn't mean the 2250s for example - a reasonable characterization for someone living in 3102 perhaps.
It's tiresome when people seem to think it's necessary to be annoyed that others make reference to their own cultural frame of reference in their writing.
Even more tiresome when they feel the need to comment about it.
Most people here, even non-Americans, likely have at least a rough idea of when the American Revolution was. And those who don't will either just gloss over it and think no more of it, or find the answer on the internet in a shorter amount of time than it took me to type this sentence. And then there are people like you. Look at the completely useless subthread you've spawned! Look at the time I've bothered to waste typing this out! Sigh.
Also the author does give the year in the next paragraph. So no googling required.
I also didn't know what the date of the American revolution was, but I understood it was just an example.
> if you’re like me, you’ll find the question much easier to answer given the second version of the sentence, because you remember the American revolution as starting in 1776, not in the 76th year of the 18th century.
The American and French Revolutions are a pretty big deal on the road to modern democracy, as well as being tied to 1700s Enlightenment ideals. Everyone educated should know this.
Of course, they are important, but so are many other things - and speaking e.g. from a European POV, a lot of other events are simply much more salient and commonplace - and the same is probably even more true for other continents (would a random reasonably educated American or European person know when the Meiji restauration happened or when Latin America became independent?). You can't expect everyone to have memorised all the important dates.
America was a backwater at the time and therefore the best place to experiment with European enlightenment ideals. Which it did, and was a direct factor in the French Revolution. I also learned about numerous revolutions in Latin America from Mexico to Bolivar to San Martin over the early 1800s.
The events that directly affect the modern world should be covered in school. I’d say revolutions that created large modern states would be among them.
Of course, we learn about the American Revolution in schools, but people aren't going to remember every date they were taught in schools. The founding of Rome or the Punic Wars are also hugely important for today's world, but not everyone can place them.
The reason most US Americans probably can place the American Revolution is because I assume it's so often commememorated there. In Germany, people would be much more likely to remember the years 1933, 1949 and 1989, because of how often they're referenced.
> I’d say revolutions that created large modern states would be among them.
The Russian revolution as well. And the Chinese one. It’s quite difficult to make sense of the late 20th century (yes, I know) without them. Or the early 21th.
A pretty big deal in America. I don't think knowledge of the exact date of the American Revolution is a requirement for education outside America. At least no more than "17something...ish".
"17something...ish" is enough to answer (or at least make a high confidence guess at) the original question (was the American Revolution contemporary with the enlightenment?)
For that matter, a lot of historical dates we consider important are only important to us because A) we're westerners and B) we got them drilled into us by textbooks and classes.
The remaining majority of the world (the west is a minority) sincerely couldn't care less about the American or French or Industrial Revolutions or Columbus (re)discovering America or the Hundred Years War or the Black Death or the Fall of Rome or whatever else.
Kind of like how we as westerners generally couldn't care less about Asian, African, Middle Eastern, Indian, or Polynesian histories.
The culture we grow up in and become indoctrinated by determines what is important and what is not.
And just so we're clear, this bit of ignorance is perfectly fine: Life is short, ain't nobody got time for shit that happened to people you don't even know who lived somewhere you will never see.
Let me put it this way: Can you really blame someone for not knowing a historical fact that is completely irrelevant to their life, especially when they probably have more pressing concerns to learn and care about?
We only have so many hours in a day and so many days in a lifetime, while knowledge is practically infinite.
As very broad subjects? Yes. But any particular factoid about them is practically irrelevant for most people who don't actually have anything to do with that factoid.
Hell, I would even go as far as to say the American Revolution is irrelevant even for most Americans because it has nothing of practical value. We (Americans) all know about it to varying degrees, but again that is due to growing up and being indoctrinated in it.
The essay doesn't really say anything about when the singular "they" was invented.
What it says is that it used to be low-status and unsophisticated language.
> In the 1970s, fancy people would have sniffed at using “they” rather than “he” for a single person of unknown sex like this. But today, fancy people would sniff at not doing that. How did that happen?
> I think “they” climbed the prestige ladder—people slowly adopted it in gradually more formal and higher-status situations until it was everywhere.
The essay’s narrative is overly simplified and misleading about details. Singular they has been common for centuries. The idea that it was lower status is a more recent invention. The author might be referring more to use of they as a personal pronoun. Anyway, whatevs, language changes, that part of author’s message is good.
The aughts or naughts (or aughties) are a pretty easy to understand way to refer to 2000-2009, though saying “the early aughts” is clearly more verbose than saying 2000-2003 (except that 2000-2003 looks more specific than is meant)
I think that last point, "there’s no good way to refer to 2000-2009, sorry", was a bit tongue in cheek, refusing to acknowledge "the aughts", since it is a terrible, terrible, stupid way to refer to anything.
I've always written it like "1900s", and always considered "20th century" to be confusing. Having to mentally do c-- or c++ is confusing and annoying.
I deal with the "2000s-problem" by using "00s" to refer to the decade, which everyone seems to understand. Sometimes I also use "21st century"; I agree with the author that it's okay in that case, because no one is confused by it. For historical 00s I'd probably use "first decade of the 1700s" or something along those lines. But I'm not a historian and this hasn't really come up.
I think this is incorrect. Don't centuries start with the 00? In that case the first year of a century is 0, and the 76th year would be 75, not 76 as the author writes:
> starting in 1776, not in the 76th year of the 18th century.
And there is a year zero in the ISO 8601:2004 system, the interchange standard for all calendar numbering systems (where year zero coincides with the Gregorian year 1 BC; see conversion table). [via Wikipedia]
It's funny that for me it feels right to associate the 20s with 2020 and the 40s with 1940, but somehow, the 30s is very foreign, and I can't think of 1930 or 2030 either.
Funny, when you say "20s" I think of the 1920s (aka the "Roaring 20s" here in the US). I wonder if it's an age thing (I'm in my 40s), or perhaps a cultural or regional thing.
The 30s as 1930s seems pretty solid to me: US Great Depression, start of WWII, not to mention many of the smaller conflicts that led up to it.
Meh. I acknowledge that the author can split a hair with their bare hand while blindfolded. But to convince everyone else they would have to lift the rest of the world to their level of pedantry.
Thankfully, most of us quit writing centuries in Roman numerals, it's about time we quit centuries as well :) Sadly, however, the regnal numbers continue to persist
I agree. Another good point to get rid of counting centuries would be that in some languages (Russian) centuries are written in Roman numerals. It's annoying having to pause and think of conversion.
I thought the exact same thing... when I was eight.
Then I just learned the mental gymnastics of hearing, say, "20th century" and associating it with "1900" at got along with it.
Really, it's a bit dated, but if people can function with miles and furlongs, they can handle centuries...
It's funny, I'm in my 40s and my brain still pauses longer than I'd expect is necessary to do that conversion. I think for 21st or 20th century I'm fine (since I've lived in both of them), but anything prior and it takes me a beat to figure out which date range it is.
Technically, decades and centuries start in a January with one or two zeros at the end, respectively. So the 1700s and the 18th century are exactly the same interval of time.
ISO 8601-2:
> Decade: A string consisting of three digits represents a decade, for example “the 1960s”. It is the ten-year time interval of those years where the three specified digits are the first three digits of the year.
> Century: Two digits may be used to indicate the century which is the hundred year time interval consisting of years beginning with those two digits.
Took me a while to understand what's ambiguous about that. For anyone else, what they're saying is that this often (typically, I guess) refers to 2000-2099 and not the other 900 years.
But then how about [2000, 2010)? During the current time period, I expect people are more likely to refer to the decade rather than the century or millennium.
> There’s no good way to refer to 2000-2009, sorry.
The author is wrong here. The correct way (at least in spoken West Coast American English) is the Twenty-aughts. There is even a Wikipedia page dedicated to the term: https://en.wikipedia.org/wiki/Aughts If you want to be fancy you could spell it like the 20-aughts. I suppose there is no spelling it with only digits+s though, which maybe what the author was looking for.
I'm not sure why, but every time I hear someone use that term, I cringe. The word just feels... off... to me for some reason. Like it's an abomination.
I think it isn't as sexy/interesting as what I thought the article was going to be about (about a different way of talking about our history, in eras maybe vs centuries or something).
this strikes me as kind of like a small PR to our language that makes an incremental improvement to a clearly confusing thing. Should be easy to merge :)
Seriously, though, we should have learnt at this point: we cannot solve social issues with technology. Everybody is working on a different tree, and social dynamics that govern the spread of idioms and from which fork you cherry-pick or merge is much more complex than what git is designed to handle.
https://www.youtube.com/watch?v=KDTxS9_CwZA
The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.
It's the rare people that don't who actually change the world.
no it won't lol, people will pay just as much through the new dating system as they would through the old.
(This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)
Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).
Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.
https://www.britannica.com/story/ten-days-that-vanished-the-....
> The most surreal part of implementing the new calendar came in October 1582, when 10 days were dropped from the calendar to bring the vernal equinox from March 11 back to March 21. The church had chosen October to avoid skipping any major Christian festivals.
It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.
Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.
Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.
https://en.wikipedia.org/wiki/Holocene_calendar
It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".
No, it isn't, since you explicitly said to start the first century on the date that doesn't exist. What does that even mean?
The point is that some days got skipped over the centuries, but there's no need to make the Centuries have weird boundaries.
That's not what the poster I originally responded to is saying. He's saying the 1st Century should start on a nonexistent day.
That allows for consistent zero-indexed centuries. It doesn't have any other practical consequences that matter.
(0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)
It’s actually the second.
> Trebeck's
Trebek's*
Some of the characters in Death Stranding, namely the main one, have a given-name, profession, employer convention -- as in Sam Porter Bridges.
Glenn Miller, Gregory Porter and Sam Smith just happen to have been more inclined to make music.
Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.
If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.
We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!
At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.
Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643
No, we don't.
When we refer to 'the first year of life', we mean the time from birth until you turn 1.
Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.
But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).
There may be something cultural that caused such a shift, like a change in how math or reading is taught (or even that it’s nearly universally taught, which changes how we think and speak because now a sizeable chunk of the population thinks in visually written words rather than sounds).
Sure, but no one ever uses that phrasing after you turn one. Then it's just "when they were one", "when they were five", whatever.
So sure, maybe we can continue to say "the 1st century", but for dates 100 and later, no more.
First = 0 Second = 1 Toward = 2 Third = 3 …
This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.
Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.
If you are going to be that pedantic, I would point out that one only has one birthday.
(Well, unless one's mother is extremely unlucky.)
“Birthday” really means “anniversary of the date of birth”.
In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.
We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.
So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!
Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).
It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.
That’s not the case, though. They can vote (and drink, in quite a few countries) when they are at least 18 years old, not when they are in their 18th year (who would even say that?)
People are 18 years old (meaning that 18 years passed since their date of birth) on their 18th birthday. There is no need of shoehorning 0-based indexing or anything like that.
> Most will say yes.
Most people say something stupid if you ask tricky questions, I am not sure this is a very strong argument. Have you seriously heard anybody talking about a child’s “5th year of development”, except maybe a paediatrician? We do talk about things like “3rd year of school” or “2nd year of college”, but with the expected (1-indexed) meaning.
> So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!
It’s really not. To have experienced a full year, you need a year to have passed, which therefore has to be the first. I think that’s a cardinal versus ordinal confusion. The first year after an event is between the event itself and its first anniversary. I am not aware of any context in which this is not true, but obviously if you have examples I am happy to learn.
> It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.
Right. I know it is difficult to admit for some of us, but we are not computers and we do not work like computers (besides the fact that computers work just fine with 1-indexing). Some people would like it very much if we counted from 0, but that is not the case. It is more productive to understand how it works and why (and again cardinals and ordinals) than wishing it were different.
Writers.
And yes, cardinal versus ordinal is my point. The farther from the origin, the less people are likely to want them different.
You are correct that nobody says "22nd year" in this context, but nobody says "21st year" either. The former is awkward but the latter is just incorrect.
You sure?
“He was born in the summer of his 22nd year…”
https://genius.com/John-denver-rocky-mountain-high-lyrics
On the contrary, enough people say it, it's a quora question:
https://www.quora.com/What-does-it-mean-to-be-in-your-twenty...
Authors love phrases like this. Which, in turn, comes from another ordinal/cardinal confusion stemming back to common law:
"A person who has completed the eighteenth year of age has reached majority; below this age, a person is a minor."
That means they completed being 17, but that's just too confusing, so people think you stop being a minor in your 18th year.
Consider a newborn. As soon as they're squeezed out they are in their first year of life. That continues until the first anniversary of their decanting, at which point they are one year old and enter their second year of life.
There is nobody, nobody, who refers to a baby as being in their zeroth year of life. Nor would they refer to a one-year-old as still being in their first year of life as if they failed a grade and are being held back.
The pattern continues for other countable things. Breakfast is not widely considered the zeroth meal of the day. Neil Armstrong has never been considered the zeroth man on the moon nor is Buzz Aldrin the first. The gold medal in the Olympics is not awarded for coming in zeroth place.
No one's saying it's true! All that's being claimed is that writers will often use phrases like "became an adult in their 18th year" or "was legally allowed to drink in their 21st year".
It's completely incorrect, but some people use it that way, and ultimately everyone understands what they actually mean.
People commonly make the mistake of thinking otherwise, but that's all it is. A mistake.
Nevertheless the traditional "how old are you" system uses a number 1 less.
Latin, like Lua, is 1-indexed.
https://douglasadams.com/dna/pedants.html
It's a C family (predecessors and descendants) idiosyncrasy that very unfortunately got out of hand. Most other old languages had either 1-based indexing or were agnostic. Most notably FORTRAN, which is the language for numerical calculations is 1-based.
The seminal book Numerical Recipies was first published as 1- based for FORTRAN and Pascal and they only latter added a 0-based version for C.
Personally, coming from Pascal, I think the agnostic way is best. It is not only about 0-based or 1-based but that the type system encodes and verifies the valid range of index values, e.g. in Pascal you define an array like this:
You will get an immediate compule-time error if you use At least with Turbo Pascal you could chose if you wanted run-time checks as well.I have a hard time wrapping my head around the fact that this feature is pretty much absent from any practically used language except ADA.
Outside of that machine context, where an array is a contiguous block of RAM that can be indexed with memory pointers, there's no particular reason to do offset indexing. 1-based works just fine - "first element, second element" - works just fine and is perfectly intuitive.
Different types of indexing can make sense in different situations. Some languages even allow that. In Ada, for example, arrays can start at whatever index you define.
I'm running into a similar issue recently. Turns out that many people saying they are '7 months pregnant' actually mean they are in the 7th month, which starts after 26 weeks (6 months!)
I will resent it till I die.
The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.
A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.
https://en.wikipedia.org/wiki/Long_eighteenth_century
In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.
For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".
From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.
https://en.wikipedia.org/wiki/1st_century_BC (the "last century BC") https://www.reddit.com/r/AskHistorians/comments/1akt4zm/this... https://www.quora.com/What-is-the-first-half-of-the-1st-cent... https://www.quora.com/What-is-meant-by-the-2nd-half-of-the-5... etc
I often get excited by some discovery sounding a lot older than it actually is, for reasons like this.
In spoken conversation, I dunno, it doesn’t seem to come up all that often. And you can always just say “20 years ago” because conversations don’t stick around like writing, so the dates can be relative.
> There’s no good way to refer to 2000-2009, sorry.
This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.
People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".
* Sorry, I don't know how to write that in past, like haber sido in Spanish, my main language.
There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)
There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)
Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.
I’ve no idea. When did the American revolution happen?
Not everyone’s cultural frame of reference is the same as yours. I can tell you when the Synod of Whitby happened, though.
Even more tiresome when they feel the need to comment about it.
Most people here, even non-Americans, likely have at least a rough idea of when the American Revolution was. And those who don't will either just gloss over it and think no more of it, or find the answer on the internet in a shorter amount of time than it took me to type this sentence. And then there are people like you. Look at the completely useless subthread you've spawned! Look at the time I've bothered to waste typing this out! Sigh.
I also didn't know what the date of the American revolution was, but I understood it was just an example.
> if you’re like me, you’ll find the question much easier to answer given the second version of the sentence, because you remember the American revolution as starting in 1776, not in the 76th year of the 18th century.
The events that directly affect the modern world should be covered in school. I’d say revolutions that created large modern states would be among them.
The reason most US Americans probably can place the American Revolution is because I assume it's so often commememorated there. In Germany, people would be much more likely to remember the years 1933, 1949 and 1989, because of how often they're referenced.
The Russian revolution as well. And the Chinese one. It’s quite difficult to make sense of the late 20th century (yes, I know) without them. Or the early 21th.
The remaining majority of the world (the west is a minority) sincerely couldn't care less about the American or French or Industrial Revolutions or Columbus (re)discovering America or the Hundred Years War or the Black Death or the Fall of Rome or whatever else.
Kind of like how we as westerners generally couldn't care less about Asian, African, Middle Eastern, Indian, or Polynesian histories.
The culture we grow up in and become indoctrinated by determines what is important and what is not.
And just so we're clear, this bit of ignorance is perfectly fine: Life is short, ain't nobody got time for shit that happened to people you don't even know who lived somewhere you will never see.
We only have so many hours in a day and so many days in a lifetime, while knowledge is practically infinite.
I limited my initial statement to educated folks, and presumably those who would like to be one.
/history/democracy/milestones -> relatively important.
Hell, I would even go as far as to say the American Revolution is irrelevant even for most Americans because it has nothing of practical value. We (Americans) all know about it to varying degrees, but again that is due to growing up and being indoctrinated in it.
> In the 1970s, fancy people would have sniffed at using “they” rather than “he” for a single person of unknown sex like this. But today, fancy people would sniff at not doing that. How did that happen?
> I think “they” climbed the prestige ladder—people slowly adopted it in gradually more formal and higher-status situations until it was everywhere.
I deal with the "2000s-problem" by using "00s" to refer to the decade, which everyone seems to understand. Sometimes I also use "21st century"; I agree with the author that it's okay in that case, because no one is confused by it. For historical 00s I'd probably use "first decade of the 1700s" or something along those lines. But I'm not a historian and this hasn't really come up.
> starting in 1776, not in the 76th year of the 18th century.
The 30s as 1930s seems pretty solid to me: US Great Depression, start of WWII, not to mention many of the smaller conflicts that led up to it.
I like the German Nullerjahre (roughly, the nil years). Naught years or twenty-naughts works pretty well too imho.
Not sure what a good word for this would be, but maybe just use what we already say — “hundreds”.
So, in the late 17th hundreds, …
ISO 8601-2:
> Decade: A string consisting of three digits represents a decade, for example “the 1960s”. It is the ten-year time interval of those years where the three specified digits are the first three digits of the year.
> Century: Two digits may be used to indicate the century which is the hundred year time interval consisting of years beginning with those two digits.
But then how about [2000, 2010)? During the current time period, I expect people are more likely to refer to the decade rather than the century or millennium.
In terms of music this is true.
We still say “20th century” though because that’s idiomatic.
The author is wrong here. The correct way (at least in spoken West Coast American English) is the Twenty-aughts. There is even a Wikipedia page dedicated to the term: https://en.wikipedia.org/wiki/Aughts If you want to be fancy you could spell it like the 20-aughts. I suppose there is no spelling it with only digits+s though, which maybe what the author was looking for.
I think it isn't as sexy/interesting as what I thought the article was going to be about (about a different way of talking about our history, in eras maybe vs centuries or something).
this strikes me as kind of like a small PR to our language that makes an incremental improvement to a clearly confusing thing. Should be easy to merge :)
Famous last words.
Seriously, though, we should have learnt at this point: we cannot solve social issues with technology. Everybody is working on a different tree, and social dynamics that govern the spread of idioms and from which fork you cherry-pick or merge is much more complex than what git is designed to handle.
Citation heavily needed
Even changing all the places it got encoded into software probably wouldn't be easy.