comp.lang.ada
 help / color / mirror / Atom feed
From: Optikos <ZUERCHER_Andreas@outlook.com>
Subject: Re: What is the history behind Natural'First = 0 ?
Date: Fri, 1 May 2020 11:14:19 -0700 (PDT)
Date: 2020-05-01T11:14:19-07:00	[thread overview]
Message-ID: <1c0449b3-9aa5-4222-88e8-aa6aab0b24b9@googlegroups.com> (raw)
In-Reply-To: <9f0215ca-2760-47cf-a7cb-50184892e1d0@googlegroups.com>

On Thursday, April 30, 2020 at 11:51:09 PM UTC-5, reinert wrote:
> I have been wondering about this for years:
> 
> Why Natural'First = 0 ?
> 
> There is no consensus about including 0 among the natural numbers.
> Since there is a Positive (Positive'First = 1), one may expect Natural'First = 0
> Except for this, I find little intuition in "Natural'First = 0".
> 
> 
> Copy form: https://en.wikipedia.org/wiki/Natural_number#History
> 
> Some definitions, including the standard ISO 80000-2,[1][2] begin the natural numbers with 0, corresponding to the non-negative integers 0, 1, 2, 3, …, whereas others start with 1, corresponding to the positive integers 1, 2, 3, …,[3][4] while others acknowledge both definitions.[5] Texts that exclude zero from the natural numbers sometimes refer to the natural numbers together with zero as the whole numbers, but in other writings, that term is used instead for the integers (including negative integers).[6]
> 
> Is the key point here: "the standard ISO 80000-2" ?
> 
> reinert

The key here is that there has always been a terminology divide between North America and Europe (with UK & Canada sometimes going a 3rd way along British-Empire lines).

Generally, in the USA, the set of natural numbers is the set of positive integers, that is denoted ℕ domestically or ℕ* when interacting with people outside of the USA to show the lack of zero.  Generally, in the USA, the set of whole numbers is the set of positive integers, that is denoted either ℕ₀ or ℤ⁺.

Conversely, generally in UK and Europe, the set of natural numbers is the set of nonnegative integers, which is denoted ℤ-⁺.  (The dispute even goes that far:  having different double-struck/white Z mnemonic notation: ℤ⁺ versus ℤ-⁺.)  Generally in UK and Europe, the set of counting numbers was formerly the set of positive integers, but the further away from the 19th century we get, whole numbers has at times become synonymous with the UK/European definition of natural numbers.

https://mathworld.wolfram.com/NaturalNumber.html

https://mathworld.wolfram.com/NonnegativeInteger.html

https://mathworld.wolfram.com/CountingNumber.html

https://en.wikipedia.org/wiki/Natural_number

Ichbiah showed his European culture by institutionalizing the European definition as the sole normative definition in Ada.  (And don't even get me started on billion, trillion, and milliard.)

      parent reply	other threads:[~2020-05-01 18:14 UTC|newest]

Thread overview: 12+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2020-05-01  4:51 What is the history behind Natural'First = 0 ? reinert
2020-05-01  7:52 ` J-P. Rosen
2020-05-01  8:38   ` AdaMagica
2020-05-01 10:24     ` J-P. Rosen
2020-05-01 19:03   ` Keith Thompson
2020-05-01 21:36     ` Robert A Duff
2020-05-03 20:08       ` Keith Thompson
2020-05-04  3:02         ` Keith Thompson
2020-05-04  8:50           ` Paul Rubin
2020-05-04 14:22           ` Dennis Lee Bieber
2020-05-01 10:13 ` Jeffrey R. Carter
2020-05-01 18:14 ` Optikos [this message]
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox