From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: *** X-Spam-Status: No, score=3.8 required=5.0 tests=BAYES_00,INVALID_MSGID, RATWARE_MS_HASH,RATWARE_OUTLOOK_NONAME autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,9a586954b11ae008 X-Google-Attributes: gid103376,public From: "Nick Roberts" Subject: Architectures Date: 1997/04/12 Message-ID: <01bc476f$8455f8e0$22f482c1@xhv46.dial.pipex.com>#1/1 X-Deja-AN: 234368485 References: <1997Apr2.202514.1843@nosc.mil> <01bc42b0$a88691c0$90f482c1@xhv46.dial.pipex.com> <1997Apr7.130018.1@eisner> <1997Apr9.214815.16233@ocsystems.com> Organization: UUNet PIPEX server (post doesn't reflect views of UUNet PIPEX) Newsgroups: comp.lang.ada Date: 1997-04-12T00:00:00+00:00 List-Id: Joel VanLaven wrote in article <1997Apr9.214815.16233@ocsystems.com>... > Given progress similar to history, 64 bits ought to be enough > for many decades. Assuming that capacity doubles every two years and > that only highly specialized systems today would need more than 1 terrabyte > of disk space, the first time 64 bit addresses will not cover a very large > filesystem will be in about 48 years. The average 4gig home system > (extrapolated) won't hit that barrier for another 16 years after that. > Before either UNIX 32-bit times will wrap around. In between we will > probably think that 32 bit addressing wasn't enough but 64 bits is more > than enough. It's interesting that good old MULTICS (remember MULTICS?), back in the swinging sixties, was based on a 48-bit (I think it was about that) addressing scheme, divided into 24 bits for a segment selector, and 24 for the offset. The idea was that all files (eventually) would be mapped straight into a segment each. Of course, in the end, it never happened (not due to technical reasons). Nowadays, 16MB would be considered much too small a limit for file size, but it probably seemed plenty then. It's not too difficult to imagine applications requiring data sets bigger than 4GB to be loaded into (virtual) memory, in the not too distant future. For example, an application which implemented very sophisticated voice recognition. Tomorrow's architecture will have to cope with this sort of thing. Nick.