From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.3 required=5.0 tests=BAYES_00,INVALID_MSGID autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,71b19e01eae3a390 X-Google-Attributes: gid103376,public From: Robert Dewar Subject: Re: Context switching (was: delay until and GNAT) Date: 1999/05/11 Message-ID: <7h83ag$o8$1@nnrp1.deja.com>#1/1 X-Deja-AN: 476399069 References: <7gpukr$s82$1@nnrp1.dejanews.com> <7grkbb$cee$1@nnrp1.deja.com> <7grvka$lc5$1@nnrp1.deja.com> <7h1e10$drg$1@nnrp1.deja.com> <3736e102@eeyore.callnetuk.com> X-Http-Proxy: 1.0 x40.deja.com:80 (Squid/1.1.22) for client 205.232.38.14 Organization: Deja.com - Share what you know. Learn what you don't. X-Article-Creation-Date: Tue May 11 02:06:09 1999 GMT Newsgroups: comp.lang.ada X-Http-User-Agent: Mozilla/4.04 [en] (OS/2; I) Date: 1999-05-11T00:00:00+00:00 List-Id: In article <3736e102@eeyore.callnetuk.com>, "Nick Roberts" wrote: > I'd just to add a little note on context switching. > > [:2:] Many processor architectures today provide built-in > support for (normal) context switching, so that the operating > system will usually have very little to do with the speed of > these context switches. Switches can generally be achieved > within a few dozen memory clock cycles (typically > out-of-cache), which will be, for most modern microcomputers, > in the ballpark of 1 microsecond (+/-1oom). Can you say what processor architectures you have in mind here? Certainly none of the ones that GNAT is commonly used on ... The context switch on the x86 in particular is horribly slow, and one would like to avoid it in a high efficiency x86 exec (I once saw an RFP for an Ada compiler from Intel that required that tasks use the hardware tasking of the x86. There was a foot note saying that it was understood that this requirement would degrade performance :-) --== Sent via Deja.com http://www.deja.com/ ==-- ---Share what you know. Learn what you don't.---