From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-0.5 required=5.0 tests=BAYES_00,INVALID_MSGID, PP_MIME_FAKE_ASCII_TEXT autolearn=no autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII X-Google-Thread: 103376,4b862d91ff93feff X-Google-Attributes: gid103376,public From: "Jean-Pierre Rosen" Subject: Re: Text_IO for other standard types Date: 1998/01/17 Message-ID: <69q6k3$8fk$1@peuplier.wanadoo.fr>#1/1 X-Deja-AN: 316805771 References: <98011512220569@psavax.pwfl.com> X-MimeOLE: Produced By Microsoft MimeOLE V4.72.2106.4 Organization: Adalog Newsgroups: comp.lang.ada Date: 1998-01-17T00:00:00+00:00 List-Id: Marin David Condic, 561.796.8997, M/S 731-96 a �crit dans le message <98011512220569@psavax.pwfl.com>... > I afraid I would have to respectfully disagree for a couple of > different reasons. > I would respectfully continue to agree with myself :-). I think this discussion rises from a much deeper issue. From what you wrote, it appears that you consider predefined types as the "normal" types, and that user defined types are just an extra burden (or luxury) which i only cost effective for high security, long-lived, etc. projects. I do not share that view. To me, data design is an important part of any design, like algorithmic design. Any type you use has constraints, and you should choose a data type that matches the requirements. Note that you *may* choose a predefined type, provided it's a design decision, if your requirements are: - No big constraints on range (or accuracy for floating-point types) - You are actually happy that the type changes depending on how powerful the machine is (big ranges on big machines, small ranges on small machines) - You want your type to match closely the hardware (for efficiency or interfacing reasons). Note that the types defined in System.Interfaces might be a better match than the predefined types in this case. In the general case however, you have some clear constraints on the type, and these must be translated into the code. The trouble is that too often, people think of "Integer" as the mathematical integer. Of course, they know that it is limited, but they tend to think "well it's an integer" (mathematically) and implement it as an Integer (predefined type). In general it works, because Integer is "big enough" (until it bombs), but wastes space. How many people use 32 bits to count 0 to 10! (I know, we have lots of memory, but...). The big issue here is that it is actually an implicit design choice, i.e. one that does not result from a design decision, but simply from not considering alternatives. In most other languages, predefined types are the only basic types available. It is an important benefit of Ada that data analysis can be applied also to basic types, and from a pedagogical point of view, it is important to teach that right from the start. From my experience in teaching Ada (and I gave my first Ada course in 1979 !), if you start teaching with predefined types, people stick to them, even if you teach them more evolved types later. Actually, I found it a very bad teaching practice to introduce something, and later say that it should not be used. Therefore, I always start by teaching user defined types, and just mention later: "oh, by the way, for some special purposes (like indexing string), you have a predefined integer type named Integer" Are user defined types such a burden ? I think not. During the first exercise session, people complain about those declarations and type incompatibilities that keep preventing their program from compiling. Typically, they need 2 to 3 hours to get their first succesful compilation. At that point, they usually turn to me and say "now, we'll have to debug that thing". And guess what ? In approximately 90% of the cases it works at first run. And then, the student turns to me and tells me: "well, maybe all those controls were really useful after all". In short: YES, I would declare: type Counter is range 0..10; even in a 10 lines program. And there is a good reason for this (also an important teaching subject!): If your mind is really quality-oriented, you don't wonder whether it's worth doing things cleanly; you *always* do it this way. And from my programming experience, I have a collection of memories where I later regretted to have not done things properly right from the start. I have *no* memories of regretting having done things properly. > 3) Not all students are CS majors who need to be steeped in the > theoretical culture because they will be designing the future. Once again, I found user defined types extremely *easy* to understand by beginners. It's rather those with CS majors who have troubles getting rid of them... > > (And, P.S.: Try explaining generic instantiation to a room full of > non-programming engineers just so you can do simple I/O exercises > with them and see how difficult this is! It's not. Just give it as a "cooking recipe". The first day, I explain that to get IO's on type Counter, they have to use the magic formula: package Counter_IO is new Integer_IO (Counter); and that they will later discover why (I don't even tell it's a generic instanciation). People accept it without problems.