From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 103376,1116ece181be1aea X-Google-Attributes: gid103376,public X-Google-ArrivalTime: 2003-10-16 09:54:23 PST Path: archiver1.google.com!news1.google.com!newsfeed.stanford.edu!cyclone.bc.net!news.uunet.ca!nf3.bellglobal.com!nf1.bellglobal.com!nf2.bellglobal.com!news20.bellglobal.com.POSTED!not-for-mail From: "Warren W. Gay VE3WWG" User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.4) Gecko/20030624 Netscape/7.1 (ax) X-Accept-Language: en-us, en MIME-Version: 1.0 Newsgroups: comp.lang.ada Subject: Re: A nongeneric bounded string array type (in database code) References: <3F7AC5B0.9080108@noplace.com> <3F7B7641.9030908@noplace.com> <3F7C8482.20102@comcast.net> <3F7D69EA.5030707@noplace.com> <3F7E2740.1050703@comcast.net> <3F7EBD85.8080205@noplace.com> <3F819C99.6080904@cogeco.ca> <3F844FE9.7030500@comcast.net> <3F86EEE3.4030600@comcast.net> <3F8EAF65.2030305@comcast.net> In-Reply-To: <3F8EAF65.2030305@comcast.net> Content-Type: text/plain; charset=us-ascii; format=flowed Content-Transfer-Encoding: 7bit Message-ID: Date: Thu, 16 Oct 2003 12:39:34 -0400 NNTP-Posting-Host: 198.96.223.163 X-Complaints-To: abuse@sympatico.ca X-Trace: news20.bellglobal.com 1066322349 198.96.223.163 (Thu, 16 Oct 2003 12:39:09 EDT) NNTP-Posting-Date: Thu, 16 Oct 2003 12:39:09 EDT Organization: Bell Sympatico Xref: archiver1.google.com comp.lang.ada:996 Date: 2003-10-16T12:39:34-04:00 List-Id: Robert I. Eachus wrote: > Warren W. Gay VE3WWG wrote: > >> Ah, but "arbitrary number of matches" could mean billions of >> rows! Do you really want to process columns that way? ;-) > > Ah, it could mean that, but, depending on the database, you can either > ask for the number of records that match, or return at most say 50, and > a count of how many more records remain. That is an implementation > detail that the programmer needs to thing about and handle appropriately. Yes, depending on the database indeed. But you see how easy it is to forget that important detail? ;-) And databases vary widely in that level of support (MySQL supports a LIMIT clause, but many others do not). Using a fetch loop that processes one row at a time is so much easier (at least when possible). This eliminates the resource issue. The CLIENT LIBRARY should already be handling the EFFICIENCY aspects of fetching, in bunches of rows, thus freeing your code from such implementation details. If you insist on arrays at a time processing, then you are usually focusing on efficiency, which is very implementation minded design. This is what many Ada people try to get away from (and certainly my focus). There is another problem with the array approach. You start referring to values by subscript (either by constant, variable or by number), and that can lead to more possibilities for error. I suppose you could use renaming, but you have to ask yourself why am I coding all of these extra language constructs? Whereas if you process a row at a time, you have no possibility of mixing the wrong Customer_Name with another's Customer_Account. Its readable, its simple and it can be strongly typed. As a side note, your Ada_Strings_Bounded_Array package makes a common mistake: it does not allow for the processing of NULL values. This is one fault I see so frequently in embedded SQL code, because the programmer never took the trouble to code for that possibility. This leads to all sorts of hideous problems when NULLs are encountered in processing. There is no doubt that Ada_Strings_Bounded_Array could be augmented to handle this, but does illustrate how this RDBMS feature gets overlooked. Microsoft is another one that is guilty of this in their database controls (try handling a null date type in the date picker, for example ;-) This is why in APQ, if the programmer doesn't allow for a NULL value, he will get the appropriate exception raised. While this is less than ideal (compile time errors would be better), it does guarantee that it will get addressed if the NULL value ever shows up and it was not anticipated. If you take arrays to the further level of columns x rows, then you have a 2-dimensional array to mess with -- which in my view is very dangerous from a code readability/code-write-reliability point of view. Don't get me wrong, arrays have their place. But here I can't help but think that it is for programer convenience and/or efficiency, which IMO is not the best approach. No offense intended here, but this approach seems to transport the "Perl way" into Ada code. ;-) -- Warren W. Gay VE3WWG http://home.cogeco.ca/~ve3wwg