Archive for the ‘RPGII’ Category

RPG In Isolation

Saturday, November 7th, 2009

I was going to discourse on another topic, but a response to Buck (to whom I apologize, along with Bill, for being derelict in checking my site’s e-mail), who commented on my previous post, demands a reply. He feels that lack of formal education, while being a contributing factor, is only a contributing factor; rather, it is the midrange habit of being exposed to one programming language.

Actually, I am inclined to agree. The isolation in RPG is real. But we also do well to consider the background of the situation. The earliest IBM minicomputers, the System/3, had as its programming language RPG II. That was it. Small businesses who wanted that computer worked in that language- at least, I’m not aware of others used on the machine, besides assembler.

When the System/34 came out in 1977, the choice of compilers widened. BASIC, COBOL, and Fortran became available. But still, RPG dominated. I’m not surprised that Fortran and BASIC popularity did not rise. But COBOL was big at the time. However, RPG was already a business-oriented language, so COBOL was not really needed from that standpoint, and RPG was perhaps felt to be more terse and easier to learn. Then too, COBOL programmers were already being kept busy keeping the IBM mainframes humming, so there were probably not an excess of them floating around. COBOL has never been much more than a peripheral language on the IBM midrange; useful probably only to those whose business already depended in some way on COBOL, perhaps having COBOL talent already available.

In addition, and perhaps most important of all, The System/34 had no need to talk to the outside world unless it were to another IBM minicomputer or mainframe. There was no need to handle HTML; there was no HTML, because there was no Internet.

This is not to say that RPG was a perfect language. String manipulation beyond the realm of MOVE and MOVEL was very difficult. The two things the designers never got right until RPGIV came along were string manipulation and date arithmetic- being able to get the date 30 days after today, for instance. (One thing I will never comprehend is why they have never been able to enable multiple dimension arrays. In earlier days in a different environment, trying to emulate multiple dimension arrays was a real pain.)

But since the programmers had no other languages to work with, they bent RPG to their will. They created these plug-ugly array- manipulation routines to piece together strings. And they created complex routines to do date addition and date format conversion. We must not forget the famous “magic numbers” - to convert a date in month-day-year for mat to year-month-day, for instance:

MMDDYY MULT 10000.01 YYMMDD. I am still reminded of the RPG programmer of my acquaintance who, with the help of engineers, created sines and cosines and other functions in RPGII.

Now, there may have been elegant functions created in C for such purposes as string manipulation and mathematical functions. But once RPG routines were developed, the C functions were not needed. It really didn’t matter. They couldn’t use C anyway. If you knew what you were doing, you could write assembler subroutines – but that was it. I don’t think RPG couldn’t even call COBOL programs; at least the techniques were not well known for doing so.

So when ILE came around, much of the motivation for calling the now-available C routines was missing. We didn’t need them. RPG routines were already doing the work. Unless there were some performance considerations, programming managers would probably take a dim view of programmers who ripped out the ugly-but-tried-and-true RPG routines just for their own amusement. One could make a valid argument that those routines were a maintenance nightmare. But unless management had a good deal of foresight, this might be a difficult thing to prove to management’s satisfaction.

And yes, programmer inertia was also a factor. My shop has had IBM midrange machines since the 1970’s, and AS/400s since the 90’s. But the first RPGIV program was written on this system in 2007, by me after I arrived.

But now, the IBM midrange does have to deal with the outside world. My company already has an Internet presence, and we quite frankly don’t have to know PHP or Java or the other languages in vogue in order to make the business work. All we have to do is make sure our Internet provider gets the necessary data. And RPGIV provides the necessary tools to alleviate some of the issues that plagued us with earlier versions of RPG. If we need outside routines like APIs or C or Java or Rexx, they are easy enough to call. I have expressed myself in earlier posts about my ambivalent feelings about freeform RPG.

But mentioning freeform brings us to the topic of: How do our employers find replacements for us fixed-format loving old codgers when we retire? That is a topic for another time.

Education and Programming Style

Saturday, October 3rd, 2009

I have been thinking a bit about how our backgrounds, educational and otherwise, may determine at least to some extent the style we use in our programming and indeed perhaps the language we use.

One of my fellow programmers (about my age) graduated from college with a background in computers. Almost immediately he was able to find a position programming in COBOL, then in ALGOL, which is a block-structured language with a relatively free format. He continued to do well at various positions; ultimately, he came to where I work, with many years of experience in programming, but none in RPG. The style here could be described as a blend of RPGII, RPGIII, and RPG/400; fixed format to the core, loaded with indicators. He learned RPG from the ground up in that environment. Sometime after I came along, he was shown freeform RPG.

I, on the other hand learned RPGII from a book and practical experience. I had no computer background when I started to learn. I was of an inquisitive spirit, though, and I advanced in my use of RPG as the language itself improved. I too ultimately became acquainted with freeform RPG.

Now, who do you think uses freeform RPG whenever he can, especially on new programs? You might have guessed it was my partner- if you did, you’d be right. RPG was actually stifling to him; given the freedom to use free format RPG along with the BIFs that work well with it, he has been liberated. On the other hand, as you may judge from this blog, I just have not seen the light or the point. When I write code for new programs, it is in fixed format RPGIV.

We learned to love the way we learned programming, and so our preference now is to do it the way have always learned it, just better.

Illustrating the point that the way we learn to program is greatly influenced by our programming origins was a discussion I just read on an RPG forum about the significance of certain characters when writing CL (Control Language on the AS/400). Instead of indicating a “not equal” test by the common expression *NE, the code used the combination ^=, where the ^ is really a character I can’t find on my keyboard, but looks like the upper-right-hand corner of a rectangle with the top leg longer than the right. The character, as it turns out, is the “logical not”, a character often used in Boolean algebra.

Now, I don’t know who originated that code. Some experienced programmers, though, just did not know what it meant. Therein may lie a tale.

During the time I learned, many if not most of my fellow programmers were not products of any kind of computer science curriculum. They did not learn BASIC or Fortran first, and usually not COBOL, though this was a bit more common. They learned RPG as their first language. They did not learn “computer science concepts”; many, like me, either never attended college or did so only briefly. They may have started as a computer operator, or maybe at another position in their company. Usually it was a small company; it had the smallest capacity IBM computer available, the System/3, or System/32, or System/34. Something about the programming process intrigued or excited them; they persevered, and they advanced.

As time went on, though, and RPG increased in importance, people with a background in computers and computer science began to program RPG. For a long time, they would prove to be stifled if they expected to program like they originally learned to program in other languages. And where the original programmers may have simply accepted, for example, the RPG cycle, and comprehended it entirely, by the time the System/38 came around, programmers who started learning RPG as RPGIII on the System/38 evidently found it mind-boggling.

I can’t count the number of times I have found a test for LR (last record) in the RPG detail cycle in code (usually written by programmers who came from the 38) using an input or update primary file. Unless explicitly set on with a SETON instruction, the LR indicator will never be on during the detail cycle. A programmer who started with RPG II would automatically know this – it would be a part of his upbringing. A programmer who started with a background in computers later would not necessarily know this. This would cause problems if the programmer intended to base output on this indicator test. Usually it would mean that last record calculations and total output would be missing.

Another difference in outlook comes with file processing. A programmer raised on RPGII (usually in a small shop) doesn’t mind the system reading files (primary and secondary) without him telling it to. But I have noticed in discussion forums that people raised on other programming languages (usually learned in college) seem to get bent out of shape if they don’t explicitly get to do the file reading themselves. They don’t really comprehend that this really is quite an advanced feature, much closer to SQL than Pascal.

It would be interesting to find a survey of programmers that told what level of education they had when they first started programming professionally. I would suspect that as a group, RPG programmers would have a slightly lower level of educational advancement than, say, a C or Java programmer. This would especially be true if they surveyed the most experienced RPG programmers. But then, it is results that matter. RPG programmers do not write compilers – we process information so that humans may comprehend it.

The Language We Learn- The Way We Think

Friday, July 24th, 2009

From time to time I read that the language we speak to some extent controls the way we think. Someone more versed in such things may be able to describe how this is true with natural languages, but I have found this to be especially true with computer languages. A recent chunk of code I came across while working on a program at my place illustrates this point.

The point of the routine is to add a particular quantity to a total a number of times defined by a variable field.

Now, to most modern programmers, this is a trivial task. A classic DO loop will do the trick quite nicely. But this code was not written in a modern language. It was written in RPGII, which didn’t have  DO and END opcodes to use.

But first, we will render how the code, as it appeared in the program, could have been coded using DO:
(The field names have been changed. The number of times the quantity would be added is NOTMES; the quantity to be added is TOTQTY; the quantity TOTQTY is to be added to is GDTOT.)


     C                       IF        *IN03=*ON AND *IN49=*OFF
     C                       DO        NOTMES
     C  TOTQTY               ADD       GDTOT         GDTOT
     C                       ENDDO
     C                       ENDIF
     

Now, you read above that it was written in RPGII. Here is how it could have been rendered in RPGII:


     C  N03
     COR 49                      GOTO      EIF001
     C                           Z-ADD     0             X
     C        LUP001             TAG
     C        X                  ADD       1             X
     C*  25 SET ON IF X<=NOTMES
     C        X                  COMP      NOTMES                  2525       <=
     C   25   TOTQTY             ADD       GDTOT         GDTOT
     C   25                      GOTO      LUP001
     C        EIF001             TAG                

But this is not even nearly how the code was written.
In early RPG, apparently programmers were apparently taught to use an indicator to test for every possible condition- a different one for every condition. And, apparently, the concept of a loop did not occur to this programmer. The value of NOTMES could be as high as 5, so, by George, we must test for every value from 1 to 5, and deal with them accordingly. When I dug into the code and got to comprehend what it was doing. I was absolutely stunned. What follows is the code as translated into RPGIV, so the number of lines would be slightly more than in the original code; but the old code looked just as bad:


     C     TOTQTY        ADD       GDTOT         GDTOT
     C  N49NOTMES        COMP      2                                      75
     C  N49NOTMES        COMP      3                                      76
     C  N49NOTMES        COMP      4                                      20
     C  N49NOTMES        COMP      5                                      05
     C*  IN RPGII, NEXT CALC WOULD TAKE 4 LINES
     C   03
     CANN49
     CAN 05
     COR 03
     CANN49
     CAN 20
     COR 03
     CANN49
     CAN 76
     COR 03
     CANN49
     CAN 75TOTQTY        ADD       GDTOT         GDTOT
     C*  IN RPGII, NEXT CALC WOULD TAKE 3 LINES
     C   03
     CANN49
     CAN 05
     COR 03
     CANN49
     CAN 20
     COR 03
     CANN49
     CAN 76TOTQTY        ADD       GDTOT         GDTOT
     C*   IN RPGII, NEXT CALC WOULD TAKE 2 LINES
     C   03
     CANN49
     CAN 20
     COR 03
     CANN49
     CAN 05TOTQTY        ADD       GDTOT         GDTOT
     C   03
     CANN49
     CAN 05TOTQTY        ADD       GDTOT         GDTOT

I’m sorry, but I’m afraid but I don’t have any profound insights as to how that monstrosity could have been avoided; but in a weird way it gets the job done. There is a certain strange creativity about it. The logic is:
1. Add the quantity once.
2. If it needs to be added 2,3,4, or 5 times, add it again.
3. If it needs to be added 3,4, or 5 times, add it again.
4. If it needs to be added 4 or 5 times, do it again.
5. If it needs to be added 5 times, add it again.

The most basic Computer Literacy class in middle school today would teach the better way, but such classes weren’t around then. And for those who learned that early, it’s probably the way they did it for the next thirty years.

Oh, my.

Converting to Freeform II

Monday, June 1st, 2009

The rather slow acceptance of freeform RPG since its inception has been intriguing to me. For that reason, a recent discussion on  the RPGIV forum has been interesting. Bob Cozzi, a well-known RPG expert, posed a question about how long it took to actually code the implementation of freeform to Hans Boldt, who evidently was on the team at IBM that developed the original implementation.
Hans said that it only took about 1 PM (person month?) to actually code it, but considerably more time was spent in planning. I can believe this. The planning discussions must have been particularly interesting.
One of the biggest things that have slowed acceptance of freeform was the non-implementation of MOVE and MOVEL. This is also the opinion of Cozzi and others in the field. Mr. Boldt touched on this and other issues.
Boldt’s first design included implementation of all opcodes. Others in the group, one in particular, argued vociferously against it. Boldt eventually came around to their way of thinking. As he said, “That design point meant we didn’t have to worry about those goofy multi-part factors.”  (I’m not clear on what that means, exactly. But anyway…..)
Once that was established it was just a matter of deciding which to implement and which not. GOTO was thrown out because, as stated in Nicklaus Wirth’s famous title for Edsger Dijkstra’s article on GOTO, “Go To Statement Considered Harmful”. I could quibble about that, but since it’s been probably 4 or 5 years since I last put a GOTO in a program, there wouldn’t be much point in arguing.
With MOVEL and MOVE however, Mr. Boldt doesn’t understand why people are so attached to those opcodes. He says, “It’s hard to see how these opcodes can be missed given the other options available by free-form opcodes.”
To a certain extent, I agree with his comments. I myself have used MOVE/MOVEL very seldom in the last several years. New opcodes have for the most part been very workable. Using EVAL and many BIFs, I get along very nicely. When EVAL gets a little tedious, I often create data structures to put together data in the desired patterns.
The problem is, I am talking about new code. Old code from the 70’s, 80’s and early 90’s did not have all this neat stuff available. About all you had for data manipulation was MOVE and MOVEL, along with their cousin MOVEA, along with perhaps data structures. From my observations over the years, I would say most programmers did not avail themselves of the help even data structures could provide; they did their work with MOVE(L). We are not talking about an easily disposable opcode like FORCE; we are talking about opcodes that appeared in almost every non-trivial RPG program before RPGIV, and most of the trivial ones. They used the opcodes to change data types (from alphanumeric to numeric and back). Above all, they used MOVE to manipulate strings, piecing together long strings from five or six (or more) smaller ones. One can complain that this is a very kludgy way of doing things, but this is all we had to work with. I would suspect that there are millions, if not tens of millions, of MOVEs in the legacy code sitting on today’s AS/400s (or whatever we’re supposed to be calling them today).
So what do we do in freeform once MOVE is taken away from us? As pointed out in the post I referenced yesterday or the day before, we must choose among 10 or 15 alternatives, depending on what we want to do - which in a sense is a demonstration of the power of the opcode. There is no easy conversion; it is not the same thing as changing  KEY   CHAIN    MASTER  to CHAIN KEY MASTER; . I don’t think even artificial intelligence could do even close to a good job of determining the best conversion; Linoma tried, but some MOVEs are just untranslatable without human eyes and brains to analyze them.
Therein lies the problem. If you rule out MOVE in freeform, a human must determine the best way to translate it to freeform. It simply isn’t financially practical to spend resources changing massive amounts of fixed-format code to freeform by hand, and the utilities available for automatic conversion do not do a clean job. You can use Linoma’s software; you can use Websphere’s conversion facility, which does not handle the MOVE and does not even attempt to do the indentation; there may be other utilities around. In the end, to convert legacy code, you can convert it to a degree automatically, but you will still be left with MOVEs to contend with by hand. You end up with ugly flipflops between free and fixed format that may even be harder to read and deal with than the original fixed format code.
It is very nice to say, that when we modify a program, we will also change it to freeform. But that is a slow way of converting what might be hundreds, or even thousands, of programs on your system. Remember, converting a MOVE to something else is not the same as reformatting a CHAIN; it must be carefully tested.
So, while freeform may ultimately win out, it could have been done a lot more readily if MOVE had been implemented in freeform. The more I think about it, I wonder if someone’s ego got in the way, someone who said, “I am going to teach these plebeians how programming SHOULD be done, instead of using these operations that no other reputable programming language uses.”  (Down, XFOOT, down!!!  Don’t bite him!)
And to think it could have been avoided by allowing the conversion of  MOVE  STRING1   STRING2  to the freeform equivalent:  move   string1    string2.

Converting to Freeform RPG

Sunday, May 31st, 2009

I didn’t think I’d be considering free-format RPG so soon as a final solution for the current RPG programs on the system at work (some of which date back to the 70’s), but a number of interesting events have taken place.
Our IBM representative made us aware of a software package that accomplishes the conversion of RPGII, RPGIII, and RPG400 programs into true RPGIV. This means (among other things) ditching the left-hand indicators and attempting to really use more modern operation codes. If you are at all familiar with what would be necessary, you can appreciate how difficult this task would be without some kind of automation. The painstaking code changes and testing would be incredible on a code base of any size.
This package is called RPG Toolbox, from Linoma Software.
So we downloaded the trial version, which allows 10 source conversions from another RPG dialect - or even RPGIV that has simply been converted from an older format using IBM’s CVTRPGSRC utility. CVTRPGSRC simply reformats old code into RPGIV syntax; it does not attempt to modernize it. We converted a source member with a lot of left-hand indicators, and we were impressed how it converted into a reasonably neat format without those indicators. What really impressed me was that it would attempt to convert a series of MOVE statements, which was commonly the method used to piece together larger fields, into EVAL statements. This could be a dangerous technique, but the RPG Toolbox utility carefully analyzes the code to make sure it is safe. In the manual, it describes how it makes decisions about code conversion; if one is interested in this product, that person would do well to download and read the manual before making decisions about what the program should do in the course of conversion.
I also had an inspiration- why not use it to convert RPG IV to freeform? In earlier posts, I have declared my reservations about how valuable freeform is. But, if it’s easy, why not live dangerously? So I tried converting one of the RPGIV programs in my file cross-reference system to freeform. The results were impressive. It looked VERY good. The more I looked at it, the more I liked it. So I said to myself, why not convert ALL our code to freeform?
But then my younger cohort brought me back down to earth. He wasn’t so sure that would be a good idea. So we converted the old-format code referred to above into freeform. The results were messy. There was much code that simply could not be converted, often involving the MOVE and MOVEL opcodes. (I have also commented on this previously.) When this happens, the free-format code is enclosed in a /FREE - /END-FREE set of compiler directives and the code drops back into fixed format. Switching back and forth between free and fixed can be maddeningly ugly.
So I have backed off the idea of a total conversion to freeform RPG. But we still feel that converting our code to a clean RPGIV that uses as much as possible of RPGIV’s modern syntax and opcodes would certainly be a good thing; and it appears that it would be relatively safe. We have ordered the software. Where the conversion of a program to freeform would not create too messy a result, we can still convert it.
I was also surprised to learn how quickly the most experienced one on our crew would take to freeform. He first started programming in ALGOL; so freeform is not really that much of a leap for him. He has in the past several weeks written new programs in freeform RPG, of his own free will. My other partner has a Pascal/Java background; so freeform would come “naturally” to him. I am the one who will take some getting used to the idea.

“Subfiles in RPGIV” and Me

Saturday, May 2nd, 2009

Subfile processing is an integral part of RPGIV data entry and data display. To be a real pro, you need to handle them properly. There are any number of ways to build and maintain them. Depending upon your level of expertise, you can do really sophisticated data processing tasks with subfiles. (Yes, I know that phrase data processing is so 1970’s.)

Dealing with them at my place of employment is an ongoing task. My two fellow programmers, though having much experience in programming, are relatively new to RPGIV. To help enhance their skills, one of them bought the excellent book by Kevin Vandever, “Subfiles in RPGIV”.

One of them was using it to help him build the first subfile program he had ever written from scratch. And wouldn’t you know, his techniques were different from mine! And the author had used my technique when he first started programming subfiles, but he had found a better one. Oh, my.

For the uninitiated, subfiles are a way to handle update and display of data files, by allowing you to display multiple instances of data in (usually) a columnar format, then treat each line as a separate data record. In RPGII, subfiles were not available; to process it, you had to plug your file data into arrays of information, each data field in an individual record linked by being the same array element in multiple arrays.

Anyway, the author recommends a technique that is clean and efficient. He uses the DDS keyword SFLNXTCHG to allow the programmer to flag for further processing data in a record that has been changed. The operating system keeps track of changes, and this allows the programmer to process only those subfile records that have been changed by the programmer. This is done by using the READC (Read Changed record) RPG opcode. Using SFLNXTCHG with an indicator allows the programmer to mark a line of data as “changed” when it processes the second time around. If one entered erroneous data in error, and the program would not accept it, the data would still be marked as “changed” and the record would be read the next time around by the READC opcode, even if the data entry person did not correct or change the data.

My approach, the way Mr. Vandever set aside, is to keep track of the number of records (entries) in the subfile. To process the subfile, I CHAIN to each subfile record by relative record number, whether it is changed or not, and process the data as needed. I can do this as many times as desired.

Why do I do it my way? Because I’m an ignorant so-and-so, I guess- or at least I was when I was learning many years ago how to program subfiles. I vaguely remember attempting to use READC to process only changed subfile records. However, I was not aware of the SFLNXTCHG keyword; and when I would read a subfile record and find an error, I couldn’t see how to tell the system to process the record the next time around. Suppose the screen had five subfile records displayed. I changed two records, and one of them was in error. Once I fixed the one in error and the program looped through with the READC opcode again, it would not pick up the correct one from the previous pass because it couldn’t tell it has been changed! When I saw an example of the CHAIN technique, I realized that it would always work, so I stuck with that technique.

So do I change now to the more efficient technique in the future? Wellll…. probably not. Being a creature of habit, for one thing; and the fact that changing my technique will not gain me all that much and would make the “read subfile” task more complex.

But what about all the unnecessary CHAIN operations, which on a large subfile mean an awful lot more subfile accesses than necessary? If you have read my blog from the start, you might remember when I told about a hissy fit  I had when I worked on a program where a previous programmer read a detail data file by CHAIN instead of READ and increased the processing time by a factor of 10 – from 15 minutes to 2.5 hours.

I suppose that I have been corrupted by the greased lightning bottled in the latest machines. In my file cross-reference program, I load source files into subfiles so that you can drill down into the subfile and find out file characteristics, even view the data in the file by putting the cursor on a file name and clicking a function key. When I tell the system to display source, it can put 9,999 lines of source to the subfile in less than a second. Why sweat the difference in processing one record with READC versus, say, 12 when using CHAIN? In a large installation with hundreds of users on a slower machine, perhaps it would be a concern- but not now. My concerns about performance are pragmatic, not ideological. As I have said before (as a matter of fact, in that previous blog post I mentioned above), I don’t sweat over nanoseconds.

Also, at least with reference to my cross-reference program, using READC is irrelevant. I do not change the subfiles in any way. They are simply a jumping-off point for further processing. I use EXFMT (which means write the format, then accept input) to display them, but only to allow the input of a right-click or a command key or a search argument to use with a command key.

A minor quibble would also be that an edit that relied on READC would catch only errors that had been introduced by the current data entry operator. There is no way to guarantee that the data in the subfile is error-free when it is first displayed, especially if the data is also updated by other programs. If you edit every line, you can check for errors in all the data, including preexisting errors.

So my partners are free to use the new techniques, but I am not likely to choose to do so in the future.

When and How to Upgrade Code

Tuesday, December 30th, 2008

I work on code that, when I came to work for my client, was essentially RPGIII/RPG400 or older. Some of the code had been implemented way back in the 1970’s and was migrated forward as newer machines were installed. And the code showed it. (I have found interesting the large number of 80 and 96 byte files in some of the systems, remnants of old punchcard processing programs.) You haven’t lived until you’ve attempted to update a program that has grown over the years to be a 10000+ line, indicator-laden monster. If you’ve been away from the program awhile and you need to make a non-trivial change, you need to spend part of a day, at least, reviewing the program’s processing.

I am normally conservative about program changes. I generally do not rip apart code and rewrite it, especially if it is not broken. But if a program is to be revised, I do generally at least convert it to RPGIV before making even a minor change. As a phrase used by one of my favorite literary characters (Anne of Green Gables) says, doing this allows “more scope for imagination”.

I was making code changes for a conversion project I was working on when I came across a particular program I had seen before and shook my head at. But for some reason, this time something snapped. I just HAD to change a particular chunk of code, even though the code in fact worked. No errors. But how far to change it? I will explain what I did. In the end, what I do will likely be seen as good by some, too intrusive by others, and  not radical enough by still others.

(more…)

But They Loved Her

Saturday, May 24th, 2008

I was once sent to a client to take care of some fairly modest modifications to some RPG programs on their System/34. As I went about making changes on a certain report program, I concerned myself with the requested changes, not paying much attention to how the program performed its main processing. As I submitted the program changes for test, an employee said to me that I should expect it to run for about 2 or 3 hours. Having looked lightly at the processing, and being acquainted with the 34’s processing speed and the size of the file being processed, I smiled inside. “More like 15 or 20 minutes, I think”, I said to myself.

To my surprise, it in fact did take around 2 ½ hours. I was anxious to find out why I was so wrong, and what I found disgusted me, and for the first time made me angry about the work of another programmer.

The file being processed was a detail file. That is, it was part of a file setup common in data processing where the line items in, for example, an invoice, are stored in a separate file from the general information about the invoice.

Let us suppose that the invoice “header” has a key of invoice number – the file is organized to be read in invoice number order. For our example, let us suppose the invoice number is 123456. In later versions of RPG, you could define file keys from fields physically separate from one another in the file; but this was RPGII on the System/34, and the key data had to be all together. In this case the key to the “detail” file was the invoice number followed by a 3-digit sequence number. So the detail keys for invoice 123456 could range from 123456000 through 123456999, though in practice the 000 record was not usually used (since the files were no doubt designed by business people, who would think starting to count with 0 was nonsense :) ).

Now, the customary way to read these records was to set lower limits (SETLL) using the invoice number and the lowest possible key (123456000), determining the highest possible key (123456999), then READing the file until you either reached end of file or the key of the record read was more than 123456999. Not having looked closely at the code before, I had assumed that was the way it was done. Since you would very seldom find an invoice with 999 line items, this would be the logical way to do it.

But this was not the way the programmer did it. She instead used a separate counter from 1 to 999; for every number from 1 to 999, she moved that number to the end of the key (getting 123456001, 123456002,123456003, etc.) and performed a random-access CHAIN to the file, using the data if the access was successful. So, if there were 4 detail records for the invoice, she would have 4 successful reads and 995 unsuccessful reads, instead of 4 successful to 1 unsuccessful the normal way. The program was spending over 99% of its time accessing the disk looking for records that didn’t exist!

What made the matter worse was that this logic was replicated in quite a few of their other major programs. Now, I’m not normally a person who likes to rock the boat, but this was too much for me. I went to my supervisor there, explained the situation and asked for permission to change these programs to do it the right way. Permission was granted, and I hunted down the other programs and modified the read routine in all of them to do it properly. And yes, they did complete in 15-20 minutes, just as I thought they should.

Unfortunately, this was too little too late. I found other areas where the programs, though giving correct results, were very poorly designed. The proof of the damage this programmer and others had done to this firm was evident in the fact that, at the time I came to that firm, they were in the process of moving their processing to another, non-IBM minicomputer using another dialect of RPG because it was deemed that the System/34 just didn’t cut the mustard. It was too slow.

This may be an extreme example. Normally, I don’t sweat performance too much, though even today I think programs are more bloated than they need to be. (I once compiled an RPGIV program with only 1 line: eval *inlr=*on. Its size was around 80K.) But bad programming can really have a bad effect.

I prefer that programs be clear and understandable, so that they can be more readily maintainable; performance can be somewhat secondary. I think that reducing disk access is the key to most performance issues. I really don’t worry about whether or not one opcode is 50 nanoseconds faster than another. Talk about a few trillion nanoseconds here and there and we’re talking some real time.

But sometimes you do have to stand up for a certain standard of performance. Unfortunately, in that company, quality control came in a little too late. Perhaps it was a human relations issue. While I was at that company, they talked about how much they liked that programmer as a person. She must have been a real charmer. Evidently, they liked her so much that they accepted whatever excuse she gave for a report run that took 2.5 hours to produce a report. Apparently, she had no intellectual curiosity to try to figure out why it ran so long. That, to me, was her cardinal sin- a lack of intellectual curiosity, a lack of desire to do what she could to improve her product. Her report produced correct results, but she could have done it right, she could have done it better.

I find I’m rambling here, in search of a point. Perhaps it’s just that quality is the important thing. Accuracy and clarity in the design. The ability to discern when something just isn’t right and know how to fix it. Be willing, whether a user or a programmer, to speak up whenever you see something that isn’t quite right, and talk to the people who can do something about it – don’t just vent to your fellow programmers or users. If a user, tell the programmer when you think his program isn’t working right, and show him how. If his pride gets hurt, so be it.

And, in harmony with what may be a recurring theme here, don’t worry about doing it the newest way; just do it the best way. Just do it right. That may not be easy; others may think another way of doing something is better (more right). If they are correct, then be willing to change.

And keep your mind open for change that is not the result of someone prodding you, but of you prodding yourself.