But They Loved Her

I was once sent to a client to take care of some fairly modest modifications to some RPG programs on their System/34. As I went about making changes on a certain report program, I concerned myself with the requested changes, not paying much attention to how the program performed its main processing. As I submitted the program changes for test, an employee said to me that I should expect it to run for about 2 or 3 hours. Having looked lightly at the processing, and being acquainted with the 34’s processing speed and the size of the file being processed, I smiled inside. “More like 15 or 20 minutes, I think”, I said to myself.

To my surprise, it in fact did take around 2 ½ hours. I was anxious to find out why I was so wrong, and what I found disgusted me, and for the first time made me angry about the work of another programmer.

The file being processed was a detail file. That is, it was part of a file setup common in data processing where the line items in, for example, an invoice, are stored in a separate file from the general information about the invoice.

Let us suppose that the invoice “header” has a key of invoice number – the file is organized to be read in invoice number order. For our example, let us suppose the invoice number is 123456. In later versions of RPG, you could define file keys from fields physically separate from one another in the file; but this was RPGII on the System/34, and the key data had to be all together. In this case the key to the “detail” file was the invoice number followed by a 3-digit sequence number. So the detail keys for invoice 123456 could range from 123456000 through 123456999, though in practice the 000 record was not usually used (since the files were no doubt designed by business people, who would think starting to count with 0 was nonsense :) ).

Now, the customary way to read these records was to set lower limits (SETLL) using the invoice number and the lowest possible key (123456000), determining the highest possible key (123456999), then READing the file until you either reached end of file or the key of the record read was more than 123456999. Not having looked closely at the code before, I had assumed that was the way it was done. Since you would very seldom find an invoice with 999 line items, this would be the logical way to do it.

But this was not the way the programmer did it. She instead used a separate counter from 1 to 999; for every number from 1 to 999, she moved that number to the end of the key (getting 123456001, 123456002,123456003, etc.) and performed a random-access CHAIN to the file, using the data if the access was successful. So, if there were 4 detail records for the invoice, she would have 4 successful reads and 995 unsuccessful reads, instead of 4 successful to 1 unsuccessful the normal way. The program was spending over 99% of its time accessing the disk looking for records that didn’t exist!

What made the matter worse was that this logic was replicated in quite a few of their other major programs. Now, I’m not normally a person who likes to rock the boat, but this was too much for me. I went to my supervisor there, explained the situation and asked for permission to change these programs to do it the right way. Permission was granted, and I hunted down the other programs and modified the read routine in all of them to do it properly. And yes, they did complete in 15-20 minutes, just as I thought they should.

Unfortunately, this was too little too late. I found other areas where the programs, though giving correct results, were very poorly designed. The proof of the damage this programmer and others had done to this firm was evident in the fact that, at the time I came to that firm, they were in the process of moving their processing to another, non-IBM minicomputer using another dialect of RPG because it was deemed that the System/34 just didn’t cut the mustard. It was too slow.

This may be an extreme example. Normally, I don’t sweat performance too much, though even today I think programs are more bloated than they need to be. (I once compiled an RPGIV program with only 1 line: eval *inlr=*on. Its size was around 80K.) But bad programming can really have a bad effect.

I prefer that programs be clear and understandable, so that they can be more readily maintainable; performance can be somewhat secondary. I think that reducing disk access is the key to most performance issues. I really don’t worry about whether or not one opcode is 50 nanoseconds faster than another. Talk about a few trillion nanoseconds here and there and we’re talking some real time.

But sometimes you do have to stand up for a certain standard of performance. Unfortunately, in that company, quality control came in a little too late. Perhaps it was a human relations issue. While I was at that company, they talked about how much they liked that programmer as a person. She must have been a real charmer. Evidently, they liked her so much that they accepted whatever excuse she gave for a report run that took 2.5 hours to produce a report. Apparently, she had no intellectual curiosity to try to figure out why it ran so long. That, to me, was her cardinal sin- a lack of intellectual curiosity, a lack of desire to do what she could to improve her product. Her report produced correct results, but she could have done it right, she could have done it better.

I find I’m rambling here, in search of a point. Perhaps it’s just that quality is the important thing. Accuracy and clarity in the design. The ability to discern when something just isn’t right and know how to fix it. Be willing, whether a user or a programmer, to speak up whenever you see something that isn’t quite right, and talk to the people who can do something about it – don’t just vent to your fellow programmers or users. If a user, tell the programmer when you think his program isn’t working right, and show him how. If his pride gets hurt, so be it.

And, in harmony with what may be a recurring theme here, don’t worry about doing it the newest way; just do it the best way. Just do it right. That may not be easy; others may think another way of doing something is better (more right). If they are correct, then be willing to change.

And keep your mind open for change that is not the result of someone prodding you, but of you prodding yourself.

3 Responses to “But They Loved Her”

  1. Alan Hedblad Says:

    Fine article. A great example of the importance of good programming. You have managed to make the science of programming interesting to a man who uses his PC as little more than a word processor!

  2. RPG and Programming » Blog Archive » “Subfiles in RPGIV” and Me Says:

    [...] than necessary? If you have read my blog from the start, you might remember when I told about a hissy fit  I had when I worked on a program where a previous programmer read a detail data file by CHAIN [...]

  3. Larry Holder Says:

    Probably the bad programming was initially hidden by a small set of test data. Which is why whenever possible I test on a full production set, real or cloned to a test system. No reduction, no fudging. I remember once in BASIC having to rewrite to use a shell-mezner (sp?) sort, as the classic bubble sort would have taken forever to complete on the full set of production data.

Leave a Reply