Archive for the ‘RPG’ Category

To Amaze the Whole Room

Wednesday, July 4th, 2012

In my favorite novel, “Pride and Prejudice”, Elizabeth Bennet says archly to Darcy, “I have always seen a great similarity in the turn of our minds. We are each of an unsocial, taciturn disposition, unwilling to speak, unless we expect to say something that will amaze the whole room, and be handed down to posterity with all the eclat of a proverb.”

Perhaps a reason similar to this is why I have not been as faithful to this blog as I ought to be. I feel like I need to say something IMPORTANT. And INTERESTING. Sometimes, no topic seems important enough to grip my interest until I get home and go through all the other stuff a head of the house does after work, eat dinner and rest in front of the TV for a short time. (Hmm.. I’m beginning to sense a pattern here.) A little topic may come to mind at work, but by the time I get home, it’s gone. Even if I write it down, I then forget to look at the piece of paper.

The day-to-day life of a programmer is not really that interesting, unless he is absorbed in a project or problem.

The life of a maintenance programmer can be tedious when his mind is not fully occupied. I have a number of projects in process, but nearly all of them are at a stopping point. Not done, but at a point where my next step waits on the actions of another, usually a user.

There are no issues of cosmic importance to be resolved. I am writing new code in freeform RPG. I am not messing with the structure of old programs just to improve them. I convert them to RPGIV when I make modifications to them, but usually I do not attempt to get them all the way to freeform; with all the indicators dancing around in them, freeform usually ends up being an even worse mess than what I start with.

There are old fashioned date routines in many of the programs. I do not touch them unless the modifications have to do directly with date calculation. The basic calculations can be easily replaced with my date functions based on IBM APIs; but they don’t always plug in neatly, which means intensive testing that I don’t want to do when I only want to change a heading or tweak one or two lines of code.

There are two RPG programmers in our shop. We both code in freeform, so there are no areas of conflict there that would make life in our shop exciting.

Somehow, the areas of contention with other programmers (inhouse or online) like control break processing seem rather dull. I do what I want and no one is around to argue with me. (My wife gets after me because it sometimes seems like I like to argue..)

Of course, as I press on into my mid-60‘s, I suppose that decreased conflict is probably not a bad thing. Anything can happen, of course, but I see little likelihood of my job situation changing before I retire in another 10 or 15 years. :-)

But maybe if I picked up on the little ideas that pop up during the day and run with them at night, I could be more productive. Maybe if I had a better way of holding onto them. Maybe if I could get a little spiral notebook.. or a netbook… or a tablet…

The Human Factor

Saturday, December 3rd, 2011

Still another example of the human factor being one of the greatest hindrances in solving computer problems - the willingness to see what one expects to see and to make assumptions.

I was writing a report that required me to accumulate data into arrays in the program, which, associated together in a data structure, could then be sorted appropriately with a SORTA statement. Below this, I had defined a separate, somewhat unrelated data structure.

In the array data structure, properly defined by the way, was an account number, followed by a short name. When I printed the report, to my surprise I found the account number slopping over into the short name and practically missing in the account number field

I have WDSC, the PC program that color codes fields and opcodes and creates outlines of the code, including a cross-reference of the fields in the program. The cross references indicated only one spot where the account number and name fields would be updated. The update was airtight, as far as I could see, when I ran the program in debug. But at the point just before the second element of the arrays were to be updated, the first elements of the arrays were already changed!

I then looked a little more carefully at the “watch” facility of the debugger, which allows you to tell the debugger to tell you when the value of a specific field changes. After the first element of the array was filled, I told it to watch for when the first element of the account number changed. I then told the program to continue. Then the cursor stopped- at my input file definition!

“Now, what’s going on?” I said to myself. The file was externally defined, but one field from the file was used in the data structure below the data structure defining the arrays. I looked at that data structure, and everything seemed to be in order. I checked other things, and I came back to that data structure and puzzled over it. And then I saw. The line (with the DS in it) that forms the initial definition of the description had an asterisk in column 7 - for some reason along the way, I had commented the line out and neglected to remove the asterisk when I was done.

With that asterisk in the definition line, its data structure fields that followed became associated with the array data structure above. The field thus overlaid the first 11 bytes of the data structure array, which included the account number and name. The field was read in, and the arrays changed.

Even with the color coding of the editor, I completely missed the asterisk. I had seen what I expected to see - the DS in the appropriate column. I missed the asterisk. Now, one could say that this is an argument for using the // freeform notation for comments and devising a new, freeform, keyword style form for field and data structure definition. But there is more to it.

I would submit that, no matter what precautions we take, the human factor will come into play. At some point, no matter what notation we use, the problem being handled will become sufficiently complex that the human writing the program will start making assumptions so he can handle the volume of concepts and data coming in, and he will see the concepts in his program that he expects to see- even if they are not there.

Sometimes it’s best just to take a break.

The Real World

Friday, May 27th, 2011

I know it’s been quite a while since I’ve made an entry here, but sometimes real life intrudes on the fantasy world of the Internet, so I just became too distracted by the real world. Either that, or I’ve just gotten lazy.

I think of things to put into this every day while I’m at work. Unfortunately, I’m at work when I get these ideas, I don’t make a note of it, or I get too tired when I get home, and the idea evaporates.

One idea that has managed to sustain itself until this evening was an observation on the state of programming in the real world – that is, outside my insulated IBM-i world. Of necessity, even my client must run outside programs and communicate with the outside world, and that is when the sadly amusing state of the computing world presents itself.

I listen as others talk about having to get new systems working on the new Windows 7 boxes. I listen to them go through the song and dance about installing this driver or that, and how it becomes difficult to keep the existing programs running when you get new hardware or install new software or both. After a while, you notice that it’s a new rendition of an old song with new singers and a new group of orchestras.

In the IBM-i world, of course, this stuff is nonsense. We are currently running programs that were written back in the 70’s and 80’s. With this IBM minicomputer (as it used to be called) family, it made no difference. The programs might need to be recompiled when you changed machines, but you didn’t need to create a new version of it- you only did it if you wanted to. No wasted effort. Nothing about the programs had to be changed if the operating system changed versions. But in the Windows world (and I believe in the Linux world too), there are no such guarantees. This always puzzled me when I looked out at my Windows.

And it has touched my world. I played a little bit with VisualAge RPG, and one of my coworkers ran with the idea and wrote an entire system, albeit a relatively small one, using VARPG to talk to our AS/400. (I know, I know. It’s not what it’s called now, but it’s the name most people in the industry identify it with.) We thought it was a very slick piece of work, even though we were aware that IBM wasn’t exactly pushing it. All we knew is that it was a very easy way to leverage our RPG skills into the Windows world. The users loved it.

But now, as time goes on and we get Windows 7 machines in, we have to make sure that if the system goes on a Windows 7 machine, it must be set up to run in XP-compatibility mode. But how long will that last? How long will it be before another “advancement” in Windows takes that option away from us? How long before it no longer becomes “cost effective” for Microsoft to have an environment in which our programs can be run? Then we either have to rewrite them in another language, or we encase them in a virtual machine where it no longer matters what the outside world (operating system) is. I suppose I should be thankful that we at least have that option.

And you can tell that the mindset of the computing world in general is that somehow this is a good thing. Throwaway software. They worry about 32-bit and 64-bit software and the incompatibilities inherent in them. But it’s “progress”. I call it wasteful – but who am I? Just a grunt programmer in an increasingly less popular computing ecosystem.

Sorry, but I don’t have any solutions. I hope somebody does.

The Last GOTO

Sunday, February 6th, 2011

I have been reading and thinking about the future of my favored programming language and the computer platform upon which I use it. The latter has been the subject of an extended discussion in LinkedIn. Some feel that the naming of the machine is of little consequence, unless you use the wrong one. It is felt that to call it the AS/400 is inaccurate, since the AS/400 is not called that anymore- which is, of course, true. It is felt that RPG will not survive for an extended period of time because of its intimate connection via DDS (Data Description Specifications) with the 24 X 80 character display green-screen terminal. Others feel that newer innovations may allow it to survive, but only if it dispenses as quickly as possible with the fixed-format (“punch-card”) calculation specifications and move quickly to free-format. There are not many defenders of fixed-format RPG around, and those that use it are probably expected to turn in their coding sheets soon and retire. And, as always, proponents of each view have their own facts and anecdotes to support their view.

I could not hope to make much of an impression on that forum and accomplish anything more than stir up the pot of contention. It is much better to make a one-sided argument here. :-) Anyone who cares to argue may feel free to do so. (more…)

I Saw the Light

Sunday, November 21st, 2010

I have changed my mind again on this matter of using the RPGIII style of calling external programs instead of subprocedures, modules, service programs and all that sort of stuff. I am beginning to “see the light” about updating my coding style.

I don’t know why, but just out of the blue I started to mess around with the service program concept. I must have 15 or 20 variations on date conversion routines: GTOJ for MMDDYY to Julian, ISOTODOW for inputting an ISO date (CCYYMMDD) and outputting the day of the week, and ISOTOVBG for inputting ISO and outputting a spelled-out date (in the format September 30, 2010) are just a few of them. Somehow I guess I felt that there was a natural grouping evident there that ought to be respected.

So anyway, I began exploring service programs a bit more. I think my main objection to modules was the fact that compilation would be a three-step process- compiling the two modules separately with CRTRPGMOD, then using CRTPGM to create the final program. I considered this to be an incredible nuisance. But I pressed on and put all my date modules into one service program. I then discovered that there was a way in the header specifications to use a “binding directory” to join the main program to modules. I found out how to put service programs into a binding directory, then reference the modules from there.

The final step was freeing myself from a misconception about the use of service programs. The article I was using for reference only referred to the use of CRTPGM. I found, though, that once the binding directory reference was inserted into the H spec, I could use CRTBNDRPG, just like I always do, so it could be a one-step compilation again. This was important beyond simple ease of use, because a number of other programs I had written, including my file/program cross-reference system, provided for recompilation of programs when requested. Before, the few programs that had external subprocedures always bombed on the CRTBNDRPG compile, and I could think of no easy way to create the proper CRTRPGMOD/CRTPGM combinations automatically. That is partly why I had retained the old CALL/PARM syntax, so I would not have to deal with that issue. Now, my problem has been solved.

So maybe there is hope for me yet.

Time Marches On

Monday, August 23rd, 2010

It has been a long time since my last entry. Much has happened since then. I made two 2100 mile round trips by car back to Michigan, both of them in connection with my mother, who passed away in June at the age of 98. Other personal situations have cropped up too.

At my place of work, I continue attempting to gradually upgrade the system. In addition to my assigned projects and troubleshooting, I am attempting to use more modern techniques; but I find it interesting to look back and see what things I have tried and not tried to do.

One thing that comes to mind is that I am no longer determined to find a way to gradually move code to free-format. If I write new code, it will probably be in RPG-free; but I am no longer attempting to translate fixed format to free. The job is just too daunting; so many of the old programs are just hideously loaded with indicators, and to do a direct conversion, using Linoma’s RPG Toolbox, creates code that would, especially to a neophyte, be even more confusing than its fixed format parent. With proper use of the toolbox we can eliminate left-hand indicators while leaving it in fixed format, and the tool does a very nice job of cleaning things up.

In another context, I am making changes that effectively are advancing beyond RPGIV as presently constituted. Using IBM routines CEESCEN, CEEDAYS, and CEEDATE, I am trying to free the system from its self-imposed reliance upon a century that runs from 1961-2060 for 2-digit years, as well as the ILE RPG range of 1940-2039 for 2-digit years. The boldest move, of course, would be to change all date references in the data files to at least an 8-digit year, if not actual date formats; but in ancient code from multiple systems, that would take a prohibitive amount of time to convert hundreds of files and programs to do this. What I have done instead is create a number of programs based upon the IBM programs above to quickly do date conversions. In doing this, I am implementing a sliding century, such that the two-digit dates will always reference a century that references the current year as the 40th year of the century. Currently, this means that my floating century is 1971-2070. In ten years, it will be 1981-2080.

A typical call would be: To convert Gregorian (MMDDYY) to Julian, we would call:

CALL ‘GTOJ’

PARM ING 6 0

PARM JUL 5 0

You may note that I am using the RPGIII/RPG400 style of calling the program rather than using the one-line form: CALLP(ING:JUL), which involves the creation of appropriate prototypes. The program GTOJ, for instance, does consist of a module DCONV, which sets up and runs the IBM routines, and module GTOJ, which sets up the input and output for DCONV. They are then combined into program GTOJ. All the routines ultimately use DCONV.

However, I could not see the point of the extensive housekeeping involved in making the programs that call GTOJ and the other programs modularized. You, of course, have to create prototypes for each of the programs (about 15 of them, at last count). Of course, I could put them all in one service program, which to my mind creates another layer of unnecessary complexity. The chief justifications for this seem to be performance and the ability to catch inconsistencies between parameters of the called and calling programs. To me, these are not sufficient reasons. In this installation, performance will never be an issue. Trust me. Our new machine (stupid word processor! I can’t put our favorite machine’s name here – OpenOffice keeps trying to capitalize it) runs at least 10 times as fast as the one it replaced. As for parameter checking, any errors last until the first test is run, after which they are corrected.

Another thing that I have cooled on is attempting to get away from MOVE, etc , in existing code. New code that I write contains very few MOVEs, except for enabling date arithmetic. I just don’t see the point in changing existing code to remove MOVE, unless by doing so I can make the code clearer. IBM has not implemented MOVE in freeform, so I don’t see a meaningful amount of existing RPG code being moved to freeform. This means that any RPG neophyte is going to have to learn to deal with both freeform and fixed RPG formats in his maintenance programming.

So time marches on. We continue to watch the progress of IBM with its initiatives regarding RPG, and we wait to see what impact it will have on our day-to-day work. In a small installation like ours, I don’t think we will ever be cutting edge.

Bugged by the Debugger- and Larger Issues

Monday, May 31st, 2010

Sometimes the frustration you get from programming stems from a lack of knowledge. You think you have a mastery of a particular programming task, but the lack of knowledge comes back to bite you.

Such was the case a few weeks ago. I was debugging a program, and I wanted to to check the results of a basic calculation. I had changed the processing to no longer use a particular field. This field was defined in the input specifications (it was not externally defined). As I stepped through the calculations, I wanted to check the field, so on the command line I entered EVAL FIELD1. The result? It was blank.

I said to myself, “This is ridiculous- it can’t be!” It was part of the key to the file, and the file had no blank keys. I checked the value of KEY1, which included FIELD1 (that is, it included the data in the record encompassed by FIELD1. It looked normal. This was exasperating. I know that data in input specifications are stored in different areas of memory, even if they describe the same spot in the data record. But the field was blank. There were other fields that encompassed the same record space; some were blank, some weren’t.

I was vaguely aware that there was some sort of keyword in the H specifications you could use to eliminate unreferenced fields. But that would mean that I shouldn’t be able to debug it at all; it should give an error if I tried to EVAL it. And I didn’t use that keyword. I had one of my fellow programmers look over my shoulder as I ran the program in debug again. He didn’t have any idea what was going on either.

After all of this, we brainstormed a bit. I had no reason to suspect it before, but we thought: what if it is not showing up in debug because it is not being used? We could find nothing in the IBM RPG language documentation that something like that is done; but I inserted a dummy calculation: EVAL FIELD1=FIELD1.

Voila! The field now showed up, filled with data, in debug.

The most disagreeable part of it all was that I couldn’t find this behavior described anywhere in the documentation. I spent a little time Googling the situation – and sure enough, others described this behavior of the debugger. Apparently it is some kind of attempt to optimize the debugger. If you don’t use the field, the value is not filled in the debugger.

My reaction was, “Why bother?” Why assume that, just because another line of code doesn’t reference it, that I’m not interested in it? (This is an old program, by the way.) If the field is part of the file’s definition, it should be assumed that I’m interested in its value. I haven’t been able to find the documentation in a text document that describes how to use the debugger. The documentation may say something about it – if there is documentation. I couldn’t find anything in the help text for the debugger that described my situation. And I would think that a debugger, which deliberately allows you to sit at a line of code as long as you want, would be the last and least thing you would need to have optimized.

But all of this speaks to a larger issue. Is this system getting too big? Is there any person around that actually knows everything about the behavior of the RPGIV compiler and its supporting programs, like the debugger? I don’t mean someone who knows where to look for the information (assuming of course that it’s even all written down), but someone who actually knows all this stuff, right down into all of the nooks and crannies, like this thing with the debugger, where some programmer or low-level committee apparently arbitrarily decided that we didn’t need to see the value of an unreferenced field in the debugger? If such a person exists, hide him from the world for six months and make him update the manuals so they include everything IN A COMPREHENSIBLE AND COMPREHENSIVE FORMAT.

I don’t pretend to know the answers to these questions. I don’t pretend to know the solutions to these problems. Maybe I’m just stupid. Maybe I just don’t know where to look. But I have for some time labored under the delusion that I have some measurable level of intelligence and that it shouldn’t be this hard to figure these things out.

This is a rant without a suggested solution. I can’t write my own compiler or supporting documentation. But it just strikes me that when Niklaus Wirth and a colleague can write a non-trivial graphical operating system (Oberon) in just over 12000 lines of code in under 200K of space, other people just as smart could make these systems we work with (possibly) smaller and (certainly) more comprehensible. Maybe it’s too late. I hope not.

A Fantasy

Thursday, April 15th, 2010

In an alternative universe:
IBM Toronto
Press Release
April 1, 2010
RPG, from its very inception, has been designed to be an easy to use programming language. Since business data is stored in files, RPG was designed to make reading and writing files easy. To that end, its creators made sure that the processing of files was made easy by its READ and READE and READP operation codes, using a Pascal-like DO loop. Doing subtotals has been accomplished by the programmer storing intermediate data in temporary user-defined fields, then comparing group identifiers with the new identifiers and outputting the results before proceeding. Proper cascading logic enables the user to track multiple group changes that may need to be done at the same time. CHAIN and READ make it possible to easily link related files. (more…)

Who’s the Best Programmer Around? Not Me

Tuesday, April 13th, 2010

My recent experience on the forum on Bob Cozzi’s RPGIV website underlined the fact that I am not in contention for the title of “World’s Best RPG Programmer”. I posed a question that involved the use of APIs and prototypes, and I submitted my sample code. I got my question answered, but not before it was made very clear to me that my abilities as to APIs and prototyping were considerably below cutting edge.

How do you get to be a good programmer? You have to be intelligent, which can mean you have to know when to be as stupid as the computer. You have to be intolerant of errors. And you have to be willing to learn from your mistakes, as you will make thousands of them.

But beyond these basic qualities and attitudes, other circumstances may determine how deeply you will get into the more arcane aspects of your chosen programming language that will allow you to be among the “best”.

For one thing, your circumstances have to be such that you are exposed to advanced programming techniques. This will likely also be a function of your intellectual curiosity. But perhaps just as importantly, your circumstances have to be such that you will actually have a need for these techniques. You may never be in a position where you will need to access user spaces. While you may see some benefit to variable length fields, you may not see any burning need to start using them. And while much of the benefit of ILE is built around things like APIs and prototypes, it may well occur that your site simply does not need a wholesale conversion of code to make use of called procedures.

As it happened, my experience above came as a result of a need for them. Two-digit years are still used on my system. In calculating the maturity date of a 30-year loan written in 2010, I bumped up against the ILE default for two-digit date fields (1940 to 2039) and got an “invalid date” error when I attempted to generate 3/15/40 to represent 2040. To get around this, I decide to use an API I was aware of, CEESCEN, which allows you to float the 100 year range you want to use for the 100-year period in question. This is used in association with APIs CEEDAYS and CEEDATE to allow you to format dates in numerous different formats. (Google the API names for details.)

However, I was not familiar with how to set up the prototypes needed, and as a result I made some stupid mistakes that leaped out at the knowledgeable participants in the forum, especially Bob. But anyway, with some help, I got the prototypes and program calls to work.

But my learning things like these are in response to a specific need. I don’t have a burning need to learn all or even a substantial portion of the APIs available. I simply don’t need them. The fact is that I am getting paid to write production code, not specifically to “learn new things” . And the fact is also that I am not in a position where I can spend a lot of time outside the workplace learning new stuff, since family and other personal needs and circumstances have first claim on my time.

Other people, by reason of education, career choices, and the employers they happened to have, may have had open to them early in their careers opportunities to work on truly advanced concepts that I never was exposed to and likely never will be. Of course, their being smarter than me, not to mention better educated early on, would also make this more likely :-) My task being to maintain and upgrade code that was from the 1980’s and 1990’s in style, if not always in vintage, I will likely be kept busy doing that for the balance of my career. I like to think I’m good, but I’ll never be the best.

Adventure in Modernization

Thursday, January 21st, 2010

But sometimes, to old codgers like me, it’s difficult to know just how far to go. In previous posts I have written about my vaguely negative feelings about freeform RPG, and I have written very clearly about how I feel about IBM’s unwillingness to implement the MOVE instruction in freeform RPG, pointing out how this can only hinder conversion of old code and diminish acceptance of the new RPG dialect.

Case in point, a program I was working on today. I was writing code to implement formatting of a six-digit account number based upon the rightmost 6 digits of an 11-digit number. It was based upon some old code (the usual situation where I work), but the old code was hideous. The old program consisted of about 15 lines of MOVE and MOVEL statements. I said, this has got to go.

So I contemplated the best way to do it. I could set up a data structure and put pieces of the account number into that, using EVALs or MOVEs,with dashes embedded as needed. But that didn’t seem quite elegant enough. I have been working harder to modernize my own code, so I finally broke it down to these two possibilities, as illustrated in this test program. The result I am aiming for is the number formatted as 01-234-5.


     H DFTACTGRP(*NO)   ACTGRP(*CALLER)
     D BACTNO          S             11  0 INZ(99999012345)
     D ACC6            S              8    INZ(*BLANKS)
     D NUM6            S              6  0 INZ(0)
     D ACCW            C                   '0  -   - '
     C/FREE
       EVALR ACC6 = %EDITW(%DEC(%SUBST(%EDITC(BACTNO:'X'):6:6):6:0):ACCW);
      /END-FREE
     C                   MOVE      BACTNO        NUM6
     C                   EVALR     ACC6 = %EDITW(NUM6:ACCW)
     C                   EVAL      *INLR= *ON

One way mixes the old and the new, with an old-fashioned MOVE to the smaller field, followed by a new-fangled %EDITW BIF using a predefined edit word. The other goes full-bore new age, with one grand set of embedded functions. To get the 11 digit number in string form so I can substring it, I use %EDITC with an X edit code, which does the conversion. Next, I %SUBST (substring) the last six characters. I then use %DEC to convert those six characters back to numeric. Finally, I apply the %EDITW function to format those six digits as desired. (I would be interested in finding out about a shorter way to do it.)

But I have a problem with it. In an earlier post, I pointed out that the ability to create long, complex functions in freeform is not necessarily a virtue. The mere fact that I felt the need to explain it here indicates that I am not comfortable with it. The code is short, but I do not feel that it is clear. On the other hand, while the the two-line version using MOVE is short and sweet, and uses a BIF, it would force me to get out of freeform to use the MOVE; stylistically, that also seems wanting.

Since the thrust of my thinking is in trying to modernize the code so future generations of converted C programmers won’t be freaked out by the C in column 6, I am leaning toward the one-line version. But I don’t like it. It’s ugly.