Guru: SQL Can Read Program-described Data
September 21, 2020 Ted Holt
I thought I left program-described database files behind me in 1988. That’s when I left my last S/36 shop to begin working on the S/38. Well, I did, but not completely. From time to time I work on a system with program-described files, and even some externally described files have program-described fields. Fortunately — and I owe this to Scott Forstie — I have learned that SQL can read program-described data. Will wonders never cease?
You may be thinking, “This article doesn’t apply to me. Our files are externally described.” You may be right. Then again, you may be wrong. I have found plenty of program-described data in externally described files.
I can think of three places where program-described data is commonly found.
- Program-described data is found in program-described files. These files usually date back to the S/36 and its predecessors. I have no idea what percentage of IBM i shops still run S/36 applications, but I know they’re out there.
- Program-described data is found in control and configuration files. It is very common for shops to have externally-described files that contain multiple types of data. One True Lookup Tables also fit into this category.
- Program-described data is found in fields that are used to store data for which they were not designed. For example, a business might license an ERP system that has a 45-byte character field for a second item description. However, the business doesn’t use the second item description, and they do need to store a few data values for which there are no fields, so they plop them into the second item description and define them in their RPG programs with a data structure.
The SQL function that allows you to read embedded data is INTERPRET, and it has the following format:
INTERPRET ( value AS data-type )
The value argument is a binary value. In SQL terms, the column must be defined as BINARY, VARBINARY, or CHAR FOR BIT DATA. I’ve never used those data types when creating a table, but that’s not a problem, as I can cast character variables to binary.
The data-type argument is the way the data is stored, i.e., data type and size.
I’ve put together a couple of examples for you. If you have program-described data, I hope they help.
Example 1: A Program-described File
Let’s begin with the first case. I have a file that stores the locations of inventory items. Each record has the following information:
From | To | Field | Type | Size |
1 | 10 | Item | Char | 10 |
11 | 13 | Aisle | Zoned | 3,0 |
14 | 16 | Bay | Zoned | 3,0 |
17 | 17 | Level | Char | 1 |
18 | 21 | QOH | Packed | 7,3 |
This file tells us how many (QOH, or quantity on hand) of an item is stored in a warehouse location (aisle, bay, and level). Notice the packed and zoned values.
Here’s the SELECT statement that reads this data:
select substr(ItemLoc, 1, 10) as Item, interpret(binary(substr(ItemLoc, 11, 3)) as numeric(3)) as Aisle, interpret(binary(substr(ItemLoc, 14, 3)) as numeric(3)) as Bay, substr(ItemLoc, 17, 1) as Level, interpret(binary(substr(ItemLoc, 18, 4)) as dec(7,3)) as QOH from itemloc
Here’s the output of the query.
Item | Aisle | Bay | Level | QOH |
AA-101 | 95 | 12 | C | 45.000 |
AA-101 | 62 | 8 | B | 50.125 |
AA-101 | 104 | 25 | E | 2000.55 |
BC-728 | 14 | 6 | A | 0.000 |
DE-345 | 87 | 19 | A | -3.000 |
In the case of zoned decimal values that cannot be negative, you can just substring without using INTERPRET. Be aware that in this case, the returned data will be character, not zoned.
select substr(ItemLoc, 1, 10) as Item, substr(ItemLoc, 11, 3) as Aisle, substr(ItemLoc, 14, 3) as Bay, substr(ItemLoc, 17, 1) as Level, interpret(binary(substr(ItemLoc, 18, 4)) as dec(7,3)) as QOH from itemloc
Example 2: A Control File
Let’s consider a simple control or configuration file.
Field | Type | Size |
Key | Char | 8 |
Sequence | Packed | 3,0 |
Record type | Char | 2 |
Entry-specific data | Char | 256 |
Key, sequence and record type are all easily accessible, but the values in the last column aren’t. Such a table usually has many record types, but for this illustration, three will suffice:
- CN: The company name; one row
- CP: The current financial period; one row
- DV: Divisions; multiple rows
Here are the formats of the entry-specific-data column for each record type:
Record type CN
From | To | Type | Size | Field |
1 | 24 | Char | 24 | Company name |
Record type CP
From | To | Type | Size | Field |
1 | 2 | Packed | 3,0 | Current period number |
3 | 7 | Packed | 8,0 | Current period begin date |
8 | 12 | Packed | 8,0 | Current period end date |
Record type DV
From | To | Type | Size | Field |
1 | 4 | Zoned | 4,0 | Division ID |
5 | 29 | Char | 25 | Division name |
30 | 54 | Char | 25 | Street address 1 |
55 | 74 | Char | 20 | City |
75 | 76 | Char | 2 | State |
77 | 86 | Char | 10 | Postal code (ZIP) |
Reading the CN record type is easy. There’s only one value in the entry-specific data, and it’s character.
select sequence, rectype, substr(data, 1, 24) as CompanyName from control where RecType = 'CN'
Sequence | RecType | CompanyName |
10 | CN | ACME Industries |
The CP format has packed decimal only.
select sequence, rectype, interpret(binary(substr(data, 1, 2)) as dec(3)) as CurrentPeriod, interpret(binary(substr(data, 3, 5)) as dec(8)) as BeginDate, interpret(binary(substr(data, 8, 5)) as dec(8)) as EndDate from control where RecType = 'CP'
Sequence | RecType | CurrentPeriod | BeginDate | EndDate |
20 | CP | 9 | 20200901 | 20200930 |
The DV record type has character and zoned data.
select sequence, rectype, interpret(binary(substr(data, 1, 4)) as numeric(4)) as DivisionID, substr(data, 5, 25) as DivisionName, substr(data, 30, 25) as Street, substr(data, 55, 20) as City, substr(data, 75, 2) as State, substr(data, 77, 10) as ZIP from control where RecType = 'DV'
Sequence | RecType | DivisionID | DivisionName | Street | City | State | ZIP |
30 | DV | 100 | Widgets | 101 Main St | Lost Angeles | KZ | 12345 |
31 | DV | 1204 | Doohickeys | 999 Worse St | New Yolk | MQ | 23456-9876 |
32 | DV | 1492 | Thingamajigs | 321 Easy St | Last Vegas | ZT | 30405 |
It’s not ideal, but it works.
Or as Charlie Daniels might say, “That’s how you do it, son.”
RELATED STORIES
IBM Knowledge Center: INTERPRET
Geico Insurance: Does Charlie Daniels Play A Mean Fiddle? TV Commercial
Ted, this is a very timely article. I’m having to fight this issue quite frequently particularly with a “One True Lookup Table” used by a Vendor’s application. Too bad for me that I’m not on 7.4 yet. Something else to look forward to…
I’m always glad to hear that someone can use something we write about. Thanks, Paul!