On 7/14/25 14:02, David Wade wrote:
On 14/07/2025 21:36, Dan Espen wrote:
Peter Flass <Peter@Iron-Spring.com> writes:On a real 3178 there are no [] characters so you either lose some
On 7/13/25 07:18, Niklas Karlsson wrote:
On 2025-07-13, Nuno Silva <nunojsilva@invalid.invalid> wrote:
On 2025-07-13, Niklas Karlsson wrote:Couldn't say. I came in a little to late to really have to butt
Not EBCDIC, but your mention of square brackets reminded me of the >>>>>>> modified 7-bit ASCII that was used to write Swedish before ISO
8859-1
and later Unicode made it big.
"} { | ] [ \" were shown as "† „ ” Ź Ž ™" on Swedish-adapted
equipment,
making C code look absolutely ridiculous. Similar conventions
applied
for the other Nordic languages and German.
I played with ISO-646-FI/SE once in a Televideo terminal, but not for >>>>>> long enough to figure out how to handle day-to-day usage of a
UNIX- like
system without these characters.
I (barely) know C has (had?) syntax and also iso646.h for such cases, >>>>>> but how would e.g. shell scripting be handled?
heads
with that issue.
That's why C had trigraphs. PL/I(F) did the same thing with its
"48-character set"
I go onto my first UNIX on mainframe project and all the developers had
already accepted TRIGRAPHS.˙ I found that totally unacceptable.˙ It took >>> me a month or 2 to find a 3270 emulator that I could patch up to finally >>> be able to see and type square brackets.
To IBM's credit I used IBM's internally used 3270 emulator (MITE I
believe) with some patches I came up with.˙ I dumped the binary, found
the translate table and fixed it.
I can't fathom why trigraphs were considered an acceptable solution.
other characters, or use tri-graphs.
By golly, you're right. The 3278 APL keyboard had them. We used 3290s
with the APL keyboard; great piece of gear.
... I worked on coloured book software on IBM VM
On 7/14/25 14:14, Scott Lurndal wrote:
Dan Espen <dan1espen@gmail.com> writes:
Peter Flass <Peter@Iron-Spring.com> writes:
On 7/13/25 07:18, Niklas Karlsson wrote:
On 2025-07-13, Nuno Silva <nunojsilva@invalid.invalid> wrote:
On 2025-07-13, Niklas Karlsson wrote:Couldn't say. I came in a little to late to really have to butt
Not EBCDIC, but your mention of square brackets reminded me of the >>>>>>> modified 7-bit ASCII that was used to write Swedish before ISO 8859-1 >>>>>>> and later Unicode made it big.
"} { | ] [ \" were shown as "å ä ö Å Ä Ö" on Swedish-adapted equipment,
making C code look absolutely ridiculous. Similar conventions applied >>>>>>> for the other Nordic languages and German.
I played with ISO-646-FI/SE once in a Televideo terminal, but not for >>>>>> long enough to figure out how to handle day-to-day usage of a UNIX-like >>>>>> system without these characters.
I (barely) know C has (had?) syntax and also iso646.h for such cases, >>>>>> but how would e.g. shell scripting be handled?
heads
with that issue.
That's why C had trigraphs. PL/I(F) did the same thing with its
"48-character set"
I go onto my first UNIX on mainframe project and all the developers had
already accepted TRIGRAPHS. I found that totally unacceptable. It took >>> me a month or 2 to find a 3270 emulator that I could patch up to finally >>> be able to see and type square brackets.
To IBM's credit I used IBM's internally used 3270 emulator (MITE I
believe) with some patches I came up with. I dumped the binary, found
the translate table and fixed it.
I can't fathom why trigraphs were considered an acceptable solution.
Not many keypunches had a square bracket key. Granted, if one were
skilled on the keypunch, one can synthesize any hollerith sequence;
so assuming one knew how the hardware translated the hollerith into
EBCDIC (and the C compiler used the same EBCDIC character) they
could punch a square bracket, albeit rather painfully. trigraphs
were much more convenient.
I got pretty good at multi-punching at one time in the long ago.
On Mon, 14 Jul 2025 20:01:48 -0700, Peter Flass wrote:
On 7/14/25 18:29, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 16:36:19 -0400, Dan Espen wrote:
I can't fathom why trigraphs were considered an acceptable solution.
What would have been better?
FORTRAN used .OR., .AND., etc.
But C avoided using meaningful names for that kind of thing.
Peter Flass <Peter@Iron-Spring.com> writes:
On 7/14/25 18:29, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 16:36:19 -0400, Dan Espen wrote:
I can't fathom why trigraphs were considered an acceptable solution.
What would have been better?
FORTRAN used .OR., .AND., etc.
FORTRAN is not C. Trigraphs worked perfectly well,
irrespective of your personal feelings. Ugly, perhaps,
but not as ugly as .OR.
On Mon, 14 Jul 2025 19:56:56 -0700, Peter Flass wrote:
On 7/14/25 14:02, David Wade wrote:
On 14/07/2025 21:36, Dan Espen wrote:
Peter Flass <Peter@Iron-Spring.com> writes:On a real 3178 there are no [] characters so you either lose some other
On 7/13/25 07:18, Niklas Karlsson wrote:
On 2025-07-13, Nuno Silva <nunojsilva@invalid.invalid> wrote:That's why C had trigraphs. PL/I(F) did the same thing with its
On 2025-07-13, Niklas Karlsson wrote:Couldn't say. I came in a little to late to really have to butt
Not EBCDIC, but your mention of square brackets reminded me of the >>>>>>>> modified 7-bit ASCII that was used to write Swedish before ISO >>>>>>>> 8859-1 and later Unicode made it big.
"} { | ] [ \" were shown as "å ä ö Å Ä Ö" on Swedish-adapted >>>>>>>> equipment,
making C code look absolutely ridiculous. Similar conventions
applied for the other Nordic languages and German.
I played with ISO-646-FI/SE once in a Televideo terminal, but not >>>>>>> for long enough to figure out how to handle day-to-day usage of a >>>>>>> UNIX- like system without these characters.
I (barely) know C has (had?) syntax and also iso646.h for such
cases,
but how would e.g. shell scripting be handled?
heads with that issue.
"48-character set"
I go onto my first UNIX on mainframe project and all the developers
had already accepted TRIGRAPHS. I found that totally unacceptable. >>>> It took me a month or 2 to find a 3270 emulator that I could patch up
to finally be able to see and type square brackets.
To IBM's credit I used IBM's internally used 3270 emulator (MITE I
believe) with some patches I came up with. I dumped the binary, found >>>> the translate table and fixed it.
I can't fathom why trigraphs were considered an acceptable solution.
characters, or use tri-graphs.
By golly, you're right. The 3278 APL keyboard had them. We used 3290s
with the APL keyboard; great piece of gear.
APL keyboards had many strange and wondrous characters... The IBM 5120 had
a selector switch for BASIC or APL and had the APL character set, iirc on >the front of the keycaps.
On 7/14/25 22:59, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 20:01:48 -0700, Peter Flass wrote:
On 7/14/25 18:29, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 16:36:19 -0400, Dan Espen wrote:
I can't fathom why trigraphs were considered an acceptable solution.
What would have been better?
FORTRAN used .OR., .AND., etc.
But C avoided using meaningful names for that kind of thing.
Not meaningful with the dots.
On Mon, 14 Jul 2025 16:36:19 -0400, Dan Espen wrote:
I can't fathom why trigraphs were considered an acceptable solution.
What would have been better?
Digraphs. They give alternative spelling for needed C tokens.
Trigraphs apply everwhere, including strings and to lower chance
of accidental match they are deliberatly obscure.
On Tue, 15 Jul 2025 07:21:12 -0700, Peter Flass wrote:
On 7/14/25 22:59, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 20:01:48 -0700, Peter Flass wrote:
On 7/14/25 18:29, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 16:36:19 -0400, Dan Espen wrote:
I can't fathom why trigraphs were considered an acceptable solution. >>>>>What would have been better?
FORTRAN used .OR., .AND., etc.
But C avoided using meaningful names for that kind of thing.
Not meaningful with the dots.
You think you can’t tell that “.OR.” came from “or”, and “.AND.’ from
“and”?
On 7/15/25 20:59, Lawrence D'Oliveiro wrote:
On Tue, 15 Jul 2025 07:21:12 -0700, Peter Flass wrote:
On 7/14/25 22:59, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 20:01:48 -0700, Peter Flass wrote:
On 7/14/25 18:29, Lawrence D'Oliveiro wrote:
On Mon, 14 Jul 2025 16:36:19 -0400, Dan Espen wrote:
I can't fathom why trigraphs were considered an acceptable
solution.
What would have been better?
FORTRAN used .OR., .AND., etc.
But C avoided using meaningful names for that kind of thing.
Not meaningful with the dots.
You think you can’t tell that “.OR.” came from “or”, and “.AND.’ from
“and”?
Of course. What I meant was "not otherwise significant to the parser,"
so not confusable with anything else.
On 14/07/2025 21:36, Dan Espen wrote:
Peter Flass <Peter@Iron-Spring.com> writes:On a real 3178 there are no [] characters so you either lose some
On 7/13/25 07:18, Niklas Karlsson wrote:I go onto my first UNIX on mainframe project and all the developers
On 2025-07-13, Nuno Silva <nunojsilva@invalid.invalid> wrote:
On 2025-07-13, Niklas Karlsson wrote:Couldn't say. I came in a little to late to really have to butt
Not EBCDIC, but your mention of square brackets reminded me of the >>>>>> modified 7-bit ASCII that was used to write Swedish before ISO 8859-1 >>>>>> and later Unicode made it big.
"} { | ] [ \" were shown as "† „ ” Ź Ž ™" on Swedish-adapted equipment, >>>>>> making C code look absolutely ridiculous. Similar conventions applied >>>>>> for the other Nordic languages and German.
I played with ISO-646-FI/SE once in a Televideo terminal, but not for >>>>> long enough to figure out how to handle day-to-day usage of a UNIX-like >>>>> system without these characters.
I (barely) know C has (had?) syntax and also iso646.h for such cases, >>>>> but how would e.g. shell scripting be handled?
heads
with that issue.
That's why C had trigraphs. PL/I(F) did the same thing with its
"48-character set"
had
already accepted TRIGRAPHS. I found that totally unacceptable. It took
me a month or 2 to find a 3270 emulator that I could patch up to finally
be able to see and type square brackets.
To IBM's credit I used IBM's internally used 3270 emulator (MITE I
believe) with some patches I came up with. I dumped the binary, found
the translate table and fixed it.
I can't fathom why trigraphs were considered an acceptable solution.
other characters, or use tri-graphs.
Did the 3178 come with an APL feature?
David Wade <g4ugm@dave.invalid> writes:
On 14/07/2025 21:36, Dan Espen wrote:
Peter Flass <Peter@Iron-Spring.com> writes:On a real 3178 there are no [] characters so you either lose some
On 7/13/25 07:18, Niklas Karlsson wrote:I go onto my first UNIX on mainframe project and all the developers
On 2025-07-13, Nuno Silva <nunojsilva@invalid.invalid> wrote:
On 2025-07-13, Niklas Karlsson wrote:Couldn't say. I came in a little to late to really have to butt
Not EBCDIC, but your mention of square brackets reminded me of the >>>>>>> modified 7-bit ASCII that was used to write Swedish before ISO 8859-1 >>>>>>> and later Unicode made it big.
"} { | ] [ \" were shown as "† „ ” Ź Ž ™" on Swedish-adapted equipment, >>>>>>> making C code look absolutely ridiculous. Similar conventions applied >>>>>>> for the other Nordic languages and German.
I played with ISO-646-FI/SE once in a Televideo terminal, but not for >>>>>> long enough to figure out how to handle day-to-day usage of a UNIX-like >>>>>> system without these characters.
I (barely) know C has (had?) syntax and also iso646.h for such cases, >>>>>> but how would e.g. shell scripting be handled?
heads
with that issue.
That's why C had trigraphs. PL/I(F) did the same thing with its
"48-character set"
had
already accepted TRIGRAPHS. I found that totally unacceptable. It took >>> me a month or 2 to find a 3270 emulator that I could patch up to finally >>> be able to see and type square brackets.
To IBM's credit I used IBM's internally used 3270 emulator (MITE I
believe) with some patches I came up with. I dumped the binary, found
the translate table and fixed it.
I can't fathom why trigraphs were considered an acceptable solution.
other characters, or use tri-graphs.
Did the 3178 come with an APL feature?
Real terminals went away pretty quickly.
The project I was on was using emulators except for some of us with
3290s.
On a real 3178 there are no [] characters so you either lose some
other characters, or use tri-graphs.
Did the 3178 come with an APL feature?
Real terminals went away pretty quickly.
The project I was on was using emulators except for some of us with
3290s.
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record
gear weren't ready ... so were going to start shipping with old BCD gear
(with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines
were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC translate table.
On Mon, 14 Jul 2025 09:40:28 GMT, Charlie Gibbs wrote:
In the mainframe world, lower case was generally held in low regard. The
myth was that anything not in all caps didn't look appropriately
computerish. This myth survived for decades afterwards.
I read somewhere that, when AT&T engineers were designing the first teletypes, they had room to include either uppercase letters or lowercase, but not both. Executives decided that entire uppercase was preferable to entire lowercase, solely because “god” seemed like a less respectful way of writing the name (or was it occupation?) of their favourite deity than “GOD”.
I have no idea if this story is credible or not ...
On Mon, 7 Jul 2025 16:10:25 -0000 (UTC), Waldek Hebisch wrote:
Endianness matter for character/digit addresable machines.
I thought such machines always stored the digits in order of ascending significance, because it didn’t make sense to do it the other way.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of ASCII peripherials. But normal 1401 memory size were decimal, so lower than corresponding binary numbers. And actual core had extra space for use
by microcode. So it does not look like a big problem.
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record
gear weren't ready ... so were going to start shipping with old BCD gear >>> (with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines
were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC
translate table.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
It is hard to say what technical problems with ASCII were.
BCD gear used properties of BCD, so rewiring it for ASCII
could require some effort. But it does not look like a
big effort. So they probably could announce ASCII before
I/O equipement was fully ready (after all, they announced
before they had working systems and did not ship some
of what was announced).
Instead of adding a high order bit to the 7-bit code, IBM wanted to
put the extra bit in position 5 (counting from the right), thus
splitting the defined and undefined characters into "stripes" in the
table. I have no idea why they thought this was a good idea, but the
rest of the industry said FOAD, and the rest, as is said, is history.
Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
On Mon, 7 Jul 2025 16:10:25 -0000 (UTC), Waldek Hebisch wrote:
Endianness matter for character/digit addresable machines.
I thought such machines always stored the digits in order of ascending
significance, because it didn’t make sense to do it the other way.
I think that bit/digit serial machines did arithmetic starting from the >lowest digit. But early computer equipment needed to cooperate with
punched card equipement, that is accept mixture of character and
numeric data written in English writing order.
In addition to any technical problem, there was the political problem created by IBM's version of 8-bit ASCII vs. the rest of the industry's version.
Instead of adding a high order bit to the 7-bit code, IBM wanted to put the extra bit in position 5 (counting from the right), thus splitting the defined and undefined characters into "stripes" in the table. I have no idea why they
thought this was a good idea, but the rest of the industry said FOAD, and the rest, as is said, is history.
scott@alfter.diespammersdie.us (Scott Alfter) writes:
In article <md6n3pFgaflU8@mid.individual.net>,
Bob Eager <news0009@eager.cx> wrote:
Don't forget the ACT Sirius. A DOS machine, that crammed more data onto a >>> diskette buy using a variable speed drive (5 speeds, I think).
Apple used the same trick with its 3.5" floppy drives to fit 800K onto a
disk that was only good for 720K elsewhere.
And before the 800K floppy, there was the single-sided 400K floppy on the same
controller.
Also, lower case letter shapes are more complicated, so upper case
is more robust to low quality print ...
ASCII not, what your machine can do for you. -- IBM
Aye, I really like the internal 400k floppy on my 128k Mac because
you can hear the drive speeding up and slowing down depending on
which region is being read.
On Fri, 18 Jul 2025 23:23:12 GMT, Charlie Gibbs wrote:
ASCII not, what your machine can do for you. -- IBM
... “ASCII what you can do for your machine”.
Sums up IBM equipment (and software) in a nutshell.
Did the 3178 come with an APL feature?On a real 3178 there are no [] characters so you either lose some
other characters, or use tri-graphs.
Not unless you paid a lot of money. In those times every mod was an
expensive extra, even if it was a link of wire..
Real terminals went away pretty quickly.
The project I was on was using emulators except for some of us with
3290s.
I think you were late on the scene. I started on 2260's which date
from 1964. The IBM PC wasn't released until 1981, some 17 years
later. 3270 emulation didn't happen until I think a couple of years
later, so almost 20 years after the first terminals. Yes they quickly replaced terminals once they were available, but they were around for
a long time...
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record
gear weren't ready ... so were going to start shipping with old BCD gear >>> (with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines
were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC
translate table.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
On Fri, 18 Jul 2025 18:23:23 +0000, Waldek Hebisch wrote:
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of ASCII
peripherials. But normal 1401 memory size were decimal, so lower than
corresponding binary numbers. And actual core had extra space for use
by microcode. So it does not look like a big problem.
I worked on a mainframe that supported both ASCII and EBCDIC. There was a mode bit which selected which it would use.
The difference was conversion from decimal nibbles to normal bytes, in
that different zone bits were used.
Bob Eager <news0009@eager.cx> writes:
On Fri, 18 Jul 2025 18:23:23 +0000, Waldek Hebisch wrote:
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of ASCII
peripherials. But normal 1401 memory size were decimal, so lower than
corresponding binary numbers. And actual core had extra space for use
by microcode. So it does not look like a big problem.
I worked on a mainframe that supported both ASCII and EBCDIC. There was a
mode bit which selected which it would use.
The difference was conversion from decimal nibbles to normal bytes, in
that different zone bits were used.
Every 360 had a ASCII bit. That bit took quite a while to disappear
from the PSW. Never saw anyone attempt to turn it on.
Bob Eager <news0009@eager.cx> writes:
On Fri, 18 Jul 2025 18:23:23 +0000, Waldek Hebisch wrote:
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of ASCII
peripherials. But normal 1401 memory size were decimal, so lower than
corresponding binary numbers. And actual core had extra space for use
by microcode. So it does not look like a big problem.
I worked on a mainframe that supported both ASCII and EBCDIC. There was
a mode bit which selected which it would use.
The difference was conversion from decimal nibbles to normal bytes, in
that different zone bits were used.
Every 360 had a ASCII bit. That bit took quite a while to disappear
from the PSW. Never saw anyone attempt to turn it on.
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record
gear weren't ready ... so were going to start shipping with old BCD gear >>>> (with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines
were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC
translate table.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
Can't make much sense of the above.
14xx programs in emulation, by definition had to use BCD.
ASCII had a different collating sequence. It's not a translation issue.
On 7/19/25 12:28, Dan Espen wrote:
Bob Eager <news0009@eager.cx> writes:
On Fri, 18 Jul 2025 18:23:23 +0000, Waldek Hebisch wrote:
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables >>>> on output and input. This could require extra space in case of ASCII
peripherials. But normal 1401 memory size were decimal, so lower than >>>> corresponding binary numbers. And actual core had extra space for use >>>> by microcode. So it does not look like a big problem.
I worked on a mainframe that supported both ASCII and EBCDIC. There was a >>> mode bit which selected which it would use.
The difference was conversion from decimal nibbles to normal bytes, in
that different zone bits were used.
Every 360 had a ASCII bit. That bit took quite a while to disappear
from the PSW. Never saw anyone attempt to turn it on.
It never did anything. Its only defined effect was to change the signs >generated for packed-decimal data. I don't know what IBM was thinking.
David Wade <g4ugm@dave.invalid> writes:
Did the 3178 come with an APL feature?On a real 3178 there are no [] characters so you either lose some
other characters, or use tri-graphs.
Not unless you paid a lot of money. In those times every mod was an expensive extra, even if it was a link of wire..
Real terminals went away pretty quickly.
The project I was on was using emulators except for some of us with
3290s.
I think you were late on the scene. I started on 2260's which date
from 1964. The IBM PC wasn't released until 1981, some 17 years
later. 3270 emulation didn't happen until I think a couple of years
later, so almost 20 years after the first terminals. Yes they quickly replaced terminals once they were available, but they were around for
a long time...
Me, late on the scene?
I started programming in 1964 on IBM 14xx in Autocoder.
Did my first 2260 project using BTAM and assembler in 1968.
One of my favorite 327xs were the 3279 color terminals. Great keyboards
on those things. Looking back there was the punched card era, the 3270
era, then the 327x emulator era. I think I put in more years in
emulator era than the real terminal era.
--
Dan Espen
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record
gear weren't ready ... so were going to start shipping with old BCD gear >>>> (with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines
were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC
translate table.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
Can't make much sense of the above.
14xx programs in emulation, by definition had to use BCD.
ASCII had a different collating sequence. It's not a translation issue.
On Sat, 19 Jul 2025 15:16:03 -0400
Dan Espen <dan1espen@gmail.com> wrote:
David Wade <g4ugm@dave.invalid> writes:
Did the 3178 come with an APL feature?On a real 3178 there are no [] characters so you either lose some
other characters, or use tri-graphs.
Not unless you paid a lot of money. In those times every mod was an
expensive extra, even if it was a link of wire..
Real terminals went away pretty quickly.
The project I was on was using emulators except for some of us with
3290s.
I think you were late on the scene. I started on 2260's which date
from 1964. The IBM PC wasn't released until 1981, some 17 years
later. 3270 emulation didn't happen until I think a couple of years
later, so almost 20 years after the first terminals. Yes they quickly
replaced terminals once they were available, but they were around for
a long time...
Me, late on the scene?
I started programming in 1964 on IBM 14xx in Autocoder.
Did my first 2260 project using BTAM and assembler in 1968.
One of my favorite 327xs were the 3279 color terminals. Great keyboards
on those things. Looking back there was the punched card era, the 3270
era, then the 327x emulator era. I think I put in more years in
emulator era than the real terminal era.
Yeahbut I'd have to book the colour terminal way in advance - anyhow
green on black is more restful to the eyes. I missed out on autocoder,
being a mere stripling.
Dan Espen <dan1espen@gmail.com> wrote:
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record >>>>> gear weren't ready ... so were going to start shipping with old BCD gear >>>>> (with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines >>>> were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC >>>> translate table.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables
on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
Can't make much sense of the above.
14xx programs in emulation, by definition had to use BCD.
Yes. And using ASCII in 360 OS-es have nothing to do with the
above.
ASCII had a different collating sequence. It's not a translation issue.
Internally emulator works in BCD. The only problem is to correctly
emulate I/O when working with ASCII periperials. That is solved
by using translation table (so that BCD code from emulator gives
correct glyph on the printer, etc).
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record >>>>>> gear weren't ready ... so were going to start shipping with old BCD gear >>>>>> (with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines >>>>> were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC >>>>> translate table.
Emulation would work without any change, CPU and almost all microcode
would be the same. IIUC what would differ would be translation tables >>>> on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
Can't make much sense of the above.
14xx programs in emulation, by definition had to use BCD.
Yes. And using ASCII in 360 OS-es have nothing to do with the
above.
ASCII had a different collating sequence. It's not a translation issue.
Internally emulator works in BCD. The only problem is to correctly
emulate I/O when working with ASCII periperials. That is solved
by using translation table (so that BCD code from emulator gives
correct glyph on the printer, etc).
If printing is all your app does.
Cards are Hollerith. A close cousin of BCD.
The app would expect any card master file to to in BCD order.
Tapes and disk have the same issue.
"Kerr-Mudd, John" <admin@127.0.0.1> writes:
On Sat, 19 Jul 2025 15:16:03 -0400
Dan Espen <dan1espen@gmail.com> wrote:
David Wade <g4ugm@dave.invalid> writes:
Did the 3178 come with an APL feature?On a real 3178 there are no [] characters so you either lose some
other characters, or use tri-graphs.
Not unless you paid a lot of money. In those times every mod was an
expensive extra, even if it was a link of wire..
Real terminals went away pretty quickly.
The project I was on was using emulators except for some of us with
3290s.
I think you were late on the scene. I started on 2260's which date
from 1964. The IBM PC wasn't released until 1981, some 17 years
later. 3270 emulation didn't happen until I think a couple of years
later, so almost 20 years after the first terminals. Yes they quickly
replaced terminals once they were available, but they were around for
a long time...
Me, late on the scene?
I started programming in 1964 on IBM 14xx in Autocoder.
Did my first 2260 project using BTAM and assembler in 1968.
One of my favorite 327xs were the 3279 color terminals. Great keyboards >> on those things. Looking back there was the punched card era, the 3270
era, then the 327x emulator era. I think I put in more years in
emulator era than the real terminal era.
Yeahbut I'd have to book the colour terminal way in advance - anyhow
green on black is more restful to the eyes. I missed out on autocoder, being a mere stripling.
One of my more favorite pastimes was redoing IBMs default 4-color color scheme of their ISPF screens. A 3279 was a 7 color terminal with
reverse image, underlining. It's amazing how much better you can make
a screen look with a little artistic skill.
At Bell Labs I had the 3279 on my desk for a year or so.
A short-term works colleague who was planning on doing-up^wrebuilding a cottage in mid-Wales for the quiet country life translated the ISPF
panels into Welsh.
On 2025-07-13, Nuno Silva <nunojsilva@invalid.invalid> wrote:
On 2025-07-13, Niklas Karlsson wrote:
Not EBCDIC, but your mention of square brackets reminded me of the
modified 7-bit ASCII that was used to write Swedish before ISO 8859-1
and later Unicode made it big.
"} { | ] [ \" were shown as "å ä ö Å Ä Ö" on Swedish-adapted equipment,
making C code look absolutely ridiculous. Similar conventions applied
for the other Nordic languages and German.
I played with ISO-646-FI/SE once in a Televideo terminal, but not for
long enough to figure out how to handle day-to-day usage of a UNIX-like
system without these characters.
I (barely) know C has (had?) syntax and also iso646.h for such cases,
but how would e.g. shell scripting be handled?
Couldn't say. I came in a little to late to really have to butt heads
with that issue.
Dan Espen <dan1espen@gmail.com> wrote:
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
antispam@fricas.org (Waldek Hebisch) writes:
Dan Espen <dan1espen@gmail.com> wrote:
Lynn Wheeler <lynn@garlic.com> writes:
other trivia: account about biggest computer "goof" ever, 360s
originally were going to be ASCII machines, but the ASCII unit record >>>>>>> gear weren't ready ... so were going to start shipping with old BCD gear
(with EBCDIC) and move later
https://web.archive.org/web/20180513184025/http://www.bobbemer.com/P-BIT.HTM
I don't know what dreams they were having within IBM but those machines >>>>>> were never going to be ASCII. It would be pretty hard to do 14xx
emulation with ASCII and IBM NEVER EVER did a competent ASCII - EBCDIC >>>>>> translate table.
Emulation would work without any change, CPU and almost all microcode >>>>> would be the same. IIUC what would differ would be translation tables >>>>> on output and input. This could require extra space in case of
ASCII peripherials. But normal 1401 memory size were decimal, so
lower than corresponding binary numbers. And actual core had extra
space for use by microcode. So it does not look like a big problem.
Can't make much sense of the above.
14xx programs in emulation, by definition had to use BCD.
Yes. And using ASCII in 360 OS-es have nothing to do with the
above.
ASCII had a different collating sequence. It's not a translation issue. >>>Internally emulator works in BCD. The only problem is to correctly
emulate I/O when working with ASCII periperials. That is solved
by using translation table (so that BCD code from emulator gives
correct glyph on the printer, etc).
If printing is all your app does.
Cards are Hollerith. A close cousin of BCD.
The app would expect any card master file to to in BCD order.
Yes, card reader and card punch also need translation table.
That why I wrote etc above.
Tapes and disk have the same issue.
That is less clear: 1401 discs and tapes stored word marks which
made them incompatible with ususal 360 formats.
And discs were
ususally read on system of the same type. So extra translation
program (needed anyway due to word marks) could also handle change
of character codes when transfering data between system.
Clearly 1401 compatibility did not prevent introduction of CKD
discs. And CKD means different on disk format than 1401 disc.
On Mon, 21 Jul 2025 09:26:43 +0100, Kerr-Mudd, John wrote:
A short-term works colleague who was planning on doing-up^wrebuilding a cottage in mid-Wales for the quiet country life translated the ISPF
panels into Welsh.
For some reason, former Linux kernel developer Alan Cox immediately came
to mind ...
Sysop: | Tetrazocine |
---|---|
Location: | Melbourne, VIC, Australia |
Users: | 14 |
Nodes: | 8 (0 / 8) |
Uptime: | 36:11:35 |
Calls: | 178 |
Files: | 21,502 |
Messages: | 78,782 |