The Essential Guide aka The Technical Manual

Now I see, you had no OS to support an assembler! How did you store your programs? Did you have to re-enter it all after power up/bootstrap etc?
At the very start, I obviously had to get the core of the OS working well enough on paper to 'boot and do something' before I could move it to the machine (obviously typing in by hand the first time or three!). At that stage about the earliest routines I developed on paper were necessarily the ones for burning code to EEPROM and an interface (largely hardware, not much code) to a domestic audio cassette recorder. Once they had been written/debugged//built, I was then able to store the evolving OS on EEPROM or cassette as development proceeded. Eventually (after the first version of the 'word processor'was completed) I added a (100KB) 5.25" floppy drive.

I see, but surely you had an OS by the time you wrote a word processor otherwise how would you manage files, store files etc? Did you store them just in memory only?
Indeed - see above. OS in EPROM, programs/files initially stored on audio cassette and later on floppy.

Kind Regards, John.
 
Sponsored Links
That is true but the assembler that we used did not always adopt the most efficient methods and I assume the Z80 assembler may have been similar. For example (and I can't remember the details) bit shifting left/right during additions etc. We would sometimes fine tune time critical events and often save many machine cycles ....
Oh, sure, particularly in those days, assemblers were not always as clever as human beings at optimising - and they certainly could not do any 'creative lateral thinking'.

The same is even more true of compilers. I'm sure if there were human beings who could get their heads around 64-bit machine code (sounds a bit like N-dimensional chess to me!), both the performance and size of most current-day applications (most of which seem to be written in languages such as C++) could probably be considerably enhanced by 'tweaking' the code.

Kind Regards, John.
 
The same is even more true of compilers. I'm sure if there were human beings who could get their heads around 64-bit machine code (sounds a bit like N-dimensional chess to me!), both the performance and size of most current-day applications (most of which seem to be written in languages such as C++) could probably be considerably enhanced by 'tweaking' the code.


We sometimes used to look at the machine code result of a section of high level compiled code (Pascal or C for example) It was analogous to posting a a small screwdriver packed in a 3x3 meter crate stuffed with polystyrene chips and old newspapers. You did eventually get the small screwdriver but it came with a lot of baggage :)

I haven't written any significant code in anger for 20 odd years. C++ (the object orientated nature of it) is a complete mystery to me. I do remember trying read it a few times but I just did not get it at all. Everyone tells me its so much better in terms of structure but my view is whet was ever un-structured about defining variables, constants, calling functions and procedures that you had pre-written :cry:
 
I think you have misunderstood BAS and myself. All my machine coding was done in hex, not binary. The 'address resolution' to which BAS and I have referred relates to the calculation of relative address offsets - which requires a physical map of memory, some counting, and then some hex addition or subtraction. An assembler does most of that for one.
Spot on.


Now I see, you had no OS to support an assembler!
You don't need one.

I didn't have one on the PDP-8 I was using when my assembler code got to the size where there was no longer room for the assembler, and I had to start translating it into machine code by hand, doing all my address resolution by hand....

An OS is just a program - you can run other programs on computers.

I've taken 1990's vintage 32-bit minicomputers which normally ran sophisticated proprietary OS's and/or Unix, and written code to run on the bare metal.


How did you store your programs?
Don't know about John, but options would have been paper tape, punched card, magnetic media.


Did you have to re-enter it all after power up/bootstrap etc?
Back in the day, 'twas not unknown to have to enter the bootstrap manually from switches after power-up.

They weren't very long - you could memorise them, but even fairly short sequences did look impressive to the uninitiated as you whizzed through flicking switches up and down.. :cool:


I see, but surely you had an OS by the time you wrote a word processor otherwise how would you manage files, store files etc? Did you store them
just in memory only?
You can store files on offline media - don't need an OS.



An assembler is not an interpreter, or even a compiler. It is just an aid to writing the exact same machine code which one could write directly. The end-product is simply machine code - so unlike interpreted (and compiled in some cases) code, there is no performance consequence of writing machine code using an assembler.

That is true but the assembler that we used did not always adopt the most efficient methods and I assume the Z80 assembler may have been similar.
I've never used an assembler which did anything but faithfully, and simply, translate your letter opcodes into machine code etc.

If the machine code one ended up with was inefficient that was your own fault for writing inefficient assembler code.
 
Sponsored Links
Now I see, you had no OS to support an assembler!
You don't need one. .... An OS is just a program - you can run other programs on computers. .... I've taken 1990's vintage 32-bit minicomputers which normally ran sophisticated proprietary OS's and/or Unix, and written code to run on the bare metal.
Indeed. However, the problem I was describing was that, even if I could have afforded it, I doubt that there was a commercial Z80 assembler written in raw Z80 machine code at the time.

That is true but the assembler that we used did not always adopt the most efficient methods and I assume the Z80 assembler may have been similar.
I've never used an assembler which did anything but faithfully, and simply, translate your letter opcodes into machine code etc.
Me neither - it certainly did sound as is sparkticus was talking about a compiler. However, there is one sense in which he is sort-of right. If one is writing directly in machine code, and thinking sufficiently cleverly and laterally, one can sometimes get up to 'optimising tricks' which one could not code in assembler - for example, by taking advantage of knowledge of relationships between various memory addresses, using the same memory address for two or more different purposes at different points in the program etc.etc.. Such 'clever' code would be frowned upon today, since (unless very well documented) it would be difficult for anyone else to understand, hence difficult to maintain - but in the days when every byte and every machine cycle counted, it was sometimes a case of 'needs must'!

Kind Regards, John.
 
I built a Z80 machine in 1976 and spent happy hours programming it in assembly.

I never got an assembler but I did write a FORTH compiler. This was a very quirky language, but highly compact. The core of the compiler was written in assembly and the bells & whistles in FORTH itself.

A FORTH program adds new instructions until the last one is the instruction to run the user program. It used integer arithmetic. Constants such as pi were expressed as a ratio, so pi = 355/113. Basic arithmetic operations include */ operators to make scaling ratios fast.

The arithmetic also used reverse polish notation. Maybe some will remember early calculators that used reverse polish notation.
 
...with apologies to he whose thread has now been rather comprehensively hijacked (but I did vote in the poll :)) ...
I built a Z80 machine in 1976 and spent happy hours programming it in assembly.
I never got an assembler but I did write a FORTH compiler. This was a very quirky language, but highly compact. The core of the compiler was written in assembly and the bells & whistles in FORTH itself.
Very similar experiences, then - except that you were four or so years ahead of me. I produced a very simple implementation of a BASIC interpreter in machine code - with most of the main BASIC commands and 26 variables (A-Z).
The arithmetic also used reverse polish notation. Maybe some will remember early calculators that used reverse polish notation.
Again 'snap'. My machine (inspired by a 1980 series of articles in Wireless World by John Adams) used an MM57109 reverse polish calculator chip as a 'numerical co-processor' (much easier than implementing the maths in machine code!), so my BASIC interpreter used reverse polish. I was already familiar with RP, since my very earliest experiences of 'programming' in anger, I guess around 1976, were on a horrendously expensive (work, not mine!) HP programmable calculator which used RP - it incorporated a 'till roll' thermal printer as an 'output device' and, way ahead of its time, stored programs (sets of RP instructions, really, but with a few variables and simple looping/jumping structres etc.) on tiny magnetic strips/cards (held 256 instructions, IIRC).

Kind Regards, John.
 
Back in the day, 'twas not unknown to have to enter the bootstrap manually from switches after power-up.
Just enough to get bytes from the tape reader into RAM at the start vector. Then load a short tape with the "intelligent" bootstrap program.

If the machine code one ended up with was inefficient that was your own fault for writing inefficient assembler code.
True

Writing in assembler invariably produces faster more efficient and more compact code than the same function written in C or other high level language. But writing it can take a lot longer. In today's world of cheap and plentiful memory and fast ( over driven ) processors the art of writing compact code with minimal memory requirement is rapidly dis-appearing.
 
An OS is just a program - you can run other programs on computers.

That is true but an OS provides a layer above which a program application that knows little about the specific hardware can run. Of course you can code at lower level (machine code - assembler etc) but you need an intimate knowledge of the hardware. I know little about the Z80 though I do remember coding some limited machine code for it at college.


I've taken 1990's vintage 32-bit minicomputers which normally ran sophisticated proprietary OS's and/or Unix, and written code to run on the bare metal.

Yes (and that is very clever) but requires a lot of knowledge about the specific architecture and intimate knowledge of the CPU registers, I/O registers, stack registers etc.

Back in the day, 'twas not unknown to have to enter the bootstrap manually from switches after power-up.
They weren't very long - you could memorise them, but even fairly short sequences did look impressive to the uninitiated as you whizzed through flicking switches up and down.. :cool:

Yes, that is what we had to do on the TI960A. We memorized the bootstrap, entered it on the front panel bit switches then set the program counter to the start of the sequence. We regularly had to re-boot the thing after crashes.


You can store files on offline media - don't need an OS.

That is beyond my experience. I was stuck with the TI960A/B. But thinking about it we did "download" and "upload" via a data link to the mainframe where we stored programs (originally on punch card, then tape, then disk) We manually entered a memory start address from which to start loading the program.

I've never used an assembler which did anything but faithfully, and simply, translate your letter opcodes into machine code etc.

The TI960 offered (through design flaws more than anything intentional) several ways to do I/O functions, computational functions and direct memory addressing. Originally we knew this and coded directly in machine code based upon what we knew about the idiosyncratic architecture. Then an assembler came along (in beta form) The compiler did not offer the same flexibility in the ways to do things especially for I/O addressing. I/O addressing was a tangle on the TI960 because it offered both limited-parallel addressing via multiple registers or via a serial controller called a control register unit (CRU)

If the machine code one ended up with was inefficient that was your own fault for writing inefficient assembler code.

That may well be partially true, we wrote the code in a noisy clean-room environment dressed in "bunny suites" and under pressure to keep production going so we almost certainly made mistakes/were careless at times. However we also (at times) had to save microseconds in an effort to capture time critical I/O events (such as fast rising/falling edges) on signals which offered significant jitter relative to its clock etc.

http://computermuseum.informatik.uni-stuttgart.de/pics/ti960b/ti_960b.jpg
 
Just found this link which brings back some old memories. I see they refer to the I/O as Communications Resource Unit (CRU) I remember it as Control Register Unit but my memory is probably less reliable than the article but hopefully more reliable than the many loose connections to the magnetic core memory in the original machine :D

http://www.computerhistory.org/collections/accession/X1619.99
 
Me neither - it certainly did sound as is sparkticus was talking about a compiler.


No, I was not intentionally referring to a compiler and apologize if I caused confusion. Compilers obviously cause significant overhead.

However, there is one sense in which he is sort-of right. If one is writing directly in machine code, and thinking sufficiently cleverly and laterally, one can sometimes get up to 'optimising tricks' which one could not code in assembler - for example, by taking advantage of knowledge of relationships between various memory addresses, using the same memory address for two or more different purposes at different points in the program etc.etc.. Such 'clever' code would be frowned upon today, since (unless very well documented) it would be difficult for anyone else to understand, hence difficult to maintain - but in the days when every byte and every machine cycle counted, it was sometimes a case of 'needs must'!


Exactly :idea: That was my point. I worked on the TI960 (very little else) In our applications the 960 was providing the control for industrial automated production test equipment for power semiconductor parametric test.
Most often we controlled (serviced) the material handling system in background mode (via ISR) and the parametric analysis in the foreground. Though we did use ISRs to service other events. The TI960 had an idiosyncratic nature because there was most often more than one way to do something and quite often 4 ways. Each way offering a specific advantage especially when writing code for time critical events such as capturing fast (fast for the day) rising/falling edges of a waveform relative to a clock source. Quite often the available sampling period would be only a few hundred microseconds (yes snail pace by todays standards) but very tricky at the time even with the aid of peripheral data capture cards for which we also coded in machine code (as opposed to assembler) for the same reason.[/quote]
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top