Give me mercy...Removing a UPS can't be that hard!

Over the subsequent decades, I have quite often had to try to produce "idiot-proof" sets of instructions (a thankless task, given the almost limitless capacity some people have for idiocy!), but

I've seen an old saying

Make something idiot proof the world creates a better idiot.
 
Sponsored Links
That was one of the big advantages of teaching at least some basic (small b) programming skills.
Yes, as I said, indeed so.

In fact, although (in the 60s) we had no access to any computer, and therefore could not do 'programming' as such, my A-Level and S-Level maths courses did include a fair bit or 'formal logic' (which we had to practice 'on paper') as somewhat of a prelude to what has to come.

My first encounter with true programming was at uni, in the late 60's, when they thought that it would be a good idea to expose us to such things, but that (using FORTRAN, and a little COBOL) was a bit of a joke, since 'the computer' was several miles down the road, and each week sent us back priunted 'output', which usually said things like "syntax error in line 3" - it took us a whole term to write a tiny programme that actually worked :)

Kind Regards, John
 
I've seen an old saying ... Make something idiot proof the world creates a better idiot.
Indeed - and I suppose that's the next stage on from what I said about trying (but never succeeding!) to make it idiot-proof in the first place, namely:
JohnW2 said:
.... (a thankless task, given the almost limitless capacity some people have for idiocy!)
:)

Kind Regards, John
 
My first encounter with true programming was at uni, in the late 60's, when they thought that it would be a good idea to expose us to such things, but that (using FORTRAN, and a little COBOL) was a bit of a joke, since 'the computer' was several miles down the road, and each week sent us back priunted 'output', which usually said things like "syntax error in line 3" - it took us a whole term to write a tiny programme that actually worked :)

I missed all of that - I went straight to building them in the early 70's and programming. I tried lots of languages along the way, before I eventually became bored with the whole idea. I have not written anything for decades now.
 
Sponsored Links
That was one of the big advantages of teaching at least some basic (small b) programming skills.

Python is very much to the fore in universities at the moment (...and free!).
LabVIEW as a pseudo-language is almost as popular in the engineering and science disciplines (...and expensive!).
 
Python is very much to the fore in universities at the moment (...and free!).
LabVIEW as a pseudo-language is almost as popular in the engineering and science disciplines (...and expensive!).
I would think that the skills/lessons (in terms of 'precise clarity') we are talking about are probably equally well learned whilst learning to use almost any programming language/environment. To the best of my knowledge, very few even attempt to compensate for 'programming oversights' - and any that do are potentially 'dangerous', since, if there is a problem with following literally the instructions which have been programmed, a complier can but attempt to guess (not necessarily correctly!) what the programmer's intentions actually were!

Accordingly, I would think that virtually any programming experience should therefore help one to become very careful in making sure that a literal interpretation of one's 'instructions' (and "nothing but one's instructions") will achieve exactly what one wants.

Kind Regards, John
 
JohnW2's comment about compiler getting confused and not doing what the programmer wanted is valid.

'programming oversights' are less likely to create problems when the code is written in assembler.

My programming journey route started on Intel 4004 went on through Motorola 6800 and 6809. Turbo Pascal, Microchip Assembler as well as Basic on a variety of machines.
 
Yes, as I said, indeed so.

In fact, although (in the 60s) we had no access to any computer, and therefore could not do 'programming' as such, my A-Level and S-Level maths courses did include a fair bit or 'formal logic' (which we had to practice 'on paper') as somewhat of a prelude to what has to come.

My first encounter with true programming was at uni, in the late 60's, when they thought that it would be a good idea to expose us to such things, but that (using FORTRAN, and a little COBOL) was a bit of a joke, since 'the computer' was several miles down the road, and each week sent us back priunted 'output', which usually said things like "syntax error in line 3" - it took us a whole term to write a tiny programme that actually worked :)

Kind Regards, John

I was rather luckier - at UCL we had an IBM 360 in the computer centre which was not far from the engineering building, but we had an Interdata Model 5 RJE (remote job entry) terminal in the eng building where we could submit jobs (on punch card) and collect printouts. Turnround was a only an hour or so. Eventually the operator allowed a few of us to enter our own jobs, so we could do it at any time. We programmed in Waterloo Fortran IV (Watfiv). A few of us got bored with Fortran and taught ourselves PL/1, which our lecturer allowed us to use for coursework.

After that I move on to 6800/6809, 68000 (16bits - YAY!), PDP/11 and then to VaxClusters.

The most important thing to remember about coding - there is always one more bug!
 
One of the benefits of Python can also be a drawback.
It's ease of access and ubiquity means it is possible to program quite complex code by simply cutting and pasting examples from for example, stackoverflow.com.
Coding without having a basic (or should I say rudimentary! :) ) understanding has never been easier!
 
So my Granddad (who is in hid mid 80's) has had a UPS on his router for a number of years as he lives in a rural area and gets lots of power cuts and power dips. The UPS was 10-11 years old and had started to become unreliable, so I ordered him a new one.

In the mean time I sent him this via email: (They also now have a VoIP service as their main phone line.)


So today when I arrived to install the new UPS, I had found that rather than plugging the IEC C13 and C14 connectors into each other as I assumed he would based on my above email, my Granddad had cut the connectors of the end of the wires and wired them together in a open/cover-less box using a terminal strip!

I guess I should have stated in red bold text in the email that no tools whats so ever are required and that the connectors just plug into each other - I assumed wrong!

Regards: Elliott.
For what it's worth the same thing happened in a hill top aerial site, except there the only service reconnected was one 4way 13A socket and not the whole installation.
 
JohnW2's comment about compiler getting confused and not doing what the programmer wanted is valid.
I'm not sure that 'confused' is the right word, and one couldn't really blame the compiler for 'doing he wrong thing' (in relation to the programmer's intentions) if there are errors/'oversights' in the programming.

Provided only that it is syntactically correct, the compiler will literally and precisely follow the instructions that have been programmed. It's only when the programmer's instructions, when taken literally, result in problems that a compiler might (although very few do) attempt to guess what was actually 'intended' - and, as above, if that ever happens, one couldn't really blame the compiler for 'guessing wrong'..
... 'programming oversights' are less likely to create problems when the code is written in assembler.
I've been talking about 'oversights' of logic, and they will exist regardless of how one programs. Taking my previous example, if the programmer forgets to make ('logical') provision for a possible situation in which a 'divide by zero' could arise, that problem will be as present if programmed in assembly language (or even machine code) as with a 'high-level' language.
My programming journey route started on Intel 4004 went on through Motorola 6800 and 6809. Turbo Pascal, Microchip Assembler as well as Basic on a variety of machines.
Not that much different here. Although by then I had a degree of familiarity with FORTRAN, COBOL, BASIC and Pascal (but very few opportunities to actually 'use' them on any computer), my first 'serious' programming experience (in late 70s) was in writing an 'operating system', 'word processor' and a crude forerunner of a 'spreadsheet' programme directly in Z80 machine code (none of this sissy assembler stuff :) ), for a machine I had designed and built - a task which (including all the inevitable debugging) took me many weeks, probably months.

The 'operating system' was stored in an EPROM, so (other than for a little 'data storage') at least did not use up any of the RAM. The 'word processor' was much more of a problem, since I initially had only 8k of RAM to play with, and that had to accommodate both the programme and the document being worked on - so I had to keep the programme down to 2k or so!

Kind Regards, John
 
I was rather luckier - at UCL we had an IBM 360 in the computer centre which was not far from the engineering building, but we had an Interdata Model 5 RJE (remote job entry) terminal in the eng building where we could submit jobs (on punch card) and collect printouts. Turnround was a only an hour or so.
When was this? I am also talking about UCL, and 'next door to' the Engineering Building. At the time I am talking about (around 1967) things worked out a bit differently to that for us, but perhaps only because our 'Introduction to Computing' was only a once-per-week phenomenon!

Our efforts/attempts were written on 'Coding Forms'. At the end of each (once per week) session, those forms would be 'sent down the road' for punching onto cards and running at the (London University) Computer Centre. It could be (and probably was) that they tuned things around fairly quickly but, from our point of view, it was a week (until the next 'session') before we saw the printouts!
The most important thing to remember about coding - there is always one more bug!
Given that we've been talking about the importance of precision of statements, I think that is only correct if one regards it as a recursive statement, since the reality is that "there is always more than one more bug" :)

Kind Regards, John
 
Provided only that it is syntactically correct, the compiler will literally and precisely follow the instructions that have been programmed. It's only when the programmer's instructions, when taken literally, result in problems that a compiler might (although very few do) attempt to guess what was actually 'intended' - and, as above, if that ever happens, one couldn't really blame the compiler for 'guessing wrong'..

I have never known a compiler make a 'guess', they just compile your set of instructions, just as you expressed them. A guess involves some level of intelligence and being aware of the alternatives, which a compiler simply doesn't possess.
 
One of the benefits of Python can also be a drawback. It's ease of access and ubiquity means it is possible to program quite complex code by simply cutting and pasting examples from for example, stackoverflow.com. Coding without having a basic (or should I say rudimentary! :) ) understanding has never been easier!
Indeed - but I think that teh same is true of any programming language, since I doubt that you could find any language for which there wasn't an ocean of code 'out there' which could be copied/pasted by people with absolutely no 'understanding'!

Worse, it's not just languages - it also applies to many applications. I see this a lot in relation to Statistical software packages. It is now possible for anyone to get their hands on (sometimes even 'free') packages that have the ability to undertake highly complex mathematical analyses, simply by 'point and click', with no need for any programming at all, so we see a lot of problems arising from people using such software without any understanding of what the software is doing, how appropriate it is (very often not!) and how to interpret the results!

As software applications become increasingly 'accessible'/'user friendly', I'm sure that this problem must be arising in all sorts of fields.

Kind Regards, John
 
I have never known a compiler make a 'guess', they just compile your set of instructions, just as you expressed them. A guess involves some level of intelligence and being aware of the alternatives, which a compiler simply doesn't possess.
As I said, very few do, but I gather that some have tried.

The nearest I am aware of that in relation to software I commonly use is its fairly trivial ability to make assumptions (aka 'guesses') in relation to simple/common typographical errors, but only in situations in which it is essentially impossible that this assumption/guess will result in something that the programmer did not intend. The following extracts from the log illustrate. Firstly, with the correct code:

592 proc print data = test ; run ;
NOTE: There were 100 observations read from the data set WORK.TEST.​

In general, if one types the first word of that code ("proc") incorrectly, the compiler will throw up an error message and abort compilation (of the step):

598 prkk print data = test ;
----
180

ERROR 180-322: Statement is not valid or it is used out of proper order.

However, if one 'gets fairly close' to typing the word correctly, it merely issues a 'warning', indicating its assumption, but continues with compilation (and subsequent execution) on the basis of that assumption:

594 prok print data = test ; run ;
----
14

WARNING 14-169: Assuming the symbol PROC was misspelled as prok.
NOTE: There were 100 observations read from the data set WORK.TEST.​

596 prox print data = test ; run ;
----
14

WARNING 14-169: Assuming the symbol PROC was misspelled as prox.
NOTE: There were 100 observations read from the data set WORK.TEST.​

That really carries virtually no 'risk', since there is no syntactically-correct meaning that the programmer could possibly have intended other than what is 'assumed' by the compiler.

What some languages do do (with the disapproval of many professionals!) is allow one to configure the compiler to allow all sorts of 'imprecise programming' - which amounts to allowing the compiler to make assumptions'. Probably the best known example is VB.NET, if one specifies "Option Strict Off". That provides compatibility with VB6 etc. and, for non-professionals like myself, enables one to write simpler code much more easily, but it also opens the door for 'programming errors/oversights', which is why many disapprove of it!

Kind Regards, John
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top