Artificial intelligence is an unnatural version of the real thing. There has been much talk lately of artificial intelligence and the threat that this may present for us, but as far as I can see it presents no threat whatsoever. Artificial intelligence is a misnomer anyway; although it is undoubtedly artificial, there is nothing remotely intelligent about any computer software that I have ever come across. Whether it is attached to a robot, or to something like correcting this blog, it is highly unintelligent. Robots are good at doing simple repetitive jobs, like filling boxes, but no one would ever claim that this involves any intelligence at all. Even a simple job like polishing my shoes is a mind bogglingly complex task. It involves finding the polish in the first place, making sure it is the right colour, ensuring any spots are removed from the leather and polishing the shoes to a suitable sheen. A robot might one day manage it, but it would still not be intelligent in any meaningful sense of the word.
There is obviously much further to go along the road before we can apply even the term comprehension to any computer program, let alone intelligence. The auto-correct facility that tries to put right spelling mistakes is pretty good at spotting the sort of typos I might make by misplacing a single letter in a long word, but with short words it always gets it wrong. Where a human being would immediately see the correct word from the context, it make ridiculous assumptions based merely on the nearest spelling.
What I am extremely dubious about is any originality of thought ever coming from a computer. Even in purely scientific terms, take for example the ‘Higgs boson’; the existent of this sub-atomic particle was postulated in 1964 by the eponymous researcher, acting on what I can only call intuition. It was only proved in recent years by the use of the CERN particle accelerator. I do not claim have any knowledge of this abstruse branch of physics, but I do not see artificial intelligence coming up with similar revelations any time soon. Perhaps artificial intuition might. But because science can eventually be demonstrated by number crunching, it may be theoretically possible to accomplish anything scientific without human input. With the arts the prospect of computers coming up with any truly innovative results is impossible. A computer may appear to be creative, but it will always be merely a mechanistic device. The original insights of a writer or an artist, or even of a musician (although music relies on mathematics to large extent) I do not see as being within the grasp of artificial intelligence.
In the arts I see no prospect of computers supplanting human beings. I can contemplate the imagination being imitated by a non-imaginative machine, but I think the results would be bizarre rather than valid. As I said earlier, we are so far from even basic tasks being accomplished by artificial intelligence. I do not foresee even my grandchildren’s grandchildren needing to be unduly concerned by the threat of artificial intelligence. On the other hand, if advances will eventually take care of all spelling errors so much the better. If artificial intelligence can answer all the questions thrown up by molecular physics I would be amazed but not concerned. But if artificial intelligence could explain the value of a Shakespeare sonnet or a Keats ode I would be flabbergasted. For one thing, in such an ‘intelligent’ discussion there is no such thing as a ‘right’ answer, although there is an almost infinite number of ‘wrong’ ones. Or how about sarcasm or irony, where you say the opposite of what you intend; try getting a computer to understand that!
THE STORY OF MODERN TECHNOLOGY
I bought my first computer in 1997; it was an Apple Mac. It was good for its time, but woefully slow and generally inadequate by modern standards. Nonetheless it wasn’t cheap and with a printer the whole set-up cost well over £2,000. PC World had recently opened in Norwich and we got our equipment from there. It had hard disk space of less than 500 megabytes which I was assured was ample for anything I might require. It wasn’t that many years since the first Mac was available with just a floppy disk for memory; no hard disk at all! I connected to the internet after two or three months of playing games on my new toy, in October of ’97. This was a fairly advanced thing to do…in fact there were so few users of the brave new system that my son Peter and I discovered that it was quite easy to win on-line competitions because there were so few entries! Peter was 10 years old; there were under 70 million internet users worldwide, most of those in the USA.
I entered into computing with enthusiasm. One thing I taught myself was HTML 4, which was the encoding system for web pages. I don’t think an awful lot of people are familiar with HTML (Hypertext Mark-up Language) coding. It isn’t as hard as it sounds, but anybody who builds web pages today uses a dedicated program to do the donkey work. The pages I built were pretty basic by modern standards but I made websites for the Versator binocular magnifier (which I was still making at the time) and about Spixworth, the subject of a book I had just published. How different it all was from the position today; now building a page for my WordPress blog is simplicity itself. Although I use the retro theme that makes my page look rather ancient (it reminds me of the early days) it would be no harder to have a very up to date style. It is still possible to see the underlying code of a web page although it is not as easy to find as it used to be.
Email was not particularly useful to me back then because so few of my acquaintances had a computer and virtually none had an internet connection. Even most businesses still used fax rather than email. It all rapidly changed of course; everything except for the painfully slow dial-up connection. That took a long time to replace thanks to the tardiness of BT in introducing broadband. The dialling tone followed by the sound of fighting cats is something I don’t miss at all. It was several years before BT got round to installing broadband, and even then we used a rather unreliable BT wired modem. When I said to my wife we would soon all be using laptops and wireless internet she thought such things would be far beyond our pockets. Now we have laptops and tablets and smart phones, and a desktop computer is thoroughly old hat. The price of computers has dived too and the power of them has soared.
Back in the early 70s the pocket calculator was the height of electronic sophistication; they were not cheap and the Sinclair Cambridge that my father bought me cost £30. That was the average weekly wage at the time. (Now they are even given away free as advertising.) My father was enthralled by these little gadgets which he called ‘computers’. How my father would have revelled in the real computer age. These real computers were still huge (and hugely expensive) machines, the size of a small bungalow. Data was input from a punched card and was entirely numerical. Output was also principally numerical and came via a printed paper tape. Keyboards were still restricted to typewriters and screens to television sets.
I used the computer in the Norwich Technical College as part of my business studies course, but only for about 5 minutes. Computer time was too valuable for me to spend any longer. We played a game which involved landing a spaceship on the moon, or so they told me. But as I said, all the results were in arithmetical form, so the result of my lunar touchdown could have been anything as far as I was concerned. Anyway they told me that I had crashed the moon landing craft. At that time Steve Jobs and Bill Gates were still schoolboys and digital meant using your fingers.
The phenomenal growth of electronic communication has been the result of Tim Berners-Lee and his introduction of the World Wide Web a mere 25 years ago. When I first remember things we were an unusual family in having a phone; for most people the way to get in touch with someone not in your immediate vicinity was to write a letter – by hand of course!