Computer science got worse when “computer science” began to be taught in universities?

From the translator: Alan Kay is, without exaggeration, "our everything" in computer science, he is also known for his tough stance on many issues related to development. I decided to translate these few paragraphs because the trends that he outlined for teaching programming are reflected, like in a distorting mirror everywhere in the world. Each reader himself will find parallels. The purpose of the translation is to discuss these issues.






This is an interesting question! My first reaction [to the question in the title] was “Absolutely not, quite the opposite” ... but this reaction is clouded by old experiences from the 60s. This is because - step by step - the best "real computer science" * in the 60s was in major universities around the world (e.g. UK: Cambridge, Manchester, Imperial, Edinburgh, etc .; USA: Pennsylvania , MIT, Princeton, CMU, Illinois, Stanford, Berkeley, UCLA, etc.; as well as major European universities: ETH, Eindhoven, etc.).





(*) Based on Will Rasen's comment below, this is how we thought of "computer science" in the 60s, when the term was formulated as an aspiration and a question, not a ready-made business.



Science is an attempt to discover and collect phenomena to explain them by creating models (theories) of some kind that produce similar phenomena, and to do so in ways that try to bypass the weaknesses of our senses and thinking abilities.



So, if such a bridge is built, it produces phenomena, and it can be studied, modeled and better understood. Those. it could be "the science of bridges" (and "structures in general"). In general, this creates the "Science of the Artificial", ie. science that arise around the artifacts that create the animals, we mainly (see the book of Herb Simon's "science of the artificial." - winner of the Turing Award and the Nobel Prize, as well as one of the Turing Award founder Alan Perlis)



In the "science of bridges" wonderful that a deeper understanding and better models of "bridging", in turn, can be used to design and manufacture better bridges that have properties of their own that need to be studied ...



The Science of Artifacts is a delightful art and activity for those who love and are called to an upward adventure of understanding, leading to creation, leading to understanding, leading to ...



Most sciences - whether about nature or artifacts - will use mathematics of some kind species - often re-invented - to aid in the modeling process. As in physics, this should not be confused with the scientific side of things.



When Alan Perlis was asked what “the science of computing” could mean, he replied that it was “the science of processes; all processes ". He might just as well have said, “the science of systems; all systems ”(he would have meant the same answer).



This is the recognition that algorithms, etc., are a tiny part of what computation is. Computing is really about understanding, inventing and building systems. As in many cases in the science of the past, when the existing mathematics does not cope with this task, it is necessary to invent new mathematics. In this case, one of the needs for new ways of understanding what is happening is associated with the available degrees of freedom and the addition of the dimension of time.



The degrees of freedom and the degree of dynamic relationships in the desired artifacts usually mean that they need to be debugged, not proven. (And there are parts of math where the proofs are of the same quality - all proofs must be debugged; some proofs actually require simulating them on a computer to debug them.)



Some of the early pioneers realized that the computer is "meta" in the sense that it can be an excellent tool for modeling self-representations, so that much of the new mathematics needed can be "extracted" from the "process space" itself. Many computer "theories" are models of processes, written as running systems, that can be debugged and explored. (We are sometimes asked how Xerox Parc could have been so resourceful and productive in the 70s with only a few dozen computer scientists. One answer lies in the above. We thought in terms of process systems, created models of them, and ran those models on computer architectures. that we have invented and built. I would call what we have done a virtuous spiritual spiral of "computer science" in understanding things,







, , « », , 1980 , - MIT, CMU, Stanford . . - , 4000 .



: « ? 4000 - . , « »?



, 60- : - , , . - « » ("Imposters In The Temple").



"" , .



«-» « », , . (, ), , .



, , CS - - ( , - «» ) Java . , , « » « ».





, .



- - , , .



, « » (, F = ma, ).



, «» Google, , , . .



.



( 60- 70- , , , , ...)



, «» -, . «» - « » - 17 , «».



NSF ( ). . .



( ): , «» . , ( , « », ). , , «»: , - , , , .



From the point of view of a person from a hazy past, this is truly a shame.








All Articles