What happened to software? Why is there so few creative software (2010)? Are we at the end of software? What are the forces which led to this situation, looking like a bit step backward to the epoch of non-programmable accounting machines? Is there a way out of this situation? Yes, and a very simple one: make good software. With invention. Developing models and abstractions. It is difficult but absolutely possible. It this re-start of software I wish to explore here in this blog. Welcome
terça-feira, 10 de dezembro de 2013
quarta-feira, 20 de novembro de 2013
SEMAT: do we need one more methodology?
Same story again. Software is engineering, software development is engineering, so we need theory and practice. Look, we have a new theory and practice. And it works well with the methodology used today, Agile.
See previous post, programmers as artisans. When people will stop defining a theory for a practice which is not yet well defined? Perhaps will never be well defined.
And look, software is a different animal. Stop saying it is an engineering.
Also, Major-league SEMAT: Why Should an Executive Care?
See previous post, programmers as artisans. When people will stop defining a theory for a practice which is not yet well defined? Perhaps will never be well defined.
And look, software is a different animal. Stop saying it is an engineering.
Also, Major-league SEMAT: Why Should an Executive Care?
quarta-feira, 6 de novembro de 2013
domingo, 27 de outubro de 2013
sábado, 19 de outubro de 2013
Things I will not bother with next time
Insight on software engineering and methodologies: they are great for standard projects. But standard projects are not creative software. And is there really a standar project?
"As for sound engineering principles in general, I think one important trait is to know when the established best-practices apply and when they do not.": Unlearning Software Engineering
"As for sound engineering principles in general, I think one important trait is to know when the established best-practices apply and when they do not.": Unlearning Software Engineering
Bad software example
Very bad software. With detailed explanation of the reasons it is so bad: Microsoft Word.
O Fim do Software? Não. (tradução da introdução)
O que aconteceu com o software? Por que se cria tão pouco hoje (2010)? Chegamos ao fim do software? Como aconteceu, quais foram as forças que levaram a esta situação que parece um retrocesso à época das máquinas de contabilidade não programáveis?
Há uma saída? Sim, e ela é muito simples: fazer bom software. Como invenção. Desenvolvendo modelos e abstrações. É difícil mas totalmente possível.
É este re-começo do software que quero explorar aqui neste blog.
Benvindos.
Há uma saída? Sim, e ela é muito simples: fazer bom software. Como invenção. Desenvolvendo modelos e abstrações. É difícil mas totalmente possível.
É este re-começo do software que quero explorar aqui neste blog.
Benvindos.
Fim dos ERPs? ou Fim do Software?
Artigo na Forbes atacando a SAP: For Enterprise IT, Time To Move Beyond SAP.
O ataque não é à SAP, que percebe como líder de mercado, mas a ERPs em geral, incluindo Oracle e outros. O que é incorreto, ele deveria analisar o mercado de ERP. E aí teria uma perspectiva diferente. Por exemplo uma coisa que ele não menciona é que os ERPs não são estáticos, evoluem. Tem muito ERP de nicho. E os grandes, SAP, Oracle, tem versões para nichos. Outra coisa do mercado de ERPs é que estão saindo soluções para cloud, ainda no começo, que parecem competir com ERPs mas na verdade são uma outra forma de vender o ERP. O mercado evolui.
Indo mais fundo na análise, o que ele diz é que quando as empresas adotam uma solução empacotada, perdem a criatividade que software permite. É o que este blog diz. Software pronto não é mais software. Perde-se a programabilidade, portanto a oportunidade de ser diferente. Nisto ele tem razão. Mas cuidado, em que uma empresa pode ser diferente e ganhar competitividade com a Folha de Pagamento, por exemplo? Talvez seja melhor ter um ERP padrão, igual ao do concorrente, para coisas administrativas burocráticas e repetitivas. E deixar a criatividade para o core business. Este é o ponto.
Há poucas empresas pensando em TI como oportunidade criativa de se diferenciar do concorrente. Ao contrário, procuram colocar ERP em todas as áreas possíveis da empresa. Transferem a criatividade para por exemplo BI, que é uma ferramenta pronta. Talvez os bancos usem TI de forma criativa, ou as empresas de cartão de crédito. Eles desenvolvem seu software, compram pouco software pronto.
O artigo fala da rigidez do SAP. Tem razão. Todos os ERPs são mais ou menos rígidos. Mas mesmo os menos rígidos estão longe de serem abertos, programáveis, que aí sim daria espaço para criatividade.
Ir para o cloud faz diferença? Não, o cloud é a infra onde roda o ERP, não muda nada ir para o cloud assim como a terceirização da infra com empresas como Tivit, IBM, Diveo não mudou nada, é só mais barato terceirizar. O usuário nem percebe.
" It’s time to get creative again" diz ele. Concordo mas duvido. Não é indo para o cloud que se fica criativo. É desenvolvendo soluções diferentes. Não vejo muita empresa fazendo isso. Ao contrário, elas vão para o cloud porque a solução no cloud é também um ERP, só que mais barato, ou mais fácil de usar. Observe: se uma empresa adota uma solução no cloud, como Salesforce.com, outra pode adotar também, aí todas adotam e pronto, caímos na mesma situação que ele descreve quando todos adotaram ERP. Sem criatividade, sem diferenciação.
Indo mais fundo na análise, o que ele diz é que quando as empresas adotam uma solução empacotada, perdem a criatividade que software permite. É o que este blog diz. Software pronto não é mais software. Perde-se a programabilidade, portanto a oportunidade de ser diferente. Nisto ele tem razão. Mas cuidado, em que uma empresa pode ser diferente e ganhar competitividade com a Folha de Pagamento, por exemplo? Talvez seja melhor ter um ERP padrão, igual ao do concorrente, para coisas administrativas burocráticas e repetitivas. E deixar a criatividade para o core business. Este é o ponto.
Há poucas empresas pensando em TI como oportunidade criativa de se diferenciar do concorrente. Ao contrário, procuram colocar ERP em todas as áreas possíveis da empresa. Transferem a criatividade para por exemplo BI, que é uma ferramenta pronta. Talvez os bancos usem TI de forma criativa, ou as empresas de cartão de crédito. Eles desenvolvem seu software, compram pouco software pronto.
O artigo fala da rigidez do SAP. Tem razão. Todos os ERPs são mais ou menos rígidos. Mas mesmo os menos rígidos estão longe de serem abertos, programáveis, que aí sim daria espaço para criatividade.
Ir para o cloud faz diferença? Não, o cloud é a infra onde roda o ERP, não muda nada ir para o cloud assim como a terceirização da infra com empresas como Tivit, IBM, Diveo não mudou nada, é só mais barato terceirizar. O usuário nem percebe.
" It’s time to get creative again" diz ele. Concordo mas duvido. Não é indo para o cloud que se fica criativo. É desenvolvendo soluções diferentes. Não vejo muita empresa fazendo isso. Ao contrário, elas vão para o cloud porque a solução no cloud é também um ERP, só que mais barato, ou mais fácil de usar. Observe: se uma empresa adota uma solução no cloud, como Salesforce.com, outra pode adotar também, aí todas adotam e pronto, caímos na mesma situação que ele descreve quando todos adotaram ERP. Sem criatividade, sem diferenciação.
domingo, 6 de outubro de 2013
Who Are Software Developers?
Interesting survey of Software Developers: "In direct contrast to outdated stereotypes, developers tend to be family men"
sábado, 28 de setembro de 2013
Alan Kay interview
"One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.
... In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture."
"Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves."
"that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over."
" I think the style languages [Lisp, APL, SmallTalk] appeal to people who have a certain mathematical laziness to them. Laziness actually pays off later on, because if you wind up spending a little extra time seeing that “oh, yes, this language is going to allow me to do this really, really nicely, and in a more general way than I could do it over here,” usually that comes back to help you when you’ve had a new idea a year down the road. The agglutinative languages, on the other hand, tend to produce agglutinations and they are very, very difficult to untangle when you’ve had that new idea."
"Even if you’re designing for professional programmers, in the end your programming language is basically a user-interface design. You will get much better results regardless of what you’re trying to do if you think of it as a user-interface design."
"Corporate buyers often buy in terms of feature sets. But at PARC our idea was, since you never step in the same river twice, the number-one thing you want to make the user interface be is a learning environment—something that’s explorable in various ways, something that is going to change over the lifetime of the user using this environment. New things are going to come on, and what does it mean for those new things to happen?"
And many many great and strong ideas and concepts in this Alan Kay interview
... In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture."
"Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves."
"that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over."
" I think the style languages [Lisp, APL, SmallTalk] appeal to people who have a certain mathematical laziness to them. Laziness actually pays off later on, because if you wind up spending a little extra time seeing that “oh, yes, this language is going to allow me to do this really, really nicely, and in a more general way than I could do it over here,” usually that comes back to help you when you’ve had a new idea a year down the road. The agglutinative languages, on the other hand, tend to produce agglutinations and they are very, very difficult to untangle when you’ve had that new idea."
"Even if you’re designing for professional programmers, in the end your programming language is basically a user-interface design. You will get much better results regardless of what you’re trying to do if you think of it as a user-interface design."
"Corporate buyers often buy in terms of feature sets. But at PARC our idea was, since you never step in the same river twice, the number-one thing you want to make the user interface be is a learning environment—something that’s explorable in various ways, something that is going to change over the lifetime of the user using this environment. New things are going to come on, and what does it mean for those new things to happen?"
And many many great and strong ideas and concepts in this Alan Kay interview
Programming without variables
Follow Andrew Koenig's excellent blog with a series on programming whithout variables.
Development x Packaged software
Yet another current myth, in Forrester's view, is that custom application development is dead, having fallen out of favor and supplanted by packaged products, according to Ried's report. "Definitely wrong," he wrote. "Enterprises spend about the same on custom-developed business applications as on packaged business software." Packaged applications are accounting for 25.8 percent of software spending while spending on custom software stands at 25.6 percent, according to the report.
"Custom development can be better as long as the business logic isn't subject to legal or tax regulations, such as financial accounting software."
"Custom development can be better as long as the business logic isn't subject to legal or tax regulations, such as financial accounting software."
segunda-feira, 23 de setembro de 2013
Lucid
Mais uma ótima dica do amigo Orsoni: Lucid.
" If you have a chance, try googling esoteric programming languages, befunge, brainf**k, unlambda, or thue".
Lucid proposes a completely different model. You look at the flow of data, not the flow of control. It can be considered a "dataflow" language.
" If you have a chance, try googling esoteric programming languages, befunge, brainf**k, unlambda, or thue".
Lucid proposes a completely different model. You look at the flow of data, not the flow of control. It can be considered a "dataflow" language.
domingo, 8 de setembro de 2013
domingo, 14 de julho de 2013
domingo, 7 de julho de 2013
2 news from my friend Teca
sexta-feira, 5 de julho de 2013
About COBOL
Banks will stick with COBOL because Java has performance issues, claims quality guru Bill Curtis: He shows real reasons of Cobol continuity. Nothing to do with the language. It is the maturity time for programs, the quality of programmers, the Web.
Also: Java apps have most flaws, Cobol apps the least, study finds
Also: Java apps have most flaws, Cobol apps the least, study finds
domingo, 19 de maio de 2013
More on first programming language
William Cusing post in the ResearchGate discussion on which programming language to use in a first programming course. Very interessting:
Racket is a variant of Scheme that focuses on teaching.
http://htdp.org/
http://docs.racket-lang.org/ drracket/htdp-langs.html
The Structure and Interpretation of Computer Programs (SICP) is also a classic textbook in computer science (which also teaches Scheme).
Teaching someone a language is a serious amount of responsibility. It amounts to teaching how to think. Teaching a low-level machine language will 'straightjacket' that individual into low-level thinking. That could be called inflicting brain damage (eminent computer scientists do say so -- such as Dijkstra), instead of teaching.
It is rare, and only will get rarer as compilers/interpreters/ virtual-machines
improve, that programmers are needed to produce fast programs nowadays.
Optimizing before profiling is the root of project failure. Learning
how to think at a higher level, profile, and optimize by hand only when
truly necessary is the way to approach the present, and moreso the
future.
For example, let's consider a typical domain where C is held to be the "best" language: numerical algorithms. Say Fast Fourier Transform. Please Read:http://en.wikipedia.org/wiki/ FFTW
http://www.fftw.org/faq/ section2.html#languages
In particular note that the Fastest Fourier Transform is implemented in a combination of ML and OCaml. They wrote their own mini-compiler in those languages, that spits out C code better than anyone else's hand-written C code. They pulled that off because ML and OCaml allow them to easily write their own compiler customized to compiling Fast Fourier Transforms.
In 'principle' they could have written C code to manipulate C code; but in reality such a project would have failed. Likewise, in 'principle' we could implement an OS in assembly: but the result would be horribly poor in features, if it worked at all. From such considerations it should become clear: the path to the future is higher level languages. C is the new assembly (and has been for some time). Learn it only for the purpose of compiling to it (or because you have a burning desire to implement device drivers for the rest of your life), and learn it only after learning one of the many superior higher level languages out there.
It is important to learn a higher level language first (ML, OCaml, Python, Ruby, Scala, Clojure, Scheme, Racket, Common LISP, ...) so that you do (less) harm to your ability to think clearly and concisely. Of these languages, Scheme has probably the best support for teaching, and one of the purest approaches; it may not be the language you use on a daily basis for the rest of your life, but it is an ideal language to have learned first.
An my answer, below:
I think there is a prejudice against low-level languages. If you think about it, Assembly or machine-level language is not "low level". It is very detailed, indeed. But a few syntactic sugar + expression parsing can take care of the register-to-memory and memory-to-register operations. There are languages, like old PL360, which do that. You can write complete expressions and the languages translates them to machine-language.
So, the point is not being low level. Also not of being a machine language. We had high level machine language like Algol for the Burroughs machine, or LISP for the Symbolic Machines.
The point is that learning machine language we do learn a Universal Machine model. The Von Neumann model. This experience is enlightning. The student understands that data and program have the same representation, are stored in the same memory. The student catches what a compiler has to do to translate his "high level program".
Lisp, Scheme or Racket, as mentioned in William Cushing post, and used in SICP, teach yet another different Universal Machine model, the lambda calculus.
With the Universal Machine concepts understood, the student can then go to "standard" programming languages like C, Java, etc...
Racket is a variant of Scheme that focuses on teaching.
http://htdp.org/
http://docs.racket-lang.org/
The Structure and Interpretation of Computer Programs (SICP) is also a classic textbook in computer science (which also teaches Scheme).
Teaching someone a language is a serious amount of responsibility. It amounts to teaching how to think. Teaching a low-level machine language will 'straightjacket' that individual into low-level thinking. That could be called inflicting brain damage (eminent computer scientists do say so -- such as Dijkstra), instead of teaching.
It is rare, and only will get rarer as compilers/interpreters/
For example, let's consider a typical domain where C is held to be the "best" language: numerical algorithms. Say Fast Fourier Transform. Please Read:http://en.wikipedia.org/wiki/
http://www.fftw.org/faq/
In particular note that the Fastest Fourier Transform is implemented in a combination of ML and OCaml. They wrote their own mini-compiler in those languages, that spits out C code better than anyone else's hand-written C code. They pulled that off because ML and OCaml allow them to easily write their own compiler customized to compiling Fast Fourier Transforms.
In 'principle' they could have written C code to manipulate C code; but in reality such a project would have failed. Likewise, in 'principle' we could implement an OS in assembly: but the result would be horribly poor in features, if it worked at all. From such considerations it should become clear: the path to the future is higher level languages. C is the new assembly (and has been for some time). Learn it only for the purpose of compiling to it (or because you have a burning desire to implement device drivers for the rest of your life), and learn it only after learning one of the many superior higher level languages out there.
It is important to learn a higher level language first (ML, OCaml, Python, Ruby, Scala, Clojure, Scheme, Racket, Common LISP, ...) so that you do (less) harm to your ability to think clearly and concisely. Of these languages, Scheme has probably the best support for teaching, and one of the purest approaches; it may not be the language you use on a daily basis for the rest of your life, but it is an ideal language to have learned first.
An my answer, below:
I think there is a prejudice against low-level languages. If you think about it, Assembly or machine-level language is not "low level". It is very detailed, indeed. But a few syntactic sugar + expression parsing can take care of the register-to-memory and memory-to-register operations. There are languages, like old PL360, which do that. You can write complete expressions and the languages translates them to machine-language.
So, the point is not being low level. Also not of being a machine language. We had high level machine language like Algol for the Burroughs machine, or LISP for the Symbolic Machines.
The point is that learning machine language we do learn a Universal Machine model. The Von Neumann model. This experience is enlightning. The student understands that data and program have the same representation, are stored in the same memory. The student catches what a compiler has to do to translate his "high level program".
Lisp, Scheme or Racket, as mentioned in William Cushing post, and used in SICP, teach yet another different Universal Machine model, the lambda calculus.
With the Universal Machine concepts understood, the student can then go to "standard" programming languages like C, Java, etc...
sábado, 18 de maio de 2013
5 pioneering paths for software development's new frontier
Beware of the complexity in the development environment. Question: how to keep it simple? And efficient.
quinta-feira, 16 de maio de 2013
How to write bad software
Culture of the enterprise, lots of money invested, no strong drive, etc... see: "I Contribute to the Windows Kernel. We Are Slower Than Other Operating Systems. Here Is Why."
sábado, 20 de abril de 2013
Types in programming languages
An Introduction To Programming Type Systems makes me think that so called high level languages can be thought as introducing a new kind o universal machine. A new model. Compare this with assembly languages: von Neuman machine, everything is a number. Or a set of bits. Or a representation in a code. It doesn't matter from the viewpoint of storage. What we do with a value is determined by the instruction. The process. Compare again with strictly functional languages. Everything is a lambda expression. Even numbers. Different models.
Feeling: richer models complicate the very very simple and effective model of universal machines like Turing, Von Neuman or Lambda.
Feeling: richer models complicate the very very simple and effective model of universal machines like Turing, Von Neuman or Lambda.
sexta-feira, 19 de abril de 2013
Spreadsheet programming problems
Mishaps of using conventional spreadsheets for serious data analysis.
However, look at this post showing that Excel (or whatever spreadsheet) can be used successfully for an enormous variety of applications and be transformed in a powerful universal tool: Microsoft Excel is Everywhere
However, look at this post showing that Excel (or whatever spreadsheet) can be used successfully for an enormous variety of applications and be transformed in a powerful universal tool: Microsoft Excel is Everywhere
domingo, 14 de abril de 2013
Computer Science
Excellent post on Computer Science: Why Computer Science Matters. Defined as “The analysis of algorithms and processes”. So, it is not only about computers.
It is very very young. It is the future. "The future breakthroughs in science and technology will not be the new “social network”, or “web app”, it will be the insight that we get through the study of the theories discussed in Computer Science."
It is very very young. It is the future. "The future breakthroughs in science and technology will not be the new “social network”, or “web app”, it will be the insight that we get through the study of the theories discussed in Computer Science."
sábado, 13 de abril de 2013
Light ERP, in the cloud, configurable?
Is it really configurable? Let's have a look at NetSuite, small size ERP, competes with the big ones, Lary Ellison invested in it.
The Vietnam war of IT
It is the object x relational models. Hard to solve. Perhaps both models were too restrictive?
See also The Filesystem Test. What about having a simples model and build upon it?
To think about a different model: a universal machine. Turing? Von Neumann? Lambda calculus?
See also The Filesystem Test. What about having a simples model and build upon it?
To think about a different model: a universal machine. Turing? Von Neumann? Lambda calculus?
sexta-feira, 8 de março de 2013
Answer to ResearchGate question:
I would bet for the mix of SML, Racket, Ruby and finally Haskell.
SML give you some notions like:
- Benefits of no mutation
- Algebraic datatypes, pattern matching
- Higher-order functions; closures
- Lexical scope
- Currying
- Syntactic sugar
- Parametric polymorphism and container types
- Type inference
- Abstract types
However, Racket shows interesting things like:
- Dynamic vs. static typing
- Laziness and streams
- Implementing languages, especially higher-order functions
- Macros
- Abstract types via dynamic type-creation
- Reflexion
On the other hand, for the OOP, Ruby would be great showing:
- Dynamic dispatch
- Pure object-orientation
- Multiple inheritance, interfaces, and mixins
- OO vs. functional decomposition and extensibility
- Subtyping for records, functions, and objects
- Class-based subtyping
- Subtyping
- Subtyping vs. parametric polymorphism; bounded polymorphism
- Reflexion in OOP
And after all this, I would introduce the lazyness of Haskell
and go deeper in all Haskell stuff.
Of course, this is just one (of many) way(s) for learning
programming languages techniques
-----------------------
See also the classical: Teach Yourself Programming in Ten Years
SML give you some notions like:
- Benefits of no mutation
- Algebraic datatypes, pattern matching
- Higher-order functions; closures
- Lexical scope
- Currying
- Syntactic sugar
- Parametric polymorphism and container types
- Type inference
- Abstract types
However, Racket shows interesting things like:
- Dynamic vs. static typing
- Laziness and streams
- Implementing languages, especially higher-order functions
- Macros
- Abstract types via dynamic type-creation
- Reflexion
On the other hand, for the OOP, Ruby would be great showing:
- Dynamic dispatch
- Pure object-orientation
- Multiple inheritance, interfaces, and mixins
- OO vs. functional decomposition and extensibility
- Subtyping for records, functions, and objects
- Class-based subtyping
- Subtyping
- Subtyping vs. parametric polymorphism; bounded polymorphism
- Reflexion in OOP
And after all this, I would introduce the lazyness of Haskell
and go deeper in all Haskell stuff.
Of course, this is just one (of many) way(s) for learning
programming languages techniques
-----------------------
See also the classical: Teach Yourself Programming in Ten Years
sábado, 16 de fevereiro de 2013
Spreadsheet as a model
This blog point to the fact that spreadsheets are an excellent model for business. To be developped...
quarta-feira, 13 de fevereiro de 2013
Question: Which first language... best answer
@ Carlos Bazilio (as far as I remember you asked “Which first language do you think is the best for learning programming techniques?”):
Start with hypothesis: “Programming is hard. It's the process of telling a bunch of transistors to do something, where that something may be very clear to us fuzzy humans, with all our built-in pattern matching, language processing, and existing knowledge, but really, horrifically, tediously difficult to communicate to a bunch of dumb transistors. (Dethe Elza)”
Then explain to your students next:
1. Programming is art. (“The Art of Computer Programming”, Donald Knuth)
2. Programming is strict way how to solve problems using “dumb” machine. (“Algorithms + Data Structures = Programs”, Niklaus Wirth)
3. To do that you have to spend lot time of learning. (“Algorithm = Logic + Control”, Robert Kowalski)
4. At the end: “Follow your heart, and your curricula, and your primary goal in teaching.” No shortcuts are allowed, and use any programming language you like.
Start with hypothesis: “Programming is hard. It's the process of telling a bunch of transistors to do something, where that something may be very clear to us fuzzy humans, with all our built-in pattern matching, language processing, and existing knowledge, but really, horrifically, tediously difficult to communicate to a bunch of dumb transistors. (Dethe Elza)”
Then explain to your students next:
1. Programming is art. (“The Art of Computer Programming”, Donald Knuth)
2. Programming is strict way how to solve problems using “dumb” machine. (“Algorithms + Data Structures = Programs”, Niklaus Wirth)
3. To do that you have to spend lot time of learning. (“Algorithm = Logic + Control”, Robert Kowalski)
4. At the end: “Follow your heart, and your curricula, and your primary goal in teaching.” No shortcuts are allowed, and use any programming language you like.
(from https://www.researchgate.net)
sexta-feira, 8 de fevereiro de 2013
How to Write Code That Embraces Change
Good advice to OOP users. Can't we do it in a simpler manner with functional programming?
sábado, 2 de fevereiro de 2013
Top 10 Algorithms
The Best of the 20th Century: Editors Name Top 10 Algorithms (SIAM News, Volume 33, Number 4)
domingo, 27 de janeiro de 2013
Universal Machines History
The Universal Computer: The Road from Leibniz to Turing: review of Martin Davis excellent book, which can be the basis for a computer history course.
Software Engineering ≠ Computer Science
Software Engineering different from Computer Science, by Chuck Connel. Several ideas: methods, in engineering, are approximate, but physics formulae are exact. They are mathematics formulae. And software is mathematics. There are no approximations, so no engineering. This is a fundamental quality of sotware. There is no "usually" or "in general".
Author said "The line is the property "directly involves human activity". Software engineering has this property, while traditional computer science does not. The results from disciplines below the line might be used by people, but their results are not directly affected by people.": this is not so. If the algortihm is correct, the software is maintenable and safe.
Agree that "Software engineering will never be a rigorous discipline with proven results, because it involves human activity." Agree for the requirements, not the algorithm.
Also agree that "If some area of software engineering is solved rigorously, you can just redefine software engineering not to include that problem." So, a good practice is to use more good software.
And "We should stop trying to prove fundamental results in software engineering and accept that the significant advances in this domain will be general guidelines."
See also Is Software Engineering Engineering? by Peter J. Denning and Richard D. Riehle. (march 2009 | vol. 52 | no. 3 | communications of the acm) Comments: Software is the opposite of engineering. Problems has to be formulated in mathematical terms in order for the solution to be perfect. And abstraction - which is difficult - solves the complexity.
See also Software Developers' Views of End-Users and Project Success.
Author said "The line is the property "directly involves human activity". Software engineering has this property, while traditional computer science does not. The results from disciplines below the line might be used by people, but their results are not directly affected by people.": this is not so. If the algortihm is correct, the software is maintenable and safe.
Agree that "Software engineering will never be a rigorous discipline with proven results, because it involves human activity." Agree for the requirements, not the algorithm.
Also agree that "If some area of software engineering is solved rigorously, you can just redefine software engineering not to include that problem." So, a good practice is to use more good software.
And "We should stop trying to prove fundamental results in software engineering and accept that the significant advances in this domain will be general guidelines."
See also Is Software Engineering Engineering? by Peter J. Denning and Richard D. Riehle. (march 2009 | vol. 52 | no. 3 | communications of the acm) Comments: Software is the opposite of engineering. Problems has to be formulated in mathematical terms in order for the solution to be perfect. And abstraction - which is difficult - solves the complexity.
See also Software Developers' Views of End-Users and Project Success.
sábado, 19 de janeiro de 2013
sexta-feira, 18 de janeiro de 2013
Open Source ideas
An anthropologist view of Open Source: Want to understand open source? Live with its developers
How open source is driving the future of cloud computing
And... MOOCS, which are open educations: A lesson from 2012: Open education brings power of knowledge to the masses and MOOC Skepticism Persists Among University Presidents, Despite Rapid Growth Of Online Courses In 2012
How open source is driving the future of cloud computing
And... MOOCS, which are open educations: A lesson from 2012: Open education brings power of knowledge to the masses and MOOC Skepticism Persists Among University Presidents, Despite Rapid Growth Of Online Courses In 2012
Assinar:
Postagens (Atom)
Which first language do you think is the best for learning programming techniques?
Francisco Frechina · Universitat Politècnica de València