Mihi, at IMHO, writes about measuring productivity of programming languages:
The main problem is to compare these languages. A common method is counting the “lines of code” a programmer needs for a given program. But who is going to implement the same specification in hundrets [sic] of languages? So to make the test applicable we need a common concept for the comparison.
The common concept that he talks about is function points.
In answer to his rhetorical question, there have been a few studies that have asked several volunteers to solve a standard programming problem in the language of their choice. Peter Norvig’s refers these in Lisp as an Alternative to Java. First there was Lutz Prechelt’s Comparing Java vs. C/C++ Efficiency Issues to Interpersonal Issues, which he followed with a study that included Perl, Python, Rexx, and Tcl. Finally, Erann Gat collected some numbers for Lisp.
It’s been a while since I’ve paid any attention to what’s going on in the C++ world. I’ve somehow become a Java drone within a multinational semiconductor company. Now, how did that happen? Let your guard down for one minute, and you find yourself a cog in some grim engine of capitalist greed, mindlessly toiling away untold hours in a nondescript cubicle farm… but I digress.
Herb Sutter managed to pull my head out of the sand just long enough for me to notice that Microsoft is working to standardize an extension of C++ for the Common Language Infrastructure (CLI). The new language is called C++/CLI.
From the latest spec:
The goals used in the design of C++/CLI were as follows:
- Provide an elegant and uniform syntax and semantics that give a natural feel for C++ programmers.
- Provide first-class support for CLI features (e.g., properties, events, garbage collection, and generics) for all types including existing Standard C++ classes.
- Provide first-class support for Standard C++ features (e.g., deterministic destruction, templates) for all types including CLI classes.
- Preserve the meaning of existing Standard C++ programs by specifying pure extensions wherever possible.
Plum Hall is keeping a repository of all the public drafts of the spec.
There is some more discussion on gender differences at the Edge. This time, Diane F. Halpern, Alison Gopnik, David Haig, and Nora S. Newcombe chime in.
In an EDN article, Warren Webb reports on a new board from Sheldon Instruments with a TI C6713 in a PC-104 form factor. Ho-hum.
Then I read this:
The board comes with DSP-software libraries that enable system engineers to directly program their DSP systems in Visual Basic or National Instruments (www.ni.com) LabView.
Did I read that right? It is possible to directly program a TI DSP in Visual Basic?
That can’t possibly be right. And if it is, there is something terribly wrong with the world we live in. Who would want to program a DSP in VB? What’s next? A COBOL compiler?
I’ve been reading John Gatto’s Underground History of America Education. I thought I’d share a couple exerpts that I found interesting.
No public school in the United States is set up to allow a George Washington to happen. Washingtons in the bud stage are screened, browbeaten, or bribed to conform to a narrow outlook on social truth. Boys like Andrew Carnegie who begged his mother not to send him to school and was well on his way to immortality and fortune at the age of thirteen, would be referred today for psychological counseling; Thomas Edison would find himself in Special Ed until his peculiar genius had been sufficiently tamed. [Link]
Forced schooling was the medicine to bring the whole continental population into conformity with these plans so that it might be regarded as a “human resource” and managed as a “workforce.” No more Ben Franklins or Tom Edisons could be allowed; they set a bad example. One way to manage this was to see to it that individuals were prevented from taking up their working lives until an advanced age when the ardor of youth and its insufferable self-confidence had cooled. [Link]
EDN is reporting that TI has improved floating-point arithmetic performance in the the C6722, C6726, and C6727:
The processor-core enhancements include adding floating-point addition to the S unit on each side of the C67x core, so that the processor can execute four floating-point additions per cycle. This doubling of the number of parallel floating-point additions per cycle can boost FFT processing by 20%. In addition to supporting single- and double-precision floating-point operations, the processor core now supports mixed-mode floating-point functions that allow developers to operate on both a single- and double-precision value in the same operation. The C672x DSPs also have twice as many internal registers as C67x DSPs to improve compiler optimizations and reduce the overall number of memory accesses.
In a Globe and Mail story, Kirk Makin reports that the Ontario government settled a lawsuit for $63 million over a software project gone horribly wrong. The lawsuit was brought by EDS Canada Ltd and a number of other companies after the government decided to cancel the Integrated Justice Project (IJP), a project meant to move the province’s entire judicial system onto computers, because it had run over budget:
The death knell came in late 2002, when then-provincial-auditor Erik Peters said the original cost estimate of the project had ballooned to more than $350-million from $180-million. He said that even if the IJP were completed, the prospective savings would be no higher than $250-million.
When the project was cancelled it had burned through $200 million. So why didn’t the government continue the project in the hope of recouping the $250 million it expected to save for the project? Derek Freeman, a Toronto lawyer with “inside knowledge of the IJP” provides a clue:
Mr. Freeman recalled that when the project began, the original private-sector partners got carried away and failed to convey how difficult it would be to translate their plans into a system-wide network.
It sounds to me like the private-sector partners failed to convey the difficulty because they were unaware of it themselves.
This is a classic problem in software. Estimates are often off by a factor of two or more because developers, when they first encounter a problem, haven’t acquainted themselves with its many intricacies.
The $350 million estimate to which the project ballooned was probably optimistic.