Discussion:
Technologies to teach in undergrad CS
(too old to reply)
David Alex Lamb
2006-02-15 03:17:09 UTC
Permalink
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").

I expect a few to answer "none but our favourite programming language(s)"; I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.

In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room). A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others. The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.

I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.

I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.

Any others? or opinions on what I already listed?
--
"Yo' ideas need to be thinked befo' they are say'd" - Ian Lamb, age 3.5
http://www.cs.queensu.ca/~dalamb/ qucis->cs to reply (it's a long story...)
Ed Wegner
2006-02-15 04:34:06 UTC
Permalink
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").
Excel (or its Open Office equivalent): myriad uses - eg. for project
estimating, project planning and monitoring, trend analysis of bugs,
comparison of algorithm attributes, etc. etc.

Source Code Version Control: CVS or SVN - require the students use it on
both individual and group projects.

Configuration Management for baselining consistency / compatability of
Requirements, Testware, and Implementation

Bug tracking

scripting language (any will do, but Perl and Ruby would be choices 2
and 1): software engineering is still one of the few (only) professions
where the practitioners make their own tools.

Finally, one that might be a "subject" - or even worse, a lowly "skill"
rather than a "technology", but you should include it, anyway.

activities that require reading and understanding lots of code written
by others (mostly good code, but some bad code to fix up).

<snip>

Ed.
Phlip
2006-02-15 05:33:43 UTC
Permalink
Post by David Alex Lamb
I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
I love it when Robert C. Martin busts on graduates with CS degrees who never
wrote a line of code. They remind me of a certain local retail sales
specialist who likes to tease customers "I just do this because I have a
degree in philosophy."

What did you expect to do? Graduate and get a job in the philosophy
department of a major corporation??
Post by David Alex Lamb
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
How about using XPath to test web pages written as pure XHTML. No more
relying on browser forgiveness!
--
Phlip
http://www.greencheese.org/ZeekLand <-- NOT a blog!!!
James
2006-02-24 08:25:08 UTC
Permalink
Post by Phlip
Post by David Alex Lamb
I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
It's not a case of technologies belonging in academia or otherwise. The
academy's job generally is to teach flexible, portable skills like
problem solving, writing style, research skills, etc. and not
necessarily to prepare students for industry.

In Computer Science specifically, the skills revolve around the
computational process, including computational analysis but also the
ability to describe and elaborate on existing techniques of computation.
Personally, I feel that if I have a good grasp of computational
processes, the principles of which have remained relatively static for
fifty years. Some were established by Charles Babbage and Ada
Wolstonecraft (Lady Lovelace) in the 17th Century, then the actual
technologies are merely a matter of establishing and upholding
standards. Admittedly, this is lots of work but of a fairly mundane
sort. Technologies generally come and go with more frequency than any
scientist would, or should, feel comfortable with and the method of
consensus by which they emerge has no parallel in the scientific method.

Thus the question of which technologies *should* be taught in CS101 is
already decided with this outlook. In short, the answer is it doesn't
matter as long as these broader skills are adequately addressed. This
is virtually impossible to do, or at least, it is impossible to
investigate computation in any way that deserves the title 'Science',
without actually typing code, running it and seeing what it does.

On the other hand, the question that was actually asked, I believe, was
what technologies *are* being used. In my case, the University of
Queensland, like a good couple of dozen Universities around the globe,
has adopted MIT's choice of Scheme (a dialect of Lisp) for the
introductory CS course. In fact we also use the same textbook as well
as MIT's online tutorial material.
Post by Phlip
What did you expect to do? Graduate and get a job in the philosophy
department of a major corporation??
2 cents
Philosophy as a discipline is an enabling condition for all academic
enterprise. That's why the pinnacle of academic achievement is a Doctor
of Philosophy. The implication being that whatever your area of study,
if you take it to its extreme and actually create new knowledge, you
will touch some area of philosophical thought and you will need to read
some philosophy. Perhaps if major corporations had a philosophy
department, things like corporate ethics might be taken a little more
seriously.
Post by Phlip
Post by David Alex Lamb
I consider "flexible methods" and "extreme programming" to be
software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
I think these still fall into the purview of CS skills although there is
overlap into the area of 'training' (as opposed to 'education').
Like most of the suggestions I have read so far, they also constitute
advanced topics in the software process and not introductory courses.
J
James
2006-02-24 08:29:35 UTC
Permalink
Post by Phlip
Post by David Alex Lamb
I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
It's not a case of technologies belonging in academia or otherwise. The
academy's job generally is to teach flexible, portable skills like
problem solving, writing style, research skills, etc. and not
necessarily to prepare students for industry.

In Computer Science specifically, the skills revolve around the
computational process, including computational analysis but also the
ability to describe and elaborate on existing techniques of computation.
Personally, I feel that if I have a good grasp of computational
processes, the principles of which have remained relatively static for
fifty years, some of which were established by Charles Babbage and Ada
Wolstonecraft (Lady Lovelace) in the 17th Century, then the actual
technologies are merely a matter of establishing and upholding
standards. Admittedly, this is lots of work but of a fairly mundane
sort. Technologies generally come and go with more frequency than any
scientist would, or should, feel comfortable with and the method of
consensus by which they emerge has no parallel in the scientific method.

Thus the question of which technologies *should* be taught in CS101 is
already decided with this outlook. In short, the answer is it doesn't
matter as long as these broader skills are adequately addressed. This
is virtually impossible to do, or at least, it is impossible to
investigate computation in any way that deserves the title 'Science',
without actually typing code, running it and seeing what it does.

On the other hand, the question that was actually asked, I believe, was
what technologies *are* being used. In my case, the University of
Queensland, like a good couple of dozen Universities around the globe,
has adopted MIT's choice of Scheme (a dialect of Lisp) for the
introductory CS course. In fact we also use the same textbook as well
as MIT's online tutorial material.
Post by Phlip
What did you expect to do? Graduate and get a job in the philosophy
department of a major corporation??
2 cents
Philosophy as a discipline is an enabling condition for all academic
enterprise. That's why the pinnacle of academic achievement is a Doctor
of Philosophy. The implication being that whatever your area of study,
if you take it to its extreme and actually create new knowledge, you
will touch some area of philosophical thought and you will need to read
some philosophy. Perhaps if major corporations had a philosophy
department, things like corporate ethics might be taken a little more
seriously.
Post by Phlip
Post by David Alex Lamb
I consider "flexible methods" and "extreme programming" to be
software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
I think these still fall into the purview of CS skills although there is
overlap into the area of 'training' (as opposed to 'education').
Like most of the suggestions I have read so far, they also constitute
advanced topics in the software process and not introductory courses.
J

David Lightstone
2006-02-15 11:18:06 UTC
Permalink
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software
engineering)?
(not "subjects" but "technologies").
Topic 1
A refresher course on - How to do a research paper. Covering
(1) How to collect citations
(2) How to construct an outline
(3) How do allocate citaions to the outline
(4) How to write prose

When I first learned the skill (cerca 9th or 10th grade ) the technology was
4x6 index cards, indexed tabs and a metal box.

You may think that the response is flippant, but specialize the outline to
the present "approved" means for organizing the structure of a system, the
citations to wishes and wants of your favorite manager, and you have a
simple means for preparing a requirements specification (which probably, on
one form or another, has been used for over 4000 years)
Post by David Alex Lamb
I expect a few to answer "none but our favourite programming language(s)"; I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room).
A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others.
The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.
I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
Any others? or opinions on what I already listed?
--
"Yo' ideas need to be thinked befo' they are say'd" - Ian Lamb, age 3.5
http://www.cs.queensu.ca/~dalamb/ qucis->cs to reply (it's a long story...)
H. S. Lahman
2006-02-15 18:55:04 UTC
Permalink
Responding to Lamb...
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").
IMO, the most important "technology" by far is process engineering. No
matter what technologies, tools, methodologies, or whatever one actually
has at hand in a particular development environment, they have to be
glued together in a development process somehow so that they play together.

Second on my list would be process improvement. New technologies,
tools, and whatnot show up with alarming regularity. If they are to be
integrated effectively into existing development environments, the shop
has to have some sort of process improvement discipline.

Third on my list would be SQA, specifically defect prevention. The
industry is facing a paradigm shift where the old quality-by-testing
view will have to be replaced with a more process-oriented view. [Just
as what happened in the '80s in manufacturing when the PacRim started
providing products that didn't break nearly as often. Customers are
finally beginning to notice that the only thing that breaks in their
lives nowadays is software.]

One could argue none of these are technologies. But they are at least
techniques and they all have very well defined alternative disciplines
(e.g., the alphabet soup for process frameworks: CMM, ISO, etc.) that
are "teachable".
Post by David Alex Lamb
I expect a few to answer "none but our favourite programming language(s)"; I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room). A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others. The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.
I agree here. The OO paradigm requires OOA/D for the fundamentals and
that requires UML, even if one ends up with an OOP-based process after
school.

Similarly, any RAD environment will demonstrate the data-oriented
paradigms for CRUD/USER processing, regardless of what specific IDE one
might use after school.
Post by David Alex Lamb
I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
I think some version of markup processing (XML, XMI, SOAP, etc.) should
be in the curriculum. Muck with any one of them and one can quickly
learn the others. One key idea to get across is that there is a whole
world of portable, interoperable parametric polymorphism out there
beyond type substitution and markup processing is a versatile mechanism
for achieving it.
Post by David Alex Lamb
Any others? or opinions on what I already listed?
I would be tempted to also provide some exposure to functional
programming. In particular, contrasting it to OO and P/R approaches.
[FWIW, I think it would be grand if courses in OO, P/R, and FP all had
the same lab problem application. Then anyone taking all three could
see just how different the approaches really are.]

I also think some layered model infrastructure should be taught. It
really doesn't matter which one so long as the course underscores the
basic features (layering, decoupling through interfaces, hiding grunt
work in canned infrastructure, serializing communications, etc.).

Finally, I think some presentation of MDA should be made. While MDA
doesn't affect application developers very much, it is rapidly becoming
hugely important to tool, OS, and infrastructure vendors because it
provides a systematic approach to implementing interoperability and plug
& play. IOW, if the curriculum has a course in OSes, then it should
have one in MDA.
--
*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
***@pathfindermda.com
Pathfinder Solutions -- Put MDA to Work
http://www.pathfindermda.com
blog: http://pathfinderpeople.blogs.com/hslahman
(888)OOA-PATH
j***@alum.mit.edu
2006-02-15 21:00:56 UTC
Permalink
Off the top of my head, I think 'gprof' or a similar profiling tool
would be worth it. After a section on gprof itself you could have a
quick discussion-only thing on deeper tools (purify/boundschecker kinds
of things on the one hand, code coverage tools on the other).

--JMike
David Alex Lamb
2006-02-16 16:46:43 UTC
Permalink
I made a strategic error in setting followups to comp.edu -- theoretically
it's the more appropriate group, but it's close to dead. So here are the
responses so far, in hopes that there will be more comments:

From: Ed Wegner <***@tait.co.nz>
Date: Wed, 15 Feb 2006 17:34:06 +1300
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").
Excel (or its Open Office equivalent): myriad uses - eg. for project
estimating, project planning and monitoring, trend analysis of bugs,
comparison of algorithm attributes, etc. etc.

Source Code Version Control: CVS or SVN - require the students use it on
both individual and group projects.

Configuration Management for baselining consistency / compatability of
Requirements, Testware, and Implementation

Bug tracking

scripting language (any will do, but Perl and Ruby would be choices 2
and 1): software engineering is still one of the few (only) professions
where the practitioners make their own tools.

Finally, one that might be a "subject" - or even worse, a lowly "skill"
rather than a "technology", but you should include it, anyway.

activities that require reading and understanding lots of code written
by others (mostly good code, but some bad code to fix up).

From: "Phlip" <***@yahoo.com>
Date: Wed, 15 Feb 2006 05:33:43 GMT
Post by David Alex Lamb
I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
I love it when Robert C. Martin busts on graduates with CS degrees who never
wrote a line of code. They remind me of a certain local retail sales
specialist who likes to tease customers "I just do this because I have a
degree in philosophy."

What did you expect to do? Graduate and get a job in the philosophy
department of a major corporation??
Post by David Alex Lamb
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
How about using XPath to test web pages written as pure XHTML. No more
relying on browser forgiveness!

From: "David Lightstone" <***@prodigy.net>
Date: Wed, 15 Feb 2006 11:18:06 GMT

Topic 1
A refresher course on - How to do a research paper. Covering
(1) How to collect citations
(2) How to construct an outline
(3) How do allocate citaions to the outline
(4) How to write prose

When I first learned the skill (cerca 9th or 10th grade ) the technology was
4x6 index cards, indexed tabs and a metal box.

You may think that the response is flippant, but specialize the outline to
the present "approved" means for organizing the structure of a system, the
citations to wishes and wants of your favorite manager, and you have a
simple means for preparing a requirements specification (which probably, on
one form or another, has been used for over 4000 years)
Post by David Alex Lamb
I expect a few to answer "none but our favourite programming language(s)"; I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room).
A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others.
The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.
I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
Any others? or opinions on what I already listed?
From: "H. S. Lahman" <***@verizon.net>
Date: Wed, 15 Feb 2006 18:55:04 GMT
NNTP-Posting-Host: 71.245.234.122

IMO, the most important "technology" by far is process engineering. No
matter what technologies, tools, methodologies, or whatever one actually
has at hand in a particular development environment, they have to be
glued together in a development process somehow so that they play together.

Second on my list would be process improvement. New technologies,
tools, and whatnot show up with alarming regularity. If they are to be
integrated effectively into existing development environments, the shop
has to have some sort of process improvement discipline.

Third on my list would be SQA, specifically defect prevention. The
industry is facing a paradigm shift where the old quality-by-testing
view will have to be replaced with a more process-oriented view. [Just
as what happened in the '80s in manufacturing when the PacRim started
providing products that didn't break nearly as often. Customers are
finally beginning to notice that the only thing that breaks in their
lives nowadays is software.]

One could argue none of these are technologies. But they are at least
techniques and they all have very well defined alternative disciplines
(e.g., the alphabet soup for process frameworks: CMM, ISO, etc.) that
are "teachable".
Post by David Alex Lamb
I expect a few to answer "none but our favourite programming language(s)"; I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room). A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others. The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.
I agree here. The OO paradigm requires OOA/D for the fundamentals and
that requires UML, even if one ends up with an OOP-based process after
school.

Similarly, any RAD environment will demonstrate the data-oriented
paradigms for CRUD/USER processing, regardless of what specific IDE one
might use after school.
Post by David Alex Lamb
I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
I think some version of markup processing (XML, XMI, SOAP, etc.) should
be in the curriculum. Muck with any one of them and one can quickly
learn the others. One key idea to get across is that there is a whole
world of portable, interoperable parametric polymorphism out there
beyond type substitution and markup processing is a versatile mechanism
for achieving it.
Post by David Alex Lamb
Any others? or opinions on what I already listed?
I would be tempted to also provide some exposure to functional
programming. In particular, contrasting it to OO and P/R approaches.
[FWIW, I think it would be grand if courses in OO, P/R, and FP all had
the same lab problem application. Then anyone taking all three could
see just how different the approaches really are.]

I also think some layered model infrastructure should be taught. It
really doesn't matter which one so long as the course underscores the
basic features (layering, decoupling through interfaces, hiding grunt
work in canned infrastructure, serializing communications, etc.).

Finally, I think some presentation of MDA should be made. While MDA
doesn't affect application developers very much, it is rapidly becoming
hugely important to tool, OS, and infrastructure vendors because it
provides a systematic approach to implementing interoperability and plug
& play. IOW, if the curriculum has a course in OSes, then it should
have one in MDA.

From: ***@alum.mit.edu
Date: 15 Feb 2006 13:00:56 -0800

Off the top of my head, I think 'gprof' or a similar profiling tool
would be worth it. After a section on gprof itself you could have a
quick discussion-only thing on deeper tools (purify/boundschecker kinds
of things on the one hand, code coverage tools on the other).
--
"Yo' ideas need to be thinked befo' they are say'd" - Ian Lamb, age 3.5
http://www.cs.queensu.ca/~dalamb/ qucis->cs to reply (it's a long story...)
David Alex Lamb
2006-02-22 02:56:30 UTC
Permalink
Post by David Alex Lamb
Date: Wed, 15 Feb 2006 11:18:06 GMT
Topic 1
A refresher course on - How to do a research paper. Covering
(1) How to collect citations
(2) How to construct an outline
(3) How do allocate citaions to the outline
(4) How to write prose
Sounds like a good idea. Unfortunately the first course where we require
actual research papers is in fall of 4th year; that might be a reasonable
start, since pushing it down into earlier courses might involve some academic
politics.
I received a private note regarding my responses in this thread, from someone
who seemed to think I was opposed to teaching 'how to write a research
paper'. I'm not -- I just tried to say there would be ingrained resistance.

However, what I may have implied but didn't make clear is that neither profs
nor students would have any interest in this topic except in a course that
actually did a research paper (or research-like paper, such as requirements
analysis). This my professional opinion based on years of experience of such
resistance on various topics.

So
-- in what kind of course, and how soon in the curriculum, would you try to
teach the how-to-write-a-paper topic?
-- do you agree it would have to be in a course that actuall required such a
paper?
-- if not, how would you overcome the resistance I mentioned above? or, do
you think I'm excessively pessimistic about the exitence of such resistance?
--
"Yo' ideas need to be thinked befo' they are say'd" - Ian Lamb, age 3.5
http://www.cs.queensu.ca/~dalamb/ qucis->cs to reply (it's a long story...)
Barb Knox
2006-02-22 08:18:00 UTC
Permalink
Post by David Alex Lamb
Post by David Alex Lamb
Date: Wed, 15 Feb 2006 11:18:06 GMT
Topic 1
A refresher course on - How to do a research paper. Covering
(1) How to collect citations
(2) How to construct an outline
(3) How do allocate citaions to the outline
(4) How to write prose
Sounds like a good idea. Unfortunately the first course where we require
actual research papers is in fall of 4th year; that might be a reasonable
start, since pushing it down into earlier courses might involve some academic
politics.
I received a private note regarding my responses in this thread, from someone
who seemed to think I was opposed to teaching 'how to write a research
paper'. I'm not -- I just tried to say there would be ingrained resistance.
However, what I may have implied but didn't make clear is that neither profs
nor students would have any interest in this topic except in a course that
actually did a research paper (or research-like paper, such as requirements
analysis). This my professional opinion based on years of experience of such
resistance on various topics.
So
-- in what kind of course, and how soon in the curriculum, would you try to
teach the how-to-write-a-paper topic?
Very soon, e.g. the first semester of a Software Engineering sequence.
Without decent requirements, you're toast -- "Any system whatsoever is
the full correct implementation of SOME specification". After learning
to READ specs, the next logical step is to CRITIQUE and IMPROVE (i.e.,
rewrite) them.
Post by David Alex Lamb
-- do you agree it would have to be in a course that actually required such a
paper?
Absolutely. Otherwise it will seem pointless to almost all students.
Post by David Alex Lamb
-- if not, how would you overcome the resistance I mentioned above? or, do
you think I'm excessively pessimistic about the exitence of such resistance?
"A pessimist is a well-informed optimist." I once tried to teach (and
assess) design in an introductory programming class BEFORE teaching any
coding. The result was abysmal.
--
---------------------------
| BBB b \ Barbara at LivingHistory stop co stop uk
| B B aa rrr b |
| BBB a a r bbb | Quidquid latine dictum sit,
| B B a a r b b | altum viditur.
| BBB aa a r bbb |
-----------------------------
David Lightstone
2006-02-22 11:53:47 UTC
Permalink
Post by Barb Knox
Post by David Alex Lamb
Post by David Alex Lamb
Date: Wed, 15 Feb 2006 11:18:06 GMT
Topic 1
A refresher course on - How to do a research paper. Covering
(1) How to collect citations
(2) How to construct an outline
(3) How do allocate citaions to the outline
(4) How to write prose
Sounds like a good idea. Unfortunately the first course where we require
actual research papers is in fall of 4th year; that might be a reasonable
start, since pushing it down into earlier courses might involve some academic
politics.
I received a private note regarding my responses in this thread, from someone
who seemed to think I was opposed to teaching 'how to write a research
paper'. I'm not -- I just tried to say there would be ingrained resistance.
However, what I may have implied but didn't make clear is that neither profs
nor students would have any interest in this topic except in a course that
actually did a research paper (or research-like paper, such as
requirements
analysis). This my professional opinion based on years of experience of such
resistance on various topics.
I as an individual did not have a clue as to how to systematically write a
research paper until well after a had left universtiy. Its not that I didn't
write any while attending. Rather it was because I was readily able to hack
them out without great effort

The statement - nobody would be interested in learning something that they
do not realize they need to know is valid.

The statement - no faculty member would teach material they realize to be
significant, but about which they have no interest would (if expressed) be a
problem.

The statement - no faculty member realizes that teaching the material is
something in need of being done would be a problem

Interest or lack of interest in the subject material correlates very
strongly with an awareness of the significant of the material to a task the
students will need to accomplish in their intended future career. (ie if you
don't know you need to know something, lack of interest will definitely be
an expressed )

The issue of concern, expressed in your original post, was identify things
students were in need of learning.
(1) Do you now claim that the material does not need to be learned?
(2) Do you now claim that the material should be learned by attending a
course taught be another department?


The only problem is how you relate the material to something that the
students know they will need to do during the course of their professional
or academic career.
My guess a simple statement - you will have to do research papers
eventually - will be sufficient. The choice will be one of hacking them out,
or doing them systematically.

If that fails try - lets see if you know how to do them. How did you go
about writting you last research paper? Should not take long for you the
figure out if they know how to do it. Should take them even less time to
realize which of their fellow students know something that they don't know.
Is that not how study groups are formed?

Perhaps asking them to read a poorly written specification might be
sufficient. If you have any friends in the auto industry. I can tell you for
certain that they have more than a few poorly written requirement's
specifications available (proprietary knowledge may make it impossible to
get anything current, but stuff 10 year old may still exist). Perhaps
requests for proposals issues by you favorite Govenment will do. (preferable
something not written by attorneys). Irrespective of the source run it thru
a peer review exercise. Once people see the mistakes that can and have been
made by others they can be motivated not to make those mistakes
Post by Barb Knox
Post by David Alex Lamb
So
-- in what kind of course, and how soon in the curriculum, would you try to
teach the how-to-write-a-paper topic?
Very soon, e.g. the first semester of a Software Engineering sequence.
Without decent requirements, you're toast -- "Any system whatsoever is
the full correct implementation of SOME specification". After learning
to READ specs, the next logical step is to CRITIQUE and IMPROVE (i.e.,
rewrite) them.
Post by David Alex Lamb
-- do you agree it would have to be in a course that actually required
such a
paper?
Absolutely. Otherwise it will seem pointless to almost all students.
Post by David Alex Lamb
-- if not, how would you overcome the resistance I mentioned above? or, do
you think I'm excessively pessimistic about the exitence of such resistance?
"A pessimist is a well-informed optimist." I once tried to teach (and
assess) design in an introductory programming class BEFORE teaching any
coding. The result was abysmal.
See Donald Norman's book - The Design of Everyday Things. Its not software
design, but it does address many issues tht need to be correctly understoud
Post by Barb Knox
--
---------------------------
| BBB b \ Barbara at LivingHistory stop co stop uk
| B B aa rrr b |
| BBB a a r bbb | Quidquid latine dictum sit,
| B B a a r b b | altum viditur.
| BBB aa a r bbb |
-----------------------------
Ron Ruble
2006-02-18 05:29:51 UTC
Permalink
David Alex Lamb wrote:
<snip>
Post by David Alex Lamb
In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room).
Understood. Given the limited time, I would think it might be
better to introduce them to more modern, perhaps easier to
learn tools. Subversion or CVS with a graphical front-end.

Even if they only run on the client side (I don't know what
difficulties you might experience in setting up a source
repository for students, It would give them an introduction
to concepts and a little practical use.

I favor graphical tools primarily because, in the corporate
world, one huge reason an RCS isn't used is because of the
impatience with futzing with the commands.

As bad as working without an RCS is, it's really annoying
to have a local PC hard drive fail and the programmer tells
you he hasn't been using the RCS for weeks/months.

I'm of two minds on make. I use tools that manage makefiles
for me, but I have created and edited makefiles. I've often
planned to look at ANT and other tools, but never had time to
do so.
Post by David Alex Lamb
A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others. The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.
I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.
I haven't learned XML and the associated tools yet, so that's one I wonder
about.
As an aside, ANT uses XML files for it's configuration.

XML manipulation, in practice, is filled with hundreds or
thousands of different ways to do it. I would personally delve
into XML only as part of teaching something where you can
teach the appropriate (to the specific course) way to manipulate
it.

In a Java course, teach using the Java XML classes. Especially
since you haven't dealt with XML.
Post by David Alex Lamb
I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
Any others? or opinions on what I already listed?
You might want to offer a few brief practical examples of how
to use various traditional Unix text manipulation tools.

The suggestion on gprof is a good one. We tend to forget that
profiling and debugging are too important to just "pick up
entirely on the street." Teach how to debug; possibly without
using a modern debugger.

You've been given a number of good suggestions by others.
Phlip
2006-02-18 14:46:21 UTC
Permalink
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software
engineering)?
(not "subjects" but "technologies").
You may want to investigate how Ralph [GOF] Johnson starts freshmen at the
University of Illinois.

I wish I could sign up.
--
Phlip
http://www.greencheese.org/ZeekLand <-- NOT a blog!!!
David Alex Lamb
2006-02-19 04:27:32 UTC
Permalink
Post by Phlip
You may want to investigate how Ralph [GOF] Johnson starts freshmen at the
University of Illinois.
I wish I could sign up.
Got a link? I'll search if I have to, but see ''medical leave'' above.
--
"Yo' ideas need to be thinked befo' they are say'd" - Ian Lamb, age 3.5
http://www.cs.queensu.ca/~dalamb/ qucis->cs to reply (it's a long story...)
Phlip
2006-02-19 17:02:06 UTC
Permalink
Post by David Alex Lamb
Post by Phlip
You may want to investigate how Ralph [GOF] Johnson starts freshmen at
the University of Illinois.
Got a link? I'll search if I have to, but see ''medical leave'' above.
I can't find a citation with Google.

Ralph said he used to start his freshmen classes half using RUP and half
using XP. Then he switched them all to XP because the results were
consistently better.

Of course this is monkey-wrench Software Engineering, not enlightened
Computer Science, so maybe freshmen shouldn't write any code at all... ;-)
--
Phlip
http://www.greencheese.org/ZeekLand <-- NOT a blog!!!
H. S. Lahman
2006-02-18 15:57:44 UTC
Permalink
Responding to Lamb...

I posted this previously but somehow it never got onto the server.
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").
I expect a few to answer "none but our favourite programming language(s)"; I'm
slightly sympathetic to the "teaching technology doesn't belong in academia"
viewpoint but a lot less than most academics I talk to.
My first would be one of the alphabet soup of process frameworks (CMM,
ISO, etc.). All the the technologies needed to be glued together so
that they play together properly. Any one of the frameworks would be
sufficient since fundamentally they are all the same.

My second would be a process improvement discipline (TQM, Baldridge,
etc.). IMO, this is really crucial to introducing a badly needed sense
of professionalism into the industry.

My third would be SQA, particularly defect prevention (e.g., 6-Sigma).
We are facing a paradigm shift in the industry similar to that of the
'80s for manufacturing where testing in quality simple doesn't cut it.
Customers are beginning to figure out that the only thing that breaks
nowadays is software. Again, it really doesn't make much difference
which one because the fundamentals are all pretty much the same.

One can argue that these are not technologies. However, they are at
least techniques, which is close enough for government work. In
addition, each of these things has several well-defined, teachable
alternative methodologies that are analogous to families of technologies
(e.g., XML, XMI, SOAP, etc.).

[Note that you could kill several of these birds with one stone by
teaching TSP, which implements all of them. However, since it is an
/implementation/ and course in it would need to provide the groundwork
to illustrate what the underlying fundamentals were and how they are
implemented.]
Post by David Alex Lamb
In the past I've tried to squeeze in some configuration management (at the
ancient RCS/make level) but not a whole lot else (there is often no room). A
few years ago I introduced UML into an introductory software engineering/
software architecture course. I can't at the moment recall any others. The
database courses usually do a little some-DBMS-or-other (DB2 in our case, last
time I looked), plus SQL. The graphics course uses OGL. I've been on medical
leave for several years, so I have probably missed a few others.
I agree that CRUD/USER processing is important enough to warrant
spending time on the basic RAD elements. I don't think it matters much
which IDE is used; the key is the UI/DB pipeline architecture that
allows layered models and highly focused "canned" infrastructures behind
IDEs. Which, in turn, gives rise to a unique view of programming (P/R).
Post by David Alex Lamb
I'm not particularly interested in covering all, or even most, of the current
buzzwords -- just things that introduce ways of thinking that are likely
"fundamental" in the sense of being reasonably enduring and giving
interesting/different ways of looking at, and solving, problems.
I haven't learned XML and the associated tools yet, so that's one I wonder
about. I consider "flexible methods" and "extreme programming" to be software
process models rather than technology, so they're already covered (among
others) as academic subjects in our Software Process course.
I think the markup/scripting family of technologies is also important.
Again, the particular technology probably doesn't make much difference
so long as the course properly captures the underlying invariants of the
processing.

Another justification for XML et al is that it is a mechanism that
supports portable parametric polymorphism beyond just the 3GL type
substitution view. IMO, this is something that is sorely underutilized
in the OO paradigm.

However, I think the crucial thing is for the course to describe the
underlying structure so that it could be mapped to other members of the
family. In another thread I was pointing out to someone that the
biggest decision in OR is to decide which solution approach to use. I
pointed out that I never saw a useful practical characterization of the
approaches that would make such decisions relatively easy in any of my
text books. I learned the characterizations because I lucked out and
had a good prof who provided it.
Post by David Alex Lamb
Any others? or opinions on what I already listed?
If you don't have it already, I think you need a course on functional
programming. It is the logical evolution of procedural programming for
situations where requirements are stable over time.

[FWIW, I think it would be ideal if courses on OOP, FP, and P/R all had
the same lab project to develop. Then students taking two or more of
the courses would be able to contrast the techniques effectively and
understand how different they really are.]


*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
***@pathfindermda.com
Pathfinder Solutions -- Put MDA to Work
http://www.pathfindermda.com
blog: http://pathfinderpeople.blogs.com/hslahman
(888)OOA-PATH
David Alex Lamb
2006-02-19 04:31:21 UTC
Permalink
Post by H. S. Lahman
Responding to Lamb...
My first would be one of the alphabet soup of process frameworks (CMM,
ISO, etc.)
My second would be a process improvement discipline (TQM, Baldridge,
etc.).
My third would be SQA
One can argue that these are not technologies. However, they are at
least techniques, which is close enough for government work.
I'd consider them 'subjects' -- some may be covered already in our software
process course.
Post by H. S. Lahman
If you don't have it already, I think you need a course on functional
programming. It is the logical evolution of procedural programming for
situations where requirements are stable over time.
We have function and logic programming.
Post by H. S. Lahman
[FWIW, I think it would be ideal if courses on OOP, FP, and P/R all had
the same lab project to develop.
What's P/R? or should I be embarassed to ask?
--
"Yo' ideas need to be thinked befo' they are say'd" - Ian Lamb, age 3.5
http://www.cs.queensu.ca/~dalamb/ qucis->cs to reply (it's a long story...)
JXStern
2006-02-20 22:45:00 UTC
Permalink
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").
I have an alternative suggestion, take some major commercial
"technology" and disect it. Do a compare and contrast. List the top
10 features and top 10 uglies. Teach students to do this, so they are
not smitten by every new hack and marketing hype they run across in
the real world. This would be sort of an "applied theory" class,
academically acceptable, teaching some degree of technology competence
as a side-effect. Might do three in a quarter, a different set
whenever the mood strikes.

J.
JXStern
2006-02-20 22:46:05 UTC
Permalink
oops, fell into your trap!
Post by David Alex Lamb
What technologies should we teach in undergraduate software engineering (or
those portions of computing science that overlap with software engineering)?
(not "subjects" but "technologies").
I have an alternative suggestion, take some major commercial
"technology" and disect it. Do a compare and contrast. List the top
10 features and top 10 uglies. Teach students to do this, so they are
not smitten by every new hack and marketing hype they run across in
the real world. This would be sort of an "applied theory" class,
academically acceptable, teaching some degree of technology competence
as a side-effect. Might do three in a quarter, a different set
whenever the mood strikes.

J.
Continue reading on narkive:
Loading...