19

I was recently reading some papers regarding bridging the gap between academia and industry, specifically in undergraduate computer science and software engineering programs. The papers I read were published between the late 1970s and the mid 2010s. I noticed a stark contrast in papers published in the 1990s as compared to the 1970s with regards to education, often in reference to mathematics and science.

A small passage from Essential Elements of Software Engineering Education, published in 1976:

It is clear from the above discussion that the education of a software engineer will involve the study of a variety of subjects combined with a considerable amount of practical experience which must be accumulated over a number of years. From a university standpoint, the subject matter not only cuts across a number of traditional disciplines and boundaries, but also covers topics that historically have not been part of academic curricula.

This passage follows a discussion of problems that arise from teaching software engineering, specifically the difficulties in mapping certain aspects of industry into the classroom. This discussion is a significant portion of the paper, and similar discussions are prevalent in other papers written around the same period of time. Generally, there's an understanding that a wide knowledge base is needed in the fields of computer science and software engineering, but the trend is to focus on maximizing the skills and knowledge needed in industry.

Throughout everything published in the mid-to-late 1970s, I get the idea that the people designing the computer science and software engineering curricula understand the industry. This includes technical and non-technical topics, but always serving the needs of the profession. It's well summarized in this statement, also from Essential Elements of Software Engineering Education:

A curriculum in software engineering must be multi-form and in fact be a collection of curricula to meet the diverse needs of existing professional groups.

Fast-forward to the 1990s. A passage from the The SEI undergraduate curriculum in software engineering reads:

The mathematics and science content of the curriculum should help achieve two fundamental objectives. First, it should prepare students to participate competently in an increasingly technological society. This includes the ability to understand science and technology issues well enough to make informed political decisions. Second, the science and mathematics content should provide the students with an appropriate foundation for subsequent software engineering courses.

...

While the physical and life sciences are fundamental to traditional engineering disciplines, they provide virtually no basis for software engineering. The only significant exception is that electricity and magnetism, common topics in introductory physics courses, support the study of the computer itself, and software engineers need a basic understanding of the machine for which they are developing software. To achieve the first objective stated above, however, it is probably the case that basic knowledge of physics, chemistry, and biology are essential in almost any undergraduate curriculum. Chemistry and biology, in particular, are likely to be increasingly important in understanding society’s health care, environmental, and genetic engineering issues in the next century.

This is really the first time that I saw society come up in a discussion about the content of a curriculum. There were mentions of law and legal topics being relevant for software engineers and computer scientists in previous papers, but always from a professional standpoint. In the above quoted passage, the first objective (and I'm assuming the most important in the eyes of the author) of mathematics and science education is not to prepare the students for future course work or their first job in industry or for future research, but for functioning fully in modern society.

Beyond that, the authors even identify that these topics have limited utility in many industrial settings for software engineers. However, they continue to encourage that basic knowledge should be part of the curriculum for society's benefit.

My questions:

  1. What happened in the late 1970s into the 1990s that caused a shift from focusing on the profession and entering the workforce (both industrial and academia/research) to the general needs of the society in computer science and software engineering education?
  2. Was this phenomenon localized to computer science and software engineering education or was it a widespread event?
  3. What were the triggering events?
4
  • 2
    This is totally unfounded speculation on my part, but I wonder if the gradual penetration of the web (and the internet) into the public consciousness led to an increased sense that training in computing was important for participation in society itself. This is certainly the view of computing today.
    – Suresh
    Commented Jul 9, 2012 at 15:38
  • I agree with your point, but I'm interested in the view of mathematics and science education from computing professionals, not the study of computing being important for society. As an example, in the 1970s, the study of the natural sciences (with the exception of electricity and magnetism to understand the inner workings of computer hardware or a domain-specific topic) was called non-essential by at least two authors because it wasn't relevant to the workforce. In the 1990s, the stance was that sciences should be included to make the student a better participant in society. Commented Jul 9, 2012 at 15:53
  • "The web" was called first "ftp" and then "gopher" in the time period under discussion, was known to nearly nobody, and linked only a few large and wealthy institution together. It didn't support social interaction in the way the modern network does. Commented Jul 9, 2012 at 18:31
  • This question is perfect for this SE new site: undergraduates if you find it useful you can follow it and help us in spreading the word about it.
    – Daniele B
    Commented Jan 21, 2013 at 11:00

3 Answers 3

8

In the late 1970s when Freeman, Wasserman and Fairley wrote about software engineering curricula they were talking about graduate education. They assumed that most students would be working professionals returning to school to learn those things that they should have learned earlier, but which were not being taught when they had been in college. Those ideas were implemented in the early MSE programs at places like Wang Institute (where Dick Fairley and I met). We felt at the time that undergraduates could not appreciate some of the problems and techniques we taught, so it was not worth trying. I remember having arguments with Gary Ford at the SEI about exactly this point.

By the 1990s people were beginning to believe that software engineering should be taught to undergraduates, so that they wouldn't have to unlearn bad habits when they went to work. There was also an increasing interest in professional licensing of software engineering. Licensing would require ABET-accredited undergraduate programs, among other things. The focus on society is related to the interest in developing a discipline of software engineering aligned with ABET accreditation and professional boards of engineers.

This is not to say that we weren't interested in serving the needs of society in the 1970s and 1980s. We even had a version of a code of ethics that we taught Wang Institute students during orientation. But we assumed that students already knew their role in society. They had come back to school primarily to learn new methods and tools.

By the way, SWEBOK was actually started in 1998 as part of an effort to professionalize software engineering. The software engineering code of ethics was published at about the same time. Both of these projects were meant (by some of us) to support the eventual licensing of software engineering.

4

Since I asked this question, I studied a number of factors. I believe that the two primary factors are the age of the disciplines and the education reform in the United States in the 1980s. I also believe that the growth of multi-disciplinary education and the prevalance of computers have spurred these changes more drastically in the computing related fields (although I don't have sufficient knowledge at this time to be as confident in this assessment).


The first thing to consider is the age of the disciplines. At the time of the publication of the first articles, in the 1970s, the fields of computer science and software engineering were relatively new on the scene.

The first computer science degree program was started in 1953 at the University of Cambridge. In the US, the first computer science program wasn't founded until 1962. However, some of the papers noted that computer science education didn't made significant advances until 1969, with the publication of the ACM Curriculum 68 and ACM Information Systems Curriculum, which established the central topics to computer science. That's over 15 years between the first CS program and significant advances in CS education.

Software engineering as a separate discipline wasn't even a thought until the NATO conferences in 1968 (Garmisch, Germany) and 1969 (Brussels), and it took another 10 years (1979) before the first graduate program and it wasn't until 1996 that an undergraduate software engineering program existed. If SE as a discipline follows similar trends as computer science, I wouldn't expect significant advances in the education techniques before the late 1980s or early 1990s, after the central topics have been identified and disseminated. As a point of reference, the IEEE's Guide to the Software Engineering Body of Knowledge (SWEBOK), which outlines the core knowledge areas and related disciplines of software engineering, wasn't even started until 1993, which puts it at roughly the same time-to-development as computer science.

I'm not an academic, but I would suspect that it's rather difficult to design a curriculum that's relevant to students seeking careers in industry without a solid framework, especially when the goals are to produce a solid, reliable curriculum that stands up to the rigor of engineering. In addition, there is the additional work related to validation and accreditation of the programs. The papers from the 1970s were typically laying the groundwork for the work to come over the next 15-20 years by proposing the key topics and content. Upon further examination, nearly all of the topics presented in the papers were identified as essential knowledge areas of software engineering or as a related discipline in the Guide to the SWEBOK.


I believe that educational reform also plays a role in the changes. According to Wikipedia, education reform was occurring around the world, starting in the early 1900s. Considering that the majority (all but one or two) of the papers that I read were written by someone in the United States, I focused my research on the educational reform that started in the 1980s and still continues.

In 1983, a report titled A Nation at Risk: The Imperative For Educational Reform (PDF) was published. Although the bulk of the paper is centered on primary and secondary education (in the US, kindergarten through 12th grade), it also mentions a decline in SAT scores, a decline in College Board (AP) test scores, an increase in the teaching of remedial mathematics courses in public 4 year universities, and millions of dollars being spent by businesses and the military for remedial education and training programs. The report found that the average graduate (of secondary schools as well as higher education) is not as well-educated as the average graduate of the previous generation and smaller proportions are completing high school and college.

This report presents the "Learning Society". A learning society has "a basic foundation the idea that education is important not only because of what it contributes to one's career goals but also because of the value it adds to the general quality of one's life." The focus become on life-long learning, well beyond the end of schooling. In contrast to this "learning society", they find that the American education system is expressing standards in "minimum requirements" and students who do the minimum amount of work to get by.

To report also finds problems:

  • Students are taking "general track" courses instead of vocational or college preparation courses in secondary education, but only small percentages complete courses like Algebra II, French I, and Calculus
  • Large numbers of credits are gained in physical/health education, remedial courses, and courses for training for adulthood
  • Grades rose as amount of required effort to complete work rose
  • Science-oriented students (4 years of science/math in secondary school) in the US are spending significantly (1/3) less time than any counterpart in many other industrialized nations
  • A significant number of public colleges must accept all high school graduates from their state
  • Textbooks aren't being written by experienced teachers or scholars
  • Many textbooks don't challenge their readers.
  • School years are significantly shorter (in length and total days) than many other industrialized countries.
  • Teacher preparation curricula are focused on "educational methods" instead of subject matter
  • Shortages of teachers, especially in mathematics and science, leading to under-qualified teachers teaching these subjects.

Although many of the recommendations point to changes in secondary education, it only makes sense that ripples flow throughout higher education. Vocational schools, colleges, and universities most likely adjusted their curricula in two directions. The first would be to meet the needs of the students who might have been underprepared by their secondary education by adding courses for subject matter that might have previously been expected by a high school graduate. The second direction would be to create the environment of the Learning Society by creating courses to expand the mind and enable students to "learn to learn" in the future.


I believe that scaaahu might be onto something in his answer, as well. As mentioned in several of the papers in the 1970s, industry can drive academia. This is even evident today with things such as Industrial Advisory Boards, which allow representatives from industry to meet with departments at universities and provide feedback on the quality of the graduates and suggest curricula improvements to allow graduates to ensure they have the skills needed in the workplace.

Specifically, I'm looking at domain knowledge. After reading some work regarding how in the early days of the computing profession, computing professionals were expected to know computing. However, the modern workplace is often cross-functional. Having domain knowledge seems to be more important in these cross-functional teams to facilitate communication.

However, going back to the idea of a Learning Society, even if the education isn't necessarily in the domain of work, the ability to learn to learn along with critical thinking, problem solving, and collaboration (themes that are cross-cutting across nearly every discipline and things that are difficult to teach outside of practice) is critically important to success.

1
  • I believe I found valid answers to two of the three questions that I presented in the original question. However, due to a lack of experience, I don't believe I can answer if the problem is localized to computing fields. However, I think I can draw the assumption that it's most likely not. Commented Jul 10, 2012 at 22:36
3

This answer is from a retired software engineer’s point of view. In 60’s and 70’s, not too many people had a close look at real computers. I wrote my first few FORTRAN programs on punched cards without actually seeing what a computer looked like.To me, a computer was like a black box. There was a magician living in the computer room, could understand the DO loop instructions in my program and do the job for me. There were a lot of students thinking the same way like me. The professors in CS department were facing tough choices. There was limited time for the students to take the courses. You teach them math and science first, or programming first? What would the industry think if your computer science graduates could not even write programs? Naturally, the curriculum had to be focused on computer science itself.

In late 70’s, micro computers came out. In 90’s, PC became house furniture. A lot of software packages were available. Programming became everybody’s skill like driving a car. In the mean time, the industry found out CS graduates are hard to use because they don’t have enough application domain knowledge. I, for example, had to borrow books from the library to learn how radar operates. In the last few years before I retired, I didn’t need the programming language manual because I knew them. But I had to get on Wikipedia web site many times everyday because I was not familiar with the application I was working on. And I know I was not an exception. Many of my colleagues were doing the same thing. Naturally, the industry and academia have to train the undergraduate student math and science so that they would be more useful.

The above is my observation and my experience. I would not say who is wrong or right. The OP was wondering what happened. I believe it was a gradual process and a basic economics rule, demand and supply.

6
  • The idea of application domain knowledge is something that was recognized in the 1970s, evidenced by the quote about meeting "diverse needs of existing professional groups" and discussed in depth in some other papers. However, I'm more concerned with the transition from including material in a curriculum from needing to know a particular topic in order to do your job (the application domain knowledge you mention) to including material because it benefits society but is known to have little impact on the profession. Commented Jul 10, 2012 at 11:09
  • However, you might be onto something with the tough choices and supply/demand idea. Computer science wasn't an academic discipline until the 1950s, and even then didn't mature until the mid 1960s. There wasn't a software engineering graduate program until 1979 or an undergraduate program until 1996. So this specific condition that I'm describing might be unique to computing professions, but it requires the expertise of an academic to determine that. However, if it's not unique to the computing disciplines, then something must have triggered this change, and I would like to know what it is. Commented Jul 10, 2012 at 11:34
  • @ThomasOwens, I would like to point out that the society in U.S. in 60's and 70's is very different from the society in 90's and thereafter. So, people might have changed their thinkings because the society changed?
    – Nobody
    Commented Jul 10, 2012 at 12:15
  • Hence my third question in the body of the post: What triggered these changes? Was it a societal change? If it was, it is rather drastic over a span of a decade, especially since the sentiments of the 1990s are still being echoed in the 2000s and 2010s. Was it a reaction to (or part of) the reform movements in secondary education of the 1980s? Was it the immaturity of the academic disciplines in the computing field? Answering those questions is the only acceptable answer to my question. Commented Jul 10, 2012 at 12:24
  • @ThomasOwens, of course it's your call to accept the answer. I am not here for the reps. I submitted my answer because I was a computing professional and also used to be in academia. I myself am interested in this question. But, I am not too sure there was a (quick)trigger. I'll leave it to the others to answer it.
    – Nobody
    Commented Jul 10, 2012 at 12:45

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .