History of software engineering
From its beginnings in the 1960s, writing software has evolved into a profession concerned with how best to maximize the quality of software and of how to create it. Quality can refer to how maintainable software is, to its stability, speed, usability, testability, readability, size, cost, security, and number of flaws or "bugs", as well as to less measurable qualities like elegance, conciseness, and customer satisfaction, among many other attributes. How best to create high quality software is a separate and controversial problem covering software design principles, so-called "best practices" for writing code, as well as broader management issues such as optimal team size, process, how best to deliver software on time and as quickly as possible, work-place "culture", hiring practices, and so forth. All this falls under the broad rubric of software engineering.
Overview
The evolution of software engineering is notable in a number of areas:- Emergence as a profession: By the early 1980s, software engineering professionalism, to stand beside computer science and traditional engineering.
- Role of women: Before 1970 men filling the more prestigious and better paying hardware engineering roles often delegated the writing of software to women, and legends such as Grace Hopper or Margaret Hamilton filled many computer programming jobs.
- Processes: Processes have become a big part of software engineering. They are hailed for their potential to improve software but sharply criticized for their potential to constrict programmers.
- Cost of hardware: The relative cost of software versus hardware has changed substantially over the last 50 years. When mainframes were expensive and required large support staffs, the few organizations buying them also had the resources to fund large, expensive custom software engineering projects. Computers are now much more numerous and much more powerful, which has several effects on software. The larger market can support large projects to create commercial off the shelf software, as done by companies such as Microsoft. The cheap machines allow each programmer to have a terminal capable of fairly rapid compilation. The programs in question can use techniques such as garbage collection, which make them easier and faster for the programmer to write. On the other hand, many fewer organizations are interested in employing programmers for large custom software projects, instead using commercial off the shelf software as much as possible.
1945 to 1965: The origins
The NATO Science Committee sponsored two conferences on software engineering in 1968 and 1969, which gave the field its initial boost. Many believe these conferences marked the official start of the profession of software engineering.
1965 to 1985: The software crisis
Software engineering was spurred by the so-called software crisis of the 1960s, 1970s, and 1980s, which identified many of the problems of software development. Many projects ran over budget and schedule. Some projects caused property damage. A few projects caused loss of life. The software crisis was originally defined in terms of productivity, but evolved to emphasize quality. Some used the term software crisis to refer to their inability to hire enough qualified programmers.- Cost and Budget Overruns: The OS/360 operating system was a classic example. This decade-long project from the 1960s eventually produced one of the most complex software systems at the time. OS/360 was one of the first large software projects. Fred Brooks claims in The Mythical Man-Month that he made a multimillion-dollar mistake of not developing a coherent architecture before starting development.
- Property Damage: Software defects can cause property damage. Poor software security allows hackers to steal identities, costing time, money, and reputations.
- Life and Death: Software defects can kill. Some embedded systems used in radiotherapy machines failed so catastrophically that they administered lethal doses of radiation to patients. The most famous of these failures is the Therac-25 incident.
1985 to 1989: "''No Silver Bullet''"
For decades, solving the software crisis was paramount to researchers and companies producing software tools.The cost of owning and maintaining software in the 1980s was twice as expensive as developing the software.
- During the 1990s, the cost of ownership and maintenance increased by 30% over the 1980s.
- In 1995, statistics showed that half of surveyed development projects were operational, but were not considered successful.
- The average software project overshoots its schedule by half.
- Three-quarters of all large software products delivered to the customer are failures that are either not used at all, or do not meet the customer's requirements.
Software projects
- Tools: Especially emphasized were tools: structured programming, object-oriented programming, CASE tools such as ICL's CADES CASE system,Ada, documentation, and standards were touted as silver bullets.
- Discipline: Some pundits argued that the software crisis was due to the lack of discipline of programmers.
- Formal methods: Some believed that if formal engineering methodologies would be applied to software development, then production of software would become as predictable an industry as other branches of engineering. They advocated proving all programs correct.
- Process: Many advocated the use of defined processes and methodologies like the Capability Maturity Model.
- Professionalism: This led to work on a code of ethics, licenses, and professionalism.
Debate about silver bullets raged over the following decade. Advocates for Ada, components, and processes continued arguing for years that their favorite technology would be a silver bullet. Skeptics disagreed. Eventually, almost everyone accepted that no silver bullet would ever be found. Yet, claims about silver bullets pop up now and again, even today.
Some interpret no silver bullet to mean that software engineering failed. However, with further reading, Brooks goes on to say: "We will surely make substantial progress over the next 40 years; an order of magnitude over 40 years is hardly magical..."
The search for a single key to success never worked. All known technologies and practices have only made incremental improvements to productivity and quality. Yet, there are no silver bullets for any other profession, either. Others interpret no silver bullet as proof that software engineering has finally matured and recognized that projects succeed due to hard work.
However, it could also be said that there are, in fact, a range of silver bullets today, including lightweight methodologies, spreadsheet calculators, customized browsers, in-site search engines, database report generators, integrated design-test coding-editors with memory/differences/undo, and specialty shops that generate niche software, such as information web sites, at a fraction of the cost of totally customized web site development. Nevertheless, the field of software engineering appears too complex and diverse for a single "silver bullet" to improve most issues, and each issue accounts for only a small portion of all software problems.
1990 to 1999: Prominence of the Internet
The rise of the Internet led to very rapid growth in the demand for international information display/e-mail systems on the World Wide Web. Programmers were required to handle illustrations, maps, photographs, and other images, plus simple animation, at a rate never before seen, with few well-known methods to optimize image display/storage.The growth of browser usage, running on the HyperText Markup Language, changed the way in which information-display and retrieval was organized. The widespread network connections led to the growth and prevention of international computer viruses on MS Windows computers, and the vast proliferation of spam e-mail became a major design issue in e-mail systems, flooding communication channels and requiring semi-automated pre-screening. Keyword-search systems evolved into web-based search engines, and many software systems had to be re-designed, for international searching, depending on search engine optimization techniques. Human natural-language translation systems were needed to attempt to translate the information flow in multiple foreign languages, with many software systems being designed for multi-language usage, based on design concepts from human translators. Typical computer-user bases went from hundreds, or thousands of users, to, often, many-millions of international users.
2000 to 2015: Lightweight methodologies
With the expanding demand for software in many smaller organizations, the need for inexpensive software solutions led to the growth of simpler, faster methodologies that developed running software, from requirements to deployment, quicker & easier. The use of rapid-prototyping evolved to entire lightweight methodologies, such as Extreme Programming, which attempted to simplify many areas of software engineering, including requirements gathering and reliability testing for the growing, vast number of small software systems. Very large software systems still used heavily documented methodologies, with many volumes in the documentation set; however, smaller systems had a simpler, faster alternative approach to managing the development and maintenance of software calculations and algorithms, information storage/retrieval and display.Current trends in software engineering
Software engineering is a young discipline, and is still developing. The directions in which software engineering is developing include:Aspects
help software engineers deal with quality attributes by providing tools to add or remove boilerplate code from many areas in the source code. Aspects describe how all objects or functions should behave in particular circumstances. For example, aspects can add debugging, logging, or locking control into all objects of particular types. Researchers are currently working to understand how to use aspects to design general-purpose code. Related concepts include generative programming and templates.Experimental
is a branch of software engineering interested in devising experiments on software, in collecting data from the experiments, and in devising laws and theories from this data. Proponents of this method advocate that the nature of software is such that we can advance the knowledge on software through experiments only.Software product lines
, is a systematic way to produce families of software systems, instead of creating a succession of completely individual products. This method emphasizes extensive, systematic, formal code reuse, to try to industrialize the software development process.The Future of Software Engineering conference, held at ICSE 2000, documented the state of the art of SE in 2000 and listed many problems to be solved over the next decade. The FOSE tracks at the ICSE 2000 and the ICSE 2007 conferences also help identify the state of the art in software engineering.
Software engineering today
The profession is trying to define its boundary and content. The Software Engineering Body of Knowledge SWEBOK has been tabled as an ISO standard during 2006.In 2006, Money Magazine and Salary.com rated software engineering as the best job in America in terms of growth, pay, stress levels, flexibility in hours and working environment, creativity, and how easy it is to enter and advance in the field.
Sub-disciplines
Artificial intelligence
A wide variety of platforms has allowed different aspects of AI to develop, ranging from expert systems such as Cyc to deep-learning frameworks to robot platforms such as the Roomba with open interface. Recent advances in deep artificial neural networks and distributed computing have led to a proliferation of software libraries, including Deeplearning4j, TensorFlow, Theano and Torch.A 2011 McKinsey Global Institute study found a shortage of 1.5 million highly trained data and AI professionals and managers and a number of private bootcamps have developed programs to meet that demand, including free programs like The Data Incubator or paid programs like General Assembly.
Languages
Early symbolic AI inspired Lisp and Prolog, which dominated early AI programming. Modern AI development often uses mainstream languages such as Python or C++, or niche languages such as Wolfram Language.Prominent figures in the history of software engineering
- Charles Bachman is particularly known for his work in the area of databases.
- Laszlo Belady the editor-in-chief of the IEEE Transactions on Software Engineering in the 1980s.
- Fred Brooks best known for managing the development of OS/360.
- Peter Chen known for the development of entity-relationship modeling.
- Edsger Dijkstra developed the framework for a form of structured programming.
- David Parnas developed the concept of information hiding in modular programming.
- Michael A. Jackson software engineering methodologist responsible for JSP method of program design; JSD method of system development ; and Problem Frames approach for analysing and structuring software development problems.
- David Pearson designed and developed the ICL CADES system 1968-1977 and went on to become a computer graphics pioneer.
- Richard Stallman Created the GNU system utilities and championed free software