One of the first problems to be studied in the 1950s, shortly after the invention of computers, was an LCT problem, namely the translation of human languages. The large amounts of funding poured into machine translation testifies to the perceived importance of the field, right from the beginning. It was also in this period that scholars started to develop theories of language and communication based on scientific methods. In the case of language, it was Noam Chomsky who refines the goal of linguistics as a quest for a formal description of language, whilst Claude Shannon and Warren Weaver provided a mathematical theory that linked communication with information. Computers and related technologies have provided a physical and conceptual framework within which scientific studies concerning the notion of communication within a computational framework could be pursued. Indeed, this framework has been fruitful on a number of levels. For a start, it has given birth to a new discipline, known as natural language processing, or computational linguistics. This discipline studies, from a computational perspective, all levels of language from the production of speech to the meanings of texts and dialogues. And over the past 40 years, NLP has produced an impressive computational infrastructure of resources, techniques, and tools for analyzing sound structure, word structure, grammatical structure and meaning structure. As well as being important for language-based applications, this computational infrastructure makes it possible to investigate the structure of human language and communication at a deeper scientific level than was ever previously possible. Moreover, NLP fits in naturally with other branches of computer science, and in particular, with artificial intelligence. From an AI perspective, language use is regarded as a manifestation of intelligent behaviour by an active agent. The emphasis in AI-based approaches to language and communication is on the computational infrastructure required to integrate linguistic performance into a general theory ofintelligent agents that includes, for example, learning generalizations on the basis of particular experience, the ability to plan and reason about intentionally produced utterances, the design of utterances that will fulfill a particular set of goals. Such work tends to be highly interdisciplinary in nature, as it needs to draw on ideas from such fields as linguistics, cognitive psychology, and sociology. LCT draws on and incorporates knowledge and research from all these fields.
Today
Language and communication are so fundamental to human activity that it is not at all surprising to find that Language and Communication Technologies affect all major areas of society, including health, education, finance, commerce, and travel. Modern LCT is based on a dual tradition of symbols and statistics. This means that nowadays research on language requires access to large databases of information about words and their properties, to large scale computational grammars, to computational tools for working with all levels of language, and to efficient inference systems for performing reasoning. By working computationally it is possible to get to grips with the deeper structure of natural languages, and in particular, to model the crucial interactions between the various levels of language and other cognitive faculties. Relevant areas of research in LCT include: