Propositional formula


In propositional logic, a propositional formula is a type of syntactic formula which is well formed and has a truth value. If the values of all variables in a propositional formula are given, it determines a unique truth value. A propositional formula may also be called a propositional expression, a sentence, or a sentential formula.
A propositional formula is constructed from simple propositions, such as "five is greater than three" or propositional variables such as P and Q, using connectives or logical operators such as NOT, AND, OR, or IMPLIES; for example:
In mathematics, a propositional formula is often more briefly referred to as a "proposition", but, more precisely, a propositional formula is not a proposition but a formal expression that denotes a proposition, a formal object under discussion, just like an expression such as "" is not a value, but denotes a value. In some contexts, maintaining the distinction may be of importance.

Propositions

For the purposes of the propositional calculus, propositions are considered to be either simple or compound. Compound propositions are considered to be linked by sentential connectives, some of the most common of which are "AND", "OR", "IF … THEN …", "NEITHER … NOR …", "… IS EQUIVALENT TO …". The linking semicolon ";", and connective "BUT" are considered to be expressions of "AND". A sequence of discrete sentences are considered to be linked by "AND"s, and formal analysis applies a recursive "parenthesis rule" with respect to sequences of simple propositions.
Simple propositions are declarative in nature, that is, they make assertions about the condition or nature of a particular object of sensation e.g. "This cow is blue", "There's a coyote!". Thus the simple "primitive" assertions must be about specific objects or specific states of mind. Each must have at least a subject, a verb, and perhaps an adjective or adverb. "Dog!" probably implies "I see a dog" but should be rejected as too ambiguous.
For the purposes of the propositional calculus a compound proposition can usually be reworded into a series of simple sentences, although the result will probably sound stilted.

Relationship between propositional and predicate formulas

The predicate calculus goes a step further than the propositional calculus to an "analysis of the inner structure of propositions" It breaks a simple sentence down into two parts its subject and a predicate. The predicate calculus then generalizes the "subject|predicate" form into a form with the following blank-subject structure " ___|predicate", and the predicate in turn generalized to all things with that property.
The generalization of "this pig" to a member of two classes "winged things" and "blue things" means that it has a truth-relationship with both of these classes. In other words, given a domain of discourse "winged things", p is either found to be a member of this domain or not. Thus there is a relationship W between p and, W evaluates to where is the set of the boolean values "true" and "false". Likewise for B and p and : B evaluates to. So one now can analyze the connected assertions "B AND W" for its overall truth-value, i.e.:
In particular, simple sentences that employ notions of "all", "some", "a few", "one of", etc. are treated by the predicate calculus. Along with the new function symbolism "F" two new symbols are introduced: ∀, and ∃. The predicate calculus, but not the propositional calculus, can establish the formal validity of the following statement:

Identity

Tarski asserts that the notion of IDENTITY lies outside the propositional calculus; however, he notes that if a logic is to be of use for mathematics and the sciences it must contain a "theory" of IDENTITY. Some authors refer to "predicate logic with identity" to emphasize this extension. See more about this [|below].

An algebra of propositions, the propositional calculus

An algebra, loosely defined, is a method by which a collection of symbols called variables together with some other symbols such as parentheses and some sub-set of symbols such as *, +, ~, &, ∨, =, ≡, ∧, ¬ are manipulated within a system of rules. These symbols, and well-formed strings of them, are said to represent objects, but in a specific algebraic system these objects do not have meanings. Thus work inside the algebra becomes an exercise in obeying certain laws of the algebra's syntax rather than in semantics of the symbols. The meanings are to be found outside the algebra.
For a well-formed sequence of symbols in the algebra —a formula— to have some usefulness outside the algebra the symbols are assigned meanings and eventually the variables are assigned values; then by a series of rules the formula is evaluated.
When the values are restricted to just two and applied to the notion of simple sentences linked by propositional connectives this whole algebraic system of symbols and rules and evaluation-methods is usually called the propositional calculus or the sentential calculus.
While some of the familiar rules of arithmetic algebra continue to hold in the algebra of propositions, some do not.

Usefulness of propositional formulas

Analysis: In deductive reasoning, philosophers, rhetoricians and mathematicians reduce arguments to formulas and then study them for correctness. For example: Is the following argument sound?
Engineers analyze the logic circuits they have designed using synthesis techniques and then apply various reduction and minimization techniques to simplify their designs.
Synthesis: Engineers in particular synthesize propositional formulas from truth tables. For example, one might write down a truth table for how binary addition should behave given the addition of variables "b" and "a" and "carry_in" "ci", and the results "carry_out" "co" and "sum" Σ:
rowbaci+cicoΣ
0000000
1001101
2010101
3011210
4100101
5101210
6110210
7111311

Propositional variables

The simplest type of propositional formula is a propositional variable. Propositions that are simple, symbolic expressions are often denoted by variables named p, q, or P, Q, etc. A propositional variable is intended to represent an atomic proposition, such as "It is Saturday" = p or "I only go to the movies on Monday" = q.

Truth-value assignments, formula evaluations

Evaluation of a propositional formula begins with assignment of a truth value to each variable. Because each variable represents a simple sentence, the truth values are being applied to the "truth" or "falsity" of these simple sentences.
Truth values in rhetoric, philosophy and mathematics: The truth values are only two:. An empiricist puts all propositions into two broad classes: analytic—true no matter what, and synthetic—derived from experience and thereby susceptible to confirmation by third parties. Empiricits hold that, in general, to arrive at the truth-value of a synthetic proposition, meanings must first be applied to the words, and then these meaning-templates must be matched against whatever it is that is being asserted. For example, my utterance "That cow is '!" Is this statement a TRUTH? Truly I said it. And maybe I am seeing a blue cow—unless I am lying my statement is a TRUTH relative to the object of my perception. But is the blue cow "really there"? What do you see when you look out the same window? In order to proceed with a verification, you will need a prior notion of both "cow" and "", and an ability to match the templates against the object of sensation.
Truth values in engineering: Engineers try to avoid notions of truth and falsity that bedevil philosophers, but in the final analysis engineers must trust their measuring instruments. In their quest for robustness, engineers prefer to pull known objects from a small library—objects that have well-defined, predictable behaviors even in large combinations,. The fewest behaviors of a single object are two, and these are put in correspondence with. Such elements are called digital; those with a continuous range of behaviors are called analog. Whenever decisions must be made in an analog system, quite often an engineer will convert an analog behavior to digital by use of a comparator.
Thus an assignment of
meaning''' of the variables and the two value-symbols comes from "outside" the formula that represents the behavior of the compound object. An example is a garage door with two "limit switches", one for UP labelled SW_U and one for DOWN labelled SW_D, and whatever else is in the door's circuitry. Inspection of the circuit might reveal that, on the circuit board "node 22" goes to +0 volts when the contacts of switch "SW_D" are mechanically in contact and the door is in the "down" position, and "node 29" goes to +0 volts when the door is 95% UP and the contacts of switch SW_U are in mechanical contact. The engineer must define the meanings of these voltages and all possible combinations, including the "bad" ones. The circuit mindlessly responds to whatever voltages it experiences without any awareness of TRUTH or FALSEHOOD, RIGHT or WRONG, SAFE or DANGEROUS.

Propositional connectives

Arbitrary propositional formulas are built from propositional variables and other propositional formulas using propositional connectives. Examples of connectives include:
The following are the connectives common to rhetoric, philosophy and mathematics together with their truth tables. The symbols used will vary from author to author and between fields of endeavor. In general the abbreviations "T" and "F" stand for the evaluations TRUTH and FALSITY applied to the variables in the propositional formula.
The connectives go by a number of different word-usages, e.g. "a IMPLIES b" is also said "IF a THEN b". Some of these are shown in the table.

Engineering connectives

In general, the engineering connectives are just the same as the mathematics connectives excepting they tend to evaluate with "1" = "T" and "0" = "F". This is done for the purposes of analysis/minimization and synthesis of formulas by use of the notion of minterms and Karnaugh maps. Engineers also use the words logical product from Boole's notion and logical sum from Jevons' notion.

CASE connective: IF … THEN … ELSE …

The IF... THEN... ELSE... connective appears as the simplest form of CASE operator of recursion theory and computation theory and is the connective responsible for conditional goto's. From this one connective all other connectives can be constructed. Although " IF c THEN b ELSE a " sounds like an implication it is, in its most reduced form, a switch that makes a decision and offers as outcome only one of two alternatives "a" or "b".
The following three propositions are equivalent :
  1. & ≡ AND " ≡
  2. ∨ ≡ " OR "
Thus IF … THEN … ELSE—unlike implication—does not evaluate to an ambiguous "TRUTH" when the first proposition is false i.e. c = F in. For example, most people would reject the following compound proposition as a nonsensical non sequitur because the second sentence is not connected in meaning to the first.
In recognition of this problem, the sign → of formal implication in the propositional calculus is called material implication to distinguish it from the everyday, intuitive implication.
The use of the IF... THEN... ELSE construction avoids controversy because it offers a completely deterministic choice between two stated alternatives; it offers two "objects", and it selects between them exhaustively and unambiguously. In the truth table below, d1 is the formula: AND. Its fully reduced form d2 is the formula: OR. The two formulas are equivalent as shown by the columns "=d1" and "=d2". Electrical engineers call the fully reduced formula the AND-OR-SELECT operator. The CASE operator is an extension of the same idea to n possible, but mutually exclusive outcomes. Electrical engineers call the CASE operator a multiplexer.

IDENTITY and evaluation

The first table of this section stars *** the entry logical equivalence to note the fact that "Logical equivalence" is not the same thing as "identity". For example, most would agree that the assertion "That cow is blue" is identical to the assertion "That cow is blue". On the other hand, logical equivalence sometimes appears in speech as in this example: " 'The sun is shining' means 'I'm biking' " Translated into a propositional formula the words become: "IF 'the sun is shining' THEN 'I'm biking', AND IF 'I'm biking' THEN 'the sun is shining'":
Different authors use different signs for logical equivalence: ↔, ≡, ⇔. Typically identity is written as the equals sign =. One exception to this rule is found in Principia Mathematica. For more about the philosophy of the notion of IDENTITY see Leibniz's law.
As noted above, Tarski considers IDENTITY to lie outside the propositional calculus, but he asserts that without the notion, "logic" is insufficient for mathematics and the deductive sciences. In fact the sign comes into the propositional calculus when a formula is to be evaluated.
In some systems there are no truth tables, but rather just formal axioms. the result of such a calculus will be another formula. Eventually, however, if one wants to use the calculus to study notions of validity and truth, one must add axioms that define the behavior of the symbols called "the truth values" relative to the other symbols.
For example, Hamilton uses two symbols = and ≠ when he defines the notion of a valuation v of any well-formed formulas A and B in his "formal statement calculus" L. A valuation v is a function from the wffs of his system L to the range , given that each variable p1, p2, p3 in a wff is assigned an arbitrary truth value.
The two definitions and define the equivalent of the truth tables for the ~ and → connectives of his system. The first one derives F ≠ T and T ≠ F, in other words " v does not mean v". Definition specifies the third row in the truth table, and the other three rows then come from an application of definition. In particular assigns the value F to the entire expression. The definitions also serve as formation rules that allow substitution of a value previously derived into a formula:
Some formal systems specify these valuation axioms at the outset in the form of certain formulas such as the law of contradiction or laws of identity and nullity. The choice of which ones to use, together with laws such as commutation and distribution, is up to the system's designer as long as the set of axioms is complete.

More complex formulas

As shown above, the CASE connective is constructed either from the 2-argument connectives IF... THEN... and AND or from OR and AND and the 1-argument NOT. Connectives such as the n-argument AND, OR are constructed from strings of two-argument AND and OR and written in abbreviated form without the parentheses. These, and other connectives as well, can then used as building blocks for yet further connectives. Rhetoricians, philosophers, and mathematicians use truth tables and the various theorems to analyze and simplify their formulas.
Electrical engineering uses drawn symbols and connect them with lines that stand for the mathematicals act of substitution and replacement. They then verify their drawings with truth tables and simplify the expressions as shown below by use of Karnaugh maps or the theorems. In this way engineers have created a host of "combinatorial logic" such as "decoders", "encoders", "mutifunction gates", "majority logic", "binary adders", "arithmetic logic units", etc.

Definitions

A definition creates a new symbol and its behavior, often for the purposes of abbreviation. Once the definition is presented, either form of the equivalent symbol or formula can be used. The following symbolism =Df is following the convention of Reichenbach. Some examples of convenient definitions drawn from the symbol set and variables. Each definition is producing a logically equivalent formula that can be used for substitution or replacement.

Axiom and definition ''schemas''

The definitions above for OR, IMPLICATION, XOR, and logical equivalence are actually schemas, that is, they are models for a general formula format but shown with specific letters a, b, c for the variables, whereas any variable letters can go in their places as long as the letter substitutions follow the rule of substitution below.

Substitution versus replacement

Substitution: The variable or sub-formula to be substituted with another variable, constant, or sub-formula must be replaced in all instances throughout the overall formula.
Replacement: the formula to be replaced must be within a tautology, i.e. logically equivalent to the formula that replaces it, and unlike substitution its permissible for the replacement to occur only in one place.

Inductive definition

The classical presentation of propositional logic uses the connectives. The set of formulas over a given set of propositional variables is inductively defined to be the smallest set of expressions such that:
This inductive definition can be easily extended to cover additional connectives.
The inductive definition can also be rephrased in terms of a closure operation. Let V denote a set of propositional variables and let XV denote the set of all strings from an alphabet including symbols in V, left and right parentheses, and all the logical connectives under consideration. Each logical connective corresponds to a formula building operation, a function from XXV to XXV:
The set of formulas over V is defined to be the smallest subset of XXV containing V and closed under all the formula building operations.

Parsing formulas

The following "laws" of the propositional calculus are used to "reduce" complex formulas. The "laws" can be verified easily with truth tables. For each law, the principal connective is associated with logical equivalence ≡ or identity =. A complete analysis of all 2n combinations of truth-values for its n distinct variables will result in a column of 1's underneath this connective. This finding makes each law, by definition, a tautology. And, for a given law, because its formula on the left and right are equivalent they can be substituted for one another.
Enterprising readers might challenge themselves to invent an "axiomatic system" that uses the symbols, the formation rules specified above, and as few as possible of the laws listed below, and then derive as theorems the others as well as the truth-table valuations for ∨, &, and ~. One set attributed to Huntington uses eight of the laws defined below.
If used in an axiomatic system, the symbols 1 and 0 are considered to be well-formed formulas and thus obey all the same rules as the variables. Thus the laws listed below are actually axiom schemas, that is, they stand in place of an infinite number of instances. Thus ≡ might be used in one instance, ≡ and in another instance ≡, etc.

Connective seniority (symbol rank)

In general, to avoid confusion during analysis and evaluation of propositional formulas make liberal use parentheses. However, quite often authors leave them out. To parse a complicated formula one first needs to know the seniority, or rank, that each of the connectives has over the other connectives. To "well-form" a formula, start with the connective with the highest rank and add parentheses around its components, then move down in rank. From most- to least-senior, with the predicate signs ∀x and ∃x, the IDENTITY = and arithmetic signs added for completeness:
Thus the formula can be parsed—but because NOT does not obey the distributive law, the parentheses around the inner formula is mandatory:

Commutative and associative laws

Both AND and OR obey the commutative law and associative law:
Omitting parentheses in strings of AND and OR: The connectives are considered to be unary and binary. For example:
However, a truth-table demonstration shows that the form without the extra parentheses is perfectly adequate.
Omitting parentheses with regards to a single-variable NOT: While ~ where a is a single variable is perfectly clear, ~a is adequate and is the usual way this literal would appear. When the NOT is over a formula with more than one symbol, then the parentheses are mandatory, e.g. ~.

Distributive laws

OR distributes over AND and AND distributes over OR. NOT does not distribute over AND or OR. See below about De Morgan's law:
NOT, when distributed over OR or AND, does something peculiar :
Absorption, in particular the first one, causes the "laws" of logic to differ from the "laws" of arithmetic:
The sign " = " symbolizes the assignment of value or meaning. Thus the string symbolizes "0", i.e. it means the same thing as symbol "0" ". In some "systems" this will be an axiom perhaps shown as ; in other systems, it may be derived in the truth table below:
A key property of formulas is that they can be uniquely parsed to determine the structure of the formula in terms of its propositional variables and logical connectives. When formulas are written in infix notation, as above, unique readability is ensured through an appropriate use of parentheses in the definition of formulas. Alternatively, formulas can be written in Polish notation or reverse Polish notation, eliminating the need for parentheses altogether.
The inductive definition of infix formulas in the previous section can be converted to a formal grammar in Backus-Naur form:

::=

It can be shown that any expression matched by the grammar has a balanced number of left and right parentheses, and any nonempty initial segment of a formula has more left than right parentheses. This fact can be used to give an algorithm for parsing formulas. For example, suppose that an expression x begins with. Starting after the second symbol, match the shortest subexpression y of x that has balanced parentheses. If x is a formula, there is exactly one symbol left after this expression, this symbol is a closing parenthesis, and y itself is a formula. This idea can be used to generate a recursive descent parser for formulas.
Example of parenthesis counting:
This method locates as "1" the principal connective the connective under which the overall evaluation of the formula occurs for the outer-most parentheses. It also locates the inner-most connective where one would begin evaluatation of the formula without the use of a truth table, e.g. at "level 6".
startV)=VV))
count012333322333345555665433112344443344443223333333210

Well-formed formulas versus valid formulas in inferences

The notion of valid argument is usually applied to inferences in arguments, but arguments reduce to propositional formulas and can be evaluated the same as any other propositional formula. Here a valid inference means: "The formula that represents the inference evaluates to "truth" beneath its principal connective, no matter what truth-values are assigned to its variables", i.e. the formula is a tautology.
Quite possibly a formula will be well-formed but not valid. Another way of saying this is: "Being well-formed is necessary for a formula to be valid but it is not sufficient." The only way to find out if it is both well-formed and valid is to submit it to verification with a truth table or by use of the "laws":
A set of logical connectives is called complete if every propositional formula is tautologically equivalent to a formula with just the connectives in that set. There are many complete sets of connectives, including,, and. There are two binary connectives that are complete on their own, corresponding to NAND and NOR, respectively. Some pairs are not complete, for example.

The stroke (NAND)

The binary connective corresponding to NAND is called the Sheffer stroke, and written with a vertical bar | or vertical arrow ↑. The completeness of this connective was noted in Principia Mathematica. Since it is complete on its own, all other connectives can be expressed using only the stroke. For example, where the symbol " ≡ " represents logical equivalence:
In particular, the zero-ary connectives and can be expressed using the stroke:

IF … THEN … ELSE

This connective together with, forms a complete set. In the following the IF...THEN...ELSE relation = d represents ∨ ≡ ∨ = d
Example: The following shows how a theorem-based proof of " ≡ " would proceed, below the proof is its truth-table verification. is defined to be :
In the following truth table the column labelled "taut" for tautology evaluates logical equivalence between the two columns labelled d. Because all four rows under "taut" are 1's, the equivalence indeed represents a tautology.

Normal forms

An arbitrary propositional formula may have a very complicated structure. It is often convenient to work with formulas that have simpler forms, known as normal forms. Some common normal forms include conjunctive normal form and disjunctive normal form. Any propositional formula can be reduced to its conjunctive or disjunctive normal form.

Reduction to normal form

Reduction to normal form is relatively simple once a truth table for the formula is prepared. But further attempts to minimize the number of literals requires some tools: reduction by De Morgan's laws and truth tables can be unwieldy, but Karnaugh maps are very suitable a small number of variables. Some sophisticated tabular methods exist for more complex circuits with multiple outputs but these are beyond the scope of this article; for more see Quine-McCluskey algorithm.

Literal, term and alterm

In electrical engineering a variable x or its negation ~ is lumped together into a single notion called a literal. A string of literals connected by ANDs is called a term. A string of literals connected by OR is called an alterm. Typically the literal ~ is abbreviated ~x. Sometimes the &-symbol is omitted altogether in the manner of algebraic multiplication.
In the same way that a 2n-row truth table displays the evaluation of a propositional formula for all 2n possible values of its variables, n variables produces a 2n-square Karnaugh map. For example, 3 variables produces 23 = 8 rows and 8 Karnaugh squares; 4 variables produces 16 truth-table rows and 16 squares and therefore 16 minterms. Each Karnaugh-map square and its corresponding truth-table evaluation represents one minterm.
Any propositional formula can be reduced to the "logical sum" of the active minterms. When in this form the formula is said to be in disjunctive normal form. But even though it is in this form, it is not necessarily minimized with respect to either the number of terms or the number of literals.
In the following table, observe the peculiar numbering of the rows:. The first column is the decimal equivalent of the binary equivalent of the digits "cba", in other words:
This numbering comes about because as one moves down the table from row to row only one variable at a time changes its value. Gray code is derived from this notion. This notion can be extended to three and four-dimensional hypercubes called Hasse diagrams where each corner's variables change only one at a time as one moves around the edges of the cube. Hasse diagrams flattened into two dimensions are either Veitch diagrams or Karnaugh maps.
When working with Karnaugh maps one must always keep in mind that the top edge "wrap arounds" to the bottom edge, and the left edge wraps around to the right edge—the Karnaugh diagram is really a three- or four- or n-dimensional flattened object.
decimal equivalent of cbaminterm
0000
1001
3011
2010
6110
7111
5101
4100
0000

Reduction by use of the map method (Veitch, Karnaugh)

Veitch improved the notion of Venn diagrams by converting the circles to abutting squares, and Karnaugh simplified the Veitch diagram by converting the minterms, written in their literal-form into numbers. The method proceeds as follows:

Produce the formula's truth table

Produce the formula's truth table. Number its rows using the binary-equivalents of the variables for n variables.
Example: ∨ ) = q in conjunctive normal form is:
However, this formula be reduced both in the number of terms and in the total count of its literals.

Create the formula's Karnaugh map

Use the values of the formula found by the truth-table method and place them in their into their respective Karnaugh squares. If values of "d" for "don't care" appear in the table, this adds flexibility during the reduction phase.

Reduce minterms

Minterms of adjacent 1-squares can be reduced with respect to the number of their literals, and the number terms also will be reduced in the process. Two abutting squares lose one literal, four squares in a 4 x 1 rectangle or 2 x 2 square lose two literals, eight squares in a rectangle lose 3 literals, etc. This process continues until all abutting squares are accounted for, at which point the propositional formula is minimized.
For example, squares #3 and #7 abut. These two abutting squares can lose one literal, four squares in a rectangle or square lose two literals, eight squares in a rectangle lose 3 literals, etc. This process continues until all abutting squares are accounted for, at which point the propositional formula is said to be minimized.
Example: The map method usually is done by inspection. The following example expands the algebraic method to show the "trick" behind the combining of terms on a Karnaugh map:
Observe that by the Idempotency law = A, we can create more terms. Then by association and distributive laws the variables to disappear can be paired, and then "disappeared" with the Law of contradiction =0. The following uses brackets only to keep track of the terms; they have no special significance:

Impredicative propositions

Given the following examples-as-definitions, what does one make of the subsequent reasoning:
Then assign the variable "s" to the left-most sentence "This sentence is simple". Define "compound" c = "not simple" ~s, and assign c = ~s to "This sentence is compound"; assign "j" to "It is conjoined by AND". The second sentence can be expressed as:
If truth values are to be placed on the sentences c = ~s and j, then all are clearly FALSEHOODS: e.g. "This sentence is complex" is a FALSEHOOD. So their conjunction is a falsehood. But when taken in its assembled form, the sentence a TRUTH.
This is an example of the paradoxes that result from an impredicative definition—that is, when an object m has a property P, but the object m is defined in terms of property P. The best advice for a rhetorician or one involved in deductive analysis is avoid impredicative definitions but at the same time be on the lookout for them because they can indeed create paradoxes. Engineers, on the other hand, put them to work in the form of propositional formulas with feedback.

Propositional formula with "feedback"

The notion of a propositional formula appearing as one of its own variables requires a formation rule that allows the assignment of the formula to a variable. In general there is no stipulation that forbids this from happening.
The simplest case occurs when an OR formula becomes one its own inputs e.g. p = q. Begin with = q, then let p = q. Observe that q's "definition" depends on itself "q" as well as on "s" and the OR connective; this definition of q is thus impredicative.
Either of two conditions can result: oscillation or memory.
It helps to think of the formula as a black box. Without knowledge of what is going on "inside" the formula-"box" from the outside it would appear that the output is no longer a function of the inputs alone. That is, sometimes one looks at q and sees 0 and other times 1. To avoid this problem one has to know the state of the "hidden" variable p inside the box. When this is known the apparent inconsistency goes away.
To understand the behavior of formulas with feedback requires the more sophisticated analysis of sequential circuits. Propositional formulas with feedback lead, in their simplest form, to state machines; they also lead to memories in the form of Turing tapes and counter-machine counters. From combinations of these elements one can build any sort of bounded computational model.

Oscillation

In the abstract case the simplest oscillating formula is a NOT fed back to itself: ~ = q. Analysis of an abstract propositional formula in a truth-table reveals an inconsistency for both p=1 and p=0 cases: When p=1, q=0, this cannot be because p=q; ditto for when p=0 and q=1.
Oscillation with delay: If an delay is inserted in the abstract formula between p and q then p will oscillate between 1 and 0: 101010...101... ad infinitum. If either of the delay and NOT are not abstract, the type of analysis to be used will be dependent upon the exact nature of the objects that make up the oscillator; such things fall outside mathematics and into engineering.
Analysis requires a delay to be inserted and then the loop cut between the delay and the input "p". The delay must be viewed as a kind of proposition that has "qd" as output for "q" as input. This new proposition adds another column to the truth table. The inconsistency is now between "qd" and "p" as shown in red; two stable states resulting:

Memory

Without delay, inconsistencies must be eliminated from a truth table analysis. With the notion of "delay", this condition presents itself as a momentary inconsistency between the fed-back output variable q and p = qdelayed.
A truth table reveals the rows where inconsistencies occur between p = qdelayed at the input and q at the output. After "breaking" the feed-back, the truth table construction proceeds in the conventional manner. But afterwards, in every row the output q is compared to the now-independent input p and any inconsistencies between p and q are noted ; when the "line" is "remade" both are rendered impossible by the Law of contradiction ~). Rows revealing inconsistencies are either considered transient states or just eliminated as inconsistent and hence "impossible".

Once-flip memory

About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output "q" feeds back into "p". Given that the formula is first evaluated with p=0 & q=0, it will "flip" once when "set" by s=1. Thereafter, output "q" will sustain "q" in the "flipped" condition. This behavior, now time-dependent, is shown by the state diagram to the right of the once-flip.

Flip-flop memory

The next simplest case is the "set-reset" flip-flop shown below the once-flip. Given that r=0 & s=0 and q=0 at the outset, it is "set" in a manner similar to the once-flip. It however has a provision to "reset" q=0 when "r"=1. And additional complication occurs if both set=1 and reset=1. In this formula, the set=1 forces the output q=1 so when and if the flip-flop will be reset. Or, if the flip-flop will be set. In the abstract instance in which s=1 ⇒ s=0 & r=1 ⇒ r=0 simultaneously, the formula q will be indeterminate. Due to delays in "real" OR, AND and NOT the result will be unknown at the outset but thereafter predicable.

Clocked flip-flop memory

The formula known as "clocked flip-flop" memory is given below. It works as follows: When c = 0 the data d cannot "get through" to affect output q. When c = 1 the data d "gets through" and output q "follows" d's value. When c goes from 1 to 0 the last value of the data remains "trapped" at output "q". As long as c=0, d can change value without causing q to change.
The state diagram is similar in shape to the flip-flop's state diagram, but with different labelling on the transitions.

Historical development

lists three laws of thought that derive from Aristotle: The law of identity: "Whatever is, is.", The law of noncontradiction: "Nothing can both be and not be", and The law of excluded middle: "Everything must be or not be."
The use of the word "everything" in the law of excluded middle renders Russell's expression of this law open to debate. If restricted to an expression about BEING or QUALITY with reference to a finite collection of objects -- the members of which can be investigated one after another for the presence or absence of the assertion—then the law is considered intuitionistically appropriate. Thus an assertion such as: "This object must either BE or NOT BE ", or "This object must either have this QUALITY or NOT have this QUALITY " is acceptable. See more at Venn diagram.
Although a propositional calculus originated with Aristotle, the notion of an algebra applied to propositions had to wait until the early 19th century. In an reaction to the 2000 year tradition of Aristotle's syllogisms, John Locke's Essay concerning human understanding used the word semiotics. By 1826 Richard Whately had critically analyzed the syllogistic logic with a sympathy toward Locke's semiotics. George Bentham's work resulted in the notion of "quantification of the predicate" . A "row" instigated by William Hamilton over a priority dispute with Augustus De Morgan "inspired George Boole to write up his ideas on logic, and to publish them as MAL in 1847".
About his contribution Grattin-Guinness and Bornet comment:
Gottlob Frege's massive undertaking resulted in a formal calculus of propositions, but his symbolism is so daunting that it had little influence excepting on one person: Bertrand Russell. First as the student of Alfred North Whitehead he studied Frege's work and suggested a emendation with respect to it around the problem of an antinomy that he discovered in Frege's treatment. Russell's work led to a collatoration with Whitehead that, in the year 1912, produced the first volume of Principia Mathematica. It is here that what we consider "modern" propositional logic first appeared. In particular, PM introduces NOT and OR and the assertion symbol ⊦ as primitives. In terms of these notions they define IMPLICATION →, then AND, then EQUIVALENCE p ←→ q &.
Computation and switching logic: