there are K-linear maps Δ: B → B ⊗ B and ε: B → K, such that is a coalgebra;
compatibility conditions expressed by the following commutative diagrams:
Multiplication ∇ and comultiplication Δ
::
: where τ: B ⊗ B → B ⊗ B is the linear map defined by τ = y ⊗ x for all x and y in B,
Multiplication ∇ and counit ε
::
Comultiplication Δ and unit η
::
Unit η and counit ε
::
Coassociativity and counit
The K-linear map Δ: B → B ⊗ B is coassociative if. The K-linear map ε: B → K is a counit if. Coassociativity and counit are expressed by the commutativity of the following two diagrams :
Compatibility conditions
The four commutative diagrams can be read either as "comultiplication and counit are homomorphisms of algebras" or, equivalently, "multiplication and unit are homomorphisms of coalgebras". These statements are meaningful once we explain the natural structures of algebra and coalgebra in all the vector spaces involved besides B: is a unital associative algebra in an obvious way and is a unital associative algebra with unit and multiplication so that or, omitting ∇ and writing multiplication as juxtaposition, ; similarly, is a coalgebra in an obvious way and B ⊗ B is a coalgebra with counit and comultiplication Then, diagrams 1 and 3 say that Δ: B → B ⊗ B is a homomorphism of unital algebras and diagrams 2 and 4 say that ε: B → K is a homomorphism of unital algebras and : Equivalently, diagrams 1 and 2 say that ∇: B ⊗ B → B is a homomorphism of coalgebras and : diagrams 3 and 4 say that η: K → B is a homomorphism of coalgebras and : where
Examples
Group bialgebra
An example of a bialgebra is the set of functions from a groupG to, which we may represent as a vector space consisting oflinear combinations of standard basis vectorseg for each g ∈ G, which may represent a probability distribution over G in the case of vectors whose coefficients are all non-negative and sum to 1. An example of suitable comultiplication operators and counits which yield a counital coalgebra are which represents making a copy of arandom variable, and which represents "tracing out" a random variable — i.e., forgetting the value of a random variable to obtain a marginal distribution on the remaining variables. Given the interpretation of in terms of probability distributions as above, the bialgebra consistency conditions amount to constraints on as follows:
The product ∇ maps a probability distribution on two variables to a probability distribution on one variable;
Copying a random variable in the distribution given by η is equivalent to having two independent random variables in the distribution η;
Taking the product of two random variables, and preparing a copy of the resulting random variable, has the same distribution as preparing copies of each random variable independently of one another, and multiplying them together in pairs.
A pair which satisfy these constraints are the convolution operator again extended to all by linearity; this produces a normalized probability distribution from a distribution on two random variables, and has as a unit the delta-distribution where i ∈ G denotes the identity element of the group G.
Other examples
Other examples of bialgebras include the tensor algebra, which can be made into a bialgebra by adding the appropriate comultiplication and counit; these are worked out in detail in that article. Bialgebras can often be extended to Hopf algebras, if an appropriate antipode can be found. Thus, all Hopf algebras are examples of bialgebras. Similar structures with different compatibility between the product and comultiplication, or different types of multiplication and comultiplication, include Lie bialgebras and Frobenius algebras. Additional examples are given in the article on coalgebras.