Literature DB >> 33286520

Information-Theoretic Descriptors of Molecular States and Electronic Communications between Reactants.

Roman F Nalewajski1.   

Abstract

The classical (modulus/probability) and nonclassical (phase/current) components of molecular states are reexamined and their information contributions are summarized. The state and information continuity relations are discussed and a nonclassical character of the resultant gradient information source is emphasized. The states of noninteracting and interacting subsystems in the model donor-acceptor reactive system are compared and configurations of the mutually-closed and -open equidensity orbitals are tackled. The density matrices for subsystems in reactive complexes are used to describe the entangled molecular fragments and electron communications in donor-acceptor systems which determine the entropic multiplicity and composition of chemical bonds between reactants.

Entities:  

Keywords:  acid-base systems; density matrices; electron communications; reactivity theory; resultant information descriptors; state continuity

Year:  2020        PMID: 33286520      PMCID: PMC7517292          DOI: 10.3390/e22070749

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


1. Introduction

The information theory (IT) [1,2,3,4,5,6,7,8] of Fisher [1] and Shannon [3] has been successfully applied in an entropic interpretation of the molecular electronic structure [9,10,11]. Several information principles have been investigated [9,10,11,12,13,14,15,16] and pieces of molecular electron density attributed to atoms in molecules (AIM) have been approached [12,16,17,18,19,20], providing the IT basis for the intuitive stockholder division of Hirshfeld [21]. Patterns of entropic bond multiplicities have been extracted from electronic communications in molecules [9,10,11,22,23,24,25,26,27,28,29,30,31,32], information distributions in molecules have been explored [9,10,11,33,34], and the nonadditive Fisher (gradient) information [1,2,9,10,11,35,36] has been linked to electron localization function (ELF) [37,38,39] of density functional theory (DFT) [40,41,42,43,44,45]. This analysis has formulated the contragradience (CG) probe for localizing chemical bonds [9,10,11,46], while the orbital communication theory (OCT) of the chemical bond using the “cascade” propagations in molecular information systems has identified the bridge interactions between AIM [11,47,48,49,50,51,52], realized through intermediate orbitals. The quantum electronic states of molecular systems and their dynamics are determined by the Schrödinger equation (SE). These (complex) wavefunctions are specified by their modulus and phase components, which generate the probability and current distributions of the system electrons. Such physical attributes respectively reflect the complementary classical (static) and nonclassical (dynamic) structures of “being” and “becoming”, which both contribute to the state overall entropy and information content. It is of interest to examine their continuity relations in order to establish the net productions of these properties and to identify the origins of their sources. In quantum mechanics (QM), the wavefunction phase, or its gradient determining the effective velocity of probability density, give rise to nonclassical information and entropy supplements to the classical measures of Fisher [1] and Shannon [3]. In the resultant IT descriptors of electronic states, the information and entropy content in the probability (wavefunction modulus) distribution is combined with the relevant complement due to the current density (wavefunction phase) [53,54,55,56,57,58,59,60,61,62]. The overall (resultant) gradient information is then proportional to the expectation value of the system kinetic energy of electrons. Such combined descriptors are also required in the phase distinction between the bonded (entangled) and nonbonded (disentangled) states of molecular subsystems, for example, the substrate fragments of reactive systems [63,64,65]. This generalized treatment allows one to interpret the variational principle for electronic energy as equivalent information rule, and to use the molecular virial theorem [66] in general reactivity considerations [67,68,69,70,71]. The elementary chemical processes have also been monitored using the classical entropy and information descriptors [72,73,74,75]. Elsewhere, the application of DFT construction by Harriman, Zumbach, and Maschke (HZM) [76,77], of wavefunctions yielding the prescribed electron distribution, for a description of reactive systems has been examined [63,78,79]. In such density-constrained Slater determinants the defining equidensity orbitals (EO) of the Macke/Gilbert [80,81] type exhibit the same molecular probability density, with the orbital orthogonality being assured by the local phases alone. Such orbital configurations define a constrained multicomponent system, which is composed of the mutually-closed (disentangled) orbital units, with each subsystem being characterized by its own phase and chemical potential descriptors. Their simultaneous opening onto a common electron reservoir, and hence also onto themselves, generates an externally- and mutually-open orbital system, in which the EO fragments are effectively “bonded” (entangled) [63,82]. They, then, exhibit a common (molecular) phase descriptor and equalize their chemical potentials at the global reservoir level. In OCT, the chemical-bond pattern of molecular systems can be probed [9,10,11,22,23,24,25,26,27,28,29,30,31,32] using techniques and descriptors developed in IT of communication devices (information channels) [3,4,7,8]. The conditional probabilities of simultaneous orbital events can be tackled using amplitudes from the bond-projected superposition principle (SP) of QM. Their cascade (“bridge”) propagations in molecules, involving intermediate orbitals, also determine the bridge contributions to chemical interactions between bonded atoms. This quantum approach to electronic communications defines the probabilities of observing specific orbital “events” conditional on the given “molecular” state, of the system as a whole, which can be used to generate the entire network of the state quantum communications between atomic orbitals or general basis functions. In the single Slater determinant approximation of the familiar Hartree–Fock (HF) or Kohn–Sham (KS) methods the communication amplitudes are related to the corresponding elements of the familiar charge-and-bond-order (CBO) matrix of quantum chemistry. This novel IT framework has also established an entropic perspective on the state overall bond multiplicity and its covalent and ionic components. The entropic measure of the overall bond “covalency” has been linked to the conditional entropy (average “noise”) in the molecular information system, while the complementary ionicity descriptor has been identified to reflect the mutual information (information “flow”) in this channel. Specific IT tools for detecting effects of chemical bonds, predicting their spatial localization and chemical multiplicity have also been developed. The direct communications between orbitals reflect the “through-space” bonding component, a result of constructive interference between orbitals, while the indirect (cascade) propagations give rise to “through-bridge” bond orders due to orbital intermediates. This approach has increased our insight and understanding of the information origins of the chemical bonding. In the present analysis we focus on the state continuity relations for the wavefunction modulus and phase components, as well as on the balance equations establishing the net productions of the resultant information and entropy descriptors. The nonclassical origins of the information sources in quantum systems are emphasized and the quantum mixed character of effective states of molecular fragments and interacting reactants is stressed. Reactive complexes involving the donor (basic, B) and acceptor (acidic, A) subsystems are reexamined and a distinction between the interacting and separated (isolated) species are briefly explored. We also investigate the substrate density matrices and inter-reactant electronic communications, which shape the entropic multiplicities of chemical bonds between reactants.

2. Probability and Phase Continuities

For simplicity, let us consider a single electron in state |ψ(t)〉 at time t, or the associated (complex) wavefunction in position representation: Its modulus (R) and phase (φ) components determine the state electron probability and current densities: The effective velocity (, t) of the probability “fluid” measures its current-per-particle and reflects the state phase gradient: In the molecular scenario, an electron is moving in the external potential v() due to the “frozen” nuclear frame of the familiar Born-Oppenheimer approximation. The electronic Hamiltonian H( ultimately determines the Schrödinger dynamics of electronic state:i This SE of molecular QM also implies specific time evolutions of both components of the complex wavefunction of Equation (1) (see Section 3). The time derivatives of the modulus and phase parts of electronic states ultimately reflect the relevant continuity equations associated with physical descriptors of the particle probability and current densities. It directly follows from the SE (6) and its complex conjugate that the quantum dynamics implies the probability continuity relation expressing the vanishing source of this distribution: This equation, thus, expresses the time evolution of the state modulus component:∂ while the associated phase dynamics reads:∂ The inflow of probability flux in Equation (7), ∇⋅ also implies the vanishing divergence of the velocity field :∇⋅ In the probability continuity relation of Equation (7), ∂ the negative divergence term, −∇⋅ (, t), represents the local probability outflow and σ(, t) stands for its (vanishing) local “source”. The total time derivative of the distribution p(, t) = p[(t), t], thus, determines the vanishing net production of p(, t): This total time derivative measures the rate of change in an infinitesimal volume element of probability fluid moving with velocity = d/dt, whereas the partial derivative ∂p[(t), t]/∂t refers to a volume element around the fixed point in space. One also realizes that the effective velocity of the probability current, = p, also determines the phase flux and its divergence (see Equation (11)):∇⋅ This complementary flow descriptor ultimately generates a nonvanishing phase source: Using Equation (9) finally gives: To summarize, the effective velocity of the probability current also determines the phase flux in molecules. The source (net production) of the classical (probability) variable of the electronic state identically vanishes, while that of its nonclassical (phase) component remains finite. The phase source is seen to be determined by both wavefunction components and the external potential due to the system nuclei.

3. Resultant Entropy/Information Descriptors and State Continuity

Equation (1) identifies the following (additive) components of the wavefunction logarithm:ln Together they generate the so-called resultant measures of the global and gradient contents of the state overall entropy or information [53,54,55,56,57,58,59,60,61,62]. For example, for the given time t, the complex global entropy descriptor [59,62], the quantum expectation value of the (non-Hermitian) multiplicative operator of the complex global entropy, S() = −2lnψ(),       where S() stands for the entropy density per electron, combines the classical (Shannon) entropy S[p] ≡ ∫p()S()d as its real part, and the nonclassical phase supplement S[φ] ≡ ∫p()S()d, which determines its imaginary component. The latter reflects the state average phase φ[ψ] = ∫p()φ()d: S[φ] = −2φ[ψ]. The corresponding Fisher-type gradient measure of the state resultant information I[ψ] is defined by quantum expectation I[ψ] = 〈ψ|I|ψ〉 of the (Hermitian) operator in position representation, I( The associated gradient measure of resultant entropy reads: The generalized information measure I[ψ], related to the average kinetic energy T[ψ] = 〈ψ|T|ψ〉, combines the classical Fisher information in probability distribution, I[p] ≡ ∫p()I()d, and its nonclassical complement I[φ] ≡ ∫p()I()d, due to inhomogeneities in the state phase distribution. These equations confirm a symmetrical role played by the additive components of Equation (17) in generating the overall entropy and information content in the quantum electronic state. The resultant distribution of the local resultant gradient information reflects the density of electronic kinetic energy. One also observes that densities-per-electron of the gradient information and complex global entropy are mutually related: Thus, the gradient of the latter constitutes the (coherent) quantum amplitude of the former. Both resultant densities of an entropic content of the electronic state are seen to include the nonclassical (phase/velocity) complements of the classical (modulus/probability) contributions. The preceding relation constitutes a natural (complex) generalization of the corresponding classical link between local information and entropy descriptors of Fisher and Shannon: It should also be observed that a (noncoherent) classical density of Shannon’s global entropy, is devoid of any phase content. A reference to Equation (19) indicates that the state information functional I[ψ] is proportional to the average kinetic energy of electrons T[ψ] = 〈ψ|T|ψ〉, determined by the quantum operator of Equation (5): One also recalls that the average electronic energy in state |ψ(t)〉 combines the following components: where we have used the relevant integration by parts and V[ψ] denotes the average energy of the electron nuclei attraction. Expressing SE in terms of the state modulus and phase components R and φ gives:i (∂ln Its imaginary parts determine the continuity equation for the wavefunction modulus component (see Equation (8)), ∂ln where we have introduced the modulus flux associated with real part of the state logarithm (17). The real components of Equation (22) similarly recover the phase dynamics of Equation (9):∂ One also observes that combining the preceding equation (multiplied by i) with the modulus continuity of Equation (23) gives the state logarithmic continuity relation ∂ln This equation can be also recast to express the state logarithmic source: The (complex) logarithmic continuity relation further emphasizes a classical (real) character of the modulus and probability descriptors and a nonclassical (imaginary) nature of the phase and current state variables. It introduces the (complex) wavefunction current and identifies the (nonclassical) state source, stressing the phase origin of the state production of its overall information and entropy density. This combined treatment also reveals two independent sources of the resultant entropy and information descriptors of Equations (18), (20), and (21), the additive components of the logarithmic separation of Equation (17).

4. Integral Productions of Information and Entropy Descriptors

It is also of interest to examine the integral sources of the resultant measures of the quantum global entropy or gradient information. For the average production of the state complex entropy, one finds: where we have recognized the probability continuity of Equation (7). Using Equation (30), then, finally gives This expression again confirms the nonclassical (phase) origin of the complex entropy source, which reflects the average of the local production σ[ψ] of the state phase (see Equation (16)). The integral production σ[ψ] of state overall entropy can be also discussed in terms of the local source contribution per electron, σ(), and the associated continuity relation where () stands for the density of entropy current carried by the probability flux (). Then, it again follows from Equations (16) and (18) that the local source of the complex entropy has purely nonclassical, phase origin: σ() = −2iσ(). One similarly determines the corresponding total time derivatives of the overall gradient measures of the state resultant information (Equation (20)) and entropy (Equation (21)):   Using the probability and phase continuity relations, one ultimately obtains the following expressions for the average productions of these state functionals: These expressions reveal the complementary character of the gradient information and entropy, with the positive source of one implying the negative production of another. The time derivatives of these overall functionals manifest the nonclassical (phase) origins of the molecular productions of electronic gradient entropy and information descriptors [64]. These integral sources can be also expressed in terms of the electron current density of Equation (3): In close analogy to irreversible thermodynamics [83], they are seen to be determined by the product of local flux and affinity densities, () and ∇σ(), respectively. Using Equations (11) and (16), finally gives the following explicit expression for the phase-source gradient: ∇ Thus, it follows that only the wavefunction modulus and shape of the external potential influence the affinity factor in the resultant information and entropy production of Equation (37). It should be recalled, however, that the phase gradient, ∇φ, determines the flux factor, , in this product. The integral source of the gradient information (Equation (36)) can be also interpreted in terms of the local information source σ(): The local continuity equation for the information density I() of Equation (20), also involves the information current density: The local continuity relations of Equation (37) again emphasize the nonclassical (phase) origin of the information source. One observes that only a presence of the state local phase contribution generates a finite probability flow and nonvanishing information production.

5. Isolated/Interacting and Open/Closed Subsystems

Consider the simplest case of the two-electron reactive complex consisting of the B+ (base, electron donor) and A+ (acid, electron acceptor) subsystems, each containing a single electron, NA0 = NB0 = 1, at the polarization (P) stage of the reactive system [63,64,65,67,68,69,70,71]:R This model system involves the mutually-closed substrates, at a finite distance RAB between the two subsystems with both (interacting) electrons, g(1,2) = r1,2−1(a.u.), moving in the external potential v = vA + vB, due to the fixed nuclei of both geometrically “frozen” reactants. Their infinite separation, RAB→∞, results in the sum of isolated reactants {X0}, R while the mutual opening of these molecular fragments in R* = (A*¦B*), at a finite distance RAB, results in the global equilibrium state of R as a whole, after the optimum B→A charge transfer (CT):R*(1, 2) = A*(1, 2) + B*(1, 2). As indicated above, such open interacting subsystems assume an effective two-electron character, since in R* the two electrons are indistinguishable. Therefore, both mutually open subsystems effectively explore the probability distribution p() of the whole complex. The equilibrium subsystems {X*} are, then, characterized by the subsystem densities {ρX*()} exhibiting fractional average numbers of electrons {NX* = ∫ρX*()d}. They generate the final, equilibrium molecular distribution of the whole reactive complex, and define the optimum amount of the B→A CT: One further recalls that in theory of chemical reactivity this (global) NCT measure results from the electronegativity-equalization (EE) considerations [84,85,86,87,88,89,90], based upon the chemical potential [91,92,93,94,95] and hardness/softness [96] or Fukui function [97] derivative descriptors of the (mutually-closed) polarized subsystems in R+. This model scenario, thus, involves the one-electron Hamiltonians {h0(X)} of the isolated, infinitely separated fragments {X0(i)} in R∞ (see Equation (5)),    h Their eigenvalue problems, h define the (one-electron) othonormal bases of the alternative complete sets of stationary states in isolated fragments, also capable of representing any (two-electron) state Ψ(A, B) = Ψ(1, 2) of the whole reactive complex. For example, in the A expansion:Ψ(1, 2) = ∑ The mutually- and externally-open, interacting parts of this model reactive system are in the mixed states described by the corresponding density matrices of subsystems [82]. Indeed, for the mutually-open, interacting fragments, a simple product representation of this (pure) quantum state, Ψ(1, 2) = with each reactant described by the substrate wavefunction that is dependent exclusively on its own internal coordinates, is not available. It exists only for the (disentangled) states of noninteracting subsystems in R∞ ≡ R0, Ψ and the distinguishable electrons attributed to the mutually-closed (c) reactants in the polarized reactive system R+ = (A+|B+) ≡ R, Ψ This product wavefunction is replaced by the corresponding Slater determinant |ψA+ ψB+|, when the two polarized subsystems become mutually open (o) in R* = (A*¦B*) ≡ R thus making the two electrons indistinguishable:Ψ The (two-electron) Hamiltonians, describing the interacting subsystems in R+ at finite distances between reactants, read:   H The complete sets of their stationary states {ΘuX(1, 2)}, H then, define the alternative (two-electron) bases for expanding the “molecular” state Ψ(1, 2):Ψ(1, 2) = ∑ The equilibrium reactive complex, R* = (A*¦B*), at a finite distance between the two mutually-open substrates, corresponds to the electronic Hamiltonian of reactive complex as a whole,           H(1, 2) = [T(1) + T(2)] + [ where h(1, 2) stands for the overall perturbation relative to the reference Hamiltonian HR0(1, 2) in the separated reactant limit (SRL). In Appendix A the energetic implications of the mutual opening of reactants are briefly examined using the 1st—order perturbation theory. This molecular Hamiltonian determines the stationary states of the whole reactive complex R*:H(1, 2) Ψ Their phase component is purely time dependent, Ψ thus, giving rise to the vanishing phase gradient, and hence zero probability current. A general “molecular” state Ψ(1, 2) ≡ Ψ(1, 2) of two indistinguishable electrons, which determines the electron density also characterizes all equilibrium reactants {X*} in R* = (A*¦B*), since all mutually-open fragments explore the same “molecular” probability distribution: Their (mixed) quantum states are represented by the corresponding density operators, for example, those corresponding to the applied (external) thermodynamic conditions. The reactive system coupled to an external heat bath B(T) and electron reservoir R(μ) would be represented by the equilibrium grand-ensemble establishing the statistical mixture of {Ψ(1, 2)}. The state probabilities are then related to the absolute temperature T of B and the chemical potential μ of R [94,95]. Expanding a general (pure) state Ψ(1, 2) of R* in the stationary molecular basis {Ψ(1, 2)} gives:Ψ(1, 2) = ∑ In this molecular state the expectation value of a property FA of subsystem A, represented by the associated quantum operator FA(1), is given by an ensemble-average expression including the partial (fragment) trace operation and the subsystem density matrix ρ(A) [82,98]. Indeed, using the fragment expansion of Equation (44) gives 〈 The diagonal element of the above subsystem density matrix (see the normalization condition of Equation (44)), ∑ measures conditional probability P[ψA|Ψ] of observing in Ψ the subsystem state ψA = RAexp(iφA). These probabilities define the effective density operator of A in the molecular state Ψ, d which determines the effective (mixed) state of this fragment in the reactive system. Its representation in the basis {ψA} of Equation (43) is diagonal: Therefore, by selecting in Equation (55) FA = φA, one obtains the following expression for the representative average phase of A in Ψ:〈

6. Equidensity Orbital Systems

As an illustration, consider the EO configurations defined by Slater determinants of orbitals conserving the specified molecular probability distribution. In HZM construction [76,77] of modern DFT [40,41,42,43,44,45], of the wavefunctions yielding the prescribed electron density, one introduces the plane-wave type EO,            which exactly reproduce the system probability distribution p(). The density-dependent vector function () = f() + f() + f() =[p;], for which the Jacobian determinant then assures the orbital orthonormality: Here, = (q, + q, + q, ) denotes the (constant) reduced momentum (wave number) vector of EO and Φ() stands for its resultant phase. The latter is defined by the sum of orthogonality phase F() and its local “thermodynamic” supplement φ(), common in all occupied EO of the system electron configuration under consideration:Ψ[ Notice, that in this HZM representation all orbital components are described by the local resultant phases {Φ() = Φ[p; ], l = 1, 2, …, N} originating from the same overall probability density p(). The resultant local phase Φ() also generates the associated orbital current of Equation (3): The optimum “thermodynamic” contribution φ(), common to all occupied EO reconstructing the given electron density ρ() = N p(), is determined from the subsidiary minimum information principle [53,54,55,56,57]. It relates this phase contribution to the average wave vector in the configuration under consideration [78,79]: where the summation extends over all occupied EO. Then, the resultant EO phases are shaped by displacements {δ =[p] − 〈[p]〉} of the the orbital wave vectors {[p]} from the configuration average vector of the preceding equation:{ These resultant phases, then, give rise to the vanishing overall current () in electron configuration (63), the sum of equilibrium EO currents since ∑ The occupied EO in the HZM product state Ψ represent the closed (c) orbital system Ψ(N), with each EO containing a single (distinguishable) electron, {n+ = l}. These (disentangled) nonbonded orbital components are distinguished by their EO phases, with different wave vectors attributed to each orbital (see Equation (63)). One can also envisage its open (o) (bonded, entangled) analog of this N-orbital system, described by the Slater determinant of Equation (63):Ψ*( This mutual opening of EO in Ψ*(N), although still precluding the net electron flows between orbitals, due to the limiting occupations {n * = 1}, now formally opens electronic exchanges, since in the determinantal state all electrons are indistinguishable (see also Appendix A). The mutually-open state of orbital subsystems can also involve the external (thermodynamic) coupling of these orbital components to the heat bath B(T) and (“molecular”) electron reservoir R(μ) in the (macroscopic) composite system:M( This mutual and external opening of EO fragments in M(N) implies their effectively “bonded” (entangled) character. It is reflected by their fractional orbital occupations {0 < n*(μ, T) < 1} marking partial electron outflows to the initially unoccupied (virtual) EO. Indeed, the externally open, thermodynamic-orbital fragments must be described by the statistical mixture of EO states {|φ〉}, defined by the equilibrium density operator D( with the equilibrium orbital probabilities {P(μ, T)} reflecting the applied thermodynamic conditions, i.e., the chemical potential μ of the reservoir and the absolute temperature T of the heat bath. This mixed state of the mutually-open (bonded, entangled) orbital components corresponds to an equalized (average) phase intensity and a common level of the chemical potential, fixed by the electron reservoir. The equilibrium probability of the φ “subsystem” in such an EO grand-ensemble is determined by thermodynamic parameters μ and T, the equilibrium (fractional) orbital occupations {0 < n* < 1} and orbital energies {e = 〈ϕ|H|ϕ〉}: Here, ΞEO(μ, T) = ∑ exp[β(μn* − e)] denotes the EO grand-partition function, β = (kBT)−1, and kB is the Boltzmann constant.

7. Electron Communications

Let us now examine the molecular electron communications between states of isolated subsystems in the donor-acceptor reactive system, R = A----B. For specificity and simplicity reasons we again refer to the two-electron scenario of Section 5. The whole information system of the probability scattering between stationary states of isolated reactants in such a molecular complex involves four blocks of conditional probabilities, defining the internal (diagonal) and external (off-diagonal) blocks of electronic communications within and between reactants, respectively. In Communication Theory of the Chemical Bond [9,10,11,22,23,24,25,26,27,28,29,30,31,32], the former determines the intra-reactant bonds, i.e., determine the substrate polarization and activation accompanying the chemical reaction, while the latter reflect the inter-reactant bond pattern which directly probes the reactivity behavior. In what follows, we shall focus on this external part of electronic communications alone. For specificity, we, thus, examine the { communications described by the molecular probabilities P(A→B). In accordance with SP of QM [99,100], the conditional probability P[w′(B)|w(A)] ≡ P→(A→B) of observing the output state |ψB〉 ≡ |w′(B)〉 of the the “receiver” part B of this partial reactive network, given the input state |ψA)〉 ≡ |w(A)〉 in its “source” part A, is determined by the squared modulus of the corresponding scattering amplitude A[w′(B)|w(A)] ≡ A→(A→B) measuring their mutual projection in the molecular Hilbert space [62,100]: These probabilities satisfy the relevant normalization involving summation over the complete set of all monitoring states in this inter-reactant communication “device”:∑ since the sum of state projections {P′(B) = |w′(B)〉〈w′(B)|} then, amounts to the identity operator: ∑P′(B) = 1. Of interest also is the doubly conditional probability scattering, of the |w(A)〉→|w′(B)〉 communication in the specified “parameter” state |Ψ〉 of the whole reactive complex. This communication involves the intermediate state |Ψ〉 in the bridge communication [11,47,51,62]:| Its amplitude can be thus regarded as that of the single-cascade communication determined by the product of two-stage amplitudes, where PΨ stands for the projection operator onto the “molecular” reference state. The associated conditional probabilities, then, satisfy the intermediate (“bridge”) normalization condition:∑ Let us now examine more closely such A→B communications in the molecular state Ψ = Ψ(1, 2) between the stationary states {ψA(1)} and {ψB(2)} of isolated reactants. They determine the following stage projections:〈 the associated (local) bridge amplitude and resulting density of two-electron probabilities: Their global analogs, measuring probabilities between states rather than locations within states, involve integrations of such local scattering densities over all possible locations of Electron 1 in the network source state, ψA(1), and of Electron 2 in the system receiver state, ψB(2): Hence, the stage probabilities in the cascade of Equation (83) read: The double-conditional scattering probabilities between states of isolated reactants indeed observe the bridge normalization of Equation (80). This inter-reactant, external communication system, defined by blocks P(A→B) and P(B→A) of the conditional probabilities in this resolution of stationary states of isolated reactants, ultimately generates the entropic multiplicities (in bits) of the chemical bonds between both substrates [9,10,11,22,23,24,25,26,27,28,29,30,31,32,62]. For the given molecular state of the whole reactive system, the conditional entropy of the output states, given the input states, ultimately defines the overall IT covalency in the inter-reactant bonds, a measure of the information noise in the underlying communication system, while, the complementary descriptor of the overall bond IT ionicity, then, reflects the mutual information in these reactant states, a measure of the information flow between the two subsystems. Elsewhere [101] we have examined the internal communications {P(X→X)} in interacting subsystems, which shape electronic structure of the polarized reactants. They have been shown to be determined by the fragment density matrices of Equations (55), (84), and (85).

8. Conclusions

Due to Heisenberg’s uncertainty principle of QM, the sharply specified locations of electrons in the position representation of quantum states, which defines the molecular wavefunction and the associated probability distribution in the physical space, precludes the corresponding precise specification of electronic momenta. Therefore, only an effective measure of the latter, consistent with the probability flux definition, is available in quantum description. The current-per-particle measure of the probability velocity, which itself combines the incompatible position and momentum variables of electrons, appears as a natural choice for such an effective local “velocity” descriptor, which gives rise to its vanishing divergence in molecular QM [101]. This simplifies local continuity considerations for the electronic probability “fluid” and separates the “moment”, a “static” aspect of electronic probability density determined by the modulus component of molecular wavefunction, from the “momentum”, a “dynamic” feature of electronic current distribution reflecting the phase gradient of molecular quantum state. To paraphrase Prigogine [102], the former reflects the state (static) electronic structure “of being” while the latter constitutes the its (dynamic) structure “of becoming”. The classical (probability) and nonclassical (current) degrees-of-freedom of molecular states, then respectively determine the system structures “of being” and “of becoming”. Both these patterns carry the information contained in the system (complex) quantum electronic state and contribute to the overall (resultant) entropy and information descriptors. The distributions of electrons and their current in a molecule determine the classical (modulus) and nonclassical (phase) contributions to the overall information content of the system quantum state. The minimum of the average resultant gradient measure of information, the expectation value of (dimensionless) kinetic energy of electrons, then, establishes the information equilibria in the whole molecular system and its fragments, reflected by the local (“thermodynamic”) phase contribution. The phase aspect of such generalized, phase-transformed equilibrium states is vital for the coherent propagation of electronic communications in molecules. It also distinguishes the bonded (entangled) and nonbonded (disentangled) states of reactants. In the present analysis, we have reexamined the probability and phase continuities in QM and summarized the resultant measures of the information and entropy content combining the classical and nonclassical contributions. The additive resolution of the wavefunction logarithm have generated the (complex) state continuity relation with the relevant source contribution identified as the (imaginary) phase production. We have also discussed sources of such overall entropy and information descriptors. The states of isolated and interacting reactants in a simple model of the reactive (donor-acceptor) system have been explored in some detail. We have emphasized the mixed character of electronic states in the entangled (interacting) molecular fragments. Indeed, such subsystems have been shown to be described by their partial density operators, with the quantum expectations of reactant properties determined by the subsystem density matrices for the specified (pure) molecular state. Information principles using the resultant entropy and information measures have also been used to determine phase equilibria [51,52,53,54,55,56,57] in molecular systems and their constituent parts, marking the extreme values of alternative overall measures of electronic entropy (uncertainty, “disorder”) or information (determinicity, “order”) content in electronic wavefunctions. These “thermodynamic” states represent phase-transforms of molecular wavefunctions and generate finite equilibrium currents. As an illustration, the disentangled (mutually closed) and thermodynamically entangled (mutually open) EO systems of the HZM construction have been examined. In this “plane-wave” type representation the fixed electron densities of molecular fragments generate finite electronic currents due to nonvanishing (local) EO phases, and hence also finite nonclassical contributions to the resultant IT descriptors.
  12 in total

1.  Information theory, atoms in molecules, and molecular similarity.

Authors:  R F Nalewajski; R G Parr
Journal:  Proc Natl Acad Sci U S A       Date:  2000-08-01       Impact factor: 11.205

2.  Conceptual density functional theory.

Authors:  P Geerlings; F De Proft; W Langenaeker
Journal:  Chem Rev       Date:  2003-05       Impact factor: 60.622

3.  Quantum-thermodynamic definition of electronegativity.

Authors:  E P Gyftopoulos; G N Hatsopoulos
Journal:  Proc Natl Acad Sci U S A       Date:  1968-07       Impact factor: 11.205

4.  Universal variational functionals of electron densities, first-order density matrices, and natural spin-orbitals and solution of the v-representability problem.

Authors:  M Levy
Journal:  Proc Natl Acad Sci U S A       Date:  1979-12       Impact factor: 11.205

5.  What is an atom in a molecule?

Authors:  Robert G Parr; Paul W Ayers; Roman F Nalewajski
Journal:  J Phys Chem A       Date:  2005-05-05       Impact factor: 2.781

6.  Electron localization function as information measure.

Authors:  Roman F Nalewajski; Andreas M Köster; Sigfrido Escalante
Journal:  J Phys Chem A       Date:  2005-11-10       Impact factor: 2.781

7.  Fisher information and steric effect: study of the internal rotation barrier of ethane.

Authors:  Rodolfo O Esquivel; Shubin Liu; Juan Carlos Angulo; Jesús S Dehesa; Juan Antolín; Moyocoyani Molina-Espíritu
Journal:  J Phys Chem A       Date:  2011-04-07       Impact factor: 2.781

8.  Role of electronic kinetic energy and resultant gradient information in chemical reactivity.

Authors:  Roman F Nalewajski
Journal:  J Mol Model       Date:  2019-08-16       Impact factor: 1.810

9.  Information-Theoretic Approaches to Atoms-in-Molecules: Hirshfeld Family of Partitioning Schemes.

Authors:  Farnaz Heidar-Zadeh; Paul W Ayers; Toon Verstraelen; Ivan Vinogradov; Esteban Vöhringer-Martinez; Patrick Bultinck
Journal:  J Phys Chem A       Date:  2018-04-20       Impact factor: 2.781

10.  Information equilibria, subsystem entanglement, and dynamics of the overall entropic descriptors of molecular electronic structure.

Authors:  Roman F Nalewajski
Journal:  J Mol Model       Date:  2018-07-19       Impact factor: 1.810

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.