Category Archives: Thermodynamics

Useful Thermochemistry from Gaussian Calculations


Statistical Mechanics is the bridge between microscopic calculations and thermodynamics of a particle ensemble. By means of calculating a partition function divided in electronic, rotational, translational and vibrational functions, one can calculate all thermodynamic functions required to fully characterize a chemical reaction. From these functions, the vibrational contribution, together with the electronic contribution, is the key element to getting thermodynamic functions.

Calculating the Free Energy change of any given reaction is a useful approach to asses their thermodynamic feasibility. A large negative change in Free Energy when going from reagents to products makes up for a quantitative spontaneous (and exothermic) reaction, nevertheless the rate of the reaction is a different story, one that can be calculated as well.

Using the freq option in your route section for a Gaussian calculation is mandatory to ascertain the current wave function corresponds to a minimum on a potential energy hypersurface, but also yields the thermochemistry and thermodynamic values for the current structure. However, thermochemistry calculations are not restricted to minima but it can also be applied to transition states, therefore yielding a full thermodynamic characterization of a reaction mechanism.

A regular freq calculation yields the following output (all values in atomic units):

Zero-point correction=                           0.176113 (Hartree/Particle)
 Thermal correction to Energy=                    0.193290
 Thermal correction to Enthalpy=                  0.194235
 Thermal correction to Gibbs Free Energy=         0.125894
 Sum of electronic and zero-point Energies=           -750.901777
 Sum of electronic and thermal Energies=              -750.884600
 Sum of electronic and thermal Enthalpies=            -750.883656
 Sum of electronic and thermal Free Energies=         -750.951996

For any given reaction say A+B -> C one could take the values from the last row (lets call it G) for all three components of the reaction and perform the arithmetic: DG = GC – [GA + GB], so products minus reagents.

By default, Gaussian calculates these values (from the previously mentioned partition function) using normal conditions, T = 298.15 K and P = 1 atm. For an assessment of the thermochemistry at other conditions you can include in your route section the corresponding keywords Temperature=x.x and Pressure=x.x, in Kelvin and atmospheres, respectively.

(Huge) Disclaimer: Although calculating the thermochemistry of any reaction by means of DFT calculations is a good (and potentially very useful) guide to chemical reactivity, getting quantitative results require of high accuracy methods like G3 or G4 methods, collectively known as Gn mehtods, which are composed of pre-defined stepwise calculations. The sequence of these calculations is carried out automatically; no basis set should be specified. Other high accuracy methods like CBS-QB3 or W1U can also be considered whenever Gn methods are too costly.

Estimation of pKa Values through Local Electrostatic Potential Calculations


Calculating the pKa value for a Brønsted acid is very hard, like really hard. A full thermodynamic cycle (fig. 1) needs to be calculated along with the high-accuracy solvation free energy for each of the species under consideration, not to mention the use of expensive methods which will be reviewed here in another post in two weeks time.

Thermodynamic_Cycle
Fig 1. Thermodynamic Cycle for the pKa calculation of any given Bronsted acid, HA

Finding descriptors that help us circumvent the need for such sophisticated calculations can help great deal in estimating the pKa value of any given acid. We’ve been interested in the reactivity of σ-hole bearing groups in the past and just like Halogen, Tetrel, Pnicogen and Chalcogen bonds, Hydrogen bonds are highly directional and their strength depends on the polarization of the O-H bond. Therefore, we suggested the use of the maximum surface electrostatic potential (VS,max) on the acid hydrogen atom of carboxylic acids as a descriptor for the strength of their interaction with water, the first step  in the deprotonation process. 

We selected six basis sets; five density functionals; the MP2 method for a total of thirty-six levels of theory to optimize and calculate VS,max on thirty carboxylic acids for a grand total of 1,080 wavefunctions, which were later passed onto MultiWFN (all calculations were taken with PCM = water). Correlation with the experimental pKa values showed a great correlation across the levels of theory (R2 > 0.9), except for B3LYP. Still, the best correlations were obtained with LC-wPBE/cc-pVDZ and wB97XD/cc-pVDZ. From this latter level of theory the linear correlation yielded the following equation:

pKa = -0.2185(VS,max) + 16.1879

Differences in pKa turned out to be less than 0.5 units, which is remarkable for such a straightforward method; bear in mind that calculation of full thermodynamic cycles above chemical accuracy (1.0 kcal/mol) yields pKa differences above 1.0 units.

We then took this equation for a test with 10 different carboxylic acids and the prediction had a correlation of 98% (fig. 2)

47051619_1824157374360101_2244437569725005824_n
fig 2. calculated v experimental pKa values for a test set of 10 carboxylic acids from equation above

I think this method can really catch on for a quick way to predict the pKa values of any carboxylic acid imaginable. We’re now working on the model extension to other groups (i.e. Bronsted bases) and putting together a black-box workflow so as to make it even more accessible and straightforward to use. 

We’ve recently published this work in the journal Molecules, an open access publication. Thanks to Prof. Steve Scheiner for inviting us to participate in the special issue devoted to tetrel bonding. Thanks to Guillermo Caballero for the inception of this project and to Dr. Jacinto Sandoval for taking the time from his research in photosynthesis to work on this pet project of ours and of course the rest of the students (Gustavo Mondragón, Marco Diaz, Raúl Torres) whose hard work produced this work.

Explaining Entropy can be a mess…


Another scientific concept that is hard to grasp by laypeople and that to my opinion has been the center of much distortion in the chemistry classroom, is the thermodynamical function Entropy, S.

More often than not, S is said to be a measure of “disorder” and people just take it! If one was to define disorder then one would have to also define order: Is my apartment too entropic? what about my life? Does nature understand order in the same way as we do? How do we understand order inside a living cell where many molecules and organelles are floating around? If indeed S was a measure of disorder then, why is it important to measure it?

Entropy in a nutshell. There have been many attempts to define S in a way young students may understand it, yet tracing parallelisms with ordinary every-day-life concepts is hard and often leads to miss conceptions.  A student of mine once asked: “if entropy is always increasing, how come bodies tend to cool down?” he meant to ask how come the translation motions of a molecular ensamble tended to decrease (and with this achieving “order”.)

Prof. Mayo Martínez-Kahn at UNAM in Mexico wrote a very interesting paper about Entropy in the local journal of the Chemistry School, “Educación Química”. The paper was entitled “The tombs of Entropy” as a reference to the widely known fact that in Boltzmann’s tomb his famous equation relating Entropy to the partition function Q, is engraved. Prof. Martínez then ventures in imagining how would other tombs from people who have made contributions to the concept and notion of S would look like. I remember distinctively the one of Sadi Carnot’s in which his famous thermodynamic cycle was displayed.

Entropy in so many words is a function that describes how many different energy levels are available in a thermodynamic system. The more levels, the higher the entropy. It also describes the spontaneity of a process to occur since in nature a system always tends to undergo changes that increase its entropy along with that of its surroundings.

How come Gibbs’ free energy or Helmholtz don’t cause such confusions? my guess is because nobody has attached an every-day-word to them!

PS. It is still important to make scientific concepts permeate into the general audience. Recently decesead comedian George Carlin mentioned Entropy in the following video…

%d bloggers like this: