As a continuation of our previous work on estimating pKa values from DFT calculations for carboxylic acids, we now present the complementary pKb values for amino groups by the same method, and the coupling of both methodologies for predicting the isoelectric point -pI- values of amino acids as a proof of concept.
Analogously to our work on pKa, we now used the Minimum Surface Electrostatic Potentia, VS,min, as a descriptor of the availability of Nitrogen’s lone pair and correlated it with the experimental basicity of a large number of amines, separated into three groups: primary, secondary and tertiary amines.
Interestingly, the correlation coefficient between experimental and calculated pKb values decreases in the following order: primary (R2 = 0.9519) > secondary (R2 = 0.9112) > tertiary (R2 = 0.8172). This could be due to steric effects, the change in s-character of the lone pair or just plain old selection bias. Nevertheless, there is a good correlation between both values and the resulting equations can predict the pKb value of an amino group within less of a unit, which is very good for a statistical method that does not require the calculation of a full thermodynamic cycle.
We then took thirteen amino acids (those without titratable side chains) and calculated simultaneously VS,min and VS,max for the amino and the carboxyl group (this latter with the use of equation 2 from our previous work published in Molecules MDPI) and the arithmetical average of both gave us their corresponding pI values with an agreement of less than one unit.
This work is now available at the Journal of Chemical Information and Modeling (DOI: 10.1021/acs.jcim.9b01173); as always a shoutout is due to the people working on it: Leonardo “Leo” Lugo, Gustavo “Gus” Mondragón and leading the charge Dr. Jacinto Sandoval-Lira.
We’re always happy at the lab when a student defends their dissertation thesis and now it was the turn of Raúl Márquez-Avilés to do so with flying colors.
The title of his dissertation is “Molecular Dynamics Simulations of 5 potential entry inhibitors for HIV-1“. He performed 500 ns long molecular dynamics simulations of the CD4 – gp 120 proteins interacting with one or several molecules of various lead compounds with inhibitory properties. The leads were obtained previously in our group (by Durbis Castillo, now at McGill) from a massive docking library of ca. 16 million compounds, all having a central piperazine core (Fig1)
The protein gp120 is a surface glyco-protein located at the surface of the HIV virus which couples to the CD4 protein on lymphocytes-T, being this the first step in the infection process of a healthy cell; generating inhibitors of this coupling could help stop the infection from spreading systemically. Four systems were devised: (SB) The reference state for which only gp-120 and CD4 were considered, (S2) A single ligand molecule was placed in the Phe43 cavity of gp120 to assess their inhibitory capacity, (S3) the ligand was placed right outside the Phe43 cavity to assess their entry capacity, and (S4) five ligand molecules were placed outside the Phe43 cavity of gp120 to force their entry (Fig2). Their binding energies were calculated using MM-PBSA and although all five ligands show statistically similar results as inhibitors all five exhibit a stronger binding energy than the reference proving their efficacy in preventing the coupling of the virus to the healthy cell. As a bonus, his research on system S4 shed light on the existence of an allosteric site on gp120 that will warrant further research in our group.
This work is still pending publication.
Raúl Márquez has always proven to be a hard working person who is also very self-sufficient student, a very cheerful labmate, and, as I just learned yesterday, an avid chess player. I’m sure he has a bright future in whichever endeavor he chooses now. Congratulations Raúl Márquez-Avilés!
Funny enough I was unable to log into my Linux (Ubuntu) session and I realized this might be a more common problem that it seemed. So, if you keep getting redirected to the login screen after typing your correct password over and over (and over and over), there’s no need to panic.
This usually has to do with the .Xauthority file, so from the login page press Ctrl+Alt+F1 which will bring you to the command line where you can login with your usual credentials. Once logged in, search for the .Xauthority file and check that it is owned by you and not the root
ls -l ~/.Xauthority
-rw------ 1 root root 1 feb 11 13:13 /home/joaquin/.Xauthority
Use the following command to change ownership
chown group:username ~/.Xauthority
in my case both group and username are joaquin. You may need to ‘sudo’ it. If that doesn’t work try deleting the file altogether, upon login it will be created again.
rm -rf ~/.Xauthority
In any case, if any of these solutions worked, press Ctrl+Alt+F7 to go back to the login screen and now you should be able to get in.
These solutions are quite straightfoward but if the problem persist you may need to update the system or downright install it again from the command line we opened at the begining.
We’re sad to begin this year by saying farewell to Dr. Jacinto Sandoval-Lira who held a postdoc position in our lab for two years with a DGAPA – UNAM scholarship, a very competitive and highly sought-after position here in Mexico. Dr. Sandoval will now relocate to the Technological Institute of San Martín Texmelucan in Puebla, Mexico, whose students will be fortunate to have him as a tutor and a teacher of chemistry in the environmental engineering department.
During the past two years we’ve worked together in various projects, mainly the excitonic transference between photosynthetic pigments but also in calculating reaction mechanisms and solving chemical equilibria problems with various computational approaches, but apart from the research Dr. Sandoval was also a co-organizer of the past Meeting on Physical Chemistry, organized a local course on the use of Dens Tool Kit (DTK), as well as our weekly lab seminars and taught various graduate and undergraduate courses on molecular modeling, chemoinformatics and computational chemistry, not too mention all the collaborations he has brought to our lab in the field of organic chemistry of which I regard him as an expert. He really has been a force of nature!
Aside from a brilliant scientist and a hard working one, Jacinto is an exceptional human being and a great friend. His attention to detail, his drive, and his willingness to help others reach their full potential make him an ideal colleague and an ideal professor. I don’t wish you luck, Jacinto, you don’t need it: I wish you success!
This time I try delivering a personal video post to close this #IYPT2019 celebrations. I hope you find it interesting.
I invite you all to always imitate molecules and react!
It was my distinct pleasure for me to participate in the organization of the latest edition of the Mexican Meeting on Theoretical Physical Chemistry, RMFQT which took place last week here in Toluca. With the help of the School of Chemistry from the Universidad Autónoma del Estado de México.
This year the national committee created a Lifetime Achievement Award for Dr. Annik Vivier, Dr. Carlos Bunge, and Dr. José Luis Gázquez. This recognition from our community is awarded to these fine scientists for their contributions to theoretical chemistry but also for their pioneering work in the field in Mexico. The three of them were invited to talk about any topic of their choosing, particularly, Dr. Vivier stirred the imagination of younger students by showing her pictures of the times when she used to hangout with Slater, Roothan, Löwdin, etc., it is always nice to put faces onto equations.
Continuing with a recent tradition we also had the pleasure to host three invited plenary lectures by great scientists and good friends of our community: Prof. William Tiznado (Chile), Prof. Samuel B. Trickey (USA), and Prof. Julia Contreras (France) who shared their progress on their recent work.
As I’ve abundantly pointed out in the past, the RMFQT is a joyous occasion for the Mexican theoretical community to get together with old friends and discuss very exciting research being done in our country and by our colleagues abroad. I’d like to add a big shoutout to Dr. Jacinto Sandoval-Lira for his valuable help with the organization of our event.
Elucidating the pairing of non-hydrogen bonded unnatural base pairs (UBPs) is still a controversial subject due to the lack of specificity in their mutual interactions. Experimentally, NMR is the method of choice but the DNA strand must be affixed on template of sorts such as a polymerase protein. Those discrepancies are well documented in a recent review which cites our previous computational work, both DFT and MD, on UBPs.
Since that last paper of ours on synthetic DNA, my good friend Dr. Rodrigo Galindo from Utah U. and I have had serious doubts on the real pairing fashion exhibited by Romesberg’s famous hydrophobic nucleotides d5SICS – dNaM. While the authors claim a stacked pairing (within the context of the strand in the KlenTaq polymerase enzime), our simulations showed a Watson-Crick-like pairing was favored in the native form. To further shed light on the matter we performed converged micro-seconds long simulations, varying the force field (two recent AMBER fields were explored: Bsc1 and OL15), the water model (TIP3P and OPC), and the ionic compensation scheme (Na+/Cl– or Mg2+/Cl–).
In the image below it can be observed how the pairing is consistently WC (dC1′-C1′ ~10.4 A) in the most populated clusters regardless of the force field.
Also, a flipping experiment was performed where both nucleotides were placed 180.0° outwards and the system was left to converge inwards to explore a ‘de novo’ pairing guided solely by their mutual interactions and the template formed by the rest of the strand. Distance population for C1′ – C1′ were 10.4 A for Bsc1 (regardless of ionic compensation) and 9.8 A for OL15 (10.4 A where Mg2+ was used as charge compensation).
Despite the successful rate of replication by a living organism -which is a fantastic feat!- of these two nucleotides, there is little chance they can be used for real coding applications (biological or otherwise) due to the lack of structural control of the double helix. The work of Romesberg is impressive, make no mistake about it, but my money isn’t on hydrophobic unnatural nucleotides for information applications 🙂
All credit and glory is due to the amazing Dr. Rodrigo Galindo-Murillo from the University of Utah were he works as a developer for the AMBER code among many other things. Go check his impressive record!
I just came back from beautiful Cancun where I attended for the third time the IMRC conference invited by my good friend and awesome collaborator Dr. Eddie López-Honorato, who once again pulled off the organization of a wonderful symposium on materials with environmental applications.
Dr. López-Honorato and I have been working for a number of years now on the design application of various kinds of materials that can eliminate arsenic species from drinking water supplies, an ever present problem in northern Mexico in South West US. So far we have successfully explored the idea of using calix[n]arenes hosts for various arsenic (V) oxides and their derivatives, but now his group has been thoroughly exploring the use of graphene and graphene oxide (GO) to perform the task.
Our joint work is a wonderful example of what theory and experiment and achieve when working hand-in-hand. During this invited talk I had the opportunity to speak about the modeling side of graphene oxide, in which we’ve been able to rationalize why polar solvents seem to be -counterintuitively- more efficient than non-polar solvents to exfoliate graphene sheets from graphite through attrition milling, as well as to understand the electronic mechanism by which UV light radiation degrades GO without significantly diminishing there arsenic-adsorbing properties. All these results are part of an upcoming paper so more details will come ahead.
Thanks to Dr. Eddie López for his invitation and the opportunity provided to meet old friends and make new ones within the wonderful world of scientific collaborations.
Statistical Mechanics is the bridge between microscopic calculations and thermodynamics of a particle ensemble. By means of calculating a partition function divided in electronic, rotational, translational and vibrational functions, one can calculate all thermodynamic functions required to fully characterize a chemical reaction. From these functions, the vibrational contribution, together with the electronic contribution, is the key element to getting thermodynamic functions.
Calculating the Free Energy change of any given reaction is a useful approach to asses their thermodynamic feasibility. A large negative change in Free Energy when going from reagents to products makes up for a quantitative spontaneous (and exothermic) reaction, nevertheless the rate of the reaction is a different story, one that can be calculated as well.
Using the freq option in your route section for a Gaussian calculation is mandatory to ascertain the current wave function corresponds to a minimum on a potential energy hypersurface, but also yields the thermochemistry and thermodynamic values for the current structure. However, thermochemistry calculations are not restricted to minima but it can also be applied to transition states, therefore yielding a full thermodynamic characterization of a reaction mechanism.
A regular freq calculation yields the following output (all values in atomic units):
Zero-point correction= 0.176113 (Hartree/Particle) Thermal correction to Energy= 0.193290 Thermal correction to Enthalpy= 0.194235 Thermal correction to Gibbs Free Energy= 0.125894 Sum of electronic and zero-point Energies= -750.901777 Sum of electronic and thermal Energies= -750.884600 Sum of electronic and thermal Enthalpies= -750.883656 Sum of electronic and thermal Free Energies= -750.951996
For any given reaction say A+B -> C one could take the values from the last row (lets call it G) for all three components of the reaction and perform the arithmetic: DG = GC – [GA + GB], so products minus reagents.
By default, Gaussian calculates these values (from the previously mentioned partition function) using normal conditions, T = 298.15 K and P = 1 atm. For an assessment of the thermochemistry at other conditions you can include in your route section the corresponding keywords Temperature=x.x and Pressure=x.x, in Kelvin and atmospheres, respectively.
(Huge) Disclaimer: Although calculating the thermochemistry of any reaction by means of DFT calculations is a good (and potentially very useful) guide to chemical reactivity, getting quantitative results require of high accuracy methods like G3 or G4 methods, collectively known as Gn mehtods, which are composed of pre-defined stepwise calculations. The sequence of these calculations is carried out automatically; no basis set should be specified. Other high accuracy methods like CBS-QB3 or W1U can also be considered whenever Gn methods are too costly.
The format of a research paper hasn’t changed much throughout history, despite the enormous changes in platforms available for their consumption and the near extinction of the library issue. Convenient electronic files such as PDFs still resemble printed-and-bound-in-issues papers in their layout instead of exploiting the seemingly endless capabilities of the electronic format.
For instance, why do we still need to have page numbers? a DOI is a full, traceable and unique identification for each work and there are so many nowadays that publishers have to pour them out as e-first, ASAPs, and just accepted before having them assigned page numbers, a process which is still a concern for some researchers (and even for some of the organizations funding them or evaluating their performance). Numbers for Issues, Volumes and Pages are library indexes needed to sort and retrieve information from physical journals but in the e-realm where one can browse all issues online, perform a search and download the results these indexes are hardly of any use, only the year is helpful in establishing a chronological order to the development of ideas. This brings me to the next issue (no pun intended): If bound-issues are no longer a thing then neither should be covers. Being selected for a cover is a huge honor, it means the editorial staff think your work stands out from the published works in the same period; but nowadays is an honor that comes to a price, sometimes a high price. With the existence of covers, back-covers, inner-covers and inner-back-covers and whatnot at USD$1,500 a piece, the honor gets a bit diluted. Advertisers know this and now they place their ads as banners, pop-ups and other online digital formats instead of -to some extent- paying for placing ads in the pages of the journals.
I recently posted a quick informal poll on Twitter about the scientific reading habits of chemists and I confirmed what I expected: only one in five still prefers to mostly read papers on actual paper*, the rest rely on an electronic version such as HTML full text or the most popular PDF on a suitable reader.
— Joaquin Barroso (@joaquinbarroso) June 3, 2019
What came as a surprise for me was that in the follow up poll, Reference Manager programs such as Mendeley, Zotero, EndNote or ReadCube are only preferred by 15% while 80% prefer the PDF reader (I’m guessing Acrobat Reader might be the most popular.) A minority seems to prefer the HTML full text version, which I think is the richest but hardly customizable for note taking, sharing, or, uhm hoarding.
A follow up on the previous poll. Dear #ChemTweeps, if you mostly read papers in electronic format what is your preferred platform?
— Joaquin Barroso (@joaquinbarroso) June 10, 2019
I’m a Mendeley user because I like the integration between users, its portability between platforms and the synchronization features but if I were to move to another reference manager software it would be ReadCube. I like taking notes, highlighting text, and adding summaries and ideas onto the file but above all I like the fact that I can conduct searches in the myriad of PDF files I’ve acumulated over the years. During my PhD studies I had piles of (physical) paper and folders with PDF files that sometimes were easier to print than to sort and organize (I even had a spreadsheet with the literature read-a nightmarish project in itself!)
So, here is my wish list for what I want e-papers in the 21st century to do. Some features are somewhat available in some journals and some can be achieved within the PDF itself others would require a new format or a new platform to be carried out. Please comment what other features would you like to have in papers.
- Say goodbye to the two columns format. I’m zooming to a single column anyway.
- Pop-up charts/plots/schemes/figures. Let me take a look at any graphical object by hovering (or 3D touching in iOS, whatever) on the “see Figure X” legend instead of having to move back and forth to check it, specially when the legend is “see figure SX” and I have to go to the Supporting Information file/section.
- Pop-up References. Currently some PDFs let you jump to the References section when you click on one but you can’t jump back but scroll and find the point where you left.
- Interactive objects. Structures, whether from X-ray diffraction experiments or calculations could be deposited as raw coordinates files for people to play with and most importantly to download** and work with. This would increase the hosting journals need to devote to each work so I’m not holding my breath.
- Audio output. This one should be trickier, but far most helpful. I commute long hours so having papers being read out loud would be a huge time-saver, but it has to be smart. Currently I make Siri read papers by opening them in the Mendeley app, then “select all“, “voice“, but when it hits a formula, or a set of equations the flow is lost (instead of reading water as ‘H-Two-O‘, it reads ‘H-subscript Two-O‘; try having the formula of a perovskite be read)
- A compiler that outputs the ‘traditional version‘ for printing. Sure, why not.
I realize this post may come out as shallow in view of the Plan-S or FAIR initiatives, sorry for that but comfort is not incompatible with accessibility.
What other features do you think research papers should have by now?
* It is true that our attention -and more importantly- our retention of information is not the same when we read on paper than on a screen. Recently there was an interview on this matter on Science Friday.
** I absolutely hate having a Supporting Information section with long PDF lists of coordinates to copy-paste and fix into a new input file. OpenBabel, people!