Category Archives: Paper
The format of a research paper hasn’t changed much throughout history, despite the enormous changes in platforms available for their consumption and the near extinction of the library issue. Convenient electronic files such as PDFs still resemble printed-and-bound-in-issues papers in their layout instead of exploiting the seemingly endless capabilities of the electronic format.
For instance, why do we still need to have page numbers? a DOI is a full, traceable and unique identification for each work and there are so many nowadays that publishers have to pour them out as e-first, ASAPs, and just accepted before having them assigned page numbers, a process which is still a concern for some researchers (and even for some of the organizations funding them or evaluating their performance). Numbers for Issues, Volumes and Pages are library indexes needed to sort and retrieve information from physical journals but in the e-realm where one can browse all issues online, perform a search and download the results these indexes are hardly of any use, only the year is helpful in establishing a chronological order to the development of ideas. This brings me to the next issue (no pun intended): If bound-issues are no longer a thing then neither should be covers. Being selected for a cover is a huge honor, it means the editorial staff think your work stands out from the published works in the same period; but nowadays is an honor that comes to a price, sometimes a high price. With the existence of covers, back-covers, inner-covers and inner-back-covers and whatnot at USD$1,500 a piece, the honor gets a bit diluted. Advertisers know this and now they place their ads as banners, pop-ups and other online digital formats instead of -to some extent- paying for placing ads in the pages of the journals.
I recently posted a quick informal poll on Twitter about the scientific reading habits of chemists and I confirmed what I expected: only one in five still prefers to mostly read papers on actual paper*, the rest rely on an electronic version such as HTML full text or the most popular PDF on a suitable reader.
— Joaquin Barroso (@joaquinbarroso) June 3, 2019
What came as a surprise for me was that in the follow up poll, Reference Manager programs such as Mendeley, Zotero, EndNote or ReadCube are only preferred by 15% while 80% prefer the PDF reader (I’m guessing Acrobat Reader might be the most popular.) A minority seems to prefer the HTML full text version, which I think is the richest but hardly customizable for note taking, sharing, or, uhm hoarding.
A follow up on the previous poll. Dear #ChemTweeps, if you mostly read papers in electronic format what is your preferred platform?
— Joaquin Barroso (@joaquinbarroso) June 10, 2019
I’m a Mendeley user because I like the integration between users, its portability between platforms and the synchronization features but if I were to move to another reference manager software it would be ReadCube. I like taking notes, highlighting text, and adding summaries and ideas onto the file but above all I like the fact that I can conduct searches in the myriad of PDF files I’ve acumulated over the years. During my PhD studies I had piles of (physical) paper and folders with PDF files that sometimes were easier to print than to sort and organize (I even had a spreadsheet with the literature read-a nightmarish project in itself!)
So, here is my wish list for what I want e-papers in the 21st century to do. Some features are somewhat available in some journals and some can be achieved within the PDF itself others would require a new format or a new platform to be carried out. Please comment what other features would you like to have in papers.
- Say goodbye to the two columns format. I’m zooming to a single column anyway.
- Pop-up charts/plots/schemes/figures. Let me take a look at any graphical object by hovering (or 3D touching in iOS, whatever) on the “see Figure X” legend instead of having to move back and forth to check it, specially when the legend is “see figure SX” and I have to go to the Supporting Information file/section.
- Pop-up References. Currently some PDFs let you jump to the References section when you click on one but you can’t jump back but scroll and find the point where you left.
- Interactive objects. Structures, whether from X-ray diffraction experiments or calculations could be deposited as raw coordinates files for people to play with and most importantly to download** and work with. This would increase the hosting journals need to devote to each work so I’m not holding my breath.
- Audio output. This one should be trickier, but far most helpful. I commute long hours so having papers being read out loud would be a huge time-saver, but it has to be smart. Currently I make Siri read papers by opening them in the Mendeley app, then “select all“, “voice“, but when it hits a formula, or a set of equations the flow is lost (instead of reading water as ‘H-Two-O‘, it reads ‘H-subscript Two-O‘; try having the formula of a perovskite be read)
- A compiler that outputs the ‘traditional version‘ for printing. Sure, why not.
I realize this post may come out as shallow in view of the Plan-S or FAIR initiatives, sorry for that but comfort is not incompatible with accessibility.
What other features do you think research papers should have by now?
* It is true that our attention -and more importantly- our retention of information is not the same when we read on paper than on a screen. Recently there was an interview on this matter on Science Friday.
** I absolutely hate having a Supporting Information section with long PDF lists of coordinates to copy-paste and fix into a new input file. OpenBabel, people!
Calculating the pKa value for a Brønsted acid is very hard, like really hard. A full thermodynamic cycle (fig. 1) needs to be calculated along with the high-accuracy solvation free energy for each of the species under consideration, not to mention the use of expensive methods which will be reviewed here in another post in two weeks time.
Finding descriptors that help us circumvent the need for such sophisticated calculations can help great deal in estimating the pKa value of any given acid. We’ve been interested in the reactivity of σ-hole bearing groups in the past and just like Halogen, Tetrel, Pnicogen and Chalcogen bonds, Hydrogen bonds are highly directional and their strength depends on the polarization of the O-H bond. Therefore, we suggested the use of the maximum surface electrostatic potential (VS,max) on the acid hydrogen atom of carboxylic acids as a descriptor for the strength of their interaction with water, the first step in the deprotonation process.
We selected six basis sets; five density functionals; the MP2 method for a total of thirty-six levels of theory to optimize and calculate VS,max on thirty carboxylic acids for a grand total of 1,080 wavefunctions, which were later passed onto MultiWFN (all calculations were taken with PCM = water). Correlation with the experimental pKa values showed a great correlation across the levels of theory (R2 > 0.9), except for B3LYP. Still, the best correlations were obtained with LC-wPBE/cc-pVDZ and wB97XD/cc-pVDZ. From this latter level of theory the linear correlation yielded the following equation:
pKa = -0.2185(VS,max) + 16.1879
Differences in pKa turned out to be less than 0.5 units, which is remarkable for such a straightforward method; bear in mind that calculation of full thermodynamic cycles above chemical accuracy (1.0 kcal/mol) yields pKa differences above 1.0 units.
We then took this equation for a test with 10 different carboxylic acids and the prediction had a correlation of 98% (fig. 2)
I think this method can really catch on for a quick way to predict the pKa values of any carboxylic acid imaginable. We’re now working on the model extension to other groups (i.e. Bronsted bases) and putting together a black-box workflow so as to make it even more accessible and straightforward to use.
We’ve recently published this work in the journal Molecules, an open access publication. Thanks to Prof. Steve Scheiner for inviting us to participate in the special issue devoted to tetrel bonding. Thanks to Guillermo Caballero for the inception of this project and to Dr. Jacinto Sandoval for taking the time from his research in photosynthesis to work on this pet project of ours and of course the rest of the students (Gustavo Mondragón, Marco Diaz, Raúl Torres) whose hard work produced this work.
Two new papers on the development of chemosensors for different applications were recently published and we had the opportunity to participate in both with the calculation of electronic interactions.
A chemosensor requires to have a measurable response and calculating either that response from first principles based on the electronic structure, or calculating another physicochemical property related to the response are useful strategies in their molecular design. Additionally, electronic structure calculations helps us unveil the molecular mechanisms underlying their response and efficiency, as well as providing a starting point for their continuous improvement.
In the first paper, CdTe Quantum Dots (QD’s) are used to visualize in real time cell-membrane damages through a Gd Schiff base sensitizer (GdQDs). This probe interacts preferentially with a specific sequence motif of NHE-RF2 scaffold protein which is exposed during cell damage. This interactions yields intensely fluorescent droplets which can be visualized in real time with standard instrumentation. Calculations at the level of theory M06-2X/LANL2DZ plus an external double zeta quality basis set on Gd, were employed to characterize the electronic structure of the Gd³⁺ complex, the Quantum Dot and their mutual interactions. The first challenge was to come up with the right multiplicity for Gd³⁺ (an f⁷ ion) for which we had no experimental evidence of their magnetic properties. From searching the literature and talking to my good friend, inorganic chemist Dr. Vojtech Jancik it was more or less clear the multiplicity had to be an octuplet (all seven electrons unpaired).
As can be seen in figure 1a the Gd-N interactions are mostly electrostatic in nature, a fact that is also reflected in the Wiberg bond indexes calculated as 0.16, 0.17 and 0.21 (a single bond would yield a WBI value closer to 1.0).
PM6 optimizations were employed in optimizing the GdQD as a whole (figure 1f) and the MM-UFF to characterize their union to a peptide sequence (figure 2) from which we observed somewhat unsurprisingly that Gd³⁺interacts preferently with the electron rich residues.
This research was published in ACS Applied Materials and Interfaces. Thanks to Prof. Vojtech Adam from the Mendel University in Brno, Czech Republic for inviting me to collaborate with their interdisciplinary team.
The second sensor I want to write about today is a more closer to home collaboration with Dr. Alejandro Dorazco who developed a fluorescent porphyrin system that becomes chiefly quenched in the presence of Iodide but not with any other halide. This allows for a fast detection of iodide anions, related to some gland diseases, in aqueous samples such as urine. This probe was also granted a patent which technically lists yours-truly as an inventor, cool!
The calculated interaction energy was huge between I⁻ and the porphyrine, which supports the idea of a ionic interaction through which charge transfer interactions quenches the fluorescence of the probe. Figure 3 above shows how the HOMO largely resides on the iodide whereas the LUMO is located on the pi electron system of the porphyrine.
This research was published in Sensors and Actuators B – Chemical.
I began my path in computational chemistry while I still was an undergraduate student, working on my thesis under professor Cea at unam, synthesizing main group complexes with sulfur containing ligands. Quite a mouthful, I know. Therefore my first calculations dealt with obtaining Bond indexed for bidentate ligands bonded to tin, antimony and even arsenic; yes! I worked with arsenic once! Happily, I keep a tight bond (pun intended) with inorganic chemists and the recent two papers published with the group of Prof. Mónica Moya are proof of that.
In the first paper, cyclic metallaborates were formed with Ga and Al but when a cycle of a given size formed with one it didn’t with the other (fig 1), so I calculated the relative energies of both analogues while compensating for the change in the number of electrons with the following equation:
ΔE = E(MnBxOy) – nEM + nEM’ – E(M’nBxOy) Eq 1
A seamless substitution would imply ΔE = 0 when changing from M to M’
The calculated ΔE were: ΔE(3/3′) = -81.38 kcal/mol; ΔE(4/4′) = 40.61 kcal/mol; ΔE(5/5′) = 70.98 kcal/mol
In all, the increased stability and higher covalent character of the Ga-O-Ga unit compared to that of the Al analogue favors the formation of different sized rings.
Additionally, a free energy change analysis was performed to assess the relative stability between compounds. Changes in free energy can be obtained easily from the thermochemistry section in the FREQ calculation from Gaussian.
This paper is published in Inorganic Chemistry under the following citation: Erandi Bernabé-Pablo, Vojtech Jancik, Diego Martínez-Otero, Joaquín Barroso-Flores, and Mónica Moya-Cabrera* “Molecular Group 13 Metallaborates Derived from M−O−M Cleavage Promoted by BH3” Inorg. Chem. 2017, 56, 7890−7899
The second paper deals with heavier atoms and the bonds the formed around Yttrium complexes with triazoles, for which we calculated a more detailed distribution of the electronic density and concluded that the coordination of Cp to Y involves a high component of ionic character.
This paper is published in Ana Cristina García-Álvarez, Erandi Bernabé-Pablo, Joaquín Barroso-Flores, Vojtech Jancik, Diego Martínez-Otero, T. Jesús Morales-Juárez, Mónica Moya-Cabrera* “Multinuclear rare-earth metal complexes supported by chalcogen-based 1,2,3-triazole” Polyhedron 135 (2017) 10-16
We keep working on other projects and I hope we keep on doing so for the foreseeable future because those main group metals have been in my blood all this century. Thanks and a big shoutout to Dr. Monica Moya for keeping me in her highly productive and competitive team of researchers; here is to many more years of joint work.
As is the case of proteins, the functioning of DNA is highly dependent on its 3D structure and not just only on its sequence but the difference is that protein tertiary structure has an enormous variety whereas DNA is (almost) always a double helix with little variations. The canonical base pairs AT, CG stabilize the famous double helix but the same cannot be guaranteed when non-canonical -unnatural- base pairs (UBPs) are introduced.
When I first took a look at Romesberg’s UBPS, d5SICS and dNaM (throughout the study referred to as X and Y see Fig.1) it was evident that they could not form hydrogen bonds, in the end they’re substituted naphtalenes with no discernible ways of creating a synton like their natural counterparts. That’s when I called Dr. Rodrigo Galindo at Utah University who is one of the developers of the AMBER code and who is very knowledgeable on matters of DNA structure and dynamics; he immediately got on board and soon enough we were launching molecular dynamics simulations and quantum mechanical calculations. That was more than two years ago.
Our latest paper in Phys.Chem.Chem.Phys. deals with the dynamical and structural stability of a DNA strand in which Romesberg’s UBPs are introduced sequentially one pair at a time into Dickerson’s dodecamer (a palindromic sequence) from the Protein Data Bank. Therein d5SICS-dNaM pair were inserted right in the middle forming a trisdecamer; as expected, +10 microseconds molecular dynamics simulations exhibited the same stability as the control dodecamer (Fig.2 left). We didn’t need to go far enough into the substitutions to get the double helix to go awry within a couple of microseconds: Three non-consecutive inclusions of UBPs were enough to get a less regular structure (Fig. 2 right); with five, a globular structure was obtained for which is not possible to get a proper average of the most populated structures.
X and Y don’t form hydrogen bonds so the pairing is pretty much forced by the scaffold of the rest of the DNA’s double helix. There are some controversies as to how X and Y fit together, whether they overlap or just wedge between each other and according to our results, the pairing suggests that a C1-C1′ distance of 11 Å is most stable consistent with the wedging conformation. Still much work is needed to understand the pairing between X and Y and even more so to get a pair of useful UBPs. More papers on this topic in the near future.
Ever since I read the highly praised article by Floyd Romesberg in Nature back in 2013 I got really interested in synthetic biology. In said article, an unnatural base pair (UBP) was not only inserted into a DNA double strand in vivo but the organism was even able to reproduce the UBPs present in subsequent generations.
Inserting new unnatural base pairs in DNA works a lot like editing a computer’s code. Inserting a couple UBPs in vitro is like inserting a comment; it wont make a difference but its still there. If the DNA sequence containing the UBPs can be amplified by molecular biology techniques such as PCR it means that a polymerase enzyme is able to recognize it and place it in site, this is equivalent to inserting a ‘hello world’ section into a working code; it will compile but it’s pretty much useless. Inserting these UBPs in vivo means that the organism is able to thrive despite the large deformation in a short section of its genetic code, but having it replicated by the chemical machinery of the nucleus is an amazing feat that only a few molecules could allow.
The ultimate goal of synthetic biology would be to find a UBP which codes effectively and purposefully during translation of DNA.This last feat would be equivalent to inserting a working subroutine in a program with a specific purpose. But not only could the use of UBPs serve for the purposes of expanding the genetic code from a quaternary (base four) to a senary (base six) system: the field of DNA origami could also benefit from having an expansion in the chemical and structural possibilities of the famous double helix; marking and editing a sequence would also become easier by having distinctive sections with nucleotides other than A, T, C and G.
It is precisely in the concept of double helix that our research takes place since the available biochemical machinery for translation and replication can only work on a double helix, else, the repair mechanisms get activated or the DNA will just stop serving its purpose (i.e. the code wont compile).
My good friend, Dr. Rodrigo Galindo and I have worked on the simulation of Romesberg’s UBPs in order to understand the underlying structural, dynamical and electronic causes that made them so successful and to possibly design more efficient UBPs based on a set of general principles. A first paper has been accepted for publication in Phys.Chem.Chem.Phys. and we’re very excited for it; more on that in a future post.
Literature in synthetic chemistry is full of reactions that do occur but very little or no attention is payed to those that do not proceed. The question here is what can we learn from reactions that are not taking place even when our chemical intuition tells us they’re feasible? Is there valuable knowledge that can be acquired by studying the ‘anti-driving force’ that inhibits a reaction? This is the focus of a new manuscript recently published by our research group in Tetrahedron (DOI: 10.1016/j.tet.2016.05.058) which was the basis of Guillermo Caballero’s BSc thesis.
It is well known in organic chemistry that if a molecular structure has the possibility to be aromatic it can somehow undergo an aromatization process to achieve this more stable state. During some experimental efforts Guillermo Caballero found two compounds that could be easily regarded as non-aromatic tautomers of a substituted pyridine but which were not transformed into the aromatic compound by any means explored; whether by treatment with strong bases, or through thermal or photochemical reaction conditions.
These results led us to investigate the causes that inhibits these aromatization reactions to occur and here is where computational chemistry took over. As a first approach we proposed two plausible reaction mechanisms for the aromatization process and evaluated them with DFT transition state calculations at the M05-2x/6-31+G(d,p)//B3LYP/6-31+G(d,p) levels of theory. The results showed that despite the aromatic tautomers are indeed more stable than their corresponding non-aromatic ones, a high activation free energy is needed to reach the transition states. Thus, the barrier heights are the first reason why aromatization is being inhibited; there just isn’t enough thermal energy in the environment for the transformation to occur.
But this is only the proximal cause, we went then to search for the distal causes (i.e. the reasons behind the high energy of the barriers). The second part of the work was then the calculation of the delocalization energies and frontier molecular orbitals for the non-aromatic tautomers at the HF/cc-pVQZ level of theory to get insights for the large barrier heights. The energies showed a strong electron delocalization of the nitrogen’s lone pair to the oxygen atom in the carbonyl group. Such delocalization promoted the formation of an electron corridor formed with frontier and close-to-frontier molecular orbitals, resembling an extended push-pull effect. The hydrogen atoms that could promote the aromatization process are shown to be chemically inaccessible.
Further calculations for a series of analogous compounds showed that the dimethyl amino moiety plays a crucial role avoiding the aromatization process to occur. When this group was changed for a nitro group, theoretical calculations yielded a decrease in the barrier high, enough for the reaction to proceed. Electronically, the bonding electron corridor is interrupted due to a pull-pull effect that was assessed through the delocalization energies.
The identity of the compounds under study was assessed through 1H, 13C-NMR and 2D NMR experiments HMBC, HMQC so we had to dive head long into experimental techniques to back our calculations.
As part of an ongoing collaboration with the University of Arizona (UA) and the Center for Advanced Research and Studies (CINVESTAV – Saltillo), we are looking into the use of calix[n]arenes for bio-remediation agents capable to extract Arsenic (V) and (III) species from water. Water contamination by arsenic is a pressing issue in northern Mexico and the southern US, therefore any efforts aiming to their elimination has strong social and health repercussions.
As in previous studies, all calixarenes were optimized along with their corresponding guests within the cavity, namely H3AsO4, H2AsO4– and HAsO42- at the DFT level with the so-called Minnesota functionals by Truhlar and Cao, M06-2X/6-31G(d,p) level of theory. Interaction energies were calculated through the NBODel procedure. Calixarenes with R = SO3H and PO3H are the most promising leads. This study is now publishes in the Journal of Inclusion Phenomena and Macrocyclic Chemistry (DOI 10.1007/s10847-016-0617-0) as an online first article.
This article is also the first to be published by our undergraduate (and almost grad student in a month) Gustavo Mondragón who took this project on a side to his own research on photosynthesis.
Now my colleagues in Arizona and Saltillo, Prof. Reyes Sierra and Dr. Eddie López, respectively, will work on the experimental side of the project. Further calculations are being undertaken to extend this study to As(III) and to the use of other potential extracting materials such as metallic nanoparticles to which calixarenes could be covalently linked.