2697 Publications

Investigating the membrane curvature sensing ability of the N-terminal domain of huntingtin

Shelli Frey, Jordyn Markle, A. Sahoo, et al.

Huntington's disease (HD) is an inherited neurodegenerative disorder associated with motor and cognitive decline, caused by a mutation in the poly-glutamine (polyQ) region near the N-terminus of the huntingtin (htt) protein. Expansion of the polyQ region results in the disease that is characterized by oligomeric and fibrillar aggregates of mutated protein. The first 17 amino acids (Nt17) of htt, which are adjacent to the polyQ tract, function as a lipid-binding domain, facilitated by the formation of an amphipathic α-helix. There is increasing evidence that lipid interactions may play a role in the toxic gain of function associated with the htt polyQ expansion, as membrane-related changes, including structural abnormalities of several organelles, are observed in HD. Given the uneven and curved shapes of organelles, it is important to examine the mechanistic preferences that drive the preferential partitioning of Nt17 to curved membranes. To better understand the role of the cell membrane environment in the interaction and aggregation of htt, circular dichroism, fluorescence microscopy, and coarse-grained molecular dynamics were employed to measure the association of Nt17 with phospholipid vesicles and subsequent effects throughout time. In zwitterionic curved membranes, sensing was driven by the bulky sidechains of phenylalanine residues, which are able to sense lipid packing defects in the curved regions of the membrane. However, in a mixture of zwitterionic and anionic lipids, curvature sensing is affected by the anionic lipid content, implying the surface charge of membranes affects the curvature sensing process. Salt screening experiments suggest a balance between the electrostatic and hydrophobic interactions that governs the extent to which Nt17 can sense physiologically relevant regions of curvature.

Show Abstract

Generalized Compressed Sensing for Image Reconstruction with Diffusion Probabilistic Models

We examine the problem of selecting a small set of linear measurements for reconstructing high-dimensional signals. Well-established methods for optimizing such measurements include principal component analysis (PCA), independent component analysis (ICA) and compressed sensing (CS) based on random projections, all of which rely on axis- or subspace-aligned statistical characterization of the signal source. However, many naturally occurring signals, including photographic images, contain richer statistical structure. To exploit such structure, we introduce a general method for obtaining an optimized set of linear measurements for efficient image reconstruction, where the signal statistics are expressed by the prior implicit in a neural network trained to perform denoising (known as a ``diffusion model''). We demonstrate that the optimal measurements derived for two natural image datasets differ from those of PCA, ICA, or CS, and result in substantially lower mean squared reconstruction error. Interestingly, the marginal distributions of the measurement values are asymmetrical (skewed), substantially more so than those of previous methods. We also find that optimizing with respect to perceptual loss, as quantified by structural similarity (SSIM), leads to measurements different from those obtained when optimizing for MSE. Our results highlight the importance of incorporating the specific statistical regularities of natural signals when designing effective linear measurements.

Show Abstract

Observation of Floquet–Bloch states in monolayer graphene

Floquet engineering is a novel method of manipulating quantum phases of matter via periodic driving [1, 2]. It has successfully been utilized in different platforms ranging from photonic systems [3] to optical lattice of ultracold atoms [4, 5]. In solids, light can be used as the periodic drive via coherent light-matter interaction. This leads to hybridization of Bloch electrons with photons resulting in replica bands known as Floquet-Bloch states. After the direct observation of Floquet-Bloch states in a topological insulator [6], their manifestations have been seen in a number of other experiments [7-14]. By engineering the electronic band structure using Floquet-Bloch states, various exotic phase transitions have been predicted [15-22] to occur. To realize these phases, it is necessary to better understand the nature of Floquet-Bloch states in different materials. However, direct energy and momentum resolved observation of these states is still limited to only few material systems [6, 10, 14, 23, 24]. Here, we report direct observation of Floquet-Bloch states in monolayer epitaxial graphene which was the first proposed material platform [15] for Floquet engineering. By using time- and angle-resolved photoemission spectroscopy (trARPES) with mid-infrared (mid-IR) pump excitation, we detected replicas of the Dirac cone. Pump polarization dependence of these replica bands unequivocally shows that they originate from the scattering between Floquet-Bloch states and photon-dressed free-electron-like photoemission final states, called Volkov states. Beyond graphene, our method can potentially be used to directly observe Floquet-Bloch states in other systems paving the way for Floquet engineering in a wide range of quantum materials.
Show Abstract

BAnG Bidirectional Anchored Generation for Conditional RNA Design

Roman Klypa, A. Bietti, Sergei Grudinin

Designing RNA molecules that interact with specific proteins is a critical challenge in experimental and computational biology. Existing computational approaches require a substantial amount of experimentally determined RNA sequences for each specific protein or a detailed knowledge of RNA structure, restricting their utility in practice. To address this limitation, we develop RNA-BAnG, a deep learning-based model designed to generate RNA sequences for protein interactions without these requirements. Central to our approach is a novel generative method, Bidirectional Anchored Generation (BAnG), which leverages the observation that protein-binding RNA sequences often contain functional binding motifs embedded within broader sequence contexts. We first validate our method on generic synthetic tasks involving similar localized motifs to those appearing in RNAs, demonstrating its benefits over existing generative approaches. We then evaluate our model on biological sequences, showing its effectiveness for conditional RNA sequence design given a binding protein.

Show Abstract

In-Context Denoising with One-Layer Transformers: Connections between Attention and Associative Memory Retrieval

We introduce in-context denoising, a task that refines the connection between attention-based architectures and dense associative memory (DAM) networks, also known as modern Hopfield networks. Using a Bayesian framework, we show theoretically and empirically that certain restricted denoising problems can be solved optimally even by a single-layer transformer. We demonstrate that a trained attention layer processes each denoising prompt by performing a single gradient descent update on a context-aware DAM energy landscape, where context tokens serve as associative memories and the query token acts as an initial state. This one-step update yields better solutions than exact retrieval of either a context token or a spurious local minimum, providing a concrete example of DAM networks extending beyond the standard retrieval paradigm. Overall, this work solidifies the link between associative memory and attention mechanisms first identified by Ramsauer et al., and demonstrates the relevance of associative memory models in the study of in-context learning.

Show Abstract

In-Context Denoising with One-Layer Transformers: Connections between Attention and Associative Memory Retrieval

We introduce in-context denoising, a task that refines the connection between attention-based architectures and dense associative memory (DAM) networks, also known as modern Hopfield networks. Using a Bayesian framework, we show theoretically and empirically that certain restricted denoising problems can be solved optimally even by a single-layer transformer. We demonstrate that a trained attention layer processes each denoising prompt by performing a single gradient descent update on a context-aware DAM energy landscape, where context tokens serve as associative memories and the query token acts as an initial state. This one-step update yields better solutions than exact retrieval of either a context token or a spurious local minimum, providing a concrete example of DAM networks extending beyond the standard retrieval paradigm. Overall, this work solidifies the link between associative memory and attention mechanisms first identified by Ramsauer et al., and demonstrates the relevance of associative memory models in the study of in-context learning.

Show Abstract

The first complete 3D reconstruction and morphofunctional mapping of an insect eye

Anastasia A Makarova, N. Chua, Anna V Diakova, Inna A Desyatirkina, P. Gunn, Song Pang, C Shan Xu, Herald F Hess, D. Chklovskii, Alexey A Polilov

The structure of compound eyes in arthropods has been the subject of many studies, revealing important biological principles. Until recently, these studies were constrained by the two-dimensional nature of available ultrastructural data. By taking advantage of the novel three-dimensional ultrastructural dataset obtained using volume electron microscopy, we present the first cellular-level reconstruction of the whole compound eye of an insect, the miniaturized parasitoid wasp Megaphragma viggianii. The compound eye of the female M. viggianii consists of 29 ommatidia and contains 478 cells. Despite the almost anucleate brain, all cells of the compound eye contain nuclei. As in larger insects, the dorsal rim area of the eye in M. viggianii contains ommatidia that are believed to be specialized in polarized light detection as reflected in their corneal and retinal morphology. We report the presence of three ‘ectopic’ photoreceptors. Our results offer new insights into the miniaturization of compound eyes and scaling of sensory organs in general.

Show Abstract

Heuristic energy-based cyclic peptide design

Q. Zhu, V. Mulligan, Dennis Shasha

Rational computational design is crucial to the pursuit of novel drugs and therapeutic agents. Meso-scale cyclic peptides, which consist of 7-40 amino acid residues, are of particular interest due to their conformational rigidity, binding specificity, degradation resistance, and potential cell permeability. Because there are few natural cyclic peptides, de novo design involving non-canonical amino acids is a potentially useful goal. Here, we develop an efficient pipeline (CyclicChamp) for cyclic peptide design. After converting the cyclic constraint into an error function, we employ a variant of simulated annealing to search for low-energy peptide backbones while maintaining peptide closure. Compared to the previous random sampling approach, which was capable of sampling conformations of cyclic peptides of up to 14 residues, our method both greatly accelerates the computation speed for sampling conformations of small macrocycles (ca. 7 residues), and addresses the high-dimensionality challenge that large macrocycle designs often encounter. As a result, CyclicChamp makes conformational sampling tractable for 15- to 24-residue cyclic peptides, thus permitting the design of macrocycles in this size range. Microsecond-length molecular dynamics simulations on the resulting 15, 20, and 24 amino acid cyclic designs identify designs with kinetic stability. To test their thermodynamic stability, we perform additional replica exchange molecular dynamics simulations and generate free energy surfaces. Three 15-residue designs, one 20-residue and one 24-residue design emerge as promising candidates.

Show Abstract

Distributional Associations vs In-Context Reasoning: A Study of Feed-forward and Attention Layers

Lei Chen, J. Bruna, A. Bietti

Large language models have been successful at tasks involving basic forms of in-context reasoning, such as generating coherent language, as well as storing vast amounts of knowledge. At the core of the Transformer architecture behind such models are feed-forward and attention layers, which are often associated to knowledge and reasoning, respectively. In this paper, we study this distinction empirically and theoretically in a controlled synthetic setting where certain next-token predictions involve both distributional and in-context information. We find that feed-forward layers tend to learn simple distributional associations such as bigrams, while attention layers focus on in-context reasoning. Our theoretical analysis identifies the noise in the gradients as a key factor behind this discrepancy. Finally, we illustrate how similar disparities emerge in pre-trained models through ablations on the Pythia model family on simple reasoning tasks.

Show Abstract

Superfast Direct Inversion of the Nonuniform Discrete Fourier Transform via Hierarchically Semiseparable Least Squares

Heather Wilber, Ethan N. Epperly, A. Barnett

A direct solver is introduced for solving overdetermined linear systems involving nonuniform discrete Fourier transform matrices. Such matrices can be transformed into a Cauchy-like form that has hierarchical low rank structure. The rank structure of this matrix is explained, and it is shown that the ranks of the relevant submatrices grow only logarithmically with the number of columns of the matrix. A fast rank-structured hierarchical approximation method based on this analysis is developed, along with a hierarchical least-squares solver for these and related systems. This result is a direct method for inverting nonuniform discrete transforms with a complexity that is usually nearly linear with respect to the degrees of freedom in the problem. This solver is benchmarked against various iterative and direct solvers in the setting of inverting the one-dimensional type-II (or forward) transform, for a range of condition numbers and problem sizes (up to (4 10

Show Abstract
  • Previous Page
  • Viewing
  • Next Page
Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates