Sitemap
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Pages
Posts
Future Blog Post
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Blog Post number 4
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 3
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 2
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 1
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
notes
Singular Value Decomposition
Pro tips for dummies: in a product of matrices, the number of columns of the matrix on the left must match the number of rows of the matrix on the right. For example a matrix m x n (m rows and n columns) can be multiplied only by a matrix n x l. The results is a matrix m x l. The columns space has maximum dimension equal to the number of rows, and vice versa.
The Great Recession
The Great Recession
Blackheads
Who is studying blackheads?
Motivation
Short-term and long-term rewards
My Hearing Tests
Here are the results of my hearing test before and after the surgery.
Protein intake
The International Sociaty of Sports Nutritions stands:
Vitamin D
How to get vitamin D
High Performance Computing Servers
Important Warning
What follows might be bullshit. I am just trying to make sense of the things from my experience and my readings, but I am not an expert in HPC.
Arch Linux
What the fuck is Arch Linux?
Bash scripting
Everything is taken from this amazing guide, or this other amazing guide.
Jekyll
Jekyll is a static site generator. This means that if run the command jekyll build
, within a folder in which there is a Gemfile (jekyll is written in Ruby) and a bunch of folders and files appropriately organized, the command will produce a website ready to be host on web server.
Linux Ecosystem
Terminology
- devices: any hardware, also disk partitions
My Linux Distro Configuration
Distro: Linux Mint
The NMDA Receptor
NMDAR subunit composition
The NMDA receptors is a thetramer, meaning that it is composed of 4 subunits. Two of the subunits are always NR1, which contains the coagonist binding site. One of the other subunit is always NR2. In the adult rat hippocampus, NR2A and NR2B are the predominant subunits. The CA1 synapses contain multiple subtype of NMDARs, including NR1/NR2A, NR1/NR2B and NR1/NR2A/NR2B.
STDP Models
Kempter et al. (1999) - Hebbian learning and spiking neurons
Receptor-Ligand Kinetics
The receptor-ligand kinetics is typically described by the chemical reaction:
Analysis of Spike Trains
A spike train is a sequence of action potentials emitted by a neuron. For multiple reasons, the sequence of spikes displays some random behavior. Thus, they are often described using statistical quantities, and modeled as stochastic processes.
Bike Valves
There are three types of bike’s valves that you may encounter:
- Schrader (or American) valve
- Presta (or French) valve
- Woods (or Dunlop or English) valve
Singular Value Decomposition
Singular Value Decomposition (SVD)
Time Series Analysis
Stationary Process and Autocorrelation function
Consider a time series \(\{ X_t\}_{t \in I}\), I being an index set. Assume \(E[X_t^2]<\infty \quad \forall t \in I\).
Ito Calculus
Here we denote a stochastic process as \(f(t, \omega)\), \(f: I \times \Omega \longrightarrow \mathbb{R}\).
Probability Theory
Basics of Probability Theory
Stochastic Processes
Some definitions
publications
On the thermodynamic interpretation of deep learning systems
Published in International Conference on Geometric Science of Information, 2021
In the study of time evolution of the parameters in Deep Learning systems, subject to optimization via SGD (stochastic gradient descent), temperature, entropy and other thermodynamic notions are commonly employed to exploit the Boltzmann formalism. We show that, in simulations on popular databases (CIFAR10, MNIST), such simplified models appear inadequate: different regions in the parameter space exhibit significantly different temperatures and no elementary function expresses the temperature in terms of learning rate and batch size, as commonly assumed. This suggests a more conceptual approach involving contact dynamics and Lie Group Thermodynamics.
Download here
Effectiveness of Biologically Inspired Neural Network Models in Learning and Patterns Memorization
Published in Entropy, 2022
In this work, we propose an implementation of the Bienenstock–Cooper–Munro (BCM) model, obtained by a combination of the classical framework and modern deep learning methodologies. The BCM model remains one of the most promising approaches to modeling the synaptic plasticity of neurons, but its application has remained mainly confined to neuroscience simulations and few applications in data science. Methods: To improve the convergence efficiency of the BCM model, we combine the original plasticity rule with the optimization tools of modern deep learning. By numerical simulation on standard benchmark datasets, we prove the efficiency of the BCM model in learning, memorization capacity, and feature extraction. Results: In all the numerical simulations, the visualization of neuronal synaptic weights confirms the memorization of human-interpretable subsets of patterns. We numerically prove that the selectivity obtained by BCM neurons is indicative of an internal feature extraction procedure, useful for patterns clustering and classification. The introduction of competitiveness between neurons in the same BCM network allows the network to modulate the memorization capacity of the model and the consequent model selectivity. Conclusions: The proposed improvements make the BCM model a suitable alternative to standard machine learning techniques for both feature selection and classification tasks.
Download here
Astrocytes enhance plasticity response during reversal learning
Published in Communications Biology, 2024
Astrocytes play a key role in the regulation of synaptic strength and are thought to orchestrate synaptic plasticity and memory. Yet, how specifically astrocytes and their neuroactive transmitters control learning and memory is currently an open question. Recent experiments have uncovered an astrocyte-mediated feedback loop in CA1 pyramidal neurons which is started by the release of endocannabinoids by active neurons and closed by astrocytic regulation of the D-serine levels at the dendrites. D-serine is a co-agonist for the NMDA receptor regulating the strength and direction of synaptic plasticity. Activity-dependent D-serine release mediated by astrocytes is therefore a candidate for mediating between long-term synaptic depression (LTD) and potentiation (LTP) during learning. Here, we show that the mathematical description of this mechanism leads to a biophysical model of synaptic plasticity consistent with the phenomenological model known as the BCM model. The resulting mathematical framework can explain the learning deficit observed in mice upon disruption of the D-serine regulatory mechanism. It shows that D-serine enhances plasticity during reversal learning, ensuring fast responses to changes in the external environment. The model provides new testable predictions about the learning process, driving our understanding of the functional role of neuron-glia interaction in learning.
Download here