Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

publications

Inferring long memory using extreme events

Dayal Singh Kalra and M. S. Santhanam, Chaos 31, 113131 (2021)

We study extreme events in time series with long-range correlations. We find that it is possible to infer the long memory of time series just by extreme events.

Initializing ReLU networks in an expressive subspace of weights

Dayal Singh and G J Sreejith, Under review

We analyze signal propagation in ReLU networks with anti-correlated weights and demonstrate that it has an order-to-chaos phase transition, unlike the uncorrelated case. Furthermore, we demonstrate that ReLU networks initialized at this criticality train faster.

research

Statistical physics of Deep Neural Networks

In the past few years, there has been some progress in understanding Deep Neural Networks (DNNs) through ideas from statistical physics. These studies have shed light on various theoretical questions about DNNs. This includes, their function space and generalization properties. For my master’s thesis, I have worked on the information propagation in Deep ReLU networks with correlated weights. In particular, we show that ReLU networks with anti-correlated weights have an order-to-chaos criticality, unlike the uncorrelated weight case. Furthermore, we propose intializing ReLU networks at this criticality, and demonstrate that ReLU networks with anti-correlated intialization train faster. ArXiv link to the paper.

talks