Szabolcs Horvát

Currently I am a postdocotral researcher at the Stem-cell and Brain Research Institute (Inserm U846) in Lyon, France. I am working in the Cortical Architecture, Coding and Plasticity group lead by Henry Kennedy.

Previously I worked at the Department of Physics and iCeNSA, University of Notre Dame, with Zoltán Toroczkai.

Contact me at:

By popular demand, how to pronounce my name?


Graph sampling with constraints

At the moment I am working on problems related to sampling random graphs (networks) with arbitrary constraints, and have developed a solution for the so-called degeneracy problem of Exponential Random Graph models (see below), which greatly extends their practical applicability. Illustration of degeneracy in ERGs

The simplest type of sampling constraint is a sharp constraint where the value of certain graph measures (quantities) is strictly prescribed: e.g. choose any graph that has precisely \(m\) edges with equal probablity. There are no generally applicable methods for working with arbitrary sharp constraints.

An average constraint is more relaxed: sample from a graph distribution where only the average of certain graph measures is fixed. There are many distributions with the same mean, and the principle of maximum entropy (E.T. Jaynes, 1957) selects the one which contains the least bias other than the constraint. Applying average constraints to graphs distributions leads to Exponential Random Graph (ERG) models.

ERG models have become a popular network analysis tool, but unfortunately they may suffer (depending on the particular constraints) from a significant drawback termed degeneracy: e.g. when the edge count of a graph is constrained in average, the sampled graphs may all be either very sparse or very dense, but none represent the average edge count. This precludes most practical applications which assume a unimodal distribution concentrated around the average.

We have developed a practical solution for reducing degeneracy that relies on carefully chosen transformations of the constrained measures.

See the preprint on arXiv.

Analysing a novel synchronization model

Spontaneous synchronization occurs in many places in nature (fireflies flashing together, rhythmic applause, mechanically coupled metronomes, etc.) and has been widely studied. Most synchronization models fall into two main categories: phase-coupled oscillators (e.g. the Kuramoto model) or pulse-coupled oscillators (e.g. integrate and fire model). An entirely different model (described by Nikitin et al., PRL 87, (2001)) uses oscillators that periodically output some signal: let’s imagine flashing lights for an intuitive picture. The oscillators are bimodal, being able to flash either quickly or slowly. A surprising result is that if each individual unit switches between the fast and slow modes in an attempt to keep the momentary total light intensity in the system at a fixed level, their flashing may become synchronized. This model is remarkable because synchronization emerges as a non-obvious side effect of an interaction aimed at optimizing an entirely different property of the system. I worked with Prof. Zoltán Néda on understanding the behaviour and mapping the phase space of this model family.

Synchronization model phase space

Curriculum Vitae

GPG Public Key