We define a metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data, referred to as cross-frequency coupling and apply it to electrophysiological ...

8 Article(s)

Document(s)

Title

We define a metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data, referred to as cross-frequency coupling and apply it to electrophysiological ...

We define a metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data, referred to as cross-frequency coupling and apply it to electrophysiological ...

We define a novel metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data and apply it to electrophysiological recordings from the brain to infer ...

A fundamental problem in many science and engineering disciplines is inferring the characteristics of a physical or biological system from the dependencies in data recorded from the system. The dependencies in data, particularly in case of signals re...

We consider the problem of estimating mutual information between dependent data, an important problem in many science and engineering applications. We propose a data-driven, non-parametric estimator of mutual information in this paper. The main novel...

In this paper, we developed a model-based and a data-driven estimator for directed information (DI) to infer the causal connectivity graph between electrocorticographic (ECoG) signals recorded from brain and to identify the seizure onset zone (SOZ) i...