Zach Rewolinski

PhD Student



Contact

Zach Rewolinski

PhD Student


Curriculum vitae


[email protected]


Department of Statistics

UC Berkeley

337 Evans Hall, University of California
Berkeley, CA 94720




Zach Rewolinski

PhD Student


[email protected]


Department of Statistics

UC Berkeley

337 Evans Hall, University of California
Berkeley, CA 94720



A Low-Rank Tensor Completion Approach for Imputing Functional Neuronal Data from Multiple Recordings


Journal article


Li Zheng, Zachary T. Rewolinski, Genevera I. Allen
2022 IEEE Data Science and Learning Workshop (DSLW), 2022

Semantic Scholar DOI
Cite

Cite

APA   Click to copy
Zheng, L., Rewolinski, Z. T., & Allen, G. I. (2022). A Low-Rank Tensor Completion Approach for Imputing Functional Neuronal Data from Multiple Recordings. 2022 IEEE Data Science and Learning Workshop (DSLW).


Chicago/Turabian   Click to copy
Zheng, Li, Zachary T. Rewolinski, and Genevera I. Allen. “A Low-Rank Tensor Completion Approach for Imputing Functional Neuronal Data from Multiple Recordings.” 2022 IEEE Data Science and Learning Workshop (DSLW) (2022).


MLA   Click to copy
Zheng, Li, et al. “A Low-Rank Tensor Completion Approach for Imputing Functional Neuronal Data from Multiple Recordings.” 2022 IEEE Data Science and Learning Workshop (DSLW), 2022.


BibTeX   Click to copy

@article{li2022a,
  title = {A Low-Rank Tensor Completion Approach for Imputing Functional Neuronal Data from Multiple Recordings},
  year = {2022},
  journal = {2022 IEEE Data Science and Learning Workshop (DSLW)},
  author = {Zheng, Li and Rewolinski, Zachary T. and Allen, Genevera I.}
}

Abstract

New neuroscience technologies have led to massive functional neuronal data sets that hold promise for better understanding large populations of neurons and how the brain works. At the same time, the fact that a huge amount of neurons in the brain cannot be recorded simultaneously calls for novel statistical and machine learning methodologies. In particular, many neuronal activity data sets consist of multiple recordings of different subsets of neurons, when the same external stimulus is presented. In this paper, we focus on imputing this type of data set by modeling it as a noisy low-rank tensor completion problem with block-wise measurements. We propose a novel method consisting of a matrix completion algorithm for its unfolding and a further refinement based on gradient descent for the squared loss of low-rank Tucker decomposition. The output of our algorithm can be applied in many downstream data analysis tasks such as learning the functional neuronal connectivity. We provide simulations and real data experiments to validate our method and demonstrate its potential in terms of both low imputation errors and accurate graph estimation when applying it to adjust for the effect of external stimulus on neuronal activities.





Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in