Kohei Hayashi
Kohei Hayashi
Researcher, Preferred Networks
Bestätigte E-Mail-Adresse bei - Startseite
Zitiert von
Zitiert von
Estimation of low-rank tensors via convex optimization
R Tomioka, K Hayashi, H Kashima
arXiv preprint arXiv:1010.0789, 2010
Tensor factorization using auxiliary information
A Narita, K Hayashi, R Tomioka, H Kashima
Data Mining and Knowledge Discovery 25, 298-324, 2012
Statistical performance of convex tensor decomposition
R Tomioka, T Suzuki, K Hayashi, H Kashima
Advances in Neural Information Processing Systems (NIPS), 137, 2011
Making tree ensembles interpretable: A bayesian model selection approach
S Hara, K Hayashi
International conference on artificial intelligence and statistics, 77-85, 2018
Making tree ensembles interpretable
S Hara, K Hayashi
arXiv preprint arXiv:1606.05390, 2016
Exploring unexplored tensor network decompositions for convolutional neural networks
K Hayashi, T Yamaguchi, Y Sugawara, S Maeda
Advances in Neural Information Processing Systems 32, 2019
Cross-temporal link prediction
S Oyama, K Hayashi, H Kashima
2011 IEEE 11th International Conference on Data Mining, 1188-1193, 2011
Expected tensor decomposition with stochastic gradient descent
T Maehara, K Hayashi, K Kawarabayashi
Proceedings of the AAAI Conference on Artificial Intelligence 30 (1), 2016
On the extension of trace norm to tensors
R Tomioka, K Hayashi, H Kashima
NIPS workshop on tensors, kernels, and machine learning 7, 2010
On tensor train rank minimization: Statistical efficiency and scalable algorithm
M Imaizumi, T Maehara, K Hayashi
advances in neural information processing systems 30, 2017
Real-time top-r topic detection on twitter with topic hijack filtering
K Hayashi, T Maehara, M Toyoda, K Kawarabayashi
Proceedings of the 21th ACM SIGKDD International Conference on Knowledge …, 2015
Self-measuring similarity for multi-task gaussian process
K Hayashi, T Takenouchi, R Tomioka, H Kashima
Proceedings of ICML Workshop on Unsupervised and Transfer Learning, 145-153, 2012
Factorized asymptotic Bayesian hidden Markov models
R Fujimaki, K Hayashi
arXiv preprint arXiv:1206.4679, 2012
Doubly decomposing nonparametric tensor regression
M Imaizumi, K Hayashi
International Conference on Machine Learning, 727-736, 2016
Exponential family tensor factorization for missing-values prediction and anomaly detection
K Hayashi, T Takenouchi, T Shibata, Y Kamiya, D Kato, K Kunieda, ...
2010 IEEE International Conference on Data Mining, 216-225, 2010
When Does Label Propagation Fail? A View from a Network Generative Model
Y Yamaguchi, K Hayashi
IJCAI, 2017
Factorized asymptotic bayesian inference for latent feature models
K Hayashi, R Fujimaki
Advances in Neural Information Processing Systems 26, 2013
Tensor decomposition with smoothness
M Imaizumi, K Hayashi
International Conference on Machine Learning, 1597-1606, 2017
A tractable fully bayesian method for the stochastic block model
K Hayashi, T Konishi, T Kawamoto
arXiv preprint arXiv:1602.02256, 2016
On random subsampling of Gaussian process regression: A graphon-based analysis
K Hayashi, M Imaizumi, Y Yoshida
International Conference on Artificial Intelligence and Statistics, 2055-2065, 2020
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20