Skip to content

Commit

Permalink
tensorly + minor
Browse files Browse the repository at this point in the history
  • Loading branch information
arranger1044 committed Oct 10, 2024
1 parent 1c363c7 commit 5571980
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
4 changes: 2 additions & 2 deletions cfp.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
---
title: Call for Papers
title: call for papers
nav: true
---

# Call for Papers
# call for papers

## Important dates
All times are end of day AOE.
Expand Down
3 changes: 2 additions & 1 deletion index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ title: Home

{% include figure.html img="colorai.png" alt="colorai logo" width="60%" %}

Many of the recent advancements in AI are due to exploiting <i><b>structured low-rank representations</b></i>, from low-rank tensor factorizations <cite>(Kolda and Bader 2009)</cite> for scaling large language models (LLMs), e.g., via adapters <cite>(Hu et al. 2021)</cite> or structured matrices
Many of the recent advancements in AI are due to exploiting <i><b>structured low-rank representations</b></i>, from low-rank tensor factorizations <cite>(Kolda and Bader 2009; Kossaifi et al. 2019)</cite> for scaling large language models (LLMs), e.g., via adapters <cite>(Hu et al. 2021)</cite> or structured matrices
<cite>(Dao et al. 2022)</cite>; the diffusion of compact polynomial representations as powerful inductive biases for deep learning architectures <cite>(Cheng et al. 2024)</cite>; the emergence of probabilistic circuits to provide tractable probabilistic inference with guarantees <cite>(Loconte et al. 2024; Choi et al. 2020)</cite> and reliable neuro-symbolic AI <cite>(Ahmed et al. 2022)</cite>; and the wide application of tensor networks to solve and accelerate physics-related problems <cite>(Biamonte and Bergholm 2017)</cite> and quantum computing <cite>(Orus 2019)</cite>.

### *"How are all these representations related to each others? and how can we transfer knowledge across communities?"*
Expand Down Expand Up @@ -36,6 +36,7 @@ TBA
- Hu et al. 2021 - [LoRA: Low-Rank Adaptation of Large Language Models](https://openreview.net/forum?id=nZeVKeeFYf9)
- Loconte et al. 2024 - [What is the Relationship between Tensor Factorizations and Circuits (and How Can We Exploit it)?](https://arxiv.org/abs/2409.07953v1)
- Dao et al. 2022 - [Monarch: Expressive Structured Matrices for Efficient and Accurate Training](https://proceedings.mlr.press/v162/dao22a/dao22a.pdf)
- Kossaifi et al. 2019 - [Tensorly: Tensor learning in python](https://www.jmlr.org/papers/v20/18-277.html)
- Cheng et al. 2024 - [Multilinear Operator Networks](https://openreview.net/forum?id=bbCL5aRjUx)
- Choi et al. 2020 - [Probabilistic Circuits: A Unifying Framework for Tractable Probabilistic Models](https://yoojungchoi.github.io/files/ProbCirc20.pdf)
- Ahmed et al. 2022 - [Semantic Probabilistic Layers for Neuro-Symbolic Learning](https://proceedings.neurips.cc/paper_files/paper/2022/hash/c182ec594f38926b7fcb827635b9a8f4-Abstract-Conference.html)
Expand Down

0 comments on commit 5571980

Please sign in to comment.