-
Notifications
You must be signed in to change notification settings - Fork 3
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
2b2d411
commit f363498
Showing
9 changed files
with
27 additions
and
8 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
@InCollection{desbrun2008discrete, | ||
author = {Desbrun, Mathieu and Kanso, Eva and Tong, Yiying}, | ||
booktitle = {Discrete Differential Geometry}, | ||
publisher = {Springer}, | ||
title = {Discrete differential forms for computational modeling}, | ||
year = {2008}, | ||
pages = {287--324}, | ||
} | ||
|
||
@Article{trask2022enforcing, | ||
author = {Trask, Nathaniel and Huang, Andy and Hu, Xiaozhe}, | ||
journal = {Journal of Computational Physics}, | ||
title = {Enforcing exact physics in scientific machine learning: a data-driven exterior calculus on graphs}, | ||
year = {2022}, | ||
publisher = {Elsevier}, | ||
} | ||
|
||
@Comment{jabref-meta: databaseType:bibtex;} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,6 +2,6 @@ | |
|
||
# (APPENDIX) Appendix {-} | ||
|
||
# Related work | ||
# Glossary | ||
|
||
To be completed. |
This file was deleted.
Oops, something went wrong.
File renamed without changes.
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
# Learning discrete exterior calculus operators with CCANNs | ||
|
||
The operator $G_{tr}=G\odot att$ of Equations \@ref(eq:attention1) and \@ref(eq:attention2) has an advantageous cross-cutting interpretation. First, recall that $G_{tr}$ has the same shape as the original operator $G$. More importantly, $G_{tr}$ can be viewed as a learnt version of $G$. For instance, if $G$ is the $k$-Hodge Laplacian $\mathbf{L}_k$, then the learnt attention version $G_{tr}$ of it represents a $k$-Hodge Laplacian that is adapted to the domain $\mathcal{X}$ for the learning task at hand. This perspective converts our attention framework to a tool for learning *discrete exterior calculus (DEC) operators* [@desbrun2008discrete]. We refer the interested reader to recent works along these lines [@smirnov2021hodgenet; @trask2022enforcing], where neural networks are used to learn Laplacian operators in various shape analysis tasks. | ||
|
||
Concretely, one of the main building blocks of DEC is a collection of linear operators of the form $\mathcal{A} \colon \mathcal{C}^i(\mathcal{X}) \to \mathcal{C}^j(\mathcal{X})$ that act on a cochain $\mathbf{H}$ to produce another cochain $\mathcal{A}(\mathbf{H})$. An example of an operator $\mathcal{A}$ is the graph Laplacian. There are seven primitive DEC operators, including the discrete exterior derivative, the hodge star and the wedge product. These seven primitive operators can be combined together to form other operators. In our setting, the discrete exterior derivatives are precisely a signed version of the incidence matrices defined in the context of cell/simplicial complexes. We denote the $k$-signed incidence matrix defined on a cell/simplicial complex by $\mathbf{B}_k$. It is common in the context of discrete exterior calculus [@desbrun2008discrete] to refer to $\mathbf{B}_k^T$ as the $k^{th}$ *discrete exterior derivative* $d^k$. So, from a DEC point of view, the matrices $\mathbf{B}_0^T, \mathbf{B}_1^T$ and $\mathbf{B}_2^T$ are regarded as the discrete exterior derivatives $d^0(\mathbf{H})$, $d^1 (\mathbf{H})$, and $d^2 (\mathbf{H})$ of some 0-, 1-, and 2-cochains defined on $\mathcal{\mathcal{X}}$, which in turn are the discrete analogs of the gradient $\nabla \mathbf{H}$, curl $\nabla\times \mathbf{H}$ and divergence $\nabla \cdot \mathbf{H}$ of a smooth function defined on a smooth surface. We refer the reader to [@desbrun2008discrete] for a coherent list of DEC operators and their interpretation. Together, cochains and the operators that act on them provide a concrete framework that facilitates computing a cochain of interest, such as a cochain obtained by solving a partial differential equation on a discrete surface. | ||
|
||
Our attention framework can be viewed as a non-linear version of the DEC based on linear operators $\mathcal{A}$, and can be used to learn the DEC operators on a domain $\mathcal{X}$ for a particular learning task. Specifically, a linear operator $\mathcal{A}$, as it appears in classical DEC, can be considered as a special case of Equations \@ref(eq:attention1) and \@ref(eq:attention2). Unlike existing work [@smirnov2021hodgenet; @trask2022enforcing], our DEC learning approach based on CCANNs generalizes and applies to all domains in which DEC is typically applicable; examples of such domains include triangular and polygonal meshes [@crane2013digital]. In contrast, existing operator learning methods are defined only for particular types of DEC operators, and therefore cannot be used to learn arbitrary types of DEC operators. |
File renamed without changes.
This file was deleted.
Oops, something went wrong.