Skip to content

Commit

Permalink
Added appendix D
Browse files Browse the repository at this point in the history
  • Loading branch information
papamarkou committed Aug 5, 2024
1 parent 2b2d411 commit f363498
Show file tree
Hide file tree
Showing 9 changed files with 27 additions and 8 deletions.
18 changes: 18 additions & 0 deletions bib/appendix.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
@InCollection{desbrun2008discrete,
author = {Desbrun, Mathieu and Kanso, Eva and Tong, Yiying},
booktitle = {Discrete Differential Geometry},
publisher = {Springer},
title = {Discrete differential forms for computational modeling},
year = {2008},
pages = {287--324},
}

@Article{trask2022enforcing,
author = {Trask, Nathaniel and Huang, Andy and Hu, Xiaozhe},
journal = {Journal of Computational Physics},
title = {Enforcing exact physics in scientific machine learning: a data-driven exterior calculus on graphs},
year = {2022},
publisher = {Elsevier},
}

@Comment{jabref-meta: databaseType:bibtex;}
2 changes: 1 addition & 1 deletion index.rmd
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: "Topological Deep Learning: Going Beyond Graph Data"
author: "Mustafa Hajij, Ghada Zamzmi, Theodore Papamarkou, Nina Miolane, Aldo Guzmán-Sáenz, Karthikeyan Natesan Ramamurthy, Tolga Birdal, Tamal K. Dey, Soham Mukherjee, Shreyas N. Samaga, Neal Livesay, Robin Walters, Paul Rosen, Michael T. Schaub"
date: "`r Sys.Date()`"
documentclass: krantz
bibliography: [bib/main.bib, bib/related-work.bib]
bibliography: [bib/main.bib, bib/related-work.bib, bib/appendix.bib]
biblio-style: apalike
fontsize: 11pt
link-citations: yes
Expand Down
2 changes: 1 addition & 1 deletion rmd/80-related-work.rmd → rmd/80-glossary.rmd
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,6 @@

# (APPENDIX) Appendix {-}

# Related work
# Glossary

To be completed.
3 changes: 0 additions & 3 deletions rmd/81-glossary.rmd

This file was deleted.

File renamed without changes.
File renamed without changes.
7 changes: 7 additions & 0 deletions rmd/83-learning-dec-operators-with-ccanns.rmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Learning discrete exterior calculus operators with CCANNs

The operator $G_{tr}=G\odot att$ of Equations \@ref(eq:attention1) and \@ref(eq:attention2) has an advantageous cross-cutting interpretation. First, recall that $G_{tr}$ has the same shape as the original operator $G$. More importantly, $G_{tr}$ can be viewed as a learnt version of $G$. For instance, if $G$ is the $k$-Hodge Laplacian $\mathbf{L}_k$, then the learnt attention version $G_{tr}$ of it represents a $k$-Hodge Laplacian that is adapted to the domain $\mathcal{X}$ for the learning task at hand. This perspective converts our attention framework to a tool for learning *discrete exterior calculus (DEC) operators* [@desbrun2008discrete]. We refer the interested reader to recent works along these lines [@smirnov2021hodgenet; @trask2022enforcing], where neural networks are used to learn Laplacian operators in various shape analysis tasks.

Concretely, one of the main building blocks of DEC is a collection of linear operators of the form $\mathcal{A} \colon \mathcal{C}^i(\mathcal{X}) \to \mathcal{C}^j(\mathcal{X})$ that act on a cochain $\mathbf{H}$ to produce another cochain $\mathcal{A}(\mathbf{H})$. An example of an operator $\mathcal{A}$ is the graph Laplacian. There are seven primitive DEC operators, including the discrete exterior derivative, the hodge star and the wedge product. These seven primitive operators can be combined together to form other operators. In our setting, the discrete exterior derivatives are precisely a signed version of the incidence matrices defined in the context of cell/simplicial complexes. We denote the $k$-signed incidence matrix defined on a cell/simplicial complex by $\mathbf{B}_k$. It is common in the context of discrete exterior calculus [@desbrun2008discrete] to refer to $\mathbf{B}_k^T$ as the $k^{th}$ *discrete exterior derivative* $d^k$. So, from a DEC point of view, the matrices $\mathbf{B}_0^T, \mathbf{B}_1^T$ and $\mathbf{B}_2^T$ are regarded as the discrete exterior derivatives $d^0(\mathbf{H})$, $d^1 (\mathbf{H})$, and $d^2 (\mathbf{H})$ of some 0-, 1-, and 2-cochains defined on $\mathcal{\mathcal{X}}$, which in turn are the discrete analogs of the gradient $\nabla \mathbf{H}$, curl $\nabla\times \mathbf{H}$ and divergence $\nabla \cdot \mathbf{H}$ of a smooth function defined on a smooth surface. We refer the reader to [@desbrun2008discrete] for a coherent list of DEC operators and their interpretation. Together, cochains and the operators that act on them provide a concrete framework that facilitates computing a cochain of interest, such as a cochain obtained by solving a partial differential equation on a discrete surface.

Our attention framework can be viewed as a non-linear version of the DEC based on linear operators $\mathcal{A}$, and can be used to learn the DEC operators on a domain $\mathcal{X}$ for a particular learning task. Specifically, a linear operator $\mathcal{A}$, as it appears in classical DEC, can be considered as a special case of Equations \@ref(eq:attention1) and \@ref(eq:attention2). Unlike existing work [@smirnov2021hodgenet; @trask2022enforcing], our DEC learning approach based on CCANNs generalizes and applies to all domains in which DEC is typically applicable; examples of such domains include triangular and polygonal meshes [@crane2013digital]. In contrast, existing operator learning methods are defined only for particular types of DEC operators, and therefore cannot be used to learn arbitrary types of DEC operators.
3 changes: 0 additions & 3 deletions rmd/84-learning-dec-operators-with-ccanns.rmd

This file was deleted.

0 comments on commit f363498

Please sign in to comment.