Rashevsky and Trucco in 1995, defined an entropy measure to characterize the structural information content of graphs. His definition is based on the partitioning of the n vertices into k classes of equivalent vertices, according to their degree dependence. Then, assigned a probability to each partition obtained as the fraction of vertices in this partition divided by the total number of vertices. Thus, according to Rashevsky the entropy of a graph
According to Rashevsky,
Trucco applied this concept to the edge automorphism group,
Quanto mais elementos no conjunto dos grupos automórficos de X, denominado
Um isomorfismo de aresta do grafo
According to Trucco, the entropy measure is:
Para realizar esse calculo precisamos portanto de identificar as permutações possíveis dos grafos ditos "congruentes" (isomorficos de aresta).
Exemplo (grafos extraídos do artigo de Rashevsky)
Seja o grafo L ilustrado na figura 1-l acima. O grupo
Repare que os grafos b, d, g, i e k mostrados na figura 1 tem entropia
Por exemplo, a entropia de rashevsky do grafo b é
Como a entropia de Rashevsky e Trucco baseiam-se inteiramente em características invariantes do grafo (graph invariants). Essas características abrangem o número de vértices, nós, conecções, etc. A principal limitação disto é que 2 grafos estruturalmente diferentes podem ter the same information content. Those mesures are so-called degenerate.
To overcome this problem, some measures for graph entropy based on metrical properties were developed.
Bonchev and Trinajstic developed entropy measures which take into account several structural graph features as distances and vertex degrees.
Given a distance matrix
The diagonal of the matrix is composed of zeros and is a symmetric matrix. It means that the upper-triangular content is the same as the lower-triangular. Bonchev used the shortest path every pair of connected nodes (the distance
Let the distance of a value
The information content on graphs can be
(1) the information content of a given system,
According to Bonchev it is:
or
(2) the mean information content of one element of the system will then,
Then according to Bonchev:
Lets consider the example given 3 graphs with alternative branchings as depicted in figure.
With the distance matrix
Although the way of calculating the graph entropy using partition-based measures i.e., the set of automorphisms and the minimum distance between two nodes seem useful it is not computationally efficient. For larger graphs, it becomes an issue as there is no efficient algorithm to check if there exists an isomorphism for two given graphs and to compute the distance between two nodes can be a tough computer task.
To overcome those issues, Ye et al (2014) extended the calculations of the von Neumann entropy for undirected graphs to directed graphs. And more, they simplified the equations to make them in terms of the nodes in-degree and out-degree. The von Neumann entropy of directed graph is:
Accoring to Ye at al,
As we can see, the above equation contain two terms. The first is related to the graph size, while the second depends on the degree statistics in the graph. Exploring which topologies give the maximum and minimum entropies it is clearly from equation
On the other hand, when the terms in the curly brackets take on their smallest value, the entropy is maximum. This occurs in structures of star graphs.
This make sense to us, because when considering bubbles in de Bruijn graphs we increase the von Neumann entropy as oposite to linear chain. temos que checar se os grafos de bruijn sao irreducible e aperiodics para a aproximacao da entropia de Neumann ser verdadeira devido as condicoes impostas no paper
We have a primarily interest in evaluate the entropy of graph structures without taking into account the graph size. So the authors proposed a normalization with respect to the graph size removing the size dependence:
Note that the normalizing function