21

A claw is a $K_{1,3}$. A trivial algorithm will detect a claw in $O(n^4)$ time. It can be done in $O(n^{\omega+1})$, where $\omega$ is the exponent of fast matrix multiplication, as follows: take the subgraph induced by $N[v]$ for each vertex $v$, and find a triangle in its complement.

As far as I know, these basic algorithms are only known. Spinrad listed in his book "efficient graph representations" the detection of a claw in $o(n^{\omega+1})$ time as an open problem (8.3, page 103). For the lower bound, we know that an $O(n^c)$-time algorithm will imply an $O(n^{\max{(c,2)}})$-time algorithm for finding a triangle. So we may consider $\Omega(n^\omega)$ as a lower bound.

Question:

  1. Is there any progress on this. Or any progress on showing it's impossible?
  2. Are there other natural problems with $O(n^{\omega+1})$-time algorithms which are the best?

Remark:

  1. I'm explicitly asking for the detection of a claw, instead of the recognition of claw-free graphs. Although an algorithm usually solves both, there are few exceptions.
  2. It's claimed in Handbook of Algorithms and Theoretical Computer Science that it can be found in linear time, but it was only a typo (see "efficient graph representations").
domotorp
  • 13,931
  • 37
  • 93
Yixin Cao
  • 2,560
  • 1
  • 16
  • 29
  • You can use Alon et al.'s method of finding triangle in $O(|E|^{1.41})$, for each node which end up at a $O(|V|\cdot |E|^{1.41})$ which is better than $|V|^{\omega+1}$ if the graph is not too dense. – R B Apr 30 '14 at 15:29
  • @RB, thank you for pointing this out. The basic idea is still to run the whatever-triangle-detection-algorithm $n$ times, which is what I want to avoid. – Yixin Cao Apr 30 '14 at 15:43
  • How can we expect to find some algorithm which is not related to triangle finding? Because whatever the algorithm is should check neighbors of each vertex. I mean that property is very local property, except with constant factor difference every vertex should be seen. (Or I cannot imagine any natural algorithm which avoids this locality). Do you have any even vague idea? – Saeed Apr 30 '14 at 16:39
  • @Saeed what I want to avoid is not the triangle-finding algorithm, but calling it $n$ times. For example, will $\sqrt{n}$ or $\log{n}$ suffice? Or (as I've to guess), it's inevitable, which means it's the lower bound. – Yixin Cao Apr 30 '14 at 17:21
  • If you don't want to avoid triangle detection then, it's obvious that you cannot do it better than linear search. Isn't it? – Saeed Apr 30 '14 at 17:24
  • Can you briefly explain why? For example, is it possible to find all $3$-independent set, and store it in $n^3$ space, and then find the center? I'm not saying this works, and what I want to say is that I cannot see the obvious necessity of running it $n$ times. – Yixin Cao Apr 30 '14 at 17:27
  • The way you thinking around is still avoiding triangle detection, which is nice though, but I think is not possible, I'll think to see that can we prove they are same as each other or not (except that linear search), if I do I'll write it up as an answer, but in the case that we use triangle detection, I think is easy to provide an adversary argument which says we should have linear search. – Saeed Apr 30 '14 at 17:34
  • 2
    Maybe it is good to mention that if we can find a claw in f(n) time, we can also find a triangle in f(n+1) time (just take the complement of the graph and add one more vertex connected to everyone), so we should not hope to find anything better than $n^\omega$. – domotorp Apr 30 '14 at 18:11
  • 1
    @domotorp, seems that part is clear, the other way around is not clear, I mean why we need linear search. As Yixin also pointed out, may be there is another algorithm which does not use a triangle finding algorithm and solves problem in $o(n^{\omega +1})$, which I think is hard to find such an algorithm and probably is easier to show that any algorithm uses triangle finding as subroutine (or can be converted) with linear search on it. – Saeed Apr 30 '14 at 19:01

2 Answers2

16

I think that we can do slightly better than $O(n^{1+\omega})$ for dense graphs, by using rectangular matrix multiplication. A similar idea was used by Eisenbrand and Grandoni ("On the complexity of fixed parameter clique and dominating set", Theoretical Computer Science Volume 326(2004) 57–67) for 4-clique detection.

Let $G=(V,E)$ be the graph in which we want to detect the existence of a claw. Let $A$ be the set of (ordered) pairs of vertices in $V$. Consider the following Boolean matrix $M$ of size $|V|\times |A|$: each row is indexed by some $u\in V$, each column is indexed by some $(v,w)\in A$, and the corresponding entry is one if and only if $\{u,v\}\in E$, $\{v,w\}\in E$ and $\{u,w\}\notin E$. Consider the Boolean matrix product of $M$ and its transpose $M^T$. The graph $G$ has a claw if and only if there exists $\{u,v\}\notin E$ such that the entry of $MM^T$ in the row indexed by $u$ and the column indexed by $v$ is one.

The complexity of this algorithm is essentially the complexity of computing the Boolean product of an $n\times n^2$ matrix by a $n^2\times n$ matrix, where $n$ denotes the number of vertices in the graph. It is known that such a matrix product can be computed in time $O(n^{3.3})$, which is better than $O(n^{1+\omega})$ for the best known upper bound on $\omega$. Of course, if $\omega=2$ (as conjectured) then the two approaches give the same complexity $O(n^3)$.

Yixin Cao
  • 2,560
  • 1
  • 16
  • 29
  • Great! This is exactly what I want for my first question: only one call of matrix multiplication, though a bigger one. I'll wait for more comments or answers on my second question before taking it as an answer. – Yixin Cao May 01 '14 at 13:33
15

I don't know how to avoid doing $n$ matrix multiplies, but you can analyze it in such a way that the time is effectively that of a smaller number of them. This trick is from

Kloks, Ton; Kratsch, Dieter; Müller, Haiko (2000), "Finding and counting small induced subgraphs efficiently", Information Processing Letters 74 (3–4): 115–121, doi:10.1016/S0020-0190(00)00047-8, MR 1761552.

The first observation is that, when you're going matrix multiplies, the matrices are not really $n\times n$, but $d\times d$ where $d$ is the degree of each vertex, because what you're looking for is a co-triangle in the neighborhood of each vertex.

Second, in a claw-free graph, every vertex must have $O(\sqrt m)$ neighbors. For, otherwise the set of neighbors would have too few edges to avoid having a triangle in the complement. So when you're doing matrix multiplications, you only need to do it on matrices of size $O(\sqrt m)$ rather than $n$.

In addition, each edge can contribute to the size of the matrix multiplication problem for only two vertices, its endpoints. The worst case happens when the $2m$ for the total size of these matrix multiplication problems is spread into $O(\sqrt m)$ subproblems of size $O(\sqrt m)$ each, which gives a total time bound of $O(m^{(1+\omega)/2})$, an improvement for sparse graphs over the $O(nm^{\omega/2})$ bound mentioned by R B.

David Eppstein
  • 50,990
  • 3
  • 170
  • 278
  • Wow, that's a clever idea, I was thinking whether is possible to do sublinear search (actually disproving this) and didn't even think about the intrinsic properties of the problem. – Saeed Apr 30 '14 at 20:44
  • Thank you David. I leave it open for a moment as my second question seems to be not noticed yet. – Yixin Cao Apr 30 '14 at 21:03