Introducing the Gram matrix complicates the problem (and makes computational study of it difficult.) Instead, notice that $H(t)$ is invertible exactly when $A + t(B-A) = A + tC$ is of full rank.
We can study this in detail in terms of the Singular Value Decomposition (SVD) of $A,$
$$A = UDV^\prime,$$
where $U^\prime U = \mathbb{I}_d = VV^\prime$ and $D$ is the diagonal matrix of the $d$ singular values of $A,$ all of which are nonzero because $A$ is of full rank. Accordingly, both $U$ and $V$ are of full rank, so instead let's examine the rank of the $d\times d$ matrix
$$U^\prime H(t) V = D + tU^\prime C V = tD\left(t^{-1}\mathbb{I}_d + D^{-1}U^\prime C V\right),$$
which will be the same as that of $H(t).$ (The rank of $H(0)$ is, of course, the rank of $A,$ so from now on assume $t\ne 0.$)
Taking determinants and writing $p_X(\lambda) = \det(X - \lambda \mathbb I)$ for the characteristic polynomial of any matrix $X,$
$$\begin{aligned}
\det \left(U^\prime H(t)V\right) &= \det(tD)\det\left(t^{-1}\mathbb{I}_d + D^{-1}U^\prime C V\right)\\
&=\det(D)\,t^d\ p_{D^{-1}U^\prime (B-A) V}(-t^{-1}).
\end{aligned}$$
Since $\det(D)\ne 0$ and $t\ne 0,$ the $t$ for which $H(t)$ is singular are the roots of $p_X(-t^{-1})$ where $X = D^{-1}U^\prime (B-A) V.$
Since the roots of $p_X$ are the eigenvalues of $X,$ we have found that
The only $t$ for which $H(t)$ fails to be invertible are the negative reciprocals of the (nonzero) eigenvalues of $X.$
(This includes complex eigenvalues, BTW.)
Since any $d\times d$ matrix has at most $d$ distinct eigenvalues, there are only finitely many such $t$ -- and we now have a means to compute them all. The ones relevant to the question are the real eigenvalues in the interval $(-\infty, -1),$ which may be fewer in number.
Example
Here is a simple but non-trivial example with $w\times d = 3\times 2$ matrices. Let
$$A = \pmatrix{1&0\\0&1\\0&0},$$
which obviously is of full rank, and
$$B = \pmatrix{-1&-1\\-1&-1\\0&0}.$$
(Although the question stipulates that $B$ should be of full rank, nothing in the foregoing analysis requires that and this example helps show why that is unnecessary.)
Here is R code to assess the invertibility of $H(t)$ by computing its determinant:
detH <- Vectorize(function(x, A, B) det(crossprod(A + x*(B - A))), "x")
I vectorized the calculation to enable easy plotting of $\det (H(t)),$ as in this snippet and its output:
A <- rbind(diag(1,2), 0)
B <- rbind(matrix(-1,2,2), 0)
curve(detH(t, A, B), 0, 1, xname="t", n=501, lwd=2)
abline(v = c(1/3, 1), col="Red", lty=3)

Evidently $H(t)$ fails to be of invertible precisely when $t\in\{1/3,1\},$ because those are the two points where this graph touches $0.$
What does the analysis say? We must begin with the SVD of $A.$ $A$ already is column-orthogonal, meaning $A=U,$ whence $D=V=\mathbb{I}_2$ are both identity matrices. Thus
$$X = D^{-1}U^\prime (B-A) V = U^\prime (B-A) = \pmatrix{1&0&0\\0&1&0}\pmatrix{-2&-1\\-1&-2\\0&0}=\pmatrix{-2&-1\\-1&-2}.$$
Notice that although $A$ and $B$ are rectangular matrices, $X$ is a square $d\times d$ matrix. Its characteristic polynomial is
$$p_X(\lambda) = \det(X - \lambda\mathbb{I}_2) = \det\pmatrix{-2-\lambda&-1\\-1&-2-\lambda}=(2+\lambda)^2-1.$$
Its zeros all satisfy $(2+\lambda)^2=1,$ whence the eigenvalues are $\lambda\in\{-3,-1\}.$ Their negative reciprocals are $t\in\{1/3,1\},$ agreeing with the direct R calculation.