Here is an approach that will work for some polynomials, but is not guaranteed to work for all of them.
It is guaranteed to work for all multilinear quadratic polynomials. It is also guaranteed to work if there is any variable $x$ such that $p(\cdots,x,\cdots)$ does not contain a $x^2$ term. However, for polynomials of the form $p(x,\cdots,z) = c_x \cdot x^2 + \cdots + c_z \cdot z^2 + p'(x,\cdots,z)$ where $p'$ is a multilinear quadratic polynomial (i.e., polynomials where every variable appears squared), I am only able to present a heuristic. The heuristic might often work OK in practice, but I have no proof.
(I feel like there must be a cleaner solution lurking here somewhere....)
Let $\mathbb{F}$ be a field. Suppose $p(x,y,z,\cdots) \in \mathbb{F}[x,y,z,\cdots]$ is a multivariate quadratic polynomial with coefficients in the field $\mathbb{F}$. Factor out factors of $x$, to get
$$p(x,y,z,\cdots) = c \cdot x^2 + q(y,z,\cdots) \cdot x + r(y,z,\cdots).$$
Notice that $q$ must be linear/affine, $r$ must be quadratic, and $c$ must be a constant ($c \in \mathbb{F}$). We want to find an assignment of values to $x,y,z,\cdots$ that makes
$$c \cdot x^2 + q(y,z,\cdots) \cdot x + r(y,z,\cdots) = 0.$$
Now let $\Delta$ denote the discriminant of this quadratic equation (in $x$), i.e.,
$$\Delta(y,z,\cdots) = q(y,z,\cdots)^2 - 4c \cdot r(y,z,\cdots).$$
The original polynomial is satisfiable if and only if you can find an assignment of values to $y,z,\cdots$ that makes $\Delta(y,z,\cdots)$ be a square (i.e., a quadratic residue) in $\mathbb{F}$. Notice that $\Delta(y,z,\cdots)$ is itself a multivariate quadratic polynomial (since $q$ is affine and $r$ is quadratic).
At the point, our solution approach is going to branch, based upon whether $c=0$ (an easy case) or $c\ne 0$ (the harder case).
If $c=0$, solving this equation is easy: it is basically just a linear equation (it is linear in $x$, once you fix all the other variables).
As a result, there is a special class of polynomials where this problem is especially easy to solve: any polynomial $p$ where there exists a variable -- for simplicity, we will call it $x$ -- such that $x^2$ does not appear in $p$. For that class, you will have $c=0$ above, and then the following algorithm suffices to solve the problem:
Check whether $q$ has any assignment of values to $y,z,\cdots$ that makes $q(y,z,\cdots) \ne 0$. Since $q$ is linear, this is easy to check.
If yes, then choose any such assignment of values to $y,z,\cdots$. Next, set $x = - r(y,z,\cdots) q(y,z,\cdots)^{-1}$. By definition, $q(y,z,\cdots) \ne 0$, so $q(y,z,\cdots)$ has an inverse (since we're working in a field), so such a value of $x$ exists. This immediately yields an assignment of values to $x,y,z,\cdots$ that makes $p$ zero, and we're done.
If no, then we know $p(x,y,z,\cdots) = r(y,z,\cdots)$ for all possible assignments of values to the variables of $p$. Effectively, $p$ never depended on $x$ in the first place, so it was a mistake to think of it as a function of $x$. Or, put another way, we have eliminated one variable from $p$, and we get a new instance of the original problem. Recursively apply our methods to $r(y,z,\cdots)$. Exception: if $p$ was a function of a single variable (i.e., $p(x)$), then in this case you can conclude that $p$ is not satisfiable (it is identically zero for all possible values of its variables).
This algorithm handles the easy case: namely, polynomials where there exists at least one variable $x$ that appears in $p$ but where $x^2$ does not appear in $p$. For this case, the algorithm runs in polynomial time and determines whether there exists an assignment of values to the variables that makes the polynomial equal to zero.
Now back to the hard case. If it doesn't fall into the easy case, then we must have
$$p(x,\cdots,z) = c_x \cdot x^2 + \cdots + c_z \cdot z^2 + p'(x,\cdots,z).$$
where $p'(x,\cdots,z)$ has no squared-terms (it is multilinear). Let's look at what methods we can use for this case.
Well, here's one situation where we can solve this. Suppose there exist a pair of variables $x,y$ such that $-c_x/c_y$ is a square (a quadratic residue) in $\mathbb{F}$, say, $-c_x/c_y=\alpha^2$. Then we can apply the change of variables $y' = y+\alpha x$. Conveniently, we have
$(y')^2 = (y+\alpha x)^2 = y^2 + 2\alpha \cdot xy + \alpha^2 \cdot x^2 = y^2 + 2\alpha cdot xy - c_x/c_y \cdot x^2.$$
Plugging this change of variables into $p$, we get
$$p(x,y',\cdots) = c_x \cdot x^2 + c_y \cdot y^2 + 2\alpha c_y \cdot xy - c_y \cdot (c_x/c_y) \cdot x^2 + \dots,$$
i.e.,
$$p(x,y',\cdots) = c_y \cdot y^2 + 2\alpha c_y \cdot xy + \dots$$
where the omitted part does not contain a $x^2$ term (since the $c_x \cdot x^2$ and $\alpha^2 c_y \cdot x^2$ terms cancel). Effectively, we have eliminated the $x^2$ term, so now we can apply the method above to $p(x,y',\cdots)$. Notice that $p(x,y',\cdots)=0$ is satisfiable if and only if $p(x,y,\cdots)=0$ is. When we find a solution that makes this zero, we can back-solve for $y$, and we obtain an assignment to $x,y,z,\cdots$ that makes $p(x,y,z,\cdots)=0$.
Is it possible that no pair of variables allows us to eliminate a squared-term? If $-1$ is a quadratic residue (a square) in $\mathbb{F}$, and if we have at least 3 variables, then that is not possible: $-c_x/c_z = (-1) \times (-c_x/c_y) \times (-c_y/c_z)$, and since the product of a quadratic non-residue and a quadratic non-residue is a quadratic residue, we are guaranteed that at least one of $-c_x/c_z$, $-c_y/c_z$, or $-c_x/c_z$ is a quadratic residue so at least one of the squared terms can be cancelled. However, if $-1$ is a quadratic non-residue, or we have only 1 or 2 variables, then it might not be possible to eliminate a squared-term. That's the remaining difficult case. (The situation with only 1 or 2 variables is not difficult to handle, so really the difficult case is where $-1$ is a quadratic non-residue, and where each variable appears squared in $p$.) I have no general solution for this difficult case.
However, I can suggest a heuristic that may often work, even for this difficult case.
In any field $\mathbb{F}$, approximately half of all field elements are squares (quadratic residues); if you pick a random field element, it will be a square with probability about $1/2$. Therefore, if we pick values for $y,z,\ldots$ randomly, we can predict that (heuristically) $\Delta(y,z,\cdots)$ should be a square with probability $1/2$, if $\Delta(y,z,\cdots)$ acts like a random function. This suggests the following heuristic algorithm:
Randomly pick a variable to eliminate, say $x$.
Pick values for $y,z,\ldots$ randomly.
If $\Delta(y,z,\cdots)$ is a square in $\mathbb{F}$, then the equation $p(x,y,z,\cdots)=0$ has a solution for $x$, namely $x=(-q(y,z,\cdots) \pm \sqrt{\Delta(y,z,\cdots)})/(2c)$ (assuming $\mathbb{F}$ does not have characteristic 2). This gives us an assignment to the variables $x,y,z,\ldots$ that makes $p(x,y,z,\cdots)=0$, so we're done.
If $\Delta(y,z,\cdots)$ is not a square, go back to step 1.
If after many steps, you do not find any solution, then you might guess that the equation is unsatisfiable (but this is a heuristic, so your guess could be wrong).
This algorithm might work well for many of the remaining difficult polynomials, but I have no proof that it will always work. In particular, there is a risk that all the possible discriminant polynomials $\Delta(\cdots)$ might have the property that their values are always non-squares (quadratic non-residues), in which case the algorithm above will fail.