Typical Shannon entropy, on discrete set of probabilities, needs to be positive, as it is average of non-negative numbers, i.e.
$$\sum_i p_i \left(\tfrac{1}{p_i}\right).$$
Differential entropy need not to be positive. It is
$$\int p(x) \log\left(\tfrac{1}{p(x)}\right) dx,$$
which does not need to be positive. $p(x)$ is probability density, so it can be greater than one, making $\log(\tfrac{1}{p(x)})$ negative. In fact differential entropy can be viewed as Shannon entropy, where we do limit for infinitesimally small boxes and subtract $\log(1/\epsilon)$ (i.e. box size), otherwise the limit diverges:
$$
\lim_{\epsilon\to\infty} \sum_i p_{[i\epsilon, (i+1)\epsilon]}
\log\left(\tfrac{1}{p_{[i\epsilon, (i+1)\epsilon]}}\right)
$$
$$
\approx
\lim_{\epsilon\to\infty}
\sum_{i} p(i \epsilon)\epsilon
\log\left(\tfrac{1}{p(i \epsilon)\epsilon}\right)
$$
$$
=
\lim_{\epsilon\to\infty}
\left(\sum_{i} p(i \epsilon)\epsilon
\log\left(\tfrac{1}{p(i \epsilon)}\right)
+ \log(1/\epsilon) \right)
$$
$$
=
\int_x
p(x)
\log\left(\tfrac{1}{p(x)}\right) dx
+ \lim_{\epsilon\to\infty}\log(1/\epsilon)
$$
For Dirac delta differential entropy is $-\infty$, so you are right.