0

I didn't know that can happen and since I already asked the question here I don't know what to do with this question should I delete it ?

I am currently reading Baby Rudin as my second analysis book (after Introduction to Real Analysis by Robert G. Bartle and Donald R. Sherbert) and I was surprised that Rudin’s book includes complex numbers, I tried to avoid complex numbers for a while because the way we were taught about them in school and even college was unsatisfactor They said something like "Mathematicians in the past didn’t accept complex numbers and thought they were meaningless and nonsense, but now they don’t" and then moved on with the course without explaining why mathematicians in the past rejected complex numbers and what made them accept them later. This question has been on my mind for quite some time and I tried to avoid complex numbers as much as possible as this question is probably very hard to answer (after all, there must be a reason why we were not taught the answer to this question). But now, since I "have" to deal with complex numbers, I want to know the answer to this question. After some thought, I remembered that not only complex numbers was viewed as meaningless then became normal, but also zero and negative numbers were not accepted and viewed as nonsense in the past, and then became normal.

So I want to ask for some references (books, articles, etc.) that explain what was the problem of accepting zero, negative numbers, and complex numbers, and how these problems were solved.

pie
  • 263
  • 1
  • 12
  • 8
    Better asked on the history of mathematics stackexchange. – Ethan Bolker Nov 05 '23 at 00:27
  • 5
    The only "problem" with zero, negative numbers, and complex numbers is that they're not immediately intuitive. For example, people felt that if zero is "nothing," and numbers represent "something," then zero can't be a number. Similarly, one cannot have $-3$ apples, and clearly there is no real number $x$ satisfying $x^2 = -1$. These problems were "solved" by humans understanding that mathematical ideas are just that: abstract concepts. All that's required for a mathematical object to exist is a precise definition and a rigorous proof of existence. –  Nov 05 '23 at 02:19
  • https://www.ferrovial.com/en/stem/complex-numbers/ – Will Jagy Nov 05 '23 at 02:28
  • https://www.math.uri.edu/~merino/spring06/mth562/ShortHistoryComplexNumbers2006.pdf – Will Jagy Nov 05 '23 at 02:30
  • Paul Nahin's book "An Imaginary Tale: The Story of $\sqrt{-1}$" is a fun read. – Jair Taylor Nov 05 '23 at 02:32
  • @JesseMadnick so why we don't define $\frac{1}{0}$ or any other undefined stuff for example lets define a new number $g $ st $|g| =-1$ – pie Nov 05 '23 at 14:35
  • 1
    Because it's pretty easy to represent what's going on when you introduce $i$ - just work with pairs of numbers, and stipulate that $(0,1) \times (0,1) = (-1,0)$. To "create" $e$, you have to introduce the least upper bound property, which is non-constructive, and introduces a bunch of other weird behaviors. Going from $\mathbb Q$ to $\mathbb R$ is a much bigger ask than introducing $i$. –  Nov 05 '23 at 15:27
  • 3
    And you can "define" $\frac {1}{0}$ - you just have to choose which properties of numbers you want to assume still hold, and which you're willing to give up. (When you introduce $i$, you have to give up $\lt$.) If you assume $\frac {1}{0}$ exists and keep the simplest field axioms, then you get that every number is equal to every other number. That's way too trivial for almost everybody. –  Nov 05 '23 at 15:31
  • 1
    Well, $\mathbb C$ is usually built on top of $\mathbb R$, and in some sense there's a perfect copy of $\mathbb R$ sitting inside of $\mathbb C$, so there is a dependence of $\mathbb C$ on $\mathbb R$, but not the other way. Although you can also just look at $\mathbb Q[i]$, and add $i$ to $\mathbb Q$ without bothering to do $\mathbb R$. –  Nov 05 '23 at 15:36
  • 1
    @pie: You can define any object you like, but you have to say what set $S$ the object lives in and what operations $S$ has. For instance, $i = \sqrt{-1}$ doesn't belong to $S = \mathbb{R}$, but rather to $S = \mathbb{C}$, and $\mathbb{C}$ is a field. If you want to define an object $x = 1/0$, that's fine, but what set $S$ does $x$ live in? Maybe you define $S = \mathbb{R} \cup {1/0}$, but do you want to add and multiply elements of $S$? Well, you'll have to define that, too. But if you want all the familiar properties like $a+b=b+a$, etc., you'll end up proving that $0 = 1 = 2$ in $S$. –  Nov 05 '23 at 18:41
  • 3
    @pie: The point is that the object $x = 1/0$ is fundamentally different from $i = \sqrt{-1}$. Whereas one can define a field called $\mathbb{C}$ that contains both $\mathbb{R}$ and $i = \sqrt{-1}$, there is no field (having $0 \neq 1$) that contains both $\mathbb{R}$ and $x = 1/0$. Similarly, if you want to define $g \in S$ to be an object that satisfies $|g| = -1$, then you can do this, but you'll need to say what $S$ is, and what the domain and codomain of the extended absolute value function is, etc. (Obviously, $g \notin \mathbb{R}$). –  Nov 05 '23 at 18:43
  • @JesseMadnick Thank you very much for your explanations "These problems were "solved" by humans understanding that mathematical ideas are just that: abstract concepts. All that's required for a mathematical object to exist is a precise definition and a rigorous proof of existence."

    I want to ask what made modern mathematician think this way and when did this happen?

    – pie Nov 05 '23 at 18:49
  • 1
    @pie: I'm not a historian, but I'd say that the advent of set theory probably had a large impact on how mathematicians viewed mathematics. You should probably ask this on history of mathematics stackexchange. –  Nov 05 '23 at 18:53
  • @pie "Why don't we define $1/0$?" We can and do - see extended real numbers. "lets define a new number $g$ with $|g| =-1$" - well, be my guest! You just have to define what $|\cdot|$ means for you. The problem is convincing other people to also use your definition. This was done with $i$ because it is extraordinarily useful - in electrical engineering, quantum mechanics, differential equations, improper integrals, etc. If your definition with $|g| = -1$ is that useful I'm sure people will come around to it! – Jair Taylor Nov 06 '23 at 05:36
  • @JairTaylor I know about the extended real number but I have many questions about it see https://math.stackexchange.com/questions/4796921/questions-about-the-extended-real-number-system and the answer that I received was that it is just a convenient way to not write theorems twice – pie Nov 06 '23 at 11:33
  • @JairTaylor I thought that many or most math theorems developed in abstract with no intention to apply them to the real world I know that number $g$ is silly but can a new branch in math formed from that like how complex analysis from $i$? – pie Nov 06 '23 at 11:36
  • 1
    @pie Well, I'd say that most definitions at least have some application to other math, if not outside applications. In particular, the use of $i$ was important in the solution of the general cubic - see the book I mentioned for the details. As to whether a new branch can be formed with your $g$... well, go for it if you like! No one will stop you. They might not pay much attention, though, unless you can find a good reason. – Jair Taylor Nov 06 '23 at 18:34
  • $1/0$ cannot just be defined to be any exotic object. That division by zero is forbidden has its reason and the issues do not vanish with exotic "resolutions" like the "extended real numbers" and similar stuff. This is completely different to the definition of $i$ which consistently extends the real numbers. –  Nov 07 '23 at 08:15
  • 5
  • @Gae.S. that is my question and this question I asked on MSE then I re asked it here I didn't know about migration so the question is here twice – pie Nov 12 '23 at 17:25
  • 1
    @pie I wasn't actually asking, but whenever you flag a question as a duplicate the site creates a comment of yours asking if the post you're linking "solves the problem". I find it very annoying, I liked it better when it just said "Possible duplicate of ...". – Gae. S. Nov 12 '23 at 17:28

0 Answers0