(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: I thought we were talking about Zeno's paradoxes?
Our universe does seem to behave discretely, but Zeno's paradoxes are not proof of this of this fact. At least that's what I was arguing.
I absolutely agree with what you are saying here. Zeno didn't "prove" anything. Nor did I claim that he did.
However, what he did do is offer a
reasonable hypothesis. A hypothesis that has subsequently been observed to be true via the science of Quantum Physics.
All I'm saying is that Zeno's hypothesis was correct. I'm not saying that he had necessarily 'proved' it. But I feel that he made very good arguments that have never been refuted.
And, as I say, modern proclamations that the calculus limit answers Zeno's concerns are simply false. It does no such thing.
Zeno was right.
And that's all I'm saying.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: (February 29, 2012 at 8:12 am)Abracadabra Wrote: The people who favored the continuum ultimately won in terms of shaping mathematical thinking and this is why we currently have a mathematics that based on the idea of a continuum.
It's not like discrete math went extinct or anything. If anything, our physics is based on real analysis. There's still plenty of mathematicians doing research in Algebra, Combinatorics, etc.
Oh absolutely!
Our modern "mathematics" is a truly ill-defined field in the larger picture. By that I mean that there are many different forms of analysis that are placed under the umbrella called "mathematics".
Hells bells, even Boolean algebra is considered to be "mathematics" yet Boolean algebra has nothing at all to do with ideas of 'quantity' as in normal mathematics. It's a totally different type of logical analysis having it's own rules of operations that are totally separate from the rules of numerical arithmetic.
In arithmetic there are operations like addition, subtraction, multiplication, division, etc. In Boolean algebra there are operations like AND, OR, NOT, XOR, and so on. These are totally different concepts from the normal numerical concepts of quantity. Yet we lump this into "mathematics".
The very term "Mathematics" has become nothing more than an umbrella-term used by universities as a box to shove anything into that appears to have a logical structure.
That itself is a very unwise thing to do.
Why is it unwise? Because by doing that we tend to loose sight of what's truly important in things like the study of "quantity" or "quantitative properties" that might be associated with our physical reality.
In other words, "mathematics" has become a hodgepodge collection of any sort of logical structures, and therefore it has lost sight in what's truly important or cardinally foundational to any one of them.
We'd be far better off keeping these totally different types of logical systems separate from each other.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: (February 29, 2012 at 8:12 am)Abracadabra Wrote: We don't need anything on the number line but rational numbers, and they are discrete (i.e. quantized).
You probably know this already, but quantization also means our universe doesn't require the entirety of the rational numbers either.
I never said it did.
In fact, IMHO, it's utterly absurd to even speak of "The entirety of the rational numbers"
What would that even mean? The rational numbers are infinite as an abstract concept. To speak of "The entirety of them" would be to imply that you could someone include all of them. But that is Georg Cantor's mistake of tying to treat the infinite as though it can be treated as a finite thing.
All I'm saying is that "all you need" to explain the real world are rational numbers (in an absolute cardinal sense). But that doesn't mean that you need "all of them" to do this.
In fact, as you point out, in a quantized universe not only wouldn't you need them, but neither could they all exist in terms of 'real' quantitative properties.
Absolutely!
You're right on the money there.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: (February 29, 2012 at 8:12 am)Abracadabra Wrote: We have no need to support these notions of "real numbers" as though they must be an integral part of our idea of absolute quantity.
This is absolutely true. We support these notions of "real numbers" because modern analysis is incredibly useful. Did I mention that it forms the backbone of QM?
According to a theory that runs on infinite-dimensional spaces of complex functions, real-world quantities are discrete. I guess that means it's time to get rid of those whimsical abstractions called real numbers!
Can you see why I think this is unreasonable?
Sure I can. But this is only because you're not truly understand where I'm coming from. I have no problem with abstractions and things like infinite-dimensional spaces, complex functions, etc.
And neither am I saying that we need to get rid of whimsical abstracted called "real numbers". On the contrary those concepts are indeed required.
What I'm actually saying is that we are simply treating them incorrectly.
I'm not saying that we need to do away with them altogether.
Pi will be Pi forever. The square root of two will be the square root of two forever. And both of those "numbers" will forevermore incommensurable in terms of rational numbers. As will be "e" of the natural logarithms, and phi of Fibonacci numbers. All of those irrational relationships will remain valid observations.
However, where the mistake is made is in demanding that these concepts be made into 'cardinal quantities' and be incorporated into a number line, etc.
That's where the mistake is made.
These irrational
relationships arise from specific self-referenced situations and are not the result of cardinal properties of quantity. Especially in terms of sets or collections of "individual" cardinal objects.
So it's a mistake to try to force them to become part of the definition of cardinality. Which is what mathematicians have been doing since the early 20th century.
It's a real mistake, and one that hasn't gone unrecognized by other mathematicians. Henri Poincare had this to say about Georg Cantor's ideas of set theory (which form the cardinal basis of modern set theory)
It goes something like this, although this not an exact quote,...
"Georg Cantor's set theory based on nothing (i.e. the empty set) and producing transfinite numbers (infinities of different cardinal sizes) is a disease which future mathematicians will eventually need to be cured from"
I am totally in agreement with Henri Poincare's observations.
And he's not the only prominent mathematician who felt this way at the time. Kronecker also rejected Cantor's nonsense and stated the following:
"God gave us the integers, the rest is the work of men".
Hopefully Kronecker was speaking of 'God' in the same Albert Einstein thought of 'God'. But I also agree with the words of Kronecker. Especially in terms of a cardinal definition of number as an idea of
absolute quantity.
There's simply no need to be destroying that notion with made up ideas of so-called "real cardinal numbers", that ultimately force us into the obvious paradox of being faced with infinities of different cardinal sizes.
These are not only unnecessary ideas, but they are ultimate "wrong".
They also blind us to the true nature of the relative essence of irrational relationships and the FACT that they are produced by self-referenced situations. The mathematical community doesn't even know this FACT.
They can't see it because they have indeed accepted irrational "numbers" in terms of cardinal ideas of quantity. Thus they aren't even looking for another explanation.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: I understand that you believe we'll get more mileage out of a system based on 'absolute' quantities. I do not understand why you think this system will afford us some problem-solving techniques that the current implementations of mathematics doesn't. A paradigm shift in mathematics would be awesome. But that sort of thing only happens when you find a system that's better than the current one.
I can not only offer a better system than the current one, but I can even point to precisely what changes need to be made, and why they need to be made.
Precisely how that will help in a practical matter in terms of science I cannot predict. However, surely a correct mathematical formalism would be more effective than the false one we are currently using.
One immediate thing I can point to is that the corrected system shows clearly where irrational relationship come from and why they should not be thought of in terms of "cardinal" absolute quantities.
So it makes a difference already. Plus the corrected system would also not require multiple sized infinities which is another bogus idea that we don't need.
Something is either finite or infinite and that's that. The idea that something could be more infinite than something else is a bogus and unfruitful idea.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: (February 29, 2012 at 8:12 am)Abracadabra Wrote: It's various axioms of set theory that's really at the base of the problem.
I'd like you to expand on that point... but you are aware that the machinery of QM relies on that stuff too, right?
Sure I'm aware of that. And this is precisely why it should be rock solid and true to reality.
I'm not saying that we should 'trash' set theory altogether. All I'm saying is that there have been mistakes made at the foundation of set theory. In fact, those errors have been introduced basically by one man - Georg Cantor.
Change those and let set theory evolve from the new foundation. They we'll have a better set theory.
In fact, if we make the changes I propose (which have also been proposed by other mathematicians even back in the days when set theory was starting out), then another great feature of this is that Kurt Godel's incompleteness theorem would also
no longer apply to mathematical formalism.
The reason for that is quite involved, but that too is based on a concept of self-reference. Cantor's set theory is indeed a self-referenced system. The change I propose would produce a system that it not self-referenced.
Thus Godel's incompleteness theorem would not apply to my set theory.
But that's a whole other story.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: Axiom of choice? Clearly a worthless abstraction. It's not like physicists needed their hilbert spaces to have orthonormal bases or anything...
I think you're jumping to conclusions that may not have anything at all to do with the issues that I'm attempting to address.
I'm not saying that abstraction itself is the culprit. I'm quite sure that my ideas qualify as being 'abstract' as well. But then again, that depends on what you're defining as "abstract"
What does abstract even mean?
Does it mean vague or ill-defined?
I don't think mathematicians would be bragging about mathematics being abstract if that meant that mathematics is vague or ill-defined.
Does it mean non-tangible? (i.e. having no relation to the physical world)
Well, again such a thing should not be important to mathematics. Why would mathematics brag about mathematics not relating to anything real?
However, if abstract simply means - Applicable to many cases,...
Then my ideas are as abstract as it gets.
And IMHO, this is indeed the type of abstraction that is important to mathematics.
So when we speak about 'abstract concepts' we really need to look closely at what we mean by that.
Is we simply mean, whimsical, vague or having no application to reality, then we need to question why we would even be interested in such concepts.
On the other hand, if we mean ideas that have many applications and can also be applied to correctly describing the quantitative physical properties of our universe, then I'm all for it.
(February 29, 2012 at 10:44 am)Categories+Sheaves Wrote: (February 29, 2012 at 8:12 am)Abracadabra Wrote: ...And thus with that frame of mind they are naturally going to go off in la-la land making up mathematics on their own whim with total disregard to whether or not is means anything.
Reminds me of a guy I know.
Are you sure it doesn't remind you of a guy that you truly do not know at all?