Hello.
So I've had the pleasure of having both of my HCJ seminar papers this academic year being on the topic of Logic and the Philosophy of Mathematics and in particular the works of Gottlob Frege and Bertrand Russell.
So I've had the pleasure of having both of my HCJ seminar papers this academic year being on the topic of Logic and the Philosophy of Mathematics and in particular the works of Gottlob Frege and Bertrand Russell.
I might have described myself as agnostic before this injustice was visited upon me, but now I'm a solid theist who believes in a God that takes a perverse interest in dull human affairs and meddles accordingly. A God that just wanted to see me stutter my way through grotesquely complex material that I'll never, ever understand.
I'll try to write about it anyway in spite of the fact that I don't understand it. Wittgenstein's “whereof one cannot speak, thereof one must be silent” can stay on the shelf for now.
Logic and Technology
Logic is a means of working out a statement's truth value. Something can be true, false, or meaningless (if there is no means of verification.) Human beings are veritable verification machines – we approach everything in a logical manner, whether it looks that way or not. We're so complex that we don't even have to really think about the processes that we go through to reach our conclusions – it's only in the last hundred years or so where men like Frege have attempted to map this out, that technology has been able to advance forward and mimic these processes.
'Intuitive' has become a buzzword for lauding consumer electronics or computer programs on their usefulness recently, and when you think about it, that's an absolutely ridiculous premise to judge a computer on when the concept of 'intuition' is a very human thing and is, of course, constrained by the people who programmed it in the first place. You might as well judge it on whether you think it has a good sense of humour or not. It is only as computers reach ever increasing levels of complexity that even begin to approach the human brain that these gadgets have become 'intuitive' and therefore pleasant to use instead of incredibly infuriating – I remember my dad had a Palm Pilot (this is seriously going back) when I was a tiny little girl and it never did what you wanted it to do, purely it was limited by its own rubbish hardware and programming. This doesn't really happen so much any more. And that's great, and a testament to how far everything has come along.
When talking about how technology has progressed, that's just shorthand for how far the ability for us to express and map out inconceivably complex logic has come along.
Frege
Gottlob Frege was a German philosopher and mathematician. He was among the first to truly define numbers as 'the expression of pure deductive logic' and remove the pervasive idea of numbers being seen as mystical entities since the time of Pythagoras. The number 3 held particular significance to Pythagoras and his followers in the form of 'the golden ratio', something that was applied in art and architecture.
The de-mystification of mathematics helped to pave the way for the idea that all philosophical reasoning can simply be reduced to its series of core arguments and analysis of the way in which they have been put togther (syntax), and this is the underpinning for analytical philosophy. Frege showed that all arithmetic was could be broken down to a set of simple, logic premises. He also introduced new logical notation that could express generality, such as 'all' 'some' 'none' and 'if'. Frege maintained that the same rules that apply to numbers can apply to language, as long as the language used is clear and precise.
Bertrand Russell expanded upon what Frege had already built. Frege's work mainly concerned the logic of language, whereas Russell was trying to prove that mathematics should hold a inexorable link in logic as, much like music, it is a 'perfect' language. There are no connotations, no subjectivity – a number 9 is a 9 and a D sharp is a D sharp (or an E flat).
Something humanity had always struggled with was trying to pin down exactly what numbers are; if they're not mystical entities or Platonic perfect forms, then it stands to reason that they are empirical generalisations of what you usually get when you add objects together: 1 bead + 1 bead + 2 beads, which is fine, but that breaks down when you add 1 drop of water to 1 drop of water as you are still left with 1 drop of water.
Russell instead took the lead from Kant and put forward the notion that numbers are synthetic, a priori propositions that can be defined logically from a simple, limited set of axioms. Axioms according to Russell are self-evident, logical truths.
Principia Mathematica
Russell's Principia Mathematica was an attempt to show that mathematics is simply a follow on from logic. He believed that the philosophy of logic enjoyed a high status and that it was only fair – given that mathematics was made up from component pieces of pure logic - that maths should have an equal status.
He began by defining the vocabulary for logical premises i.e. X and ¬X (the sky is blue and the sky is not blue) to stand for propositions, and then gave examples of how you would use that vocubulary in practice. The syntax used to make 'deductions' is expressed as the following:
If X, then Y.
X.
Therefore Y.
X.
Therefore Y.
So that could be something like:
If it rains, then the ground becomes wet.
It is raining,
Therefore the ground is wet.
He then moved on to the problems surrounding numbers, as mentioned earlier. The main problem is how to define a number without referring to the concept of a number itself – the same issue that would arise if you were to try and explain what 'red' is to someone who can't see. It can quickly become very circular.
The solution was to think of numbers in terms of 'sets.' The number 2 is simply a way of saying 'the set of all sets of couples', in the same way that number 3 is 'the set of all sets of trios.'
Notation has since developed to the point where the Principia is considered to be unreadable to 'amateur' logicians and people have pointed out inconsistencies in the work. Russell, for the sake of his quest to derive complex arithemetic from 'self evident truths' (axioms) was forced to introduce two axioms to the reader - the axioms of infinity and reducibility. Critics noted that neither were a priori, and required experience to measure and prove.
Additional notes from lecture:
The solution was to think of numbers in terms of 'sets.' The number 2 is simply a way of saying 'the set of all sets of couples', in the same way that number 3 is 'the set of all sets of trios.'
Notation has since developed to the point where the Principia is considered to be unreadable to 'amateur' logicians and people have pointed out inconsistencies in the work. Russell, for the sake of his quest to derive complex arithemetic from 'self evident truths' (axioms) was forced to introduce two axioms to the reader - the axioms of infinity and reducibility. Critics noted that neither were a priori, and required experience to measure and prove.
Additional notes from lecture:
Natural numbers are special words used
in counting – the act of counting is to create a group of items.
Apes and stone-age tribes have types of
very simple number systems that relied on natural numbers, and the
extent of counting would consist of: “one thing, “more than one
thing” and “many things.” They didn't need to go in to any
further complexity and so these were the only numbers they needed.
Even for people from advanced cultures,
small number words are functionally different from large number
words. Most people can instinctively know that there are up to seven
objects in a group before having to step aside and physically count
them; you could say with conviction that there were only three people
in a room, but you could not say that there were 37,498 people at a
football match without some means of proving that.
Addition and multiplication are plurals
of plurals. Creating words and abstract symbols for plural categories
requires a system of number words (again symbols) and a logical
syntax for combining those number words to imply further or predicate
number words. In this instance, the number 7,367 is a predicate
symbol of a bunch of more basic symbols organised according to a
known syntax; the point is, you can't conceptualise what 7,367 looks
like without breaking it down in this way.
A predicate, in logic, can be analysed
(similar to division and calculating number squares of roots in
mathematics.) This statement is the basis of analytic philosophy, on
which all modern technology bases itself. Computers follow a basic
logical language – an ongoing debate is whether computers can ever
reach the complexities of human logic.
Ancient civilisations used
hieroglyphics for number words – Greek and Roman societies depended
on numbered symbols and their system did not regard zero and one as
numbers.
The introduction of the concept of
'zero' came from India, and was absolutely ground-breaking – the
idea that you can describe 'nothing' had caused philosophers
considerable agony for millennia. This was because the proposition
that zero = nothing in itself meant that it was something. It went
against the Law of the Excluded middle formulated by Aristotle –
this is the idea that for any proposition, it must be true, or that
its opposite must be. It excludes any other possibility and zero, not
being either anything or nothing, did not fit with this rule.
No comments:
Post a Comment