zotz: (Default)
[personal profile] zotz
Yesterday there was a Guardian article about the philosopher (and sculptor, and jazz pianist, and sailor, and cidermaker . . .) Dan Dennett, who it turns out is a man too talented to be allowed to live. I have recommended his work to some of you before - especially Consciousness Explained and Darwin's dangerous Idea.

Date: 2004-04-18 03:26 pm (UTC)
From: [identity profile] steer.livejournal.com
Because of the arguments of Lucas and Searle, we know that consciousness cannot be implemented on a Turing machine.

I would say that actually Penrose believes this as his starting hypothesis but introduces the arguments of Lucas and Searle since they are classically used in such a situation. [You must admit that any decent coverage of the strong AI hypothesis would have to include those arguments.]

2 : But everything we know how to build using the laws of modern physics can be implemented on a Turing machine.

I do not believe this part is true and I do not believe Penrose claims it. If there is a 50% chance that an atomic particle decays in time x how do you simulate that with a TM? [Answer, you can't without a hidden variable.]

His falling for Searle's emotionally-laden nonsense, on the other hand, is just embarrassing in one so bright

I believe Searle reversed his position a few years back. [And, as usual with AI type debates, countered his own argument with a different (and mutually contradictory) argument to the critics.]

Date: 2004-04-19 12:32 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
I would say that actually Penrose believes this as his starting hypothesis but introduces... Not sure I know what you mean here. If you mean he starts with that belief for emotional reasons, I agree. However his case rests on that of Lucas and Searle.

If there is a 50% chance that an atomic particle decays in time x how do you simulate that with a TM?

When discussing computability classes like BPP or ZP, one posits a Turing machine with access to a source of randomness. That's not the sort of uncomputability Penrose is thinking of. He concedes that physics as we know it does not give you the stuff to build a super-Turing machine; that's why he introduces the discussion on QGT.

His is by far the most intellectually careful and rigorous defence of biological supremacism ever, IMHO. Searle says (from memory) "In answer to the question 'can a machine think?' we answer 'yes, we are such machines". But he never takes seriously explaining what's so special about brain-stuff that computer-stuff can't do the same think. Penrose rises to the challenge.

I believe Searle reversed his position a few years back.

I heard Searle speak in public fairly recently and his position did not seem much different.

Date: 2004-04-19 03:22 am (UTC)
From: [identity profile] steer.livejournal.com
If you mean he starts with that belief for emotional reasons, I agree.

I do mean that.

However his case rests on that of Lucas and Searle.

(grin) Well, that rather depends on which side you see the burden of proof being on. For me the claim that consciousness can arise within the bounds of classical and quantum mechanics is a really rather extraordinary one in itself -- there no proof of that, good bad or indifferent.

That's not the sort of uncomputability Penrose is thinking of.

Indeed - I fully realise that. What I'm pointing out is that we don't need any new physics to be beyond what is reachable for a strict TM. Already you have to start introducing "a source of randomness" - what I'm trying (incoherently to say) is that you are gradually pushed back from strict turing computability [as I mention to [livejournal.com profile] zotz the next step is to require you to include sum over histories and use a non deterministic TM]. QM already requires a looser definition of computable -- it would be no surprise if a complete theory of quantum gravity introduced something more.

I heard Searle speak in public fairly recently and his position did not seem much different.

See reference - though (to be frank) I remember reading Searle at the time and being completely nonplussed as to quite what about his argument he had changed.

http://www.artsci.wustl.edu/~philos/MindDict/searle.html

Date: 2004-04-19 04:03 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
No, randomness is the last wrinkle. In fact, even that's not necessary if you're looking only at Lucas's argument, which is the only one that even vaguely holds water - any problem that can be solved with bounded error on a randomized TM can be solved on a deterministic TM.

[as I mention to zotz the next step is to require you to include sum over histories and use a non deterministic TM]

I'm sure you know that everything computable on a non-deterministic TM is computable on a deterministic TM.

Date: 2004-04-19 04:14 am (UTC)
From: [identity profile] steer.livejournal.com
No, randomness is the last wrinkle.

(grin) I suspect a wrinkle that you will already have a hard time straightening.

any problem that can be solved with bounded error on a randomized TM can be solved on a deterministic TM

Interesting -- but accepting "solved with bounded error" as the same as "computable" fundamentally changes the nature of what is computable.

http://mathworld.wolfram.com/ComputableNumber.html

Most irrationals are not computable. But they are certainly computable within bounded error.

I'm sure you know that everything computable on a non-deterministic TM is computable on a deterministic TM.

I'm sure I don't. In fact I'm sure I know that while it is widely believed proved the Church-Turing thesis has stood as unproved for nearly 70 years now. [I'm not sure whether or not I _believe_ the Church-Turing Thesis - I think I will remain agnostic on the issue.]

Have you seen:
http://arxiv.org/abs/quant-ph/0203034

I'd be interested in your opinion -- I don't know enough QM to decide so I am very skeptical of it.

Date: 2004-04-19 04:45 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
It's more usual to think of yes/no questions than computing infinite streams; in this case the error bound is a bound on the probability that the machine gets the answer wrong. You can convert an "infinite stream" problem to a "yes/no" problem with a question like "Is bit n of (say) pi 1?". By this definition, most irrationals are not even computable within bounded error.

If you don't know the proof of equivalence between a TM and an NDTM, you should look it up in any undergraduate text on the subject. The reduction is simple: given an NDTM to solve a given problem, evaluate all possible branches simultaneously. If any of them accept, then accept, otherwise don't accept. Just asked a colleague to recommend a suitable textbook, he suggests Michael Sipser, Introduction to the Theory of Computation, or a couple of others I'll add later.

The Church-Turing thesis can't be stated with enough formality to be mathematically proven. What we can do is think of lots of machines that fit the criteria they lay out, and prove that they are simulable on a TM, and that we've done many times. See Wikipedia: Church-Turing Thesis: "...the thesis does not have the status of a theorem and cannot be proven..."

Date: 2004-04-19 06:08 am (UTC)
From: [identity profile] steer.livejournal.com
[Error: Irreparable invalid markup ('<em [...] error.</em>') in entry. Owner must fix manually. Raw contents below.]

<em most irrationals are not even computable within bounded error.</em>

Interesting - something I should read more on.

<Em>If you don't know the proof of equivalence between a TM and an NDTM</em>

Apologies - you are correct here. What I was trying to get across was the concept of a Probabilistic Turing machine with an energy minimisation principle factored into to the probabilities. I had (erroneously) thought that an NDTM was the same sort of thing. [I made the same error in a reply to zotz.]

Sidenote: Irksomely, some web references say an NDTM is the same thing as a Quantum Turing Machine and other say this is a common fallacy. I shall give up investigating this on the web and do some work.

Date: 2004-04-19 07:21 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
It's a common fallacy. QC is believed to be a different computational class from NP.

Date: 2004-04-19 07:31 am (UTC)
From: [identity profile] steer.livejournal.com
But still an open research question I presume [I managed to find web references saying with assurance that NP was solvable in polynomial time using QC and that it certainly was not. Ah, the web, you can find any answer you want there.]

Thanks for the discussion - very interesting, helped me clarify my ideas and I found a few more things to read up on. [If you do ever have a chance to look at that paper I'd be interested on your thoughts as to whether it's lunacy or genius. Haven't studied QM since I was an undergraduate.]

Date: 2004-04-19 08:10 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
It is indeed open: we still don't even know if P=NP.

However if you remove the time bounds then all these machines are equivalent.

Here's what my colleague recommended:

Date: 2004-04-19 08:07 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
Introductions to computability theory; most of these contain some (basic) material on complexity as well --except Davis:

(1) Hopcroft, J., Ullman, J. & R. Motwani: Introduction to Automata Theory, Languages, and Computation 2/E

Sadly, far less authoritative and complete than the first edition, although more up to date and more accessible to undergrads. Still a good intro, though.

(2) Sipser, M.: Introduction to the Theory of Computation.

Very, very clear and accessible, yet contains non-trivial proofs. Far from complete in coverage, but IMO the best starting point to obtain good intuitions regarding computability and complexity.

(3) Lewis, H. and C. Papadimitriou: Elements of the Theory of Computation

More coverage than Sipser regarding automata and languages, but less on complexity. A sound introduction, if a bit dry.

(4) Davis, Martin: Computability and Unsolvability.

Classic text on computability (*not* complexity). Available in an affordable Dover edition :-)

Personally, I'd go with Sipser as an interesting, accessible introduction to both computability and complexity. However, coverage is selective, so it is certainly not a reference book. Lewis and Papadimitriou is solid, but dry (comparable to trying to eat three Weetabix straight from the packet).

Hopcroft and Ullman is probably the best bet overall. Unfortunately, the second edition doesn' t live up to the first: for a lot of basic results, the first edition is still a better reference, and may be worth tracking down at a college library. It's quite pricey, though. If you're not interested in complexity, then Davis' book is quite complete (and cheap!).

Rgds,

J.

Re: Here's what my colleague recommended:

Date: 2004-04-19 09:15 am (UTC)
From: [identity profile] steer.livejournal.com
(Grin) Thanks to the elegant minimalism of York Uni library this decision was made a lot easier. I now have a copy of Hopcroft on loan. Thanks for the info.

Date: 2004-04-19 11:04 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
What I was trying to get across was the concept of a Probabilistic Turing machine with an energy minimisation principle factored into to the probabilities.

Not sure exactly what you mean here; if you mean a quantum Turing Machine (ie one in which the probability of a particular result is determined by a sum of amplitudes) then that too is equivalent to a TM. Or if you mean something where each result has a weight and the one with the least weight wins, then that's equivalent too (so long as you work out the details in a reasonable way and don't try and slip in the ability to do infinite work in finite time).

Date: 2004-04-19 12:24 pm (UTC)
From: [identity profile] steer.livejournal.com
I guess I'm thinking aloud here so perhaps should stop -- the sort of consideration I'm thinking of is the way which QM systems can find a base energy from a variety of answers -- e.g. not only calculate a number of sums in parallel but select the one which is in some sense optimal. However, I'm afraid at this point my recollection of QM has grown amazingly hazy and I'm not sure whether I'm confusing this with other issues so I'd best stop while I'm behind. :-)

Date: 2004-04-19 04:38 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
For me the claim that consciousness can arise within the bounds of classical and quantum mechanics is a really rather extraordinary one in itself -- there no proof of that, good bad or indifferent.

No, when you encounter a phenomenon in nature the default claim should be that it is explicable in known physics. Only when you have a really good reason to think so should you reach for the idea that physics will need to be extended to explain that phenomenon.

Date: 2004-04-19 04:46 am (UTC)
From: [identity profile] steer.livejournal.com
when you encounter a phenomenon in nature the default claim should be that it is explicable in known physics

Well, that's a claim certainly. I would say it rather ignores how physicsts have actually worked which tends to be more "Bloody hell, that's a bit unusual. Doesn't seem to fit with my models. Mind you it does with this new and interesting model."

Only when you have a really good reason to think so should you reach for the idea that physics will need to be extended to explain that phenomenon.

Which gets us into whether or not we have such a good reason to believe consciousness is an unusual phenomenon requiring an unusual explanation - I think it clearly is.

Date: 2004-04-19 05:08 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
How so? To posit new physics before discounting the possibility that existing physics could account for what you observe would seem to be a very straightforward case of multiplying entities beyond necessity.

Roentegen was excited by the flourescence he observed from his Crookes tube because he took many steps to ensure that it could not be caused by anything already known in physics, and must therefore be the result of a new kind of ray. Becquerel knew that the fogging he saw on photographic plates was not a result of exposure to light, and no existing theory said that uranium salts could fog unexposed plates. Physics advances entirely through solving problems with existing theory (sometimes phenomena it can't account for, sometimes internal problems). It could hardly advance by someone positing a new theory for every leaf fall.

Date: 2004-04-19 05:49 am (UTC)
From: [identity profile] steer.livejournal.com
How so? To posit new physics before discounting the possibility that existing physics could account for what you observe would seem to be a very straightforward case of multiplying entities beyond necessity.

Well, perhaps we don't want to go to far down this argument because I think it is something of a side issue. It is usually impossible to know when the possibility of existing physics explaining a phenomenon can be discounted. Worse, it is usually subject to extreme controversy. Take Einstein's 1905 paper on quanta and black body radiation. At the time, most physics argued that there were perfectly acceptable explanations within the body of conventional physics it was just nobody had come up with them.

In the case of consciousness, we have nothing which approaches an explanation through conventional physics (I take it you would agree with that?). This does not, of course, mean that such an explanation is possible, but certainly it justifies the attempt to look for something more.

Date: 2004-04-19 11:19 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
We were arguing about burden of proof. Even in the cases you cite, the burden of proof was on Einstein to argue that it wasn't just a failure of imagination, but that it would fundamentally not be possible to account for the photoelectric effect in classical mechanics.

Penrose recognises this, which is why he attempts to demonstrate the same for consciousness.

To me, trying to solve the problems with consciousness by extending physical law is as bizarre as trying to solve some mystery with the course of the English Civil War by extending physical law. I just don't see that any of the problems are of the sort that would be solved that way. But then, I also probably don't see the problems as even close to as severe as you do - for the most part I think Dennett's account in "Consciousness Explained" is about right.

Date: 2004-04-19 12:34 pm (UTC)
From: [identity profile] steer.livejournal.com
We were arguing about burden of proof.

Indeed - but surely we were arguing more precisely about whether one needs to establish that "classical" explanations are insufficient (which Einstein did not do) or whether it is sufficient to have a new explanation with more predictive power (which I admit Penrose does not yet have).

for the most part I think Dennett's account in "Consciousness Explained" is about right.

(Grin) There I feel is where we fundamentally differ but that is another and a very long discussion. For me it is a beautiful book, intelligently written by a philosopher I admire tremendously, but a better title would have been "consciousness ignored."

Profile

zotz: (Default)
zotz

August 2018

S M T W T F S
   1234
56 7 891011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 1st, 2026 07:08 pm
Powered by Dreamwidth Studios