BRETT HALL
  • Home
  • Physics
    • An anthropic universe?
    • Temperature and Heat
    • Light
    • General Relativity and the Role of Evidence
    • Gravity is not a force
    • Rare Earth biogenesis
    • Fine Structure
    • Errors and Uncertainties
    • The Multiverse
    • Galaxy Collisions
    • Olber's Paradox
  • About
  • ToKCast
    • Episode 100
    • Ep 111: Probability >
      • Probability Transcript
  • Blog
    • Draft Script
  • Philosophy
    • Epistemology
    • Fallibilism
    • Bayesian "Epistemology"
    • The Aim of Science
    • Physics and Learning Styles
    • Positive Philosophy >
      • Positive Philosophy 2
      • Positive Philosophy 3
      • Positive Philosophy 4
    • Inexplicit Knowledge
    • Philosophers on the Web
    • David Deutsch & Sam Harris
    • David Deutsch: Mysticism and Quantum Theory
    • Morality
    • Free Will
    • Humans and Other Animals
    • Principles and Practises: Preface >
      • Part 2: Modelling Reality
      • Part 3: Political Principles and Practice
      • Part 4: Ideals in Politics
      • Part 5: The Fundamental Conflict
    • Superintelligence >
      • Superintelligence 2
      • Superintelligence 3
      • Superintelligence 4
      • Superintelligence 5
      • Superintelligence 6
  • Korean Sydney
  • Other
    • Critical and Creative Thinking >
      • Critical and Creative Thinking 2
      • Critical and Creative Thinking 3
      • Critical and Creative Thinking 4
      • Critical and Creative Thinking 5
    • Learning >
      • Part 2: Epistemology and Compulsory School
      • Part 3: To learn you must be able to choose
      • Part 4: But don't you need to know how to read?
      • Part 5: Expert Children
      • Part 6: But we need scientific literacy, don't we?
      • Part 7: Towards Voluntary Schools
    • Cosmological Economics
    • The Moral Landscape Challenge
    • Schools of Hellas
  • Postive Philosophy blog
  • Alien Intelligence
  • High Finance
  • New Page
  • Serendipity In Science
  • Philosophy of Science
  • My YouTube Channel
  • The Nature of Philosophical Problems
  • The Nature of Philosophical Problems with Commentary
  • Subjective Knowledge
  • Free Will, consciousness, creativity, explanations, knowledge and choice.
    • Creativity and Consciousness
  • Solipsism
  • P
  • Image for Podcast
  • ToK Introduction
  • Begging the Big Ones
  • Blog
  • Our Most Important Problems
  • Corona Podcasts
    • Brendan and Peter
    • Jonathan Davis
  • Responses
  • Audio Responses
  • New Page
  • Critically Creative 1
  • Critically Creative 2
  • Critically Creative 3
  • Critically Creative 4
  • Critically Creative 5
  • David Deutsch Interview in German
  • Audio Files
  • Lookouts
  • Breakthrough!

Errors & uncertainties

The unexpected ways error correction is at the heart of
​reason, science, democracy, learning and progress
Picture

Errors and Uncertainties

“I wish to replace...the question of the sources of our knowledge by the entirely different question: How can we come to detect and eliminate error?” - Karl Popper, 1960 “Knowledge without authority”

Perhaps the most crucial lesson we can learn as we grow and understand the world is that there is not, or should not be, “authorities” to whom we defer about how our own lives, or the lives of any reasonable people, should be organised and controlled. What we should desire is the autonomy to direct our own actions in the way we see fit in so far as it does not get in the way of anyone else doing the same in their pursuit of what seems good to them. But a "deference" for authority is not like this: it is the belief that someone else by virtue of greater strength or knowledge or power conferred upon them by some institution - is better placed to decide for us because, presumably, they are less prone to error than we might be. But that is wrong. We can all, all of us, always be in error. We should know better by now.

Of course, deference to authority evolved from a good place: it was a reaction to the utter chaos of the uncivilised world and our tribal, indeed primitive, roots. It was a first approximation - tentative attempts - in the creation of a stable society. Other species in the animal kingdom barely recognise authority (modulo some social animals - like large brained mammals - where the hierarchical authority enforces its instinctive will through violence (as, indeed, all authority ultimately must)) - the rather natural state of things is to be in utter chaos, especially in the animal world. There the rule is: devour or be devoured. Our oldest cultures and customs evolved to impose order on this chaos, or perhaps even birth it into being. Violence, destruction, cruelty and the ruin and decay of all things eventually: this is nature. And in order to escape the hazards of nature - to be the exception to the rule and excel and survive and indeed thrive - we humans have laboured long both physically and intellectually to build cultures (as well as technologies) that can be stable. And as David Deutsch has observed in "The Beginning of Infinity" the Enlightenment in the Western World has allowed a very peculiar, unique and special thing: we have created, evolved and until now nurtured a culture that is stable under change. This is no small thing. Most cultures that have ever existed have been "static societies" where tradition has reinforced not change, but maintained the status-quo and, we should witness, the vast majority of these have fallen into utter ruin and gone extinct. And this is because they were unable to allow change - the precondition for progress and problem solving at a rate faster than the rate at which the biggest, lethal problems arose. But ours, now, is different. So far. It allows for change. And progress. And the fastest rate of problem solving that has ever existed, from what we can tell, in the entire history of the cosmos. And the reasons for that are not entirely clear which is a reason we need to be very careful about changing our own culture too quickly because if it is roughly stable now under change, any major change to it too suddenly for us to correct course if an error is made could cause the whole project to become unstable and that could be dangerous, as it would be for any culture static or dynamic.

But it has taken us millennia to get to this point where we as people and as a civilisation not only tolerate but welcome a high rate of change and progress. It is no surprise that at first, the best idea we had as a species, was to defer to authority in order to keep things stable: the strongest or the most cunning could enforce the rules. Perhaps the most cruel or the most wealthy used strong-arm tactics. Whatever it was: a hierarchy of power of some sort was set up in a community that enforced standards - typically through occasional and not altogether arbitrary -violence. And this, it must be admitted, is far better than constant (often random) violence and chaos. That is a recipe for fear and constant misery but what is marginally better is some kind of framework within which there can be a kind of peace, if not freedom, where at least one’s children might be able to reach maturity without being indiscriminately murdered. Laws and customs might be arbitrary and themselves cruel - and mandated by some tyrant but, again, this “system” is preferable to some state of constant war, conflict and suffering. And, especially if traditions have existed where people can live out their natural lives in relative peace (if not freedom) then it is no wonder that humans have opted for a “strong” person to be in power: an authority to which we can defer.

So it is completely understandable that this “deference to authority” is deeply ingrained in our culture - politically, socially and even academically. In politics: we strive to elect someone who will “get things done” and make decisions on behalf of us all. Someone who will make the laws and control our civilisation and protect it from threats. Someone else, in short, who will solve our problems. We defer to their authority and expertise - we obey. Socially, the man has been “head of the household” and many communities still organise their own families so that the first born son ranks more highly than the youngest daughter. Religions are patriarchal. And finally in academia, we struggle to shrug off deference to the authority of the expert.

As David Deutsch observed during his interview with Sam Harris on The Waking Up Podcast, “I think that what happens when we consult experts, whether or not you use the word “authority” - it’s not quite that we think that they’re more competent - it’s...I think ...when you referred to error correction - that hits the nail on the head. I think that there’s a process of error correction in the scientific community that approximates to what I would use if I had the time and the background and the interest to pursue it there.”

So a rejection of authority is not a wholesale rejection of experts: but it is a rejection of a certain way of thinking about them and how knowledge is validated. We should not be interested in sources, but in methods. If we have some good explanation of how the method in some field of expertise comes to create the knowledge it claims to possess, and a good account of what the sources of error are, then we can assess whether that knowledge really maps onto reality, or is going to be nonsense. After all, astrologers claim to be “experts” - but that is no reason to trust their prognostications. We rightly assess methods and means not sources.

“Question Authority” was a bumper sticker my father bought me when I was a teenager and perhaps it seeped into my unconsciousness. It was not “Reject Authority” - but question. Again: the source does not matter. If the person with the badge and gun who claims the authority of the state asks me to do something, it would be unwise to continually disobey. But it might be reasonable to question, if it seems safe to do so. The quote at the beginning of this page that encapsulates Popper’s epistemology, reproduced at the top of this page, is very much a call to academia and the wider community. We should not seek out sources for knowledge but rather ways in which to detect and eliminate error. Popper’s anti-authority philosophy extended even into political theory. His claim about democracy was that it was not a system for installing leaders, but rather a system for most effectively removing bad ones without violence. To use Popper’s own words, “The new problem, as distinct from the old “Who should rule?”, can be formulated as follows: how is the state to be constituted so that bad rulers can be got rid of without bloodshed, without violence?” (It is worth reading the quote in its original context in a classic piece he wrote for The Economist here.)

But this new, powerful anti-authority philosophy that places progress and change at the centre of things might work for such lofty notions as epistemology and for political theory but can it be applied to, say, teaching and learning? It can - and as I have explained here: teachers should not claim to be or act as authorities. But if you have a group of learners who want to learn from you, how best to help them? Of course, respond to their questions - but help instil in them a sense of uncertainty (which is identical to a sense of discovery, in truth) and the idea that knowledge, useful and powerful as it is, is bound to be filled with error. After all, it is fallible humans who construct it. And there is no better area where we can see the power and utility of this "error and uncertainty" methodology than science. In learning about error and uncertainty we really do learn about science itself and not merely science facts. But all academic disciplines are filled with error and democracy, as we have seen, is best conceived of as an error-in-leadership-correction-mechanism. But then, what makes science so special?

Learning Physical Sciences

One crucial, deal-breaking difference between science and non-science is, as Popper described, drawn by the demarcation line of “falsificationism”. If a theory can be falsified by some experimental observation, then it might be a scientific theory - but if it cannot, even in principle be, then it never can be. Experiment is thus at the heart of science. But that is not all. David Deutsch built upon Popper's progress in drawing a line between what is scientific and what is not to extend the line of demarcation into the realm of "good explanations". See his TED talk here for more on that. (The central claim is that a "good explanation" is "hard to vary while still accounting for the phenomena it purports to.") The purpose of science, then is about creating not only "falsifiable" claims but actually good explanations of the physical world. "Falsifiable claims are two-a-penny" as David has said, "Every end-of-the-world prophet has a falsifiable claim to make." So we cannot be merely after predictive models. They are useful when scientific, but not the whole story. Kepler’s Laws do not exist merely to predict where points of light on the sky will be night after night. Instead they are explanations of things that really do exist - huge astronomical objects like planets and moons and Suns are actually in the sky and moving according to Kepler's Laws (to a first approximation, anyway). And given they exist, Kepler’s Laws explain something of how they behave. So explanation, in terms of objects that really do exist is the purpose of science. And those explanations, in part, refer to mathematical models that allow us to make those predictions. But all of that: the explanation we create, the (in physics at least) mathematical model that follows and the predictions we derive are all error prone. The explanation might be completely wrong. The model might work - but only within certain limits. And the prediction will always be uncertain. And, in physics, wonderfully, uncertain to a degree we can quantify...which is to say specify with some numbers.

In learning physical science, a valuable lesson is in recognising that not only are the predictions uncertain but, unlike many other fields of study, we can actually quantify our uncertainty. This is very special and should be the envy of every other domain of science that purports to explain some part of physical reality. We can place numbers on our predictions to actually specify, in some sense, how confident we are that some number we have measured actually represents reality. The physical sciences are very much about measurements so that we can test our best explanations and use them to make ever better predictions. Measuring, specifying and constraining those parts of theories we have conjectured. This is why error analysis is - or should be - at the heart of scientific experimentation and learning about science. Students of science should want to test their own ideas and be able to provide some good indications of the ways they could be wrong and to what degree. So, arguably (once we have created an explanation for the phenomena in question) reporting on all the sources - and possible source - of error in our analysis is the most important aspect of science. This is what in large part can separates out good science from bad.

“A result quoted without uncertainty is, strictly, meaningless” wrote physicist David Deutsch in “The Beginning of Infinity”. We should take that seriously. Results in physics are always provided with some other numbers: +/- values which give the range within which some data point might lie, or error bars which do the same pictorially for a graph, or increasingly commonly some statistics - like confidence intervals - all of which suggest that although we have managed with some effort to obtain a result - we cannot possibly know with absolute, final precision, what it is. We just might be wrong. And even when we think we're right about our results...we might still be wrong. Our instruments are limited due to the way they are constructed - and this introduces an uncertainty. A ruler calibrated in centimetres is not much use in measuring the width of a human hair. But even if we use it to measure the height of a person, we can only measure to within the nearest 1cm. Are they 164cm or 165cm? The ruler might be closest to 164cm. But we cannot say, using such a ruler, that the person is 164.00000000cm. At best they are less than 164.5cm as we assume anything above this and our eyes would estimate the height as 165cm. In truth what we would say, given this one measurement, is that the person is (164+/-0.5)cm indicating that at most they are 164.5cm and at least 163.5cm tall. But even here we might be wrong. After all, imagine we used an even better ruler and got a result of (164.535+/-0.005)cm - would this more precise result necessarily be more accurate (which is to say closer to the truth?). No. For example: what if we simply forgot to ask the person to take their shoes off? All the precision instrumentation in the world and doing multiple such trials cannot fix that. Only a better method and understanding our underlying assumptions. And sometimes we just cannot easily know what all these are until someone else points them out to us. Or we stumble across them ourselves.

Error analysis in physics can be a tricky thing. It can throw up some interesting trivialities and some cosmically significant anomalies. Take an ever so slightly more complicated problem beyond just measuring some height. Let's say we want to know how fast a pendulum oscillates. In such a case we may need to take many many measurements (which is to say, in good science, we should take more than one set of measurements. Indeed, the more the better). So to the pendulum. In physics pendulums can be used to find the acceleration due to gravity. So to measure how fast it oscillates (this is called the period) we might use a stop watch. Usually if one has basic equipment like that then what we do is record how long something like 10 oscillations takes, then divide the final result by 10. We'll ignore the division.

So perhaps, for 10 oscillations it takes 20.32seconds (s). Then we do it again and this time we get 21.10s. Then 19.83s. Now we have a range. Why shouldn't all the results be the same? Because people who operate stop watches are inconsistent. Also, physical things like pendulums can be a bit unpredictable if we aren’t very careful. Air currents might affect things and so on. So with our 3 data points, we then take an average (it’s 20.08s). And the uncertainty? Well then it becomes more complicated. One method is to choose the highest (20.32s) and then subtract the lowest (19.83s) and divide this result by 2 (such a method produces an uncertainty of 0.25s and so we might quote the result of this investigation as that the time for 10 oscillations takes (20.08 +/- 0.25)s which would be better written as the rounded numbers (20.1+/-0.3)s. If we have large data sets sometimes this method produces an uncertainty which is too small (because some actual data is outside of the uncertainty). For example if I just include one more additional data point of 21.70s then my average becomes 20.7s and my uncertainty becomes 0.9s (using the same method as before) but now this means - given that uncertainty of 0.9s, that the range of data is at least (20.7-0.9) = 19.8s up to a maximum of (20.7+0.9) = 21.6s. But although my upper limit (including the uncertainty) is 21.6s I actually have a data point of 21.7s. So there’s a problem. We might call that point an "outlier" (which some error analysis techniques say to ignore) but in this case it's the very presence of the outlier was the reason the uncertainty increased at all in the first place. And we cannot definitely rule out that just maybe the outlier is closer to the truth.

The point here is, because we do not have in hand access to ultimate ontological reality, the best we can do is repeat and make the working assumption there are no so-called “systematic” errors. These are errors that are part of the method and that, typically, we don’t know about if we're being careful. For instance, in this case, just maybe there was some kind of error in the stop watch that always added an extra 0.5s to any reading or maybe whenever the pendulum was released it was given an extra push and we did not realise. But surely "authoritative" experts in physics at the very top of their game would not make systematic errors? Well they must, of course, because - as I said - often they cannot be known about or else they would have been corrected. Let us come to this shortly.

In the physical sciences - physics being the preeminent example - discoveries always (or should always!) include the uncertainty or "margin of error" as it is sometimes called in other disciplines. The error analysis is absolutely crucial. The Higgs Boson was discovered in 2012 and the scientists placed into popular science culture the notion of “5-sigma” significance. This means that the chance the signal was not the Higgs Boson was 1 in around 3.5 million. So many experiments were performed where the data was "concordant" (all the data was saying much the same thing) that if the experiment was repeated some 3.5 million times we wouldn't expect a deviation in anymore than 1 data point. Here’s a good popular article about the significance of “significance” and 5-sigma https://blogs.scientificamerican.com/observations/five-sigmawhats-that/ The 5-sigma standard has been used also for gravitational waves in 2014 https://www.scientificamerican.com/article/gravity-waves-cmb-b-mode-polarization/ but notice that this article says it provides “proof” of cosmic inflation. Proof? As though now we have the final demonstration of the truth. A “proof” is better considered as a mathematical process* where rules are followed to reach conclusions. It is no guarantee that something is true. And certainly evidence cannot do that. The purpose of evidence is to decide between theories guessed as I explain here. http://www.bretthall.org/general-relativity-and-the-role-of-evidence Even accounts of science get how science works wrong rather more often than not. They fall into that strange desire for "certainty" where results are "error free". But that itself is a terrible error. *I say "mathematical process" here but this does not mean it is not also, and crucially, a physical process. This fact is yet another reason why certainty is not possible - even in mathematics. All proofs are completed by some "prover" - an object, be it a human brain, a calculator, pen and paper or some computer. Those objects that do the "proving" in the mathematical proof, obey the laws of physics. And our knowledge of those laws of physics tells us that physical objects cannot be Platonic ideals - they are subject to errors, wear-and-tear (from the second law of thermodynamics) and uncertainty because we cannot have perfect knowledge of the values that the quantities possessed by these objects take in our universe at any time and such values, as a rule, are subject to change at any moment (this is from quantum theory) and our instruments cannot provide perfect precision for related reasons (party the subject of this entire post). So there are physical reasons why mathematical proof is fallible and as liable to lead us into error as any other kind of useful mechanism for producing knowledge.

So is the 5-sigma standard fool proof? Hardly! One of my favourite experiments of the last few decades was some work done at my old Alma Mater, The University of New South Wales by some professors and old classmates of mine. It was ground breaking stuff - about whether the fine structure constant had changed. (The fine structure constant tells us how strong the electromagnetic force in our universe is). I wrote about that here http://www.bretthall.org/fine-structure.html Three papers were published each becoming more confident than the last culminating in a 4.7-sigma significance (so let’s round that up to 5... ;) that the fine structure constant was indeed changing. See here: https://arxiv.org/pdf/astro-ph/0306483.pdf That paper was published in MNRAS (The Monthly Notices of the Royal Astronomical Society) perhaps second only to ApJ (the Astrophysical Journal) for "authority" and prestige in the world of astrophysics and astronomy. The importance of a fine structure constant that varies means that the very laws of physics would not be “constant” across the universe. At least the strength of the fundamental forces would not be and that’s very strange indeed. If the constants are different elsewhere in the universe…well…then is it really the same universe? Anyways - such problems are the reason many of us get excited about science - especially physics and astronomy - because what, in heaven’s name, could explain such a thing? Some set of laws deeper than the laws of physics? Is it really the case that at the edge of the observable universe another universe might begin? Or that the laws of physics were different in the past, evolve over time to become "biofriendly" and then continue to evolve towards something else again? All of these huge questions turn on such results. And with an almost (!) 5-sigma significance, it seems like there was close to that 1 in 3.5 million chance that this was an error.

Well, as it turns out 4-sigma, 4.7-sigma or 5-sigma, it does not matter. We should always be prepared for a surprise in science. In this case there was indeed an error - as reported in this paper https://arxiv.org/abs/1409.1923 by most of the the very same authors that wrote those first papers and the error was “systematic” - it was caused by - in the words of the authors “long-range distortions of the quasar spectra's wavelength scales which would have caused significant systematic errors in our…measurements". In other words, there was something they didn't realise that was affecting their results. They, of course, admitted as much originally. But anyone not familiar with error analysis might not. They might see 5-sigma or almost-5-sigma and think almost-certainty. But it's nothing like that. It never it. Error is ubiquitous.

The lesson here is wonderful and deep. First: the researchers only ever claimed small effects - that the fine structure constant (alpha) has changed by only a tiny amount and also the amount by which is was claimed to have changed was claimed to have changed within some range. Then they claimed this measurement was confident at the 4.7 sigma level. Finally they claimed a systematic error that undid all their previous results. Science self corrected and these researchers did excellent science at every step of the way. And it was so much about the errors. Meantime, all of this produced new and exciting techniques of astrophysical methods to uncover the mysteries of the universe with better precision. People won’t make the same mistake again (or at least as often, we hope) and progress will continue. You can read more from Michael Murphy, one of the lead researchers, about how even this error leads to important progress in astrophysics here.

The process of quantifying uncertainty is a very useful one for anyone interested in learning the physical sciences. A student might very well be interested in astrophysics - and that’s great. But it is difficult for a student to fully appreciate how it is some community has come to agree that the universe is accelerating its rate of expansion or that dark matter exists or that there is a black hole at the centre of our galaxy. A student assumes that the community has a very good procedure for “detecting and eliminating error”. As all physical sciences do. But how to really appreciate this? If you do some of this yourself - if you can understand exactly what the process involves by performing an investigation and reporting on your findings and the errors then you get a real sense of the tentativeness of the scientific enterprise. If you truly want to be a physical scientist and not primarily something like a philosopher of physical science or someone who merely appreciates physical science - or is a purely theoretical physical scientist - then sometimes you might need to make measurements or understand how the measurements are made. It is in learning some of these techniques that you begin to understand that there is no authority and no certainty and certainly no perfection in science - but there can be more or less reliability and better or worse techniques. There is no authority in that a person who claims to have the numbers, might have the numbers wrong. And no certainty because our data taking devices can make errors and even when we think they do not, they cannot be perfectly precise. And all of this taken together means our final results in the sciences are “uncertain” to some degree even if “useful” and “closer to the truth” than some other number.

Learning how uncertainties work and how they are calculated and what causes error allows us to think critically about science. It’s a skill that means we do not need to ever defer to authority, instead we can judge those who claim authority and expertise by our own standards of error correction. Is the expert making a claim we can check? What method did the expert use? Did that method report the sources of error and the degree of uncertainty? How was it done? If you can learn something about all this in physics - the most precise of all the physical sciences - then you will understand that if even there things are error prone, then it is no stretch to realise that everywhere else not only are errors at least as common, but far less well reported upon, as a rule.

Everything is prone to error and uncertainty. Elections can throw up errors, as it turns out, but the citizens have a means of correcting that error at the very next election for that is the purpose of democracy. Detecting and correcting errors is what allows progress in every domain. It’s the very thing that underpins the ability of reason to reach into all areas of human intellectual endeavour and improve them. What a marvellous thing correcting errors and being in a state of uncertainty is. And for anyone learning science, most especially, it's crucial to get a grip on the fact you might be wrong, and how...and how to tell everyone else just how wrong you might be and have been.
Proudly powered by Weebly
  • Home
  • Physics
    • An anthropic universe?
    • Temperature and Heat
    • Light
    • General Relativity and the Role of Evidence
    • Gravity is not a force
    • Rare Earth biogenesis
    • Fine Structure
    • Errors and Uncertainties
    • The Multiverse
    • Galaxy Collisions
    • Olber's Paradox
  • About
  • ToKCast
    • Episode 100
    • Ep 111: Probability >
      • Probability Transcript
  • Blog
    • Draft Script
  • Philosophy
    • Epistemology
    • Fallibilism
    • Bayesian "Epistemology"
    • The Aim of Science
    • Physics and Learning Styles
    • Positive Philosophy >
      • Positive Philosophy 2
      • Positive Philosophy 3
      • Positive Philosophy 4
    • Inexplicit Knowledge
    • Philosophers on the Web
    • David Deutsch & Sam Harris
    • David Deutsch: Mysticism and Quantum Theory
    • Morality
    • Free Will
    • Humans and Other Animals
    • Principles and Practises: Preface >
      • Part 2: Modelling Reality
      • Part 3: Political Principles and Practice
      • Part 4: Ideals in Politics
      • Part 5: The Fundamental Conflict
    • Superintelligence >
      • Superintelligence 2
      • Superintelligence 3
      • Superintelligence 4
      • Superintelligence 5
      • Superintelligence 6
  • Korean Sydney
  • Other
    • Critical and Creative Thinking >
      • Critical and Creative Thinking 2
      • Critical and Creative Thinking 3
      • Critical and Creative Thinking 4
      • Critical and Creative Thinking 5
    • Learning >
      • Part 2: Epistemology and Compulsory School
      • Part 3: To learn you must be able to choose
      • Part 4: But don't you need to know how to read?
      • Part 5: Expert Children
      • Part 6: But we need scientific literacy, don't we?
      • Part 7: Towards Voluntary Schools
    • Cosmological Economics
    • The Moral Landscape Challenge
    • Schools of Hellas
  • Postive Philosophy blog
  • Alien Intelligence
  • High Finance
  • New Page
  • Serendipity In Science
  • Philosophy of Science
  • My YouTube Channel
  • The Nature of Philosophical Problems
  • The Nature of Philosophical Problems with Commentary
  • Subjective Knowledge
  • Free Will, consciousness, creativity, explanations, knowledge and choice.
    • Creativity and Consciousness
  • Solipsism
  • P
  • Image for Podcast
  • ToK Introduction
  • Begging the Big Ones
  • Blog
  • Our Most Important Problems
  • Corona Podcasts
    • Brendan and Peter
    • Jonathan Davis
  • Responses
  • Audio Responses
  • New Page
  • Critically Creative 1
  • Critically Creative 2
  • Critically Creative 3
  • Critically Creative 4
  • Critically Creative 5
  • David Deutsch Interview in German
  • Audio Files
  • Lookouts
  • Breakthrough!