Showing posts with label continuity. Show all posts
Showing posts with label continuity. Show all posts

Monday, April 7, 2025

Information Processing Finitism, Part II

In my previous post, I explored information processing finitism (IPF), the idea that nothing can essentially causally depend on an infinite amount of information about contingent things.

Since a real-valued parameter, such as mass or coordinate position, contains an infinite amount of information, a dynamics that fits with IPF needs some non-trivial work. One idea is to encode a real-valued parameter r as a countable sequence of more fundamental discrete parameters r1, r2, ... where ri takes its value in some finite set Ri, and then hope that we can make the dynamics be such that each discrete parameter depends only on a finite number of discrete parameters at earlier times.

In the previous post, I noted that if we encode real numbers as Cauchy sequences of rationals with a certain prescribed convergence rate, then we can do something like this, at least for a toy dynamics involving continuous functions on between 0 and 1 inclusive. However, an unhappy feature of the Cauchy encoding is that it’s not unique: a given real number can have multiple Cauchy encodings. This means that on such an account of physical reality, physical reality has more information in it than is expressed in the real numbers that are observable—for the encodings are themselves a part of reality, and not just the real numbers they encode.

So I’ve been wondering if there is some clever encoding method where each real number, at least between 0 and 1, can be uniquely encoded as a countable sequence of discrete parameters such that for every continuous function f from [0,1] to [0,1], the value of each parameter discrete parameter corresponding to of f(x) depends only on a finite number of discrete parameters corresponding to x.

Sadly, the answer is negative. Here’s why.

Lemma. For any nonempty proper subset A of [0,1], there are uncountably many sets of the form f−1[A] where f is a continuous function from [0,1] to [0,1].

Given the lemma, without loss of generality suppose all the parameters are binary. For the ith parameter, let Bi be the subset of [0,1] where the parameter equals 1. Let F be the algebra of subsets of [0,1] generated by the Bi. This is countable. Any information that can be encoded by a finite number of parameters corresponds to a member of F. Suppose that whether f(x) ∈ A for some A ∈ F depends on a finite number of parameters. Then there is a C ∈ F such that x ∈ C iff f(x) ∈ A. Thus, C = f−1[A]. Thus, F is uncountable by the lemma, a contradiction.

Quick sketch of proof of lemma: The easier case is where either A or its complement is non-dense in [0,1]—then piecewise linear f will do the job. If A and its complement are dense, let (an) and (bn) be a sequence decreasing to 0 such that both an and bn are within 1/2n + 2 of 1/2n, but an ∈ A and bn ∉ A. Then for any set U of positive integers, there will be a strictly increasing continuous function fU such that fU(an) = an if n ∈ U and fU(bn) = an if n ∉ U. Note that fU−1[A] contains an if and only if n ∈ A and contains bn if and only if n ∉ A. So for different sets U, fU−1[A] is different, so there are continuum-many sets of the form fU−1[A].

Tuesday, December 3, 2024

Continuous variation

Some arguments against restricted composition—the view that some but not all pluralities compose a whole—are based on the idea that a feature that cuts reality at the joints, such as composition, cannot be vague, and that if composition is restricted, one can have a continuous series of cases from a case of composition to a case of having lack of composition.

But now suppose I owe you ten dollars. Then there is a continuous series of cases where the amount I pay you ranges from zero to $20. The properties wrong, right and supererogatory cut nature at the joints. But as my payment moves from $9.99 to $10.00, it switches from wrong to right, and as it hits $10.01, it switches from merely right to supererogatory. So, one can have a case where the presence of joint-cutting features depends on something that varies continuously. And there is no vagueness: if I pay less than $10, I definitely wrong you; if I pay $10 or more, I definitely do right.

Sunday, September 25, 2022

A strict propriety argument for probabilism without any continuity assumptions

Here’s an accuracy-theoretic argument for probabilism (the thesis that only probabilities are rationally admissible credences) on finite spaces that does not make any continuity assumptions on the scoring rule. I will assume all credence functions take values on [0,1].

  1. All probabilities are rationally admissible credences.

  2. If any non-probabilities are rationally admissible, then all non-probabilities satisfying Normalization (whole space has credence 1) and Subadditivity (P(A) + P(B) ≤ P(AB) when A and B are disjoint) are rationally admissible with the appropriate prevision being given by a level set integral [correction: actually, I need LSI, not the version of LSI in the earlier blog post].

  3. A rationally appropriate scoring rule s satisfies strict propriety for all rationally admissible credences with an appropriate prevision: if V is an appropriate prevision then Vus(u) is better than Vus(v) whenever u and v are different rationally admissible credences.

  4. There is a rationally appropriate scoring rule.

But now we have a cute theorem:

  • On any finite space Ω with at least two points, no scoring rule satisfies strict propriety for the credences with Normalization and Subadditivity and level set integral prevision.

It follows no non-probabilities are rationally admissible.

Is this a good argument? I find (2) somewhat plausible—it’s hard to think of a less problematic weakening of the axioms of probability than from Additivity to Subadditivity, and I have not been able to find a better prevision than the level set integral one. Standard arguments for probabilism assume strict propriety for all probabilities. But it seems to me that a non-probabilist will find strict propriety for all probabilities plausible only insofar as they find strict propriety for all admissible credences plausible. Thus (3) is dialectically as good as the usual strict propriety assumption.

I think the non-probabilist’s best way out is to deny strict propriety or to deny that there is a rationally appropriate scoring rule. Both of these ways out work just as well against more standard arguments for probabilism, and I think both are good ways out.

Technically speaking, the advantage of this argument over standard arguments for probabilism is that it makes no assumptions of continuity.

Friday, September 23, 2022

Discontinuous epistemic utilities

I used to take it for granted that it’s reasonable to make epistemic utilities be continuous functions of credences. But this is not so clear to me right now. Consider a proposition really central to a person’s worldview, such as:

  • life has (or does not have) a meaning

  • God does (or does not) exist

  • we live (or do not live) in a simulation

  • morality is (or is not) objective.

I think a case can be made that if a proposition like that is in fact true, then there is a discontinuous upward jump in epistemic utility as one goes from assigning a credence less than 1/2 to assigning a credence more than 1/2.

Wednesday, July 17, 2019

Continuous choices

Suppose at at noon, Alice is relaxed in an armchair listening to music, but at any given time she is capable of choosing to get up, walk over to the kitchen and make herself a sandwich for lunch, which it’s time for. For fifteen minutes she continues listening to the music and then gets up at 12:15. It seems that she is continually responsible for her continuing to sit until 12:15, and then she is responsible for getting up.

Here is one realistic question about what happened between 12:00 and 12:15:

  1. Did Alice make a vast number of choices, one at every moment until 12:15, to remain seated, and then at 12:15 a choice to get up?

In favor of a positive answer, it is difficult to see how she could be responsible for not getting up at a given time if she did not choose not to get up.

But a positive answer seems psychologically implausible. Indeed, it doesn’t seem like Alice would be enjoying the music if every moment she had to positively choose to stay.

Also, let’s think about what the reasons weighing in on each choice would be. On the one hand, there is a very weak reason to get up now. It’s a weak reason because getting up the next moment would be just as good hunger-wise. On the other hand, there is a very weak reason to keep sitting in order to enjoy music between this moment and the next. It’s a weak reason because the amount of music involved is very small. Choices on the basis of such very weak reasons are hard to make. These reasons would be hard to weigh. And when making choices between hard to weigh reasons, it seems that the chances of going for either option should be of the same order of magnitude. But if Alice were to make a vast number of choices between getting up and staying between, say, 12:00 and 12:10, with each choice having roughly the same order of magnitude of probability, then it was very unlikely that all these choices were choices to stay.

I find the responsibility argument pretty persuasive, though. Maybe, though, the right story that balances psychological plausibility with intuitions about responsibility is this: Alice made a small number of choices between 12:00 and 12:15. Most of these choices were a choice whether to think harder about whether to get up or just let the status quo continue “for a while”. Most of the time, she chose just to let the status quo roll on. At a time t during which the status quo was “just rolling on”, Alice’s responsibility for not getting up was derivative from her choice to stop thinking about the question. Sometimes, however, Alice decided to think harder about whether to get up. Finally, she thought harder, and got up.

Since the number of choices is smaller on this story, it doesn’t interfere as much with the enjoyment. There is some interference, but that’s realistic. And since the number of choices is smaller, the probabilities of each option can be of the same order of magnitude without this creating any problems.

Now, prescinding from the realism behind the discussion of (1), we can ask the also interesting question:

  1. Could it be that both (a) time is continuous and (b) Alice literally makes a choice to remain seated at every single moment of time between 12:00 and 12:15?

The answer, I think, is negative. For consider a choice at t. Alice would be choosing between the good of slightly more music and the good of slightly earlier relief of hunger. But how long as the “slightly more” and “slightly earlier”? Zero temporal length! For if time is continuous, and Alice is choosing at every moment, zero length of time elapses between choices. Indeed, there is no sense to the idea of “between choices”. So Alice would be choosing between zero-value goods. And that doesn’t make rational sense.

Sunday, March 13, 2016

Times that never become present

Could there be times that are never present? At first sight, this seems a contradiction: surely, each time t is present at itself. Given the B-theory of time, this indeed is automatically true.

Not so, however, for A-theories. There is no contradiction in the growing block growing by leaps and bounds. Imagine that suddenly a whole minute is added to the growing block. The times in the middle of that minute never got to be at the leading edge of reality, and hence never got to be present, since to be present is to be at the leading edge of reality, given growing block. Or consider the moving spotlight: the spotlight could jump ahead in the spacetime manifold by a minute or an hour or a year, skipping over the intervening bits of the manifold. It's less clear whether it is possible to have times that aren't ever present given presentism. Still, Dean Zimmerman has considered an eccentric version of presentism on which there still is a four-dimensional spacetime manifold. On such a view, times could be identified with hypersurfaces in some preferred foliation, and there might be some such hypersurfaces that never become present.

So, apart from the B-theory and many versions of presentism, we have a possibility of times that are never present. Why would we want to countenance such a nutty option, though?

I can think of two reasons. The first would be to reconcile Aristotle's theory of time with many physical theories. According to Aristotle, times are endpoints of changes, and any interval of time contains at most finitely many changes, so that time is discrete. (Causal finitism might be a reason to adopt such a theory.) But in many modern physical theories, from Newton at least through Einstein, time is a continuous coordinate. One can try to reconcile the two views by supposing that time is continuous, as Newton and Einstein suppose, but that only those times which are the endpoints of changes are ever present. Aristotle then may be right that times are discrete, as long as we understand him to be speaking only about the times that matter, namely those that ever become present. The second motivation would be to have a flash ontology--an ontology on which physical things exist only during the discrete moments of quantum collapse--while softening the counterintuitive consequence that at most times the universe is empty. For we could identify the times that ever become present with the times at which a flash occurs. Then even if at most times, in the broad sense of the word "times", the universe is empty, still the universe is non-empty at all the times that matter, namely at all the times that become present.

Neither a B-theorist nor a standard presentist can suppose times that are never present. But she might still suppose something that plays a similar functional role. She could think of abstract times as numbers or as hypersurfaces in an abstract continuous manifold. Then real time could be discrete, while abstract time is continuous.

Wednesday, October 30, 2013

The vagueness argument against restricted compositionality

Lewis and Sider have argued that if restricted compositionality is true—some but not all pluralities of two or more objects compose a whole—then there will be cases where it's vague how many objects there are. For instance, imagine two universes, A and B, each with the same finite set of n particles with the same intrinsic properties. But in A, the particles are neatly arranged into galaxies, trees, tables, buildings, etc. And in B there is just a blooming buzzing confusion. If restricted compositionality holds, then, assuming there are no immaterial objects, universe B has exactly n or at most n+1 objects—it's just too messy to have any cases of composition, except perhaps for the universe as a whole (that's why it might be n+1 rather than n). But A is much like our universe, and so we would expect lots of cases of composition, and hence the number of objects will be a lot more than n+1, say n+m for some large m. However, we can now imagine a continuous sequence of universes ranging from A to B, differing continuously in how the particles are arranged. As we move that continuous sequence, the number of objects will have to change from no more than n+m to n+1. But it is incredible that the object count should sharply change due to a very tiny shift in particle positions. Instead, the object count will at times be vague. But how many objects there are is a matter of which sentences using universal quantification, conjunction, negation and identity are true. But quantification, conjunction, negation and identity are not vague. So we have vagueness where we cannot have vagueness.

There may be some technical problems with the argument as I formulated it, given the assumption of no immaterial objects. Maybe we can't do without immaterial entities like God or numbers. One could reformulate the argument to restrict the counting to material entities, but "material" might actually be a vague term. Perhaps the best thing to do is to assume that these universes have no immaterial contingent entities, and then just count contingent entities. Contingency shouldn't be a vague matter, after all. The Aristotelian may balk at this. For it may well be that a necessary condition for a bunch of material entities to compose a whole that they have a form, and forms are immaterial but contingent. Maybe, though, "form" is not vague, and so we can just count the contingent non-forms.

But talking of forms suggests a more serious difficulty. If there are Aristotelian forms, then how many material objects there are may well not supervene on how material objects are spatiotemporally arranged and what intrinsic properties they have. For objects to come to compose a whole, there must come into existence a form. There is nothing absurd about there being sharp laws of nature specifying under which precise conditions a form comes into existence. There is no need for the laws of nature to be continuous (and the possibility of fundamental discreteness is empirically still open). Or perhaps God decides on a case-by-case basis whether to create a form. Then there is no vagueness as to how many material objects there are: the number of material objects equals the number of forms of material objects that in fact inform some matter (the souls of the resurrected are forms of material objects but temporarily fail to inform any matter). Of course in transitional cases we won't have much confidence whether some objects compose a whole, but that's just because we are unable to see forms except through their normal effects.

Monday, June 1, 2009

Degrees of consciousness

There is one sense in which our consciousness, say of our surroundings, can vary continuously: the content of the consciousness can vary pretty much continuously in terms of the level of detail and discrimination. After waking up, I come to be aware more determinately of what is around me, after all. This is an uninteresting (from the point of view of my present interest) form of continuous variation of consciousness. The interesting form of continuous variation would be where the content is fixed, but somehow the degree of consciousness varies. It is hard to imagine what fixed-content variation in the level of consciousness would be like. I would be aware of exactly the same detail, but differently. More vibrantly? More focusedly or more concentratedly? Focus and concentration are a promising start. But it seems to me that less or concentrated focused consciousness does at least tend to have a lower level of detail of content. When I concentrate on a part of the visual field, I see more detail there.