0

This is largely a reference request, but supplementary explanations are welcome.

I describe my thoughts on the paradox of analysis here.

I recently tried to derive the form of first-order logic more or less a priori (and got side-tracked).

My hypothesis was that first-order logic is an inevitable structure that emerges when you try to specify a set of rules for “valid reasoning”. One’s train of thought could go something like this:

Is there a most conservative thing I can say, about the nature of “valid argumentation”? I’m not sure there’s a single most indubitable place to start. I can observe that sometimes when people say something, I tend to ‘agree’ or ‘disagree’, but sometimes neither. Sometimes, some statements, like ‘3 + 3 = 6’, strike me as ‘true’, in a very strong sense - but at the same time, I feel that that is a certain ‘kind’ of truth - not ‘true’ in the way other things are true - one reason why is that they are only really symbols; I could break free from them and deny the statement’s truth if I chose to change the meaning of the symbols…

And so on. Instead of diving head-on into established ideas in logic, such as that ‘a statement is either true or false; there is an implication symbol; the implication symbol is transitive’, we are trying to not close any doors, before we have explored where they lead.

I was thinking a generalization I felt comfortable with looked like this:

some kind of “thing”


+--------------------+
|  Units of meaning  |
+--------------------+
| statement          |
+--------------------+
| sentence           |
+--------------------+
| idea               |
+--------------------+
| claim              |
+--------------------+
| intuition          |
+--------------------+
| feeling            |
+--------------------+
| thing              |
+--------------------+
| opinion            |
+--------------------+
| fact               |
+--------------------+
| truth              |
+--------------------+
| suggestion         |
+--------------------+
| proposition        |
+--------------------+
| notion             |
+--------------------+
| …                  |
+--------------------+

some kind of “relationship”



+---------------------------------------+
| Ways those things might relate to each other |
+---------------------------------------+
| suggests                              |
+---------------------------------------+
| implies                               |
+---------------------------------------+
| is similar to                         |
+---------------------------------------+
| relates to                            |
+---------------------------------------+
| goes together with                    |
+---------------------------------------+
| fits in with                          |
+---------------------------------------+
| …                                     |
+---------------------------------------+

I have tried not to make any “ontological commitments”. Everything is still open-ended. We can choose any of the above, and develop it with further “tentative considerations”. I could consider the case where one of the above relations is transitive:

If A being True implies B being True, and if B being True implies C being True, I will hold the transitory consideration that therefore A being True implies C being True.

Or, I could consider a case where a relation is not transitive:

There are cases where I might feel that A and B go together well, and where B and C go together well, but then I no longer feel that A and C go together well in the same way.

Either of the above increases the amount of information, in a system. You could say it increases “definition” or “resolution”, like a high-resolution camera. In the beginning, there are less rules, so “anything goes”. The more rules we add, the clearer it becomes when something violates them. The first example (above) is almost the axioms of category theory, we just need to add a few more ideas. If we ever felt discontent with how a chain of hypothetical considerations evolved, we could go back and consider some other basic conceptual elements as being what we derive a formal system from. But we are also free to choose “all of the above”, and see how they interact. (This reminds me of the recent idea of a “multi-way system”.) I think of this as a ‘thinking methodology’ one could call “coherentizing”, as a possible solution to the paradox of analysis. In a way, you begin from the murkiest, vaguest sketches of ideas, meanings, rules, claims, statements, or concepts. You try, in an inoffensive way, to develop them in minorly incontestable ways. Each time you do that, you are able to see what the emergent system is like. Is it “self-coherent”? Does it act in the way you wanted it to? I would like to know about philosophical literature which basically takes the philosophy of “coherentism” and turns it into a methodology of thinking.

3
  • 1
    It sounds like Jackson's version in From Metaphysics to Ethics, see SEP:"On Jackson’s view, the role of conceptual analysis is to make explicit our ‘folk theory’ about a given matter, elucidating our concepts by considering how individuals classify possibilities."
    – Conifold
    Commented Apr 9 at 20:03
  • 1
    for "My hypothesis was that first-order logic is an inevitable structure that emerges when you try to specify a set of rules for “valid reasoning”." see curtis franks's "logical nihilism": "Suppose that we are interested in detecting and understanding whatever relationships we can find. Then we might wish not to be wedded to any point of view. We might, instead, try on a few hats until some interesting patterns appear where before there seemed to have been only disorder. We might find that one hat helps time and again, but we will be well-advised not to forget that we are wearing it. [...]
    – ac15
    Commented Apr 9 at 20:11
  • For if we never take it off, then we risk forever overlooking logical relationships of considerable interest. Worse, we risk coming to think of the relationships we can detect as “in the world,” “preconditions of thought,” or some such thing."
    – ac15
    Commented Apr 9 at 20:11

1 Answer 1

0

You ask:

Have any philosophers discussed “coherentizing” as a solution to the “paradox of analysis”?

As far as I see, the paradox of analysis is strongly related to the observation that deduction may show relationships among propositions and show their logical coherence, but it does not produce new information. See Conifold's take (PhilSE) on Hintikka's scandal of deduction (Springer) here. The attempt to extend deductive calculi as methods of providing new information is a fundamental misapprehension which confuses creating new propositions with new concepts. Philosophers seem to confuse all sorts of things about coherence, inferential certainty, propositions, and the generation of belief.

As such, deduction and coherentism as the classical rationalists argue, is certainly a great tool for moving towards a more rational set of beliefs, both globally, and within subsets of beliefs, but coherentism doesn't inherently make new information. If you take coherentism as the IEP does, there are three aspects worth considering: logical consistency, explanatory relations, and various inductive (non-explanatory) relations. But is largely the first that one equates with what it means for a passage to be coherent. Logical consistency is the backbone of the methods of classical rationalism and is rooted in human reason and rationality. My go-to on that topic is The Architecture of Reason by Robert Audi (GP) who also has the definitive contribution to the Oxford Handbook of Reason. But Audi puts coherentism in it's appropriate place among the "superstructure" he erects to tackle the explication of reason and rationality.

Audi addresses coherence early on in Section 4, Chapter 1 (pg.24) and revisits it in Section 7, Chapter 2. There aren't any clean quotations that jump out in the context of this question, but he roughly makes the case that coherence and incoherence have distinct roles in the production of knowledge, and that coherence does not produce knowledge, but rather accompanies it from other sources of knowledge, which for Audi includes perception, memory, consciousness, and testimony. Incoherence allows us to reject beliefs from our knowledge, but coherence is necessary but NOT sufficient for admitting beliefs. This is because knowledge generation is not the manipulation of propositions, but rather the experiences which prefigure our being able to state and rationally vet them. Thus coherentism accompanies the generation of knowledge, but it does not alone entail it. This is part of his project of creating a "well-groundedness conception of justification" which marginalizes the role of coherence in justification, but still admits its utility particularly in recognizing "conceptual coherentism" and the relevancy of having concepts that semantically coherent. (This would be the book I would consider the primary recommendation.)

As for the relationship between logical coherence and semantics and truth more broadly, without a doubt, first-order logic is an abstraction of natural language ontology and reasoning. It is selecting a subset of semantics that we use to reason about the world in which we formalize a syntax. Such an activity has recently been termed formal semantics. From WP:

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

This is exactly what you're doing when you try to map out natural language expressions that have semantic import in logic and it blurs the line between natural grammars and formal grammars by looking for isomorphisms between them. But appealing to this project as a source of generating new knowledge confuses the production of propositions of both order and propositional attitudes for the actual experience that lead to their production. Carnap was obviously returning to Kant's analytical-synthetic divide in his quixotic quest to draw a neat line between observation statements and speculative and meaningless metaphysical statements, but Quine's attack on the divide by differentiating between the salva veritate of cognitive synonymy (See Two Dogmas) and normal paraphrasing and linguistic synonymy I think makes the case that information generation even complex natural languages, does not rely on deductive processes.

Thus, trying to place the locus of deductive reasoning as the core of ontological production does not reflect how actual natural language ontology (SEP) evolves. A more adequate model for the generation of information can be found with Sellars and perhaps in the extreme McDowell, where there is a complex interplay between empirical experience and rational language use (in both conceptual coherence and the use of intuition for the production of novel theory-laden propositions). In other words, the production of knowledge lies in justification, and justification is far more than coherence, and the end result is that the productivity of conceptual analysis lies not so much in vetting the logical coherence or manipulating language to produce deductively valid propositions, but using the spontaneity of perception, memory, consciousness, and testimony to produce novel propositions and subject them to reason that is also abductive and inductive.

At least, that's my understanding of the interplay of the giants I have cited. I'm always open to feedback!

3
  • I noticed your recent comment here. Next time, please do not upvote posts without checking the mathematics (which in this case is completely bogus). The 'detailed explanation' in that post shows beyond any doubt that the poster totally does not understand the incompleteness theorem, nor the diagonal lemma. Please delete your comment on that thread, and feel free to ask me if you want to know the correct mathematics. See you around!
    – user21820
    Commented Apr 14 at 5:28
  • @user21820 Wrong party, bub. Didn't contribute anything.
    – J D
    Commented Apr 15 at 14:18
  • Sorry the link to your comment didn't seem to work, but the thread is correct; you commented on Bumble's post saying "+1", but it is totally wrong.
    – user21820
    Commented Apr 16 at 10:11

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .