Two Cognitive Cases

Two Cognitive Cases

This is the first draft of an article I’m developing, I appreciate any commentaries and corrections, which can be sent as reponses.

The study of cognition can be benefited in a number of ways, and people from areas as separate as mechanical engineering, artificial systems and psychology show us that. In fact, from Gödel’s theorem to dynamic systems to molecular genetics, there is some kind of contribution that has been made to the understanding of mind. I want to present two new challengers here, to join the group of things which are useful to understand cognition. In fact to understand in general. They are Recently-Biased Information Selectivity and Happiness.

First I want to talk about selectivity. If one wants to understand more and be able to acomplish more, it is highly likely that he will have to learn (the other option being invent, or discover). We learn more than we invent because it is cheaper, in economic terms. Our cognitive capacity to learn is paralleled in the south asian countries, whose development is funded into copying technologies developed in high-tech countries. Knowledge is not a rival good, that is, the fact that I have it doesn’t imply you can have it too. Knowledge, Newton aside, has nothing to do with apples.

So suppose our objective was to learn the most in the least time, and to be able to produce new knowledge in the least time. There is nothing more cognitive than increasing our descriptive and procedural knowledge in reasonable timing. The first thing one ought to do is to twist the idea of learning upon itself, and start learning about learning. There are many ways to improve learning that can themselves be learned. One can achieve higher efficiency by learning reading techniques. Also she could learn how to use different mental gadgets to learn about the same topic (i.e. Thinking of numbers as sounds, if she usually thinks of them as written, and vice-versa). She also could simply change her material tools, using a laptop instead of writing with pen, writing in a different language to allow for different visual analogies, using her fingers to count. The borders are of course not clear between different cognitive tools. Writing in chinese implies thinking through another grammatical scheme, as well as looking at different symbols, one of this is more mental, the other more material, both provide interesting cognitive connections to other concepts and thus improve thinking and learning. Another blurry technique, without clear frontiers is to use some cognitive enhancer. Coffee, the most widely used one, is a great enhancer. Except that it isn’t. Working as a brain’s false alarm that everything is okaywhn is isn’t, leading to disrythmia, anxiety exaustion etc. Modafinil is much better, healthier, less prone to causing tension. But are these mental or material gadgets? One thing is certain, they are part of the proof that the mind-body dichotomy has no bearing on reality. These are all interesting techniques for better learning, but I suggest they are not as powerful as selectivity.

Recent-Biased Information Selectivity is a pattern towards seeking knowledge, what is informally called an “approach” to knowledge. A Recent-Biased Information Selector is a person who has a pattern of behavior. This pattern is, as the name denunciates, to look for the solution for her problems mostly in the most recent publications she can find. That is, amongst all of her criteria for deciding to read or not to read something, to watch or not to watch a video, to join or not a dance group, being new is very close to the peak. There are many reasons for which this is a powerful technique, given our objectives. The first is the Law of Accelerated Returns, as proposed by Kurzweil(2005). According to it, the development of information technology is speeding up, we have an exponential increase in the amount of knowledge being produced, as well as in the amount of information being processed. This is Moore’s law extended, and it can be extended to all levels of technological improvement, from the invention of the multicellular organism to genomic sequencing, from the invention of a writing system to powerful computing etc… Stephen Hawking (2001) points out that if one wanted to read all that was being published in 2001, he’d have to run 145 kilometers per second, this speed has probably doubled by now (2010). So Information technology in general and Knowledge in particular are increasingly speeding up. That means that if you cut two adjacent periods of equal sizes from now to the past, odds are high there is more than twice the knowledge of the older period in the newer one. If one were to distribute fairly his readings among all there is to be read, he would already be exponentially shifted towards the present. So a fair distribution in order to obtain knowledge is one that decreases exponentially towards the past. Let us say one reads 1000 pages, more or less three books, per month. So if we divide time in 4 equal periods, let’s say, of 20 years, one would have these pages divided according to the following proportions: 1 : 2 : 4 : 8. Now, 1x+2x+4x+8x = 15x = 1000. x=66 We get the distribution of pages:

66 to 1930 – 1950

122 to 1950 – 1970

244 to 1970 – 1990

488 to 1990 – 2010

In general: Let S be the number of subspaces into which one’s division will be made. Let T be the total number of pages to be read. The fair amount of reading to be dedicated to the Nth subspace is given by the formula:

2n-1 · (T / (21+22… 2S-1))

Now, this is not how we usually reason, since our minds are in general linear predictors, we suppose that fairness in terms of learning knowledge would be to read the same amount for equal amounts of time. This is of course a mistake, a cognitive bias, meaning something that is engendered in our way of thinking in such a way that it leads us systematically to mistakes. My first purpose is to make clear that the wisdom in the strategy of dividing cognitive pursuit equally though time is a myth. A first objection to my approach is that it is too abstact, highly mathematical, there are deep assymetries between older stuff and newer stuff that has not been considered, so one should not distribute her reading accordingly. Exactly! Let us examine those asymmetries.

First asymmetry: Information inter-exchange: It is generally taken for granted by most people that the future has no influence on the past, whereas the past has influence on the future. More generally, an event X2 at time T2 will not influence another event X1 at time T1, but might influence X3 at time T3. This of course is false. But we are allowed to make Newtonian approximations when dealing with the scale in which knowledge is represented, that is paper scale, brain scale (Tegmark 2000), sound-wave scale. So it is true for our purposes. From information flow asymmetry it follows that what is contained in older knowledge could have influenced newer knowledge, but not otherwise. This is reason to take the fair distribution, and squeeze it even more towards the present.

Second, asymmetry: Having survived for long enough. This is the main objection I saw against biasing towards the present, it consists of saying that the newest stuff has not passed through the filter of time (this could also be called the “it’s not a classic” asymmetry) and therefore is more likely to be problematic. I have argued elsewhere that truthful memes are more likely to survive (Caleiro Forthcoming) and indeed that is a fair objection to the view I am proposing here. This asymmetry would make us stretch our reading back again. But in fact there is a limit to this filter. One has strong reasons not to read what came just hot out of press (unless there are other factors for it) but few reasons not to read what has been for 2 years in the meme-pool, for instance. The argument is strong, and should be considered.

Third, Conceptual-Scheme Complexity: Recent stuff is embedded in a far more complex world, and in general into a very complex scheme of things, that is, the concepts deployed are part of a complex web, highly sofisticated, deeptly interacting. This web makes the concepts clearer since they are more strongly interwoven with other concepts, theories, experiments etc… The same concept usually will have a much more refined conception today than the one it had two hundred years ago. Take the electron for instance, we have learned enormous amounts about it, the same word means much more today than it did. Even more amazing is the refinement of fuzzy concepts like “mind”, “cognition”, “knowledge”, “necessity”, “a priori” and so on.

Fourth, Levels of Meta-knowledge available: Finally we get to an interesting asymmetry, that is related to how many layers of scrutiny has an area passed through. In the early days we had “2+3·5+(9-3)” kind of maths, then someone notices we’d be well with a meta-symbol for a given unknown number so we had “ X+3 = 2” kind of mathematics…. and someone else eventually figured a symbol could denote a constant, and we had “Ax+By+C = 0” kind of mathematics, there are more layers, but the point is clear. Knowledge2, That is Meta-Knowledge depends on the availability of Knowledge1, same for Meta-meta-knowledge or Knowledge3. In psychology we had first some data regarding a few experiments with rats, then some meta-studies, with many clusters of experiments with rats, then some experiments with humans, then meta-inter-species knowledge that allowed us to compare species, then some theories of how to achieve knowledge in the area, that is epistemology of psychology etc… Now, it is usually impossible to create knowledge about something we have no data about. So there is no meta-knowledge without there being knowledge first. The number of layers is always increasing, for it is always possible to seek patterns in the highest level (though not always to find them!). More publications give us access to more layers of knowledge, and the more layers we have, better is our understanding.

These asymmetries give us the following picture, if we had a fair distribution, we should squeeze it a lot towards the recent past (some 2 years before present) but resist the temptation to go all the way and start reading longterm-useless nonfiltered stuff like newspapers. A simple way to do that is to change the 2 in the general equation for a 3. Some interesting ideas on this topic of selective ignorance deserve mention:

There are many things of which a wise man might wish to be ignorant.”

Ralph Waldo Emerson

Learning to ignore things is one of the great paths to inner peace”

Robert J Sawyer – 2000

What information comsumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

Herbert Simon, Turin Award winner, Nobel Prize winner

Just as modern man consumes both too many calories and calories of no nutritional value, information workers eat data both in excess and from the wrong sources”

If you are reading an article that sucks, put it down and don’t pick it back up. If you go to a movie and it’s worse than The Matrix Revolutions, get the hell out of there before more neurons die. If you’re full after half a plate of ribs, put the damn fork down and don’t order dessert.”

Timothy Ferris – 2007

I’ve shown the names of those I’m quoting, for this gives one tip on exceptions for the no hot-out-of-press rule. That is the argument from authority. The argument from authority is fallacious in its usual form:

Source A says that p.
Source A is authoritative.
Therefore, p is true.

But is reasonable in its bayesian form (“~” is the symbol for “not”) :

Source A says that p. Source B says that ~p.
Source A is authoritative. Source B isn’t.
Therefore, it is rational to consider that p is more likely to be true until further analysis.

The other exception in which we should read what is hot-out-of-press (given our cognitive objective, as always in this article) is when it is related to one’s specific line of work at the moment. Suppose I’m studying Happiness to write a review of current knowledge in the area, this gives me good grounding to read an article published this month, since I must be as up to date as possible to perform my work. Exceptions aside, it is a good strategy to let others filter the ultra-recent information for you and remain in the upper levels of analysis. The same is true of old information, what is relevant is highly likely to have been either preserved, as I mentioned before, or rediscovered, as all the cultural evolutionary convergences show (Diamond 1999,Caleiro Forthcoming ).

First Case Conclusion:

Our natural conception of how to distribute our time in obtaining knowledge is biased in the wrong way, suggesting equal amounts of effort to equal amounts of time. To achieve greater and deeper knowledge, one should distribute her effort with exponentially more reading of more recent periods than older ones. In addition, she should counter this bias with another bias, shifting it even more towards the present but stoping short of it, with an allowance for some basic knowledge filters to operate before choosing what to read. We end up with an exponential looking curve that peaks in the recent past and falls abruptly before reaching the present.

Second case, Happiness

All other things equal, most people would not choose to have every single day of their lifes, from tommorrow onwards, being completely miserable. It is a truism that people do not want to suffer unless it is necessary, and most times not even in that case. Neutrality is good, but not good enough, so, all things equal, it is also true that most people would choose to have countless episodes of deep fulfilling happiness for the rest of their lifes, as oposed to being merely “Not so bad”. Some people have noticed that this is not so unanimous, for instance, Betrand Russell (1930) wrote: “Men who are unhappy, like men who sleep badly, are always proud of the fact.”

I intend to discuss happiness from another perspective, the perspective of cognition. Is happiness good or bad for thinking? Supposing our cognitive objective, as we did before, let us examine happiness. Suppose we don’t care about happiness, we just want to be cognitively good. Contrary to popular legend that thinking equals suffering, and Lennon’s remark that “Ignorance is bliss”, current evidence suggests that happiness is positively correlated with (Gilbert 2007, Lyubuomisrky 2007, Seligman 2002):

Sociability

Energy

Charity

Stronger immune system

Cooperation

Physical health

Earnings

Being Liked

Amount of friends

Social support

Flexibility

Intelligence

Ingenuity in thinking

Productivity in job

Leadership skills

Negotiation skills

Resilience in face of hardship

It is hard not to notice how many characteristics there are on the list, and easy to see how many of them are related to being a better learner, a better teacher, and a better cognitive agent in general. This is true independently of what one studies, if the knowledge is descriptive such as calculus, or procedural such as dancing. There is also the evident fact that depressed people tend to loose productivity dramatically during their bad periods. This gives us good scientific grounding to believe that happiness is important for cognition, to learn better, to achieve more, and to be cognitively more apt in general. So we ought to be happier.

But should we be happier? How much happier? The reason why I started this article is that I was reading in the park, listening to some music, watching people going and coming, families, foreigners, kids etc… It was a beautiful sunny day and I had just exercized, I was reading something interesting and challenging, the music was exciting, I took a look around me and saw the shinning sun reflecting on the trees, a breeze passed amidst the giggle of kids nearby and I thought “This is great!” In fact I thought more than that, I thought “This is great! Still it could be better”. There is some background knowledge needed to qualify the power of this phrase. Once I saw a study that said a joke had been selected among thousands by internet users, therefore it was a scientifically proven funny joke. Now, I’m a happy person. In fact I’m a very happy person. It took me a while to accept that. It is hard to accept that one is the upper third of happiness, because that tells a lot about the human condition, and how happy people are. So I was pondering about this fact that people told me, and that I subjectively felt, and finally science came to my aid. The University of Pennsilvania holds an online-test called authentic happiness invetory. The website has 700,000 members. I did the test twice, with some 14 months in-between. The website provides comparisons among those who took the test, we can take these to be some dozens of thousands of people at least. The first time I did it, it showed “You scored as high or higher than 100% of web users, 100% of your gender, 100% of your age group 100% of your occupational group, 100% of your educational level and 100% of your Zip code”. One year later, the first five bars were still showing 99% and the last one 98%. Thinking I might have been in an exceptionally happy day that time, I took the test again, and to my surprise I was back to 100% in all categories. I knew I am happy, but that was taking the thing to a whole other level. So I was as scientifically comproved to be happy as that joke, I was in the very end of the tail of the curve.

Now think again about that phrase in that scene in the park. “This is great! Still, it could be better.” I was not talking about myself (as I’ve been for one paragraph now) I was talking about Man. If you found yourself in the edge of the curve you’d know what I mean. If this is the best we can do, we are not there yet. I’m not saying that being happy is not great, it is awesome, but it could be much better. I suggest that anyone who had the experience of reading all those “100%” there in the website thought the same, this cannot be the very best, there must be more. This is what brings me to the Humanity+ motto:

Better than well.

The human condition is not happiness-driven. Evolutionarily speaking, we do what we can to have more grandchildren than our neighbors, whereas this includes happines or whether it doesn’t. A mind that was satisficed all the time would not feel tempted to change his condition, so mother nature invented feelings such as anxiety, boredom, tiredness of the same activity, pain etc… Happiness, as designed by evolution, is fleeting, ephemerous (Morris 2004). How could we change that? There are several ways, the most obvious one being chemical intervention. Also technologies of direct stimulation of pleasure centers could be enhanced to accepted levels of safety. Artifacts such as MP3 players also have an effect on happiness since listening to music causes happiness (Lyubomirsky 2007), many artifacts have positive effects on happiness and in the long term may help in improving the human condition. Art, philosophy, spirituality and science have also had long term effects on human happiness. So in order to improve the human condition in the long term, we ought to work in all those bases. This would in turn provide us means to achieve our proposed cognitive goal, through greater cognitively enhancing happiness. Before moving on, I’d like to make an effort of showing a mistake that most people are likely to make, due to some cognitive biases, I’ll first list the biases:

Status quo bias: people tend not to change an established behavior unless the incentive to change is compelling. (Kahneman et al 1991)

Bandwagon effect: the observation that people often do and believe things because many other people do and believe the same things. The effect is often called herd instinct. People tend to follow the crowd without examining the merits of a particular thing. The bandwagon effect is the reason for the bandwagon fallacy’s success.

From Yudkowsky (2009):

Confirmation bias:In 1960, Peter Wason conducted a now-classic experiment that became known as the ‘2-4-6’ task. (Wason 1960.) Subjects had to discover a rule, known to the experimenter but not to the subject – analogous to scientific research. Subjects wrote three numbers, such as ‘2-4-6′ or ’10-12-14’, on cards, and the experimenter said whether the triplet fit the rule or did not fit the rule. Initially subjects were given the triplet 2-4-6, and told that this triplet fit the rule. Subjects could continue testing triplets until they felt sure they knew the experimenter’s rule, at which point the subject announced the rule.

Although subjects typically expressed high confidence in their guesses, only 21% of Wason’s subjects guessed the experimenter’s rule, and replications of Wason’s experiment usually report success rates of around 20%. Contrary to the advice of Karl Popper, subjects in Wason’s task try to confirm their hypotheses rather than falsifying them. Thus, someone who forms the hypothesis “Numbers increasing by two” will test the triplets 8-10-12 or 20-22-24, hear that they fit, and confidently announce the rule. Someone who forms the hypothesis X-2X-3X will test the triplet 3-6-9, discover that it fits, and then announce that rule. In every case the actual rule is the same: the three numbers must be in ascending order. In some cases subjects devise, “test”, and announce rules far more complicated than the actual answer.” […]

“Hot” refers to cases where the belief is emotionally charged, such as political argument. Unsurprisingly, “hot” confirmation biases are stronger – larger in effect and more resistant to change.”

Let me restate a case of the status quo bias in another form: When people make a decision, they should take only the benefits and costs of what they intend to do, and carefully analyse them. This is fairly obvious. Also, it is complete nonsense. What one ought to do when she is trying to find out about doing or not doing something is to compare that that thing with what she would do in case she didn’t do that thing. Suppose I’m a father who gets his daughter everyday in school. Then some friends invite me to go play cards, I reason the following: “Well, playing cards is better than doing nothing” and I go play cards, leaving my poor child alone in school.

 

Another important topic is how can someone be happier than he usually is, right now? What is already available? What has been proven to increase satisfaction? The rest of the article is dedicated to this topic. Gilbert (2007) has many interesting words on that, they are worth quoting:

“My friends tell me that I have a tendency to point out problems without offering soutions, but they never tell me what I should do about it.”[…]”… you’ll be heartened to learn that there is a simple method by which anyone can make strikingly accurate predictions about how they will feel in the future. But you may be disheartened to learn that, by and large, no one wants to use it.

Why do we rely on our imaginations in the first place? Imagination is the poor man’s wormhole. We can’t do what we’d really like to do – namely, travel trough time , pay a visit to our future selves, and see how happy those selves are – and so we imagine the future instead of actually going there. But if we cannot travel in the dimension of time, we can travel in the dimensions of space, and the chances are pretty good that somewhere in those other three dimensions there is another human being who is actually experiencing the future event that we are merely thinking about.” […] “it is also true that when people tell us about their current experiences […] , they are providing us with the kind of report about their subjective state that is considered the gold standard of happiness measures. […] one way to make a prediction about our own emotional future is to find someone who is having the experience we are contemplating and ask them how they feel.[…] Perhaps we should give up on rememberin and imagining entirely and use other people as surrogates for our future selves.

This idea sounds all too simple, and I suspect you have an objection to it that goes something like this… “

This fine writer’s message is simple, stop imagining, start asking someone who is there. This is the main advice for those who are willing to predict how happy will they be in the future if they make a particular choice.

Now, Lyubomirsky offers many other happiness increasing strategies. First, she proposed the 40% solution to happiness. Happiness is determined according to the following graph:

50% genes 10% circumstances and 40% intentional Activities
50% genes 10% circumstances and 40% intentional Activities

That is, Happiness is 50% genetically determined (that is, if you had to predict Natalie Portman’s happiness, and she had monozygotic twin separated at birth, it would be more useful to know how happy the twin is than to know every single fact you may figure out about Natalie’s way of life, past and present conditions and reactions to life events) , 10% due to life circumstances (This includes wealth, health, beauty, marriage etc…), and 40% due to intentional activities. So, all things considered, if one is willing to become happier right now, the best strategy is to change these last 40%, how can we do it. Here I will list some comproved ways of increasing general subjective happiness. I will not provide a detailed description of the experiments, but those can be found in Lyubomirsky’s book references. My aim here is to give my reader a cognitive tool for increasing her happiness, since I have defended that achieving greater happiness is a good cognitive strategy.

Bostrom, N. 2004 The Future of Human Evolution. Death and Anti-Death: Two Hundred Years After Kant, Fifty Years After Turing, ed. Charles Tandy. Ria University Press. pp. 339-371. Available online: http://www.nickbostrom.com/fut/evolution.html

Diamond, J.1999. Guns Germs and Steel:The Fates of Human Societies. W.W. Norton & Co

Kahneman, D., Knetsch, J. L. & Thaler, R. H. (1991). Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias. Journal of Economic Perspectives, 5, 1, pp. 193-206

Russell, B. 1930. Conquest of Happiness

Available online: http://russell.cool.ne.jp/beginner/COH-TEXT.HTM

Tegmark, M. 2000. The importance of quantum decoherence in brain processes IN Physics Review E61:4194-4206

Available online: http://arxiv.org/abs/quant-ph/9907009

Yudkowsky, E. 2009. Cognitive biases potentially affecting judgment of global risks IN Global Catastrophic Risks, eds. Nick Bostrom and Milan Cirkovic. Oxford

Available online: http://yudkowsky.net/rational/cognitive-biases

10 opiniões sobre “Two Cognitive Cases”

  1. This commentary by David Pearce in H+ Magazine
    http://hplusmagazine.com/digitaledition/2009-fall/
    I found quite illuminating, and challenging, of the “cognitive objective” assumption.

    I think it’s fair to say the transhumanist community
    is mostly interested in intelligence-amplification
    — superintelligence rather than supersentience.
    I share an interest in cognitive enhancement, but
    in my opinion there is an important sense in which
    a congenitally blind person with an IQ of 220, or
    920, is just as ignorant as a congenitally blind
    person with an IQ of 120. I worry more about our
    ignorance in the latter sense than I do about our
    limited reasoning powers. Psychedelic drugs can
    briefly give us a tiny insight into how “blind” we
    normally are; but we soon lapse into ignorance
    again. Such is the state-dependence of memory.
    If I’d never tried psychedelics, then I fear I would
    be scornful of their significance because of the
    incoherence of most users’ descriptions of their
    effects. But using the blindness analogy again,
    someone congenitally blind who is surgically
    guaranteed the gift of sight can take years before
    they can make sense of the visual world… at first
    they are overwhelmed and confused by visual
    stimuli.

  2. Well, I’m not sure how I could manage to bring this into discussion again (since we’ve already discussed it at Logic Class), but still, I think writing it down can be helpful to make clear the arguments I’m trying to put, aswell as to clarify our discussion. Still, I want the background of what I’m writing here absolutely clear: I’m not claiming that your view is wrong, even less that I have means of proving it. Rather, I’m just pointing out reasons why I can’t fully endorse it, or, maybe I should say, questions I have that don’t allow such endorsement, at least on first sight. Wether or not I’ll end up having agreed with you from the beggining, it’s not clear; still, either way, let’s just see how it goes.😛

    I think the one thing that tickles me the most is that, from the beggining, you seem to assume that knowledge is cumulative – that is, that the knowledge we produce today just adds to the knowledge we had before, or rather, improves it, not being at all of a distinct kind than it. Which presumes, one way or another, that the we are dealing with the same object, and thus what was true back then about that object can’t possibly have turned false, nor contradict anything else about the object that is true right now(since it’s the same one from back then). If the object is the same now than it was before, then the knowledge we have know can’t contradict the knowledge we had then, but just add to it, because, when we say we know something about the object, we say we know that something is true about it. Therefore, our knowledge produced today can only add up to our knowledge from the past, and more than this, often (maybe most of the times) relies on past knowledge to be produced. Which, in one way or another implies, I believe, from what you proposed, that we should read more texts from actuality (in a broad sense) than from the distant past (again, in a broad sense – I know you specify those in your article, but I don’t think it’s really relevant doing that here).

    Well, while it’s still possible that the consequence (i.e that we should read more recently published texts than old ones) is true, I have a lot of problems with the antecedent. Do different knowledges truly deal with the same, exact object? I know João said that reality is the same for everyone – but that, I believe, is a naive way of dealing with epistemics. Not to imply that reality is absolutely subjective and, thus, that we’re never, at all, talking minimally about the same thing – or, in other words, that our “worlds” don’t minimally converge – but just that I find no sense (or a very restrict sense) in talking of a reality which is absolutely independent of our subjective view and participation on it. I’m saying, yes, that our subjective standpoint is part of the definition of the very reality we aim to know about. Which implies that we’re never, truly, talking precisely about the same objects. But doesn’t, in my point of view, as contradictory as it may seem, imply that one knowledge is completely distinct from each other. If my belief on different realities for different subjects doesn’t mean that those realities don’t converge, likewise, my belief that the knowledge we had before from an object isn’t knowledge from the same object than the knowledge we have now, doesn’t mean that those knowledges don’t somehow converge. But that’s it: CONVERGE. I do believe that something is lost on the way. Wether or not what is lost is relevant, may be another question, relying, precisely, on that which we consider relevant, which, again, relies on our standpoint of view.

    I know I’m extending myself way too much, and I could even more, but I’ll try to sum it up: What you propose on your article seems, to me, to assume that knowledge is cumulative. I tried to sketch some of the reasons why I don’t believe it is exactly like that. Reasons, therefore, that could make plausible for one to read more articles from the past than recent ones. Or, maybe not: maybe it just implies that we should read more from the past than we thought at first. Or maybe not even so; maybe it suffices to read just the ammount you suggested so we can get from the past what the present cannot offer us. At any rate, if one could show that the knowledge from the past does not deal with the same object as actual knowledge, then maybe one could find reasons why he should read more texts from the past than recent ones.

    But, in all honesty, I don’t care too much for that. Like I said, I’m not really worried with the consequence of your implication. Rather, I’m worried about the antecedent. It does not bother me that it may be that, in general, people should read more recent publications than past ones. I have no grudge with that. But I do have problems with accepting, at first sight, that knowledge is cumulative. I may be persuaded of it, but right now I’m more inclined towards the position that it isn’t FULLY cumulative, but also not completely not-cumulative. I’ll be the first one to admit that it’s not an easy position to stand for, but nevertheless, it’s the one which seems more satisfying to me.😛

    So, to make it yet even more clear: I’m not so much discussing wether or not your conclusion is true, as I’m discussing wether or not knowledge is cumulative. It’s not the only thing I’d like to discuss from your text, but it certainly strikes me as the one that I want the most to.😛

    (Also, if João would like to join the discussion, I’d very much appreciate it. Obviously, he’s the one who disagrees the most with my position – which only makes it even more interesting if he can play his part here. :-P)

    Well, that’s everything from “that pale lad from logic class” for now.😛

  3. P.S: I thought a little about the question about wether or not the knowledge from the past deals with the same object than the actual knowledge. It’s hard to make a precise definition of what I could mean by different objects, and it’d take too much space here to attempt it – but, if it could make you more comfortable, I believe we can talk about different knowledges from the same object aswell – in a sense which could involve (but not be resumed to) intensional knowledge X extensional knowledge. Maybe there are past knowledges which deal, extensionally, with the same object than actual ones, but not intensionally. That may well be one of the main differences – but I’m not claiming that it’s the only one – nor that there isn’t also differences between objects. I’m merely trying to provide an argument that makes plausible to admit that exists some sort of difference between past and actual knowledge that doesn’t allow knowledge to be fully cumulative.

    Well, I’d better stop now, or else I’ll be reported for spamming.😛

  4. You have given possible reasons for knowledge not to be cumulative.

    1) Difference of Object.

    You could have claimed

    2) Indeterminacy of translation (like Kuhn and Feyerabend do)

    Let us assume you have claimed 1 and 2.

    Let us, with Putnam, separate concept from conception. The conception of an electron (or the sun) changes over time, but the concept remains somewhat the same. Sometimes, words change meaning, this is what happened for instance with apelido and sobrenome in portuguese which are the inverse of the spanish equivalents.

    All this goes in your favor.

    There is more.

    Literature is not very cumulative, new novel books do not depend strongly on older novel books.
    Fashion is not cumulative after a few generations: what people dress is not related with what people dressed one century ago.
    Same goes for most of the arts, there is no connection between todays and very early days.

    But this is of course false of most of physics (not all of physics, the use of alpha, beta and gama surely is not a REQUIREMENT for good physics, but F=MA is)

    So there are things in the world that belong to the two categories:

    1) Depends heavily on earlier theories.
    2) Is independent of earlier theories.

    Oddly enough, the more an area depends on old stuff, the more reason you have to read the newest stuff. This comes from the assymetries I have exposed.

    Now, it is a question of figuring out what group does philosophy belong to, the one in which anything goes, or the one which relies on past information.

    Generally speaking, the guideline for cumulativity is refering to things of the world that are stable. These are things such as people, elephants, personalities, species, electrons, quarks, the sun, orbits, money, death, life, love.

    Those who declare philosophy is unrelated to these things, feel free to delve into the old ages of darkness, in which philosophers believed women had few teeth than man, Man is good by Nature (who knows about Woman), Man is bad by nature, God is great, There is a specific distance between the angels in heaven, earth elements tend downwards, We live in the best possible world and other very similar and reasonable claims, that every serious person should think about for years, specially when using government money.

  5. “Those who declare philosophy is unrelated to these things, feel free to delve into the old ages of darkness, in which philosophers believed women had few teeth than man, Man is good by Nature (who knows about Woman), Man is bad by nature, God is great, There is a specific distance between the angels in heaven, earth elements tend downwards, We live in the best possible world and other very similar and reasonable claims, that every serious person should think about for years, specially when using government money.”

    One word: LOL.😛 Do I sense some sort of mockery, definetely NOT related to the discussion we had earlier? =P

    I can’t really respond now, since I’m going to class, but I will once I get home. Either way, I really liked the answer, specially the smartass tauntings. =PPP

  6. Lucas,

    I don’t really know if I’m going to have the time to join the discussion. Even so I would like to make abundantly clear, as much clear as it is for every one else who produces knowledge except for some dark continental philosophers, that one of the only epistemic position ever to produce good knowledge – knowledge that provides the well being of the peoples – is the epistemic position of the realist, the other one (the minority) is the one that doesn’t say anything about the question.
    Ironically, the belief that philosophy – with its god-eye that sees all and knows all – can rise above the ontology and epistemology is one of the best friends of the view that you so skeptically presented. Oddly enough, this view that you have put forward says that even though we cannot have any kind of access of the objective nature of reality or even though doesn’t make sense to speak about this so call “reality” we CAN talk about ideas, about knowledge, concepts and so on. They say that a hard fact about nature is impossible to assert, meanwhile a (soft?) fact about a hard fact is the only thing we can talk about. But this is nonsense. This is destroy the very notion of knowledge altogether. This view subjects to the notion that philosophy, the forgotten and almighty queen, can – like Wittgenstein use to think – rise above the indeterminacies of the various languages and TALK about it in a more DETERMIANT sense. This is nonsense, not in the wittgensteinian sense of nonsense, in the lay people sense of nonsense: as something that we not ought to think about because it leaves us nowhere. This is the view of the intellectual elite who abuses the power society gives to it. A power it’s ought to be use with the intention of producing knowledge that can benefit the society as a whole. This is not a power to be used with the one and only intention of producing flamboyant displays, of producing texts that – by some conditions of verifiability – cannot be distinguish from a random text generator. Texts that like the peahen of the peafowl have the sole propose of showing how good one can get in something that – excluding showing off and finding mates (what is a lot) – have absolutely no practical consequence whatsoever.
    The areas of knowledge which refuses to submit to such indignity are known as natural sciences and they all submit to what you called a naive realist stand point. No wondering you call then naive, they simply don’t play the game of the peafowl, they don’t get it. For them is all about the pursuit of truth and the pursuit of problems that if solved will bring about a better and happier society. This are the areas of knowledge that are also known as the ones that naively returns to society the greater goods. Scientists innocently thinks that if society spends billions of dollars with them they ought to pay it back. Funny or not this naive trend is catching up: is the largest area, with the more intense debates and discoveries and the areas where the greatest genius decided to work on, also the areas in which society spends most of they investments.
    Also you said that one must adopt a reflexive stand point and that science can only think about it self in it own terms. Continental philosophy, on the other hand can only think in terms of the subject. Hegel created it in the 19 century and the 20 century was marked by a futile battle from most of continental philosophers trying to get away from the sphere of the subject. There is, however, a new science – a creation of the so-called naive trend – that explains ideas and knowledge in terms that goes well beyond the subject, in evolutionary terms. This new science makes the ideas itself the fundamental entity and is called memetics. What is sometimes hard to understand for someone so immerse in the ‘smart’ trend – or should I called it tricky? – is that this view also cannot think about it self outside it own terms, the difference is that the tricky trend cannot see past its own nose and usually don’t give much attention to its own practical consequences.

  7. João,

    I didn’t mean that we don’t have ANY kind of access of objective reality. Like I said, it’s not because one can conceive that there is many realities that he cannot conceive that they converge. I meant that our access to objectivity is not one we can have because we have the guarantee of a unique, equal reality experienced by all of us; I meant that, whatever access to objectivity we have is CONDITIONED by the fact that it’s attained THROUGH our subjective reality – which means that there are no a priori guarantee that whatever it is that appears to us as objective is, indeed, as such. In a very, very poor analogy, just as consistency of an arithmetical system cannot be proven within the scope of that very system (as determined by Gödel’s Theorem – one, I confess, I barely know or understand, but, considering how I’m relying on a very restrict and brief analogy, I hope this won’t do any harm), objectivity of a certain knowledge cannot be shown within the subjectivity from which it was produced. To put it clearly: it may be that you have found a “hard fact” of nature, but you have no means of making it certain you did – meaning, by “certain”, any finitary way through which it is shown it’s impossible to conceive such knowledge any other way than objectively.

    So, I’m certainly not saying that we have access to ‘soft’ facts, but not to ‘hard’ facts. I don’t mean that “a hard fact about nature is impossible to assert, meanwhile a (soft?) fact about a hard fact is the only thing we can talk about”. No matter what we’re talking about or how we’re talking about it, there’s nothing we can undoubtely assert – and that’s valid not only to science, but to philosophy as well. I’m very far from considering philosophy as some sort of all-seeing-god. And I’m very far from assuming that we can only talk of things in philosophical terms. That we cannot guarantee the objectivity of natural science knowledge, doesn’t means that it’s an expendable discourse, even less that philosophy is able to make up for it – we cannot guarantee it’s objectivity either. But it’s not because we cannot guarantee that either of them are objective that they are completely devoid of such, or should be discarded.

    I do not concede that, in ANY aspect, one can claim the absolute certainty of science or philosophy; yet, I do not believe that the lack of this certainty forbids the progress of either, nor that their progress is made through the existence of any certainty of such a kind, nor NEEDS to rely on any such hopes.

    Which brings me to my point. I don’t believe philosophy and science are fundamentally distinguished by such certainty of their knowledges, but rather, by their approach. It’s not a question of wether one’s approach is closer to the truth than the other’s, but rather, about WHAT KIND OF KNOWLEDGE they produce.

    What I believe, in that case, is that one thing that is to blame both science and philosophy is their obstination in sticking to their approaches and neglecting the other one’s one. Even though different, science CAN contribute to the kind of knowledge philosophy search, aswell as philosophy can contribute to further the advance of science, through questioning assumptions that may have contributed to it’s advance so far, but are now impeding it from continuing to move forward. That may send shivers down your spine, but, yes, I do believe that there’s some way of approaching the ideas of philosophers such as Derrida, Hegel, Kant, Deleuze, Adorno, and so on, that can contribute to the progress of science. Aswell as I believe that there are many scientific theories which, properly considered, can provide many philosophical insights (and I’m certainly not excluding from those the very memetics to which you referred).

    Your response seemed to assume that I had some sort of hierarchical view, in which philosophy – in the terms you put, as continental philosophy – was the Queen of knowledge, which by far surpassed in it’s developments any “naive” knowledge that science could provide us with. When I talked about naivety, I didn’t mean to imply that. I hope I have made it sufficiently clear that that’s not how I think, and that that wasn’t the background of what I wrote. The idea that Philosophy has to be the Queen of all knowledge is rather absurd to me, because the IDEAL I have in mind is one in which Science and Philosophy (not only analitical, but also continental) are PARTNERS. It’s true that, usually, that philosophy doesn’t look past it’s own nose. But it’s also true that it’s not common to give any sort of well-disposed attention to it. It’s also true that it usually doesn’t give much attention to it’s own “pratical” consequences, in a DETERMINED sense of this word. But it also seems true to me that it’s important to consider wether or not all that matters is that which can be understood under the particular conception of “practicality” that natural science seems to hold. If one is not careful enough to consider wether what they think that is good to do and provide to the others is unrestrictedly so, wether or not it may be bad in another senses, wether or not it may become bad in the future, wether or not it’s good in a short-term, in a long-term, or in both perspectives, there might be severe consequences. If one TRULY cares about the others well-being, then that’s a question that cannot be overly emphatized. There is no lack of examples that show how much bad we can make to people by doing to/with/for them that which we consider good, acting as if there was no reason to question wether or not it is as so. If one truly cares about the others well-being, he’ll know better than be arrogant of his conceptions and beliefs, and will be careful to examine wether or not they really seem the best way to contribute to others’ happiness.

  8. (By the way, I think that, by answering to João’s objections, I also answered to the main objections – and tauntings😛 – that Diego made on his answer. Either way, I may post something else later commenting whatever didn’t have room in this last answer but, as of right now, I’m going to sleep, fellas. :-P)

Deixe uma resposta

Preencha os seus dados abaixo ou clique em um ícone para log in:

Logotipo do WordPress.com

Você está comentando utilizando sua conta WordPress.com. Sair / Alterar )

Imagem do Twitter

Você está comentando utilizando sua conta Twitter. Sair / Alterar )

Foto do Facebook

Você está comentando utilizando sua conta Facebook. Sair / Alterar )

Foto do Google+

Você está comentando utilizando sua conta Google+. Sair / Alterar )

Conectando a %s