Impacto do erro translacional induzido e do enovelamento livre de erro na taxa da evolução de proteína

domingo, outubro 31, 2010

Molecular Systems Biology 6 Article number: 421 doi:10.1038/msb.2010.78
Published online: 19 October 2010
Citation: Molecular Systems Biology 6:421

Impact of translational error-induced and error-free misfolding on the rate of protein evolution

Jian-Rong Yang1,2, Shi-Mei Zhuang1 & Jianzhi Zhang2

Key Laboratory of Gene Engineering of the Ministry of Education, State Key Laboratory of Biocontrol, School of Life Sciences, Sun Yat-sen University, Guangzhou, PR China
Department of Ecology and Evolutionary Biology, University of Michigan, Ann Arbor, MI, USA

Correspondence to: Jianzhi Zhang2 Department of Ecology and Evolutionary Biology, University of Michigan, 1075 Natural Science Building, 830 North University Avenue, Ann Arbor, MI 48109, USA. Tel.: +1 734 763 0527; Fax: +1 734 763 0544; Email:

Received 11 May 2010; Accepted 31 August 2010; Published online 19 October 2010

What determines the rate of protein evolution is a fundamental question in biology. Recent genomic studies revealed a surprisingly strong anticorrelation between the expression level of a protein and its rate of sequence evolution. This observation is currently explained by the translational robustness hypothesis in which the toxicity of translational error-induced protein misfolding selects for higher translational robustness of more abundant proteins, which constrains sequence evolution. However, the impact of error-free protein misfolding has not been evaluated. We estimate that a non-negligible fraction of misfolded proteins are error free and demonstrate by a molecular-level evolutionary simulation that selection against protein misfolding results in a greater reduction of error-free misfolding than error-induced misfolding. Thus, an overarching protein-misfolding-avoidance hypothesis that includes both sources of misfolding is superior to the translational robustness hypothesis. We show that misfolding-minimizing amino acids are preferentially used in highly abundant yeast proteins and that these residues are evolutionarily more conserved than other residues of the same proteins. These findings provide unambiguous support to the role of protein-misfolding-avoidance in determining the rate of protein sequence evolution.

Molecular Systems Biology 6: 421; published online 19 October 2010; doi:10.1038/msb.2010.78

Subject Categories: simulation and data analysis; proteins

Keywords: evolutionary rate; expression level; mistranslation; protein misfolding




Em construção sobre ombros de gigantes em biologia molecular...

Mentiras, mentiras deslavadas em pesquisas médicas

sábado, outubro 30, 2010

Lies, Damned Lies, and Medical Science

By David H. Freedman


IN 2001, RUMORS were circulating in Greek hospitals that surgery residents, eager to rack up scalpel time, were falsely diagnosing hapless Albanian immigrants with appendicitis. At the University of Ioannina medical school’s teaching hospital, a newly minted doctor named Athina Tatsioni was discussing the rumors with colleagues when a professor who had overheard asked her if she’d like to try to prove whether they were true—he seemed to be almost daring her. She accepted the challenge and, with the professor’s and other colleagues’ help, eventually produced a formal study showing that, for whatever reason, the appendices removed from patients with Albanian names in six Greek hospitals were more than three times as likely to be perfectly healthy as those removed from patients with Greek names. “It was hard to find a journal willing to publish it, but we did,” recalls Tatsioni. “I also discovered that I really liked research.” Good thing, because the study had actually been a sort of audition. The professor, it turned out, had been putting together a team of exceptionally brash and curious young clinicians and Ph.D.s to join him in tackling an unusual and controversial agenda.

Last spring, I sat in on one of the team’s weekly meetings on the medical school’s campus, which is plunked crazily across a series of sharp hills. The building in which we met, like most at the school, had the look of a barracks and was festooned with political graffiti. But the group convened in a spacious conference room that would have been at home at a Silicon Valley start-up. Sprawled around a large table were Tatsioni and eight other youngish Greek researchers and physicians who, in contrast to the pasty younger staff frequently seen in U.S. hospitals, looked like the casually glamorous cast of a television medical drama. The professor, a dapper and soft-spoken man named John Ioannidis, loosely presided.

One of the researchers, a biostatistician named Georgia Salanti, fired up a laptop and projector and started to take the group through a study she and a few colleagues were completing that asked this question: were drug companies manipulating published research to make their drugs look good? Salanti ticked off data that seemed to indicate they were, but the other team members almost immediately started interrupting. One noted that Salanti’s study didn’t address the fact that drug-company research wasn’t measuring critically important “hard” outcomes for patients, such as survival versus death, and instead tended to measure “softer” outcomes, such as self-reported symptoms (“my chest doesn’t hurt as much today”). Another pointed out that Salanti’s study ignored the fact that when drug-company data seemed to show patients’ health improving, the data often failed to show that the drug was responsible, or that the improvement was more than marginal.

Salanti remained poised, as if the grilling were par for the course, and gamely acknowledged that the suggestions were all good—but a single study can’t prove everything, she said. Just as I was getting the sense that the data in drug studies were endlessly malleable, Ioannidis, who had mostly been listening, delivered what felt like a coup de grâce: wasn’t it possible, he asked, that drug companies were carefully selecting the topics of their studies—for example, comparing their new drugs against those already known to be inferior to others on the market—so that they were ahead of the game even before the data juggling began? “Maybe sometimes it’s the questions that are biased, not the answers,” he said, flashing a friendly smile. Everyone nodded. Though the results of drug studies often make newspaper headlines, you have to wonder whether they prove anything at all. Indeed, given the breadth of the potential problems raised at the meeting, can any medical-research studies be trusted?

That question has been central to Ioannidis’s career. He’s what’s known as a meta-researcher, and he’s become one of the world’s foremost experts on the credibility of medical research. He and his team have shown, again and again, and in many different ways, that much of what biomedical researchers conclude in published studies—conclusions that doctors keep in mind when they prescribe antibiotics or blood-pressure medication, or when they advise us to consume more fiber or less meat, or when they recommend surgery for heart disease or back pain—is misleading, exaggerated, and often flat-out wrong. He charges that as much as 90 percent of the published medical information that doctors rely on is flawed. His work has been widely accepted by the medical community; it has been published in the field’s top journals, where it is heavily cited; and he is a big draw at conferences. Given this exposure, and the fact that his work broadly targets everyone else’s work in medicine, as well as everything that physicians do and all the health advice we get, Ioannidis may be one of the most influential scientists alive. Yet for all his influence, he worries that the field of medical research is so pervasively flawed, and so riddled with conflicts of interest, that it might be chronically resistant to change—or even to publicly admitting that there’s a problem.

THE CITY OF IOANNINA is a big college town a short drive from the ruins of a 20,000-seat amphitheater and a Zeusian sanctuary built at the site of the Dodona oracle. The oracle was said to have issued pronouncements to priests through the rustling of a sacred oak tree. Today, a different oak tree at the site provides visitors with a chance to try their own hands at extracting a prophecy. “I take all the researchers who visit me here, and almost every single one of them asks the tree the same question,” Ioannidis tells me, as we contemplate the tree the day after the team’s meeting. “‘Will my research grant be approved?’” He chuckles, but Ioannidis (pronounced yo-NEE-dees) tends to laugh not so much in mirth as to soften the sting of his attack. And sure enough, he goes on to suggest that an obsession with winning funding has gone a long way toward weakening the reliability of medical research.

He first stumbled on the sorts of problems plaguing the field, he explains, as a young physician-researcher in the early 1990s at Harvard. At the time, he was interested in diagnosing rare diseases, for which a lack of case data can leave doctors with little to go on other than intuition and rules of thumb. But he noticed that doctors seemed to proceed in much the same manner even when it came to cancer, heart disease, and other common ailments. Where were the hard data that would back up their treatment decisions? There was plenty of published research, but much of it was remarkably unscientific, based largely on observations of a small number of cases. A new “evidence-based medicine” movement was just starting to gather force, and Ioannidis decided to throw himself into it, working first with prominent researchers at Tufts University and then taking positions at Johns Hopkins University and the National Institutes of Health. He was unusually well armed: he had been a math prodigy of near-celebrity status in high school in Greece, and had followed his parents, who were both physician-researchers, into medicine. Now he’d have a chance to combine math and medicine by applying rigorous statistical analysis to what seemed a surprisingly sloppy field. “I assumed that everything we physicians did was basically right, but now I was going to help verify it,” he says. “All we’d have to do was systematically review the evidence, trust what it told us, and then everything would be perfect.”

It didn’t turn out that way. In poring over medical journals, he was struck by how many findings of all types were refuted by later findings. Of course, medical-science “never minds” are hardly secret. And they sometimes make headlines, as when in recent years large studies or growing consensuses of researchers concluded that mammograms, colonoscopies, and PSA tests are far less useful cancer-detection tools than we had been told; or when widely prescribed antidepressants such as Prozac, Zoloft, and Paxil were revealed to be no more effective than a placebo for most cases of depression; or when we learned that staying out of the sun entirely can actually increase cancer risks; or when we were told that the advice to drink lots of water during intense exercise was potentially fatal; or when, last April, we were informed that taking fish oil, exercising, and doing puzzles doesn’t really help fend off Alzheimer’s disease, as long claimed. Peer-reviewed studies have come to opposite conclusions on whether using cell phones can cause brain cancer, whether sleeping more than eight hours a night is healthful or dangerous, whether taking aspirin every day is more likely to save your life or cut it short, and whether routine angioplasty works better than pills to unclog heart arteries.

But beyond the headlines, Ioannidis was shocked at the range and reach of the reversals he was seeing in everyday medical research. “Randomized controlled trials,” which compare how one group responds to a treatment against how an identical group fares without the treatment, had long been considered nearly unshakable evidence, but they, too, ended up being wrong some of the time. “I realized even our gold-standard research had a lot of problems,” he says. Baffled, he started looking for the specific ways in which studies were going wrong. And before long he discovered that the range of errors being committed was astonishing: from what questions researchers posed, to how they set up the studies, to which patients they recruited for the studies, to which measurements they took, to how they analyzed the data, to how they presented their results, to how particular studies came to be published in medical journals.

This array suggested a bigger, underlying dysfunction, and Ioannidis thought he knew what it was. “The studies were biased,” he says. “Sometimes they were overtly biased. Sometimes it was difficult to see the bias, but it was there.” Researchers headed into their studies wanting certain results—and, lo and behold, they were getting them. We think of the scientific process as being objective, rigorous, and even ruthless in separating out what is true from what we merely wish to be true, but in fact it’s easy to manipulate results, even unintentionally or unconsciously. “At every step in the process, there is room to distort results, a way to make a stronger claim or to select what is going to be concluded,” says Ioannidis. “There is an intellectual conflict of interest that pressures researchers to find whatever it is that is most likely to get them funded.”

Perhaps only a minority of researchers were succumbing to this bias, but their distorted findings were having an outsize effect on published research. To get funding and tenured positions, and often merely to stay afloat, researchers have to get their work published in well-regarded journals, where rejection rates can climb above 90 percent. Not surprisingly, the studies that tend to make the grade are those with eye-catching findings. But while coming up with eye-catching theories is relatively easy, getting reality to bear them out is another matter. The great majority collapse under the weight of contradictory data when studied rigorously. Imagine, though, that five different research teams test an interesting theory that’s making the rounds, and four of the groups correctly prove the idea false, while the one less cautious group incorrectly “proves” it true through some combination of error, fluke, and clever selection of data. Guess whose findings your doctor ends up reading about in the journal, and you end up hearing about on the evening news? Researchers can sometimes win attention by refuting a prominent finding, which can help to at least raise doubts about results, but in general it is far more rewarding to add a new insight or exciting-sounding twist to existing research than to retest its basic premises—after all, simply re-proving someone else’s results is unlikely to get you published, and attempting to undermine the work of respected colleagues can have ugly professional repercussions.

In the late 1990s, Ioannidis set up a base at the University of Ioannina. He pulled together his team, which remains largely intact today, and started chipping away at the problem in a series of papers that pointed out specific ways certain studies were getting misleading results. Other meta-researchers were also starting to spotlight disturbingly high rates of error in the medical literature. But Ioannidis wanted to get the big picture across, and to do so with solid data, clear reasoning, and good statistical analysis. The project dragged on, until finally he retreated to the tiny island of Sikinos in the Aegean Sea, where he drew inspiration from the relatively primitive surroundings and the intellectual traditions they recalled. “A pervasive theme of ancient Greek literature is that you need to pursue the truth, no matter what the truth might be,” he says. In 2005, he unleashed two papers that challenged the foundations of medical research.

Read more here/Leia mais aqui: The Atlantic


Why Most Published Research Findings Are False
John P. A. Ioannidis


There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance. Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. In this essay, I discuss the implications of these problems for the conduct and interpretation of research.

Citation: Ioannidis JPA (2005) Why Most Published Research Findings Are False. PLoS Med 2(8): e124. doi:10.1371/journal.pmed.0020124

Published: August 30, 2005

Copyright: © 2005 John P. A. Ioannidis. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Competing interests: The author has declared that no competing interests exist.

Abbreviation: PPV, positive predictive value

John P. A. Ioannidis is in the Department of Hygiene and Epidemiology, University of Ioannina School of Medicine, Ioannina, Greece, and Institute for Clinical Research and Health Policy Studies, Department of Medicine, Tufts-New England Medical Center, Tufts University School of Medicine, Boston, Massachusetts, United States of America. E-mail:


O que é um gene, pós-ENCODE? História e definição atualizada

What is a gene, post-ENCODE? History and updated definition

Mark B. Gerstein1,2,3,9, Can Bruce2,4, Joel S. Rozowsky2, Deyou Zheng2, Jiang Du3, Jan O. Korbel2,5, Olof Emanuelsson6, Zhengdong D. Zhang2, Sherman Weissman7, and Michael Snyder2,8

-Author Affiliations

1 Program in Computational Biology & Bioinformatics, Yale University, New Haven, Connecticut 06511, USA;
2 Molecular Biophysics & Biochemistry Department, Yale University, New Haven, Connecticut 06511, USA;
3 Computer Science Department, Yale University, New Haven, Connecticut 06511, USA;
4 Center for Medical Informatics, Yale University, New Haven, Connecticut 06511, USA;
5 European Molecular Biology Laboratory, 69117 Heidelberg, Germany;
6 Stockholm Bioinformatics Center, Albanova University Center, Stockholm University, SE-10691 Stockholm, Sweden;
7 Genetics Department, Yale University, New Haven, Connecticut 06511, USA;
8 Molecular, Cellular, & Developmental Biology Department, Yale University, New Haven, Connecticut 06511, USA

Next Section


While sequencing of the human genome surprised us with how many protein-coding genes there are, it did not fundamentally change our perspective on what a gene is. In contrast, the complex patterns of dispersed regulation and pervasive transcription uncovered by the ENCODE project, together with non-genic conservation and the abundance of noncoding RNA genes, have challenged the notion of the gene. To illustrate this, we review the evolution of operational definitions of a gene over the past century—from the abstract elements of heredity of Mendel and Morgan to the present-day ORFs enumerated in the sequence databanks. We then summarize the current ENCODE findings and provide a computational metaphor for the complexity. Finally, we propose a tentative update to the definition of a gene: A gene is a union of genomic sequences encoding a coherent set of potentially overlapping functional products. Our definition sidesteps the complexities of regulation and transcription by removing the former altogether from the definition and arguing that final, functional gene products (rather than intermediate transcripts) should be used to group together entities associated with a single gene. It also manifests how integral the concept of biological function is in defining genes.


Charbel Niño El-Hani e a mudança do conceito de gene: entre a cruz e espada [cruz, credo!]

Genetics and Molecular Biology, 30, 2, 297-307 (2007)

Copyright by the Brazilian Society of Genetics. Printed in Brazil

Between the cross and the sword: The crisis of the gene concept

Charbel Niño El-Hani

Instituto de Biologia, Universidade Federal da Bahia, Salvador, BA, Brazil.


Challenges to the gene concept have shown the difficulty of preserving the classical molecular concept, according to which a gene is a stretch of DNA encoding a functional product (polypeptide or RNA). The main difficulties are related to the overlaying of the Mendelian idea of the gene as a ‘unit’: the interpretation of genes as structural and/or functional units in the genome is challenged by evidence showing the complexity and diversity of genomic organization.

This paper discusses the difficulties faced by the classical molecular concept and addresses alternatives to it. Among the alternatives, it considers distinctions between different gene concepts, such as that between the ‘molecular’ and the ‘evolutionary’ gene, or between ‘gene-P’ (the gene as determinant of phenotypic differences) and ‘gene-D’ (the gene as developmental resource). It also addresses the process molecular gene concept, according to which genes are understood as the whole molecular process underlying the capacity to express a particular product, rather than as entities in ‘bare’ DNA; a treatment of genes as sets of domains (exons, introns, promoters, enhancers, etc.) in DNA; and a systemic understanding of genes as combinations of nucleic acid sequences corresponding to a product specified or demarcated by the cellular system. In all these cases, possible contributions to the advancement of our understanding of the architecture and dynamics of the genetic material are emphasized.

Key words: gene, classical molecular gene concept, Mendelian gene, genomic complexity.

Received: April 4, 2006; Accepted: November 3, 2006.


Thomas Nagel sobre reducionismo e antireducionismo

Reductionism and Antireductionism

Thomas Nagel

New York University Law School, 40 Washington Square South, New York, NY 10012 ,USA


Reductionism is the idea that all of the complex and apparently disparate things we observe in the world can be explained in terms of universal principles governing their common ultimate constituents: that physics is the theory of everything. Antireductionism comes in two varieties: epistemological and ontological. Epistemological anti-reductionism holds that, given our finite mental capacities, we would not be able to grasp the ultimate physical explanation of many complex phenomena even if we knew the laws governing their ultimate constituents. Therefore we will always need special sciences like biology, which use more manageable descriptions. There may be controversy about which special sciences cannot be replaced by reduction, but that there will be some is uncontroversial. Ontological antireductionism holds, much more controversially, that certain higher-order phenomena cannot even in principle be fully explained by physics, but require additional principles that are not entailed by the laws governing the basic constituents. With respect to biology, the question is whether the existence and operation of highly complex functionally organized systems, and the appearance of self-replicating systems in the universe, can be accounted for in terms of particle physics alone, or whether they require independent principles of order.

1998 The Limits of Reductionism in Biology. Wiley, Chichester (Novartis Foundation Symposium 213) p 3-14


Ddoh Bei – Avraham Fried


Para, por e com Israel, sempre!

L'vinyomin – Avraham Fried


Para, por e com Israel, sempre!

Modeh Ani – Avraham Fried


Para, por e com Israel, sempre!

Oxigênio atmosférico e a evolução do gigantismo de insetos

2010 GSA Denver Annual Meeting (31 October –3 November 2010)

Paper No. 77-5 

Presentation Time: 9:00 AM-9:15 AM

Image not related to this article/Imagem não relacionada a este artigo
Source/Fonte: Dr. Günther Bechly


VANDENBROOKS, John M., School of Life Sciences, Arizona State University, PO Box 874601, Tempe, AZ 85287,, HARRISON, Jon Fewell, School of Life Sciences, Arizona State University, Mail Code 4501, Tempe, AZ 85287-4501, and KAISER, Alex, Dept. of Basic Science, Midwestern University, 19555 N. 59th Ave, Glendale, AZ 85308

Most models estimate that over the last 500 million years atmospheric oxygen has varied from ~12% to 35%. Most strikingly, the giant insects of the late Paleozoic (i.e. dragonflies) existed when atmospheric oxygen was hyperoxic, supporting a role for oxygen in the evolution of insect body size. However, the fact that not all groups during this time period were giant (i.e. cockroaches) coupled with the paucity of the insect fossil record and the complex interactions between oxygen, organisms and communities makes it difficult to definitively accept or reject the historical oxygen-size link. Nevertheless, we have successfully reared dragonflies, cockroaches and a variety of other insect species under varying oxygen levels and the results of these studies do support a link between oxygen and the evolution of insect size: 1) dragonflies and other insect groups do develop and evolve larger body sizes in hyperoxia, while almost all insects develop smaller body sizes in hypoxia; yet cockroaches show no size difference when reared under hyperoxia, 2) insects developmentally and evolutionarily reduce their investment in the tracheal respiratory system when living in higher oxygen levels; suggesting there are significant costs associated with tracheal system structure and function and 3) larger insects invest more of their body in the tracheal system, potentially leading to greater effects of oxygen on large insects. These results provide several mechanisms by which the tracheal oxygen delivery system may be involved in the small size of modern insects and hyperoxia-enabled Paleozoic gigantism. When we begin to examine the fossil record closely, we see that certain groups have responded more strongly to oxygen variation. While taxa such as Protodonata and Paleodictyoptera have gigantic members, they are outliers to an overall pattern of oxygen-mediated body size change. On the other hand, Blattodeacontain no giant representatives and demonstrate little effect on maximum body size, but do show shifts in average size correlated with changes in atmospheric oxygen levels. Here we examine the role of atmospheric oxygen in the evolution of insect body size and discuss the possibility of imaging fossil tracheae as a proxy for paleo-oxygen levels. This research was supported by NSF EAR 0746352 and DOD 3000654843 to JH and JVB.

2010 GSA Denver Annual Meeting (31 October –3 November 2010) 
General Information for this Meeting

Session No. 77
Paleontology IV - Environmental Controls on Ecology and Evolution
Colorado Convention Center: Room 605
8:00 AM-12:00 PM, Monday, 1 November 2010

© Copyright 2010 The Geological Society of America (GSA), all rights reserved. Permission is hereby granted to the author(s) of this abstract to reproduce and distribute it freely, for noncommercial purposes. Permission is hereby granted to any individual scientist to download a single copy of this electronic file and reproduce up to 20 paper copies for noncommercial purposes advancing science and education, including classroom use, providing all reproductions include the complete content shown here, including the author information. All other forms of reproduction and/or transmittal are prohibited without written permission from GSA Copyright Permissions.

A pergunta filosófica mais importante [?] feita por Marcelo Gleiser

If I had the opportunity to meet the assumed designer, I'd ask what, to me, is the most important question of them all: ''Mr. Designer, who designed you?"

If the designer answers that it doesn't know, that perhaps it was also designed, we fall into an endless regression, straight back to the problem of the first cause, the one that needs no cause. At this point the mask tumbles and we finally discover the true identity of the IDists' Designer. We should capitalize the word, as this is how we are taught to refer to God.

Read more here/Leia mais aqui: The Edge


MARCELO GLEISER, a professor of physics and astronomy at Dartmouth College, is the author of 'The Prophet and the Astronomer: A Scientific Journey to the End of Time.
Marcelo Gleiser's Edge Bio Page

Mary Midgley contra a adoração do humanismo em publicação humanista

Against humanism

Mary Midgley

Of course we should love, honour and cherish our species, says Mary Midgley. But should we have to worship it too?

Does the term “humanism” really stand for a new and better form of religion? If so, what is that religion? Or is it something designed as a cure for religion itself, a way to get rid of it on Christopher Hitchens’s principle that “religion poisons everything”?

Many people, no doubt, agree with Hitchens. But Auguste Comte, the founding father of modern humanism, would not have been one of them. For him, “humanism” was a word parallel to “theism”. It just altered the object worshipped, substituting humanity for God. He called it the “religion of humanity” and devised ritual forms for it that were close to traditional Christian ones. He thought – and many others have agreed with him – that the trouble with religion was simply its having an unreal supernatural object, God. Apart from this, the attitudes and institutions characteristic of religion itself seemed to him valuable, indeed essential. And he certainly had no wish to get rid of the habit of worship, only to give it a more suitable object. Surely (he said) worshipping human beings – who are real natural entities – would easily be able to replace the existing idle and artificial practices? So he ruled that, for instance, the enlightened citizen should start his day by worshipping first his mother, then his wife and then his daughter – after, of course, ensuring that they all did exactly what they were told for the rest of the time. And the other occasions of life could be similarly hallowed. This would all be part of his positivistic enterprise of developing the human scientific faculties that would finally enable us to abandon superstition

These precepts, however, did not work out easily. Comte’s new Christian-like institutions withered like alien vines once they were applied to their new objects, even though he carefully policed them and trained his priesthood in the newly-discovered skills of Sociology. I once saw the still extant Comtian temple in Paris, a tidy little Victorian church with round (not Gothic) arches, its walls lined with statues of the Saints of Humanity – Plato, Newton, Shakespeare, Beethoven. I asked its gloomy concierge whether she thought anybody ever worshipped there but she replied, “Nobody. I think, never.”

Plainly, Comte’s simple recipe for grafting a new object on to traditional institutions – a new head on to the old body – did not produce the improved life-form he hoped for. This may seem odd. It should (we think) surely be possible simply to celebrate and admire the lives of past and present humans without getting committed to any questionable doctrines – without those suspect claims to a background beyond familiar facts which create the poison of religion for people like Hitchens. And of course we do celebrate people unpretentiously in this way. But our doing so hardly seems to constitute an ism, a cause, a distinctive attitude that says something about the whole human species. There is also the further question – even if you want to get rid of God, is the human race the right thing to be worshipping instead? Of course it is important to us simply because it is ours, but should we think of it as central to the cosmos, or even to earthly life? Considering that it is already making other species extinct at an increasing rate, do we really want to give it a kind of divine status?

Serious celebrations of individual human merit do not usually take us in this direction. We do not celebrate people simply for being specially human but for particular things they have done or said. They have changed our attitudes to particular ideals and values, and this new thinking can inspire quite new visions of reality. When a fresh prophet – Newton or Blake or Pythagoras or Jesus or Nietzsche or Darwin or Marx or Einstein or the Dalai Lama – appears among the existing Saints of Humanity, this contribution has wide consequences. Not only can it alter our map of human life, it can also call on us to change our whole world-picture. New ideals do not just alter our conduct. They can gradually change our whole conception of reality.

The reason why we revere these people is that they have extended the bounds of human experience, showing us things that the rest of us simply had never thought of. They have therefore encountered quite new problems and have had to describe them in new language, often using rich seams of metaphor that can never be unpacked literally. Subsequent efforts to work out their meaning can call for profound shifts which make everything appear differently – including, of course, both some splendid inventions and some fearful mistakes. And these shifts often change the way in which we conceive reality itself.

In doing this, we are not forced to stick to the revelations of a particular group of prophets who were specially revered during the Enlightenment. Indeed, even if we wanted to halt there we could not do so. There is no fixed, unalterable background map of the “familiar facts” that must survive all such shifts, and certainly no fixed schedule dividing real entities from fishy, imaginary ones. Entities like Fate and Progress and the Logic of History and the Hidden Hand of the Market come and go.

Materialists take matter to be what is typically real, but matter itself is not at all what it used to be. Newton’s reassuringly solid, inert particles are long gone. Energy, which succeeded them, seems now to be dissolving into a succession of more exotic possibilities. At present, many respected physicists advocate belief in the Multiverse, by which they do not mean just a crowd of existing extra universes but an apparently limitless string of new ones that continually come into being all the time whenever a quantum event is needed to decide between two possible alternatives. This idea strikes many of us today (as it would have struck most people earlier) as not just unlikely but meaningless, yet it is now viewed as the kind of thing that can merit Nobel Prizes.

Changes like this in ontology – in what is considered to be real – are known to be so common in human history that it seems surprising when people treat a current doctrine about it as a timeless truth. That, however, is what has happened to the rather crude form of materialism that Comte himself enshrined by his positivist doctrine. Positivism got rid of Cartesian dualism – the twofold world of Spirit and Matter that had seemed so obviously final to Newton – not by rethinking it but by simply eliminating Spirit, leaving Matter to manage on its own. The main reason for doing this was undoubtedly the fear of religion. The whole concept of Spirit was seen as too dangerous because of its history, notably, of course, the political oppression of the churches. Thus, as often happens, the new insight was shaped chiefly by contrast with the previous one and taken as a final refutation of it.

But Matter had been so carefully defined by dualists as inert and alien to life that it was really hard to see how it could do all that was now expected of it – how it could be the source of conscious, active living animals, including ourselves. The unlucky consequence of this clash can be seen in what is now called the Problem of Consciousness, the desperate ongoing attempt by many scientists to find ways of talking about human experience in “scientific” language – language that has been carefully designed to make all such talk impossible.

This problem began to distress people during the 1970s because that was when the behaviourist veto on ever mentioning subjectivity finally lost its force. Behaviourists had been following positivist principles in dismissing the phenomena of consciousness as effectively unreal, since they could not be described in physical terms, and they concluded that psychologists could only study outward behaviour, taking no notice of experience.

Not surprisingly, this worked so badly that the theory was officially abandoned. Yet the general suspicion of talking about conscious experience remained very strong. Odd though it sounds, psychologists seem still to have thought that attending to subjectivity was the same thing as being subjective – that is, biased and uncritical. The world, in fact, consisted solely of objects with no subjects to observe them. As Marilynne Robinson has lately pointed out in a very sharp little book called Absence of Mind, this meant that our inner life – the place where the whole drama of human thought had till now been carried out – had somehow been scientifically proved not to exist. Thus our only source of information about the outer world was no longer available.

Despite this ruling, however, the difficulty of discussing observation without an observer – the absurdity of enquirers trying to leave themselves out of their own enquiries – increasingly bothered scientists, especially ones concerned with evolution, where the role and origin of conscious experience needed to be considered. It is all very well to eliminate God from the intelligible universe but eliminating ourselves from it blocks all sorts of enquiries. Not much of a human world is left once this is done. Accordingly, in the ’70s consciousness itself began to be officially rated as a mentionable scientific problem. A few other terms too have since gradually been readmitted to polite society, including, in recent times, even daring adjectives such as spiritual.

The emphasis is still, however, on the need to reduce these concepts to Matter as traditionally conceived – to bring them within reach of the abstractions used by the existing physical sciences. The search for a “scientific explanation of consciousness” which goes on at the yearly conference at the Center for Consciousness Studies in Tuscon, Arizona still centres not on trying to be scientificin the sense of using suitable methods, but on making consciousness respectable by somehow bringing it within the range of physics and chemistry, mainly at present through neurobiology.

Read more here/Leia mais aqui: New Humanist

O formato do genoma é tão importante quanto seu conteúdo?

Is the Shape of a Genome as Important as Its Content?

ScienceDaily (Oct. 29, 2010) — If there is one thing that recent advances in genomics have revealed, it is that our genes are interrelated, "chattering" to each other across separate chromosomes and vast stretches of DNA. According to researchers at The Wistar Institute, many of these complex associations may be explained in part by the three-dimensional structure of the entire genome.

Three-dimensional structure of the fission yeast genome. (Credit: S. Pombe)

A given cell's DNA spends most of its active lifetime in a tangled clump of chromosomes, which positions groups of related genes near to each other and exposes them to the cell's gene-controlling machinery. This structure, the researchers say, is not merely the shape of the genome, but also a key to how it works.

Their study, published online as a featured article in the journal Nucleic Acids Research, is the first to combine microscopy with advanced genomic sequencing techniques, enabling researchers to literally see gene interactions. It is also the first to determine the three-dimensional structure of the fission yeast genome, S. pombe. Applying this technique to the human genome may provide both scientists and physicians a whole new framework from which to better understand genes and disease, the researchers say.

"People are familiar with the X-shapes our chromosomes form during cell division, but what they may not realize is that DNA only spends a relatively small amount of time in that conformation," said Ken-ichi Noma, Ph.D., an assistant professor in Wistar's Gene Expression and Regulation program and senior author of the study. "Chromosomes spend the majority of their time clumped together in these large, non-random structures, and I believe these shapes reflect various nuclear processes such as transcription."

Read more here/Leia mais aqui: Science Daily


Mapping of long-range associations throughout the fission yeast genome reveals global genome organization linked to transcriptional regulation

Hideki Tanizawa, Osamu Iwasaki, Atsunari Tanaka, Joseph R. Capizzi, Priyankara Wickramasinghe, Mihee Lee, Zhiyan Fu and Ken-ichi Noma*

+Author Affiliations

The Wistar Institute, Philadelphia, Pennsylvania, USA

*To whom correspondence should be addressed. Tel: +1 215 898 3933; Fax: +1 215 573 7919; Email:

Received August 6, 2010.
Revision received September 29, 2010.
Accepted September 30, 2010.


We have comprehensively mapped long-range associations between chromosomal regions throughout the fission yeast genome using the latest genomics approach that combines next generation sequencing and chromosome conformation capture (3C). Our relatively simple approach, referred to as enrichment of ligation products (ELP), involves digestion of the 3C sample with a 4 bp cutter and self-ligation, achieving a resolution of 20 kb. It recaptures previously characterized genome organizations and also identifies new and important interactions. We have modeled the 3D structure of the entire fission yeast genome and have explored the functional relationships between the global genome organization and transcriptional regulation. We find significant associations among highly transcribed genes. Moreover, we demonstrate that genes co-regulated during the cell cycle tend to associate with one another when activated. Remarkably, functionally defined genes derived from particular gene ontology groups tend to associate in a statistically significant manner. Those significantly associating genes frequently contain the same DNA motifs at their promoter regions, suggesting that potential transcription factors binding to these motifs are involved in defining the associations among those genes. Our study suggests the presence of a global genome organization in fission yeast that is functionally similar to the recently proposed mammalian transcription factory.
© The Author(s) 2010. Published by Oxford University Press.

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License

(, which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.


Solsbury Hill - Peter Gabriel

Climbing up on Solsbury Hill
I could see the city light
Wind was blowing, time stood still
Eagle flew out of the night
He was something to observe
Came in close, I heard a voice
Standing stretching every nerve
I had to listen had no choice
I did not believe the information
Just had to trust imagination
My heart going boom boom, boom
“Son,” he said, “Grab your things, I’ve come to take you home.”
To keeping silence I resigned
My friends would think I was a nut
Turning water into wine
Open doors would soon be shut
So I went from day to day
Tho’ my life was in a rut
‘Till I thought of what I’d say
Which connection I should cut
I was feeling part of the scenery
I walked right out of the machinery
My heart going boom boom boom
“Hey,” he said, “grab your things, I’ve come to take you home.”
Yeah back home
When illusion spin her net
I’m never where I want to be
And liberty she pirouette
When I think that I am free
Watched by empty silhouettes
Who close their eyes, but still can see
No one taught them etiquette
I will show another me
Today I don’t need a replacement
I’ll tell them what the smile on my face meant
My heart going boom boom boom
“Hey,” I said, “You can keep my things, they’ve come to take me home.”