Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

Thursday, February 9, 2012

Evolution, death, living, and the 5 most important drugs

I’ve been procrastinating on writing this article for many a month, and the longer I sit on it (or it sits on my mind), the more lethargic I feel, lethargy that spills into everything else. So anyway, evolution is one of my most obsessive topics in science, and I spend more time thinking/reading about it than any other thought seed. Recently, after watching a TED talk about the future of evolution (people who are too sensitive to the whole eugenics business and feel their heart stabbed with even the hint of “controlling” evolution can conveniently stop reading at this point and return to their routine without bothering with the above-linked video), there was something mentioned in it that irrelevantly ignited my interest in pharmacology. In essence, after much thought, the idea boils down to this: you don’t need to meddle with genes in order to have a conscious influence on our species’ evolution; we are already doing it – in the form of medical interventions of all kinds, especially those that delay fatality.

Up until a few hundred or a few thousand years ago, man was completely subject to the will of nature, physiologically speaking. Any harmful genes and genes that would decrease the probability of survival of an individual – example, diseases acquired through hereditary, genes influencing moronic behaviors that lead to death, etc. – were all eliminated from the human gene pool through the death of the individual if the death occurred before breeding. What we define as harmful in this context need not necessarily be something extreme like cancer or plague but even a mild infection that could lead to gangrene and eventually death, or an infection during a simple fever (which is a time when the immune system is fighting to normalize body temperature and any secondary sources of malice, even something as inconsequential as a dog bite or bruised infection, would not be sufficiently dealt with by the busy immune system and thus increase chances of death). So there need not necessarily be something wrong directly with the genes themselves; the body’s immune system, which is a manifestation of those genes, can instead be easily influenced by external agents, and thus also the chances of carrying on of those genes into the next generation; or, death could not even have been from a physiologic dysfunction but rather from being eaten by a predator or death by stupidity, all of which are again manifestations of survival instincts influenced by genes. Evolution has its fingers in every pie. In as many cases as not, death used to occur well before breeding, and therefore, those genes were eliminated from the gene pool through evolutionary natural selection – a phenomenon powered by the thin line that separates death and survival.**

However, with the institution of medical science that has been gradually increasing in depth over the past couple of millennia, we have been actively delaying death and getting better at it, thereby being able to reproduce before death (not that there’s something called ‘reproduce after death,’ but you get the point) and pass on every kind of gene, good or bad, to the offsprings, and they to theirs. The larger-perspective consequence of this medical intervention simply means that we are keeping natural selection from acting on those undesirable genes and ensuring that they are kept active/present in the human gene pool. As a quick illustration of this effect, two graphs demonstrate the average life expectancy of an individual over the past few centuries and the infant mortality rate for the 20th century, both of which show that medical intervention has increased the chances of gene survival in the gene pool:


(Remember, though, that death can occur even after infancy but before reproduction - the so called "virgin deaths," for lack of a better term - so the graphs are valid yet unavailable for that demographic as well)

The conclusion here is that we are not at the mercy of nature anymore to survive, and even without actually manipulating the genes directly, we are still consciously influencing their presence in our species. Since genes are what determine the course of evolution, we humans have already begun to control the future of our species on an evolutionary scale (you could also argue that veterinary science has done the same for animals, although to a lesser degree). And in addition to delaying death and ensuring reproduction, medical intervention also treats or cures inherited/congenital and other diseases arising from those genes or from external agents, which, as I said earlier, are again manifestations of genes.

The above-linked video discusses working directly on the genes themselves and manipulating them consciously, and not indirectly through medical science as I just described, to eliminate harmful genes altogether and tune our mortality, but that is not what I want to talk about here. I am more interested in how medical science has indirectly worked on our genes and helped you and I come into existence and stay in existence.

The following are five of the most important and impressive pharmacological miracles which either we owe our existence to or which humankind is much, much worse off without. I say humankind as a collective term, of course. Individually, we all have, or will have later in life, a medicine which is as important to us personally as these are to mankind as a whole.

1. Penicillin.
The granddaddy of all modern antibiotics and the first antibiotic ever discovered. It is estimated that 75% of today’s world population would not be alive if penicillin hadn’t saved our ancestors from infections. 75% of our current population would never have known what it means “to live.” Penicillin, more than any other single factor, led to the exponential rise in population in the 20th century. We are currently developing resistance to penicillin and gradually making it ineffective on several bacterial infections, but secondary antibiotics derived from penicillin and many other antibiotics discovered even today through the same method Dr. Fleming used to (accidentally) discover penicillin in 1928 are rising up to fill the grandmaster’s place.




2. Ether
What we generally call anesthesia, was discovered in 1842. All surgeries performed until then were live surgeries, with the patient conscious. If a mass had to be removed, you had to be cut open when you could feel every moment of the excruciating pain, many dying of the pain itself. Not that we are here and alive because of this chemical compound, but without it, the millions of surgeries taking place every year would either never take place because of the fear of pain, or be so painful as to suck the life force out of you. In the awe-inspiring HBO TV mini-series John Adams, set in 18th century, one of his daughters undergoes a breast removal for a diagnosis of malignant, cancerous tumor – while she is still awake and feeling every inch of the pain as the blade cuts her breast off, and as Mr. and Mrs. Adams cry in each other's arms at the screams of their daughter's agony during the procedure. The cancer recurs in the other breast a while after, and she refuses surgery and accepts death.

3. Smallpox vaccine

The first successful vaccine in medical history, it symbolizes man’s ascendancy over the dark side of nature, a symbol in the form of a healed lesion on your left shoulder. Small pox has been one of the deadliest epidemics in recorded time – notorious not just for its mortality rates but also for the stretch of time it lurked in our species, spanning over 12,000 years. Even the Spanish flu, which claimed over 50 million lives in early 20th century, did not last longer than a few years. This vaccine used to treat the smallpox virus is ironically made of another virus – cowpox virus. Classic case of fight fire with fire, huh?




4. Antidepressants
There is a very sick mentality among the general population of judging people who are on antidepressants. I once very briefly worked on a psychiatry department account and had a long stint with general physician consulting in my line of work. I could never cease to be surprised at the number of patients with a very chronic case of depression who refused to go on antidepressants simply because of the fear of what their friends might think (I am not exaggerating. I’m barely even paraphrasing). What other people might think! I don’t think the patient is completely at fault here, because as Col. Walter Kurtz rightly put it, “It’s judgment that defeats us.” We are all inherently afraid of being judged, and that fear surfaces in these depressed patients at the sound of being on a drug for the mind, emotionally helpless as they already are. But what sickens me to my gut is when I see or hear someone judging, either vocally or with subtle facial expressions through which they so wonderfully communicate condescension, a person who is on antidepressants. No one judges patients with diabetes, heart diseases, renal failure, or any other physiological issues, but when it comes to a psychological issue, an emphatic “OH!” ensues. Right, here is the truth: clinical depression is neither a choice nor a reflection of the state of lives of people suffering from it. It’s as physiological a condition as osteoarthritis or congestive heart failure. Chronically depressed patients’ brains are physically incapable of producing the appropriate amounts of chemicals required to be happy. The only way to treat it is to fight it with an antidepressant drug, the way you would treat a backache with ibuprofen – i.e., with medicine! You can't (read slowly: CAN'T) cure it by thinking optimistically, thinking happy thoughts, taking a freakin break, or any other pop culture bullshit worthless advices people give. Go watch this enlightening Stanford video, educate your ignorant, harebrained ass, and refrain from making any smartass judgment calls the next time you hear someone is on an antidepressant: you could potentially save them their lives, just by being a decent human being. Too much to ask?

As Maria Bustillos, studying the reason for DFW's suicide, wrote the most sensible words I've ever heard from a lay person:

I have known intimately and looked after depressed people, and have no illusions about my ability to understand the real nature of that illness. The sort of blues I occasionally suffer through compares to real depression like a broken fingernail compares to being shot in the head and then set on fire and drowned. But it seems to me that the victims of that terrible disorder are often trying all their lives in vain to figure out why this must be so. Why them. And maybe there really is just no reason, or the reason is completely random, a cluster of neurons misfiring one day by accident, a bad thing that happens and could not be helped.

On a similar note, as much as I admire and appreciate Sir Ken Robinson’s intricate critique of the current state of our pathetic school/education systems worldwide, a glimpse of the judgmental attitude towards psychogenic drugs can be seen in these two (1, 2) million-plus-views Youtube videos of his. In the second video although he correctly admits that he is not qualified to comment on ADHD, he somehow believes he is qualified to comment on ADHD drugs. The very least bit of qualification required to comment on this class of drugs is to try one. But without knowing what it feels like to be on one, without conducting proper research into the drug, and without knowing what the drug does inside the brain, Sir Ken confidently posits that psychostimulant ADHD drugs numb your senses and enable you to focus on the “boring” subject by disabling you from attaining the heightened state of enjoyment you feel while watching a work of art. But the reality of it is hilariously contrary – ADHD medications heighten, NOT numb, your senses and therefore enable you to focus on whatever it is you want to focus on, including art. As someone who has tried modafinil, I can vouch for it, and so can academic sources and millions of others on it. Just try watching a film while on modafinil, and you’ll bet your life savings you are enjoying it better than Sir Ken. So making daft statements like these shows not just how uneducated (no pun intended) and biased he is against the science of mind and drugs, but also displays the stupidity of the audience that nods its heads and shares his judgment of ADHD drugs and of those who prescribe or support it. Another sad example comes from Doug Stanhope, one of my favorite and extremely bright comedians. As I said, I love both Robinson and Stanhope in all other respects but fall short of sympathizing with their disdain and judgment towards psych drugs. And all this is just a glimpse; the bigger picture is much uglier.

[I’m sorry about the tone of these two paragraphs above. I vent bitterly because I too was on an anxiolytic for a brief period last year, and save a few friends, pretty much everyone else gave me a judgmental look for it and it wasn’t too long before I realized this was a universal phenomenon that’s causing millions of people around the world to commit suicide simply because they’d rather deal with the consequence of not being treated for their psychological condition than be judged for accepting treatment.]

5. Antihypertensives
Approximately 1 billion people in the world currently suffer from hypertension (high blood pressure), and it is estimated that one out of every four people born will develop severe, chronic hypertension into their adulthood, and another one of those four will develop moderate hypertension (hypertension classification here). Hypertension has been academically called a “silent killer” because the symptoms of hypertension are so mild and unfelt over a long time that you become adjusted to its abnormality as normalcy and it slowly begins to eat away your longevity. One day, thud! Hypertension in rare cases can occur even at young age but normally occurs well into your adulthood, so antihypertensives are drugs that extend your days, and not (just) help you survive till you give birth. Many of us owe our parents’ and grandparents’ long life to this class of drugs.

Of course, utterly needless to say, every class of drugs is important, not just the ones I have chosen to mention here. From nutritionals to antivirals, NSAIDs to antipyretics, analgesics (very commonly used) to ACE inhibitors, insulin to even birth control pills and emergency contraceptive pills – sometimes not just to restore normalcy but to elevate normalcy up to higher standards of healthy living. Some are personally more important to us than any of these drugs mentioned above. But we all need them. Even those who prefer Ayurveda, Homeopathic and other alternative medicines and show a general sense of disdain towards Western medicine owe their existence to them, deny as they might. Pharmacology and medical science is what makes living today better than living in the past – more so than any other perks of human development.

____________

** (It's also worth noting that the death before reproduction of a single member of a species containing an undesirable gene isn't necessarily going to eliminate that gene from the gene pool. The individual's siblings, their chances of having inherited the same undesirable gene from the common parent, and other members of the species possessing the same gene - all influence how long a gene survives in the system, which is why undesirable genes tend to survive for several generations.)

____________

I have some thoughts on how wealth affects genes and the human gene pool from an evolutionary perspective, but they are still raw, disorganized and incomplete, so hopefully by next year I can write a blog post about it. 

Monday, May 30, 2011

On the importance of variety


I'm sure you've all heard of and read - to nauseating levels in pop culture, sometimes - the saying that "variety is the spice of life." I'm sure at least some of you have read nerdlikejazzy's recent viral blogpost '18 Things I Wish Someone Would’ve Told Me at 18,' point number 4 of which elaborates on "exploring new ideas and opportunities often." In this post, which in all likelihood will be my last one for the next 6 months owing to my busy schedule, I'll add a few points to support the imperative of varied experiences in life. Don't worry, I'm not gonna get into any heavy-handed, pseudophilosophical discussions about the meaning of life or the "painful journey" we call life. I'll try and work it out from a scientific perspective only and skim the surface of the scientific basis that substantiates the claim.

The American psychologist Fred Attneave writes about the human brain:
"If there were just one brain cell to cope with each image that we can distinguish in all its presentations, the volume of the brain would have to be measured in cubic light years."

Meaning that there is just a tiny bit of information (information as in an image) that can be stored in and processed by one brain cell, and since we come across literally infinite images (well not literally!) each waking moment of our lives, each human brain would have to be the size of a galaxy to be able to successfully process all the images (or other information) we see at any given time, all the time. But we know this isn't true, since we know how big our brains really are - anywhere from the size of a coconut, for most of us, to the size of a peanut, for some people like Arindam Chaudhuri and his disciples. Despite the size that is really required of the brain to process information, our brain does it with extraordinary precision using a size that is only a billionth of what is required. How so, though?

The brain has a workaround trick to solve this, discovered independently in the 1950s by Barlow and Attneave. It has evolved a very sophisticated and complex system of erasing redundancy. Redundancy is the opposite of information; redundancy is a measure of unsurprisingness, information a measure of surprisingness. To illustrate this with an analogy, imagine what a worthless waste of paper it would be if the newspapers reported every single day that the sun rose in the east that day. It's redundant information, and everyone knows without being reminded that the sun rises in the east everyday. However, in the unlikely event that the sun rises in the west on a fateful day, that would be something of a news to report. That is a change from the daily routine of sun's rising in the east, and newspapers will not fail to report it the following day. The information content of it would be high, measured as the 'surprise value' of the message. The brain works in a very similar way, regardless of the body sense to which it is responding or the memory it is storing, as there is only a limited number of neurons for the brain to work with. In fact, the entire nervous system works this way. I'll explore two examples, of senses in particular.

The eye. Suppose you are looking at a black rectangle on a white background. The whole scene is projected on to your retina - you can think of the retina as a screen covered with a dense carpet of tiny photocells, the rods and cones. In theory, each photocell could report to the brain the exact state of the light falling upon it. But the scene we are looking at is massively redundant. Cells registering black are overwhelmingly likely to be surrounded by other cells registering black. Cells registering white are nearly all surrounded by other white-signalling cells. The important exceptions are cells on edges. Those on the white side of an edge signal white themselves and so do their neighbours that sit further into the white area. But their neighbours on the other side are in the black area. Using a phenomenon called 'lateral inhibition,' the brain can reconstruct the whole scene with just the retinal cells on edges firing. There is a massive savings in nerve impulses. The eye tells the brain only about edges and the brain fills in the boring bits between. Once again, redundancy is removed and only information gets through. Any image you see, including these words as you read as well as of anything else that is laying by your computer, is reconstructed in your brain through this same process of inhibition of redundancy.

You can see this lateral inhibition in action in this optical illusion below. Cover the central divide between the two shades of grey with your index finger and stare at your finger for a few seconds. Now both sides appear to be made of the same shade of grey, or very nearly the same. Now move your finger off the screen, and the difference between the shade drastically increases. Can you reason why?



The ear. Hearing, all audible sounds as analogous to images, works very similar to the eye as described above: by cutting down on the redundancy in the pitch of the sound, filtering in only the information about the variation in pitch, and then the brain processing and reconstructing the sound - all in real time. But with sound, there is an additional factor: of time itself. To explain just how dramatically the brain filters out redundancy, let me cite this study conducted by researchers at Rice University. Two groups of subjects were made to listen to two sound pieces respectively. The first sound, given to the first group, was of constant pitch whereas the second sound, given to the second group, was of varying pitch. The groups were not told of the duration of the sound they were listening to, and none of them had clocks or watches or any other way of measuring time. After both groups finished listening to the sounds, when asked to guess the duration of the sound piece they just listened to, all the subjects of the first group (the constant-pitch group) reported a duration that was much less than the duration reported by the subjects of the second group - despite the fact that both sound pieces were of the exact same duration! The first group subjects' brains not only cut down on the redundancy of constant pitch but also reduced the subjective perception of time that was associated with the redundant information. The second group subjects' brains, however, because of the variation in pitch and thus more 'surprise content,' were able to store not only more information about the sound but also more of the overall time associated with it.

Where I am going with all these illustrations and citations should be clear at this point. The reason why four years of college feel like four months is because of the same reason that the brain remembers a constant-pitch sound to be shorter than it is: repetitive, routine tasks are compacted by the brain into just a few, typical, representative prototypes and stored into memory locations where the subjective perception of time is also cut short in accordance with those compressed memories, thereby filtering out not only the redundancies of the experience but also much of the perception of time that the experience is made of. This essentially is the same way our brain retains boiled-down memories over an entire lifetime.

So, to counter this seeming 'drawback' in the design of our brain, to keep life from feeling short-lived and having passed in a flash, to avoid the perception of time having flown like an arrow, it's most advisable to have different kinds of varied experiences that the brain can perceive as information or 'surprise content', to try new things if only for the sake of the feel, to break the patterns of tedious, recurrent routine that come off as redundant, and also to collect as many photos, videos and souvenirs of the experience - because remember this: there's only so much the brain can remember.

_