Monthly Archives: May 2011

This IS Rocket Science… A brief introduction.

Hello there, I decided to write this after I prepared a short ten minute segment on rocket technology for a podcast. In this article, I will be writing about the basic physics that underlie rocket propulsion and will also elaborate upon some of the basic types of rocket engines.

The Physics of Rocket Propulsion

The physics of rocket propulsion are actually quite simple and are based on Newton’s laws of motion, the third one in particular. That one states that for every action, there is an equal and opposite reaction in a kinematic context. Rocket engines are propelled by means of a reaction to rapidly moving, large volume jet of gas in the opposite direction. This jet of gas is produced by the chemical process of combustion in most cases.

This is in fact very similar to how jet engines work, but the difference is that jet engines can only work when there is an external oxygen supply, while rockets can work anywhere due to having an onboard supply of oxidiser.

The combination of oxidiser and fuel and the combustion that follows is the main mechanism by which rocket engines produce that all-important propulsive jet. Some of the arrangements by which rocket engines achieve this will be discussed later.

Rocket Propulsion - Of Action and Reaction

A brief history of Rocketry.

Now while Newton was the first one to place the law of action and reaction in a mathematically rigorous framework I think it is important to note that the idea of jets of gaseous substances as means of propulsion goes back a long long way. I, personally, see Hero of Alexandria’s rotating steam powered device, called the Aelophile, which used jets of steam to drive a solid body around an axis.
It may be possible to procure a classroom model of this device and see it in action.


The first true rockets (as in containing their own oxidizer) were built by the Chinese before the tenth century BC and were powered by Gunpowder. There are reports of rockets being used in combat not only by the Chinese but also the Mongol hordes of Ghengis Khan who took the technology with them on their conquests.

Rocket technology reached Europe in the middle ages thanks to the Mongolians and later the Ottoman Empire. It was incorporated into artillery practises of the era. Iron-cased rockets were first successfully employed by the kingdom of Mysore against the soldiers of the English East India Company in the Anglo-Mysore Wars of the late Seventeen Nineties. This kind of rocket was further developed by William Congreve, the Comptoller of the Royal Arsenal in London; he developed long stabilizing tail-poles to improve accuracy and developed stronger rockets and new propellant mixtures to improve range.

Congreve Rockets look like this.

Congreve Rockets

William Hale came up with the idea of slightly vectored thrust which improved control of direction further. This negated the need for a tail-pole and thus contributed to rockets being lighter and more aerodynamic.

As with so many other things in science, science fiction was where the idea of space travel using rockets first began to take root; Jules Verne’s classic novel “From the Earth to the Moon” comes to mind.

Scientists like Konstantin Tsiolkovsky in Russia/the USSR, Herman Oberth in Germany and Robert Goddard in the USA worked to provide the science of racketeering a rigorous footing, eventually it was Robert Goddard who built a very successful rocket (for the standards of its day, in any case) based on design principles such as a dedicated combustion chamber for solid fuelled rockets and the use of an asymmetrical hourglass shaped nozzle to accelerate propulsive gases to beyond supersonic speed. He and Tsiolkovsky also speculated about the idea of multi-stage rockets, which basically have multiple, self-contained rocket engines.

To go on a little tangent here, multistage rockets enable range to be extended and the ability to discard spent units make it progressively easier to accelerate the payload to ever increasing velocities. This is critical in achieving the velocity required to escape the Earth’s gravity, which eventually made space exploration using rockets possible.

Robert Goddard launched the world’s first liquid fuelled rocket in 1926, several companies around the world started to dabble in rocketry and there was some significant activity until the second world war broke out, including advances in the design of various engine components and methods to implement the principles of rocketry.

Goddard Rocket

The photograph above is of Goddard and his liquid fuelled rocket.

In World War II, advances were made in the military deployment of rocket technology, including large unguided rockets like the Soviet Katyusha and sophisticated guided missiles like the German V-Two. Messerschmitt also built a rocket plane called the Me-163 Komet, which by all accounts was a pretty unsafe machine to handle, given unstable propellant components which had a tendency to explode.

The Messerschmitt Me-163 Komet

Following the war, rocket development, while still being focussed on the military use of rockets, found offshoots in space exploration programmes, the Redstone rocket programme in the USA and the R7 derivatives of the V2 in the USSR laid the foundations for propelling objects into space. The solving of the problems of atmospheric re-entry made space flight a reality, finally.

The space age had arrived.

Types of Rocket Engines – A Brief Introduction

Right, as mentioned earlier, there are several extant ways by which rocket propulsion may be achieved, and I am now going to try and offer very brief explanations of the defining features of each.

Solid-propellant rockets use a solid state combination of fuel and oxidizer called the grain, this fuel does not spontaneously combust and therefore must be ignited by other means, such as a small powder charge. Obviously it is difficult to stop and start the combustion of solid propellants when required and they also produce less thrust than other types of rocket engine and are therefore not as efficient, however, they can be stored for long periods of time and without too many safety risks.

Solid Propellant Rocket Engine - Schematic.

Liquid propellant rockets bring together oxidizer and fuel in liquid form, either through the use of pumps or using pressurized gases. Some fuel-oxidizer mixtures ignite spontaneously when formed, and rockets that use these kinds of fuels don’t need a dedicated ignition system, if one is needed, however, electric sparks or a small amount of solid propellant can be used. These rockets offer more thrust and control, at the cost of increased rocket complexity and storage concerns.

One type of Liquid Fuel Rocket Engine

Electric rockets use electricity to produce thrust, they are not as powerful as the other two types of rocket engines that have been described yet and therefore, as far as I’m aware, is restricted to propulsion after a spacecraft has already managed to escape the earth’s gravity. These use things like ions, jets of plasma and the control of electric and magnetic fields to accelerate gas that is conducive to manipulation by such fields.

An Electric Rocket Engine - schematic

One other kind of rocket that’s been envisaged uses heat from a nuclear reactor to heat liquid hydrogen to extremely high temperatures, producing a rapidly expanding gas jet that can produce a lot of thrust.

Nuclear Rocket - Schematic

I guess that is it for this article, I hope you enjoyed reading it. You could always peruse wikipedia or Encyclopaedia Britannica for more information, it is quite an addictive thing to read about ;)

That is it from me.

-Ankur ‘Exploreable’ Chakravarthy

Exciting world of neurosciences

Hello everybody,

Today, I am going to write something about a subject that is very dear to me. I am fascinated by this subject so I hope a little of my fascination rubs on to you. I am new to this subject, so I am not an expert. I hope that you will learn and come to appreciate this subject just as I have.

The subject that I am going to talk about is THE BRAIN. Yes, I know. There are a million different things that can be said about this one “master organ”. It is a 3 pound jelly (you can hold it in the palm of your hand!), made of 100 billion neurons. Neurons are a class of cells which make up the brain and nerves. The brain is such a wonderful organ that is can contemplate the vastness of the universe, contemplate the meaning of infinity, God… In fact, it can even contemplate itself contemplating the meaning of infinity. Self – awareness, according to me is the holy grail of neurosciences.

All these neurons together constitute the spectrum of human activities. To understand how the brain works, there are several different methods. One such method is to study the brain which has been damaged. If the damage is confined to a small region of the brain (either due to genetic change or a physical damage), then the brain itself doesn’t stop working altogether. There is no reduction in the cognitive ability on the whole. Instead, there is a highly selective loss of one function while other functions are preserved intact. This makes it easy to map the function onto the structure and understand how the structure contributes for the overall activity.

Here are a few examples which help you understand this process:

Example number 1: Capgras Syndrome

According to wiki, the Capgras delusion theory (or Capgras syndrome) is a disorder in which a person holds a delusion that a friend, spouse, parent, or other close family members have been replaced by identical-looking impostors. This is due to the damage to a very specific part of the brain called fusiform gyrus. It is also called discontinuous occipitotemporal gyrus. In layman terms, it can be called the face area of the brain. If there is damage to this particular part of brain you can no longer recognise people just by seeing their face. Mind you, you can still recognise them by hearing them. In fact, you won’t be able to recognise yourself in a mirror. Of course, you know it is you because it imitates your actions.

In this rare syndrome, the person will be completely lucid but still will not be able to recognise his own friends and family members. In the olden psychiatry textbooks, this can be explained by a Freudian principle – Oedipus complex in men and Electra complex in Women. According to this explanation, young children will have a strong sexual attraction to their parents (“father-fixated” and “mother-fixated”). As they grow up, the cortex develops and inhibits these latent sexual feelings. If there is a damage to the part of the brain which suppresses these feelings then the sexual arousal returns.

You have to understand that I don’t necessarily believe in this principle. It just happens to be one of the explanations that could successfully resolve the capgras delusion. But, this principle cannot explain why a person with capgras delusion has difficulty recognising his own pet. The whole Freudian explanation (Oedipus and Electra) don’t really work for pets, do they?

To explain it, scientists looked at visual areas in the brain (all 30 of them). The object is processed and sent to a small structure in the brain called fusiform gyrus where faces are perceived. From this structure, the message cascades into another structure called amygdala in the limbic system which is involved in gauging the emotional significance of the object that has been visualised.

The patient (With Capgras delusion) might  have a problem in the area where the neurons connect the fusiform gyrus with the amygdala. So, he recognises his friends, parents etc, but the emotional significance of it is lost. So, the patient starts thinking of his own friends and family as imposters. (Scientists determined this based on galvanic skin response)

You can read more about this methodology here.

How is this complex neural circuitry set-up in the brain? What is the reason for it? Is it nature, nurture or genes?

Example number 2: Phantom Limb

                Recently, an uncle of mine had an amputation done on his leg (above the knee) because the doctors found giant cell tumours below his knee cap. I started reading up about this peculiar feeling that he had post surgery. It was a vivid compelling experience for him. Of course, I knew what a phantom limb was but I never appreciated the severity of the situation. He was bed ridden for several months before the operation and experienced severe pain (due to a fracture to the same leg). His brain sent signals to his leg to move but it gets back visual feedback saying “NO”. This is called learned paralysis because it gets wired into the brain that even after sending a command, there is no appropriate result.

Even after the operation, he continued to feel that pain in the same place. He knew his limb was amputated. He could see it, but still he felt the pain. The learned paralysis allowed him to feel the pain. The phantom limb also behaves like a paralysed limb. The only way of dealing with this is to allow the brain to see that the phantom limb is moving according to the command. This can be done using a mirror box. A simple but ingenious creation by Vilayanur S. Ramachandran. You can read more about this here.

Example number 3: Synaesthesia

                Synaesthesia is a neurologically-based condition in which stimulation of one sensory or cognitive pathway leads to automatic, involuntary experiences in a second sensory or cognitive pathway. In short, it is mingling of the senses.         

In one common form of synaesthesia, letters or numbers are perceived as inherently coloured. 5 is red, 3 is green, 1 is white, 2 is blue so and so forth. These people are completely normal in other ways. Sometimes, sound and colour gets mixed up. C sharp is yellow, perhaps?

Did you know that synaesthesia is 8 times more common in artists, poets, novelists and other creative people?

In the brain, the “colour area” and the “number area” are next to each other. In people with synaesthesia, there is a crosslink between these two areas. This is because the gene which causes the changes in interneuronal connection. Usually, artists have a way of perceiving seemingly mundane things in a different way. (Her lips were like a volcano that’s hot – Elvis Presley). There is a link between these things. You can read more about this here.

Just a few more things to say: Each neuron makes 1000 – 10,000 contacts with other neurons in the brain. That is a lot! This blog is not a review of any one particular paper. I will leave that to someone who is an expert in that particular field. What I want to achieve by posting this blog is to create an interest in neurosciences amongst budding scientists. There are a lot of things being done in this particular area and yet, we have only taken baby steps in our understanding of the brain and its functions. If we join this research, then we will surely be able to understand more about ourselves and our behaviour.



Will Indian Society Finally Wake Up?

Hello there,

It isn’t often I blog about social issues but the one we have at the moment is one I deem important enough to the extent that overlooking it would be impossible. We are talking about a further, alarming, skewing of the child sex ratio in India, as reported by in a research paper in the Lancet. The paper in question is “Trends in selective abortions of girls in India: analysis of nationally representative birth histories from 1990 to 2005 and census data from 1991 to 2011, Jha et al, The Lancet, doi:10.1016/S0140-6736(11)60649-1″ which you may access for free upon registering on The Lancet website here

The abstract of the study reads thus


India’s 2011 census revealed a growing imbalance between the numbers of girls and boys aged 0—6 years, which we postulate is due to increased prenatal sex determination with subsequent selective abortion of female fetuses. We aimed to establish the trends in sex ratio by birth order from 1990 to 2005 with three nationally representative surveys and to quantify the totals of selective abortions of girls with census cohort data.


We assessed sex ratios by birth order in 0•25 million births in three rounds of the nationally representative National Family Health Survey covering the period from 1990 to 2005. We estimated totals of selective abortion of girls by assessing the birth cohorts of children aged 0—6 years in the 1991, 2001, and 2011 censuses. Our main statistic was the conditional sex ratio of second-order births after a firstborn girl and we used 3-year rolling weighted averages to test for trends, with differences between trends compared by linear regression.

The conditional sex ratio for second-order births when the firstborn was a girl fell from 906 per 1000 boys (99% CI 798—1013) in 1990 to 836 (733—939) in 2005; an annual decline of 0•52% (p for trend=0•002). Declines were much greater in mothers with 10 or more years of education than in mothers with no education, and in wealthier households compared with poorer households. By contrast, we did not detect any significant declines in the sex ratio for second-order births if the firstborn was a boy, or for firstborns. Between the 2001 and 2011 censuses, more than twice the number of Indian districts (local administrative areas) showed declines in the child sex ratio as districts with no change or increases. After adjusting for excess mortality rates in girls, our estimates of number of selective abortions of girls rose from 0—2•0 million in the 1980s, to 1•2—4•1 million in the 1990s, and to 3•1—6•0 million in the 2000s. Each 1% decline in child sex ratio at ages 0—6 years implied 1•2—3•6 million more selective abortions of girls. Selective abortions of girls totalled about 4•2—12•1 million from 1980—2010, with a greater rate of increase in the 1990s than in the 2000s.

Selective abortion of girls, especially for pregnancies after a firstborn girl, has increased substantially in India. Most of India’s population now live in states where selective abortion of girls is common.

They basically carried out analysis of statistics from a survey they organized, the Indian National Health Survey, and Census data from 2011 and 2001, they used a rolling three year average of chid sex ratios for evaluation and they subjected the values from it to linear regression analysis, which is used to identify trends. You may read more about the use of rolling averages here and Linear Regression Analysis here
They accounted for factors such as female infant mortality due to non-abortion reasons et cetera to arrive at a standard that we should expect to see if sex selective abortions were absent, and this is what renders their analysis rigorous.

Some of the findings that have emerged from the study are downright shocking.

Firstly, it would appear that the number of sex-selective abortions as far as girl children were concerned, based on an estimate which they derived using a conditional measure based only on second-born female chidren seems to have shot up drastically, of course, while it appears that the rate of increase is not as rapid in the past decade as opposed to the ‘nineties, the increase in number is rather worrisome.

Secondly, perhaps even more shockingly, the prevalence of sex-selective abortions based on their estimates seems to be more in cases where i) Mothers were educated ii)Households were richer, which is especially alarming. It would also appear that the number of districts in India from which further skewing of the Child sex ratio has been reported has also increased, the conclusion from this being that the practise is on its way up and is spreading nationally. The authors of the paper suspect this may be to do with the ability to afford ultrasound (which is used for sex determination)

The implication of this is that we’ll start to end up with more men than women in Indian society in the next generation (since the current cohort being studied and subjected to evaluation will grow up to be the men and women who define it) and if the problem isn’t dealt with quickly, we will, on a national level, end up with a severe disparity in the number of men and women, and with all the responsibilities women currently hold in the Indian setup and all they do, it could have major social repercussions.

One of the most striking trends that emerges from the published data is the spread of the phenomenon, using data from three decades to see what proportion of the population had a skewed child sex ratios, the authors found that 56% of the population of India lives in states where the child sex ratio (CSR) is skewed, as opposed to 10% in 1991 and 27% in 2001, which would indicate that the spread of Sex-selective abortion has been spreading to more and more places in the country.

I have cited a graph from the original paper under fair use conditions for purposes of scholarship and education that illustrates this.

Distribution of the total population living in states with varying child sex ratios (girls per 1000 boys at ages 0—6 years), 1991, 2001, and 2011 Mean national values for each of the censuses are shown. The vertical grey bar represents a natural sex ratio at birth of 950—975 girls per 1000 boys, where the distribution of child sex ratios at ages 0—6 years would be centred in the hypothetical absence of selective abortion of girls and equal girl and boy child mortality rates.

Please right-click the graph and use the menu to bring up a larger image, I hope you can see how the practice has been taking hold in India and how it is increasingly skewing child sex ratios.

Here is a map illustrating geographical changes in child sex ratio, again from the same paper and cited under fair use provisions.

Child sex ratio of girls to boys at ages 0–6 years in 2001 and 2011, by district Of the 623 districts, data were available for 596 in the 2001 census and 588 in the 2011 census. The blue highlighted states are Gujarat, Haryana, Himachal Pradesh, and Punjab, which have shown consistently lower child sex ratios at ages 0–6 years in the last three censuses. State names are Andhra Pradesh (AP); Assam (AS); Bihar (BR); Chattisgarh (CG); Gujarat (GJ); Haryana (HR); Himachal Pradesh (HP); Jammu and Kashmir (JK); Jharkhand (JH); Karnataka (KA); Kerala (KL); Maharashtra (MH); Orissa (OR); Punjab (PB); Rajasthan (RJ); Tamil Nadu (TN); Uttarakhand (UK); Uttar Pradesh (UP); West Bengal (WB); and Arunachal Pradesh, Manipur, Meghalaya, Mizoram, Nagaland, Sikkim, and Tripura (collectively NE).

Again, right-click the map and use the pop-up menu for a larger image.

Further Information and Links

You can also read an article by Kalpana Sharma of The Guardian here , read the BBC’s coverage of the reaction to the data that has emerged in Bihar (which is a very bad offender)
here and in Kashmir here. You can also read a case report on female foeticide and infanticide from here

Spread the word, share these articles, talk about the issues at hand, pretending there isn’t a problem while maintaining a facade of perfection as a country is going to achieve bugger all! Only when people are willing to come to terms with the presence of a problem can efforts be made to look for solutions, only when there is strong consensus will people begin to make that effort. We just cannot put up with abortions being carried out because a foetus happened to be the second one in gestation and lacked a Y chromosome.

Thanks for reading,now go spread the word.
– Ankur “Exploreable” Chakravarthy.

ELISA – Enzyme Linked Immunosorbent Assay.

ELISA is a fantastically versatile tool that is used for diagnosing viral infections and the like, but technically speaking, it can be used to detect and quantify any antigen or antibody that one may so desire to look for. It stands for “Enzyme Linked Immunosorbent Assay” and that means a test which uses an immunosorbent (a substance that either is or reacts with an immune compound to bind to it) linked to an enzyme.

Before moving on, it is important to know that antibodies are chemical substances that the immune system makes which bind in a highly specific manner to other molecules, which are called antigens. (This definition is tautological, an antigen is a substance that binds to an antibody and a substance that binds specifically to an antibody is an antigen) , it is also important to note that antibodies are proteins, and may be suitably modified and linked to other proteins using good old chemistry. The process of doing this is called conjugation and this is how one produces the enzyme linked immunosorbents used in the assay.

Depending on whether one intends to detect antibodies or antigens, the setup used for ELISA may vary. In case we are looking for antibodies, we can immobilize the antigen onto a substrate, in case we are looking for antigens we can immobilize antibodies onto the substrate.

ELISA is usually carried out on strips of cellulose acetate (this is called Dot ELISA) or in microtitre plates.

Photograph of Dot Elisa Strips

Microtitre Plate used for ELISA.

Microtitre Plate used for ELISA.

Generalized Protocol.

[1] Depending on whether one is looking for antigens or antibodies, antibodies or antigens, respectively, are chemically bound to the surface of wells/test strips. While scientists have to do this while creating their own assays, those that are used for diagnosis come with those molecules already bound.

[2] Areas of the well that haven’t been coated with the antigen/antibody are then blocked using a blocking buffer, the well is now ready to use.

[3] The sample is then added and if there are antibodies present to the bound antigens they will bind to the well, too. These wells are then washed to remove any unbound antibodies.

[4] A secondary antibody which is covalently modified and linked to an enzyme is then added, this antibody binds to any antibodies that have bound to antigens in step [3] , finally, an enzyme substrate is added the enzyme converts this to a product which can be measured using optical methods such as colorimetry. This enables quantification as well as detection.

Sandwich ELISA is used when we are looking for antigens, in this case a primary, unlabelled antibody is bound to the wells, if there are antigens they will bind to these antibodies, a second set of labelled antibodies is then introduced. These antibodies bind to the antigen bound to the primary antibody. Again a substrate for the enzyme is added and this produces an optically measurable signal.

Some setups use an unlabelled antibody to bind to the antigen and a labelled antibody to bind to the antibody that is bound to the antigen. Again, it is one of the many possible configurations that may be employed.

Outlines of ELISA setups.

Please click on the image to bring up a larger version.


As already mentioned, they can be used for detection and quantification of either antibodies or antigens and this immediately has diagnostic implications. It can also be used in some circumstances to quantify the effect of drugs on the expression of particular proteins (in this case by analysing the quantity of bindable antigen) and so on and so forth. It is not only used clinically for diagnosing viral infections but also for things like pregnancy tests, where assays are geared towards the detection of proteins like HCG (Human Chorionic Gonadotrophin)

Further links.

You may watch videos of various methods of ELISA being employed and summarized here

That is all from me about ELISA at this juncture. Happy reading.


White people believe that they face the worst racism

…according to a study in the journal Perspectives on Psychological Science.

The study, called ‘Whites see racism as a zero-sum game that they are now losing‘, by Michael Norton and Samuel Sommers, suggests that white Americans surveyed think that they are now more widely discriminated against than black people, and that this supposed ‘anti-white bias’ is a bigger societal problem than the real anti-black bias. (This is an American-based study, and I think the problem is probably more prominent there, but since I only know the UK, and I see a similar trend happening over here, I will be using UK-based examples.)

After all of the wrongdoings of the past, governments are now at least trying to make society more equal for everyone, but the damage that has been done has penetrated society too deeply to disappear overnight. Ideas that black people and indeed people of other ethnicities are in some way inferior are ingrained in the collective consciousness, to the extent that when their position in society begins to improve, white people have started to cry ‘racism!’ Are we really selfish and shortsighted enough to convince ourselves that all along, all they were complaining about was the fact that white people had a more privileged position in society? Is history no longer taught in schools?

Many of my black friends have talked to me about the racism they experience in daily life, whether it’s a remark they have overheard from someone in the street, or discrimination by the authorities. A large proportion of my black friends have been subject to at least one stop and search by the police. As far as I know, none of my white friends have. Figures published in 2010 indicated that black people were seven times more likely to be stopped than white people.

These stops and searches under Section 44 were last January ruled illegal by the European Court of Human Rights for their arbitrary and widespread usage, as well as their disproportionate targeting of blacks and Asians. Now are we going to label this as racism against white people? Of course not, because it is utterly transparent that this was a case of inequality. So how can the same principle of removing inequality apply in other situations and be called ‘anti-white bias’?

This is how I see it. Most white people don’t even know they are white until they are in a room full of black people. They don’t have to; society holds them in a more privileged position by default, and in the past it was a rare occasion that a white person would find themselves in such a situation. Now, however, it is becoming more common.

Having used this example already, I will continue with it. The stop and search issue can be held at arms length and viewed relatively objectively by white people as inequality that needs to be extinguished. Why? Because it just doesn’t happen to white people. But when the situation for non-Caucasians starts to improve in other, less subtle ways, why is the reaction turned on its head?

I have seen this time and time again in previous jobs; the anecdote of the poor white person who lost out on a job opportunity to the black person who ‘wasn’t even as good at the job’. ‘Political correctness gone mad’, they say. Whenever I hear this it makes me cringe. Yes, there are guidelines to say that workforces (dependent on who applies) need to be representative of the general population, but why do I find it hard to believe that a less skilled person would be favoured over a more skilled person? From a productivity point of view, it doesn’t make sense for a company to employ an inferior applicant. Surely it’s more likely to be the case that the two people are equally skilled, but the white person doesn’t know how to react because it’s an unfamiliar situation they find themselves in, but in the name of improving standards of equality it has to happen. I suspect that if the job was given to another white person, then sure, the other candidate would be annoyed, and would probably make some digs about capability, but as soon as preference is given to a black person, the problem becomes about skin colour. It’s an easy target.

To cite another article:

a co-author of the study called the results “surprising.” That’s putting it mildly. But maybe we’ve missed the way white Americans have been systemically deprived of access and opportunities. Maybe we’ve overlooked all the times whites have been targeted by implicit and explicit race-baiting attacks, whether they’re playing professional sports or seeking elected office. Maybe we didn’t get the memo on the way the legacy of discrimination against white Americans continues to manifest itself in worse outcomes in income, home ownership, health and employment for them, the way white people are told they’re “objectively” ugly, and the disgust so many Americans felt the last time a white person ran for president.

Now imagine all this actually happening to a white person. Quite simply; it would not happen. The privileged position we hold is so blinkered to reality that it’s only when someone uses analogy like that displayed above that we realise it’s really not so bad after all. Yet some people will continue to complain. Please open your eyes. Try shutting up, listening and learning something. Racism is still a part of the daily life of non-white people, manifesting itself in all aspects of their lives.

So if you are a white person and you have read all this and disagree: Think you know racism? Think again.

From DNA to RNA – The process of transcription.

Right, I’m back to blogging about molecular biology et cetera, and if one were to go back and re-read the last article that I wrote, the first thing they would notice is how the Human Genome Project was extremely significant, since it would enable us to understand how genotypes give rise to phenotypes.

The transition between genotype and phenotype is not very simple or clear cut at all times, since there can be varying numbers of different proteins and interactions between proteins involved in the development of traits from genes. Interactions with the environment can also play a role in this.

One of the central tenets of modern molecular biology is the idea of the central dogma of molecular biology, which forbids the transfer of sequential information from proteins to other proteins or nucleic acids, but allows sequence information transfer from nucleic acids to proteins. You may trawl through the archives on this blog to find that article.

One of the molecular interactions that does happen in all living organisms barring a few viruses with reverse transcription is the process of Transcription. In transcription, an RNA sequence is produced using DNA as a template, and a short introduction to this process will be the focus of my post.

The process of transcription per se can be split into several key stages; Initiation, Promoter Clearance & Elongation, and termination. A brief description of these stages follows…

[1] Initiation – this is the first stage of the transcription process, which should be bleeding obvious :P, it must be noted that the process of transcription is contingent upon the Transcriptional apparatus beginning to liaise with the DNA that is to be transcribed.

Key Steps in the initiation of transcription in Eukaryotes. Note how the transcriptional complex assembles round the promoter, which is critical in indicating where to start transcription

In Eukaryotes (organisms with a proper, double membrane nucleus) the process is slightly more complex than in bacteria, but both involve the binding of the RNA polymerase enzyme’s subunits to DNA sequences called promoters. Promoters are capable of binding to RNA polymerase by means of specific sequence motifs that they carry, and as such they serve as starting points for transcription.

In Eukaryotes, proteins called transcription factors are also required to bind to core promoter regions along with RNA polymerase for initiation to occur properly. The key difference is that while RNA Polymerase is capable of binding directly to promoters in Prokaryotes, in Eukaryotes this is not possible and has to be mediated by transcription factors, this can enable the evolution of genetic systems with a much more complex range of transcriptional regulatory interactions possible, since transcription factors per se can facilitate binding to a wider range of promoter sequences based on the interactions they have with target sequences, and thus enable organisms to develop gene regulatory networks with more flexibility and variability, this can sometimes be a problem in things like cancer since one potential implication is that there are more ways by which dysregulation may be achieved.

This illustration highlights the different ways in which transcription factors can affect transcription, either enhancing it or suppressing it. The presence of such a transcriptional apparatus enables a much more diverse range of interactions and molecular transcriptional regulation.

RNA polymerase is an enzyme made up of multiple subunits, initiation can be said to have well and truly occurred after we end up with all the requisite factors and enzymes at the promoter. The next step is promoter clearance.

[2] Promoter Clearance/Abortive Initiation – The fully assembled, active RNA Polymerase complex then begins to move along the unwound DNA (the unwinding is due to the action of a helicase enzyme that is part of the complex), producing very short fragments of RNA, slowly it frees itself from the promoter, at which point a molecule in the complex, called the σ
factor, will have rearranged. From this point onwards, the process of transcription is able to produce full length transcripts.

[3] Elongation – this is the step in which long, full length transcripts that later undergo translation to form proteins or RNA processing to work as functional RNA molecules are produced. The process is actually quite simple.

The Process of Transcription Elongation.

i) RNA Polymerase unwinds DNA through helicase activity.

ii) The 3′ – 5′ strand of DNA is used as a template for RNA synthesis, the RNA polymerase incorporates nucleotides through complementary base pairing, like in DNA, but with a noticeable exception, that being that instead of Thymine we have Uracil in RNA, and this binds to Adenine on the DNA template.

iii) And so a strand of RNA is synthesized, as the enzyme complex moves along the template DNA, it allows the hitherto separated strands of DNA to come back in together and the RNA is dissociated. The whole process of elongation is dependent on energy from ATP.

[4] Termination – This is the final step, where transcription is halted and the RNA transcript is released.

The process again is different in Prokaryotes and Eukaryotes, in prokaryotes there are two known mechanisms.

i) Rho Dependent Termination – here, a protein factor called Rho binds to the transcription termination site, and destabilizes the interaction between the DNA template and the RNA Polymerase complex, resulting in transcription being ended and the transcript being released as a consequence.

ii) Rho independent termination – here, the formation of a hairpin loop in G-C rich regions of the DNA template basically renders it inaccessible to the RNA strand and RNA polymerase, resulting in termination of transcription.

In Eukaryotes there is enzymatic cleavage of the transcript followed by polyadenylation of mRNA to produce the final transcript which may undergo further posttranscriptional modifications before translation, such as splicing.

A little tangent – Reverse Transcription.

Some viruses with RNA genomes (or retroviruses) engage in reverse transcription, that is, production of a DNA sequence using an RNA template, they do this with the help of an enzyme called Reverse Transcriptase. Basically, the process of reverse transcription is as follows…

i) Synthesis of DNA strand complementary to RNA template.
ii) Enzymatic degradation of RNA template.
iii) Conversion of DNA strand synthesized in step i) to a double stranded version through DNA polymerase activity.

Retrotransposons also use Reverse transcription when they do their thing (i.e, self replicate and integrate themselves into other sites in the genome) , Howard Temin, Renato Dulbecco and David Baltimore won a Nobel for their discovery of Reverse Transcriptase, and one may find the apposite Nobel lectures through the link here

More to read, more to learn

[1] Read an article from Nature Scitable on DNA transcription, here

[2] Watch a simplified video summary here

Alright then, I guess that is pretty much it for today.


Kill the gays – Uganda’s anti-homosexuality bill.

Hi all,

I’m sure many of you have heard about the anti-homosexuality bill that is likely to be passed within the next 48 hours in Uganda.

For those of you that haven’t, here’s a brief breakdown.

Homosexuality is already illegal to the extent that:

  • same-sex sexual activity is illegal
  • same-sex relationships are not recognised as valid
  • same-sex marriage is banned
  • same-sex adoption is illegal
  • homosexuals are not allowed to serve openly in the military
  • there is no anti-discrimination legislation
  • there are no laws concerning gender identity/expression

The proposed bill will broaden the criminalisation of homosexuality to include:

  • death penalty for people with previous convictions
  • death penalty for the HIV-positive (propagating the misconception that HIV can only be spread via anal sex)
  • death penalty for those engaging in same-sex acts with people under 18 years of age.
  • Ugandans engaging in same-sex sexual activity outside of Uganda can be extradited back to the country and punished
  • punishment for LGBT rights advocates
(Image from wikipedia)

How has this come about?

The current laws against homosexuality are remnants of British colonialism, and were enacted as a response to certain cultural and ritual practices. Indeed, there are several documented accounts of homosexual customs in pre-colonial Africa, and around the world.
However, many people view homosexuality as a western import, and due to religious influences are keen to distance themselves as much as possible from these ‘unnatural practices’, which is not surprising when you read Bible passages like this, explicitly laying down the law:
If a man also lie with mankind, as he lieth with a woman, both of them have committed an abomination: they shall surely be put to death; their blood shall be upon them. (Leviticus 20:13 KJV)

The consensus seems to be that the move towards this more severe bill has been influenced by American evangelical Christians; certainly among the strongest supporters of the bill is the Ugandan Pentecostal Pastor Solomon Male, who preaches that homosexuals actively ‘recruit’ people to their ‘ranks’, that nobody is born gay, and that homosexuals can be ‘cured’ of their ‘affliction’.

The Bible contains references in several of its books to homosexual ‘abominations’, including Genesis, Leviticus and Deuteronomy. Click here for a full list.

Sadly, having spent an extended period of time in several east African countries, I have seen the American evangelicals in action. In fact, they’re pretty hard to miss. Their billboards litter the roadsides; for every regular billboard, there are 10 depicting white Americans flashing broad smiles, encouraging the ‘morally underdeveloped’ Africans to come over to a ‘better’ way of life.

Whilst many may be evangelicals who place emphasis on mercy, forgiveness and love, there are still those that accept and teach nothing but a literal interpretation of the Bible, instead with an emphasis on God’s wrath; examples of which include passages like the one I selected from Leviticus above. Like I said, it’s pretty explicit, and there’s not many other ways in which passages like that can be interpreted.

Whether they are directly involved with the bill itself or not, they will still have blood on their hands if this bill passes, since it is these imported, fundamentalist convictions that have poisoned the minds of so many.

This bill, if passed (which it seems likely to at the moment), could result in hundreds of thousands of Ugandans losing their lives and their dignity, as they are outed in the press and brought to ‘justice’. That is, if propaganda-fueled citizens don’t beat the government to it by stoning the victims of this bill to death in the street.

If you are reading this I urge you to sign the petitions here and here in attempt to make the voices in opposition of this bill heard.

If you think that the passing of this bill is fine, then I’m sorry that propaganda and hateful religious dogma has warped your mind so much that you think it’s OK to kill someone for something that is completely natural and beyond their control. You know, kind of like killing someone because of the colour of their skin. We’ve spoken out about that, now speak out about this.
Do not let this bill pass as law.

That’s it for now.

The Story of the Human Genome Project – A Short Narration

It is our inalienable heritage. It is humanity’s common thread

– Sir John Sulston.

The story I am going to tell you here is one that I rate extremely significant in the history of our species. It enabled us to identify ourselves at the molecular level, and also established a genetic link to the rest of the biosphere by allowing us to peer into the similarities and the differences we share with the rest of the living world.

It is a story of public co-operation on a scale that is not usually seen, it is a story of the ethical fight to keep the data from the human genome project in the public domain as opposed to being the property of corporate owners who might have very well tried to monopolize genes and what they do, it is a story of scientific prowess and technological achievement, finally it is a story of where we come from and where we are headed, and it tells of great promises that may be fulfilled in our fight against disease & death and greatly help us in our pursuit of good health. It promises to shine a light on the molecular mechanisms of development, of what makes us who we are, and what goes wrong when we fall sick. I think it is one of the most marvellous stories one could perhaps narrate insofar human scientific achievement is concerned.

The Thread of Life – DNA.

Before we move on to the rest of the article, I feel an introduction to DNA is warranted. DNA is a polymer of nucleic acids that is found in most organisms in a double stranded configuration. DNA is a molecule that acts as a template for the body to make proteins, and organisms are made out of proteins or other molecules which are acted upon by proteins. DNA also has regions which regulate when genes are turned on and off, and these enable things such as signalling and feedback to be introduced. Despite being a bog-standard polymer, DNA has the ability to orchestrate all the complex chemistry that is essential to turn single cells that carry a sufficiently complex genome to extremely complex multicellular organisms. Everything that we are, except whatever behaviours we may learn due to the environment, is down to DNA and processes that act on DNA.

I will be writing more on how this transition from genes to phenes happens in future posts, but until then it may be useful to learn what the structure of DNA is like and what implications this has. DNA in most organisms is made up of two strands wrapped around each other in a double helix. I say most organisms because some viruses, for example, have single stranded DNA.

Structural features of DNA - A Graphical Summary.

DNA basically consists of two antiparallel strands wound into a helix, these strands are made of a sugar molecule called deoxyribose and a phosphate molecule. They are bound to nitrogenous bases of two types, namely purines and pyrimidines. A purine always binds to a pyrimidine, i.e Adenine always binds to thymine and Guanine always binds to Cytosine, we call this complementary base pairing. This is extremely relevant to how DNA works… since it can explain how descendent cells can acquire their DNA, i.e, either strand can be used as a template to produce a new double stranded copy of DNA. It can also explain how sequence can determine the sequence of other molecules that are produced using DNA as a template, using processes such as transcription and translation. You may want to read articles that I’ve written on the blog that deal with the Central Dogma of Molecular Biology and The Elucidation of the Genetic Code on this very blog if you feel like taking a little detour at this juncture, for we are just getting started. You may also find this interactive exercise to be useful in learning about DNA structure and base pairing to be an interesting thing to do.

A little experiment you can do

Sure, what better way to get you involved in knowing about the project than to actually make it possible for you to actually carry out one of the key steps involved in the HGP, namely the isolation of DNA? While the HGP used human DNA, for purposes of our little experiment, many other organisms will do, and I suggest using plant material for this.

The following protocol may be carried out.

[1] Grind up some peas or onions, around 50 grams, in around 30-40 ml of warm water with 1-2 tablespoons of salt.

[2] Add some detergent to the mixture to lyse cells from the paste and to break down proteins. Filter this.

[3] To the filtrate, add alcohol drop by drop until you see white fibrous clumps moving into the alcohol layer, which is the upper layer.

[4] You have a DNA sample.

It must be noted that this process is very similar to what happens in the case of research grade projects involving DNA extraction, in fact, the chemistry is much the same! You may also carry out an exercise in isolating DNA, virtually, here Go ahead and give it a try, you know you want to ;)

What did the Human Genome Project Entail & What technological advances were critical ?

It basically entailed a large scale, collaborative sequencing effort to completely sequence the haploid genomes (either eggs or sperm) of genome donors who were anonymous. This effort was helped greatly by the development of high quality, rapid sequencing methods that allowed genome fragments to be sequenced and then put in place by the algorithmic arrangement of overlapping ends to produce a continuous sequence.

The sequencing technology that was used in the HGP was an automated form of Sanger’s Chain Termination Method, for which Fred Sanger won a second Nobel Prize in Chemistry for coming up with this. Automation enabled sequencing studies to handle ever larger genomes, and there people could map genomes both on a whole genome basis and on a fragment based basis.

The major technological platforms that were needed to successfully complete the HGP were improved and developed by working on the genomes of other organisms, ranging from viruses with cute little genomes to organisms of much more complex organisms such as the worm C.elegans (which by itself has won people who studied facets of it four Nobel prizes) and then Saccharomyces cerevisiae (Baker’s yeast, the stuff that brews yer beer). Sequencing the Drosophila melanogaster genome was one of the first projects to utilize the Whole Genome Shotgun method developed by Celera Genomics (Craig Venter’s company).

How Sanger Sequencing Works

Sanger Sequencing (courtesy Scitable)

The idea here is that once you have single stranded DNA with a starting primer sequence, DNA polymerase will extend it ( this is the same principle used in the extension step of PCR, about which I have written on this blog before), in Sanger sequencing, you label a form of nucleotide that binds complementarily to the single stranded DNA template but does not allow DNA polymerase to extend it further, hence terminating the chain and add this to the mixture. After sorting these chains by size using electrophoresis , we can just read the sequence of terminal bases using the fluorescent labels that they have been treated with, and ta-da, we can read the sequence.

Here is a video that explains the concept of Sanger Chain Termination Sequence to you.

Approaches to Genome Sequencing.

In this section, I describe the two predominant approaches that HGP teams used to organize samples and assemble data from sequencing those samples into genome draft sequences.

Hierarchical Genome Sequencing

This method was implemented by the Publicly funded collaboration. The difference between this and the private Celera Genomics project is apparent in sample preparation and assembly.

Hierarchical Shotgun Sequencing.

In Hierarchical Shotgun Sequencing…

[1] Markers for regions of the genomes are identified.
[2] The genome is split into fragments using restriction/cutting enzymes that contain a known marker.
[3] These fragments are cloned in bacteria using bacterial plasmids, we call these constructs BACs (Bacterial Artificial Chromosomes)
[4] These fragments are individually sequenced using automated Sanger sequencing.
[5] Assembly of the genome is done on the basis of prior knowledge of the markers used to localize sequenced fragments to their genomic location. A computer stitches the sequences up using the markers as a reference guide.

Whole Genome Shotgun Sequencing

This method was employed by Celera Genomics, which was a private entity that was trying to monopolise the human genome sequence by patenting it, to do this they had to try and beat the publicly funded project. Whole genome shotgun sequencing was therefore adopted by them.

Whole Genome Shotgun Sequencing


[1] A library is generated of random fragments of the human genome using restriction digestion followed by cloning.
[2] These fragments are then sequenced, we call each of these fragments a sequence read.
[3] Overlapping sequences are then used to produce contiguous sequences.
[4] A scaffold is constructed using computationally predicted read pairs.
[5] Contiguous sequences are then computationally assembled together.

Now the thing with Celera Genomics is that they did data integration using both the scaffold method and the marker method using publicly available data, they found, as a result, that hierarchical sequencing was slightly more efficient, but more or less the data could be analysed and integrated using both approaches.

At this juncture, you may want to take a little detour again to read a more detailed explanation of the technology involved on Nature Scitable, which you may find here

Nature of the Donors whose sequences were used

Due to ethical considerations, the donors have remained anonymous, but the approach the public project used was to isolate White Blood Cells from two male and two female donors, mix it all together and to then put these samples through one of the aforementioned sequencing workflows.

The Celera Genomics project used five anonymous donors and samples that were taken from a pool of 21 donors initially. The source tissue for samples again appears to be similar to that in the public project.

The timeline of the Human Genome Project, and some comments.

Timeline of the HGP. Please click on the image for a large, high-resolution version.

[1] Much of the work that made sequencing the human genome possible actually took place something like more than a century ago, starting with the elucidation of the nature of inheritance, subsequent localization of heredity to chromosomes, then confirmation of DNA as the genetic material and finally the discovery of the double helical structure of DNA. This was then followed by discovering how DNA replication took place, how DNA specifies protein sequences and further insights from the role of genes and proteins in development.

[2] Along the way, we learned how variations in genes can contribute to disease, starting with the implication of genes in Huntington’s Chorea and the discovery of the variant gene that causes Duchenne’s Muscular Dystrophy. The development of Sanger sequencing was a landmark.

[3] Sequencing the genomes of several model organisms along the way is very significant in my opinion because it allows us to use genetic data from those organisms in relation to our own so that they can be of import to human studies and biology. We already know of genes such as Pax6 which are universally conserved in the development of animal eyes, for instance. We are able to carry out precise comparison between gene sequences in order to identify what genetic differences are responsible for phenetic differences.

[4] The immense amounts of co-operation seen, with the publicly funded project actively involving large amounts of international partners is an indicator of what could be achieved if only people were willing to look beyond the barriers of nationalism, in my opinion. The fact that we had sequencing centers from North America, Europe & Asia is also an indicator of how important one could hold the common heritage that we all share to be.

[5] The completion of sequencing of the first chromosome in the HGP, namely Chromosome 22 was a significant landmark since it showed it could be done and was only a matter of time.

[6] The announcement of a completed draft in 2003 was a beautiful moment in the history of science, since a thirteen year project had now officially ended with all its goals achieved.

Of Public Affairs and Private Affairs

Now this is a very interesting bit, people had been patenting genes for long but the sequencing of the human genome brought with it its own aspirants for monopoly. Celera wanted to patent the human genome in case it got there first, and charge researchers for the privilege of accessing said data, and to prevent redistribution. They also wanted to release data annually, which could stymy progress.

The public effort, on the other hand, has had a policy of open access to the data it publishes, with new data to be mandatorily published as soon as the compilation of the sequence following sequence was complete, within 24 hours. I have always been one for open access, and I find it tremendously relieving that I do not have to pay to access what is a part of my own molecular heritage, that would seriously suck.

Of spinoffs, benefits & implications.

[1] The completion of the draft enabled variation studies to be carried out, such as HapMap, which documents single nucleotide variations (or point mutations/substitutions) which have diagnostic and prognostic value.

[2] The availability of a reference sequence enabled the study of the functioning of normal vs mutant genes and in deriving a working understanding of how aberrant gene function may be linked with diseases. This also encouraged the development of diagnostic markers for diseases, a case in point being the detection of a test for BRCA1 and BRCA2 variants which are good markers for susceptibility to familial breast cancers.

[3] Having reference sequences available along with variation data can help in the analysis of intragenic variation and how evolution might be shaping the genome, for instance, it can also optimize the development of gene silencing techniques which may enable us to not only study gene function, but to utilize the same to develop therapeutic strategies to combat those genetic disorders.

[4] In diseases like cancer, it facilitates analysis of alleles mutated or implicated in disease progression, this gives us understanding of pathogenesis at the molecular level and can inform the search for drug targets that can be exploited to treat the disease and the drugs to actually attack those targets.

[5] Projects like ENCODE, which are involved in the identification of all expressed DNA elements, not just the 25000 or so protein expressing sequences, will help us to further our understanding of how genes work in concert to bring phenes into existence and as a consequence how our genomes make us who we are with the exception of external environmentally shaped traits.

[6] The availability of the human genome sequence can further molecular diagnostics involving PCR for instance by enabling PCR primer design that does not cross react with human transcripts, which means that pathogen DNA/RNA can be reliably amplified and identified for diagnostic purposes.

[7] The development of unique microarray probes that has now made large scale array based expression studies possible, improving our understanding of the pathology of diseases, is also something that owes its origins to this project.

[8] As gene sequencing technology improves, and we are able to sequence more and more for less and less money and to it quicker, we should start seeing masses of genetic data emerging that can be seamlessly integrated into databases that deal with the human genome and variation therein, and this will not only benefit researchers as they learn more and more from such data but will also help the people who have their genome sequenced by helping them make use of information gleaned from what researchers are looking at.

There is already technology being developed that could make genetic sequencing dizzyingly fast and efficient, including for instance Nanopore sequencing where nanotechnology is used to drive sequencing.

Another such large scale sequencing technology under development is the Helicos synthesizer, if it works well it could be a brilliant addition to the toolkit molecular geneticists have.

[9] The project helped to establish that all humans share a common heritage, this is summed up in the very first lines by a quote by Dr.John Sulston, who I am aware was the director of the Wellcome Trust’s Sanger Center when the project was running. So this means anybody who starts to discriminate amongst people based on superficial variation can screw themselves, and IMO *should* screw themselves.

[10] It also raised ethical concerns about genetic privacy and what insurance companies could treat people with susceptibility to certain disorders, in the US of A, at least, there was a ban against federal genetic discrimination introduced in about 2000.

It remains to be seen how effective legislation against genetic discrimination is, but I fervently hope that this doesn’t become a problem.

[11] On a much more individualistic level, it may help parents who are carriers of genetic diseases to choose to have children by checking through genetic testing if the embryo is disease free.

Delving into the Details – More you can do


[1] The Common Thread, Sir John Sulston & Georgina Ferry, Joseph Henry Press.

I’ve had the pleasure of reading this nearly autobiographical book, and it is a brilliant, gripping account of what happened during the HGP, written by someone who was at the forefront of research involved in the project : Sir Sulston was the director at the Sanger Centre.

The Google Books entry for this can be found here.

[2] Genome : The autobiography of a species in 23 chapters by Matt Ridley.

This is a popular science treatment of the Human Genome & the HGP for laymen, you may find this a useful read if you are looking for a platform to delve into exploring more of this unique scientific milestone. You may find the apposite Amazon Page here

[3] Our Posthuman Future, Francis Fukuyama, Picador Press.

This book is not per se about the Human Genome, but it covers issues like the ethics of biotechnology and genetic engineering and other technologies. It could be useful for anybody trying to understand where the ethics regarding biotechnology policymaking lie and what the implications are.


[1] The Gene Code – A BBC4 Documentary written and presented by Dr.Adam Rutherford, this two part programme introduces you to DNA and then takes you through the Human Genome Project and its implications, but be warned that some of the things mentioned therein are less than academically rigorous, as in the failure to differentiate between junk and ncDNA, for instance.

[2] The Incredible Human Journey – A BBC Documentary presented by Dr.Alice Roberts, this five part programme traces the routes humanity took out of Africa and how our species spread all over the world. It is something that encapsulates the commonality of our heritage, both cultural and genetic very well.


You may want to,

[1] Access a Nature Scitable book on the Human Genome Project here

[2] Access a Nature Scitable book on Genomes and their links to diseases

[3] How do we map genes and link genes to phenes? Find out more by browsing this nature Scitable book titled Gene Mapping: Then and Now.

[4] If you want to learn how to browse the human genome, look for genes and then take a tour into the world of genomic data, you may find this tutorial, which deals with the use of a software portal called the UCSC Genome Browser, which is very useful when it comes to perusing genomic data. If you want to go through the genome, look at genes and what they do, this is one very versatile tool to facilitate that.

[5] Read the official Department of Energy’s (the primary financier of the HGP in the USA) HGP Portal here

[6] Visit the Wellcome Trust’s beautiful educational resource site for the HGP, called You can find animations, explanations & activities listed on this site which may make learning about the HGP, and genes and genomes a very interesting proposition.

That is all (!) from me as far as this particular post is concerned. I hope you had a happy time reading & stayed awake throughout, if not I expect your gratitude for helping you to doze off :P .

- Ankur “Exploreable” Chakravarthy.