Though risky, transplants are crucial treatment interventions for many patients.They can save lives of cancer patients, others with severely ill organs and recently there are trials to make them a mainstream treatment for autoimmune disease patients ( i’ve just read an article about that topic and i’d really love to write about it soon too).
But the major problem with transplants, other than the agonizing wait for the right donor on endless lists for sometimes many years, is that when things go wrong with transplants, the patient’s life becomes at mortal risk. Almost 40% of transplant patients will show rejection episodes within the first year after the operation. The detection of these immunological reactions are usually so late, and the only solution will be to flush the patient’s system with huge doses of immunosuppressive drugs that are toxic themselves and can have debilitating effects on cancer patients for instance. Also, to be able to detect rejection reactions, the doctors should take biopsies of the new organ, a process that can cause damage to the organ itself, let alone the stress and the already fragile patient condition. Transplant patients have to undergo exploratory biopsies monthly for one year after the operation!!!!!
Will these risky, life-saving procedures be safer in the future?
Early detection of these reactions was an interesting topic and a field of research for the cardiologist Hannah Valantine of Stanford University School of Medicine in Palo Alto, California. In 2009, she devised a new test that detects the immunological changes in a transplant patient in an episode of rejection. The test, called AlloMap, became the first of its kind to be approved by the FDA for use in the detection of heart transplant rejections. Yet, it failed to detect the rejections early in about half the patients.This, of course, didn’t satisfy Valantine.
Along with biophysicist Stephen Quake of Stanford, they came out with a much more sensitive test. The idea was that DNA from the new organ constitutes around 1% of the free DNA in blood of transplant patients. This DNA is foreign from the DNA of the patient and using her test, it can be very sensitively detected, despite the fact that it is circulating in minute amounts. To validate their test, they used it on stored plasma samples from transplant patients that later showed rejection signs. It was found that the amounts of the rejected organ’s DNA in such episodes are elevated soon after the surgery and constitutes around 3% of the free DNA in plasma, and of course, will be much elevated later, in the peak of the episode. They reported the results in The Proceedings of National Academy of Sciences.
The good thing about this test, besides its high sensitivity, is that it is much less invasive than a biopsy, and the biopsy will not be needed except for confirmation, in case the test is positive and the DNA % is higher than normal. Also, early detection will allow doctors to use much smaller doses of immunosuppressives to control the case and therefore, less side effects will be experienced. Valantine is a cardiologist, but believes the test can be used with other types of transplanted organs, other than hearts.
, organ rejection
No Comments »
If you have ever been one of the unlucky ones waiting for a cancer diagnosis biopsy, or having a friend or a relative undergoing the process, then you must know how the wait is nerve wrecking. The standard procedure is using a biopsy needle to extract some cells of the tissue suspected for cancer, and then waiting for about a week, until the results come out. To make matters worse, results can sometimes be inconclusive or 100% correct.
Simple smartphone applications might be able to rapidly diagnose cancer in the future
Fortunately, a group of scientists at Massachusetts General Hospital (MGH) in Boston, were able to develop a new technology, that is much more rapid and almost 100% accurate in the diagnosis process. They developed a small NMR device (detects compounds by the mode of oscillation of their nuclei in a magnetic field) the size of about a coffee cup, and they were also able to synthesize magnetic Nanoparticles, which stick to certain tumor specific proteins. So now all I have to do, is head to the clinic, have the needle biopsy performed and the cells taken. Then they are mixed with the magnetic Nanoparticles and the results are taken from the small NMR device and read using a simple smartphone application.
This technique was used in the first trial on 50 patients, taking less than an hour to diagnose each. Also, as the device can detect 9 tumor associated proteins, combining the results for 4 of these gave accurate results in 48 out of the 50 patients. In another trial, the accuracy was 100% in the 20 patients tested. The conventional tests’ average accuracy is 74-84 %.
This new technology will also cut down on the cost of repeat biopsies, which can be very expensive, and scientists hope it will have many other applications as well, like patient cancer follow-up, through quantitative analysis of the tumor associated proteins. Maybe the biopsy will not be needed in the future and a simple blood test will also do….
, cancer diagnosis
, magnetic nanoparticles
, smart phones
, tumor associated proteins
2 Comments »
For a long time, mental retardation was believed to be incurable, as it is usually caused by gene mutations that disrupt brain development right from the beginning and even before birth. But thanks to a lot of hard-working scientists, there are trials now to improve the quality of life of such patients, along with their caregivers. The work has been focused on a condition known as “Fragile X syndrome”. In this disease, a mutation takes place in a gene called FMR1 , which is responsible for the production of proteins, that regulate neural development, usually leading to mental retardation according to the extent of such mutations.
Fragile X syndrome's common physical symptoms : elongated face, large ears, etc
Another important contributor to the condition is the metabotropic glutamate receptor-5, abbreviated to mGluR5. It is responsible for controlling the process of protein synthesis at the neuronal synapses, becoming hyperactive in case of fragile X. Being an interesting therapeutic target, a major pharmaceutical company developed AFQ056, an mGluR5-receptor blocker, in the hope that it’ll restore normal transcription levels. The results of the initial double blind clinical trials, conducted on 30 patients, were evaluated through the notes taken by the caregivers about the behavioral improvements of the patient. This included less repetitive behavior, less hyperactivity, less tantrums and having better chances of establishing communication with the patients.
What seemed like a puzzle is that some caregivers reported no change at all after the patients took the drug. So after data analysis, the researchers found that the only patients affected by the treatment were the ones (7 patients out of 30) having a certain genetic marker: complete methylation of the FMR1 gene regulator sequence, and therefore, complete lack of FMR1 transcription. Another disappointment was that the drug didn’t improve cognition or memory, but this, they say, might be attributed to the short duration of the trial, lasting for only 4 weeks.
The next step is to repeat the trial, but this time on 160 selected patients, after testing them for the marker and the experiment will last for 3 months, hoping to obtain better results that are more significant to the patients of this illness.
Sources: Wikipedia and Science News
, fragile X syndrome
, mental retardation
, metabotropic Glutamate receptor 5 blockers
No Comments »
Forget about the old Petri dishes and culture media! A brilliant new method for the growing of microorganisms in order to study their behavior, especially in a large community, again tracking the phenomenon of quorum sensing , has been developed and put to use.
The new invention, by Connell et al., resembles a trapping sack for microorganisms, made of bovine serum albumin covalently cross-linked by laser lithography to form a three dimensional structure. These harboring chambers are very small, with a 2 to 6 picoliter capacity, and are permeable, and thus allowing an infinite influx of nutrients and other essential small molecules for the bacteria growing inside.
Scientists have already compared the growth rates of Pseudomonas cells in “the trap sacks” to those in conventional culture media and mouse lungs and the results were promising! The new technique allows them to study patterns of antibiotic resistance, infection and biofilm formation more clearly and in earlier phases of bacterial growth…
Source: Science magazine, Vol. 330 issue 6004.
Image source: Microbiology Bytes
Tags: antibiotic resistance
, culture media
, microbial trapping
, pseudomonas biofilms
, quorum sensing
No Comments »
When you have to make a tough decision, a difficult choice, one that will affect your life and the lives of those around you, you always have to involve them in the process, and you will find that the wisest thing to do is to unite and make the decision collectively…
Starting from us, humans, and reaching bacteria, collective decision making can be a matter of life and death, yet many factors can influence the way we think and switch our behavior from one way to another. In quorum sensing, bacteria behave in a completely different way upon reaching a certain population density, from the way each one would behave individually.
This is all obvious, but I was lately wondering, do viruses exhibit any form of such “attitude”?? What is the equivalent to quorum sensing in the world of viruses?
The work in this field was all focused on bacterial viruses or bacteriophages, and especially on the temperate lambda virus (a phage that infects E. coli).Temperate means that upon infecting a bacterial cell, the virus will be allowed to choose between two scenarios:
Either: the lytic pathway, in which the virus will use the bacterial resources to replicate itself several times, then bursting out of the cell, killing it and releasing the viral progeny.
Or: the latency or lysogenic pathway, in which the virus integrates its genetic material into that of the host, undergoing minimal transcription and translation, and just replicating and vertically transferring it as the cell divides.
So what really helps or even forces the virus into a certain direction? For a long time, it was thought that the choice is completely random, and greatly affected by environmental conditions. But Joshua Weitz (assistant professor in the school of biology, Georgia Tech) and his team were not satisfied by this answer. They wanted to justify the experimental observation that when one virus infects the cell, the result would be lysis and cell death, whereas if two or more co-infect the host, the result would be latency.
Their mathematical model, based on the gene regulatory dynamics of the λ phage, shows us that the true answer lies in the levels of “gene expression”. Apparently, the process turned out to be controlled by three key genes: cro, cI, and cII. These genes are bound together by a decision loop (a feedback system) that is nonlinear and thus is tremendously affected by minimal changes in the levels of their expression into proteins, which depends on the total number of viral genomes in the host.
The negative feedback system was linked to the “cro” gene, and was triggered by the overall lower rate of mRNA transcription present at this stage, and thus its protein products inhibited the genes responsible for the production of the lysogenic proteins, and so lysis takes place.
In case of co-infection by two or more viruses, as the overall level of viral mRNA transcription is higher (although the increase could be so small!)& the products activate the “c I” gene transcription, translated into lysogenic proteins that activate & accelerate the positive feedback system, leading to even higher levels of production of the lysogenic proteins, and the cell is kept alive and kicking for a certain time period.
The “c II” gene represents a “gate” to the activation of the lysogenic cascade, activated prior to the c I gene. This is a figure I designed to simplify the idea.
Although it is far from settled, but proposing the ability of viruses to make collective decisions based on the number of viral genomes in the surrounding environment can be a very important “life” history trait. Having this trait may be critical to the evolution and survival of certain types of viruses and can explain a lot about that. But what I thought to be most interesting is this: knowing about these mechanisms can allow us to manipulate them in the future! We might be able to slow down the aggressiveness of some viral infections by driving them to latency. Even if is not a radical cure, it can greatly improve the life quality of lots of patients. Maybe in the future we would be able to find other viral functions that are driven by the same mechanisms like host resource usage or cellular penetration, and so defeating viruses in some new unconventional ways…..so let us hope and work!
Original paper: collective decision making in bacterial viruses
Biophysical journal 15 Sept 2008 (available online)
References (Citation by ResearchBlogging)
WEITZ, J., MILEYKO, Y., JOH, R., & VOIT, E. (2008). Collective Decision Making in Bacterial Viruses☆ Biophysical Journal, 95 (6), 2673-2680 DOI: 10.1529/biophysj.108.133694
Edited on Sep 24, 2010 (07:21 CLT)
No Comments »
How about taking a “closer” look onto a microbe??? I’ll try to take you on a journey deep into the nature of the particles constituting it, deep to the extent of subatomic levels, in a trial to learn more about the beginning of life and matter…
Since being a nuclear physicist has always been a dream for me, but obviously didn’t come true, I have been very interested lately in the news about the re-operation of the Large Hadrons Collider (LHC) at CERN and the experiments being conducted there in their attempts to find out more about the composition of matter and the origins of our universe, and the effects these discoveries will have on all different fields of science.
My readings into this topic have brought me to know more about one of the fundamental building blocks of matter, the Quark. Being an elementary particle, quarks theoretically can’t be broken down into smaller units. Their existence was first proposed by physicists Murray Gell-Mann and George Zweig in 1964 as “the Quark model”. The model was introduced to give a better explanation and understanding of atomic Nucleii composition, but there was little evidence for their existence. This lasted till “deep inelastic scattering “experiments were conducted at SLAC National Accelerator Laboratory operated by Stanford University in 1968. since then, six types of quarks also known as “the six flavors” have been discovered, divided into three generations : 1st generation including ( up) and (down) quarks, 2nd including (charm) and (strange) quarks and the 3rd including (bottom) and ( top) quarks. The (top) quark, first observed at “Fermilab” in 1995, was the last to be discovered. There have been trials to prove the existence of a 4th generation of quarks, but till now, all have failed but in the future, and thanks to the current LHC experiments, who knows? After all, protons, neutrons and even atoms were once considered fundamental units of matter and that there was nothing more beyond them!!!
Higher generations of quarks are heavier and less stable, so they undergo certain type of particle decay into the more stable types, those are the up and down quarks, the most abundant in our Universe. The higher generation quarks can’t be produced except at extreme conditions of heat and pressure and with the help of high energy collision, a state believed to exist just after “The big bang” that created our universe.
Unfortunately quarks can’t exist solely in space. They form composites known as Hadrons, the most stable of which are protons and neutrons (so quarks are the building units of the building units of the nucleus of an atom!). This is because of two very important physical phenomenons known as “color charge” and “strong interaction”. These very strong bonds are mediated by energy carriers known as “gluons” (actually derived the word glue, as to stick!) and therefore quarks can’t be isolated singularly, making their observation not an easy task for physicists through the years. Simulating the cosmic conditions present just after the big bang, at which quarks were supposed to exist singularly in what is known as “the quark-gluon plasma” is one of the major aims of the CERN experiments through the LHC.
So you might ask yourself, if they can’t be observed by themselves and can’t exist singularly under normal conditions, how can they prove their existence in the first place? The answer simply is that proposing their presence justifies a lot of physical phenomenons and fits into certain physical models and gives the right answers to lots of experiments, so science has to admit that they are there!
Quarks have lots of interesting characteristics. For example, they have fraction charges ( like -1/3, and +2/3). Each quark has an antiquark having the same magnitude, mean half life but opposite charges (+1/3, -2/3).Hadrons, consisting of quarks, will always have integer charges (for example, a neutron has a charge of 0 consisting of 1 up quark (+2/3) and 2 down quarks( 2*-1/3) that is a sum of zero). They are the only elementary particles in the Standard Model of particle physics to experience all four fundamental interactions, also known as fundamental forces (electromagnetism, gravitation, strong interaction, and weak interaction) .
The quarks got their name when Gell-Mann named them after the sound made by ducks!!!! The “strange” quarks were termed so because they had exceptionally long half lives!!.As for the “charm” type, Glashow, who co proposed charm quark with Bjorken, is quoted as saying, “We called our construct the ‘charmed quark’, for we were fascinated and pleased by the symmetry it brought to the subnuclear world.”
4 Comments »