Patients almost always want to know what will happen if they have some disease where their demise is imminent. The classic big guys remark about "Am I gonna make it doc?" often is the bravado covering abject terror.
In a recent NEJM article there is an interesting piece on prognostics in medicine. Namely we are all now seeing more and more survival curves, conjoined with prognostic tests which contain genetic information.
The patient wants to know but unfortunately the physician really does not have a clue. The article states:
We believe that at least as much attention should be paid to clinicians'
communication about the uncertainty associated with prognostication as
to the search for better prognostic models. We propose a framework of
three central tasks that clinicians can perform to help patients and
families manage uncertainty. Physicians should tailor this framework to
the core values of the patient. Some patients will value quality of life
more than quantity of life, and for these patients uncertainty about
future well-being may be of greater concern than life expectancy.
My concern about better prognostics fall in the following categories:
1. Prognostic data on survival work only for large groups not for a single patient. When I look at the Kaplan Meir survival data for such therapeutics as those new ones for melanoma and even for transplants for MDS I see a 20-30% survival at the tails. To me that tells me something, besides the patient who hope to be the tail. I ask why that group, what happened there? Unfortunately we all too often do not have an answer.
2. Prognostic data is now being used by the Government, namely CMS and under the ACA, for treatment directives and physician control and compensation. Just look at the prostate cancer debacle. The "Committees" making the decisions have decided that PSA tests are useless so they will soon disallow them. They may be useless for some but not all. Yet to the "Committee" they apply to all, a classic Soviet style prognostic decision process.
3. Genetic Tests Yield Prognostic Data: With the ability to collect massive genetic data from genes to SNPs we now have researchers announcing new prognostic tests which at best are problematic and at worst harm the patients. The problem is that all too often one can statistically get "prognostic" data from any correlation, meaningless as it may be.
4. Prognostic Data on Survival Applies to Only Some: I read a recent paper on MDS survival and was truly disappointed by what was in my opinion poor statistical analyses. It compared low risk and high risk patients undergoing bone marrow transplants. It showed the higher risks with better survival. It also showed the higher risk having initially a faster drop in Kaplan Meir at the beginning and then a sustained flattening at 30%. Why? One reason may be that the one who died early had poor HLA matches while those who survived had better, say 10 point match to a 5 point match. But such an analysis was not to be,
The article then states:
Prognosis, and prognostic uncertainty, have a profound influence on
physicians, as well as on patients and families. Physicians' generally
optimistic bias is well documented. In one study, physicians
overestimated the likely duration of survival of terminally ill patients
by a factor of five, and the longer the duration of the
patient–physician relationship, the more optimistic the estimate.
Clinicians may also have trouble with prognostic uncertainty. Some
react with an unwillingness to talk to the patient about the future at
all (but commonly express this unwillingness in terms such as “we have
to wait and see” or “no one can tell”). Others, ignoring the uncertainty
inherent in prognostication, do more and more tests in the futile hope
of improving their prediction. We believe that physicians need to
recognize their reaction to uncertainty and how these reactions may
influence their conversations with patients.
The question of what is going to happen is always on the mind of a patient and their family when a severe diagnosis is made. All too often the physician just does not know. The lung cancer patient who on average is supposed to be dead in three months but lasts three years. The pancreatic cancer patient who also was to last three months but dies of old age. They may be the exceptions but our prognostic capabilities are still quite weak, except on the average. Yet we have never seen an "average" patient.
Friday, June 28, 2013
Monday, June 24, 2013
Obesity and Congress
There is a letter today in the NY Times from Congressional members applauding the AMA panel decision on obesity and then recommending expanding Medicare coverage for drugs and therapy. They state:
We recently introduced a bipartisan, bicameral bill to lift the ban on
obesity drug reimbursement under Medicare and encourage intensive
behavioral counseling. This would open doors to examine different
approaches to obesity treatment.
If the costs of obesity drugs and new treatments are covered, large
research studies can examine what treatments have the most effective
patient outcomes. Better understanding these outcomes would foster a
healthier population and lead to significant health cost savings.
Now try and follow the logic here.
First, this is Medicare, those over 65, with a lifetime of obesity in almost all cases.
Second, the damage is already done; diabetes, kidney damage, atherosclerosis, retinal damage, nerve damage, etc. Most likely they are already on chronic care regimens.
Third, research, research on what? They are fat, they ate too much. And now at 65+ we want to add this to Medicare? For what benefit?
This is one of the dumbest statements I have ever read! And two are Republicans! No wonder we are collapsing as a nation.
Labels:
Health Care
Saturday, June 22, 2013
Imagine if a Chemical Engineer Were a NYC Employee
The NY Times today discussed the collapse of the Regents grading system. The quote by one of the Officials is as follows:
“We anticipated there to be bumps,” Ms. Hughes said, noting that the city had originally set a deadline of this coming Monday to have all tests graded. “Things are moving more slowly than we had hoped.”
So hundreds of students, if not thousands, for who believes any Government spokesperson in my opinion, did not graduate or get their grades. Could you imaging some Chemical Engineer saying, "We expected some small explosions" or a surgeon saying "{We expected a few dead patients". Bumps in the road are all too common in Government projects, just look at the new 911 System in NYC as well. Now this is 2013, software has been developed for sixty plus years, testing is a sophisticated science, so why all these mistakes. I would guess because it is the Government. You see if this were some commercial enterprise it would lose customers and go out of existence. One suspects the folks involved will get a bonus.
“We anticipated there to be bumps,” Ms. Hughes said, noting that the city had originally set a deadline of this coming Monday to have all tests graded. “Things are moving more slowly than we had hoped.”
So hundreds of students, if not thousands, for who believes any Government spokesperson in my opinion, did not graduate or get their grades. Could you imaging some Chemical Engineer saying, "We expected some small explosions" or a surgeon saying "{We expected a few dead patients". Bumps in the road are all too common in Government projects, just look at the new 911 System in NYC as well. Now this is 2013, software has been developed for sixty plus years, testing is a sophisticated science, so why all these mistakes. I would guess because it is the Government. You see if this were some commercial enterprise it would lose customers and go out of existence. One suspects the folks involved will get a bonus.
Labels:
Education
Friday, June 21, 2013
Health Care Costs: Whose Fault?
In a recent article in Becker's one of the current contenders for Governor in the Commonwealth of Massachusetts makes some rather extreme statements in my opinion.
They state:
The failure of the healthcare system stems from six main areas of waste, Dr. Berwick said: overtreatment, failures to coordinate care, failures in care delivery, excessive administrative costs, excessive healthcare prices and fraud and abuse.
Now let us examine this a bit.
1. At no time does he put any blame on the patient. Obesity, smoking, diet, sexually transmitted diseases, drug abuse, and the list goes on are all self inflicted causes of chronic and costly disease. We can estimate the costs quite well but in the mind of this person in my opinion only Government can remedy this with more care not better behavior and personal responsibility.
2. Excessive health care costs are a bit confusing since 20% of health care is Medicare which as a monopsonist sets the price. Insurance companies also have such power. No physician ever collects what they say the costs are.
3. Failures are perhaps a human condition, but through better process are controllable. So what does he suggest? Never seem to get an answer.
4. Administrative costs? Ever try to deal with they Government, they are the cause NOT the solution.
The article also states:
"We live in a civilized and wealthy country," Dr. Berwick said. "We should make healthcare a human right."
A right? A costly right. Perhaps a right to privacy should come first. Also when rights like this are mandated by such as on high they often not only miss the mark but have very negative consequences, intended or not.
They state:
The failure of the healthcare system stems from six main areas of waste, Dr. Berwick said: overtreatment, failures to coordinate care, failures in care delivery, excessive administrative costs, excessive healthcare prices and fraud and abuse.
Now let us examine this a bit.
1. At no time does he put any blame on the patient. Obesity, smoking, diet, sexually transmitted diseases, drug abuse, and the list goes on are all self inflicted causes of chronic and costly disease. We can estimate the costs quite well but in the mind of this person in my opinion only Government can remedy this with more care not better behavior and personal responsibility.
2. Excessive health care costs are a bit confusing since 20% of health care is Medicare which as a monopsonist sets the price. Insurance companies also have such power. No physician ever collects what they say the costs are.
3. Failures are perhaps a human condition, but through better process are controllable. So what does he suggest? Never seem to get an answer.
4. Administrative costs? Ever try to deal with they Government, they are the cause NOT the solution.
The article also states:
"We live in a civilized and wealthy country," Dr. Berwick said. "We should make healthcare a human right."
A right? A costly right. Perhaps a right to privacy should come first. Also when rights like this are mandated by such as on high they often not only miss the mark but have very negative consequences, intended or not.
Labels:
Health Care
Wednesday, June 19, 2013
AMA and Obesity
The AMA has declared obesity a disease, says Medpage. They state:
Obesity should be called a disease and not simply a condition, the American Medical Association's policy-making House of Delegates voted on Tuesday. The vote -- approved by roughly 60% of the AMA's full House -- goes against the recommendation of its Council on Science and Public Health, which issued a report earlier this week saying that calling obesity a disease would be problematic.
"Problematic" is an understatement. In almost all cases obesity is self inflicted. Again it is simply input less output is net accumulation. Too much input and at 3500 kcal per pound you get fat. In almost all cases that is the rule.
The NY Times states:
“The suggestion that obesity is not a disease but rather a consequence of a chosen lifestyle exemplified by overeating and/or inactivity is equivalent to suggesting that lung cancer is not a disease because it was brought about by individual choice to smoke cigarettes,” the resolution said.
Now just parse this statement. Lung cancer caused by smoking is a self induced disease, why, because it cannot be reversed by lifestyle change and will kill you. It is still self induced and preventable. Obesity is self induced and absolutely reversible. The revers can also often stop and reverse the underlying Type 2 Diabetes. This may make the physicians job easier by not demanding the patient do something but it explodes the costs on those of us who watch our weight.
The Times also states:
Today, the AMA adopted policy that recognizes obesity as a disease
requiring a range of medical interventions to advance obesity treatment
and prevention. “Recognizing obesity as a disease will help change the way the
medical community tackles this complex issue that affects approximately
one in three Americans,” said AMA board member Patrice Harris, M.D. “The
AMA is committed to improving health outcomes and is working to reduce
the incidence of cardiovascular disease and type 2 diabetes, which are
often linked to obesity.”
Obesity should be called a disease and not simply a condition, the American Medical Association's policy-making House of Delegates voted on Tuesday. The vote -- approved by roughly 60% of the AMA's full House -- goes against the recommendation of its Council on Science and Public Health, which issued a report earlier this week saying that calling obesity a disease would be problematic.
"Problematic" is an understatement. In almost all cases obesity is self inflicted. Again it is simply input less output is net accumulation. Too much input and at 3500 kcal per pound you get fat. In almost all cases that is the rule.
The NY Times states:
“The suggestion that obesity is not a disease but rather a consequence of a chosen lifestyle exemplified by overeating and/or inactivity is equivalent to suggesting that lung cancer is not a disease because it was brought about by individual choice to smoke cigarettes,” the resolution said.
Now just parse this statement. Lung cancer caused by smoking is a self induced disease, why, because it cannot be reversed by lifestyle change and will kill you. It is still self induced and preventable. Obesity is self induced and absolutely reversible. The revers can also often stop and reverse the underlying Type 2 Diabetes. This may make the physicians job easier by not demanding the patient do something but it explodes the costs on those of us who watch our weight.
The Times also states:
One reason in favor, it said, was that it would reduce the stigma of
obesity that stems from the widespread perception that it is simply the
result of eating too much or exercising too little. Some doctors say
that people do not have full control over their weight.Supporters of the disease classification also say it fits some medical
criteria of a disease, such as impairing body function.
This creates a broad definition of disease. This is in my opinion one of the most ill conceived things ever done in the field of medicine. People who cause the problem should face the consequences. They burden those who do not inflict their own lack of will on others.
The AMAs statement is as follows:
The statement makes it even worse. It states it is a disease and require "medical intervention". It just requires shutting one's mouth. All one must do it look around a physicians office and see most of the staff morbidly obese and then look behind the screen and see food just lying all about with continual consumption. The solution is simple, get physicians to start the process by denying any food in the office, and seeking non-obese help, if at all possible.
Labels:
Health Care
Tuesday, June 18, 2013
Are Terrorists That Dumb?
Terrorists are most likely like spies. At least one would think they would or should be. Spies have been around for a real long time, a real long time. The Persians had spies in Greece some 2500 years ago. So it is nothing new. Spies, and one would suspect Terrorists, communicate amongst one another and they would most likely do so in some secure manner, not say sending emails on some well known site or planning in an open "chat room". But perhaps they are really that dumb.
There is an article in Science which has alleged "experts" applauding the current administrations massive tactics of seeking these folks by examining what is in plain sight. In fact they state:
“I can tell you that this kind of thing is extremely effective,” says Alex Pentland, a computational social scientist at the Massachusetts Institute of Technology in Cambridge who has studied phone networks.
Now being told by someone from the Media Lab is not really that comforting to me but perhaps it may be somewhat truthful. I would not know what their basis is. The article continues:
The first step, Uzzi says, would be to map who calls whom—with each person represented by a point or “node” and each person-to-person link represented by a line or “edge”—to create a simple communication network. Next, the analyst would study details of calls—their frequency, duration, and timing—to determine how closely connected each pair of people is. This step breaks the communication network into smaller, overlapping social networks. The third step would be to study the dynamics of a social network, to see how activity ebbs and wanes and the network evolves. The fourth step would be to try to correlate the dynamics of the network with external events,
This assumes the "plain sight" scenario. If we take spies as the example, they have two options; hide in plain sight and be totally covert. When in plain sight you must be transparent, not attract attention, and be as if you were a sleeper. You could not stand out, you don't communicate, and you must fit in with everyone else. If you are covert than you stand apart and in this case you must use secure and covert means of communicating. Thus codes, secure comm links, such as wireless HF digital links, so that no one knows just who you are calling.
The use of the above techniques work well in commercial applications where people volunteer their information and have no desire to "hide". We can find that people say on Staten Island order more Pizza deliveries per 1,000 HH than any other area in the US and further that they have the greatest BMI in the US as well. But if we had some Chinese espionage agent I doubt that they would call the Embassy to transmit data. I would think.
I can recall all the means and methods of WW II and those in the Cold War, so why would they not be employed? Can we really detect those hiding in plain sight. Perhaps, but the Fort Hood terrorist was shouting out his plans and no one ever paid attention? What happened there. He apparently even gave presentations that should have alerted a Mall Guard.
Thus the issue is less the value than the very rights we would expect. As for Congress and its overview, I have found that all too often those providing oversight just do not have a clue technically so what good does it do.
I am still in favor of human intel, people, people where the action is. Again, just sitting in that bar in Istanbul, the coffee shop in Algiers, the shops in Islamabad. Words, accents, clothing, shoes, packages, hair cuts, people meeting, those are the tools of intel, and the ability to have a keen insight into the obvious.
There is an article in Science which has alleged "experts" applauding the current administrations massive tactics of seeking these folks by examining what is in plain sight. In fact they state:
“I can tell you that this kind of thing is extremely effective,” says Alex Pentland, a computational social scientist at the Massachusetts Institute of Technology in Cambridge who has studied phone networks.
Now being told by someone from the Media Lab is not really that comforting to me but perhaps it may be somewhat truthful. I would not know what their basis is. The article continues:
The first step, Uzzi says, would be to map who calls whom—with each person represented by a point or “node” and each person-to-person link represented by a line or “edge”—to create a simple communication network. Next, the analyst would study details of calls—their frequency, duration, and timing—to determine how closely connected each pair of people is. This step breaks the communication network into smaller, overlapping social networks. The third step would be to study the dynamics of a social network, to see how activity ebbs and wanes and the network evolves. The fourth step would be to try to correlate the dynamics of the network with external events,
This assumes the "plain sight" scenario. If we take spies as the example, they have two options; hide in plain sight and be totally covert. When in plain sight you must be transparent, not attract attention, and be as if you were a sleeper. You could not stand out, you don't communicate, and you must fit in with everyone else. If you are covert than you stand apart and in this case you must use secure and covert means of communicating. Thus codes, secure comm links, such as wireless HF digital links, so that no one knows just who you are calling.
The use of the above techniques work well in commercial applications where people volunteer their information and have no desire to "hide". We can find that people say on Staten Island order more Pizza deliveries per 1,000 HH than any other area in the US and further that they have the greatest BMI in the US as well. But if we had some Chinese espionage agent I doubt that they would call the Embassy to transmit data. I would think.
I can recall all the means and methods of WW II and those in the Cold War, so why would they not be employed? Can we really detect those hiding in plain sight. Perhaps, but the Fort Hood terrorist was shouting out his plans and no one ever paid attention? What happened there. He apparently even gave presentations that should have alerted a Mall Guard.
Thus the issue is less the value than the very rights we would expect. As for Congress and its overview, I have found that all too often those providing oversight just do not have a clue technically so what good does it do.
I am still in favor of human intel, people, people where the action is. Again, just sitting in that bar in Istanbul, the coffee shop in Algiers, the shops in Islamabad. Words, accents, clothing, shoes, packages, hair cuts, people meeting, those are the tools of intel, and the ability to have a keen insight into the obvious.
Labels:
Commentary
The Philadelphia Chromosome
The book by Wapner, The Philadelphia Chromosome, is simply perfect. It is a wonderful
balance of science, personalities, inventions and medicine. It flows smoothly,
giving the reader the opportunity to catch up each time a new step is taken.
The topic is simply the best example of how we are beginning to understand
cancer in a detailed manner, thus allowing us to develop therapeutics to treat
cancer, turning many from death sentences to chronic disease. The focus of
Wapner’s book is CML, chronic myelogenous leukemia. This is a blood cancer where
the white cells start changing and slowly increase and then in a blast crisis
cause total collapse and death. This is the story of how it was understood what
caused it and how to stop it.
She takes the discovery of the Philadelphia chromosome and
walks the reader step by step to the acceptance and use of imatinib. I remember
a colleague in Vienna, Austria, who in 2002 came down with CML, at a young age,
and I had been following the imatinib development and after a few calls it was
possible to get him on the therapeutic and he survived. This tale has some many
coincidences, most positive, that makes its telling almost mandatory to best
understand where cancer therapeutics is progressing today.
The tale presented by Wapner is fairly straightforward; it
is a mixture of science, luck, coincidence, and human nature.
The first part of the book takes us from the early 1950s through
the mid-1980s, where we go from not understanding to having a somewhat clear
scientific comprehension of both the problem and a remedy. The first part moves
through the following:
1. Using an innovative way to look at chromosomes the 1959
discovery of a chromosomal alteration indicates that a hematological cancer,
CML, presents with this abnormality. A short end on one chromosome and longer
end on another were observed. This is just six years after the Watson and Crick
paper.
2. Recognition of a cancer generating gene.
3. Recognition of the fusion of two chromosomes
4. The understanding of how protein kinases work via the src
gene from chicken sarcomas. In fact on p 47 the author describes this process
and she does a superb job in highlighting what will become a significant key in
the overall development of imatinib.
5. The discovery of the Abelson gene, abl, and its
relationship to cancer. On p 63 the author relates how Abelson was berated in
1969 by physicians and the president of AACR, the cancer research society.
Frankly at that time cancer was the realm of the surgeon, who often just cut as
much as possible and often doing more damage than good.
6. The new Gag/Abl protein, namely the fusion seen on
chromosomes is identified as the fusion of two genes. The author on p 72 has a
wonderful description of this insight.
7. Gag/Abl causes cancer. On p 83 the author discusses the
work in David Baltimore’s lab in the early 80s. At the same time they found abl
had come from chromosome 9 and was transported to chromosome 22 attached to
gag.
8. The tale of Drucker and his persistence and dealings with
Ciba Giga, eventually to be Novartis starts at this point. Here we see the
drive of pushing, connecting, developing, and frankly persevering with a
positive result. It was recognized that the Gag/Abl fusion product was the
driver of CML and that the driver of that was ATP and ATP found a connection
point on the protein. Then if a new molecule could bind and block the ATP the
gene effects, namely cancer, could be blocked. Thus starts the second part of
the book.
The second part of the book is much less linear and shows
the complexity afoot in the pharma world. Even though Drucker, and at this
point many others, saw the path forward, it was necessary to engage the pharmas
and their massive powers as well as their bureaucracies. Thus, we have the tale
of this part. The author does a superb job of giving some light on the
development and testing of therapeutics. One walks away seeing the most bureaucratic
part the pharma and the FDA and Government entities almost acting as sideline
players at best.
The infighting in the pharmas, the conflicts between
development, marketing, toxicology, and management is a wonderful tale, typical
of so many large institutions. The lesson from part one is the brilliance of
the dedicated researchers, that from part two is one wonders how any
therapeutic is ever developed. In today’s world it would take fifty years to
approve aspirin!
Wapner’s book presents great history but it begs the
question of the future progress. Although that was not the intent of Wapner,
she sets up the question quite nicely. She shows how research and development
proceeded from the 1950s through the end of the century. I would argue that we
are at the beginning of a new paradigm of development, and it is not clear if
the institutions are all that prepared to deal with it.
There are several drivers which make the future even more
interesting.
First the communications over the period of 1959 through 2002
were driven most often by personal contact, journals and conferences. Today the
Internet spreads results and data around the world instantaneously. The chance occurrences
are increased orders of magnitude.
Second, as we look at kinases we now understand them first
as intracellular networks and pathways and secondly as distributed
spatio-temporal systems. This means that we are moving from the world of bench
researchers and their singular focus to the “engineer” and their systematic
approaches. Cancer is viewed as an unstable multidimensional system.
Fortunately there are many tools and expertise to deal with such a paradigm.
Third, and this is an exceptionally critical change, we have
multi-national participation, with explosions from countries like China.
One is starting to see more and more of the fundamental work
arising from not just the US and Europe but Asia. These three factors will most
likely be accelerators for the tale told by Wapner. However she also contains
the cautionary tale of the Pharmas and the regulatory bodies, oftentimes the
brakes on progress. That will most likely be the challenge to realizing true
progress on cancer.
The future may very well be driven by the observations of
Eddington and Einstein:
“It is also a good rule not to put too much confidence in
the observational results that are put forward until they are confirmed by
theory.” Arthur Eddington
“It is quite wrong to try founding a theory on observable
magnitudes alone. It is the theory which decides what we can observe.” Albert
Einstein
The two quotes frame the changes which are occurring in the
understanding of cancer. The tale by Wapner was initially data driven, there
was not model, it had to be constructed. Now, we have models, we understand
pathways, we understand where they fail, and thus result in cancers. We have
the models and thus hopefully we can make better and faster sense of the data. Wapner
sets the path to best understand that progress.
Friday, June 14, 2013
Genes and the Left Wing
Apparently the left wing bloggers have determined that Justice Scalia does not "believe" in evolution. One of the left wing bloggers, who in my opinion is all too often a bit over snarky states:
The text goes on just like that: simply summarizing molecular biology. That’s right, Justice Scalia can’t confirm these details with his knowledge (valid) or his belief (um, what?).
Can't he hire a clerk to teach him molecular biology?
Now let us examine just what this complaint seems to be.
Justice Scalia states:
I join the judgment of the Court, and all of its opinion except Part I–A and some portions of the rest of the opinion going into fine details of molecular biology. I am unable to affirm those details on my own knowledge or even my own belief. It suffices for me to affirm, having studied the opinions below and the expert briefs presented here, that the portion of DNA isolated from its natural state sought to be patented is identical to that portion of the DNA in its natural state; and that complementary DNA (cDNA) is a synthetic creation not normally present in nature.
Now one should note several things. First belief is a legal term of art, and before commenting the Justice indicated that this is a complex issue which may not have been fully illuminated. Second, and this is critical, Scalia unlike the one who crafted the opinion, uses the correct term for cDNA, complementary DNA, not what we have commented in before. Thus Justice Scalia is spot on in both delimiting his knowledge in the complexity of the issue and secondly in using the correct term.
Thus unlike the left wing bloggers he appears as is so often the case to be correct, and the left wing bloggers just wrong. So what else is new?
The text goes on just like that: simply summarizing molecular biology. That’s right, Justice Scalia can’t confirm these details with his knowledge (valid) or his belief (um, what?).
Can't he hire a clerk to teach him molecular biology?
Now let us examine just what this complaint seems to be.
Justice Scalia states:
I join the judgment of the Court, and all of its opinion except Part I–A and some portions of the rest of the opinion going into fine details of molecular biology. I am unable to affirm those details on my own knowledge or even my own belief. It suffices for me to affirm, having studied the opinions below and the expert briefs presented here, that the portion of DNA isolated from its natural state sought to be patented is identical to that portion of the DNA in its natural state; and that complementary DNA (cDNA) is a synthetic creation not normally present in nature.
Now one should note several things. First belief is a legal term of art, and before commenting the Justice indicated that this is a complex issue which may not have been fully illuminated. Second, and this is critical, Scalia unlike the one who crafted the opinion, uses the correct term for cDNA, complementary DNA, not what we have commented in before. Thus Justice Scalia is spot on in both delimiting his knowledge in the complexity of the issue and secondly in using the correct term.
Thus unlike the left wing bloggers he appears as is so often the case to be correct, and the left wing bloggers just wrong. So what else is new?
Labels:
Commentary
Thursday, June 13, 2013
Women and Lunch
Some day I must cross the border and visit one of my former students who is in Ottawa and at the same time visit Frances Woolley. Today she writes a long blog about men and women having lunch together. Now since I am old enough to be her father perhaps I may be able to shed some light on the situation. That man and woman thing.
You see when I grew up it was during WW II and there were few men around. The world as I saw it was controlled and managed by women, mother, grandmother, aunts, cousins. The men I intuited were sent off to war and some never returned. I even envisioned that the women made all these choices, after all even all my teachers were women and the girls often got the best grades. There were indeed very few men. I was the minority, and there always was that concern about what happened when you grew up. Bit perhaps an observation may also be useful. The "Baby Boomers" came along after our "Silent Generation" and the key difference was that the men were home, during perhaps some Freudian key period in their lives. For us them men came home after we had been imprinted, for them the imprinting was dramatically different. Just an observation.
Now segue to the late sixties, I am teaching at MIT, my first major assignment, the Sophomore Electronics course. Never saw a woman in class, and then there in the front row was a real woman. Yes, I made the mistake of asking if perhaps she was in the wrong class. Today that question would have you drawn and quartered but then it was trying to be polite. No, she replied, she was a Biology major and she wanted to learn electronics, a laudable goal I thought. From that point I never asked again and the numbers grew exponentially.
By the late 1980s I had accumulated a few female students, and I would travel back and forth, but the thought that there could be any issues never crossed my mind, nor my wife's. You see I still had my WW II memories, and now their were colleague, not students. I knew their families, their spouses, and eventually even their children. They were professionals and after years still are. I have even assembled cribs for my student's children, and perhaps they may be told later in life of such a heroic event. Namely my doing anything really mechanical!
Now Frances writes:
As a person becomes professionally more established, too, new challenges arise. Is it okay to go out to for dinner with a co-author in another city? To go out drinking with a former supervisee? Nick Rowe, in his position as Associate Dean, will go out for drinks with the (male) Dean. But is it appropriate for a female Associate Dean and a male Dean to go out drinking?
Drinks and dinner, well it all depends. In the context of normal professional relationships I see no problem, in the context of how it "looks" it all depends. Are there predatory individuals, yes and I have seen them, but I suspect that it is they that often look askew at others. In New York today there are no second glances, no issues with such meetings. Then again I feel safe with age, there being nothing more disarming than being perceived as grandfather.
So Frances, whenever I get to Ottawa I would love to have a drink, and I will bring my wife and my former student, of course assuming she can get a baby sitter.
You see when I grew up it was during WW II and there were few men around. The world as I saw it was controlled and managed by women, mother, grandmother, aunts, cousins. The men I intuited were sent off to war and some never returned. I even envisioned that the women made all these choices, after all even all my teachers were women and the girls often got the best grades. There were indeed very few men. I was the minority, and there always was that concern about what happened when you grew up. Bit perhaps an observation may also be useful. The "Baby Boomers" came along after our "Silent Generation" and the key difference was that the men were home, during perhaps some Freudian key period in their lives. For us them men came home after we had been imprinted, for them the imprinting was dramatically different. Just an observation.
Now segue to the late sixties, I am teaching at MIT, my first major assignment, the Sophomore Electronics course. Never saw a woman in class, and then there in the front row was a real woman. Yes, I made the mistake of asking if perhaps she was in the wrong class. Today that question would have you drawn and quartered but then it was trying to be polite. No, she replied, she was a Biology major and she wanted to learn electronics, a laudable goal I thought. From that point I never asked again and the numbers grew exponentially.
By the late 1980s I had accumulated a few female students, and I would travel back and forth, but the thought that there could be any issues never crossed my mind, nor my wife's. You see I still had my WW II memories, and now their were colleague, not students. I knew their families, their spouses, and eventually even their children. They were professionals and after years still are. I have even assembled cribs for my student's children, and perhaps they may be told later in life of such a heroic event. Namely my doing anything really mechanical!
Now Frances writes:
As a person becomes professionally more established, too, new challenges arise. Is it okay to go out to for dinner with a co-author in another city? To go out drinking with a former supervisee? Nick Rowe, in his position as Associate Dean, will go out for drinks with the (male) Dean. But is it appropriate for a female Associate Dean and a male Dean to go out drinking?
Drinks and dinner, well it all depends. In the context of normal professional relationships I see no problem, in the context of how it "looks" it all depends. Are there predatory individuals, yes and I have seen them, but I suspect that it is they that often look askew at others. In New York today there are no second glances, no issues with such meetings. Then again I feel safe with age, there being nothing more disarming than being perceived as grandfather.
So Frances, whenever I get to Ottawa I would love to have a drink, and I will bring my wife and my former student, of course assuming she can get a baby sitter.
Labels:
Commentary
The Myriad Decision and cDNA
The Supreme Court states:
They can also synthetically create exons-only strands of nucleotides known as composite DNA (cDNA). cDNA contains only the exons that occur in DNA, omitting the intervening introns.
As NCI states:
The next step is to convert the mRNA back into a DNA molecule in the test tube. This can be thought of simply as reversing what went on in the cell when the gene DNA was switched on and mRNA was made by base pairing. This is a two-stage process. First, each mRNA is copied into a new DNA strand using base pairing to form a mRNA-DNA duplex. Next, the mRNA is chopped up and removed, and the DNA strand is used to make a second DNA strand. This double-stranded DNA is called complementary DNA or cDNA. Thus, each cDNA in the test tube originally came from a specific mRNA in the cell.
But in the MIT cDNA course they call it complementary DNA, NOT "composite DNA (sic)" Even Wikipedia uses that term. It is amazing this is on page 1 they make this massive fundamental mistake. Any AP Biology High School student knows this. One wonders who crafted this document.Perhaps that is what is wrong with our Government.
They can also synthetically create exons-only strands of nucleotides known as composite DNA (cDNA). cDNA contains only the exons that occur in DNA, omitting the intervening introns.
As NCI states:
The next step is to convert the mRNA back into a DNA molecule in the test tube. This can be thought of simply as reversing what went on in the cell when the gene DNA was switched on and mRNA was made by base pairing. This is a two-stage process. First, each mRNA is copied into a new DNA strand using base pairing to form a mRNA-DNA duplex. Next, the mRNA is chopped up and removed, and the DNA strand is used to make a second DNA strand. This double-stranded DNA is called complementary DNA or cDNA. Thus, each cDNA in the test tube originally came from a specific mRNA in the cell.
But in the MIT cDNA course they call it complementary DNA, NOT "composite DNA (sic)" Even Wikipedia uses that term. It is amazing this is on page 1 they make this massive fundamental mistake. Any AP Biology High School student knows this. One wonders who crafted this document.Perhaps that is what is wrong with our Government.
Labels:
Government,
Law
Lions, Tigers and Bears
I live less than 20 miles from the southern tip of Manhattan as the crow flies. It is in the most densely populated state in the US. We have red fox and coyote in the neighborhood, not to mention deer, rabbits, chipmunks, turkey, and the squirrel. But we now have a bear, yes, a 200 pound bear walks across my driveway at night eating up whatever we have left behind.
Now it is not that I have anything against bears, we have one in New Hampshire, but this guy likes my daylilies, my precious hybrids. Do I call Christie? What will EPA say? And yes he did check out my car...hopefully he does not decide to take a ride.
Now it is not that I have anything against bears, we have one in New Hampshire, but this guy likes my daylilies, my precious hybrids. Do I call Christie? What will EPA say? And yes he did check out my car...hopefully he does not decide to take a ride.
Labels:
Commentary
Theory vs Data: Or Just a New Gene?
What comes first, the chicken or the egg? In science we
often think the data is the dominant sine qua non. We see that ever so more
today as we examine all of the researchers who “find” another gene “causing”
cancer. The problem is that finding a new gene is just too easy and the Press
is all too ignorant to ask what it really means.
There are two quotes worth noting, one from the work on DNA
itself and the second from the folks who brought us Quantum Mechanics.
First, there is a quote from the book by Jusdon on DNA. In
Judson there is a quote (p 93):
“It is also a good rule not to put too much confidence in
the observational results that are put forward until they are confirmed by
theory.” Sir Arthur Eddington wrote in 1934: his paradoxical inversion of the
inductive system as preached from Bacon to Russell has become an epigraph for
the latter day recension of the scientific method as practices.
The second is a quote from a discussion on Quantum Mechanics
by Gribbin. From Gribbin (pp 139-140) we have:
At one point Einstein had commented: “It is quite wrong to
try founding a theory on observable magnitudes alone. It is the theory which
decides what we can observe.”
In both cases there is the imperative to ultimately put all
data in the context of a world view, a model of reality that links inputs and
outputs, and which can become both the language of the very concepts and the
sounding board upon which measurements are made.
All too often we see researchers just dumping a ton of new
genes and arguing that they are causative. On the other hand we have detailed
pathway models demonstrating cause and effect. Yet the discoverers of the new
gene seem never to place them in a context. They are at best correlative, and
most likely nor causative.
References:
Judson, H., The Eighth Day of Creation, Touchstone (New
York) 1979.
Gribbin, J., Erwin Schrodinger and the Quantum Revolution,
Wiley (New York) 2013.
Labels:
Commentary
Carrot or Stick
The carrot or the stick is often the way we see motivating people to respond. The ACA appears to allow carrots but not sticks, except perhaps in some insurance pricing schemes when the carrot has failed. In a JAMA article the authors examine the ACA and the concept of incentives. They state:
But HIPAA did not render health factors completely irrelevant, and neither does the ACA. In fact, the ACA could be considered to strengthen the link between health status and insurance coverage terms in one respect. Under the HIPAA exception for “programs of health promotion and disease prevention,” employers are permitted to tie premiums or co-payments to tobacco use, body mass index (BMI), or other health factors as long as certain requirements are met. The ACA continues and expands on this policy, supporting the use of outcome-based health incentives within both public and private insurance.
The HIPPA issue related to privacy. On the other hand ACA relates to "incentives", positive motivations to have the person do the right thing. The counter would be punishment for doing the wrong thing, say being taxed at ever increasing rates if your BMI exceeds 25.0.
However the do relate the ACA double edged sword approach where they state:
Programs' health-contingent incentives could not in aggregate exceed 20% of insurance coverage costs. For example, if the cost of coverage were $5000, which is close to the average cost of employer-sponsored individual coverage, a plan could give a reward equal to $1000 to nonsmokers or, alternatively, impose a $1000 surcharge on smokers. Programs were also required to offer a “reasonable alternative standard” or waiver for those for whom it was “unreasonably difficult due to a medical condition” to meet a program standard
They continue to discuss the two sides of motivation. They conclude:
If incentive programs are poorly designed, however, they may do little to change health outcomes. A reward based on achieving a BMI of 25 might lead to weight loss among individuals with BMIs of 26 or 27 but not among those with BMIs of 36 or 37.
Research on recurring errors in individual decision making suggests that premium adjustments may be less effective motivators than incentive programs that incorporate immediate and frequent feedback as well as highly visible and salient rewards and that harness behavioral motivators such as anticipated regret.
If incentives fail to change behavior, higher-risk individuals may be left in a worse financial position, undermining the ACA's efforts to weaken the link between health status and insurance costs and potentially threatening distributional equity.
This is a bit confusing. What is meant by a "worse financial position". Clearly if one has a BMI of 37 then one subjects oneself to very high risk for various diseases and that we then all must bear the ongoing burden of the costs of handling these. If however we actually considered the cost per incremental 0.1 in BMI and then applied it to each person then that would be an efficient cost recovery mechanism. After all we have the IRS into every other corner of our lives why not doing our annual physicals as well. Just imagine some morbidly obese GS9 with an attitude telling you to get up on a scale and then yelling out to her associate what you weigh and your BMI while hundreds of others wait in line. Not too far fetched from where we seem to be going.
But HIPAA did not render health factors completely irrelevant, and neither does the ACA. In fact, the ACA could be considered to strengthen the link between health status and insurance coverage terms in one respect. Under the HIPAA exception for “programs of health promotion and disease prevention,” employers are permitted to tie premiums or co-payments to tobacco use, body mass index (BMI), or other health factors as long as certain requirements are met. The ACA continues and expands on this policy, supporting the use of outcome-based health incentives within both public and private insurance.
The HIPPA issue related to privacy. On the other hand ACA relates to "incentives", positive motivations to have the person do the right thing. The counter would be punishment for doing the wrong thing, say being taxed at ever increasing rates if your BMI exceeds 25.0.
However the do relate the ACA double edged sword approach where they state:
Programs' health-contingent incentives could not in aggregate exceed 20% of insurance coverage costs. For example, if the cost of coverage were $5000, which is close to the average cost of employer-sponsored individual coverage, a plan could give a reward equal to $1000 to nonsmokers or, alternatively, impose a $1000 surcharge on smokers. Programs were also required to offer a “reasonable alternative standard” or waiver for those for whom it was “unreasonably difficult due to a medical condition” to meet a program standard
They continue to discuss the two sides of motivation. They conclude:
If incentive programs are poorly designed, however, they may do little to change health outcomes. A reward based on achieving a BMI of 25 might lead to weight loss among individuals with BMIs of 26 or 27 but not among those with BMIs of 36 or 37.
Research on recurring errors in individual decision making suggests that premium adjustments may be less effective motivators than incentive programs that incorporate immediate and frequent feedback as well as highly visible and salient rewards and that harness behavioral motivators such as anticipated regret.
If incentives fail to change behavior, higher-risk individuals may be left in a worse financial position, undermining the ACA's efforts to weaken the link between health status and insurance costs and potentially threatening distributional equity.
This is a bit confusing. What is meant by a "worse financial position". Clearly if one has a BMI of 37 then one subjects oneself to very high risk for various diseases and that we then all must bear the ongoing burden of the costs of handling these. If however we actually considered the cost per incremental 0.1 in BMI and then applied it to each person then that would be an efficient cost recovery mechanism. After all we have the IRS into every other corner of our lives why not doing our annual physicals as well. Just imagine some morbidly obese GS9 with an attitude telling you to get up on a scale and then yelling out to her associate what you weigh and your BMI while hundreds of others wait in line. Not too far fetched from where we seem to be going.
Labels:
Health Care
Genes and Patents
The US Supreme Court has just ruled in the Myriad case. The Court holds:
A naturally occurring DNA segment is a product of nature andnot patent eligible merely because it has been isolated, but cDNA ispatent eligible because it is not naturally occurring.
(a) The Patent Act permits patents to be issued to “[w]hoever invents or discovers any new and useful . . . composition of matter,” §101, but “laws of nature, natural phenomena, and abstract ideas”“ ‘are basic tools of scientific and technological work’ ” that lie beyond the domain of patent protection,The rule against patents on naturally occurring things has limits, however. Patent protection strikes a delicate balance between creating “incentives that lead to creation, invention, and discovery” and “impeding]the flow of information that might permit, indeed spur, invention.” This standard is used to determine whether Myriad’s patents claim a “new and useful . . . composition of matter,” §101, or claim naturally occurring phenomena.
(b) Myriad’s DNA claim falls within the law of nature exception.Myriad’s principal contribution was uncovering the precise location and genetic sequence of the BRCA1 and BRCA2 genes. cDNA is not a “product of nature,” so it is patent eligible under§101.
(c) cDNA does not present the same obstacles to patentability as naturally occurring, isolated DNA segments. Its creation results in an exons-only molecule, which is not naturally occurring. Its order of the exons may be dictated by nature, but the lab technician unquestionably creates something new when introns are removed from a DNA sequence to make cDNA.
(d) This case, it is important to note, does not involve method claims, patents on new applications of knowledge about the BRCA1 and BRCA2 genes, or the patentability of DNA in which the order of the naturally occurring nucleotides has been altered.
This is a game changing decision and is worth the reading.
A naturally occurring DNA segment is a product of nature andnot patent eligible merely because it has been isolated, but cDNA ispatent eligible because it is not naturally occurring.
(a) The Patent Act permits patents to be issued to “[w]hoever invents or discovers any new and useful . . . composition of matter,” §101, but “laws of nature, natural phenomena, and abstract ideas”“ ‘are basic tools of scientific and technological work’ ” that lie beyond the domain of patent protection,The rule against patents on naturally occurring things has limits, however. Patent protection strikes a delicate balance between creating “incentives that lead to creation, invention, and discovery” and “impeding]the flow of information that might permit, indeed spur, invention.” This standard is used to determine whether Myriad’s patents claim a “new and useful . . . composition of matter,” §101, or claim naturally occurring phenomena.
(b) Myriad’s DNA claim falls within the law of nature exception.Myriad’s principal contribution was uncovering the precise location and genetic sequence of the BRCA1 and BRCA2 genes. cDNA is not a “product of nature,” so it is patent eligible under§101.
(c) cDNA does not present the same obstacles to patentability as naturally occurring, isolated DNA segments. Its creation results in an exons-only molecule, which is not naturally occurring. Its order of the exons may be dictated by nature, but the lab technician unquestionably creates something new when introns are removed from a DNA sequence to make cDNA.
(d) This case, it is important to note, does not involve method claims, patents on new applications of knowledge about the BRCA1 and BRCA2 genes, or the patentability of DNA in which the order of the naturally occurring nucleotides has been altered.
This is a game changing decision and is worth the reading.
Labels:
Law
Wednesday, June 12, 2013
Prostate Cancer Testing
I noticed a piece which gives prices for various prostate cancer gene tests. The article in Spectrum Online states:
The newest test was developed by Genomic Health Inc., which has sold a similar one for breast cancer since 2004. Doctors at first were leery of it until studies in more groups of women proved its value, and the same may happen with the prostate test, said Dr. Len Lichtenfeld, the American Cancer Society’s deputy chief medical officer.
The company will charge $3,820 for the prostate test and says it can save money by avoiding costlier, unnecessary treatment. Another test for assessing prostate cancer risk that came out last summer — Prolaris by Myriad Genetics Inc. — sells for $3,400.
We discussed the CCP test in a detailed White Paper and expressed our concerns, based not upon any deficiency but due to in our opinion a lack of reproducibility. We do not have any data on the second test. But if our understanding of the CCP test is correct it is used after a biopsy and tests 31 genes in the cancer ridden prostate. The test then provides some statistical measure of death in some period of time given some scalar metric based upon what appears to be an undisclosed process of gene expression.
The the test allegedly tells a patient who has undergone a biopsy with some level of Gleason score that the have a good or bad chance of survival.
Thus the patient is charged for the biopsy, the path study, the test, and then gets to decide what? That is where I would have the problem.
The article continues:
About 240,000 men in the U.S. are diagnosed with prostate cancer each year, and about half are classified as low risk using current methods. Doctors now base risk estimates on factors such as a man’s age and how aggressive cells look from biopsies that give 12 to 14 tissue samples. But tumors often are spread out and vary from one spot to the other.
We have shown that if one uses 24 or more cores, depending on prostate volume, that one can reduce the risk of missing the diseased segments. We have moved from six cores, to 12, to 24, and some are as high as 36. With a highly competent urologist and 24 cores one should managed to have a high detection probability. Furthermore we know that if upon detection we have a Gleason of 7 or more than most likely we have a serious problem. Yes, many of Gleason 7 progress slowly, and yes it would be good to know which do not, yet the above tests in my opinion have a way to go. Just my opinion.
Yet one of my major concerns is the possibility that a prostate cancer stem cell may have migrated to a distant site, say the bone, and that the test would miss that event and provide a false sens of security. Furthermore, even in the biopsy samples the stem cell may have been missed and the cells detected may be considered indolent but the ore aggressive cells have remained in place.
Just grossly testing for the presence of certain genes is interesting but in my opinion far from conclusive.One need just look at the paper by Navin and Hicks and examine the various problems with the approaches mentioned above. As Navin and Hicks state:
Defining the pathways through which tumors progress is critical to our understanding and treatment of cancer. We do not routinely sample patients at multiple time points during the progression of their disease, and thus our research is limited to inferring progression a posteriori from the examination of a single tumor sample. Despite this limitation, inferring progression is possible because the tumor genome contains a natural history of the mutations that occur during the formation of the tumor mass.
There are two approaches to reconstructing a lineage of progression: (1) inter-tumor comparisons, and (2) intra-tumor comparisons. The inter-tumor approach consists of taking single samples from large collections of tumors and comparing the complexity of the genomes to identify early and late mutations. The intra-tumor approach involves taking multiple samples from individual heterogeneous tumors to compare divergent clones and reconstruct a phylogenetic lineage. Here we discuss how these approaches can be used to interpret the current models for tumor progression.
We also compare data from primary and metastatic copy number profiles to shed light on the final steps of breast cancer progression. Finally, we discuss how recent technical advances in single cell genomics will herald a new era in understanding the fundamental basis of tumor heterogeneity and progression.
Thus the multiplicity of ways tumors progress means that taking samples at one time and place most likely will not reflect upon the true status of the systemic disease.
The newest test was developed by Genomic Health Inc., which has sold a similar one for breast cancer since 2004. Doctors at first were leery of it until studies in more groups of women proved its value, and the same may happen with the prostate test, said Dr. Len Lichtenfeld, the American Cancer Society’s deputy chief medical officer.
The company will charge $3,820 for the prostate test and says it can save money by avoiding costlier, unnecessary treatment. Another test for assessing prostate cancer risk that came out last summer — Prolaris by Myriad Genetics Inc. — sells for $3,400.
We discussed the CCP test in a detailed White Paper and expressed our concerns, based not upon any deficiency but due to in our opinion a lack of reproducibility. We do not have any data on the second test. But if our understanding of the CCP test is correct it is used after a biopsy and tests 31 genes in the cancer ridden prostate. The test then provides some statistical measure of death in some period of time given some scalar metric based upon what appears to be an undisclosed process of gene expression.
The the test allegedly tells a patient who has undergone a biopsy with some level of Gleason score that the have a good or bad chance of survival.
Thus the patient is charged for the biopsy, the path study, the test, and then gets to decide what? That is where I would have the problem.
The article continues:
About 240,000 men in the U.S. are diagnosed with prostate cancer each year, and about half are classified as low risk using current methods. Doctors now base risk estimates on factors such as a man’s age and how aggressive cells look from biopsies that give 12 to 14 tissue samples. But tumors often are spread out and vary from one spot to the other.
The NY Times says:
The test looks at the activity level of 17 genes in the biopsy sample and computes a score from 0 to 100 showing the risk that cancer is aggressive.To see how well the test worked, testing was performed on archived
biopsy samples from 412 patients who had what was considered low or
intermediate-risk cancer but then underwent surgery.In many such cases, the tumor, which can be closely studied after it is
surgically removed, turns out to be more aggressive than thought based
on the biopsy, which looks at only a tiny sample of the tumor.
We have shown that if one uses 24 or more cores, depending on prostate volume, that one can reduce the risk of missing the diseased segments. We have moved from six cores, to 12, to 24, and some are as high as 36. With a highly competent urologist and 24 cores one should managed to have a high detection probability. Furthermore we know that if upon detection we have a Gleason of 7 or more than most likely we have a serious problem. Yes, many of Gleason 7 progress slowly, and yes it would be good to know which do not, yet the above tests in my opinion have a way to go. Just my opinion.
Yet one of my major concerns is the possibility that a prostate cancer stem cell may have migrated to a distant site, say the bone, and that the test would miss that event and provide a false sens of security. Furthermore, even in the biopsy samples the stem cell may have been missed and the cells detected may be considered indolent but the ore aggressive cells have remained in place.
Just grossly testing for the presence of certain genes is interesting but in my opinion far from conclusive.One need just look at the paper by Navin and Hicks and examine the various problems with the approaches mentioned above. As Navin and Hicks state:
Defining the pathways through which tumors progress is critical to our understanding and treatment of cancer. We do not routinely sample patients at multiple time points during the progression of their disease, and thus our research is limited to inferring progression a posteriori from the examination of a single tumor sample. Despite this limitation, inferring progression is possible because the tumor genome contains a natural history of the mutations that occur during the formation of the tumor mass.
There are two approaches to reconstructing a lineage of progression: (1) inter-tumor comparisons, and (2) intra-tumor comparisons. The inter-tumor approach consists of taking single samples from large collections of tumors and comparing the complexity of the genomes to identify early and late mutations. The intra-tumor approach involves taking multiple samples from individual heterogeneous tumors to compare divergent clones and reconstruct a phylogenetic lineage. Here we discuss how these approaches can be used to interpret the current models for tumor progression.
We also compare data from primary and metastatic copy number profiles to shed light on the final steps of breast cancer progression. Finally, we discuss how recent technical advances in single cell genomics will herald a new era in understanding the fundamental basis of tumor heterogeneity and progression.
Thus the multiplicity of ways tumors progress means that taking samples at one time and place most likely will not reflect upon the true status of the systemic disease.
Labels:
Cancer,
Health Care
Tuesday, June 11, 2013
Data, Data, Data
This flap about the monitoring of personal data reminds me of the general problem of data. Back some decades ago I did some work tracking Soviet nuclear subs, a summer job type thing. My task was to try various schemes on the massive data files to discriminate between whales and subs. Specifically my job was to determine the sensitivity of a single parameter on discrimination on a stored data base of alleged whales and submarines. It was not clear whether this was ever used or even if it was important.
Now I worked this data to death. There were tons of data. It was the worst job I ever did in my life. Working for the New York Sanitation Department in January 1960 shoveling snow was better. At least there was an achievement, no snow at the cross walks. I ran every possible variation, with little understanding of what needle in this haystack I was looking for.
But then what I did was to step back and re-frame the question and ask whether it was ever possible to do what the process I was thinking about would achieve what I set out to do. I did a detailed analysis of the situation and when complete, the data notwithstanding, it was clear you could not do what I was set out to do, at least the way I was set out to do it. Then again this may have been a real challenge or it may have just been one of those management games that large companies played, and after all it was just for the summer. What it did teach me was that one just does not wander aimlessly around data, one needs a theory, a model, a physical realization and embodiment. Once I created the model and did the analysis the data could become meaningful or meaningless. Sometimes data helps, often it can confuse. In fact one may have captured the wrong data.
Now what does this have to do with this current issue? Before answering that let me give two other examples. First is Excel. I would argue that the market collapse of the dot com bubble was as much the fault of Excel as it was the hype, indeed the hype was Excel. For by then any moron could gin up data by the truckload, put it in an Excel spread sheet and come up with a trillion dollar business worth billions. And since it was based upon data and done with Excel it must be true. Nonsense!
The next example is the microarray and cancer genes. We have enabled the folks to run arrays on hundreds of genes and from that using again their Excel spread sheets we have almost daily announcements of new genes causing some form of cancer. Namely some loose correlation is causation.
Now to massive data. One needs discriminant functions, namely one must have some idea as to what to look for. Frankly given no initial data one can find anything and anything can be big, real big. Data supports theories, it is not the theory. Data can often be wrong; wrong by interpretation or by collection.
Now how does good Intel really work? The same old way it always has, feet on the ground, snippets at a bar in Athens, a coffee shop in Tangiers, a small bistro in Marseille. It is listening and gathering and having a team of dedicated smart loyal people, not Government employees.It used to work that way for a while. Today it has become all politics.
Besides the current problem is what has been going on for a long time now, sloppy control of data. Solve that problem and you solve everything. We did that once, it worked somewhat better than this mess. Perhaps we just kill the computers and reintroduce the typewriter.
Now I worked this data to death. There were tons of data. It was the worst job I ever did in my life. Working for the New York Sanitation Department in January 1960 shoveling snow was better. At least there was an achievement, no snow at the cross walks. I ran every possible variation, with little understanding of what needle in this haystack I was looking for.
But then what I did was to step back and re-frame the question and ask whether it was ever possible to do what the process I was thinking about would achieve what I set out to do. I did a detailed analysis of the situation and when complete, the data notwithstanding, it was clear you could not do what I was set out to do, at least the way I was set out to do it. Then again this may have been a real challenge or it may have just been one of those management games that large companies played, and after all it was just for the summer. What it did teach me was that one just does not wander aimlessly around data, one needs a theory, a model, a physical realization and embodiment. Once I created the model and did the analysis the data could become meaningful or meaningless. Sometimes data helps, often it can confuse. In fact one may have captured the wrong data.
Now what does this have to do with this current issue? Before answering that let me give two other examples. First is Excel. I would argue that the market collapse of the dot com bubble was as much the fault of Excel as it was the hype, indeed the hype was Excel. For by then any moron could gin up data by the truckload, put it in an Excel spread sheet and come up with a trillion dollar business worth billions. And since it was based upon data and done with Excel it must be true. Nonsense!
The next example is the microarray and cancer genes. We have enabled the folks to run arrays on hundreds of genes and from that using again their Excel spread sheets we have almost daily announcements of new genes causing some form of cancer. Namely some loose correlation is causation.
Now to massive data. One needs discriminant functions, namely one must have some idea as to what to look for. Frankly given no initial data one can find anything and anything can be big, real big. Data supports theories, it is not the theory. Data can often be wrong; wrong by interpretation or by collection.
Now how does good Intel really work? The same old way it always has, feet on the ground, snippets at a bar in Athens, a coffee shop in Tangiers, a small bistro in Marseille. It is listening and gathering and having a team of dedicated smart loyal people, not Government employees.It used to work that way for a while. Today it has become all politics.
Besides the current problem is what has been going on for a long time now, sloppy control of data. Solve that problem and you solve everything. We did that once, it worked somewhat better than this mess. Perhaps we just kill the computers and reintroduce the typewriter.
Labels:
Government
Sunday, June 9, 2013
Typing, A Skill or What?
In her classic style Frances Woolley has written a somewhat interesting piece regarding the fine art of typing. She poses the issue as follows:
Learning how to touch-type is a classic example of a human capital investment. It requires hours of practice, but there is a big productivity pay-off. When typing becomes a purely automatic process, when there is no need to ever even glance at the keyboard or think about which key to use, the writer is free to concentrate on writing - something called cognitive automaticity. A person who can type quickly can get things done rapidly and efficiently. I can answer emails, write memos or accomplish other routine tasks faster than most economists, because I can type quickly. Moreover, learning good posture at the keyboard has health benefits, because it reduces the risk of repetitive strain injury. So why isn't typing taught in school? Why doesn't every student graduate from high school knowing the the best way to use a keyboard? Why do schools no longer offer the kind of intensive training in typing that is needed to become a highly proficient typist, working consistently at 80 word per minute or more? I don't know, but I have theories.
Now I never learned to type. I use two fingers and somehow still manage to get out well over 500,000 words per year. In fact my hand writing is so poor that I could finally practice Medicine. But to comment on Frances; to what Frances was getting at, let me add an additional dimension.
Back in the 60s when I was doing my theses and drafts of my first books, I wrote everything in longhand on yellow pads and in pencil. I still have the first draft of the first book. The sweat stains from un-air-conditioned offices are still on the paper. When I wrote then I wrote with the typist in mind, for the typist was a true barrier to thought. I could not type, yes a little, but not as I do today. So I wrote in a rigid and final form. I had to be certain that what I wrote was the way it would be in final form because I dreaded the typists, they really hated doing draft after draft. It was the end of the 60s and liberation of all types was exploding, except for those of use who hated typewriters.
With computers and word processors I can now type everything, and retype, cut and paste, insert images, graphs, and best of all equations, I really love equations, I have not had a secretary since 1990. My folks tried to get me one once but for various reasons the secretary lasted three days and I was not even on this continent.
But I really think differently when I type, I have no fear of a typist, I can make changes, I can edit, add, cut and paste and integrate with images. In fact I now actually must type to think! That is a change. I cannot even create unless I put it to "paper", in a computer sense.
Frances continues:
When it comes right down to it, typing is manual labour. There is a fast way of doing it - the classic "home row" method. Self-taught typists can achieve good speeds - 50, 60 wpm - but not the ultra-fast speeds of a typist using the classic method . Creative and original approaches are generally sub-optimal. Yet telling students "your method is wrong" goes against the grain for teachers who want to encourage students to discover, explore, and work things out on their own.
Not really. To refer to Heidegger typing is at hand, it is the taking of the idea and explaining it in a linear manner. Back in 1990 I wrote a paper on Multimedia Communications where I said:
In the development of a theory for design of computer systems involving the human user, Winograd and Flores invoke the theories of the German Philosopher, Heidegger. Specifically they refer to four key propositions of the philosopher that impact the overall end user interface issue in the multimedia environment. These are:
1. Our implicit beliefs and Assumptions cannot be made explicit. We all too often may make the statement, "You know what I mean." In so doing we are creating to mistakes. First, the other may never know what we mean just by the nature in which we individually perceive experiences and objects. Second, we may, ourselves, not have the insight to our own true beliefs, because we all too often find ourselves questioning them. Hermeneutics, the study of meaning in documents, has been expanded by Gadamer to investigate human reasoning. Thus, indicates Gadamer, our understandings can change with the time and place. This changing makes the explicit articulation specious at best.
2. Practical understanding is more fundamental than detached theoretical understanding. Heidegger has a concept called "throwness", part of being-in-itself. We know something only by being thrown or involved in it. We know what a radiologist does with an image and how he manipulates it for understanding by doing the process ourselves. We cannot expect the user to detail their beliefs and in fact those understandings are time varying.
3. We do not relate to things primarily through having representations of them.We relate to things themselves. We do not relate to a representation. The representation to the "thing itself" is done in the context of the task to be accomplished. For example, teleconferencing is useful is we are not to relate to the person but to a subject whose essences can be presented directly through the medium, rather than just a representation. We find that teleconferencing is inadequate for personal contact since the contact is through a representation.
4. Meaning is fundamentally social and cannot be reduced to the meaning giving activities of individual subjects.Meaning is obtained in dialog, in a conversational fashion, with the ability to meet consensus. Gadamer and Heidegger both relate meaning to the social process of communicating. Both also relate the evolution of meaning to the ongoing set of discourses. Specifically, social or conversational activity is the ultimate foundation of intelligibility. This means that both in the design process as well as in the operations process, the need is critical to have the communications channel be conversational if the intent is to convey intelligibility. If the intent is only to transfer predefined package from one point to the other them the conversationality is not essential. In a multimedia environment, intelligibility in the context of the various media and thus intelligibility demands conversationality.
Learning how to touch-type is a classic example of a human capital investment. It requires hours of practice, but there is a big productivity pay-off. When typing becomes a purely automatic process, when there is no need to ever even glance at the keyboard or think about which key to use, the writer is free to concentrate on writing - something called cognitive automaticity. A person who can type quickly can get things done rapidly and efficiently. I can answer emails, write memos or accomplish other routine tasks faster than most economists, because I can type quickly. Moreover, learning good posture at the keyboard has health benefits, because it reduces the risk of repetitive strain injury. So why isn't typing taught in school? Why doesn't every student graduate from high school knowing the the best way to use a keyboard? Why do schools no longer offer the kind of intensive training in typing that is needed to become a highly proficient typist, working consistently at 80 word per minute or more? I don't know, but I have theories.
Now I never learned to type. I use two fingers and somehow still manage to get out well over 500,000 words per year. In fact my hand writing is so poor that I could finally practice Medicine. But to comment on Frances; to what Frances was getting at, let me add an additional dimension.
Back in the 60s when I was doing my theses and drafts of my first books, I wrote everything in longhand on yellow pads and in pencil. I still have the first draft of the first book. The sweat stains from un-air-conditioned offices are still on the paper. When I wrote then I wrote with the typist in mind, for the typist was a true barrier to thought. I could not type, yes a little, but not as I do today. So I wrote in a rigid and final form. I had to be certain that what I wrote was the way it would be in final form because I dreaded the typists, they really hated doing draft after draft. It was the end of the 60s and liberation of all types was exploding, except for those of use who hated typewriters.
With computers and word processors I can now type everything, and retype, cut and paste, insert images, graphs, and best of all equations, I really love equations, I have not had a secretary since 1990. My folks tried to get me one once but for various reasons the secretary lasted three days and I was not even on this continent.
But I really think differently when I type, I have no fear of a typist, I can make changes, I can edit, add, cut and paste and integrate with images. In fact I now actually must type to think! That is a change. I cannot even create unless I put it to "paper", in a computer sense.
Frances continues:
When it comes right down to it, typing is manual labour. There is a fast way of doing it - the classic "home row" method. Self-taught typists can achieve good speeds - 50, 60 wpm - but not the ultra-fast speeds of a typist using the classic method . Creative and original approaches are generally sub-optimal. Yet telling students "your method is wrong" goes against the grain for teachers who want to encourage students to discover, explore, and work things out on their own.
Not really. To refer to Heidegger typing is at hand, it is the taking of the idea and explaining it in a linear manner. Back in 1990 I wrote a paper on Multimedia Communications where I said:
In the development of a theory for design of computer systems involving the human user, Winograd and Flores invoke the theories of the German Philosopher, Heidegger. Specifically they refer to four key propositions of the philosopher that impact the overall end user interface issue in the multimedia environment. These are:
1. Our implicit beliefs and Assumptions cannot be made explicit. We all too often may make the statement, "You know what I mean." In so doing we are creating to mistakes. First, the other may never know what we mean just by the nature in which we individually perceive experiences and objects. Second, we may, ourselves, not have the insight to our own true beliefs, because we all too often find ourselves questioning them. Hermeneutics, the study of meaning in documents, has been expanded by Gadamer to investigate human reasoning. Thus, indicates Gadamer, our understandings can change with the time and place. This changing makes the explicit articulation specious at best.
2. Practical understanding is more fundamental than detached theoretical understanding. Heidegger has a concept called "throwness", part of being-in-itself. We know something only by being thrown or involved in it. We know what a radiologist does with an image and how he manipulates it for understanding by doing the process ourselves. We cannot expect the user to detail their beliefs and in fact those understandings are time varying.
3. We do not relate to things primarily through having representations of them.We relate to things themselves. We do not relate to a representation. The representation to the "thing itself" is done in the context of the task to be accomplished. For example, teleconferencing is useful is we are not to relate to the person but to a subject whose essences can be presented directly through the medium, rather than just a representation. We find that teleconferencing is inadequate for personal contact since the contact is through a representation.
4. Meaning is fundamentally social and cannot be reduced to the meaning giving activities of individual subjects.Meaning is obtained in dialog, in a conversational fashion, with the ability to meet consensus. Gadamer and Heidegger both relate meaning to the social process of communicating. Both also relate the evolution of meaning to the ongoing set of discourses. Specifically, social or conversational activity is the ultimate foundation of intelligibility. This means that both in the design process as well as in the operations process, the need is critical to have the communications channel be conversational if the intent is to convey intelligibility. If the intent is only to transfer predefined package from one point to the other them the conversationality is not essential. In a multimedia environment, intelligibility in the context of the various media and thus intelligibility demands conversationality.
Thus when we type, we become one with what we are typing, whether we have typing lessons or not. Typing is not manual labor, no more than writing notes, moving the paint brush, or slinging the hammer on marble. It is Hedeggerian thrownness wherein we become one with the work,
Labels:
Commentary
Down the Rabbit Hole
This case regarding the release of secret info gets more Alice like by the hour. Now we have some 29 year old high school drop out in Hawaii having access to TS/SID/NF data from his local terminal! And we are worried about the Chinese! Why not just post it on Facebook directly by the Intel Agencies.
But! Wait, is this too unbelievable, if so, perhaps it is not believable. Perhaps it is not what it appears to be. Why "escape" to Hong Kong. He apparently left his job because of seizures. So we have people with access to such secrets who also have seizures, the ADA at work!
What is this guy, who does he work for, why did he go to Hong Kong, and the list of questions go on. This is starting to sound like a plot from so poorly done Hollywood movie. The truth, if there be any, is no where near the surface.
As the Guardian states:
He has had "a very comfortable life" that included a salary of roughly $200,000, a girlfriend with whom he shared a home in Hawaii, a stable career, and a family he loves. "I'm willing to sacrifice all of that because I can't in good conscience allow the US government to destroy privacy, internet freedom and basic liberties for people around the world with this massive surveillance machine they're secretly building."
Well, he is paid well above a GS 15. Strange when we pay High School drop outs such a salary when we send back to China our PhDs.
The tale here is much more complex than what we are told, I would believe. This is too simple, too neat, too direct. Let us see what happens in the next twenty for hours.
But! Wait, is this too unbelievable, if so, perhaps it is not believable. Perhaps it is not what it appears to be. Why "escape" to Hong Kong. He apparently left his job because of seizures. So we have people with access to such secrets who also have seizures, the ADA at work!
What is this guy, who does he work for, why did he go to Hong Kong, and the list of questions go on. This is starting to sound like a plot from so poorly done Hollywood movie. The truth, if there be any, is no where near the surface.
As the Guardian states:
He has had "a very comfortable life" that included a salary of roughly $200,000, a girlfriend with whom he shared a home in Hawaii, a stable career, and a family he loves. "I'm willing to sacrifice all of that because I can't in good conscience allow the US government to destroy privacy, internet freedom and basic liberties for people around the world with this massive surveillance machine they're secretly building."
Well, he is paid well above a GS 15. Strange when we pay High School drop outs such a salary when we send back to China our PhDs.
The tale here is much more complex than what we are told, I would believe. This is too simple, too neat, too direct. Let us see what happens in the next twenty for hours.
The Right to be Left Alone
Some one hundred and twenty three years ago Warren and Brandeis published a paper entitled, The Right to Privacy. It opens as follows:
That the individual shall have full protection in person and in property is a principle as old as the common law; but it has been found necessary from time to time to define anew the exact nature and extent of such protection. Political, social, and economic changes entail the recognition of new rights, and the common law, in its eternal youth, grows to meet the demands of society. Thus, in very early times, the law gave a remedy only for physical interference with life and property, for trespasses vi et armis.
Then the "right to life" served only to protect the subject from battery in its various forms; liberty meant freedom from actual restraint; and the right to property secured to the individual his lands and his cattle. Later, there came a recognition of man's spiritual nature, of his feelings and his intellect. Gradually the scope of these legal rights broadened; and now the right to life has come to mean the right to enjoy life--the right to be let alone, the right to liberty secures the exercise of extensive civil privileges; and the term "property" has grown to comprise every form of possession-- intangible, as well as tangible.
The right to be let alone is a fundamental right, but the current administration in the act of "protecting" us has taken away that right. From mandatory health care, to mandated tracking of all communications, to full body searches at any and all spots, to tracking of reading materials, to gathering information from employers and banks. Our right to be let alone has been dismembered beyond all repair. What would Brandeis think?
That the individual shall have full protection in person and in property is a principle as old as the common law; but it has been found necessary from time to time to define anew the exact nature and extent of such protection. Political, social, and economic changes entail the recognition of new rights, and the common law, in its eternal youth, grows to meet the demands of society. Thus, in very early times, the law gave a remedy only for physical interference with life and property, for trespasses vi et armis.
Then the "right to life" served only to protect the subject from battery in its various forms; liberty meant freedom from actual restraint; and the right to property secured to the individual his lands and his cattle. Later, there came a recognition of man's spiritual nature, of his feelings and his intellect. Gradually the scope of these legal rights broadened; and now the right to life has come to mean the right to enjoy life--the right to be let alone, the right to liberty secures the exercise of extensive civil privileges; and the term "property" has grown to comprise every form of possession-- intangible, as well as tangible.
Thus in the beginning mankind felt they had a right to not have physical harm inflicted upon them. In today's world we kind of take that to be the case except perhaps with the exception of the TSA and their deadly X ray machines, but I lay that aside in this piece. They continue:
Thus, with the recognition of
the legal value of sensations, the protection against actual bodily injury was
extended to prohibit mere attempts to do such injury; that is, the putting
another in fear of such injury. From the action of battery grew that of assault
Much later there came a qualified protection of the individual against
offensive noises and odors, against dust and smoke, and excessive vibration.
The law of nuisance was developed. So regard for human emotions soon extended
the scope of personal immunity beyond the body of the individual.
His reputation, the standing among his fellow men, was considered and the law of slander and libel arose Man's family relations became a part of the legal conception of his life, and the alienation of a wife's affections was held remediable. Occasionally the law halted--as in its refusal to recognize the intrusion by seduction upon the honor of the family. But even here the demands of society were met. A mean fiction, the action per quod servitium amisit, was resorted to, and by allowing damages for injury to the parents' feelings, an adequate remedy was ordinarily afforded.
Similar to the expansion of the right to life was the growth of the legal conception of property. From corporeal property arose the incorporeal rights issuing out of it; and then there opened the wide realm of intangible property, in the products and processes of the mind, as works of literature and art, good-will, trade secrets, and trade secrets.
His reputation, the standing among his fellow men, was considered and the law of slander and libel arose Man's family relations became a part of the legal conception of his life, and the alienation of a wife's affections was held remediable. Occasionally the law halted--as in its refusal to recognize the intrusion by seduction upon the honor of the family. But even here the demands of society were met. A mean fiction, the action per quod servitium amisit, was resorted to, and by allowing damages for injury to the parents' feelings, an adequate remedy was ordinarily afforded.
Similar to the expansion of the right to life was the growth of the legal conception of property. From corporeal property arose the incorporeal rights issuing out of it; and then there opened the wide realm of intangible property, in the products and processes of the mind, as works of literature and art, good-will, trade secrets, and trade secrets.
Thus we see a natural progression protecting the person, a natural progression of civil law which Brandeis believe was building on the Constitution, not conflicting with it. Now or on of the most powerful parts of the paper:
Recent inventions and business methods call attention to the
next step which must be taken for the protection of the person, and for
securing to the individual what Judge Cooley calls the right "to be let
alone."
Instantaneous photographs and newspaper enterprise have invaded the sacred precincts of private and domestic life; and numerous mechanical devices threaten to make good the prediction that "what is whispered in the closet shall be proclaimed from the house-tops." For years there has been a feeling that the law must afford some remedy for the unauthorized circulation of portraits of private persons; and the evil of the invasion of privacy by the newspapers, long keenly felt, has been but recently discussed by an able writer.
The alleged facts of a somewhat notorious case brought before an inferior tribunal in New York a few months ago, directly involved the consideration of the right of circulating portraits; and the question whether our law will recognize and protect the right to privacy in this and in other respects must soon come before our courts for consideration.
Instantaneous photographs and newspaper enterprise have invaded the sacred precincts of private and domestic life; and numerous mechanical devices threaten to make good the prediction that "what is whispered in the closet shall be proclaimed from the house-tops." For years there has been a feeling that the law must afford some remedy for the unauthorized circulation of portraits of private persons; and the evil of the invasion of privacy by the newspapers, long keenly felt, has been but recently discussed by an able writer.
The alleged facts of a somewhat notorious case brought before an inferior tribunal in New York a few months ago, directly involved the consideration of the right of circulating portraits; and the question whether our law will recognize and protect the right to privacy in this and in other respects must soon come before our courts for consideration.
The right to be let alone is a fundamental right, but the current administration in the act of "protecting" us has taken away that right. From mandatory health care, to mandated tracking of all communications, to full body searches at any and all spots, to tracking of reading materials, to gathering information from employers and banks. Our right to be let alone has been dismembered beyond all repair. What would Brandeis think?
Labels:
Commentary,
Government
Friday, June 7, 2013
"Curiouser and curiouser" cried Alice.
The Guardian now reports on more high level intel. One is interesting but a flow of such releases by the same person at the same time the current president meets with the head of China is, as Alice said, "Curiouser and curiouser".
The Guardian states:
It says the government will "identify potential targets of national importance where OCEO can offer a favorable balance of effectiveness and risk as compared with other instruments of national power".
It appears that these were planted and timed for maximum benefit. This is not subtle intel practices that I saw in the 60s and 70s, this is hatchet like. It appears that perhaps internal security has been breached, and badly so. Then one must wonder who is the adversary. This is a game, the Guardian seems to just be an intermediary, facilitating one side against the other.
From whence did these docs originate, if the classification was correct then they should be traceable, the holders of them limited. Yet of they were obtained via electronic means we have a fundamental problem. Namely such documents and facts should never leave the world of paper, never appear on a computer. Bring back typewriters, and place them in a safe. When you place them even on a word processor you invite a compromise.
One should recall the classic release of highly classified data during the Falcon and the Snowman tale, when a tech in the vault of TRW in San Jose released highly classified data to the Soviets via a childhood friend who interfaced via the Mexico City Embassy of the USSR. The people were able to copy the documents, take them from TRW and then deliver them to the Soviets.
Then there was the Hansen episode, in the park behind the house I lived in in Virginia. The neighborhood was filled with intel types but the park apparently become a meeting location for all sorts. All of this a few blocks from Langley.
One just need understand that even with classic controls, things leak, in most cases because of sloppy oversight.
The Guardian states:
It says the government will "identify potential targets of national importance where OCEO can offer a favorable balance of effectiveness and risk as compared with other instruments of national power".
It appears that these were planted and timed for maximum benefit. This is not subtle intel practices that I saw in the 60s and 70s, this is hatchet like. It appears that perhaps internal security has been breached, and badly so. Then one must wonder who is the adversary. This is a game, the Guardian seems to just be an intermediary, facilitating one side against the other.
From whence did these docs originate, if the classification was correct then they should be traceable, the holders of them limited. Yet of they were obtained via electronic means we have a fundamental problem. Namely such documents and facts should never leave the world of paper, never appear on a computer. Bring back typewriters, and place them in a safe. When you place them even on a word processor you invite a compromise.
One should recall the classic release of highly classified data during the Falcon and the Snowman tale, when a tech in the vault of TRW in San Jose released highly classified data to the Soviets via a childhood friend who interfaced via the Mexico City Embassy of the USSR. The people were able to copy the documents, take them from TRW and then deliver them to the Soviets.
Then there was the Hansen episode, in the park behind the house I lived in in Virginia. The neighborhood was filled with intel types but the park apparently become a meeting location for all sorts. All of this a few blocks from Langley.
One just need understand that even with classic controls, things leak, in most cases because of sloppy oversight.
Labels:
Government
Employment June 2013
The current employment data brings little good news. Let us examine it as we always do by looking at the Romer Curve, that now infamous prediction of how well the current administration was to do based upon the wisdom of the economists.
Now according to Romer we would well into a full recovery. Yet the real world has at best stalled.
The above is her error curves demonstrating the digging of an ever deeper hole.
Here we can see some light. There is a continuing growth in those participating in the work force. True unemployment is now about 10.25% down from the 12+% at the peak but not really where we need it to be.
The above shows the narrowing of the gap between population and Employment. In a sense we are making some slow progress.
However when we look at the percent of workforce employed we still see that deadly and system drop. That is the problem to be addressed.
Now according to Romer we would well into a full recovery. Yet the real world has at best stalled.
The above is her error curves demonstrating the digging of an ever deeper hole.
Here we can see some light. There is a continuing growth in those participating in the work force. True unemployment is now about 10.25% down from the 12+% at the peak but not really where we need it to be.
The above shows the narrowing of the gap between population and Employment. In a sense we are making some slow progress.
However when we look at the percent of workforce employed we still see that deadly and system drop. That is the problem to be addressed.
Labels:
Economy
CCP and Prostate Cancer
There is always an interest in determining the prognostic
value of tumors and hopefully staging treatment. There has been a recent flurry
of interest in using “cell cycle progression” CCP, gene testing, a method of
taking gene products from biopsy samples and then using them to ascertain the
most likely progression of the tumor.
We summarize here our opinions as stated in a recent White Paper.
CCP is
a methodology proposed to do this. We take no position in this opinion paper
regarding the efficacy of CCP as applied to PCa but we examine the original
assertions in some detail. Conceptually it makes sense. It is as follows:
1. A handful of genes if over expressed, when combined with
other metrics, can provide fairly accurate prognostic measures of PCa.
2. Selecting the genes can be accomplished in a variety of
ways ranging from logical and clear pathway control genes such as PTEN to just
a broad base sampling wherein the results have a statistically powerful
predictive result.
3. Measuring the level of expression in some manner and from
the measurements combine those in a reasonable fashion to determine a broad
based metric.
4. Combining the gene expression metric with other variable
to ascertain a stronger overall metric.
The CCP work to date has been focused somewhat on these
objectives.
Let us now briefly update the work as detailed in the
industry press. As indicated in a recent posting:[1]
.....initially measured the levels
of expression of a total of 31 genes involved in CCP. They used these data to
develop a predefined CCP “score” and then they set out to evaluate the value of
the CCP score in predicting risk for progressive disease in the men who had
undergone an RP or risk of prostate cancer-specific mortality in the men who
had been diagnosed by a TURP and managed by watchful waiting.
Thus there seems to be a strong belief in the use of CCP,
especially when combined with other measures such as PSA.
The CCP test has been commercialized as Prolaris by Myriad. In
a Medscape posting they state[2]:
"PSA retained a fair amount of its predictive value,
but the predictive value of the Gleason score "diminished" against
the CCP score." he said. "Once you add the CCP score, there is little
addition from the Gleason score, although there is some."
"Overall, the CCP score was a highly significant
predictor of outcome in all of the studies,...it was
the dominant predictor in all but 1 of the studies in the multivariate
analyses, and typically a unit change in the score was associated with a
remarkably similar 2- to 3-fold increase in either death from prostate cancer
or biochemical recurrence, indicating that this is a very robust predictor, and
seems to work in a whole range of circumstances."
Thus there is some belief that CCP when combined with other
metrics has strong prognostic value.
In this
analysis we use CCP as both an end and a means to an end. CCP is one of many
possible metrics to ascertain prognostic values. There is a wealth of them. We
thus start with the selection of genes. We first consider general issues and
then apply them to the CCP approach. This is the area where we have the
majority of our problems.
Let us first examine how they obtained the data. We shall
follow the text of the 2011 paper and then comment accordingly.
1. Extract RNA
2. Treat the RNA with enzyme to generate cDNA
3. Collect the cDNA and confirm the generation of key
entities.
4. Amplify the cDNA
5. Pre amplify the cDNA prior to measuring in an array.
7. In arrays record levels of expression
Clearly there may be many sources of noise or error in this
approach, especially in recording the level of fluorescent intensity. The
problem is however that at each step we have the possibility of measurement
bias or error. These become additive and can substantially alter the data
results.
In this section we consider the calculations needed to
develop a reliable classifier. This is a long standing and classic problem.
Simply stated:
“Assume you have N gene expression levels, G(i), and you
desire to find some function g(G(1),…,G(N)) such that this function g divides
the space created by the Gs into two regions, one with no disease progression
and one with disease progression.”
Alternatively we could ask for a function f(G(1),…,G(N)) such
that the probability of disease progression, or an end point of death in a
defined period, is f or some function derived therefrom.
Let us begin with general classifiers. First let us review
the process of collecting data. The general steps are above. We start with a
specimen and we end up with N measurements of gene expression. In the CCP case
we have some 31 genes we are examining and ascertaining their relative excess
expression. Now as we had posed the problem above we are seeking a classifier
to determine a function f or g as above which would either bifurcate the space
of N genes or a function f from which we could ascertain survival based upon
the N gene expression measurements.
Now from classic classifier analysis we can develop the two
metrics; a simple bifurcating classifier and a probability estimator. The
simple classifier generates a separation point, a line or plane as shown below,
for which being below is benign and being above is problematic. This is akin to
the simple PSA test of being above or below 4.0. However we all know that this
has its problems. Thus there may be some validity in the approach for
prognostic purposes. Clearly a high value indicates a significant chance for
mortality, one assumes directly related to this disease.
Let us now examine the CCP index calculation in some detail.
We use the 2011 paper as the source. The subsequent papers refer
back to this and thus we rely upon what little is presented here. The approach
we take herein is to use what the original paper stated and then line by line
establish a mathematical model and where concerns or ambiguities we point them
out for subsequent resolution. In our opinion the presentation of the
quantitative model is seriously flawed in terms of its explanation and we shall
show the basis of our opinion below.
We have provided a detailed examination in our recent White
Paper. In our opinion there is a lack of transparency and reproducibility in
the 2011 paper and thus one cannot utilize what is presented.
This area of investigation is of interest but it in my
opinion raises more questions than posing answers. First is the issue of the
calculation itself and its reproducibility. Second is the issue of the
substantial noise inherent in the capture of the data.
1. Pathway Implications: Is this just another list of Genes?
The first concern is the fact that we know a great deal
about ligands, receptors, pathway elements, and transcription factors. Why, one
wonders, do we seem to totally neglect that source of information.
2. Noise Factors: The number of genes and the uncertainties in measurements raise serious concerns as to stability of outcomes.
Noise can be a severe detractor from the usefulness of the
measurement. There are many sources of such noise especially in measuring the
fluorescent intensity. One wonders how they factor into the analysis. Many
others sources are also present from the PCR process and copy numbers to the
very sampling and tissue integrity factors.
3. Severity of Prognosis and Basis: For a measurement which is predicting patient death one would expect total transparency.
The CCP discriminant argues for the most severe
prognostication. Namely it dictates death based upon specific discriminant
values. However as we have just noted, measurement noise can and most likely
will provide significant uncertainty in the “true” value of the metric.
4. Flaws in the Calculation Process: Independent of the lack of apparent transparency, there appear in my opinion to be multiple points of confusion in the exposition of the methodology.
In our opinion, there are multiple deficiencies in the
presentation of the desired calculation of the metric proposed which make it
impossible to reproduce it. We detail them in our White Paper.
5. Discriminants, Classifiers, Probability Estimators: What are they really trying to do?
The classic question when one has N independent genes and
when one can measure relative expression is how does one take that data and
determine a discriminant function. All too often the intent is to determine a
linear one dimensional discriminant. At the other extreme is a multidimensional
non-linear discriminant. This is always the critical issue that has been a part
of classifiers since the early 1950s. In the case considered herein there is little if any description of or justification of the method employed. One could assume that the authors are trying to obtain an estimate of the following:
P[Death in M months]=g(G1,...GN))
where Gk is the level of expression of one of the 31 genes. One would immediately ask; why and how? In fact we would be asked to estimate a Bayesian measure:
P[Death in M months|G1,...GN]
which states that we want the conditional probability. We know how to do this for systems but this appears at best to be some observational measure. This in my opinion is one of the weak points.
P[Death in M months]=g(G1,...GN))
where Gk is the level of expression of one of the 31 genes. One would immediately ask; why and how? In fact we would be asked to estimate a Bayesian measure:
P[Death in M months|G1,...GN]
which states that we want the conditional probability. We know how to do this for systems but this appears at best to be some observational measure. This in my opinion is one of the weak points.
6. Causal Genes, where are they?
One of the major concerns is that one genes expression is
caused by another gene. In this case of 31 genes there may be some causality
and thus this may often skew results.
7. Which Cell?
One of the classic problems is measuring the right cell. Do
we want the stem cell, if so how are they found. Do we want metastatic cells,
then from where do we get them. Do we want just local biopsy cells, if so
perhaps they under-express the facts.
8. Why this when we have so many others?
We have PSA, albeit with issues, we have SNPs, we have ligands, receptors, pathway elements, transcription factors, miRNAs and the list goes on. What is truly causal?
8. Why this when we have so many others?
We have PSA, albeit with issues, we have SNPs, we have ligands, receptors, pathway elements, transcription factors, miRNAs and the list goes on. What is truly causal?
Basically this approach has possible merit. The problem, in my
opinion, is the lack of transparency in the description of the test metric.
Also the inherent noisy data is a concern in my opinion. Moreover one wonders why so much Press.
1.
Cooperberg, M., et al,
Validation of a Cell-Cycle Progression Gene Panel to Improve! Risk
Stratification in a Contemporary Prostatectomy Cohort! https://s3.amazonaws.com/myriad-library/Prolaris/UCSF+ASCO+GU.pdf
2.
Cooperberg, M., et al,
Validation of a Cell-Cycle Progression Gene Panel to Improve Risk
Stratification in a Contemporary Prostatectomy Cohort, JOURNAL OF CLINICAL
ONCOLOGY, 2012.
3.
Cuzick J., et al,
Prognostic value of a cell cycle progression signature for prostate cancer
death in a conservatively managed needle biopsy cohort, British Journal of
Cancer (2012) 106, 1095 – 1099.
4.
Cuzick, J., et al,
Prognostic value of an RNA expression signature derived from cell cycle
proliferation genes for recurrence and death from prostate cancer: A
retrospective study in two cohorts, Lancet Oncol. 2011 March; 12(3): 245–255.
5. Duda, R., et al, Pattern Classification, Wiley (New York) 2001.
6. McGarty, T., Prostate Cancer Genomics, Draft 2, 2013, http://www.telmarc.com/Documents/Books/Prostate%20Cancer%20Systems%20Approach%2003.pdf
7. Theodoridis, S., K., Koutroumbas, Pattern Recognition, AP (New
York) 2009.
Labels:
Cancer
Subscribe to:
Posts (Atom)