These are the most recent St Louis FED Recession Comps. They tell a powerful tale that we are still mired in a long term mess. Recall that employment looks good only because we lowered the denominator, namely the workforce. That means we have permanently removed 6 million people from doing anything productive. Why? Lots of answers but like terrorism if we fail to identify it and call it what it is then it will never go away. But the current Administration eschews such acts.
First Production is not bad. It follows the average growth from a Recession and looking at this things may seem fine.
Now the above is Income growth. It is horrible, there is NONE. We have given a new bottom to Income.
Employment is rock bottom as well, and this only looks slightly better because of the elimination of people from the work force.
Retail Sales is on par, most likely driven by Government support via Income Transfers.
The GDP growth is also rock bottom. We may have some growth but relative to all other Recessions we are below them for the past 10 Quarters!
Now for the GDP elements we show them below in toto.
Personal Consumption is low. Government Consumption is also low. They are the two laggards. The others seem fine. Overall we still appear to have a concerned private sector and Washington seems to neglect it.
Saturday, January 31, 2015
Friday, January 30, 2015
Whose Money?
There are times that I truly wonder what has happened to some reporters. Medscape reports on the proposed genetic tracking proposal of the Administration, yes that is what it really is, and they state:
Putting his money where his mouth is, President Obama will seek $215 million to finance the Precision Medicine Initiative he first mentioned in his State of the Union speech on January 20.
Putting whose money! It is NOT his $215 million, it is ours you nimwhit! This is the problem many folks have. The fail to understand the very fundamentals of our economy. They continue:
Putting his money where his mouth is, President Obama will seek $215 million to finance the Precision Medicine Initiative he first mentioned in his State of the Union speech on January 20.
They continue:
The NIH has been in contact with 200 studies that have at least 10,000 enrollees each, which it hopes to integrate into the overall cohort, Dr Collins said. Volunteers will be needed to round it out.
The data will be accessible to qualified researchers and likely also will be used by pharmaceutical, device, and diagnostic companies, he said. An early outcome "will be to take this field of pharmacogenomics — the right drug at the right dose for the right person — and really put it to the test," Dr Collins said. The FDA has approved more than 100 drugs with labeling urging DNA testing before use. "And yet it's not being done because the logistics are all wrong." But with a database that offers results on a million people, "it's a click of the mouse for the doctor to figure out whether it's a different drug or a different dose," he said.
The proposal was in NEJM. It stated:
The concept of precision medicine — prevention and treatment strategies that take individual variability into account — is not new1; blood typing, for instance, has been used to guide blood transfusions for more than a century. But the prospect of applying this concept broadly has been dramatically improved by the recent development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as proteomics, metabolomics, genomics, diverse cellular assays, and even mobile health technology), and computational tools for analyzing large sets of data. What is needed now is a broad research program to encourage creative approaches to precision medicine, test them rigorously, and ultimately use them to build the evidence base needed to guide clinical practice. The proposed initiative has two main components: a near-term focus on cancers and a longer-term aim to generate knowledge applicable to the whole range of health and disease. Both components are now within our reach because of advances in basic research, including molecular biology, genomics, and bioinformatics. Furthermore, the initiative taps into converging trends of increased connectivity, through social media and mobile devices, and Americans' growing desire to be active partners in medical research.
It simply is taking all of our personal genetic data and handing it over to the FEDs so they can figure out what is the "best" way they can deal with us.
One should beware. This is a massive intrusion into our lives and the results could be catastrophic. Precision medicine is not personal medicine. It is the development of least cost delivery and the tails be damned. Namely they will deal with the +/- one sigma and the rest may just go by the wayside. Where is Nancy Pelosi when we really need her, she could have made this real clear!
Putting his money where his mouth is, President Obama will seek $215 million to finance the Precision Medicine Initiative he first mentioned in his State of the Union speech on January 20.
Putting whose money! It is NOT his $215 million, it is ours you nimwhit! This is the problem many folks have. The fail to understand the very fundamentals of our economy. They continue:
Putting his money where his mouth is, President Obama will seek $215 million to finance the Precision Medicine Initiative he first mentioned in his State of the Union speech on January 20.
They continue:
The NIH has been in contact with 200 studies that have at least 10,000 enrollees each, which it hopes to integrate into the overall cohort, Dr Collins said. Volunteers will be needed to round it out.
The data will be accessible to qualified researchers and likely also will be used by pharmaceutical, device, and diagnostic companies, he said. An early outcome "will be to take this field of pharmacogenomics — the right drug at the right dose for the right person — and really put it to the test," Dr Collins said. The FDA has approved more than 100 drugs with labeling urging DNA testing before use. "And yet it's not being done because the logistics are all wrong." But with a database that offers results on a million people, "it's a click of the mouse for the doctor to figure out whether it's a different drug or a different dose," he said.
The proposal was in NEJM. It stated:
The concept of precision medicine — prevention and treatment strategies that take individual variability into account — is not new1; blood typing, for instance, has been used to guide blood transfusions for more than a century. But the prospect of applying this concept broadly has been dramatically improved by the recent development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as proteomics, metabolomics, genomics, diverse cellular assays, and even mobile health technology), and computational tools for analyzing large sets of data. What is needed now is a broad research program to encourage creative approaches to precision medicine, test them rigorously, and ultimately use them to build the evidence base needed to guide clinical practice. The proposed initiative has two main components: a near-term focus on cancers and a longer-term aim to generate knowledge applicable to the whole range of health and disease. Both components are now within our reach because of advances in basic research, including molecular biology, genomics, and bioinformatics. Furthermore, the initiative taps into converging trends of increased connectivity, through social media and mobile devices, and Americans' growing desire to be active partners in medical research.
It simply is taking all of our personal genetic data and handing it over to the FEDs so they can figure out what is the "best" way they can deal with us.
One should beware. This is a massive intrusion into our lives and the results could be catastrophic. Precision medicine is not personal medicine. It is the development of least cost delivery and the tails be damned. Namely they will deal with the +/- one sigma and the rest may just go by the wayside. Where is Nancy Pelosi when we really need her, she could have made this real clear!
Labels:
Health Care
Just What is New Here?
In a NY Times article today there is a big announcement of a program to use genetic analysis on disease. They state:
White House officials said the “precision medicine initiative,” also known as personalized or individualized medicine, would begin with a down payment of $215 million in the president’s budget request for the fiscal year that starts on Oct. 1.
Well have we not been doing this for a few decades. We have BRCA for breast cancer, BRAF for melanoma, we have translocations for CML and we have tones of putative diagnostic and prognostic tests for almost every cancer. So just what is new?
The we have in the same piece:
“Cancer,” he said, “is a disease of faulty genes. The goal of personalized medicine is to understand the unique characteristics of individual patients so therapies can be tailored to genetic mutations that underlie their disease.”
Well, kind of. If you assume that epigenetic factors such a miRNAs and methylation are genes in some broad sense.
White House officials said the “precision medicine initiative,” also known as personalized or individualized medicine, would begin with a down payment of $215 million in the president’s budget request for the fiscal year that starts on Oct. 1.
Well have we not been doing this for a few decades. We have BRCA for breast cancer, BRAF for melanoma, we have translocations for CML and we have tones of putative diagnostic and prognostic tests for almost every cancer. So just what is new?
The we have in the same piece:
“Cancer,” he said, “is a disease of faulty genes. The goal of personalized medicine is to understand the unique characteristics of individual patients so therapies can be tailored to genetic mutations that underlie their disease.”
Well, kind of. If you assume that epigenetic factors such a miRNAs and methylation are genes in some broad sense.
The proposal is:
Federal
officials described the project as a research consortium that would
collect information from large numbers of people. The data could include
medical records, laboratory test results, profiles of patients’ genes,
and information about their diet, tobacco use, lifestyle and
environment. The president’s budget request,
to be unveiled on Monday, includes $130 million for the consortium,
White House officials said. In addition, they said, ..... will
request $70 million for the National Cancer Institute, the largest of
the National Institutes of Health
units, to investigate genes that may contribute to the risk of
developing certain types of cancer, and then to use that knowledge in
developing more effective treatments.
So we will start to collect all of this info. I am a bit confused. Did we not hand out a few billion to docs to get EHR systems and that under meaningful use they were supposed to be doing this now. So what is the added millions for again?
Labels:
Health Care
Tuesday, January 27, 2015
Where's The Franchise?
Google announced a massive expansion of fiber to the home. As the Washington Post announces:
After months of speculation, Google confirmed Tuesday that its ultra-fast Internet service will soon be coming to four more cities — Atlanta; Charlotte, N.C.; Nashville, Tenn.; and Raleigh-Durham, N.C. Those regions, along with more than a dozen cities in their immediate vicinity, will be the latest to benefit from high-speed Internet provided by the search giant. Google Fiber already sells Internet service with download speeds of up to 1 gigabit per second — roughly 100 times faster than the national average — for $70 a month in other cities such as Provo, Utah. Google had been considering expanding to as many as nine metropolitan areas. In a blog post Tuesday, Google said it was still in talks with five of those cities — Phoenix, Portland, Salt Lake City, San Antonio and San Jose — and would decide whether to expand into those regions later this year. Construction in the four cities the company named Tuesday will begin in a few months, according to Google.
The questions are:
1. Does Google have to get Franchises and pole attachment agreements in all of these places or is there some "deal" that goes around that and if so why? This is a total of 12 major cities. In my almost 40 years of experience getting a Franchise especially with an incumbent is a long and costly process. Avoiding one is a miracle, namely very few miracles really happen, if any.
2. At the same time Google is buying in to wireless. As we have argued wireless is much less expensive, requires no franchise, is already enabled by customers and has near equal capacity. So why waste billions on fiber? Do the shareholders care?
3. What is the Google strategy and what are its goals? It appears that they can afford to play many games. But to what end? They can control the distribution channel but then what?
After months of speculation, Google confirmed Tuesday that its ultra-fast Internet service will soon be coming to four more cities — Atlanta; Charlotte, N.C.; Nashville, Tenn.; and Raleigh-Durham, N.C. Those regions, along with more than a dozen cities in their immediate vicinity, will be the latest to benefit from high-speed Internet provided by the search giant. Google Fiber already sells Internet service with download speeds of up to 1 gigabit per second — roughly 100 times faster than the national average — for $70 a month in other cities such as Provo, Utah. Google had been considering expanding to as many as nine metropolitan areas. In a blog post Tuesday, Google said it was still in talks with five of those cities — Phoenix, Portland, Salt Lake City, San Antonio and San Jose — and would decide whether to expand into those regions later this year. Construction in the four cities the company named Tuesday will begin in a few months, according to Google.
The questions are:
1. Does Google have to get Franchises and pole attachment agreements in all of these places or is there some "deal" that goes around that and if so why? This is a total of 12 major cities. In my almost 40 years of experience getting a Franchise especially with an incumbent is a long and costly process. Avoiding one is a miracle, namely very few miracles really happen, if any.
2. At the same time Google is buying in to wireless. As we have argued wireless is much less expensive, requires no franchise, is already enabled by customers and has near equal capacity. So why waste billions on fiber? Do the shareholders care?
3. What is the Google strategy and what are its goals? It appears that they can afford to play many games. But to what end? They can control the distribution channel but then what?
Monday, January 26, 2015
The Insanity of Quality
HHS is mandating payment for quality for Medicare reimbursement. Specifically they state:
In a meeting with nearly two dozen leaders representing consumers, insurers, providers, and business leaders, Health and Human Services Secretary Sylvia M. Burwell today announced measurable goals and a timeline to move the Medicare program, and the health care system at large, toward paying providers based on the quality, rather than the quantity of care they give patients.
I would remind folks that in Zen and the Art of Motorcycle Maintenance that it was the attempt to define quality that drove the prime character insane. Quality is complex and near impossible to define no less measure.
They continue:
HHS has set a goal of tying 30 percent of traditional, or fee-for-service, Medicare payments to quality or value through alternative payment models, such as Accountable Care Organizations (ACOs) or bundled payment arrangements by the end of 2016, and tying 50 percent of payments to these models by the end of 2018. HHS also set a goal of tying 85 percent of all traditional Medicare payments to quality or value by 2016 and 90 percent by 2018 through programs such as the Hospital Value Based Purchasing and the Hospital Readmissions Reduction Programs. This is the first time in the history of the Medicare program that HHS has set explicit goals for alternative payment models and value-based payments.
Note that the goal is 85% to quality. We have written extensively on this topic and its complexity. The ability to deliver anything like a quality measure is not only problematic but we believe impossible. It is merely a Government plan to cut costs and ration service.
In a meeting with nearly two dozen leaders representing consumers, insurers, providers, and business leaders, Health and Human Services Secretary Sylvia M. Burwell today announced measurable goals and a timeline to move the Medicare program, and the health care system at large, toward paying providers based on the quality, rather than the quantity of care they give patients.
I would remind folks that in Zen and the Art of Motorcycle Maintenance that it was the attempt to define quality that drove the prime character insane. Quality is complex and near impossible to define no less measure.
They continue:
HHS has set a goal of tying 30 percent of traditional, or fee-for-service, Medicare payments to quality or value through alternative payment models, such as Accountable Care Organizations (ACOs) or bundled payment arrangements by the end of 2016, and tying 50 percent of payments to these models by the end of 2018. HHS also set a goal of tying 85 percent of all traditional Medicare payments to quality or value by 2016 and 90 percent by 2018 through programs such as the Hospital Value Based Purchasing and the Hospital Readmissions Reduction Programs. This is the first time in the history of the Medicare program that HHS has set explicit goals for alternative payment models and value-based payments.
Note that the goal is 85% to quality. We have written extensively on this topic and its complexity. The ability to deliver anything like a quality measure is not only problematic but we believe impossible. It is merely a Government plan to cut costs and ration service.
Labels:
Health Care
Wednesday, January 21, 2015
Why Are Economists So, Well Just Ignorant...?
I read a piece by a Stanford Economist who is alleging that one should not allow Common Carriage over the Internet Transport companies, such as CATV companies. He alleges:
Under net neutrality, Owen said, Internet service providers are unlikely to offer costly service improvements to anyone if they cannot recover the costs. "At least on the surface, it seems that net neutrality would condemn all users to the same not-terrific and slow-to-improve service," he said. By the end of the 20th century, Owen said, a broad consensus developed among economists that price regulation of industries was unlikely to improve consumer welfare. "Maintaining efficient prices and providing incentives for progressive management of regulated firms rarely works," he wrote.
Now admittedly he is at Stanford, and down the street a way are all those app companies etc but he is an economist after all, so we cannot expect much regarding technical reality.
You see the Internet was designed and is looked at as an hourglass, thin in the middle, limited capabilities, just allowing say TCP/IP. The smarts were at the edge, then end, with the users. That has worked for a real long while. The problem is that we have allowed the CATV folks to get between the TCP/IP path and the end users. That is against the prime directive, that is what the argument concerning Internet Neutrality is all about. Imagine a world with only MSNBC, and you have Comcast's view of life.
By making the carrier a Common Carrier they do what they are supposed to. Common Carriage does not mean rate regulation, never did, only in the minds of those who fail to understand it. It means openness and level playing fields, etc, those catch phrases so common in DC.
The Internet is meant to be minimalist and not really pay attention to what is being sent across it. Each packet is equal. Each packet gets charged the same. Competition helps, but frankly until the wireless companies get their acts together we are left with 1970 technology from CATV companies. Remember that we change out our mobile devices at least every other year while the average age of a cable modem is 10+ years. They have NO motivation to innovate. So why should they be motivated to do anything other than take actions to further disable their customers. The only answer is Common Carriage, as Elizabeth I set up in 1603!
But also from an economics perspective, why should we tolerate bundling. CATV companies are notorious for that. We all understand that, we should pay for what we get from them, transport to a meet point. Tell me what it is and don't get in the way! Simple economics, simple antitrust. But not simple for some folks.
Under net neutrality, Owen said, Internet service providers are unlikely to offer costly service improvements to anyone if they cannot recover the costs. "At least on the surface, it seems that net neutrality would condemn all users to the same not-terrific and slow-to-improve service," he said. By the end of the 20th century, Owen said, a broad consensus developed among economists that price regulation of industries was unlikely to improve consumer welfare. "Maintaining efficient prices and providing incentives for progressive management of regulated firms rarely works," he wrote.
Now admittedly he is at Stanford, and down the street a way are all those app companies etc but he is an economist after all, so we cannot expect much regarding technical reality.
You see the Internet was designed and is looked at as an hourglass, thin in the middle, limited capabilities, just allowing say TCP/IP. The smarts were at the edge, then end, with the users. That has worked for a real long while. The problem is that we have allowed the CATV folks to get between the TCP/IP path and the end users. That is against the prime directive, that is what the argument concerning Internet Neutrality is all about. Imagine a world with only MSNBC, and you have Comcast's view of life.
By making the carrier a Common Carrier they do what they are supposed to. Common Carriage does not mean rate regulation, never did, only in the minds of those who fail to understand it. It means openness and level playing fields, etc, those catch phrases so common in DC.
The Internet is meant to be minimalist and not really pay attention to what is being sent across it. Each packet is equal. Each packet gets charged the same. Competition helps, but frankly until the wireless companies get their acts together we are left with 1970 technology from CATV companies. Remember that we change out our mobile devices at least every other year while the average age of a cable modem is 10+ years. They have NO motivation to innovate. So why should they be motivated to do anything other than take actions to further disable their customers. The only answer is Common Carriage, as Elizabeth I set up in 1603!
But also from an economics perspective, why should we tolerate bundling. CATV companies are notorious for that. We all understand that, we should pay for what we get from them, transport to a meet point. Tell me what it is and don't get in the way! Simple economics, simple antitrust. But not simple for some folks.
Labels:
Internet Neutrality
Tuesday, January 20, 2015
Trust Fund Kids vs Community College
Fortunately I managed to go through college by means of scholarships, jobs, and no loans. Also I ate a lot of rutabagas and drank a lot of powdered milk. No car, no phone, one pair of shoes, never thought I needed two, and books were still cheap.
Nowadays college is a fortune and kids cannot get scholarships as a result of academic performance, they must conform to some politically correct formula to fit the scholarship route. So no matter how well you did, welcome to paying full freight. And most likely bearing that load for years.
Now at the other extreme is Community College. I actually went there for a year recently. I was denied entry initially because I was over 65, until I informed the EVP of Equal Opportunity that the law applied to us old folks as well. Amazing what sending them a copy of the law will do. Then of course they demanded all my school records, High School, College, Grad School, Post Grad, Professional School, and then I was admitted. If I had a GED it would have been easier but I guess they just did not want some old educate guy there. Then the instructor asked whether I was qualified and his response was I was one of those over achievers. I guess it was not an actionable response.
Lesson learned from Community College was that 90% of the students failed to complete the course. Yet they all paid the tuition, most via Pell Grants. Namely we taxpayers footed the bill. Why did the drop out? Jobs, poor preparation, no support infrastructure, etc. Furthermore at Community College the instructors are marginal at best. Better than those I had at Manhattan College, there I was asked to teach Freshman Calculus and Sophomore Systems Theory. In the first case the instructor was terrified of the students and in the second the instructor admitted he had no idea what the subject was, he was a structural engineer. But at Community College the Instructors may have a modicum of understanding but their approach is akin to a low tier High School. For example Biology gives 100 question multiple choice questions. Pure memorization, no understanding.
Now to the increase in inheritance tax. If I were to generation skip and leave the money to my grandchildren then the tax increases. The proposal is if they work hard and get great grades but because of who they are cannot get a scholarship but I would like to help them, then the Government will take 60% of my funds and give it to the 90% in Community College who never graduate! Smart, it is not!
I continue to wonder what is in the minds of those who come up with this scheme. Destroy those who perform well and create mediocrity. Well it looks that way to me.
Nowadays college is a fortune and kids cannot get scholarships as a result of academic performance, they must conform to some politically correct formula to fit the scholarship route. So no matter how well you did, welcome to paying full freight. And most likely bearing that load for years.
Now at the other extreme is Community College. I actually went there for a year recently. I was denied entry initially because I was over 65, until I informed the EVP of Equal Opportunity that the law applied to us old folks as well. Amazing what sending them a copy of the law will do. Then of course they demanded all my school records, High School, College, Grad School, Post Grad, Professional School, and then I was admitted. If I had a GED it would have been easier but I guess they just did not want some old educate guy there. Then the instructor asked whether I was qualified and his response was I was one of those over achievers. I guess it was not an actionable response.
Lesson learned from Community College was that 90% of the students failed to complete the course. Yet they all paid the tuition, most via Pell Grants. Namely we taxpayers footed the bill. Why did the drop out? Jobs, poor preparation, no support infrastructure, etc. Furthermore at Community College the instructors are marginal at best. Better than those I had at Manhattan College, there I was asked to teach Freshman Calculus and Sophomore Systems Theory. In the first case the instructor was terrified of the students and in the second the instructor admitted he had no idea what the subject was, he was a structural engineer. But at Community College the Instructors may have a modicum of understanding but their approach is akin to a low tier High School. For example Biology gives 100 question multiple choice questions. Pure memorization, no understanding.
Now to the increase in inheritance tax. If I were to generation skip and leave the money to my grandchildren then the tax increases. The proposal is if they work hard and get great grades but because of who they are cannot get a scholarship but I would like to help them, then the Government will take 60% of my funds and give it to the 90% in Community College who never graduate! Smart, it is not!
I continue to wonder what is in the minds of those who come up with this scheme. Destroy those who perform well and create mediocrity. Well it looks that way to me.
Labels:
Politics
Saturday, January 17, 2015
Economists: What Value Are They?
Over the years we have been quite critical of economists. They hold themselves out as practitioners of some scientific discipline yet their recommendations are often in conflict with one another and their ability to forecast is dismal. As some economist has recently noted in the defense of his practice of the art:
Since
the global financial crisis and recession of 2007-2009, criticism of
the economics profession has intensified. The failure of all but a few
professional economists to forecast the episode – the aftereffects of
which still linger – has led many to question whether the economics
profession contributes anything significant to society. If they were
unable to foresee something so important to people’s wellbeing, what
good are they? Indeed,
economists failed to forecast most of the major crises in the last
century, including the severe 1920-21 slump, the 1980-82 back-to-back
recessions, and the worst of them all, the Great Depression after the
1929 stock-market crash. In searching news archives for the year before
the start of these recessions, I found virtually no warning from
economists of a severe crisis ahead. Instead, newspapers emphasized the
views of business executives or politicians, who tended to be very
optimistic.
It was not a failure of a few but a failure of a community of them. Does the art of economics lend anything useful to society? The defense of that question is in the following quote:
But this criticism is unfair. We do not
blame physicians for failing to predict all of our illnesses. Our
maladies are largely random, and even if our doctors cannot tell us
which ones we will have in the next year, or eliminate all of our
suffering when we have them, we are happy for the help that they can
provide. Likewise, most economists devote their efforts to issues far
removed from establishing a consensus outlook for the stock market or
the unemployment rate. And we should be grateful that they do.
This statement is unfair to physicians. Physicians do recognize the problems their patients will face and often cannot do anything. Just look at obesity. It leads to a plethora of disorders but try and get someone to diet. Just like Congress.In reality Economics should be compared to Civil Engineering. Now observe:
1. Civil Engineers have building codes based upon facts.
2. Civil Engineers have a science they all agree to. Try and get two Economists to agree on anything. They are the nastiest bunch I have ever seen. And each one has at least two opinions on everything and there is no concurrence.
3. Civil Engineers get sued of the bridge they designed falls. Ever hear of an Economist getting sued for anything? Physicians get sued, even lawyers get sued. But Economists, no jury could ever understand them anyhow.
4. Civil Engineers design and build bridges. The bridges work, they do what they were supposed to, unless of course politicians get in the middle. Economists cannot predict anything with the same sense of accuracy. Economists have lots of equations and theories. Civil Engineers have a few thousands of years of experience.
Imagine what would have happened to an Economist in Imperial Rome!
So please, until economists can agree on their "laws" and take responsibility for their failures they are at best witch doctors who somehow make a lot of money.
Labels:
Economics
Friday, January 16, 2015
Cancer Stem Cells Again
There is an interesting update on cancer stem cells in Science. They write:
THE CANCER STEM CELL model emerged in the mid-1990s, when stem cell biologist John Dick of the University of Toronto reported that his team had isolated rare cells in the blood of people with leukemia that seemed to play a key role in the cancer. Although such patients' blood teems with aberrant white blood cells, only a few of them were capable of growing into a new leukemia when injected into mice. Those cells appeared to be misguided versions of the normal adult blood stem cells that differentiate into mature blood cells. Like normal stem cells, the cancer stem cells carried distinctive surface proteins and were self-renewing: They could divide to produce both a regular cancer cell and a new stem cell.
Now many researchers have examine the stem cell model and there are reasons for its validity. We have argued for Prostate Cancer and one suspects for hematologic cancers such as MDS. The article focuses on Weinberg at MIT and his new company where the authors state:
Verastem's strategy is to screen approved drugs and other chemicals for their ability to block focal adhesion kinase (FAK), an enzyme that helps tumor cells stick to each other and also helps cancer stem cells survive. In the body, Weinberg believes, blocking FAK kills cancer stem cells directly and also makes it harder for these rare cells within a primary tumor to travel through the bloodstream and seed metastases.
It should be interesting to see how this develops. Perhaps our understanding of the stem cell is not mature enough. It has also been argued that the stem cell uses exosomes to cause growth in other cells. There is still a great deal to understand.
THE CANCER STEM CELL model emerged in the mid-1990s, when stem cell biologist John Dick of the University of Toronto reported that his team had isolated rare cells in the blood of people with leukemia that seemed to play a key role in the cancer. Although such patients' blood teems with aberrant white blood cells, only a few of them were capable of growing into a new leukemia when injected into mice. Those cells appeared to be misguided versions of the normal adult blood stem cells that differentiate into mature blood cells. Like normal stem cells, the cancer stem cells carried distinctive surface proteins and were self-renewing: They could divide to produce both a regular cancer cell and a new stem cell.
Now many researchers have examine the stem cell model and there are reasons for its validity. We have argued for Prostate Cancer and one suspects for hematologic cancers such as MDS. The article focuses on Weinberg at MIT and his new company where the authors state:
Verastem's strategy is to screen approved drugs and other chemicals for their ability to block focal adhesion kinase (FAK), an enzyme that helps tumor cells stick to each other and also helps cancer stem cells survive. In the body, Weinberg believes, blocking FAK kills cancer stem cells directly and also makes it harder for these rare cells within a primary tumor to travel through the bloodstream and seed metastases.
It should be interesting to see how this develops. Perhaps our understanding of the stem cell is not mature enough. It has also been argued that the stem cell uses exosomes to cause growth in other cells. There is still a great deal to understand.
Labels:
Cancer
Climate and Hot Summers
NASA has announced that last summer was the hottest on record. Well I would beg to differ, at least my plants tell me so. NASA states:
The year 2014 ranks as Earth’s warmest since 1880, according to two separate analyses by NASA and National Oceanic and Atmospheric Administration (NOAA) scientists. The 10 warmest years in the instrumental record, with the exception of 1998, have now occurred since 2000. This trend continues a long-term warming of the planet, according to an analysis of surface temperature measurements by scientists at NASA’s Goddard Institute of Space Studies (GISS) in New York. In an independent analysis of the raw data, also released Friday, NOAA scientists also found 2014 to be the warmest on record.
I ran their map program and obtained the following:
Now if one looks closely one sees that my area was not really that hot. It was Europe, East Russia and Canada that were hot and parts of Brazil.
Let us examine my backyard. Yes my very nice backyard, and my sentinel plants the Hemerocallis genus.
First we look at the date of first bloom on the three early species:
Now what we have plotted is the date of first bloom where the date is the number of days from January 1 until the bloom date. Thus the larger the number the colder the season. A downward sloping line means warming. H minor going back to 1990 shows warming. But last years was the 4th coldest in 25 years. Not the warmest. The same can be said of the other two species bu H dumortierii shows little evidence of warming.
Why plants? Well plants integrate climate over a year. They reflect what is happening in an integrated manner. They are better than surface measurements.
Now I looked at the date of mid cross. This is the date when I had reached 50% of my total crosses. It is an integral of an integral if you will. It looks across all hybrids and tells when the peak blooming occurs. The results are below:
Again 2014 was not bad. The 50% point was about July 19th. In contrast 2013 had July 8th, almost 10 days earlier. Also 2012 showed a date of July 7th. Thus from my few thousands of plants perspective 2014 was one of the coldest years on record.
I guess one really should look at the data. Plants are great creatures. The use the sun more than any other creatures so why not use them to tell us something. The reason is we really have so few Botanists. Perhaps we need a few at NASA.
The year 2014 ranks as Earth’s warmest since 1880, according to two separate analyses by NASA and National Oceanic and Atmospheric Administration (NOAA) scientists. The 10 warmest years in the instrumental record, with the exception of 1998, have now occurred since 2000. This trend continues a long-term warming of the planet, according to an analysis of surface temperature measurements by scientists at NASA’s Goddard Institute of Space Studies (GISS) in New York. In an independent analysis of the raw data, also released Friday, NOAA scientists also found 2014 to be the warmest on record.
I ran their map program and obtained the following:
Now if one looks closely one sees that my area was not really that hot. It was Europe, East Russia and Canada that were hot and parts of Brazil.
Let us examine my backyard. Yes my very nice backyard, and my sentinel plants the Hemerocallis genus.
First we look at the date of first bloom on the three early species:
Now what we have plotted is the date of first bloom where the date is the number of days from January 1 until the bloom date. Thus the larger the number the colder the season. A downward sloping line means warming. H minor going back to 1990 shows warming. But last years was the 4th coldest in 25 years. Not the warmest. The same can be said of the other two species bu H dumortierii shows little evidence of warming.
Why plants? Well plants integrate climate over a year. They reflect what is happening in an integrated manner. They are better than surface measurements.
Now I looked at the date of mid cross. This is the date when I had reached 50% of my total crosses. It is an integral of an integral if you will. It looks across all hybrids and tells when the peak blooming occurs. The results are below:
Again 2014 was not bad. The 50% point was about July 19th. In contrast 2013 had July 8th, almost 10 days earlier. Also 2012 showed a date of July 7th. Thus from my few thousands of plants perspective 2014 was one of the coldest years on record.
I guess one really should look at the data. Plants are great creatures. The use the sun more than any other creatures so why not use them to tell us something. The reason is we really have so few Botanists. Perhaps we need a few at NASA.
Labels:
Climate Issues
Sunday, January 11, 2015
Entrepreneur?
The Harvard Crimson reports on a rather strange offer. Namely:
When edX courses Entrepreneurship 101 and 102 opened Friday, enrollees had an extra incentive to complete the courses: Users who pass either class will receive $1,000 in credit to spend on Amazon Web Services.
This is the ultimate in Millennial mindsets. Not only do they want to do something they like, or rather not work, but now they get "paid" to become an entrepreneur! You cannot make this up. Perhaps they should just skip everything and go directly to an IPO and cash out.
When edX courses Entrepreneurship 101 and 102 opened Friday, enrollees had an extra incentive to complete the courses: Users who pass either class will receive $1,000 in credit to spend on Amazon Web Services.
This is the ultimate in Millennial mindsets. Not only do they want to do something they like, or rather not work, but now they get "paid" to become an entrepreneur! You cannot make this up. Perhaps they should just skip everything and go directly to an IPO and cash out.
Labels:
Academy
FED Balance Sheet
As of this week the FED Balance sheet is seen as below:
In some focus the two main items are Treasuries and MBS. This we show below:
Both seem to be flattening out as compared to a year ago. This is the case for the MBS, and the Treasuries have flattened. One expects some unwinding will occur. This will put pressure on rates although world conditions may keep them low.
In some focus the two main items are Treasuries and MBS. This we show below:
Both seem to be flattening out as compared to a year ago. This is the case for the MBS, and the Treasuries have flattened. One expects some unwinding will occur. This will put pressure on rates although world conditions may keep them low.
Labels:
Economy
The New Math as Politics
The New Math, by Phillips, is subtitled as a “Political
History”. The New Math movement was an attempt to rejuvenate the teaching of
mathematics in secondary schools and ultimately in the Primary schools. To the
proponents of this movement there was staleness in the teaching of mathematics
which was reflected in their belief that it was merely an exercise in memorization
and lacked any true understanding of the elements of mathematical thinking.
For example, most elementary school students had memorized
multiplication tables, learned fractions and division as a mechanical process,
and dealt with “word problems” with abject terror. Mathematics was taught as if
it were some mechanical set of processes that one set to memory and the
students failed to have an understanding of what they were doing.
On pp 13-15 the author provides some baseline information on
the creation of the New Math. The intent was to imbue understanding of what
they student was doing not just a set of rote manipulations to produce an
answer. There was an abject fear that memorization was futile and that the
expression of what the processes were became the goal. Students must learn why
they could multiple 3 X 2 and then 3 X 2+4 and find an answer. They must
understand the processes of manipulations, at the risk of never memorizing that
3 X 4 are 12.
On p 27 the authors also discusses some of the political
movements pressing for improvement. It mentions Rickover and Doolittle as
applying their influence to promote improved education due to the need seen in
WW II to “educate” many enlistees to be able to perform what were technical
tasks. For example, to train an enlisted sailor in Fire Control Systems or in
Radio or Radar, there was a prerequisite in understanding Geometry, Algebra and
Trigonometry. Many schools never taught the skills to students, and thus the
need to re-educate. Thus on one hand the need was to have a better baseline
education and on the other hand to attempt to emphasize fundamentals as the
core of that education.
What then were the principles? It all depended on whom one
spoke with. What happened was that a group of “mathematicians” saw that they
needed understanding of set theory, complex rules of algebra, base n systems of
numbers and the like. This then changed the core of many of the courses.
The attack then also went to the teachers themselves.
Teachers were all too often the product of teachers colleges, often state run
institutions, to produce individuals to manage the utilization of the state
mandated texts and managed by state mandated exams. The Regents of the State of
New York was in many ways a classic example. Geometry was defined by them and
each instructor taught that material.
The author on p 31 refers to Hofstadter and his book on “anti-intellectualism”
and the argument that teachers had become “estranged” from academia. In reality
the Hofstadter book is a polemic of a Columbia Professor against what he
perceives is the “anti-intellectuals”, namely the Republicans, Catholics, and
anyone opposed to his politics. In Hofstadter’s book on pp 138-141 is one of
his many rants against Catholics and the Church, ironically because the
intellectual were also strong supporters of Kennedy.
Thus the intellectualism at Columbia and of Hofstadter was
at best problematic and use of the author of Hofstadter as a baseline is also
problematic. Likewise, for example, on p 394 of the anti-intellectual treatise
of Hofstadter, Hofstadter calls the Partisan Review the “house organ” of the “intellectuals”.
It is in William Barrett’s writings of his time at the Partisan Review that he
noted is strong Communist bent. Thus, using Hofstadter by the author is an
attempt to set up the New Math as the “intellectual’s movement” and then
subsequently to use this as the basis for arguing that its demise was the
result of some right wing attempt to defeat it appears as a bit of a straw man
strategy.
One of the problems I have is that the author fails to
clearly identify what he means by New Math and what the New Math was. In a
classic paper by Feynman in 1965 criticizing the New Math he states:
Many of the books go into considerable detail on subjects
that are only of interest to pure mathematicians. Furthermore, the attitude
toward many subjects is that of a pure mathematician. But we must not plan only
to prepare pure mathematicians. In the first place, there are very few pure
mathematicians and, in the second place, pure mathematicians have a point of
view about the subject which is quite different from that of the users of
mathematics. A pure mathematician is very impractical; he is not interested -
in fact, he is purposely disinterested - in the meaning of the mathematical symbols
and letters and ideas; he is only interested in logical interconnection of the
axioms
This was the problem of the New Math. A radar technician
does not need to understand set theory to understand the probability of a false
alarm and the signal to noise ratio. Specifically Feynman states:
What is the best method to obtain the solution to a
problem? The answer is, any way that works. So, what we want in arithmetic
textbooks is not to teach a particular way of doing every problem but, rather, to
teach what the original problem is, and to leave a much greater freedom in
obtaining the answer - but, of course, no freedom as to what the right answer should
be.
Feynman, a product of the New York City School System, and
then MIT and Princeton, is correct. His own technique was to intuit the answer
and then find the framework to support it. I doubt he ever used a single
element of set theory. The conclusion even in 1965 was the core of the New Math
was flawed as a pedagogical approach. It in fact was intellectualism gone
astray.
On p 103 the author describes some of the texts which
resulted from this effort. Take the Moise and Downs text on Geometry, and
compare it to text by Wells in 1908. Wells was brief and to the point and one
walked away understanding enough geometry to measure angles, understands
triangles and the like. The Moise and Downs book makes the development of Proofs
impossible. The simple example on pp 190-191 (of the 1982 edition) is a classic
obfuscation of the obvious, a proof of the existence of a perpendicular line.
Kline also discussed the shortcomings in his superb book “Why
Jonny Can’t Add” Mathematics is a tool for almost all of its users. It is “learned”
by application. No user of an Excel spreadsheet would benefit from the New
Math.
The author then proceeds to discuss the political opposition
from the right to New Math and the Back to Basics movement. On p 145 he opens
the Epilogue with the statement “Opponents of the new math won.” In reality the
weaknesses of the New Math caused its own demise. With the like of Kline and
Feynman against it than what chance would it have? It just did not work.
The last sentence is also worthy of comment:
“Yet math classroom will remain a political venue as long
as learning math counts as learning to think. Debates about American math
curriculum are debates about the nature of the American subject.”
It is not clear to me what he means by the phrase, “nature
of the American subject”. Is subject the material taught, the individual,
some broader idea not explained?
Overall this book has two tales. One is the intended one of
the birth and death of the New Math. It has not totally died but still is found
floating around a bit. It is also a tale of pedagogy in the state run schools
and who decides what students must know and why they must know it. For most of
us, mathematics is a tool, it is a way to express facts and explore reality. My
day is often driven by mathematical realities, albeit those of an engineer,
pedantic, utilitarian, and lacking in questioning principles. I assume
solutions exist; I do no really pay attention to uniqueness theorems, and use
them as a tool kit to gain knowledge. Almost all who rely on mathematics do so.
The pure mathematician asks fundamental questions, questions about fields,
convergence, existence, measurability and the like. They do affect reality from
time to time. But rarely, yet when the do the impact is significant, just look
at the analysis of the Wiener Process in dynamic systems, and the Ito integral.
However, for the most part we want students to understand
technique, to a point. Out of the mix will come the engineers, physicists, and
yes the mathematicians, the very few mathematicians who have the unique
capabilities to abstract thinking.
Overall the book is a reflection of the political processes
surrounding education. This has been all too common especially since the advent
of Dewey and the education movement he was so prominent in. This book is a
useful exercise in grasping with the tendencies to make material relevant on
the one hand and a facilitator to understand society for good citizens on the
other. The book has certain weaknesses but it also has certain positive points.
It allows one to see how the arguments can be made. One may then ask in a
similar fashion; if these same arguments and this same process will follow
through with Common Core?
The book also shows the break between the academic
practitioners and the practitioners who teach the subjects. At the University
level we still see a great deal of freedom. At MIT for example courses change
on almost an annual basis as the technology and science progress. The need for “standards”
is non-existent. At the secondary level this is hardly the case, due to the
size and complexity. That perhaps is worthy of a similar study.
But one of the important observations here is the movement of "protected" groups like the "mathematicians" who may very well have been used by political operatives to gain deeper control in schools. As indicated colleges and universities are somewhat protected. But if the Government extends its control to Community Colleges we can easily see the movement of Washington control to move there. That may very well be the unintended less of this book. Namely, beware the Politician, they ultimately want to control everything,
But one of the important observations here is the movement of "protected" groups like the "mathematicians" who may very well have been used by political operatives to gain deeper control in schools. As indicated colleges and universities are somewhat protected. But if the Government extends its control to Community Colleges we can easily see the movement of Washington control to move there. That may very well be the unintended less of this book. Namely, beware the Politician, they ultimately want to control everything,
Saturday, January 10, 2015
Treasury Spreads January 2015
The above are a few yield curves from the past five years. Note that as of yesterday the yield is dropping to one its lows for 30 years. Also note that the curve is starting to slope upward, showing higher yields at lower durations. There is doubt of getting to an inverted yield curve again but one can wonder.
The above is the spread of 30 day to 30 year, the maximum spread. It is approaching a long term low again.
The above is the 90 day to 10 year spread. It is far from a low and this reinforces the above conjecture.
This shows the dynamics of the details of the previous curve. One suspects that with the employment rates we could see the FED making a change but again we must examine the FEDs Balance Sheet. We will do this tomorrow.
The above is the spread of 30 day to 30 year, the maximum spread. It is approaching a long term low again.
The above is the 90 day to 10 year spread. It is far from a low and this reinforces the above conjecture.
This shows the dynamics of the details of the previous curve. One suspects that with the employment rates we could see the FED making a change but again we must examine the FEDs Balance Sheet. We will do this tomorrow.
Labels:
Economy
MOOCs and Results
In a recent article in Science there was a discussion about obtaining performance data from MOOCs. Two things struck me:
1. The fact that it seems that there was limited if any thought as to how well they MOOC performed. One suspects that performance measures would have been at the heart of the effort. Namely is all this money worth it, and worth what? Apparently not.
2. Also the article in my opinion seems to ramble almost everywhere except in articulating any semblance of content or context. One wonders what the purpose of all those words were.
1. The fact that it seems that there was limited if any thought as to how well they MOOC performed. One suspects that performance measures would have been at the heart of the effort. Namely is all this money worth it, and worth what? Apparently not.
2. Also the article in my opinion seems to ramble almost everywhere except in articulating any semblance of content or context. One wonders what the purpose of all those words were.
For example:
Using engagement data rather than waiting for learning
data, using data from individual courses rather than waiting for shared
data, and using simple plug-in experiments versus
more complex design research are all sensible design decisions for a
young
field. Advancing the field, however, will require
that researchers tackle obstacles elided by early studies.These challenges cannot be addressed
solely by individual researchers. Improving MOOC research will require
collective action
from universities, funding agencies, journal
editors, conference organizers, and course developers. At many
universities that
produce MOOCs, there are more faculty eager to
teach courses than there are resources to support course production.
Universities
should prioritize courses that will be designed
from the outset to address fundamental questions about teaching and
learning
in a field. Journal editors and conference
organizers should prioritize publication of work conducted jointly
across institutions,
examining learning outcomes rather than engagement
outcomes, and favoring design research and experimental designs over
post
hoc analyses. Funding agencies should share these
priorities, while supporting initiatives—such as new technologies and
policies
for data sharing—that have potential to transform
open science in education and beyond.
OK, now try to parse this one. First what is engagement data? Second what is learning data? I have now played around with over a dozen MOOCs. Some are good, most are horrible. My last attempt was a Materials Science course at MIT. The lectures were spent watching the instructor write the lecture notes, which we had, on a large chalk board in total silence except for the clicking of the chalk. So why? Second the tests were really tests in reading comprehension, did you use the right units, and did you copy the value properly. Any errors were errors of transcription and not comprehension. How is that measured?
Frankly it seems that MOOC management has been stumbling around in some directionless manner. If there is no way to determine what the "return" on the investment is then why invest.
Now the biggest problem I have with MOOCs is the discussion functions. In my experience I saw anonymous discussants who were a few steps from shall we say rather anti-social actions. The article states:
The most common MOOC experimental interventions have been domain-independent “plug-in” experiments. In one study, students
earned virtual “badges” for active participation in a discussion forum . Students randomly received different badge display conditions, some of which caused more forum activity than others.
My experience, and anyone seeking proof need look no further than any anonymous discussion on the web, is that it facilitates the worst and possible near real anti-social behavior. Why would anyone want to participate. I tried once to make an observation and some, in my opinion, socially in-adept person made comments that had me remove my remark. The remark from another made one shudder! Then there is the near diabolical "peer review" method of having people who know nothing from disparate cultures attack others work. One wonders who invented that scheme!
Thus one may wonder if the writer of the piece has had any hands on experience. Apparently from what is written the answer is they have not. Yet the ability to measure effectiveness is critical. When will someone credibly address that issue?
Labels:
MOOCs
Employment January 2015
Let's start with the standard chart. Yes the DoL report says 5.6% but if one were to use the baseline of 2006 participation rates it is still above 8%. Although we see participation increasing.
The chart above should be the concern. We have a lower participation rate or higher gap, there are now millions permanently unemployed. That is and should be the concern. Frankly not only is there no change in that number but it appears to be increasing.
The above demonstrates that concern.
Thus overall things are getting better but the permanent gap is a real ongoing concern. It appears that at the older end we have lost people permanently and at the lower end the young millenials want meaningful work rather than a job. That is the real problem.
Labels:
Economy
Sunday, January 4, 2015
Words Mean Something
From today's NY Times weather map of 2014 they make the following statement:
The average temperature for the year was 54.5 degrees, 0.5 degrees below normal, which makes 2014 tied for the 56th-warmest year since 1869.
Now just think a bit:
1. The temperature was 54.5 and that was 0.5 below normal, call that average.
2. Then 2014 is tied for the 56th warmest year since 1869.
3. If it is below normal, say average, median or mean, then it means that half are above.
4. If it is since 1869 we have 145 years of data and assume that there are other ties, possibly even triplets.
5. Then if all were ties there would be 72 years of data and being 59th makes sense but calling it the warmest in any measure is nonsense. The words just do not match.
6. How does it rank as coldest? I guess it is politically incorrect to ask such a question in front of the NYT readers?
This is a clear example of the duplicity in the discussion about climate, change or otherwise. Political correctness dominates facts. Pity.
The average temperature for the year was 54.5 degrees, 0.5 degrees below normal, which makes 2014 tied for the 56th-warmest year since 1869.
Now just think a bit:
1. The temperature was 54.5 and that was 0.5 below normal, call that average.
2. Then 2014 is tied for the 56th warmest year since 1869.
3. If it is below normal, say average, median or mean, then it means that half are above.
4. If it is since 1869 we have 145 years of data and assume that there are other ties, possibly even triplets.
5. Then if all were ties there would be 72 years of data and being 59th makes sense but calling it the warmest in any measure is nonsense. The words just do not match.
6. How does it rank as coldest? I guess it is politically incorrect to ask such a question in front of the NYT readers?
This is a clear example of the duplicity in the discussion about climate, change or otherwise. Political correctness dominates facts. Pity.
Labels:
Global Warming
Prostate Cancer Markers
The MAC, has just announced reimbursement for three prostate cancer genetic tests. As OncLive states:
After months of delay, the Medicare Administrative Contractor (MAC), with jurisdiction over most molecular diagnostic tests used to treat cancer, made a series of decisions this fall that will allow Medicare reimbursement for several well-known tests, including 3 used in the treatment of prostate cancer.
Specifically the tests were:
1. ConfirmMDx by Oral Oncolytics which is a methylation test. We considered this in WP 116.
2. Prolaris by Myriad which is a complex gene test which we considered in WP 107.
3. Decipher by GenomeDx which is akin to Prolaris.
The methylation test has interesting merit and the others seem correlative at best. One of the problems of prognostic tests as to more aggressive PCa is just what does one do next?
After months of delay, the Medicare Administrative Contractor (MAC), with jurisdiction over most molecular diagnostic tests used to treat cancer, made a series of decisions this fall that will allow Medicare reimbursement for several well-known tests, including 3 used in the treatment of prostate cancer.
Specifically the tests were:
1. ConfirmMDx by Oral Oncolytics which is a methylation test. We considered this in WP 116.
2. Prolaris by Myriad which is a complex gene test which we considered in WP 107.
3. Decipher by GenomeDx which is akin to Prolaris.
The methylation test has interesting merit and the others seem correlative at best. One of the problems of prognostic tests as to more aggressive PCa is just what does one do next?
Labels:
Cancer,
Health Care
Saturday, January 3, 2015
A Worthwhile Contribution
Medieval Christianity by Madigan is a very readable and comprehensive book covering Western Europe from about 500 AD until 1400 AD, albeit edging down to 150 and up to 1500 at its extreme. The book is well balanced, well researched and accessible to all readers. The title also states it as “A New History” but just what is “new” and “well known” is not as clear perhaps as the author may have desired. Notwithstanding, what the author has presented is useful for the newly informed as well as the “well informed”.
The author starts with a brief discussion of early
Christianity from 150 to 600. This has as its center piece Augustine and his
writings. One of the most difficult problems with early Christianity is the
complexity of Greek thought and the Eastern Church and the slow evolution of a
Western Church. Southern has examined this in detail and it is the complexity
of Eastern thought which in many ways was a departing point for the west and it
was it abandonment by Augustine via his Roman way of thinking that opened the
Western Church and what we now think of Medieval Christianity. Augustine
introduced many ideas in a manner that reformed Western beliefs. His battle
with Pelagius is clearly one and his emphasis on grace another.
There is an interesting discussion on p 29-30 on when this
stage of early Christianity ended. One way to pinpoint this change perhaps is
the time of Gregory I. The reason is that at this time Gregory breached with
Byzantium by severing with the ruler in Ravenna and taking both religious and
political control in Rome.
The author’s discussion on Gregory is very limited in scope
and here I would fault the author for an opportunity to use this figure as a
major break point for the establishment of the Western Church (see pp 45-62).
It can be argued that it was Gregory who de factor created Medieval
Christianity.
The Bishop of Rome in 600 was still just that, the Bishop of
Rome. The Emperor in Constantinople was a de facto head of the Church, calling
various Councils to discuss major religious issues. Gregory had been in the
court in Constantinople, had been Mayor of Rome, had come from an old line
Roman family and desired to be a monk along the lines of Benedict. However he
was drawn to the Bishop of Rome slot by the people of Rome who required his
leadership.
Also Gregory was looking westward, seeing Constantinople as
an aging confluence of political intrigue. Thus by looking west his
communications with the Merovingian queen Brunhilda is a classic example of
Rome becoming pari passu with leaders and influencing them via religion and
charm. On the other hand the likes of Brunhilda were brutal to the point of
savagery and Gregory seems in his writing to have avoided discussion of these
facts. Likewise he dealt with the Lombards as well as sending the Italian
Bishop Augustine to England. This latter act however can be viewed as an
affront to the Irish who were still adhering to the Eastern Church ways and saw
Gregory as an equal in debate. In essence Gregory set up the conflict between
Ireland and England. But it was Gregory and his looking westward rather than
Eastward that made for the seminal start of the Medieval Church.
In this section it would have been useful to explore in some
detail the lengthy discussions between Columbanus and Gregory I. There was but
a brief mention on p 48 of Columbanus. First the Latin of Columbanus, the Irish
monk, was dramatically different from Gregory. Gregory had evolved to almost a
koine type Latin while Columbanus seems to have retained almost Ciceronian Latin.
The Irish monks had learned Latin almost independently from Rome based upon
classic texts and this in a way strongly influenced their style. In addition
Columbanus and all the Irish monks had never been under the Roman yoke and thus
in dealing with Rome they dealt with them as almost an independent
thinker.
Chapter 4 introduces Charlemagne. Charlemagne was a follow
on to the Merovingians, albeit the descendent of a Merovingian court official.
Charlemagne in 800 gets coroneted by the Bishop of Rome, now viewed as both a
religious figure and putative political player.
Chapter 5 deals with the parochial life. There was a local
parish alongside the monastic monasteries. The local priests were typically
less well educated that the monks who spent much of their time reading and
writing. In contrast the local parish priest was dealing with local matters of
lesser import. Chapter 6 deals with the Jews, an issue always made complex,
especially in the West. Chapter 7 considers the Crusades and Islam. A great deal
has been written on crusades and this presentation is brief. The complexity of
the expansion and acceptance of Muslim beliefs was often seen by the Christians
as another heretic sect, especially their belief in polygamy. There did not
seem to be any attempt to “understand” their thought throughout this period.
Starting in Chapter 8 the author moves to what he calls the
era of High Medieval Christianity. This is from 1050 through 1300. There is a
discussion of the reforms to what had become a Church with many small faults,
and this included Rome itself. By this time Rome had clearly become a Papacy in
terms of its singular position. Chapter 10 discusses some of the heretical
movements during this early period of the High Medieval Church.
Chapters 11 and 12 present the Dominicans and the
Franciscans respectively. Whereas the Dominicans were always positioned as
intellectually elite, Aquinas was a Dominican, the Franciscans presented a
possible threat to Rome, and they advocated a return to early Christian belief
of poverty. However Rome managed them quite well and the net result was a
Franciscan order that was on a par with the Dominicans and in a sense often
superior. One needs look no further than Ockham and his Franciscan followers.
The author then details many of the elements of religious
life and affairs. At this time the Church was becoming a dominant part of the
lives of the people.
On pp 262-266 the author presents Abelard and Heloise. This
is one of the classic tales of this time. This is one of the best descriptions
and one in context that I have seen. This alone is worth reading.
On pp 277-283 there is a brief discussion on Aquinas. I
would have liked to see a more detailed presentation. Aquinas became a figure
of the Aristotelian movement and after his death his works were banned by some
but they came back in the 19th century and the basis of Church
belief and doctrine. Some more detailed discussion of his work would have been
useful. I felt his presentation was too brief in passing.
As noted, the author discusses Aquinas but fails to discuss
Ockham, the Franciscan, albeit a brief note on the next to last page
((p434).Ockham was a nominalist, one who denied universals, and thus in
contradistinction to Aquinas. Ockham also reinvigorated the idea of the
Individual and as such was a catalyst for many works emanating from his. Also
Ockham demonstrated confrontational intellectual opposition to the Avignon
Popes resulting in his fleeing eastward and being supported by German Princes.
Here is an example of quasi-national opposition to the non-Roman Pope, a
conflict that was just starting to brew.
Late Medieval Christianity occurs from 1350 to 1500 and the
author does a good job in details the key points. Again there are “heretical” movements
such as Hus and Wyclif and the Lollards. He discusses the changes and discusses
Prague in some detail. Prague was a cauldron of religious dissent, as the
statue to Hus demonstrates in the square of present day Prague, a statue I
passed daily on my way to my office, ironically across from the house of Kafka!
Understanding central Europe more would have been helpful in explaining this
effect.
The Avignon papacy from 1309-1378 (pp 374-378) blends Middle
and High Medieval Christianity and represents a clear distortion of the Bishop
of Rome and the attempted, and in many ways total, control by the French throne
over the Pope. Here we have most likely the first instance of having a Pope as
a separate entity from the Bishop of Rome. For centuries before this, when the
Pope qua leader of the Church was mentioned, the position was synonymous with
the Bishop of Rome. In fact the true title should be Bishop of Rome, since that
is the position of such a leader. It would have been helpful to have an
expanded discussion on this topic. This period of fighting Popes has in my
opinion left an indelible scar on the Western Church.
Overall this is a superb book and worth reading and
rereading. The author builds upon Southern and his work as he indicates.
However there are many other views of the issue he presents and space being
limited his presentation is fair, well balanced, and exceptionally readable. In
contrast one might also read, if available, the works of Henri Daniel-Rops (a
pseudonym for Henri Tetiot) who albeit an apologist for the Church, has added
insight on many of these issues discussed by the author.
Labels:
Books
Friday, January 2, 2015
Cancer and "Bad Luck"
I commented yesterday on the brief Science piece by Vogelstein and colleagues. In the past twenty four hours I have seen over two dozen news pieces from every continent, except Antarctica, pitching the "bad luck" tale. Even China Daily had the story on its front page! This is unfortunately now a typical response, no analysis, just repetition. The "bad luck" was in the abstract and they could not have chose a better phrase to get picked up globally.
Now the results are not ground shaking and "bad luck" is not a scientific phrase. In reality the authors just observed that certain cancers are most likely driven by specific genetic changes already known or by such personal life choices like smoking. The rest are really unknown.
We are learning more and more of epigenetic effects, such as methylation, that result from inflammation, which may be a result of some life style choice such as obesity. That has not been factored in, especially since breast and prostate cancers were not considered.
Thus the term "bad luck" may just be a "bad choice" but it does get Press. But is that what science it meant to do?
Now the results are not ground shaking and "bad luck" is not a scientific phrase. In reality the authors just observed that certain cancers are most likely driven by specific genetic changes already known or by such personal life choices like smoking. The rest are really unknown.
We are learning more and more of epigenetic effects, such as methylation, that result from inflammation, which may be a result of some life style choice such as obesity. That has not been factored in, especially since breast and prostate cancers were not considered.
Thus the term "bad luck" may just be a "bad choice" but it does get Press. But is that what science it meant to do?
Labels:
Cancer
Thursday, January 1, 2015
Cancer By the Numbers
There is always a novel attempt to garner information or patterns from data on cancer. In a recent Science paper Vogelstein and colleagues have done some interesting "back of the envelope" analysis. They did the following:
1. Collected data which provided the number of stem cells of a particular cells type in a typical lifetime of a person. For example they somehow determined from the literature the number of melanocyte stem cells in a lifetime. They did the same for many other cells. For example there are lots of basal cells per human lifetime, as one would expect. In contrast there were two orders of magnitude less melanocytes.
Let us call that NSC(i) where i equals a specific type of cell.
2. Then they plotted the incidence of cell related cancers versus the total lifetime stem cells by type. Thus we see that the incidence of basal cell cancer is high and so are the total number of lifetime stem cells of basal cells.
Let is call this INC(i) the incidence of a specific cancer related to the specific cell.
3. Then they created a normalized line through the incidence versus stem cell count curve and drew a chart of cancers how far below to how far above they were to the average line. This waterfall type chart then was the discussion point.
That is we have some generalized relationship:
INC(i)=K NSC(i) where K is a common constant obtained from a regression type analysis. However the actual INC(i) may be above or below the regression line.
4. The cancers well above the norm were those driven by some putative genetic or environmental factor such as smoking and lung cancer. The rest are due the authors state to just having lots of stem cell mutations.
This if INC actual (i) > INC regression (i) we attribute this excess to some genetic or environmental/lifestyle condition.
Interesting concept, but there are some issues:
1. Do they really mean stem cells? Melanocytes do not reproduce as quickly but it is not clear just what a melanocyte stem cell is. We have seen this in prostate cancers.
2. The authors admit epigenetic factors as well and one suspects that they could dominate.
3. The excess cancers such as smoking and lung are clearly environmental effects.
4. Some how there is no discussion of breast and prostate. One wonders why since they are so prevalent.
Otherwise this is interesting and worth the read. Their conclusion is:
These results suggest that only a third of the variation in cancer risk among tissues is attributable to environmental factors or inherited predispositions. The majority is due to “bad luck,” that is, random mutations arising during DNA replication in normal, noncancerous stem cells. This is important not only for understanding the disease but also for designing strategies to limit the mortality it causes.
That is worth exploring.But, and this is a classic case, the Press has latched on to the "bad luck" phrase. We really do not know from this study what is the issue. Thus again we have a confluence of words by authors and the explosion of the press to enhance the piece. Frankly it is interesting but hardly conclusive of anything!
1. Collected data which provided the number of stem cells of a particular cells type in a typical lifetime of a person. For example they somehow determined from the literature the number of melanocyte stem cells in a lifetime. They did the same for many other cells. For example there are lots of basal cells per human lifetime, as one would expect. In contrast there were two orders of magnitude less melanocytes.
Let us call that NSC(i) where i equals a specific type of cell.
2. Then they plotted the incidence of cell related cancers versus the total lifetime stem cells by type. Thus we see that the incidence of basal cell cancer is high and so are the total number of lifetime stem cells of basal cells.
Let is call this INC(i) the incidence of a specific cancer related to the specific cell.
3. Then they created a normalized line through the incidence versus stem cell count curve and drew a chart of cancers how far below to how far above they were to the average line. This waterfall type chart then was the discussion point.
That is we have some generalized relationship:
INC(i)=K NSC(i) where K is a common constant obtained from a regression type analysis. However the actual INC(i) may be above or below the regression line.
4. The cancers well above the norm were those driven by some putative genetic or environmental factor such as smoking and lung cancer. The rest are due the authors state to just having lots of stem cell mutations.
This if INC actual (i) > INC regression (i) we attribute this excess to some genetic or environmental/lifestyle condition.
Interesting concept, but there are some issues:
1. Do they really mean stem cells? Melanocytes do not reproduce as quickly but it is not clear just what a melanocyte stem cell is. We have seen this in prostate cancers.
2. The authors admit epigenetic factors as well and one suspects that they could dominate.
3. The excess cancers such as smoking and lung are clearly environmental effects.
4. Some how there is no discussion of breast and prostate. One wonders why since they are so prevalent.
Otherwise this is interesting and worth the read. Their conclusion is:
These results suggest that only a third of the variation in cancer risk among tissues is attributable to environmental factors or inherited predispositions. The majority is due to “bad luck,” that is, random mutations arising during DNA replication in normal, noncancerous stem cells. This is important not only for understanding the disease but also for designing strategies to limit the mortality it causes.
That is worth exploring.But, and this is a classic case, the Press has latched on to the "bad luck" phrase. We really do not know from this study what is the issue. Thus again we have a confluence of words by authors and the explosion of the press to enhance the piece. Frankly it is interesting but hardly conclusive of anything!
Labels:
Cancer
Subscribe to:
Posts (Atom)