Sunday, January 31, 2010

H1N1 and Where Did It Go?

The CDC data on H1N1 shows a dramatic decline in incidence. They report:

During week 3 (January 17-23, 2010), influenza activity remained at approximately the same levels this week in the U.S.

  • 164 (4.6%) specimens tested by U.S. World Health Organization (WHO) and National Respiratory and Enteric Virus Surveillance System (NREVSS) collaborating laboratories and reported to CDC/Influenza Division were positive for influenza.
  • All subtyped influenza A viruses reported to CDC were 2009 influenza A (H1N1) viruses.
  • The proportion of deaths attributed to pneumonia and influenza (P&I) was above the epidemic threshold.
  • Five influenza-associated pediatric deaths were reported. Four deaths were associated with 2009 influenza A (H1N1) virus infection and one was associated with an influenza A virus for which the subtype was undetermined.
  • The proportion of outpatient visits for influenza-like illness (ILI) was 1.7% which is below the national baseline of 2.3%. Two of the 10 regions (Regions 4 and 9) reported ILI equal to their region-specific baseline.
  • No states reported widespread influenza activity, five states reported regional influenza activity, Puerto Rico and nine states reported local influenza activity, the District of Columbia, Guam, and 33 states reported sporadic influenza activity, and the U.S. Virgin Islands and three states reported no influenza activity.
It is interesting to consider and possibly determine why it has not had as high an impact as was originally anticipated. Was there a basic misunderstanding, did prevention work, are people more educated. It was clear that at MIT and at Brigham and Women's there was an aggressive approach to hand sanitizing and it clearly worked even in environments open to ready transmission.

This will be an interesting study some time when it is done.

Friday, January 29, 2010

GDP Grows at Recovering Rate

The BEA has released the Q4 GDP numbers for the economy. They summarize the data as follows:

Real gross domestic product -- the output of goods and services produced by labor and property located in the United States -- increased at an annual rate of 5.7 percent in the fourth quarter of 2009,(that is, from the third quarter to the fourth quarter), according to the "advance" estimate released by the Bureau of Economic Analysis. In the third quarter, real GDP increased 2.2 percent.

This is a substantial growth and shows a recovery.

However we want to show a few facts that will be critical in the analysis.

Fact 1: The real GDP has grown but the growth although good places it still on a path below the beginning of the Recession. We demonstrate that with the graph below.

Fact 2: M2 is still flat. Despite the flow of funds from the FED, the supply of money is very stable and this seems to imply a low velocity of money and that people are saving more and spending less. Thus there will be a conservative recovery unless there will be a double dip resulting from loss of faith in what Washington is going to do. Currently the gross uncertainty is a major factor in low spending.

Fact 3: M2 annualized rates of change are again negative or in the negative range. There is little growth in money supply and thus fear of inflation is low but prospects for a growing GDP are also low. The conundrum is somewhat driven by the assumption that people hold money if they believe that its value will not appreciate. We seem to be seeing this phenomenon in action.

Fact 4: The projected inflation rate based upon the changes in GDP, money velocity, and change in M2 is shown below. Clearly at the current rate inflation is not a concern. Yet we continue to have caution in terms of the FEDs actions, its current excess reserve policy as they explained is really a balance sheet strengthening not a flooding of money supply (See our prior posts on this).

In summary this should be received as good news for the economy. Now if we can reduce inflation. One underlying question however in that we have not yet analyzed the GDP details is what led to this rise. If it were just the Government Spending then we are still concerned.

Thursday, January 28, 2010

The iPad: Remember the Memex

Apple has introduced the iPad. I thought it would be interesting to review a small part of an article, "As We May Think", written by Vannevar Bush in the Atlantic Monthly in 1945.

Bush states:

"Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and to coin one at random, ``memex'' will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.

It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk.

In one end is the stored material. The matter of bulk is well taken care of by improved microfilm. Only a small part of the interior of the memex is devoted to storage, the rest to mechanism. Yet if the user inserted 5000 pages of material a day it would take him hundreds of years to fill the repository, so he can be profligate and enter material freely.

Most of the memex contents are purchased on microfilm ready for insertion. Books of all sorts, pictures, current periodicals, newspapers, are thus obtained and dropped into place. Business correspondence takes the same path. And there is provision for direct entry. On the top of the memex is a transparent platen. On this are placed longhand notes, photographs, memoranda, all sort of things. When one is in place, the depression of a lever causes it to be photographed onto the next blank space in a section of the memex film, dry photography being employed."

He continues:

All this is conventional, except for the projection forward of present-day mechanisms and gadgetry. It affords an immediate step, however, to associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex. The process of tying two items together is the important thing.

When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item.

Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together to form a new book. It is more than this, for any item can be joined into numerous trails.

Does any of this sound familiar. Bush was somewhat of a typical Yankee tinkerer and visionary. His family was from Provincetown and he obtained a PhD from MIT in 1917. He was FDRs chief science adviser and administrator.

Bush was an innovator who actually got things done. General Groves reported to him during the period of the Manhattan Project. He had to manage the technical teams, each of which had their own way to purify uranium, explode it, deploy it and the like. Unlike marcroeconomists, however, there was always truth behind the science, and the truth was what Bush was good at getting out.

He did slow down the digital computer as conceived by Norbert Wiener but he later allowed it to catch up. He was always concerned with the practical things, like how long a vacuum tube would work.

I remember meeting him in I believe 1965 as a young grad student, he always had his pipe in hand, an artifact of a now bygone age. But he was an ever present advocate for the memex. I guess between Google Desktop, Google, and Kindle/iPad etc we are there now. Good job Professor Bush!

Where Has the Banting and Best World Gone?

Banting and Best who are noted as the discoverers of insulin, along with MacLeod, at Toronto, having been awarded the Nobel Prize for the achievement, accomplished the task with minimal resources in a brief period of time and under less than supportive conditions. Banting was clearly a driven man with the ability to survive the pressures from the external environment. The example of this team is that a small focused group can achieve wonders.

I read an article today in Nature Genetics and it is on the determination of some several genes related to glycemic control in humans with Type 2 Diabetes. Like so many articles of this type it states:

Levels of circulating glucose are tightly regulated. To identify new loci influencing glycemic traits, we performed meta-analyses of 21 genome-wide association studies informative for fasting glucose, fasting insulin and indices of beta-cell function (HOMA-B) and insulin resistance (HOMA-IR) in up to 46,186 nondiabetic participants.

Follow-up of 25 loci in up to 76,558 additional subjects identified 16 loci associated with fasting glucose and HOMA-B and two loci associated with fasting insulin and HOMA-IR. These include nine loci newly associated with fasting glucose (in or near ADCY5, MADD, ADRA2A, CRY2, FADS1, GLIS3, SLC2A2, PROX1 and C2CD4B) and one influencing fasting insulin and HOMA-IR (near IGF1).

We also demonstrated association of ADCY5, PROX1, GCK, GCKR and DGKB-TMEM195 with type 2 diabetes. Within these loci, likely biological candidate genes influence signal transduction, cell proliferation, development, glucose-sensing and circadian regulation. Our results demonstrate that genetic studies of glycemic traits can identify type 2 diabetes risk loci, as well as loci containing gene variants that are associated with a modest elevation in glucose levels but are not associated with overt diabetes.

Interesting but there were some 176 authors! The list of authors was longer than the abstract! This is an amazing trend in research papers which I find disturbing. I see this even at the graduate level where there are so many authors one wonders who really did the work. In the past, say 40 years ago, there was a single author. We knew who made the contribution and we knew who made the mistake. In the biological sciences the need to publish and the need to extend the reach of involvement, possibly for later plausible deniability, has it appears gone to the extreme.

If from experiments of this type a great discovery occurs, then we will not have a Banting and Best, Watson and Crick, and the like. We will just have a large bunch of folks. Pity.

Norbert Wiener, Markets, and Cycles

Norbert Wiener was the person who inspired my first writings and indeed my first book. Not that I am in any way a mathematician, for I am an engineer at heart, nor am I even a table top philosopher, but Wiener being both a great mathematician and a well versed student of philosophy had many insights half a century ago which are worth sharing.

In a paper he wrote in the mid 1950s (as quoted by Masani in his book on Wiener) Wiener is quoted as saying:

"Suppose, now, that a sum of money at the time of Christ had been left at 2% compound interest; for example the thirty pieces of silver off Judas. By what factor would it have multiplied up to the present time? We are approaching the year 2000 and in order to express our result in round numbers let us suppose that we are at the year 2000. Then one dollar at the time of Christ would amount, at 2%, to a quantity with over ninety-seven zeros. At any conceivable scale of evaluation one cent at the time of Christ put in a bank at 2% compound interest would amount to something like 10 to the 84 times all the value of the goods in the world at the present time. This is ridiculous, but it still has meaning."

He continues:

"The sums earned by money put out to interest have been wiped out time and time again by wars, famines, plagues, and other catastrophes. These catastrophes have been great enough to wipe out every single commercial undertaking of antiquity of thousands of years, and if they had not taken place. the rate of interest for long term investment could scarcely be two tenths of a percent."

Masani then states Wiener's conclusion:

"It follows that modern capitalism is able to offer attractive returns on private investments in long term undertakings only by its condescension of bankruptcies during down phases of its periodical trade cycles. For the well off the resulting losses are often on paper, but they are painfully real to poorer people thrown out of work. Thus the system is not socially homeostatic."

Wiener had a practical insight that many in today's complex world of macroeconomics should consider. For Wiener was a true mathematician, one of the best of the 20th century, and unlike these economists who attempt at mathematics to hide a swath of frailties Wiener made primal contributions, the Generalized Harmonic Analysis and Brownian motion being two which have affected the current world.

Wednesday, January 27, 2010

Global Warming Data Fabrication

The Times of London reports on the global warming data scandal. They state:

The impact of global warming has been exaggerated by some scientists and there is an urgent need for more honest disclosure of the uncertainty of predictions about the rate of climate change, according to the Government’s chief scientific adviser.

John Beddington was speaking to The Times in the wake of an admission by the Intergovernmental Panel on Climate Change (IPCC) that it grossly overstated the rate at which Himalayan glaciers were receding.

Professor Beddington said that climate scientists should be less hostile to sceptics who questioned man-made global warming. He condemned scientists who refused to publish the data underpinning their reports.

There seems to be one violation of the scientific trust after another. As relates to the specific UK group whose emails were hacked the Times state:

The stolen e-mails , revealed on the eve of the Copenhagen summit, showed how the university’s Climatic Research Unit attempted to thwart requests for scientific data and other information, and suggest that senior figures at the university were involved in decisions to refuse the requests. It is not known who stole the e-mails.

Professor Phil Jones, the unit’s director, stood down while an inquiry took place. The ICO’s decision could make it difficult for him to resume his post.

Details of the breach emerged the day after John Beddington, the Chief Scientific Adviser, warned that there was an urgent need for more honesty about the uncertainty of some predictions. His intervention followed admissions from scientists that the rate of glacial melt in the Himalayas had been grossly exaggerated.

In one e-mail, Professor Jones asked a colleague to delete e-mails relating to the 2007 report by the Intergovernmental Panel on Climate Change.

This still remains a concerning set of facts.

An Excellent Review of the Perils of CCE

Dr Groopman wrote an excellent piece in the New York Review of Books regarding one of my favorite topics, comparative clinical effectiveness. We have been arguing here for more than the past year that CCE is as currently proposed one of the most damning elements of the health care proposals issued from the current Congress.

He commences with the quote from Orszag as follows:

In June 2008, testifying before Max Baucus's Senate Finance Committee, Orszag—at the time director of the Congressional Budget Office—expressed his belief that behavioral economics should seriously guide the delivery of health care. In subsequent testimony, he made it clear that he does not trust doctors and health administrators to do what is "best" if they do no more than consider treatment guidelines as the "default setting," the procedure that would generally be followed, but with freedom to opt out. Rather, he said,
To alter providers' behavior, it is probably necessary to combine comparative effectiveness research with aggressive promulgation of standards and changes in financial and other incentives. [Emphasis added.]

This is a chilling statement in that he seems to be saying that when the Government promulgates a set of clinical guidelines then the Government will take whatever measures necessary to see that the physicians follow those guidelines, now mandates in his view it appears, and if not there will be consequences. That is indeed a chilling effect.

He also uses the reference to Sunstein, the nudge concept, which in essence contends that people will do very little to make personal choices and the Government can select the right one and make them believe by certain exogenous pressures that it is their own choice. Such a subtle mind management reduces rejection on the part of the populace. He states:

Thaler and Sunstein build on behavioral economic research that reveals inertia to be a powerful element in how we act. Most people, they argue, will choose the "default option"—i.e., they will follow a particular course of action that is presented to them instead of making an effort to find an alternative or opt out. Further, they write,

These behavioral tendencies toward doing nothing will be re- inforced if the default option comes with some implicit or explicit suggestion that it represents the normal or even the recommended course of action.

Thus between the heavy hand of Orszag and the manipulative fingers of Sunstein the current Administration wants to get CCE out there as the best was of doing things. As we have said many times before, one should be concerned about some GS 13 outsourcing the next version of Harrison's to the lowest Government contract bidder. The thought is terrifying.

Groopman states:

There is a growing awareness among researchers, including advocates of quality measures, that past efforts to standardize and broadly mandate "best practices" were scientifically misconceived. Dr. Carolyn Clancy of the Agency for Healthcare Research and Quality, the federal body that establishes quality measures, acknowledged that clinical trials yield averages that often do not reflect the "real world" of individual patients, particularly those with multiple medical conditions. Nor do current findings on best practices take into account changes in an illness as it evolves over time. Tight control of blood sugar may help some diabetics, but not others. Such control may be prudent at one stage of the malady and not at a later stage. For years, the standards for treatment of the disease were blind to this clinical reality.

Frankly, not only are these misconceived, as we have argued regarding the PSA results touted almost a year ago, trials which when conceived were worthy, but when completed failed to adjust to the knowledge obtained in the interim, medical knowledge is changing on a daily basis and the communications amongst and between physicians is an ongoing process. It is iterative and collegial, and changing the process to one of officially chronicled results will lead to disaster. Why not just use Osler from say 1926?

Groopman then makes a compelling case for why health care in this country is in many ways the best, the most costly, and the most complex. He states:

Cost-effectiveness is going to be a hard sell to the American public, not only because of the great value placed on each life in the Judeo-Christian tradition, but because the federal government has devoted many hundreds of billions of dollars to bail out Wall Street. To perform mammograms for all American women in their forties costs some $3 billion a year, a pittance compared to the money put into the bank rescue. The Wall Street debacle also made many Americans suspicious of "quants," the math whizzes who developed computer models that in theory accurately assessed value in complex monetary instruments but in fact nearly brought down the worldwide financial system. When a medical statistician says that imposing a limit on mammography is a "no-brainer," people may recall George Tenet's claim that the case for invading Iraq was a "slam-dunk."

Finally Groopman ends with the following:

The care of patients is complex, and choices about treatments involve difficult tradeoffs. That the uncertainties can be erased by mandates from experts is a misconceived panacea, a "focusing illusion." If a bill passes, Cass Sunstein will be central in drawing up the regulations that carry out its principles. Let's hope his thinking prevails.

On this I disagree. As much as Groopman appears to admire the Sunstein approach, as his very article seems to state as its core argument, medical research is an ever changing source of new information. Each patient treated educates the practitioner about the next. Patients themselves are part of the education process. Thus any system, soft or hard in its motivation, mandated from Washington, will in all likelihood be the the detriment of the system, the physician, and the patient.

Tuesday, January 26, 2010

CBO Budget Outlook

The CBO has just issued its most recent outlook for the next ten years. It puts as good a spin on a bad situation as would be possible from Washington.

It states:

In 2010, under an assumption that no legislative changes occur, CBO estimates that federal spending will total $3.5 trillion and revenues will total $2.2 trillion. The resulting deficit of about $1.3 trillion would be just $65 billion less than last year’s shortfall and more than three times the size of the deficit recorded in 2008. Total outlays are projected to increase by just $5 billion, while revenues are projected to rise by $70 billion. The deficit for this year is on track to be about as large as last year’s because an expected decline in federal aid to the financial sector will be offset by increases in other outlays, particularly spending from last year’s stimulus legislation and outlays for income support programs, health care programs, Social Security, and net interest. At the same time, revenues are projected to increase only modestly primarily because of the slow pace of economic recovery forecast by CBO and the lagged effect of the recession on tax receipts.

In 2011, according to CBO’s baseline projections, the deficit falls to $980 billion, or 6.5 percent of GDP, as the economy improves, certain tax provisions expire as scheduled, and spending related to the economic downturn abates. Revenues are projected to rise by about $500 billion, an increase of 23 percent, while outlays are projected to increase by $126 billion, or 4 percent.

The outlook is as follows:

Severe economic downturns often sow the seeds of robust recoveries. During a slump in economic activity, consumers defer purchases, especially for housing and durable goods, and businesses postpone capital spending and try to cut inventories. Once demand in the economy picks up, the disparity between the desired and actual stocks of capital assets and consumer durable goods widens quickly, and spending by consumers and businesses can accelerate rapidly. Although CBO expects that the current recovery will be spurred by that dynamic, in all likelihood, the recovery will also be dampened by a number of factors. Those factors include the continuing fragility of some financial markets and institutions; declining support from fiscal policy as the effects of ARRA wane and tax rates increase because of the scheduled expiration of key tax provisions; and slow wage and employment growth, as well as a large excess of vacant houses.

In reality the issues are driven by the total economic uncertainty from Washington. Business is uncertain as to tax rates, as to the taxing from exogenous new taxes such as cap and trade and health care, from the fragility of the dollar, from the uncertainty of available credit to finance current operations, from the depletion of equity for start up opportunities and the list goes on. That abject terror which exists in the business community, especially with entrepreneurs, will delay a recovery and the numbers projected by the CBO post-recovery shall never be met and things will get worse.

Broadband and Manners

About a week ago I gave an interview to the newspaper in Burlington Vermont upon which I had subsequently commented upon here. As I am generally not interested in public expression to any degree, I have been down that path before, but as I was asked and I responded.

What truly interests me is the responses to the article and especially the ad hominem attacks but anonymous commenters. For example:

First Mr. McGarty from MIT I would ask for my money back from MIT or if you are teaching our young at MIT please get another job. You fail to see the added benifits (sic) this would bring to a rural area. You fail to take into consideration the future dependency of our economic and health care on HSI and your misleading statements can put people at risk of having no HSI at all.

Well, if the commenter, bandhog by name, had spent a few seconds on facts, he, I assume it is a he but alas one never knows, he would see that I donate my time, there is no exchange of money, except from me to MIT, and they do not even pay for parking no less the 600 mile round trip. As for satisfaction of students, just ask them to find out check my name on theses. Also I have no "job", for indeed I am old enough to be retired.

As to the benefits, one knows there are benefits, I enjoy them daily. Also as one who lives at times in our home in northern New Hampshire, I made a choice there as well. As to health care, I suspect that I know a bit more then bandhog, one does not diagnose solely by broadband, Osler would be horrified. As to my statements, they are economic statements of fact, you see, I had done this most of my life, and thus have the distinct disadvantage of experience. If you chose to live in a Walden like world, you take the consequences. It is your choice, and one suspects that others should not subsidize your choice at their expense. But bandhog seems to hide behind his nom de plume rather than engage in a true debate.

The second commenter, fibernetworks, states:

The first “industry observer”, Mr. McGarty from MIT, states that the cost per mile to build the Fiber network will be in the range of $50,000 per mile. This is wrong. The actual cost is in the range of $16,000 to $22,000 per mile, which, using his own logic, makes the customers “needed” per mile drop from 20 per mile to less than 10 per mile.

Well, I hate to disappoint the fibernetworks person but when, in my experience, you average buried and pole mounted, when you add the pole electronics, when you factor in the make ready, when you account for dealing with whomever owns the pole and delays, when you complete the strand mapping, and on and on it approaches $50,000. If you believe in $16,000 as the all in costs you may have a great surprise, or you may be in a very unique area. When we did the buildout budgets for Hanover, NH and the other almost 30 towns we assumed the $25,000 per mile but soon found it was closer to $50,000. For fibernetworks this means that facts of experience may trump his opinion.

Thus the lesson from this tale. The anonymous nature of the comments allows people to say whatever, since one can never test the basis upon which they comment. They comment often in hateful and baseless. This shows the types of folks who back the project which in many ways calls into question the project itself.

Monday, January 25, 2010

The Rowe Conjecture and the Efficient Market Hypothesis

Nick Rowe of Toronto presented a conjecture concerning two variables as relates to the efficient market hypothesis, EMH. (Note: We have written this in some detail on the Telmarc web site for reference).

Rowe defined them as follows:

Recall that the EMH simply stated is the assumption that the market value of a stock is a reflection of all the information available regarding the stock. Now we know two things. First, that there is a herd mentality in the market that make many people to believe that the stock has or does not have value independent of whatever information is available. In fact some people may have information not available to others. Second, the herd mentality is driven by a percent of those who believe the EMH whereas when the herd develops the true existence of the EMH may actually disappear.

Thus the two variables, the one being the belief in the EMH and the second being the actual operation of the EMH are related. If the true existence of the EMH is say 100% then we have an efficient market and herd mentality is at a minimal because everyone distrusts the herd and does their own analysis, assuming equality to information and equality of access to trading.
We now develop a dynamic model based on the Rowe conjectures. We have changed these variables slightly from what Rowe had stated so that they are probabilities and that they are time dependent. Now Rowe sets the problem up as a supply and demand model wherein he disregards temporal dynamics and further looks at the people percent as the quantity and the probability of validity as the price variables.

We disregard the supply demand paradigm and look at them as interlinked temporal variables. Rowe has presented a compelling model of market behavior. We build upon it and do so in a dynamic fashion.

We assume a generalized model of the following type:

Now this is a generalized model which we will add some structure to. We will do so by applying a discrete time version and then go back to the continuous time version to analyze the results in a phase plane methodology.

Let us now write:

This is a linear model. We will expand this shortly but this is a good place to commence the analysis. This simple model states the following:

1. At some time k+1, the percent of people who now believe that the EMH is true is some multiple of the percent who believe before, and this may be greater or less than one, and some percent of the probability that it is actually in force.

2. The EMH is often true if those in the market are of the belief that it is not and that the market is not reflecting the true value and that they must do their own work to seek the truth.

3. The EMH is often false, namely its presence has a low probability, if there is a herd mentality. Namely the greater the belief in an EMH the smaller the probability that an EMH is true.

4. Market Bubbles occur when the herd approaches 100% and this also means that the truth that EMH exists is reduced to zero. When a market Bubble occurs the market then is subject to collapse, and the belief in the EMH drops precipitously.

5. Thus the model should reflect the dynamic as follows:

a. when the belief is low then the truth is high
b. when the belief is high, it grows the level of belief to a point and then collapses the level of belief
c. when the belief is high the truth is low
d. truth is dependent only upon the belief, and it is the belief that solely drives the Bubble

Thus we can create a model which can be written as follows. First for truth we have:

This is the Clock equation of Andronov et al and it describes an oscillatory system in the space of x, and dx/dt. Namely we have a phase space of the two variables, orthogonal to one another and there is a oscillatory behavior in the box as shown below. This can be simulated in discrete time by the following:

The typical solution for this may look as follows:

The above is a time plot of the two variable over time and they cycle back and forth. This shows the following:

1. There can be a model for the EMH that demonstrates the relationship between the two variables. It is not a model using a supply demand model.

2. The model demonstrates market cycles as expectations and reality cycle with each other.

3. The model can be tested against real data to ascertain its validity.

It would be interesting to see how this compares with reality.

More Thoughts on CCE

The NEJM published a recent article on the CMS approval of certain new medical technologies. The authors state:

In deciding whether to pay for new medical technologies, the Centers for Medicare and Medicaid Services (CMS) is becoming more specific about its requirements for evidence of improved health outcomes in the Medicare population. In our view, this is a positive and overdue step, but one whose rationale and likely consequences must be better understood by the medical community, policymakers, and the public. Expansions of access to health insurance under the health care reform legislation pending in Congress — and resulting financial pressures — would almost certainly intensify the emphasis on more relevant and robust evidence....

Over time and in fits and starts, Medicare has embraced this emphasis on “outcomes.” The program pays for broad categories of health care services (e.g., hospital and physicians’ services) but is prohibited by law from paying for items and services that are not “reasonable and necessary.” Although most coverage decisions are made by the regional health plans that administer the Medicare program, the CMS issues national coverage determinations (NCDs) each year for 10 to 15 technologies that are projected to have a major impact on care, for which an existing national policy requires updating, or for which regional policies are conflicting.

The NCD or national coverage outcome is the approach of evaluating whether Medicare will reimburse for certain technologies. For example this applies to robotic surgery for psoate cancer. There is currently a paucity of evidence that the robotic approach increases life span yet it does reduce certain morbidities.

They continue:

Some physicians may be concerned about stricter evidentiary requirements, perceiving them as impeding access to important medical advances. Others may be disturbed by the idea of interference by “big government” in the doctor–patient relationship. Still others may suspect the motives underlying the requirement for evidence reviews, seeing the trend as part of a cost-containment agenda, as highlighted recently by the second-guessing of the motives behind changes to the screening guidelines for breast cancer and cervical cancer.

Yes indeed there is this concern and perhaps this was a key element in the recent Massachusetts Senate race outcome. The breast cancer decision was a political disaster. The team making it was apparently devoid of disease specific expertise. Furthermore it was devoid of essential patient input. For it is not just the test per se but the "ritual" of the patient and physician contact which is key. This ritual effort along with the elements of the rituals including mammographies, PSA tests and the like, afford substantial benefit, well beyond just the Government defined end points.

The authors continue:

How do evidence requirements vary among different categories of technology, and how can that evidence be generated most efficiently? When can Medicare make reasonable inferences from studies undertaken in non-Medicare populations? When is it reasonable to extrapolate from surrogate markers studied in randomized, controlled trials to longer-term outcomes? When and how should observational data and other nonexperimental evidence be used? When should technology be reassessed in light of new information?

Part of the solution will come from having a more transparent, timely, and participatory process, and Congress and the CMS have worked to improve matters in this regard. Part of the solution will also come from smarter design and implementation of clinical trials and better synthesis of evidence. The CMS should continue to explore ways to enact flexible coverage policies in order to tie payment to outcomes. The agency has experimented with a policy of “coverage with evidence development,” which enables Medicare to cover the use of promising technologies for patients enrolled in studies that will better determine a technology’s risks and benefits.

The problem is that if CMS demands evidence for any and all treatments then this becomes the chicken and egg problem. The arguments over CCE in the most recent health care bills are prime examples of the concern here. CMS has the authority already to enact many forms of control. It will be interesting to see if CMS does to health care what EPA has done to CO2 control, by fiat!

Microsoft and the Registry

Just a note for readers, I decided to get a new laptop, old one had no disk space left, it was seven years old, so i got a great Dell and like any techy decided I would get Windows 7, the 64 bit version.

Well, you all can guess the ending. I installed a bluetooth mouse, and then decided to download and install the update drivers. Yes, then all hell broke out. The download wiped out the Registry!

For those of you who do not know a Registry it is akin to the limbic system of the brain, that internal element which interconnects stuff and gets us frazzeled under stress. It imprints lasting memories on us so that we never do anything again like what we had just experienced. The limbic system imprints such things as never again to touch a hot stove, lick a frozen pipe under a dare in norther New Hampshire, or the like.

Thus from 8 PM till well past midnight I was in the hands of some well intentioned Dell techys in Mumbai who took control of my life, and I watched as they tried to determine what happened. Now I know a bit about the Registry, I know more about the limbic system, but alas I could not figure out what had been done. They then asked me to push a button which I never saw and there it was, it worked again.

I have been cleaning up that mess ever since. This is why I measure my life in MegaGates, units of human lives wasted on Microsoft induced disasters. Some of you may have sympathy, but alas, there are those techys who will tell me that have no trouble with the Registry, there will be Apple users who say why do you deal with a Registry, there will be the Luddites who say just buy another one and never touch them, and I am left in this never world of asking why!

Thursday, January 21, 2010

Prostate Cancer, CCE, and Testing

The British Journal of Cancer has just published an interesting article regarding Prostate Cancer. They state:

There is evidence that prostate cancer (PC) screening with prostate-specific antigen (PSA) serum test decreases PC mortality, but screening has adverse effects, such as a high false-positive (FP) rate. We investigated the proportion of FPs in a population-based randomised screening trial in Finland...An FP result is a common adverse effect of PC screening and affects at least every eighth man screened repeatedly, even when using a relatively high cutoff level. False-positive men constitute a special group that receives unnecessary interventions but may harbour missed cancers. New strategies are needed for risk stratification in PC screening to minimise the proportion of FP men.

The last statement is the most powerful. It states that despite the false positive, namely a man is told that an increased PSA may be an indicator for Prostate Cancer, and then after a biopsy there does not appear to be any, then shortly thereafter they do come down with PCa. Namely false positives may not truly be false positives but early true positives. Specifically the histological test of looking at cells may not be the correct early assessment method.

The Cancer Research UK states in their assessment of the article the following:

The study, a clinical trial of the controversial PSA test for prostate cancer, tells us that false-positives are common. It also shows that men who get a false alarm:
  • are likely to get another one the next time they go for a PSA test
  • are likely to refuse future invitations to screening, and
  • are likely to actually be diagnosed with prostate cancer the next time round
The third result, in particular, is a fascinating one. It suggests that men who get a false-positive result through PSA testing, in the words of the researchers, “constitute a special group”. They could well go through unwarranted tests, but they could also harbour missed cancers that only turn up later.... As we mentioned above, there’s a large prostate screening trial running across Europe, called ESPRC. The new results, published in the British Journal of Cancer, (which Cancer Research UK owns) come from the Finnish part of this trial – its largest component.

It involves more than 80,000 men, some of whom were randomly invited to three rounds of PSA testing, with four-year gaps between each round. Roughly 30,000 men attended their first round of screening and more than 10,000 of these men went on to attend all three rounds.

The study showed that false-positives are a common part of PSA testing. In any individual round of testing, the majority of positive results are false alarms (between 60 and 70 per cent), while just over a quarter lead to an actual cancer diagnosis. Among the men who attended at least one round of screening, 1 in 8 had at least one false-positive result.

It’s worth noting that the researchers were using a fairly high cut-off level of PSA (4 ng/ml) – i.e. the level above which they were thought to have suspected prostate cancer. This sets a pretty high bar for a positive result and should minimise the number of false positives. Nonetheless, many still crept through.

Among the men who get a false alarm in one round, more than half will get another false alarm in the next one. Many men without tumours have persistently high PSA levels for some other reason, so they keep on testing positive. That’s a lot of extra worry and more potential for unneeded tests.

Indeed, in this trial, every third man who got a false alarm went through two biopsies within 4 years of their result. That’s probably an underestimate too, as it doesn’t account for any visits to private doctors.

However, the study also shows that false-positives aren’t entirely meaningless. If men had a false alarm during one round of screening, they were 3-9 times more likely to be diagnosed with prostate cancer during the next round...."

The analysis of the poor trials mentioned above is what we had commented on a year ago when the results were issued. Namely they used the 4.0 PSA level which we now know to be wrong, especially for men under 65. In addition we also now know that the better measure is PSA velocity, namely the change in PSA in a years time. If the change is 0.75 or greater then there is a 90% chance of Prostate Cancer. That is a fairly good metric. Thus is you have a PSA of 1.5 in one year and the next year it is 2,25, you have a 90% chance of incipient PC.

The 2003 NEJM article by Nelson et al on Prostate Cancer lays out the genetic progression of Prostate Cancer and it is that progression which PSA somewhat follows. Yet it is that progression that most histological exams, using say a Gleason framework, do not follow. It is worth a simple review to see what we mean. Let us go through 4 simple steps:

1. Cancer is simply a breaking down of the normal cell cycle as shown below. Cells duplicate themselves via mitosis and it is that mitotic process wherein say old cells "die" and new cells are created. In fact the old cell just repairs itself and then duplicates itself. The classic process is as below.

Most of the time the cell is resting in G0. The cell when in G1 is getting ready to reproduce. For it is in S that the DNA copies itself and then goes on to M for separation into new cells. The skin, blood, and many other cells are doing this all the time. However there may be problems. The cell DNA may be hist by radiation, some chemical which damages the DNA, or the like. Cell DNA is quite fragile.

2.The cell begins its change to reproduce and there are many internal control mechanisms. They take the cell almost through G1 up to an R point, at which if the cell DNA has any problems the corrective mechanism will kill the cell. However if the genes controlling this protective mechanism are not working due to same attack, then the cell goes past this R point and does it again and again. That is the beginning of cancer.

3. Now there are many chemical pathways that try to stop errors from propagating. We show some of them below.

The most important for Prostate Cancer is the PTEN pathway. This gene and its protein if in any way damaged result in loss of control of cell growth. Many environmental factors control the breakdown of PTEN. Once it goes the PSA starts to explode. The cancer then also becomes unmanageable. It is this final assault that will often result in death.

4. Cancer is a progressive disease of steps. The ones from Nelson, and there are updated version of it now seven years later, but this is quite reasonable are shown as follows:

A simple health cell starts on the left and spends its whole life happy and well. Then all we need is one cell which gets attacked and the process starts. But it takes many attacks, one after the other to take the cell from a slight problem to a deadly mass. Understanding these steps and being able to determine what is in the "bad" cells will be a much better path to take than what we have now with PSA but PSA is good. It works, and it does save lives.

Thus we argue three facts:

1. The clan of Comparative Clinical Effectiveness users is really a backward looking clan. In fact the PSA testing controversy shows how backward looking they can be. Yes a 4.0 PSA will result in little improvement. For by the time it gets there especially in young men it is too late. We need a forward looking clan of researchers on the clinical side. That however may be an oxymoron since the clinical researchers most often look backward.

2. The genetic markers are truly the best measures of what the problem may be. Yet we need better means and methods to measure them We need to have say nanothechnology which will scrub through the prostate and scape up telltales of the presence of the genetic markers. Are there any PTEN negative cells, and if so then they are the clones which will be reproducing and kill the patient. They are the ones which should be eliminated.

3. Genetic medicine is as we have argued recently the PC of medicine. It will be the sea change necessary to finally attain scale in the practice of medicine.

Wednesday, January 20, 2010

What Anger?

The current Administration seems to portray the Massachusetts vote as one compounded in anger. Frankly having spent a great deal of time there one is left to ask, what anger. Perhaps Gergen is angry because his academic arrogance was called into question, and perhaps the Kennedy machine is angry because they lost the last semblance of God given regality. Yet the people who voted for Brown were not angry. Angry was and is a negative emotion. The people felt empowered to say they did not agree with the ensconced pols who all too often take them for granted. This is not Venezuela, yet, and votes still do count. The "anger" tag is the last resort of the rascal, call your opponent a nasty name and hope it sticks.

This election was a restatement of the principle that if all else fails the politician should listen to the people, even in a Republic. The current administration should set aside Alinsky and return to the principles of the Founders. As I walk around the Jockey Hollow encampment of Washington I can recall what those troops went through. They were from Massachusetts, New Jersey, New Hampshire, and they fought and dies for individual freedom. T^hat is what was reiterated to Washington last night, and the anger came from Washington not Massachusetts.

Tuesday, January 19, 2010

Congratulations Senator Brown!

Senator Brown has accomplished the impossible, a Republican in Massachusetts. Great job! It does send a message. As we have been arguing for over a year now here we believe in a health care reform but what Congress has delivered is a sham. It creates an untenable situation which is more "gifts" to friends and nothing of merit to the US citizens. There are parts we believe in but as a whole it is a disaster.

In the late 60s and through the 70s I was a staunch Massachusetts Democrat. I ran a portion of the Tsongas for Congress campaign in Acton, I was a rep at the 1974 mini convention, I was a strong supporter even of McGovern and spent four years in and out of secondment in the Carter Administration. But alas enough was enough. I know Massachusetts quite well and as I looked at town after town the tale is overpowering. Woburn, Reading Peabody, true blue collar Democrats, going powerfully for Brown. Falmouth, Bourne, Barnstable, solidly for Brown, just look at the map. The towns of Concord and Newton and Lexington were as expected Democrat. The Concord Lexington creep went to Acton but died at the border. Littleton went Brown. This was truly amazing. The central western regions went Democrat as expected but they also wanted to have Guantanamo prisoners sent there as well.

Frankly a major key to the win was the blatantly arrogant question of the Harvard based Gergen, presuming a Kennedy entitlement to the position, and Brown's brilliant retort, it was the people's Senator. The isolation of many academics from the people is dangerous, and one would suspect especially so in a School of Government. I have spent time years ago walking and talking on the campaign trail. People do listen, they do think, they do decide. That is what Gergen missed in his insular academic mindset but that the people did respond.

This will be a loud call for Washington to listen. Yet I suspect this will not be the case. This was not a Glenn Beck cult but a massive shift. Hopefully we do not see Congress ignore the people.

Scale Economy in Medicine

There has been some discussion today about an article about Baumol's "cost disease" and it being an explanation for why the cost of health care will never decrease. Simply stated the intent was to show that health care was akin to musicians in an orchestra. That there is no way that there ever will be economies of scale in the delivery of health care.

As with a great deal of what Baumol has stated for a while I beg to disagree. The change in computing and in turn our economy was the deployment of the computer and especially the PC. The same will occur in health care with the deployment of genetic techniques to ascertain predispositions for diseases, for the staging of these diseases, for their treatment and in turn for their prevention.

As we have argued many times before, we sit on the precipice of this change occurring. We see that there are many ways in which we can now determine if a person has a predisposition for disease as extreme and rare as Marfan's syndrome, and the BRCA genes for breast cancer, the PTEN gene expressed or not expressed in many cancers. The staging and assessing of many diseases and especially many cancers will allow for improved and more efficient treatment.

We are there now, we are at the edge of change. In many ways genetic therapy and prevention is akin to computing in 1947. If it continues to develop it will solve the health care dilemma and it will show that Baumol's cost disease will remain with musicians and will no longer apply to health care.

However if we institutionalize the current way of doing health care as is being done with the current plan then more than likely we will see this occur well beyond our collective lifetimes.

The Treasury: Rates and Change

We have been looking at the debt and its components for the past year and some interesting trends are appearing. The above is the debt and its public and intragovernmental elements. Readers should remember that the intragovernmental elements are what Congress steals from our Social Security and Medicare to pay for the redistribution programs and pork to keep them in DC. The public debt is what China and our other friends have been talked into buying.

Now let us look at the public side as below.

The above shows the purchase of the public debt is moving to the longer held instruments since the shorter instruments are paying little is any interest. We had shown this a few days ago when we commented on the yield curve.

This is good and bad. It is good in that one may say that the buyers see little long term inflation. It is bad in that if there is long term inflation they will sell these on the market even at a massive loss and this will result in a spiral downward in a positive feedback mechanism just in the middle of an inflationary time and will exacerbate it.

How can this happen? Well simply, the FED can really start printing money. Why would this happen, because the buyers just do not "feel" any trust in US policy.

My concern is that the yield curve is driving us to a possible tipping point with inflation arising not from just printing but loss of faith in the country.

Monday, January 18, 2010

Broadband Stimulus: Some Questions

In 2004 I had a company, The Merton Group, which was awarded funding to build broadband in New Hampshire, Vermont, and Massachusetts. The above is the dummy check for Hanover, New Hampshire. We eventually never started and never took a penny of Government money. As we have noted before it was due to the franchise requirements. Moreover there were parts of the towns that under no circumstance could be ever make the business work. Those parts are similar to what a Vermont company called EC Fiber is allegedly doing in Vermont.

I was interviewed a week ago by the Burlington Free Press regarding EC Fiber. As I told the reported I really knew nothing about the company other than what I had read on the Internet but I had the experience having tried this once before. I could speak only regarding my extensive industry experience.

The paper states:

Managers of the proposed network — a patchwork of 46,500 people living in 22 towns in four counties — say it will pay off its debt with its subscription revenue. ECF Board Chairman Loredo Sola of Pomfret said the nonprofit network hopes to hear within a few weeks whether it has been chosen from hundreds of applicants nationally for a $69 million federal stimulus loan from the Department of Agriculture’s Rural Utility Service. If that federal money becomes available, work on the “shovel-ready” project can begin quickly, Sola said.

If we take the above we see that there are about 20,000 households, and they desire a 50% penetration by year six so they will have 10,000 subscribers and the RUS funding of $69 million is only 80% of the total requires so that a total of $83 million is needed. That is $8300 per sub!

Just to pay back RUS will be $69 per sub per month at 50% take rate! That seems to me to be un-doable. I am quoted initially as follows:

Some industry observers are expressing concerns that the project is too costly and the market too competitive. Terry McGarty, a researcher at Massachusetts Institute of Technology with experience trying to develop fiber-to-the-home systems, said ECF’s chances of financial success are “highly problematic.”

Problematic is an understatement. If the revenue is $100 per sub per month and just paying RUS with a 50% penetration is $69, that is 1% per month of the total capital per sub, then they are under water already for the video content may exceed $30. And no other costs are included. The article continues:

Twenty-seven towns passed an advisory question on Town Meeting Day 2008 to build the ECF network. Ultimately, 22 signed an interlocal contract to proceed. “Basically, it was an instrument to allow the towns to get together and do things,” said Jim Dague, Granville road commissioner and the town’s liaison to the ECF project. “Everyone was in favor of doing that.” Granville, with a population of about 300, is one of the smallest communities in the network. “We have about four houses per mile and 17 miles of roads,” Dague said. “That’s why FairPoint is not doing it.”

Let me do another simple analysis.

1. We know that the target capital per sub should be $3500 at the maximum for that converts to $35 per month per sub for capex repayment. Why is $35 per capex important well because the video costs $30 per month, the Internet access is $20 paid to the Internet backbone carriers, and the operating costs are $15. That is the all in revenue of $100 and thus at that point the business is break even.

2. Now we also know that of the capex we have $500 for the drop from the street to the home, $500 for the equipment in the home, and $500 for miscellaneous other centralized equipment. These are incremental costs. This leaves $2000 per home for the fiber in the streets and common equipment.

3. We know that fiber costs about $50,000 per mile to string up including pole changes and the like. Since we have $2000 per sub we know we need about 25 subs per mile. At 50% penetration that means we need 50 houses per mile.

4. Ooops, these towns do not seem to be even close!

The analysis is all back of the envelope. Any experienced and competent person can do the analysis. What has happened here. Well the Burlington Free Press states I am an MIT researcher, yes that is what I do part time now, but I have built these networks world wide, and have been amongst many things a Group President at Warner Cable, deploying the first single mode fiber in 1981, the COO of NYNEX Mobile deploying a full New England Network, and Founder and CEO of Zephyr, the dominant fiber network in Central and Eastern Europe. Thus I have the distinct disadvantage of experience which apparently the EC Fiber teams seems to be weak on, especially if one looks at Burlington Telecom.

Now as to BT, I was asked in 2004 to meet with town officials there to give them advice regarding BT. I went through this simple analysis above. I think the got it and I told them of the high risks. It seems that they were not listening then as they are not now.

The article continues:

ECF said RUS responded to that question: “Provided the prices and technical specs were within industry norms, the ‘no-bid’ aspect of the contracts would be outweighed by the ‘shovel ready’ advantage of having signed, complete contracts ready to go into operation as soon as financing is completed.”

When I did Merton back a few years ago I was made by RUS as well as by common sense to go and get competitive bids. We had dozens and I made many trips to vendors and had vendors come to me. We finally chose a collection to do the work. This seems to be a sole source deal. It further seems to be one where RUS just wants the money out the door if the above quote is correct.

It is a shame because systems like this can be be built in targeted areas but if they fail the costs will be on the backs of those who are least able to pay, namely the taxpayers. And yes there will be many who profit while there will be many more who may not.