Tuesday, April 30, 2013

Prostate Cancer and SNPs

SNPs are single nucleotide polymorphisms, simple the single base has been replaces by another somewhere in the DNA and this results in a higher predilection to PCa. In a recent Nature Genetics paper the authors state:

More than 70 prostate cancer susceptibility loci, explaining ~30% of the familial risk for this disease, have now been identified. On the basis of combined risks conferred by the new and previously known risk loci, the top 1% of the risk distribution has a 4.7-fold higher risk than the average of the population being profiled. These results will facilitate population risk stratification for clinical studies.

These are germ line mutations and given the germ-line predisposition ones risk is increased substantially. They continue:

Epidemiological studies provide strong evidence for genetic predisposition to prostate cancer. Most susceptibility loci identified thus far are common, low-penetrance variants found through genome-wide association studies. Fifty-four loci have been identified so far.

Finally they state:

With the identification of these new loci, 77 susceptibility loci for prostate cancer have now been identified. On the basis of an overall twofold familial relative risk for the first-degree relatives of prostate cancer cases and on the assumption that SNPs combine multiplicatively, the new loci reported here, together with those already known, explain approximately 30% of the familial risk of prostate cancer. Taking into consideration these SNPs and this risk model, the top 1% of men in the highest risk stratum have a 4.7-fold greater risk relative to the population average, and the top 10% of men have a 2.7-fold greater risk.

What does this mean for health care. Is is prognostic and if so why? One of the problems with SNPs is that the pathways they control are not clear. SNPs just sit there and we know that if they are there then the risk is greater. The lingering question is still what do they really do?

The Enlightenment

The Enlightenment by Pagden is a compelling book. It is not a history, it is not a work of comparative philosophy, and it is not a work of political theory. It is a view of the enlightenment by topics and through the focus of these topics it draws in the principal players in the Enlightenment again and again, intertwining their views in an ever more complex web. Each time Pagden does so he addresses another set of issues and often brings current affairs to the fore as well.

The core of the Enlightenment was the focus on reason and its tremendous powers and the total abhorrence of institutional revealed religion. Faith conflicted with reason. The Enlightenment totally rejected the Scholastics and their use of reason and logic. In a strange way reason dominated even the experimental efforts that surrounded the Enlightenment figures. The author states in his Preface that reason was not overthrowing the passions, but that the claims of reason were to be rejected as well as accepted. The author does also address the concern of Eurocentrism and the placing of the Enlightenment on a secondary state, a place where he believes it is not to be.

The author begins with an attempt to define the Enlightenment, or “Enlightenment” as process. To be enlightened meant being critical and for this capability it meant the use of reason (p 21). He provides a remark from Kant that it is but a few men, since most men and all women are but sheep, which use reason. For others if they can pay others for such things as what to eat, what is moral, then there is no need to think, reason. Yet amidst the mass of historical references the definition of either Enlightenment or the Enlightenment still is elusive. It is built upon reason, but it also appears to be a period based upon a revolt, a revolt from the way things were done, and especially the way one held religious belief.

Chapter 1 presents a somewhat historical context for the beginning of the Enlightenment. On p 33 there is a discussion of the end of the Thirty Years War, with the Peace of Westphalia. This event, the War, still hangs over much of central Europe. Many of the political divisions were religious divisions, and these divisions set the stage for conflicts for centuries to come.

Chapter 2 describes the change which the Enlightenment brought. It also presents one of the most convolved sentences I have ever read. On p 66 the author states:

“The Enlightenment, and in particular that portion which I am concerned, was in part, as we shall now see, and attempt to recover something of this vision of a unified and essentially benign humanity, of a potentially cosmopolitan world, without also being obliged to accept the theologians’ claim that this could only make sense as part of the larger plan of a well-meaning, if deeply inscrutable, deity.”

There are eight commas. But the sentence does accurately describe exactly what the author intends it to be. Yet it is also exemplary of style, which at time may be a bit daunting for the reader.

On p 69 the author provides insight to the debate that lurks below the surface between Hobbes and Rousseau. For Hobbes mankind was fundamentally and aggressive animal and needed the Leviathan to control them. For Rousseau mankind was originally pristine pure and was thereafter corrupted. Both men reached their conclusions by reason devoid of any scientific evidence or facts. That in a sense was the fatal flaw of the Enlightenment. It assumed the overwhelming power of reason as a sine qua non.

On p 77 there is a discussion of natural law and its deficiencies. The author states:

“The entire scholastic theory of moral and political life rested, as we have seen, on the idea that our basic understanding of the law of nature was made up of certain “innate ideas” or “innate senses”.

Then on p 79 he addresses the assault by Locke on this principle by stating:

“Few historians of philosophy have paid much attention to this length onslaught on the notion that there might exist no “innate Principles” or Innate Characters of the Mind which are to be Principles of Knowledge” beyond “a desire of Happiness and aversion of Pain”.

There was thus on the one hand a rejection of innate laws of nature, one that could be reasoned, and the application of reason to all existence.

Chapter 3 is a Chapter regarding a world without God. To some sense it is the Enlightenments fracturing of the past centuries and an attempt to break loose. He contends that man and the result of the Enlightenment can adapt to a civil society san religion. As he says on p 109:

“If it appears to do so now, that is only because of the fear that the Church has, over the centuries, inculcated in it.”

The author seems to align himself very much with those iconoclasts of the Enlightenment as one progress through this chapter.

He continues in Chapter 7 with a discussion of laws. On the one hand we have Montesquieu, and on the other hand we have Robespierre. As he states on p 309:

“Furthermore, political virtue was conceived of as a sentiment and not, as Montesquieu put it, “the consequence of knowledge”. True, the virtuous citizens had to be able to distinguish good laws from bad, but they did not require any special knowledge to do that; they did not need to understand precisely what a republic actually was, or how its institutions operated, or did they – as the ancients would, in fact, have assumed that they did – have to be actively involved in it, in order to love it; for the “last man in the state can have this sentiment as can the first”.”

 There are times when I had difficulty discerning the author from his subjects. This sentence gave me pause. If citizens were to distinguish good laws from bad, how much did they truly have to know? Does this not apply especially to any republic, where representation in a legislative body reflects to some degree the public? I believe the author has some point to be made here but they are somewhat poorly extracted from the sources.

On p 321-322 the authors delves into the Great Society of Mankind by an interesting allusion to foreign aid. To him it would have been an unacceptable concept in the Enlightenment but as an act consistent with Enlightenment thinking it would be congruent.

In his Conclusions he discusses the enemies of the Enlightenment. The discussion is generally in line with modern thinking but there appear to be several divergences. On p 395 there is a discussion of Communitarianism. The author states that Communitarians have much in common with 19th Century nationalists. He discusses the source as Hegelian in part. But he sees the Communitarians as enemies of Enlightenment thinking. This discussion is very interesting and worth a read several times.

He again returns to the Thirty Years War as that seminal event which in a manner kicked off the Enlightenment. For most Americans this is an event at best hidden in the dark past of the World History book. However for a European, this is a dividing line between the past and the present. It was a war of the people, a war of faith, not a war of territory. Even today one feels it when dealing with Poland, Sweden, Austria, Germany, France and so forth.

But the Enlightenment is also a collection of characters. The author brings them to life in his style of topical discussion. Voltaire becomes almost a current day Cable TV commentator, irascible, while at the same time amassing a personal fortune. He went after the Catholic Church, in the guise of attacking religion, but praised the British for their religious tolerance while at the same time the British were massacring the Irish for their faith. At the other extreme he discusses de Tocqueville and his view of the Americas while not discussing the de Tocqueville writing on Ireland the French Revolution.

There is a wealth of books on the Enlightenment and those of Gay, Cassirer and Israel are but three that come to mind. This book is not in that class. The former are historical works that flow in some linear manner; either temporally of thematically. This book is kaleidoscopic in style, with flashes of insight coming and going and then within those flashes incorporating vignettes of the main characters who are players on the stage of the Enlightenment. This is not a text of the type of Skinner who may include all players so that the ones that we see so often are placed within an historical context.

This book, in summary, is a delight to read, albeit not in a linear fashion. It has brilliant flashes of insight and explanation, yet there are times when one yells back at the words in total disagreement. This book draws out thinking in some depth about the Enlightenment more than a linear historical work. It was a delight to read.

However it does pose the question: who is the Voltaire of today?  Is it some "Talk Radio personality, some Cable "New" commentator? They are irreverent, attacking the "system". Then again one may ask who is Robespierre?

Monday, April 29, 2013

Economics and Circuit Design

One of the things an engineer does when designing a circuit is to think through what goes up, and what goes down.

Now there are several ways of doing this. The young engineer writes out all the equations and then begins to solve them. The engineer was educated in the laws of electromagnetic theory and its applications to circuits and then understanding these laws, which are demonstrable in the real world since it is from this world that they came, the engineer determines the performance of the circuit. The experienced engineer who has done this for many years can intuit the performance by saying this goes up and this then goes down.

Now economists like to think that they too have theories based upon fact. Don't ever let them trick you on this one, they are based upon belief. A great example is in the NY Times today from the Gnome from the South, who states:

Why did spending plunge? Mainly because of a burst housing bubble and an overhang of private-sector debt — but if you ask me, people talk too much about what went wrong during the boom years and not enough about what we should be doing now. For no matter how lurid the excesses of the past, there’s no good reason that we should pay for them with year after year of mass unemployment. 

So what could we do to reduce unemployment? The answer is, this is a time for above-normal government spending, to sustain the economy until the private sector is willing to spend again. The crucial point is that under current conditions, the government is not, repeat not, in competition with the private sector. Government spending doesn’t divert resources away from private uses; it puts unemployed resources to work. Government borrowing doesn’t crowd out private investment; it mobilizes funds that would otherwise go unused.

 Just read through this and you can see the belief set. For example, does Government spending take money from private use or not? Well not rally but somewhat, yet again ....

The answer is it all depends, it is not black and white. People respond to beliefs, not to facts. I call this the Extended Rowe Conjecture. Namely there is a battle between belief and fact. Belief often wins. If the Government increases corporate taxes to pay for its excess, or even alludes to the potential, companies move and respond so as to counter this.

Also there is a systemic loss from the labor pool. Tens of millions of jobs have just downright disappeared. Why? Outsourcing? Not really, we have become dramatically more productive, a fact that has been building for decades. This drives out human labor. It is inevitable. So should Government continue to pay these people in anticipation of a return. It is akin to Waiting for Godot. He will not come.

Government spending terrifies people. Why, because of the added regulatory burden, the costs to comply, and the list goes on. Furthermore it poses a greater risk of uncertainty.

Friday, April 26, 2013

Why are Economists so Nasty?

I have been following the Reinhart and Rogoff debate. Now I have done thousands of spread sheets over the years, more likely tens of thousands. Once I was interviewing a person as head of my M&A group in a start up and I was explaining something to him. He found an error in my Excel spread sheet and then told me. I hired him on the spot. For two reasons, first, he found the error and corrected it, and second, he told me, especially during an interview. Now he did go to IIT in India and he did have an MBA from Chicago. Neither did I hold against him. After all, I am told in India if you cannot get into IIT then you try MIT.  It was sort of the Avis of high tech.

But still as I read some of the comments they are so vitriolic that one wonders if economics is no more than some religious cult. I have never seen such a battle over say the force balances on a three element truss! Maybe there is a "discussion" over the true function of PTEN on a cancer pathway but the way out is through experiment. It is not a belief set but a fact set.

Now this battle highlight several things about economists. First the left leaning ones are real nasty. Second, there are no facts behind the theory. Third, the Canadians are real nice.

Thursday, April 25, 2013

Happy 60th Birthday to DNA

Sixty years ago today Watson and Crick published their seminal paper on the structure and reproduction of DNA.
In such a short communications they managed to start a movement that has led to the understanding of so many things about the biology of life itself.

So Happy 60th!

Wednesday, April 24, 2013

More Thoughts on Bell Labs


The Idea Factory is a well written presentation of what happened in Bell Laboratories in its early and middle lifetime. The author has captured the view from within the Lab and has presented a history that is in many ways presented in a manner in which the Lab people would have wanted it presented. His conclusions however are subject to significant debate, if not being downright wrong.

I write this review having heard the author present his work in Madison, NJ to an audience almost totally filled with hundreds of former Labs staff and also as one who spent a great deal of time at the Labs from 1964 through 1972, while going back and forth to MIT, plus over fifty years in the industry.

The author presents the often told tales of Shockley and the transistor, Shannon and information theory, as well as all the management types who formed, directed, and molded the Lab like Kelley and others. Many of these people I knew firsthand and as any observer the view is all too often colored by one’s position at the time.

The driving presumption of the author is best stated in his introduction where he says:

“Some contemporary thinkers would lead us to believe that twenty-first century innovation can only be accomplished by small groups of nimble profit seeking entrepreneurs working amid the frenzy of market competition. Those idea factories of the past, and perhaps their most gifted employees, have no lessons for those of us enmeshed in today’s complex world. This is too simplistic. To consider what occurred at Bell Labs, to glimpse the inner workings of its invisible and now vanished “production lines” is to consider the possibilities of what large human organizations might accomplish.”

This conclusion is frankly a significant over-reach, if not just out right wrong, since it is posited without any basis in fact contained within the book. The author never really looks at the many other parts of the Lab, the tens of thousands who worked on miniscule parts of large systems. The R&D group at Murray Hill was but a tiny part of an enterprise whose overall goal was to ensure the monopoly that AT&T had been granted by the Federal Government and to maximize the profit made in that monopoly.

To understand one must recognize that in the old Bell System profit was defined as a return on investment, meaning the invested plant. Revenue thus equaled expense, plus depreciation plus that profit construct; namely the company could charge whatever it wanted to subject to the regulators limited control. The game was thus to maximize profit, which in turn meant to maximize the invested plant, and not to be maximally efficient in a competitive sense, there was no competition. Understanding the ground rules of the old Bell System is essential to the understanding of Bell Labs. No other company, save perhaps the power utilities, functioned in such a manner. This was the basis of the world view of the Labs, a world of monopolistic control.

But the “creative destruction” of the free market did begin to surround the Labs. It surrounded the Labs in the areas in which the author appears paradoxically to make them most successful. Let me discuss just three examples.

Satellite Communications: The author speaks glowingly of Pierce and his vision of satellite communications. Yet Pierce wanted dozens of low orbit satellites, apparently driven by his desire to have low time delay for voice. He wrote a paper which appeared in Scientific American proselytizing the idea. Based upon that proposal, COMSAT was formed and capitalized based upon a need for this massive investment not only in space segment but also in the complex tracking earth stations. A few days after the COMSAT IPO Hal Rosen and his team at Hughes launched Syncom I, the first synchronous satellite. Within weeks they launched Syncom II. Synchronous satellites provided global coverage with only three satellites, not the dozens demanded by Pierce’s world view. COMSAT was then off with its own satellite, Intelsat 1 and its progeny using not Pierce, but Rosen. Somehow this minor fact is missing from the book.

Digital Switching: Fred Kappel was the Chairman of AT&T in the 60s during the time of the development of the first Electronic Switching System, the No 1 ESS. This system was developed by people such as Ray Ketchledge and others. They had deployed a computer based system, albeit still with analog mechanical switches called Fereeds. Fereeds were small mechanical switches that clicked and clacked. The Fereeds made the new computer elements be the dog still wagged by this old technological tail cross-connection technology. Kappel wanted an all-digital switch and the Labs kept putting him off. But at the time he had another card up his sleeve. AT&T also owned Bell Canada and their Bell Labs entity called Bell Northern Research. So off he went and got them to build the all-digital switch. The entity doing it became Northern Telecom, NORTEL. NORTEL subsequently became a major switch supplier of their new and better switches to the Operating Companies. Thus, in a true sense, Kappel used the entrepreneurial spirit of the Canadians to do what the mass of people at Bell Labs would not do.

The Internet: Now in the mid-1970s the ARPA net was in early development and some of the basic principles were evolving from Government, Academia, and a bunch of small start-up companies like Linkabit and BB&N. ARPA, the DOD advanced research arm had an office called IPTO and they wanted to expand the Internet more aggressively using the public telephone network. Yet since AT&T was a monopoly they somehow had to co-opt AT&T to agree. A first step was to go to a meeting at Murray Hill and seek their support. So off go a couple of folks from ARPA and in Murray Hill they met the standard Bell System meeting of a few dozen people. The senior person, a VP I was told, began to lecture them that if they wanted this accomplished just send them the money and they would deliver what they felt was the correct design. The ARPA folks walked away somewhat aghast and immediately reached the conclusion that they would develop what became the Internet, totally independent of AT&T. This was, in a sense, the final straw since it sowed, in my opinion, the seeds for AT&T's ultimate destruction, not the Judge Greene breakup.

The author, in my opinion, misses many other R&D entities which had a significant role in the evolution of technology, oftentimes well exceeding Bell Labs. Let me discuss just a few:

MIT Rad Lab: At the beginning of WW II Vannevar Bush set out to establish a center for R&D focusing on radar. Bell Labs had tried to capture this jewel but Bush wanted a more innovative and competitive center and as such he chose MIT and from that came the Rad Lab. The Rad Lab was composed of engineers, but they were drawn from many places and the best part was that when the war was over they went back to those many places. The Rad Lab designed radar but radar had the same elements as communications, and specifically digital communications. Thus from the Rad Lab came such innovations as the modem, designed by Jack Harrington, to interconnect signals from distributed sites. From the Rad Labs came rapidly effected engineering systems, and the terms system is critical, because the parts all worked together. From the Rad Labs came a set of book, the Rad Lab Series, which became the bible for engineers who entered the wireless world and the digital age. The Rad Lab was a petri dish that bred hundreds of engineers who went forth and created the core “startups” in the Cambridge 128 areas and also in Silicon Valley.

DoD Design Companies: It is well known that many of the transistor companies were driven by the demands of DOD. Also many of these same types of companies in Silicon Valley and in the 128 Corridor were driven by DOD money as well. Groups of engineers educated from the Rad Lab type entities of WW II came out and started small companies fed from the DOD demands in those days. It allowed for many bright engineers to experience the “startup” albeit at the Government trough.

This this book has strengths and weaknesses. Its strengths are:

1. A well written story of some of the key players in Bell Labs.

2. A well described evolution of the development of the management techniques.

3. An excellent discussion of some of the major personalities in the R&D world at the time.

Its weaknesses however should be considered when taking the author’s conclusions to heart. Namely:

1. This is truly a tale written from the perspective of Bell Labs. It totally fails to consider the competitors and thus when reaching his conclusion the author does so without any basis in fact. He totally ignores the weaknesses of such a system as Bell Labs and moreover he fails to consider the alternative entities such as the Rad Lab and its offshoots. In my opinion this is the major failing of this book. It would have been much more credible and useful if the author had looked at Bell Labs in the context of the total environment; the strengths and weaknesses and the competitors and alternative models of research.

2. The monopolistic structure of AT&T was a major driver for what people do and why. The issue of return on investment being the profit, and not revenue less expenses, is a true distortion of what is done and why. This idea of a world view is a formidable force. It molded what the Labs and AT&T did and why they did it. The author seems to be totally devoid of any notion of its import.

3. There were many failures at Bell Labs, and those failures were never truly perceived by those within the system, and it was this blind spot that in my opinion also led to its downfall. The author missed a great opportunity to follow up on this. Instead we see all these Herculean minds making great successes and yet the system collapses.

4. Bell Labs was enormous in size and scope at its high point. I had spent time at Holmdel, Whippany, Indian Hill, Andover and even a brief stint at the remains of West Street. Yet the focus is on Murray Hill and a small part of a small part. This is especially disturbing in light of the author’s global conclusion which is reached without a single discussion of these areas. To do Bell Labs justice one must perforce covers these as well. The Pierce, Shockley and Shannon tales are told again and again, but the efforts of the hundreds of thousands of others over the decades are still silent. In the presentation by the author before a mostly former Ball Labs group it was clear that my observation on this point had substantial merit.

Overall there is a significant story to be told but this author does not accomplish it. In fact the author’s statement denigrating the entrepreneur and the process of “creative destruction” is made without any attempt to understand the difference between a monopolistic structure and competitive markets. Perhaps if we had kept the old paradigm we would still have our black rotary dial phones.

Having a Grasp of Basic Facts

In the book by Crawford, Captive Audience, she develops the argument that the COMCAST and NBC merger was an example a writer who in my opinion had a good tale to tell but was ill equipped to do it.

This is a difficult book to review. Not because of some of its conclusions, which I am hearty agreement with based upon hands on experience, but because the author all too often steps well outside her area of expertise and opines on things which are just wrong.

Moreover this is one of those books on Amazon which has some sort of following. No sooner had I posted it then people started stating it was not a helpful review without comments, at least as yet. Now this review gores both oxes; the cable company and the clique who seem to follow what the author writes with religious zeal. It will interesting to see how these results evolve. 

However her conclusions are of merit despite the confusion of her argument.

1. “Cable Companies are bad”. She has some very valid point here as she demonstrates through the vehicle of the Comcast and NBC merger. She argues that such a merger should never have happened. One could provide substantial grounds for preventing is, most on antitrust issues, but they were never truly approached by the current administrations. The reasons why is a tale unto itself.

2. “Fiber is the only way.” Here I argue she is clearly wrong and is so since she does not understand the technology. Since this is a key point in her argument one wonders why she did not at least reach out to find better support and understanding.

3. “Government is the best market regulator.” This is an extreme position which has mixed, at best, results. In fact it has been clear in the technology space that the Government is the worst regulator.

Let me address the above points and relate them to the text itself:

1. Wireless has substantially more capacity than the author understands.

2. The cost of fiber is dominated by local costs such as franchise acquisition, costs of pole attachments, and the delay costs of laying the fiber.

3. There exists a significant body of law the antitrust laws which can and should be used to manage the industry not just regulation.

4. Cable companies are monopolies for the most part and should be regulated as such.

Let me now provide some details contained within the book specifically supporting the above:

On p 78 the author speaks of the abandonment by Verizon of fiber deployment. Why did Verizon abandon its buildout? Frankly there are two. First there were the exploding legal costs and delays in getting local franchises. These were exasperated by the local cable companies but facilitated by the local towns who often did not understand the economics of competition, they just asked for more as they were advised by the incumbent cable operators. Second, and this is a critical element, was the success of wireless in expanding bandwidth efficiency. Namely with only a 1 bps/Hz a decade earlier they now were at almost 10 bps/Hz and they could see ultimately even another order of magnitude increase. This focus on wireless was most evident with the succession to the CEO position with the wireless head taking the helm. Thus it was to some degree a problem with the incumbent but it also was an understanding that the wireless alternative was more than viable.

On p 90 there is an interesting discussion regarding the “interstate access charges”. In fact these were the interconnect fees. The author refers to the Prodigy effort, but such an effort was doomed from the start by the massive overhead put on it by IBM, yet at the same time they were facing the overhead of AT&T. The access charge issue is a simple one. There were local phone companies and long distance ones, at that time.

The local companies demanded and received a fee for interconnecting to the local company. Even though the local companies were separately paid by the customer, they were allowed by the FCC to impose this charge to third parties such as an AOL or Prodigy. Fortunately the FCC abandoned this stance. The author seems to have not fully understood this issue.

On p 95 the author tries to outline the history of on line capabilities using the AOL and Time Warner as an example. In fact it began in 1978 with Warner Cable and the QUBE system. This was the first two way cable system that allowed interaction and online purchasing. This Warner, and perforce Time Warner, had been developing this for almost two decades. In the early 1980s Warner Cable developed the first “Electronic Shopping Mall” a two way video on demand d system in a joint venture between Warner, Bank of America and GTE, with Bell Atlantic and DEC participating. That effort collapsed when Warner ran into financial difficulties. Chase Bank and others did the same during the videotex period. The author appears to posit this sudden even with Time Warner and AOL when in reality there had been many trials, tests, and attempts.

On p 125 the author states that Edison invented the telegraph. What of Morse? Perhaps some fact checking of simple facts would help.

On p 129 and on the author refers to Sen Franken so many times one wonders why? The book was not written by Franken and based upon his public record he was both new and definitely not an expert in regulatory issues and technology. This continual referencing becomes a distraction.

On p 133 there is a discussion of the new channels being cash cows. However there is a very serious issue here. The cable companies bundle up packages of programs which they also own and demand that anyone providing one provide the full package and at premium prices. The consumer gets the full sports package and pays for it no matter if they have ever seen a single sports event. This is the major failing of the FCC and the FTC. Legally this is akin to bundling, a practice clearly prohibited by the antitrust laws. But to data the DoJ has never acted upon this, nor has the FTC.

On p 156 on the author delves into the cable versus wireless issue and here she is well out of her depth. It is a pity because this area is a significant one for possibilities. Let me first outline the wireless argument and then return to the text:

1. Wireless capacity can be measured by the number of bits per second, bps; it can deliver to a user.

2. The user demands bps depending on the application and the number of them the user may have. For example HDTV had been a bid user of bandwidth.

3. Now two things have occurred technically over the past ten years. First bandwidth efficiency, measured in bps/HZ, has increased from 1 bps/Hz to now 10 bps/Hz. Yet at the same time the data rate required for video has collapsed, going from 100 Mbps down to 4 Mbps. Thus supply, that is bps/Hz, has exceeded the demand, such as Mbps. Namely we can now easily use wireless for HDTV.

4. The acquisition of bandwidth by the wireless companies has continued and now provides almost universal service. Wireless does not require franchises or pole attachments, and can be delivered in a short order.

5. Wireless efficiency now at 10 bps/Hz is anticipated to increase to 100 bps/Hz. That means that a 20 MHz spectrum could provide a 2 Gbps channel to a single user, and with multibeam antennas it can to so to a plethora of users. This backs up directly to a competitor of fiber. And at a tenth the cost!

On p 160 the author again reinforces her lack of technical understanding and capabilities. She states:

“When people want to download a lot of data, say to make a video call, they overwhelmingly opt for high speed wired connections.”

Perhaps she has not been made aware of the iPad.

This distortion continues throughout this chapter. She does it again on p. 161,

On p 251 she states:

“Will wireless help America reach the president’s goal of one gigabit to every community? No.”

The answer is yes, and since the wireless companies have hundreds of MHz not the 20 above, they can well exceed that.

On p 258 she describes the franchises being exclusive. In fact almost all were no-exclusive. The problem was the cost of overbuild.

On p 263 she demands “For starters most Americans should have access to reasonable priced 1-Gb symmetric.” Now I assume she means 1 Gbps not 1 Gb. it is rate not totality, and she

On p 265 she begins her argument of moving to a utility model. She states “To do this though American needs to move to a utility model.” Frankly the issue should be one of bundling or tying in and the control is the existing antitrust laws. The problem with the utility model is all too well known. The FCC controlled and was controlled by AT&T before divestiture. The result was very slow development of technology and capability. The utility model sets prices based on a return on investment, namely the provider is guaranteed a profit based on invested capital, and their costs are covered no matter how inefficient. The result is a capital intensive and bloated system. Better would be a real competitive market where the barriers to entry are not enforced by the Government but the Government enforces the antitrust laws already on the books.

On p 267 she also makes statements regarding the costs of fiber. Based upon my experience doing this her numbers are categorically wrong. The most significant costs not included is the franchise acquisition costs, often in excess of $1,000 per sub, plus the costs of pole attachments and the delay costs associated with dealing with local regulators.

On p 267 she further states “The government standardizes, regulates, provides tax subsidies, and puts price supports in place every day.” One could just imagine the Government standardizing wireless or broadband structures. They have no technical depth and furthermore the politics that would ensnarl such would be unimaginable. The Government should just stand apart from any standards. Let the technical people deal with that, and live or die by their decisions.

On p 269 she gets into the Internet discussion. Again for some reason she uses Franken as a foil but they are a distraction. The fact is that indeed ARPA, specifically IPTO, developed the early Internet deployment in the 1970s. In fact I ended up with the task of deploying the international circuits for IPTO. Then through the early 1980s it somewhat slowed but with the help of Kahn and Cerf the IETF was formed and began an open source development of what could be called standards, albeit very flexible one. Then the DOD abandoned it and spun off a separate network and the result almost went nowhere but at the end of the 80s we saw such academic networks such as NYSERNET evolve and the NREN come forth. Thus the Internet history is a mixed bag of public and private parentage and the bright line alluded to by the author is without merit.

The book is worth reading but only if one can work through the mire of the author’s statements for which she has no basis or those which are just outright technically in error.

The classic book on telephone change is the Coll book, Deal of the Century, outlining the breakup of ATT. Coll is a brilliant writer and deals with facts he both understands and can explain. The author of this book had such an opportunity but she clearly went well beyond her ken and the result is that between the facts and opinions are prognostications based on fact-less presumptions. The issue she is focusing on is truly an important issues and needs as much public understanding as possible. The cable companies have secured for themselves a protected niche and have further vertically integrated in a manner which later 19th century antitrust minds would find abhorrent. This is ever so true in view of the channels they control; information and communications.

Monday, April 22, 2013

Public Intellectuals and Religion

I have written extensively on the “Public Intellectual”, a term often self-applied as a term of adulation. But this becomes ever so more an issue when that person also assumes the mantle of religious spokesperson, at least for their point of view.

The University of Notre Dame, that football school out in the mid-west, is hosting a conference on this which is described as[1]:

This international conference, hosted by the Notre Dame Institute for Advanced Study, will focus on the roles played by public intellectuals—persons who exert a large influence in the contemporary society of their countries by virtue of their thought, writing, or speaking—in various countries around the world and in their different professional roles. Leading experts from multiple disciplines will come together to approach this elusive topic of public intellectualism from different perspectives.

The agenda includes such topics as[2]:

1.     The Religious Leader as Public Intellectual
2.     Islam and the Public Intellectual
3.     The Blogger as Public Intellectual
4.     The Economist as Public Intellectual
5.     The Former Diplomat as Public Intellectual
6.     The Philosopher as Public Intellectual

Now what do these who fill these roles have to say, is it if any value, and if not who are the putative true Public Intellectuals in their areas who have such statements to make. We examine these issues somewhat herein. Moreover there is the compelling question of communitarianism versus individualism. Namely as one looks at the Religious issue, examining the now current issue of Social Justice, are we individually responsible or responsible as a community?

If one examines the recent Treatise from the Vatican[3] one would gather that the Church has almost eliminated individual responsibility for communitarian approaches, in fact the very call of Social Justice is Justice emanating from Society as a whole and not from the individual to others. In fact Social Justice demands that the individual is sublimated to the community, society, and that there is some magic group of all perfect society managers who make the “right” decisions for all.The Treatise specifically states (p 145, from Vatican II):

"If economic activity is to have a moral character it must be directed to all men and to all peoples. Everyone has the right to participate in economic life and the duty to contribute, each according to his own capacity, to the progress of his own country and to that of the entire human family. If, to some degree, everyone is responsible for everyone else, then each person also has the duty to commit himself to the economic development of all."

Now economic activity is neither moral nor amoral in and of itself. It is akin to farming, to hunting, to breathing, to perhaps even reproducing. The act has a moral element when and only when it causes an immoral act; that is it causes a robbery or murder. Now to view in economic terms the statement "each according to his own capacity..." is as Marxian as "to each according to his means and to each according to their needs..."  The statement is full of communitarian elements and devoid of individualistic responsibilities. This is a prototypical statement from Vatican II, and in ways it is as we shall discuss the basis of the conflict resulting therefrom as described by Wills.

The concept of their being a Social Doctrine of the Church is in many ways a total denial of individual responsibility. Yet the very essence of the teachings in the New Testament is directed towards the Individual. Thus in a way the progression of the Church’s doctrine has been in response to the Liberalism and Enlightenment doctrines and rather than emphasizing the Individual duties the response was to create a parallel universe which “agrees” with these “pagans” but uses words more akin to what the Vatican would be comfortable with.

From an interview with one of the speakers, the erstwhile Religious one defines Justice as follows[4]:

For me it is a commitment to justice, to making the good of others, the good of the world community the ground on which I make choices. Justice is more than niceness or random acts of kindness. Justice is a principle of life. For example, from a religious perspective in this day and age, I believe justice requires me to have as much respect for Islam, as I do for Catholicism. We do Islam a great disservice if we judge the entire tradition on the basis of a few radicals. My spirituality directs me to recognize the spiritual foundation, the truth and vision in every religion.

John Ryan, a Catholic priest in the late 19th through mid-20th century wrote a great deal on the concept of Justice, and Distributive Justice as her termed it. His view was that it was necessary to take from the rich to support the poor and he was a severe critic of capitalism. Moreover his views were that “society” owed the poor and he saw the individual qua individual as irrelevant.

As Ryan states[5]:

“The Christian conception of the intensive limitations of private ownership is well exemplified in the action of Pope Clement IV, who permitted strangers to occupy the third part of any estate which the proprietor refused to cultivate himself. Ownership understood as the right to do what one pleases with one’s possessions is due partly to the Roman law, partly to the Code Napoleon, but chiefly to the modern theories of individualism.”

What Ryan says is that no individual or person has the true and clear right to their property. In fact the Clement argument is one of using the fallow land. Farmers left one third of their land fallow so as to allow it to recover. Clement then believe that fallow land should be put to use, albeit destroying the value in the future. No one has ever claimed Pope’s were infallible in science and agriculture.

Ryan then quoted Herbert Spencer:

“Violence, fraud, the prerogative of force, the claims of superior cunning, these are the sources to which these titles may be traced. The original deeds were written with the sword rather than the pen; not lawyers but soldiers were the conveyances; blows were the current coin given in payment; and foe seals blood was used in preference to wax.”

Strange as this may be as a quote, it does allow him to draw in Spencer, the strong promoter of 19th Century individualism to justify his claims; a claim that simply states that no property has any “legal” and read that as “moral” basis. In fact Ryan’s belief is that not only should the one third be shared but that there should be no private property at all.

The problem here is a serious one. In the Gospels there was always an acceptance of several important issues:

(1) What was the State was the State’s and what was God’s was God’s. For example you had to obey God’s law but if that violated the State’s law so be it and you thus suffered the consequences. Thus the martyrs. God’s law was imposed on the individual, not the group, martyrs died individually.

(2) The duty to obey God’s law was incumbent upon the individual. This was to a degree a break with the Old Testament Law as applied to the Israelites, who were viewed as a group. Jews as a group, and to a degree as individuals in the Group, had the duty. But since the Christians were polyglot and disparate, the group identity was abandoned in terms of their religious duty, and thus the burden was on the individual. What did “you” do, not what does the group do.
Now who are putatively true Public Intellectuals in the area of religion? I would argue that Gary Wills fits the profile quite well. He is a true intellectual, as evidenced by his wealth of experience and understanding, and he addresses key issues worth the discussion, and he does so with a level of expertise worth of an intellectual. Finally he communicates in a manner which is readily available to a wide audience, albeit educated, yet he does not write for the specialist.

I am reminded of my first Wills book, “Bare Ruined Choirs” published in 1972 and reflecting on Vatican II and the change in the Catholic Church. He was no an apologist, far from it, he demonstrated a breath of understanding the exceeded the classic religious writer. One may not agree with Wills on everything, and I am one who has many point of disagreement, but his writings are always worth reading. He is not a Hans Kung, namely he does not personalize and internalize his invective, and in fact one finds no vitriol in his words.

Thus who are the true Public Intellectuals? Posner[6] has taken a cut at that a few years ago, and I wrote a brief work examining that plus the issue of individualism[7]. The Posner discussions attempt to demonstrate the decline in the Public Intellectual. One can readily argue that the decline may in many ways have been a collapse after the interdiction of the Internet.

Let us examine two of the speakers at this conference; strangely bot are somewhat obese in my opinion, which may in itself reflect an attitude. The religious public intellectual purports to be both a contemplative and yet a rather outspoken public figure. Her writing is nowhere near that of Wills, and in fact can be somnolence creating. She appears to be confrontational with the Vatican and has developed a clan of believers who may be almost cultist in nature. She lacks the true intellectual base and her issues are ephemeral rather that of substance.

The economics representative is acerbic and critical to an extreme of those with whom he differs. If one examines his blog one see a flow of almost venom like comments against those who show even the slightest level of disagreement. Both however believe apparently in the anti-individualism of the left.

Finally, I return to Wills and a possible cause of all this mess. I would argue that it was the strength of the arguments of Augustine, as demonstrated in the brief book by Wills[8] that set the path to this point of view. Strange as it may be, for on the one hand it was Augustine that introduced individual salvation via Grace but on the other hand institutionalized Original Sin, a communitarian artifact. Wills argues that there are hidden expressions of Augustin in such works as his Confessions, and I would argue that they are indeed.

Augustine became a believer of the evil of sex, and that in many ways was a result of his own lurid and self-centered life. He fathered a child and then abandoned the child, who eventually died at a young age. He abandoned his “wife” and took up the Faith, even though his “wife” was a Christian before him. As they say the worst prohibitors of habits are those who have forsaken them; thus former smokers are adamant anti-smokers, and converts are more true believers that those born into the faith. Much of what we trouble about in today’s Catholic Faith may emanate from Augustine, the Reformation and Luther were a direct result therefrom. Perhaps Pelagius was right, as were the Donatists.

[3] See Compendium of the Social Doctrine of the Church, US Conference of Bishops, 2004.
[5] See Ryan, Distributive Justice, Macmillan, 1916, p. 23.
[6] See Posner, Public Intellectuals: A Study of Decline, Harvard U Press, 2003.
[8] Wills, G., Augustine, Viking, 1999.

Sequencing What Gene?

Today in the NY Times is a story about the major hospitals sequencing genes on cancer patients. The article states:

Sequencing an entire genome currently costs in the neighborhood of $5,000 to $10,000, not including the interpretation of the information. It is usually not reimbursed by insurance, which is more likely to cover tests for genetic mutations that are known to be responsive to drugs. The treatments themselves, which are sometimes covered, typically cost several times that. 

Even optimists warn that medicine is a long way from deriving useful information from routine sequencing, raising questions about the social worth of all this investment at a time of intense fiscal pressure on the health care system. 

“What’s the real health benefit?” said Dr. Robert C. Green, a Harvard professor and a medical geneticist at Brigham and Women’s Hospital in Boston. “If you’re a little bit cynical, you say, well, none, it’s foolish.” 

The real question is "what gene?". Namely are we sequencing the genes of the person which made them what they are, or are we sequencing the genes of the cancer. And even worse, what cancer cells since they have the nasty habit of mutating at a rather rapid rate. 

The germ line genes may or may not tell us a great deal, unless we understand what went wrong and when. Very few cancers are germ line related, most are somatic. Stuff happens, and then it happens again and again.

As we have argued in our recent work on cancer cell dynamics there is a highly complex but measurable and modelable  process to examine this complex somatic process. Furthermore we have explicitly demonstrated that for prostate cancer and melanoma.

Specifically any such program should:

1. Catalog the germ-line genes. It is always good to know where you are starting from.

2. Monitor the somatic genes of a cancer as it progresses.

3. Understand the "expression" not just the genes since expression is modulated by epigenetic factors such as methylation and miRNAs as examples.

4. Perform the analysis using a fact based model which is spatially and temporally based along with recognizable mutation paths.

5. Validate the models and use them for prognostic purposes.


Wednesday, April 17, 2013

Mathematics and Language

As usual Frances Woolley has written a spot on piece about "language" and in this case mathematics and economics.

She states:

Bad notation makes a paper difficult to follow. Papers that are hard to read and understand get rejected, or receive lower grades. But what makes for good notation? 
 
First, symbols should be easy to remember....

(Second) ... avoid multiple subscripts or subscript/superscript combinations whenever possible...

(Third) ... One other rule is ... Generally speaking, greek letters are used for parameters of the model ...

(Fourth) ... Once one has a model, one has to figure out what goes in between the equations (hint: economics). Every symbol used in a paper needs to be defined clearly the first time it mentioned. If the symbol has not been used in a while, it is a good idea to give the reader a hint as to what it means.

 Now I have struggled with this for decades. Here is a variation of Woolley but written with my set of rules:

1. Write for your audience. If you are writing for pure mathematicians then you need all that stuff. If you are writing for the real world then just say what you mean and no more. You are NOT a pure mathematician and who cares if you have a Banach space. You never use that fact.

2. Follow Ockham:

a. Say as little as you have to to explain the concept. Do not add all sorts of stuff to show how "smart" you are.

b. Be a nominalist, namely there are no abstract Platonian constructs, only the reality of the moment.

3. Write equations so that they are obvious. If w is a variable that depends on x and t then write w(x,t). That's all, no more. If w is one of i=1,...,n elements then write a simple subscript. As Frances says choose a variable to be obvious. x is good for position, t for time, and then go from there.

4. Define it and repeat the definition as often as possible. Why? Because I may not have read the obscure definition hidden in the midst of paragraph 23.

5. Structure your presentation. Long paragraphs are tedious. Number or bullet your points. Highlight the fact that you are making. Unless your are a pure mathematician, assume it is not clear to the reader or that you can leave it as a simple exercise for the reader.

I have found that Economists try to pretend that they are pure mathematicians. I had taken a few pure mathematics courses at MIT and I can assure anyone that I am not nor will I ever be a pure mathematician. However I am a reasonably good engineer so writing equations are as an Ockhamist.  I try to think of the reader, since I am using a language that should be clear and simple.

Sunday, April 14, 2013

High Tech Start Ups and New York City

Now I have done almost a dozen start ups and half a dozen turn arounds. In addition I was born in New York City and worked there as well. The new Cornell center as described in the NY Times is:

Though Cornell and the Technion are taking it further, the relationship between most engineering and computer science schools and the business world is already so fluid as to startle someone with a liberal arts background. Professors routinely take breaks from academia to go into business. Former students and professors create companies based on work done within university walls and reach back into them to collaborate and recruit talent. Universities often own pieces of new ventures. 

This kind of cross-pollination helped create thriving tech sectors in the areas surrounding the Technion, Stanford, the University of Texas and the Massachusetts Institute of Technology — something Mayor Bloomberg wants for New York. And it is of growing importance to universities, not just for their ability to draw top faculty and students, but also for their finances. “Technology transfer,” the private-sector use of university-born innovations, has become a multibillion-dollar source of revenue for schools. 

When Mayor Bloomberg asked business leaders about the city’s economic prospects, the complaint he said he heard most often was a shortage of top-notch talent in computer science and engineering. The hope was that a new graduate school could turn the tech sector into another pillar of the city’s economy, like finance, medicine and media. In 2010, the mayor announced a contest, offering city-owned land on Roosevelt Island worth hundreds of millions of dollars, and up to $100 million worth of capital improvements. 

Now New York is NOT Cambridge, it is not Silicon Valley and it is cer6tainly not Tel Aviv. It is New York. It costs a fortune to live there, the taxes now beat even California, getting around is impossible, and the ability to get start up spots is just zero! 

Also start up folks are born, not made, and they are often with one leader and a few good followers. I have always seen teams but I have always seen the leader. I was with my first start up in 1969, funded by EG&G venture arm and it went belly up. I was just a technical consultant but I learned more from that failure than almost anywhere else. There were lots of reasons, but the primary one was not paying attention to details.

I also found on my third return to MIT, or it may have been my fourth, that my doctoral students by the mid 2000s thought that to do a start up you needed a Harvard MBA and/or lawyer to head it up, that engineers could not do so. My reply, "What do I look like, chopped liver!" I still do not know how that would be translated to Mandarin.

What is needed is a critical mass of the right technical people in an economically livable environment with access to good ideas and capital, as well as a technical support network. And New York City is in my opinion the last place in the world to find that!

One need just look at the MIT area. There is a wealth of talent, plenty of low cost facilities, plenty of capital, and plenty of places where one can live on a low budget.  To some degree that is also the case in certain areas of Silicon Valley, at least for those renting. Yet New York City would at best require a substantial commute, and that back and forth eats up a great deal of the creative spirit. Brooklyn is becoming more costly and the Bronx is still a long commute on almost any train, and the office space is costly, and that does not include the union problems. One must remember that one cannot attach a nail to a wall, almost everywhere, without union labor at extreme costs.

Go across the river to New Jersey and the world does change on several fronts. Lower cost real estate and living conditions and easier travel. Yet the symbiotic melding of like minds is not there. In New Jersey there are no MITs or Stanfords. So where in New York will this work, Roosevelt Island, in my opinion that is more than highly unlikely.

New York is great for finance and entertainment, for deals and dealers. It would hardly in my opinion be great for high tech start ups. On the other hand there are tremendous opportunities there which can be monetized elsewhere. It is a gold mine of such opportunities, but actually doing a raw start up is more than problematic.

Saturday, April 13, 2013

Hobby Shop

Making something real is often the best way to understand what really works. Thought this video would be of value.

Wednesday, April 10, 2013

A Toaster with a Faucet

Form follows function, or something like that. Industrial design is I gather an art, and like most art is best seen from the eye of the creator. I have known a few industrial designers in my day and I fear that they all follow the same trend of the belief that their embodiment of reality is the only true path. Sometimes it may work. Apple is an example of a very successful niche market. Microsoft is not in that league, however.

Take Windows 8. I think it is a disaster. Windows 7 works, it is XP upgraded. It is not Vista. It is a utility, it works, it is easy to use and I did not have to change my world views. My Kindle is OK, no great, could be better but compared to Nexus 7 it at least works, Google seems not to be able to get hardware to work, but arrogance does that to you.

Now IDC reports what appears to be a collapse of the PC market. As reported by CBS Newswatch:

Global shipments of PCs fell 14 percent in the first three months this year, the sharpest plunge since research firm IDC started tracking the industry in 1994. The firm said Wednesday that the appeal of tablets and smartphones is pulling money away from PCs, but it also blames Microsoft's latest version of Windows, which has a new look and forces users to learn new ways to control their machines.

Microsoft launched Windows 8 on Oct. 26, hoping to revitalize the PC industry and borrow some of the excitement that surrounds tablets. PC shipments were already falling, but the latest report suggests the decline is speeding up. "Unfortunately, it seems clear that the Windows 8 launch not only didn't provide a positive boost to the PC market, but appears to have slowed the market," IDC Vice President Bob O'Donnell said.

 Now let us go back to that form and function thing. Why do I use a PC? Well I write on it, I do spread sheets, I prepare presentations, I store and retrieve massive amounts of data, I use a wealth of special software for various analyses, I use image storage, and the list goes on. It is the engine that keeps me going forward. Now why do I use a pad, like a Kindle. I get email, I can read a document, somewhat, I can get to the web, and it is small and easy to use. I do not play games so that leaves out a tone of usage and I have no use for social media, too distracting.

As the BBC states:

IDC said 76.3 million units were shipped, a figure that underlines the appeal of tablets and smartphones an alternatives to PCs. The firm said Microsoft's latest version of Windows had failed to revitalise the industry. Recession had also led companies to put back renewal of their PCs, IDC said.
The firm's vice president, Bob O'Donnell, said: "Unfortunately, it seems clear that the Windows 8 launch not only didn't provide a positive boost to the PC market, but appears to have slowed the market."

Windows 8 is designed to work well with touch-sensitive screens, but the displays add to the cost of a PC. Together, the changes and higher prices ``have made PCs a less attractive alternative to dedicated tablets and other competitive devices,'' Mr O'Donnell said. Microsoft was not immediately available for comment. IDC also said that, traditionally, companies replaced PCs every three years, but that during the economic downturn this was more likely to be every five years.

Thus for me the PC is a toaster, you put bread in the top, push the handle, and a few tens of seconds later out comes toast. But what would happen if you decided to add a faucet to it? First question is why? Well, says the industrial designer and marketing folks because the other guys have that on their pads. Well make a faucet and put it on a sink. Don't attach the faucet to my toaster!

But again we saw this tale before. It was Vista. People view the operating system today like a utility. It is an electrical outlet, I just want to plug stuff in and have it work. Microsoft does not want to be viewed that way, it thinks it is much more than that, and every time they try and show us the mess it up again.

The issue is simple. Keep the PC for what it does well. It is a great client and even a server. It has great processing capability and storage capacity. It is not mobile, and for what many people use it for it will never be that way. I have a laptop, it is portable, it goes from one place to another. Yet it is not really mobile. I have my Kindle, it is about as mobile as I want to get. But that is me.

Thus designers and marketing people must understand use and users. What was wrong with Windows 7. It was great. That stupid screen that opens on Windows 8 on a high capacity PC is just an annoyance. Did anyone at Microsoft ever speak to a customer, really? How about listening to the customers also. That's a first. I really do not want a faucet on my toaster.

Tuesday, April 9, 2013

More On Circulating Tumor DNA

It seems that there is a significant amount of new work being done on evaluating cancers via circulating tumor cells and their DNA. Another paper in Nature states:

Cancers acquire resistance to systemic treatment as a result of clonal evolution and selection. Repeat biopsies to study genomic evolution as a result of therapy are difficult, invasive and may be confounded by intra-tumour heterogeneity Recent studies have shown that genomic alterations in solid cancers can be characterized by massively parallel sequencing of circulating cell-free tumour DNA released from cancer cells into plasma, representing a non-invasive liquid biopsy. 

Here we report sequencing of cancer exomes in serial plasma samples to track genomic evolution of metastatic cancers in response to therapy. Six patients with advanced breast, ovarian and lung cancers were followed over 1–2 years. For each case, exome sequencing was performed on 2–5 plasma samples (19 in total) spanning multiple courses of treatment, at selected time points when the allele fraction of tumour mutations in plasma was high, allowing improved sensitivity. 

For two cases, synchronous biopsies were also analysed, confirming genome-wide representation of the tumour genome in plasma. Quantification of allele fractions in plasma identified increased representation of mutant alleles in association with emergence of therapy resistance. ...treatment with gefitinib

These results establish proof of principle that exome-wide analysis of circulating tumour DNA could complement current invasive biopsy approaches to identify mutations associated with acquired drug resistance in advanced cancers. Serial analysis of cancer genomes in plasma constitutes a new paradigm for the study of clonal evolution in human cancers.

Cancer Research UK commented on the works as follows:

Scientists ... used traces of tumour DNA, known as circulating tumour DNA (ctDNA) found in cancer patients’ blood to follow the progress of the disease as it changed over time and developed resistance to chemotherapy treatments.  

They followed six patients with advanced breast, ovarian and lung cancers and took blood samples, which contained small amounts of tumour ctDNA, over one to two years.

By looking for changes in the tumour ctDNA before and after each course of treatment, they were able to identify which changes in the tumour’s DNA were linked to drug resistance following each treatment session.

Using this new method they were able to identify several changes linked to drug-resistance in response to chemotherapy drugs such as paclitaxel (taxol) which is used to treat ovarian, breast and lung cancers, tamoxifen which is used to treat oestrogen-positive breast cancers and transtuzumab (Herceptin) which is used to treat HER2 positive breast cancers.

And they hope this will help shed new light on how cancer tumours develop resistance to some of our most effective chemotherapy drugs as well as providing an alternative to current methods of collecting tumour DNA – by taking a sample direct from the tumour – a much more difficult and invasive procedure.

 As we noted in a previous note regarding the same set of procedures by others researchers this is a useful method to detect the progression of cancer.

However the following observations are of note:

1. Are these coming or going cells, namely are the cells on their way to a metastasis or the result of one.

2. Can we use these cells to determine the changes in DNA expression as the cells progress.

3. How effective a prognostic tool are these measurements.

4. What therapeutic methods can be applied now knowing this information.

Thus is this data of primary use or secondary. Notwithstanding its clinical use it does represent an excellent tool for genomic progression.

Reference:

Murtaza M et al, Noninvasive analysis of acquired resistance to cancer therapy by sequencing of plasma DNA (2013) Nature.

MIT vs Harvard: Location, Location


MIT is expanding Kendall Square and it has gotten approval for almost 1 million sq feet of new high tech, biotech specifically, office space. As The Tech reports:

The rezoning allows for up to 980,000 new square feet of commercial development and at least 240,000 new square feet of residential development, in addition to the 800,000 square feet currently permitted for academic (including dormitory) uses. In some regions, the rezoning permits buildings as high as 300 feet, taller than the Green Building.

 The Green Building is the tallest building on the MIT campus but what one sees looking around is ever increasing building height and ever expanding coverage. The main problem is that there is but on T stop and very poor parking and traffic flow. In fact it is a disaster.


One sees taller and taller buildings and more and more people coming into the area. Frankly that cannot be stopped and if done properly can be a tremendous asset to MIT.

Now what of Harvard? Unfortunately Harvard does not have such a convenient real estate location, at least in the Square. So does location portend destiny, perhaps in this case. This section of Cambridge is exploding with bio tech, MIT has been the intellectual focus, and frankly this may be the obvious evolution of MIT, from a pure tech school to an expanded bio tech institute. Ironically the driver for this may very well have been real estate.

Location is Destiny, or at least a large part of it. Harvard Yard is a closed 17th century artifact, MIT is an unbounded amalgam of ever skyward buildings. I await the first true skyscraper, a bit of Manhattan, with an attitude to match.

An Interesting New Cancer Technology

The challenge is determining of a cancer has metastasized is to find out where and how much. The classic approach is to look at the local draining lymph nodes and see if has gone there. However the cancer cells may often escape through the blood system and not the lymph system. Consider ocular melanoma, there is no lymph system connection and it spreads by hematological means only.

That means that by examining the blood we should be able to find the wandering malignant cells, at least in theory. In a recent release by MedGadget the article relates developments at MGH in Boston as follows:

Circulating tumor cells (CTCs) are shed by primary tumors and allow the cancer to metastasize to the distant sites. While this is a devastating tool in cancer’s war chest, it offers clinicians a marker through which to diagnose and monitor progress of the disease. Since the discovery of CTCs over a hundred years ago, researchers have been developing ever more sensitive methods of capturing them since they’re extremely rare in whole blood.

In a recent development by Ozkumur et al at MGH the authors state:

Circulating tumor cells (CTCs) are shed into the bloodstream from primary and metastatic tumor deposits. Their isolation and analysis hold great promise for the early detection of invasive cancer and the management of advanced disease, but technological hurdles have limited their broad clinical utility. We describe an inertial focusing–enhanced microfluidic CTC capture platform, termed “CTC-iChip,” that is capable of sorting rare CTCs from whole blood at 107 cells/s. 

Most importantly, the iChip is capable of isolating CTCs using strategies that are either dependent or independent of tumor membrane epitopes, and thus applicable to virtually all cancers. We specifically demonstrate the use of the iChip in an expanded set of both epithelial and nonepithelial cancers including lung, prostate, pancreas, breast, and melanoma. 

The sorting of CTCs as unfixed cells in solution allows for the application of high-quality clinically standardized morphological and immunohistochemical analyses, as well as RNA-based single-cell molecular characterization. The combination of an unbiased, broadly applicable, high-throughput, and automatable rare cell sorting technology with generally accepted molecular assays and cytology standards will enable the integration of CTC-based diagnostics into the clinical management of cancer. 

There are several problems here however:

1. As we had demonstrated in some of our prior analysis, blood borne cancer cells are rare, but more importantly they are cells which are coming from and going to organs. Namely they are in transit, from whence and to where we do not know.

2. The genetic states of each of these wandering cells may be a marker of from whence it came. The problem is that we do not fully understand this genetic mutation process, and in fact as we have shown before it may actually be a Markov like chain process.

3. Understanding this change in cells may be of significant therapeutic value. However this again is uncertain given our current state of knowledge.

4. Again we come back to the cancer stem cell and ask if the few cells we find in the blood stream are the right cells to examine.

However this advance could provide significant data to allow us to expand the understanding of mutating cancer cells.

Reference:

Ozkumur, E.,Inertial Focusing for Tumor Antigen–Dependent and –Independent Sorting of Rare Circulating Tumor Cells, Sci Transl Med Vol. 5, Issue 179, p. 179