Thursday, August 31, 2017

A Modifier of a Verb is Called an Adverb

In the NY Times there is a piece recounting the alleged work ethic of Silicon Valley. They start by stating:

Silicon Valley prides itself on “thinking different.” So maybe it makes sense that just as a lot of industries have begun paying more attention to work-life balance, Silicon Valley is taking the opposite approach — and branding workaholism as a desirable lifestyle choice. An entire cottage industry has sprung up there, selling an internet-centric prosperity gospel that says that there is no higher calling than to start your own company, and that to succeed you must be willing to give up everything.

First, grammatically it is "thinking differently" if one wants to modify how one thinks. If however if it is used as a predicate then "I think different." means somehow that this thing called different is "what" you think.

Now in the 14th Century all students took the Trivium, Grammar, Logic, and Rhetoric. Why you may ask. To express themselves to each other clearly and correctly. Now Scholasticism was ending with Ockham and minimalism but this use of language correctly and unambiguously allowed for the development of English Law which may be one of the greatest creations of Mankind. I am not a lawyer for full disclosure.

Now back to the work ethic. Just being busy for 24 hours is meaningless unless you are productively doing something. I managed for a decade to work more than 24 hours a day, traveling to beat the sun to my some two dozen countries. I have seen lawyers book more than 24 hours in a day the same way. But that is a trick.

The question is; what are you producing during that time. Writing code is not "producing", it is effecting and idea. Creating a new system architecture is "producing", whereas writing code to do what you were told to do is akin to typing a Tolstoy novel. You are not Tolstoy, you are a typist. Thus the content of what is produced is critical. It begs the question of a Post Doc performing hundreds of tests for some Principal Investigator. Is that also akin to programming or is value added? Or is that why they are just a Post Doc.

The article continues:

Good grief. The guy is developing an app that lets you visualize how a coffee table from a catalog might look in your living room. I suppose that’s cool, but is it really more important than seeing your kids? Is the chance to raise some venture-capital funding really “the ultimate reward”? 

There are two issues in this statement. First, is another App of any value, period! Fifty years ago one built a better communications system, an improved medical imaging system, an innovative therapeutic, or any other thing that was real, advanced humanity as a whole. But a new App, get real. Second is the issue of who is doing what. In my experience the true start up is one , perhaps two people. There is a visionary, a Founder, and that person assembles a team. That person is the "dream merchant" selling the vision, laying out its implementation, convincing customers, and yes raising money. For the most part every one else is an employee, a follower. Thus this begs the question; do all people have to hustle, or just the leader?

Recently while trying to sort through some possible start up ideas at MIT we had one where the alleged CEO disappeared. I finally found him and the reason was because his wife had a baby and he was taking his Paternity Leave as she was taking her Maternity Leave. My daughter was delivered by me and the Resident and I went back to the Institute and picked my wife and daughter up the next morning between classes. That was fifty years ago. Did not skip a beat. No leave anywhere, and that is truly a Hustle. Today I guess we all get leave, paid and all. Unless you are not in Silicon Valley of course!

GDP April 2017


Here is the current GDP. It is growing, albeit still slowly. The above is each element, not the total. Consumption shows good growth and Investment is increasing. Net ExIm has come back up again however. Overall things are not that bad.

Sister Rosita


In the 5th Grade I had Sister Rosita. A short rotund Sister of Charity whose sole purpose in life was to get me to spell correctly. This was well before the age of Dyslexia and special needs students such as myself. Sister Rosita would shout at me, "McGarty, when will you ever spell correctly!" And I would reply, "I will have people to correct my errors Sister." Then "WHAM", the good old three foot (or should it be feet?) ruler across the knuckles.

Now Sister Rosita is a thing of the past and I have Microsoft Word. It always spells the word correctly, unfortunately it is often the wrong word. No matter how I look at it the word always looks correct, at least for a month or so. Then I read it and see the error.

Now add to this mess such things as Twitter. I tried Twitter for a few months, found it useless. Also as some may note I cannot say much with few words. I think through something, write a draft, with mis-spellings and all, then post it. Some folks actually read this stuff. Now along comes the NY Times and spelling[1]. They note as follows:

Actually, we should lay off everyone’s spelling. In a digital age of autocorrect and electronic publications that can be edited from afar, not to mention social media platforms that prize authenticity and immediacy over polish, misspelling has become a mostly forgivable mistake. You simply do not need to be able to spell as well as people once had to, because we now have tools that can catch and correct our errors — so it’s just not a big deal if, on your first draft, you write “heel” instead of “heal.” People are very attached to spelling, of course. When I first floated the idea that politicians’ misspelling was a forgivable sin, I was dragged over the coals for it on Twitter. My wife got so upset that she quit talking to me for most of a day. When I emailed my editor to say I wanted to defend Mr. Trump’s misspelling, she wrote back, “You should listen to your wife.” So I did what I normally do when confronted with people who are wrong on the internet: I researched the subject. I looked at the history of standardized spelling and what misspelling says about you cognitively. I uncovered a rich history of political misspelling. And I read a book by an Oxford professor on the shifting cultural attitudes toward spelling and then talked to him for a long time. Yet there is an even deeper sort of elitism underlying the criticism of spelling mistakes. It stems from people correlating accurate spelling with a good education and outsize intelligence, which is actually incorrect. There is not much scientific evidence to suggest that spelling well is connected to high intelligence. In the same way that some people are naturally better at arithmetic than others, some are naturally better spellers than others (and some people have lexical disabilities, like dyslexia, that make spelling even more difficult). But if you spell well, you can still do lots of dumb things, and if you spell poorly, you can still be very smart. Standardized spelling has been with English for at least a few hundred years, and it has mostly served us well. So I understand that the idea of abandoning it, or at least relaxing our adherence to it, may sound frightening, like the first step on a short march to civilizational decline…Second, there’s little evidence that how one types on electronic media has much to say about how one functions otherwise. One study, in fact, showed that kids who frequently used “textese” tended to be better at grammar than those who didn’t. All of this suggests that we are simply giving too much weight to spelling and other typographical mistakes. Focus on what people say, not how they spell it.

So is the art of spelling lost at last? It is akin to the "other left" syndrome where some says turn left when they mean turn right and correct it by saying the "other left".

Spelling counts. But not on Twitter or even Facebook.

Wednesday, August 30, 2017

Look to the Left, Look to the Right

In the old days, when you started class at the University, there was a sense of terror of not succeeding. The old adage of looking Left and then Right and saying that one of them will not be graduating in four years left you with a sense that you had to press on from day one.

Now the Institute President states:

“We are very lucky to have you!”

I guess so. But why are the students very lucky to be there?

Then it states:

"MIT is a unique crucible, where you will be faced with challenges you didn’t quite expect, at an important time of your life,” she said. “My advice here is quite simple: Embrace failure! If you haven’t already, you’ll soon realize that failures frequently, and I might say usually, allow you to learn far more than your successes.” Failure, she said, “lets you know that your knowledge lacked depth, or your understanding was incomplete, or maybe your expectations were a little unrealistic. Filling in those gaps adds to your knowledge base, and how you go about recovering from those failures will teach you lifelong lessons.”

 I would gather that this is not something you would espouse at the Med School! Yes, we make mistakes, and yes we learn from these mistakes. But in life, Failure is often not an option. Prior Planning Prevents Poor Performance, a dictum my father embedded in my thinking from day one. Experiments fail, but in so doing one hopefully understands why.

But embracing failure should not be an option. You don't drive eighty miles per hour on black ice, failure there can be severe!

What is in a Change?

Back in the early 1960s, Walker Memorial at MIT was a meeting place at lunch time and even for dinner. It was as close to High Table in an English University one could get, but no High Table. It was a watering hole, a meeting ground, a place to trade ideas, a eat food which was lower in carbs than what is there today.

Walker Memorial is now 100 years old and MIT is celebrating. They note:

For many of those who have passed through Walker Memorial over the past 100 years, the most enduring images remain the murals in Morss Hall, which were painted by Edwin Howland Blashfield of the Class of 1869. Created and installed between 1923 and 1930, their allegories of alma mater receiving homage from scientific and academic disciplines have watched over countless MIT community functions, from dining hall breakfasts to the Assembly Ball and more. For most MIT alumni and students, Walker Memorial holds indelible memories. A century after its completion, the tribute to President Walker has been realized in the best possible way — with the building continuing to serve as a community gathering place.

However it is currently planned to be turned  a building for the Music Department. MIT notes:

Walker Memorial Hall is a significant campus building that has served many roles on the MIT campus over its nearly 100-year history. Currently in need of substantial renovation, Walker appears to be a good match with the programmatic needs of the Music and Theater Arts Department – a community in search of a new home. MIT is currently studying the possibility of renovating Walker to co-locate the Music and Theater Arts Department with its teaching and extracurricular activities. The new center would allow MIT to explore new frontiers of artistic and technological discovery.

Now I am not against music, but the loss of this "watering hole" is significant. It has been replaced by Starbuckian hang outs featuring high carb feasts and uncomfortable seating. I think that it is a loss.. Just a thought.

A Statue, per chance, A Lesson

Now with all the fuss about statues perhaps there is one which is acceptable. This is Thomas Paine, in a small park down the road from the center of Morristown, not far from the Inn that Washington and Hamilton would hang out from time to time.

Paine one could say was sinless, at least as per the current standards. He donated what little money he made from his publishing to the Revolutionary cause, he then moved to France to assist that Revolution. He did get crosswise with Robespierre, but one could say anyone would have done so. He understood Liberte, Equalite, Fraternite as simple, freedom, individualism as per de Tocqueville, and the freedom of associations. He did not especially like Jefferson, and Adams found him even more strident than he was, but his words resonated.

His social views ring clearly even today. So perhaps we should look for the statues of those whose insight has lingered in a positive sense. Good old Tom!

CAR-T Cell Approval by the FDA

A year ago we wrote extensively about the developments in CAR-T cells. We have been following this work for several years and it has finally reached a clinical level with FDA approval today of a Novartis therapeutic.

The FDA notes:

Kymriah, a cell-based gene therapy, is approved in the United States for the treatment of patients up to 25 years of age with B-cell precursor ALL that is refractory or in second or later relapse. Kymriah is a genetically-modified autologous T-cell immunotherapy. Each dose of Kymriah is a customized treatment created using an individual patient’s own T-cells, a type of white blood cell known as a lymphocyte. The patient’s T-cells are collected and sent to a manufacturing center where they are genetically modified to include a new gene that contains a specific protein (a chimeric antigen receptor or CAR) that directs the T-cells to target and kill leukemia cells that have a specific antigen (CD19) on the surface. Once the cells are modified, they are infused back into the patient to kill the cancer cells. ALL is a cancer of the bone marrow and blood, in which the body makes abnormal lymphocytes. The disease progresses quickly and is the most common childhood cancer in the U.S. The National Cancer Institute estimates that approximately 3,100 patients aged 20 and younger are diagnosed with ALL each year. ALL can be of either T- or B-cell origin, with B-cell the most common. Kymriah is approved for use in pediatric and young adult patients with B-cell ALL and is intended for patients whose cancer has not responded to or has returned after initial treatment, which occurs in an estimated 15-20 percent of patients.

ALL is a deadly childhood cancer, most often, and  I recall see my first case in March 1968. Cold rainy day and a ten year old had a fever and malaise and we ran the blood work and it was a leukemia. Death was then a certainty. Now with this therapeutic survival is a viable option.

It should be interesting to see how this can be applied to other cancers. Cell markers must be available for the differing malignancies.

Wednesday, August 23, 2017

Yield Curves August 2017

The above shows the ever closing gap on the Yield Curve. The short term rates have been growing rapidly but long term have not moved. One suspects a built up delay in the long term but the short term action is the Fed rolling over its holdings and the buyers demanding more return. This has yet to be seen on the long term because there is much less of it in proportion to turnover.
The above is the spread. It is lower than any time since we started tracking after the Banking Crash.
The above is a similar demonstration but it shows the explosion of the short term rates. One expects long term explosion in the coming year.


Sunday, August 13, 2017

Programming and Programmers

My first computer was a used IBM 709 and the language was FAP, or Fortran Assembly Programming. The input was a paper tape! Yes, boys and girls, there were such things. I used it to analyze acoustic feedback instabilities in public address systems.

Does this talent make one better? In my opinion, not really. Programmers today are labor and VCs are capital. One should go back and read Marx.

Everything is divided into labor and capital

The software engineers are labor and the VCs and management are capital

Capital needs labor, especially in Silicon Valley, to think they are paid lots so that the value of capital's assets increase, such as real estate

Capital demands labor be compliant and not rebel

However there is a natural dialectic, labor vs capital, and this is just the beginning of that Hegelian dialectic, there will be more.

Thesis is the current model of the allocation of capital and labor.

Antithesis is the recognition by labor that they are being manipulated.

Synthesis will be the revolutionary change that is inevitable in this industry.

Strange that quasi Marxists, namely the current Capitalists, are in reality playing our Marx in real life, as capitalists!

Just a thought. 

Friday, August 11, 2017

Ontogeny Recapitulates Phylogeny, or What Comes First, the Chicken or the Egg

In Nature there is an article where they are applying the AI worlds approaches to plant systematics. Namely the process of sorting and arranging plants. They note:

There are roughly 3,000 herbaria in the world, hosting an estimated 350 million specimens — only a fraction of which has been digitized. But the swelling data sets, along with advances in computing techniques, enticed computer scientist ... and botanist ..., to see what they could make of the data. ... team had already made progress automating plant identification through the Pl@ntNet project. It has accumulated millions of images of fresh plants — typically taken in the field by people using its smartphone app to identify specimens. Researchers trained similar algorithms on more than 260,000 scans of herbarium sheets, encompassing more than 1,000 species. The computer program eventually identified species with nearly 80% accuracy: the correct answer was within the algorithms’ top 5 picks 90% of the time. That, says Wilf, probably out-performs a human taxonomist by quite a bit.

Now having done this a few decades ago,  and still proceeding to do so with Hemerocallis, the answer is not form but simple genetics. Sequence the genes, then using a mutation hypotheses base determine how the genes evolved. The use the shapes to see the effect. As is well know this process has been around since Linnaeus and it suffers from substantial defects. Just because a form looks close to something says nothing about the genetic evolution.

Sunday, August 6, 2017

Those in Glass Houses Should Not Throw Ginkgo Nuts

I grow Ginkgo trees. This is a view of this year's Ginkgo nuts, ready to drop and then off to the cooling off period before potting for next season. Ginkgo trees are ancient, one of the gymnosperms or naked seed plants. Surviving for millions of years, amongst multiple climate changes and assaults. They line the streets of New York because they thrive on pollution. Good friends for humans. I have a few dozen on my land alone.

Now seventy two years ago today my father and his shipmates were in the North Pacific preparing for an invasion of Japan. They were the survivors of the battles at Leyte, Saipan, and other places in the Pacific. They had already been given their winter gear in preparation of the invasion. They knew how bloody Okinawa was and were preparing for even worse.
The above is what they found in Manila. The Japanese did this. Thus they truly feared what they would be up against in the invasion of the islands.

Then late in the day of the 6th of August they heard about Hiroshima. They did not cheer, the cried. For they knew this this was the true beginning of the end and that they would now have a chance of seeing their children.

Thus when revisionist "historians," such as the one in the New York Times, bemoans:

The Hiroshima ginkgos, the tenacious older siblings of the tender green trees in front of our North Carolina house, were able to resist the most devastating outcome of science and technology, the splitting of the atom, a destructive power that could turn the whole planet into rubble. Those trees’ survival was a message of hope in the midst of the black rain of despair: that we could nurture life and conserve it, that we must be wary of the forces we unleash.

They fail to understand that this plant had managed a survival of even greater proportions.  Referring to the Japanese curator who introduced this writer to the tree the author states:

By then middle-aged, his body was a testament to that war crime and its aftermath. One ear was flat and mangled, his hands were gnarled, and from a finger on each grew a black fingernail.

One can vehemently object to the use of the pejorative term, "War crime", as if war itself is not the very crime he detests. The Pacific was strewn with bodies. My uncle was riddled with Japanese machine gun bullets on Okinawa, yet survived, along with his men, and awarded the Distinguished Service Cross. The devastation on Okinawa was just a prelude to what would have happened on the main islands. I would take umbrage to the term "war crime". Neither the survivors nor the tree deserve such.

Saturday, August 5, 2017

Guess Who Is Coming to Dinner?

Yep, it is that MIT Professor who thinks all of us out here in the real world are so stupid...Now he wants to teach us economics!

He tells us:

I’m excited to announce the launch of a new course on edX that covers Introductory Microeconomics. I’ve wanted to do a course like this for years. I have always found economics provides a terrific way to think about the world. Economics principles explain so much of what drives our everyday life: how people decide which goods to buy and how to spend their time; how firms set prices and hire workers, and whether the outcomes of markets are fair and efficient. These economics principles were inspirational to me when I first learned them as an undergraduate. I have gone on to apply to them to a set of topics I am passionate about, both as a Professor at MIT and as a policy expert for both state and local governments. Whether in the classroom, in Washington D.C., or in state capitals, I have found that basic economic principles never lead me wrong in terms of explaining important aspects of the world.

But wait, I thought we were beyond the pale, uneducateable, devoid of any intellectual merit. I guess it is like all those Shark movies, a shark can attack in any form, Zombie Shark anyone!