Wednesday, August 31, 2016

Unions at the Academy

Let's totally destroy the University in the United States! How? Make all Grad students join a Union! Take away any merit based rewards, make every student the same...they are not. As the Harvard Crimson states:

Overturning precedent, the National Labor Relations Board ruled that student assistants at private universities are considered employees with collective bargaining rights, a move that would force Harvard to legally recognize an elected graduate student union. The 3-1 decision handed down Tuesday marks a significant milestone for the unionization effort at Harvard, which began in April 2015 and has since grown in size and sophistication despite opposition from University administrators. However, the ruling has implications far beyond Harvard and comes as debate over the issue of graduate student unionization has roiled campuses across the country. The decision does not only affect Ph.D. students or graduate students; the NLRB ruled that employees under a collective bargaining unit could include undergraduate teaching assistants and research assistants as well. Harvard’s unionization movement, the Harvard Graduate Students Union-United Auto Workers, ramped up efforts over the last year to push Harvard to recognize a graduate student union and has already gained more than enough support among graduate students to call for a union election. 

 The Academy is based on merit and performance. It is not a Ford assembly line. Now the useless grad student gets the same rights as a Nobel Prize winner. Next we will see Washington forcing unions on brain surgeons! Talk about dumbing down!

It's Amazing to See People Learn Things

As a follower of the CATV and Telco critic on Backchannel I was amazed to see that she has finally grasped an element of reality, namely pole attachments. There are lots of hidden costs in building fiber and competing with the incumbents. Franchise is one, and a costly one. Pole attachments is another. To get that fiber from point A to point B you have to get some right of way. Been that way for centuries, it relates to property law, an old English concept that one would assume a lawyer would have some understanding of.

You see you have no right to the poles, you have to negotiate, with the Telco or Power company. You have to do pole counts, rights of way negotiations etc.

Our young newcomer to reality notes:

But many cities don’t control their own poles. In some areas, poles are controlled by utilities, or even telecom companies. Anyone hoping to string fiber in those places faces two nightmarish, indefinite periods of delay and uncontrolled costs: first getting an agreement in place with the pole owners, and then getting the poles physically ready for a new wire. We’ll call these steps Swamp One and Swamp Two.

Well that really is NOT new. Towns make money from the poles, however they are owned by third parties. So how can they "control" something they have no property right to? First year law school anyone?

She continues:

Swamp One: Attachment. At the moment, the FCC gives regulatory assistance (“pole attachment rights”) in negotiations with utility pole owners only to cable TV providers, companies selling internet access, and phone companies. The FCC’s assistance comes in the form of mandatory deadlines and set formulas for calculating fees to be paid to the pole owner.

 For anyone in the real world who had done this the process is rant with delays. And as an old colleague once said to me; "Delay is the deadliest form of denial" They do not say no, they just kick the can down the street! And Google thought they would change this world? Told them no fifteen years ago, but I guess if you have a big ego and lots of money then Newton's law do not apply to you!

She continues:

Swamp Two: Make-Ready. Even if a city wrestles into place an agreement with pole-owners that allows it to string fiber on their poles, or uses a company that has pole attachment help from the FCC, there’s still a gruesome, unpredictable process left to get the pole ready for a new attachment.

 This is just a corollary to the first problem. First get a pole attachment agreement and second wait.

This is a prime example of why wireless is the way to go. But this person seems to just want to attack the mountain hoping somehow that it will collapse and she can get herself and all her minions to the promised land. Fat chance!

Saturday, August 27, 2016

If All Else Fails Listen to the Customer

Government works well, right? Well just look at the Social Security Admins new Internet Security scheme. They call it "multifactor authentication". Simply when you logged in with your ever changing password you must also have a mobile phone with text messaging. Not an email connection, but a decade old text system which costs you about $0.25 per message at least. Then they, SSA, sends you an additional authentication key, which you then enter int your SSA web site.

Now there is the Mini Mental Exam, that wonderful test we use to test old folks for mental acuity. This 80 year old comes in while under a dozen mind altering meds and we ask them to count backwards from 100 by sevens! No child currently in a Public School with no meds could do this, but we want Grand Ma to do it. Now we want grandma to use a multi transactional random key entry system, like arming a nuclear warhead, to get on their SSA site!

As SSA has stated:

On July 30, 2016, we began requiring you to sign into your my Social Security account using a one-time code sent via text message. We implemented this new layer of security, known as “multifactor authentication,” in compliance with a Presidential executive order to improve the security of consumer financial transactions.  SSA implemented the improvements aggressively because we have a fundamental responsibility to protect the public’s personal information.
However, multifactor authentication inconvenienced or restricted access to some of our account holders. We’re listening to your concerns and are responding by temporarily rolling back this mandate. As before July 30, you can now access your secure account using only your username and password. We highly recommend the extra security text message option, but it is not required. We’re developing an alternative authentication option, besides text messaging, that we’ll begin implementing within the next six months.We strive to balance security and customer service option...

Frankly one should ask what moron came up with this approach. Most SSI recipients are on limited incomes, many have limited mental faculties to deal with this, and then we get a procedure that is not even used for launch codes! Why? Because they can't keep their own system secure. So what do they do? Put the burden on the customer. No business would survive with this type action.

One wonders where the get the people who run these organizations? Take a look and you would be shocked, or perhaps not. If you think the Presidential race is an issue you really should look behind the curtains at the million or so Federal employees. 

Let's see what SSA comes up with next!

But one should read the comments on the SSA site referenced above. The individual in charge of this fiasco was Jim Borland, Assistant Deputy Commissioner, Communications. People said such things as:

Evidently you’re committed to making it impossible to use My Social Security. Extra burdens do NOT make things more secure. Thanks for nothing.

There are many simpler and more effective schemes. However security starts with the team that operates the servers and comm interfaces. That means the SSA. Specifically the above named person. Pushing it off to the "customers" is just reckless and abusive. But alas, it is our Government. Could it be worse? Try the EU.

Friday, August 26, 2016

The Gang Who Can't Code Straight!

Microsoft is at it again with W10. As Ars Technica reports:

As if that weren't bad enough, Microsoft has pushed out a bad update that breaks important PowerShell functionality. The update, KB3176934, is the latest cumulative update for the Anniversary Update and was released on August 23. It breaks two key PowerShell features: Desired State Configuration (DSC), which is used to deploy configurations to systems, and implicit remoting, a feature that makes it easier to use PowerShell commands that are installed on remote systems. The reason that these things have broken is remarkable. The Cumulative Update doesn't include all the files it needed to. One missing file breaks DSC; a second missing file breaks implicit remoting. A revised package that includes these missing files will be released on August 30; although Microsoft recognizes that the problem exists, it isn't apparently important enough to rush out a fix, so it'll have to wait for the next Tuesday.

Yep, a bad and broken update causing billions of dollars of down time!

Microsoft notes:

Known issues in this update
Issue 1

After you apply this update and then start a remote Windows PowerShell session, the functionality to import a module (implicit remoting) no longer works. 

Issue 2

After you apply this update, PowerShell Desired State Configuration no longer works. Users will receive an "Invalid Property" error message when they run any DSC operation. 

 And then follow this to Microsoft:

On August 23, Windows update KB3176934 released for Windows Client. Due to a missing .MOF file in the build package, the update breaks DSC. All DSC operations will result in an “Invalid Property” error. In addition, due to a missing binary in the build package the update breaks PowerShell implicit remoting. Implicit remoting is a PowerShell feature where PowerShell commands work on a remote session instead of locally. Specifically, importing a remote session no longer works.

 One has to ask if there are any adults in Seattle. Perhaps it is some form of massive substance excess. The solution starts with a new CEO who has the customer in mind! Where is the Board, that is the problem. There will be massive class action suits sooner or later!

Thursday, August 25, 2016

Rockport 2016

Just for those who want a moment of serene quiet. The wave lapping on the sand and the sun setting to the west behind the harbor in Rockport, MA.

Wednesday, August 24, 2016

Microsoft and the World

The gross disrespect of their customers continues. As Tech Republic reports:

The Windows 10 Anniversary Update, which began rolling out on August 2, came with some unfortunate side effects for some users—it killed their webcam. A Windows employee has addressed the issue, but it looks no fix will be available until September. The problem was initially noted by several users a few days after the update went live. Basically, this issue renders USB webcams and network-connected webcams inoperable in programs like Skype or Open Broadcaster Software (OBS), among others. The update also caused some devices to unintentionally freeze up. The reason for this behavior seems to rest in the changes that were made to how the OS access the camera in the Anniversary update. Before the update, only one application could access the camera at a time. With the Anniversary update, also known as version 1607, a new service called the Windows Camera Frame Server allows for multiple connections at once, and that's causing some problems. 

Yep! September, but don't count on that! All those billions of dead web cams, watch Skype just die on the vine.  The will do nothing till September, most likely December. This is in my opinion abject evil. Where are the tort lawyers. This could become the world's largest class action suit!

It would serve Microsoft right to just disappear! Could you imagine what would happen if this were a drug company! Is there a Board somewhere at this useless company!

Friday, August 19, 2016

The Morons at Microsoft!

For those who may have noticed, Logitech and other cameras have stopped working on the 1607 W10 release. Why? Now ArsTechnica gives a great rendition.

The version 1607 frame server, however, only supports uncompressed data. Microsoft's rationale for this is that most applications receiving compressed data will have to immediately uncompress that data in order to actually manipulate it. With the new camera-sharing capability, this means that multiple applications could be performing the MJPEG-to-YUV or H.264-to-YUV conversion simultaneously. It's more efficient in this situation to simply read YUV data from the camera in the first place and skip those extra conversions. H.264 adds additional complexity: applications can negotiate specific compression parameters with the camera to alter the compression quality on the fly. This isn't an issue when an application has exclusive control of the camera but becomes a problem when two different applications try to use different parameters with the same camera.

They continue:

By preventing the use of the compressed formats, Microsoft avoided these issues. But it came at a great cost. Applications demanding or expecting support for MJPEG or H.264 data have stopped working. This could manifest in strange ways. I have a Logitech C920 camera, and I use it with Skype. Skype progressively enhances video quality; a connection may start out using a lower quality, and it'll then be upgraded as bandwidth and processor usage settle down. What I found was that an initial video call would connect, with the application using something like 640×480 YUV data. After a few seconds, however, Skype would try to upgrade the call to 720p or 1080p video. This should work, and in old versions of Windows, was seamless. But with the Anniversary Update it means switching from an uncompressed data stream to a compressed stream—and so it fails. The video just freezes after a few seconds of correct operation.

 Yes indeed, the folks in Seattle are totally clueless. H.264 is a compression technique used, you guess it, everywhere! For years! You think it would have made its way to Seattle. These folks are doomed! Doomed! If this keeps up we can all switch back to Windows 95, watch for the hourly reboot, and them load a Linux OS.

If there people worked for me they would be history! Along with the CEO who apparently does not give a damn about the customers!

We need a Harvard Business School Case Study on this one, now!

USPSTF and What Happens Now?

The USPSTF or U.S. Preventive Services Task Force back in 2011 made the problematic determination based on what is in my and many others' professional opinions flawed data the recommendation that men no longer get PSA tests. The result? Apparently a massive drop in the detection of early stage prostate cancer. What does that mean? Has for some strange reason the incidence of PCa just stopped, slowed down, disappeared Or are we awaiting a time bomb of massive proportions of men having metastatic cancers. One need just read a 1950s version of Harrison or Cecil to see. Men will just show up with terminal bone mets.

Science Daily reports on a JAMA study:

From 2012 to 2013, the localized/regional-stage prostate cancer incidence rates per 100,000 men declined from 356.5 to 335.4 in men 50 to 74 and from 379.2 to 353.6 in men 75 and older, according to the study. The authors note the decrease from 2012 to 2013 was smaller than that from 2011 to 2012 (6 percent vs. 19 percent). Previously reported findings indicate PSA testing rates decreased significantly between 2010 and 2013. Other factors that could contribute to the decline in incidence rates for early stage prostate cancer include changes in the prevalence of unknown risk factors and preventive measures. "In conclusion, the decrease in early-stage prostate cancer incidence rates from 2011 to 2012 in men 50 years and older persisted through 2013 in SEER registries, albeit at a slower pace. Whether this pattern will lead to a future increase in the diagnosis of distant-stage disease and prostate cancer mortality requires long-term monitoring because of the slow-growing nature of this malignant neoplasm," the research letter concludes.

As the JAMA report concludes:

 In conclusion, the decrease in early-stage prostate cancer incidence rates from 2011 to 2012 in men 50 years and older persisted through 2013 in SEER registries, albeit at a slower pace. Whether this pattern will lead to a future increase in the diagnosis of distant-stage disease and prostate cancer mortality requires long-term monitoring because of the slow-growing nature of this malignant neoplasm.

I think we may already know the answer. As part of our "new" health care system we may very well be just letting the "old men die". Pity!


The FCC is auctioning off 126 MHz in the 600 MHz band. As noted in my recent critique to that attorney who seems to be technically clueless in my opinion, 600 MHz bends around corners! And 126 MHz at 100 bps/Hz and a reuse factor of 10-50 in a multibeam environment one gets a phenomenal capacity, in basements!

As noted in Telegeography:

According to the Federal Communications Commission (FCC), the 600MHz Broadcast Television Spectrum Incentive Auction (‘Auction 1002’), which commenced on 16 August, has generated bids worth USD10.588 billion after five rounds of bidding. Round six is scheduled to commence today (Friday 19 August). As expected, spectrum allocations covering New York and Los Angeles have attracted the highest bids thus far, followed by the likes of Chicago, San Francisco, Baltimore-Washington, DC and Philadelphia. Interest in smaller markets has already began to wane, however, sources have noted. The current Auction 1002 ‘Forward Auction’ was preceded by a ‘Reverse Auction’ between the FCC and the TV broadcasters that held the 600MHz spectrum. This process saw the ‘clearing cost’ for 126MHz of spectrum established at USD86.423 billion, seriously exceeding analyst expectations. If that figure is not met in the Forward Auction, the FCC will reduce the amount of spectrum it will free up and resume bidding with TV broadcasters in a second stage of the Reverse Auction.

 The FCC has a rather obscure auction process, backward and forward, but as of now it has topped $11 billion.

Fierce Wireless states:

The generic license blocks offered in the initial stage during the forward auction under this band plan will consist of a total of 4030 "Category 1" blocks (zero to 15 percent impairment) and a total of 18 "Category 2" blocks (greater than 15 percent and up to 50 percent impairment). The FCC said approximately 97 percent of the blocks offered for the forward auction will be "Category 1" blocks, and 99 percent of the "Category 1" blocks will be zero percent impaired. These figures likely will cheer wireless carriers and other auction bidders since unimpaired spectrum can be used more quickly.

 This can be a game changing play. Watch the process.

Thursday, August 18, 2016

Wealth Distribution

The CBO has an interesting report on wealth distribution. They note:

In 2013, aggregate family wealth in the United States was $67 trillion (or about four times the nation’s gross domestic product) and the median family (the one at the midpoint of the wealth distribution) held approximately $81,000, CBO estimates. For this analysis, CBO calculated that measure of wealth as a family’s assets minus its debt. CBO measured wealth as marketable wealth, which consists of assets that are easily tradable and that have value even after the death of their owner. Those assets include home equity, other real estate (net of real estate loans), financial securities, bank deposits, defined contribution pension accounts, and business equity. Debt is nonmortgage debt, including credit card debt, auto loans, and student loans, for example. In 2013, families in the top 10 percent of the wealth distribution held 76 percent of all family wealth, families in the 51st to the 90th percentiles held 23 percent, and those in the bottom half of the distribution held 1 percent. Average wealth was about $4 million for families in the top 10 percent of the wealth distribution, $316,000 for families in the 51st to 90th percentiles, and $36,000 for families in the 26th to 50th percentiles. On average, families at or below the 25th percentile were $13,000 in debt.

They present the following Figures:

The above is the relative distribution by percentile. It appears that the 90s sent many sky high.
The above is a more selective view.
And the above by age. So guess who is supporting whom?

Wednesday, August 17, 2016

Almost Twenty Five Years Ago

From a filing for Pioneer Preference in May 1992 I wrote the following:

From FCC Pioneer Preference May 3 1992

Telmarc Telecommunications


            5. The following technological approaches will be deployed, integrated, tested, and optimized to determine their effectiveness in providing the specified service quality goals.

(1) Adaptive Network Management: Adaptive Network Management, ANM, is a system that uses in-situ sensors to monitor the power and signal quality throughout the network. The number of sensors will greatly exceed the number of cell locations. This set of dynamic measurements will then be used in a feedback schemes to adaptive change the characteristics of the cell transmit power and other characteristics to maximize the service quality. Specifically, the Petitioners have individually designed a proprietary network management system that uses the in-situ sensors that monitor all key signal elements. These elements are power, frequency, interference, noise, and other significant signal parameters. The system then transmits these signals back to a central processor which then generates an optimal signal to control the cell site transmission characteristics, such as power, frequency and other factors. The overall objective is to optimize the system performance from the users perspective.

(2) Gateway RF Digital Front Ends: A broadband, digital front end will be used to act as a gateway to interface the air interfaces of CDMA, TDMA and other access methods through the same cell and in the same frequency band. This system will permit multiple air interfaces to be gatewayed into the same network access point thus reducing the need for a single standard, and increasing the ability to provide a national network. This front end has been developed by Steinbrecher Assoc, of Woburn, MA. The system element allows, through its use of large gain bandwidth product front end and fully digital RF processing, the ability to handles many different and simultaneous multiple access methods, such as TDMA and CDMA. This ability goes to the heart of interoperability and standards.

(3) CDMA Backbone Network: The Petitioner will use a CDMA air interface and access methodology. The Petitioner fully supports the efforts of QUALCOMM in their development and implementation of CDMA in the 800 MHz band and their recent movement of this to the 1.85-1.90 GHz bands. Although there is no uniqueness in the use of CDMA, the Petitioners argue that this technology has specific characteristics that allow for the delivery a maximum benefit to the public.

(4) Co-Located Distributed Switch Access: Unlike other proposed schemes which use redundant MTSO accesses, this trial will focus on Central Office Co-Location methods that reduce capital and operating cost redundancies. The co-location approach, will minimize access line costs and eliminate the need for a MTSO. The adjunct processors at the Central Offices will be interconnected by a high speed bus to allow for adequate control and call hand-off. Co-Location is achieved via the intelligence that is contained in the CDMA cell sites and the adjunct processor distribute communications and processing capabilities. The fundamental existence of this capability was demonstrated by QUALCOMM in their CDMA trial, albeit not in the Co-Location context. The QUALCOMM QTSO was in effect a no Co-Located adjunct. The Petitioners propose to request access from the PUC in the Commonwealth of Massachusetts to access New England Telephone on a Co-Locations basis. The public good achieved is through the reduction in costs and the ability to use existing capital assets provided by the LECs. The uniqueness of the Petitioners proposals are the fact that extensive use of adjuncts will be made in the system operation.

(5) Adaptive Beam Forming Phased Array Technology: One of the current problems with a cellular systems will be the use of broad beam antennas and the inability to provide additional antenna gain on both transmit and receive to the individual portables. With the use of adaptive beam forming antennas, the service to lower power portables may be improved. The Petitioners approach will include such capabilities. Time dynamic control of these multiple bean antennas will permit higher localized gain on portables, which will in turn allow for lower transmit power and thus longer portable battery life. The Petitioners have been discussing the use of the technology developed at the Massachusetts Institute of Technology's Lincoln Laboratory in this area.

All of the above are now becoming a reality in wireless. Timing is everything, so is living long enough!

Microsoft, Windows 10 and Privacy

The EFF has a great piece on the lack of privacy in W10 and more importantly the loss of any form of control.

They note:

The trouble with Windows 10 doesn’t end with forcing users to download the operating system. By default, Windows 10 sends an unprecedented amount of usage data back to Microsoft, and the company claims most of it is to “personalize” the software by feeding it to the OS assistant called Cortana. Here’s a non-exhaustive list of data sent back: location data, text input, voice input, touch input, webpages you visit, and telemetry data regarding your general usage of your computer, including which programs you run and for how long. While we understand that many users find features like Cortana useful, and that such features would be difficult (though not necessarily impossible) to implement in a way that doesn’t send data back to the cloud, the fact remains that many users would much prefer to opt out of these features in exchange for maintaining their privacy.
And while users can opt-out of some of these settings, it is not a guarantee that your computer will stop talking to Microsoft’s servers. A significant issue is the telemetry data the company receives. While Microsoft insists that it aggregates and anonymizes this data, it hasn’t explained just how it does so. Microsoft also won’t say how long this data is retained, instead providing only general timeframes. Worse yet, unless you’re an enterprise user, no matter what, you have to share at least some of this telemetry data with Microsoft and there’s no way to opt-out of it.

 Thus Microsoft can track your every move. Worse however is that Microsoft single-handedly can block you emails to those who use MS emails such as hotmail. Thus not only do they have potential access to any and all of your emails they also decide who you may communicate with. And you will never know this. Namely if MS decides for reasons only known to them that the sender's email is unacceptable they then block it. The MS email user will never know that it was blocked. The blocked email sender must go through multiple hoops to the extent of threatening litigation to free up the connection.

Overall in my opinion and based upon my experience the world would be a lot better with an alternative, soon!

More Thoughts on Wireless

Wireless can work in amazing ways. Just consider the above. In classic cellular world we would have say 6 beams, over 360 degrees, or six 60 degree beams. Each beam would be say 20 MHz of a 40 MHz spectrum available, and each adjacent beam would have a different 20 MHz, alternating. Thus using a classic QPSK system say for even 3G we have 1 bps/Hz and in 40 MHz we have say 120 Mbps capacity. Now for 5G, we have OFDM and we have multi beam antennas. Here we have 20 beams, and 100 bps/Hz. due to OFDM and higher EIRP per beam, and we get 40 Bbps per 40 MHz!

That is only half the tale. The other half is that the data, say video, is being compressed at higher and higher amounts.

Thus as we expand capacity we are compressing content! There are of course even more ways to manipulate this process.

In a Nature article the author makes the following statement:

The most advanced commercial networks are now on 4G, which was introduced in the late 2000s to provide smartphones with broadband speeds of up to 100 megabits per second, and is now spreading fast. But to meet demand expected by the 2020s, say industry experts, wireless providers will have to start deploying fifth-generation (5G) technology that is at least 100 times faster, with top speeds measured in tens of billions of bits per second.The 5G signals will also need to be shared much more widely than is currently feasible, says Rahim Tafazolli, head of the Institute for Communication Systems at the University of Surrey in Guildford, UK. “The target is how can we support a million devices per square kilometre,” he says — enough to accommodate a burgeoning 'Internet of Things' that will range from networked household appliances to energy-control and medical-monitoring systems, and autonomous vehicles (see 'Bottleneck engineering').

Well given the simplistic example above the tools to do this are readily available. They have been known for several decades already, only now can silicon do this. The author continues:

MIMO is already used in Wi-Fi and 4G networks. But the small size of smartphones currently limits them to no more than four antennas each, and the same number on base stations. So a key goal of 5G research is to squeeze more antennas onto both. Big wireless companies have demonstrated MIMO with very high antenna counts in the lab and at trade shows. At the Mobile World Congress in Barcelona, Spain, in February, equipment-maker Ericsson ran live indoor demonstrations of a multiuser massive MIMO system, using a 512-element antenna to transmit 25 gigabits per second between a pair of terminals, one stationary and the other moving on rails. The system is one-quarter of the way to the 100-gigabit 5G target, and it transmits at 15 gigahertz, part of the high-frequency band planned for 5G. Japanese wireless operator NTT DoCoMo is working with Ericsson to test the equipment outdoors, and Korea Telecom is planning to demonstrate 5G services when South Korea hosts the next Winter Olympics, in 2018.

As noted above the MIMO function can be at the cell site not at the end user device. Thus the above argument is a straw man at best. At worst it may be a gross misrepresentation.

On comment on this article states:

Unfortunately, this article is unsound and should be withdrawn. Currently and for many years, congestion on the Internet backbone and most local broadband in the developed world is extremely rare. They are almost never a bottleneck, especially the backbone. While traffic has gone up, as noted, Moore's Law has brought down the cost of carrying bits at about the same rate. This has been established by, among others, ..... As ... notes, the only evidence of congestion in the article (except local like a convention center) is a failure of HBO to meet demand. This is almost certainly because HBO didn't buy enough capacity, not that the Internet couldn't handle the volume. I hate denigrating the work of another writer in these tough times, but this one is so misleading it should be retracted to get errors out of the public discussion

The above is in my opinion spot on. Perhaps Nature should stick to genes and molecules and leave the engineering to those who do or have done it for a living.

Saturday, August 13, 2016

Wireless v Fiber

There have been some recent moves in expanding wireless. Fiber is still fiber. Let us examine the differences and try to explain to some people what the facts are.

Recently in Backchannel[1] one of the writers, a lawyer I believe by calling, has made statements which in my opinion and my experience are not just wrong, they a truly outright apparent fabrications based on nothing that is in my opinion acceptable to those with even a modicum of competence.

Let me first restate some bona fides. Besides a PhD in EECS from MIT in communications, I then added some fifty years of design and deployment experience in wireless and fiber. One need look no further than a list of hundreds of papers on the topic. I have built out fiber in about twenty countries and frankly found the US the most difficult due to Franchise rules and pole attachment regulations. The incumbents in the US have a permanent barrier to entry for any new entrant. Put that aside for the moment.

Let us first compare fiber and wireless.

I. Fiber

Fiber has substantial capacity. Yet it requires many hurdles and costs an excessive amount per subscriber in capital. Let me list the hurdles:

1. Franchises: In every town and state there are Franchise requirements. You just can't build out a fiber system. You must get permission. The problem; twofold. First towns have Selectmen or the like who generally are clueless, often supported by Cable Companies, and willing to spends months if not years negotiating a Franchise. This adds thousands of dollars to the cost per subscriber and is all too often not realized.

2. Pole attachments: If you get a Franchise, then you have to get pole rights or other rights of way. You cannot start the negotiations until you have a franchise so it is sequential and the incumbents who owns the poles is in no hurry to get to the end.

3. Build Out: The laying of fiber has negative economies of scale. Even assuming the fiber is free, which it is not, the labor costs are always increasing and the delays are ever expanding. What may have cost $50,000 per mile five years ago is now $75,000.

4. Drops: Assuming you have achieved the above then you must get to the subscriber. That is the drop. It generally must be buried and if say you are in New England the rocks etc. will drive the costs to extreme levels.

5. Capital: The Capital per Sub can readily exceed $5,000 which is quite excessive. If the above were non-existent then one could do it for almost a tenth but the above are real. Our lawyer friend seems to be ignorant of these facts.

II. Wireless

Wireless is a totally different tale. First the key difference is the lack of infrastructure. You do not need a Franchise; you need a license but if you already have it so be it. Here are the advantages of wireless:

1. Ever Scalaeble Technology: The introduction of 4G with OFDM allowed the bits per second per Hz to go from 1 to 10. For 5G we see that using multiple beam antennas we can go from 10 times to 100 times! That means each user can get well in excess of 10 Gbps.

2. Capital is Incremental: Unlike fiber and even more so unlike a satellite system, wireless capital per subscriber and be deployed incrementally. I demonstrated that twenty-five years ago! Again we did it. Fiber requires a build of infrastructure. Wireless builds as we follow the customers.

3. Technology Changes in Short Time Periods: Cable TV converters are an average of 10-15 years old. They seem never to be replaced or upgraded. A wireless device is upgraded every 18 to 24 months! Thus the customer can follow the technology curve. As one upgrades cell sites using software defined modems and the like, then technology is always at the leading edge and the capital to the infrastructure provider is low.

4. Distributed Systems Can Evolve: As we build out systems we can do so in a distributed manner. WiFi can be integrated with backbone wireless and mesh networks are readily available.

We have argued again and again that the Google fiber builds were fruitless. Now we see they want to build out wireless. Is Google seeing the light? Not really, they needed licenses. If there is however a sharable band then perhaps they can execute this strategy.

Now for what in my opinion are the falsehoods of this lawyer:

Statement 1 is:

One way to increase the information-carrying capacity of a wireless network is to encode data on those wobbling frequencies more efficiently. The standards you’ve heard about — CDMA, 3G, LTE — they’re all about jamming more data into each unit (hertz) of spectrum. A new 5G set of standards will do the same thing, in an even fancier way: the antennas for very, very high frequencies can be so tiny that you can put 8 or 16 of them into a handset or base station and then have them all work together in an array to create a beam of data. Tons and tons of data can be carried on those aggregated beams. Transmission beams in an array can be steered in milliseconds to point to an individual user. You couldn’t do this kind of thing at lower frequencies, because many antennas would need, say, three feet of space — and you can’t fit that into a handset.

Yes, you can use small antennas at the lower frequencies. Ever hear of Ham radio? I have a 140 MHz hand held set, I can create a beam from a set of small antenna. Ever hear of WiFi, even 802.11n uses MIMO, many antennae. The above statement is just wrong. But even more so, the real antennas are beam-formers at the base station! I did this in 1992, and filed it with the FCC for my Pioneer Preference. We developed it jointly with MIT Lincoln Lab. The military has done this for decades. The statement as presented is just wrong, totally wrong!

She continues:

Until there’s a standard, carriers that want to be able to reach global markets won’t be anxious to make devices that will work in just a few places. They want to be able to use the same frequencies everywhere. Current phones and other widely-used private-sector communications devices have radios that transmit and receive only frequencies below 6 GHz, and the very, very high frequency spectrum that the FCC recently said it would open up for 5G purposes is all above 24 GHz. So we have a huge legacy replacement problem that will take a while to overcome and requires a standard to fix. All of this takes years.

Now back in 1990-92 I was COO of what is now Verizon Wireless. I worked with Qualcomm to introduce CDMA. We worried about turning the ship, from analog to digital, but it worked, seamlessly. Frankly I would suggest that the customer never noticed. Why? Simply because the replacement time for handsets is about 2 years! Thus with such a short replacement time the turnaround is painless. It did not and does not "take years". I did it! Good Lord, look at the facts!

She continues:

Again, wireless and fiber are complementary. Carriers know this. People call the cables between cell towers and central network offices “backhaul,” and when Verizon launched its 4G LTE network in the US covering 93% of the population it needed about 30,000 towers, each one of which had to have a fiber connection. But for a high-frequency 5G spectrum to cover that same population, you’d need to reach many millions of towers and base stations with fiber. Remember, you need to be very close to base stations to pick up and transmit these ginormous amounts of data across high-frequency airwaves. We’re going to have to have fiber interconnection points right next to houses and office buildings, and in many places fiber running inside those buildings. And to reach indoor areas with reliable high capacity, you’ll need multiple antennas inside rooms that can beam signals towards you from multiple angles (to avoid the “people as bags of water” problem).

The back-haul has always been with us. We actually used wireless for many of them but alas fiber can work as well and it can be shared with multiple carriers as are the cell towers.  I agree that with the ultra-high bands proposed for the new releases they are of short distance. Worse they do not work in humid environments, tried that one folks. So using the higher bands may not be really good to use and why then does one assume millions of towers! That is just in my opinion a stupid idea! One may then have an evolving multi-tiered network, with micro and nano cell nodes at customer premises as we have WiFi today.

Also 5G is a technological change, and evolution. One can use the old bands but with the new technology. One can get 100 bps/Hz and with a few hundred MHz and adaptive beam formed antennas one gets Gbps links to local users on demand.

It is my opinion that this lawyer has put forth a straw man that does not in any manner reflect reality. Indeed, no one is proposing building millions of towers. Engineers just are not that, shall we say, stupid. I cannot perhaps say the same for those technologically impaired.

Dog Days of August

In one of the left wing blurbs the author writes:

Heat waves are a growing trend across the globe. This summer, the Middle East has experienced record temperatures. July saw the heat index in Iran and the U.A.E. hit an almost unimaginable 140ºF. Actual temperature in the region hit hemispheric records of 129ºF. According to Live Science, most people will experience hyperthermia after 10 minutes in 140ºF. While we are not seeing temperatures go quite that high — yet — it is the unrelenting heat that kills. High temperatures combined with high humidity mean no relief. Even at night, temperatures do not fall sufficiently for the human body to adequately cool itself. In 2010, 55,000 people in Russia died in a heat wave. In 2003, 70,000 people across Europe were killed by heat.For decades, scientists have been warning that climate change ...

Well it just is August. For those of us who work outside, especially protecting ourselves from the rays of the sun, it is a challenge. But it is August. I have daily records for the past thirty years of hybridizing and frankly there is no substantial change. 

I recall the summer of 1964 at NY Tel, my summer job, with no air conditioning at 140 West Street. When the temps exceeded 90 for more than 4 days we got out at 2:30. That did happen frequently. Then I tool the Broadway Line to 242nd Street, from 95F to 120F in the subway! With a half million or more people at rush hour! Try that one on.

So have things changed much? Slightly but not warranting the hysteria. Data and facts are strange things. Just watch the dogs. BTW it is 64F in Moscow right now!

Sunday, August 7, 2016

Windows 10 New Release

Some how the folks at Microsoft must really hate their customers. Really, really, really!

Windows 10 new update takes 5-10 hours to process! Really. Then you find that your Logitech interfaces no longer function, no video. Then you also find you must re-register some expensive software and hope it takes. It goes on and on. If the developers worked for me it would have been their last day on the job! But after all that seems not to be the case with Microsoft management.

For what? Well the changed the format so you now have to get used to a new layout. Yes folks, you spent a year learning the old one and now you have a new one.

They seem to be the spawn of Satan! Google works the same now as it did fifteen years ago. It just works, although we know a great deal has changed. Yet Microsoft seems to continuously annoy and aggravate its customers.

Stop it guys! Really! We paid for this piece of junk so stop messing around with it. Perhaps it is Seattle. Too much coffee?

Oh yes and one more thing. In the previous incarnation W10 booted very quickly. Now it take 5-8 minutes! On several machines, even a simple laptop! What have these morons done. It is faster to boot an XP machine. 

Wednesday, August 3, 2016

Cable and Privacy

Privacy is a difficult issue. Especially in the world of the Internet. Back in the telephone days the customer records were sacrosanct. One need a Court Order to tap a line, to see customer records and the like. In today's world we are all too eager to assume that such is still the case and the FCC seems to be trying to make it so. Yet the CATV companies not only want unfettered access to our records but they also want to monetize them.

ArsTechnica reports on the Comcast proposal to the FCC. They note:

Comcast executives met with FCC officials last week, and "urged that the Commission allow business models offering discounts or other value to consumers in exchange for allowing ISPs to use their data," Comcast wrote in an ex parte filing that describes the meeting.

 Now one can view this another way. Namely the customer must pay more to keep their privacy, not get a discount. There is no price in CATV, it is totally unregulated. Frankly if one desires to allow the CATV to see and monetize their information, fine, but that should be an "opt in" approach not an "opt out".

Tuesday, August 2, 2016

Student Loans and the Next Bubble

The above us the summary of outstanding student loans. As of January 2016 they were in excess of $1.4 trillion, yes "trillion". Unlike the housing bubble, there really is no asset behind it. Unless the student is a Chemical or Electrical Engineer, we really don't need any more Progressive Political Scientists. We need people to create, not to talk. In fact we most likely don't need any more MBAs or Lawyers.

What we really do not need is to continue this explosion. It is three times the Medicare load. And for that people paid in and continue to pay in.

Frankly this is truly terrifying. It is all unsecured and one Party is saying they will just wipe it out. Really!

Monday, August 1, 2016

Here is Another Reason for Healthcare Stress

In a recent Healio article they discuss the duty of physicians to identify CMS over-payment mistakes and to then contact CMS and make restitution all within 60 days. They state:

Have you ever heard someone in your office say, “I think Medicare may not have paid these claims correctly?” After that statement, did the relevant documents get buried in a pile on a desk — never to be reviewed again? Be careful. Earlier this year, CMS issued a final rule governing the responsibility of providers to report and return Medicare overpayments. The rule makes clear that providers are responsible for identifying and repaying overpayments that may have occurred within a 6-year lookback period, and repayment must be made to Medicare within 60 days of identifying the overpayment. Failure to take action could trigger liability under the “reverse” false claims provisions of the federal False Claims Act, which punish the retention of federal dollars paid in excess of the actual amount to which the provider was entitled....The rule applies to overpayments arising within Medicare Parts A and B, and it requires any Medicare provider or supplier that has identified an overpayment to report and return the overpayment within 60 days after the date on which the overpayment was identified. Somewhat helpfully, the final rule recognizes that providers cannot meaningfully “identify” an overpayment without also confirming and quantifying the overpayment; consequently, under the rule, “a person has identified an overpayment when the person has, or should have through the exercise of reasonable diligence, determined that the person has received an overpayment and quantified the amount of the overpayment.”

Now on top of EHR management, paperwork, the new CMS payment scheme, you have to become the auditor for CMS as well as their working capital manager.

One should wonder how many such traps there are out there today. Physicians will all end up as employees and getting salaries, working 40 hour weeks, and ultimately getting paid accordingly. There will be no more Oslers, Cushings, Kaplans and the like.