Regulations are a critical factor in the access network. Unlike the “rest of the network” the access network is burdened with federal, state and local regulations and this is only getting worse. I’ve written extensively in the past that net neutrality is a bad idea and that Title II is a gigabit killer.
Why is regulation bad for everyone, including Google? The regulated monopoly “phone companies” depreciated equipment over 30 years. With asset-based pricing regulations you want to keep your asset base as high as possible. Thus, the innovation cycle of the regulated voice industry was 30 years. In the unregulated data networking industry the desired depreciation cycle is five to seven years with three to five years being a more common life span of equipment. Thus, the innovation cycle is three to five years. Today, service providers want to accelerate their innovation cycle to less than one year and ideally three to four months to be more competitive with the “web companies” such as Google and Facebook.
Until recently the net neutrality debate was focused on adverse traffic impacts such a throttling P2P traffic. It’s widely reported that as few as 10 percent of users consume upwards of 80 percent of capacity. The numbers have changed with the proliferation of streaming video but the issue remains. Mobile network operators have solved this problem with data caps. They also have program where web companies can pay so their traffic doesn’t count against subscribers’ data caps. (This may be illegal soon as well.) When an analogous program (for example, paid fast lane) was implemented in the broadband access market there was outrage.
Traditional content delivery networks (CDNs) can bypass much of the public Internet to improve quality of service. Companies that want to provide a better user experience can use CDNs and cache their content in select Tier 1 locations across the country. This helps; however, from the Tier 1 cache to the user is best-effort delivery. Once the traffic enters the local exchange carriers’ (LEC) network in a large metropolitan area the “last 50” miles are best effort.
With this model OTT companies cannot ensure the quality of their service. Why shouldn’t they be able to pay the LEC for better traffic treatment? The argument is that this benefits the large companies at the detriment of start-up companies. It’s just another challenge innovative start-ups must overcome. This actually benefits consumers as only those companies with a compelling offering will make it over the hurdle. Marginal companies with a marginal offering won’t flood the market and the network with garbage. This is a good thing. Isn’t the FCC all about protecting the consumer?
Can capitalism and the free market address the issue of a “digital divide”? Yes, a case in point is Comcast in the Boston area. The company offers $10/month broadband service to any family that has children on the free or subsidized school lunch program in the city of Boston. No laws, no regulations just a solid business driven move by Comcast.
Service providers have invested billions of dollars deploying and managing broadband networks. Data rates have continuously increased. Gigabit networks are being deployed around the world by a range of companies and organizations. The free market is driving them. It’s counter intuitive to expect them to spend limited CAPEX if their return on investment is regulated or uncertain. Today, regulators are faced with conflicting priorities. On one hand they want to spur gigabit investments but on the other hand they want to regulate broadband access. It’s obvious that you can’t get both. To repeat: Title II is a gigabit killer.