The following stories have been tagged technical ← Back to All Tags

Early Lessons from Longmont - Community Broadband Bits Podcast 106

Longmont is about to break ground on the citywide FTTH gigabit network but it is already offering services to local businesses and a few neighborhoods that started as pilot projects. Vince Jordan, previously a guest two years ago, is back to update us on their progress.

Until recently, Vince was the Telecom Manager for Longmont Power and Communications in Colorado. He has decided to return to his entrepreneurial roots now that the utility is moving forward with the citywide project. But he has such a great voice and presence that we wanted to bring him back to share some stories.

We talk about Longmont's progress and how they dealt with a miscalculation in costs that forced them to slightly modify prices for local businesses shortly after launching the service. And finally, we discuss the $50/month gigabit service and how Longmont has been able to drive the price so low.

You can read our full coverage of Longmont from this tag.

We want your feedback and suggestions for the show - please e-mail us or leave a comment below. Also, feel free to suggest other guests, topics, or questions you want us to address.

This show is 20 minutes long and can be played below on this page or via iTunes or via the tool of your choice using this feed.

Listen to previous episodes here. You can can download this Mp3 file directly from here.

Thanks to Waylon Thornton for the music, licensed using Creative Commons. The song is "Bronco Romp."

Wireless Commons Part 1: Interference Is a Myth, but the FCC Hasn't Caught on Yet

This is the first in two-part series on spectrum basics and how we could better manage the spectrum to encourage innovation and prevent either large corporations or government from interfering with our right to communicate. Part 2 is available here.

We often think of all our wireless communications as traveling separate on paths: television, radio, Wi-Fi, cell phone calls, etc. In fact, these signals are all part of the same continuous electromagnetic spectrum. Different parts of the spectrum have different properties, to be sure - you can see visible light, but not radio waves. But these differences are more a question of degree than a fundamental difference in makeup. 

As radio, TV, and other technologies were developed and popularized throughout the 20th century, interference became a major concern. Any two signals using the same band of the spectrum in the same broadcast range would prevent both from being received, which you have likely experienced on your car radio when driving between stations on close frequencies – news and music vying with each other, both alternating with static. 

To mitigate the problem, the federal government did what any Econ 101 textbook says you should when you have a “tragedy of the commons” situation in which more people using a resource degrades it for everyone: they assigned property rights. This is why radio stations tend not to interfere with each other now.

The Federal Communications Commission granted exclusive licenses to the spectrum in slices known as bands to radio, TV, and eventually telecom companies, ensuring that they were the only ones with the legal right to broadcast on a given frequency range within a certain geographic area. Large bands were reserved for military use as well.

Originally, these licenses came free of charge, on the condition that broadcasters meet certain public interest requirements. Beginning in 1993, the government began to run an auction process, allowing companies to bid on spectrum licenses. That practice continues today whenever any space on the spectrum is freed up. (For a more complete explanation of the evolution of licensing see this excellent Benton foundation blog post.)

Although there have been several redistributions over the decades, the basic architecture remains. Communications companies own exclusive licenses for large swaths of the usable spectrum, with most other useful sections reserved for the federal government’s defense and communications purposes (e.g. aviation and maritime navigation). Only a few tiny bands are left open as free, unlicensed territory that anyone can use. 

NTIA Spectrum Map

This small unlicensed area is where many of the most innovative technologies of the last several decades have sprung up, including Wi-Fi, Bluetooth, Radio Frequency Identification (RFID), and even garage door openers and cordless phones. A recent report by the Consumer Electronics Association concluded that unlicensed spectrum generates $62 billion in economic activity, and that only takes into account a portion of direct retail sales of devices using the unlicensed spectrum. 

On its face, the current spectrum allocation regime appears an obvious solution; an efficient allocation of scarce resources that allows us to consume all kinds of media with minimal interference or confusion, and even raises auction revenues for the government to boot. 

Except that the spectrum is not actually a limited resource. Thanks to the constant evolution of broadcasting and receiving technologies, the idea of a finite spectrum has become obsolete, and with it the rationale for the FCC’s exclusive licensing framework. This topic was explored over a decade ago in a Salon article by David Weinberger, in which he interviews David P. Reed, a former MIT Computer Science Professor and early Internet theorist. 

Reed describes the fallacy of thinking of interference as something inherent in the signals themselves. Signals travelling on similar frequencies do not physically bump into each other in the air, scrambling the message sent. The signals simply pass through each other, meaning multiple signals can actually be overlaid on each other. (You don’t have to understand why this happens, just know that it does.) Bob Frankston belittles the current exclusive licensing regime as giving monopolies on colors. 

As Weinberger puts it:

The problem isn’t with the radio waves. It’s with the receivers: “Interference cannot be defined as a meaningful concept until a receiver tries to separate the signal. It’s the processing that gets confused, and the confusion is highly specific to the particular detector,” Reed says. Interference isn’t a fact of nature. It’s an artifact of particular technologies.

In the past, our relatively primitive hardware-based technologies, such as car radios, could only differentiate signals that were physically separated by vacant spectrum. But with advances in both transmitters and receivers that have increased sensitivity, as well as software that can quickly and seamlessly sense what frequencies are available and make use of them, we can effectively expand the usable range of the spectrum. This approach allows for squeezing more and more communication capacity into any given band as technology advances, without sacrificing the clarity of existing signals. In other words, (specifically those of Kevin Werbach and Aalok Mehta in a recent International Journal of Communications paper) “The effective capacity of the spectrum is a constantly moving target.”

In the next post, we’ll look at how we can take advantage of current and future breakthroughs in wireless technology, and how our outdated approach to spectrum management is limiting important innovation.

Open Technology Institute Report Offers Overview of Public Broadband Options

Publication Date: 
May 6, 2014
Author(s): 
Ben Lennett, Open Technology Institute
Author(s): 
Patrick Lucey, Open Technology Institute
Author(s): 
Joanne Hovis, CTC Technology and Energy

The Open Technology Institute at the New America Foundation, along with ctc Technology and Energy, have released an overview of options for local governments that want to improve Internet access. The report is titled, "The Art of the Possible: An Overview of Public Broadband Options."

The paper has been released at an opportune time, more communities are now considering what investments they can make at the local level than ever. The Art of the Possible offers different models, from muni ownership and partnerships to coops. The paper examines different business models and assesses the risk of various approaches.

It also includes a technical section for the non-technical to explain the differences between different types of broadband technology.

From the introduction:

The one thing communities cannot do is sit on the sidelines. Even the process of evaluating whether a public network is appropriate can be beneficial to community leaders as a means to better understand the communications needs of their residents, businesses, and institutions and whether existing services and networks are keeping pace.

The purpose of this report is to enable communities to begin the evaluation of their broadband options. The report begins with an overview of different network ownership and governance models, followed by an overview of broadband technologies to help potential stakeholders understand the advantages and disadvantages of each technology. It then provides a brief summary of several different business models for publicly owned networks. The final two chapters focus on the potential larger local benefits and the risks of a publicly funded broadband project.

Peering: Then and Now on Community Broadband Bits Podcast #96

This week we are welcoming Scott Bradner, a long time doer, writer, and thinker on Internet matters. Thanks to a listener request, we had already recorded an interview last week discussing peering before the news broke that the FCC would be allowing paid prioritization peering arrangements, which many have said represents the end of network neutrality. We talked prior to the announcement of the FCC's upcoming rules so we do not discuss them directly.

We explain what "peering" is and why it is essential to the Internet. It gets a little technical but we try to bring it back with simple examples.

Our take on the Comcast-Netflix deal may surprise some listeners because the arrangement is not as far from the tradition of paid interconnection arrangement as some strong supporters of network neutrality maintain. However, we are explicit in noting that monopoly providers like Comcast may abuse their market power to shake down companies like Netflix. That is worrisome but may best be dealt with using other means aside from changing the way peering has historically worked.

We end the show discussing the consolidation of ISPs and the role of symmetry in peering.

Scott recommended these two columns and I strongly encourage readers/listeners to read Barbara van Schewick's post on the subject.

We want your feedback and suggestions for the show - please e-mail us or leave a comment below. Also, feel free to suggest other guests, topics, or questions you want us to address.

This show is 20 minutes long and can be played below on this page or via iTunes or via the tool of your choice using this feed.

Listen to previous episodes here. You can can download this Mp3 file directly from here.

Thanks to Valley Lodge for the music, licensed using Creative Commons. The song is "Sweet Elizabeth."

Seattle, Gigabit Squared, the Challenge of Private Sector Cable Competition

This the second in a series of posts exploring lessons learned from the Seattle Gigabit Squared project, which now appears unlikely to be built. The first post is available here and focuses on the benefits massive cable companies already have as well as the limits of conduit and fiber in spurring new competition.

This post focuses on business challenges an entity like Gigabit Squared would face in building the network it envisioned. I am not representing that this is what Gigabit Squared faced but these issues arise with any new provider in that circumstance. I aim to explain why the private sector has not and generally will not provide competition to companies Comcast and Time Warner Cable.

Gigabit Squared planned to deliver voice, television, and Internet access to subscribers. Voice can be a bit of hassle due to the many regulatory requirements and Internet access is comparatively simple. But television, that is a headache. I've been told by some munis that 90% of the problems and difficulties they experience is with television services.

Before you can deliver ESPN, the Family Channel, or Comedy Central, you have to come to agreement with big channel owners like Disney, Viacom, and others. Even massive companies like Comcast have to pay the channel owners more each year despite its over 10 million subscribers, so you can imagine how difficult it can be for a small firm to negotiate these contracts. Some channel owners may only negotiate with a provider after it has a few thousand subscribers - but getting a few thousand subscribers without good content is a challenge.

Many small firms (including most munis) join a buyer cooperative called the National Cable Television Cooperative (NCTC) that has many of the contracts available. But even with that substantial help, building a channel lineup is incredibly difficult and the new competitor will almost certainly be paying more for the same channels as a competitor like Comcast or Time Warner Cable. And some munis, like Lafayette, faced steep barriers in just joining the coop.

FCC Logo

(An aside: if we are going to pretend that competition can work in the telecommunications space, Congress and/or the FCC have to ensure that small providers can access content on reasonable terms or the ever-consolidating big providers will be all but unassailable by any but the likes of Google. Such regulations should include rigorous anti-monopoly enforcement on a variety of levels.)

Assuming a new provider can secure a reasonable channel lineup, it now needs to deliver that to the subscribers and this is more complicated than one might imagine. From satellite dishes to industrial strength encryption to set-top boxes, delivering Hollywood content is incredibly complicated.

When confronted with this challenge for its Kansas City network, Google evaluated all the options and decided the only option was to build its own technology for delivering television signals to subscribers. Google has the some of the best engineers on the planet and even they encountered significant challenges, suggesting that route is ill-advised for new companies. Even if Google were willing to share their approach, it was written for the Google eco-system and would need significant porting to work for other firms.

Several of the recent triple-play municipal FTTH networks used Mediaroom, a technology developed by Microsoft that was recently sold to Ericsson, which has strong connections with AT&T. All of which suggests that delivering television channels is not becoming easier for small, local networks.

From the tremendous challenges of securing television channels to the difficulty of delivering them to subscribers, investors are aware of the mountain a new entrant has to climb before even starting to compete with a massive firm like Comcast.

Longmont Power and Communications Logo

It remains to be seen whether a network delivering only Internet access (or with telephone as well) will succeed today, but most have believed that television is needed to effectively compete for subscribers (and generate enough revenue to pay for the network). Longmont is bucking that wisdom in deploying a gigabit and phone network throughout its footprint north of Denver and many are watching intently to see how it fares (our coverage here).

The main lesson from Part II of our Seattle Gigabit Squared analysis is the difficulty of a small firm competing against a massive cable company like Comcast and the subsequent reluctance of most investors to fund such firms.

This is not to say it is impossible for small entities to compete, especially entities that can handle a distant break-even point or justify its network by the many indirect benefits created by such an investment - including more jobs, lower prices for telecommunications services, and improved educational opportunities to name three (see our recent podcast on this subject). In most cases, the kinds of entities that are willing to include indirect benefits on their balance sheets in addition to cash revenues are nonprofit entities.

We strongly support the right of communities to decide for themselves how to ensure their residents and businesses have the connections they need to thrive in the 21st century. We also recognize that many cities, particularly the larger metro areas, would prefer not to directly compete with some of the most powerful firms on the planet, even if they are also tops among the most hated. Few local governments relish the opportunity to take on such a new challenge and understandably search for firms like Gigabit Squared that can assist them, reduce the risks of building a network, and shield them from charges of being godless communists by think tanks funded by the cable and telephone companies.

However, we are not optimistic that many communities will find success with this public-private-partnership approach. Indeed, with recent news suggesting that Gigabit Squared left at least $50,000 in unpaid bills behind, the risks of going with such a solution may indeed be greater than previously appreciated.

It is for the above reasons that we continue to believe most communities will be best served by building and operating their own networks, though some may choose to do so on an open access basis where multiple ISPs operate on the network.

That is where we will turn in the final segment of this series. Read that post here.

Big City Community Networks: Lessons from Seattle and Gigabit Squared

A few weeks ago, a Geekwire interview with outgoing Seattle Mayor Mike McGinn announced that the Gigabit Squared project there was in jeopardy. Gigabit Squared has had difficulty raising all the necessary capital for its project, building Fiber-to-the-Home to several neighborhoods in part by using City owned fiber to reduce the cost of building its trunk lines.

There are a number of important lessons, none of them new, that we should take away from this disappointing news. This is the first of a series of posts on the subject.

But first, some facts. Gigabit Squared is continuing to work on projects in Chicago and Gainsville, Florida. There has been a shake-up at the company among founders and it is not clear what it will do next. Gigabit Squared was not the only vendor responding to Seattle's RFP, just the highest profile one.

Gigabit Squared hoped to raise some $20 million for its Seattle project (for which the website is still live). The original announcement suggested twelve neighborhoods with at least 50,000 households and businesses would be connected. The project is not officially dead, but few have high hopes for it given the change in mayor and many challenges thus far.

The first lesson to draw from this is what we say repeatedly: the broadband market is seriously broken and there is no panacea to fix it. The big cable firms, while beating up on DSL, refuse to compete with each other. They are protected by a moat made up of advantages over potential competitors that includes vast economies of scale allowing them to pay less for advertising, content, and equipment; large existing networks already amortized; vast capacity for predatory pricing by cross-subsidizing from non-competitive areas; and much more.

So if you are an investor with $20 million in cash lying around, why would you ever want to bet against Comcast - especially by investing in an unknown entity that cannot withstand a multi-year price war? You wouldn't and they generally don't. The private sector invests for a return and overbuilding Comcast with fiber almost certainly requires many years before breaking even. In fact, Wall Street loves Comcast's position, as penned in an investor love letter on SeekingAlpha:

We're big fans of the firm's Video and High-Speed Internet businesses because both are either monopolies or duopolies in their respective markets.

Seattle Conduit

Seattle has done what we believe many communities should be doing - investing in conduit and fiber that it can use internally and lease out to other entities. This is a good idea, but should not be oversold - these kinds of conduit and fiber projects are typically deploying among major corridors, where the fiber trunk lines are needed. But networks require far more investment in the distribution part of the network, which runs down each street to connect subscribers. With this heavy investment comes the modern day reality that whoever owns the distribution network owns the subscriber - that owner decides who subscribers can take service from. (We have more conduit tips from previous Seattle coverage.)

Additionally, different conduit and fiber segments may be owned by various entities, including different departments within a city. This may introduce administrative delays in leasing it, suggesting that local governments should devise a way of dealing with it before a network is actually being deployed.

Even if a city wanted to lay conduit everywhere for the entire network (trunk and distribution), it would need to have a network design first. Different companies build different networks that require different layouts for fiber, huts, vaults, etc. Some networks may use far more fiber than other designs depending on the network architect preference. The result is a limit on just how much conduit can/should be deployed with the hope of enticing an independent ISP to build in the community.

In deciding the size of conduit and where to lay it, different types of fiber network approaches are either enabled or disabled (e.g. GPON vs Active Ethernet). In turn, that can limit who is willing to build a fiber network in the community. The same can be true of aerial fiber, attached to utility poles.

Investing in conduit and/or fiber along major corridors may go a long way to connect local businesses and some residents but almost certainly will not change the calculations for whether another company can suddenly compete against a massive firm like Comcast.

And paradoxically, beginning to connect some businesses with fiber and a private partner could make a citywide system less feasible. The firms that are prepared to meet the needs of local businesses may not have the capacity nor inclination to connect everyone. But without the high margin business customers among neighborhoods, a firm that wants to connect neighbors may struggle to build a successful business plan. Additionally, some firms may only be interested in serving high end neighborhoods rather than low income areas.

Community BB Logo

This is a major consideration in our continued advocacy for community owned networks. They have an interest in connecting businesses as the first step in connecting the entire community. An independent ISP may only find it profitable to focus on the businesses, though some ISPs share our values of ensuring everyone has access.

In the first Geekwire interview, Mayor McGinn returned to his original position when campaigning - that the City itself should be playing a larger role and investing its own resources rather than pinning its hopes on distant firms.

McGinn noted that “we haven’t given up on the private sector,” but said that if he were continuing as mayor, he’d start garnering political support to build a municipal fiber utility. That’s actually something the mayor considered back in 2010, after a consultant recommended that the City find a way to build an open-access fiber-to-the-premises communication infrastructure to meet Seattle's goals and objectives.

A feasibility study looking at one particular way of building an open access fiber network put the cost at $700-$800 million. However, there were other alternatives that they did not pursue, opting instead for a far less risky (and with far less payoff) public-private-partnership with Gigabit Squared.

Over the next few days, I will explore other lessons. A review of lessons from today:

  • Comcast and other cable companies have tremendous advantages that other would-be competitors in the private sector will generally fail to overcome
  • City owned conduit and fiber helps to encourage competition but is subject to significant limitations
  • Communities should invest in conduit in conjunction with other capital projects but should not inadvertantly weaken the business case for universal access

Update: The Gigabit Squared deal with Seattle is officially dead. Part II of this series is available here.

Tubes Offers an Internet Tour

If you have been trying to find a book that offers an engaging explanation of how the Internet physically works and the various networks interconnect, search no more. Tubes: A Journey to the Center of the Internet by Andrew Blum has done it.

The author was featured on Fresh Air way back in May, but not much has changed with Internet infrastructure since then.

In Tubes, journalist Andrew Blum goes on a journey inside the Internet's physical infrastructure to uncover the buildings and compounds where our data is stored and transmitted. Along the way, he documents the spaces where the Internet first started, and the people who've been working to make the Web what it is today.

He was also just on C-Span's "The Communicators."

I enjoyed the read and learned a few things along the way. Those looking for a dry, just-the-facts-ma'am approach may not enjoy the frequent musings of Blum on his experiences. But I did.

One of his trips took him to a community in Oregon called The Dalles, where a municipal network allowed Google to build its very first "built-from-scratch data center." More on that in a post to come soon...

Those who are doing their reading on tablets now will be interested to know that the eBook is temporarily priced at $1.99. The deal lasts until New Years according to the author.

Why Wi-Fi Performance Varies Greatly

Why can Wi-Fi be so great in some places but so awful in others? (Ahem... Hotels.) Time to stop imagining Wi-Fi as magic and instead think of it just as a means of taking one connection and sharing it among many people without wires.

If you take a ho-hum connection and share it with 10 people, it becomes a bad connection. On the other hand, if you take a great connection and share it with those same 10 people, they will be very happy surfers. As Benoit Felten recently told us, the best wireless networks have been built in cities with the best wired infrastructure.

Wi-Fi still has hiccups but they are well worth it for convenience, mobility, and ubiquity. In a recent GigaOm article, Stacey Higgenbotham detailed additional reasons that not all Wi-fi is created equal.

Higgenbotham lays out the limitations of Wi-fi and breaks down the technology's touchiness into four main factors. She addresses the issue specifically for travelers. From the article:

Backhaul: For most Wi-Fi is their access to the internet, but it’s actually just a radio technology that moves information over the air.

Density:…the more people you add to a network — even if those people are just checking their Facebook page — the worse the network will perform.

Movement: Wi-Fi connectivity is designed for fixed access, meaning the radios stay put...when you try to jump from hot spot to hot spot problems occur.
...
Device: Newer phones and tablets are supporting a dual-mode Wi-Fi radio, which means they can hop from the 2.4 Ghz band to the 5 GHz band….now with phones like the latest iPhone that have dual-band support, you may hop to another band only to find a bunch of other users.

Higgenbotham doesn't have the answers on how to improve service, but she offers practical advice:

Given these constraints, it hopefully makes a little more sense when you can’t download a movie while on Amtrak or your Facebook video keeps buffering as you surf on the jetway. Unfortunately, with so many variables, there’s not a lot that the Wi-Fi Alliance or even the hot spot provider can always do. If there’s a business case for faster backhaul and a better managed network, the provider will make it happen. But in places like a doctor’s office or free hotel Wi-Fi where that economic incentive isn’t always clear, they may not. And for crowded planes and trains, the solution may just be for users to grin and bear it.

There you go, it isn't magic. But we can do much, much, better if we recognize the incredible value of being able to connect anywhere with devices of our choosing is worth much more to us than it is to cable and telcos who only about the maximum we are willing to pay to live in that world.

This is an infrastructure and the communities that can ensure residents and businesses have access to the networks they need at affordable prices will thrive.

How Chattanooga, Bristol, and Lafayette Built the Best Broadband in America

Publication Date: 
April 9, 2012
Author(s): 
Christopher Mitchell

We are thrilled to finally unveil our latest white paper: Broadband At the Speed of Light: How Three Communities Built Next-Generation Networks. This report was a joint effort of the Institute for Local Self-Reliance and the Benton Foundation.

We have chronicled how Bristol's BVU Authority, Chattanooga's EPB, and Lafayette's LUS built some of the most impressive broadband networks in the nation. The paper presents three case studies and then draws lessons from their common experiences to offer advice to other communities. Here is the press release:

The fastest networks in the nation are built by local governments, a new report by the Institute for Local Self-Reliance and Benton Foundation reveals

Chattanooga, Tennessee, is well known for being the first community with citywide access to a “gig,” or the fastest residential connections to the Internet available nationally. Less known are Bristol, Virginia, and Lafayette, Louisiana – both of which now also offer a gigabit throughout the community.

A new report just released by the Institute for Local Self-Reliance (ILSR) and the Benton Foundation explains how these communities have built some of the best broadband networks in the nation. Broadband At the Speed of Light: How Three Communities Built Next-Generation Networks is available here.

“It may surprise people that these cities in Virginia, Tennessee, and Louisiana have faster and lower cost access to the Internet than anyone in San Francisco, Seattle, or any other major city,” says Christopher Mitchell, Director of ILSR’s Telecommunications as Commons Initiative. “These publicly owned networks have each created hundreds of jobs and saved millions of dollars.”

“Communities need 21st century telecommunications infrastructure to compete in the global economy,” said Charles Benton, Chairman & CEO of the Benton Foundation. “Hopefully, this report will resonate with local government officials across the country.”

Mitchell is a national expert on community broadband networks and was recently named a “Top 25 Doer, Dreamer, and Driver” by Government Technology. He also regularly authors articles at MuniNetworks.org.

The new report offers in-depth case studies of BVU Authority’s OptiNet in Bristol, Virginia; EPB Fiber in Chattanooga, Tennessee; and LUS Fiber in Lafayette, Louisiana. Each network was built and is operated by a public power utility.

Mitchell believes these networks are all the more important given the slow pace of investment from major carriers. According to Mitchell, “As AT&T and Verizon have ended the expansion of U-Verse and FiOS respectively, communities that need better networks for economic development should consider how they can invest in themselves.”

Broadband At the Speed of Light: How Three Communities Built Next-Generation Networks is available here.

About ILSR: Institute for Local Self-Reliance (ILSR) proposes a set of new rules that builds community by supporting humanly scaled politics and economics. The Telecommunications as Commons Initiative believes that telecommunications networks are essential infrastructure and should be accountable to residents and local businesses.

About Benton: The Benton Foundation works to ensure that media and telecommunications serve the public interest and enhance our democracy. We pursue this mission by seeking policy solutions that support the values of access, diversity and equity, and by demonstrating the value of media and telecommunications for improving the quality of life for all.

Understanding Fiber-to-the-Home Video For Anyone

Thanks to the Fibre Evolution Blog for alerting us to a slick, short video that explains why FTTH is superior to alternatives when it comes to accessing the Internet. The video was produced the FTTH Council of Europe and is meant for a very general audience.  Enjoy.

Video: 
See video