ACG Research

ACG Research
We focus on the Why before the What

Thursday, June 23, 2011

Cloud Applications that Solve Your E-Mail Headaches: Stop Managing Your Inbox NOW!

Dealing with e-mail attachment limits can consume a considerable amount of time and resources. We’ve all experienced functionality issues with sending graphics files, video files or large documents. Because most companies institute file sending limits, companies need an e-mail management solution outside of their internal e-mail to ensure production integration. Several companies such as Drop Box and and YouSendIt offer FTP replacement. These companies can easily transfer your file for free (limit is 50MB to 100MB in most cases) or offer low-cost solutions to solve your e-mail issues. Consider the following:
  • 107 trillion: The number of e-mails sent on the Internet in 2010
  • 294 billion: Average number of e-mail messages per day
  • 1.88 billion: The number of e-mail users worldwide
  • 480 million: New e-mail users since 2010
  • 89.1%: The share of e-mails that are spam
  • 262 billion: The number of spam e-mails per day (assuming 89% are spam)
  • 2.9 billion: The number of e-mail accounts worldwide
  • 25%: Share of e-mail accounts that are corporate
The majority of e-mail users are on Microsoft Exchange servers (52%), Lotus Notes (21%) and Novell Groupwise (6%). Analysts estimate that hosted options such as Microsoft Office 365 (BPOS) and other cloud-based offerings will grow from 12 percent to 31 percent by 2012.

Companies that offer a suite of products that give a company domain or department unique visibility, control and security without compromising your IT department. Other enhancements such as Active Directory for Microsoft e-mail (52 percent of the market uses Microsoft) deliver seamless integration and plug-ins for Outlook Exchange.

The benefit of using these cloud companies’ offers is that they address issues related to IT security and compliance risk — the number one concern of enterprises. And most importantly, they complement the resources of your IT department and free your staff to focus on critical mission goals instead of the time-consuming task of managing e-mail.

To read more from Lauren Robinette, click here.

Lauren Robinette

Core router spending to rise with demand for next-gen platforms

Although carriers are investing heavily in their networks, especially in edge network equipment and services, they have been neglecting core routers. Over the last few years carriers have under-invested in core and put more intelligence in their edge networks and develop more services. This has put stress on the core.

But that will change in the near term as the growing volume and complexity of IP traffic, core routing platforms reach the end of their life cycles or no longer meet network demands continues to put pressure on networks. These pressures will put pressure on vendors to develop routers with higher capacities, greater port density and improved scalability.

Read what Ray Mota and others have to say about next-gen platforms in SearchTelecom's complete article.

Ray Mota

Monday, June 20, 2011

Content Delivery Network: Market Growth in High Gear

Market will reach $5B by 2016

ACG Research has released its report on the content delivery network (CDN) service provider market. The CDN service provider market grew 20+ percent in 2010 and is expected to grow 22 percent CAGR through 2016. The CDN market is being powered primarily by growth in the video delivery segment, which is expect to grow more than 30 percent in 5 years as a result of the surging popularity of over the top video.

This rapid growth is changing the competitive landscape as incumbents and well-heeled new entrants (telcos, MSOs and cloud service providers) vie for market share. “We fully expect to see continued downward price movement as contenders push towards reaching critical mass,” commented David Dines, principal analyst, video infrastructure and CDN. “At the same time competitors will be differentiating themselves on service level agreements and value-add services such as video management and analytics.”

ACG’s comprehensive report includes a white paper, market share report (Excel spreadsheet), forecast and executive look PowerPoint presentation.

For more information about purchasing this report or about ACG’s CDN syndicated service, contact Karen Grenier,

Thursday, June 16, 2011

Leveraging Investment in Fiber Optic Communications

Though utilities use only a fraction of the broadband capacity they install to support smart grid applications, it is relatively easy to justify investing in that capacity. If utilities were to lease fiber optic capacity to providers of general broadband services, companies in both sectors would benefit, and so would their customers.
  • Why is it so much easier to build a business case for smart grid communications than for broadband services?
  • Why does it make sense for utilities to invest in fiber optics for smart grid even though its bandwidth requirements are modest?
  • Why is joint use of the communications facilities benefit consumers?
Click here to read Michael Kennedy's complete article (IEEE Smart Grid Newsletter, June 2011).

To read more from Michael Kennedy, click here.

Michael Kennedy

Wednesday, June 15, 2011

ACG Research Acquires Network Strategy Partners (NSP)

Merger will drive economic value for service providers and vendors with deep dive validated business case analyses

ACG Research announces the acquisition of Network Strategy Partners (NSP), a globally respected business modeling and TCO/ROI consulting firm. ACG Research, an analyst and consulting firm, provides market research, consulting, service creation, go-to-market and playbooks in the service provider space.

NSP consultants will extend ACG Research’s strong presence in the service provider space by bringing an extensive multidisciplinary background in management consulting, industry analysis, financial analysis and communications engineering to create and communicate independent and unbiased business case analyses.

Michael Kennedy, PhD, managing partner and cofounder of NSP, will be responsible for ACG’s Business Case Modeling division. Dr. Kennedy’s professional experience includes leadership positions in Arthur D. Little’s enterprise network consulting practice, Director of Consulting Services for Strategic Networks, and Managing Partner of Network Strategy Partners. Notable engagements include helping service providers develop the concept of IP VPN, creation of the first business cases for the Metro Ethernet Forum, development of the most in-depth OpEx model for network operations, and planning and overseeing construction of the first fiber-optic network for three of the largest U.S. electric utilities. He also worked as an industry analyst as V.P. of Gartner’s Strategies in Telecommunications Services advisory service. As a financial analyst Michael provided equity research on the “sell-side” at Soundview Financial Corporation, and at AT&T and Bell Labs where he developed financial analyses of the strategic plan, econometric analysis for the defense of U.S. vs. AT&T, forecast telecommunications demand, and performed engineering cost studies of transmission and outside plant projects.

Business case analyses must go beyond simple financial analyses and dive-deep into the underlying technologies,” says Michael Kennedy, “this provides the rigorous business case analyses that service providers need to build solutions that control cost and develop differentiated and profitable new services.”

"We are extremely excited about the acquisition and having Dr. Kennedy on our team,” says Ray Mota, PhD. The future of sales to services providers will be driven by aligning all features and benefits to an economic value. Vendors that align their product mix with economic value will succeed. And with Michael leading our business case analysis consulting services, ACG will be on the cutting edge in providing the services to service providers and vendors that help them determine their economic value.”

To download Michael Kennedy's business cases, click here.

Contact Michael at

Wednesday, June 8, 2011

M2M: A Big Deal for Networks or Just Another Flash in the Pan?

I find it amusing that the tech industry starts a topic and tries to make it a market space or definition that just does not make sense. If we look back over the last 10 years, we have many examples of pundits trying too hard to push ideas that are essentially fads or are not a sufficiently differentiated. (Full disclosure: I am one of the industry analysts that contributes to this frenzy at times, so I am criticizing myself to a degree).

When it comes to machine to machine (M2M), I do not understand why it is getting so much attention in the networking space. Sure, I see the forecasts predicting that there will be 10x the number of machines than people and there will be a need for 70B additional machines for connectivity. Even if the forecasts are right, these machines will not create very much network traffic. The average device will not need constant monitoring and will not send a steady stream of data. How much data do you need to get from a refrigerator or thermostat? Temp, humidity, amps, can be monitored continually but even home energy monitoring software only needs that data once a minute at most. Plus, we forget, they are machines and they need bits and bytes, not pretty pictures and video to get the relevant information.

How much data would this mean for the network? Even, very generously, if a device were to send data once every 10 seconds, there would 8,640 transmissions per day times 1 kbyte per transmission would equal 8.64 MB per day or about 3.2GB. This is the equivalent of one Netflix movie per year. Multiply that by the expected number of connected machines per household then we might see the equivalent network load of 64GB per year per household in 5 to 10 years.

This is being generous because many of the devices will not have to transmit all of the data over the network; much of it will be kept locally with aggregated data sent on a schedule or on demand. As a sanity check, I asked a developer in the smart grid/demand side management sector what their traffic pattern looks like. They have devices and meters that talk to each other (over Zigbee, not WiFi). One device acts as a gateway and sends I.D. and usage data every 15 minutes over IP to centralize server. Using these assumptions, the data transiting the Internet would be 100 to 1000 times less.

When I mention this data to some proponents, they respond that video for home monitoring is going to be the big M2M app. My argument is that it probably does not make sense to be broadcasting all of this data across the net if no one is watching. It will work much like webcams and video on demand do today; the stream will be set up when requested.

This also brings me to my other point. M2M does not deserve its own category; it is a technology that will be deployed as a part of other application areas such as home monitoring and smart grid to serve a purpose for the application and is not an end into itself.
To put this in perspective, OTT video is happening now, and it is 40 percent of all traffic (Netflix alone is 30 percent of prime time traffic) and will grow fivefold over the next five years. By my rough estimation, the M2M traffic — if it takes off — will take three to five years to add the equivalent of one streamed movie per year per household. Other issues such as addressing, security and management will not be big issues either, because practicality dictates that most devices will use a low power, self-organizing wireless protocol such as Zigbee or ZWave and will therefore not need an IP address.

Do you agree? I look forward to a debate on this.

David Dines

Monday, June 6, 2011

A Business Case for Scaling the Next-Generation Network with the Cisco ASR 9000 System

That video makes up the majority (90%) of total consumer IP traffic and is quickly overtaking mobile data traffic is not new information, but the type of equipment that vendors are developing and how they are responding to the video challenge, in some cases, is. With broadband operators demanding that routing solutions address rapidly increasing bandwidth requirements that video is putting on their networks, the pressure is on for vendors to respond.

Cisco has just announced a new routing solution that it claims can cost efficiently scale as traffic grows. Cisco also states that the ASR 9000 System reduces OpEx by as much as 71% and that its virtualized technology reduces TCO by up to 73% over competitive solutions that lack this technology.

To find out if the ASR 9000 System enables service providers to profitably scale networks to support bandwidth-intensive applications ACG Research conducted a business case analysis that compares the cash flow and six-year cumulative total cost of ownership of the Cisco ASR 9000 System with the routing solutions of two leading vendors.

In this business case we examine virtual systems and answer if the Cisco ASR 9000 System and its network virtualization technology achieved TCO when deployed across triple and quad-play access and aggregation networks. We determine if costs associated with router networks are reduced with the system, and we analyze OpEx and CapEx claims.

To read the business case, click here to download the PDF.

For more information about ACG Research’s business case analysis service, contact our sales department at