Archive for June, 2014:

9 tech support requests you never want to hear from your family

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



If you’re a techie it’s likely that someone in your family will ask you a question about their cell phone, their computer, their iPad, their you-name-it … and not long after the whole family will use you as their personal tech support. You become the go-to guy for every tech question. I’s hard to turn down a plea from your mother or sweet Aunt Mabel. While some problems are easily solved (“There’s no ‘any’ key, Mom, it means just press any key”) there are other questions that will make your blood run cold. Here are the worst of the worst family support requests. If you have favorite family support requests, please comment below then follow me on Twitter,, and Facebook.

#1: “Why is Google broken?”
What do you mean “broken”, Aunt Flo? Well, I can’t get to its, what do you call it, home range. OK, what does your browser say? What’s a browser, dear? I’m using Internet Exploder … it says it’s version 6 … is that OK? No, there’s no lights on my, whadyacallit, router … oh, really? Why yes, I did unplug it when I was vacuuming …

#2: “Could you help me move my stuff to a new computer?”
This is the kind of request that makes Sisyphus’ labors look trivial. It’s going to suck up at least 12 hours of your life and Mom won’t understand why it’s taking so long; “I thought I’d be back playing online poker by now, dear. Will it take much longer? Perhaps I’ll have a little drink while you have fun.” Fun? Kill me now.

#3: “Why is my Internet slow?”
That’s ‘cause the Internet is full, Gran. OK, unplug the power from your gateway. Yep, that’s the little box with the lights. No, not the blue cable, that’s the Ethernet connection. There’s no blue one? OK then, green? Oh, there are two black cables; OK, then follow the one that’s plugged into the wall socket and unplug it. You can’t reach it? Well, unplug it where it connects to the box with the lights. It won’t unplug? Ah that’s because it’s the power supply. Follow that cable to the other box with lights … just a sec, I need a stiff drink … now, where were we?

#4: “What (computer | cellphone | pad | gadget) should I buy?”
Best answer: None of ‘em, Cousin Jack. Stick with pencil and paper and we’ll both be happier. Political answer: Well, what do you want to do with said device? How much do you want to spend? So, you can just about afford pencil and paper … see the best answer above.

#5: “I can’t find my file! Would you help me?”
Once again, Uncle Bob has saved his document in some random subdirectory so you’ve got to walk him through finding with the file manager … that is, if he ever saved it at all. Oh, yes, Uncle Bob doesn’t realize that most software just isn’t helpful in that way. If you don’t save the file and you just reflexively click on “yes” when asked if you’re sure — as Uncle Bob is wont to do — there’s no force in the universe that can get it back for you. But it doesn’t matter how many times you tell him, he’ll never remember … and so you’ll be finding files and or looking for nonexistent ones until your last breath.

#6: “My keyboard doesn’t working any more …”
Bro! Really? First stop and change the batteries if it’s a wireless keyboard. What, no new batteries? You’re off the hook until your brother gets some. What, the keyboard still doesn’t work? Has the dongle come loose (“Oh, I pulled it out to dust it and forgot to put it back.”). When that doesn’t work, you’ll have to get your brother to go through what’s plugged in where, have drivers been removed, is it still paired, … the whole nine yards of diagnosing the problem. And even then, your brother will probably finally mutter “What’s this switch on the side of the keyboard?” At your end you’ll wind up with key-shaped dents in your forehead …

#7: “My password doesn’t work any more …”
Oh, sweet Jesus. Aunt Mabel’s forgotten her password or she’s using the wrong account name. Again. Just pray to the gods above that she hasn’t been messing with her account details and changed her email address to some random Hotmail account. We all know that you shouldn’t write your account names and passwords down but for Aunt Mabel you need to make an exception.

#8: “Why doesn’t (program | web site) work any more?”
Of course your sister isn’t necessarily being accurate here because “doesn’t work” could mean anything from “I can’t find it” to “I used to know how to do (something) but I’ve lost my mind”. This can be a long conversation because after you’ve slogged your way through what she actually means you’ll either be running a training session or reinstalling an application from scratch. If it’s an application just pray she’s got the registration key because even though it’s not your fault you don’t have the key, her anger has to go somewhere and your ear is the nearest target.

#9: “Remember that problem we fixed a year ago? Well, it’s gone wrong again and …”
This is annoying in every way you can think of. Of course you don’t remember the problem and you can bet that Dad doesn’t really remember it either but somehow he’s joined up the dots in some random fashion and is convinced he knows the cause was something “we” (ha!) fixed. This is where you start crumpling a piece of paper in front of the phone and say “Sorry, Dad, we have a lot of interference on the line, let me call you back in a bit …”

Posted in: TECH

Continue Reading

Microsoft kills off plan to pay people to write good things about Internet Explorer

It was only a couple days ago when Microsoft released its Internet Explorer Developer Channel, “a fully functioning browser designed to give Web developers and early adopters a sneak peek at the Web platform features we’re working on.” Any chance IE might have gotten some long-term social media love was dashed after a clueless “social strategist on behalf of Microsoft” invited the wrong person to write something positive about Internet Explorer.

The “wrong” person was TechCrunch founder Michael Arrington who posted the unsolicited blog-for-pay letter on Uncrunched. The “strategist” was from the advocate marketing firm SocialChorus, which lists Bing as a customer. In part, the message stated:

The new Internet Explorer is a brand new experience with many different features. This reworked Internet Explorer lets you search smarter and do more with its cool new features, such as multitasking, pinnable sites, and full-screen browsing.

In this program, we are looking to spread the word about the new Internet Explorer web experience in a cool, visual way, which is where you come in! Internet Explorer has teamed up with many partners in gaming, entertainment, and more, and we’d love to see you talk about your opinions on these collaborations.

“Compensation” as well as “fun prizes and rewards” were offered for writing a flattering Internet Explorer post. SocialChorus asked bloggers to use specific hashtags. Instead of positive reviews however, the hashtags #IEbloggers and #reThinkIE have turned into more IE bashing tools. Some of those tweets are funny!

Additionally SocialChorus asked “rethink Internet Explorer” bloggers to sign a contract and receive “program access where you will see cool social content, the complete blog post prompt, and all required blog assets.” In order to be compensated, you must “Share your post or related photos with our hashtag (#IEbloggers) on 2 to 3 social networks (Instagram, Facebook, or Twitter).”

The link with more details about the program has since been deleted (you can see it in full here) but not before it caught the eye of Google’s Matt Cutts, who is the head of Google’s webspam team. Cutts tweeted that he was “asking for more info while the webspam team investigates.”

Both Google and Bing have policies about passing links from paid posts; Google penalized Chrome back in 2012 for a sponsored post scheme, meaning searching for “browser” on Google would not bring up Chrome.

Meanwhile Microsoft PR damage-control went into overdrive. Arrington later updated the post with Microsoft’s comment: “This action by a vendor is not representative of the way Microsoft works with bloggers or other members of the media. The program has been suspended.”

Just the same, SocialChorus had asked for the glowing Internet Explorer posts to be up by July 10, so if people who signed contracts weren’t informed that Microsoft yanked the money, you still might see some pro-IE posts around that time.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

Ten reasons why open source is eating the world

Ten reasons why open source is eating the world

Open source software, once just the domain of technology hobbyists, is taking over the software world. According to Gartner, open source software will be included in mission-critical software portfolios of virtually all Global 2000 enterprises by 2016. In fact, according to open source management vendor Black Duck Software, there are now a million different open source software projects. Here are 10 reasons for the surging popularity of open source software.

Faster innovation
Traditional software vendors create and develop their products in-house. Open source vendors, however, aren’t starting from zero – they innovate on top of a common base. “Open source provides a software foundation that alleviates the need to start development projects from scratch,” says analyst Jon Oltsik. “It can then be customized for specific purposes which can help accelerate the development process.” With cloud services there’s the Amazon or Microsoft approach. “In this arena, I see service providers providing Apache OpenStack-powered clouds offering comparable services to Amazon EC2, but differentiating on the variety of service offerings, professional services, and custom-tailored service levels,” says Citrix’s Mark Hinkle (pictured, left).

Security was once viewed as an open source liability, but that has changed. This year, 72 percent of BlackDuck’s respondents said that they specifically chose open source because of security. Open source software allows users to review code for potential security flaws. “I really do like the transparency of open source,” says Daniel Polly, enterprise information security officer at First Financial Bank. “But more so, when a piece of software is interacting with data, I do like the fact that with open source you can see what’s going on in that data stream.” Polly says the bank uses Snort. Commercial vendors are now being pressed to match what open source can offer, both in security and in other areas, he adds.

Price also continues to be factor. In this year’s Black Duck survey, 68 percent of respondents said that open source helped improve efficiency and reduce costs.

Of course, open source is not the same as free. Vendors can still charge for the software, for particular versions of the software, for support, or for custom development work. In addition, a company might need to spend internal resources on adapting or integrating open source software. But it’s no longer the leading factor. “It’s about more than just cost-cutting or any of the traditional reasons to simply use open source software,” says Lou Shipley (pictured, left), President and CEO, Black Duck. “Open source has proven its quality and security, and reached a point of widespread democratization and proliferation.”

Traditional proprietary software is often focused on the needs of a particular market segment, for example, enterprise or SMB. Open source projects typically don’t suffer from this problem since they’re usually built around customer requirements.

“As a typical startup, we kicked off with an IT backbone built almost entirely on open source technology,” says Rafael Herrera (pictured left), head of BI International at Groupon. “The key factor for us – besides the cost gains – was scalability. We needed a framework that could support dynamic growth from the outset. As a typical startup, we kicked off with an IT backbone built almost entirely on open source technology,” adds Herrera. For example, Groupon uses an open source data integration platform from Talend.

Feature set
According to a 2013 report from the Linux Foundation, 80 percent of companies plan to increase their use of Linux over the next five years, while only 20 percent plan to increase their use of Windows. The number of companies using Linux for mission-critical workloads grew from 60 percent in 2010, to 73 percent in 2012. And, sure, price was a factor. Even when adding in support costs, open source software is generally significantly cheaper overall. But, according to the Linux Foundation report, it was only the second-most important factor. The first was the feature set. This is a dramatic reversal from the early years of open source technology, when the commercial products were generally more complete and robust.


Open source software allows savvy users to go right into the source code and modify it. “I’ve been able to extend the open source software we’re using to fit our need without engaging a third party,” says Paul Stadler (pictured left), technology manager at the Chester County Cat Hospital. The company uses open source veterinary practice management software for its core operations, running on a Linux server and delivered via a Web-based interface to employee desktops and mobile devices.

In fact, the adaptability and flexibility of open source software was the fourth most important reason why companies chose it over proprietary software, according to this year’s Black Duck survey. This benefit of open source was ranked eighth last year.

In the past, when several companies needed the same functionality, they built it from scratch, used a product from an outside vendor or formed a consortium to create and maintain the product. Open source software streamlines this process by enabling competing companies to work together. This frees up time and money for companies to spend working on projects that differentiate themselves. According to Black Duck’s survey, 50 percent of corporations contribute to open source, and 56 percent say that they will increase their contributions this year. By participating in development, enterprises can help influence the way the software evolves and build relationships with other developers.

It’s nice to think that standards are set by groups of intelligent though-leaders, choosing the best possible path forward for an industry. In practice, however, what often happens is the emergence of de-facto standards based on popular products, like, say, Microsoft Word’s .doc format. A successful open source project can provide the same function, without the associated risk of vendor lock-in. “Many times it’s easier to implement a standard as a result of adoption of real products,” says Citrix’ Hinkle. “Apache Web Server is a good example of massive adoption and an accessible platform that drove the adoption of many web standards in the earlier days of the Internet.”

In many areas, open source software is no longer trailing behind proprietary platforms but is instead leading the way. Cloud, mobile, Big Data and the Internet of Things all feature many high-profile open source projects that are driving the evolution of these platforms. Not to mention the Web itself, much of it built on the open stack of Linux, Apache, MySQL and PHP. Even in the latest hot tech news topic — virtual reality — there are several competing open source virtual environment platforms, including OpenSim, Open Wonderland, and Open Qwaq.

According to this year’s Black Duck Future of Open Source survey, quality was the top reason why respondents chose open source. That’s a big change. In 2011, quality was in fifth place. As open source projects gain adherents, more people contribute to improving stability, spotting or fixing bugs and streamlining interfaces. A related factor, ease of deployment, rose from sixth place in 2013 to third place today, another sign of the rapid maturation of open source projects. In fact, many open source tools are now as simple to install as their proprietary equivalents — simpler, if you take into account the fact that in many cases no purchase or procurement process is involved.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Posted in: TECH

Continue Reading

Review: FireEye fights off multi-stage malware

Specialized 10G appliance successfully defends against multi-stage malware attacks

You can’t see some malware until it’s too late. Sophisticated attacks arrive in pieces, each seemingly benign. Once these advanced attacks reassemble, the target is already compromised.

Practical advice for you to take full advantage of the benefits of APM and keep your IT environment
FireEye takes a new approach to malware detection with its NX appliances. As this Clear Choice test shows, the FireEye device allows advanced malware to proceed – but only onto virtual machines running inside the appliance.

In our tests, the FireEye appliance performed flawlessly. It detected all the multi-stage malware samples we threw at it, including some involving recent zero-day exploits. The top-of-the-line NX 10000 ran at speeds beyond 4Gbps in inline mode, and at better than 9Gbps in tap mode, both with and without attack traffic present.

The NX line fills a specialized niche, and complements rather than replaces existing security gear. It’s not an IDS on its own, though the company is working on an IDS module. Even then, the NX 10000 won’t be an all-in-one security device. Instead, it does one thing really well: It stops the most advanced forms of malware from passing into the enterprise.

This comprehensive protection doesn’t come cheap. The high-end system we tested has a list price of around $420,000, plus service contract. But that’s intended for 10G Internet links (which themselves are rather pricy), and it’s aimed at enterprises protecting assets worth far more than a half-million dollars. For comparison, FireEye’s entry-level NX 900 appliance works on 10-Mbit/s links and has a list price of $9,600.

While automated attacks from script kiddies remain a nuisance, a far more serious threat comes from sophisticated multi-stage malware. Some of these so-called advanced persistent threats (APT) are state-sponsored; in other cases organized crime is involved.

Whatever the origin, there are at least three phases. First, the exploit stage uses a vulnerability to place a bit of code used in later stages. The vulnerability typically hides inside a seemingly benign file, such as a Flash object or Javascript in a Web page.

New attacks, especially zero-day vulnerabilities, are often seen during this stage.

Second, the exploit on the infected client then downloads the actual malware, though not necessarily in one piece. The “dropper” might come in multiple pieces from multiple sources, each obscured to hide its nature.

Third, the compromised system phones home to a command-and-control network that executes the malware. By now, the attackers control the target system; they have its data and a pathway to the rest of the internal network.

Conventional approaches to fighting malware have limitations in combating multi-stage malware threats. A signature-based system might detect the existence of a malware binary file, but only once it’s been reassembled on the target – and by then the target is already compromised. Newer sandbox systems stop traffic before it reaches target machines, but they may not be able to assemble and analyze all the constituent parts of a multi-stage attack. Indeed, a key step in some exploit kits is to “fingerprint” versions of the hypervisor, OS, browser, and plug-ins before deciding whether to proceed.

The FireEye difference
Virtualization is FireEye’s key differentiator. Its appliances run multiple versions of Windows OSs, browsers, and plug-ins, each in its own virtual machine. Malware actually compromises a target (virtual) machine – and then and only then does the FireEye software record a successful attack. Network managers can configure the FireEye appliance to block such attacks, preventing their spread into the enterprise.

We tested the NX 10000 appliance, FireEye’s highest-speed device with two 10G Ethernet interfaces. It focuses specifically on Web-based attacks. The company has other product lines for email, mobile, and forensic analysis, but we did not test those.

The NX 10000 operates in tap or inline mode, with the latter optionally able to block attacks. Its content library is updated daily to include new exploits, something we verified in testing with recent zero-day vulnerabilities.

FireEye’s technology complements rather than replaces an intrusion detection system (IDS). Unlike an IDS or IPS, it doesn’t have a library of thousands of attack signatures. Instead, it looks for actual compromises on its virtual machines. The company says an IDS module is in beta testing, and is slated for general release by the end of the second quarter.

Once the appliance identifies multi-stage malware, it triggers an alert. With a conventional IDS/IPS, a malware alert might say something like “file trojan.exe was detected.” In contrast, an NX 10000 alert shows each component of the malware, including callback URLs used to contact command-and-control networks, as seen below.

malware detection
FireEye’s NX 10000 offers detailed reporting on multi-stage malware, showing each component of an attack, including callback URLs used to contact command-and-control networks.

The appliance’s virtual machines represent various service pack levels of Windows 7 and Windows XP, along with many combinations of browser and Adobe Flash and Microsoft Silverlight versions. FireEye wrote its own hypervisor that makes virtual machines appear to run on bare metal. That’s useful to thwart exploit kits that skip execution on machines if they detect VMware hypervisors.

The version we tested doesn’t yet support Mac OS X virtual machines, though FireEye says Mac support will be available in the third quarter.

Like all security devices, the NX 10000 only detects attacks it can see, and that has network design implications. Placing the appliance at the network perimeter makes sense. So does a hub-and-spoke design that aggregates Internet traffic from branch offices and telecommuters. Enterprises with more decentralized designs might instead consider smaller appliances for each site.

One drawback to having a large appliance at a central site: The NX 10000 lacks built-in high-availability support, instead relying on external systems such as load balancers to avoid a single point of failure.

Coverage tests
We tested the NX 10000 in terms of multi-stage malware coverage and performance. We ran coverage and performance tests in both tap and inline modes, and conducted performance tests both with and without attack traffic present.

For the coverage tests, we replayed a collection of 60 multi-stage malware samples seen in the wild between January and April 2014. These samples, used with permission of, represent many aspects of multi-stage malware. They involve different kinds of exploit kits; zero-day exploits; dropper executable files; and callbacks to command-and-control networks. We did not tell FireEye which samples we’d use in testing.

In all 60 cases, the FireEye appliance correctly identified the individual components of each malware sample, both in inline and tap modes.

The FireEye device updates its library of multi-stage malware examples at least once every 24 hours. It’s possible the system would not detect a brand-new exploit, but we did not see that in testing. Indeed, the most recent of our samples was first posted on malware less than 24 hours before we used it in testing, and the updated FireEye device correctly identified it.

FireEye says its customers typically see one attempted malware exploit every three minutes; our tests were far more stressful than that. We replayed all malware samples consecutively at rates approaching 10G Ethernet wire speed. In contrast, each malware sample originally took seconds or even minutes to run from start to finish. Also, there was no gap between the end of one malware sample and the start of another.

Performance tests
The FireEye appliance also met its stated limits in performance tests. FireEye claims the NX 10000 can forward traffic at around 4Gbps in inline mode and at nearly 10Gbps in tap mode.

We evaluated these claims using Spirent Avalanche, a Layer 4-7 traffic generator analyzer. We configured Avalanche HTTP traffic from up to 40,000 clients, as in a large enterprise network. We measured performance in both inline and tap modes, and we also measured performance while the system was under attack.

With only benign web traffic, the FireEye device forwarded traffic at 4.224G and 9.259Gbps in inline and tap modes, respectively. Both results are in line with FireEye’s performance claims.

We then repeated these tests while concurrently offering the multi-stage malware samples (again, we offered these consecutively, at the maximum possible rate). This time, the NX 10000 forwarded traffic at 4.207G and 9.298Gbps in inline and tap modes, respectively. Those numbers are virtually identical to the tests with benign traffic only, with the minor differences most likely explained by bandwidth contention among many TCP flows.

The FireEye appliance again identified all components of all 60 malware samples offered in the inline tests. Some malware samples were not identified in the tap-mode tests, but we believe this was due to an overloaded CPU in the switch mirroring traffic to the FireEye device. The switch reported CPU utilization of 100 percent and became unresponsive during multiple iterations of the tap-mode tests. While the missed reports should not be “charged” to the FireEye device, this does point up the importance of using tap infrastructure capable of forwarding all traffic at 10G Ethernet wire speed.

Advanced attacks require advanced defenses. The NX 10000 represents an innovative and effective approach to combating multi-stage malware. Combined with a conventional IPS (or using its own IPS module, available soon), the FireEye appliance should help large enterprises keep malware off their networks.

Network World gratefully acknowledges the assistance of Spirent Communications, which supplied its Spirent Avalanche C100MP traffic appliance. Spirent’s Michelle Rhines, Ankur Chadda, Angus Robertson, and Chris Chapman also supported for this project. Thanks, too, to, which provided permission to use its packet captures of recent multi-stage malware

Newman is a member of the Network World Lab Alliance and president of Network Test, an independent test lab and engineering services consultancy. He can be reached at

How We Did It
We assessed FireEye’s NX 10000 in terms of features, attack coverage, and performance. Features testing required no separate methodology. Instead, we discovered functions supported by the device in the course of security and performance testing.

For attack coverage, we obtained 60 multi-stage malware packet captures from the website These captures had been seen in the wild between January and April 2014. Captures included examples of exploit kits, dropper (infected) files, and callbacks to command-and-control servers.

We used the open-source tcpprep and tcprewrite tools to prepare these packet captures for use on our test bed, rewriting MAC and IP addresses. We then used the open-source tcpreplay tool to generate the multi-stage malware attacks. We generated these attacks using a single FreeBSD 10.0 server equipped with a multiport NIC.

For performance testing, we used Spirent Avalanche, a layer 4-7 traffic generation tool to offer Web traffic from up to 40,000 users. In this case, Avalanche ran on Spirent’s C100MP appliance equipped with 10G Ethernet interfaces.

For security and performance tests, we constructed a routed test bed with three IP subnets. When tested in inline mode, the FireEye NX 10000 appliance resided in the middle segment, and bridged traffic between two layer-3 switches. When tested in tap mode, the layer-3 switches were directly connected to each other, and the NX 10000 listened to traffic using a mirror port configured on one switch. The Spirent and FreeBSD traffic generators resided in the outer subnets in this test bed, with one interface from each device connected to each outer subnet.

For the coverage tests, we configured tcpreplay to offer all 60 malware samples at the maximum rate possible. We monitored the NX 10000 continuously during and after the test, and verified that it correctly identified each attempted exploit. The success metrics in this case included the ability to identify source and destination IP addresses and name the various exploit kits, droppers, and command-and-control callbacks used by each malware sample.

For the performance tests, we configured Spirent Avalanche to emulate up to 40,000 clients requesting 64-kbyte Web objects as quickly as possible. We ran four permutations of performance tests: With benign traffic only, both in inline and tap modes; and with benign and malware traffic offered concurrently, again in inline and tap modes.




$50 Life time access to Microsoft MCTS Certification, MCTIP Certification and over 4000+ Exam

Continue Reading

Google unseats Microsoft as the U.S. browser powerhouse

Google’s strength in mobile browsing pushes it past Microsoft’s Internet Explorer

Google has unseated rival Microsoft as the leading browser maker in the U.S. for the first time, Adobe said this week, citing data from its analytics platform.

The rise in Google’s domestic fortunes followed Microsoft’s reduction to second fiddle worldwide in May 2013.

According to the Adobe Digital Index (ADI), a measurement of browser usage based on tracking visits to the average U.S. website, Google’s desktop and mobile browsers — Chrome on both platforms, the aging Android browser on the latter only — slipped past Microsoft’s Internet Explorer (IE), which retained its premier position on the desktop but had little to show for its effort on smartphones.
U.S. browser share chart
The U.S. browser war reached a milestone in April as Google replaced Microsoft as the No. 1 maker of Web browsers, said Adobe. (Image: Adobe.)

For April, Google accounted for 31.8% of all browser usage in the United States. Meanwhile, Microsoft owned a 30.9% share.

Apple’s Safari was in third place with a combined desktop and mobile share of 25%, while Mozilla’s Firefox, which lacks a meaningful presence in mobile, was a distant fourth with just 8.7%.

The rise of Google’s browsers, and to a lesser extent Apple’s Safari, and the corresponding declines of both IE and Firefox, can be attributed to mobile browsing, primarily that conducted on smartphones. “Today, mobile [operating systems are] more important, giving Google and Apple a leg up with default status on Android and iOS,” said ADI analyst Tyler White in a statement.

Adobe tallied visits, which in analytics parlance is synonymous with a session on a website, a period during which a user may view numerous pages before leaving, or before a time limit of inactivity expires. Adobe thus actually measures a type of “usage share,” or how active users of each browser are on the Web.

Other analytic firms count differently. California-based Net Applications uses visitors, an expression of the number of unique individuals — actually their browsers, as the tracking is done with cookies — to measure “user share,” which is analogous to the number of copies of each browser in use during a specific period.

Because Adobe drew its data only from consumer-facing sites — some 10,000 of them during April — it was little surprise that the Chrome/Android browsers outpaced IE. Microsoft’s browser has a lock in businesses, where it’s often mandated as the only allowed desktop browser, but it has a less-dedicated — some would say less-coerced — base among consumers. On mobile, IE accounted for just 1.8% of usage.

Google’s climb to the top spot in the U.S. followed its push into that place globally by almost a year: Adobe’s data had Google’s Chrome/Android passing Microsoft’s IE in May 2013 worldwide. “Outside the U.S., Google’s browser share has grown even more rapidly,” an Adobe spokesman said in an email Friday.

Adobe’s take on the desktop versus mobile contest was in line with other, earlier calculations by Net Applications, in that while IE’s strength was its desktop browser, the rise in mobile browsing caused its overall share to drop six percentage points in the last year. Meanwhile, Chrome/Android and Safari benefited from their primary positions in mobile on Android and iOS, respectively, the two most-used mobile operating systems on the planet.

Chrome’s 31% usage share on the desktop, for example, lagged behind IE’s 43%, but Google’s mobile browsers made up for that shortfall in spades. Safari’s puny 10% on the desktop — in fourth place in the U.S. — was helped out of the cellar by Safari’s 59% usage share on mobile.

Mozilla, Adobe said, was in the weakest position of the Big Four because of its lack of a viable mobile strategy. In the last two years, Mozilla has lost a steady drip of Firefox desktop users and been hit by the increasing importance of mobile browsing, with its total usage share falling from nearly 20% to its now sub-9%.

Although Net Applications’ numbers are global and not U.S. specific — the company does not publicly release the latter — the trends shown by its data are similar to Adobe’s.

But not identical.
By Net Applications’ reckoning, Microsoft remained the top browser maker worldwide in May 2014 with a desktop + mobile user share of 48%, more than double that of runner-up Google and its 21%. Apple and Mozilla continued to battle for third place, with the former making strides in its move to pass the latter. In May, Firefox accounted for 13.8%, Safari for 13.4%. Safari cut the April gap between it and Firefox by more than half in May, and may become No. 3 as early as this month.

Firefox’s decline could not come at a worse time. Mozilla’s contract with Google for making the latter’s search engine the default on Firefox comes up for renewal in November. According to Net Applications, Firefox had a combined desktop + mobile user share of 21.1% three years ago when it negotiated the current contract, which paid Mozilla approximately $300 million annually, nearly all of its revenue.

Going into this year’s tête-à-tête with Google, Mozilla will be bargaining from a much weaker position, down 35% in total share since 2011.

Net Applications had mobile gaining more ground at the expense of desktop browsing in May. By the end of the month, mobile browsing had jumped to 18% of all browsing worldwide. At the current pace, mobile should reach the 20% milestone in October, and account for more than a quarter of all browsing by this time next year.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCSE Training at

Continue Reading

Sprint near deal to buy T-Mobile USA for $50 billion, reports say

Sprint has reached a deal to buy T-Mobile US for about US$50 billion, according to news reports on Wednesday.

Sprint, owned by Japan’s Softbank, would pay about $40 per share for T-Mobile, The Wall Street Journal reported, citing people familiar with the matter. The deal could still fall apart, the Journal warned.
Featured Resource
Presented by Riverbed Technology
10 Common Problems APM Helps You Solve
Such a deal would combine the nation’s third- and fourth-largest mobile operators, forming a larger rival to Verizon and AT&T but reducing the U.S. mobile market to just three major national carriers. Because of that change in the competitive lineup, the plan would probably face an uphill battle for regulatory approval.

If regulators rejected the plan, Sprint would have to pay T-Mobile more than $1 billion in cash and other assets, the Journal reported.

Under the proposed terms of the deal, T-Mobile parent company Deutsche Telekom would own 15 percent to 20 percent of the combined company.

Reports of a deal to combine Sprint and T-Mobile have circulated since late last year. Both carriers have struggled against much larger rivals in AT&T and Verizon, and each has only about half as many subscribers as either of the two big players. Just last year, Sprint agreed to sell 80 percent of its shares to Softbank in a deal that gave it a much-needed injection of cash to complete an elaborate network transformation.

However, the U.S. Department of Justice and Federal Communications Commission rejected an earlier attempted buyout of T-Mobile by AT&T, and since then T-Mobile has introduced plans that have helped change the way U.S. carriers sell phones and service. Some federal regulators have indicated they want the U.S. to remain a four-carrier market for just the kind of competitive pressure that an underdog like T-Mobile can put on prices and choice.

Absorbing T-Mobile would also be one more task for a very busy Sprint, which in recent years has been integrating multiple networks while phasing out others.


MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 4000+
Exams with Life Time Access Membership at

Continue Reading

Follow Us

Bookmark and Share

Popular Posts