Motorola uses NFC to enable touch-to-unlock for smartphones

The next best thing to a password pill, the Motorola Skip can both bolster a smartphone’s security and make it more convenient to use.

Earlier this year, Motorola’s head of advanced technologies Regina Dugan discussed an alternative to the increasingly vulnerable password method for authentication – a “password pill” that would store credentials within the user’s body.

The password pill would transmit an EKG-like signal to authenticate the user with any appropriately equipped device touched by the user. Dugan talked about the password pill much like a one-a-day vitamin.

Most smartphone users don’t authenticate their devices, and those who do limit it to a four-number PIN, because anything longer than that makes it more difficult to check notifications as frequently.

It makes sense, then, for Motorola to introduce a non-invasive version. The Skip, a magnetized clip that can be worn on clothes without the intrusiveness of a password pill, looks like a derivative of the password pill research, providing strong authentication with simplicity of operation. It’s based on NFC technology, the same technology used to secure contactless payments and building access.
image alt text
The clip-on NFC Motorola Skip at left, and the adhesive sticker versions at right.

The Skip does not at first stand out as something that “you didn’t know you needed until you had it,” but a security-cognizant smartphone user who frequently checks his or her smartphone might jump at it.

Few smartphones employ passwords of sufficient length to really secure the device. Why? The National Institute of Standards and Technology(NIST) recommends a 12-character random password. Those who check smartphones 50 times a day would find a 12-character password annoying.

The Skip is installed from the Google Play store. Once installed, Skip authentication is enabled by touching the smartphone to it. Thereafter, the user only has to touch the smartphone to the Skip to unlock it.

The Skip also comes with three stickers, with embedded NFC tags encoded with the same level of authentication as the body-worn device. Presumably, the user might affix the tags to a desk or an car’s smartphone dock to simplify unlocking.

Losing one’s Skip is not a disaster, because the user can revert to a previously set pattern, PIN or password. Given the ease of the Skip’s NFC-based touch authentication, and the low risk involved with losing it, a user can afford a long, complex and secure password in the event the Skip is lost. Smartphone users can sleep better knowing they won’t be the victim of identity theft if their smartphone is lost or stolen.

The Skip is a good solution for people who have sensitive data on smartphones who want strong authentication, but also care about convenience.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

Microsoft’s $100k hacker bounty sounds great but has a lot of loopholes

Microsoft’s $100k hacker bounty sounds great but has a lot of loopholes
Winning vulnerabilities and exploits must be novel, generic, reasonable, reliable and impactful — whatever those mean

Microsoft is offering up to $100,000 for vulnerabilities found in Windows 8.1 that are paired with exploits, but it’s pretty much up to Microsoft to decide who gets paid how much based on a set of subjective criteria.

In order to pull down the full amount, a submission must be novel, generic, reasonable, reliable, impactful, work in user mode and be effective on the latest Windows OS, according to details of the new bounty program. Each of those criteria is subject to interpretation.

It will be up to Microsoft to convince potential participants in the program that their submissions will be treated fairly, says Ross Barrett, senior manager of security engineering for Rapid7.

“A lot of people don’t trust them,” Barrett says. Microsoft could find an attack technique good but not novel, and then patch the vulnerability without paying. “That’s paranoid, maybe, but that kind of paranoia tends to be par for the course in this industry,” he says.

“If I were Microsoft I would make a point of making sure that somebody gets this [$100,000]. It would do wonders for their reputation. It’s more about community relations.”

It’s also about economics, because $100,000 is “an almost insane amount of money” that will be hard to ignore, says Amol Sarwate, director of vulnerability labs at Qualys. In countries with weaker economies that amount would be even more significant, he says.

The sum is likely even more than researchers could make selling such exploits on the black market, he says, and submitting to the program doesn’t run the risk of getting caught by law enforcement.

These cash bounty programs have work pretty well since TippingPoint (now part of HP) set up its Zero Day Initiative in 2005, Sarwate says, with others forming similar programs. Google’s vulnerability program, for example, has paid out more than $800,000 since it started in 2010.

Many researchers are satisfied getting public credit for finding vulnerabilities, he says. Sarwate says this recognition is valuable to them — so much so that citations of these credits routinely show up on the resumes of researchers who received them.

The effectiveness of Microsoft’s big-payoff program is in luring in “ethically neutral” researchers who have discovered exploits and want credit for it immediately, says Barrett. For many researchers that is the true prize. But they may not want to take the option of responsible disclosure in which they submit the vulnerability to the company and wait for perhaps months for it to issue a patch and give credit because the process takes too long.

Instead, they may disclose irresponsibly — posting the vulnerability to a public site where they get immediate credit, but the vulnerability is also available for criminals to exploit. It is these impatient researchers Microsoft can hope to attract, Barrett says; they may be willing to wait for credit if they are paid as well.

“It’s aimed at people who go straight to the press with their exploits, and it tries to win them over,” he says.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

Microsoft opens new competitive fronts with cloud-based Windows Server

Amazon, VMware and even Microsoft partners threatened by Windows Azure move

Microsoft doesn’t want to admit it, but a Gartner analyst says the vendor’s decision to offer Windows Server instances in the Azure cloud is opening a new competitive front against partner hosting companies.

Before 2010 is over, Microsoft will update Windows Azure with the ability to run Windows Server 2008 R2 instances in the Microsoft cloud service. The move could blur the lines between platform-as-a-service (PaaS) clouds like Azure, which provide abstracted tools to application developers, and infrastructure-as-a-service (IaaS) clouds such as Amazon’s EC2, which provide raw access to compute and storage capacity.

Microsoft Windows Azure and Amazon EC2 on collision course

This move also improves Microsoft’s competitive stance against VMware, which is teaming with hosting companies to offer PaaS developer tools and VMware-based infrastructure clouds.

But the cloud-based Windows Server instances open up a new competitive front against Rackspace and other Web hosters who are Microsoft partners, according to Gartner analyst Neil MacDonald.

Microsoft has, to some extent, downplayed the new capabilities, saying the cloud-based Windows Server – which goes under the name Windows Azure Virtual Machine Role – is primarily an on-ramp to port some applications to the Azure cloud.

“What they really want is people using Azure,” MacDonald says. At the same time, VM Role “is a form of infrastructure-as-a-service,” he continues. “The reason Microsoft is being so vague is they really don’t want to upset their ecosystem partners, all the hosters out there in the world making good money hosting Windows workloads. Microsoft doesn’t really want to emphasize that it is competing against them.”

Whereas IaaS clouds provide access to raw compute power, in the form of virtual machines, and storage that is consumed by those VMs, PaaS clouds provide what could be described as a layer of middleware on top of the infrastructure layer. Developers using PaaS are given abstracted tools to build applications without having to manage the underlying infrastructure, but have less control over the basic computing and storage resources. With Azure, developers can use programming languages .Net, PHP, Ruby, Python or Java, and the development tools Visual Studio and Eclipse.

Microsoft officials have previously predicted that the lines between PaaS and IaaS clouds will blur over time, but stress that Windows Azure will remain a developer platform.

In response to MacDonald’s comment, Windows Azure general manager Doug Hauger says “our partners provide a vast range of services to customers for hosting an infrastructure-as-a-service [cloud]. The VM Role does not compete with them in this space.”

For what it’s worth, Rackspace does view Microsoft as a cloud competitor. “The cloud market is going to be huge and there are many ways to win in it,” Rackspace President Lew Moorman says. “Microsoft is serious about the market and we view them as an emerging competitor as well as partner. We are confident that our service difference will resonate to a large part of the market regardless of the technical offers that emerge from players such as Microsoft.”

In an interview this week, Hauger discussed both the similarities and differences between Microsoft’s cloud-based Windows Server instances and the virtual machine hosting provided by Amazon and other IaaS vendors.

“I think there is an incredibly broad, gray line between infrastructure-as-a-service pure-play and platform-as-a-service,” Hauger says.

Ultimately, the marketplace will only care about the technical capabilities of cloud services, not the taxonomies used to define them, Hauger continues. With VM Role, Azure customers will have to manage and patch their own guest operating system. This is clearly different from pure PaaS, in which developers write to endpoints and services through an API, and are “abstracted from even worrying about the operating system,” Hauger says.

But VM Role, when it becomes available later in 2010, will still have some of the developer tools and other benefits of PaaS, so “it’s not the ground floor of infrastructure-as-a-service,” Hauger says. “You’re taking the elevator up a little bit.”

Even though Microsoft is offering VM hosting, that does not mean customers will be able to create custom compute and storage configurations, as they might with an IaaS provider like Rackspace, Hauger says.

Custom storage configurations are “something we absolutely do not offer with the Windows Azure platform, because we’ve made an architectural decision to have a uniform storage pool.”

On the other hand, Azure customers don’t have to worry about writing multi-tenancy capabilities into their applications. Hauger argues that building applications that are resilient, scalable and automated is, while not impossible in an IaaS cloud, quite difficult when “you’re staring down the throat of a VM and you have to manage that yourself.”

Even with VM Role, and a Server Application Virtualization option that will let developers transfer application images to Azure, Hauger does not recommend that customers “forklift a big, monolithic application from on-premise and move it over to Windows Azure.”

VM Role could be used to move some “lightweight” HPC applications to the Azure cloud, Hauger says. If a customer needs large-scale data analysis, but only for a short amount of time, it makes sense to move that app to Azure temporarily and then take it back in-house, he says. Some customers are finding that purely Web-based applications, like Facebook games, also make sense for Azure, he says.

Microsoft officials are willing to admit that Azure’s capabilities are not limitless.

For example, Microsoft CTO Barry Briggs says his own team used Azure last year to build a charity auction application, but kept credit card processing on-premise “because PCI compliance is a big deal.”

“There are some things that will probably stay on-premise for a while and I suspect PCI compliance will be there, because customers want to take some time to understand what the capabilities and potentials of the [cloud] technology really are,” Briggs says.

Although Microsoft is expanding Azure by offering VM hosting, it’s important to note that the offer applies only to Windows Server 2008 R2. Microsoft clearly wouldn’t offer Linux VMs and offering older versions of Windows Server would not fit the Microsoft strategy, either, MacDonald says.

Amazon’s Elastic Compute Cloud, meanwhile, offers Windows Server 2003 and 2008, eight versions of Linux and OpenSolaris.

Although Amazon does offer a billing service, load balancing, databases and a variety of other tools designed for developers, Amazon has not made any significant moves into PaaS, MacDonald says. Amazon says its approach prevents customers from “being locked into a particular programming model, language or operating system.”

But Microsoft’s Windows Server hosting does put the two companies into more direct competition, MacDonald says.

Perhaps most crucially for Microsoft, the VM Role service gives CEO Steve Ballmer and his cloud team a more viable way of competing against VMware, which has partnerships designed to provide both PaaS offerings and the VM hosting capabilities needed to move applications to the cloud.

“The customers need to have an easier on-ramp to cloud computing, and Microsoft wasn’t providing that and their biggest competitor was,” MacDonald says. “This was a gap they had to fill and I’m glad to see they’ve done it. I’d say it’s two years late. It doesn’t mean they’re too late, but they should have done this from day one.”

As for the Web hosters who now find themselves in competition against Microsoft, MacDonald says they will simply “have to evolve.”

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

Windows 8 Update: Scarcity of touchscreens is hurting Windows 8

Dell deal on Windows RT, dubious Windows 8 sales numbers

Along with whatever other problems Windows 8 faces, Microsoft partners interested in making machines that show the operating system off to best advantage are handicapped by a short supply of touchscreens, the top Windows executive says.

The company is hoping the problem will be solved in time to make alluring devices in time for Christmas sales, says Tami Reller, chief marketing and financial officer for the Windows division, as quoted in this CITEworld story.

“We see that touch supply is getting so much better,” Reller says. “By the holidays we won’t see the types of restrictions we’ve seen on the ability of our partners and retail partners to get touch in the volume they’d like and that customers are demanding.”

Along with that area a slew of complaints about the Windows 8 user interface, many of which may be addressed by Windows Blue, the code name for the upgrade that is also coming out later this year, likely before the holidays, Reller says.

It’s still unclear what changes Windows Blue will include although rumors say the start button and start page so familiar in earlier versions of Windows will be restored. The specifics of Windows Blue – officially called Windows 8.1 – will be revealed at the Microsoft Build developers’ conference at the end of June, she says.

Although Reller didn’t mention it during her remarks at a JP Morgan tech conference in Boston, by the end of the year Intel’s Haswell chips should be in production offering a longer battery life, higher performance and improved graphic processing for a range of devices such as ultrabooks, convertibles and tablets.

This is a convergence of events that Microsoft no doubt would have welcomed last holiday season just after Windows 8 launched in October.

Windows RT deal
Dell has come out with a Windows RT tablet for $300 – $200 less expensive than the cheapest Microsoft Surface RT.

That’s a limited time offering and is a $150 discount off the regular price for its XPS 10, which sports a 10.1-inch display and, like all Windows RT devices, runs on ARM chips. Another short-term option tosses in a keyboard/dock for an extra $50.

At that price the bundle is still significantly cheaper than an iPad and may grab a few potential Apple customers.

When is 100 million not 100 million?
Microsoft says it’s sold 100 million Windows 8 licenses so far and seems proud of it, but the number is being picked apart by people who note that the number of licenses sold might be far higher than the number in actual use.

According to a story in ComputerWorld the count of machines running Windows 8 could be closer to 59 million.

Why would Microsoft release the higher number but not release the number of machines that have activated the software? The obvious answer: that number is embarrassingly small.

Windows 8 is bad for this business
Buffalo, N.Y. -based Synacor blames Windows 8 for a 16% drop in search-engine advertising revenues for its content-portal services.
Because Windows 8 defaults to Bing as the search engine and sets MSN as the home page, according to this story in the the Buffalo News. Part of Synacor’s business is to set its customers’ advertising pages into the start page of end users’ browsers.

“That hurts Synacor because the company generates revenue every time a subscriber uses the Google search box on the start pages that it designs, while a reduction in page views also hurts Synacor’s advertising sales on those start pages,” the News story says.

The situation has contributed to a 5% drop in revenues for Synacore.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

Google Fiber divides users into ‘the fast’ and ‘the furious’

Google’s fiber push is making the ‘have nots’ mad. That’s a good thing.

Every day is a beautiful day in the fiberhood.

The chosen ones in Kanas City, Austin and Provo are getting Internet connections that are 100 times faster than average at very low prices, thanks to Google’s Fiber project.

Unfortunately, you and I don’t live there. So we’re stuck in a bandwidth backwater.

As Internet trolls like to say: U mad, bro?

If you don’t live in one of these cities, you should be mad. Google’s Fiber project demonstrates that very high Internet speeds are possible and nobody except Google has the vision or courage to make it happen.

One Internet bandwidth provider has admitted the ability to provide much faster speeds to consumers, but has decided not to. Time Warner Cable CFO Irene Esteves said in February that Time Warner is perfectly capable of “delivering 1 gigabit, 10 gigabit-per-second” Internet connectivity to consumers, but that the company just doesn’t “see the need of delivering that to consumers.” I believe Esteves’ statement accurately represents the thinking of most existing Internet providers.

Now are you mad?

The issue isn’t really that consumers don’t want faster Internet speeds. And it’s not that cable providers don’t care.

It’s really a chicken-and-egg problem.
Why your Internet is so slow

Average U.S. Internet speeds rank 12th or 13th in the world, which is pathetic for the country that invented the Internet and contains Silicon Valley, Hollywood and data-hungry Wall Street and a $15 trillion annual GDP.

Other countries are pulling away. A Sony-backed service recently announced 2Gbps download speeds in Tokyo for $51 per month — twice the speed of Google Fiber and 200 times faster than the U.S. average and at a lower price than Google Fiber.

Now are you mad?

Gigabit fiber Internet access is affordable, but only if everybody gets it. But everybody isn’t going to get it unless it’s affordable.

And that’s why we can’t have nice things.

At least, that’s what Esteves really means when she says that users don’t want faster speeds. Providing consumers with the faster speeds Time Warner currently provides to some business customers is very expensive because only a few customers pay for it all.

It’s not that Time Warner Cable’s customers don’t want fast Internet. They don’t want Time Warner Cable’s price for fast Internet.

However, if you lay fiber to every home in a city, and if a majority of homes sign up to use it, the cost can come way down. And that’s what Google Fiber is all about. It’s about making a bet on the future and investing heavily to bootstrap widespread use and high demand.

Google’s Fiber project involves the actual digging of trenches and the actual laying of fiber optic cables all the way to homes. There are innumerable logistical and legal hurdles to overcome for each city.

Google is already providing the service in Kansas City, and is still expanding into new neighborhoods there. The company recently announced that it would roll it out to Austin, Texas, then Provo, Utah.

Google offers consumers three “plans.” The first is Internet comparable in speed to ordinary broadband, and it’s free. The second is 1Gbps speeds and 1 TB of Google Drive space for $70 per month. The third adds TV plus a 2TB DVR box for a total of $120 per month.

Getting Google Fiber service is just like getting cable Internet service (except 100 times faster). You get a Wi-Fi capable router, and you plug your PC into it via Ethernet for the full-speed experience.

Google is spending $84 million to build the infrastructure necessary to serve 149,000 Kanas City customers. That’s $563.75 per customer, for you math majors. (If that sounds like a lot of money, consider that the infrastructure gives you 100 times faster Internet for the rest of your life for the price of an iPad. Still, customers don’t have to pay for it up front — Google is doing that.) And it gets cheaper per customer with each new person that signs up.

Goldman Sachs estimates that it would cost $140 billion to deploy Google Fiber nationwide.

To put that in perspective, that one-time investment would give entrepreneurs in every state of the union a radical advantage globally, ignite an economic boom comparable to the nationwide deployment of electricity 100 years ago and enable incredible new services — all for less than what the U.S. loses each year in offshore tax havens.

Now are you mad?
Why mad users are the best thing about Google Fiber

It’s unlikely that Google will lay fiber to every city in the US, and less likely still that Google will do that Internationally. And it doesn’t need to.

Google Fiber is already “inspiring” ISPs to boost speeds and investment. Google may be triggering an arms race for high-speed Internet connectivity, because it’s re-setting expectations about how fast the Internet should be.

This increasingly matters as HD movies and TV becomes more mainstream. Right now, Netflix alone consumes one-third of all the download bandwidth in the U.S. at peak times.

Hollywood and other movies-on-demand services had better get busy offering compelling services. More than half the upload bandwidth in the U.S. is consumed by BitTorrent.

I think the minority of providers who figure out how to offer vastly higher speeds at very low cost will survive, and the Time Warners will get out of the ISP business for good.

No, AT&T didn’t announce gigabit fiber in Austin

Hours after Google announced that Austin would get the Google Fiber treatment, AT&T (which is headquartered in Dallas) announced that it would build a gigabit fiber network of its own in Austin.

Or, at least that’s what the news reports would have you believe. But if you look at the press release, it was really a passive-aggressive bit of whining about Google getting special treatment from Austin authorities.

Instead of announcing a plan to build fiber optic connectivity in Austin, AT&T actually announced that “it is prepared to build an advanced fiber optic infrastructure in Austin,” according to the announcement press release.

“Prepared to build” does not mean “plans to build.”

Then the whining began: AT&T’s plans “anticipate it will be granted the same terms and conditions as Google on issues such as geographic scope of offerings, rights of way, permitting, state licenses and any investment incentives.”

The release ended with this zinger: “Our potential capital investment will depend on the extent we can reach satisfactory agreements.”

In other words, the whole reason for AT&T’s press release was not to announce the intention to build fiber optic gigabit Internet connectivity, but instead to complain about preferential treatment of Google by local authorities.

AT&T has a point. Local, state and government regulations and restrictions are a big part of why our Internet speeds are so slow. And that’s yet another reason why Google Fiber is so brilliant.
Google is simply smarter than AT&T

Rather than approaching individual cities and begging them for permission to lay fiber, Google held a big contest and said, in effect: “OK, we’re going to pick a city to gain a massive economic boost. You want it? What are you going to do for us?”

Then they started choosing from among the 1,100 applicant cities based on which ones were most serious about making Google Fiber possible.

In fact, Google Fiber triggered a gold rush of entrepreneurial investment and activity.

One enterprising local even rents their Google Fiber-connected home at a premium on AirBnB, and calls it “Hacker House.”

Google Fiber is creating a lot of hype and attention. It’s making people realize that affordable, ultra high-speed Internet connectivity is possible.

It’s making people look at their local governments and ISPs and ask: Why can’t I have this?

But mostly, Google Fiber is making people mad. And that’s the right emotion in the face of the incredible waste of time and money and opportunity that takes place every day that goes by while we’re held back by yesterday’s Internet speeds.

But let’s not just get mad. Let’s get fiber.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

70-410 Microsoft Certified Solutions Associate (MCSA): Windows Server 2012

You work as an administrator at The network consists of a single domain
named All servers on the network have Windows Server 2012 installed. has a server, named Certkingdom-SR07, which has two physical disks installed. The C: drive
hosts the boot partition, while the D: drive is not being used. Both disks are online.
You have received instructions to create a virtual machine on Certkingdom-SR07. Subsequent to creating
the virtual machine, you have to connect the D: drive to the virtual machine.
Which of the following is TRUE with regards to connecting a physical disk to a virtual machine?

A. The physical disk should not be online.
B. The physical disk should be uninstalled and re-installed.
C. The physical disk should be configured as a striped disk.
D. The physical disk should be configured as a mirrored disk.

Answer: A


You work as a senior administrator at The network consists of a single
domain named All servers on the network have Windows Server 2012
You are running a training exercise for junior administrators. You are currently discussing the new
VHD format called VHDX.
Which of the following is TRUE with regards to VHDX? (Choose all that apply.)

A. It supports virtual hard disk storage capacity of up to 64 GB.
B. It supports virtual hard disk storage capacity of up to 64 TB.
C. It does not provide protection against data corruption during power failures.
D. It has the ability to store custom metadata about the file that the user might want to record.

Answer: B,D


You work as a senior administrator at The network consists of a single
domain named All servers on the network have Windows Server 2012
installed, and all workstations have Windows 8 installed.
You are running a training exercise for junior administrators. You are currently discussing a
Windows PowerShell cmdlet that activates previously de-activated firewall rules.
Which of the following is the cmdlet being discussed?

A. Set-NetFirewallRule
B. Enable-NetFirewallRule
C. Set-NetIPsecRule
D. Enable-NetIPsecRule

Answer: B


You work as a senior administrator at The network consists of a single
domain named All servers on the network have Windows Server 2012
installed, and all workstations have Windows 8 installed.
You are running a training exercise for junior administrators. You are currently discussing the
Always Offline Mode.
Which of the following is TRUE with regards to the Always Offline Mode? (Choose all that apply.)

A. It allows for swifter access to cached files and redirected folders.
B. To enable Always Offline Mode, you have to satisfy the forest and domain functional-level
requirements, as well as schema requirements.
C. It allows for lower bandwidth usage due to users are always working offline.
D. To enable Always Offline Mode, you must have workstations running Windows 7 or Windows
Server 2008 R2.

Answer: A,C



Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

70-431 Q&A / Study Guide / Testing Engine / Videos

Certkingdom has hired you as their Database administrator. You create a database on named
Development on ABC-DB01 that hosts an instance of SQL Server 2005 Enterprise Edition.
You perform weekly maintenance and conclude that the Development database growing at about
100 MB on a monthly basis. The network users started complaining about poor performance of
queries run against the database.
There is 2GB RAM installed on DB01 and the database consumes 1.6 GB of RAM.
How would you determine whether additional RAM should be acquired for ABC-DB01?

A. You should consider monitoring the SQL Server: Memory Manager – Target Server Memory
(KB) Page Splits/sec counter in System Monitor.
B. You should consider monitoring the SQL Server: Buffer Manager counter in System Monitor.
C. You should consider monitoring the System – Processor Queue counter in System Monitor.
D. You should consider monitoring the SQL Server: Access Methods – Page Splits/sec counter in System Monitor.

Answer: B

Explanation: The SQL Server: Buffer Manager object is utilized to view the information related to

You create a database on ABC-DB01 that is running an instance of SQL Server 2005 Enterprise Edition.
Certkingdom recently suffered a power outage to the building which forces you to restart ABC-DB01
which now fails to start the SQL Server (MSSQLSERVER). Certkingdom wants you to troubleshoot
the service failure.
What must be done to determine the cause of the service start failure?

A. You should consider reviewing the Event Viewer logs listed below:
The Event Viewer Applications log.
The Event Viewer System logs.
Microsoft Notepad should be utilized to manually view the Microsoft SQL
Server\MSSQL.1\MSSQL\LOG\ErrorLog file.

B. You should consider reviewing the Event Viewer logs listed below :
The Event Viewer Windows logs.
The Event Viewer Setup logs.
The Event Viewer Application log.

C. You should consider reviewing the Event Viewer logs listed below :
The Event Viewer Forwarded Event logs.
The Event Viewer Hardware events.
The Event Viewer Security log.

D. You should consider reviewing the Event Viewer logs listed below :
The Event Viewer Setup Events logs.
The Event Viewer Windows logs.
The Event Viewer Applications and Services logs.

Answer: A

Certkingdom has recently opened an office in Perth where you work as the database administrator.
The Certkingdom database infrastructure runs on computers utilizing SQL Server 2005 Enterprise
Edition. You create a Import database with a backup schedule configured in the table below:

The Imports database contains a table named incoming which was updated a week ago. During
the course of the day A network user informs you that a table has been dropped from the Imports
table at 16:10. Certkingdom wants the incoming table restored to the Imports database.
What must be done to restore the table using minimal effort and ensuring data loss is kept to a minimum?

A. You should consider restoring the database from the recent deferential backup.
B. You should consider having the differential of Monday and snapshot backup from Tuesday restored.
C. You should consider deletion of all differential and database snapshots except the recent backup.
D. You should consider having the database table recovered from the recent database snapshot.

Answer: A

Certkingdom hired you as the network database administrators. You create a database named
Customers on ABC-DB01 running an instance of SQL Server 2005 Enterprise Edition.
A custom application is used to access and query the database. The users recently reported that
the custom application experiences deadlock conditions constantly.
How would you determine which Server session ID is related to the deadlocks?
What must be done to observer the SQL Server session ID involved with the deadlock scenario?

A. You should consider monitoring the SQL Server Profiler to monitor Error and Warning events.
B. You should consider monitoring the SQL Server Profiler to monitor Lock Deadlock Chain events.
C. You should consider monitoring the SQL Server Profiler to monitor Objects.
D. You should consider monitoring the SQL Server Profiler to monitor performance.

Answer: B

You are the system administer of a SQL Server 2005 Enterprise Edition server named DB01 that
uses Windows Authentication mode. Certkingdom uses a custom developed application for running
queries against the database on DB01. Users complain that the custom application stops
responding. Yu notice that the CPU utilization at 100% capacity.
You then try to connect to DB01 by utilizing the SQL Server Management Studio but DB01 still
does not respond. Certkingdom wants you to connect to ABC-DB01 to determine the problem.
What should be done to successfully determine the problem?

A. You should consider utilizing the osql –L command from the command prompt.
B. You should consider utilizing the sqlcmd –A command from the command prompt.
You could additionally use SQL Server Management Studio to access Database Engine Query for
connecting to DB01 with the SQL Server Authentication mode.
C. You should consider utilizing the osql –E command from the command prompt.
D. You should consider utilizing the sqlcmd –N command from the command prompt.
E. You should consider utilizing the sqlcmd –R command from the command prompt.

Answer: B

Explanation: The sqlcmd –A command is utilized to ensure that a dedicated administrative connection is utilized.

MCTS Training, MCITP Trainnig

Continue Reading

Google-led group warns of ‘patent privateers

BlackBerry, Red Hat, Google and EarthLink say businesses use patent trolls as mercenaries to harass the competition

Patent trolls are increasingly becoming a weapon some companies can use to harm or harass their competitors, according to public comments jointly submitted today to the Federal Trade Commission and the Justice Department by lawyers for Google, Red Hat, BlackBerry and EarthLink.

The comment detail what the companies say is a rising tide of so-called “patent privateering” and called for a large-scale government probe of the matter. The term refers to the practice of selling patents to a patent-assertion entity (or patent troll), which enables the troll to turn around and sue a competitor without the original company having to expose itself to negative publicity or countersuits.

Google senior competition counsel Matthew Bye explained why the process works in an official blog post.

“Trolls use the patents they receive to sue with impunity – since they don’t make anything, they can’t be countersued. The transferring company hides behind the troll to shield itself from litigation, and sometimes even arranges to get a cut of the money extracted by troll lawsuits and licenses,” he wrote.

What’s more, according to the companies, patent privateering can be used to circumvent fair, reasonable and non-discriminatory licensing agreements – exposing businesses that made good-faith decisions to create products based on a given technology to infringement suits by trolls.

Google and its co-signers urged an FTC investigation into the practice, saying that the extent of patent privateering and its effects is difficult to quantify without additional information.

“The secrecy in which PAEs cloak their activities exacerbates all of these concerns and leaves the public without information needed to access the likely competitive effects of patent outsourcing practices,” the companies said.

Google recently announced an Open Patent Non-Assertion Pledge, saying that it will agree never to sue over the use of some designated patents unless attacked first. The first 10 patents in the program all relate to MapReduce, a big data processing model.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

14 dirty IT tricks, security pros edition

The IT security world is full of charlatans and wannabes. And all of us have been “advised” by at least one of them.

All you want in an IT security consultant is expertise, unbiased advice, and experienced recommendations at a reasonable price. But with some, you get much more than you bargained for.

For example: Big-ticket items that solve tiny problems you don’t have. Surprises about the feature set after you’ve already signed the dotted line. Disregard for your deadlines or what happens to your systems once the work is done.

It’s often challenging to see the shady practices coming. After all, those who employ them sometimes work for the most prestigious firms, have the friendliest handshakes, express compassion for your security woes. Some aren’t even malicious; they just don’t know how to efficiently solve your problems.

Here are 14 dirty IT security tricks to be aware of before you bring in that outside consultant or vendor. If you have experienced one of these or have another to offer, share it in the comments.

Dirty IT security consultant trick No. 1: Feigning practical experience
A funny TV commercial once depicted a couple of tech consultants getting nervous when asked to help deploy the solution they just designed. “Hey, we’re only consultants!” they retort.

Like most “Dilbert” cartoons, there’s more than a little bit of truth at work here: Many consultants have never deployed the solutions they are selling.

We’ve all encountered this ploy, either in the form of an outright lie about hands-on experience or just an IT consultant who is less forthcoming than they should be about how often they roll up their sleeves and get work done.

If you want to avoid consultants who employ this trick, just ask, “How many times have you implemented the specific solution you are recommending right now?” Then follow it up: “Can I have references?”

Dirty IT security consultant trick No. 2: Proposing one solution for all
Some IT security consultants are all too ready to describe their solution as the one solution you’ve been waiting for to solve all (or most) of your IT security problems.

Not that they take the time to even listen to your problems. Their eyes glaze over anytime they aren’t actively speaking. They can’t wait to interrupt you to start in again about this wonderful solution they’ve brought to you in the nick of time.

There’s just one problem: None of the consultant’s past customers has solved all their security problems.

When you ask a consultant employing this tactic whether prior customers solved their security issues, he’ll say yes. When you ask for customer references, he’ll look surprised, give you caveats, and push you not to call them. If you do call and find out the truth, wait to hear the consultant claim the installation failed because the customer didn’t implement the solution the way he told them to, customized it too much, or simply didn’t listen to him.

Don’t be fooled by claims of incompetence when it comes to previous customers.

Dirty IT security consultant trick No. 3: Knowledge bluffing
How many times has a consultant claimed to be an expert in a particular area, only to have their bluff unmasked because they muff the correct use of technical terms?

Sometimes you don’t even have to dig too deep or ask them anything technical. One of my favorite encounters with this particular practice was when a “certified novel expert” showed up to help my company with its Novell network. I kid you not. The guy claiming to be the master at a particular technology couldn’t even pronounce the name correctly. It’d be funny if it weren’t so embarrassing.

Dirty IT security consultant trick No. 4: Full-court sales press
Rushing decisions reeks of recommended sales tactics. How many times have we heard this: “Hey, I’ll give you 20 percent off the regular pricing if you buy today, before the end of our quarter.”

It doesn’t bother your security consultant that it’s the 13th of the month and you’re thinking his company has a weird fiscal calendar. I don’t know about you, but whenever I’m offered a discount to buy by a particular day, I always wait until after the day and expect the same discount.

I’m sure buying early would help make their bonus bigger — but I don’t care about their bonus. I care about my company. If they want a bigger bonus, they better make me feel like I’m am an idiot for not implementing their product today. An appeal to their own financial gain is the least of my concerns, especially if I feel they’re trying to rush my thoughtful consideration.

Dirty IT security consultant trick No. 5: Eye candy
I don’t mind vendors bringing beautiful people to a sales meeting, as long as they’re knowledgeable about the product. But when these trophy salespeople are clueless about the offering and have little to no experience in the industry, they’re wasting a seat in the conference room.

Employing models at a security conference is one thing. But when we’ve moved beyond handing out brochures and have begun the product demo and question-and-answer session, it’s time to get serious. Sway me with knowledge and experience, not a pretty smile.

Dirty IT security consultant trick No. 6: Recommending tiny solutions to specific problems for big money
Ever have a consultant pitch you a new, whiz-bang product that’s just great at detecting XYZ? “It’s a complex issue that is hard to stop, but this product does it better than anything else.”

Before you sign up for this expensive, targeted solution, ask yourself two questions: Has your company been exploited by XYZ before, and is your company likely to be exploited that way in the future?

If the answer is no to both of these questions, then reconsider the purchase no matter how awesome the solution.

Dirty IT security consultant trick No. 7: Travel bribes
They come in and insinuate that if you buy their product they will be able to “recommend” you as a visitor to their annual conference meeting in some exotic locale: “Buy our expensive IPS and you’ll have a week in Maui coming up soon.”

Or they fund an expensive “networking” trip for you before you buy the product.

I can’t say I really hate this technique, even though what your consultant is suggesting is usually unethical and sometimes illegal. Who doesn’t want to visit a nice vacation spot, stay in a five-star hotel, and eat in restaurants they could never otherwise afford?

Of course, it always pisses off the consultant when you decide not to buy. When I get offered something that might be mistaken for a bribe, I think it’s best if I don’t buy any product, just so no one gets the wrong idea. But thanks for the trip!

Dirty IT security consultant trick No. 8: “One last thing”
I hate this trick most of all. The consultant brags and brags about a particular solution, even demos its awesomeness. It is awesome. You’ll take 10 of them. Then after you’ve convinced management to allocate the money to buy it, the consultant tells you a tiny fact that crushes all the advantages.

I’ve been told after signing a contract that the data storage I was shown in the demo, which I thought was part of the product, is extra. After signing a contact, I’ve been told the solution has a few bugs. Those bugs, it turns out, invalidated the product. I’ve been told, after the fact, that the solution doesn’t work as well on my wider enterprise, though the consultant was very familiar with my environment. I’ve had consultants leave out annual service costs, mandated upgrades, and all sorts of details that tipped what I thought was a good decision to become a bad decision.

And they tell you the new information with a smile.

Dirty IT security consultant trick No. 9: Ignoring your deadline
From the outset, you tell the consultant or vendor your drop-dead date for finishing a particular implementation or project. They work with you, gain your trust, and their solution seems perfect for your company. You place your order, and all of a sudden they don’t have a product, installers, or trainers that can fit your schedule. It’s hurry up and wait.

You wonder how they didn’t hear you repeatedly at the beginning when you asked if they could make the date expectations you were directed to meet. Their changing date forces you to make another purchase decision, eat into another budget, or reschedule a major vacation. It’s never fun.

Dirty IT security consultant trick No. 10: Promoting product — and getting kickbacks
We expect consultants to be impartial and to recommend the best solutions for our companies. Lots of consultants make extra money from their “partners” to push particular solutions. We get that. But pushing a product without telling you about the possible conflict of interest goes beyond the pale.

I remember one consultant, many years ago, who advised me on what networking equipment to buy. He didn’t tell me that he was getting a vendor kickback, and after we became “friends,” or so I thought, he tricked me into buying more network equipment than I could ever have used. It was enough network ports for three times the number of Ethernet runs I needed.

To this day I have memories of all that equipment, hundreds of thousands of dollars’ worth, sitting unused in a backroom storage area. It was my mistake. The consultant? He bought a brand-new boat that year.

Dirty IT security consultant trick No. 11: Knowingly recommending products that will be discontinued
Twice recently I’ve encountered customers who were lured into buying solutions just months before their end of life.

In one case, it was high-speed networking equipment. The other was a network access control solution. Each spent megadollars to deploy what ended up being a discontinued product. In one instance, the consultant later let it slip that he was suspicious the solution was going to be discontinued because he had heard all the developers were let go last year.

Isn’t that a tidbit you might want to know before making a buying decision?

Dirty IT security consultant trick No. 12: Saying one thing, signing another
One thing consultants are very good at is translating your needs into a vendor’s purchasing nomenclature. This is especially important when customizing or purchasing a partial solution. You want X of this and Y of that, and the consultant ensures these needs are met, cutting through any possible miscommunication.

Except when they don’t.

No matter how many times you’re told what you’re going to get, make sure it’s part of the contract. Too often, the product arrives, the project is supposed to begin, and something is missing — something expensive. The customer goes back to the vendor and finds out the consultant didn’t include a particular item on the contract.

The consultant will retort that they were clear about what was and wasn’t on the contract, even if you are dead sure what they said verbally was different. Then you have to come up with the additional budget to get what you want or otherwise scratch the entire project.

Dirty IT security consultant trick No. 13: Shortchanging accountability
Doctors take an oath to do no greater harm to their patients than when they first arrived. I wish consultants had a similar oath.

Too often consultants implement projects poorly, leaving their customers to endure service outages in their wake. Knowing that the only thing that changed in your environment was what the consultant just installed is of no consequence. That just moves the consultant to openly wonder whether something unrelated is causing the outage on the very system they messed with.

Insist on a contract that makes your consultant accountable for unexpected service outages due to no fault of your own.

Dirty IT security consultant trick No. 14: Consultants who make big changes before leaving
Lastly, my favorite consultant trick is the one where they make a major change just before they get on a plane home for the weekend or take an extended vacation. Sure, the resulting outage isn’t always their fault, but if you’re going to make big changes to an IT network, do it a few days before you skip town. Nothing is worse than having to leave multiple, unanswered emails and phone calls to a consultant while your user base is experiencing downtime.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

A new way to sell used IT equipment

MarkITx offers two-sided market for selling used IT gear

The buying and selling of used IT equipment is not a trivial market, but it doesn’t get enormous attention. In many cases, enterprises unload used equipment as a trade-in or at bargain price to a wholesaler because they just want the equipment off the floor.

A new Chicago-based startup says it has developed a system to help enterprises get as much value out of their old IT equipment as possible.

The company, MarkITx, is running an online system for selling equipment, but it’s not a new version of an eBay-like system. The seller and buyer remain invisible to each other, the money goes in escrow, and the transaction may even involve third-party vendors to refurbish equipment.

Ben Blair, the CTO of MarkITx, said the exchange works like “a two-sided market” rather than an auction. Buyers post what price they want to pay for particular piece of equipment, including condition and quantity. Sellers post the quantity and current condition of the equipment they are selling and what amount they want to receive for it.
IT industry

More in IT industry
Blair provided this example: A telecom operator wants to sell a few cabinets of Cisco Unified Computing System (UCS) systems for $2.1 million, and a financial exchange operator is seeking a UCS for their back office.

MarkITx puts a value on the equipment, and in this case it recommends selling the Cisco equipment at $1 million, based on a range of $800,000 to $1.2 million. The telco lists all their equipment at the suggested price, but each unit is listed as a separate item at its fair market value. If the buyer and seller agree on the price, the trade is executed automatically and the payment is put in escrow with MarkITx.

Once that happens, the telco is then given shipping instructions to an OEM- or ISO-certified refurbisher that MarkITx is working with. That third party inspects the equipment, test and verifies it, and the equipment is delivered to the buyer. When the buyer accepts the equipment, the funds in escrow are released.

MarkITx will also bring the equipment up to the condition that buyer wants. For example, a seller is offering 10 rack-mount LCD KVMs at $200 each but in “C” condition. However, the buyer wants them in “A” condition, with replaced back panels and polished glass. The refurbisher says it will cost $100 per unit to bring the equipment to “A” condition, and if the buyer is willing to pay $300 per unit, the trade executes.

Blair said the company, which began operating last May but just launched publicly this month, is clearing about $1 million in transactions a month. It has $5.5 million in supply on the market, and $13.5 million in demand, he said. The bulk of its customers are in telecom, data center, education and government. The company is paid by the seller through a default commission of 20%, but it has membership plans for lower commissions.

Blair believes there is an opportunity for the company because he says many enterprises are only interested in getting rid of their equipment, not in maximizing its potential resale value. They believe that once IT departments investigate the potential resale of some of the equipment through their market, their minds will change.

“[Wholesalers] make their spread off ignorance,” said Blair, who said his company’s model ensures offering transparency on price.

According to IDC, the market for used equipment in the U.S. is about $70 billion.

Joseph Pucciarelli, an analyst at IDC, said that a lot of companies are restrained on what they can spend on capital equipment, and purchase used equipment to augment their existing systems. In many cases, they buy used equipment to keep compatibility with existing systems as way to keep their “support footprint” from expanding, he said.

Pucciarelli said companies typicall will unload used equipment to an original equipment manaufacturer (OEM) as part of a trade-in on an upgrade.

The company doing the upgrade won’t get top dollar for their old IT equipment, but they may not see a selling alternative as worth their time, said Pucciarelli. Relative to the overall size of the transaction in an equipment upgrade, “you are talking about something that is pennies on the dollar [for the used systems,]” he said.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

Microsoft’s comes out of preview phase

Microsoft’s comes out of preview phase
Hotmail users will be upgraded to the new email service by summer

Microsoft has moved its email service out of the preview phase, and plans a marketing campaign to boost its adoption worldwide.

The service, which claims 60 million active users since the preview was released last July, will soon start to upgrade Hotmail users to the new service, David Law, director of product management at, wrote in a blog post on Monday.

At launch of the preview, Microsoft said would eventually replace Hotmail. The migration of Hotmail users, which will be completed by summer, will be seamless, and users’ email address, password, messages, folders, contacts, rules, vacation replies, and other features will stay the same, with no disruption in service, Law wrote. He did not specify a date when the transition would be complete. Users won’t have to switch to an address if they prefer not to, he added.

Microsoft is also launching a large-scale marketing campaign to promote the service worldwide, stating that it is confident that is ready to scale to a billion people.

“A number of people have expressed appreciation that replaces advertising with the latest updates from Facebook or Twitter when they’re reading email from one of their contacts,” Law wrote. On an average, people saw 60% fewer ads when using because they now get much more relevant updates from their friends, he added.

Microsoft launched recently a campaign against Gmail in the U.S., targeting Google’s alleged practice of going through the contents of all Gmail messages to sell and target advertisements. The “Don’t Get Scroogled by Gmail” campaign on Microsoft’s promotes as an alternative to Gmail. Microsoft asked users to sign a petition to stop Google from going through personal email to sell ads.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

BYOD to Change the Face of IT in 2013

The “Bring Your Own Device” phenomenon, largely driven by Apple iPhones and iPads, is changing the face of IT departments, perhaps reaching a tipping point. If CIOs thought mobile devices presented challenges before, they haven’t seen anything yet.

[ALSO: BYOD keeps expanding and IT needs to deal with it]

“IT departments need to be service organizations,” says CTO Aaron Freimark at services firm Tekserve, which helps Fortune 1000 companies adopt Apple products. “The most conservative financial institutions are seeing all of these iPhones on their networks and accessing Exchange servers. We’re reaching a critical mass this year, when companies are forced to deal with it.”

MobileIron and iPass released a joint study, 2013 Mobile Enterprise Report, that found IT increasingly losing control of mobility budgets to other departments. In 2011, 53 percent of IT departments managed the mobile budget. This number dipped to 48 percent in 2012.

This year just might be the year of BYOD and the mobile workforce change how IT operates, or at least putting more emphasis on services. A new report from Forrester found that at least a quarter of a billion global information workers already practice BYOD in some form. A third of information workers want iPhones, or 208 million global information workers. Nearly the same amount want Windows tablets.

Along these lines, the Mobile Enterprise Report found that tablet usage increased in all non-executive departments between 2011 and 2012, with legal and HR seeing the biggest hike followed by finance and accounting.

Part of what’s driving BYOD is the emergence of next-generation workers, the Millennials. Many of these workers, between the ages of 18 and 29, are willing to blend work and personal lives, which goes hand-in-hand with BYOD. According to the Forrester report, the rise of “anytime, anywhere” workers in the United States and Europe grew from 15 percent to 28 percent of employees between 2011 and 2012.

Slideshow: 15 Best iPhone Apps for Newbies (2012)

What does this mean for CIOs? Change is in the wind, one that’s blowing toward becoming a service organization.

The top two sources of BYOD frustration for an IT department are onboarding and supporting an increasing number of devices, according to the Mobile Enterprise Report. The latter is one of the concerns that CTO Bill Murphy at financial services firm Blackstone Group has about supporting BYOD tablets beyond the iPad. (For more on this, check out How a Big Financial Services Firm Faced BYOD iPads.)

“Right now, the amount we do for our users as it relates to mobile devices is vast,” Murphy says. “If we had to support 15 types of devices, we wouldn’t have the staff to be able to handle it.”

Echoing the sentiments of Murphy, Freimark and others, the Forrester report states:

“For CIOs, BYOD is both an opportunity to outsource cost to employees and also a call to action to implement security models and application architectures that are device-agnostic. Only in this way can you get out of the business of device provisioning and into the business of service provisioning, and that’s where you can make a real difference in employee’s satisfaction and productivity.”

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

Cisco fills out SDN family with 40G switch, controller, cloud gear for data center

Nexus 6000 designed for high-density 40G; InterCloud, for VM migration to hybrid clouds; ONE Controller for programming Cisco switches, routers

Cisco this week will fill out its programmable networking family with a new line of data center switches, cloud connectivity extensions and a software-based SDN controller.

The new products fill out Cisco’s ONE programmable networking strategy, which was unveiled last spring as the company’s answer to the software-defined networking trend pervading the industry. Cisco ONE includes APIs, controllers and agents, and overlay networking techniques designed to enable software programmability of Cisco switches and routers to ease operations, customize forwarding and more easily extend features, among other benefits.

This week’s data center SDN rollout comes after last week’s introduction of the programmable Catalyst 3850 switch for the enterprise campus.

The new Nexus 6000 includes two configurations: the four RU Nexus 6004 and the 1 RU 6001. The 6004 scales from 48 Layer 2/3 40Gbps Ethernet ports, all at line-rate Cisco says, to 96 40G ports through four expansion slots. The switch also supports 384 Layer 2/3 10G Ethernet ports at line-rate, and 1,536 Gigabit Ethernet/10G Ethernet ports using Cisco’s FEX fabric extenders.

The Nexus 6001 sports 48 10G ports and four 40G ports through the four expansion slots. The Nexus 6000 line features 1 microsecond port-to-port latency and support for up to 75,000 virtual machines on a single switch, Cisco says. It also supports FibreChannel-over-Ethernet tunneling on its 40G ports.

The Nexus 6000 will go up against 10G and 40G offerings in Arista Networks’ 7000 series switches, Dell’s Force10 switches and Juniper’s QFabric platforms. Cisco also announced 40G expansion modules for the Nexus 5500 top of rack switch and Nexus 2248PQ fabric extender to connect into the Nexus 6000 for 10G server access and 40G aggregation.

Cisco also unveiled the first service module for its Nexus 7000 core 10G data center switch. The Network Analysis Module-NX1 (NAM-NX1) provides visibility across physical, virtual and cloud resources, Cisco says, including Layer 2-7 deep packet inspection and performance analytics. A software version, called virtual NAM (vNAM), will also be available for deployment on a switch in the cloud.

For hybrid private/public cloud deployments, Cisco unveiled the Nexus 1000V InterCloud software. This runs in conjunction with the Nexus 1000V virtual switch on a server and provides a secure tunnel, using cryptography and firewalling, into the provider cloud for migration of VM workloads into the public cloud.

Once inside the public cloud, Nexus 1000V InterCloud provides a secure “container” to isolate the enterprise VMs from other tenants, essentially forming a Layer 2 virtual private cloud within the provider’s environment. The enterprise manages that container using Cisco’s Virtual Network Management Center InterCloud software on the customer Within the context of Cisco ONE, Nexus 1000V InterCloud is an overlay, while the Nexus 6000 is a physical scaling element for the virtual data center. A key core element of Cisco ONE is the new ONE Controller unveiled this week.

ONE Controller is software that runs on a standard x86 server. It controls the interaction between a Cisco network and the applications that run on it and manage it through a set of northbound and southbound APIs handling communication between those applications and the network.

Those APIs include Cisco’s onePK set, OpenFlow and others on the southbound side between the controller and the switches and routers; and REST, Java and others on the northbound side between the controller and Cisco, customer, ISV and open source applications.

Among the Cisco applications for the ONE Controller are a previously announced network slicing program for network partitioning; and two new ones: network tapping and customer forwarding.

Network tapping provides the ability to monitor, analyze and debug network flows; and custom forwarding allows operators to program specific forwarding rules across the network based on parameters like low latency.

Cisco also provided an update on the phased rollout of Cisco ONE across its product portfolio. OnePK platform APIs will be available on the ISR G2 and ASR 1000 routers, and Nexus 3000 switch in the first half of this year. They’ll be on the Nexus 7000 switch and ASR 9000 router in the second half of 2013.

OpenFlow agents will be on the Nexus 3000 in the first half of this year. This is in keeping with Cisco’s initial plan for OpenFlow, which was changed last spring to appear first on the Catalyst 3000. OpenFlow will now appear on the Catalyst 3000 and 6500, and Nexus 7000 switch and ARS 9000 router in the second half of this year.

For Cisco ONE overlay networks, the Cloud Services Router 1000V, which was also introduced last spring, is now slated to ship this quarter. It was expected in the fourth quarter of 2012. Microsoft Hyper-V support in the Nexus 1000V virtual switch will appear in the first half of this year, as will a VXLAN Gateway for the 1000V. KVM hypervisor support will emerge in the second half of this year.

As for the product announced this week, the Nexus 6004 will ship this quarter and is priced from $40,000 for 12 40G ports to $195,000 for 48 40G ports. The Nexus 6001 will ship in the first half of this year and pricing will be announced when it ships.

The 40G module for the Nexus 5000 series will ship in the first half, with pricing to come at that time. The 40G-enabled Nexus 2248PQ will cost $12,000 and ship in the first quarter.

The NAM-NX1 for the Nexus 7000 will ship in the first half with pricing to come at shipment. The vNAM will enter proof-of-concept trials in the first half.

The Cisco ONE Controller will also be available in the first half. Pricing will be announced when it ships.

MCTS Certification, MCITP Certification

Best CCNA Training and CCNA Certification and more Cisco exams log in to



Continue Reading

Microsoft Surface sales suck

Or do they? If you listen to some analysts, Surface, and other slates running Windows 8 or RT, started slow out of the gate. Considering how much tablets sapped PC shipments in Q4, slow forebodes trouble ahead. Or does it?

“There is no question that Microsoft is in this tablet race to compete for the long haul”, Ryan Reith, IDC program manager, says. “However, devices based upon its new Windows 8 and Windows RT operating systems failed to gain much ground during their launch quarter, and reaction to the company’s Surface with Windows RT tablet was muted at best”. He estimates that Microsoft shipped just 900,000 Surfaces during fourth quarter, which means to stores and not actual sales to customers.

That number sure looks low compared to any manufacturer in the top 5. Even lowly ASUS shipped 3.1 million units. But sell-through matters more. Except for about 10 days of the quarter, at retail, Surface sold exclusively through 66 retail shops in Canada and the United States. Apple offered iPad through an average 390 shops — 150 outside the United States. Accounting for online sales and doing some best guesstimates, I get 14,680 iPads sold per Apple Store and (assuming 600,000 units) 9,090 Surfaces per Microsoft shop.

However, when adjusting for actual sales days (Microsoft’s slate was available for only about two-thirds of the quarter), Surface-sell through averages out a little higher than iPad on a per-store basis. Meaning: Given limited distribution, Microsoft’s tablet sells better than IDC shipments suggest.

Size Matters
Microsoft’s problem is something else: Size. “We believe that Microsoft and its partners need to quickly adjust to the market realities of smaller screens and lower prices”, Reith emphasizes. That’s a polite way of saying Surface RT costs too much at $499 and Pro, for sale starting February 9, is already overpriced. But are they? Really?

According to NPD DisplaySearch, market demand shifts towards smaller, and lower-cost models. The firm forecasts that slates with 7-7.9-inch displays will account for 45 percent of shipments this year. By contrast, 9.7-inchers will fall to 17 percent — that’s the size of iPad, the category leader. But Apple offers the 7.9-inch iPad mini, whereas Microsoft and its partners offer nothing in this rapidly exploding size segment.

Apple tablets are pricey, too. Starting February 5, one iPad 4 will sell for $929. But fruit-logo pricing starts lower, at $329 for 16GB iPad with WiFi. Microsoft is locked lowest at $499 with a 10.6-inch slate. What the company needs more is a broader range of sizes and prices, the strategy competitors like Apple, ASUS and Samsung pursue. That would preserve current Surface pricing.

Such an approach doesn’t easily fit Microsoft’s current tablet strategy, which is all about making a traditional desktop operating system available on more form factors. But that’s not what the market wants today, when tablets displace some computer sales rather than replace PCs altogether.

Reith warns: “In the long run, consumers may grow to believe that high-end computing tablets with desktop operating systems are worth a higher premium than other tablets, but until then ASPs on Windows 8 and Windows RT devices need to come down to drive higher volumes”.

Give a Little
Simply stated: Working with partners, Microsoft must make gaining market share the top priority. Tablet shipments grew about 75 percent year over year and quarter on quarter to 52.5 million in Q4. Laptops lead the PC category, but NPD DislaySearch predicts that tablet shipments will exceed notebooks this year. Again, that’s not so much slates replacing PCs as displacing new sales, as capabilities overlap. Microsoft doesn’t want to be left behind Android and iOS slates. This is a platform war that nobody wants to lose.


ASUS tablet shipments grew 402.3 percent year over year and Samsung’s by 263 percent, according to IDC. These are phenomenal gains, and both companies offer models running Windows 8 or RT alongside Android. Something else: They also sell what Microsoft doesn’t — smaller slates with screens 7-7.9 inches. Short term, Microsoft’s options are limited with Surface. But working with partners, Microsoft could bring Windows RT to smaller screens. Such a strategy would preserve Surface pricing and Microsoft’s strategy around bringing desktop Windows to new devices.

But there’s a wrinkle. Android costs ASUS and Samsung nothing, and Apple realizes the cost of iOS through research and development. Whereas, Microsoft partners pay to license Windows RT. I wouldn’t recommend that Microsoft give tablet OEMs Windows for free, but co-marketing contributions and other incentives could temporarily make the fees essentially zero — on smaller slates.

Already Apple feels the pinch. In Q4, iPad shipment share fell to 43.6 percent from 51.7 percent a year earlier, even as volumes increased (22.9 million from 15.1 million), according to IDC. However, for the second quarter in a row, iPad share declined.

Apple’s falling tablet fortunes show just how dynamic is the segment, and that competitors can and will gain share. But for which platform? Android or Windows RT? Microsoft can answer the question, even in part, by adjusting its tablet strategy.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

Follow Us

Bookmark and Share

Popular Posts