Archive for October, 2013:

5 IT security horror stories (and 5 solutions)

When it comes to security, your employees may be your weakest link. While policies and training can go a long way toward helping your employees keep devices and data safe, sometimes technology is the answer.

Your business relies on the security of its networks, storage and mobile devices to protect personal information and corporate data. But often, the weakest link in a data security plan is the human element. While education and training can go a long way toward helping your employees keep devices and data safe, sometimes it’s up to technology to save the day.

Jaspreet Singh, CEO and Founder of data protection and governance company Druva, outlines five of the worst data security horror stories and explains how they could have been prevented.

Problem: Mobile Device Loss
Almost 70,000 laptops, smartphones and other mobile devices are lost every year at airports, in hotel rooms, in taxis, says Singh. The loss of personal and business information can be crippling and embarrassing, and can leave your company at risk for even greater theft and data loss, Singh says.

Mobile Device Loss Solution
Obviously, you want your employees to understand the importance of keeping their devices with them at all time. But, in the event a loss or theft happens, technology can come to the rescue. With continuous synchronization and data backup, even if a device is lost or stolen, it can easily and quickly be reprovisioned on a new device. And with data loss protection (DLP) software, sensitive data and information can be wiped from the device remotely, significantly reducing the chance of a breach.

Problem: Data Theft
In a highly publicized incident, one large storage and archiving company was the victim of a massive data theft when a huge number of encrypted drives were stolen from a van transporting them to an off-site facility. Don’t think it could get worse? The van was unlocked and unattended, making the theft much easier.

Data Theft Solution
The physical security of devices when in transport or in a storage facility is just as important as securing the data they contain. Make sure your off-site storage facility and the transportation method used to get your drives there are secure, and that staff is highly trained. You also should encrypt all the data and devices, which can mitigate risk in the event of a theft.

Problem: Laptop Theft
A physician at Lucile Packard Children’s Hospital at Stanford University reported that his hospital-issued laptop was stolen from his car, putting the information of about 57,000 patients at risk. While the computer was password-protected, it wasn’t immediately apparent what kind or how much data was on the computer.

Laptop Theft Solution
Installing eDiscovery software could have helped more easily discern that, fortunately, the information on that laptop was years out of date, and didn’t contain any financial or personal identifying information. Of course, such a theft is still a concern, but could have been much worse, Singh says.

Problem: BYOD
Bring Your Own Device offers employees flexibility and freedom, but can also put confidential information and proprietary business information at risk, Singh says. If users are accessing confidential files or personal information over unprotected wireless access (or, as previously stated, lose their device) your business could be at risk.

BYOD Solution
Education is one of the first lines of defense against this sort of breach, Singh says. Make sure your employees understand the risks and, if they can help it, that they aren’t supposed to access certain files or information using their devices. If a device is lost or stolen, DLP software can wipe a device and make it unusable for the thief.

Problem: Web Traffic Detour
For about 18 minutes in April 2010, about 15 percent of U.S. government Internet traffic was redirected through China, including traffic to and from the sites of the U.S. Army, Navy, Marine Corps, Air Force, the office of the Secretary of Defense, the Senate and NASA, Singh says. Though the Chinese government denied it, a major flaw was found in a government data center that could easily have been exploited to redirect traffic.

Web Traffic Detour Solution
Singh says building in restricted user access could have prevented such an incident. By incorporating a blacklist and whitelist of authorized users, network administrators can control which users, which devices, and which specific IP addresses are permitted to access specific data, applications, and computing functions, he says.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

How the cloud is blowing up the network

The cloud introduces much more dynamic characteristics to IT

CHICAGO – For networking folks, the good old days are fading away.

Applications used to be easy to manage, at least compared to today. Traditional network architecture approaches align networks with the applications they’re supporting. There are linear data flows, which lead to linear networking flows, and they evolve together. As the application grows, so does the network. These topologies are, relatively, easy to scale horizontally using tools like load balancers, and simple to monitor by tapping single points of traffic flow.

“Then, virtualization changed things,” said Eric Hanselman, chief analyst at the 451 Group, who presented a discussion on how cloud is changing networking at Cloud Connect event in Chicago this week. The fundamental difference is that virtualization allows applications to be mobile now. “When you start to move workloads around, those data flows become much more complicated,” Hanselman said. “Those traditional networking tiers start to come undone.”

Now, cloud has introduced a whole new set of complexity beyond just virtualization. In a cloud environment, not only are virtual machines sliced up within a server, but they can be automatically provisioned and scaled. That requires even more network flexibility. The cloud brings other challenges too. Today, applications can run in a geographically dispersed setting all around the world. But when that happens, the underlying data that supports those apps still needs to be constantly updated and synchronized as well.

Customers may be used to replicating data for disaster recovery scenarios. But that active/backup model doesn’t quite cut it in a cloud world. Geographically dispersed applications need to be synchronized, creating active-active scenarios across multiple sites. But that’s tough to do.

Sometimes users are settling with not having the most consistent data across these applications, but instead have a system where data is mostly up to date in real time, and will eventually replicate across a distributed environment, Hanselman said. Hyperscale data centers use this philosophy: When a user updates Facebook, that update may not show up immediately across the entire globe. But, eventually it will work its way through the system. “Prepare to separate and distribute” your data, Hanselman recommends.

The cloud has introduced these new models for data to be distributed across the globe, and the networking needs to keep up. Cloud providers are trying to make these networking paradigms advance with the advent of cloud services. Amazon offers direct connect, which is a direct link between their data centers and collocation facilities operated by a variety of partners, like Equinix. Other companies are rolling out SDN-like qualities in their cloud, which gives customers the ability to spin up and down networks on demand.

[FROM SMALL TO BIG: 5 tips for managing your cloud at scale]

The cloud introduces much more dynamic characteristics to IT, says Bernard Golden, director of the enterprise practice at Entratius, a cloud management platform that is owned by Dell. Users usually understand the changes that are needed in the network from a conceptual point of view to accommodate this, but they confront these challenges once their systems are implemented.

A key to relieve some of these issues, he said, is to install some sort of software layer between the network and these dynamic applications – whether that be a SDN, or more palatable virtualized switches. “Essentially, you need to have smart software in the middle of the network now,” said Golden, who’s also author of the recent Amazon Web Services for Dummies book.

The takeaway, Hanselman said, is that virtualization, and especially cloud, significantly alter traditional networking approaches. If the network doesn’t update with the adoption of new technologies like cloud computing, the entire system can be flawed.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

Why green IT is good for business

As these companies have discovered, when IT projects focus on operational efficiency, sustainability benefits usually follow.

When Kevin Humphries talks about green IT at FedEx, you won’t hear much about reducing the company’s carbon footprint. FedEx embraced the new math of green IT when it engineered every inch of its new, LEED-certified 46,000-sq.-ft. data center for maximum operational efficiency. “We found the most optimal mathematical model for capacities and efficiencies,” says the senior vice president of IT at FedEx Corporate Services. The result is what he calls “a perfect blend” of green energy usage, fiscal savings and rational utilization of equipment and resources.

Elements of the design included flywheel backup power generators, variable-speed fans that help keep the facility’s Power Usage Effectiveness (PUE) rating low, and air-side economizers that generate 5,000 hours per year of free cooling for the Colorado Springs building. FedEx also raised the operating temperature in the data center by 5 degrees to cut the cooling bill. Meanwhile, specific rack cooling technologies, alternative energy sources and other systems were not included because FedEx felt they were risky, prone to failure or required undue maintenance. “We had to find the perfect blend of simplicity and advanced technology,” Humphries says.

Because the hype and excitement over green IT has diminished over the past few years, and the specter of carbon taxes has faded, organizations have begun to put sustainable IT initiatives on the back burner, or even dismiss them entirely. But successful green IT projects usually go hand in hand with operational efficiency initiatives, where benefits drop down to the bottom line while meeting corporate sustainability goals. “Did we make any trade-offs with efficiency versus cost? There were very few,” Humphries says.

The good news is that there are still plenty of relatively easy ways to make your facilities more eco-friendly. “Your average data center remains relatively inefficient,” says Simon Mingay, an analyst at Gartner. And that means green IT affords lots of opportunities for gaining favor with the CFO as well as the corporate sustainability advocate. There’s even a road map to follow: The best practices for energy efficiency are now well established and readily available from resources such as The Green Grid (see box below), and standards for water usage, carbon usage, renewable energy and e-waste are evolving rapidly.

Ian Bitterlin, chairman of The Green Grid’s EMEA Technical Work Group and CTO at Emerson Network Power Systems, says there are three steps to maximizing efficiency: Cut consumption, make IT processes more efficient and think about alternative energy — in that order. “If you put them in the wrong order, you’ll just waste renewable energy,” he says.
Virtualization End Game

When it comes to reducing consumption and improving operational efficiency, consolidation through server virtualization still offers the biggest bang for the buck. While many IT organizations are working to virtualize more of their legacy server infrastructures — FedEx is now 80% virtualized — Raytheon has already reached the finish line. “All legacy servers have been transitioned, [and] as a result, 2013 will be the last year that we capture and report on the energy and cost savings,” says Brian Moore, IT sustainability program lead. The emphasis now turns to desktop virtualization using energy-sipping thin clients, e-waste reduction, and the use of analytics and data center instrumentation to monitor, manage and reduce energy use.

At Northrop Grumman, server virtualization was part of a data center consolidation effort, completed in August, that eliminated 4,000 physical servers while consolidating 19 data centers and 81 smaller server rooms into three facilities. But like its desktop power management program, which cut energy use for desktops by more than 21%, the server virtualization was driven primarily by a desire to improve operational efficiencies and the bottom line. Helping the company’s GreenNG environmental sustainability program achieve its goal of reducing greenhouse gas emissions by 25% — a goal attained just one year into the three-year initiative — was a bonus. “Power consumption reductions were one of the benefits, but it was just part of our overall IT transformation strategy,” says Brad Furukawa, vice president and CTO for Northrop Grumman’s shared services division.

While server virtualization is well established, many organizations are still just getting started with desktop virtualization. But First National of Nebraska is well into a project to replace between 70% and 80% of its desktops with virtual desktops accessed through thin clients. The move will cut the hassle and headaches associated with disposal of desktop e-waste, which has been piling up in warehouses. “It was driving us crazy,” says James Cole, senior vice president and CIO at the financial services firm.

First National of Nebraska has even extended the concept of virtualization to its chillers. It uses an off-premises utility that provides chilled water to multiple tenants in the local business district. Normally, each business would have its own closed-loop system, each with its own excess capacity. Virtual chillers let each business share capacity, improving energy efficiency and cutting costs, says Cole.

Turning Up the Heat

A more controversial green computing initiative involves raising the maximum air intake temperature on equipment racks in data centers to as high as 80.6 degrees to reduce the energy demands of cooling systems, as Technical Committee 9.9 of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) recommended in 2011. But today, just 7% of all data centers run at 75 degrees or higher, according to a recent Uptime Institute survey. The idea of raising the temperature may seem anathema to many data center managers, but some organizations are slowly inching up their thermostats.

At Earth Rangers, a nonprofit focused on environmental education, IT systems director Rob DiStefano raised temperatures in a small data center from the high 60s in 2010 to 77 degrees, but stopped there when network storage temperature alarms went off. “Storage units are the biggest heat monster in the room,” he says. He could have reconfigured the alarms from the factory defaults, but the idea made him uncomfortable. “We didn’t want to risk it,” DiStefano says. And with intake air temperatures at 77 degrees, air temperatures on the back of the racks were getting uncomfortably warm, says Andy Schonberger, director of the Earth Rangers center.

For his part, Humphries raised the temperature in the FedEx Colorado Springs data center by 5 degrees. He declined to say where the temperature is now set, but he says if he set the temperature at 76 degrees on the intake side of the racks, the temperature in the hot aisle would top 100 degrees. “Sending in someone to replace servers in 100-degree heat is not what we want,” he says. Humphries says the law of diminishing returns kicks in as you approach the upper range of the ASHRAE limit: Fans run longer and equipment works harder, and adding heat containment would have gone against FedEx’s commitment to simplicity in the data center. But raising the temperature 5 degrees in Colorado Springs yielded cost savings that are significant enough for the organization to begin phasing in a similar change at another major data center in Tennessee.

Savings also added up at Raytheon, which raised temperatures in the network distribution rooms in its Tucson, Ariz., facility from 65 to 75 degrees without running into problems. That step alone saved 112,000 kilowatt-hours per month — enough energy to power 100 homes, according to Moore. Raytheon has expanded the initiative to other facilities, but savings vary depending on location, total power use and other variables.

Roger Schmidt, an IBM fellow and chief engineer on data center efficiency, recommends that Web 2.0 and lower-tier data centers turn the needle closer to the 80.6-degree mark, but he says that even Tier 1 data centers in risk-averse industries such as banking can safely ease the mercury up to 75 degrees.

Another underappreciated strategy is to set up instrumentation in the data center that lets administrators monitor and manage both temperature and power use. Most IT organizations still don’t do this, according to Gartner. Schonberger advises building a business case for this by tracking the half-dozen pieces of equipment that are your company’s biggest energy consumers. “It doesn’t save you any money, but it allows you to prioritize,” he says.

Saving Green With Alternative Energy

Only after an organization has analyzed and instrumented its data center, eliminated redundancy and re-engineered to squeeze the maximum efficiency out of its IT infrastructure should alternative power come into play. First National of Nebraska became one of the first organizations to power a data center entirely on fuel cells when it built a new data center more than a decade ago. But when it was time to order new fuel cells this year, it was able to cut the power requirements from 600 to 400 kilowatts because management of its data center infrastructure had improved.

The operating cost, at 12 cents per kwh, is almost double the 6.2 cents First National’s utility would charge. But First National had designed the building to use the waste heat from fuel cells to warm its interior and melt snow on some outdoor surfaces in the winter. Fuel cells also provide a very stable power supply and meet management’s goals of using renewable energy, Cole says. But, he says, “if we were building the data center today, it would be a more difficult business decision.”

FedEx uses solar and fuel cells in other facilities. But after considering solar, wind and fuel cells during the design phase for the Colorado Springs data center several years ago, it decided to pass. Of those technologies, fuel cells looked the most promising. The cost of power from fuel cells couldn’t match utility rates, but the business was more concerned about the availability of commercial utility power than the economics, says IT director Brad Hilliard. What killed the idea was the location of service, which at that time was concentrated on the East and West coasts. Today, however, the technology and associated support infrastructure is more mature. Were he reconsidering that decision now, Hilliard says, “We would spend more energy on that because that local utility risk is so important to manage.”

Fuel cells can be very efficient for data centers because the waste heat they generate can be used to cool the facility when fed into special absorption chillers. But from a purely economic standpoint, you still need fiscal incentives, such as tax breaks, to make them a viable business proposition, says Gartner’s Mingay.

Beyond the Data Center

KPMG’s IT team has taken the lead in driving green IT initiatives that go well beyond the data center. For example, IT pitched a 500 kW solar array to the real estate group for its Montvale, N.J., campus after sustainable IT leader Darren McGann heard a colleague at Microsoft discussing a solar project it had completed. The IT group also raised average operating temperatures to 79 degrees in its data centers after seeing a demonstration where Microsoft moved part of a data center into an outdoor tent to demonstrate the robustness of IT equipment. McGann claims that KPMG’s data center was the first to generate power using gas micro turbine generators, and it reuses the waste heat in combination with absorption chillers to help cool the data center.

Gas-powered microturbines are more economical than fuel cells, but you still can’t make the business case on energy savings alone, says Mingay. In most locations, you still need tax breaks, capital allowances or the policy tool known as “feed-in tariffs” to make them economically viable, although it can be more attractive in locations where natural gas is inexpensive.

IT is uniquely positioned to help make the business case for, and drive, other green initiatives that go beyond the data center, PCs and office automation equipment. At KPMG, for example, it was IT business process analysis and automation expertise that helped propel an initiative to create a paperless audit system.

IT also collaborated with the in-house travel group to encourage videoconferencing in lieu of travel. To do that, it created a JavaScript program within the travel services portal that determines when users request travel between locations where videoconferencing services are available and offers to redirect users to a videoconferencing reservations page before they complete their flight arrangements. “That increased videoconferencing by 85% in just three months,” which increased productivity by reducing travel downtime, saved energy and reduced the company’s carbon footprint by 60% in one fell swoop, McGann says. “What I’m most proud of is that the IT organization has been a constant contributor by engaging with nontraditional partners within the organization,” he adds.

Many companies have adopted virtualization-by-default policies, and IT needs to extend that to “design green by default,” says McGann. It’s easier to make the business case when starting fresh, because retrofits are more expensive. But it’s also important to have a clear understanding of what sort of commitment management is making toward energy and carbon reduction and align projects to meet those strategic goals.

“Many projects will align with an energy reduction goal or a leadership commitment the CEO has made,” he says. “And once you connect the dots you can get leadership to go forward.”

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

How startups should sell to the enterprise

CIOs sound off at DEMO Fall 2013 about the best – and worst – ways to establish business with them.

DEMO is all about startups pitching their new products, but a panel discussion on Thursday turned the tables, with CIOs telling startups what they can do to win business in the enterprise.

The panel was moderated by CITEWorld editorial director Matt Rosoff and featured Dish Network CIO Mike McClaskey, BDP International global CIO Angela Yochem, EchoSign co-founder Jason Lemkin, and Workday strategic CIO Steven John.

Citing the growing reach of technology into new departments of the enterprise, Lemkin warned that buying decisions for technology all eventually come back to the CIO. Even though other areas of the company may make small or even moderately sized purchases, CIOs may take note of the vendors that circumvent them when making the sale, and could block them from any future business. What seems like a short-term win could turn out to be a long-term problem.

However, some startups have been able to sell to enterprise customers after taking an alternative route. Yammer, for example, was mentioned during the panel as a company that gained traction with lower-level employees, often without the knowledge of the CIO. Once Yammer started to attract attention from executives, it embraced CIOs and scaled to meet their needs, satisfying both an enterprise customer’s users and decision makers.

Similarly, McClaskey mentioned the opportunity afforded through consumer technology outlets, such as the Google Play and Apple App Stores. Almost all CIOs use multiple devices that access these outlets, and if they come across a potentially useful enterprise technology while using a personal device, they’re more likely to seek more information about it later on.

Flexibility was mentioned as a key aspect for startup companies that are lucky enough to land large customers early on. McClaskey cited two cases in which Dish opted to work with startups. One of the most important aspects of the relationships was the younger companies’ willingness to incorporate Dish’s input on the product. The startups’ product development teams worked directly with Dish to help adjust aspects of the product to accommodate their needs. This is important not only to sustain business with early customers, but to help attract new customers in the future.

Lemkin cited the importance of use cases and references when trying to attract customers as a young company. Those that are willing to adapt in order to establish strong relationships early on will be more likely to build similar relationships with new customers.

The panel also discussed the importance of maintaining trust with customers and other connections throughout the IT industry. John cited trust as a main factor in all business decisions, from engaging in new business to hiring employees. Decision makers in large enterprises are more likely to side with people who they can trust, whether that trust comes directly from previous business or from word-of-mouth recommendations from others in the industry.

Yochem pointed out the benefit of establishing a good relationship even when failing to complete a sale. Regardless of how a discussion on new business goes, the connection made in the process is still valuable. Another important and often overlooked consideration is asking around for any other potential customers. Even if a potential customer company isn’t in a position to make a purchase, they might know of others who are. Yochem advised salespeople at startups to end every conversation with a potential customer by asking if they know anyone else they should talk to.

The panelists also gave valuable insight into the most effective, and ineffective, methods of engaging an enterprise customer. McClaskey mentioned being “bombarded” with cold calls, emails and webinar invites from sales representatives, and often even from third-party companies hired to do this work for them. These requests often receive the lowest priority, sometimes for no other reason than that they get lost in the white noise created by all the companies that want their business.

The best way to connect with a CIO, according to McClaskey, is through mutual connections – analysts, partners, references, or other companies they’ve done business with.

Similarly, Lemkin also warned against bringing in outside employees to head up their sales operations too early on. Most entrepreneurs won’t have a clear idea of what they want in a president or vice president of sales until they’ve made a handful of meaningful sales on their own. He advised startups to hold off on hiring a sales vice president until they’ve made two sales to CIO-type customers, and to learn from that experience.

For young companies selling tech products and services, Lemkin said the CIO is their best ally. Following his advice, and that of his colleagues, may set one startup apart from the rest of the crowd.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at

Continue Reading

7 sneak attacks used by today’s most devious hackers

Most malware is mundane, but these innovative techniques are exploiting systems and networks of even the savviest users

Millions of pieces of malware and thousands of malicious hacker gangs roam today’s online world preying on easy dupes. Reusing the same tactics that have worked for years, if not decades, they do nothing new or interesting in exploiting our laziness, lapses in judgment, or plain idiocy.

But each year antimalware researchers come across a few techniques that raise eyebrows. Used by malware or hackers, these inspired techniques stretch the boundaries of malicious hacking. Think of them as innovations in deviance. Like anything innovative, many are a measure of simplicity.

[ Verse yourself in 14 dirty IT security consultant tricks, 9 popular IT security practices that just don’t work, and 10 crazy security tricks that do. | Learn how to secure your systems with the Web Browser Deep Dive PDF special report and Security Central newsletter, both from InfoWorld. ]

Take the 1990s Microsoft Excel macro virus that silently, randomly replaced zeros with capital O’s in spreadsheets, immediately transforming numbers into text labels with a value of zero — changes that went, for the most part, undetected until well after backup systems contained nothing but bad data.

Today’s most ingenious malware and hackers are just as stealthy and conniving. Here are some of the latest techniques of note that have piqued my interest as a security researcher and the lessons learned. Some stand on the shoulders of past malicious innovators, but all are very much in vogue today as ways to rip off even the savviest users.

Stealth attack No. 1: Fake wireless access pointsNo hack is easier to accomplish than a fake WAP (wireless access point). Anyone using a bit of software and a wireless network card can advertise their computer as an available WAP that is then connected to the real, legitimate WAP in a public location.

Think of all the times you — or your users — have gone to the local coffee shop, airport, or public gathering place and connected to the “free wireless” network. Hackers at Starbucks who call their fake WAP “Starbucks Wireless Network” or at the Atlanta airport call it “Atlanta Airport Free Wireless” have all sorts of people connecting to their computer in minutes. The hackers can then sniff unprotected data from the data streams sent between the unwitting victims and their intended remote hosts. You’d be surprised how much data, even passwords, are still sent in clear text.

The more nefarious hackers will ask their victims to create a new access account to use their WAP. These users will more than likely use a common log-on name or one of their email addresses, along with a password they use elsewhere. The WAP hacker can then try using the same log-on credentials on popular websites — Facebook, Twitter, Amazon, iTunes, and so on — and the victims will never know how it happened.

Lesson: You can’t trust public wireless access points. Always protect confidential information sent over a wireless network. Consider using a VPN connection, which protects all your communications, and don’t recycle passwords between public and private sites.

Stealth attack No. 2: Cookie theftBrowser cookies are a wonderful invention that preserves “state” when a user navigates a website. These little text files, sent to our machines by a website, help the website or service track us across our visit, or over multiple visits, enabling us to more easily purchase jeans, for example. What’s not to like?

Answer: When a hacker steals our cookies, and by virtue of doing so, becomes us — an increasingly frequent occurrence these days. Rather, they become authenticated to our websites as if they were us and had supplied a valid log-on name and password.

Sure, cookie theft has been around since the invention of the Web, but these days tools make the process as easy as click, click, click. Firesheep, for example, is a Firefox browser add-on that allows people to steal unprotected cookies from others. When used with a fake WAP or on a shared public network, cookie hijacking can be quite successful. Firesheep will show all the names and locations of the cookies it is finding, and with a simple click of the mouse, the hacker can take over the session (see the Codebutler blog for an example of how easy it is to use Firesheep).

Worse, hackers can now steal even SSL/TLS-protected cookies and sniff them out of thin air. In September 2011, an attack labeled “BEAST” by its creators proved that even SSL/TLS-protected cookies can be obtained. Further improvements and refinements this year, including the well-named CRIME, have made stealing and reusing encrypted cookies even easier.

With each released cookie attack, websites and application developers are told how to protect their users. Sometimes the answer is to use the latest crypto cipher; other times it is to disable some obscure feature that most people don’t use. The key is that all Web developers must use secure development techniques to reduce cookie theft. If your website hasn’t updated its encryption protection in a few years, you’re probably at risk.

Lessons: Even encrypted cookies can be stolen. Connect to websites that utilize secure development techniques and the latest crypto. Your HTTPS websites should be using the latest crypto, including TLS Version 1.2.

Stealth attack No. 3: File name tricksHackers have been using file name tricks to get us to execute malicious code since the beginning of malware. Early examples included naming the file something that would encourage unsuspecting victims to click on it (like AnnaKournikovaNudePics) and using multiple file extensions (such as AnnaKournikovaNudePics.Zip.exe). Until this day, Microsoft Windows and other operating systems readily hide “well known” file extensions, which will make AnnaKournikovaNudePics.Gif.Exe look like AnnaKournikovaNudePics.Gif.

Years ago, malware virus programs known as “twins,” “spawners,” or “companion viruses” relied on a little-known feature of Microsoft Windows/DOS, where even if you typed in the file name Start.exe, Windows would look for and, if found, execute instead. Companion viruses would look for all the .exe files on your hard drive, and create a virus with the same name as the EXE, but with the file extension .com. This has long since been fixed by Microsoft, but its discovery and exploitation by early hackers laid the groundwork for inventive ways to hide viruses that continue to evolve today.

Among the more sophisticated file-renaming tricks currently employed is the use of Unicode characters that affect the output of the file name users are presented. For example, the Unicode character (U+202E), called the Right to Left Override, can fool many systems into displaying a file actually named AnnaKournikovaNudeavi.exe as AnnaKournikovaNudexe.avi.

Lesson: Whenever possible, make sure you know the real, complete name of any file before executing it.

Stealth attack No. 4: Location, location, locationAnother interesting stealth trick that uses an operating system against itself is a file location trick known as “relative versus absolute.” In legacy versions of Windows (Windows XP, 2003, and earlier) and other early operating systems, if you typed in a file name and hit Enter, or if the operating system went looking for a file on your behalf, it would always start with your current folder or directory location first, before looking elsewhere. This behavior might seem efficient and harmless enough, but hackers and malware used it to their advantage.

For example, suppose you wanted to run the built-in, harmless Windows calculator (calc.exe). It’s easy enough (and often faster than using several mouse clicks) to open up a command prompt, type in calc.exe and hit Enter. But malware could create a malicious file called calc.exe and hide it in the current directory or your home folder; when you tried to execute calc.exe, it would run the bogus copy instead.

I loved this fault as a penetration tester. Often times, after I had broken into a computer and needed to elevate my privileges to Administrator, I would take an unpatched version of a known, previously vulnerable piece of software and place it in a temporary folder. Most of the time all I had to do was place a single vulnerable executable or DLL, while leaving the entire, previously installed patched program alone. I would type in the program executable’s filename in my temporary folder, and Windows would load my vulnerable, Trojan executable from my temporary folder instead of the more recently patched version. I loved it — I could exploit a fully patched system with a single bad file.

Linux, Unix, and BSD systems have had this problem fixed for more than a decade. Microsoft fixed the problem in 2006 with the releases of Windows Vista/2008, although the problem remains in legacy versions because of backward-compatibility issues. Microsoft has also been warning and teaching developers to use absolute (rather than relative) file/path names within their own programs for many years. Still, tens of thousands of legacy programs are vulnerable to location tricks. Hackers know this better than anyone.

Lesson: Use operating systems that enforce absolute directory and folder paths, and look for files in default system areas first.

Stealth attack No. 5: Hosts file redirectUnbeknownst to most of today’s computer users is the existence of a DNS-related file named Hosts. Located under C:\Windows\System32\Drivers\Etc in Windows, the Hosts file can contain entries that link typed-in domain names to their corresponding IP addresses. The Hosts file was originally used by DNS as a way for hosts to locally resolve name-to-IP address lookups without having to contact DNS servers and perform recursive name resolution. For the most part, DNS functions just fine, and most people never interact with their Hosts file, though it’s there.

Hackers and malware love to write their own malicious entries to Hosts, so that when someone types in a popular domain name — say, — they are redirected to somewhere else more malicious. The malicious redirection often contains a near-perfect copy of the original desired website, so that the affected user is unaware of the switch.

This exploit is still in wide use today.

Lesson: If you can’t figure out why you’re being maliciously redirected, check out your Hosts file.

Stealth attack No. 6: Waterhole attacksWaterhole attacks received their name from their ingenious methodology. In these attacks, hackers take advantage of the fact that their targeted victims often meet or work at a particular physical or virtual location. Then they “poison” that location to achieve malicious objectives.

For instance, most large companies have a local coffee shop, bar, or restaurant that is popular with company employees. Attackers will create fake WAPs in an attempt to get as many company credentials as possible. Or the attackers will maliciously modify a frequently visited website to do the same. Victims are often more relaxed and unsuspecting because the targeted location is a public or social portal.

Waterhole attacks became big news this year when several high-profile tech companies, including Apple, Facebook, and Microsoft, among others, were compromised because of popular application development websites their developers visited. The websites had been poisoned with malicious JavaScript redirects that installed malware (sometimes zero days) on the developers’ computers. The compromised developer workstations were then used to access the internal networks of the victim companies.

Lesson: Make sure your employees realize that popular “watering holes” are common hacker targets.

Stealth attack No. 7: Bait and switchOne of the most interesting ongoing hacker techniques is called bait and switch. Victims are told they are downloading or running one thing, and temporarily they are, but it is then switched out with a malicious item. Examples abound.

It is common for malware spreaders to buy advertising space on popular websites. The websites, when confirming the order, are shown a nonmalicious link or content. The website approves the advertisement and takes the money. The bad guy then switches the link or content with something more malicious. Often they will code the new malicious website to redirect viewers back to the original link or content if viewed by someone from an IP address belonging to the original approver. This complicates quick detection and take-down.

The most interesting bait-and-switch attacks I’ve seen as of late involve bad guys who create “free” content that can be downloaded and used by anyone. (Think administrative console or a visitor counter for the bottom of a Web page.) Often these free applets and elements contain a licensing clause that says to the effect, “May be freely reused as long as original link remains.” Unsuspecting users employ the content in good faith, leaving the original link untouched. Usually the original link will contain nothing but a graphics file emblem or something else trivial and small. Later, after the bogus element has been included in thousands of websites, the original malicious developer changes the harmless content for something more malicious (like a harmful JavaScript redirect).

Lesson: Beware of any link to any content not under your direct control because it can be switched out on a moment’s notice without your consent.

Stealth fallout: Total loss of controlHackers have been using stealth methods to hide their maliciousness since the beginning days of malware. Heck, the first IBM-compatible PC virus, Pakistani Brain, from 1986, redirected inquiring eyes to a copy of the unmodified boot sector when viewed by disk editors.

When a hacker modifies your system in a stealthy way, it isn’t your system anymore — it belongs to the hackers. The only defenses against stealth attacks are the same defenses recommended for everything (good patching, don’t run untrusted executables, and so on), but it helps to know that if you suspect you’ve been compromised, your initial forensic investigations may be circumvented and fought against by the more innovative malware out there. What you think is a clean system and what really is a clean system may all be controlled by the wily hacker.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification,
Microsoft MCITP Training at


Posted in: TECH

Continue Reading

Google relaxes access controls to Apps docs

Google relaxes access controls to Apps docs
People without a Google Account will be able to view documents stored in the Apps suite

Adding convenience possibly at the expense of security, Google will now let people without a Google Account view documents stored in its Apps cloud suite.

The move is meant to simplify how Apps customers share files with outsiders.

Until now, Apps customers could only grant document access to users with a Google Account. People who didn’t have an account or who weren’t logged in to their account couldn’t get into the documents even when invited to do so via an emailed link from an Apps user.

That will no longer be the case, Google said on Monday.

The change applies to Word processing files created with Docs, presentations created with Slides and charts created with Drawings, which are all Google cloud productivity apps that are included in Apps, the company’s workplace collaboration and communication suite.

“As a result of this change, files shared outside your domain to an email address not linked to an existing Google Account can be viewed without having to sign in or create a new Google Account,” reads the Google blog post.

These recipients will only be able to view the file. They won’t be able to edit or add comments to it, actions that still require the recipient to be logged into a Google Account.

Google warns that “because no sign in is required, anyone may view the file with this sharing link.” In other words, the file could end up being viewed by unintended users who somehow get their hands on the link. This possibility is erased if the recipient creates a Google Account, at which point the link becomes unusable for others.

The company started to roll out the feature on Monday to Apps customers that are on the “rapid release” track, which delivers new and changed functions to administrators and end users as soon as they go live. The feature will later reach Apps customers on the “scheduled release” track, which delivers updates once a week and makes them available to administrators first.

Apps administrators will be able to disable this feature for their users on their domain control console.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification,
Microsoft MCITP Training at


Continue Reading

Critics slam World Wide Web Consortium over inclusion of DRM in HTML5

New charter for HTML Working Group leaves controversial Encrypted Media Extension proposal in play

The latest version of the World Wide Web Consortium’s HTML Working Group charter includes provisions for ongoing work on restrictive content protection systems – a decision that has angered groups like the Electronic Frontier Foundation and Free Software Foundation.

The main opposition centers on the controversial Encrypted Media Extension proposal, which would build robust digital rights management capabilities directly into future HTML standards. While EME is still some distance from being officially accepted, its inclusion in the latest draft charter makes that outcome more likely.

[MORE OPEN-SOURCE: HP says open sourcing SDNs is wrong]

DRM, which is used to control access to online media content like streaming video, is a contentious topic, particularly among free and open-source software advocates.

EME, the FSF wrote in a form letter earlier this year, would expose users to a wide array of restrictions on their web experience.

“[EME] would fly in the face of the W3C’s principle of keeping the Web royalty-free — this is simply a back door for media companies to require proprietary player software. It is willful ignorance to pretend otherwise just because the proposal does not mention particular technologies or DRM schemes by name,” the group said.

The EFF echoes the thrust of those remarks in a statement responding to the news that EME would be retained, saying on Wednesday that the group is “deeply disappointed” by the decision.

“By approving this idea, the W3C has ceded control of the “user agent” (the term for a Web browser in W3C parlance) to a third-party, the content distributor. That breaks a – perhaps until now unspoken – assurance about who has the final say in your Web experience, and indeed who has ultimate control over your computing device,” the EFF stated.

The current draft of the charter runs through June 2015, and also included provisions for work on a dual-licensing

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Posted in: TECH

Continue Reading

Rivals to again review Google’s promises in EU antitrust case

The EU competition chief expects a resolution of the case involving search results for Google services ‘by next Spring’

The European Union is to give Google’s competitors a chance to review the company’s latest proposals to avoid antitrust sanctions.

Google is accused of directing users to its own services in search results rather than those of competitors. In its first proposals to the European Commission to settle the antitrust case, it suggested labeling its own services as such in search results, but competitors were extremely unhappy with this. Some said it would even make matters worse.

The latest proposals from Google are significantly improved, according to the E.U.’s competition commissioner, Joaquin Almunia.

Almunia said a settlement is possible, and the best solution for consumers. At a special hearing in the European Parliament on Tuesday, he said he hopes to resolve the case by “next spring.” However, he did not rule out the possibility of sanctions if Google’s proposals prove unsatisfactory.

The first review process, or market test, led Almunia to doubt whether it would be possible to reach an agreement on the measures to which Google must commit.

“Many respondents during the market test said that in this Google proposal the links to rivals that would be displayed for certain categories of specialized search services were not visible enough. In my opinion, the new proposal makes these links significantly more visible.”

Google is keen to bring the case to a close, according to the company’s senior vice president and general counsel, Kent Walker.

“Given the feedback the European Commission received on our first proposal, they have insisted on further, significant changes to the way we display search results. While competition online is thriving, we’ve made the difficult decision to agree to their requirements in the interests of reaching a settlement,” Walker said.

The case has been ongoing since November 2010, but Google is also facing other investigations in the European Union. It’s Motorola Mobility unit was sent a formal complaint by the European Commission for abusing its dominant position by imposing injunctions against Apple for the use of standards-essential patents.

Another probe that is at a preliminary stage concerns allegations received about some aspects of the Android ecosystem. Almunia said on Tuesday that he had not reached a decision on whether to launch a formal investigation into Android.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at



Continue Reading

Follow Us

Bookmark and Share

Popular Posts