Archive for April, 2013:

Google Fiber divides users into ‘the fast’ and ‘the furious’

Google’s fiber push is making the ‘have nots’ mad. That’s a good thing.

Every day is a beautiful day in the fiberhood.

The chosen ones in Kanas City, Austin and Provo are getting Internet connections that are 100 times faster than average at very low prices, thanks to Google’s Fiber project.

Unfortunately, you and I don’t live there. So we’re stuck in a bandwidth backwater.

As Internet trolls like to say: U mad, bro?

If you don’t live in one of these cities, you should be mad. Google’s Fiber project demonstrates that very high Internet speeds are possible and nobody except Google has the vision or courage to make it happen.

One Internet bandwidth provider has admitted the ability to provide much faster speeds to consumers, but has decided not to. Time Warner Cable CFO Irene Esteves said in February that Time Warner is perfectly capable of “delivering 1 gigabit, 10 gigabit-per-second” Internet connectivity to consumers, but that the company just doesn’t “see the need of delivering that to consumers.” I believe Esteves’ statement accurately represents the thinking of most existing Internet providers.

Now are you mad?

The issue isn’t really that consumers don’t want faster Internet speeds. And it’s not that cable providers don’t care.

It’s really a chicken-and-egg problem.
Why your Internet is so slow

Average U.S. Internet speeds rank 12th or 13th in the world, which is pathetic for the country that invented the Internet and contains Silicon Valley, Hollywood and data-hungry Wall Street and a $15 trillion annual GDP.

Other countries are pulling away. A Sony-backed service recently announced 2Gbps download speeds in Tokyo for $51 per month — twice the speed of Google Fiber and 200 times faster than the U.S. average and at a lower price than Google Fiber.

Now are you mad?

Gigabit fiber Internet access is affordable, but only if everybody gets it. But everybody isn’t going to get it unless it’s affordable.

And that’s why we can’t have nice things.

At least, that’s what Esteves really means when she says that users don’t want faster speeds. Providing consumers with the faster speeds Time Warner currently provides to some business customers is very expensive because only a few customers pay for it all.

It’s not that Time Warner Cable’s customers don’t want fast Internet. They don’t want Time Warner Cable’s price for fast Internet.

However, if you lay fiber to every home in a city, and if a majority of homes sign up to use it, the cost can come way down. And that’s what Google Fiber is all about. It’s about making a bet on the future and investing heavily to bootstrap widespread use and high demand.

Google’s Fiber project involves the actual digging of trenches and the actual laying of fiber optic cables all the way to homes. There are innumerable logistical and legal hurdles to overcome for each city.

Google is already providing the service in Kansas City, and is still expanding into new neighborhoods there. The company recently announced that it would roll it out to Austin, Texas, then Provo, Utah.

Google offers consumers three “plans.” The first is Internet comparable in speed to ordinary broadband, and it’s free. The second is 1Gbps speeds and 1 TB of Google Drive space for $70 per month. The third adds TV plus a 2TB DVR box for a total of $120 per month.

Getting Google Fiber service is just like getting cable Internet service (except 100 times faster). You get a Wi-Fi capable router, and you plug your PC into it via Ethernet for the full-speed experience.

Google is spending $84 million to build the infrastructure necessary to serve 149,000 Kanas City customers. That’s $563.75 per customer, for you math majors. (If that sounds like a lot of money, consider that the infrastructure gives you 100 times faster Internet for the rest of your life for the price of an iPad. Still, customers don’t have to pay for it up front — Google is doing that.) And it gets cheaper per customer with each new person that signs up.

Goldman Sachs estimates that it would cost $140 billion to deploy Google Fiber nationwide.

To put that in perspective, that one-time investment would give entrepreneurs in every state of the union a radical advantage globally, ignite an economic boom comparable to the nationwide deployment of electricity 100 years ago and enable incredible new services — all for less than what the U.S. loses each year in offshore tax havens.

Now are you mad?
Why mad users are the best thing about Google Fiber

It’s unlikely that Google will lay fiber to every city in the US, and less likely still that Google will do that Internationally. And it doesn’t need to.

Google Fiber is already “inspiring” ISPs to boost speeds and investment. Google may be triggering an arms race for high-speed Internet connectivity, because it’s re-setting expectations about how fast the Internet should be.

This increasingly matters as HD movies and TV becomes more mainstream. Right now, Netflix alone consumes one-third of all the download bandwidth in the U.S. at peak times.

Hollywood and other movies-on-demand services had better get busy offering compelling services. More than half the upload bandwidth in the U.S. is consumed by BitTorrent.

I think the minority of providers who figure out how to offer vastly higher speeds at very low cost will survive, and the Time Warners will get out of the ISP business for good.

No, AT&T didn’t announce gigabit fiber in Austin

Hours after Google announced that Austin would get the Google Fiber treatment, AT&T (which is headquartered in Dallas) announced that it would build a gigabit fiber network of its own in Austin.

Or, at least that’s what the news reports would have you believe. But if you look at the press release, it was really a passive-aggressive bit of whining about Google getting special treatment from Austin authorities.

Instead of announcing a plan to build fiber optic connectivity in Austin, AT&T actually announced that “it is prepared to build an advanced fiber optic infrastructure in Austin,” according to the announcement press release.

“Prepared to build” does not mean “plans to build.”

Then the whining began: AT&T’s plans “anticipate it will be granted the same terms and conditions as Google on issues such as geographic scope of offerings, rights of way, permitting, state licenses and any investment incentives.”

The release ended with this zinger: “Our potential capital investment will depend on the extent we can reach satisfactory agreements.”

In other words, the whole reason for AT&T’s press release was not to announce the intention to build fiber optic gigabit Internet connectivity, but instead to complain about preferential treatment of Google by local authorities.

AT&T has a point. Local, state and government regulations and restrictions are a big part of why our Internet speeds are so slow. And that’s yet another reason why Google Fiber is so brilliant.
Google is simply smarter than AT&T

Rather than approaching individual cities and begging them for permission to lay fiber, Google held a big contest and said, in effect: “OK, we’re going to pick a city to gain a massive economic boost. You want it? What are you going to do for us?”

Then they started choosing from among the 1,100 applicant cities based on which ones were most serious about making Google Fiber possible.

In fact, Google Fiber triggered a gold rush of entrepreneurial investment and activity.

One enterprising local even rents their Google Fiber-connected home at a premium on AirBnB, and calls it “Hacker House.”

Google Fiber is creating a lot of hype and attention. It’s making people realize that affordable, ultra high-speed Internet connectivity is possible.

It’s making people look at their local governments and ISPs and ask: Why can’t I have this?

But mostly, Google Fiber is making people mad. And that’s the right emotion in the face of the incredible waste of time and money and opportunity that takes place every day that goes by while we’re held back by yesterday’s Internet speeds.

But let’s not just get mad. Let’s get fiber.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 


Continue Reading

Hackers increasingly target shared Web hosting servers for use in mass phishing attacks

Nearly half of phishing attacks seen during the second half of 2012 involved the use of hacked shared hosting servers, APWG report says

Cybercriminals increasingly hack into shared Web hosting servers in order to use the domains hosted on them in large phishing campaigns, according to a report from the Anti-Phishing Working Group (APWG).

Forty-seven percent of all phishing attacks recorded worldwide during the second half of 2012 involved such mass break-ins, APWG said in the latest edition of its Global Phishing Survey report published Thursday.

In this type of attack, once phishers break into a shared Web hosting server, they update its configuration so that phishing pages are displayed from a particular subdirectory of every website hosted on the server, APWG said. A single shared hosting server can host dozens, hundreds or even thousands of websites at a time, the organization said.

APWG is a coalition of over 2000 organizations that include security vendors, financial institutions, retailers, ISPs, telecommunication companies, defense contractors, law enforcement agencies, trade groups, government agencies and more.

Hacking into shared Web hosting servers and hijacking their domains for phishing purposes is not a new technique, but this type of malicious activity reached a peak in August 2012, when APWG detected over 14,000 phishing attacks sitting on 61 servers. “Levels did decline in late 2012, but still remained troublingly high,” APWG said.

During the second half of 2012, there were at least 123,486 unique phishing attacks worldwide that involved 89,748 unique domain names, APWG said. This was a significant increase from the 93,462 phishing attacks and 64,204 associated domains observed by the organization during the first half of 2012.

“Of the 89,748 phishing domains, we identified 5,835 domain names that we believe were registered maliciously, by phishers,” APWG said. “The other 83,913 domains were almost all hacked or compromised on vulnerable Web hosting.”

In order to break into such servers, attackers exploit vulnerabilities in Web server administration panels like cPanel or Plesk and popular Web applications like WordPress or Joomla. “These attacks highlight the vulnerability of hosting providers and software, exploit weak password management, and provide plenty of reason to worry,” the organization said.

Cybercriminals break into shared hosting environments in order to use their resources in various types of attacks, not just phishing, APWG said. For example, since late 2012 a group of hackers has been compromising Web servers in order to launch DDoS (distributed denial-of-service) attacks against U.S. financial institutions.

In one mass attack campaign dubbed Darkleech, attackers compromised thousands of Apache Web servers and installed SSH backdoors on them. It’s not clear how the Darkleech attackers break into these servers in the first place, but vulnerabilities in Plesk, cPanel, Webmin or WordPress have been suggested as possible entry points.

 


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 


Continue Reading

70-450 PRO: Designing, Optimizing and Maintaining a Database Administrative Solution Using Microsoft SQL Server 2008

QUESTION 1
You work as a database administrator at Certkingdom.com. You are in the process of preparing the
deployment of a new database that will have 45 gigabytes storage space for the transaction log
file, and 280 gigabytes storage space for the database data file.
There are approximately six 120 GB disk drives available for the database in the storage array.
Certkingdom.com contains a RAID controller that supports RAID levels 0, 1, 5 and 10. The disks are on
the RAID controller. You have received an instruction from the CIO to make sure that the
transaction log’s write performance runs at optimum. The CIO has also instructed you to make
sure that in the event of a drive failure, the database and transaction log files are protected.
To achieve this goal, you decide to configure a storage solution.
Which of the following actions should you take?

A. You should consider using a RAID 1 volume as well as a RAID 5 volume in your storage configuration.
B. You should consider using a RAID 1 volume as well as a RAID 10 volume in your storage configuration.
C. You should consider using a RAID 3 volume as well as a RAID 5 volume in your storage configuration.
D. You should consider using a RAID 1 volume as well as a RAID 3 volume in your storage configuration.

Answer: A

Explanation:


QUESTION 2
You work as a database administrator at Certkingdom.com. Certkingdom.com has a database server named CertkingdomDB04
with a SQL Server 2008 instance that includes an extensive mission-critical database that is
constantly being used Certkingdom-DB04 has a quad-core motherboard with four CPUs.
When it is reported that Certkingdom-DB04 often encounters CPU pressure, you receive an instruction
from management to make sure that the accessible CPU cycles are not exhausted by online index rebuilds.
Which of the following actions should you take?

A. You should make use of the affinity I/O mask option.
B. You should make use of the optimize for ad hoc workloads option.
C. You should make use of the affinity mask option.
D. You should make use of the max degree of parallelism option.

Answer: D

Explanation:


QUESTION 3
You work as a database administrator at Certkingdom.com. Certkingdom.com has a database server named CertkingdomDB01
with a SQL Server 2008 instance.
During routine monitoring on Certkingdom-DB01, you discover that the amount of CXPACKET waits
experienced by the instance is low, while the amount of lazy writer waits is abundant.
You have been instructed to enhance the operation of the instance to ensure productivity.
Which of the following actions should you take?

A. You should consider setting up the Windows System Monitoring tool to better the performance.
B. You should consider setting up the Asynchronous database mirroring to better the performance.
C. You should consider using the SQLAGENT.OUT log to better the performance.
D. You should consider setting up the software non-uniform memory access (soft-NUMA) to better the performance.

Answer: D

Explanation:


QUESTION 4
You work as a database administrator at Certkingdom.com. Certkingdom.com has a database server named CertkingdomDB01.
Certkingdom-DB01 is configured with 4 quad-core processors, 80 gigabytes of RAM, and multiple
independent raid volumes.
You are in the process of using a transactional database on the instance. It is anticipated that the
transactional database will have a significant amount of INSERT, UPDATE, and DELETE
activities, which incorporates the creation of new tables.
You receive an instruction from management to minimize the contention in the storage allocation
structures so that database performance is optimized, and the disk bandwidth maximized.
Which of the following actions should you take?

A. You should consider enabling Server Auditing.
B. You should consider using multiple data files for the database.
C. You should consider using row-level compression.
D. You should consider using the checksum page verify option.

Answer: B

Explanation:


QUESTION 5
You work as a database administrator at Certkingdom.com.
Certkingdom.com has informed you that a new database, named CertkingdomData, has to be installed on a SQL
Server 2008 instance. CertkingdomData is made up of several schemas, of which one will host a significant amount of
read-only reference information. Information is regularly inserted and updated on CertkingdomData.
You have received instructions from the management to configure a physical database structure
that enhances the backup operation. Which of the following actions should you take?

A. This can be accomplished by using multiple filegroups and a single log file to set up the database.
B. This can be accomplished by using caching on the multiple data files.
C. This can be accomplished by using multiple downstream servers to create the database.
D. This can be accomplished by using the Database Engine Tuning Advisor tool to create the database.

Answer: A

Explanation:


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Continue Reading

Microsoft 70-693 exam study guides

Microsoft 70-693 simulator exam
70-693 MCITP
Microsoft certification validates the candidate’s skills needed by today’s IT professionals to enhance their professional career. Microsoft offers a wide range of certifications like Pro:Windows Server 2008 R2,Virtualization Administrator and other networking and developing certifications. You just need to decide which certification is suitable and best for your career. Pro:Windows Server 2008 R2,Virtualization Administrator is an international level recognized certification offered by Microsoft MCITP worldwide. 70-693 is one of the most important certification codes of Pro:Windows Server 2008 R2,Virtualization Administrator.

You should expand your work experience and take the advantage of Microsoft 70-693 training resources. Everything from practice test, sample questions and sample answers braindumps and free study guide will help you become Microsoft certified professional.

How to pass your Pro:Windows Server 2008 R2,Virtualization Administrator?
70-693 MCITP
Microsoft is a leader providing IT certification examination guide , exam simulation and study guide certification. Find the upgrade versions of Microsoft 70-693 with exam science, latest questions and pdf exam. Upgrade exams are necessary because the changing in Microsoft 70-693 exam pattern.

We developed 70-693 Microsoft exam with the help of our highly certified professionals according to the latest Microsoft updates. Our study guide certification assures you passing your exam in your first attempt with high scores and become Microsoft MCITP certified professional. You can download certification test and start preparing your MCITP 70-693 right now. This certification exam preparation guide not only help you pass your MCITP 70-693 exam but enable you to demonstrate the purpose of the exam.

MCTS Certification, MCITP Certification

 

Microsoft Certification, Microsoft Examand over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com

All Microsoft exam products are backed with 100 percent money back guarantee, if you could not pass your certification exam in your first attempt. Our answer lab contains the answers of every question that you might be asked in your Pro:Windows Server 2008 R2,Virtualization Administrator exam.

What inside the Microsoft 70-693 preparation kit?
70-693 MCITP
Microsoft MCITP Pro:Windows Server 2008 R2,Virtualization Administrator exam is one of the core requirements of 70-693 Microsoft certification. Passing MCITP Pro:Windows Server 2008 R2,Virtualization Administrator exam is easy. Microsoft designed this exam preparation guide in such a way that you do not need to search for other books and helping material about 70-693 Microsoft. This examination guide contains everything you need to pass your 70-693 Microsoft exam.

Microsoft offers you a comprehensive certification test solution to help you become Microsoft certified professional. This certification preparation guide comes with free study guide, sample questions and answers, pdf exam, braindumps and answers lab that give you the experience of actual 70-693 Microsoft certification exam. This preparation kit also contains study notes, 70-693 Microsoft pdf, 70-693 download, Microsoft 70-693 practice test and 70-693 Microsoft review.

Come & Join Us
70-693 MCITP
If you have decided to become Microsoft 70-693 Microsoft certified professional, Microsoft is here to help you achieve your goal. We know better what you need to pass your MCITP 70-693 exams. Our commitment is to provide you quality braindumps, exam science, practice test, questions and answers, study guide, tutorials and other course related material. Get everything you need to pass your 70-693 Microsoft exam.

1. Your environment includes a Windows Server 2008 R2 Hyper-V failover cluster and a single Windows Server 2008 R2 Hyper-V server. You are designing a migration strategy. You need to ensure that you can perform a SAN migration to move virtual machines (VMs) from the single server into the failover cluster. Which two actions should you perform? (Each correct answer presents part of the solution. Choose two.)
A. Add the Storage Manager for SANs feature.
B. Install a Virtual Disk Service (VDS) hardware provider.
C. Use Cluster Shared Volumes (CSVs) to store the VM files.
D. Install Microsoft System Center Virtual Machine Manager (VMM) 2008 R2.
Answers: BD
Free download:Microsoft 70-693

Posted in: Microsoft
Tags:

Continue Reading

70-410 Microsoft Certified Solutions Associate (MCSA): Windows Server 2012

QUESTION 1
You work as an administrator at Certkingdom.com. The Certkingdom.com network consists of a single domain
named Certkingdom.com. All servers on the Certkingdom.com network have Windows Server 2012 installed.
Certkingdom.com has a server, named Certkingdom-SR07, which has two physical disks installed. The C: drive
hosts the boot partition, while the D: drive is not being used. Both disks are online.
You have received instructions to create a virtual machine on Certkingdom-SR07. Subsequent to creating
the virtual machine, you have to connect the D: drive to the virtual machine.
Which of the following is TRUE with regards to connecting a physical disk to a virtual machine?

A. The physical disk should not be online.
B. The physical disk should be uninstalled and re-installed.
C. The physical disk should be configured as a striped disk.
D. The physical disk should be configured as a mirrored disk.

Answer: A

Explanation:


QUESTION 2
You work as a senior administrator at Certkingdom.com. The Certkingdom.com network consists of a single
domain named Certkingdom.com. All servers on the Certkingdom.com network have Windows Server 2012
installed.
You are running a training exercise for junior administrators. You are currently discussing the new
VHD format called VHDX.
Which of the following is TRUE with regards to VHDX? (Choose all that apply.)

A. It supports virtual hard disk storage capacity of up to 64 GB.
B. It supports virtual hard disk storage capacity of up to 64 TB.
C. It does not provide protection against data corruption during power failures.
D. It has the ability to store custom metadata about the file that the user might want to record.

Answer: B,D

Explanation:


QUESTION 3
You work as a senior administrator at Certkingdom.com. The Certkingdom.com network consists of a single
domain named Certkingdom.com. All servers on the Certkingdom.com network have Windows Server 2012
installed, and all workstations have Windows 8 installed.
You are running a training exercise for junior administrators. You are currently discussing a
Windows PowerShell cmdlet that activates previously de-activated firewall rules.
Which of the following is the cmdlet being discussed?

A. Set-NetFirewallRule
B. Enable-NetFirewallRule
C. Set-NetIPsecRule
D. Enable-NetIPsecRule

Answer: B

Explanation:


QUESTION 4
You work as a senior administrator at Certkingdom.com. The Certkingdom.com network consists of a single
domain named Certkingdom.com. All servers on the Certkingdom.com network have Windows Server 2012
installed, and all workstations have Windows 8 installed.
You are running a training exercise for junior administrators. You are currently discussing the
Always Offline Mode.
Which of the following is TRUE with regards to the Always Offline Mode? (Choose all that apply.)

A. It allows for swifter access to cached files and redirected folders.
B. To enable Always Offline Mode, you have to satisfy the forest and domain functional-level
requirements, as well as schema requirements.
C. It allows for lower bandwidth usage due to users are always working offline.
D. To enable Always Offline Mode, you must have workstations running Windows 7 or Windows
Server 2008 R2.

Answer: A,C

Explanation:


 

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Continue Reading

70-646 Q&A / Study Guide / 70-646 Videos / Testing Engine

QUESTION 1
You work as the Enterprise administrator at Certkingdom.com. The Certkingdom.com network has a domain
named Certkingdom.com. The servers on the Certkingdom.com network run Windows Server 2008 and all client
computers run Windows Vista.
The Certkingdom.com network contains more than 3,000 computers. Certkingdom.com wants to make use of
Windows Server Update Services (WSUS) updates. You thus need to setup the appropriate
storage mechanism so that it provides high availability.
Where should you store the WSUS updates?

A. In a storage subsystem as a RAID 10.
B. In a network load balancing cluster.
C. In a newly created Group Policy.
D. In a Distributed File System (DFS) link that is configured to utilize several replicating targets.

Answer: D

Explanation: You need to keep the updates on a Distributed File System (DFS) link that uses
multiple replicating targets. This will ensure that the updates highly available. The DFS contain the
following capabilities: views of folders and files, that is a virtual organization where those files
physically reside in a network.
Reference: Step 4: Set up a DFS share
http://technet.microsoft.com/en-us/library/cc708533.aspx


QUESTION 2
You work as the Enterprise administrator at Certkingdom.com. The Certkingdom.com network has forest with two
domains named us.Certkingdom.com and uk.Certkingdom.com. The functional level of the forest is set at Windows
Server 2008.
A new Certkingdom.com security policy requires that the local guest accounts and administrator accounts
should be renamed. You have to ensure that the local guest accounts are disabled after it has
been renamed.
How can this be achieved?

A. By using a custom network profile.
B. By using a Group Policy object (GPO) for every domain.
C. By using a folder redirection on all the root domain controllers.
D. By using a ServerManagerCMD tool for the root main.

Answer: B

Explanation: You need to use Group Policy object (GPO) for every domain. With this you can
rename administrator accounts as well as renaming and disabling the local guest accounts.
Windows Server 2003 also permits you to modify the administrator account and guest account
names with a Group Policy.
Reference: HOW TO: Rename the Administrator and Guest Account in Windows Server 2003
http://support.microsoft.com/kb/816109


QUESTION 3
You work as the Enterprise administrator at Certkingdom.com. The Certkingdom.com network consists of a single
Active Directory domain named Certkingdom.com. All servers on the Certkingdom.com network run Windows
Server 2008 and all client computers run Windows Vista.
Certkingdom.com has its headquarters in Chicago where you are located and a branch office in Dallas that
employs a number of helpdesk staff. You have to implement a new server named Certkingdom-SR10 in
the Dallas office. The setup policy of Certkingdom.com states that all helpdesk staff have the necessary
permissions to manage services. The helpdesk staff should also be able to configure server roles
on Certkingdom-SR10. You need to accomplish this ensuring that the helpdesk staff have the least
amount of permissions.
How can this be achieved?

A. You should make the helpdesk staff, members of the global security group.
B. You should make the helpdesk staff, members of the Server Operators group on Certkingdom-SR10.
C. You should make the helpdesk staff, members of the Account Operators group on Certkingdom-SR10.
D. You should make helpdesk staff, members of the Administrators group on Certkingdom-SR10.

Answer: D

Explanation: To add the helpdesk staff to the Administrators local group will give full
administrative access to an individual computer or a single domain. The user must be a member
of the Administrators group to change accounts or stop and start services or install server roles.
Reference: Using Default Group Accounts
http://technet.microsoft.com/en-us/library/bb726982.aspx
Reference: Securing the Local Administrators Group on Every Desktop
http://www.windowsecurity.com/articles/Securing-Local-Administrators-Group-Every-Desktop.html


QUESTION 4
You work as the Enterprise administrator at Certkingdom.com. The Certkingdom.com network has a domain
named Certkingdom.com. The servers on the Certkingdom.com network are configured to run Windows Server
2008 and the client computers run Windows Vista.
Certkingdom.com has its headquarters in Paris and branch offices in London and Stockholm. You are in
the process of devising a file sharing policy to ensure standardization throughout the network.
Your policy needs to ensure that the Certkingdom.com offices are able to access the files using the
universal Naming Convention (UNC) path. In the event of a server failure files should still be
accessible and the minimum bandwidth needs to be utilized.
You need to determine the components that need to be added to your policy?

A. You should add a DFS namespace that is domain-based and uses replication.
B. You should add the Hyper-V feature to your policy.
C. You should use failover clusters with three servers, one for each office.
D. You should add a DFS namespace that is server-based and uses replication.

Answer: A

Explanation: To comply with the CIO’s request, you need to use domain-based DFS namespace
that uses replication. To implement domain-based DFS namespace, the servers need to members
of the Active Directory domain. Furthermore, domain-based DFS enables multiple replications.
Multiple DFS replicas also provide some fault tolerance.


QUESTION 5
You work as the Enterprise administrator at Certkingdom.com. The Certkingdom.com network has a domain
named Certkingdom.com. All servers on the Certkingdom.com network run Windows Server 2008 and all client
computers run Windows Vista.
The Certkingdom.com network contains a Windows Server 2008 failover cluster that in turn hosts a
database application. During routine monitoring you discover that the database application make
use of almost half of processor and memory usage allocated for it. You want to make sure that the
level of performance is maintained on the cluster.
How can this be achieved? (Choose TWO. Each answer forms part of the solution.)

A. By using the Windows System Resource Manager (WSRM)
B. The using event subscriptions.
C. By using the Microsoft System Center Configuration Manager (SCCM)
D. By establishing a resource-allocation policy for process-based management.
E. By establishing Performance Monitor alerts.

Answer: A,D

Explanation: You need to use Windows System Resource Manager (WSRM) and set up a
resource-allocation policy for process-based management. The Windows System Resource
Manager (WSRM) enables the allocation of resources, including processor and memory
resources, among multiple applications based on business priorities. You can set the CPU and
memory allocation policies on applications. Furthermore, Windows System Resource Manager
(WSRM) does not manage address windowing extensions (AWE) memory. It also does not
manage large page memory, locked memory, or OS pool memory.
Reference: Windows System Resource Manager Fast Facts
http://www.microsoft.com/windowsserver2003/techinfo/overview/wsrmfastfacts.mspx


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Continue Reading

70-431 Q&A / Study Guide / Testing Engine / Videos


QUESTION 1
Certkingdom has hired you as their Database administrator. You create a database on named
Development on ABC-DB01 that hosts an instance of SQL Server 2005 Enterprise Edition.
You perform weekly maintenance and conclude that the Development database growing at about
100 MB on a monthly basis. The network users started complaining about poor performance of
queries run against the database.
There is 2GB RAM installed on DB01 and the database consumes 1.6 GB of RAM.
How would you determine whether additional RAM should be acquired for ABC-DB01?

A. You should consider monitoring the SQL Server: Memory Manager – Target Server Memory
(KB) Page Splits/sec counter in System Monitor.
B. You should consider monitoring the SQL Server: Buffer Manager counter in System Monitor.
C. You should consider monitoring the System – Processor Queue counter in System Monitor.
D. You should consider monitoring the SQL Server: Access Methods – Page Splits/sec counter in System Monitor.

Answer: B

Explanation: The SQL Server: Buffer Manager object is utilized to view the information related to
bottlenecks.


QUESTION 2
You create a database on ABC-DB01 that is running an instance of SQL Server 2005 Enterprise Edition.
Certkingdom recently suffered a power outage to the building which forces you to restart ABC-DB01
which now fails to start the SQL Server (MSSQLSERVER). Certkingdom wants you to troubleshoot
the service failure.
What must be done to determine the cause of the service start failure?

A. You should consider reviewing the Event Viewer logs listed below:
The Event Viewer Applications log.
The Event Viewer System logs.
Microsoft Notepad should be utilized to manually view the Microsoft SQL
Server\MSSQL.1\MSSQL\LOG\ErrorLog file.

B. You should consider reviewing the Event Viewer logs listed below :
The Event Viewer Windows logs.
The Event Viewer Setup logs.
The Event Viewer Application log.

C. You should consider reviewing the Event Viewer logs listed below :
The Event Viewer Forwarded Event logs.
The Event Viewer Hardware events.
The Event Viewer Security log.

D. You should consider reviewing the Event Viewer logs listed below :
The Event Viewer Setup Events logs.
The Event Viewer Windows logs.
The Event Viewer Applications and Services logs.

Answer: A


QUESTION 3
Certkingdom has recently opened an office in Perth where you work as the database administrator.
The Certkingdom database infrastructure runs on computers utilizing SQL Server 2005 Enterprise
Edition. You create a Import database with a backup schedule configured in the table below:

The Imports database contains a table named incoming which was updated a week ago. During
the course of the day A network user informs you that a table has been dropped from the Imports
table at 16:10. Certkingdom wants the incoming table restored to the Imports database.
What must be done to restore the table using minimal effort and ensuring data loss is kept to a minimum?

A. You should consider restoring the database from the recent deferential backup.
B. You should consider having the differential of Monday and snapshot backup from Tuesday restored.
C. You should consider deletion of all differential and database snapshots except the recent backup.
D. You should consider having the database table recovered from the recent database snapshot.

Answer: A


QUESTION 4
Certkingdom hired you as the network database administrators. You create a database named
Customers on ABC-DB01 running an instance of SQL Server 2005 Enterprise Edition.
A custom application is used to access and query the database. The users recently reported that
the custom application experiences deadlock conditions constantly.
How would you determine which Server session ID is related to the deadlocks?
What must be done to observer the SQL Server session ID involved with the deadlock scenario?

A. You should consider monitoring the SQL Server Profiler to monitor Error and Warning events.
B. You should consider monitoring the SQL Server Profiler to monitor Lock Deadlock Chain events.
C. You should consider monitoring the SQL Server Profiler to monitor Objects.
D. You should consider monitoring the SQL Server Profiler to monitor performance.

Answer: B


QUESTION 5
You are the system administer of a SQL Server 2005 Enterprise Edition server named DB01 that
uses Windows Authentication mode. Certkingdom uses a custom developed application for running
queries against the database on DB01. Users complain that the custom application stops
responding. Yu notice that the CPU utilization at 100% capacity.
You then try to connect to DB01 by utilizing the SQL Server Management Studio but DB01 still
does not respond. Certkingdom wants you to connect to ABC-DB01 to determine the problem.
What should be done to successfully determine the problem?

A. You should consider utilizing the osql –L command from the command prompt.
B. You should consider utilizing the sqlcmd –A command from the command prompt.
You could additionally use SQL Server Management Studio to access Database Engine Query for
connecting to DB01 with the SQL Server Authentication mode.
C. You should consider utilizing the osql –E command from the command prompt.
D. You should consider utilizing the sqlcmd –N command from the command prompt.
E. You should consider utilizing the sqlcmd –R command from the command prompt.

Answer: B

Explanation: The sqlcmd –A command is utilized to ensure that a dedicated administrative connection is utilized.


MCTS Training, MCITP Trainnig


Continue Reading

Google-led group warns of ‘patent privateers

BlackBerry, Red Hat, Google and EarthLink say businesses use patent trolls as mercenaries to harass the competition

Patent trolls are increasingly becoming a weapon some companies can use to harm or harass their competitors, according to public comments jointly submitted today to the Federal Trade Commission and the Justice Department by lawyers for Google, Red Hat, BlackBerry and EarthLink.

The comment detail what the companies say is a rising tide of so-called “patent privateering” and called for a large-scale government probe of the matter. The term refers to the practice of selling patents to a patent-assertion entity (or patent troll), which enables the troll to turn around and sue a competitor without the original company having to expose itself to negative publicity or countersuits.

Google senior competition counsel Matthew Bye explained why the process works in an official blog post.

“Trolls use the patents they receive to sue with impunity – since they don’t make anything, they can’t be countersued. The transferring company hides behind the troll to shield itself from litigation, and sometimes even arranges to get a cut of the money extracted by troll lawsuits and licenses,” he wrote.

What’s more, according to the companies, patent privateering can be used to circumvent fair, reasonable and non-discriminatory licensing agreements – exposing businesses that made good-faith decisions to create products based on a given technology to infringement suits by trolls.

Google and its co-signers urged an FTC investigation into the practice, saying that the extent of patent privateering and its effects is difficult to quantify without additional information.

“The secrecy in which PAEs cloak their activities exacerbates all of these concerns and leaves the public without information needed to access the likely competitive effects of patent outsourcing practices,” the companies said.

Google recently announced an Open Patent Non-Assertion Pledge, saying that it will agree never to sue over the use of some designated patents unless attacked first. The first 10 patents in the program all relate to MapReduce, a big data processing model.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


Continue Reading

Windows XP decline stalls as users hold onto aged OS, flaunt 2014 deadline

A third of all Windows users could still be running XP when Microsoft pulls patch plug in 53 weeks

The decline in usage share of Windows XP, which is slated for retirement in 53 weeks, has slowed significantly, hinting that millions of its users will hold onto the operating system much longer than some, including Microsoft, expect.

Data published monthly by California-based Web analytics company Net Applications indicates that XP’s long-running slide has virtually stalled since Jan. 1.

In the past three months, Windows XP’s monthly drop in share has averaged just 0.12 of a percentage point. That’s less than a fifth as much as the 12-month average of 0.68 percentage points.

Other averages point to a major deceleration in declining usage share: XP’s most recent six-month average decrease of 0.42 percentage points was less than half the 0.94 point average for the prior six months.

Likewise for longer timespans. In the last 12 months, Windows XP has dropped an average of 0.68 percentage points, while in the 12 months prior it fell by 0.83 percentage points.

In other words, in the second half of a 12-month stretch, XP’s decline slowed by 55%; in the second year of a two-year span, it slowed 18%.

The slowdown paints a picture that must depress Microsoft, which has been banging the upgrade drum at Windows XP users for nearly two years, and has repeatedly warned them that free security updates will stop after April 8, 2014.

Net Applications’ data can also be used to roughly plot XP’s future usage share.

If the average decline of the last 12 months holds, XP will still account for 30% of all personal computers at the end of April 2014, or 33% of all systems expected to be running Windows at that time.

Recent estimates of XP’s future by analysts, however, have been more conservative, with experts from Gartner and Forrester Research predicting that 10% to 20% of enterprise systems will still be on the aged OS when support stops.

Microsoft has not pegged XP’s current corporate share, but the Redmond, Wash., software developer clearly knows it’s large: In January, during the company’s last quarterly earnings call, CFO Peter Klein said 60% of all enterprise PCs were running Windows 7.

Since few businesses adopted Windows Vista — and with Vista’s usage share now under 5%, some that did likely ditched it — the remaining 40% must, by default, largely be Windows XP.

Windows XP will not suddenly stop working 53 weeks from now; it will boot, run applications and connect to the Internet as it did before. But it will not be served with security updates. Minus patches, and knowing how frequently cyber criminals uncover vulnerabilities, security experts expect hackers to exploit XP bugs that users will have no way of quashing.

Those same experts have split on whether Microsoft will extend Windows XP’s support to protect what increasingly looks to be a major chunk of Windows users. But Microsoft has not signaled any desire to do so.

Granted, Microsoft will have supported XP for 12 years and 5 months, or about two-and-a-half years longer than its usual decade. That will be a record, as XP this month tied the previous Methuselah, Windows NT, which received 11 years and five months of support.

But Microsoft could still rethink its XP policy, and mimic rival Apple, which has continued to support OS X Snow Leopard, an operating system that, like XP, maintains a robust usage share.

Apple, which has never spelled out its security update policies, typically has stopped supporting “n-2,” where “n” is the most current edition of OS X, around the time it releases “n.”

Snow Leopard — “n-2” in that formula, having been superseded by Lion and Mountain Lion, the latter representing “n” — has continued to receive security updates, most recently on March 14, or about eight months after Mountain Lion’s launch.

By continuing to update Snow Leopard, which powered 27% of all Macs last month, Apple patched 91% of all Macs last month.

Microsoft could do even better — cover 96% of all current Windows PCs — by continuing to support XP after April 2014.

But one expert thought that very unlikely. “I think they have to draw a line in the sand,” said John Pescatore, then an analyst with Gartner, now with the SANS Institute, in an interview last December. “They’ve supported XP longer than anything else, so they’d be pretty clean from the moral end.”

To track how long XP has before retirement, users can browse to an online countdown clock maintained by Camwood, a U.K. firm that specializes in helping businesses migrate to newer operating systems.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 


Continue Reading

How valuable are security certifications today?

When it comes to education, most people agree, more is better. No one embodies that principle at least in regard to IT certifications better than Jerry Irvine. CIO of IT consulting firm Prescient Solutions and member of the National Cyber Security Task Force, Irvine holds more than 20 IT certifications, of which at least six are specifically information security-oriented.

“I’ll stop getting certifications when I’m dead,” says Irvine, though one wonders if even that will dissuade him. Irvine is a strong believer in the notion that the value of certifications in general and security certifications in particular shows up in your wallet.

“My opinion is the more certified you are, the more marketable you are. You can prove you know more because you have those certifications,” says Irvine. “People look at you and say, ‘This guy really does know his stuff.’ That gives you the opportunity to make more money.”

Anyone who puts in the time and spends the money to get certified is showing they care about staying current with security trends and techniques. That quality makes someone more desirable to an employer, he adds.

As a practical matter, many of today’s information security certifications require much hands-on application of skills, such as CompTIA’s CASP (Certified Advanced Security Professional), which requires candidates to configure firewalls and routers and perform other security-related tasks as part of the test. Being able to pass proves to a potential employer that you can do certain things, potentially giving you an edge over those who do not hold the certification.

For some jobs, obtaining a particular security certification whether for information security or physical security is a requisite for even being considered. In that case, you will surely know if there is a certification you need to obtain. Beyond that, however, attaining certifications is generally a matter of personal and/or employer choice. Some certifications require a great deal of work both in and out of the classroom, as well as sitting for the test. The question: Do they generate return on your investment?

Certifications should not be the end goal so much as a tool you can use in furthering your career, cautions Chris Brenton, an instructor at the SANS Institute and director of information security for CloudPassage, a cloud security provider. Brenton has been delivering certification training for quite a few years but perhaps surprisingly does not hold any himself.

Certifications are one way to prove what you know, says Brenton, but there are other ways, especially if you’re a good communicator.

“It’s how much do you know and how good are you at conveying what you know?” he says.
As someone who oversees hiring security professionals for his company, Brenton looks for experience beyond certification that show the job candidate has practical skills. For example, if the candidate created a piece of open-source software relating to security (such as for vulnerability scanning or implementing host-level security), that indicates real-world knowledge, he says.

“If the candidate has an active blog or has written a book about security, that tells me more about their expertise than just looking at their resume with certifications,” he says. In that case, holding a certification would probably not result in the candidate getting a higher salary offer. Certifications do give an edge to someone when weighed against another candidate without any demonstrated expertise, he adds.

And taking a class or obtaining a certification can be a handy way to fill a gap in your expertise, says Brenton.

“Let’s say they understand most aspects of network security but there are still some black-box areas where they need more training.”

His students often come for certification when they want to switch jobs or even careers.

The world of threats both physical and information-based moves so quickly that certification is a way to show you have training and understand the issues. That said, the certification can quickly be out of date as technologies and threats morph and change. A certification that emphasizes perimeter security skills, for example, might well be perceived as less valuable now than one that focuses on vulnerability assessment and mitigation. And there is sure to be a hot new certification in 18 months to two years, if that long.

Those who obtain one security certification may feel the need to keep going as certifications change with the times. That could translate to more money in the certification provider’s wallet than yours. This is less true when it comes to physical security certifications, as physical security threats at least arguably do not change as quickly as information security threats.

Whether or not security certification will earn you more, now or in the future, depends a lot on the organization, the job and the industry. If your company values continuing education (and will help foot some of the bill for the training), that is a good indication that certification will elevate your status. If not, you may still want to pursue certification if you are a person like Jerry Irvine, for whom education is its own reward, or you need to build up your resume in anticipation of a making a move.

Irvine stands by his record.
“I hire security people. I look for certifications. Getting certified really does show something about a person,” he says. “We hire people with certifications.”


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 


Continue Reading

Follow Us

Bookmark and Share


Popular Posts