3 Mar 2014

Introducing Windows Server Update Services (WSUS)

0 comments
   Windows Server Update Services or WSUS is a Windows Server feature which allows system administrators to control how and when updates are installed within the network. WSUS is a great way to centrally administer and monitor servers and workstations and determine what are the best updates to install. WSUS connects to Microsoft's updates and downloads the updates on your server.You can then test each update and determine if they will be deployed within your network. In a WSUS infrastructure, each workstation will run the Windows Update client which verifies if the latest updates are installed. Before downloading any update, the client will verify the digital signature and the Secure Hash Algorithm of each update. The reason for doing this is to ensure that the updates are legitimate and signed by Microsoft.
The WSUS settings can be centrally administrated using Group Policy Management console. The policies can be found under Computer Configuration/Policies/Administrative Templates/Windows Components/Windows Update node. There are many configurable policies available in this section so I will let you explore each of them:
WSUS options
   Note that besides these policies, you can also enable user-based windows updates settings. Just follow the same path and you will discover 3 more options available (note that I'm using a Windows Server 2012 version):
WSUS settings

   When planning to deploy a WSUS infrastructure you'll need to consider several things. If you are using a single network location it's ideal to use one server but, if your network spans several geographical areas, you should deploy a server in every location and build a hierarchical infrastructure. Clients are connecting to the WSUS server using either HTTP or HTTPS to download updates. You should use only one server to directly connect to Microsoft's website and from there other WSUS servers should copy their updates. The connection between the WSUS server and Microsoft's website is made using the HTTP protocol. The most important aspect to consider when deploying a WSUS infrastructure is the bandwidth consumption. Some updates or service packs have several hundreds of megabytes so the overall network bandwidth can be severely affected if multiple computers are downloading updates. 
   You would also need to consider WSUS replication and update approvals. You can choose to deploy a replica WSUS server in each location or an individual server. A replica server acts just like the main WSUS server. This means that settings configured on the main server are applied to the second machine. 
   Disk space is another factor that needs consideration. If you choose to store updates locally then, your WSUS server would require several GB of free space depending on the number of updates or language packs needed. The server will also host a local database containing the list of updates stored on the local disk.
   When implementing updates in a large enterprise, one important aspect is to ensure that each workstation has received the latest updates and is protected against external attacks. The health state of your network devices can be verified using different tools available with Windows distributions: NAP, WSUS console, SMS (Microsoft Systems Center Configuration Manager) or NAP (we've learned how to verify windows updates with NAP in a previous article).
That's it for this short introduction about WSUS. In the next article we will continue discovering this awesome feature available with Windows OS. Wish you all the best and have a great day!

19 Feb 2014

MsMpEng.exe eating too much CPU

0 comments
Hy folks,
Today I had a strange problem regarding one of our IIS web servers. I had a complainant about some web applications that were having really bad performance. Note that the IIS was running under Windows Server 2008 R2 and protected with Microsoft Forefront Endpoint Protection. In such situations you would normally establish a RDP connection with the problematic server and check it's performance. From the beginning I've seen that the RDP was working really slow and I could barely open Task Manager
I then switched to the Performance tab in Task Manager and saw that the CPU was running at 100% capacity. One of the running processes caught my eye because it was constantly eating more than 50% of the processor's capacity. The name of the executable was MsMpEng.exe which is the Microsoft Antimalware Service:
Microsoft Antimalware Service

I know that this service is used by Microsoft FEP for protecting users from malware and other potentially unwanted software but, didn't knew what was causing this behavior. I've tried using Process Explorer utility to analyze the problem but, didn't helped too much. My salvage came when I used Process Monitor (by Sysinternals) to see what was going on behind this process. The antivirus software was trying to access the ServerManager.log and was locking the file:
Process Monitor

 This process was done over and over again so the CPU was constantly working at 100 percent. I've then added the path of the log file in the excluded file and locations section and the problem was finally fixed:
Microsoft Forefront Endpoint Protection

Now, when I open Task Manager, the overall CPU usage is in good parameters:
Task Manager

I've read about this problem over the Internet and some users were suggesting adding the following paths to the excluded files and location section:
C:\ProgramData\Microsoft\Microsoft Forefront Endpoint Protection 2010 Server Management
C:\ProgramData\Microsoft\Microsoft Antimalware
C:\Program Files\Microsoft Security Client\MsMpEng.exe
Note that these solutions didn't worked in my situation and only adding the ServerManager.log file to the exclusion range fixed my problem. The same fixes can be applied to Microsoft Security Essentials running on Windows Desktop versions. 
Hope you'll find this article useful, for any misunderstandings post a comment in our dedicated section and I will try to respond as soon as possible. Don't forget to enjoy your day and stay tuned for the following articles from IT training day.

10 Feb 2014

Why SSL Certificate is Important When it Comes to Web Hosting

0 comments
SSL means Secure Sockets Layer and what the SSL Certificate does is it keep your website protected against frauds and other malicious transactions online and this is the reason why SSL certificate is important when it comes to web hosting. This certificate will establish an encrypted connection between the host, which is your server, and that of the web browser. The connection in between the host and the browser should be kept private because your customers will have to key in their personal information on the browser of your site. You cannot expect your customers to trust your site if you do not have an SSL Certificate.  Learn more at http://www.sslcertificatereviews.net



Boost Up Your Website’s Conversion Rate
One of the reasons why it is important for your web hosting to have an SSL Certificate is for you to be able to easily establish trust and confidence with your customers. Those who shop online are now very careful with their transactions given the number of frauds and scams that are going on all over the World Wide Web. Before they would decide to shop from any website, they will make sure to check the security features of the site and one of these is the SSL Certificate.

If your site does not have this, then you cannot expect your customers to trust you. But if you have this, then it will be easy for your customers to put their trust in you and this can result to an increase on the conversion rate of visitors of your site.  Visit the website http://buycertificate.com to get more information.

Keep Hackers and Scammers Away
If you have SSL Certificate on your web hosting, hackers and scammers will not include you on their list of victims, and this is another reason why SSL certificate is important when it comes to web hosting. This technology indeed helps to lessen the risks and visits of these malicious individuals to your site.
What these people would actually target are websites that are not fully secure because they know that they can easily get on with their crime if a website lacks security features, such as the SSL Certificate. If these hackers and scammers would be able to get through your site, you are not only putting your customers at risk but you yourself would be at high risk as well.

These people have a way to steal your account information and would jeopardize all transactions in your site so they can steal money from you. So before any of these can happen, make sure to look for a web hosting that comes with SSL Certification.

Choose a Reliable Web Hosting Company
In order to ensure that your SSL certificate is reliable, then look for a highly reputable web hosting company that provides reliable SSL security certifications. All of the websites on your server should have this security feature. But remember that in order to be able to obtain the SSL, you will be asked to get a dedicated IP address which might cost you extra money. Nevertheless, investing on this is definitely worth it.

Now that you know why SSL certificate is important, it’s about time to purchase this security feature for your site.

Choosing a Domain Name for your Niche Website

2 comments
A domain name is that address on your website that the people will find. Basically, this is the “physical address” that you will give to customers in order for them to be able to locate your site. By having a domain name, your consumers will not have a hard time in finding your place in the World Wide Web.
This should be the first criterion that one would consider when starting an online business. But then, deciding a domain name would require that you familiarize with your market as well as your target consumers. So to help you on this, here is your guide on choosing a domain name for your niche website.  You can check for URL availability at http://checkurlavailability.com.




Market Research
First of all, you need to be able to conduct market research. This is for you to have a better chance to get found through Google and other search engines. As you know, people these days would greatly rely on these search engines when looking for something online. This is actually the hardest part when it comes to choosing the right domain name for your site. Sometimes, you will need to spend several days in market research just for this. Fortunately, Google has found a way to make it easier for ecommerce sites to easily reach out to their target consumers and that is through the Google Keyword tool. Learning how to use this tool can help you in choosing a domain name for your niche website.  More URL information at http://www.howtourl.com.
 
Choose .com Domains
If you are new to the World Wide Web and you are still in the process of building your presence in the web, then go for .com domain names, if possible. All people are used to .com domains so if they search for something on the search engines, they will most likely choose those with .com domains. Having this domain also helps to provide more credibility to your business, unlike if you use .biz, .info, and the likes.

Keywords Rich Domains
Your domains must be keywords rich and your domain name should be short and must have the most commonly searched terms related to the kind of business that you have. For example, your business has something to do with flowers. Therefore, when choosing a domain name, look for those that have the “flower” keywords in it. This is to ensure that your website will have a higher chance of getting ranked on the first few pages of Google and other search engines.

Short Domains
Your domain must be short, yet concise. This is for your customers to be able to easily remember you. If you choose a long domain name, your customers would end up misspelling it and the worst that could happen is that they will end up in your competitor’s website instead.

Three to four worded keywords are still acceptable to a lot of people, but anything more than that can make it hard for your target consumers to find you in the web.

Choosing a domain name may be difficult for beginners, but the tips above should be able to help you.

Essential Tips on Web Hosting

0 comments
These days, you will find quite a lot of web hosting companies that provide different kinds of services. With this, it can be a bit difficult to decide which of these companies to consider and what type of web hosting package you should choose. These essential tips on web hosting should be able to help you in coming up with the best decision.  Learn more about monthly pay hosting.



Adopt a Yearly Plan
If you cannot find a highly reliable web service provider, it is always a great idea to consider getting a yearly plan for your hosting package. There is no host up fee that will be charged to you if ever you get a new web hosting package on a yearly plan. There are various online web hosting providers that you can find these days which are capable of offering very affordable yearly hosting plans.

Ensure the Bandwidth and Server Space
It is best to take time in researching the net and reading the reviews of various web hosting companies to find out which ones are capable of providing accurate bandwidth as well as a server area that is specific to your needs. If you are in need of only a lesser space, a free web hosting service will be a perfect choice for you.
But if you would prefer to get a much bigger website for your business, then you should choose the package that can give you enough bandwidth and server space. Getting a paid professional web hosting provider is always the best choice and is indeed one of the most essential tips on web hosting.  More info about monthly hosting.

Determine the Web Hosting Company Agreements
Before you pay for any of the web hosting company, make sure that you understand their agreement well. The agreement will list down the legal parameters that should be included on a web hosting contract. You must also ensure that the company is highly authorized and the best way to do this is by reading reviews as well as the site agreements. If your budget won't be enough to pay for a certain web hosting package, try to negotiate with the company and see if they can come up with an affordable plan for you.

Ensure a Protected Support Domain
Most of the web hosting companies provide a technically stable support and services 24 hours of the day, seven days a week. Reliance on the Internet has been rapidly increasing these days along with the advancement of technology as well as the ease of accessibility. You can help to gain more traffic into your site if you are able to utilize top notch and popular domains. Most of the web hosting providers these days could provide top class domains to all their customers who will sign up for their yearly plan.

Familiarize Yourself with the Recent Updates on Software
A lot of the web hosting companies these days would launch new versions of their software from time to time. This is to help attract more customers to the website and ensure a high traffic. In order for you to know how this software can help you, try to familiarize yourself with the recent updates on their software.

When looking for the best web hosting company, make sure to remember all these tips on web hosting. 

4 Feb 2014

Powershell script to create new IIS application

0 comments
Internet Information Services

Hello folks,
I just want to show you a script I've created in Powershell for adding a new web application to a IIS server. It's probably way much easier to configure such app using the IIS Manager console but, using scripting you can make your like much easier and save a lot of time. When deploying a new application on several IIS servers that are load balanced, the workload can be boring so, it's better to use scripting when performing such operation.
That being stayed, I'll just paste the code with the description:

#Import the web administration module and create the paths for the new application
import-module webadministration
$SiteName = "test.ppscu.com"
$PathAppPool = "IIS:\AppPools\" + $SiteName
$PathWebSite = "IIS:\Sites\" +$SiteName

#Creating folders in which the application and logs will be stored
New-Item -ItemType directory -name $SiteName -path "C:\inetpub\sites" -Force
New-item -ItemType directory -name $SiteName -Path "C:\inetpub\logs" -Force

#Creating and configuring the App Pool (will be using the nework service, framework 2.0 and Classic pipeline mode)
New-WebAppPool -Name $name -Force
Set-ItemProperty -Path $PathAppPool -Name processmodel.identityType -Value NetworkService
Set-ItemProperty -Path $PathAppPool -Name managedRuntimeVersion -Value v2.0
Set-ItemProperty -Path $PathAppPool -Name managedPipelineMode -Value Classic
Restart-WebItem $PathAppPool

#Create Website, binding and set the physical location
New-WebSite -name $SiteName -port 80 -hostheader $SiteName -PhysicalPath "C:\inetpub\sites\$SiteName" -ApplicationPool $SiteName
Set-ItemProperty -Path $PathWebSite -name applicationPool -value $SiteName
Restart-WebItem $PathWebSite

#Add log file location
Set-ItemProperty -Path $PathWebSite -name logFile.directory -value "C:\inetpub\logs\$SiteName"

That's it for this short script, I hope you'll find the code useful when deploying IIS applications. Wish you all the best and have a great day!

How to enable Output Caching in IIS

1 comments
Hello folks,
   In this short article we will talk about the Output Caching feature available with IIS servers. We will see what are the main aspects behind this technology and how to configure it to aid to our web applications functionality.
   Before going straight to the configuration part we have to talk about the concepts of caching, what caching actually means in IIS and when it's recommended to use this feature. Output caching is a method of improving the web server's performance by storing dynamic content into memory. Caching ca be enabled for classic ASP and ASP.NET, PHP and other dynamic content.
   By default, IIS will cache static content such as images or HTML files but, for dynamic content this feature has to be configured and customized manually. I'm saying that the caching feature can be customized because it's not recommended with some dynamic objects and can even cause problems to your web application. Make sure that your web application requires output caching because it may cause instability to your system. This feature should be configured on dynamic content that is not changed with every request based on the header or URL. In IIS output caching is configured based on two variables: URL (varyByQuerystring) and header information (varybyHeaders).
   Because dynamic content changes it's information frequently, it is necessary that resources are deleted before receiving updated information. This is why the cache memory must be flushed or invalidated. IIS presents two methods of invalidating information:
- a timeout period (CacheForTimePeriod)
- a change detection mechanism (CacheUntilChange)
   For a resource to became cached by the IIS server, it must be requested a number of times in a predefined period of time. IIS offers two parameters to configure the timing and number of requests: frequentHitTimePeriod and frequentHitThreshold
If a number of requests (frequentHitThreshold) are made for the same item in the configured period of time (frequentHitTimePeriod), the resource is cached to allow the IIS server to respond faster for future requests. When a resource has met these two conditions we say that it has become "worthy". 

There are two methods available when configuring Output Caching on your IIS server:
configure Output Caching using the IIS management console 
You can enable output caching for the whole IIS server or from each website individually. Open the IIS Manager console and navigate to your web application section and click on Output Caching:
IIS management console

Now click on the Add button from the right section to configure a new caching rule:
Output Caching

Windows Server supports two caching methods:
  • User-mode caching - uses a local cache stored in the IIS worker process
  • Kernel-mode caching - uses a cache stored in the Http.sys driver. 

Note that even though the Kernel-mode caching is much faster than user-mode caching it does not support features that must run in user mode (authentication and authorization). Which caching method you use depends a lot on the application's purpose and requirements.
For this example I've created a cache rule for .php files to use change notifications:
IIS cache rule

Note that you can press the Advanced button and enable the cache different version of file based on: query string variable and/or headers feature:
Cache rule

There are some options available in both User-mode and Kernel-mode caching:
  • Using file change notifications: an item will be removed from the cache once a newer version of the file is added in the web application.
  • At time intervals (hh:mm:ss): items will be removed from the cache once the period of time has elapsed. 
  • Prevent all caching: this option prevents caching for the specified type of files
Once you've configured all these parameters, the application will be configured for caching. 

configure Output Caching by modifying the config file of your web applications
Navigate to your web application physical location, open the web.config file and enter the following lines:
<configuration> 
     <location path="mywebsite.php">     
       <system.webserver>        
         <caching>         
           <profiles>
             <add varybyquerystring="*"location="Any"
               duration="00:00:01" policy="CacheForTimePeriod"            
               extension=".php">
           </profiles>
         </caching>
       </system.webserver>
     </location>
</configuration> (Source Microsoft's website)
The policy="CacheForTimePeriod" parameter can be changed to kernelCachePolicy to enable Kernel-mode caching.
That's about is for this article folks, hope you'll find it interesting. Don't forget to enjoy your day and stay tuned for the following articles from IT training day.

30 Jan 2014

Introduction to remote connections

0 comments
Large enterprises often offer their traveling employees remote access to their internal network. This means that not only remote users have access within the network but, connection is encrypted to protect them against external attackers. Windows Server supports two remote connection types: dial-up and VPN connections. I've mentioned dial-up connections because this feature is available with Windows server 2008 although this connection type is often not used because it offers poor performance. Dial-up connections also require phone lines to connect to the Internet. VPN connections are easier  to configure, maintain and offer an increased speed. The only requisition of VPN connections is that both servers and clients must have an active Internet connection. With dial-in connections you have a secured communication channel using phone lines unlike VPN connections where you expose your VPN servers to the Internet. This means that before the encrypted channel is configured, the VPN server will accept authentication requests from external hosts.

Now let's take a look at the advantages and disadvantages of dial-up and VPN connections:
Dial-up
  • offers a relatively secured connection because the public switched telephone network is more secured than the Internet. Note that dial-up connections do not offer encryption mechanisms. 
  • internet connections are not required because dial-up connections use phone lines. This means  that no authentications requests are sent to your server exposing it to the Internet
  • constant speed

  • although dial-up connections offer a constant speed, the performance is often bad. The maximum bandwidth of such connections is 56 Kbps
  • because each remote user requires a dedicated phone line and modem, scaling a dial-up infrastructure is hard and expensive
VPN
  • because VPN connections use the public Internet, lower costs are required to implement this technology.
  • offers higher bandwidth than dial-up connections 

  • VPN connection can sometimes suffer from latency because of the Internet connection
  • Internet connection is required on both sides (server and remote user). This poses a certain risk level because the internal network is exposed to external authentication requests. 
Since dial-up connections are an outdated solution, I will not show you how to install and configure dial-in servers. Instead we will focus on VPN connections since this is the preferred remote connection method used in our days. Windows Server 2008 supports several VPN connections as follows:

L2TP or Layer Two Tunneling Protocol - offers connectivity between non-Microsoft and Microsoft products. Provides user authentication using the PPP protocol and computer authentication using IPSec. It can also enhance security providing integrity, authentication and encryption protocols. This VPN technology is also compatible with IPv6 connections.
PPTP or Point-to-Point Tunneling Protocol - Microsoft's proprietary VPN protocol that uses PPP (Point-to-Point Protocol) for user authentication and MPPE (Microsoft Point-to-Point Encryption) for encryption.
SSTP or Secure Socket Tunneling Protocol - this VPN technology uses the PPP protocol for user authentication and SSL (Secure Sockets Layer)  for data integrity, encryption and authentication. SSTP can be implemented using AD Certificate Services and requires that VPN clients trust the CA that issued the certificate installed on the VPN server.
This was a short introduction in VPN connections, in the next article we will see how to install and configure a VPN server. I hope you'll find this article interesting, don't forget to rate & share. For any misunderstandings leave a comment. Enjoy your day and stay tuned for the following articles.

27 Jan 2014

How to configure Event Subscriptions

0 comments
There are several steps that you must take to enable event forwarding from one machine to another. You will need to configure both the forwarder and the collector with the appropriate settings. Event forwarding uses the HTTP or HTTPS protocols so they are easily configured in terms of firewall protection. You will probably need to allow the HTTP flow between the forwarder and the collector. Another important aspect that you need to cover is to enable the Windows Event Collector and Windows Remote Management services on both machines. I will configure both machines from the beginning:
Forwarding machine
Open a command prompt with administrative credentials and type in the following:
winrm quickconfig
You will then need to add the computer account of the collector machine in the local Event Log Readers group. The group can be found in Computer Management:
Event Log Readers group

If you want to enable Event forwarding on multiple machines I recommend that you use GPO and run a local script on all machines. You can use the following command to add the computer account of the collector machine in the local group:
net localgroup "Event Log Readers" srv1$@ppscu.com /add where srv1$@ppscu.com is the collector's name
The computer account as been added to the specified group:
Event Log Readers

Collector machine - open a command prompt with administrative credentials and type wecutil qc. When prompted, press Y to configure the Windows Event Collector service:
Configuring Windows Event Collector

Now we'll need to create a subscription. Open Event Viewer console, navigate to the Subscription section and press the Create Subscription button from the right side of the window:
Create event subscription

There are two types of subscriptions collector initiated and source computer initiated, choose the one that suits your needs better. You'll have to select what events are sent to the collector computer and if desired, you can customize advanced options (what protocol is used to sent events, who has read access and the overall bandwidth consumption:
Advanced Subscription Settings

If you've chosen to use HTTPS as the transport method for events, you will need to deploy computer certificates on the forwarding machine from the local CA. You would also need to create the appropriate firewall rules for the port 443 and configure Winrm for HTTPS transport (open a command prompt and type winrm quickconfig -transport:https). The collector computer must trust the enterprise CA and configure the subscription advanced options to accept HTTPS transport.
Event subscriptions are checked every 15 minutes, if you want to change this value you'll need to type in wecutil ss "Subscription_name" /cm:custom and wecutil ss "Subscription_name" /hi:6000 from command prompt. The value 6000 is equivalent to 1 minute.
These are the steps that you need to take to enable event forwarding between two machines. Hope you've understood the principles behind this technology, if you have any misunderstandings leave a comment and I will try to respond as soon as possible. Wish you all the best and have a great day!

21 Jan 2014

How to configure NAP components

0 comments
In this article we will see how to configure Security Health Validators (SHVs) when configuring NAP in your Windows infrastructure. Before you can go straight in configuring a SHV, you'll need to install the Network Policy and Access Services server role. Note that for this demonstration I will be using a Windows Server 2008 R2 virtual machine.
Windows SHVs can be configured in Network Policy and Access Services/NPS/Network Access Protection/Windows Security Health Validator:
Windows Security Health Validator

There are two configurable sections here:
Settings - Windows Server 2008 will have a in-build configuration under this node. In this section you specify the policy settings for your Windows machines. There are two configurable options, one for Windows 7/Windows Vista and the other one for Windows XP. You can configure the SHV to verify if the firewall, antivirus, spyware, automatic updates are enabled and up to date on your NAP clients.
Windows Security Health Validators

Error codes - in this section you configure what behavior does the SHVs take when errors are encountered during the health validation process. 
Windows Security Health Validator

The default SHV will check several Windows components. It will verify if the antivirus is turned on and updated with the latest virus definitions. Automatic updates, Windows Firewall and spyware software updates will also be checked.
Network policies - are used to allow or deny access to a NAP client based on the criteria specified in the policy. To configure a network policy, open the Network Policy and Access Services console, navigate to NPS/Policies/Network Policies and press the New button from the right corner of the panel:
Network policies

Once the Wizard has started, you will need to enter a policy name and select the NAP server type that will take advantage of this policy:
Network policy

In the conditions tab, press Add and select the desired conditions that NAP clients must pass to receive network access. I will select NAP Capable Computers because I want to grant access only for these machines:
NAP network policy

There are many common conditions that can be configured here so, I suggest to study all of them before configuring a network policy. You can set conditions based on the Operating System, Health Policy state, Policy Expiration and so on. 
Once the conditions have been configured, click Next:
Network Policy

In the following section you'll need to specify the access permission settings for the policy. Select the access granted options and click next. If you select deny access, the health validation check will not occur:
Network policy access permission

In the next section you can configure one or more authentication types used by connection requests:
NAP authentication methods

If desired, you can also configure network constraints for a network policy :
Network policy constraints

Now click next and proceed to the Configure Settings page. Here you can specify additional settings that affects the network policy. Navigate to the Network Access Protection page and select the access level for this policy. There are three options available in this section as follows:
New Network Policy Wizard
  • Allow full network access - this option is usually configured when creating the network policy for healthy NAP clients.
  • Allow full network access for a limited time - this option will grant network access to NAP clients for a specified period of time. Once the configured time has elapsed, non-compliant computers will only be able to access the restricted network. When using this method, click the Configure button from the bottom section and select a Remediation Server Group and a troubleshooting URL:
Remediation Servers and Troubleshooting URL
  • Allow limited access - this option is configured for non-compliant computers and will give access only to the specified Remediation Server Group
Once you click Next, review the newly configured network policy and click Finish:
Configuring a Network Policy

For troubleshooting purposes it's recommended that you enable NAP logging on authentication requests. This would benefit system administrators by providing them an overall image of the NAP infrastructure. Open up the NPS console, right click this section and select Properties. In the General tab check the two available options: rejected authentication requests and successful authentication requests:

Network Policy Server Properties

Note that NAP errors are also logged in Event Viewer, don't forget to check out this tool. For detailed NAP logging you can enable event tracing on the Network Access Protection Server by running the netsh nap client set traing enable level-verbose command from cmd (tracing files are stored in C:\Windows\Tracing)
That's it for this post folks, by now we've covered the main aspects about NAP and all this info should be sufficient to install and configure a NAP infrastructure. Wish you all the best and have a great day!

Blogroll