Windows file server performance optimization

Merge this into the registry, reboot and enjoy increased performance:


Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem]
“NtfsDisable8dot3NameCreation”=dword:00000001
“NtfsMemoryUsage”=dword:00000002

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters]
“NumTcbTablePartitions”=dword:00000008

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\Interfaces\{INTERFACE NUMBER}]
“TcpAckFrequency”=dword:0000000d

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management]
“PagedPoolSize”=dword:ffffffff
“LargeSystemCache”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Executive]
“AdditionalDelayedWorkerThreads”=dword:00000020
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\RpcXdr\Parameters]
“DefaultNumberOfWorkerThreads”=dword:00000040

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\NfsSvr\Parameters]
“OptimalReads”=dword:00000001
“RdWrHandleLifeTime”=dword:0000000a
“RdWrNfsReadHandlesLifeTime”=dword:0000000a
“RdWrNfsHandleLifeTime”=dword:0000003c
“RdWrThreadSleepTime”=dword:0000003c
“SecureHandleLevel”=dword:00000000
“NfsHandlesCacheSizeLowWatermark”=dword:003d08ce
“NfsHandlesCacheSizeMax”=dword:003d0900
“NtfsHandlesCacheSizeLowWatermark”=dword:000249be
“NtfsHandlesCacheSizeMax”=dword:000249f0
“FileHandleCacheSizeInMB”=dword:3de00000
“LockFileHandleCacheInMemory”=dword:00000001
“MaxIcbNfsReadHandlesCacheSize”=dword:00001f40


 


Also, check the NTFS log size (chkdsk /l) and increase it to 65536 KB in case it isn’t already of that size. That covers Windows CIFS and NFS and was tested on 32-bit Windows 2003 (yet I believe W2K8 and 64-bit platforms also can be optimised this way, will test). This comes from the SPEC file server benchmarking results and configuration notes for HP ProLiant DL585 G2 Storage Server.


Check out other systems and results – some interesting information there.


It is a good idea to check the performance before and after changing the system parameters. You don’t need to purchase SPEC tests to do that – there are free tools available. Stay tuned for some details, or search away (if your OS of choice is Windows, use “sqlio” as the search criteria). 

How not to make decisions

In the past week, I had a number of discussions about information securtity and technology in general. With colleagues, we identified few common patterns about decision-making in corporate environments – and those are case studies on how decisions shouldn’t be made. Here’s examples:


We need mature solutions. Can anybody define maturity when it comes to IT? Is Intranetware mature solution for network file and print services? Whenever you hear maturity or business acumen, or something like that, reach out for your wallet. Fact: early adoption of technology works better in most cases. That’s because you have better support from the technology partner, more features, more time before upgrade, and staff that feels good because they are working on something new.


Everyone else does it, so it must be good. This is the “best practice” fallacy. Cases in point: do not broadcast WLAN SSID; VLANs are for security; and multihoming servers (and having separate physical connections to different security zones) is a security feature. The myths don’t withstand reality check (eg scenario-based threat analysis) but they persist in minds and get embedded in assorted standards like PCI – resulting in costlier infrastructures that are more complex to build and support.


We don’t really know what we’re doing but let’s do it anyway. Tha is, decisions large and small are made based on uncertainty and lack of knowledge. Cases in point: we don’t know what this software update is doing so let’s have full system restore as the backout plan; I heard that virtual machine will have some kind of issue running our application so please use physical (the last one comes from Microsoft engineer, no details as to the issue given despite repeated questions); and we don’t know how the database server will perform when the database size will reach 4TB so let’s go Oracle RAC. If you don’t know what the software update is doing – find out by looking in the installation package. If you have concerns abouth the database performance – create performance baseline and try to come up with automated stress test of some sort; the database size itself doesn’t mean much.


Decisions should be made based on knowledge and facts.

US Senate: security through (more) bureaucracy

When I first read the news on the Washington Post web site, I thought this is a 1 April joke: Senate Legislation Would Federalize Cybersecurity. The April Fool’s day has come and gone but all the signs are to that this is for real: the press releases trumpeting arrival of the legislation are still there. The bill’s summary is available from the US Senate Web site (I cannot find the full text of proposed legislation yet). The problem definition is a typical scaremongering:


This comprehensive legislation addresses our country’s unacceptable vulnerability to massive cyber crime, global cyber espionage, and cyber attacks that could cripple our critical infrastructure. We presently have systems to protect our nation’s secrets and our government networks against cyber espionage, and it is imperative that those cyber defenses keep up with our enemies’ cyber capabilities. However, another great vulnerability our country faces is the threat to our private sector critical infrastructure–banking, utilities, air/rail/auto traffic control, telecommunications–from disruptive cyber attacks that could literally shut down our way of life.


So get ready for digital Pearl Harbor. Real one: Conficker virus, another April Fools’ event, which some described as just that, caused zero noticeable impact.


Coming from professional politicians, the bill unsurprisingly proposes to improve the cybersecurity situation by introducing colossal new bureaucracy, headed by the US Cybersecurity Fuehrer (or Tzar, or Leader, if you so wish). If it becomes a law then the governemnt will have control over information security matters in private sector:


The legislation would require the National Institute of Standards and Technology to establish measureable and auditable cybersecurity standards that would be applicable both to government and the private sector.


Although the press release and the summary mention specifically critical infrastructure controlled by private entities – utilities, banking, transportation, health and telecommunications – apparently the bill’s scope is not limited thereto. That would dwarf Sarbanes-Oxley and HIPAA information security rackets and create massive compliance burden on the economy. Layers upon layers of firewalls, “endpoint security” and “intrusion prevention” technologies, and regular compliance audits may become mandated by the law.


The bill would also attempt to place a dollar value on cybersecurity risk. Ironically placed uder the Foster innovation section, it means this:


The legislation would require the Advisor to provide a report on the feasibility of creating a market for cybersecurity risk management, to include civil liability and government insurance.


Welcome to the cybersecurity cap-and-trade scheme!


This is not the first attempt to create cybersecurity bodies in the government. Think of the DHS and its Cybersecurity Center, the people who brought us this:


Current Threat Level


Yet according to the senators all the efforts have basically failed. Maybe that signifies a problem with the approach? It does. Government-mandated dogma is not a substitute for a pragmatic approach to security threats.