Over the years I have created a number of presentations and performed a number of architectural engagements where we have used various tools to track and establish performance baselines, determine constant expectations and build out formulas for scaling. Microsoft has new tools which help in the process. The Windows Performance Analyzer (WPA) is a set of performance monitoring tools which can establish in-depth performance profiles of Microsoft Windows operating systems and applications. Although these tools cannot discover, quantify or otherwise adapt to your essential hardware flaws, they do allow you to understand the limits within a classification of hardware. This … Continue reading Windows 2008 – Windows Performance Analyzer (WPA)
New Research from the University of Michigan duo Jie Yu and Satish Narayanasamy looks into encoding a set of tested correct interleavings in a program’s binary executable using Predecessor Set (PSet) constraints. These constraints are efficiently enforced at runtime using processor support, which ensures that the runtime follows a tested interleaving. They analyze several bugs in open source applications such as MySQL, Apache, Mozilla, etc., and show that, by enforcing PSet constraints, one can avoid not only data races and atomicity violations, but also other forms of concurrency bugs. Source: A Case for Interleaving Constrained Shared Memory Conclusions: Testing and verifying … Continue reading A Case for an Interleaving Constrained Shared Memory
Over the past 2 weeks I have been conducting performance testing on Autodesk Inventor using three different Operating Systems. Essentially the results of the testing revealed that unless applications are optimized for specific hardware much of the horse power we buy goes unused and wasted. One has to wonder if the inherent monitoring tools we see in Windows are really giving us the full picture. Perhaps the easiest thing to point at is Windows 7 current lack of GPU monitoring. Diagram of a possible Windows 8 Task Manager. Essentially the next Windows version after Windows 7 will be tackling new hurdles … Continue reading The Windows 8 Task Manager? – GPU Usage monitoring becomes a requirement
Well we can hardly say Multi-Core processors are new technology. Those who really know the ins and outs of the way applications run will tell you there is inefficiency in the way operating systems and the applications that run on them use the hardware. This inefficiency can actually make multi-core processors run slower than their single core predecessors for non-optimized applications. The most efficient applications are specifically designed to support newer technologies like Hyper-threading and Multi-Core but those are not the business productivity applications you would expect. Yes, it is game development that is most preoccupied with the specifics of … Continue reading Big engine no gas – Multi-Core OS with native support for the hardware we buy still a future prospect.
Windows Server 2008 R2 is the newest Windows Server operating system from Microsoft. Designed to help organizations reduce operating costs and increase efficiencies, Windows Server 2008 R2 provides enhanced management control over resources across the enterprise. It is designed to provide better energy efficiency and performance by reducing power consumption and lowering overhead costs. It also helps provide improved branch office capabilities, exciting new remote access experiences, streamlined server management, and expands the Microsoft virtualization strategy for both client and server computers. Powerful Hardware and Scaling FeaturesWindows Server 2008 R2 was designed to perform as well or better for the … Continue reading Top 10 Reasons to Upgrade to Windows Server 2008 R2
Save Space on Primary and Backup Storage. Hyper-V 2008 R2 boasts 87% of native drive system performance. They are separating out the data that is fixed from the information that changes by creating a new location for transitional data. The size of this data is unknown. Jeff LoucksAvailable Technology
The Virtualization team is reporting 87% of native performance when using Dynamically expanding drives. A number of MVPs were emailing today about best practices for High Availability using Hyper-V and Clustered Shared Volumes. A great topics in and of itself but one that I will bring forward more extensive information in the near future. A side topic came up. Oliver Sommer, respected MVP from Germany, introduced me to the increased performance of Dynamic Expanding drives in Hyper-V R2. Now this should not be confused with dynamic versus basic drive formats. This is dynamically Expanding Storage versus Fixed storage in Hyper-V. There … Continue reading The Numbers: Hyper-V R2 and Dynamically Expanding Storage Performance