Your Ad Here

Tuesday, September 2, 2008

PC Over Clocking Part 1.

Traditionally, the performance of a computer is usually estimated by its processor. It is generally accepted that it is exactly this unquestionably important element that determines the basic functional capabilities, class, price, and prestige of a modern PC. The 8086, 8088, 286, 386, 486, Pentium, Pentium Pro, Pentium MMX, Pentium II, Celeron, and Pentium III processors are the basic landmarks that separate one generation of computers from another. Processors from AMD, VIA (Cyrix), IBM, and certain other manufacturers also have a huge influence on the development of the computer industry. Experienced users, however, are aware of the fact that other PC components are of no less importance and cannot be neglected. These components include the hard drives, the motherboard and its chipset, the 2D/3D video adapter, monitor, the CD-ROM and DVD-ROM drives, the sound card, the high-performance network adapter for computers participating in a local area network (LAN), and the high-speed modem for computers connected to the Internet. This list could go on and on. Its size depends on the tasks you are going to be solving using the PC, which will determine the computer's requirements for functioning and its technical characteristics. At the same time, of course, performance remains one of the most important parameters.
Both the performance and the functional capabilities of the PC depend to a great extent on the parameters of its individual components, their compatibility, and their coordination. Selecting a computer according to the technical specification is not sufficient. If you really want to get the most out of your PC, you'll need to optimize its setup and configuration. Like any complex device, a modern computer requires proper maintenance and servicing. If these requirements are neglected, your modern, powerful, and expensive computer will be inferior to its optimally tuned predecessors, which by the way are much less expensive.
You should optimize your PC whenever you change the tasks it's intended to perform, and also after upgrading its hardware or software configuration. By specifying appropriate settings and running special utilities, it is possible to compensate (to a certain extent) for the natural degradation of electronic components due to their prolonged usage. To achieve a high level of performance, it is recommended that you update device drivers and BIOS code from time to time. Maintenance procedures, such as defragmentation and scanning the hard drive(s) for errors should be performed on a regular basis. If you neglect these operations, then, in a best-case scenario, this may slow down your system. In a worst-case scenario, this may result in the loss of data. You must also periodically check the latest news related to bug fixes, updates, and service packs released for operating systems and office programs. Compressing the hard drive using programs like DriveSpace or Compression
Agent may significantly increase the drive space available to the user. What's more, in certain situations it might even speed up data access, and therefore increase the operating speed of most programs.
However, it's worth noting that even a very carefully optimized and tuned PC, which undergoes maintenance procedures on a regular basis, cannot always meet the ever-growing requirements. Sooner or later, every user will run into the problem of insufficient performance. When standard performance reserves have been completely used up due to the comprehensive optimization, you will have to take more radical steps. Some users solve the problem by buying a new computer, while others upgrade the existing one. Both solutions will cost money, often a very considerable sum. Notice that users often need to replace or upgrade even relatively new computers that are in good operating condition, bought only a year or two ago (or even less)!
You should, however, note that besides hardware and software optimization and upgrade, there is another method of prolonging the life cycle of obsolete computer equipment that is still in good operating condition. This method often gives new life to those computers that already cannot be called modern. This method is known as overclocking. The idea of this method is that certain elements and components of the computer can be forced to run at speeds faster than the ones they were rated to operate at. Generally, this increases the operating frequency of the overclocked equipment, thus increasing overall system performance. But, you should realize that this decreases system reliability. Also, you must be ready to accept the fact that overclocking shortens the operating life of the equipment. However, in many cases this is justifiable and does not represent much of a problem.
Indeed, due to the constant improvement of computer technology, the period of time which hardware components can be expediently used is constantly shrinking. As soon as modern, better-quality, and higher-performance components appear on the market, it becomes economically unreasonable to keep using their aged prototypes. And this happens in spite of advanced production technology, increased reliability, and prolonged trouble-free service life of the PC hardware. At present, the operating life of processors, video adapters, and hard drives is usually no more than 2–3 years. This is an average value. However, many users try to replace these generally serviceable components with newer and higher-performance ones even before it becomes really necessary. At the same time, modern hardware components have such a high reliability level that they are guaranteed to operate for more than 10 years. Nonetheless, newer, more advanced, and higher-performance devices generally appear on the market every few months. For this reason, you can
allow a little bit of loss of reliability and reserve (for example, from 10 to 5 years), since the period that you'll actually be able to use these elements is relatively short and you really don't need your hardware to last for such a long time. There is always a small chance of system crashes and hang-ups even in a correctly overclocked computer. However, if overclocking was done correctly, a system crash is a rare event, and even if it happens, the results in most cases are not fatal. Of course, you shouldn't use overclocking modes for server components, or, for example, in systems that control mission-critical and potentially dangerous operations (like nuclear power plants, missile launching stations, etc.) or in situations where a life is at stake (operating rooms, etc.). Errors in such systems are not so harmless or allowable.
It should be emphasized that recently, overclocking has become popular even among those who have a completely new PC. Such users, with the aim of further increasing the performance of their system, often ask technical specialists to have overclocking modes set for their processors as soon as they purchase an absolutely new PC. More experienced users often perform the operation themselves at home, choosing the optimal mode while strictly controlling and thoroughly testing hardware components at every step of the overclocking procedure.
The user's natural desire to constantly improve the architecture of his or her computer is not the only reason that overclocking is now so popular. Rather, the main idea of overclocking is usually to save money. Indeed, this procedure, which is by the way applicable not only to processors, allows you to attain a relatively high performance for a relatively low price. The CPU performance gain can reach 20–30%. If you decide to use more extreme and therefore riskier modes, you can gain up to 50%, and possibly more. You can apply similar methods to increase the performance of other components, such as the RAM, the video adapter, and even the hard drive. Doing this immediately places the computer into a higher category. Often overclocked components that were at a lower performance level to start with can now compete with their more powerful and more expensive successors. What's also important is that this is done without spending practically any money. The money you save on the processor alone can be in the hundreds of dollars. And, the insignificant decrease of reliability and stability that comes along with overclocking can be minimized by taking a few measures and following certain recommendations.
Despite the main reason of overclocking computer components obviously is saving money, you really shouldn't look at this method of increasing your PC's performance from that angle alone. Relatively often, even the newest hardware components whose performance is already very high are used in overclocking modes as well. Doing this allows us to take advantage of the latent capabilities of PC components.
Overclocking such components increases their performance and improves functional capabilities even further. So, regardless of the fact that the frequency of 500 MHz for processors with the traditional architecture was achieved relatively recently, and has already been, of course, surpassed, certain overclocking enthusiasts have been able to reach heights in the area of 1000 MHz with such processors. It's often the case that such actions are accompanied by corresponding measures that will also increase the performance of the other subsystems included in a modern computer.
The popularization of experience in overclocking elements affects the financial interests of the hardware manufacturers. They, for understandable reasons, do not want to lose any part of their profits. Besides, overclocking methods are also often used by the so-called remarkers, who intentionally change the marking on computer elements — processors or memory modules for example — and pass them off as products of a higher performance, and thus more expensive than they actually are. Some companies, generally smaller ones, go even further. They release devices — for instance video adapters, motherboards, or even entire computers — whose elements have already been overclocked, and, obviously, keep this a secret from the potential customer.
Taking into consideration the opportunity for forgery and in order to defend their commercial interests, most well known manufacturers have included various improvements in their products that inhibit fabrication of fake components and limit the performance-increasing possibilities that come from using overclocking modes.
Nonetheless, note that despite all the desperate efforts of certain processor manufacturers, who do their best to prevent their products from being overclocked, the popularity of overclocking is growing at a rapid rate. Overclocking is further promoted by the release of certain motherboards and chipsets, and even some software, intended to simplify experiments with overclocking. You can easily find a wide range of products on the computer market that are intended for cooling computer components. All this makes setting the appropriate modes, setting the parameters, and testing much simpler.
Not only enthusiastic individuals but also many serious companies have dedicated their efforts to the investigation of overclocking modes and the development of recommendations for their use. Sometimes, such investigations are performed even with support provided by hardware manufacturers. An example of cooperation in this field is the collaboration between Kryotech and AMD. As a result of their investigations, an AMD processor under experimental conditions of extreme overclocking
was able to reach 1 GHz long before the release of processors rated to run at such a frequency.
This rather high interest that a number of computer companies seem to be taking in overclocking is relatively easily explained. These types of studies aid in improving the technology, enhancing the architecture, and increasing the performance of hardware components. Furthermore, these investigations serve as a source of statistics on errors and faults, which then allows the manufacturers to develop efficient and reliable hardware and software. After all, the ability of hardware components to retain stability and reliability in overclocking modes serves as an excellent promotion for their respective manufacturers. Certain manufacturers welcome this research, because they use the experience and technology gained from the leading firms that investigate overclocking problems in order to release high-performance hardware. Compaq, for example, offers a platform for high-performance servers based on the technology developed by Kryotech, which provided extreme cooling for AMD Athlon processors overclocked to frequencies 1.5–2 times higher than normal.
As the community of overclocking enthusiasts expands, the number of specialized Web sites related to overclocking problems grows accordingly. These sites are intended to make overclocking even more popular. Generally, materials published there discuss various aspects of the problems that may arise, and give corresponding recommendations.
The materials presented in this book provide an attempt to develop a systematic approach to the overclocking problem and to formulate general and specific recommendations on setting the overclocking parameters for the most important hardware components.

0 comments: