0% found this document useful (0 votes)
50 views13 pages

Code Tuning Strategies for Performance

The document discusses various strategies for code tuning to improve a program's performance. It describes considering efficiency from the perspectives of program requirements, design, classes, routines, operating system interactions, and compilation before tuning code. The Pareto principle and common misconceptions about code tuning are explained. Common sources of inefficiency like input/output operations, paging, system calls, and interpreted languages are identified. Precise measurement of code performance is emphasized along with iterating the tuning process.

Uploaded by

Ateeqa Kokab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views13 pages

Code Tuning Strategies for Performance

The document discusses various strategies for code tuning to improve a program's performance. It describes considering efficiency from the perspectives of program requirements, design, classes, routines, operating system interactions, and compilation before tuning code. The Pareto principle and common misconceptions about code tuning are explained. Common sources of inefficiency like input/output operations, paging, system calls, and interpreted languages are identified. Precise measurement of code performance is emphasized along with iterating the tuning process.

Uploaded by

Ateeqa Kokab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Code Tunning Strategies

SAWERA NAWAZ 20011598-106


HADIQA ASIF 20011598-132
NOSHEEN SHAKEEL 20011598-115
Performance Overview

Code tuning is one way of improving a program’s performance. You can often find
other ways to improve performance more—and in less time and with less harm to
the code—than by code tuning. This section describes the options.
Performance and Code Tuning

Once you’ve chosen efficiency as a priority, whether its emphasis is on speed or on size, you
should consider several options before choosing to improve either speed or size at the code level.
Think about efficiency from each of these viewpoints:
o Program requirements
o Program design
o Class and routine design
o Operating-system interactions
o Code compilation
o Hardware
o Code tuning
Introduction to code tunning

Code tuning is appealing for several reasons. One attraction is that it seems to defy the laws of
nature. It’s incredibly satisfying to take a routine that executes in 20 microseconds, tweak a few
lines, and reduce the execution speed to 2 microseconds.
There are two principles for problems in code tunning.
o The pareto principle
o Old Wives’ Tales
The Pareto Principle

The Pareto Principle, also known as the 80/20 rule, states that you can get 80
percent of the result with 20 percent of the effort. The principle applies to a lot of
areas other than programming, but it definitely applies to program optimization.
Old Wives’ Tales

Much of what you’ve heard about code tuning is false, including the following common
misapprehensions: Reducing the lines of code in a high-level language improves the speed or size of
the resulting machine code—false! Many programmers cling tenaciously to the belief that if they can
write code in one or two lines, it will be the most efficient possible.
o Certain operations are probably faster or smaller than others—false!
o You should optimize as you go—false!
o A fast program is just as important as a correct one—false!
Kinds of Fat and Molasses

In code tuning you find the parts of a program that are as slow as molasses in winter and as big as
Godzilla and change them so that they are as fast as greased lightning and so skinny they can hide in
the cracks between the other bytes in RAM. You always have to profile the program to know with
any confidence which parts are slow and fat, but some operations have a long history of laziness and
obesity, and you can start by investigating them.
Common Sources of Inefficiency

Here are several common sources of inefficiency:


o Input output operations
o Paging
o System calls
o Interpreted languages
o Errors
Pagging
25.4 Measurement:

• Small parts of a program usually consume a disproportionate share of the run time, measure your
code to find the hot spots.
• Once you’ve found the hot spots and optimized them, measure the code again to assess how
much you’ve improved it.
• Many aspects of performance are counterintuitive.
• This code was straightforward, but performance of the matrix-summation routine was critical,
and i knew that all the array accesses and loop tests had to be expensive
Even though the code wasn’t as readable as the first code, especially to programmers who
aren’t C++ experts, I was magnificently pleased with myself.
Measurements Need to Be Precise;
• Performance measurements need to be precise. Timing your program with a stopwatch or by
counting “one elephant, two elephant, three elephant” isn’t precise. Profiling tools are useful, or
you can use your system’s clock and routines that record the elapsed times for computing
operations.

• Whether you use someone else’s tool or write your own code to make the measurements, make
sure that you’re measuring only the execution time of the code you’re tuning. Use the number of
CPU clock ticks allocated to your program rather than the time of day. Otherwise, when the
system switches from your program to another program,
25.5 Iteration:

Keep trying until you improve to an Extent .

Common questions

Powered by AI

Profiling is emphasized as a critical component in the code tuning process because it provides empirical data on which parts of the code consume the most resources ('hot spots'). This information allows developers to focus their optimization efforts on parts of the code that will yield the most significant performance improvements. Profiling helps avoid the pitfalls of guessing or relying on intuition about inefficient areas, which is often fallible and can lead to wasted effort on less impactful areas .

Iteration is important in the code tuning process as it allows for continuous refinement and validation of performance improvements. By repeatedly measuring performance metrics, making changes, and assessing outcomes, developers can incrementally improve efficiency. This process acknowledges that initial optimizations may not achieve the desired effect or might introduce new issues. Iteration ensures that improvements are quantitatively assessed and contribute positively to reaching performance goals .

The document describes old wives' tales in code tuning as prevailing but erroneous beliefs that can misguide developers. These include oversimplified assumptions about code efficiency and premature optimization. The suggested approach to counter these misconceptions involves relying on empirical data from profiling tools to determine actual inefficiencies, focusing on known bottlenecks, and avoiding optimizations without solid evidence of their necessity. This practical approach helps maintain code quality and focuses efforts on impactful areas .

The document suggests a trade-off between code readability and performance optimization. Performance enhancements often lead to less readable code due to increased complexity or the use of system-specific optimizations. While these changes can bring substantial performance benefits, they can also make the code harder to understand and maintain, especially for programmers who are not experts in the language or system used. Identifying the balance between achieving necessary performance levels while maintaining code readability is crucial .

Common sources of inefficiency in programs, such as input/output operations, paging, system calls, interpreted languages, and errors, can often be identified through profiling tools that highlight 'hot spots' of code where the most resources are consumed. Once these areas are identified, developers can focus their tuning efforts on optimizing or restructuring code, reducing unnecessary computations, or improving the logic to minimize the impact of these inefficiencies. Continuous measurement and iteration after changes allow developers to confirm improvements and further refine performance if necessary .

Operating-system interactions can significantly impact code performance as they govern how effectively a program can utilize system resources such as memory and CPU time. The document highlights elements like paging and system calls as potential sources of inefficiency, where frequent or inefficient interactions can lead to performance bottlenecks. Considerations involving how a program accesses memory and communicates with the OS (such as using efficient I/O operations) are vital in minimizing these overheads, thereby improving overall performance .

Considering program requirements and design choices before code tuning is vital because these foundational aspects dictate the overall architecture and operational constraints within which performance improvements must be made. Efficient design can often prevent the need for extensive code-level optimizations by addressing potential performance bottlenecks at a higher system level. Prioritizing these elements ensures that efforts are aligned with broader strategic goals and constraints, avoiding premature or unnecessary tuning that might negatively impact code maintainability or readability .

Precise measurement is crucial in code tuning to accurately identify inefficiencies and verify the impact of optimizations. Precise performance assessment requires using tools like profilers or system routines to track execution times to the granularity of CPU clock ticks rather than using arbitrary or imprecise methods such as timing with a stopwatch. Accurate measurements help in pinpointing specific bottlenecks and in quantifying improvements after optimizations, thereby ensuring that tuning efforts are effectively addressing performance issues .

The Pareto Principle, or the 80/20 rule, in the context of code tuning suggests that 80% of the performance gains can often be achieved by optimizing just 20% of the code. This principle implies that instead of attempting to optimize all parts of a program indiscriminately, developers should focus on identifying and tuning the key sections or 'hot spots' that contribute most significantly to inefficiency . This targeted approach can lead to significant improvements in performance with relatively less effort compared to a blanket optimization strategy.

Several common misconceptions about code tuning include the beliefs that reducing lines of code in a high-level language improves efficiency, certain operations are inherently faster or smaller, optimization should be done continuously during development, and that a fast program has equal importance as a correct one. These are considered false because efficiency depends more on identifying and optimizing critical sections of the code rather than reducing code volume. Blindly assuming certain operations are efficient or optimizing without profiling can lead to misguided efforts. Moreover, correctness and performance both need prioritization, focusing solely on speed can sacrifice accuracy .

You might also like