Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Understanding Algorithms and Computer Processing: CPU, Memory, and Instruction Execution, Study notes of Computer Science

An in-depth explanation of algorithms, their role in problem-solving, and the process of how computers use them to solve problems. It covers the cpu and memory components, the role of the control unit and arithmetic/logic unit, and the translation of high-level languages to machine language. Additionally, it discusses the importance of programming and the steps involved in creating algorithms.

Typology: Study notes

Pre 2010

Uploaded on 07/23/2009

koofers-user-nzo
koofers-user-nzo 🇺🇸

5

(1)

10 documents

1 / 3

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Computers use algorithms to solve problems.
An algorithm is a detailed (every step must be stated) and unambiguous (every
step must be perfectly clear) set of step by step procedures for solving a problem.
Algorithms are general purpose- in that they are the general manner to solve
specific problems- because no data is hard-coded into the instructions and the data is
represented symbolically using a field, a spreadsheet cell address, or a variable name.
The process activity is the key to understanding how a computer actually solves
problems. The processor unit contains the CPU and memory, which are the two major H/
W components needed for processing. The CPU controls all the H/W and performs the
actual processing. Memory is the “working space” for the CPU, as everything for
processing must be in memory.
The CPU is a single chip, but is divided into the CU (control unit) and the ALU
(arithmetic/logic unit).
For every S/W instruction, the CU must Fetch (get the instruction from
memory), Decode (determine what that instruction means), and Execute (inform
the correct H/W device what to do) it. This is an algorithm telling the computer
how to follow an algorithm. In essence, the CU runs S/W.
The ALU is the part of the processor that actually performs the processing.
Memory is the second major H/W component needed for processing. The
fundamental unit of memory is a byte. Each byte of memory has its own address. The S/
W field is placeholder and descriptor of what is in memory. As a placeholder it reserves a
byte of memory and remembers its address. Everything the CPU is processing is in
memory. This is why the amount of storage space containing stuff doesn’t effect the
speed of processing.
The only language the computer understands is binary. The 0 and 1 represent the two
positions a CPU switch can be in (on or off).
There are several different forms of binary:
Numbers are represented in base 2. They are typically written in groups of
eight bi(nary digi)ts, because a group of eight bits forms one byte (measure of
storage or memory).
Characters are represented in ASCII.
Instructions are represented in ML (machine language).
A programming language is a collection of possible instructions for the CPU, and
programmers used them to create application packages.
Programmers use High Level Languages (HLL) to make programming easier.
Because a HLL uses words and symbols, it is much closer to English than the ones and
zeros of ML.
The computer can’t understand a HLL, however, so it must be translated to ML.
The computer performs this translation through the HLL to ML
“dictionary.” This is a file that contains all the HLL statements and all the
equivalent ML binary patterns. Through the request information use of data, the
SML, CSC 150 study guide 3
pf3

Partial preview of the text

Download Understanding Algorithms and Computer Processing: CPU, Memory, and Instruction Execution and more Study notes Computer Science in PDF only on Docsity!

Computers use algorithms to solve problems. An algorithm is a detailed (every step must be stated) and unambiguous (every step must be perfectly clear) set of step by step procedures for solving a problem. Algorithms are general purpose- in that they are the general manner to solve specific problems- because no data is hard-coded into the instructions and the data is represented symbolically using a field, a spreadsheet cell address, or a variable name. The process activity is the key to understanding how a computer actually solves problems. The processor unit contains the CPU and memory, which are the two major H/ W components needed for processing. The CPU controls all the H/W and performs the actual processing. Memory is the “working space” for the CPU, as everything for processing must be in memory. The CPU is a single chip, but is divided into the CU (control unit) and the ALU (arithmetic/logic unit). For every S/W instruction, the CU must Fetch (get the instruction from memory), Decode (determine what that instruction means), and Execute (inform the correct H/W device what to do) it. This is an algorithm telling the computer how to follow an algorithm. In essence, the CU runs S/W. The ALU is the part of the processor that actually performs the processing. Memory is the second major H/W component needed for processing. The fundamental unit of memory is a byte. Each byte of memory has its own address. The S/ W field is placeholder and descriptor of what is in memory. As a placeholder it reserves a byte of memory and remembers its address. Everything the CPU is processing is in memory. This is why the amount of storage space containing stuff doesn’t effect the speed of processing. The only language the computer understands is binary. The 0 and 1 represent the two positions a CPU switch can be in (on or off). There are several different forms of binary: Numbers are represented in base 2. They are typically written in groups of eight bi(nary digi)ts, because a group of eight bits forms one byte (measure of storage or memory). Characters are represented in ASCII. Instructions are represented in ML (machine language). A programming language is a collection of possible instructions for the CPU, and programmers used them to create application packages. Programmers use High Level Languages (HLL) to make programming easier. Because a HLL uses words and symbols, it is much closer to English than the ones and zeros of ML. The computer can’t understand a HLL, however, so it must be translated to ML. The computer performs this translation through the HLL to ML “dictionary.” This is a file that contains all the HLL statements and all the equivalent ML binary patterns. Through the request information use of data, the

computer finds the HLL statement that the programmer entered and converts it to the associated ML statement. This translation is possible because HLL has a limited syntax- only certain forms are allowed to represent expressions- and exact semantics- every HLL statement has only one meaning in ML. Because the computer performs this translation, a HLL can be used on different types of computers (it’s “portable”). Programming is level one problem solving, because the programmer is creating algorithms. The steps in programming (notice that the first three are level 1 problem solving and the last is level 2): Understand the problem Without a total understanding of the problem, the algorithms to solve that problem cannot be created. Plan the solution Use Pseudocode (an algorithm written in a human language) to develop an outline. Create code (an algorithm written in a computer language). Test the application S/W applications don’t interact directly with the H/W. Systems S/W controls all the H/W, and makes using it easier for users and other S/W. One kind of Systems S/W is Translation S/W. Translation S/W contains the “dictionary” that translates HLL to ML. There are two kinds of translation S/W: Interpreter- translates one HLL statement, executes that statement, repeats until all statements have been translated and executed. Compiler: all HLL statements are translated into a group of ML, that group is saved as a file for execution. A compiled application is faster than an interpreted one, and it requires no translation S/W to run it (because it’s already been changed to ML). Another kind of Systems S/W is Booting S/W. Booting S/W is the start up instructions for the computer. It performs the self-test of the H/W and loads the Operating System (OS). Because the instructions a computer is processing must be in memory- and RAM memory is erased when the power supply is cut off- Booting S/W must be in ROM. Booting S/W loads the OS, which controls and manages all the H/W and other S/ W in the system. The OS is a command interpreter. It inputs, interpreters, and executes each user command (could be an OS command- like , an application package’s file name, etc.) The OS is the interface between H/W and other S/W. The OS is a resource manager. It manages the use of scarce resources. This is the most important job of a large computer OS.