Hello Friends,this section is specifically dedicated to those users who wants to clear the fundamental of computer science. Here we have included Computer Science Engineering multiple choice questions with answers(MCQs). Students can expect some questions from this section for sure in the respective university exams,competitive exams and interviews.So practice it well and enjoy with good score you expect.
1. What is a light pen?
- Mechanical Input device
- Optical input device
- Electronic input device
- Optical output device
A light pen is an input device that was commonly used with early computer systems and graphical user interfaces (GUIs). It is a handheld device with a light-sensitive tip or sensor at one end and a cord connected to the computer at the other end.
When using a light pen, the user would interact with the computer screen by touching the pen's tip to the surface of the display. The light pen detects the screen's electron beam as it scans across the display, allowing the computer to determine the position where the pen was touched.
2. BCD is
- Binary Coded Decimal
- Bit Coded Decimal
- Binary Coded Digit
- Bit Coded Digit
BCD stands for Binary Coded Decimal. BCD is a binary representation of decimal numbers, where each decimal digit is represented by a four-bit binary code.
In BCD, each decimal digit from 0 to 9 is represented by its corresponding four-bit binary value. For example:
0 = 0000
1 = 0001
2 = 0010
3 = 0011
4 = 0100
5 = 0101
6 = 0110
7 = 0111
8 = 1000
9 = 1001
BCD is often used in digital systems, particularly in applications that require precise decimal representation, such as digital clocks, calculators, and numerical displays. BCD representation allows for direct conversion between binary and decimal values and simplifies decimal arithmetic operations in digital circuits.
3. ASCII stands for
- American Stable Code for International Interchange
- American Standard Case for Institutional Interchange
- American Standard Code for Information Interchange
- American Standard Code for Interchange Information
ASCII stands for American Standard Code for Information Interchange. It is a character encoding scheme that assigns unique numerical values to represent characters, symbols, and control codes used in computers and communication systems.
The ASCII standard was developed in the early 1960s by a committee led by Robert W. Bemer. It originally defined a 7-bit character set, which included 128 different characters, such as uppercase and lowercase letters, numbers, punctuation marks, and control characters.
In ASCII, each character is represented by a specific 7-bit binary value. For example, the uppercase letter 'A' is represented by the binary value 01000001, while the digit '7' is represented by 00110111.
ASCII has been widely used as a character encoding standard in many computing systems, programming languages, and communication protocols. It provides a consistent way to represent and exchange textual information across different platforms and devices.
4. Which of the following is first generation of computer?
- EDSAC
- IBM-1401
- CDC-1604
- ICL-2900
EDSAC (Electronic Delay Storage Automatic Calculator) is indeed another significant example of a first-generation computer. It was a pioneering computer built at the University of Cambridge in the United Kingdom.
EDSAC was developed by Maurice Wilkes and his team, and it became operational in 1949. It utilized vacuum tubes for processing and employed a mercury delay line memory system for storing data. It was a stored-program computer, meaning that both program instructions and data were stored in the same memory.
EDSAC played a crucial role in advancing computer science and served as a platform for various important early computer programs and research. Notably, it ran the first graphical computer game, known as "OXO" or "Noughts and Crosses" (Tic-Tac-Toe).
EDSAC's design and operation significantly influenced subsequent computer systems, and it is often considered one of the early milestones in the development of digital computers.
5. Chief component of first generation computer was
- Transistors
- Vacuum Tubes and Valves
- Integrated Circuits
- None of above
The chief component of first-generation computers was vacuum tubes, also known as electronic valves. Vacuum tubes were the primary electronic components used for processing and amplification in early computers.
Vacuum tubes were glass tubes containing metal electrodes, which were used to control the flow of electrons. They provided the means for performing calculations, storing and manipulating data, and controlling the overall operation of the computer.
The use of vacuum tubes in first-generation computers had some drawbacks, including their large size, high power consumption, and limited lifespan. However, they paved the way for the development of electronic digital computers and marked a significant milestone in the history of computing technology.
6. FORTRAN is
- File Translation
- Format Translation
- Formula Translation
- Floppy Translation
FORTRAN is an acronym for "Formula Translation." It is a high-level programming language specifically designed for scientific and engineering calculations. FORTRAN was developed in the 1950s by a team led by John Backus at IBM.
Key features of FORTRAN include:
1. Numerical Computation: FORTRAN was designed to efficiently handle numerical computations and mathematical operations. It supports a wide range of mathematical functions, arrays, and matrix operations.
2. Portability: FORTRAN programs can be easily transferred or ported across different computer systems and architectures. This portability made it a widely adopted language in the scientific and engineering communities.
3. Efficiency: FORTRAN is known for its ability to generate highly efficient code. It includes features such as loop optimization and array processing that allow for efficient utilization of computer resources.
4. Scientific Programming: FORTRAN includes built-in support for scientific programming, with features such as complex number handling, trigonometric functions, and input/output operations tailored for scientific calculations.
5. Legacy and Current Usage: FORTRAN has a long history and remains in use today, particularly in fields such as scientific research, engineering, and computational mathematics. Modern versions of FORTRAN, such as Fortran 90, Fortran 95, and Fortran 2003, have introduced additional features and improvements while maintaining compatibility with older versions.
FORTRAN played a crucial role in the early development of computer programming and contributed to the advancement of scientific and engineering disciplines by enabling complex calculations and simulations.
7. EEPROM stands for
- Electrically Erasable Programmable Read Only Memory
- Easily Erasable Programmable Read Only Memory
- Electronic Erasable Programmable Read Only Memory
- None of the above
EEPROM stands for Electrically Erasable Programmable Read-Only Memory.
EEPROM is a type of non-volatile memory that can be electrically erased and reprogrammed. It allows for data to be written, modified, and erased multiple times, making it useful for applications that require frequent updates or modifications to stored information.
Unlike traditional ROM (Read-Only Memory), which is programmed at the time of manufacturing and cannot be changed, EEPROM can be reprogrammed using electrical signals. It retains its stored data even when the power is turned off, hence the term "non-volatile."
EEPROM is commonly used in various electronic devices, including microcontrollers, embedded systems, computer peripherals, and consumer electronics. It provides a flexible and reliable means of storing small to medium-sized amounts of data that need to be modified or updated during the device's lifetime.
8. Second Generation computers were developed during
- 1949 to 1955
- 1956 to 1965
- 1965 to 1970
- 1970 to 1990
Second-generation computers were developed during the 1950s to the early 1960s. This period marked a significant advancement in computer technology compared to the first-generation computers.
Key characteristics of second-generation computers include:
1. Transistors: The major innovation in second-generation computers was the replacement of vacuum tubes with transistors. Transistors were smaller, more reliable, and consumed less power than vacuum tubes. They allowed for faster and more efficient data processing.
2. Magnetic Core Memory: Second-generation computers utilized magnetic core memory for data storage. Magnetic cores were small rings of magnetic material that could be magnetized or demagnetized to represent binary data. This form of memory was faster, more reliable, and had higher storage capacity than the mercury delay line memory used in first-generation computers.
3. Assembly Language and High-Level Languages: Second-generation computers saw the emergence of assembly language and high-level programming languages. Assembly language allowed programmers to write instructions using mnemonic codes that were easier to understand than machine language. High-level languages, such as COBOL and FORTRAN, provided even greater abstraction and ease of programming.
4. Batch Processing and Operating Systems: Second-generation computers introduced batch processing, where multiple jobs or tasks were collected and processed together in a sequence. This improved overall system efficiency. Additionally, operating systems were developed to manage and control the execution of programs, memory allocation, and input/output operations.
5. Decreased Size and Increased Reliability: Second-generation computers were smaller, more compact, and more reliable compared to their first-generation counterparts. The use of transistors and improved manufacturing techniques allowed for miniaturization and enhanced overall system reliability.
Prominent examples of second-generation computers include IBM 1401, IBM 7090, and UNIVAC 1107. These computers were faster, more powerful, and more accessible than their predecessors, and they played a significant role in the widespread adoption of computing technology in various industries and sectors.
9. The computer size was very large in
- First Generation
- Second Generation
- Third Generation
- Fourth Generation
The computer size was very large in the early generations of computers, particularly during the first and second generations.
During the first generation of computers in the 1940s and 1950s, computer systems were massive and took up entire rooms. They were typically housed in large cabinets or racks and required extensive cooling and power supply systems. The physical size of these computers was due to the large number of vacuum tubes used for electronic circuitry, along with other components like punch card readers, tape drives, and bulky memory systems.
In the second generation of computers in the 1950s to early 1960s, there was a significant reduction in size compared to the first generation. The use of transistors instead of vacuum tubes allowed for smaller and more compact computer systems. However, they still occupied considerable space and often required specialized rooms or facilities to accommodate them.
It was during subsequent generations, particularly from the third generation onward, that computers started to become smaller and more compact. The development of integrated circuits, miniaturization of components, and advancements in semiconductor technology contributed to the steady reduction in computer size over time.
Today, we have a wide range of computing devices, from desktop computers to laptops, tablets, and smartphones, which are significantly smaller and more portable than the early generations of computers.
10. Microprocessors as switching devices are for which generation computers
- First Generation
- Second Generation
- Third Generation
- Fourth Generation
Microprocessors as switching devices were primarily introduced in the third generation of computers.
The third generation of computers emerged in the 1960s and lasted until the mid-1970s. During this period, there was a significant shift towards using integrated circuits (ICs) and microprocessors in computer design.
Microprocessors are a key component of the central processing unit (CPU) and serve as the "brain" of the computer. They consist of multiple integrated circuits combined onto a single chip, incorporating the functions of arithmetic and logic operations, control unit, and memory addressing.
With the introduction of microprocessors, computers became more compact, faster, and more cost-effective. Microprocessors enabled greater computational power, improved efficiency, and enhanced capabilities for data processing, control, and communication.
Notable microprocessors from the third generation include the Intel 4004 and Intel 8008, which were early microprocessors introduced in 1971 and 1972, respectively. These microprocessors played a significant role in the development of smaller and more affordable computers during the third generation.
Overall, the utilization of microprocessors as switching devices was a major technological advancement in the third generation, leading to the further miniaturization and increased processing power of computers.