Shared Flashcard Set

Details

CMI
Chapter 2
23
Other
Not Applicable
09/27/2005

Additional Other Flashcards

 


 

Cards

Term
Match each base with its corresponding Number system. One will not be used.---------A)Base 10 B)Base 16 C)Base 2 D)Base 15 __Hexidecimal---__Binary---__Decimal
Definition
A=Decimal B=Hexadecimal C=Binary
Term
What is the most basic unit understood by computers?
Definition
Bit
Term
The 2 possible values for one bit are __ and __.
Definition
1---0
Term
How many bits does each have?--Byte=___, Word=___, Nibble=___, Crumb=___
Definition
Byte = 8 bits---Word = Multible bytes---Nibble = 4 bits---Crumb = 2 bits
Term
In binary number system, a 1 indicates the eletricity is ___, while a 0 indicates the electricity is ___.
Definition
On ; Off
Term
This is a small, lowercase symbol placed at the bottom right-hand corner of a number. It is used to indicate the base system for a number.
Definition
Subscript
Term
The lowerest binary number can be created with on Byte is ________; its decimal numerical equivalent is zero.
Definition
00000000
Term
The highest binary number can be created with on Byte is 11111111; its decimal numerical equivalent is ___.
Definition
255
Term
This code assigns a unique Byte of information to all uppercase and lowercase letters of the English alphabet, as well as to commonly-used formatting commands and text symbols.
Definition
ASCII
Term
This system is a comprehensive collection of character, glyph, and formatting commands (including standard ASCII code) that incorperates many international language symbols and is supported by many current web design languages.
Definition
Unicode
Term
ASCII-
Definition
American Standard Code for Information Interchange- The industry standard 8-bit characters used to define English text characters
Term
Binary Number System-
Definition
A base 2 numbering scheme which uses 2 possible values for each digit: 0 and 1
Term
Bit (Binary Digit)-
Definition
A bit is a single binary digit.
Term
Byte-
Definition
8 contiguous bits.
Term
Decimal Number system-
Definition
A number system with a base of 10 (using 10 unique digits, 0-9).
Term
Decimal Number system-
Definition
A number system with a base of 10 (using 10 unique digits, 0-9). This is the most common number system used by humans.
Term
Hexadecimal Number system-
Definition
A number system with a base of 16 (0-9) and six letters (A-F) used in computers to make binary easier.
Term
Crumb-
Definition
A collection of 2 bits.
Term
Nibble-
Definition
Collection of 4 bits.
Term
Subscript-
Definition
-Subscripts are small numbers of letters that appear at the bottom right of a number. Denotes what number system is being represented.
Term
Systeme International d'unities-
Definition
A scientific method of expressing the magnitudes or quantities of seven important matural phenomena.
Term
Unicode-
Definition
a code system that incorperates many international charcter symbols, and combines and simplifies other seperate code systems for glyphs or charcters; especially useful in website design
Term
Word
Definition
A group of multible Bytes. Smallest= 2 bytes or 16 bits
Supporting users have an ad free experience!