Subject classification: this is an information technology resource. |
Educational level: this is a secondary education resource. |
Type classification: this is a lesson resource. |
Completion status: this resource has reached a high level of completion. |
Character encoding is used to represent a repertoire of characters by some kind of encoding system. Depending on the abstraction level and context, corresponding code points and the resulting code space may be regarded as bit patterns, octets, natural numbers, electrical pulses, etc. A character encoding is used in computation, data storage, and transmission of textual data.[1]
Data are Numbers, Text, Sounds, Image, Animation, Video, etc.; in order to define them in the real world, we are using Numbers (0,...9), Alphabet (A,...Z) and Symbols (@,[,\,...) or a combination of them, for example:
Computers don't understand the definitions for Numbers (0,...9), the Alphabet (A,...Z) and Symbols (@,[,\,...), so in order to process those pieces of information a unique code must be assigned to each of them. The unique code for Numbers (0,...9), the Alphabet (A,...Z) and Symbols (@,[,\,...) is a binary numeral.
Pronounced: "ask-ee". A type of binary code that uses 7 bit for each character ( Number, Alphabet and Symbols ). A total of 128 characters (2^7=128). For example:
Number (0 - 9)
Alphabet (A -Z)
Symbols (@ < ( & ^ % $ #...)
ASCII stands for American Standard Code for Information Interchange.
View the ASCII character table at http://www.asciicodes.us
In Extended ASCII uses 8 bits (1 byte) for each character (Number, Alphabet and Symbols). A total of 256 characters (2^8=256). For example:
View Extended ASCII character table at http://ascii-code.com
A type of binary code that uses 16 bits for each character (Number, Alphabet and Symbols). A total of 65536 characters (2^16=65536). For example:
View Unicode character table at http://unicode-table.com/en/#control-character