Categories
  Encyclosphere.org ENCYCLOREADER
  supported by EncyclosphereKSF

Character encoding

From Conservapedia - Reading time: 1 min

The purpose of character encoding schemes is to provide a means of storing, retrieving, comparing, inputting, and outputting symbol glyphs such as letters, digits, and various symbols.

Computers represent data as a series of bits. Therefore, any other representation of data (such as alphabetic characters) must derive from an agreed-upon interpretation (ie a "standard") of how those bits will be interpreted. Any such standard is essentially arbitrary, but certain standards for character encoding have emerged over the years. Since most modern computers deal with bytes (8-bit groupings), or multiples of bytes, most character encodings are defined at the byte level. There may, or may not, be a correlation between a given byte value and what character it represents in any given encoding scheme. For instance, a byte value of 1001110 represents the letter "N" in ASCII,[1] but the symbol "+" in EBCDIC.

The main character encoding standards today are ASCII (sometimes ambiguously referred to as "ANSI"), and Unicode. Older standards include EBCDIC, Baudot, and Radix-50.

References[edit]


Licensed under CC BY-SA 3.0 | Source: https://www.conservapedia.com/Character_encoding
4 views | Status: cached on February 15 2023 04:56:56
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF