A character literal is a type of literal in programming for the representation of a single character's value within the source code of a computer program. Languages that have a dedicated character data type generally include character literals; these include C, C++, Java,[1] and Visual Basic.[2] Languages without character data types (like Python[3] or PHP[4]) will typically use strings of length 1 to serve the same purpose a character data type would fulfil. This simplifies the implementation and basic usage of a language but also introduces new scope for programming errors.
A common convention for expressing a character literal is to use a single quote ('
) for character literals, as contrasted by the use of a double quote ("
) for string literals. For example, 'a'
indicates the single character a
while "a"
indicates the string a
of length 1.
The representation of a character within the computer memory, in storage, and in data transmission, is dependent on a particular character encoding scheme. For example, an ASCII (or extended ASCII) scheme will use a single byte of computer memory, while a UTF-8 scheme will use one or more bytes, depending on the particular character being encoded.
Alternative ways to encode character values include specifying an integer value for a code point, such as an ASCII code value or a Unicode code point. This may be done directly via converting an integer literal to a character, or via an escape sequence.
Original source: https://en.wikipedia.org/wiki/Character literal.
Read more |