Although the term is usually associated with content recorded on a storage medium, recordings are not required for live broadcasting and online networking.
Wire and transmission lines emerged as communication tools, starting with the telegraph in the late 18th century. Samuel Morse invented the telegraph in 1832, introducing wires to transmit electrical signals over long distances. In 1844, the first successful telegraph line was established in the United States, and in the 1850s, telegraph cables were laid across the Atlantic connecting North America and Europe.[2] At the same time the telegraph was becoming mainstream, the need to transmit images over wire emerged. The first commercially successful fax machine was developed by Elisha Gray in 1861, allowing printed images to be transmitted over a wire.[3]
The telephone was another breakthrough in electronic communication, allowing people to communicate using voice rather than written messages. Alexander Graham Bell pioneered the first successful telephone transmission in 1876, and by the 1890s, telephone lines were being laid worldwide.[4] Since all these significant breakthroughs relied on transmission lines for communication, a minor improvement was made by the English engineer Oliver Heaviside who patented the coaxial cable in 1880.[5] The coaxial cable allowed for greater bandwidth and longer transmission distances.
Significant improvements in the mode of transmission were made in the last seventy years with the introduction of fiber optics, wireless transmission, satellite transmission, Free Space Optics, and the internet. Fiber optics was first developed in the 1950s but became commercially viable in the 1970s. On the other hand, wireless communication made a major improvement in the transmission mode, doing away with wires and introducing electromagnetic waves. Guglielmo Marconi invented the radio transmission in 1897, and by the 1900s, radio transmission had become a mainstream source of news, entertainment, and military communication.[6] Satellite communication allowed data to be transmitted over much longer distances than possible. The United States pioneered satellite communication in 1958 when it first launched Explorer 1.[7]
Free Space Optics (FSO), which uses lasers to transmit data through the air, was first developed in the 1960s. However, it was only in the 1990s that the technology advanced enough to become commercially viable.[8] The internet, on the other hand, emerged in the second half of the last century. In the 1960s, the first protocols for transferring files were developed, making it possible to transfer files between computers. In 1989, Tim Berners Lee created the World Wide Web, making it much easier to share information through hyperlinks. In 1996, the Real-Time Transport Protocol (RTP) was introduced, allowing for live audio and video streaming over the internet. RTP was a breakthrough in online entertainment, allowing real-time events to be broadcast live to audiences worldwide.
The history of display and output technology is long and fascinating, beginning in the early 19th century with the development of the galvanometer, which was used to detect and measure small electrical currents. In 1844, the telegraph sounder was developed, which used an electromagnet to produce a clicking sound that corresponded to the transmission of electrical signals over a telegraph line.[9] It was followed by the telephone receiver, which used a diaphragm to convert electrical signals into sound. In the late 1800s and early 1900s, the first forms of artificial light were developed, including red light and neon. These were used in various applications, including lighting for displays and signs.
In 1910, the teleprinter was invented, which allowed for the transmission of text messages over a wire. It was followed by the development of the cathode-ray tube (CRT) by William Crookes, but it became widely available by the 1920s. The CRT was used for early television and computer displays.[10] The radio and television tuner was also developed in the early 20th century, allowing people to receive and tune in to broadcast signals. The speaker and headphones were invented in the late 1800s and early 1900s and were used for listening to audio signals from radios, phonographs, and, later, electronic devices.
In the 1950s and 1960s, LED and LCDs were developed, allowing for the production of more compact and efficient displays for various applications such as lighting and television monitors.[11] In the 1970s, laser light shows were introduced, which used lasers to produce dramatic visual effects for concerts and other events. The first computer monitor was developed in the 1950s, and the first commercial PC monitor was introduced in 1976. Large electronic displays were introduced in 1985, allowing for the production of large-scale displays for use in stadiums, arenas, and other public spaces. HDTV was first proposed as a term in 1936, but it was in the 1990s that standards were established for producing and broadcasting high-definition television signals.[10] The head-mounted display (HMD) was introduced in 1968 and continues to be developed and improved to this day, allowing for immersive virtual reality experiences and other applications.
The history of electrical signal processing is closely tied to the development of electronic communications technology, beginning in the mid-18th century with the invention of the capacitor, which allowed for the capture and storage of electrical charges. In the 1830s, analog encoding methods, such as Morse code, were developed, allowing for transmitting information over long distances using electrical signals.[2] Electronic modulation was developed between 1832 and 1927 and was a crucial development in the history of telecommunications.
Electronic multiplexing, which allowed for the transmission of multiple signals over a single channel, was first developed in 1853 using a technique called time-division multiplexing (TDM).[12] Digitizing, or converting analog signals into digital form, was first developed in 1903 with the invention of pulse-code modulation (PCM) for telephone communications.[13] Electronic encryption, which allowed for the secure transmission of information over electronic channels, was developed between 1935 and 1945 and played a crucial role in developing electronic communications during World War II. Online routing, or the ability to direct electronic signals to specific destinations, was first developed in 1969 with the creation of the ARPANET, a precursor to the modern internet.[14] Electronic programming, or the ability to use electronic signals to control and automate processes, has been developed since the 1940s and continues to be an important area of research and development in electrical signal processing.
The history of electronic information storage dates back to the 18th century, with the invention of punched cards and paper tape in 1725 and 1846, respectively. Early forms of electronic storage were used to store simple text and numerical data.[15] In the late 19th century, the invention of the phonograph cylinder and disk in 1857 and 1877, respectively, allowed for the recording and storage of audio data. In 1876, the invention of film allowed for the recording and storing of moving images.[15]
In 1941, the invention of random-access memory (RAM) allowed for storing and retrieving digital data at high speeds and is still in use today.[15] Barcodes were first invented in 1952 for use in grocery stores. The Universal Product Code (UPC) was standardized in 1973, allowing for storing and retrieving product information in a digital format.[16]
In 1969, the invention of laser discs allowed for the storage and playback of high-quality video and audio data, but the format was short-lived, with its commercial life ending in 1978. Compact discs (CDs) were invented in 1982 and quickly became a popular medium for storing and playing back digital audio data.[15] DVDs were introduced in 1993, offering higher storage capacity and the ability to store video data.
Content or media refers to the different types of digital information that can be stored, transmitted, and consumed through electronic devices. The history of content formats dates back to the late 19th century when the first audio recording was created.
Audio Recording: In 1877, Thomas Edison invented the phonograph, the machine that could record and playback audio.[17] The invention began audio recording and created different audio formats, including vinyl records, magnetic tape, and digital audio files. Vinyl records were introduced in the late 19th century and were the primary format for music until the late 20th century when digital audio formats such as MP3 and AAC were introduced. The vinyl, however, remains a cultural icon despite its obsolescence, and it retains an aura of sanctity immune to symbolic pollution.[18] The MP3, on the other hand, was invented by Karlheinz Brandenburg from the Fraunhofer Institute.[19] The MP3 encoder software developed by the institute enabled individuals to digitize their audio files using a compression algorithm called MPEGI-Layer III.[19] Compressed files were stored in CDs, at the time costing $250.
Video Recording: Video recording technology was first introduced in 1952 when Charles Ginsburg created the first video tape recorder at Ampex Corporation. The videotape recorder technology was later refined, introducing different video formats, such as Betamax, VHS, and DVD. Betamax was released by Sony in 1975 and was an analog video recorder of the cassette format. However, it did not survive for long. The Video Home System (VHS), developed by the Victor Company of Japan in 1976, won the video format wars trumping Betamax and gaining widespread usage worldwide.[20] The main difference between VHS and Betamax was that Betamax had sharper and clearer images while VHS had a longer run time.[20] In the 21st century, however, these two are now part of the bygone eras. Digital video formats such as MPEG-4 and H.264 have become the dominant video recording and playback formats.
Digital File Formats: The introduction of digital file formats marked a significant shift in how content was stored and transmitted. In the early days of digital computing, text-based formats such as ASCII and RTF were used to store and transmit textual content. ASCII was based on telegraphic code and had a very narrow scope of use, considering it only had 128 code points.[21] RTF, on the other hand, stands for Rich Text Format. Microsoft developed it in 1987. The format allowed for sharing of documents across platforms. RTF was especially valued because it could store important document information such as formatting, font, and style. Later, image formats such as JPEG and PNG were introduced, allowing for the storage and transmission of digital images. The introduction of digital audio and video formats further expanded the range of digital file formats available.
Database Content and Formats: Databases have been used to store and manage digital content since the 1960s. E.F. Codd conceptualized the relational database model in 1970.[citation needed] The model required applications to search data without following links and instead search through the content. The model rested on the predicate and set theory and would set the stage for future databases.[citation needed] The first commercially available database management system (DBMS) was introduced in 1979 by Relational Software, Inc. (later renamed Oracle Corporation). Since then, different database systems have been introduced, including relational, object-oriented, and NoSQL databases.
Interactivity refers to the ability of electronic media to respond to user input, allowing for a more immersive and engaging experience. The history of interactivity can be traced back to the development of input devices such as the control panel.
Control Panel: Control panels were first introduced in the early days of computing to interact with computer systems. These panels typically consisted of a series of switches and knobs that could be used to input data and commands into the computer.
Input Device: The development of input devices such as the keyboard and mouse marked a significant advancement in the field of interactivity. The introduction of graphical user interfaces (GUIs) allowed for more intuitive interaction with computer systems.
Game Controller: The game controller was introduced in the late 1970s with the introduction of the Atari 2600 video game console. The Atari 2600 console had no disk space and only had a RAM of 128 bytes.[22] Its graphics clock ran 12 MHz, while its ROM only had 4 kilobytes.[22] Despite such limitations, the device allowed users to interact with video games more immersively, paving the way for developing more advanced game controllers in the future.
Handheld: The introduction of handheld devices such as the Nintendo Game Boy and the Sony PlayStation Portable allowed for interactive gaming on the go. Nintendo was released in 1989 in Japan, and it was criticized for not having any backlight or graphics. The first Sony PlayStation was also released in Japan in 1994. The initial devices were handheld video games; however, more sophisticated video game consoles such as PlayStation 3, 4, and 5 have also been released. The initial handheld devices featured built-in controllers and small screens, allowing users to play games anywhere and anytime.
Wired Glove: The wired glove was first introduced in the early 1980s to interact with virtual reality environments. These gloves were equipped with sensors to detect hand movements, allowing users to manipulate virtual objects and navigate virtual environments. Wired gloves significantly improved from using the mouse, joystick, or trackball during virtual interactions. They were expensive, limiting their spread and expansion.[23]
Brain-Computer Interface (BCI): The brain-computer interface (BCI) is the latest development in interactivity. The technology allows users to control electronic devices using their brainwaves, bypassing the need for physical input devices such as keyboards or controllers. While still in the experimental stage, BCI technology has the potential to revolutionize the way we interact.