In the field of computers, bandwidth refers to the maximum amount of data that can moved through a connection. In the sense that it literally means "width of the band", it can refer to how many bits wide a connection is. For instance, a connection that is 16 bits wide must split up a 32-bit double word in order to transmit it.
Bandwidth is sometimes used informally to signify throughput, the amount of data that can be transferred within a set amount of time. For instance, a USB2 port has a theoretical bandwidth of 480 MB/s, or an internet connection has a bandwidth of 1.5 Megabits/second. In these instances, while throughput may be the technically accurate term, bandwidth is more commonly used.
Throughput is bandwidth/transfer * transfers/second in a synchronous system. A 64-bit (8 byte) bus operating at 200 Megahertz would be capable of 1.6 Gigabytes of throughput.
Bandwidth is often referred to as the "speed" of a connection. However, latency plays as large of a role (if not a larger one) in the speed of many connections. Latency measures the amount of time required to cross a bus. The effect of latency can be seen in a brief (very simplified) example: A user wishes to view a 4 MB file located on another computer. The user sends the request, which takes .5 seconds to reach the host. The host then sends the file back along the same route, at 1 MB/s. In this example, it will take the user 5 seconds to get the file - .5 seconds for the request to reach the host computer, .5 for the first pieces of the file to arrive after being sent, and 4 seconds for the entire file to move through the connection.