Categories
  Encyclosphere.org ENCYCLOREADER
  supported by EncyclosphereKSF

Analytic dissection

From Wikipedia - Reading time: 2 min

Analytic dissection is a concept in U.S. copyright law analysis of computer software. Analytic dissection is a tool for determining whether a work accused of copyright infringement is substantially similar to a copyright-protected work.

In analytic dissection, unprotectable elements of a work are dissected out and discarded before making any comparison of the two works. These unprotectable components include idea (as contrasted with expression), scènes à faire (conventional elements typical of a genre), material in the public domain, and functional aspects. As the Ninth Circuit explained in the 1988 Data East case, that such elements are common to two works does not create substantial similarity. Rather, infringing similarity must be based on the similarity of what remains after the unprotectable elements are dissected out.[1]

Subsequently, in Computer Associates International, Inc. v. Altai, Inc.,[2] the Second Circuit applied this conceptual tool in determining whether two computer programs were substantially similar, under the name of the "Abstraction-Filtration-Comparison" test. As the Tenth Circuit concisely explained this test in Gates Rubber v. Bando Chemical Industries:

[A] court should dissect the program according to its varying levels of generality as provided in the abstractions test. Second, poised with this framework, the court should examine each level of abstraction in order to filter out those elements of the program that are unprotectable. Filtration should eliminate from comparison the unprotectable elements of ideas, processes, facts, public domain information, merger material, scènes à faire material, and other unprotectable elements suggested by the particular facts of the program under examination. Third, the court should then compare the remaining protectable elements with the allegedly infringing program to determine whether the defendants have misappropriated substantial elements of the plaintiff's program.[3]

This legal test has generally "been applied in subsequent [copyright law] decisions, to the extent that it is recognised in the USA, and elsewhere, as the accepted standard."[4]

Parallels in patent law

[edit]

A conceptually similar approach has been applied at times in US, UK, and European patent law. In Neilson v. Harford, the Exchequer adopted a method of analyzing the patent-eligibility of inventions based on a natural principle or phenomenon of nature, in which the principle is treated as if part of the prior art and the remainder of the invention (i.e., the mechanical implementation of the principle) is evaluated for patentability under the usual tests (novelty, etc.). The US Supreme Court followed this approach in O'Reilly v. Morse and subsequent decisions including Parker v. Flook and Mayo v. Prometheus. A similar type of analysis of obviousness or inventive level has been used under the name of the "point of novelty" test, which is suggested by the use of a Jepson claim.

References

[edit]

The citations in this article are written in Bluebook style. Please see the talk page for more information.

  1. ^ See Data East USA, Inc. v. Epyx, Inc., 862 F.2d 204 (9th Cir. 1988) (video game case).
  2. ^ 982 F.2d 693 (2d Cir. 1992).
  3. ^ Gates Rubber Co. v. Bando Chemical Industries, Ltd., 9 F.3d 823 (10th Cir. 1993).
  4. ^ Stanley Lai, The Copyright Protection of Computer Software in the United Kingdom 30 (2000) (collecting authorities).

Licensed under CC BY-SA 3.0 | Source: https://en.wikipedia.org/wiki/Analytic_dissection
7 views | Status: cached on December 05 2024 01:05:32
Download as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF