Close
Close
21 January 2020

Bobby Chesney, Strauss Center Director and James Baker Professor in Law, recently published an article on Lawfare in response to Facebook’s new deepfake policy, which he regards as an insufficient, albeit welcome, development. Chesney defines deepfakes as “realistic-looking video or audio falsehoods, which show real people doing or saying things they never did or said,” which are generated using artificial intelligence or machine learning technology. After citing some recent examples of political deepfake controversies, Chesney puts forth his two primary critiques of Facebook’s new policy. First, while the policy bans manipulated audio and videos which show people saying something which they never said (excluding parody and satire), it does not ban audio or video portraying individuals doing something they never did, such as an obscene gesture. Second, the policy does not extend to “cheapfakes”—manipulated audio or video produced by less sophisticated means. Chesney regards this loophole as more troubling given the current political climate, as cheapfakes are easier to produce and therefore more abundant than deepfakes. Despite the policy’s many shortcomings, Chesney concludes by noting that Facebook’s attempt to regulate such complex content is heartening.

Read the full article here.

FlagsIcon