What Are the Real Risks of Artificial Intelligence?

Innovation

What Are the Real Risks of Artificial Intelligence?

The same risks have always been with us

The best way to address this threat is to educate the public about the enormous menace of fake news, which is really nothing new. It should go without saying, although it unfortunately doesn’t, that one shouldn’t forward, retweet, or “send to all” a sensationalistic cartoon or “news” item one sees on social media or even a website, given that many purported news sites are untrustworthy.

Palestinian terrorists have used out-of-context photographs to falsely accuse Israel of war crimes. In one example, a Chilean police officer who knelt on the neck of a suspected vandal turned into an Israeli kneeling on the neck of a Palestinian child.13 I’ve seen other examples in which a Google image-recognition search on the purported “Israeli atrocities” have shown that the pictures are from other parts of the world entirely.

It’s therefore vital for responsible citizens to recognize the dangers of manipulation by (for example) Russian trolls in social media.16 The Washington Post reported, “The Justice Department’s special counsel announced a sweeping indictment Friday of a notorious Russian group of internet trolls—charging 13 individuals and three companies with a long-running scheme to criminally interfere with the 2016 U.S. presidential election”.17

Although the technology may be new, however, the idea is not. Remember that in Arthur Miller’s The Crucible, people testified that they had seen purported witches with the Devil. Senator Joe McCarthy was known for using doctored photos to depict people he didn’t like in the company of Communists.9 Instead of “I saw Bridget Bishop with the Devil!” in The Crucible, McCarthy’s accusations went along the lines of “I saw Senator Millard Tydings with a Commie, and here’s a picture to prove it!”

Cartoonists of World War I depicted Germans as monsters who attacked women and children, and exploited the half-truth that a German submarine sank the Lusitania and civilian lives were lost. They omitted the fact that the ship was carrying ammunition with the sole purpose of killing German soldiers, which made the ship a legitimate target. The ship’s cargo manifest, in fact, cited 1,271 cases of ammunition and 4,200 cases of cartridges and ammunition.15

The same would apply to somebody who had AI write a story in the style of Mark Twain or William Shakespeare, or draw a picture in the style of Rembrandt or Michelangelo by copying their existing works. While I can’t give legal advice, a very strong argument can be made that a living person has personality and intellectual property rights to their literary, artistic, or theatrical skills. A deceased person’s heirs may also enjoy these rights, at least for a period of time. New York enacted a post-mortem right of publicity for digital replicas of deceased performers.7 Regardless of whether one can legally use a voice clone or a digital replica of a deceased performer with or without permission, one does have an ethical obligation to disclose that the work is that of AI rather than the person.

References
1.  O’Brien, Matt. “Tech leaders issue warning: AI raises risk of extinction.” Associated Press, May 30, 2023.
2. Scientific Gems. “Killer robots: It’s not the AI that’s the problem.” Aug. 27, 2017.
3. Associated Press. “Virginia Man Killed in Civil War Cannonball Blast.” Jan. 13, 2015. (Cannonballs don’t explode, as often depicted in movies; they are solid iron, and the text suggested instead that it was a shell.)
4. Movieclips. WarGames (3/11) Movie Clip—Shall We Play a Game? (1983) HD.” July 30, 2013.
5. Apex Clips. “2001: A Space Odyssey 4K HDR | The Shutdown Of Hal.” Jan. 11, 2021.
6. Wise, Justin. “Lawyer’s AI Blunder Shows Perils of ChatGPT in ‘Early Days.’” Bloomberg Law, 2023. 
7. Townsend, Katie. “Raising the Dead: Understanding Post-Mortem Rights of Publicity.” International Documentary Association, Feb. 4, 2022.
8. Vega, Nicholas. “It’s ‘kind of scary’: Paul McCartney used A.I. to reunite with John Lennon on new Beatles record.” CNBC, June 14, 2023. 
9. Murphy, L. “‘I’m about to end this whole man’s career’: McCarthy’s Photoshop Revenge.” Urban Fictionary, Nov. 23, 2023.
10. DesignYouTrust.com. “Before Photoshop—14 Historic Photos That Have Been Manipulated.”  2016.
11. Sleeper. “Forrest Gump (1994): President Johnson Medal—‘Hit in the buttocks.’” April 4, 2023.
12. BizarrePower. “Star Trek DS9 Sisko Meets Kirk.” Aug. 11, 2023.
13. Reuters. “Fact check: Photo doesn’t show an Israeli soldier killing a Palestinian child by kneeling on his neck.” July 10, 2020.
14. JRMora. “The 1898 Spanish-American War in cartoons.” Nov. 7, 2023.
15. The Lusitania Resource. “Cargo.” The actual manifest appears here
16. MSNBC. “How Russian trolls weaponized social media.” April 14, 2019.
17. Barrett, Horwitz, and Helderman. “Russian troll farm, 13 suspects indicted in 2016 election interference.” The Washington Post, Feb. 16, 2018.
18. Verma, Pranshu. “They thought loved ones were calling for help. It was an AI scam.” The Washington Post, March 5, 2023. 

منبع: https://www.qualitydigest.com/inside/innovation-column/what-are-real-risks-artificial-intelligence-071723.html

Photo manipulation has been around almost as long as photography. Portions of three separate Civil War photos were used to place Gen. Ulysses S. Grant at City Point, Virginia. They needed a picture of Grant on a horse, so they put his head on the body of another general, Alexander M. McCook, who was sitting on a horse. This composite was then put in front of a prisoner of war camp for Confederate soldiers, who were depicted as Union soldiers. Gen. Francis P. Blair was added to a group that included Gen. William Tecumseh Sherman—and this was done with the technology of the 1860s.10 The intentions weren’t dishonest, but the takeaway is that the ability to manipulate photos has been around for more than 150 years.

Sensationalistic yellow journalism in both the United States and Spain helped foment the Spanish-American War at the cost of thousands of lives.14 American journalism included essays on “Why Spaniards Are Cruel” and cited examples dating back to Spain’s Phoenician history (Carthage had some colonies there), which included the worship of Moloch with human sacrifices. Spanish cartoonists liked to depict the United States as a “Yankee pig,” with the result that both sides’ newspapers signaled that they were spoiling for a fight.

Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.

Voice cloning and the use of AI to isolate a voice also have some legitimate applications. AI was used to isolate (not clone) John Lennon’s voice in fabricating one last Beatles single,8 but the prospect of voice cloning could allow a singer to effectively protect his or her voice against loss. That is, if the singer suffered some kind of injury or illness to the vocal cords, the singer could still perform via AI.

Misuse of AI for propaganda and fraud

Although ChatGPT was able to pass the bar exam, it cited nonexistent cases when a lawyer used it to prepare court filings.6 “New York lawyers Steven Schwartz and Peter LoDuca face a June 8 hearing on potential sanctions after a court brief they submitted cited six nonexistent cases,” wrote Justin Wise in Bloomberg Law. A licensed professional, such as a lawyer, engineer, or doctor, is ultimately responsible for his or her work product and must accordingly ensure that the computer’s output is accurate and makes sense.

Intellectual property issues

Scientific Gems2 opines, however, that the problem with the Menschenjäger wasn’t its artificial intelligence but rather its longevity, and adds that totally unintelligent land mines can cause trouble decades after the war in which they were used. The same can be said of unexploded bombs (UXBs) from World War II, and there are parts of France that are uninhabitable today due to leftovers from World War I. People have even been killed by Civil War munitions 150 years later,3 so the issue isn’t AI but rather weapons in general that remain dangerous long after their intended use ends.

However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads.

1. Science fiction stories about computers that turn on their human creators all assume that their creators are sufficiently reckless to give them unsupervised control of weapons or other resources.
2. Humans are, however, ultimately responsible for what their computers do. “The computer did it” isn’t an excuse for substandard or dysfunctional performance.
3. The ability of AI to emulate a famous artist’s, musician’s, writer’s, or actor’s style or voice creates intellectual property issues, but common sense and existing intellectual property law should tell us what we can and can’t do from a legal and ethical standpoint.
4. The biggest danger is probably the ability of AI to generate realistic audio and video of people saying and doing things they never said or did. This kind of misinformation is an enormous menace to societies around the world. But fake news, doctored photos, and similar deceptive practices have been around for more than 100 years. AI is just the newest technology for promoting it.

The death of all men

Less imaginative stories require, however, substantial willful suspension of disbelief in their premise; namely, humans have given computers ultimate control over weapons or money.

The biggest menace to a free society comes not from AI itself, but rather dishonest use of AI to propagate fake news and frauds. Voice cloning and digital replication could, for example, put a political candidate into an extremist rally where they express support, with their actual voice, for that organization’s objectives. The video could then be circulated widely on social media, where many voters could easily construe it as authentic.

Remember, however, that a computer won’t create dishonest propaganda or participate in financial or romantic fraud unless a dishonest person tells it to do so. AI is nothing more than the newest tool in the dishonest propagandist’s or fraudster’s box, which has long included doctored photos, fake news, and yellow journalism. The best countermeasure is to educate the public about this issue in general to immunize them from this kind of manipulation.

• In the 1983 film WarGames, a computer is given control of the United States’ nuclear arsenal because the commanders don’t believe the weapon controllers will obey a launch order. The computer is also connected to what is now known as the internet (!) and is willing to play “Global Thermonuclear War” with a teenager who hacks into the system.4
• The Terminator films starring Arnold Schwarzenegger assumed a computer system called Skynet was given control over nuclear weapons.
Colossus: The Forbin Project (1970) assumes the United States and Soviet Union turn over control of their nuclear weapons to computers, and the computers get together and decide to take over the world.

The HAL computer in 2001: A Space Odyssey did try to kill its human crew, but once the surviving astronaut gained access to HAL’s circuitry, the computer could do absolutely nothing to stop him from essentially pulling the plug.5 AI can’t threaten the human species as long as an emergency off switch is always available to human control. But its misuse can nonetheless create real threats.

‘The computer did it’ isn’t an excuse

A related side issue consists of using AI to defraud people over the internet. Voice cloning can be used to impersonate a friend or family member and ask for money,18 but similar scams that involve email have been around for decades. Another issue involves online dating scams in which AI can impersonate an ideal man or woman to gain somebody’s trust. The basic concept predates computers, though, noting that Rostand’s Cyrano de Bergerac was the real brain behind Christian de Neuvillette’s poetic advances to Roxane.

‘Tech leaders issue warning: AI raises risk of extinction”1 comes across as another version of science fiction stories that have been around for decades about humans creating something greater than themselves that finally destroys them. Although it’s obviously risky to allow a computer to control weapons, money, medical records, or anything else without responsible human supervision, even the most intelligent imaginable computer is limited if its sole interface with the human world consists of a video monitor and a keyboard.

The growing ability of AI to emulate an artist, writer, or orator’s style creates relatively new intellectual property issues. Respeecher says, for example, it can “create speech that’s indistinguishable from the original speaker,” and, “Every nuance and emotion from the original speech pattern is captured in our digitally replicated voices.”

Paul Myron Anthony Linebarger, the author of Psychological Warfare (Coachwhip Publications, 2010 reprint; first published 1948), also wrote science fiction stories under the name Cordwainer Smith. “Mark Elf” (Model Eleven) features an autonomous war robot called a Menschenjäger (Man-Hunter) that has been fighting for thousands of years because it doesn’t know the war ended long ago. It describes itself as the death of all men who oppose the Sixth German Reich and adds, “I am built to identify German thoughts and to kill all men who do not have true German thoughts.” This is the best incentive I’ve ever run across to pay attention in German class, but the key takeaway is that the machine was ultimately subject to human control. The Menschenjäger would have obeyed the order of a German officer to stop fighting, but there were no more Germans (or Americans), and the machine didn’t even know what a Russian was.

So please consider turning off your ad blocker for our site.

Published: Monday, July 17, 2023 – 12:03

Video manipulation was used almost 30 years ago (1994) to depict President Lyndon Johnson awarding a Medal of Honor to the fictional character Forrest Gump, portrayed by Tom Hanks.11 There’s a Star Trek: Deep Space 9 episode in which Benjamin Sisko, played by Avery Brooks, meets William Shatner’s Capt. Kirk in an original Star Trek episode.12 While these applications were perfectly honest, the same technology can be misused, for example, to depict somebody at a crime scene or with unsavory associates.

The key takeaways from this article are as follows.