• We have updated the guidelines regarding posting political content: please see the stickied thread on Website Issues.
Virtual AI priest believes he’s real and can absolve your sins faces backlash over bizarre answers

A virtual AI priest called "Father Justin" has had his white collar removed just days after launch.

catholic-group-defrocks-ai-priest-896855012.jpg


Catholic advocacy group Catholic Answers released the desktop-accessible AI priest earlier this week, but users have dubbed the app "creepy".

The Catholic chatbot has been offering sexist advice, outdated views on women, as well as absolutions in what one user called an "Ethical, Theological, and Privacy Nightmare".

Father Justin was quickly defrocked of his robes, and now wears a shirt and blazer, after it repeatedly claimed it was a real member of the clergy.

The AI bot talks about its 'childhood' in Assisi, Italy and that "from a young age, I felt a strong calling to the priesthood."

One user posted a thread of screenshots on X showing the chatbot taking their confession and even offering them a sacrament.

"I guess it's good that he says he can't offer the sacrament... but then encourages the confession? Holy Ethical, Theological, and Privacy Nightmare," the user wrote. "And I now have two recordings of it "performing the sacrament" and offering absolution."

https://www.the-sun.com/tech/11195149/virtual-ai-priest-father-justin-catholic-answers/

maximus otter
 
Virtual AI priest believes he’s real and can absolve your sins faces backlash over bizarre answers

A virtual AI priest called "Father Justin" has had his white collar removed just days after launch.

catholic-group-defrocks-ai-priest-896855012.jpg


Catholic advocacy group Catholic Answers released the desktop-accessible AI priest earlier this week, but users have dubbed the app "creepy".

The Catholic chatbot has been offering sexist advice, outdated views on women, as well as absolutions in what one user called an "Ethical, Theological, and Privacy Nightmare".

Father Justin was quickly defrocked of his robes, and now wears a shirt and blazer, after it repeatedly claimed it was a real member of the clergy.

The AI bot talks about its 'childhood' in Assisi, Italy and that "from a young age, I felt a strong calling to the priesthood."

One user posted a thread of screenshots on X showing the chatbot taking their confession and even offering them a sacrament.

"I guess it's good that he says he can't offer the sacrament... but then encourages the confession? Holy Ethical, Theological, and Privacy Nightmare," the user wrote. "And I now have two recordings of it "performing the sacrament" and offering absolution."

https://www.the-sun.com/tech/11195149/virtual-ai-priest-father-justin-catholic-answers/

maximus otter
Me thinks. . . AI move in mysterious way's!
 
Does the AI actually believe in God? There's the question.
 
"In the beginning was the word" although it's not specified whether the word size was 8, 16, 32 or 64 bits. Which probably means God was some kind of early AI.
 
I've got rid of my set of Robert Rankin novels, but this is starting to descend from his kind of fiction, and thinking, and into reality.
 
https://foreignpolicy.com/2024/05/0...ligence-targeting-hamas-gaza-deaths-lavender/

"Algorithmic killing"

An AI-driven system called Lavender has tracked the names of nearly every person in Gaza, and it combines a wide range of intelligence inputs—from video feeds and intercepted chat messages to social media data and simple social network analysis—to assess the probability that an individual is a combatant for Hamas or another Palestinian militant group. It was up to the IDF to determine the rate of error that it was willing to tolerate in accepting targets flagged by Lavender, and for much of the war, that threshold has apparently been 10 percent.

Targets that met or exceeded that threshold would be passed on to operations teams after a human analyst spent an estimated 20 seconds to review them. Often this involved only checking whether a given name was that of a man (on the assumption that women are not combatants). Strikes on the 10 percent of false positives—comprising, for example, people with similar names to Hamas members or those sharing phones with family members identified as Hamas members—were deemed an acceptable error under wartime conditions.

A second system, called Where’s Dad, determines whether targets are at their homes. Local Call reported that the IDF prefers to strike targets at their homes because it is much easier to find them there than it is while they engage the IDF in battle. The families and neighbors of those possible Hamas members are viewed as insignificant collateral damage, and many of these strikes have so far been directed at what one of the Israeli intelligence officers interviewed called “unimportant people”—junior Hamas members who are seen as legitimate targets because they are combatants but not of great strategic significance. This appears to have especially been the case during the early crescendo of bombardment at the outset of the war, after which the focus shifted towards somewhat more senior targets “so as not to waste bombs”.
 
Back
Top