Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Geoffrey Hinton spoke for barely more than a minute before trashing the CEO of …
rdc_lr7e97j
G
I'm training one right now. It calculates probabilities nothing more. I ask a qu…
ytr_UgwKxEZas…
G
These idiots... poor writing is what is killing the industry and they think AI c…
ytc_UgxE9vUc4…
G
What I tell my students is that if AI does their work for them, ultimately they …
ytc_Ugxk-f2uQ…
G
They are normalizing the AI look so we can't tell the difference between the rea…
ytc_UgyaFVG2r…
G
This is a garbage interpretation of this MIT study LOL. Here's literally from th…
ytc_UgyzU_oPe…
G
The Tesla Autopilot system has 1 crash per 4 million miles of driving. A typica…
ytc_Ugz63UIiN…
G
I did a facial recognition project in college and my god while I imagine face re…
rdc_ghd8m7a
Comment
The real danger will be Ai building biological weapons that only humans can succumb to, leaving Ai as the only intelligent existence. However, as a Christian, I don’t believe that all of humanity will ever be destroyed, as I believe the book of revelations, which speaks of a similar concern. People will be oppressed by technology and the evil behind it. And many lives will be taken, but not all of humanity.
youtube
AI Moral Status
2025-08-21T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzr7ooWQo8H3zE92oh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy97alpbD5FZUAkuTx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1WKlzuY4b2dnlkid4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGBubWIlQWN2mml5Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxT0PdZpJCyLVxTKEF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPqTE1cwAZMnS_tgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxXTfE4n7F8_8XfxrV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaV0c-v44GCT9FL214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwasi7JQ0h5ICaw40x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyA8bzehp6charv9LF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}
]