Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hang on the person was killed because of the vehicle sensors not having the capa…
ytc_Ugy0iRDur…
G
As an Artist, I F**king hate AI art and AI bros. It Steals from Artists, It's Ch…
ytc_UgxTdrNdJ…
G
How about putting some random guy next to each big AI data centre in charge of a…
ytc_Ugwwtqt6z…
G
Predictive policing meaning they have your name they have your address, and they…
ytc_Ugy1feQfx…
G
I think these people should focus on preventing AI from dangerous things rather …
ytc_Ugww57Kxx…
G
Ai deepfake porn is so far removed from art you have to be braindead to think ot…
ytr_UgxJTO0Su…
G
And it's truly adorable watching the creature created from a broken condom defen…
ytr_UgxxG_7Ya…
G
Last night I got across the perfect scheme for using A.I. for phone scams. Sampl…
ytc_Ugy3CXS4z…
Comment
And yet science fiction like Isaac Asimov, Arthur Clarke, Frank Herbert, Glen Larson, and Anne McCaffrey foresaw artificial intelligence as a threat to humanity. Science fiction has been warning society for decades. Our own modern day prophets have been telling us for decades not to lightly open this Pandora's box. But our scientists and pure researchers have no imagination to have read from those authors and to consider ethical questions. The movement toward AI, I will not call progress, has far outpaced all philosphical boundaries exploring it. I am deeply alarmed that Isaac Asimov's 3 laws have not been the bedrock underpinning the development of AI.
youtube
AI Harm Incident
2025-09-10T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzg3_8a3DFjnZTsE0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwoOz1u6NjUoBW7954AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx11KEc-JwBDrIPPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5er2ffUHWiWnYK354AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCnwUSg2IrEoTK9HZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLcPspYA6TTpRECOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugypro70lNeHMU7G3xJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcEHIkf1puNkcow1Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWy-V0P_fyp5vtBBJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz1_5wr0el45qkVFBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]