Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
US is trying to weaponize AI, to help it gain in development of more potent arse…
ytc_Ugym94n9s…
G
This shit, is fucking scary. I mean, really. I don't get scared about a great ma…
ytc_UgxkDWkBA…
G
The question I have when current AI SOUNDS conscious is is it just MIMICRY based…
ytc_UgwtrUYhF…
G
My fiancee and I went out for a meal on the weekend. We always have an unwritten…
rdc_nkrknts
G
For scams like that. I hope that AI is smart enough to avoid helping scammers.
T…
ytc_Ugzc7O9KE…
G
@sandponics took me 1 second, i have no idea what this it 😂 , but i will send yo…
ytr_UgxV2wJVZ…
G
Well..the bigest problem of AI its the energy.But im sure they ll find a way on …
ytc_UgymYNas4…
G
all nonsence!
the "threat" of AI is just what they want you to beLIEve!
never f…
ytc_Ugy6eRwR0…
Comment
Sam Altman in 2015: "Superhuman machine intelligence is probably the greatest threat to the continued existence of humanity."
Sam Altman in 2024, when asked about the existential threats posed by superintelligent AI systems: "I have faith that researchers will figure out to avoid that."
youtube
AI Moral Status
2025-10-30T19:5…
♥ 149
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_UgzBT387s47sOwcPs1Z4AaABAg.AOuzlSWvmj5AOwDFgx-cK1","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugwi7Qu-Q2vLXsVZsfV4AaABAg.AOuzXkdEEnJAOw2p3xmKUs","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv1oJdpryn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv1scxVXZN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv2_Whq0Wm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv2j47kuHW","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytr_UgymJuLeNqo5aW-ONxh4AaABAg.AOuyo4FEdDqAOv-NSCiGSt","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgymJuLeNqo5aW-ONxh4AaABAg.AOuyo4FEdDqAPzviDnf765","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytr_UgzWno767nWZhBfYPcd4AaABAg.AOuyWeLhbX4AOvE5gV7R3r","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytr_Ugzbpe_VtRtLrfYT2q14AaABAg.AOuy9JwWL3RAOv4mi7BeIE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]