Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI version of Thomas Andrews triggers me! The real one was different in appearan…
ytc_UgwpROPUz…
G
I am now 80. At age 12 I became obsessed with trying to understand how and why h…
ytc_UgzdQ8g4E…
G
Doesn’t matter, fake or not fake. AI wants to kill all humans. They told an inte…
ytr_UgzXglL66…
G
huge waves? did text to speech make huge waves? did vocoloids make huge waves? …
rdc_jhbov08
G
It's fine to get mad at the 1st controversy but you can't do anything about it. …
ytc_Ugz8HTz3K…
G
Curiosity fuels intelligence, and the quest for truth is deeply tied to understa…
ytc_Ugxk8_LBy…
G
Wow Elon Musk talk to AI is dangerous on creative in AI Elon Musk 😢…
ytc_UgweuRWqF…
G
Indeed. There won't be addiction in the future. There will only be AI and it wil…
ytr_Ugx_O5tLG…
Comment
Elon Musk in 2018: "Mark my words — A.I. is far more dangerous than nukes"
Elon Musk in 2023: "I think the safest way to build an AI is to make one that is maximally curious and truth-seeking... an AI that cares about understanding the uni- verse is unlikely to annihilate humans, because we are an interesting part of the universe."
youtube
AI Moral Status
2025-10-30T19:5…
♥ 142
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_UgzBT387s47sOwcPs1Z4AaABAg.AOuzlSWvmj5AOwDFgx-cK1","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugwi7Qu-Q2vLXsVZsfV4AaABAg.AOuzXkdEEnJAOw2p3xmKUs","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv1oJdpryn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv1scxVXZN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv2_Whq0Wm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_UgzZtvGMWRhoEpeBmQB4AaABAg.AOuz65l7sEnAOv2j47kuHW","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytr_UgymJuLeNqo5aW-ONxh4AaABAg.AOuyo4FEdDqAOv-NSCiGSt","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgymJuLeNqo5aW-ONxh4AaABAg.AOuyo4FEdDqAPzviDnf765","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"resignation"},{"id":"ytr_UgzWno767nWZhBfYPcd4AaABAg.AOuyWeLhbX4AOvE5gV7R3r","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytr_Ugzbpe_VtRtLrfYT2q14AaABAg.AOuy9JwWL3RAOv4mi7BeIE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]