Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even if AI turns out malicious, I still don't understand the fear bc it is limit…
ytc_Ugw7X3LJd…
G
A better president would at least know the difference between AI and robotics an…
ytr_UgwW_9z2C…
G
Artificial intelligence takes away our creativity and gives us reprogrammed data…
ytc_Ugw1J6Z8t…
G
the driverless truckers are safer, my uncle the trucker was a drunk and a womani…
ytc_UgzZZzHUf…
G
why do i feel like i understand gpt... most of the time i don't feel anything...…
ytc_UgyIKVLDf…
G
I think they should made an AI system that pays artists depending on how much AI…
ytc_Ugznusw5F…
G
"OH BUT I DONT HAVE MONEY"
"I DONT HAVE SUPPLIES"
"I DONT HAVE TIME"
THEN MAKE D…
ytc_UgzrCe9_q…
G
Actual quote: "“Google is 5,000 times better than Uber at autonomous driving. "
…
rdc_dftuuni
Comment
yeah guys, it's probably a good idea that AI makes human intellectual work superfluous and robotics makes human muscle work superfluous and combine these two and you get stuff like drone swarms or t800, etc., making human violence superfluous.
really, it's totally cool, when billionaires can have their own army of AIs to think and produce and kill, without any or extremely minimal human labour needed.
very good idea, yes.
youtube
2025-07-28T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfPkIuTUiCp5mP1Cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBBGq-h5aDGgrbc_d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxAulaOMNIN9bdmKeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYKUpGEMKW8AcvWzl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxzHY_ddRMTRDVAqUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxcj_IZ9o4iIP6n4_R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugww-m5iL4nig7dlXht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzdmV-wTP_GG_vbC4F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxuarXjFhGeFSTb2t54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNfctvGZCJn9fxG0t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]