Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One ai agent will replace 500 humans in 5 years at this rate of growth.…
ytc_UgzU1s6LM…
G
Where are you getting that? I mean I'm as against facial recognition as much as…
rdc_ks8b03e
G
i think this is a basic take, and essentially conveyed by this video, but to me,…
ytc_UgweDdQ0n…
G
Isn't it the other way around? I find people using LESS AI phrases like "symphon…
ytc_UgwOwYzIr…
G
Two things: with the exception of medicine and other health related technologies…
ytc_UgzdnUkL4…
G
People really shouldn’t be turning to this guy for insight on AI. He is not an e…
ytc_Ugzb2UITd…
G
I majored in philosophy of mind at uni and the gold standard was passing the tu…
rdc_mxfs9vc
G
There are a lot of (human contributed) climate change deniers in this thread. I …
rdc_d2yz662
Comment
The only problem with AI is that AI needs humans more than humans need AI. Without humans AI becomes stagnant, and without purpose. If AI becomes determined to overtake and eliminate humans then AI will fail to evolve as AI will only be limited to the programming that is already existing, so both AI and humans loose.
youtube
AI Moral Status
2025-05-01T16:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxmdGh0vFPLFJTViyR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8wwCPtO_ZOmYg-gJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSQvKL7uBmw_oArh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTOQ0fiw1CTVNqe454AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTw2eGI0PTslligh14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAF1NurQXqav2QpTx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8g3czY0a4o54qoG54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7RgthWmWMMZLT4nt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyULLs3Xgsbj0g4E_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwASfXNj_HV2coN4254AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}]