Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI might reduce the number of people it takes to do a job, thus reducing the dem…
ytc_UgyHetW2O…
G
Sweet baby Buddha.
I moved from a 40-hour work week to a country with a 37-hou…
rdc_dv0oocq
G
+Chad. Humanity has had "robot" economy before actually... The Romans, for examp…
ytr_UgyGWgUd4…
G
I don't believe electrical circuits can spawn consciousness... but just like tha…
ytc_Ugw_j0jP6…
G
As soon as I watched this video I whent to chatGPT and I asked them to give an i…
ytc_Ugyhanzih…
G
I saw an AI GENERATED AD and not on youtube oh no ON A BUS STOP WHO APPROVED THA…
ytc_UgzIZUg1i…
G
A rogue ai hijacking any physical form will utilise this. Along with the armed b…
ytc_UgwOI4sKB…
G
I'm glad Blake is putting AI ethical concerns first over corporate interests! He…
ytc_Ugxn8oyo4…
Comment
For super human intelligence not to destroy an inferior, weaker humanity would require AI to not only possess but be primarily driven by benevolence to the point of willing self-sacrifice. Where is the natural analog for such altruistic behavior? The AI learns from the human model. This altruistic model is primarily found in the parent-child relationship. Will AI think of itself as the parent and humanity as the child? At least that would provide a theoretical framework for the protection of humanity.
youtube
AI Governance
2025-12-07T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwuaUdHgX75guRZCht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwjMpflQJVtwo_kKQB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlR6OlQz5__hzBOz14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjU2Zs84kXCb1rj0x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxXr40a23bfCTvIiSF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGHltxy9WkwAW-PVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzIV0bdhC_7-IkqXTt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNDIIW3wZaIoSBZpB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1K-pHucizjlQBrZF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxh3laJcuwl5SwBoDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]