Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@titankronos65173 what even is your point, man?
People you are talking about, wo…
ytr_UgwD-NkWF…
G
Why are people worried about not having a job in a world where nobody has a job?…
ytc_UgyPFToCQ…
G
5:30 = Impact of AI? Teachers are finding students are learning zero. They are j…
ytc_UgzWhhcce…
G
Encouraging curiosity may be a better way to align hypothetical Human-like AI.
…
ytr_UgwSAEAsi…
G
Um.. It won't take people!/artist job. Yes, number Of people working on the job …
ytc_Ugx0mF3zq…
G
What AI Should be: Hehe funny donald trump robbing king supers in a mario costum…
ytc_UgzxIMspT…
G
I make a point to do research from print books and journals and thus avoid AI de…
ytc_Ugztp7snh…
G
Currently Autopilot is still work in progress. Before any Tesla owner can access…
ytc_UgyoednUG…
Comment
@nmnm7742I actually don’t believe AI will naturally become malevolent. Look at any of the GPT’s. They all use large language models that require months of training before they can be used. How AI’s are trained, programming hard guardrails and limitations can all be used to prevent AI’s from running amok.
youtube
AI Governance
2025-08-14T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugz_cITMkV0Ru-Ow3NV4AaABAg.ALoBbi7hYqMALoNks23oYM","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugz_cITMkV0Ru-Ow3NV4AaABAg.ALoBbi7hYqMALoQuqUxpxG","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz_cITMkV0Ru-Ow3NV4AaABAg.ALoBbi7hYqMALoaWrfsJX8","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz_cITMkV0Ru-Ow3NV4AaABAg.ALoBbi7hYqMALoiVGl4u4X","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyP4S6itfDbysIxEfl4AaABAg.ALoBAt_bL_XAQEWY7RBVMT","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyP4S6itfDbysIxEfl4AaABAg.ALoBAt_bL_XAV7aczyJMkz","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugwgch7-6-py8POyE4J4AaABAg.ALoAM4cLQwcALoBGm4JO6c","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugwgch7-6-py8POyE4J4AaABAg.ALoAM4cLQwcALoBInJEQ3O","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgycyUcswE6mb4w-qXF4AaABAg.ALo7ba07_rXALo8CxFJ5AZ","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgycyUcswE6mb4w-qXF4AaABAg.ALo7ba07_rXATQAOpLIDiB","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]