Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On one end I can understand the fear for AI,on the other hand as someone that’s …
ytc_Ugxz0S2OC…
G
Been wondering about that: 5000 AI drones, from a transport plane, sent into a s…
rdc_oi3pttx
G
If AI is logic based for “reasoning” humans already are more intelligent as we h…
ytc_UgwHuR7mW…
G
All these ex-researchers that keep warning about A.I are writing complete fanfic…
ytc_UgxqbHWKv…
G
Please do a video on how to remove copilot from the new outlook. It will get a l…
ytc_UgySpraAn…
G
Amazon says the public reason for this isn't related to AI but Covid Over hiring…
ytc_UgwBuOIas…
G
Google probably doesn't want a sentient Ai so they don't have to cut it a payche…
ytc_UgzzBPQuJ…
G
you guys know about a thing know as pulling the plug and repluging it? plus ai …
ytc_UgwZh_Z4z…
Comment
Y'all to scared of ai. Just like Aliens its unclear what the wants are for ai and it could be dangerous to build ai mimicking humans. Humans are inherently violent and greedy but the the weaker humans come together to protect them selves from those individuals that have power. Power + Human = bad. Making ai with greed and and lust for power is dangerous but ai without that is alot safer, not totally safe but definitely safer.
youtube
AI Moral Status
2020-06-12T18:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx1cBIxtZ0K6ntiT9F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxWksYQegmFYDnoDAB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyZtk_aHWPsFKw_xBJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxDiX1z9mssTd3hU_R4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCUXPijU25S6BT9U94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwW1YxRNRsLY1ldJb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgwBIUeXbNvCZFEysUB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzzDVP7OIvyM2vAX194AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwl0nMb7P2KcWijvJ94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugya-r-k7yRuYKO2Ng14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]