Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nice A.I created video. None of this actually happened. This video is what happe…
ytc_UgycN0Ygy…
G
@joannroeter2421 Yes. How about an ai system that is just for determining if st…
ytr_UgxbtxoYn…
G
Looks like you need to go post the instructions on other subs. r/LifeProTips, r/…
rdc_mnje45c
G
Why? AI will be *safer* than human operators. Robots & AI already run factories …
ytr_UgzVVa4Ht…
G
The second word in AI is intelligence and you can't control intelligence. Even n…
ytc_Ugw2EUdnn…
G
Not gonna lie, autopilot is not as good as I thought it was. There no way Tesla…
ytc_UgwFdAFcL…
G
If Ai is replacing all jobs then we caan haave Ai government officials..Ai pres…
ytc_UgxLU1a5j…
G
Don't life insurance companies already hire life risk assessors to determine the…
ytc_Ugw9dFEvN…
Comment
None of the most powerful countries Can say "We wont make autonomous weapons" because it would grant any other powerful country that didn't stop a military advantage that will make Nukes look like pointy sticks in comparison. There are massive, world changing ramifications for a large scale nuclear conflict, but sending drones instead of people doesn't irradiate half the world or kill a hundred million civilians, once you have them there is no reason not to use them in every conflict, and pandoras box is already open. I support a global ban on autonomous weaponry, but also understand that it will never ever happen. This isn't the Nuclear non-proliferation treaty, it would be like trying to ban the world from using planes or tanks or guns.
youtube
2019-01-12T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyrplhsBT3nFWgCI9B4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyksSvqXMK_yGMUxGh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzm7evec27OcKA8eOB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPFsAYP-e5Vuw8fBV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz70OAVGz0ggTxQuZt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7jIJbwjjOYNVRtf94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw6_kKDHyaWdkLVhVt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxOvlAcjFraYeFgGad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxbcmSie5k5VI-jOWB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzp8zSaWb45ssGAl994AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]