Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It so the first vid is AI for sure, like no one sits down anymore ESPECIALLY on …
ytc_Ugz-BF25a…
G
And yet, in truth, AI has had very little impact on the average person's life in…
ytc_UgyXm4efb…
G
Ill be honest this mostly seems like a scam to drum up fear and therefore make i…
ytc_UgxUXgME2…
G
I worry this might be a double edged sword. A few weeks ago Google released thei…
ytc_UgwIKCBtX…
G
its funny that Sam Altman does the whole culty lovebombing thing to stroke peopl…
ytc_UgxCdW88Z…
G
in the current state of AI humans wont have any choices in the next 50 years...…
ytc_UgxhqOGd7…
G
I think it’s fine to be used as long as people are just using it for Fonzi’s and…
ytc_UgxzGxh2y…
G
Whats disgusting is not only did engineers ignore nearly a century of automotive…
ytc_Ugyb-M-j4…
Comment
the only thing more worrying than swarms of shaped-charge drones is swarms of SJW idiots who make petitions to "Ban autonomous weapons":
1. I am sure they have the same petition in Russia/China/Iran/Syria/Turkey/NorthKorea
2. Of course, if enough people sign the petition, we are going to prevent the mere integration of otherwise readily accessible technologies. And by doing so, we would ensure that the terrorists won't take the lead with developing a super powerful weapon while we are sleeping on this concept and ways to counter it.
3. We are far better off with our current military weapons:
- Carpet bombing
- Some teenager driving a tank
- Nuclear weapons
youtube
AI Harm Incident
2018-11-01T18:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwxt0ZYo_8UPxhmMeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwzrvj0NqodLMa21Ft4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx75cHjhyOWxs_QjaJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7kqa4H6EtheQE54p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyeymu0Ei4-PrBjm5h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMdf4OLHL9QnNF-w94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaLE2xshherIxxKNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2pkXyhE-U4r1TRct4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyTu7Z0jdApHPa2U7R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugzqi_jhUKJQ-9pRT1t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]