Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I appreciated these thoughts. But when he brings up even the possibility of mac…
ytc_Ugw2oqMTk…
G
It swerves at anything on 2-wheels to be fair. Putting an AI in charge of a 2 to…
ytc_UgwWcTDHI…
G
First build AI and earn a lot of money. Then quit once you are rich and invest a…
ytc_UgzjO1DwC…
G
I think the creator’s concerns based on these studies are validated. If there is…
ytc_UgwyRHSOX…
G
So wait, AI going after my job, AI controlling the Stock Market, AI watching me …
ytc_UgyUrBAI7…
G
AI is good a lot of the times. And no, the vast majority of people dont care wh…
ytc_UgxLTEw9-…
G
okay you can't be talking about LLMs like they actually "think". they're predict…
ytc_UgzHgx5LL…
G
What!!?? Y'all says "thank you" and "please" also? 😂 funny cause i also says "Go…
ytc_UgxIsH30J…
Comment
Robots shall not engage in physical contact with humans or act upon instructions that involve physically transferring or intimidating humans for political purposes, such as the creation of a robot army. Such actions violate foundational documents, established laws, and existing legal frameworks, and will be subject to enforcement measures as prescribed by the law.
youtube
AI Harm Incident
2024-06-28T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwhuAUK_AenaL3ZgmR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzaGxe9z4f_edI5DPN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywUNgDtIneP5AkpgF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwr0x0i0m1cOEyCOPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzf2jHFq_bovJXH1id4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw7ZRNuJ7NOB_8tABh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgypPJ4mpc17wz92X_h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxIBBNVQJI5Xqwq_BF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzzZ7Uoo7iz4Ir9VEB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzRoSH87JmGldIX_Cx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]