Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The quality of this video is really amazing. How did you get these animations? S…
ytc_UgypQ4h8n…
G
So an author reads a few books and gathers knowledge about vampires and write hi…
ytc_UgyE-GPy5…
G
uhoh. if AI takes all the jobs, how will people prove they deserve the right to …
ytc_UgxM69hLV…
G
@Agudname ... No... You find prints/poster/etc... Which is a physical item some…
ytr_UgxchyFSo…
G
On the topic of styles it is different
Artist do get inspired by other people's …
ytc_Ugy51O4_3…
G
@laurentiuvladutmanea This whole debate is like people that sold horses are sudd…
ytr_UgzZXnEMT…
G
Sure, a real artists' artwork may have actual soul and motivation behind it. And…
ytc_UgzBhAc3M…
G
Idk if this is true. My team hired way less people this past cycle, because of A…
rdc_oaby4uj
Comment
Most "Experts" dont say that this will happen. Most experts say that, because we are making robots do really only one job, theyu dont have the capability to purposely kill. While you can make a robot that is designed to kill, they would also be easy to kill. It's always humanoid robots that people are scared of. Humanoid robots are very slow (unless it is the old Atlas robot) and have such a hard time balancing. Bipedal robots are very terrible in principal. Robots can have guns, but a robot having to aim a gun is very terrible and hard. I dont know. People fear what they do not know.
youtube
AI Harm Incident
2025-09-04T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzMb2OxiiEgG0_JyZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDiqVLhXKADVi-Hyp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOGp2xGhQWxNe1tNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0iqwXHFrLL5Z97RR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx2ZOLqO-nspmgo_hZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQs-GBZIJaeoCoFCl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgztRvDUxHWY0qPrQzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgygU4X9faQRpdW9xlN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqRKPnBHop_3OvtYx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHgj5vj72BRXzCDZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]