Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots are just tools in the hands of man, therefore they do not need rights. Th…
ytc_Ugwl_Pd8U…
G
I most definitely want a robot that looks like Dolores from Westworld. If I coul…
ytc_UgzSk3HS_…
G
It`s so nice for the wealthy people that created AI to tell everyone how screwed…
ytc_Ugx9eXqdw…
G
I hope A.I. really do this job completely. I worked in a callcenter for years! I…
ytc_Ugy52k25T…
G
NO TO FACIAL RECOGNITION, THIS IS IDENTITY THEFT, THE RULE OF LAW IS UPHELD, UPH…
ytc_UgxrTgv_f…
G
Surely with Hegseth at it's helm. the Pentagon has all the "artificial intellige…
ytc_UgxasJHOp…
G
Maybe having just one tail light on the back of the motorcycle might help, in th…
ytc_UgxXVSuIh…
G
As a musician, I get periods of self doubt over my work. Like every artist, I ha…
ytc_Ugy0CYuoB…
Comment
Compared to AI, a nuclear bomb is just a child's toy. In the near future it will be virtually impossible to distinguish between reality and AI generated things. And with this, 'traditional values' like good/bad, true/false, right/wrong etc. will completely disappear, as will any basis for trust.
AI is just a tool, and as a tool it is neither good nor bad. The big problem is that the rules will be defined by people, and that with good or (more likely) bad intentions. And exactly this, and not 'intelligence', will define the output. As we can already see, big companies, for example, will never use AI in favour of their customers, but only in their own favour.
youtube
AI Harm Incident
2025-12-31T10:2…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugyaos0CjOSWACNuVJ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0aOBJJ3g01kpp6Sh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyl-obNHU5JSYt6gdl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgybJJ_pvgt32w0RiNV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxTGhLYFiNDITMXXk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxB13KC9GhcjZ4MsaJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfOILANgckOhBiR2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI4jq_hp6ypmAbJLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwtdpc4R9jSV55cOy54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw6UtqxiAjWVdj3L8N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})