Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I somehow made flamingo simp for me all because I made him drink fabreeze (the a…
ytc_UgylLTF8c…
G
It's the " create a criminal program" there's really no algorithm to it.. they'r…
ytc_UgxhvofqZ…
G
Lol I once said "I'll never touch AI! It needs to die! Worst creation since the …
ytc_UgwyEHTWy…
G
AI artists need to be sent to some room without any internet,,js paper and penci…
ytc_UgwOquxPq…
G
The scenario makes a good screenplay, like an update of the Terminator.... But h…
ytc_Ugwl2hJGf…
G
Because ai art uses other peoples art as a source, taking peoples hard work with…
ytr_Ugz8eQXCx…
G
I highly doubt AI can fix a clogged sink or put in a new light fixture in the fu…
ytc_Ugx6HnJY3…
G
Calling AI "intelligent" is really ludicrous. It's a huge pot of stolen stuff an…
ytc_UgwJ_rqwr…
Comment
The conflict between existing law and future law is one of the most difficult issues. Hypothetical people's agreement had been one of the solutions. The problem is people will agree with the AI's decision. Many people already do not believe our judicial system which many professionals have put their effort to build for justice. When can we meet the singularity?
youtube
AI Harm Incident
2019-05-14T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]