Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They need to re-do the safety measures around morarlity and killing anything. Th…
ytc_UgzFf0Foa…
G
I quite like the AI key. I rather have it be set to pull up another AI though.…
ytc_UgxdijB_D…
G
Not defending ai art but nowadays artists for hire charge too damn much, price/q…
ytr_UgzyM20jO…
G
If AI takes most of the jobs, it will lead to all out socialism. There will be …
ytr_Ugx9XZ8At…
G
It looks more real than they used to. However, I wouldn't call it hyper-realisti…
ytc_Ugx3vUSUX…
G
In the 60’s, there were shows about human colonies on Mars and exploring deep sp…
ytc_UgzTPEmmY…
G
🤯 WHOA! I can’t believe I’m seeing this.
Can’t stand these Waymo cars.
I dream…
ytc_UgxNoOcQ7…
G
I do logistics and while computers and algorithms help a lot in doing most of th…
rdc_nk6mohz
Comment
Tesla auto pilot just needs a better accuracy than the human mind, like Facebook's face recognition which is better than humans recognising each other. Not really that far away i guess. Ai gets better with their expanding dataset
youtube
AI Harm Incident
2021-06-25T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxqxhrmCSg_-wJRRQ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYkM-5HAiz5imIXqZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFwDAyzNBitOcUqxd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM3KtUBvZlpAbBz4J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyrTs0o6jMqibIjf0B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8Na9QkwjBcaMfB0l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyX6Fp1Q5aWTfd7OyV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTkAvhNX6zGDf7YKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKIPRVdwlPPVhTRVN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwapLfbHNJFPMGanfp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})