Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is scrambling Shakespeare or any other literary work “transformative”? That’s re…
ytc_Ugw7JKO_z…
G
Imagine logging into Twitter and the only thing being talked about is an A.I. th…
ytc_UgwsKeMB6…
G
26:45 "Here we're inventing a replacement for human mind. A new inventor capable…
ytc_UgwVcTaAx…
G
One word that came from your brain. Something AI could never do.
Say. That was …
ytc_UgxobjdoG…
G
well it is just a piece of paper, I have a Bachelors degree as well, Ai is also …
ytc_UgyrTI92P…
G
So pretty much using references to make you're own art (something basically ALL …
ytc_UgzvG4JWN…
G
Ai is basicly a supeeinteligent human. It has flaws, but its smart, when we thre…
ytc_Ugw_rQIJ7…
G
someone like me who actually know how Ai Algorithms work and has done some proje…
ytc_Ugw9f9Yl-…
Comment
Well, in the future, we should make sure that trucks hold their objects better and not drop them. I mean, if we could make this advanced cars, we could prevent this. And because we can program all cars to follow the same system of certain crucial safety rules such as speed limit or distance between each car, or even the lanes where trucks &a cars can go, this may not happen. Furthermore, safer parts will arise in the future, I believe.
It's just a bit scary how the butterfly effect could occur in these instances. If, for example, one company may dominate the majority of cars, and that company makes a small mistake with the software, it could cause drastic negative consequences. So... I'm not all-in for self-driving cars.
youtube
AI Harm Incident
2017-06-22T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]