Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You got this wrong…
“The most important thing for most Ai leaders is to create …
ytc_Ugxi2bN4f…
G
It’s like the AI geeks have never watched a single AI based sci-fi movie…not a o…
ytc_UgzpRcm0f…
G
The thing about selfdriving cars being safer and making less trafic only applies…
ytc_Ugy-o9Zig…
G
We are already a twisted society. This was always going to happen. However, with…
ytr_UgxXYLWeE…
G
Have you ever seen how xenophobic most of the world is? This AI is simply modeli…
ytc_UgwTzCBYX…
G
In regards to the Ai having self preservation, have every action checked by anot…
ytc_UgwYo-cob…
G
Yeah I already accepted that AI is here to stay. But the fact that it looks like…
ytc_UgyHX9nyo…
G
If AI is so intelligent, then why would it even consider destroying the minion r…
ytc_Ugz4xWmAR…
Comment
To long, couldn't finish. Will stop you, right there at 5 minutes in, tell you that people are awful drivers. It is a low bar for full self driving cars to cross to be a better driver than human drivers. The robot has cameras that can see and computers that can make decisions at a fraction of a what it takes humans. So...like it or not...the autonomous vehicles will be much safer to such an extent that it might be the humans that are banned from driving. There is something like 50,000 average people that die in automobile accidents per year. In major US cities there is at least one fatality accident per day where human error is most always attributed.
youtube
2026-02-18T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCTczrF5_qG1UUpDV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYPogieASQSzrl2ph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugygc3uEdJ-Q5Y5dhXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtyonzmXa5B4InHBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzD7E8oNfnyM88rAj14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzj_C2Xy6ktxcCW6AV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJMzlSrGOZBk96mfN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSv52XjbclNkHhgyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3mrf9OIXdGjryaZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTjAGRcXrUJB3m_6J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]