Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is creating tons of jobs because of how much money is being invested into it …
ytc_Ugz7X4pkx…
G
AI will be great, but as you already see, people will hate on it for no reason, …
ytc_UgweMXfH2…
G
Algorithms are not perfect. They kept this interview from me for 4 weeks even th…
ytc_Ugx-HKGeF…
G
I don’t think even AI could make sound arguments for the existence of God.
You’r…
ytc_UgzQbCIlp…
G
Steven is part of the problem, performance green guy but in reality media figur…
ytc_Ugz62csxU…
G
Ai artists soon lose their income and goals like traditional artists when AI com…
ytc_Ugx0PlZ1o…
G
Great to see the discussion on AI touching spirituality. Whatever faith you beli…
ytc_UgxnuQFUz…
G
Won't this be great when we have self-driving cars with AI and photo recognition…
rdc_oa9ctsj
Comment
Self driving cars are not programmed for every single scenario and roadway they may encounter. I would suspect the best of the best to only be programmed to do one thing: learn how to drive. by watching humans do it. They will likely employ machine learning and hence will not be making calculated "decisions" to minimize loss of life, but rather respond more analogously to how a human driver would react.
youtube
AI Harm Incident
2019-01-21T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]