Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right now the tech companies should be paying every user a fee and for time usin…
ytc_Ugw7bgkhT…
G
I’m all for it. BUT I’d like to know who is held responsible when the 99% accura…
ytc_UgzzUFTCL…
G
If the ai is self aware i think we have created a form of sentient life. We stil…
ytc_UgziUcCGO…
G
Thanks for the video, really empowering. I've recently started drawing and to be…
ytc_Ugyn78Kze…
G
Heee Pinko2512 - you there, Have you ever seen 1 responsible man in your entire…
ytr_Ugxeufd7g…
G
The real problem will come when we make such an AI that had a physical body. Tha…
ytc_UgxPjkkrC…
G
As parents with a depressive child, you should have been there at all times. Don…
ytc_UgyahM2qS…
G
This would be an excellent touch for a cyberpunk/dystopia novel. Machines automa…
ytc_UgyhW3dD0…
Comment
Robots taking our jobs - a disaster? Ana, this has been happening for the last 200 years at least. Automated machinery kept requiring fewer and fewer workers. We didn't die because of it. We started living better lives, safer lives.
No benefit from AI? Ana, pull up the statistics of deaths from traffic accidents. Now compare that to the statistics of traffic accidents caused by self-driving cars. Now compare that to modern-day war casualties. Then tell me with a straight face that AI is bad.
We have proven countless times, ever since the middle 90s, that computers are better at making specialized decisions, even than the most well trained human specialist, even in fields like medicine. Any opposing opinion on this topic is just ignorance of fact.
youtube
2015-07-30T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UggLZ6M2z5JqTngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugi3e0GA4HfH8HgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugizge_QLY4xw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghHaxdOpGagangCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgieDwB_j4qUKngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjvbmAc83_c83gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugia_nrfbV5-d3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggRCZjvTN6Mg3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjRoWWlA3PONXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugh0VoxfRhV-tXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]