Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I trust Tesla autopilot far more than I do the average driver....and I have quit…
ytc_UgzLdsn3H…
G
Even Siri would make a better wife than my sketchy ex wife,more brains more sex …
ytr_UgyBtakov…
G
Once AI AND robotics are working good enough, humans are not needed anymore. 90-…
ytc_UgywIlQjQ…
G
There is no such thing as an AI artist. If you didn't create the art, you are no…
ytc_UgzPS_kvb…
G
@NeferPitou-w9f Hello, I am from the information technology field, and I would …
ytr_Ugxg3n2y2…
G
AI is good for black and white issues. It falls apart in the gray areas…
ytc_UgysL51Se…
G
stop saying you're "talking to it". talking is language. the AI doesn't know any…
ytr_UgzYVUdXw…
G
Lets be real . Some of those AI songs are great. If you like eminem, listen to t…
ytc_UgzsxUS9E…
Comment
Thing is: self-driving vehicles will not be safer. This claim is not supported by any proof, it´s pure wishfull thinking. In reality safety will degrade as one invoice item on a long list of some manufacturer. Because after all one engineer will have to sign that his software is safe to do Trillions of miles...and this is not possible. There will be failures and fatalities, and if you have to find one responsible it will be the guy who´s signature is under the expertise. So there will be no responsibility and therefore not safety above the level that is accepted in society. And *this* comes down to PR.
youtube
AI Jobs
2025-08-27T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgygB3QH1a4McS7zyMt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJ9rozsr0U7zM4X3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzkCu31Naz8vltkRCx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxzNjkvWtvd_N9hlzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0NHDdfGnWACC4y1l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNFnOa5a2B1gLpNBJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzq55e9cWDQZ8xoxEh4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzvkvk_kOOLmcH_Xld4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxKRf2DAAMsfO1FCqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwB8tFgnMirD_6cCZ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]