Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
one can argue that they use AI to cut cost but like
just use official art dawg t…
ytc_Ugw_nioyh…
G
As an artist, I say
Fuck the writers no one gave a fuck about artist
Let the A…
ytc_Ugza7jH6-…
G
This is the third child who I have heard took their lives because of A.I or chat…
ytc_Ugxwz5aiW…
G
I think it's smart that the cops pull over Waymo. The provider gets instant feed…
ytc_UgwfByng9…
G
35:10 - Meanwhile, in CCP Chy-Na -
The western dogs are slowing their A.I resea…
ytc_Ugwc8Q169…
G
Too late. Ai was never created for the benefit of humanity, it was invented for …
ytc_Ugw07y6zC…
G
The AI has become so intelligent. I figure it out how to back engineer itself in…
ytc_UgxPvMIY3…
G
Keep in mind every single time a prediction like this is made it is wrong. Ai wi…
ytc_UgysPsxTd…
Comment
A common misrepresentation of AI is that "it does not have a gut feeling" and often combined with human examples that displayed a good gut feeling. Humans have a terrible gut feeling when it comes to complex decision making (the book "thinking fast and slow" addresses this quite well). And even in this video many examples were given where humans made bad decisions but another one saved the day. An AI does have a gut feeling if trained on data representing "gut feelings" - an informal linguistic concept to convey decision making based on uncertain data, past experience and not fully conscious statistical neural net computations of the brain.
Let's just face it: Giving away a trigger for nukes to anyone is a scary thought, AI, human or anything inbetween.
youtube
AI Governance
2023-07-16T21:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwm539up-aNfMAu-Y94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJ6M36leEfuwblouV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxyEJUh6fIT-0rZK-94AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWT342ZW-02_qPXdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdJkW-cjGPEWhhh9Z4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkwMCqKl9jv6-k4UZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCUztRmb-RlwsW0zB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYjOQNhaSRlc6k18h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzayHyV0WLgioR-dAZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyY_KISsykn8ZTCCZx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]