Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While holding it upside down, I taught my calculator to say the word “boobs” 😲 b…
ytc_UgwcyPni-…
G
Yes We are extremely smart however the most important part to intelligence is co…
ytc_Ugw8UcTwx…
G
What you’re discussing toward the end is literally what the gospel promises: Jes…
ytc_UgxCijJsP…
G
I thought ai art was kinda neat, but I felt bad using it too much. It was great …
ytc_UgwAED2pH…
G
AI might regulate the government itself, considering how things going. Still bet…
ytc_UgxjQ_T0M…
G
The trick is not to be first in, but to be the last company standing. Given the…
ytr_Ugw_0NU8p…
G
So it begins huh? The guy who fears a.i. takeover is researching and funding the…
ytc_Ugwx1EKq9…
G
the fact that we aren't zealously unionizing every workplace worries me. when th…
ytc_UgwSDMatp…
Comment
We build cars without airbags, seatbelts and many others safety features first. It's hard to find any human progress without making mistakes along the way and it's easy to say just "make the landing gear for AI first" but the problem is in the first place that noone knows what that would even look like and there is an opporunity cost to just "slow down", not to mention how such a slow down would even look/work in practise. I mean what exactly would that entail?
It's like telling physicists in the early 20th century to slow down with all that physics stuff so we have more time to figure out how we might deal with nuclear weapons down the line, it's just not how technology works.
youtube
AI Moral Status
2025-11-02T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjZyTJQdV33bw0vop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCMEtyTtZwynwkXrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh3riF0-4UK4etQ0d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugw7UPSqMIu1xFiIUSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzrp8HbL5oyccS7tDh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJZ5WYBWtWhye6KXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7_T-EMPRxzTRgF_N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy2ds2xE56wcAnbRrZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRM1UtUh06iVVjG654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw22W8hz_3dOr8fC7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]