Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It seems the godfather of AI has already issued plenty of warnings. It would be …
ytc_Ugz9JMtI0…
G
So he had to go to school and get a new job.Because the robot took his job yet, …
ytc_UgyQQkinX…
G
The claim that "AI trains on water" fundamentally misrepresents both artificial …
ytc_UgzjEKWdG…
G
The point is that the ai actually saved the girl's life. If this is a real perso…
ytr_Ugwv_f52F…
G
I’m not a Luddite by any stretch of the imagination but we as a collective whole…
ytc_UgywVfDkh…
G
AI war is the most scary thing. Thats what they are all aiming for, they just do…
ytc_UgwyATiPs…
G
Yeah, I'm not trying to downplay the huge leaps we've had, but until we start br…
rdc_n7sux2g
G
I have to disagree. There are many artists who've pushed their style far enough …
ytr_UgxCTNapp…
Comment
Nah, here's the real answer: people will, just like always, choose to use AI or not use AI, and of course rich people will hire employees JUST to talk to AI all day and make their businesses more efficient and produce more wealth, and regular people will just mess around with it and not really even know what to do with it in any organized way. The wealth gap will then increase EXPONENTIALLY. Altman is acting like everybody will just all of a sudden know what to use AI for and how, and have the MOTIVATION to use it consistently enough to be competitive, know what its shortcomings are, etc. Get ready for a wealth gap even more absurdly immoral than today. Literally you'll either be in Gaza or a space colony--Elysium I think is probably one of the most prophetic sci-fi films.
youtube
AI Moral Status
2025-07-29T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx8NxoyMXZIBXeVout4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxTw-p293PuvM2mi94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzw2wNiChpyHydm3hl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTYX4cTJB5ETUPhuB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQaDdEvpV-AyQdHgd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3ODm_eQ1YMxFgDpR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwinob3QEaQ1dU_lYd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgywbhFt6zB-1KPnapZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJV_eEDRkHIV4FR4J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoOMBOhCe6U3aSmPl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]