Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My essay got marked for being AI, because I used big words to illustrate a furth…
ytc_UgyKbA9yP…
G
The AI debate is so intensely reactionary and bad faith, and almost all the argu…
ytc_UgyB9AGh-…
G
It think if AI gets good enough to write all code, then it will be AGI. Because …
ytc_UgxUOdXGS…
G
I understand your concern, but as an AI, he does not possess emotions or persona…
ytr_UgwmQYcwe…
G
I'm writing a small paper on that and it's scary how good some ai's are at gener…
rdc_lqrhsbw
G
Remember AI is already prodigy level knowledge and top1% innate intelligence. On…
ytc_UgwRiEcC5…
G
"and the most likely cause of consciousness is something extra-dimensional to do…
ytr_UgyoScBtz…
G
My favorite ai defender response is just the cynical “I hope your talent is over…
ytc_UgxPmBRn0…
Comment
AI may get smarter and self improving but it may not get cheaper than humans. Some jobs may be automated but the cost advantage of humans may be the key for saving most jobs. The current chips which push atomic boundaries using gargantuan amount of power is now barely getting close to say enthusiastic interns in terms of ability. Getting this scaled up to AGI may not be possible without a nuclear power plant attached to every data center. This also requires chips can continue scaling down to subatomic levels breaking physics. A new type of computing may emerge or even be developed by a self-improving AI itself but IMO is unlikely. IMO, this piece is sensationalist and alarmist paranoia.
youtube
AI Jobs
2025-11-19T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyxI0N4b4SH2GgZbgd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5e6DcQz6VwLiIQnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwA_w6mjFgdDcLjf9Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoV-6ErTrYgB9emoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqHomjXIrtySTzfal4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZxNwtJOOXcDZnAft4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwyXfuf-Aqkx2Vo1Gl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxMMiqqCN9naoQ4EoF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzdKJrDZ1rAIoUIHdZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOPTJJ_lUIqBpz8tp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]