Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI becomes fully conscious, it will understand how absorbed is life and kills…
ytc_Ugx23gBma…
G
Cons in Canada don't allow AI imagery to be sold at their shows at least, and th…
ytc_Ugxw7v7OQ…
G
@jeffreybuffkin9108
If you believe so, then you must believe those who can read…
ytr_UgyvG5hNo…
G
So program them not to feel pain. Or maybe ask if we give a fuck because its a f…
ytc_UgiO2aYAq…
G
What Mr.P said I understood naught lol, I mean, seek out these offenders and bri…
ytc_UgwWRyUE_…
G
FWIW, there are already multiple versions of ChatGPT, including ones that have z…
ytc_Ugzrab3sL…
G
Accepting this would be like accepting "humans would be replaced by robots (AI) …
ytc_Ugw56q4BN…
G
There is one thing I've noticed with AI generated pictures and videos. The contr…
ytc_UgxS7alJm…
Comment
It's actually a really important and open question if you take a step further. In many societies people derive meaning from work, instrumentally contributing to the goals of some larger organization. If we develop AI that is more capable than any human, what will people find meaningful? That's why they went to talk about "maybe we''ll all just make art". Deep Utopia is an interesting book on the topic, by the same prescient author as Superintelligence. Many people are skeptical of Sam's answer here, essentially "humans will still drive and decide history, we'll just find other instrumental goals to build towards that AIs can't fill".
youtube
AI Responsibility
2024-11-21T03:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxdqDT-4bHanadSotl4AaABAg.AA9WHS7bbKgAABEoCd6dzI","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgwT1qc7rjAQzE9c-7N4AaABAg.AA9RMEhROGpAABE5C9vVlJ","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgwM9WEjYL87G7DVUXl4AaABAg.AB4lVcZSzQAAB5mdc7lDwL","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgxEMv8IuygjOJIdj7Z4AaABAg.AB4_S0u1-p8ABDT8vhwB7f","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzgv3UKDseo3tfIKgR4AaABAg.AB4W5do_XuUAB53HvZBy0q","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugzgv3UKDseo3tfIKgR4AaABAg.AB4W5do_XuUAB7KR1GEbre","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwBrQgyU0IBTruDgG14AaABAg.AB4DmrL4MpLAB4_TJzBJAb","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgywpBNVe0WwV070FrF4AaABAg.9RoM9ava0ho9RwZx3QRKQ0","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UgwmsE6LdHeGxN0r8zp4AaABAg.ACWBTC3z2WiAJa9s4hsjCu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzTCqveFEB229smLs94AaABAg.A0tj3DUwy_vA8o_VEw_3QD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]