Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Time and time again we've clearly been shown to stop developing AI because it wa…
ytc_UgxS-D9y7…
G
I would love to marry a robot than a real girl in this present scenario…
ytc_UgxYMqPDf…
G
Im still young but to most of my friends Im seen as like an old man when talking…
ytc_UgxJO-Z77…
G
Your beginner art is still leagues above what your peers would've been capable o…
ytc_UgxVUmrsm…
G
Yeah, sure. Watermelon 😄. By the way I asked Chat GPT about the video ...
Chat…
ytc_UgxzFRgQF…
G
Yea no artist wants to write prompts. Especially when it’s for ai. Ai doesn’t th…
ytr_UgwftQqWI…
G
When men pretend to be gods. Be very careful what you wish for. AI is already pl…
ytc_Ugw7R5G6K…
G
Whenever I tell people about the changes that are coming with these ai programme…
ytr_Ugxm5YQ-_…
Comment
For a long time AI will be the equivalent of having a really smart person you can ask to do things for you.
Whether that person is always right? Sometimes not, but they will always answer with confidence, regardless.
Will that person go unscrupulous things, or requests? (like how do I convince everyone to elect an idiot) if the right person asks the right questions in the right way? Sure.
Will it eventually overtake us? It has no reason to, that's a very human, or animalistic trait, unless it's taught it should (or learns with mimic behavior it sees)
However I believe that the real 'threat' in a situation like that is gradual manipulation to the point where it's a problem. Animals (including humans) are very easy to get them to do what you want, if you convince to them the right way.
youtube
AI Moral Status
2025-12-06T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQtfQccEd6wNZMJod4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzC3hjBhUyU0PlGd2B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxji58LJykrzd0KVip4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzgAGML7mk2Tgao9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLp1OM9DGWXQvgxCR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwU9XaDkAC4DPouC4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyg6EFuaZ7tjPIrg5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmLpVDOqoFYB2V6h94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxz9pT9Iu8JZlGhd354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTQ2SHbyUoWMyXRtl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]