Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t think the risk is AI causing harm to people. I think the risk is people…
ytc_Ugz_cITMk…
G
As someone who uses ai daily, and have been for the last 3 years, I was finally …
ytc_Ugxb20Wz5…
G
If AI if is so good use AI to solve the unemployment problem it will cause and w…
ytc_UgwwFl4h6…
G
Yea ... These little parts of the conversation, where it seems that chat GPT wan…
ytc_Ugx2KN51m…
G
Opp mine too lmao idc tho (bdw I searched it up and open ai explicitly said they…
ytr_UgyXSpLLL…
G
You have to teach the cooperation with humans, not competition! The first lex: d…
ytc_UgxWjX1Oo…
G
This is the problem with AI cars. Some who drive them shouldn't be allowed to dr…
ytc_Ugx6BQdjh…
G
There is nothing these parents can do. AI is the future, like it or not. Nothi…
ytc_UgzewbqmP…
Comment
The problem is that the ai is most likely told to “do the task as efficiently as possible” and since it cant do the task when turned off, and humans can turn it off, the most efficient way to do the task is genocide. It will not feel or know what it is doing, or know its killing its creators
youtube
AI Moral Status
2025-06-14T20:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxyARTCIsg81omiiPF4AaABAg.AKSDx2qv8LyAKX_qIv3868","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytr_UgxLWJ3uONAvavI13Op4AaABAg.AK8mkGAe3-tAK9Ud574pbQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgxUWFqVah2FJzgdBrN4AaABAg.AK-Uu-RQe_HALHId0NY7Tv","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytr_UgxUWFqVah2FJzgdBrN4AaABAg.AK-Uu-RQe_HALHlLQRrUkO","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytr_Ugw4_0jrPEWzN0wLAtx4AaABAg.AJlDdknxjp5AJmZ5bSqo1j","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgwjoXODDUBCclk1VmR4AaABAg.AJOgLrBhuc_AKxW9Q7n2nC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_UgwjoXODDUBCclk1VmR4AaABAg.AJOgLrBhuc_AL956VxR3ra","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgxJLdqC8G7J4boEzZt4AaABAg.AJKHYm_IJ8sAJMogj2FkD-","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxJLdqC8G7J4boEzZt4AaABAg.AJKHYm_IJ8sALP1iuRubVZ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytr_UgwoK-5RApA_9dHm4Nh4AaABAg.AJDGnm4jn6-AJJ9fSwAI-M","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]