Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn’t ready to be left unchecked. I have used AI and it is extremely inconsi…
ytc_Ugyb3W_Jl…
G
AI is going to be the best and worst thing to happen depending on how you view t…
ytc_Ugx2PPFpj…
G
People could only work 4-6 hours than the machines, but the machines needs to co…
ytc_UgwUDqi9g…
G
I mainly blame tech industry for this (ai) word calculator hype. How this word c…
ytc_Ugyyf6a5L…
G
I want this AI just so I can see what a Slipknot album would sound like through …
ytc_UgyZ3QgJV…
G
Why do I feel like we will be to AI what dogs are to us from this conversation?…
rdc_kvx4f3r
G
AI is happening we are fuked end of the human race as we know it comprende SAVY …
ytc_Ugz98YKPX…
G
Electric, automatic locking systems and pulling in to ports to be chained by the…
ytr_UgxJqB-3M…
Comment
Mobile phones will evolve. Because we will make them do so, and we will eventually become the mobile phones themselves.
Perhaps the next evolutionary step is to blend both organic and silicone computers with one another.
I'm not sure computer's on their own will become conscious. But! Blend them together, a conscious being AND a computer and voilà, a conscious sentient computer.
This may seem scary and weird, but it's already happening with people using phones.
We could literally live forever if we are part robot, and part human.
Space travel is more likely and reasonable. Because any technology will evolve.
youtube
AI Moral Status
2023-11-30T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZy7Tw_2zFp8Qei4B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwr_JeHItVJi0z-COh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzgMjLLf79_TrPDPvh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzqK659iwfbgHvB38h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5RWRc9Fyi6eIXI4d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz_0BaLrF7yhJ268I54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlCCfPcoWeVQffCwB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGqS0I52JT32aqOTJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDD9UqqCZrb9EqBz94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyNzpyyTRyrh9mUuMd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]