Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, I'm worried about not knowing if I'm talking to a human or an AI so I treat…
ytr_UgwqGXeu6…
G
Is that really your first question: how would a robot get there quicker than a p…
ytr_Ugwjof6X6…
G
If AI is going to take all the jobs who is left to pay for goods and services? B…
ytc_UgyktXQ5q…
G
Where does your subconscious come from....and it emerges fully tuned into the No…
ytc_UgwAHDy30…
G
I only use AI for the purpose of designing a template for my character until I c…
ytc_UgwZ0WpqQ…
G
You know what's scary.... Robots... are basically humans... who don't get tired.…
ytc_UgwK3RxPw…
G
i dont think AI would want rights.
we need right to protect our lives and feeli…
ytc_UghgtcFB4…
G
So I got my degree in CompE where the lqnguagea to learn were C,C++& Java. did I…
ytc_UgyIipp-a…
Comment
Your conclusion is reliant on the presumption that AI will always be just as useful as any other tool. But that's just unjustified imo. With the proliferation of super powerful computers in this race for AI and control, really, everyone is trying to produce their own chips and there are rapid advancements in Quantum computing, I could surely see the possibility of an AGI that surpasses all human-exclusivism in the near future. We've been saying only humans can do this and only humans can do that and we've been proven wrong on so many of those things that it's no longer reasonable to have such assumptions. I'd say we're no longer the only ones who can use tools, we may very well be the tools that computers and AI uses. And that's a scary thought but it's not far-fetched at all.
youtube
2023-08-30T15:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw0FsLVFmnSqfG0qhd4AaABAg.9tnDKm4peuA9tsD1uyVM0T","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxTorq7V9o2sLu92wJ4AaABAg.9tP8oJaT2J69uX0ZVFqNdY","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxXlbU4GEvtPCkNE_J4AaABAg.9tLQLowYv_89u3HwR_3Kz7","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx1lzNSE5QEBF1gho54AaABAg.9tC3JJ23g6K9tNgTd2rCEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxlDKVGN9w4WFpMXWF4AaABAg.9t5usdplrq_9u2FNlarsiB","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxlDKVGN9w4WFpMXWF4AaABAg.9t5usdplrq_9u2NSnL3cxB","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxlDKVGN9w4WFpMXWF4AaABAg.9t5usdplrq_9u2jUH5J31u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxlDKVGN9w4WFpMXWF4AaABAg.9t5usdplrq_9u2k77yU6xf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwRrdN4ObLi77vdRb94AaABAg.9t5KXf_tIBi9tP9UkF6UM1","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwjE1GeJpxUTR3uf9x4AaABAg.9sgEBOgE-T89t2KZ-HbuYZ","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]