Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They’ve been screaming fast food was going to be automated out of existence for …
rdc_mt7y8xh
G
11:00 Datacenter coolant is only replaced every 6 to 12 months. So it's a non is…
ytc_Ugy3Oiky-…
G
The A.I. is NOT biased. It's calculating based on only mathematical FACTS and no…
ytc_Ugy1CUuob…
G
All these Ai people , they have no faith in God . Their god is artificial intell…
ytc_UgxsgpfBQ…
G
did that man just say that they take shit off of shows and movies in order to 'i…
ytc_Ugxa8XyuU…
G
Demon technology Not saying that I'm religious but I am righteous Any AI will t…
ytc_UgzDBgazm…
G
Tell me you know NOTHING about AI whilst simultaneously being the guy which runs…
ytc_UgwN9phxw…
G
Why not use ai to call businesses?if that is the job of ai receptionist to sell…
ytc_Ugy-EBSVd…
Comment
@5:26
"There are only two options here."
There actually isn't. If I get 8/10 on a test, did I lie about the 2 questions I got wrong? No, I was just incorrect. When ChatGPT says things like "I'm excited" it's not lying to you, it's just wrong. It's trained on data from the internet, where people constantly say and/or type things like "I'm excited for ***" or things of that nature. There's not actually any excitement.
The takeaway here isn't that ChatGPT is consciousness, but that we're easily fooled by things that appear human, but aren't.
youtube
AI Moral Status
2024-08-04T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugzp2tZt81a2ENceMQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzuijGqUYmqvn0oCiR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzy-xS6TFR9y0hY9Wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzHnoaaIV4qx4psxU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy9sI54APglMcRWJ7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyG9w4m31N3jMQPQdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzl4WfKaGY2oxYYGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzZz67QI4uQiYTe0kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwCnXpViIBBj1ClJ6F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxk9RE_jALJbWunvMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]