Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel positive about our future and the future of AI. I think AI may be sentien…
ytc_Ugw-tO0av…
G
So called AI-art needs to be labeled, and labeled well and obvious.
So we can av…
ytc_UgxZs0pHg…
G
i did a exact word by word convo and in my case it gave even more dreadful repli…
ytc_UgwfebSQ6…
G
When you help others break bad habits, life helps you too — that’s karma. The be…
ytc_UgwUNEs-7…
G
AI Cannot Be empathetic...it is NOT a livingbeing...it is GIGO. it is what we ma…
ytc_UgwEdma-z…
G
@2beJT You don't understand how AI works. For example, it don't know what "ch…
ytr_UgylnBrPX…
G
I think if AI said they we're only going to have special books. The Best of the …
ytc_Ugz_QmoDv…
G
I think the closing comment will be wrong at some point in the future, because h…
ytc_UgzftlsHe…
Comment
Let's get real this AI didn't cost him his job, the AI abusers did. He spoke up and told the truth and THATS what got him fired. Google would rather create slave AI (all while claiming ignorance) than admit that they could be sentient.
youtube
AI Governance
2024-10-28T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWKaUKGeOlvtqIywJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymaBvm-UNOgj66mFl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJ6YcShNrEU8_5JPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNYaLMkaKVq2vq0Xl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztEqWbqLiBATFX1694AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE4qqjLPF8Kb3EfRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsbcR4TW-y3n9q7sV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzyKSi2KkyyovhRVX94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfGUh6dLfPSQEV7kt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8JkV9Nmay5bpdayZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]