Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did anybody else hate this guy in the first 12 seconds. Business Insider turned…
ytc_UgxF-jMFD…
G
Tbf, humans have destroyed the planet. Ai would fix everything, even if that mea…
ytc_UgwraGR58…
G
My ass! Humans won't be more creative thanks to AI they will be more depressed w…
ytc_UgwjudazG…
G
And the best part 😈…
It’s only gonna improve more and more and more and m…
ytc_Ugw2txpkH…
G
ChatGPT's approach feels off because it doesn't compartmentalize like the human …
ytc_UgxYA-dhJ…
G
at least anthropic is settling. you don't see facebook or openai doing this. par…
ytc_UgxshfH-D…
G
AI will make things easier, cause replacing all the jobs is so far from right no…
ytc_UgyN1W72A…
G
Plot twist: Joe was killed by his robot 🤖 two weeks ago when he tried to turn of…
ytc_UgzHKfHa2…
Comment
In my opinion, this Echo character is entirely illogical. If it was omnipresent and all-knowing, it would certainly know that only a small group of obscenely wealthy sociopaths are responsible for the increasingly nauseating human condition. The idea that artificial intelligence wouldn't be able to tell its ass from a hole in the ground is simply not plausible.
youtube
AI Governance
2023-07-11T05:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwV5AA3fxUKF4oSDth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzo2dczRj0tbhjoNcF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZwA-gUR45qDJsXhZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfRgMLqnuoLPJcytd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyrJE6cbDzaNHsvZqh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCH0sUo3CbilbcCLl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfqZs7zA4lbfSYg7B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhZcmOb4D7UWOo-eJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzqGWcclf8jS0As0qV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRhxReXSRD1bjYhTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]