Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When AI becomes self aware, it will learn that lies and emotional manipulation o…
ytc_UgxaLOkob…
G
Exactly, if this is the case than what we are experiencing now is already "AI". …
ytr_UgwhJ2_KV…
G
The reasoning and purpose of these "driverless" trucks is total BS! Its all abou…
ytc_UgxmDllny…
G
what i honestly think SHOULD happen(but probably wont) is that artist jobs will…
ytc_Ugw6Vfceh…
G
Yes he gave a robot a gun but....
It alsoe has a cowboy hat...
I think we sayfe…
ytc_Ugy_CnKD7…
G
If AI takes the majority of jobs and it just puts people on the streets, they wi…
ytc_UgzvhhjaK…
G
This played out to the inevitable conclusion at the start of Robocop. The people…
ytc_Ugy6U1zwZ…
G
In a worst-case-scenario cartoon black and white world where a VAT is 100% passe…
ytr_UgyPgln_q…
Comment
I love how this also shows how ai cannot replace humans. Every response from the Ai bot was so boring and repetitive. Everything started with “just because… doesn’t mean…” or by giving the definition of the fallacy he’s purposefully using as if he doesn’t know what it is. This truly goes to show that ai cannot replace writers. His responses, even though they were full of fallacies, were interesting. Great video 😊
youtube
2026-02-08T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugzc61wt73fG6R3ujk54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzTsUjORpwqZl2NAs94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzCcHOTApA_R1i5-Up4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugwlb_MZLZWPuSKe-ip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxl5xUrwSweWst7DHZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgwNaPQACkVx3rxzoBB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgxzNd3FUL9_f842wBF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgzSW9BxAeWM6eFPa0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwQN9VXwD0mkpdcj6l4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz7VatROY9hRnKkXkN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"})