Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like you can post anything in this sub that speaks badly about a company …
rdc_hyuzyl0
G
I like the ai art concept, its given me ideas I don't think I could otherwise co…
ytc_UgwSNN-ao…
G
What does a driverless truck do when someone roadrages against it or it causes a…
ytc_UgyC0CaAu…
G
Great... so not only will AI gives us no purpose.... it will also help us live f…
ytc_UgwdG4DR8…
G
Literally everyone is selling the idea of AI - why are all the cattle supporting…
ytc_UgyRF8-DK…
G
This guy is a clown go listen to mo gawdat he will give you fucking nightmares h…
ytc_UgyHsiTTk…
G
So what I’m seeing and hearing “someone” needs to send a fleet worth of unicorn …
ytc_UgwrQFyeA…
G
Wow it’s almost like training ai on general things people say online isn’t a gre…
ytc_UgwVuZbOr…
Comment
Nothing you say matters. AI will eventually replace more and more jobs and things that need a human touch as of right now. No matter how many speak out we are only delaying the inevitable. Human art will still have a place in the world tho, either in the abstract department or just because some people like the fact that it’s made by a human and not a machine. It’s just the price of progress 😔
youtube
Viral AI Reaction
2022-12-28T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyoUI_Ws_nHhwNSRkl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjUbkn3BgLk2wfXW14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3zPtKFN9EZDYJCrZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxDT4N6HC7OMjBPg894AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyH1xuB1dSq2Dcvez14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx74rcyUAMELJ1AbhN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz-DEgfRNXK0d2xY4Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx2WIblDVG3K_6zg-p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0EoKqN-G6s_c-6pp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw8Q5HSQoqhi-TZ9lh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"approval"}
]