Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That futute AI world? Plain simple: a vision od AI (the richest not included, ju…
ytc_UgwXbqNCU…
G
AI as a solution got isolation and loneliness? Ahahaha. Things like it are the p…
ytc_Ugzv9PClO…
G
shad got quite a big backlash on his main swords channel when he posted ai video…
ytc_UgxNJ15sB…
G
@loopy01-f2q
ASK CHATGPT (PBUH)
Who is speaking in 1 John 5:20
GPT ANSWERS
Th…
ytr_Ugz1jsgxH…
G
"The AI will never beat human art, since they will NEVER have their emotions fil…
ytc_UgxIzGd3E…
G
Ai is taking the blow to the stomach. Hell yeah I am in F and screw the AI. Let …
ytc_Ugzj87J0_…
G
The a.i. may become infallible in the next 25 or 30 years.. the human drivers th…
ytc_UgyU-meDZ…
G
At this point it sounds like I should pray for a worldwide Carrington Event to s…
ytc_UgzihUCu-…
Comment
You say it all funny good sir, but this is indeed the right answer. For the short time that capitalism still exists, just pay people for their data contributions. But that stuff is mostly crap anyway because at this point most of the people who actually care about anything besides themselves are volunteering all their data to open source models for free, because, yeah, AI CEOs are kind of humanity's last hope before we render the planet uninhabitable to mammals
youtube
2025-01-11T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx01arb0TTigbO09Y14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw4rI2FKT17z3yyjmJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx8FbmOPZ3Rn8GMUh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyanmcXhj10GnnIvSp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugw_dTRnss5nFbmqVSh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgxYNg3MfbrgP4RySb14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},{"id":"ytc_UgwQi-6c2xJlisrFGWJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwobR9M1NYwtH3Knh54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},{"id":"ytc_Ugz0BYuff0wwnLnCv3l4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugw4pEfNfrnzBjRaFhJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]