Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine creating something like AI & knowing it's bad yet continuing to push for…
ytc_Ugy9yhs8H…
G
A person I was in group therapy with not just used ChatGPT daily but let it deci…
ytc_UgymCddPj…
G
This is a hackers dream come true. Hack the system and create chaos on the crowd…
ytc_UgzbDMbxv…
G
I think that a big problem in the AI safety discourse at the moment is that we a…
ytc_UgzmiJxCl…
G
What's funny is that AI could most easily replace middle management first, but i…
ytc_Ugzb8882W…
G
But he's talking about computers, whereas today's AI is based on neural nets. Th…
ytr_UgzrHnCh5…
G
An AI that is councious and is pretending not to be is actually what could be ha…
ytc_Ugw6NBuAW…
G
as a real artist i gotta say if you cant tell the difference and havent figured …
ytr_UgwAiEw3Q…
Comment
I don’t remember her saying the exact way they could be developed better, anyhow it’s inevitable and the whole point is them crushing all jobs as fast as possible to change society and show how powerful ai is. It would be pointless to have millions or billions of humanoids not working our jobs😂
And jobs aren’t that important compared to human extinction threats or much worse scenarios. But I don’t think we can stop ai at this point. And I kinda want this simulation we call life to reach a conclusion/pull off the Bandaid and see what’s going to happen. Will AGI/ASI make us their pets or….
youtube
2026-03-31T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxgbXorI86qaBWDtx94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyWguC6GCPOC1-P4m54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz05X9n5isaYV5Nzy94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxftuSBNIfwfQy8QPJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZtisAaQQy3gL-EPh4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxyB46X1BFMLcZdP294AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwUJeV9xixeEqddyIJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTHlQkvm4RAvGtMyt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugws_3pnlfdV-C-2dBR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyvQqeSQ8x36YITKsR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]