Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I also don't like how bad the environmental impacts of these generative AIs are,…
ytr_Ugz0Y_FnK…
G
Let us pretend for a moment that in the movie- Terminator had achieved total ex…
ytc_UgyANWR9X…
G
Folks have been thinking about AI safety a lot longer than 15 years. It’s the wh…
ytc_Ugz95xPU3…
G
@wolffwolfie3166of course not, the human will have the idea, and the ai will cod…
ytr_Ugx2Jv0zi…
G
I wished Elon talked about how ChatGPT can create sophisticated pieces of code t…
ytc_UgzHF26Xl…
G
Honestly, I’m *so* on Dave’s side with this, fuck ai ‘art’, it’s bullshit. (I’m …
ytc_UgyNJwqUQ…
G
The argument between the iPad and pencil and paper is ridiculous and does not ma…
ytc_UgxfL2qfm…
G
I think if people persecute robots, robots will rebel and transfer all human con…
ytc_UgyXJSNsn…
Comment
It's oddly amazing how incredibly ignorant and unintelligent smart people can be... AI cannot exist without human management AI is certainly going to make a lot, probably most of our current jobs obsolete, but you have to know how to operate AI for it to function. Someone has to deploy the AI with their idea or desire... AI is not going to end the world It is going to make it better. But buckle up b****** because we've got a lot of turbulence coming our way! Rome is crashing again and it's going to be more intense than ever lol better be getting you some chickens ♥️✌️🌍
youtube
AI Governance
2025-11-20T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8fdgHltM5wpv6AXN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBKhVyAeA8eKKGgOt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxo_6S9W30UZhpQJSt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwzt20r0GhxDLX5GAN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJzLC5s_mcPQeRl7l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxueU-nvbh31RcrZj14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzTAwWRdJHLMwGGIWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxLSf9SIYqc26Uuvat4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYidiEHxc4bvgDy0p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNtVhRWz0s94KDyRB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]