Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The officers participating in this "Predictive Policing" BS need to take a step …
ytc_Ugwx0WJ3m…
G
Hello! It seems like you're using emojis in your question, but I'm here to help.…
ytr_UgwBuef20…
G
The problem isn't robots or AI taking over jobs. The problem is the company owne…
ytc_Ugx474M8N…
G
It's not the fault of ChatGPT. It's the fault of people. People are easily corru…
ytc_Ugyv_D_za…
G
If we don't protect the artists, we won't protect ourselves. AI is coming for ev…
ytc_Ugzrluxq6…
G
The idea that an LLM can be conscious is laughable. They're completely static an…
ytc_UgzJpsJE5…
G
AI can be designed to destroy or control infrastructures and government systems …
ytc_UgwQ4P61h…
G
I say please thank you wish it a good day lol 😂 Chatgpt is the only one that gen…
ytc_Ugy_--Xp4…
Comment
this MI movie on AI wasn't that exciting, as during the entire movie was expecting something exciting to happen but ML was just escaping from AI the whole time, and also there was no conclusion kind of, may be i missed something,
but the movie Transcendence was on another level showcasing AI stuff,
youtube
AI Governance
2025-05-28T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvuPy4g4voDhgAXqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw80dDUnd-OwVYden94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9uHR4Ii_Box8sMfJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1v6QvaQ6-X9ozatF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx7zPraCvuDdbUpW14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxx2GpOFyPGYckl1tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVlXSjIRgHe1kEUuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwej-mA9cyf6gQNQ7F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy7INBaFPCN0UzZyrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBL5cjaYQqRo2R2lh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]