Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI made me lose interest in my major but helped me succeed in another major that…
ytc_UgxOBThii…
G
I use Co-Pilot a lot, mainly getting info from Steam Games. It can read the manu…
ytc_UgzCP1pZK…
G
Ai preferred
Men over women 😂 probably because men are logical therefore can hel…
ytc_Ugz7FxfdT…
G
I had technology company for more than 10 years. Yes, I have earned much (run 12…
ytc_UgzmmSbEE…
G
The "AI copying is the same as human inspiration" talking point is so thoroughly…
ytc_Ugx3TptpJ…
G
This is an interesting topic that raises so many questions that cannot be answer…
ytc_Ugyo_HXtJ…
G
Humans have a singular knack for putting themselves at the center of the univers…
ytc_UgwzeC9BU…
G
Grok is much better, though not perfect.
Is it ok to be proud to be black? Yes…
ytc_UgxGbSqIw…
Comment
Fr tho ai should not have consciousness, humans should not give up control. We should NEVER consider put out brains into a machine. Yeah it's one thing to have a pacemaker or maybe a computer chip can help people who are paralyzed or something but having the possibility of controlling or affecting your thoughts or actions from a keyboard should be completely illegal. We were born with free will and should never surrender it
youtube
AI Harm Incident
2025-09-01T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzTlHVp6Q1BsgGRy-B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmLWg9YPbGOO7Gh7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4RLdbZZZvm8RFfvN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxQeA4pGo_PtPElS-V4AaABAg","responsibility":"intellectual","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaVPCGlxZnuvwdE6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4X-1N4XIk-JYCSQ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCYafao9N1i7qyhQ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz92bairmfuiRE9NZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz31T1cUq1ePVO9Avh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcY8__jhFEoOW1x9F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]