Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai right now sucks too much to let it handle stuff like this, at some point it w…
ytc_UgxnU6kS4…
G
All it takes is one artist annoyed enough at having their work stolen to take le…
ytc_UgxAnUAJv…
G
It's a total joke for US to propose "regulations" of AI murder machines - lol th…
ytc_Ugywe0sEh…
G
This is bullshit and has to be cgi. I Googled it, and there is what they call le…
ytc_UgzacfZRf…
G
Guys when ai artists feed ur art to ai just get another artist and feed the ai a…
ytc_Ugy8JkAwZ…
G
It's fascinating, isn't it? Sophia's response highlights her ability to engage i…
ytr_UgyTbZQuD…
G
These chatrooms definitely exist in other languages too. A few weeks ago in Braz…
ytc_Ugw1ZEhwg…
G
how to poison AI it easy just use AI make many not good image, flood it to inter…
ytc_UgxgdbsXx…
Comment
Through the Ages it are the Intellectual Classes that conjure reason and rationale that culminate into harmful and deathly situations for significant segments of Human population; why should the advent of Artificial Intelligence be deemed different save for its potential to be the most fatal for ALL humanity ?😅😅😅😅
youtube
AI Harm Incident
2025-09-01T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzTlHVp6Q1BsgGRy-B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmLWg9YPbGOO7Gh7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4RLdbZZZvm8RFfvN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxQeA4pGo_PtPElS-V4AaABAg","responsibility":"intellectual","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaVPCGlxZnuvwdE6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4X-1N4XIk-JYCSQ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCYafao9N1i7qyhQ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz92bairmfuiRE9NZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz31T1cUq1ePVO9Avh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcY8__jhFEoOW1x9F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]