Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My thought is, the driver should be paying attention to the road and always be r…
ytc_UgxbghNUG…
G
not that im for replacing humans with AI, but if the employees webcam "isnt work…
ytc_Ugy--cJZJ…
G
Very few people coming out of college make $250k. Most start between $50k to $80…
ytr_Ugy0Xrlgw…
G
In my experience you probably wouldn't have had that accident if a human was dri…
ytc_UgwiWUOen…
G
If you understand, how LLM works it is not scary at all. It is total BS. It is L…
ytc_UgwGDUqhV…
G
AI is not replacing jobs, companies are restructuring who knows AI in there indu…
ytc_UgwNU0FeX…
G
Programming will switch from a standardized programming language like C++ to pro…
ytc_UgxiCxEBI…
G
I you can't recognize when a LLM "hallucinates" given an order, you shouldn't g…
ytc_Ugwlephpo…
Comment
@KO-sx9uy Y'know King In Yellow? If not, look it up. Basically if you see him you get too much knowledge for humans to handle and are driven to madness. Mabye AI would get treatment like that.
youtube
2026-04-04T04:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwLEHnUrgjSAJZGYSt4AaABAg.ARH0tFfE18aASFyC5eywZm","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzIlOP21OViEwVThKN4AaABAg.AQdwyuAO0u7AQsLUDPpI64","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxPbs-orV0cWuk4tMF4AaABAg.AQ2MUhKlPaZAQNOMMXxRrT","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxPbs-orV0cWuk4tMF4AaABAg.AQ2MUhKlPaZAQNpdFvmPJJ","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxPbs-orV0cWuk4tMF4AaABAg.AQ2MUhKlPaZARvMMtNaQgi","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugx5yPvbayTrEKtw0Nl4AaABAg.APoonGeQtVDAPrmli1Rxxc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugyp6ZXk7UwhtpKd2np4AaABAg.APGg6qewiWxARvNBcF0VQu","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxJ8I_z6ffrSwYDD0l4AaABAg.AOTUa5f-DRkAOdOA2X2Dp9","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugylta7DXki56LLngYB4AaABAg.AO-4egeHU8dAVA4qRlWD9X","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugy7c_-Fsxi_Srk4OFt4AaABAg.ANqsAc2Z9XEAOiJjaNfDKU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]