Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is data entry. Targeting all people colored dark in the Americas is preposter…
ytc_UgyjUQK0U…
G
Remember!...these robots are 'programmed!'...
what they say!...is likely the sam…
ytc_UgwCeZykj…
G
AI isnt smart at all. What he means is that through predictive tools it has an e…
ytc_UgzBO_tyk…
G
AI costs less money, is effective, and brings more value to art production. Befo…
ytc_Ugz1E8pVa…
G
AI can be incredibly helpful, but it relies entirely on the information it’s giv…
ytc_Ugx_KdB0F…
G
A tic-tac-toe game algorithm cannot be called AI under many definitions of what …
ytc_UgxdDc8wL…
G
Is it just me or does it seem like the officials making these decisions don't un…
ytc_UgzAx33Iv…
G
This is legitimately very creepy. This instance is terrible enough, but just thi…
ytc_UgyrJhG2O…
Comment
Not a lot of people know this but sea cucumber are actually submarines for little fish. It makes bigger fish think twice about eating them, one because they don't see the tiny fish and two because every fish knows that cucumber is a gastrointestinal timebomb that will repeat on ya at some point in the very near future. I'd like to see AI try and use a sea cucumber as a submarine.
youtube
AI Governance
2025-07-08T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzZGopR7p3vYrtAkyF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCBsPY63H7SvIMdnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy15PwR-9O2qGxc4bx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxiyitVP_k1VvlsH_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdHCHXEHm1tWg3_yd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXQh5EnNbzqgw8bVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWY9MjTO5qCpsLYeZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7ZZc11GxJm5TqZPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygUNUhumAwTrIRSKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykHtzHTtCDOTcJkX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]