Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is definitely a video film in which real people are used throughout, and fake…
ytc_Ugx24DxGp…
G
And ChatGPT is not gonna let it slide after introducing their new way making a b…
ytc_UgxpJZCws…
G
How are you blaming chatGPT for his suicide when the program even gave him the p…
ytc_UgzHREBeR…
G
To be honest, ai would only freak out if it was in danger. The best way to skir…
ytr_UgwV5VNsp…
G
How are atonomous driver and equipment checks done???? Does autonomous know wh…
ytc_UgzAfNrFO…
G
I already knew AI 🤖 will cost more harm than good because in a very short time h…
ytc_Ugx-vJ1h7…
G
what about the other way around, what if you use ai art to trace and make real a…
ytc_UgytOC_Um…
G
Driverless cars are a hazard. They've already proved that, several times. Driver…
ytc_Ugw94BjwH…
Comment
AI told me the tower of Babel story is a myth.Back then God confused the languages of the earth because the people were getting too big for their boots.If God is? (I think He is. "I am the great I am") then we are probably not far away from another divine intervention!
youtube
AI Governance
2025-09-07T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxbPXBs24PHAub1dQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydzkdUFJ18Ks5MkLx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzyE89NDRBKR4VwIl94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmVnEBP9_5juJOyyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn3yK3ykP0L6FOORx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8QBP4LSUTZe8lVb94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwRAQAh_SBW5t7B7xZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzmKKzuKe8VOIzyfux4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-SaLxTREgt-bDV_R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgySDKpwc4MOWF5vNjh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]