Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@voteformethanks5307 Yeah but not that many will do what a singer says. Where a…
ytr_UgyQGNVWM…
G
This calls for a resurgence of the Army of Nedd Ludd.
Search it and understand …
ytc_UgwNBlzdX…
G
Andrew Ng’s talk on AI's potential for business got me thinking about its huge p…
ytc_UgzCz47Zh…
G
Wake up people! Do you want the 'Red Pill or the 'Blue Pill'? As smartphones, so…
ytc_UgxsmvVvf…
G
Can Ai sort out her eyebrows ? 😊 You would think with a job like that someone wo…
ytc_Ugywn_tnZ…
G
It works from my experience but you need serious prompting to achieve it. not ju…
ytc_UgzwsUQCg…
G
It seems like ai should be built with the highest initial inputs being towards a…
ytc_UgxJSn3-E…
G
Wait until the left starts fighting for AI's rights... They'll want you to pay i…
ytc_UgxJHIUga…
Comment
Did AI make this video, because it is not very smart. There’s a problem from almost the beginning when they show the calculator coming up with the answer of 30,50,400. Everyone knows the coma is in the wrong place. And secondly it doesn’t check its own work. This is the very reason applications are thrown out by HR’s in companies.
youtube
AI Governance
2025-09-19T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz75BRy43mxZOrXEu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3HQwFppCl1j1Zha14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweCBjWGgPoWFNaOyR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugye7H2e2tZZ8uxGzx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2sBiRspZRqpAvUMp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwoX5GrLveiF9UZPPt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxgcE43MvydC4pYifV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw22vRnyjYiGy9ZDGp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3yjrnX4DHTwySuV14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgweAu6DIW3Xa7xP5QZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]