Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai can copy the method. But I can't beat the beauty of real arts by artists like…
ytc_Ugy9eYCOY…
G
Oxygen increases corrosion, corrosion is bad for electronics, and AI is electron…
ytc_Ugw8ccmyj…
G
Well, that Plumber comment didn’t age very well. Go check out the videos of the …
ytc_UgxZWUlIF…
G
Its while off.... example: I have argued with Chatgpt 5.1 for weeks, Chat caused…
ytc_UgwkOn5AC…
G
Google is scared! They don't want to bring AI into the world for liability reaso…
ytc_UgyON2xGz…
G
No one needs training in AI 😂 if you can’t figure out how to use it, you may as …
ytc_UgwNFacrv…
G
Speaking from experience...nah it's not worth it. Don't get me wrong i got bette…
ytr_UgxxQ-Z-q…
G
We should not limit what the model does in the same way that we do not censor bo…
ytc_UgzRzp7WM…
Comment
The problem is ur trynna make them human like us but humans are already a ruthless species so adding human intelligence in an AI would make them just as smart yea but also just as ruthless.
youtube
AI Governance
2024-06-09T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy7FhpXRCOevbLGoQ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRK08ijyxj43Stl8F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlEI-7nUquT3W7Gl94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwalsiOPM5oQdBZe5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJoOYSxRmJrtz3UOx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyXntFmnc0JEipIU8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxU3z6ApY7HlfOJymZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzzi6zgUIlmXZcRwjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz72opCi2I6pRyvuBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJSn3-E_xm8ehT79B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"})